Is it time for Google to open up a lightweight and large model, Gemma, and the era of universal AI?
hughmini
发表于 2024-2-22 11:24:19
1297
0
0
Google released a new artificial intelligence "open model" Gemma on February 21st, which means that open source for large models means that external developers can build them into their own models. Google has also become another major technology company, following Meta, attempting to take the path of open source big models and accelerating the arrival of the era of universal AI.
Google stated that Gemma is a series of "lightweight" advanced open models built using the same research and technology as creating Gemini models. Developers can use the Gemma "Open Model" series to build artificial intelligence software for free. The company stated that it is publicly disclosing key technical data, such as so-called "model weights".
Google CEO Sundar Pichai said, "Gemma has demonstrated powerful performance and will be available globally starting today, running on laptops or Google Cloud."
Market analysis suggests that Google's open source of large models may attract software engineers to develop on Google's technological foundation and encourage the use of its newly profitable cloud division. Google stated that these models have also been optimized for Google Cloud.
However, Gemma is not entirely open source, which means the company can still establish terms and ownership for using the model.
It is reported that compared to the Gemini model previously released by Google, the Gemma model may have smaller parameters, with 2 billion or 7 billion parameter versions available for selection. Google has not yet disclosed the parameter size of its largest Gemini.
Google stated, "Gemini is the largest and most powerful AI model widely used today. The Gemma model shares technology and infrastructure components with Gemini, and can run directly on developers' laptops or desktops."
The company also emphasizes that Gemma surpasses models with larger parameters on key benchmarks while adhering to strict standards for safe and responsible output.
Previously, the open-source Meta's Llama 2 model had a maximum of 70 billion parameters. In contrast, OpenAI's GPT-3 model has 175 billion parameters.
In a technical report released by Google, the company compared Gemma's 7 billion parameter model with several models including Llama 27 billion parameter, Llama 213 billion parameter, and Mistral 7 billion parameter in different dimensions. Gemma outperformed its competitors in benchmark tests such as question answering, reasoning, mathematics/science, and code.
Nvidia stated during the release of the Gemma model that it has partnered with Google to ensure that the Gemma model runs smoothly on its chip. Nvidia also stated that it will soon develop a chatbot software to be used in conjunction with Gemma.
Opening up AI models with smaller parameters is also Google's business strategy. Previously, iFlytek also chose to open source smaller parameter size models.
Liu Qingfeng, Chairman of iFlytek, explained to a reporter from First Financial News, "The key to General Motors' big models is to see who has good performance, and open source big models are to establish an ecosystem. Therefore, from a technical perspective, generally open source big models are slightly lower than General Motors' big models."
"We have also observed that many companies may hide their biggest model and still hope to establish barriers for commercialization," a researcher engaged in AI big model development told a reporter from First Financial.
There are currently different views on open source big models. Some experts believe that open source AI big models may be abused, while others support open source methods, believing that this can promote technological development and expand the beneficiaries.
LogoMoney.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表LogoMoney.com立场,且不构成建议,请谨慎对待。
声明:该文观点仅代表作者本人,本文不代表LogoMoney.com立场,且不构成建议,请谨慎对待。
猜你喜欢
- Baidu Robin Lee: The AI native era needs millions of native applications, not 100 big models
- Rolling GPT-4? Google releases the strongest AI model interpretation
- News reports that Apple is developing its own large-scale language model for devices
- IPhone can run! Microsoft launches lightweight model Phi-3 with performance comparable to GPT-3.5 Turbo AI in the future on mobile devices?
- The first hundred billion parameter model from Tongyi Qianwen has arrived
- JD technical leader: Large models will become smaller and even finer down to the scene
-
困扰开发者多年的“苹果税”迎来松动的契机。 日前,美国地方法院作出裁决,要求苹果支持移动开发者将用户引导至第三方支付平台消费,这意味着此后iOS开发者将不再受“苹果税”的制约,可以直接推广并将用户 ...
- abc691001
- 3 天前
- 支持
- 反对
- 回复
- 收藏
-
美股市场:美股三大指数5日集体下跌,道指、标普终结9连涨。截至当天收盘,道琼斯工业平均指数比前一交易日下跌98.60点,收于41218.83点,跌幅为0.24%;标准普尔500种股票指数下跌36.29点,收于5650.38点,跌幅 ...
- jgserver
- 3 天前
- 支持
- 反对
- 回复
- 收藏
-
①美联储连续第三次维持利率不变鲍威尔称不急于调整利率 美联储周三维持利率在4.25%-4.50%区间不变,美联储主席鲍威尔表示,当前政策适度限制,经济增长风险尚未显现,不急于调整利率,6月降息门槛高。他强 ...
- wengkejian
- 昨天 10:32
- 支持
- 反对
- 回复
- 收藏
-
据媒体援引消息人士报道,如果当前的贸易谈判未能带来令人满意的结果,欧盟计划对约1000亿欧元(约合1130亿美元)的美国产品加征额外关税。 知情人士称,拟议的报复措施最早将于周三与成员国分享,并将进行 ...
- benhao
- 前天 11:42
- 支持
- 反对
- 回复
- 收藏