Opera has introduced a new feature that will allow users to download and use Large Language Models (LLMs) locally on their computers. The feature is available to Opera One users who receive developer stream updates. With that, users can get access to over 150 models from more than 50 families, including Llama from Meta, Gemma from Google, Vicuna, Mistral AI, and more.
Krystian Kolondra, EVP of Browsers and Gaming at Opera, said:
“Introducing Local LLMs in this way allows Opera to start exploring ways of building experiences and knowhow within the fast-emerging local AI space."
Opera is calling these new features as part of its "AI Feature Drops Program" and swears user data will be kept locally on their device, allowing them to use generative AI without the need to send information to a server. The company is utilizing the Ollama open-source framework to run these models on users' computers. Each variant of the models has between 2-10GB of space on the local system.
Opera is trying hard to jump on the AI bandwagon with these new features, although local LLMs aren't the first AI thing the browser features. With the launch of Opera One last year, the company has made itself clear that it's trying to be an AI-centric flagship browser and even launched Aria, its AI assistant. The features will help Opera enhance its market share by offering unique and innovative features to its users. As of March 2024, Opera has a desktop browser market share of 3.15%, as per Statcounter.
If you are interested in exploring this new feature in Opera, you can upgrade to the newest version of Opera Developer and follow these steps to activate the local LLMs on your computer. Once activated, the local LLM will be utilized on the user's machine instead of Opera's own AI assistant Aria, till the user starts a chat with Aria or switches it back on.
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.