Monday, June 24, 2024

How to Enable and Use Local AI Models on Opera One Developer

What to know

  • Opera lets you download and use AI models locally on its Opera One Developer browser.
  • Enable and download AI models from the Aria side panel (on the left), then Choose local AI model > Settings.

Opera recently announced the integration of local AI models into its Opera One browser. With this development, which has made Opera the first major browser with built-in AI models, you can now pick and choose from 150 Large Language Models (LLMs), store them locally, and use them offline. It’s also a more private and secure way of using AI on your machine. Here’s everything you need to know to start using AI models locally on the Opera One browser.  

How to enable Local AI models in Opera One Developer

Currently, the option to download and use AI locally in Opera One is available only for the Developer version of the browser. Here’s how to set up the browser and enable local LLMs.

  1. Open the link given above and click on Download now under ‘Opera One developer’. 
  2. Once downloaded, run the setup, and follow the on-screen instructions.
  3. You’ll need to sign into your Opera account to use AI models. You can sign up with your email or use a Gmail, Facebook, or Apple account. 
  4. Next, click on Opera’s default AI ‘Aria’ in the side panel on the left.
  5. Under ‘New chat’, click on Choose Local AI Model.
  6. Click Go to settings.
  7. Here, make sure the option Enable Local AI Models in Chat is toggled on.

How to download and use local AI models on Opera One Developer

Opera One developer offers several AI models that you can download and start using locally from within the browser itself. Here’s how to do so:

  1. Open Aria from the side panel on the left.
  2. Under ‘Local AI Models, you’ll find a bunch of LLMs for you to pick from, the likes of which include Meta’s Llama, Google’s Gemma, Vicuna, and Mistral AI’s Mixtral, Starcoder, etc. Click on an AI model to view it.
  3. In most cases, you’ll see two or more variants that you can download.
  4. For demonstration purposes, we’re choosing a lightweight AI model – Phi-2 by Microsoft. Click on the download button next to your preferred model.
  5. Wait for the LLM to download on your machine.
  6. Next, to switch from Opera One’s default AI model Aria to the downloaded LLM, click on the Menu button in the top left corner.
  7. Select New chat.
  8. Under ‘New chat’, click on Choose local AI model.
  9. Then select your downloaded AI model.
  10. And just like that, you’ll now have access to the AI model locally on the Opera One browser. Type in a prompt to start using it like any other AI chatbot.


Let’s consider a few commonly asked questions about using local AI models on the Opera One browser.

What are the benefits of using AI models locally?

The many benefits of using AI models locally include better privacy and security compared to online use of AI chatbots which rely on sending the data to the servers. And since the LLM is stored locally, you don’t even need an internet connection to chat with AI.

Which AI model should I download for the Opera One browser?

There are several AI models that you can store locally and most of them have multiple variants depending on the number of parameters (in billions) included in their training. However, Opera highlights a few interesting LLMs that you can explore, such as Code Llama which is ideal for coding, Phi-2 which is good for question-answer chats, and Mixtral, for its performance, accessibility, and versatility.  

How do I switch back to Aria?

To switch back to Opera’s homegrown AI chatbot Aria, simply start a new chat. 

We hope this guide helped you set up and start using AI models on the Opera One Developer browser locally. Until next time!


This website uses cookies. By continuing to use this site, you accept our use of cookies.