Running Elephas offline with Jan.ai

How We Got Elephas Offline: Let's Dive In!

There are two apps that you can use,

  • LM Studio (Supports only M1, M2, M3 macs)
  • Jan.ai (Supports intel and Silicon chips)

In this post, we will see about Jan.ai.

Why Choose Jan.ai?

Jan.ai makes things easy: download AI models straight to your server for better security, skipping the risky unknowns.

Simple privacy! And it works with both Intel and Silicon chips.

 

Installing Jan.ai

visit the page Jan.ai to set up on your Mac

Notion image
 

After installing Jan.ai → Hub→ Search the model is, ”Llama-3-8B-Instruct-32k-v0.1-GGUF”

Notion image

There are many AI models available in the Jan.ai(Offline AI), we suggest to use, Llama3 quantised model from here,

 

Click a “Download” And “Use” it,

Notion image
 

Click the “Start Server” button,

Notion image
💡
Make sure entering the server port in Elephas

How to connect with Elephas

Now go to Elephas → Preference → AI Providers → Custom AI, and Enter your local host url (Authorization is optional).

Notion image

And Tap on refresh models

How use Jan.ai in Elephas

Now go to Elephas → Preference → General. Now you can pick a feature and select any Jan.ai (Offline AI) models available from dropdown.

Notion image
 
 
Notion image
 

Super Chat

In Super Chat as well, you can select the Jan.ai( Offline AI) models

Notion image

Super brain

Super brain indexing still requires internet connection as it depends on backend AI engine, we will add the offline capability soon.

 

Need help? We're Here for You!

Contact us in support@elephas.app.

 
Did this answer your question?
😞
😐
🤩