rw-book-cover

Metadata

Highlights

  • In the same way the automobile granted us personal freedom to explore, Local III starts our journey towards a new freedom — personal, private access to machine intelligence. (View Highlight)
  • Developed by over 100 contributors spanning every timezone, this update includes an easy-to-use local model explorer, deep integrations with inference engines like Ollama, custom profiles for open models like Llama3, Moondream, and Codestral, and a suite of settings to make offline code interpretation more reliable. (View Highlight)
  • Local III also introduces a free, hosted, opt-in model via interpreter --model i. Conversations with the i model will be used to train our own open-source language model for computer control. (View Highlight)
  • Local III also introduces a free language model endpoint serving Llama3-70B. This endpoint provides users with a setup-free experience while contributing to the training of a small, locally running language model. (View Highlight)
  • Where model is a model from Ollama’s model library. This unified command abstracts away all model setup commands. It will only download the model if you haven’t downloaded it before. (View Highlight)
  • Images sent to local models are rendered as a description of the image generated by Moondream, a tiny vision model. The model also receives OCR extracted from the image. (View Highlight)
  • If this revolution is to broadly distribute its benefits, it must belong to the people. In classical computing, society transitioned away from the mainframe era of access to build the personal computer. This helped ensure a destiny for computers which we could control. (View Highlight)
  • Now, an oligopoly of language model providers stand to control the intelligence age. Open Interpreter is a balancing force against that. Our community is rapidly developing a response to ensure our collective freedom— private, local access to powerful AI agents. (View Highlight)