With Open Interpreter, you can execute code (Python, Javascript, Shell, and more) on your local machine. Open Interpreter has a ChatGPT-like interface that you can access in your terminal by typing $ interpreter
after installation.
This gives you a natural-language way to interact with your computer’s general-purpose features:
✅Make and modify photos, videos, PDFs, etc.
✅Use a Chrome browser to do research
✅Visualize, process, and explore large datasets
…and many more.
How to Start Open Interpreter
pip install open-interpreter
Terminal
After installation, run interpreter
:
interpreter
Python
import interpreter
interpreter.chat("Plot APPL and META's normalized stock prices") # Executes a single command
interpreter.chat() # Starts an interactive chat
Interactive Chat
If you want to chat with me in your terminal, you can launch the interpreter by typing interpreter
on the command line:
interpreter
Or interpreter.chat()
from a .py file:
interpreter.chat()
Programmatic Chat
To send messages directly to the chat function, use .chat(message)
:
interpreter.chat("Add subtitles to all videos in /videos.")
# ... Streams output to your terminal, completes task ...
interpreter.chat("These look great but can you make the subtitles bigger?")
# ...
Start a New Chat
One feature of Open Interpreter in Python is that it keeps track of what you have typed before. To clear the previous commands and start over, you can use the reset option:
interpreter.reset()
Save and Restore Chats
If you set return_messages to True, interpreter.chat()
will give you a List of messages. You can use this list to continue a conversation later by passing it to interpreter.load(messages)
:
messages = interpreter.chat("My name is Killian.", return_messages=True) # Save messages to 'messages'
interpreter.reset() # Reset interpreter ("Killian" will be forgotten)
interpreter.load(messages) # Resume chat from 'messages' ("Killian" will be remembered)
Customize System Message
Open Interpreter’s system message can be customized and adjusted to enhance its features, change its access, or provide more background.
interpreter.system_message += """
Run shell commands with -y so the user doesn't have to confirm them.
"""
print(interpreter.system_message)
Change the Model
To use Code Llama
, you can launch the interpreter
locally from the command line.
interpreter --local
For gpt-3.5-turbo
, you can use fast mode:
interpreter --fast
Or, in Python, set the model manually:
interpreter.model = "gpt-3.5-turbo"
0 Comments