How to Connect Local LLMs to MindStudio AI Agents

What You'll Need
- Node.js 18 or newer — Download from nodejs.org
- A local model server — Choose one:
Step 1: Install Node.js
- Visit nodejs.org.
- Download the installer for your operating system.
- Run the installer and follow the prompts.
Step 2: Set Up Your Local Model Server
Option A: Using Ollama (Recommended)
- Download Ollama from ollama.com.
- Install it on your computer.
- After installation, quit the Ollama app if it's running in your system tray.
- Open Terminal with administrator privileges:
- Mac: Open Terminal, then run
sudo -s - Windows: Search for "Terminal", press Ctrl+Shift+Enter, click "Yes"
- Linux: Press Ctrl+Alt+T, then run
sudo -i
- Mac: Open Terminal, then run
- Start Ollama:
Leave this Terminal window open and running.
Option B: Using LM Studio
- Download LM Studio from lmstudio.ai.
- Install and open the application.
- Configure it to run as a local server (check the LM Studio documentation for details).
- Download a model through LM Studio.
- Enable the local server.
- Keep LM Studio running.
Step 3: Install the MindStudio Tunnel
Open a new Terminal window (keep your model server running in the first one) and install the tunnel tool:
Step 4: Connect to Your MindStudio Account
Run this command to authenticate:
This will open your browser and create an API key in your MindStudio account (found under Developer > API Keys).
Step 5: Download an AI Model (If Needed)
If you're using Ollama and don't have any models yet, download one:
If you're using LM Studio, download models through the LM Studio application.
Step 6: Register Your Models
- Ensure that your local server is running.
- Tell MindStudio about your local models:
This scans your local setup and makes your models available in MindStudio.
Step 7: Start the Tunnel
Launch the connection between your computer and MindStudio:
Keep this Terminal window open while you're using your local models. The tunnel will stay active until you close it or lose internet connection.
Step 8: Verify Everything Works
- Log into MindStudio.
- Navigate to Service Router > Self-Hosted Models.
- Your local models should appear in the list.
Step 9: Use Your Local Model in an AI Agent
- Create a new AI agent in MindStudio.
- Add a "Generate Text" block.
- Click "Model Settings" in that block.
- Look under the Local Models category.
- Select your local model.
- Build and test your agent as usual.
As long as the tunnel is running, your agent will use your local model instead of cloud-based models.
Useful Commands
mindstudio-local auth— Log in to MindStudiomindstudio-local register— Refresh your model listmindstudio-local start— Start the tunnelmindstudio-local models— See available modelsmindstudio-local status— Check connection statusmindstudio-local logout— Sign out
Advanced: Custom Server URLs
If your local server runs on a different port:
For Ollama:
For LM Studio:
That's it! You're now running local LLMs while building agents in MindStudio's visual interface. This setup avoids cloud model costs and keeps your data on your own machine.

