How to Connect Local LLMs to MindStudio AI Agents

Connect local language models running on your computer to MindStudio, so you can build AI agents without paying for cloud-based model usage.

What You'll Need

  • Node.js 18 or newer — Download from nodejs.org
  • A local model server — Choose one:

Step 1: Install Node.js

  1. Visit nodejs.org.
  2. Download the installer for your operating system.
  3. Run the installer and follow the prompts.

Step 2: Set Up Your Local Model Server

Option A: Using Ollama (Recommended)

  1. Download Ollama from ollama.com.
  2. Install it on your computer.
  3. After installation, quit the Ollama app if it's running in your system tray.
  4. Open Terminal with administrator privileges:
    • Mac: Open Terminal, then run sudo -s
    • Windows: Search for "Terminal", press Ctrl+Shift+Enter, click "Yes"
    • Linux: Press Ctrl+Alt+T, then run sudo -i
  5. Start Ollama:
Terminal
ollama serve

Leave this Terminal window open and running.

Option B: Using LM Studio

  1. Download LM Studio from lmstudio.ai.
  2. Install and open the application.
  3. Configure it to run as a local server (check the LM Studio documentation for details).
  4. Download a model through LM Studio.
  5. Enable the local server.
  6. Keep LM Studio running.

Step 3: Install the MindStudio Tunnel

Open a new Terminal window (keep your model server running in the first one) and install the tunnel tool:

Terminal
sudo npm install -g @mindstudio-ai/local-model-tunnel

Step 4: Connect to Your MindStudio Account

Run this command to authenticate:

Terminal
mindstudio-local auth

This will open your browser and create an API key in your MindStudio account (found under Developer > API Keys).

Step 5: Download an AI Model (If Needed)

If you're using Ollama and don't have any models yet, download one:

Terminal
ollama pull llama3.2

If you're using LM Studio, download models through the LM Studio application.

Step 6: Register Your Models

  1. Ensure that your local server is running.
  2. Tell MindStudio about your local models:
Terminal
mindstudio-local register

This scans your local setup and makes your models available in MindStudio.

Step 7: Start the Tunnel

Launch the connection between your computer and MindStudio:

Terminal
mindstudio-local start

Keep this Terminal window open while you're using your local models. The tunnel will stay active until you close it or lose internet connection.

Step 8: Verify Everything Works

  1. Log into MindStudio.
  2. Navigate to Service Router > Self-Hosted Models.
  3. Your local models should appear in the list.

Step 9: Use Your Local Model in an AI Agent

  1. Create a new AI agent in MindStudio.
  2. Add a "Generate Text" block.
  3. Click "Model Settings" in that block.
  4. Look under the Local Models category.
  5. Select your local model.
  6. Build and test your agent as usual.

As long as the tunnel is running, your agent will use your local model instead of cloud-based models.

Useful Commands

  • mindstudio-local auth — Log in to MindStudio
  • mindstudio-local register — Refresh your model list
  • mindstudio-local start — Start the tunnel
  • mindstudio-local models — See available models
  • mindstudio-local status — Check connection status
  • mindstudio-local logout — Sign out

Advanced: Custom Server URLs

If your local server runs on a different port:

For Ollama:

Terminal
mindstudio-local set-config --ollama-url http://localhost:11434

For LM Studio:

Terminal
mindstudio-local set-config --lmstudio-url http://localhost:1234

That's it! You're now running local LLMs while building agents in MindStudio's visual interface. This setup avoids cloud model costs and keeps your data on your own machine.

Launch Your First Agent Today