bolt.diy (Previously oTToDev)

bolt.diy: AI-Powered Full-Stack Web Development in the Browser

Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.

Check the bolt.diy Docs for more information.

Also this pinned post in our community has a bunch of incredible resources for running and deploying bolt.diy yourself!

We have also launched an experimental agent called the "bolt.diy Expert" that can answer common questions about bolt.diy. Find it here on the oTTomator Live Agent Studio.

bolt.diy was originally started by Cole Medin but has quickly grown into a massive community effort to build the BEST open source AI coding assistant!

Table of Contents

Join the community

Join the bolt.diy community here, in the oTTomator Think Tank!

Requested Additions

Features

Setup

If you're new to installing software from GitHub, don't worry! If you encounter any issues, feel free to submit an "issue" using the provided links or improve this documentation by forking the repository, editing the instructions, and submitting a pull request. The following instruction will help you get the stable branch up and running on your local machine in no time.

Let's get you up and running with the stable version of Bolt.DIY!

Quick Download

Download Latest Release ← Click here to go the the latest release version!

Prerequisites

Before you begin, you'll need to install two important pieces of software:

Install Node.js

Node.js is required to run the application.

  1. Visit the Node.js Download Page
  2. Download the "LTS" (Long Term Support) version for your operating system
  3. Run the installer, accepting the default settings
  4. Verify Node.js is properly installed:
    • For Windows Users:
      1. Press Windows + R
      2. Type "sysdm.cpl" and press Enter
      3. Go to "Advanced" tab → "Environment Variables"
      4. Check if Node.js appears in the "Path" variable
    • For Mac/Linux Users:
      1. Open Terminal
      2. Type this command:
        echo $PATH
        
      3. Look for /usr/local/bin in the output

Running the Application

You have two options for running Bolt.DIY: directly on your machine or using Docker.

Option 1: Direct Installation (Recommended for Beginners)

  1. Install Package Manager (pnpm):

    npm install -g pnpm
    
  2. Install Project Dependencies:

    pnpm install
    
  3. Start the Application:

    pnpm run dev
    

    Important Note: If you're using Google Chrome, you'll need Chrome Canary for local development. Download it here

Option 2: Using Docker

This option requires some familiarity with Docker but provides a more isolated environment.

Additional Prerequisite

Steps:

  1. Build the Docker Image:

    # Using npm script:
    npm run dockerbuild
    
    # OR using direct Docker command:
    docker build . --target bolt-ai-development
    
  2. Run the Container:

    docker-compose --profile development up
    

Configuring API Keys and Providers

Adding Your API Keys

Setting up your API keys in Bolt.DIY is straightforward:

  1. Open the home page (main interface)
  2. Select your desired provider from the dropdown menu
  3. Click the pencil (edit) icon
  4. Enter your API key in the secure input field

API Key Configuration Interface

Configuring Custom Base URLs

For providers that support custom base URLs (such as Ollama or LM Studio), follow these steps:

  1. Click the settings icon in the sidebar to open the settings menu Settings Button Location

  2. Navigate to the "Providers" tab

  3. Search for your provider using the search bar

  4. Enter your custom base URL in the designated field Provider Base URL Configuration

Note: Custom base URLs are particularly useful when running local instances of AI models or using custom API endpoints.

Supported Providers

Setup Using Git (For Developers only)

This method is recommended for developers who want to:

Prerequisites

  1. Install Git: Download Git

Initial Setup

  1. Clone the Repository:

    # Using HTTPS
    git clone https://github.com/stackblitz-labs/bolt.diy.git
    
  2. Navigate to Project Directory:

    cd bolt.diy
    
  3. Switch to the Main Branch:

    git checkout main
    
  4. Install Dependencies:

    pnpm install
    
  5. Start the Development Server:

    pnpm run dev
    

Staying Updated

To get the latest changes from the repository:

  1. Save Your Local Changes (if any):

    git stash
    
  2. Pull Latest Updates:

    git pull origin main
    
  3. Update Dependencies:

    pnpm install
    
  4. Restore Your Local Changes (if any):

    git stash pop
    

Troubleshooting Git Setup

If you encounter issues:

  1. Clean Installation:

    # Remove node modules and lock files
    rm -rf node_modules pnpm-lock.yaml
    
    # Clear pnpm cache
    pnpm store prune
    
    # Reinstall dependencies
    pnpm install
    
  2. Reset Local Changes:

    # Discard all local changes
    git reset --hard origin/main
    

Remember to always commit your local changes or stash them before pulling updates to avoid conflicts.


Available Scripts


Contributing

We welcome contributions! Check out our Contributing Guide to get started.


Roadmap

Explore upcoming features and priorities on our Roadmap.


FAQ

For answers to common questions, issues, and to see a list of recommended models, visit our FAQ Page.