Blender MCP connects large language models to Blender through the MCP protocol, allowing developers to use natural language for modeling, scene assembly, material editing, and animation production. It solves the traditional Blender automation challenges of script-heavy workflows, complex setup, and difficult integration debugging. Keywords: Blender, MCP, AI 3D modeling.
The technical specification snapshot provides the essentials at a glance
| Parameter | Description |
|---|---|
| Project Name | blender-mcp |
| Core Capabilities | Natural language control of Blender, AI 3D modeling, scene and material automation |
| Primary Language | Python |
| Communication Protocol | MCP (Model Context Protocol) + local TCP |
| Blender Requirement | 3.0+, 4.0+ recommended |
| Python Requirement | 3.10+ |
| Default Port | 9876 |
| Clients | MCP-compatible agents such as Cursor and Claude |
| Installation Entry Point | uvx blender-mcp |
| Core Dependencies | uv / uvx, Blender plugin addon.py |
| Repository | github.com/ahujasid/blender-mcp |
| Star Count | Not provided in the source article; refer to the live GitHub page |
This solution turns Blender automation into a conversational workflow
The core idea behind blender-mcp is not just “install another plugin.” It exposes Blender as a tool endpoint that an AI agent can call. The language model interprets natural language, and the plugin translates that intent into executable Blender operations.
This changes 3D creation from “writing scripts to drive software” into “using instructions to drive workflows.” For designers, the barrier to entry drops significantly. For developers, the automation pipeline becomes more standardized, reusable, and extensible.
The minimum environment must be ready before deployment
- Blender 3.0 or later, with 4.0+ recommended
- The latest version of Cursor, or another MCP-compatible client
- Python 3.10+
- Network access to GitHub and Python package sources
python --version
blender --version
Use these two commands to confirm that your local Python and Blender versions meet the minimum requirements.
Installing uv is the first critical step in the entire setup chain
blender-mcp depends on uvx to start the service, so uv is the entry point for the entire workflow. If uv is not installed correctly, Cursor cannot launch the MCP service, and Blender cannot establish connectivity.
Use the following installation commands on Windows and macOS
powershell -c "irm https://astral.sh/uv/install.ps1 | iex" # Install uv on Windows
This command downloads and runs the official uv installation script on Windows.
brew install uv # Install uv on macOS with Homebrew
This command quickly installs uv on macOS.
Windows also requires the correct environment variable setup
If your terminal reports uvx is not recognized as an internal or external command, the installation usually succeeded, but the user-level path has not taken effect. In that case, add ~\.local\bin to your Path.
$localBin = "$env:USERPROFILE\.local\bin" # Locate the uv executable directory
$userPath = [Environment]::GetEnvironmentVariable("Path", "User") # Read the user Path
[Environment]::SetEnvironmentVariable("Path", "$userPath;$localBin", "User") # Append it to the user environment variables
This script permanently adds the uv executable directory to the current user’s environment variables.
The verification commands must return version numbers
uv --version
uvx --version
If both commands print version numbers, the package management layer is ready.
Successful Blender plugin installation is confirmed by the BlenderMCP panel in the sidebar
The core plugin file is addon.py from the repository. You can clone the repository directly or download the archive and extract that file manually. If you installed an older version previously, uninstall it first to avoid panel conflicts.
git clone https://github.com/ahujasid/blender-mcp.git # Clone the plugin source repository
This command retrieves the latest plugin files and gives you a local project directory for future updates.
The installation path is straightforward: Edit > Preferences > Add-ons > Install, then select addon.py and enable Interface: Blender MCP. After that, press N in the 3D viewport sidebar and confirm that you can see the BlenderMCP tab.
Plugin configuration defines the communication boundary between Blender and the agent
The default port is 9876, and in most cases you do not need to change it. As long as the Blender plugin port matches the MCP service configuration, the base communication path works. If the port is already in use, update both sides consistently.
Beyond the core connection, the plugin also supports extended integrations such as Poly Haven, Sketchfab, Hyper3D, and Tencent Hunyuan. These are not required for deployment, but they determine whether the AI can automatically fetch assets, generate 3D content from text, or access cloud resources.
Three parameters have the biggest impact on generation quality
Octree Resolution: Controls voxel precision and model detailInference Steps: Controls iteration count and generation stabilityGuidance Scale: Controls prompt adherence strength
{
"port": 9876,
"octree_resolution": 256,
"inference_steps": 30,
"guidance_scale": 7.5,
"generate_texture": true
}
This example shows a balanced set of AI generation parameters.
MCP configuration in Cursor determines whether the AI can discover Blender as a tool
A global MCP configuration is recommended. Its advantage is simple: configure once and reuse across projects. If you only want to enable it for a single project, you can also use .cursor/mcp.json as a project-level configuration.
{
"mcpServers": {
"blender": {
"command": "uvx",
"args": ["blender-mcp"]
}
}
}
This JSON configuration tells Cursor to launch the MCP service named blender-mcp through uvx.
After saving it, if the MCP page in Cursor shows the blender service, the agent-side configuration is correct. Next, return to Blender and click the connect button in the plugin panel. When the interface displays Running on port 9876, Blender is actively listening.
The minimum connectivity test should validate basic geometry creation before complex scene generation
After deployment, do not start with a long prompt. The safest approach is to create a few basic objects first and verify that position, scale, and spatial relationships are executed correctly.
Create a sphere with a radius of 2 meters in the Blender scene, then create a cube with an edge length of 1 meter. Place the cube directly above the sphere with a gap of 0.5 meters between them.
This instruction is specifically designed to verify that the full chain is working: agent understanding, MCP invocation, and Blender execution.
Common failures usually fall into three categories: commands, ports, and configuration format
The first category is uvx being unavailable, which is usually caused by environment variables not taking effect. The second category is a red error state in Cursor services, usually caused by invalid JSON, failed package downloads, or startup errors in uvx blender-mcp. The third category is Blender not responding, often due to port mismatches, firewall blocking, or multiple clients competing for the same service.
Follow this troubleshooting order
uvx blender-mcp # Manually verify that the service can start
netstat -ano | findstr 9876 # Check whether port 9876 is already in use on Windows
These two commands help you confirm whether the MCP service can run and whether another process is already occupying port 9876.
The images below show the plugin and interface in a real running state
AI Visual Insight: This image shows the actual working interface after Blender has been connected to an external agent. The key focus is typically the plugin panel on the right, the generated scene objects, or the animation control area. It demonstrates that MCP invocation goes beyond text and has already converted natural language into visible objects, timeline actions, or scene layouts inside Blender.
AI Visual Insight: This image is more focused on tool-call results inside Cursor or another conversational client. You can typically see the AI instruction, the MCP tool response, and the scene status feedback. Technically, it proves that the agent has recognized the blender tool, translated the natural language request into structured operations, and returned the execution results.
The highest-value workflow is to break modeling, materials, lighting, and animation into multiple rounds of instructions
Instead of sending one extremely long prompt, a more reliable strategy is to divide the task into five stages: modeling, materials, lighting, animation, and rendering adjustments. This makes failures easier to isolate and better matches the rhythm of agent-based tool calling.
For example, first create a forest and a wooden cabin, then separately request a unified low-poly style, and then add three-point lighting and Cycles settings. A phased interaction model is more stable than a single complex instruction and gives you more controllable results.
FAQ
Q1: Why did I install the Blender plugin, but Cursor still cannot see the blender tool?
A: In most cases, MCP is not configured correctly. First check whether the mcpServers JSON is valid, then confirm that running uvx blender-mcp in a terminal does not produce errors, and finally make sure Cursor has been restarted and has reloaded the configuration.
Q2: Do I have to use port 9876?
A: No. Port 9876 is only the default value. The critical point is not the number itself, but that the Blender plugin side and the MCP service side must match exactly. If the port is already occupied, change both configurations and try again.
Q3: Is this workflow better for designers or developers?
A: It works well for both. Designers can use natural language directly for repetitive modeling and scene building, while developers can extend custom tools on top of MCP and the open-source plugin to build automated 3D production pipelines.
Core summary
This article reconstructs the full deployment path for blender-mcp, including uv installation, Blender plugin loading, Cursor MCP configuration, two-way connectivity testing, and common troubleshooting. It helps developers quickly enable natural-language-driven AI 3D modeling, material editing, scene assembly, and animation production in Blender.