UnrealGenAISupport
by: prajwalshettydev
UnrealMCP is here!! Automatic blueprint and scene generation from AI!! An Unreal Engine plugin for LLM/GenAI models & MCP UE5 server. Supports Claude Desktop App & Cursor, also includes OpenAI's GPT4o, DeepseekR1 and Claude Sonnet 3.7 APIs with plans to add Gemini, Grok 3, audio & realtime APIs soon.
πOverview
Purpose: The Unreal Engine Generative AI Support Plugin aims to simplify the integration of various generative AI models into game development, allowing artists and developers to focus on creative tasks without the complexity of LLM/GenAI integration.
Overview: This plugin is designed for Unreal Engine users, providing compatibility with several state-of-the-art generative AI models such as OpenAI's GPT-4o and Claude Sonnet 3.7. It focuses on creating a user-friendly experience for implementing AI features in games and interactive experiences, all while maintaining a long-term support framework.
Key Features:
-
Model Integration: Supports multiple LLM/GenAI models for diverse applications in game development, including APIs for OpenAI, Deepseek, and more, targeting real-time interactions and creative generation.
-
MCP Support: Integrates a Model Control Protocol (MCP) for enhanced control over scene objects in Unreal Engine, enabling tasks like spawning and manipulating entities directly through prompts or scripts.
-
Blueprint Generation: Automates the creation of Blueprints and functions, streamlining the workflow for developers by providing an easy mechanism to generate complex behaviors and interactions in Unreal projects.
Unreal Engine Generative AI Support Plugin
Overview
The Unreal Engine Generative AI Support Plugin enables easy integration of large language models (LLM) and generative AI (GenAI) within Unreal Engine. It abstracts the complexity of integrating AI models so developers can focus on game development.
The plugin currently supports Unreal Engine 5.1 and above, with integration for multiple AI models including OpenAI's GPT, Anthropic Claude, Deepseek, XAI Grok 3, and more. It also features Model Control Protocol (MCP) integration for interactive control of Unreal projects via AI clients such as Claude Desktop App and Cursor IDE.
Warning
This plugin is under rapid development.
- Do not use in production environments.
- Always use version control.
A stable release will be released soon.
Current Supported AI APIs and Progress
Provider | Feature | Status |
---|---|---|
OpenAI | Chat API (various GPT models) | β Completed |
DALL-E API | π οΈ In Progress | |
Vision API | π οΈ In Progress | |
Realtime API | π οΈ In Progress | |
Structured Outputs | β Completed | |
Whisper API | π§ Planned | |
Anthropic Claude | Chat API (claude-3-7-sonnet etc.) | β Completed |
Vision API | π§ Planned | |
XAI (Grok 3) | Chat Completions API | β Completed |
Reasoning API | π οΈ In Progress | |
Image API | π§ Planned | |
Google Gemini | Gemini Chat & Imagen API | π§ Planned |
Meta AI | Llama 4 herd | π§ Planned |
Deepseek | Chat API and Reasoning API | β Completed (some features not supported) |
Baidu | Chat API | π§ Planned |
3D Generative Models | TripoSR by StabilityAI | π§ Planned |
Additional progress features include version control support, lightweight plugin builds, automated testing, and LTS (Long-Term Support) branching (mostly planned or in progress).
Statuses:
- β Completed
- π οΈ In Progress
- π§ Planned
- π€ Need Contributors
- β Not supported
Setting API Keys
For Editor
Set environment variables for your API keys as follows (<ORGNAME>
depends on the provider):
Windows:
setx PS_<ORGNAME> "your_api_key"
Linux/MacOS:
echo "export PS_<ORGNAME>='your_api_key'" >> ~/.zshrc
source ~/.zshrc
Examples of <ORGNAME>
:
OPENAIAPIKEY
, DEEPSEEKAPIKEY
, ANTHROPICAPIKEY
, METAAPIKEY
, GOOGLEAPIKEY
.
Restart the Unreal Editor and your IDE after setting the keys.
For Packaged Builds
Storing API keys in packaged builds is a security risk. Follow best practices for API key security by routing requests through a secure backend server.
For test builds, you can set the API key at runtime using the function GenSecureKey::SetGenAIApiKeyRuntime
in C++ or Blueprints.
More information: https://help.openai.com/en/articles/5112595-best-practices-for-api-key-safety
Setting up MCP (Model Control Protocol)
Note: Skip if not using MCP in your project.
Caution: MCP allows external clients like Claude Desktop App to control your Unreal project. Use only in a controlled environment with backups and version control.
Steps to Setup MCP:
-
Install a compatible client:
- Claude Desktop App
- Cursor IDE
-
Configure the MCP JSON config file.
Claude Desktop App config (claude_desktop_config.json
):
{
"mcpServers": {
"unreal-handshake": {
"command": "python",
"args": ["<your_project_directory>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py"],
"env": {
"UNREAL_HOST": "localhost",
"UNREAL_PORT": "9877"
}
}
}
}
Cursor IDE config (.cursor/mcp.json
):
{
"mcpServers": {
"unreal-handshake": {
"command": "python",
"args": ["<your_project_directory>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py"],
"env": {
"UNREAL_HOST": "localhost",
"UNREAL_PORT": "9877"
}
}
}
}
- Install MCP CLI:
pip install mcp[cli]
-
Enable Python Editor Script Plugin in Unreal Engine:
Edit β Plugins β Enable "Python Editor Script Plugin" -
(Optional) Enable AutoStart MCP server in plugin settings.
Adding the Plugin to Your Project
With Git
- Add the plugin repository as a Git submodule.
git submodule add https://github.com/prajwalshettydev/UnrealGenAISupport Plugins/GenerativeAISupport
-
Regenerate project files (right-click
.uproject
β Generate Visual Studio project files). -
Enable the plugin in Unreal Editor (Edit β Plugins β Enable "GenerativeAISupport").
-
For C++ projects, add the module dependency to your
Build.cs
:
PrivateDependencyModuleNames.AddRange(new string[] { "GenerativeAISupport" });
With Perforce
In development.
Unreal Marketplace
Coming soon.
Fetching Latest Plugin Changes
With Git
cd Plugins/GenerativeAISupport
git pull origin main
Or update all submodules:
git submodule update --recursive --remote
With Perforce
In development.
Usage
An example Unreal project using this plugin is available:
https://github.com/prajwalshettydev/unreal-llm-api-test-project
OpenAI API
Supports Chat and Structured Outputs with C++ and Blueprints. Tested models include gpt-4o
, gpt-4o-mini
, gpt-4.5
, o1-mini
, o1
, o3-mini-high
.
Chat - C++ Example
void SomeDebugSubsystem::CallGPT(const FString& Prompt,
const TFunction<void(const FString&, const FString&, bool)>& Callback)
{
FGenChatSettings ChatSettings;
ChatSettings.Model = TEXT("gpt-4o-mini");
ChatSettings.MaxTokens = 500;
ChatSettings.Messages.Add(FGenChatMessage{ TEXT("system"), Prompt });
FOnChatCompletionResponse OnComplete = FOnChatCompletionResponse::CreateLambda(
[Callback](const FString& Response, const FString& ErrorMessage, bool bSuccess)
{
Callback(Response, ErrorMessage, bSuccess);
});
UGenOAIChat::SendChatRequest(ChatSettings, OnComplete);
}
Structured Outputs - C++ Example
Send a custom JSON schema directly:
FString MySchemaJson = R"({
"type": "object",
"properties": {
"count": { "type": "integer", "description": "The total number of users." },
"users": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": { "type": "string", "description": "The user's name." },
"heading_to": { "type": "string", "description": "The user's destination." }
},
"required": ["name", "role", "age", "heading_to"]
}
}
},
"required": ["count", "users"]
})";
UGenAISchemaService::RequestStructuredOutput(
TEXT("Generate a list of users and their details"),
MySchemaJson,
[](const FString& Response, const FString& Error, bool Success) {
if (Success)
{
UE_LOG(LogTemp, Log, TEXT("Structured Output: %s"), *Response);
}
else
{
UE_LOG(LogTemp, Error, TEXT("Error: %s"), *Error);
}
}
);
DeepSeek API
Supports Chat and Reasoning (C++ and Blueprints). Note: System messages are mandatory for the reasoning model.
Warning: Increase HTTP timeout settings for DeepSeek API calls due to potentially long response times by adding to
DefaultEngine.ini
:
[HTTP]
HttpConnectionTimeout=180
HttpReceiveTimeout=180
Chat and Reasoning - C++ Example
FGenDSeekChatSettings ReasoningSettings;
ReasoningSettings.Model = EDeepSeekModels::Reasoner; // or EDeepSeekModels::Chat for Chat API
ReasoningSettings.MaxTokens = 100;
ReasoningSettings.Messages.Add(FGenChatMessage{TEXT("system"), TEXT("You are a helpful assistant.")});
ReasoningSettings.Messages.Add(FGenChatMessage{TEXT("user"), TEXT("9.11 and 9.8, which is greater?")});
ReasoningSettings.bStreamResponse = false;
UGenDSeekChat::SendChatRequest(
ReasoningSettings,
FOnDSeekChatCompletionResponse::CreateLambda(
[this](const FString& Response, const FString& ErrorMessage, bool bSuccess)
{
if (!UTHelper::IsContextStillValid(this))
{
return;
}
UE_LOG(LogTemp, Warning, TEXT("DeepSeek Reasoning Response Received - Success: %d"), bSuccess);
UE_LOG(LogTemp, Warning, TEXT("Response: %s"), *Response);
if (!ErrorMessage.IsEmpty())
{
UE_LOG(LogTemp, Error, TEXT("Error Message: %s"), *ErrorMessage);
}
})
);
Anthropic API
Supports Chat for multiple Claude models (C++ and Blueprints).
Chat - C++ Example
FGenClaudeChatSettings ChatSettings;
ChatSettings.Model = EClaudeModels::Claude_3_7_Sonnet;
ChatSettings.MaxTokens = 4096;
ChatSettings.Temperature = 0.7f;
ChatSettings.Messages.Add(FGenChatMessage{TEXT("system"), TEXT("You are a helpful assistant.")});
ChatSettings.Messages.Add(FGenChatMessage{TEXT("user"), TEXT("What is the capital of France?")});
UGenClaudeChat::SendChatRequest(
ChatSettings,
FOnClaudeChatCompletionResponse::CreateLambda(
[this](const FString& Response, const FString& ErrorMessage, bool bSuccess)
{
if (!UTHelper::IsContextStillValid(this))
{
return;
}
if (bSuccess)
{
UE_LOG(LogTemp, Warning, TEXT("Claude Chat Response: %s"), *Response);
}
else
{
UE_LOG(LogTemp, Error, TEXT("Claude Chat Error: %s"), *ErrorMessage);
}
})
);
XAI Grok 3 API
Supports Chat completions (C++ and Blueprints).
Chat - C++ Example
FGenXAIChatSettings ChatSettings;
ChatSettings.Model = TEXT("grok-3-latest");
ChatSettings.Messages.Add(FGenXAIMessage{
TEXT("system"),
TEXT("You are a helpful AI assistant for a game. Please provide concise responses.")
});
ChatSettings.Messages.Add(FGenXAIMessage{TEXT("user"), TEXT("Create a brief description for a forest level in a fantasy game")});
ChatSettings.MaxTokens = 1000;
UGenXAIChat::SendChatRequest(
ChatSettings,
FOnXAIChatCompletionResponse::CreateLambda(
[this](const FString& Response, const FString& ErrorMessage, bool bSuccess)
{
if (!UTHelper::IsContextStillValid(this))
{
return;
}
UE_LOG(LogTemp, Warning, TEXT("XAI Chat response: %s"), *Response);
if (!bSuccess)
{
UE_LOG(LogTemp, Error, TEXT("XAI Chat error: %s"), *ErrorMessage);
}
})
);
Model Control Protocol (MCP)
Work in progress. Enables control of Unreal Engine projects by external AI clients.
Usage
-
If AutoStart MCP server enabled in plugin settings, just open Unreal Editor and the AI client (Claude Desktop App or Cursor IDE).
-
If not enabled:
- Run the MCP server manually:
python <your_project_directory>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py
- Launch or restart the MCP client.
- In Unreal Editor run Python script:
Plugins/GenerativeAISupport/Content/Python/unreal_socket_server.py
(From Tools β Run Python Script) - Use the MCP features in the client.
Known Issues
- MCP nodes fail to connect properly; no undo/redo support.
- No streaming support for DeepSeek reasoning model.
- Limited support for complex material generation.
- Occasional issues running LLM-generated Python scripts.
- Limited error handling when compiling blueprints from LLM output.
- Problems spawning certain nodes (getters/setters).
- Context window and docking issues in Unreal Editor.
Contribution Guidelines
Setting up for Development
Install the Unreal Python package to enable scripting and IntelliSense:
pip install unreal
Further development setup instructions will be added soon.
References
- API key environment variable setup inspired by OpenAI-Api-Unreal.
- MCP server implementation inspired by Blender-MCP.
Quick Links
- OpenAI API Documentation
- Anthropic API Documentation
- XAI API Documentation
- Google Gemini API Documentation
- Meta AI API Documentation
- Deepseek API Documentation
- Model Control Protocol Documentation
- TripoSt Documentation
Support This Project
If you find this plugin helpful, consider sponsoring the developer to support ongoing development.