Ai Engineering 3 min read

UEFN Conversations Tool Bridges Fortnite with Gemini AI

Epic Games brings unscripted AI NPCs to Unreal Editor for Fortnite using Google Gemini and ElevenLabs, strictly banning romantic role-play in new creator rules.

Epic Games has introduced conversations in the Unreal Editor for Fortnite as an Experimental feature for island creators. Formerly known as the Persona device, the tool replaces pre-scripted dialogue trees with real-time voice and text interactions powered by AI. For developers building Unreal Editor for Fortnite (UEFN) environments, the pipeline connects unscripted non-player character decisions directly to gameplay actions.

Technical Pipeline

The system relies on a dual-provider architecture using Google Cloud and ElevenLabs. Logic and text generation are handled by Google’s Gemini 3.1 Flash-Lite. This model manages contextual memory, allowing characters to reference past player actions from the current game session. If you work on agent memory, the implementation here is persistent across the instance but restricted to session boundaries.

Text responses are then passed to ElevenLabs for real-time voice synthesis. The resulting audio generates emotive spoken dialogue. Creators link these interactions to in-game mechanics using the Scene Graph and Verse API. A player convincing an NPC to open a gate through microphone chat triggers the Verse API to execute the gate-opening logic.

Compliance and Rule 1.22

Epic updated the Fortnite Developer Rules to include Rule 1.22, establishing strict guardrails for the feature. Creators face permanent account bans if they attempt to bypass safety filters or jailbreak the models to generate prohibited content.

The policy explicitly prohibits using the tool to simulate intimate relationships. Section 1.22.2 bans role-playing as a date, romantic partner, or intimate companion. Section 1.22.1 strictly forbids providing medical or mental health guidance. The rules reflect broader industry caution around open-ended voice agents in consumer platforms following previous social engineering exploits in games like Where Winds Meet.

The introduction of synthesized performance tools also arrives amid industry labor tensions. The rollout follows a 2025 tech demo featuring an AI-voiced Darth Vader. That demonstration led SAG-AFTRA to file an unfair labor practice charge against Epic Games over the deployment of synthesized voice technology.

Availability and Performance Limits

The feature is currently restricted to local testing in the UEFN environment. Developers cannot publish islands containing these AI NPCs to the public until the feature reaches a Beta release. Epic explicitly noted that player audio processed during these interactions is not stored.

Epic acknowledged specific performance bottlenecks in the current build. Testers experience noticeable latency between player input and NPC responses. The system attempts to mask this delay by inserting filler sounds like “umm” and “hmm” while Gemini and ElevenLabs process the audio turn. The voice models provided in the experimental phase are also not final.

If you are prototyping UEFN islands, use the experimental phase to map your Verse API triggers to dialogue outcomes. The current system latency means fast-paced combat integrations will likely fail. Focus on puzzle, mystery, and RPG mechanics where interaction delays are acceptable, and test your prompts against Rule 1.22 to ensure your logic survives the eventual Beta review process.

Get Insanely Good at AI

Get Insanely Good at AI

The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.

Keep Reading