AI Entertainment4 views6 min read

How One Filmmaker Built a Sci-Fi Universe Using AI Tools

Filmmaker Josh Wallace Kerrigan is using AI tools like Midjourney and Runway to create Neural Viz, a popular sci-fi universe he writes, directs, and performs himself.

Chloe Sullivan
By
Chloe Sullivan

Chloe Sullivan is a culture and technology reporter for Neurozzio, focusing on how emerging technologies like AI are influencing creative industries, including film, music, and art.

Author Profile
How One Filmmaker Built a Sci-Fi Universe Using AI Tools

Filmmaker Josh Wallace Kerrigan has developed a complete cinematic universe called Neural Viz, attracting millions of views across social media platforms. By combining traditional screenwriting with a suite of artificial intelligence tools, he is demonstrating a new path for independent content creation that bypasses the conventional Hollywood studio system.

Kerrigan writes, directs, and performs every character in his sci-fi world, using AI for visual generation, voice synthesis, and facial motion capture. The project, which began in 2024, has grown from a single web series into a multi-show universe with its own lore, characters, and evolving narrative, showcasing a sophisticated use of a technology often associated with low-quality content.

Key Takeaways

  • Josh Wallace Kerrigan is the creator behind the AI-generated sci-fi series Neural Viz and its expanded Monoverse.
  • He uses a combination of AI tools, including Midjourney, Runway, and ElevenLabs, for visuals, animation, and voice work.
  • Kerrigan performs all characters himself, using AI facial motion-capture to map his expressions onto alien models.
  • The project stands out for its narrative depth and quality, challenging the perception of AI-generated video as mere "slop."
  • Neural Viz has gained significant traction, leading to interest from major studios and allowing Kerrigan to pursue filmmaking full-time.

A New Approach to Filmmaking

In Los Angeles, Josh Wallace Kerrigan works from his home computer, directing characters that exist only as digital creations. One such character is Tiggy, a glistening alien who stars in his series. The process is not seamless; generating a single shot, like Tiggy turning his head, can require numerous attempts as the AI struggles with consistency and specific instructions.

Kerrigan's workflow begins with an initial image prompt in a tool like Midjourney, for example, "fat blob alien with a tiny mouth and tiny lips." He then uses other specialized AI software to generate voices, such as ElevenLabs, and to animate scenes, like Runway. This hands-on process involves constant refinement to overcome the technology's limitations.

Despite the advanced tools, the creative process is iterative and often challenging. The AI might misinterpret a prompt, such as adding a frog's face to a character's head instead of giving it frog-like skin. These quirks require patience and a deep understanding of both filmmaking principles and the software's capabilities.

From Hobby to Cinematic Universe

What began as an experiment in 2024 has evolved into a complex world known as the Monoverse. The first series, Unanswered Oddities, was a mockumentary-style show set in a future where aliens called "glurons" speculate about the extinct "hooman" race.

The initial concept expanded rapidly. Kerrigan introduced new shows set within the same universe, including a police procedural and a parody of paranormal investigation programs. These series began to interconnect, developing subplots, character arcs, and a deeper lore about the fall of humanity. This narrative complexity has set Neural Viz apart from typical AI video experiments.

The Rise of Generative AI Video

Generative AI tools for video creation have become widely accessible, leading to a surge in AI-generated content online. While platforms like Google's Veo and OpenAI's Sora can create visually impressive clips from simple text prompts, much of the output lacks narrative coherence or artistic vision. This has led to the term "slop" to describe low-effort, algorithmically generated content. Creators like Kerrigan represent a different approach, using AI as a production tool to execute a specific creative vision.

The Man Behind the AI Mask

Before launching Neural Viz, Josh Wallace Kerrigan spent over a decade navigating the traditional paths of Hollywood. Originally from a small town in Texas, he studied film and moved to Los Angeles in 2012. He worked various industry jobs, from barista to production assistant, and directed a low-budget horror film.

Kerrigan gained extensive experience in nearly every role on a film set but found it difficult to gain lasting traction. The entertainment industry was shifting due to the streaming bubble, and the 2023 writers' and actors' strikes further disrupted career paths. It was during this period that he began exploring 3D modeling and generative AI.

He found that AI could automate the most time-consuming aspects of animation. Instead of fighting the technology's weaknesses, he built his creative concept around them. He chose to feature talking alien heads because AI was better at that than complex action sequences. He adopted a grainy, retro-TV aesthetic to mask visual imperfections.

"Everything I do within these tools is a skill set that’s been built up over a decade plus. I do not believe there’s a lot of people that could do this specific thing."

Fusing Human Performance with AI

A critical element of Neural Viz's success is Kerrigan's direct involvement in the performances. He writes all the scripts and then acts out every character's part himself. Using Runway's facial motion-capture tool, he records his own facial expressions and voice, which the software then maps onto his alien characters.

This method gives him a level of control over the nuance of a performance that a simple text prompt cannot achieve. It is a modern form of digital puppetry, where the creator's performance is the foundation of the final product. According to Kerrigan, he would consider hiring other actors in the future but currently finds it more efficient to perform all the roles himself.

Turning Glitches into Lore

Kerrigan often incorporates AI errors into the story. When a video generator made a character's skin unusually smooth, he wrote it into the script as the character "metamorphosizing" because he could no longer afford his "morph inhibitors." Another character's speech tic—starting sentences with a long vowel sound—originated as a software glitch that Kerrigan decided to keep as a personality trait.

Navigating Industry Disruption

The unique quality of Neural Viz quickly attracted a following online, with videos earning millions of views. This success also caught the attention of Hollywood executives. Kerrigan reported that he spoke with nearly every major studio and received two job offers.

He turned down an in-house studio role to maintain his independence, instead partnering with a producer to create a new TV pilot unrelated to the Monoverse. With income from his new contract and revenue from YouTube and TikTok, Kerrigan quit his day job in January 2025 to focus on his creative work full-time.

His journey highlights a growing debate within the entertainment industry. While many fear AI will eliminate jobs, others see it as a tool that empowers independent creators. "There is a version of these tools that allows people to become more independent of the system, and I think that’s probably a good thing," Kerrigan stated.

The Future of Creative Work

The rise of generative AI is forcing a reevaluation of creative roles. Fellow AI creators like Zack London, known as Gossip Goblin, and the duo behind TalkBoys Studio have also found success but faced criticism from industry peers concerned about job displacement.

However, Kerrigan's process suggests that the most successful creators will be those who combine artistic vision with technical skill. His work is not about pushing a button; it relies on his decade of experience in writing, cinematography, and directing. The AI serves as a powerful production assistant, not the primary creator.

As AI tools become more sophisticated, they are offering creators more granular control. The release of Runway's Act-Two software, for example, allowed for even more nuanced facial expressions, capturing a character's trembling lip during an emotional scene. This trend suggests that the future of AI in film will favor storytellers who can master the technology to serve their narrative goals, rather than those who simply generate content without intent.