LA’s TechWeek has brought out the AI cognoscenti, starting with a packed house for an AI Competition at Culver Studios.

Last night, Toonstar, a digital-first animation studio, Loyola Marymount University and Variety VIP+ hosted an LA Tech Week event in Toonstar’s Arts District studio. Toonstar made this demo of their AI-assisted animation production process while panels were going on in the other room. With a simple character design and a voice Toonstar’s system can generate animation – and dubbing. The voice and animation lipsync can be cloned to make the character speak in any language.

There was a fireside chat with Susanne Daniels, former entertainment chief for The CW, LIFETIME, MTV and YouTube Originals, followed by a panel with Eric Iverson, CTO of UTA​, Carles Reina, Jose Garcia Moreno, Animation Professor of Loyola Marymount University, VP Revenue at ElevenLabs​, John Attanasio, CEO + Co-founder of Toonstar​, and Lisa Sterbakov, Partner @ Orchard Farm Productions (Mila Kunis Studio). ​

Where the Robots Grow is AI’s First Feature Film. And it’s free on YouTube right now. This 87-minute film, produced by Tom Paton’s AiMation Studios, cost just $8,000 per minute — an unprecedented figure for a professionally animated feature.

AWS and FBRC ai kicked off LA Tech Week with The Culver Cup, a generative AI Film Competition. Filmmakers were tasked with creating innovative short films using cutting-edge AI tools like Luma AI’s Dream Machine and Playbook3D. Out of 50 entries, the top eight finalists, including Joey Daoud and Krystal Trixx, competed in a bracket-style showdown. The competition spotlighted how AI is transforming storytelling, offering filmmakers the ability to produce high-quality, production-ready films faster than ever before.

Synthsaga is the retro future YouTube channel of the Acosta brothers, (Raul and Eduardo) whose studio is in Gijón, Spain has been building XR experiences since 2014. “We are vintage sci-fi enthusiasts,” said Raul. “Almost a year ago we started a retro cyberpunk channel which evolved to the form Synthsaga has now, a constant game between utopian-dystopian parallel universes of the past.” Fictional nostalgia is the perfect subject matter for generative AI. But. It’s the storytelling, the execution, the music, and the attitude of the filmmakers that really matters here, and it’s good enough here to get you to forget, for a hot second, that this is not AI. It’s just a good story that makes us want to know more about the world and the characters. The Acostas use midjourney for static images; Luma and Minimax for video; Udio for music and Elevenlabs and custom recordings for sound.

Pixie Dust was created by Davide Bianca, an LA-based Emmy award winning executive creative director, strategist, and technologist. For nearly two decades he has been at the forefront of innovation, telling stories across multiple mediums while helping Hollywood studios, tv networks, streaming platforms and gaming companies bring their properties to international audiences and shape pop culture. “I have been creating GenAI films since 2019, however my work in the field of AI dates back to the early 2000s,” says Bianca. “My background is in Computer Science and for the past twenty years I have been working at the intersection of tech innovation and storytelling with a firm belief that when technology is in service of story it becomes the invisible hand that allows us to transcend, indistinguishable from magic.” Bianca’s company has a very, very impressive show reel. Tools used to create “Pixie Dust” were Midjourney 6.1, Kling 1.5, Magnific, Photoshop Beta with Adobe Firefly, ElevenLabs, and CapCut.

Share.

Leave A Reply

Exit mobile version