
About
Is open-source video generation finally getting practical? Today, Hunter and Riley break down the Mirage AI release of Alice-T2V-14B-MoE, a new text-to-video model launched with open weights on Hugging Face. Alice produces quick five-second video clips at 480p and 720p, giving creators real control for the first time. The hosts dig into why this model matters compared to closed options like Runway, Luma, or Kling. They explain the Mixture of Experts approach, why open weights do not mean an effortless experience, and who should actually run Alice (hint: teams and power users, not casual TikTokers). Plus, how open video models signal a shift from "magic tricks" to true creative infrastructure that you can slot into customized pipelines. You'll find practical advice for working with these new tools: how to build your workflow, why the Apache 2.0 license matters, and common pitfalls with documentation and provenance. The episode is packed with real-world tips—like generating concept shots for pitch decks, building style experiments, and understanding that five seconds of video can be both useful and messy. The hosts also get candid about the risks of improved fake footage and offer simple tips to defend against deepfake scams. If you're curious about the future of content creation, workflow automation, and the growing open-source AI toolkit, this episode is for you. It's not just hype—it's about turning "available" into "actually adopted."