logo

Your World. Your Rules.

Next-generation gaming platform, coming soon

Learn More

What is Tales?

Tales is a next-gen gaming platform, powered by an AI that generates interactive immersive experiences from text prompts. You dream it, Tales builds it.

Built on our own foundation large world model, TALES harnesses the power of advanced AI to generate limitless gaming experiences on demand.

Our visionary team comprises of PhDs from world-renowned institutions like Stanford and the University of British Columbia, alongside veteran engineers from industry giants such as IBM, Microsoft, Netflix, Call of Duty and Shopify. This diverse blend of academic excellence and real-world expertise drives our mission to redefine the boundaries of gaming.

At TALES, we’re not just creating games; we’re opening doorways to infinite worlds, where every player’s imagination becomes an instantly playable reality. Join us as we embark on this extraordinary journey to transform how we play, explore, and experience digital realms.

What is the vision

The Vision

A next-gen gaming console powered by AI, where users co-create interactive 3D experiences – essentially building a "Westworld" for entertainment and education.

The social experiment

Can a community, driven by shared passion and rewards, generate the vast levels of data required to create an incredible gaming console?

The challenge

Training a massive Large World Model requires vast amounts of data, cost, and infrastructure.

The solution

a) The Tales team builds a model and platform in a way where anyone with resources can join our goal. See Tales as the “glue” that is enabling all kinds of resources to come together; GPU, server storage, engineers, gameplay recordings.

b) Tales subsequently trains AI on the data, building a Large World Model that is actively maintained and built upon by the very same community that helped fund it and build it.

c) The outcome is a next-generation video game console to experience all sorts of interactive immersive experiences.

What are the use cases?

Interactive Experiences

Watch anything.

Create and enjoy any unique setting or experience.

Interactive Games

Play any game.

Create, refine and play any game you can imagine.

Interactive Education

Learn any subject.

Create a learning environment and then grow your knowledge.

Interactive NPCs

Interact with anyone.

Create characters and engage with them in difference scenarios.

How does it all work?

The backbone of Tales is Sia, its Large World Model (LWM) — a 3D and spatial intelligence AI that creates new gameplay and interactive experiences based on data gathered from our community. Sia is named after the Egyptian cat-god of perception and mind.

Players provide her with a prompt in the form of text, image, or video, after which Sia works with them to understand the genre, settings, characters, and gameplay mechanics you want to experience – before walking you through the creation of your very own interactive world. The same applies for immersive experiences, puzzles, characters, and watchables – Sia is the conduit between you and your wildest imaginations.

In layman’s terms, where a Large Language Model (LLM) creates and outputs text, a Large World Model (LWM) creates and outputs 3D environments. We’ve named ours, Sia.

Key Features

The product centers around a library of user-generated games and experiences.

Tales Arcade

The Arcade is the library for all games created & shared on Tales. YouTube, but for games.

  • Create a game and share it with the community
  • Browse library of new gaming experiences
  • Arcade is a bridge that unites the human imagination with the latest in AI-game development
  • Arcade is collaborative, simple to use, and empowers people to transform their dreams into playable worlds
  • The Arcade in Tales is where every story begins

Interactive Education

Interactive Education is the library for all learning-based experiences.

  • Create immersive education experiences, combining visual, verbal, and interactive formats to transform the way we learn
  • Prompt a Mandarin class, attend an astrophysics lecture, or have Da Vinci show you how he painted the Mona Lisa
  • Engage with learning material like never before

Intelligent NPCs

NPCs are non-player-characters that you can create and interact with. It’s your childhood imaginary friend brought to life.

  • Generate intelligent NPCs (non-player characters) 
  • Interact with NPCs in real-time
  • Philosophical debates with Socrates or fireside chats with your favorite artist
  • Sessions with a therapist or personal trainer
  • Transform how you think about virtual companions

Tales Watch

Tales Watch is the library for all immersive experiences. Think Netflix, but you’re in the movies.

  • Tales Watch delivers watchable experiences
  • Place yourself right in the middle of historic or fictional scenes
  • History, your favorite stories, and your wildest fantasies at your fingertips
  • Create it, watch it, interact with it.

Roadmap

Tales is pioneering the future of interactive entertainment with our revolutionary next-gen gaming platform.

Phase 1: Foundation and ML Infrastructure
Duration: 2-3 months
1. Set up core infrastructure
  • Establish cloud computing environment (e.g., AWS, Google Cloud)
  • Configure distributed computing systems for ML training
  • Set up data storage and management systems
2. Develop initial ML model architecture
  • Design neural network architecture for 3D scene understanding
  • Implement basic training pipeline
3. Create data ingestion system
  • Develop APIs for user-uploaded gameplay footage
  • Implement data preprocessing and cleaning algorithms
  • Set up data validation and quality assurance processes
4. Establish version control and CI/CD pipelines
  • Set up Git repositories
  • Implement automated testing and deployment workflows
Phase 2: Data Training and Transparency
Duration: 3-4 months
1. Implement transparent data training system
  • Develop data provenance tracking
  • Create user dashboard for monitoring data contributions
  • Implement data anonymization and privacy protection measures
2. Enhance ML model training
  • Fine-tune model architecture based on initial results
  • Implement transfer learning from pre-trained models
  • Develop data augmentation techniques
3. Create initial data annotation tools
  • Develop user interface for tagging and describing uploaded content
  • Implement semi-automated annotation suggestions
4. Establish ML model evaluation metrics
  • Define key performance indicators (KPIs) for model quality
  • Implement automated evaluation pipelines
Phase 3: Basic 3D Simulation Engine
Duration: 4-5 months
1. Develop core 3D engine components
  • Implement rendering pipeline (OpenGL or Vulkan)
  • Create basic physics simulation
  • Develop scene graph and object management system
2. Integrate ML model with 3D engine
  • Implement inference pipeline for real-time scene generation
  • Develop system for dynamically loading ML-generated content
3. Create basic world-building tools
  • Develop simple terrain generation system
  • Implement basic object placement and manipulation tools
4. Establish asset pipeline
  • Create system for importing and optimizing 3D models
  • Implement texture and material management
Phase 4: AI-Driven Content Generation
Duration: 5-6 months
1. Enhance ML model for content generation
  • Implement generative adversarial networks (GANs) for 3D asset creation
  • Develop natural language processing (NLP) system for text-to-scene generation
2. Create character system
  • Implement character models with skeletal animation
  • Develop basic AI for character behavior and pathfinding
3. Implement spatial awareness and interaction
  • Develop object interaction system
  • Implement collision detection and response
4. Enhance world-building tools
  • Create procedural generation systems for landscapes, vegetation, and structures
  • Implement more advanced object manipulation and scene editing tools
Phase 5: Advanced AI and User Experience
Duration: 6-7 months
1. Implement advanced character AI
  • Develop more sophisticated behavior trees and decision-making algorithms
  • Implement natural language generation for character dialogue
2. Enhance scene understanding and generation
  • Improve ML model to handle more complex and diverse scenes
  • Implement style transfer techniques for scene aesthetics
3. Develop user experience and interface
  • Create intuitive UI for scene creation and manipulation
  • Implement real-time collaboration features
4. Optimize performance
  • Implement level-of-detail (LOD) systems
  • Optimize rendering and physics simulations for various hardware configurations
Phase 6: Alpha Launch and Iteration
Duration: 3-4 months
1. Implement prompt-based experience generation
  • Develop natural language interface for scene creation
  • Integrate ML models for interpreting and executing user prompts
2. Create first playable experiences
  • Develop sample games and interactive scenarios
  • Implement basic gameplay mechanics and systems
3. Establish feedback and iteration loop
  • Develop analytics and telemetry systems
  • Create user feedback channels and bug reporting tools
4. Optimize and polish
  • Perform extensive testing and bug fixing
  • Optimize performance across various devices and platforms
Phase 7: Beta and Ecosystem Development
Duration: 4-5 months
1. Implement mod support and SDK
  • Develop plugin architecture for user-created content
  • Create documentation and examples for third-party developers
2. Enhance multiplayer capabilities
  • Implement networking layer for real-time multiplayer experiences
  • Develop server infrastructure for hosting user-created worlds
3. Improve content creation tools
  • Develop more advanced AI-assisted design tools
  • Implement version control and collaboration features for user-created content
4. Establish marketplace and sharing features
  • Develop system for users to share and monetize their creations
  • Implement content curation and recommendation systems
Phase 8: Launch and Beyond
Duration: Ongoing
1. Official launch of Tales V1 'Westworld'
  • Finalize all systems and features
  • Ensure scalability and stability of infrastructure
2. Continuous improvement and expansion
  • Regular updates and feature additions based on user feedback
  • Ongoing ML model training and refinement
3. Ecosystem growth
  • Foster community of developers and content creators
  • Establish partnerships for content and technology integrations
4. Research and development
  • Explore integration of emerging technologies (e.g., VR/AR, haptics)
  • Investigate advanced AI techniques for more realistic and dynamic world simulation

Team

We’re stealth-funded and have been building in stealth-mode since August 2023

Jason Krupat
Jason Krupat
Head of Product
  • Co-founder & CPO of Burn Ghost game, exited 2022
  • Ex-Head of Games at Yahoo! was funded by Draft Kings to tune of $3.1m
  • BSc English, University of Wisconsin
  • 20 years of experience in mobile/web product management
  • Investments in Ditto AI app
  • Favorite game is Clash Royale
  • Prediction: Character.ai is best AI app of 2025 as users begin to realize use cases
Viktor Uzunov
Viktor Uzunov
Head of Community
  • Exited Jefferies LLC in 2020 to start Web3 journey
  • BSc Political Science and Economics, MA in Marketing Strategy, London School of Economics
  • 8 years in conventional investment banking, 5 years in Web3
  • Investments in Solana & BTC since 2014, still holds all of it
  • Favorite game is the Diablo franchise
  • Prediction: Human productivity will vastly increase rather than decrease as people assume
Anonymous
Anonymous
Lead Developer
  • PhD, Stanford University
  • 18 years in game theory, development and applied sciences
  • Mined BTC in 2015 and held tight to this day
  • Favorite games are MMORPGs
  • Prediction: Life and technology converge to become indistinguishable