Why Game Studios Are Training Their Own Generative AI Models
May 29, 2024
As ChatGPT surpassed 100 million monthly users and AI image generation models like Midjourney, Leonardo.AI, DALLE, Google’s Gemini, and Stable Diffusion amassed large followings, generative AI enthusiasts have heralded these tools as “revolutionary” for game development.

However, according to the GDC’s 2024 State of the Gaming Industry Report, only 20% of game developers at AAA and AA studios are actively incorporating generative AI tools into their workflows​​.

This discrepancy highlights a significant gap between the hype surrounding generative AI and its practical application in professional game development, particularly at the enterprise studio level. While general-purpose AI models have become incredibly popular among consumers, they seem to fall short for enterprise adoption - at least outside of limited experimentation.

Many AAA game studios have started to train their own AI models in order to accelerate game development processes while protecting against some of the inherent risks of general-purpose models, like lack of intellectual property security and control.

In a May investors call, Electronic Arts' CEO said that "more than 50% of our development processes will be positively impacted by the advances in generative AI" and went on to talk about their 40 years of proprietary data that could be used to train their own generative AI models.

Why do general-purpose AI models fall short and how could training AI models on proprietary content build more innovative, creative and fun games?

Note: In this article, we will focus on AI models for image generation, though many of these insights could similarly be applied to other types of AI models, like LLMs for text.

Where AI Models Fall Short for Enterprise

When we talk to artists who have tried to use a general-purpose model, like Midjourney, for professional work, we mostly hear, “I feel like I’m fighting with it to get it to do what I need.” They may be able to use the tools for ideation or rapid brainstorming, but when it comes to actual creation, general-purpose models fall short.  

When we talk to information security teams, responsible for managing software infrastructure, or legal teams, the concerns grow. And a large majority (84%) of game developers indicated they were somewhat or very concerned about the ethics of using generative AI, according to GDC’s report.

The main reasons that general-purpose models aren’t being adopted can fall into four main buckets:

  • Lack of control
  • Lack of security
  • Lack of governance
  • Lack of ownership

Lack of Control and Customization

While general-purpose models are versatile, they are limited in their ability to provide reliable tools to professional creatives. These models often lack the fine-tuned understanding required to generate content that meets the needs of a studio's intellectual property (IP) and workflow.

General-purpose models may struggle with adhering to the visual style of a studio, especially when that style is unique and distinctive. These models may succeed in producing content that is visually appealing, but they lack the nuanced aesthetics that define a game or studio’s brand. This limits the practical application of AI tools to ideation and brainstorming, rather than directly supporting work in the production pipeline.

Lack of security & IP protection

Using any AI tool for production poses security risks, particularly if that tool leverages customer data to improve its model. Studios are reluctant to feed proprietary content into external systems due to fears of IP theft or having their IP used in a way that could advantage competitors.

Further, AI presents new information security risks around unreleased IP. AI models are surprisingly small files that can be easily stored on a thumb drive or cloud service. Using services that offer enterprise-grade security and are built to provide assurance over model security is important when working with an AI technology partner.  

Lack of governance & traceability

Game development requires meticulous tracking of asset creation processes. General-purpose AI tools often lack metadata tracking and governance features for studios to trace how an asset was generated. As AI tools become more advanced, artists will begin to incorporate these tools in different ways into their creative process. Some artists may still always start from a sketch that they’ve created, or only use AI to refine an image they produced in another application, like Photoshop.

Being able to trace every step of the creation process of assets will be crucial, especially as the copyright laws around AI-generated assets become more clear. Studios will need to have the tools necessary to demonstrate where and how human expression helped author every asset they create. 

Lack of ownership

Most general-purpose models are proprietary models developed by companies or organizations where the source code and training methodologies are not publicly accessible. These companies use customer data to constantly improve their own model, which they then grant access to via a subscription service.

Studios may fully own the outputs that these models create, and they may even be able to have a version of the proprietary base model fine-tuned to their organization for use in the service. However, with closed models and proprietary generation technology, businesses will never have access to the underlying model in perpetuity or transparency over how the generation process works.

This means any investment in training and adapting these models does not build a lasting asset for the studio, or any kind of durable path towards operationalizing AI. Some indie studios may accept this trade-off (licensing access to their AI models), but enterprise studios are beginning to see these AI models as part of their intellectual property. 

The Move to Training AI Models

Recognizing these limitations, game studios are increasingly opting to train their own generative AI models. Roblox has been vocal about their use of generative AI in all parts of their game experience, including training models on their proprietary content. 

During the 2024 Game Developers Conference, Roblox revealed two new tools to accelerate avatar setup and object texturing, making it easier for users and studio partners to create experiences inside of Roblox. These tools are powered by AI models that Roblox trained, in-house, using their own proprietary content.

“Gen AI is already streamlining creation in unprecedented ways, and the biggest opportunity we see for AI is in our art workflows,” said Guy de Beer, Chief Operating Officer at award-winning Roblox studio Toya Play. “We create dozens of custom characters each month, and Avatar Auto Setup has the potential to increase productivity dramatically. For the kind of branded experiences we build — with hundreds of bespoke avatars — AI changes the creation economy, providing boundless opportunities for creators to bring their creative visions to life.” 

Hi-Rez Studio has also spoken about taking an artist-centric approach to training AI models on their own proprietary content. In a recent interview, Stew Chisam, CEO of Hi-Rez Studios shared that “with the ability to securely train models on our assets and give artists direct control over the generation process, we're able to ensure the work we do is centered around the artist's process, and our overall creative direction.”

Electronic Arts’s CEO also recently shared that he believes how using their proprietary data to train AI models will give EA a competitive advantage in developing games.

Large game developers are turning to custom AI model training for:

  • More consistency and control in the creation process
  • A higher degree of security for model and outputs
  • More observability into how assets get created
  • Full ownership of their model and its underlying weights / training data.

Creating Consistent Game Assets and Materials

Custom AI models trained on a studio's proprietary content can generate game assets and materials that are consistent with the game's art style and narrative. This ensures that new characters, environments, and props align with existing visuals, accelerating the timeframe from concept to final render.

AI models can also be trained to produce specific assets necessary in the production process. For example, concept artists can spend hours producing “orthographic” views of character designs after they’ve finished the initial character concept. This is a highly manual and often tedious process that could be accelerated with a custom trained model, giving the artist more time to concept additional characters or go deeper into other aspects of the game. 

In the case of Roblox, they’ve used a custom model to accelerate avatar creation and object texturing for creators and partners in an effort to help create more user-generated content experiences. Other studios are doing the same for their internal creative teams, deploying these custom trained models so that their team can generate IP-appropriate creatures, landscapes, and items that match the game's established visual language.

A Higher Degree of Security for Your Model and Outputs

Custom AI models can offer more security than mainstream AI models because they can be deployed in permissioned or even isolated environments. Large studios typically work with a variety of external vendors, freelancers, and cross-functional teams, and being able to manage role-based access ensures that their AI models remain secure. 

There are many ways to deploy an AI model to a team, and training your own model typically increases that flexibility. 

More Observability into How Assets Get Created

Game development requires meticulous tracking of asset creation processes, and general-purpose AI tools often lack robust metadata tracking and governance features. For instance, in large-scale game development, it’s essential to document every step of asset creation for legal, creative, and developmental purposes. Without proper tracking, it’s challenging to resolve issues related to asset ownership, originality, or integration into the game.

By using Custom trained AI models, legal teams have more insight into the training data used to create in-game assets so that assets can be traced back to their originating source.

Ownership of the Model and Its Underlying Weights

Depending on the infrastructure used (eg. openly licensed model infrastructure), studios can fully own their AI models and the underlying training data. This means any investment in training and adapting these models builds a lasting asset for the studio, providing long-term value and independence from external providers.

This ownership allows studios to build valuable, reusable assets that can be monetized and leveraged for future projects without incurring continuous licensing costs. 

Not all custom trained AI models are created equal

When navigating the landscape of custom-trained AI model providers, studios need to do their due diligence to ensure that the services surrounding the AI model training empower them to control, secure, and own that model. Here are three critical considerations to keep in mind:

A) Model Fidelity and Application Suitability: The first question to ask is whether the vendor's model training product can produce models that generate results at the fidelity your team requires for incorporation into a production workflow. It's crucial that the AI model not only fits your specific use case but also performs at a level that meets your standards for quality and reliability.

B) Ownership and Access: Clarify whether the vendor is offering mere "access" to the model through a licensing agreement or if they are providing the full model files, weights, and training data to your studio. True ownership of the AI model is vital for full integration into your products and services, offering you the flexibility to modify and scale the model as needed without continued dependency on the vendor.

C) Security and Governance Practices: Ensure that the vendor has robust security and governance practices in place to secure your intellectual property (IP). This includes compliance with industry standards such as SOC 2, which is critical for maintaining the confidentiality, integrity, and availability of your data. Additionally, inquire about the measures they take to protect against model extraction attempts and other potential security vulnerabilities.

Many vendors have noted the rising demand for custom-trained AI models and have adjusted their marketing strategies to offer some kind of “custom model”. However, it's crucial to be aware that not all these offerings meet the above criteria. 

Some vendors may provide models that do not generate at the required fidelity for production workflows or lack sufficient security and governance measures. Almost no vendor offers full ownership to the custom models that you train on their platform; they instead provide licensed access through a monthly subscription.

(Disclaimer: Invoke offers a step-by-step AI Model Trainer as part of our Professional Edition, open source model training scripts for single users with hardware to run their model training locally on their computer, and model training consulting for Enterprise customers).

What’s Next?

Adoption of generative AI models and tools is still new, especially at the enterprise level. As more studios begin training and deploying models to their teams, game studios will need to develop the internal capacity to manage the ongoing model training and maintenance process. 

We believe every game studio should be training their own models, have full ownership of the models they train, and see those models quickly driving value for the studio by saving their team time that they can reinvest into the creative process. 

One of the most important things we can do to make great games that players love is to give creatives more time.

More Resources