ARTICLE AD
Jam & Tea Studios is the latest gaming startup implementing generative AI to transform the way players interact with non-playable characters (NPCs) in video games.
Traditionally, video game NPCs are directed by predetermined scripts, which can feel repetitive, unrealistic, and boring. It also may restrict the number of potential experiences for players. However, when generative AI is involved, players can engage in casual conversation and interact with NPCs how they want to (within reason).
Founded by gaming veterans from Riot Games, Wizards of the Coast, and Magic: The Gathering, the company announced on Friday its first game that will leverage generative AI tools to help with gameplay mechanics, content generation, dialogue, and item generation.
Jam & Tea’s debut game, Retail Mage, is a roleplaying game that allows players to take on the role of a wizard working as a salesperson at a magical furniture store. The main goal of the game is to earn 5-star reviews by helping customers. But it’s really up to the players to decide if they actually want to work or cause chaos. With AI NPCs as customers and human players being able to say and do almost whatever they want, the possible outcomes should vary widely.
In Retail Mage, players are approached by customers who each have their own requests. Instead of selecting from preset phrases, players can type in the text generator how they’d like to respond. The player can ask the AI to “say something charming,” and it will offer four different dialogue options.
Jam & Tea is among several companies competing in the AI-powered NPC space, alongside Artificial Agency, Inworld, and Nvidia. Ubisoft’s AI-powered “Ghostwriter” tool writes NPC dialogue for some of its games.
The new game also comes at a time when there’s concern among creatives about the potential challenges posed by the prevalence of generative AI. Last month, SAG-AFTRA — the union comprised of voice actors and other talent — initiated a strike against major game publishers over AI concerns.
However, Jam & Tea claims it’s taking a balanced approach to the inclusion of AI, and wants to protect artists, writers, and other creatives working in game design.
“Our philosophy is that we believe creatives are going to be only more essential as we move forward in using this technology and in bringing new experiences to players,” co-founder and chief creative officer M. Yichao told TechCrunch, who was the former narrative designer for Guild Wars 2, League of Legends and other titles.
“AI will generate all this dialogue, and you can talk to characters endlessly… but it’s going to take the creative eye and lens to really add meaning to that and to craft that into an experience that matters into something with impact, depth, and emotion that carries through stories. That’s going to become more important than ever,” Yichao added.
He explained that creatives are heavily involved throughout the development process, including when it comes to crafting NPCs, giving them motivation, interests, and backstory, as well as providing example lines to help the AI mimic the tone and generate lines in real-time.
Limitations of AI NPCs
Despite its advantages, generative AI in NPCs has its limitations. One major concern is the issue of AI unpredictability, when the behavior of an NPC becomes excessively erratic, resulting in a frustrating experience for the player. AI can also hallucinate answers, so there’s a possibility that the NPC could say something that’s wrong or doesn’t exist in the world.
Continuously improving the AI engine will help mitigate unpredictable NPCs, Yichao believes. Players can also rate the characters’ responses, which provides data to help improve the characters’ behavior. Plus, Jam & Tea claims to have put in guardrails in place to prevent inappropriate conversations.
Players are still encouraged to be creative, allowing for inventive and spontaneous interactions can occur. For example, instead of helping a customer, players can choose to engage in activities instead, like playing hide and seek– a real scenario that occurred during playtesting.
“Our lead engineer was playtesting one night and went up to the NPCs and just said, ‘I’m bored.’ And the NPC responded by saying, ‘Well, why don’t we play a game? Let’s play hide and seek.’ And so the other NPCs heard and said, ‘Oh, we’re playing too,’” shared co-founder and CTO Aaron Farr. The NPCs proceeded to follow the rules of the game, with one seeker walking throughout the store to find all the hiders.
“None of that was programmed; all of that was emergent behavior. That is part of the delight of when we have what a player wants to do combined with its experience to modify the experience in real-time,” Farr added, a former engineering leader at Riot Games and Singularity 6.
The company has been experimenting with various large language models (LLMs) throughout the testing phase, including OpenAI, Google’s Gemma, Mistral AI, and Meta’s Llama, and other open models. It’s currently uncertain which LLM it will ultimately use in the final version of the game, but is fine-tuning the model to train it on how to give better responses that are more “in character.”
Generate items out of thin air
Jam & Tea’s AI engine goes beyond dialogue generation. Players can also interact with any object in the game and state their intentions with that object, such as picking it up or dismantling it for parts. They can even create items from scratch. Depending on what they want to do, the game interprets that intention and determines if they’re successful or not.
In a demo shown to TechCrunch, Yichao interacted with an NPC named Noreen, who asked for an antelope-shaped plush. He then typed a command into an action box and retrieved a pillow resembling an antelope from a crate. The game recognized his action as successful and added the item to his inventory.
Since the item didn’t previously exist in the game, players won’t physically see an antelope-shaped plush appear. All that happens in the game is the item shows up in the player’s inventory as a default image of a pillow. If the player wants to perform an action, like sitting in a chair, a notification appears on the screen indicating that the action was performed.
“One of the things that’s really exciting about this technology is it allows for open-ended creative expression. Like, I can take a piece of meat and say, what if I put it in the bowl and I make a delicious fish stew? We might not have a fish stew [image], but one of the things that I’m working with our artists on is coming up with a creative ability to represent that item in a way that’s satisfying in the world and allows the player’s imagination to fill in some of those blanks, and gives players maximum creative freedom to make things that are unexpected,” Yichao said.
AI technology won’t be used for 2D or 3D asset generation. Real artists will create the images.
Image Credits: Jam & TeaRetail Mage is a relatively basic game compared to others. At launch, the company promises to provide a more advanced product than the test version we saw during the demo.
Jam & Tea states that the game is primarily intended to demonstrate the application of the technology as it continues to experiment. Beyond Retail Mage, the company is also developing another game — currently referred to as “Project Emily” internally — which will showcase their broader ambitions, featuring more environments and a sophisticated storyline.
The startup’s scrappy team of eight has a lot of work ahead to reach the level of bigger gaming companies. However, taking action now while there is momentum allows the company to adapt and grow as AI models advance.
Jam & Tea raised $3.15 million in seed funding from London Venture Partners with participation from Sisu Game Ventures and 1Up Ventures. It plans to raise another round later this year.
As for the business model, Jam & Tea will charge $15 to buy the game and offer extra game packs that players can purchase separately. It’ll launch on PCs initially, but the company aims to enable cross-platform functionality within the next few years.
Retail Mage is slated to be released to the public later this fall.