For the most part, AI is exceptionally bad at illustrating hands. They come out six-fingered or four-fingered or, even worse, just some wispy ends that fade into the background. AI has been programming large Western 1940s-era smiles onto people of various cultures. It’s been reshaping images we know and refitting them according to prompts. Depending on the data that it’s fed, though, sometimes AI has solutions, and sometimes it doesn’t.
Video game developers and AI companies want to use these AI tools to streamline game development and make it faster. They claim it could help solve the problem of video game crunch and automate some of the most tedious parts of game development. But at the same time, wary developers warn that the new technology is advancing at a rate that could make it even harder to break into the industry, which is notoriously underpaid and challenging to enter.
At a panel I moderated at the Game Developers Conference in March, I grilled Microsoft employees who work with artificial intelligence on whether AI would take the jobs of quality assurance testers. Quality assurance workers at Activision Blizzard who are placed on performance improvement plans are asked to find bugs and meet a quota, for instance. If AI tools can be used to find all the bugs in a game, wouldn’t that take away QA jobs? The Coalition’s Kate Rayner told me that Microsoft doesn’t have bug quotas and that games have so many millions of bugs that developers usually can’t find them all before a title is released to the public.
“If you’re playing a game, once you ship it, you may have had only a few hundred people involved in the creation of that game,” said Rayner, vice president and technical director at The Coalition, the studio in charge of the Gears of War franchise. “When it goes out there, there are millions of people playing the game. So they’re going to find all the bugs, right? So having tools that can simulate that and help us amplify, we get more test coverage. It’s really where the power is.”
On March 23rd, Ubisoft announced a new AI tool called Ghostwriter, which it said would help writers iterate on one line of dialogue 10 different ways. “Listen, get the fuck over here,” calls one non-playable character in the Ubisoft trailer. (Ubisoft declined an interview for this piece.)
These basic lines of dialogue, called barks, are a way for writers to break into game writing. Depending on whether you’re talking to AI evangelists or entry-level game developers, barks and QA bug hunting are either forms of drudgery or very important inroads and pathways into maintaining steady employment in a tough industry. Automating this basic task in game development could cost people jobs, said Janine Hawkins, a freelance games writer who first tweeted about the tool on March 24th.
“I have no doubt that the writers currently working with the tool and tuning it to their needs enjoy using it or find it helpful,” Hawkins told me. “But all it takes is an executive saying ‘Our writers can do twice as many barks now, so why do we need the same number of writers?’ for it to threaten scarce writing jobs.”
Hawkins said the job threat could come from Ubisoft or from other developers who use similar tools. “This is already a very devalued segment of games writing, and it’s so easy to imagine that devaluation snowballing as AI tools tip the scales even more in favor of volume.”
In China, for instance, some freelancers have noted the dearth of video game job opportunities, according to a Rest of World report from April 11th.
“Entry-level jobs have always been high risk. AI may exacerbate this circumstance but certainly will not change the precarious nature of these positions.”
“AI will provide efficiencies especially around some of the more chronic shortcomings in game development like crunch time right before a major deadline,” said Joost van Dreunen, a lecturer on the business of games at the NYU Stern School of Business. “Entry-level jobs have always been high risk. AI may exacerbate this circumstance but certainly will not change the precarious nature of these positions. We do have to wonder, however, what organic intelligence will be lost in the long run and whether that presents a strategic disadvantage.”
It’s true that game development is very difficult and that prototyping a game can take a lot of time. And it’s also true that a lot of these basic roles are repetitive and monotonous.
“Games, and more specifically, art for games, are becoming more and more expensive and time-consuming,” said Konstantina Psoma, founder and CEO of Kaedim, a company that uses machine learning algorithms to turn 2D images into 3D models. “I believe that AI-powered software that is developed to help the pain points of game developers can help bring down costs and time while maintaining the high-fidelity of graphics.”
That’s the very real promise of generative AI that can already be witnessed in some of these apps. Currently, I can hop into one of these apps and generate an avatar of myself in the perfect lighting conditions and desired pose.
What used to cost me $100 to $200 to commission a human artist, and used to take several days, has turned into a free process that takes seconds, where I can refine and redo the results an infinite number of times, assuming I’m using a service whose servers can hold up to the strain. I don’t have to worry about the artist becoming fed up with the number of changes I’m requesting, but I do have to worry about the creepy, vacant stares of some of these avatars being created.
“It just opens up a whole can of worms because there was no regulation on AI and how it’s used. There’s no copyright strike on anything that people have done,” said a current game developer, speaking on the condition of anonymity as they were not authorized to speak to media. “They were never made with artists in mind. It was not a bespoke tool. It bypassed artists completely.”
Regulators are looking at how to deal with the new emerging technology and have given some hints as to their thinking. In February, Michael Atleson, an attorney in the FTC division of advertising practices, warned AI-related businesses against false advertising.
As OpenAI CEO Sam Altman put it in New York Magazine: “We are messing around with something we don’t fully understand. And we are trying to do our part in contributing to the responsible path through it.”
So will AI steal game development jobs? The answer is that the proper controls have to be put in place, Microsoft employees told me on the panel.
Daniel Kluttz, a director of responsible AI at Microsoft, said on the panel that it was important to bring people in to “really, really stress test those systems and try to identify some of these emergent behaviors that may surprise you pleasantly. They may not surprise you so pleasantly. But you don’t know what you don’t know. And it is so important for those diverse views to come into play there.”
Before we get too ahead of ourselves, it’s important to note that AI is still getting things wrong.
For instance, I asked ChatGPT for examples of its language model being used to write non-playable character lines. It told me that, in 2021, the game developer Eleventh Hour Games used the technology to write dialogue in its game Last Epoch. I then fact-checked this claim with the game studio. Eleventh Hour Games told me in an email that it did not use AI to generate NPC dialogue in Last Epoch and was curious how ChatGPT could have come to that conclusion.
The bottom line is that humans are still in charge — for now.