AI offers new tools for game creation, but developers are worried about their jobs

For the most part, AI is exceptionally bad at illustrating hands. They come out with six or four fingers or, worse, just a few wispy ends that fade into the background. AI programs big Western smiles from the 1940s on people from different cultures. It reshaped and modified images we know as per the directions. However, depending on the data that is fed in, sometimes AI has solutions and sometimes it doesn’t.

Video game developers and AI companies want to use these AI tools to streamline and speed up game development. They claim it could help solve the problem of video game cracking and automate some of the most tedious parts of game development. But at the same time, wary developers warn that the new technology is evolving so quickly that it could make it even harder to break into the industry, which is notoriously underpaid and challenging to enter.

During a panel I moderated at the Game Developers Conference in March, I questioned Microsoft employees who work with artificial intelligence about whether AI would take the jobs of quality assurance testers. For example, Quality Assurers at Activision Blizzard who are placed on performance improvement plans are asked to find bugs and meet a quota. If AI tools could be used to find all bugs in a game, wouldn’t that eliminate QA tasks? The Coalition’s Kate Rayner told me that Microsoft has no bug quota and that games have so many millions of bugs that developers usually can’t find them all before a title is released to the public.

“When you play a game, once you release it, you might only have a few hundred people involved in making that game,” said Rayner, vice president and technical director at The Coalition, the studio responsible for for the Weapons of war franchise. “When it comes out there, there will be millions of people playing the game. So they’re going to find all the bugs, right? So with tools that can simulate that and help us strengthen, we get more test coverage. That is where the strength really lies.”

On March 23, Ubisoft announced a new AI tool called Ghostwriter that would help writers repeat one line of dialogue in 10 different ways. “Listen, get the fuck over here,” yells a non-playable character in the Ubisoft trailer. (Ubisoft declined an interview for this piece.)

These basic lines of dialogue, called barking, are a way for writers to break into game writing. Depending on whether you’re talking to AI evangelists or entry-level game developers, barking and looking for QA bugs are either forms of drudgery or very important ways and paths to keeping a steady job in a tough industry. Automating this basic task in game development could cost people jobs, says Janine Hawkins, a freelance game writer who tweeted first about the tool on March 24.

“I have no doubt that the writers who are currently working with the tool and tailoring it to their needs will enjoy using it or find it useful,” Hawkins told me. “But all it takes is a manager who says, ‘Our writers can now bark twice as much, so why do we need the same number of writers?’ for threatening scarce writing jobs.

Hawkins said the job threat could come from Ubisoft or other developers using similar tools. “This is already a very devalued segment of game writing, and it’s so easy to imagine devaluation snowballing as AI tools tip the scale even more in favor of volume.”

In China, for example, some freelancers have noticed the lack of video game job opportunities, according to one Rest of the world April 11 report.

“Entry-level jobs have always been risky. AI may exacerbate this circumstance, but will certainly not change the precarious nature of these positions.”

“AI will bring efficiency, especially around some of the more chronic shortcomings in game development, such as crunch time right before a major deadline,” said Joost van Dreunen, a lecturer in game business at NYU Stern School of Business. “Entry-level jobs have always been risky. AI may exacerbate this circumstance, but it certainly won’t change the precarious nature of these positions. However, we must ask ourselves which organic intelligence is lost in the long term and whether that creates a strategic disadvantage.”

It is true that game development is very difficult and prototyping a game can take a lot of time. And it’s also true that many of these basic roles are repetitive and monotonous.

“Games, and more specifically art for games, are getting more and more expensive and time consuming,” said Konstantina Psoma, founder and CEO of Kaedim, a company that uses machine learning algorithms to convert 2D images into 3D models. “I believe that AI-powered software developed to help game developer pain points can help reduce costs and time while preserving the high-fidelity of graphics.”

That’s the very real promise of generative AI already seen in some of these apps. Currently, I can jump into one of these apps and generate an avatar of myself in the perfect lighting conditions and desired pose.

What used to cost me $100 to $200 to commission a human artist, and used to take several days, has turned into a free process that takes seconds where I can refine and redo the results an infinite number of times , assuming I’m using a service whose servers can handle the voltage. I don’t have to worry about the artist getting fed up with the number of changes I request, but I do have to worry about the creepy, absent looks of some of these avatars being created.

“It just opens up a whole can of worms because there was no regulation about AI and how it is used. There is no copyright strike for anything that people have done,” said a current game developer, who spoke on condition of anonymity because they were not authorized to speak to the media. “They were never made with artists in mind. It was not a custom instrument. It completely bypassed artists.

Last era.
Image: Games of the eleventh hour

Regulators are looking at how to deal with the new emerging technology and have offered some hints at their thinking. In February, Michael Atleson, an attorney with the FTC’s division of advertising practices, warned AI-related companies about false advertising.

As Sam Altman, CEO of OpenAI, put it New York Magazine: “We are messing around with something we don’t fully understand. And we try to do our bit to make it through in a responsible way.”

So will AI steal game development jobs? The answer is that proper controls need to be put in place, Microsoft employees told me at the panel.

Daniel Kluttz, a director of responsible AI at Microsoft, said on the panel that it was important to bring people in to “really, really stress test those systems and try to identify some of these emerging behaviors that might pleasantly surprise you. They may not pleasantly surprise you. But you don’t know what you don’t know. And it is so important that these divergent views play a role there.”

Before we get too ahead of ourselves, it’s important to note that AI still gets things wrong.

For example, I asked ChatGPT for examples of the language model used to write non-playable character lines. It told me that in 2021 the game developer Eleventh Hour Games used the technology to write dialogues in its game Last era. I then fact-checked this claim with the game studio. Eleventh Hour Games told me in an email that it didn’t use AI to generate NPC dialogues Last era and was curious how ChatGPT could have come to that conclusion.

The bottom line is that humans are still in charge – for now.

Leave a Comment