AI has a new task: helping to keep the bugs out of video games.
At the recent Ubisoft Developer Conference in Montreal, the French gaming company unveiled a new AI assistant for its developers. Dubbed Commit Assistant, the goal of the AI system is to catch bugs before they’re ever committed into code, saving developers time and reducing the number of flaws that make it into a game before release.
“I think like many good ideas, it’s like ‘how come we didn’t think about that before?’,” says Yves Jacquier, who heads up La Forge, Ubisoft’s R&D division in Montreal. His department partners with local universities including McGill and Concordia to collaborate on research intended to advance the field of artificial intelligence as a whole, not just within the industry.
La Forge fed Commit Assistant with roughly ten years’ worth of code from across Ubisoft’s software library, allowing it to learn where mistakes have historically been made, reference any corrections that were applied, and predict when a coder may be about to write a similar bug. “It’s all about comparing the lines of code we’ve created in the past, the bugs that were created in them, and the bugs that were corrected, and finding a way to make links [between them] to provide us with a super-AI for programmers,” explains Jacquier.
Ubisoft hopes that Commit Assistant will cut down on one of the most expensive and labour-intensive aspects of game design. The company says that eliminating bugs during the development phase requires massive teams and can absorb as much as 70 per cent of costs. But offloading the bug-killing process to AI, even partially, isn’t without its own challenges. “You need a tremendous amount of data, but also a tremendous amount of power to crunch the data and all the mathematical methods,” he says. “That [allows] the AI to make that prediction with enough accuracy so that the developer trusts the recommendation.”
It’s still early days – Ubisoft is “only starting to pollinate” Commit Assistant to its development teams and, so far, there’s no usage data on how much it’s impacting game creation. There’s also the human factor to account for: Will developers want an AI poking through their code and effectively saying “you’re doing it wrong”?
“The most important part, in terms of change management, is just to make sure that you take people on board to show them that you’re totally transparent with what you’re doing with AI – what it can do, the way you get the data,” says Jacquier. “The fact that when you show a programmer statistics that say ‘hey, apparently you’re making a bug!’, you want him or her [to realise] that it’s a tool to help and go faster. The way we envisage AI for such systems is really an enabler. If you don’t want to use that, fine, don’t use it. It’s just another tool.”
Ubisoft is working on other AI applications beyond Commit Assistant, though Jacquier emphasises that it is only currently useful in dealing with very specific individual tasks – like getting virtual agents to avoid walking into each other. “AI so far is very good at making decisions on very narrow topics, like Alpha Go,” he says. (AlphaGo is the AI system from DeepMind that beat top Go player Ke Jie at the notoriously complex board game in May 2017.)
“We’ll see in the future more and more examples where this works, but in reality, [something like] a self-driving car, you won’t see in our streets probably until 20 years from now,” he says. “Simply because all those self-driving cars would have to avoid other automated vehicles, pedestrians, old-school cars driven by real humans, and rogue factors like wildlife wandering onto roads.”
But improving AI in gaming could help solve some of these real-world problems. Olivier Delalleau, an AI programmer at Ubisoft, spoke at UDC about autonomous driving in Watch Dogs 2. Using an example of a non-player-controlled car driving around the game’s virtual San Francisco, Delalleau showed how, initially, it would more often careen out of control when taking corners. The car was programmed with the goal of reaching a destination or looping the streets, providing visual flavour to the game world.
“[We found] cars never braked, because they didn’t find it was a good solution,” Delalleau says. As a result, it didn’t learn to brake. “It’s pretty difficult [for an AI] to learn to brake, because it [doesn’t see it as] a good solution most of the time. You need to help it find that it is a good solution.”
Delalleau used reinforcement learning, a form of machine learning, to help the AI learn this skill. Ubisoft provided thousands of examples of braking when driving, and the system learned that it could achieve its goals more efficiently by following the rules of the digital road. The outcome was that the AI cars began taking corners more slowly. This made Watch Dogs 2’s representation of San Francisco more realistic and reduced random crashes.
Jacquier believes that similar work could help inform AI systems with real-world applications, such as driverless cars. “In terms of ethics, I think that actually the games industry can help,” Jacquier says. “When you’re wondering how an autonomous car will behave in a situation that involves pedestrians or other cars, it’s like the Trolley Problem. That’s something you wouldn’t be able to test in real life, either for moral reasons or cost in some situations. But maybe you can have some fair answers by simulating that in a video game environment, and see how your AI would behave.”
Other areas in which Ubisoft is using AI include non-player characters (NPCs). In the upcoming Far Cry 5, Ubisoft has implemented a virtualised version of Maslow’s hierarchy of needs – the psychological theory of motivating factors for human behaviour – for NPC characters. This gives in-game agents motivations for their actions, and is modeled largely on the self-preservation strata of Maslow’s pyramid.
When a player encounters a non-player character in Far Cry 5, two systems are at work: trust and morale. If you raise your weapon at someone you’ve never met before, they will react with distrust or fear, warning you to lower your gun. If the NPC recognises a lingering threat from you, it will launch an attack of its own, fearing for its own ‘life’. When facing a group of enemies, as you pick off members of a gang, individual foes may realise they’re outclassed and lose their thirst for combat, and attempt to flee as they sees their ‘friends’ taken out. Elsewhere, animal companions will respond to player activity, cowing close to the ground unprompted when you crouch into stealth, for instance. It’s the sort of work that adds depth and realism to the world.
In future, tools such as Commit Assistant could spread beyond the confines of Ubisoft. La Forge developed the AI in conjunction with the University of Concordia and published academic papers on how it works. “If someone else wants to implement this kind of method, it’s totally possible to do that by getting those articles, which are public,” says Jacquier.
The system wouldn’t be of use to all developers though. It very much thrives in a ‘big data’ environment with near countless examples of what not to do to feed it as a guide. That restriction, for now, renders it uniquely beneficial to big-budget studios.
But if Ubisoft’s artificial baby matures as is expected, the pay-off for players could be significant – it could mean fewer release dates are pushed back for bug fixes and fewer bugs end up in the finished product. Meanwhile, it could free developers to focus their attentions on improving other aspects of the game. Perhaps best of all, if everything goes according to plan, you’ll never even notice.