Spirit AI’s technology could bring Westworld-style interactions to video games sooner than you think

Jem Alexander, Friday, July 20th, 2018 4:43 pm

Artificial intelligence is an important aspect of game development and as the science improves, so too do the interactive experiences that we’re able to create. Natural language analysis and creation has been elusive to AI technology, which would allow in-game characters to understand conversation and respond in kind. Spirit AI, a company focused on making strides in this area for the betterment of the games industry, has two pieces of middleware that can help developers integrate this into their games.

This includes meta systems which can help moderate your community within the game world by analysing not just the words people use, but also the intent behind them, as well as the context in which they’re spoken.

“One of the things that we do at Spirit AI is we develop a tool called Ally which is all about using AI to be able to, in a really nuanced way, understand language and context,” explained Mitu Khandaker, Creative Partnerships Director and member of the Spirit AI founding team. “One of the things we’re really good at as a company, which a lot of our tech is based on, is using AI to really understand interactions and the context behind interactions.

“With our Character Engine we’re using that to understand what players are saying to characters but with Ally we’re using that to understand what players are saying to each other. When spirit AI started and we were faced with being able to develop tech which really understood context and really understood language, immediately for me I said ‘Well yeah, it’s going to be super cool being able to build tools around characters but also as a woman who exists online.’ This was just a year following the worst of GamerGate. [And I] immediately thought ‘It makes sense to be able to develop tools which can help platforms understand what is harassment and what isn’t harassment’. Because it’s more than just language. You can’t always just rely on things like keywords. It’s context, it’s intention, it’s asking ‘Is someone actually being hurt by the things they’re saying?’

“I like to frame it around trying to understand consent because that’s basically what it is. Even I engage in smacktalk when I’m playing games online and that’s fine because that’s a context where there has been consent established between you and the people you’re playing with because they’re your friends. But when it’s a total stranger it’s a completely different context and there’s been no consent established and the same language can be very different in those two scenarios.”

Ally uses the same technology as Spirit AI’s Character Engine to make these judgements and it all feeds into Spirit AI’s mission statement of making games more attractive to a diverse audience.

“This was just a year following the worst of GamerGate. [And I] immediately thought ‘It makes sense to be able to develop tools which can help platforms understand what is harassment and what isn’t harassment’.”

“We’re developing both of these different streams at once because it’s based around that core of AI really being able to analyse text and language and context,” Khandaker said. “But ultimately the mission is to help games and other forms of entertainment become more accessible and compelling in various ways. By making more interesting characters we’re making games compelling in that way but we’re also able to allow more people to play games because they’re not being driven off by toxic communities.”

While Ally is the community manager’s tool of choice, Character Engine is geared towards writers. This may seem contradictory at first if your assumption, like ours, is that its intention is to eventually replace games scriptwriters with procedurally generated dialogue.

“These tools is not about auto generating stories,” Khandaker clarified. “It’s not about replacing writers. These are tools for writers. The idea is that instead of writing one particular script with lots of different branches or writing a million different barks with tons of variations. Using this tool you’re actually writing in a different way. As a writer, you would use it to define a story space that a character exists in and the engine allows the character to improvise within that space that you as a creative person have defined.”

Source:  Mitu Khandaker
Source: Mitu Khandaker

Putting the words “AI” and “improvise” so close together immediately conjures images of the TV show Westworld. Its futuristic theme park setting is full of robots designed to naturally alter their behaviour according to the actions of the humans around them to fit the story they’ve been programmed to tell. It’s inherently gamey and even features narrative designers as characters, whose job it is to build imprint these stories on the robot NPCs. Khandaker sees this as a great analogy for the Character Engine and its place within game development.

“I really appreciate that Westworld exists out there just as a piece of media,” she says. “Because I think Westworld is actually very good at using the language of understanding this tech and understanding how this stuff could work. I often find the UI in Westworld really interesting because I find myself thinking ‘Oh, it’s showing you how his AI is thinking right now’. It’s a really good touchstone.”

But for Khandaker the intention is never to convince the player that an AI is human.

“This is something that’s really important to me,” she noted. “I’m all about trying to develop compelling characters but it’s really not about trying to trick players into thinking they’re interacting with real people. Part of the fun of the interaction with an AI character can come from the fact that you know it’s not real. It’s fun to think ‘This is a character. I’m going to try and find the edges of it. I’m going to try and break it’. And when you can’t or when it still responds in these really delightful ways that’s a really nice feeling. And that can become interesting in itself.

“So I really don’t think it’s about trying to replace people or replicate people. I’m very [against] the idea of the Turing Test because that’s based upon the whole premise of ‘Can you trick someone into thinking an AI is real?’. I don’t think that’s what we should be aiming for. I think that we should just have a class of things that we understand in the world as being AI.”

This sort of technology could be a solution to one of the major problems in virtual reality at the moment. That of characters appearing as little more than talking mannequins, eerily staring through you while they reel off their pre-recorded dialogue.

“I think storytelling in VR is interesting because you’re confined to this particular space and you’re bounded physically,” Khandaker says. “So I think having interesting, compelling character interactions within that space it fulfills this fantasy that we can have of having these realistic interactions. Because in VR everything has that physical fidelity and so you expect interactions you’re having with a character to also have that level of fidelity. Suddenly you feel like you should be able to talk to a character and if they don’t respond then it feels really jarring. I think these tools definitely help with that kind of thing.”

Source: Baobab Studios
Source: Baobab Studios

On the other end of the spectrum is something that’s dear to Khandaker’s heart: Mobile games that simulate text message conversations. They’ve been growing in popularity over the years, with the likes of Mr Robot, A Normal Lost Phone and Simulacra, but very few take advantage of natural language recognition.

“One example I like to talk about – and I’m actually making a game in this sort of vein as well – is the idea of replicating the kinds of conversations we have on phones,” Khandaker explained. “We talk to our friends with our phone, that’s a mental model that everyone has. You text people. I would love to see a whole class of new games about using natural language and texting with characters and stories unfolding via text. And we’ve got some examples of that, right? We’ve got things like Bury Me My Love or Lifeline. It’s on your device and sometimes the characters will nudge you. You’re involved with it. You’re not just sitting down to play for a session. You’re playing as you go about day.

“Some of that has been constrained so far by having to do lots of authoring for it or even using menu options for text input. You choose what you want to say. I think there is room for natural language games in that space where you just type whatever you want.”

Something that hasn’t been created by Spirit AI, but has a similar vibe, is an app called Replika. It’s an AI chat bot with various applications ranging from therapy to journaling to simply alleviating loneliness.

Source: Google Play Store
Source: Google Play Store

“It’s not a game per say, but you can have this nice playful relationship with it,” Khandaker continued. “I think it was originally intended as a therapy bot. You download the app and create a virtual friend and you can talk to it via natural language. You text with it. And every now and again it will come up as a pop up notification as if it’s a text. ‘How are you feeling?’. And the idea is that you can just type to it, because like I said it’s got these therapeutic applications. You can tell this thing which you know is an AI things that you wouldn’t necessarily want to be comfortable telling anyone else. But that’s sort of something that I think starts to point to the way forward in these types of relationships that we can have with more specifically playful game characters.”

The dream for Spirit AI is to have Ally and Character Engine interacting together and that’s very much the goal for the company. Developers can get involved and try both out for themselves by heading to the website.

“Get in touch!” Khandaker urged. “Whether you’re interested in Character Engine or Ally or both. Obviously there are different use cases for each but there’s also ways that these things can come together. This doesn’t exist yet in Ally, but one thing that we’ve talked about a lot is… Let’s say you have Ally running in your online game and it detects an instance where somebody is uncomfortable or upset by something that’s been said to them by another player. You could have a Character Engine powered NPC come along and intervene. And actually step in and try and diffuse the situation. Or send a message, whether it’s calming down the person who’s upset or trying to re-educate the person doing the harassment. There’s lots of possibilities there. It’s a really exciting time for using AI.”

GameDaily.biz © 2024 | All Rights Reserved.