During today's keynote address at Computex in Taiwan, NVIDIA's $7 billion richer CEO Jensen Huang showed off a new technology for game developers. Naturally, the new tech involves using AI and cloud servers, which has become the company's big new focus.
The new technology is called NVIDIA ACE (Avatar Cloud Engine) for Games. It will allow developers to put in NPCs that can talk to player characters in real-time, but with dialogue that is non-scripted and powered by AI chatbots similar to ChatGPT, Bing Chat, and Bard. The technology also allows the NPC facial animations to match the non-scripted dialogue.
NVIDIA stated:
You can bring life to NPCs through NeMo model alignment techniques. First, employ behavior cloning to enable the base language model to perform role-playing tasks according to instructions. To further align the NPC’s behavior with expectations, in the future, you can apply reinforcement learning from human feedback (RLHF) to receive real-time feedback from designers during the development process.
NVIDIA also stated that these AI NPCs are controlled by NeMo Guardrails, which will hopefully keep them from saying weird or even offensive things to gamers.
The company showed off a brief demo of ACE that was posted on YouTube. It was created in Unreal Engine 5 with ray tracing enabled and MetaHuman tech for NPC character models. NVIDIA also used technology from a startup company called Convai that's creating AI characters in games. NVIDIA added:
Convai used NVIDIA Riva for speech-to-text and text-to-speech capabilities, NVIDIA NeMo for the LLM that drives the conversation, and Audio2Face for AI-powered facial animation from voice inputs.
The AI NPC shown in the demo is definitely not perfect. His speech pattern seemed very stilted and, dare we say, artificial in nature. However, it's more than likely that speech patterns will be improved and become more natural in the months and years ahead.
NVIDIA did not state when ACE for Games will be available to game developers. However, it did mention that its Audio2Face technology, which matches the facial animation to a game character's speech, is being used in two upcoming games: the third-person sci-fi game Fort Solis, and the long-awaited post-apocalyptic FPS sequel S.T.A.L.K.E.R. 2: Heart of Chornobyl.
NVIDIA reveals ACE for Games to give NPCs ChatGPT-like chat features with matching animation
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.