In-game animation: the next level
iAnimate.net
Online Animation School
iAnimate speak to top industry experts from Ubisoft, Blizzard, and Defiant Studios, who pool their wisdom and take a look at the current state of in-game animation, and what the next generation has to offer…
Interview with:
- Richard Arroyo, Ubisoft
- Jeremy Collins, Blizzard
- Kevin Rucker, Blizzard
- Ricky Wood, Defiant Studios
Adam
What has changed the most in terms of in-game animation since you started in the industry?
iAnimate.net
Fidelity of animation has changed over the past 10 years as an animator and game developer. Everyone agrees, not much has changed in terms of animating characters in Maya or whatever 3D software. In-game animation and cinematic animation are bridging the gap with each other. More in-game animation has fully-featured facial lip sync, corrective pose-based blend shape deformers on rigs are more prevalent, and real-time graphics and game engines are nipping at the heels of pre-rendered cinematics and films. Things like model density, joint count, texture resolution, and so on, have become a lot better over the years, but we're still thinking about how the player is going to experience the game.
We have to remember that more polygons does not mean a stronger emotional connection with a videogame character. At the end of the day, it's the performance that people remember, not the technical specs. It's the big, unknown question; "What happens when the player... [fill in the blank]...?" We're always thinking about the player and how we can make cool, believable looking animation for them to experience.
What we have is better hardware and tech. It's not just about one animation per button press anymore. Systems are super complex making for some really fluid character performances that begin to rival film.

Things like model density, joint count, texture resolution, and so on, have become a lot better over the years, but we're still thinking about how the player is going to experience the game.
Adam
What's the biggest challenge with in-game animation?
iAnimate.net
One of the biggest challenges with in-game animation is making sure the gameplay needs are met. When we're asked to create any type of animation for characters or creatures the first few questions should be about how the designers are going to use them and what do they need. What's the timing of these attacks? How fast should they move? Will we need an enter-loop-exit or is this a canned, one off move? Also, testing is essential before anything is final.
In-game animation is about making actions read clearly while working within the constraints of your game world. We don't have the luxury of an infinite amount of frames to sell action. But those restrictions also end up being the best part of in-game animation because they force animators to think critically and really make sure their pose choices are 100% clear within a set amount of time.
An in-game animator's greatest challenge is the balance between responsiveness and weight. Animations often need to start in their anticipation so that when you press a button on your keyboard or controller you see the expected result instantly. But because we're removing frames from the beginning of our animations to serve responsiveness, we have to find a way to inject weight into our animations with fewer frames. There are many tricks that we do to get around this, such as putting our anticipation at the end of an animation for repeated attacks, making certain attacks unable to be canceled, as well as crafting the spacing of each frame so that the weight of a character is properly felt.
Really we need to ensure everything reads clearly in-game. It's one thing to make a nice play blast, but ultimately it's the game that has to look and feel good. We try to get the animations in-game as soon as possible to see if it's working. It'll save so much time in the long run but we have other new challenges such as scope and players expectations.

Adam
What engine is used at your studio (Unreal, Unity, proprietary engine) and why?
iAnimate.net
Each studio is different but if animators want to keep with the trends, and learning a 3D engine is a great asset and one of the reasons iAnimate created the Unreal Workshop. Getting hands on experiences with an engine and integrating your animations so it becomes part of a system that provides player control is more valuable than you can imagine.
Adam
What current emerging technology can you see explode in the next few years?
iAnimate.net
Better technology will enable animators to focus on performance by taking on some of the grunt work. Working with a great programmer or technical animator who can help us transform our animation will only help push the creativity to another level.
We wouldn’t be surprised if full body motion matching will be the norm on next gen consoles. After that procedurally generated motion will hit our shores and transform how we work, enabling quality and quantity like never before. This will give an opportunity for indie and smaller studios to create games with more content. Similar how games have helped push the motion capture industry, we should be looking at how these other advancements affects the film and TV industry. With the strong video and game streaming services, mobile industry, the still very young VR industry, and even making motion capture more accessible, there’s so much to explore. The mobile industry will continue to benefit from the growing number of console-like smart devices, continued improvements in technology, and cellular networks and streaming services.
However, VR is still expensive and time consuming to use. If the price and ease of use is cut by half we'll likely see more devs producing deeper VR experiences for next gen hardware. Until that changes, there's little financial interest to make VR content compared to the well established and ever improving pad based games.

Fetching comments...