Home » Culture » How 'Bambi' paved the way for both 'Fallout 4' and 'Angry Birds'
 

How 'Bambi' paved the way for both 'Fallout 4' and 'Angry Birds'

Adam Bargteil | Updated on: 8 August 2017, 16:15 IST

When “Bambi” premiered in London on August 9, 1942, the fifth film from Walt Disney Animation Studios broke a lot of new ground. It was the first Disney film in which a character’s parent dies early in the film – which is now a common plot device, as in “The Lion King” and “Frozen.” It was the first Disney film without human characters. And it heralded a new development in animation: a shift away from visual realism, and toward abstractness.

From left, a cave painting from Borneo, the perspective of Vermeer and the abstraction of Mondrian. The Conversation US, compiled from Luc-Henri Fage, Johannes Vermeer and Piet Mondrian., CC BY-ND

Since humans began painting in caves more than 40,000 years ago, and all the way through the 17th century, artists aimed to make realistic portrayals of their subjects. But not long after Vermeer cracked the puzzle of depicting three-dimensional perspective accurately on a two-dimensional canvas, something fundamental shifted: Realism began to give way to the more stylized and abstract art of Van Gogh and Monet through Picasso and Dali to Mondrian and Pollock and the advent of modern art.

The same trend happened in animation. “Snow White,” Disney’s first animated film and the first full-length movie with every frame drawn by hand, was heralded for its realism. A multiplane camera allowed the film’s artists to use shadows and three-dimensional effects to striking effect. Just five years later, in “Bambi,” the studio was making conscious decisions to avoid realism in the artwork.

As a scholar and researcher of computer animation for movies and video games, I’ve noticed this general trend toward realism followed by the intentional adoption of more abstract and stylized imagery can be seen across the visual arts, including in electronic media.

Starting with realism

When making “Bambi,” Disney’s animators had a strong initial focus on realism, sending artists to the Los Angeles Zoo to observe deer behavior and even keeping two fawns at the studio. That helped make the depictions of Bambi, his mother and the other deer very realistic. But the producers found the initial sketches of the forest background too busy and distracting.

Walt Disney with the two fawns the studio used for animating ‘Bambi.’ Disney

So they intentionally substituted an impressionistic style to depict the forest backgrounds. By incorporating more detail at the center of the action and less detail near the edges, the artists were able to direct viewers’ attention to the characters.

Spreading to video games

The primitive graphics of ‘Space Invaders.’ Scalleja, CC BY-SA

The development of computer animation and video games followed a similar trajectory. The first arcade games, like “Computer Space,” “Pong,” “Space Invaders” and “Asteroids,” used vector graphics to display line drawings, the electronic equivalents of prehistoric cave art.

Realism improved along with screen technology, and particularly with the increasing prevalence of raster displays, where the image is divided into a grid of individually illuminated pixels. Color brought “Donkey Kong,” and by the 1980s middle-class kids were playing “Super Mario Bros.” at home on their televisions.

At the end of the 1980s, improvements in graphics hardware achieved the video game equivalent of Vermeer’s camera obscura. Then game designers could depict three-dimensional space on two-dimensional screens. Like the paintings of Vermeer, computer games were finally able to achieve the realism of accurate perspective, and keep the images flowing in real time with the gameplay.

For the next 25 years a primary goal of the video game industry was improving realism by incorporating shadows, texture, increasingly detailed geometry and ever more complex lighting effects. By 2015 and 2016, games like Bethesda Game Studios‘ “Fallout 4” and EA DICE’s “Battlefield 1” achieved breathtaking levels of realism, though they also made clear that there remains room for improvement.

Dramatic realism in ‘Fallout 4.’ Bethesda Game Studios

Going mobile

The advent of the iPhone in the late 2000s led video games back away from realism. The new devices almost immediately spawned thousands of simple, two-dimensional, abstract and often highly stylized games. In this case the abstraction was more a product of need than a conscious choice: Smartphones are not powerful computers and cannot compute advanced graphics algorithms or achieve the realism of high-end gaming computers. But nearly everyone has a smartphone, so the mobile gaming industry has significantly expanded the number of people who play video games.

Part of their popularity may be, in fact, that these new games aren’t realistic, but rather silly and cute. Far more people have played “Where’s My Water?” and “Angry Birds” than “Fallout 4.” There is even an “Angry Birds” movie.

Back to the drawing screen?

This trend toward – and then away from – realism also appears in computer animation for the movie industry. Computers were first used to assist in traditional animation, where a seasoned animator draws a few important “key” frames and a less-skilled assistant draws the frames in-between. In 1974 computers replaced the assistant in producing the (highly stylized) short film “Hunger.”

Over the next two decades filmmakers increasingly used computers, like the computer-generated “Genesis” effect in 1982’s “Star Trek II: The Wrath of Khan” and the T-1000 and other special effects in 1991’s “Terminator 2: Judgment Day.”

It wasn’t until 1995 that Pixar released “Toy Story,” the first feature-length animated film depicted in a three-dimensional world. Software representations of how light falls on a scene and interacts with the surfaces it hits were still primitive, and resulted in a very “plastic” look – which Pixar turned to its advantage by telling a story involving plastic toys. The human characters did not look realistic.

In the two decades since, the animation industry has made enormous progress in creating realistic virtual characters. There has been enormous progress even in the six years that separate Clu in “Tron: Legacy” and Grand Moff Tarkin in “Rogue One.” Before the end of the decade, we may see virtual characters that are indistinguishable from real actors.

Moving back to stylized designs

In movies now, we are again seeing artists intentionally abandoning realism for stylization. Director Steven Spielberg originally planned his 2011 adaptation of the “The Adventures of Tintin” as a live-action film, but his fellow producer Peter Jackson convinced Spielberg to shoot the film in a digital motion-capture studio to create a more stylized look that would more closely resemble the original comic.

The choices of moviemakers to use or not use realism have become artistic in nature. The 2.5-dimensional short “What to do with CO2,” made in 2013, is compelling because it isn’t realistic.

Similarly, in 2015 Netflix used both realism and stylization for artistic effect in “The Little Prince” – where the real world of the Little Girl was portrayed realistically, but her imagination of the world of the Little Prince is shown in stop-motion-like scenes.

The ConversationAcross all this time and all these technologies, a trend seems clear: Humans try to express reality. Once they do, they go back to making art.

Adam Bargteil, Professor of Computer Science and Electrical Engineering, University of Maryland, Baltimore County

This article was originally published on The Conversation. Read the original article.

First published: 8 August 2017, 16:15 IST