I am a fully grown, 35 year old adult video gamer, and I’m not the only one.
Who plays video games today?
According to industry group Entertainment Software Association, “The average gamer is 35 years old and 72 percent are age 18 or older.” 65 percent of American households include at least one person who plays video games at least three hours per week, and most parents (67 percent) also play video games with their children. For the industry group’s entire report, entitled Essential Facts about the Computer and Video Game Industry: 2017 Sales, Usage, and Demographic data, click here.
We can blame Nintendo for that.
The average gamer today was born in 1982 (give or take). Allowing for a few years of infancy and toddler-hood, most of us “average gamers” were first exposed to video games in the mid-late eighties, when the Nintendo Entertainment System was dominant. That system was released in North America 1985, sold over 60 million units, and included classics like Tetris (1989), Super Mario Bros. (1985?), and The Legend of Zelda (1987).
Fun fact: the release date for Super Mario Bros. in North America is not known with certainty, even by Nintendo. It was released for the Famicom in Japan on September 13, 1985.
Nintendo, then, had a large influence on those of us adult gamers who never grew out of the pastime. One need only look to the nostalgia-fueled hype created by the release of the NES Classic for evidence of how present a simple video game console was during our childhoods.
Video gaming through the years.
While there has been some flexibility to this number, the average time a console spends on the market before it is replaced is between 7-10 years. The Super Nintendo Entertainment System was released in North America in 1991, when we average gamers were turning nine years old and were still solidly “children.” This period also saw competition from SEGA with its Genesis console attracting a large following. It was also the console cycle that saw the release of controversial games such as Mortal Kombat. Such games created alarm among parents, lawmakers, and the medical community about and sparked debate about the health effects of video games. I wrote about this history in a prior blog post: “Video Games May be Linked to Mental Illness.” I wrote about the therapeutic aspects of gaming in a separate post: “Therapeutic Video Games Exist, and You Are Already Playing Some.”
Fast-forward a bit.
Fast forward another console generation (five years this time) to the release of the Nintendo 64 in 1996. While childhood staples Mario and Zelda continued appearing in the form of Super Mario 64 and The Legend of Zelda: Ocarina of Time, video games were beginning to grow up. Sony’s Playstation console competed fiercely with Nintendo for those of us transitioning from middle school to high school–from children to young adults. Games such as the horror classic Resident Evil and the puzzle adventure Tomb Raider (starring the just-a-bit-too-sexy Lara Croft, who would later be played by Angelina Jolie in the 2001 movie) targeted adolescents rather than children, and they were wildly successful. Manufacturers kept the attention of an aging consumer base by creating new, age-appropriate (or even wildly inappropriate) gaming experiences.
The college years.
Freshman and Sophomore years of college saw the release of the next generation of consoles: Nintendo Gamecube in 2001, Microsoft XBox (also 2001), and Sony’s Playstation 2 (2000). While still including the classics, this generation targeted a distinctly grown-up audience, with first person shooters such as Halo: Combat Evolved, Metal Gear Solid, and other similarly violent (and increasingly realistic) games. Those of us tired of Mario could still find excitement dodging alien gunfire and lobbing flash-bangs at enemy troops.
You can pick up on the theme now…
With each iterative generation, manufacturers have made sure to target the same group of children–now grown up–that the industry so successfully captured in the mid-1980s. As we became adults, manufacturers even began adding distinctly adult functionality to their consoles. The Playstation 2 played DVD movies. Microsoft and Sony consoles now play Blu-Ray movies, stream Netflix and Hulu, and even include parental control features so that we can lock out our own children from our adults-only games.
So, at 35 years of age, I am the “average video gamer” in the eyes of the industry. Below is my current video game library, which does not include games I have traded in or downloaded digitally. You will see a mix of adults-only games, but you will also notice that I have not been able to give up my childhood classics such as Mario and Zelda. I do not have children–those are my games for my own enjoyment.
And I wonder, how long can the industry keep us captive? If you had asked me in high school what I thought of a 35 year old video gamer, the words “loser” or even “predator” might have come to mind. As of now, I have no intention of retiring my consoles. In ten years, will we all be 45-year old gamers? What kind of games does a 45-year old play?
Check back in a decade or so and I’ll let you know.