Friday, January 9, 2009

1/9/08:

When I was in high school, my younger brother got a pet dragon.

Now he plays in a rock band with his friends when he’s home on vacation, usually between midnight and 3 a.m.

I would never have thought of doing these things, myself — but then, he never went on a River Raid or destroyed Asteroids.

By now you probably know I’m talking about video games.

But I’m not just talking about video games. I’m talking about the way "kids these days" think and act, and how it’s been influenced by video games and new television genres.

I’m talking about it because despite my own relative youth, the way my brother — about 10 years younger than me — thinks is significantly different than the way I do. And I think it’s in large part thanks to his history with video games.

When I was 9, we had an Atari. By the time my brother was 9, we had a Sega and Super Nintendo, and as he grew older, he went through a Sony PlayStation, a Nintendo 64, an Xbox and several computers’ worth of computer games.

Was I deprived, or was I mercifully spared a virtual childhood?

I think that partly depends on how well the skills we learn in video games translate into real-world know-how.

On a practical level, for instance, thanks to his Halo experience, my brother may know how to outstrategize alien invaders.

That may not sound very practical, but while I probably learned some level of hand-eye coordination from my Atari adventures, my brother learned entire ways of thinking. Increasingly, the skills games require are the ones demanded by a global, information-driven economy: strategic thinking, virtual problem-solving and even (groan) some game theory.

Perhaps most importantly, he learned how to learn-by-video-game.

It’s not likely that his career path will veer into "professional video gamer," nor that all careers in the future will have video-game components. But we’re living in the society predicted by both naysayers and advocates of the Internet’s ubiquity. Our lives are just as much virtual as they are actual these days, and lines between real and not-real are smudging.

The blurring of boundaries heralded by the influx of "reality TV" shows 10 years ago has become much more sophisticated. We still have "Survivor," but we also have "Lost," which is a scripted show (less "real" than "Survivor" in that sense) with interactive elements such as "The Lost Experience" alternate-reality game that people play online (more "real" than "Survivor").

Even in sitcoms, which used to be about setting up elaborate one-line gags no matter how unrealistic the premise, we’ve moved on from "Friends" to "The Office," which subtracts the laugh track for awkward moments and the quotidian victories of paper company employees in Scranton, Pa. – also more like real life, unless you spend all your days being witty in Central Perk.

And "Lost" and "The Office" took their venture into "reality" one step further, running ads for products that exist only within the series. The uninitiated wouldn’t even know these weren’t real.

If interacting with the real world requires dealing with this much virtual input, maybe we should all be playing video games.

James Paul Gee, the author of "What Video Games Have to Teach Us About Learning and Literacy," says that "we can learn a lot from those young people who play [video] games, if only we take them and their games seriously."

I agree. We all need to learn how to interpret, interact and think in a virtual world. Relying on old standards would be like training as a blacksmith after the Industrial Revolution: useful and quaint, but not quite relevant.

No comments: