Gamers are More Divided Than Ever…And That’s a Good Thing

ibtirNYCHmayOC

Become enthralled by one thing long enough and regardless of whatever the thing itself may be, the same set of questions tends to present themselves when you began to look back on it.

While the questions are too varied to cover in full, a host of them will inevitably concern comparing that thing as it once was, to how the thing is now. When doing so it’s often essential to use your experience to properly separate the past as it actually occurred, and the past as you perceive it through the eyes of nostalgia.

That’s a distinction that’s been running through my mind recently as I look at how games have changed from both the day one origins of the medium, and from my personal start as a gamer, to where they are now. In doing so, it’s interesting to discover and distinguish the things that have actually changed, as opposed to the things that you heart tells you are different.

Specifically, lately I’ve been wondering if the gaming community really is more hostile to each other and divided than it has been ever before.

My heart tells me the answer is yes. After all, it seemed like the cultural divide among gamers when I was young didn’t extend far past Sega v.s. Nintendo. Now, though, we have issues like AAA v.s. indies, Digital Rights Management, the treatment of women in video games, the validity of YouTube gaming as a career, the ethics of micro-transactions, gaming as art, and many, many, more. All of those issues cause a nigh infinite series of divides among the gamers of the world, and that’s before you even get into the traditional Xbox, PS4, Wii U, and PC debates.

The question then is, are we really more divided as a gaming community than ever before? Is there really more of a hostile environment between gamers everywhere than there was back in the old days? Or instead, has this always been the case and its only the rapid speed the internet carries information and opinions at from all corners that causes the perception that there is more arguments than ever before?

Even when you approach that topic from an unbiased perspective, the answer will almost always be yes. Gamers are more divided and hostile than ever. The once popular idea of a community of gamers united against the rest of the world’s upturned noses at the very idea of gaming, has given way to a civil war with infighting on nearly every front. While you could argue if the embodiment of a gaming community with an “us against the world” mentality ever did truly exist at all, there’s little doubt that certainly isn’t the case now.

And you know what? In many ways we’re better off this way.

Well…Most of the Time Anyway

Oh sure from time to time I see a topic or viewpoint that I personally consider to be outlandish get very heated, and want to cite the always popular (yet rarely practical) “Can’t we all just get along?” belief, but for the most part I’ve come to accept the constant presence of various heated debates to be a good thing for gaming, and not a detriment.

The reason being is that complacency in any industry is never a good thing. No matter what else you can say against the average gamer, one this that’s for sure is that they are not a complacent lot. Not only are they quick to turn against something the moment it becomes a little too commonplace and comfortable, but they are always seeking out and confronting hot button issues without much in the way of fear hindering them. These may not always lead to the most sophisticated and intelligent debates mind you, but they are debates nonetheless.

It’s that constant stream of debate that ensures that developers, publishers, journalists, bloggers, websites, and anyone else on the creation side of the industry can never rest on their laurels. If there weren’t the dissension that exists on so many topics that we currently enjoy today, it’s possible that many of those in gaming wouldn’t feel the pressure (or even obligation) to create a variety of experiences that can cater to any number of personal tastes, preferences and beliefs.

There is a real passion behind many of the various viewpoints in the gaming world that is more and more leading to gamers from all walks of life getting creative and making something that perfectly represents their own particular set of thoughts. That not only serves as great entertainment for those that agree, but  fuel for those who do not to do the same and create something of their own in opposition.

Sure its a general attitude that doesn’t really lead to a perfect gaming world (and there are, perhaps, some topics we would be better off being unified on), but its never really been a perfect world has it? The one we have now, though where gaming is essentially forced to constantly mature, re-invent itself, and provide a variety of experiences precisely because the role and image of a gamer is no longer a caricature, but rather a group of  increasingly outspoken and discerning individuals is a pretty damn exciting one to live in, at least in lieu of perfection.

If there is one warning that all divided gamers need to heed though, its that we should all be careful to remember that at the end of the day, games are first and foremost meant for enjoyment and to be experienced. In that regard, it’s important to never be afraid to challenge your own views by actively seeking a variety of games in order to ensure that your beliefs (whatever they may be, on whatever topic or whatever style) are ones formed by trying all of the different experiences that games have to offer, and not limit yourself at all times to those that only serve your particular notions, thus undoing all of the good the current sometimes hostile and divided culture we enjoy as gamers is actually doing.

Is there a certain appeal to a utopian world where gamers come together to form a “Pleasantville” like community based on shared essential beliefs? Perhaps. But there’s also an appeal in a more gotham like gaming community where hostility and divided beliefs may rule the day, but they ultimately come together to form an impressive world that can only be forged from the fires of such a variety of passions.

Whether that’s your idea gaming world or not, it’s time we all stood back and appreciated the beauty and quality that world can so often lead to.

  

“South Park: Stick of Truth” Gets Approved in Australia Thanks to Some Creative Censoring

screen_Gamescom_3_116949

We may never know what exactly is up the collective butts of Australian video game censors, but that hilariously misinformed and outdated group of do gooders is at it again.

The target this time is “South Park: Stick of Truth.” Specifically the censors rejected the game on the basis of a scene involving penis shaped anal probes, and an abortion scene involving vacuums and a wire.

On a side note, isn’t it nice when game adaptations stay so true to the source material?

Anyway developer Obsidian tried re-submitting the game under some slightly toned down conditions, but were rejected at each turn. Finally they submitted an impressively sarcastic version of the probing scene where the image of the scene is replaced with a crying koala while on screen text informs you as to what is actually happening in the original scene.

Unsurprisingly, considering the board’s traditionally misinformed interpretation of comedy, this version was accepted.

So it looks like the fair Australian gamers of the world will get to play “Stick of Truth,” albeit with more static images of Koalas in place than were originally intended, as well as some minor mini-games axed entirely, thanks to some creative skirting of the censors.

Seriously though, what is the logic behind the extreme censorship of gaming in Australia? Considering it’s the year 2013 and I can probably pull up a YouTube video of mass genocides set to a dubstep soundtrack and intercut with images of “My Little Pony” fan porn on my phone, does a cartoon video game character’s anal probe encounter really constitute the ultimate line of morality?

  

Question Their Quality, But Never Deny The Work Behind Popular YouTube Gamers

YouTube-logo-full_color

I hate “The Big Bang Theory.” Understand that I don’t use hate often to describe something, but such is the case with that particular show. As an “out and proud” nerd such as it were, every time someone tells me that I must naturally love “The Big Bang Theory,” I tend to involuntarily cringe.

For the most part, I feel the way about many popular gaming YouTube personalities for largely the same reason. I find the quality of their content to be creatively cheap, and a bad image for the culture they have become the most vocal representatives of.

Of course please understand that isn’t meant as a blanket review of all gaming YouTube personalities. For instance, John Bain (better known by the handle TotalBiscuit), is one of  my most trusted gaming critics. For the most part though, the popular path to YouTube gaming fame of yelling at games and making cheap jokes along the way (let’s call it the PewDiePie effect) just doesn’t appeal to me, and quite honestly I don’t think it is meant to.

PewDiePie

It’s what has me somewhat conflicted about the recent YouTube content ID incident, which is threatening the livelihood, and in some cases very existence, of many of those YouTube personalities and their channels.

On one hand, I think that the literal implementation of archaic property and copyright laws that just don’t easily apply to video games is yet another in a shameful line of examples of the “world at large” not being sure exactly how to incorporate the medium properly into everyday life, business, and culture. I also do truly feel that these sanctions (many of which are completely bogus mind you) are just a taste of the world that is forming, in which the power and abilities of the individual is overshadowed almost entirely by that of the conglomerate, making it closer to impossible every day for that individual to shape their own fortune and make their own mark regardless of their current position in the world.

On the other hand, in terms of the content that we are potentially losing, I’m by and large unaffected. While there are some people hurt by this that I will miss, in the grand scheme of things from an entertainment perspective, I’m not ranking this occurrence with say the untimely cancellation of “Firefly.”

AngryJoe

Maybe you share that opinion. Maybe you don’t. To be honest, I don’t really care. That’s not because I don’t respect your right to have an opinion on that particular subject, but rather because I feel that subject is very much worthy of debate, and of differing opinions.

However, if your stance on this topic is one of joy because you feel that the role of YouTube personality shouldn’t be considered a real job, and that these people have been just coasting along off of a broken system, then I’m here to call you out for being wrong. On that subject, I leave no room for debate.

What you have to understand is this. The people who are potentially most affected by these policies (and the ones still to come) are the people who work hardest at what they do. They are not the ones that throw on a webcam, get a cheap mic, record their game play, and hastily throw it online with some poorly chosen metal music as bookends and call it a day. They are people who have learned genuine skills and talents, and have put forth 70-80 hours a week for years of their lives to get where they are today, which is a position to do what they love for a living.

Nerd3

It’s true that many of them were using pre-existing content as the crux of their works, but since when was that a crime in and of itself? Many of those who are being harmed most by this had the proper permission to use the content they were featuring at the time they used it. To criticize them for doing so is not different that criticizing the “Mystery Science Theater” cast for just piggybacking off old movies, or to criticize “Siskel and Ebert” for just judging original works and making a living off of it. Hell, while you’re at it, you might as well damn every gaming website and blog who make their livings by reporting on the industry as opposed to solely creating original content.

Many people don’t do that, though. Why? What is the difference? Is it the YouTube format? Is that what makes people completely disregard the genuine hard work that went into these people getting to where they are at in life and instead dance on the grave of their dreams while its slowly being dug?

If so, that’s a real shame. Yes I admit the concept of a grown person essentially playing video games for a living doesn’t really qualify as the most practical, or certainly noble, of pursuits. However, it is what they love doing, and through a combination of ambition, luck, skill, ability, persistence, and most importantly hard work they found a way to use the very slim opening that YouTube afforded them, and turn it into a something they could not only live off of, but take pride in.

Synidicate

There was a time when that kind of ambition and recklessness was admired and rewarded. It wasn’t always rewarded with financial gain mind you, but spiritually it was the kind of action treated with respect and looked upon for inspiration to make more of yourself and to retain the belief that with the right combination of work and passion you too could make something better for yourself, and maybe even achieve your dreams.

And now that same effort is being mocked. Maybe by only a minute portion of the jaded and uninformed (or possibly just the usual trolls), but even then that is too many. The idea that you are not a master of your own fate, and rather a slave to some idea of how things may be is a mental poison that is corrupting this world a little more each day and can in no way be tolerated by anyone with a shred of hope and life left in them.

Call out these YouTube personalities all you want for the quality of their work. Critique them, question them, or just ignore them entirely if you choose. But never, ever, deny those that truly deserve it respect for the work they put in to get where they are and their willingness to aim for something greater regardless of whether or not it was through traditional means.

Do that, and you might as well deny all of those born without a silver spoon in their mouth the right to eat.

  

If They’re Not Careful, Telltale Games Runs The Risk of Overexerting Themselves

screen-zombiefight

Ever since their acquisition of the “Sam and Max” license, Telltale has garnered a reputation as a studio that does things a little bit differently.

It was with that series that the studio kicked off their unique episodic format, where a series is released in monthly or bi-monthly installments over the course of a season. While the quality of their individual installments varied from great to “meh” with some regularity, for the most part the approach was viewed as a gimmick by many.

That was until the release of “The Walking Dead.”

With that series Telltale finally made it all click. The series of choices and consequences in those games made the episodic format actually matter, while the quality of the writing and direction made “The Walking Dead” the first series from the studio to maintain a standard of excellence throughout. The general consensus winner of the 2012 game of the year awards, “The Walking Dead” was a runaway success.

Much like the runaway success “Walking Dead” TV show, however, its increased attention also drew increased criticism. Many gamers lashed out against “The Walking Dead” games for not actually being games. Instead they saw them as a series of story sequences loosely strung together by the occasional dialogue choice or QTE section. As a result, “The Walking Dead” became one of the most cited titles in the growing debate of whether or not the term video game is still appropriate when describing the state of the medium today.

Regardless of where you stand on that particular issue though, the sales numbers don’t lie, and the numbers tell us that “The Walking Dead,” was a success. It was such a success, in fact, that it allowed TellTalle to not only continue “The Walking Dead” series, but begin entirely new series within the high profile worlds of “Fables,” “Borderlands,” and “Game of Thrones.”

And that’s what worries me.

Walking_Dead_Game_Season_2_13829709899845 (1)

See, I’m firmly in the crowd that loved “The Walking Dead.” While that’s mostly due to the quality of the game’s storytelling, I also attribute that to the fact that there wasn’t really anything like “The Walking Dead” series, even in the TellTalle canon. It was a breath of fresh air in the gaming world, and made the choice to buy “The Wolf Among Us” a no brainer.

By the end of the first episode of that game, though, it became pretty obvious that Telltale had no intentions of abandoning the gold mine of design they stumbled on during “The Walking Dead.” I don’t want to sound like I’m writing off “Wolf” as a re-skinned “Walking Dead,” but rather want to point out that if the appeal of “The Walking Dead” lied in it’s uniqueness and quality storytelling, the appeal of “Wolf” lies just in its storytelling.

That’s fine, but it does raise the question of whether or not TellTalle can justify releasing several high profile series in succession that all follow that “Walking Dead” style. After all, how many times can you hope to catch lightning in a bottle?

Now it’s not like I think Telltale should look at the success of “The Walking Dead” and say “Well, we made a good game so its time to shut down production,” but they already have both “The Walking Dead Season 2” and “The Wolf Among Us” releasing concurrently and now apparently have “Game of Thrones” and “Borderlands” titles in the works as well.

Borderlands

There’s no studio in the world that can possibly handle that amount of production and maintain a consistent level of quality, especially if the games they are making all follow the same basic template. We’ve seen before what happens to studios who feel the obligation to make annual releases of the same series and, with few exceptions, the results are not pretty.

In the case of Telltale, however, it’s even more tragic. Here’s a studio that made their namesake by releasing a game that shook the foundations of gaming and had some questioning the validity of the classification gaming itself. Going from that, to just continuing to do that but in new worlds reminds me of the executives from “South Park” who surmised that if saying shit in a TV show was popular and revolutionary, then saying shit even more and in different episodes is sure to be just as popular and revolutionary.

I believe that TellTalle is a great developer, and will never intentionally start banging out games routinely in the “Call of Duty” style. However, whether it is their intention or not, unless they start exploring a style beyond that of “The Walking Dead,” or at the very least limit their releases to a series at a time, they run the risk of overexerting themselves and learning a lesson that entertainers everywhere have learned the hard way for years.

After a while, the same act starts to get old.

  

If Gaming is To Evolve In The Next Generation, It’s Time to Start Ditching the Cinema

FFVIII

Not getting my Playstation in time for the “Final Fantasy VII” craze, my first experience with the series was “Final Fantasy VIII.” While I could make the argument that I got the better game of the deal, that is a topic of heated debate best saved for another day.

Instead the point in mentioning my first exposure to a “Final Fantasy” on Playstation is to reference that moment we all experienced when playing that series for the first time on that platform when you first saw one of the games cinematics. Though I’m not an expert on human behavior by any means, I still feel fairly confident in suggesting that the majority of people’s immediate reaction to viewing one of those beauties was to pick their jaw up off the floor so they were able to better articulate to anyone that would listen how it was “Just like a movie,” and to wonder “When all video games will look that good.”

Now “FF:VIII” may have been my personal exposure to the wonders of the video game cinema, but it would be far from the last. In fact, you could argue that the PS1 was the heyday of the video game cinema, as console developers began to realize the incredible (at the time) graphical potential in these scripted sequences, and just how much they could add to the basic video game story which previously was viewed by even the most intense fans of the medium as a sort of inevitable handicap thats few exceptions of excellence were best treated as anomalies.

Simply put, cinemas on the Playstation were nearly universally thrilling exhibitions that showcased levels of potential out of gaming that may have been dreamed of, but never really considered in earnest as a viable progression.

However, the Playstation came out in 1994 and hasn’t really been actively developed for in about 12-13 years. Cinemas though, in a format that strongly resembles that which they debuted under, remain.

RE3

“But,” you say, sensing where I’m going from both context clues and the headline, “cinemas have improved greatly since then, and exercise a level of quality that makes those PS1 examples look archaic and pathetic.

On that point, I don’t disagree. There is a film like quality in the modern cinematic that even during the mind expanding origins of the PS1 cinemas I wouldn’t have been able to properly envision. What’s more is, cinemas of that quality are so prolific now that they’ve reached a point where their construction and implementation can, from a user stand point, be viewed as effortless.

The ability for a modern game to use cinemas in order to make their stories more in line with the presentation style of films may have reached their awe inspiring peak in the days of the Playstation, but in terms of overall quality you can’t argue that every subsequent year makes them better and better.

However, I hate them. Hate them, hate them, hate them. Hate them nearly every time I see them, and have had the experience of several otherwise great ,or at least enjoyable, titles ruined almost entirely by their presence.

To understand the problem with the modern cinema, you really have to look back at why they came to be in the first place. They existed to invoke the aforementioned reactions of “Wow this looks like a movie” and “When will games look this good?” and gave gaming a needed crutch to improve the outward appeal of its storytelling.

CoD

However, gaming no longer needs that crutch and is becoming weaker and weaker by relying on it. The idea of a video game being able to mimic a film may have once been a fantastic notion, but can now be accomplished by nearly any reasonable budget.

As such, that same idea is now insulting. While there may have been a time when films were the only known effective way to tell a visual story, that time is no more. To suggest that is the case is to ignore the tremendous strides that certain ambitious developers have made in the field when it comes to finding a way to present a story that is uniquely told by the abilities of video games.

Yet again and again, game developers from all walks of life see no problem in creating a tightly scripted, high graphical sequence that allows you to do absolutely nothing but put the controller down and watch. When you consider that the one universally defining characteristic of video games is interactivity, putting the player in a position where they are either entirely unable to interact with the game, or mostly unable to do so, is crippling and converts the experience from game to digitally animated film instantaneously.

What’s more, the use of cinemas to such an insane degree have also spawned a number of other flaws in gaming. Among them, the most consistently annoying of which would have to be the rise of the QTE. These sequenced button presses are, on occasion, a well done way to add a level of interactivity to story segments, but for the most part are used as a sort of begrudging solution developers offer to anyone who may balk at why they aren’t able to actually play the game they purchased instead of just watch it for its presumed technological grandeur and “epic” story.

The game that really highlighted the gravity of this problem to me would have to be “The Last of Us.” While “The Last of Us,” has one of the greatest stories in gaming history, it is made nearly unbearable at times because of its reliance on traditional cinemas to tell the tale. The cinemas themselves may be better scripted and acted out than nearly all others out there, yet still manage to be groan worthy if for no other reason than they force you to stop playing the very game itself. A game that relies heavily on keeping you in the moment, and gains much effectiveness from its tense atmosphere which instantly dissipates the moment a cinema appears.

LoUS

What’s even worse in that instance is that Naughty Dog exhibits, in the same game no less, the ability to effectively tell a story with nearly no reliance on cinemas. That’s evident both in the banter between Joel and Ellie during levels which does more to enhance both individual characters and their relationship than any cinematic in the game can possibly do, and in the opening moments of the game which show perhaps the most gut wrenching and effective moments of the entire experience and afford you at least some level of interactivity with consequence.

Now even as I type this, I feel a twinge of hypocrisy as I’m among the biggest supporters of Telltale and their “Walking Dead” series, which is more or less an experience made up entirely of cinemas and quick time events. However, the very key difference there is that the “Walking Dead” series openly presents itself as that type of experience. It is a point and click adventure game, which are traditionally expected to be lighter on gameplay, and high on scripted sequences. You know to expect that when you go into it, and the developers are able to put extra work and importance into them since they are the majority focus of the game.

Instead my real problem with the whole idea of the modern cinema, is its appearance in games that otherwise feature an extremely active pace. It’s in those games where I sign up for the action and gameplay, and are instead spoon fed cinema after cinema that, regardless of the overall quality of the individual examples, are with few exceptions nowhere near as thrilling, effective, or certainly enjoyable as the very game they are apart of and, ideally, are only in place to enhance.

BatMan

There was a time when the cinematic was a useful, exciting tool that showcased the potential for gaming to reach new heights of storytelling excellence. That time has passed, and the entire reason the average pre-rendered scripted cinematic remains is based on nothing more than laziness and an unwillingness, or creative inability, to pursue a viable storytelling evolution that can recreate the feeling and effect of the first time we viewed an elaborate cinema in a game, without harming the game in the process.

Much like 2D gaming or other tropes of the medium once born out of technological necessity, there will always be a place for the video game cinematic, regardless of whether or not it is still universally desired. However developers everywhere, particularly those with budget to spare, need to really sit down and think when designing their next titles if the use of a cinema is actually enhancing the experience in a meaningful way, or is merely preventing the player from actually being able to play and only serving to help the graphic and storytelling teams flex their creative muscles without purpose like the design equivalent of a professional bodybuilder.

Do that, and I think that many of them will come to the same conclusion on cinemas that many gamers have been exercising for years, which is to just skip them all together.

  

Related Posts