Gamers are More Divided Than Ever…And That’s a Good Thing

ibtirNYCHmayOC

Become enthralled by one thing long enough and regardless of whatever the thing itself may be, the same set of questions tends to present themselves when you began to look back on it.

While the questions are too varied to cover in full, a host of them will inevitably concern comparing that thing as it once was, to how the thing is now. When doing so it’s often essential to use your experience to properly separate the past as it actually occurred, and the past as you perceive it through the eyes of nostalgia.

That’s a distinction that’s been running through my mind recently as I look at how games have changed from both the day one origins of the medium, and from my personal start as a gamer, to where they are now. In doing so, it’s interesting to discover and distinguish the things that have actually changed, as opposed to the things that you heart tells you are different.

Specifically, lately I’ve been wondering if the gaming community really is more hostile to each other and divided than it has been ever before.

My heart tells me the answer is yes. After all, it seemed like the cultural divide among gamers when I was young didn’t extend far past Sega v.s. Nintendo. Now, though, we have issues like AAA v.s. indies, Digital Rights Management, the treatment of women in video games, the validity of YouTube gaming as a career, the ethics of micro-transactions, gaming as art, and many, many, more. All of those issues cause a nigh infinite series of divides among the gamers of the world, and that’s before you even get into the traditional Xbox, PS4, Wii U, and PC debates.

The question then is, are we really more divided as a gaming community than ever before? Is there really more of a hostile environment between gamers everywhere than there was back in the old days? Or instead, has this always been the case and its only the rapid speed the internet carries information and opinions at from all corners that causes the perception that there is more arguments than ever before?

Even when you approach that topic from an unbiased perspective, the answer will almost always be yes. Gamers are more divided and hostile than ever. The once popular idea of a community of gamers united against the rest of the world’s upturned noses at the very idea of gaming, has given way to a civil war with infighting on nearly every front. While you could argue if the embodiment of a gaming community with an “us against the world” mentality ever did truly exist at all, there’s little doubt that certainly isn’t the case now.

And you know what? In many ways we’re better off this way.

Well…Most of the Time Anyway

Oh sure from time to time I see a topic or viewpoint that I personally consider to be outlandish get very heated, and want to cite the always popular (yet rarely practical) “Can’t we all just get along?” belief, but for the most part I’ve come to accept the constant presence of various heated debates to be a good thing for gaming, and not a detriment.

The reason being is that complacency in any industry is never a good thing. No matter what else you can say against the average gamer, one this that’s for sure is that they are not a complacent lot. Not only are they quick to turn against something the moment it becomes a little too commonplace and comfortable, but they are always seeking out and confronting hot button issues without much in the way of fear hindering them. These may not always lead to the most sophisticated and intelligent debates mind you, but they are debates nonetheless.

It’s that constant stream of debate that ensures that developers, publishers, journalists, bloggers, websites, and anyone else on the creation side of the industry can never rest on their laurels. If there weren’t the dissension that exists on so many topics that we currently enjoy today, it’s possible that many of those in gaming wouldn’t feel the pressure (or even obligation) to create a variety of experiences that can cater to any number of personal tastes, preferences and beliefs.

There is a real passion behind many of the various viewpoints in the gaming world that is more and more leading to gamers from all walks of life getting creative and making something that perfectly represents their own particular set of thoughts. That not only serves as great entertainment for those that agree, but  fuel for those who do not to do the same and create something of their own in opposition.

Sure its a general attitude that doesn’t really lead to a perfect gaming world (and there are, perhaps, some topics we would be better off being unified on), but its never really been a perfect world has it? The one we have now, though where gaming is essentially forced to constantly mature, re-invent itself, and provide a variety of experiences precisely because the role and image of a gamer is no longer a caricature, but rather a group of  increasingly outspoken and discerning individuals is a pretty damn exciting one to live in, at least in lieu of perfection.

If there is one warning that all divided gamers need to heed though, its that we should all be careful to remember that at the end of the day, games are first and foremost meant for enjoyment and to be experienced. In that regard, it’s important to never be afraid to challenge your own views by actively seeking a variety of games in order to ensure that your beliefs (whatever they may be, on whatever topic or whatever style) are ones formed by trying all of the different experiences that games have to offer, and not limit yourself at all times to those that only serve your particular notions, thus undoing all of the good the current sometimes hostile and divided culture we enjoy as gamers is actually doing.

Is there a certain appeal to a utopian world where gamers come together to form a “Pleasantville” like community based on shared essential beliefs? Perhaps. But there’s also an appeal in a more gotham like gaming community where hostility and divided beliefs may rule the day, but they ultimately come together to form an impressive world that can only be forged from the fires of such a variety of passions.

Whether that’s your idea gaming world or not, it’s time we all stood back and appreciated the beauty and quality that world can so often lead to.

If Gaming is To Evolve In The Next Generation, It’s Time to Start Ditching the Cinema

FFVIII

Not getting my Playstation in time for the “Final Fantasy VII” craze, my first experience with the series was “Final Fantasy VIII.” While I could make the argument that I got the better game of the deal, that is a topic of heated debate best saved for another day.

Instead the point in mentioning my first exposure to a “Final Fantasy” on Playstation is to reference that moment we all experienced when playing that series for the first time on that platform when you first saw one of the games cinematics. Though I’m not an expert on human behavior by any means, I still feel fairly confident in suggesting that the majority of people’s immediate reaction to viewing one of those beauties was to pick their jaw up off the floor so they were able to better articulate to anyone that would listen how it was “Just like a movie,” and to wonder “When all video games will look that good.”

Now “FF:VIII” may have been my personal exposure to the wonders of the video game cinema, but it would be far from the last. In fact, you could argue that the PS1 was the heyday of the video game cinema, as console developers began to realize the incredible (at the time) graphical potential in these scripted sequences, and just how much they could add to the basic video game story which previously was viewed by even the most intense fans of the medium as a sort of inevitable handicap thats few exceptions of excellence were best treated as anomalies.

Simply put, cinemas on the Playstation were nearly universally thrilling exhibitions that showcased levels of potential out of gaming that may have been dreamed of, but never really considered in earnest as a viable progression.

However, the Playstation came out in 1994 and hasn’t really been actively developed for in about 12-13 years. Cinemas though, in a format that strongly resembles that which they debuted under, remain.

RE3

“But,” you say, sensing where I’m going from both context clues and the headline, “cinemas have improved greatly since then, and exercise a level of quality that makes those PS1 examples look archaic and pathetic.

On that point, I don’t disagree. There is a film like quality in the modern cinematic that even during the mind expanding origins of the PS1 cinemas I wouldn’t have been able to properly envision. What’s more is, cinemas of that quality are so prolific now that they’ve reached a point where their construction and implementation can, from a user stand point, be viewed as effortless.

The ability for a modern game to use cinemas in order to make their stories more in line with the presentation style of films may have reached their awe inspiring peak in the days of the Playstation, but in terms of overall quality you can’t argue that every subsequent year makes them better and better.

However, I hate them. Hate them, hate them, hate them. Hate them nearly every time I see them, and have had the experience of several otherwise great ,or at least enjoyable, titles ruined almost entirely by their presence.

To understand the problem with the modern cinema, you really have to look back at why they came to be in the first place. They existed to invoke the aforementioned reactions of “Wow this looks like a movie” and “When will games look this good?” and gave gaming a needed crutch to improve the outward appeal of its storytelling.

CoD

However, gaming no longer needs that crutch and is becoming weaker and weaker by relying on it. The idea of a video game being able to mimic a film may have once been a fantastic notion, but can now be accomplished by nearly any reasonable budget.

As such, that same idea is now insulting. While there may have been a time when films were the only known effective way to tell a visual story, that time is no more. To suggest that is the case is to ignore the tremendous strides that certain ambitious developers have made in the field when it comes to finding a way to present a story that is uniquely told by the abilities of video games.

Yet again and again, game developers from all walks of life see no problem in creating a tightly scripted, high graphical sequence that allows you to do absolutely nothing but put the controller down and watch. When you consider that the one universally defining characteristic of video games is interactivity, putting the player in a position where they are either entirely unable to interact with the game, or mostly unable to do so, is crippling and converts the experience from game to digitally animated film instantaneously.

What’s more, the use of cinemas to such an insane degree have also spawned a number of other flaws in gaming. Among them, the most consistently annoying of which would have to be the rise of the QTE. These sequenced button presses are, on occasion, a well done way to add a level of interactivity to story segments, but for the most part are used as a sort of begrudging solution developers offer to anyone who may balk at why they aren’t able to actually play the game they purchased instead of just watch it for its presumed technological grandeur and “epic” story.

The game that really highlighted the gravity of this problem to me would have to be “The Last of Us.” While “The Last of Us,” has one of the greatest stories in gaming history, it is made nearly unbearable at times because of its reliance on traditional cinemas to tell the tale. The cinemas themselves may be better scripted and acted out than nearly all others out there, yet still manage to be groan worthy if for no other reason than they force you to stop playing the very game itself. A game that relies heavily on keeping you in the moment, and gains much effectiveness from its tense atmosphere which instantly dissipates the moment a cinema appears.

LoUS

What’s even worse in that instance is that Naughty Dog exhibits, in the same game no less, the ability to effectively tell a story with nearly no reliance on cinemas. That’s evident both in the banter between Joel and Ellie during levels which does more to enhance both individual characters and their relationship than any cinematic in the game can possibly do, and in the opening moments of the game which show perhaps the most gut wrenching and effective moments of the entire experience and afford you at least some level of interactivity with consequence.

Now even as I type this, I feel a twinge of hypocrisy as I’m among the biggest supporters of Telltale and their “Walking Dead” series, which is more or less an experience made up entirely of cinemas and quick time events. However, the very key difference there is that the “Walking Dead” series openly presents itself as that type of experience. It is a point and click adventure game, which are traditionally expected to be lighter on gameplay, and high on scripted sequences. You know to expect that when you go into it, and the developers are able to put extra work and importance into them since they are the majority focus of the game.

Instead my real problem with the whole idea of the modern cinema, is its appearance in games that otherwise feature an extremely active pace. It’s in those games where I sign up for the action and gameplay, and are instead spoon fed cinema after cinema that, regardless of the overall quality of the individual examples, are with few exceptions nowhere near as thrilling, effective, or certainly enjoyable as the very game they are apart of and, ideally, are only in place to enhance.

BatMan

There was a time when the cinematic was a useful, exciting tool that showcased the potential for gaming to reach new heights of storytelling excellence. That time has passed, and the entire reason the average pre-rendered scripted cinematic remains is based on nothing more than laziness and an unwillingness, or creative inability, to pursue a viable storytelling evolution that can recreate the feeling and effect of the first time we viewed an elaborate cinema in a game, without harming the game in the process.

Much like 2D gaming or other tropes of the medium once born out of technological necessity, there will always be a place for the video game cinematic, regardless of whether or not it is still universally desired. However developers everywhere, particularly those with budget to spare, need to really sit down and think when designing their next titles if the use of a cinema is actually enhancing the experience in a meaningful way, or is merely preventing the player from actually being able to play and only serving to help the graphic and storytelling teams flex their creative muscles without purpose like the design equivalent of a professional bodybuilder.

Do that, and I think that many of them will come to the same conclusion on cinemas that many gamers have been exercising for years, which is to just skip them all together.

The Team Behind “Thief 4″ Give a Small Preview of What the DualShock 4 Can Do

ps4-access-large3

Sony is a company with a checkered history of controller innovation. Sure they hit a sweet spot with the basic original PS1 controller which just felt right in your hands, but the biggest tech additions to that model (analog sticks and vibration) were lifted from the successful innovations of the N64. Even then they were so unsure regarding the whole “Analog” thing, that the original model of that controller had a button that allowed you to disable it, and the first game to require the sticks didn’t come to the PS1 until 1999.

Also, as the SixAxis proved, when it comes to home brewed innovations the folks at Sony lag behind. It would seem they are really vested in changing that image with the PS4 controller, which may maintain the timeless structure of the Dual Shock model, but introduces a miniature touch area, a share button of somewhat ambiguous specific functionality, and LED lights on top similar to those on the PS Move.

While the true test of these features won’t really be applicable until developers have had a year or so to play around with it and explore their full benefits, the folks behind “Thief 4” have provided a small preview of what we can expect from this new controller, specifically as it relates to the LED bar which in the case of “Thief” will remain dark when your character is hidden, and light gradually as you become more and more exposed. They’ve also noted that the touchpad will be used for enhanced menu navigation, and the more accurate motion sensors allow them to incorporate bow aiming mechanics into it, as well as a motion controlled dash option.

They also spoke of incorporating a mechanic that would allow you to blow onto a controller to blow out candles, but that it might be removed if it is “too gimmicky.”

Granted this isn’t game changing stuff, but it does remind me of the first time I played “Tiger Woods” on the PS2, and noticed how the enhanced graphics actually allowed me to better read the course at a glance, thus improving the gameplay through a cosmetic upgrade. It’s a little touch to be sure, but its an interesting first step towards what appears to be a new day for Sony controller integration and innovation.

The NCAA Pulls its Football Video Game License From EA

bo-jackson-ncaa-football-14-ultimate-team-newheader_656x369

The NCAA revealed today that they are no longer providing its football license to EA, effectively spelling the end for college football video games as we know them after the release of “NCAA 14.” While briefly touching on the issue, the real elephant in the room  that caused the decision is the use of player likenesses for which the included students receive no profit. It’s an issue that has been haunting all aspects of college sports for some time now, and the removal of this license is just an example of a larger problem that has no clear answer in sight.

EA, for their part, says they will continue to make college football games, but without the NCAA license. A move that will likely work as well for them as it did for that non NFL licensed 2K football game.

To be honest, first my reaction to this was somewhere between “Who Cares” and “Good Riddance.”

While a little pessimistic, that’s a reaction founded somewhere during the years of “Madden-Lite” NCAA entries, which turned a game that used to be on every cinder block built book shelf next to the Einstein posters and dirty laundry pile in every college dorm room in America ,into another half-hearted EA series.

Yes, if you don’t remember there was in fact a time when the “NCAA” team embraced and implemented the college spirit into their annual entries, and came up with a game that was separate, but equal in many ways to the usually more popular “Madden” franchise. Sporting its own cult fan base, it wasn’t unusual for someone to say they were a fan of “NCAA,” but never played “Madden.”

Of course, as the years went on, the only way to really distinguish the two gridiron series was by identifying the team’s logos (which, of course, are no longer available).

But the more I think on it, the more it becomes clear that this really is sad, due mostly to those years when NCAA was a classic franchise. It was once a rite of passage for every college football fan to have that “one game” that they would forever remember with their college roommate/best friend, and be able to recite play by play upon any future drinking occasion.

Now, barring some serious legal changes, that’s likely gone forever.

Ultimately, it’s true that the quality of the games would have had no bearing on the final decision. However, if the series had been able to maintain that former glory, then maybe this would be a story not entirely built around money, but memories as well.

New Metal Gear Solid 5 Footage Shows the Full Graphical Potential of Next-Gen Gaming

FOX_demo_screenshot

Next-gen gaming is a strange animal in its early days, as often times the best the last generation has to offer comes out right before (as we are very much seeing this year) whereas developers are still trying to get their footing when it comes to developing for the new systems, and as such don’t always produce experiences that truly exhibit the power and potential of these new machines.

There are exceptions of course (“Soulcalibur,” “Halo,” and “Mario 64” jump to mind) but more than often, the above conundrum tends to be the case.

My impressions of the pending next-gen fell in line with that problem, as while certain games shown certainly look to be incredible on their own full merits, in terms of graphical capabilities, I didn’t see anything from E3 or elsewhere that gave us a true visual idea of what we can expect.

However, it turns out that may have been the result of having to view blurry, second hand versions of all the footage, as Eurogamer has the 60FPS HD version of the “Metal Gear Solid 5” trailer, and it looks absolutely incredible.

Unfortunately the video is too high quality to be uploaded properly, but by proceeding here (or here for the 720p version) you can view it in all of its glory. Just know that it takes some respectable performance power to run them uninterrupted.

Now obviously some of the footage is from cinematics, and therefore not trustworthy when it comes to representing quality. However, the parts that are clearly gameplay show a level of detail and clarity that is simply not possible on this generation of console hardware. Looking at only the gameplay sections, you could make the reasonable argument that MGS5 is the most technically impressive game of all time.

Also, interestingly enough, the pursuit of 60 FPS has been around since the original Playsation days, but never became the industry standard for all releases due in large part to the rise of HD gaming making it more difficult, and somewhat less necessary. The team behind “MGS5” want to make it standard for their game though, which may indicate a shift in the rest of the industry is soon to follow in terms of  AAA releases, and if so will only increase the amount of eye candy available for gamers in the years to come.

Related Posts