Posts Tagged ‘video games

11
Jan
15

Player Consent and Responsibility

Contemplative

Over four years ago (!?!?) I wrote a post concerning how games do and don’t account for ambivalence in their pacing, which boiled down to the point that sometimes the player doesn’t want to fully engage what the game has lined up next for the player, and it should give them the space to decide when to engage it.  It’s a topic that’s come back to mind as I consider how I choose to engage Dragon Age: Inquisition.  I haven’t invested enough in that game to really judge one way or the other in its specific case, but it’s led me back to a train of thought about how games are structured around player preferences to maximize engagement.

This question of how and when to engage the player is a matter of achieving player consent to engage (e.g. when to begin the final battle with Lavos in Chrono Trigger.)  While this is not something that should be employed at every turn in a game (there’s got to be some room for surprise) the experience must first begin with soliciting and confirming the user’s consent to engage in what the game has planned.  It’s easy to see how this makes sense in cases of user experience design (allow players to skip dialogue, allow them to save at critical points before proceeding, etc) but it becomes more complex when applied to the “text” of the game itself.  “Text” meaning the content of the game itself rather than how the game and player interface with one another.  The player must first choose to listen to what the game has to say before it can successfully proceed in conveying its text.  Anticipating the player’s desires and navigating their preferences would appear to be an insurmountable task.  But this is something that is worth consideration in evaluating critically successful games.

Following from consent, the game must next instill a sense of responsibility in the player to respect the game’s rules.  When the game chooses to speak, the player must understand what is being said in addition to consenting to hear it then accept responsibility for its consequences.  And only when after this process of acquiring consent and accepting responsibility can the interaction follow.  For instance, in the very beginning of Super Mario Brothers, World 1-1, we can safely assume consent is acquired being so early in the game. And responsibility is instilled by creating a coherent set of rules to determine what the game means when it speaks.  Not jumping over the goomba results in failure.  Failure results in restarting the level.  Jumping and maneuvering is required in order to navigate the spoken hazards of the level.  Thus, the game instills in the player a sense of responsibility to protect Mario against those hazards.  Failure comes at the cost of the player’s time.  This creates anticipation and tension, but if the cost is too great, then the game will lose the player’s consent even if they understand and accept responsibility for its rules.  World 1-1 mitigates this by only increasing the cost of failure by what it knows the player is capable of.  They cannot proceed to the ending of World 1-1 without demonstrating they are capable of understanding what the game has said.

How do I tie this back into Dragon Age: Inquisition? I think many times, when a game makes grand promises, I meet them with greater scrutiny.  I have less time to play them now, and I’ve played enough of them that I want early assurance that what I’m doing is novel or new.  I’m also a tad bit overwhelmed by the sheer breadth of this game in particular.  I don’t have confidence I will play it well at first and, as a result, I don’t fully invest myself in it while I learn its systems.  I’ve consented to the game, but haven’t accepted full responsibility for the meaning of its text.  And that’s because it’s got so much to say! I’m not really sure what to make of it yet or why I need to care.  So, I joke around, play with the systems and wait for all of this noise to coalesce into an experience resembling what I think the designers had in mind for me.  At that point, they’ll have my full buy-in and I’ll fully play the game “in character.”  Games like Walking Dead and Saints Row IV accomplished this for me, and Dragon Age may yet as well.  In the mean time, this will be my experience:

13
Oct
14

Third-person gaming

Bringing the pain.  Family style.

I can’t say that the concept of a “Let’s Play” or video game streaming grabbed my attention, or captured my interest.  Watching a stranger play a game that I, myself, could be playing instead didn’t make sense.  It struck me as an entirely redundant and unnecessary part of gaming culture.  But here we are, in a world where PewDiePie commands unrivaled success on what is currently the world’s most pervasive source of video content, making millions of dollars.  It’s not something I can claim to understand, but it’s becoming as much a part of the phenomenon of video games as the games themselves.  How does this fit into a vision of video games as art and expression?

For some, games are a sport with competitors and spectators.  That by no means comprises the majority of third-person gaming content that’s currently available.  Some of the most successful let’s-players cater to a younger audience – one who doesn’t have access to disposable income to purchase games they might like to play, and instead live vicariously through others who do.  That feels like a much more reasonable explanation in my mind, but not an exhaustive one.  This is a time when many more free-to-play and inexpensive games are becoming increasingly popular.  What makes third-person gaming interesting, and at the same time frustrating, is to consider that it seems to run counter to a core axiom to understanding games as a creative medium: that interaction forms the foundation of games.

Let’s take a couple steps back and consider games from another time.  Arcades, long gone for the most part, attracted those who wanted to play games, as well as those who wanted to watch others play them.  You could see people crowd around cabinets watching other people play games for reasons not dissimilar to why you might watch a “Let’s Play” today.  You might watch someone play a game in an arcade for a couple different reasons.  They might be participating in an entertaining competition with others.  They might be playing the game particularly well.  Or, they might be progressing further into a game than you’re able to and you’d like to see those later portions of the game.

I imagine the reasons for hanging out and watching people play games in the arcade extend to watching others play games next to you on the couch, or through a video stream.  Playing games requires time, effort, and money, which most folks have a limited supply of.  Younger audiences have time, but little money to facilitate the hobby.  Adults obviously have other responsibilities which limit their time and energy.  But these aren’t reasons that stop people from wanting to be able to enjoy these games.  And in some cases, watching the game being played might be the preferred approach to consuming it.  For myself, that game is Skyrim.  It’s a game that’s open, but not terribly motivating to me.  I’d like to enjoy the game’s sights and sounds, but without investing the time into learning the ins and outs of the systems.  As a result, I’ve ended up watching my wife play through much of it.

Are these inferior experiences?  Well, there wouldn’t be a Skyrim experience for me if I didn’t tag along for someone else’s game.  Wouldn’t that represent a failure of the game, by refusing to interact with it?  Are you not just watching a poorly composed film?  Well no.  You’re not accepting that thr person playing the game is acting as the author of the experience. And you’re not accepting that the play-through you’re watching is the only way to experience the game.  What you’ve done is delegate authority to someone to interact on your behalf; to do the things that you can’t, or won’t do.  You’re still acknowledging the game’s systems, mechanics, and rules, while designating someone else to make the decisions you might want to choose.  While the game’s design allows you to manipulate an avatar to perform actions that you yourself could not perform, a Let’s Player is another layer of abstraction to the experience.  Firing a gun in reality is not as simple as in a game.  But firing a gun in a game might still be cumbersome for others, and delegating authority to perform that verb is still about performing the verb.  The difference between direct and indirect interaction is like touring a museum on your own, and having a guide provide a narrative to the experience.  The art itself is not inferior for requiring that some consume it with the help of a guide.

There’s still value in this approach.  A game can be appreciated for it’s composition, even without playing it.  We don’t have to accept one player’s actions while playing a game vicariously through them.  But at the same time, we understand the consequences of the interactions we do accept.  These are not possibilities in a game or movie.  There is not an alternative experience in a film by playing with its rules.  But furthermore, the concept of third-person gaming has an added benefit of providing means for a more inclusive gaming culture.  It provides another way to consume games which doesn’t require an up front investment to be able to appreciate them.  This might be a “well duh” moment for me, being someone who’s not really participated in the Let’s Play phenomenon, but I think there are interesting possibilities for discussion and analysis of games beyond direct interaction, a concept that’s been a lacking crutch for expressing the aesthetic potential for games.  Most recently, I’ve been able to enjoy another open world game that I’ve failed to invest in myself, Fallout New Vegas.  and I’ve enjoyed the audience-based interactions that drove the game forward.  Hopefully, with the rising popularity of platforms such as Hitbox and Twitch, we’ll dig into games more through the lens of third-person gaming.

29
Sep
14

Comment: It’s okay to ask “What are games?”

Can you find "the real game"?

Asking whether a game is actually a game has become a weapon to attack games that don’t fit a particular profile (or someone’s particular tastes) in order to discourage discussion of that particular game.  One of the most notable examples of a game that triggers this argument is Gone Home.  Accusing Gone Home of not being a “real” game occurs frequently, but this has not suppressed the game’s success within the video game enthusiast community.  It’s perhaps for that reason that the “not a real game” argument is thrown out as frequently as it is.  It’s an unfair accusation, which insinuates that Gone Home has somehow misrepresented itself, as though it were a Steam Early Access title that pretends to be a functioning product.  Those who purchase the game in good faith have received it well.  It’s not as if there’s been a rash of requests for refunds for the game.  Perhaps protest reviews of the game are provided to counter balance critical praise, meant to dissuade gamers who expect a certain score to mean a certain quality of graphics, an acceptable genre, or type of gameplay mechanics.  Or it could be the temper tantrum of a gaming tribe that doesn’t feel sufficiently catered to.

I think this is an exceedingly unfortunate development.  While it may not discourage discussion of a game like Gone Home, it is going to make video game enthusiasts less inclined to explore what games are.  The #GamerGate train of thought abuses ludological inquiry to impose a set of tastes and preferences on video games and video game enthusiasts.  Video games are a phenomenon, and using questions as weapons will prevent the community from further understanding that phenomenon.  A recurring theme on this blog has been exploring the distinction between video games and software, and dissecting what “interaction” in games really means.  These are concepts that are worthy of discussion to better understand and capture the elements of games that make them successful and close to our hearts.  So the question shouldn’t be whether or not a certain game is worthy of being called a video game, or settling on an arbitrary set of rules.  It should be “what qualities are unique to games as a phenomenon?”  And so long as software that calls itself a game isn’t misleading interested consumers into believing that it’s something that it’s not, then there’s no reason to conduct the gamer version of the red scare.

If there’s one thing that can be agreed upon, it’s that video games facilitate play.  Play doesn’t have to be fun, and it doesn’t prescribe a specific brand of interaction.  Video games represent a space that we remove ourselves to and use play to engage.  It could be to pretend we are someone else; that we are in another place; or we can do something that normal people cannot do.  So long as we are a participating party in the space, its fair to call that a game.  Winners, losers, high scores, multipliers, these are all concepts which can be components of a game, but are not intrinsic to them.  If there’s a need to highlight a deviation from mainstream games, it might be to point out that a game might be a bad product rather than a bad game.  So if your concern is for consumers, take the time and consideration to make an appropriate argument and not conflate the product with the concept.  Even if Gone Home isn’t your favorite game, it might be the start of something different and greater.  And that shouldn’t be stifled.

01
Sep
14

Comment: Gamers and Tribalism

Must be a "real" game.

I’ve been standing back over the last few weeks and watching “controversy” unfold in the gaming community.  I don’t know what to really say about it, other than I’m aghast at the campaigns of harassment and vitriol that have been levied against the likes of Zoe Quinn, Anita Sarkeesian, Phil Fish, and Tim Schafer.  I’m embarrassed to share the same hobby with the people attacking them, and feel pretty depressed with the general state of gaming.  I’ve never seen any of Quinn’s or Sarkeesian’s work, and I’ve only partly completed games by Fish or Schafer.  But what I’ve seen unfold has only served to draw me, and I suspect many others, to their work.  I’m not invested enough in any of the individual “controversies” to comment directly on them, and honestly, I can’t imagine there ever being a controversy in the video game industry that warrants this kind of attention and abuse.  I would like to make some observations about “gamers” as a community and the divide that’s opening among them.

The term “gamer” has been used as code for those who have an affinity for games in a way that they comprise an important part of their identity.  For much of the time, it was a way for these individuals to identify each other in contexts that weren’t exactly game friendly.  For younger gamers, video games were looked down upon by those in authority (and used as a scapegoat for a long time) and among their own peers, who considered it an antisocial activity.  In reality, it was a new activity that was grew through smaller demographics but was unfamiliar to most others.  Out of necessity, the gamer label was forged to create community amongst those contending with alienation.  If you wanted to apply an anthropological concept to it, gamers formed a tribe.

The video game industry made appeals to this tribe and reinforced it.  They encouraged this tribe to make THEIR games part of their identity.  And like any business would, they made observations about their audience and played to the primary demographic that games appealed to: young white males.  And if I had to speculate on why it’s young white males that were the majority of that early group of gamers, it would be because they were most likely to have access to disposable income and were receptive/privileged enough to adopt the hobby in spite of it being looked down upon.  But certainly, others who did not fit that profile enjoyed games just as much, but didn’t have a group that embraced them.  There have, without a doubt, been girl gamers (among other demographics) for as long as there have been “gamers”, but being included in that tribe meant compromising other aspects of their identity to placate the majority.

Tribalism isn’t a model for growth.  But video games as a medium were going to grow no matter what.  “Gamers” have grown up.  Gaming is in the mainstream, and it’s a far more acceptable activity than 10 or 20 years ago.  The “gamer” tribe has outlived its usefulness, but there are those who cling to it out of fear of compromising their identity by letting it go.  In order to remain loyal to the tribe, to be a true gamer, it means liking certain games, respecting aspects of “gamer” culture and not challenging the foundations of the tribe.  I was part of this earlier in my life.  But you know what, I’m not afraid anymore that having others join in on the medium means compromising my identity.  For me, the medium is part of my identity, and there are certain games that led me to that.  But particular games, companies, and ideas of who gamers are is not part of that identity.  It’s not about deciding which games are the real games; which are the core games; and which ones signal that you are a true gamer for playing them.

Gaming has outgrown gamers, and that’s a natural progression.  You’re not a terrible person if you enjoyed a game once that wasn’t the most friendly to women or folks who don’t fit a “gamer” profile.  Criticizing aspects of a game you enjoy is not criticizing you.  There have been problems with games, but nobody’s perfect, and games are going to continue improve over time.  Having voices like Sarkeesian’s goes a long way in communicating how that can happen.  It’s been the same way for every other medium of entertainment.  We can’t preserve gaming as it was 15 years ago in amber because we’re upset that we felt alienated by non “gamers.”  It’s not fair to the medium, and if you’re invested in games as a medium, you’re holding it back by doing that.  If we’re so insecure about gaming’s place in our lives that we’re blowing up perceived problems to the level of Watergate, then we’re warping the industry to be a form of therapy for an manufactured ailment, rather than a form of entertainment.  And people will be justifiably pissed at us for that.  It needs to be acknowledged that gaming will include others, and that doesn’t automatically mean competition with that gamer tribe.  But that’s how it’s being treated – as a zero-sum game where the success of a game like Gone Home somehow means other “real” games lose.

Sarkeesian’s videos are about recognizing problematic patterns in game design, and how to correct them.  It’s not about antagonizing those who identify themselves as part of a tribe.  But between Tropes vs Women in Video Games and “Gamergate”, it’s being treated like an assault on the livelihood of gamers.  If you believe that suppressing this point of view is important to protect games as you know them, then you’re just making it more difficult folks like me to be able to enjoy games with those outside of “gamers”; people who are important to me and who I want to be able to understand why games are an important part of my identity.  If you think, for example, that women shouldn’t have a problem with how other women are portrayed in games because you’ve managed to rationalize it to yourself, that’s not persuasive, and I still don’t get to share the experience.  You’re acting like gamers are a band of survivors after the apocalypse who can’t trust outsiders.  I don’t think that these non-gamers should excuse flaws in games in order to accommodate those who think games should only be made for “true” gamers.  And I don’t want to excuse them either.

I can believe there are people out there, somewhere, that fit the profile for a social justice warrior that are simply being intolerant of the existence of games they don’t agree with.  But even assuming that’s the case, the resulting campaign of harassment against the individuals mentioned above is also telling me that I can’t enjoy games with others who don’t conform to the tribe’s norms and customs.  And while perhaps there was a time where tribalism made some sort of sense in gaming, that time is well past.  The behaviors that follow from tribalism are extreme, irrational, and blinding people to the fact that the games they enjoy are not going away.  Tribalism is an explanation for spontaneous human phenomenon, not a justification for treating people like shit.  But that’s what it has become.  Being a “gamer” means being driven by fear of games you might not identify with, and the insecurity from no longer being able to claim an entire medium as your domain.

If you’re going to tell me that women shouldn’t be allowed to enjoy games because at one point you felt like women alienated you for playing games, then I’m going to tell you to get over it.  You’re being an asshole, and you’re the one ruining games by trying to hold them back.  I’m not able to identify with the gamer label anymore, and my sympathy for “gamers” has dried up.  This isn’t a controversy between SJWs and gamers.  It’s a conflict between those who love games and are afraid to share them against those who love games and want others to enjoy them as well.

20
Apr
14

When does software become a game?

FptBfEP

I swear this isn’t a post about whether or not games should be considered art. I’ll be writing this post under the assumption that they are, and from there begin to identify the point at which game design shifts from being craft and becomes art. The craft of game design lies in the tasks that are most commonly associated with game design: programming, graphical design, sound design, etc. It’s a common argument to suggest that many of the component parts of a video game could be considered art when evaluated apart from the rest of a game. But they are applied to video games, they are art assets in service to another goal rather than to their own. You do not play a game strictly to listen to its soundtrack, or admire scenery. It’s part of the reason you are playing though.

Video games are software, and something that’s fascinated me for as long as I care to remember is the question of when does software change from just being software and become a game. It’s a topic I’ve explored in the past and written about here. It’s one of the key reasons I’ve become a software engineer myself, and I’ve spent the better part of the last 15 years learning about software design. A fair deal of effort I’ve put into it during that time was into learning about game programming. In discerning what makes software a video game, it’s easy to say “I know it when I see it.” But when it comes to creating one yourself, you’re jumping down a rabbit hole that becomes overwhelming. It can be very challenging to become a good programmer and even more challenging to become a good game programmer. So when you are simultaneously trying to surmount those challenges, you end up with a whole lot of ideas that get off the ground only to crash into a mountain of existential doubts. Many times it’s easier to derive new games from other existing games, since trying to tackle all of these challenges at once is so difficult. Though we still haven’t explained where the original idea of the game originates from and how it comes to be.

TH0gITZ

The best analogy I can make to explain the concept of designing and implementing a game as art is to compare game design with the composition and performance of music by an orchestra. To create a game, there are many jobs that must be done in concert with one another, not unlike musicians who are broken up by the type of instrument they play. It requires skilled coordination by a director or team lead to craft a game, perhaps similar to the  role a conductor plays. But this analogy requires there to be a composition that’s been prepared and ready to be performed.  I’m certain that once more game design, apart from the crafting of a game, is something that we know when we see. It still leaves the question of what exactly it is? The composer knows what instruments are available, and what musicians are capable of. The concepts of music theory are harnessed to compose a piece that can be performed in a way that engages the audience. So what are the concepts of game design theory that the game “composer” could arrange to engage the audience?

There are high level concepts used to describe game design theory: level design, game mechanics, difficulty curves, and many others. Even these concepts are nebulous and rely on assumptions held by video game enthusiasts that are easily quantified or agreed upon. Level design can be pointed out in other games. They can be deconstructed and recomposed into new concepts, but you can’t simply add “level design” to software and get a game. Nor can you tack on “game mechanics” or other high concepts into software to produce a game. A lower level foundation is required to achieve those high concepts. You need notes before you can have scales, melodies, or harmonies. I’ve tried to write about this in the past in trying to separate the concept of games from fun, and I’ve written about a fair few games using lower level game concepts as a basis to deconstruct them. It was a bit awkward in its execution, but also an interesting exercise.

Software requires three basic components to become a game and to build toward higher level design concepts. Verbs, spaces, and impressions are the “notes” used to compose games. Remove any of them, and a game returns to being software. They are entirely conceptual and have no attachment to the crafting of a game. They don’t require one to have a background in programming to be able to compose them.  And the resulting design could be feasibly handed off to a developer and made a reality. For the purposes of this discussion, the game’s design can exist independently of how it is crafted. It can exist in the same way a piece of sheet music exists, and then be performed by any number of “orchestras.” We do see this many times with classic games. Developers will take a game like Tetris and rebuild it to master the craft of game programming and to admire the design of the experience. It could be said that a game like Tetris is recreated by amateur game developers so often because of its simplicity. But it’s a game that’s been thoroughly deconstructed and defined. Rather than being simple, its design is accessible and easy to understood when it’s been properly executed. It is a demonstration that game compositions can be picked up and “performed” by other developers as though it were a piece of sheet music.

tcRpUqf

Without repeating the content of earlier posts, let me try to describe verbs, spaces, and impressions. Broadly speaking, verbs describe how the players are able to express themselves. What are they capable of doing and “saying” in the confines of the game? Spaces represent the area in which these verbs manifest and how the game will proceed to respond. The interplay of verbs and spaces is a sort of conversation between the players and the game, and the progression of that conversation represents the game’s impressions. They are how exactly the conversations affect and change the game. I’ve often seen the case made that the art of games lies in the player acting as a storyteller communicating their experience with a game. This may be so, but the design of the conversation is important, otherwise it is simply a story about software rather than a game.  Once you have these three components, you can begin to build a solid foundation for the concept of game design and composition apart from the crafting of game software.

I believe that this brings us closer to understanding the point at which software becomes a game, but there is still one important, missing piece. Even if I were to take a piece of software like a spreadsheet editor and compare it to a game, I could still identify verbs, spaces, and impressions in both. What makes these components different in a game? It’s what end they serve that differentiates them. Spreadsheet editing software serves your need to organize information. It caters to the constraints of the data. The verbs, spaces, and impressions of a game serve the game itself, as its own end – a piece of software that serves no functional purpose other than to engage an audience. Maybe this is the point where you might stop suspending your disbelief. Would entertaining the audience with software be the same as providing you with tables of data? If that is your opinion, then I couldn’t hope to persuade you in the space of one blog post. I can only speak to it in my own experience as a software developer who has spent a lot of time trying to jump in between those two types of software.  But I believe that point where software seeks to serve its own ends as a game exists and is important to identify as part of the art of games apart from the crafting of games.

05
Apr
14

#BoRT: The Control Environment

secondscreen

If you were to travel back to a time before the iPhone (seven years ago) and asked gamers what they thought about mobile, touch-driven games, you’d probably hear complaints about how they are a gimmicks for Nintendo DS games.  Their explosive growth and adoption on mobile devices in the past several years has been a surprise to many gamers.  Their appreciation of games is tightly coupled with the nuanced control schemes and level designs for console and PC based games.  A touch-driven game meant sacrificing too much of that nuanced control.  The opinions of the core gaming community can’t be projected onto the larger gaming public though.

For most people, touch control games weren’t about giving anything up, it was about having games that were accessible to them without a high barrier to entry.  An activity that used to require them to purchase special hardware, wire up TVs, wrestle with PC drivers, and pick one place to play to enjoy games became an activity that they could take anywhere, at little cost.  Core gamers can grouch about on-screen controls and how the market is catering to a “casual” gaming audience or trying to cash in on free-to-play games, but the concept of gaming grew tremendously during this time.  And it has reached far more broad audience than any one single console or PC game has been able to.

There are few mobile, touch-driven games that I can think of that I don’t think would be better on a game console or PC.  I can’t deny the value that they present to the larger gaming public though.  I also think it’s just the beginning of a significant shift in how we think about games and how we play them.  The concept of “next generation games” is now meaningless.  The technology driving games is improving continuously, and isn’t restricted to how graphics are presented.  The technology surrounding games and the way we play them is now about technology and how it surrounds us.  Augmented reality games, GPS-enabled games, and RFID-enabled games, are all examples of how this trend reshaping the gaming landscape.

One concept that interests me in particular is the idea of second-screen apps.  We’ve seen them used to augment gaming experiences in trivial ways, but think about how this technology may begin to integrate non-video games.  If you’ve ever played the Battlestar Galactica board game, you know just how fun it can be to compete with your friends, deceive them, and try to fulfill your own goals.  It’s an incredibly elaborate game that includes many sets of cards, game pieces, and rules.  You’ll also know just how long it takes to set the game up, tear it down, or teach someone how to play.  It’s appealing to consider how the game could be digitally managed, but it is not well suited at all to being played via mouse and keyboard, controller, or PC monitor.

However, this sort of game could be translated to being played across phones and tablets which share a game board on one large display.  Each player could use a device to manage their decks privately, and to interact with the game board.  You’d no longer have to wrestle with the rule book or worry about finding a space that’s large enough to fit several people and arrange a game board.  And, most importantly, it would allow you to just enjoy the game rather than spending all of your time managing it.  It’s a scenario where it utilizes technology and interfaces that people are already comfortable with to make games more accessible to those who might enjoy them but can’t get past the barrier to entry.

Going forward, the technology driving games will be about how we control games through our environment.  Right now, phones and tablets are a key part of that environment.  And soon, perhaps wearable computers will expand the concept of how we play games even further.  The ways that we think about controlling games shouldn’t be limited to half-attempts by console and PC gaming companies.  There will always be a need (and an audience) for classic game pads but we are no longer limited to them.  PC and console games are now a subset of a larger gaming market.  And we’re no longer forced to consider alternative control schemes through the lens of a gaming market that caters to that subset.

There’s been a great deal said about the recent purchase of Oculus Rift by Facebook.  VR gaming is interesting and will no doubt have a place in the future of the market.  But there’s (in my own opinion) a far more interesting development occurring in the second-screen development space.  The Google Chromecast is a $35 HDMI dongle that acts as a simple media receiver.  When initially released, it was just another device that could stream shows on Netflix from your phone.  But in early February of this year, Google released the SDK for developing your own apps that utilize the Chromecast.

What makes this device interesting is the small footprint of the platform.  It turns your TV into a canvas that’s driven by computers, tablets, and phones.  So, instead of the platform (Xbox, Blu-ray player, etc.) driving the user to setup an app on a platform, and then another app on another device, the user can simply setup the device and then “cast” whatever needs to be on the screen.  To go into technical detail for a moment: the app detects a Chromecast that’s on the same network, connects to it, and then tells it to download a single-page application that handles messages from the first device and any others that connect to it.  Devices running iOS, Android, or the Chrome browser can all simultaneously connect to it and interact with each other.  The simplest example of this is a tic-tac-toe app.  It’s HTML5 driven, and platform agnostic.

Hopefully, the low cost of the Chromcast combined with the ease of building second-screen apps for it will lead to the proliferation of non-video games like Battlestar Galactica on the platform.  It’s too early yet to tell, but even if the Chromecast were to fail, other devices will emerge that facilitate this process of integrating more games into the digital space.  Expect games to break out from classic formats, which rely less on singular platforms and more on the technological landscape surrounding you.  There won’t be any one control scheme, but it will continuously evolve and diversify as much as games themselves do.

Note: #BoRT stands for Blogs of the Round Table.  The preceding post was an entry to the April/May 2014 theme: The Right Touch.

30
Nov
13

Games pretending not to be games

Feelings?

Continuing a thought from last week (and on the general theme I’ve been running with since the Aug./Sep. #BoRT topic) that pacing in games and stories are competing with each other to the detriment of the overall experience.  I don’t believe this is always the case, but those instances are the ones where a game’s status of being defined as a game is in question (e.g. Gone Home, or The Walking Dead.)  In the case of Arkham City, I felt that the development of player skill and strategy was mitigated by the need to maintain the narrative’s momentum.  Certainly, there are strategies players can employ to complete the game effectively but Arkham City hardly challenges you to do this.  And I can understand why this would be necessary – above all else the game wanted to envelope you in a sense of being Batman, and diverting attention away from the plot and its characters would interrupt that experience.

There’s something more to this idea though.  Of course, Arkham Asylum isn’t going to start the game off at level 1-1 and track your high scores.  But what about a game like Resident Evil 6?  Resident Evil 5 and 6 are known for their their eschewment of conventional survival horror elements in favor of an experience that more resembles an action movie.  Or how about the Modern Warfare series?  No health bars, no need to actually play the game, just witness the cool stuff happening around you.  Blockbuster video games are more often resembling movies in how scenes are “shot” and how their stories are told and they are criticized many times for failing to maintain a movie-like atmosphere.  The larger theme here, I believe, is that publishers are attempting to pander to an audience that doesn’t want to be reminded that they are playing a video game.

The response to Roger Ebert’s 2005 assertion that games are not art fanned flames of insecurity in the gaming community.  After years of having our hobby derided as solely being for children’s amusement, or being accused as murder simulator enthusiasts, we’re a bit sensitive.  Showing games as a form of speech has been imperative to defending video games from legal impediments.  Not all speech is protected, and to persuade many judges that games are a protected form of speech, they had to demonstrate merit as a medium.  More explicitly, at the time, if the topic of games as speech would make it to the Supreme Court of the U.S., it would be subject to the Miller test and scrutinized as to “whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.”  And from that comes the nerve that Ebert hit with a ton of bricks.

For years, and years, the debate over games as art continued (even if it was a mostly one-sided debate.)  Gamers and games writers put a great deal of effort into reassuring us all that video games are indeed worthy of being considered art, had a place in civilized society, or that they provided experiences superior to other forms of art.  This provided an opportunity for publishers to jump on board and offer games that resembled what these writers were aspiring to – games that didn’t resemble Space Invaders or Mario.  Games that camouflaged themselves against the broader media landscape.  Games that confirmed our assumptions, at least on a superficial level.  What had been signature elements of video games had become liabilities in the push to legitimize games in the eyes of wider culture.  Publishers’ desire to cash in on this impulse is uniquely captured by the announcement trailer for Dead Island.  Audiences were given an artful trailer where scenes are alternatively shown in reverse and forward time of a family that’s being attacked by zombies and a little girl that is killed in the process.  Audiences loved it, and then the game turned out to be nothing like the trailer.  At all.

Publishers were, and continue to be ready to market and build their games as something you can show off in bits and pieces and call art.  And ironically, the quickest route to this end is to copy movies.  Publishers aren’t willing to take the same risk as indie developers and build an entire game around a story.  What we’re getting is a split baby: games that attempt to provide an illusion that you’re not playing actually game, which comes at the expense of better game design.  It’s not all games, and certainly there are developers that have succeeded in building a game completely around a story and its characters, but I think you can definitely point to examples of games and genres that have suffered as part of this phenomenon; chief among them being survival horror games and jRPGs.  These are both genres that couldn’t be easily divorced from their game-like qualities, and both which dependent on the console gaming market which is less accommodating to niche genres compared to the PC games market.  The console games market is cannibalizing itself, and in combination with the pressures of HD game development and the increasing popularity of mobile games, something’s got to give and I’m about ready to jump ship entirely to PC games.

In the mean time, I’ll take refuge in playing Final Fantasy IX, again.




Posts filed under…

Archives

My Twitter

  • Hi. This is Peter. Please leave your name and number after the tone. 9 months ago