Showing posts with label Games. Show all posts
Showing posts with label Games. Show all posts

Wednesday, December 22, 2010

Hp7 part 1: The Game


Format: PS3, Xbox 360, Wii, PC(version played), DS 
Developers: EA Bright Light 
Publisher: EA
Release date: 19/11/10 
Genre: Third person shooter
Players: Single player
My Score : 3/5

Death Eaters have infiltrated Hogwarts and Voldermort (yes that’s right, I named him and its not you-know-who guys its U-No-Poo the constipation sensation that is gripping the nation) is stronger than ever and is intent on taking Harry’s life. With no Dumbledore, it’s just Harry and company to find the Horcruxes if they want any hope of defeating You Know Who.

Straight away you notice you are less confined to the walls of Hogwarts as compared to the previous games, and offered the freedom of the outside world where cover-based battles form the focal point. There is however less emphasis on exploration, as the story takes a more direct and linear approach than before to cover all the main twists and turns.

You are sometimes offered a number of routes to take where you will be faced with different situations, but this seems rather pointless as you have to complete all the routes anyway to piece the story together. If you’ve seen the film, you’ll know there’s a lot of content which has forced the game pace to be quick. This has led to some cutscenes being quite to the point where they could offer so much more depth and direction to the story.


As you’d expect, as you level up your current spells become stronger and you unlock new spells, but the limited hot keys make it frustrating and annoying to change between them in battle. So you will find yourself becoming reliant on two main spells to get the job done. In my case i was solely depended on Impedeminta and Confingro.

With Dumbledore out of the way and the Order of the Phoenix set to crumble you’d expect your good mates Ron and Hermione to fight beside and help you in your quest. Think again, the AI of both seem too lazy for your liking, Harry's character is left to do all the work while they just idle around. They don’t offer much help and occasionally stand in the way as you’re about to cast a Stupify spell. However, Ron does add some humour with his one-liners. For example, when I was under the invisibility cloak he says to Hermione: “these jeans are a bit tight.”

On the whole, the graphics are very good, but the storyline put the game’s presentation in a shadow. I didn't notice it at first but some of the characters’ voices are not authentic, which I suppose takes away a little realism, but the voice actors aren't that bad.

Looks like EA Games just handed the project to be done by whatever means possible to EA Bright Light to co-inside with the release of the movie and didn't care as to how it turned out. Given there is a final film and game to come, EA has one last chance to get everything right and leave us Muggles with a lasting impression from the Harry Potter series.




Share/Bookmark

Friday, December 17, 2010

AMD Radeon HD 6970 and Radeon HD 6950 Hands-On Preview

AMD released the mid-range Radeon HD 6800 GPUs in October. The high-end Radeon HD 6900 series was supposed to follow shortly after, but plans sometimes get derailed. The company barely made it for the holiday season, but it's finally ready to unveil the Radeon HD 6970 and Radeon HD 6950. The new GPUs offer a few new features and a handy dose of nomenclature ambiguity to boot.



Radeon HD 69Radeon HD 6970
The Specs:
Radeon HD 6970
$369
880MHz Core Clock
1536 Stream Processors
2GB GDDR5 RAM
1375 MHz Memory Clock
Radeon HD 6950
$299
800MHz Core Clock
1408 Stream Processors
2GB GDDR5
1250MHz Memory Clock
AMD's Radeon HD 6970 will go toe-to-toe with the GeForce GTX 570, and the Radeon HD 6950 spars with the GeForce GTX 470, as well as replaces the Radeon HD 5870. The mid-range Radeon HD 6870 replaces the Radeon HD 5850, and the HD 6850 boots out the Radeon HD 5830. As far as AMD is concerned, the Radeon HD 5800 series has now been discontinued. The HD 5700-based parts will continue to be manufactured, as will the dual-GPU Radeon HD 5970. The Radeon HD 6970 doesn't come close to displacing the Radeon HD 5970, so for that, we'll have to wait a little bit longer for the Radeon HD 6990.
Radeon HD 6950
The recently released 6800 GPUs were basically architecturally refined 5000 series GPUs. The 6900 series takes all those refinements and adds onto them. They feature dual graphics engines that provide triple the tessellation performance of the Radeon HD 5870. The new GPUs also contain second-generation DirectX 11 optimizations, accelerate Blu-ray 3D, and have new antialiasing and ansiotropic filtering modes to improve image quality.
Morphological antialiasing provides full-scene antialiasing via a post-processing technique. Traditional AA modes are applied when the scene is being rendered; morphological AA is done after the frame has been generated. The new AA mode is compatible with any DirectX 9/10/11 game.
EQAA, or enhanced quality antialiasing, uses up to 16 subsamples to improve image quality. It can be enabled on top of existing antialiasing modes, although it does take a little bite out of performance.
On the back end of the Radeon HD 6970 and HD 6950, you can find two dual-link DVI ports, an HDMI 1.4a port, and two mini-DisplayPort 1.2 connectors. HDMI 1.4a allows the video card to output 3D Blu-ray. DisplayPort 1.2 makes a whole new set of features available. The connector now allows you to daisy chain displays together, and with the use of a splitter/hub, you can also support up to six DisplayPort 1.2 displays through two connectors. We don't have pricing information on these hubs, but AMD indicates that they should be fairly inexpensive. Active DisplayPort 1.1 adapters used to cost $100, but now they're about $20, so we're inclined to think that the hubs won't be all that pricey.
The Radeon HD 6970 consumes 250W at peak power, and the Radeon HD 6950 consumes 200W. Both idle at 20W. On the power connector front, the Radeon HD 6970 requires 1x8-pin and 1x6-pin plug, and the Radeon HD 6950 uses 2x6-pin plugs.
The Competitive Landscape
$260 - GeForce GTX 470
$290 - Radeon HD 5870 
$300 - Radeon HD 6950
$350 - GeForce GTX 570
$370 - Radeon HD 6970
$450 - GeForce GTX 480
$530 - GeForce GTX 580
Radeon HD 6900 Performance Tests
The Radeon HD 6950 proves to be a solid value at its $300 spot. It bests the slightly cheaper GeForce GTX 470 handily in everything but Lost Planet 2. The Radeon HD 6950 also outguns the Radeon HD 5870 quite easily and gets very close to catching up with the more expensive GeForce GTX 570.
AMD's new single GPU flagship Radeon HD 6970 also performs well. It takes out the GeForce GTX 570 in everything but Lost Planet 2, but it likely won't catch up to the GeForce GTX 580 (We don't it have for testing, and it's hardly in stock anywhere, which probably explains why we don't have one.)
AMD's new GPUs didn't win the single-GPU crown, but the combination of performance and pricing make them more than competitive. The 6900 feature set is also very future proof. DisplayPort 1.2 isn't all too common yet, but with the low price of active DisplayPort adapters, rolling with three monitors (and even six) has never been easier. Also, be on the lookout for cheaper 1GB variants of the Radeon HD 6970 and Radeon HD 6950.
Test System: Core i7 980x, Asus Rampage III Extreme, 6GB OCZ DDR3, Seagate 750GB 7200.11, Windows 7 64-bit. Video card drivers - Catalyst 10.12beta, Forceware 263.09.

Share/Bookmark

Friday, December 10, 2010

Nvidia GeForce GTX 570 Hands-On Preview

(FOUND THIS REVIEW ON GAMESPOT.COM)



Nvidia has officially cranked it up to 11. After botching the launch of the GeForce 400 series, the company had to demonstrate that it was still capable of delivering. And it has. A scant eight months after the launch of the GeForce 400 series, Nvidia popped out the GeForce GTX 580 in late November, and now it's ready to release the GeForce GTX 570.

Rolling in with an MSRP of $350, the GeForce GTX 570 is eminently more affordable than the $500 GeForce GTX 580 and comes packed with 480 Cuda Cores running at 1464MHz, and 1280MB of GDDR5 RAM running at 3800MHz. The core clock is pegged at 732MHz. We're essentially talking about a GeForce GTX 580 with a slight haircut. Even more stunning, the GeForce GTX 570 slaps the GeForce GTX 480 in just about every single stat and costs significantly less!

The GeForce 500 series is based on Nvidia's GF110 architecture, which is itself an extension of the GF100 chipset that the 400 series used. One would expect that the new 500-series parts would be substantially changed from the 400 series, but then one would just be wrong. All the bullet points from the 400 series still stand: DX11 support, 3D Vision, Nvidia Surround, and PhysX. The key benefits of the 500 series are an improved cooler, quieter running speeds, and architectural improvements that basically yield a faster-running 400-series part sans the mind-blowing temperatures.

The dual-slot, 10.5-inch GeForce GTX 570 is just as long as a GeForce GTX 480. Its new copper cooler gives it a little more heft, and a rejiggered fan design lowers operating noise. You'll also need only two six-pin PCI Express power connectors to power it--no fancy eight-pin connectors required. When running at full tilt, the GeForce GTX 570 is nowhere near as noisy as the GeForce GTX 480.


Much like Rowdy Roddy Piper in They Live, Nvidia is out of bubble gum and clearly ready to kick something. At $350, the GeForce GTX 570 is a phenomenally strong GPU. ATI has no single GPU solution that can even hope to compete with it at the moment, either on price or performance (although that may change very soon). The $450 GeForce GTX 480 gets either beat or matched by the GeForce GTX 570 in our tests. Let's just go over that again: overall better performance than the GeForce GTX 480, lower power consumption, lower heat output, reduced sound levels, and all at a substantially lower price point. There's really nothing to dislike about this turn of events.

Test System: Core i7 980x, Asus Rampage III Extreme, 6GB OCZ DDR3, Seagate 750GB 7200.11, Windows 7 64-bit. Video card drivers: Catalyst 10.11 and Forceware 263.09.

Share/Bookmark

Wednesday, December 8, 2010

The Future of A.I. in gaming

Artificial intelligence in games has matured significantly in the past decade. Creating effective AI systems has now become as important for game developers as creating solid gameplay and striking visuals. Studios have begun to assign dedicated programming teams to AI development from the onset of a game's design cycle, spending more time and resources on trying to build varied, capable, and consistent non-player characters (NPCs). More developers are also using advances in AI to help their games stand out in what has already become a very crowded marketplace, spawning a slowly growing discussion in the industry about redefining game genres. Think tanks and roundtables on advances in game AI have become prominent at the annual Game Developers Conference (GDC), while smaller AI-dedicated conferences such as the annual Paris Game AI Conference and developer-run online hubs such as AiGameDev.com are garnering a big industry and community following. While industry awareness about the significance of AI in games continues to grow, GameSpot prompted Matthew Titelbaum from Monolith Games, Remco Straatman from Guerrilla Games, and AiGameDev.com founder Alex J. Champandard to share their thoughts on the future and growth of game AI.

The Halo franchise is recognised as a leader in the field of game AI.
Unlocking new possibilities

While faulty AI is easily recognised, an AI system that is doing its job often goes unnoticed. No one stops halfway through a level to admire the idiosyncrasies displayed by NPCs unless they are doing something completely out of character--the more unremarkable, the better the AI system. While achieving this result is still a priority for game developers, making games with an AI system that stands out for being good is a relatively new concept: few studios want to dedicate costly man-hours to chasing innovation in a highly technical field that, for the most part, is likely to go unnoticed. However, there are some exceptions. In 2007, AiGameDev.com launched its annual game AI awards, nominated and voted by the site's community. The purpose of the awards was to spotlight the games that showed promise in the field of AI, either by trying something different or exhibiting technical proficiency. In 2009, the Best Combat AI and the overall Best Game AI awards were won by the same studio--Guerrilla Games for Killzone 2. Remco Straatman, lead AI programmer at Guerrilla, says a lot has changed in game AI in the last five to 10 years, with more developers trading low-level scripting for more advanced NPC decision systems.

"In general, I think game AI has gone from the stage where it was an achievement if it did not stand out negatively to the point where AI in most big games is solid, and some titles are using innovative new ideas," Straatman says. "More development teams have also moved from simple state machines to behaviour trees and using planners in NPC AI systems describing knowledge of the world around the NPCs have improved with better knowledge for navigation over changing terrain, and more knowledge about strategic properties of the world such as cover. I also think advances in animation systems with better ways to combine various animations and physics have become available, which now allows for more realistic movement and responses to being hit [in combat AI]. Most of these systems were not around 10 years ago or simply could not run on the hardware available."

Creating a solid game AI system involves successfully networking smaller systems together. For example, a system that deals with the problem-solving capabilities of individual NPCs goes hand in hand with a system that makes sense of the gameworld and its parameters and helps NPCs make relevant decisions. Thankfully, developers don't have to build these systems from scratch: they use specific planners that generate increasingly complex networks.

"At the moment [Guerrilla Games] is using a specific type of planner for our NPCs called Hierarchical Task Network (HTN)," Straatman says. "This is capable of generating more complex plans than what we had before Killzone 2. We also keep on improving things like the CPU performance, which means we can support more NPCs in Killzone 3 than we could in Killzone 2. The terrain-reasoning systems we generate have also evolved over our various titles. We are now able to deal with much more dynamic terrain (like obstacles moving around or changing shape) than ever before. Our data on where there is cover has also become more detailed, something that allows NPCs to deal with more complex environments such as multistory buildings, etc."

Killzone 2's lead AI programmer Remco Straatman believes the industry is still struggling to make NPCs as human as possible.
Back when Straatman and Guerrilla began work on Killzone and Shellshock, the team’s goal was to make the AI system as capable of making its own decisions as possible, realising this would make things all the more fun for players. However, doing this in a consistent way proved to be a lot more work than the team anticipated, particularly when dealing with combat AI. While the goal of normal AI is to emulate the real-life behaviour of a particular nature (for example, doctor, civilian, or shopkeeper), combat AI works very differently. Firstly, its main objective is to be as entertaining as possible. In some cases this means being efficient at killing players; in other cases, it's more about making intentional mistakes and "overacting" by way of signalling to players what is about to happen.

"Where normal AI tries to emulate an expert medical specialist or world champion chess player, game combat AI is more like emulating an actor," Straatman says. "At the end of Killzone 2 we found ourselves looking at the NPCs doing things that we did not expect, and this surprised us positively. Reviews and forum feedback confirmed we had at least partly achieved the vision we had so many years back, and people playing the game recognised and appreciated it."
One of Killzone 2's most commended features in the field of AI was the game's skirmish mode. Because this mode is more team-based and tactical than the single-player campaign, the AI bots in this part of the game need to do more than simply run around and kill one another. Guerrilla based the skirmish AI in Killzone 2 on the real-time strategy model, building two levels of AI on each individual bot. The first is a commander AI, which controls overall strategic decisions; the second is a squad AI, which translates the commander AI's orders into orders for the individual bots. The team then taught the bots how to use the in-game badges as part of the order given to them by the squad. For example, if an engineer bot is ordered to defend an area, he will first build a turret at a tactical position before starting to patrol. While some might argue that AI bots no longer play as important a role in multiplayer games--given that most gamers now play online--Straatman says bots improve gameplay and give players a chance to test out multiplayer strategies before going up against other human players.

"They give people a testing ground for real multiplayer--getting to know the maps and the game modes in a game against human players can be too much to start with."

According to Straatman, the area that needs most improvement in the game AI field is buddy AI. Because buddy AI systems often have contradictory constraints, getting this system right is often a big challenge: the buddies should be visible and close to the player but not get in his line of fire; they should stay close and respond to the player movement but not move around all the time; and so on. Buddy AI is also much closer in view to players than enemy AI, making any errors easier to spot.

"Enemy NPCs know what other NPCs of the same faction are going to do because they are all computer-controlled and can tell each other what they will do next. However, players are much harder to predict--if you would look at movement patterns of players, you will see they are quite strange at times. This is made worse by the fact that player turn rates, movement speeds, and acceleration are very high. The last point is the expectation of the player: enemies are only supposed to shoot at you, whereas buddies are supposed to fight and interact with you in a sensible way. We are working hard to make the buddies work better, because we feel that they can add a lot to the player experience when done right."

The AI director in the Left 4 Dead games is an example of how developers can use AI to reach beyond traditional individual NPC behaviour.
Straatman believes the struggle to make NPCs as human as possible is still very much at the top of the list for many AI programmers, with the future set to change the way we think about in-game interaction.

"The ideal is always to immerse the player in the game: the NPCs should feel like they are living and breathing creatures, and this illusion should not be spoiled anywhere. Within the relatively limited interaction you have in a game, it may be achievable to make the distinction very small. I think human behaviour is so interesting, and yet subtle interactions such as conversations are still out of reach of autonomous AI; games rely on clever scripting or cutscenes to get that across. If we as a field will master these types of interactions, more parts of the game can be interactive, and possibly whole new game genres may become feasible."

"I think this will make games more approachable and immersive. If we are able to maintain the immersion by having realistic behaviour in the interactive parts of the game, you will get a seamless experience from cut scenes to combat. I also think we are ready to use AI for more than just individual NPCs--the director system in Left 4 Dead is one interesting first step in that direction. We probably will see more combinations of AI systems that before were limited to one type of game: RTS games will have unit AI that will come closer to what you now see in first-person shooter games. MMOs could also start using more elaborate AI, potentially even to command hordes of NPCs. I hope we will see some brave studios try to create these new systems that are now becoming possible."

Share/Bookmark

Wednesday, December 1, 2010

Social Anxiety

(FOUND THIS ON GAMESPOT) 

Designers behind Civilization, Super Meat Boy, Spore, and more weigh in on the Facebook gaming phenomenon and the morality of social gaming mechanics.

At the risk of understatement, social gaming is huge. The phenomenon of free-to-play, microtransaction-supported games has grown exponentially in recent years, to the point that the estimated worth of leading social publisher Zynga was pegged at $5.51 billion, overtaking that of traditional publishing giant Electronic Arts earlier this month.

Every month, more than 360 million people play Zynga games like FarmVille, Mafia Wars, and FrontierVille through Facebook, MySpace, and iOS devices. To put that number in perspective, that's more than the total number of votes in the last three US presidential elections combined, with more than 1 million to spare.

While social publishers like Zynga, Playdom, and Playfish are presiding over a period of explosive growth in the gaming industry, they are also the cause of much consternation in the development community, partly because of the way the free-to-play business model impacts design choices in these games. The most successful social games to date have used very simple gameplay mechanics, encouraging neither strategy nor dexterity but regular interaction with the game. Although free to play, the games also typically have a microtransaction component, where players can spend real money for in-game items or performance boosts.

Players can also reap some of the same rewards by recruiting their friends to sign up for the game, with each new user giving the developer another potential microtransaction customer. Those transactions aren't always of the "micro" variety, either. In his Game Developers Conference 2009 keynote address, Playfish cofounder Sebastian de Halleux talked about one of Pet Society's more popular items, a sofa shaped like lips that costs players $40 worth of virtual currency.

Addictive or Exploitive?

Although undeniably successful, the existing social game framework has been the subject of much debate among game developers from every corner of the game industry, from the mainstream to the indie community. Some, like Super Meat Boy creator Edmund McMillen, are particularly strident in their assessment.

"Social games tend to have a really seedy and abusive means of manipulation that they use to rope people in and keep them in," McMillen said. "People are so tricked into that that they'll actually spend real money on something that does absolutely nothing, nothing at all…There's a difference between addicting and compelling, and I think all designers want to push toward compelling. Crack is addicting, but it's not a fun game. It's a bad thing. It feels good when you're doing it, I'm sure, but it's not something you want to brag to your friends about doing. It's the difference between bragging to your friends about being addicted to running and being addicted to crack. It's, 'Man, I just ran a marathon and I'm getting better,' versus, 'Man, I just did crack for a week, and now I want so much more.'"

Sid Meier knows a thing or two about addicting and compelling games. His celebrated Civilization series of turn-based strategy games is notorious for sucking gamers in, so much so that the latest installment was promoted with a series of "CivAnon" video shorts. The clips featured rock-bottom accounts from members of a fictional 12-step program for gamers hooked on the series. Despite the marketing, Meier is hesitant to criticize games--his own or others--for being addictive.

"I think that's just the wrong word," Meier said. "It's fun to play. As game designers, we want to make an experience that you want to continue to play and play again and replay. So I'm hesitant to make that a bad thing: that games that are fun, that games are things you want to do, that you want to keep doing. Because that's our goal: to create a great experience. I just want to be careful that we don't make [it] a negative that games are too good. 'They're too much fun, they're too compelling!' Games should be fun. They should be compelling. They should make you want to play."

That's the goal Meier has for his current project, a Facebook-exclusive version of Civilization. While the developer hasn't detailed exactly how the game will work just yet, chances are it will be a more straight-faced attempt at a social game than the first effort of Ian Bogost, associate professor in the School of Literature, Communication, and Culture at Georgia Tech and cofounder of Persuasive Games.

Microtransactions, Macro Trepidation

Bogost was so put off by the trend that he created a satirical Facebook game of his own. Cow Clicker is a game that gives players a single cow that they are allowed to click on once every six hours, and it tallies the total number of clicks made. By purchasing the game's "Mooney" with microtransactions, players can click their cows more often or swap them out with premium cows costing as much as $500 worth of in-game currency.

If players can get their friends to sign up for Cow Clicker as well, they can share clicks and earn rewards more quickly, like a bronze cowbell for the fashion-forward bovine. Bogost said most social game players will never give the developers a dime in microtransactions, which leads to the industry mimicking an uncomfortable business model.

"There are certain industries in which the majority of revenues come from the minority of the customers," Bogost said. "Without citing numbers, it is generally incredibly sinful, morally questionable industries that are like this: alcohol sales, gambling, and tobacco. And we might want to ask ourselves what we think about that. When you have a game that does not have a spending cap and the vast majority of revenue is coming from a minority of players, 10 percent of players generating 90 percent of revenues, how do we feel about that? It's not a simple question, but it is something I think can't simply be brushed under the rug. We can't say, 'Well players will do what they want, and it's none of my business how they spend their free time.' A lot of game developers take that position. I think that's unfortunate."

Bogost also took issue with the way social games treat friends as resources, saying there was "some violence" involved in the process. It's something people do all the time, he admitted, in networking to find a job or getting a message out to a broader audience. But he worries many Facebook games do too little to actually cultivate or strengthen friendships.

"These games seem to consider their friends as mere resources rather than individuals with whom they want to develop sophisticated and expanded relationships," Bogost said. "They rely on compulsion. They prey upon the time we spend away from them. They have this insidious quality of being able to buy out of playing the game entirely through microtransactions. These things bother me, personally, as aesthetics."

Not all the social game concerns are ethical in nature. Chris Hecker, Spore developer and creator of the upcoming SpyParty, sees social games as a potential threat to the long-term health of games. Echoing a presentation he gave at this year's Game Developers Conference, Hecker expressed concern with games that rely on external enticements to keep players engaged. According to Hecker, psychological research suggests that rewarding people for a task like playing games--as with achievements and trophies, or the aforementioned bronze cowbell--can cause them to derive less enjoyment from that task.

"My worry is from the player's standpoint. If the research carries over to gameplay as it does in other [fields], it will actually turn people off games in the long run. It emphasizes the shallow, dumb, non-interesting tasks, and it decreases motivation for interesting tasks that might be intrinsically motivated."

Hecker said his hope for games is that they become the preeminent art form of the 21st century in the same way film was for the 20th century. His concern is that the industry is engaging in trends now that will hold it back from achieving that goal in the future.

"The way you become the 'preeminent art form of the 21st century' is not by giving people more achievements and stuff," Hecker said. "It's by making deeper and more compelling games."


While some might find the social gaming model insidious, other designers find the nascent nature of the genre downright inspirational. One such creator is Spry Fox cofounder and chief creative officer Daniel Cook, whose career has been following an increasingly experimental trajectory since breaking into the industry with Epic Games' 1995 PC shoot-'em-up Tyrian. During a stint with Microsoft, Cook worked on titles like the live game show 1 vs. 100, and for Spry Fox, he has developed the browser-based Steambirds and games for Amazon's Kindle e-reader (Triple Town and Panda Poet).


"I personally adore the microtransactions model," Cook said. "To me, there's always been something fundamentally dishonest about the way retail works. Most games are purchased without reading the reviews. There's a box on the shelf and someone spent an insane amount of marketing dollars to get someone to look at the pretty picture on the box and buy the game. As a game designer, I would much rather have someone try my game for free, and if you like it and find value there, pay a little bit of money. I'm absolutely in love with that model."

Taking Measure of Metrics

Another benefit Cook has spotted in the social gaming model is the abundance of metrics available. It's easy for developers to make minor changes in social games, take them live for a short period of time, and get detailed data on exactly how the player base reacted.

"A lot of game design historically has been designing in the dark," Cook said. "You don't know what people think, and more importantly, you don't know what they're going to do. The metrics give us very up-to-date, rapid information on areas of gameplay that we never would have had insight into previously. It's like someone is turning on a light bulb for the whole design process."

Cook acknowledged the possibility that some designers may rely too heavily on metrics, but he said they were just tools to be used judiciously, like focus groups or a designer's own instincts.

"I've seen intuition create incredibly horrible games, and I've seen metrics create incredibly horrible games," Cook said. "If you use the tools badly, then yes, they will lead you in the wrong direction."

Meier downplayed the impact metrics have on the overall design of his games, saying metrics primarily help with small-scale optimization.

"Most of our design decisions are pretty big and broad," Meier explained. "Take this in, put this out, double that. The idea that we need to add 3 percent to this or make this green instead of blue…those aren't the kind of decisions we focus on to make the game. Our game is based on big ideas, fun concepts, and interesting ways to play a variety of strategies, so metrics are not really at this point a big part of our game design process."

McMillen said Super Meat Boy was designed without the aid of metrics. And while there was one focus testing session for the game, most of the feedback was thrown out. (One tester suggested that having a static loading screen would be preferable to a cutscene that couldn't be skipped for the first few seconds because the level was loading in the background.)

"The funny thing about [metrics] and business in general is the idea that they think they're perfecting something and they're going to be more successful by perfecting it," McMillen said. "When in reality, I guarantee you something will come out in the next few years that will beat out these games, and it will be something nobody knows about, and something nobody knows they wanted, because the thing that people really want is something they don't know exists."

Looking Forward

One common theme expressed to varying degrees by all of the developers above is that the notion of a social game isn't inherently broken and that things could get better.

"I'm sure you can make a responsible, fun, even competitive Facebook game," McMillen said. "Facebook has the ability to become like an Xbox Live system, where if a good game came out and was fun like a Geometry Wars type game and you could compete for high scores, I think that would be just as successful and a more responsible gameplay experience."

Describing himself as naturally optimistic, Hecker focused on how far Facebook games have already come.

"If you look at Facebook before Zynga and Playfish and all these guys, they were selling apps that let you put fake vomit on your friend's wall," Hecker said. "There was no gameplay whatsoever; it was just junk you would spam on your friends' walls. And the game developers basically put all those guys out of business by making just the simplest games. And that gives me some hope because it means that maybe deeper gameplay will steamroller over the current list of really shallow games. It's not a huge step forward, but it seems like it's going in the right direction."

Finally, Meier took a long-range perspective on the issue, like a Civilization player on the first turn, looking at a lone settler unit on barren plains and hoping in time to turn it into the heart of a globe-spanning empire.

"If you're not thrilled with the current crop of games, we're still in the early experimental phase," Meier said. "The first computer games weren't all that awesome, and the first networking games weren't like what we're seeing today. Give us a little time to explore this technology and see what we come up with."

Share/Bookmark
Related Posts Plugin for WordPress, Blogger...