Game development has been going on for decades and in each release play testers and users alike will find their share of bugs and glitches. But how much is acceptable and should those that make games own up to their programming shortcomings by offering quick updates and patches to consumers? On top of that questions is, should this be free?
Many argue both ways that games should come relatively bug free and developers usually side with each other and comment that games these days are impossible to keep bug free and do not intend to weed out each and every bug that occurs. So where is the happy medium?
First I’m going to start with a history lesson from my childhood. I loved Mario, Nintendo, Nintendo Power magazines and almost any video game I could get my hands on. I guess I was lucky because most of the video games I played and owned were bug free or at least I never encountered any during my sleepless nights and attempts to dodge homework for more time with the Italian plumber. I came to expect that games always worked and relative to now, that may have been because less games were produced and those games tended to be relatively simple next to our massive projects being released now in 2011 or even ten years ago.
I read an article recently about a new game that was released for console and PC and I concluded that the response given during the interview was the medium I spoke of, above. It stated in a general sense that developers that are worth their salt and those looking to move forward in game design and produce quality should be concerned about bugs and glitches and ought to protect their investment by keeping the users happy with free updates and patches if need be but said that the responsibility also lies with the consumer to treat a game like a piece of software, because it is. He went on to explain that software is written, tested, it breaks, it is repaired, tested etc. etc and should be treated with care for those with shallow pocket books. Most computer software users at least read some reviews or shop around to find out if the software is worth the money before they purchase it. His point is that a game, like a traditional PC software should be worth what you pay for it when it comes out and it is the responsibility of users to report bugs using forums or perhaps another medium supported by the developers so they can carry out their responsibility to repair those issues that arise that threaten that question; “Is it still worth their money?”
Previously I thought that developers should keep games bug free or at least the major glitches and have since been converted to the view of the developer during the interview I was reading about. It only makes sense to be patient, read some reviews and participate in forums or blogs to get more information about your $60 purchase. If the game is full of bugs and large publications give it a poor score, simply put, don’t pick it up or wait a while and see if any support comes out for it.
The pace at which games are being produced is vastly quicker than it was when I was a kid and the demand and expectations of myself, my peers and many other gamers is increasing as well. With that being said, bugs and glitches will always be a part of our games, especially surrounding new technology as gaming continues to evolve onto more powerful consoles.
Who should pay for your character getting stuck? You be the judge and post your opinion!