I started reading this article with a very favorable bias - but the very first point turned me off.
Tadhg Kelly should know better. Remember the switch from 16-bit to 32-bit in 1995? Or even the 8-bit to 16-bit revolution? A lot of people bet on the existing install base, rather than switching development over to the new platform. They lost their shirt.Many studios made the leap from PlayStation 1 to PlayStation 2 development without stopping to think whether it was a good idea... Responsible businesses would not have made the leap instantaneously.
It was a strange phenomenon that baffled some analysts. Towards the end of the life of the machine, EVEN BEFORE THE NEW CONSOLE WAS OUT, people quit buying games for their old systems. Maybe they were saving their money for the new machines & new games. Maybe they did what I do and pick up used copies of hit games from 2 years ago. Maybe the stores are just being very conservative in their orders when they are planning on making space for the big new sellers, which are sure to sell TONS because people are so anxious to build up a library for their new box. Whatever the reason - those who relied upon that huge existing user base to support them ended up losing their shirt.
I don't know if that happened to anyone during 2000 - 2001, but businesses learned from the experiences of the past, and no doubt expected that trend to continue. Unless some fundamental shift in customer purchasing habits takes place, it's still going to be a problem in 2005. If you plan on shipping a PS2 game in 2006, expect sales to be dismal.
Yeah, they don't NEED them. But try telling THEM that. The customers vote with their wallets - and I've been burned for being on the wrong end of the technology spectrum when I was working for a major publisher on retail games. But the driving factor in consumer sales in the 'core' retail market has been GRAPHICS. The eye-candy is what gets them in the door - beautiful screenshots, interesting vistas, cool special effects, and of course sexual titillation and gratuitous violence is usually a draw. Things like gameplay, story, characters - that might keep them playing and talking about your game, but it's the graphics that gets them in the door.Customers don't, on the whole, need the most advanced technology in their games in order to buy and enjoy them. They just need good games, need to know that they exist, and find them appealing.
You really have four choices:
1) Write your game for the early-adopter set, showing off their cool new hardware
2) Write your game for maximum compatability - it'll run on almost anything, but it looks like crap compared to the competition
3) Try to go to the middle-of-the-road - something that looks 'okay' but not great on the top-end machines, but still has a reasonable amount of compatibility for lower-end tech
4) Try to achieve both #1 and #2 by having a hugely scaling graphics system (which effectively means twice the effort, as much content will have to be re-done from scratch to make the most of the limitations and capabilities).
#4 is pretty ridiculous, though it has been tried. More often, though, it morphs into #3 part-way through development.
#1 cuts off out everything but the high end from your market, and as the article states, it's hugely expensive. The advantage is that the market you are keeping is the market that has HISTORICALLY (that's changing now, thank goodness) had the most money to spend, and has proven their willingness to throw great gobs of money towards games that allow them to show off their latest hardware.
#2 cuts out the high end - whether you like it or not. Sure, they can play your game, but 99.9% of them won't give you the time of day. It takes a really stellar game to make them look past it. And to come off sounding really harsh, but it's the industry's attitude - you are selling to the cheapskates who won't even fork over the cash to buy a new videocard. Hey - that's me... I'm still running a 2 gig box with a GForce 4 --- an antique by 'core' gamer standards. HOWEVER - this market is proving that once you can actually find them and market to them, they will buy games. Lots of 'em. So the problem is starting to go away. But we're not there yet.
#3 gives you all of the problems of #2 without much of the associated advantages. You really run the risk of being in the 'worst of both worlds.'
It's an ugly situation. I think I'm happier to be an indie right now not trying to go after the really big bucks (not that I'd complain if I stumbled into 'em). However, times are changing - so are these problems.
First of all, the 'casual' audience is growing by a huge amount - the more we can start catering and marketing to the low-end, the less pressured everyone will be to stay at the bleeding edge. Including customers... I think.
The other thing is that I believe that each succeeding generation of hardware is less impressive than the last, even for the most hardcore gamer. This means we're going to be coming up against the law of diminishing returns, which may finally allow us (as an industry as a whole, I'm not really talking about indies here) to throttle back a bit on our supersonic trip to increase development costs. When tools, engines, & pre-generated content options have a chance to catch up and mature, you are going to hit the point where the average customer has a tough time telling the difference between games that had budgets that were an order of magnitude apart, we will have arrived.
Nintendos already figured this out - except their solution was to create new gimmicks to replace flashier graphics. Maybe they are right, but I like to think they are wrong. After all, they were wrong before... back when they thought that catering to their install base would be far more lucrative than keeping pace with technology. Maybe this time the tide will finally turn.
In the meantime, this article is great in theory, and represents ideals I'd love to see happen, but doesn't have much grounding in reality.