Confused on today's low end spec

Discussion in 'Game Development (Technical)' started by JGOware, Aug 8, 2008.

  1. gmcbay

    Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    280
    Likes Received:
    0
    Attempting to hit extreme min spec at the cost of some basic level of special effects or even just at the cost of increased developer time (as has been mentioned previously in this thread, it is actually way easier to do a lot of 3D programming tasks using the shader model, despite the "advanced" stigma they seem to have) is one of the negative legacy memes of the old Dexterity Software articles, IMO.

    Yes, there are people out there running 7 year old computers... but how many of them are in the position to actually shell out money for your game? If they are running 7 year old computers they are either extremely cheap or extremely poor, neither of which makes them bad people, but either of which makes them a very unlikely potential customer.
     
  2. ManuTOO

    Original Member

    Joined:
    Aug 9, 2004
    Messages:
    344
    Likes Received:
    4
    I totally agree with gmcbay.

    The unity3d stats lack a very important thing : the country of the surveyed people.
    My guess would be that the 30% people without shader support are mainly from poor countries where anyway, they don't have the means to buy ur games (either coz of not enough money, or just coz they don't have CC).

    The same stats but only for _paid_ customers would be very great... :)
     
  3. Bad Sector

    Original Member

    Joined:
    May 28, 2005
    Messages:
    2,742
    Likes Received:
    5
    You forget something however: some people will not buy new hardware if they have no reason to do so. If most of these people use Office, mail and browse the internet and maybe playing an occasional game now and then, they don't have to get new hardware to do so. A computer made 5 or even 7 years ago can do that (and while the latest version of Office may need more resources than the previous versions, again they dont have to use the latest version).

    And i'm sure that nobody but the hardcore games will get a new graphics card, CPU or more memory to play a $20 downloadable game. Besides, if we're talking about 5 years old hardware (or even 3 years old hardware in some cases), a user cannot simply change a graphics card or plug a new CPU. Today's major stuff (like video cards) isn't compatible with hardware made 3-4 years ago, especially with cheap hardware. To upgrade your average 3-4yrs old PC today with a new video card you need at least a new motherboard, which in turn will need a new CPU (because the sockets will most likely be different) and if your computer is a little older it might even need new RAM. In some cases you might even need a new PSU. That will cost you around $200 or more.

    And this assuming you know how to upgrade the PC yourself of course. If you don't you'll have to pay a little extra for service ($20-$40 is a normal fee around here), although from my experience most salesmen will try to convince you to get a new PC (which in the cheapest case - and assuming you know where to look - wont cost less than $300).

    I doubt someone will be willing to spend $300 (or $200/$220/$240) to be able to play a few more downloadable games if otherwise his old computer does the job (amazingly with the exception of games and realtime graphics software, most computers created since 2000 can do a whole lot of stuff).

    However this same person is more likely to spend $20 for a game ($20 is much less than $300).

    EDIT: and lets not forget the people who prefer to use a laptop. Those aren't very upgradeable either.
    EDIT2: if you want low-end hardware, get an EeePC 701 4G Surf. They're as low as you can find in stores these days.
     
  4. vjvj

    Indie Author

    Joined:
    Sep 25, 2004
    Messages:
    1,732
    Likes Received:
    0
    I was going to raise this exact analogy in one of my previous posts... But I elected not to to avoid the "graphics elitist" personality I've been ever so closely inching towards in this thread :)

    I have an Eee and use that as my min-spec testing machine. It supports SM 2.0 (not arguing with you, just throwing another data point out there). Our action game is currently CPU-bound on the Eee at about 70-80fps, so we've got plenty of room to draw more... Hahaha :)
     
  5. gmcbay

    Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    280
    Likes Received:
    0
    I don't doubt there are people out there who fit your description of having very old computers and yet still spend $20 on a game from time to time. My point isn't that they don't exist, just that I believe they are such an incredibly small percentage of the potential market that they honestly aren't worth worrying about.

    All things being equal, of course you should try to keep the min specs down as much as you can, but I believe a lot of indie authors go extremely overboard in this area resulting in a situation where they spend an inordinate amount of effort to keep the min spec low with the end result being a bad ROI of their development time.
     
  6. Bad Sector

    Original Member

    Joined:
    May 28, 2005
    Messages:
    2,742
    Likes Received:
    5
    The thing is you don't actually have "hard data" about that (nor i for the opposite), so the best you can do is to plan for the worst case, which is that if you use pixel shaders, 30% of your potential customers wont be able to play the game (if we believe the Unity data of course).

    Note that i'm not saying not to use pixel shaders but dont rely on them. Make it optional, write a fallback renderer. Besides writing a fallback renderer can be as reusable as writing a shader: write it once and use it for many games.
     
  7. Applewood

    Moderator Original Member Indie Author

    Joined:
    Jul 29, 2004
    Messages:
    3,859
    Likes Received:
    2
    Leaving thread now. It seems people with a bone to chew can't digest the concept of "optional"...
     
  8. gmcbay

    Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    280
    Likes Received:
    0
    Writing a fallback renderer is like the worst thing you can do, ROI-wise, IMO. My point is that as programmer geeks we all want to create the most robust solution that could possibly exist. We want to abstract out the graphics completely so we can have some super snazzy plugin rendering system that can render to DX, OGL (even if we only ever intend to release on Windows), and some super spiffy software rendering we also write. Creating an elegant solution to this sort of problem calls out to us like a siren's song. But actually doing all of that takes a MASSIVE amount of time. Huge amounts of time to develop, huge amounts of time to test, huge amounts of time to maintain.

    I am of the opinion that that sort of time usage is really not worth it from a business perspective. It scratches an itch we feel as developers, but even if it opens up 30% of the market, it still isn't worth it because it is going to result in much more than a 30% increase in development time, and that development time is going to be spent on supporting people who I believe are much less likely to buy your game than the other 70%, and to top it off, it is time spent on creating solutions that are already legacy as soon as they ship whereas if you build your code based soley on the shader model, the number of people who can't run the games you make is going to get smaller and smaller over time with no effort required on your part and you get the immediate benefits of less development effort up front on the nuts and bolts engine of your game to get the same results as you would using the fixed function pipeline (or worse, some architecture astronaut solution that caters to both models).

    Clearly this is somewhat of a religious issue and I'm unlikely to change any minds on this, and Applewood probably has the right idea by just walking away from this, but I wanted to clarify my point at least a little more before I let it drop.
     
  9. Bad Sector

    Original Member

    Joined:
    May 28, 2005
    Messages:
    2,742
    Likes Received:
    5
    Writing a software renderer for rotating and scaling 2D images shouldn't take you more than two days. Optimizing this to a point it runs in EeePC level hardware shouldn't take more than a week.

    And there is always the abnormally fast SwiftShader, a Direct3D 9 reimplementation in software which "performs between 50 and 100 times faster than Microsoft's Direct3D® Reference Rasterizer in tests with sample applications and can achieve performance that surpasses low end integrated graphics hardware in many cases." (quote from the site) and has Pixel Shader 2.0 support. I tried it with some older 3D games (you just drop the dll in there) and it works.
     
  10. gmcbay

    Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    280
    Likes Received:
    0
    I guess part of the problem here is that we aren't on the same page because I was under the assumption we weren't limiting the discussion to 2D games.

    Yes, writing a pretty good 2D software engine isn't super hard. In that case I'd probably just use SDL which already has hardware acceleration with software fallback or one of the other existing solutions like PopCap, et al.

    When you start talking about actual 3D rendering, though, there is a pretty huge difference in the way you structure everything for fixed pipeline vs shader model with shader model code being much easier to write and maintain once you understand the basics of it. That is the case where writing a software fallback and/or multiple hardware API rendering back-ends is a huge undertaking.
     
  11. Mattias Gustavsson

    Original Member

    Joined:
    Aug 10, 2005
    Messages:
    669
    Likes Received:
    0
    I was assuming we were mainly talking about 2D games... there doesn't seem to be many indie 3D games for sale (at least not on the portals, where I guess most of the indie stuff goes). But maybe I'm wrong about that, so feel free to give examples (I'm not that up to date with the portal stuff. I guess XBLA have plenty of 3D titles?)
     
  12. vjvj

    Indie Author

    Joined:
    Sep 25, 2004
    Messages:
    1,732
    Likes Received:
    0
    Don't forget that 3D rendering does not always == 3D gameplay, or even 3D effects.

    If you are using hardware acceleration for 2D without a perspective transform in sight, somewhere in the driver your render commands are eventually being pushed through a 3D interface anyway. Once you start heading down the path of hardware acceleration and its natural benefits (alpha blending, texture filtering, texture wrap/repeat modes, etc.), the PITA of fixed-function is already creeping up on you before you can even say "three" or "dee".

    I think part of the disconnect in the communication here is that many of us are used to working in a fixed function environment and do not see the drawbacks due to lack of an opposing perspective. I think that's muddling the ROI argument that gmcbay has been stressing in this thread.

    This particular issue hits home for me, because we made the decision to go fixed-function with Meridian 59: Evolution and I'm not so sure that was the right decision in retrospect. With all the spell effects in the game, render state management became a nightmare and I'm not really sure the increase in compatibility resulted in any noteworthy increase in subscriptions; at least, enough to make up for the months lost in fixing render-related bugs.
     
  13. Bad Sector

    Original Member

    Joined:
    May 28, 2005
    Messages:
    2,742
    Likes Received:
    5
    Well... if you have a fixed pipeline renderpath working, adding software rendering shouldn't be very hard. In fact the only difference is that you have to write a rasterizer and do T&L yourself. There are tons of information on the net on how to write an optimized rasterizer and T&L is a few minutes of work (given that you have a 3D math library - but doing any serious 3D work practically requires one :).

    However i wouldn't bother with a software rasterizer for 3D games unless they're simple 3D games. And even then, i would write the sw 3D renderer mostly for the fun factor (for me), not because i really think that one would be needed.

    (btw from my tests i found that in SDL under windows RLE encoding the sprites with transparency - color keying - in software surfaces is A LOT faster than using hardware blitting)
     
  14. gmcbay

    Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    280
    Likes Received:
    0
    SDL's historical hardware blitting was based on DirectDraw which isn't very well supported by modern hardware. Newer versions of SDL can use OpenGL textured rects for blitting sprites and I'd be surprised if software surfaces were outperforming that, especially if you're color keying or doing alpha blending at all. On the other hand, OpenGL under Windows poses a host of its own problems, particularly with Intel Integrated Graphics chipsets.

    There are far more differences than just basic polygon rendering. If you want to support shadows in your game, for example, the algorithms to do so in a fixed-function pipeline are far more esoteric and way harder to integrate into an arbitrary engine than doing the same with shaders is. The difference in difficulty to get similar results and the amount of coupling required within your game engine code are just completely night and day here when comparing fixed pipeline to programmable pipeline. Ditto for skeletal animation skinning, particle effects, etc. The programmable pipeline model is just SO much easier to work with when doing any of this stuff and the way you structure everything is so different that trying to have any sort of meaningful fallback to fixed pipeline is really a huge pain in the ass (and IMO, a waste of time these days) if you use any of these features.
     
  15. Bad Sector

    Original Member

    Joined:
    May 28, 2005
    Messages:
    2,742
    Likes Received:
    5
    SDL has obsoleted SDL_OPENGLBIT surfaces long ago:
    Unless you mean using SDL as a crossplatform "OpenGL initializer" and using pure OpenGL for graphics. However in this case you dont use SDL for the graphics but OpenGL.


    On shadows: depends how you do shadows. The only 'esoteric' method i've seen is shadow maps. Projected shadows, planar shadows, volumetric shadows and stencil planar shadows are done pretty much the same way in fixed and programmable pipeline (sometimes you can move some calculations from the CPU to the GPU but the algorythm is the same). Also when in doubt use different shadowing methods. If you really cant afford the time to do proper shadows in fixed function, just do blob shadows.
     
  16. coconut76

    Original Member

    Joined:
    Jan 6, 2005
    Messages:
    63
    Likes Received:
    0
    Hello,

    I've just read this thread and it is just what I ask for...
    Thank for the Unity link, i'm happy to know that 60-70% of casual gamers have shader2.0 support.

    I'm planning to make a new game.
    But I hesitated between moving on Shader 2.0.
    Actually, I have my own 3d engine wich works with DirectX 8 and fixed pipeline.
    I looked at XNA and irrlicht.

    Why XNA ? Because there is actually the Dreamplay competition. But the problem is to migrate with c#. (I'm a c++ coder !)
    I saw that XNA/c# is more easy than coding in c++.

    Irrlicht is more flexible because it has dx8,dx9,opengl,software render and I could use some parts of my own engine.

    I'd like to have some minds about that.

    Thanks in advance.
     
  17. Desktop Gaming

    Moderator Original Member Indie Author

    Joined:
    Feb 24, 2005
    Messages:
    2,296
    Likes Received:
    12
    I know this thread is a month old, but anyway....

    Today I finished off building my 'new' test PC. P3-733MHz, 256MB RAM, PCI GeForce2MX440 64MB graphics, WinXP. I didn't set out for my test system or minimum requirements to be this spec - its just bits I had lying around and the graphics card was £5 off eBay.

    Ran Magicville on it and it runs superbly well - very close, almost unnoticably different - to how it runs on my dev PCs. I was amazed to say the least, and I'm very happy with the result.
     
  18. JGOware

    Indie Author

    Joined:
    Aug 22, 2007
    Messages:
    1,578
    Likes Received:
    0
    "GeForce2MX440" I have one of those as well on a 800mhz system. Runs some games better than my Intel 82845G, Celeron 2.93hz system dev kit.
     
  19. Desktop Gaming

    Moderator Original Member Indie Author

    Joined:
    Feb 24, 2005
    Messages:
    2,296
    Likes Received:
    12
    Yeah I had a 32MB GF2MX seven years ago and it was a good card. I stand by my previous comments though; I'm amazed at how well it holds itself up on such a low spec system.

    My other PCs have a Radeon 9600, a 7600GT and an Intel GMA965 X3100.
     
  20. JGOware

    Indie Author

    Joined:
    Aug 22, 2007
    Messages:
    1,578
    Likes Received:
    0
    "I stand by my previous comments though; I'm amazed at how well it holds itself up on such a low spec system."

    True, but it's more likely GPU related than CPU. Run your game on an 828 or even better 810 series Onboard Intel for an even better test. ;)
     

Share This Page

  • About Indie Gamer

    When the original Dexterity Forums closed in 2004, Indie Gamer was born and a diverse community has grown out of a passion for creating great games. Here you will find over 10 years of in-depth discussion on game design, the business of game development, and marketing/sales. Indie Gamer also provides a friendly place to meet up with other Developers, Artists, Composers and Writers.
  • Buy us a beer!

    Indie Gamer is delicately held together by a single poor bastard who thankfully gets help from various community volunteers. If you frequent this site or have found value in something you've learned here, help keep the site running by donating a few dollars (for beer of course)!

    Sure, I'll Buy You a Beer