Rendering problem when switching render targets.

Discussion in 'Game Development (Technical)' started by Battleline, May 15, 2005.

  1. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    When I switch render targets, my models no longer seem to have their faces rendered properly. Faces that should be obscured by faces in front of them are being rendered. I'm not sure what I should do to prevent this from happening. :confused:

    This is what I'm currently doing whenever I switch to render my glow surface.
    Code:
    	g_pDevice->SetRenderTarget( g_pGlowSurface, NULL ); 
    	g_pDevice->SetRenderState(D3DRS_ZENABLE,TRUE);
    
    Then I switch it back to the back surface basically using the same code.
    Code:
    	g_pDevice->SetRenderTarget( g_pBackSurface, NULL ); 
    	g_pDevice->SetRenderState(D3DRS_ZENABLE,TRUE);
    
    I didn't initially call the SetRenderState function to set ZENABLE back to true, but I added it thinking that was perhaps the problem. It didn't seem to have any effect.

    Here's a screen shot.

    Anyone have any ideas?

    Thanks
     
  2. mkovacic

    Original Member Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    127
    Likes Received:
    0
    You're not passing a z-buffer to a second SRT() (you probably don't need it for glow rendering, right?).

    Get a ptr to a z-buffer using IDirect3DDevice8::GetDepthStencilSurface() before your first SRT(), then pass it to the second SRT().
     
  3. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    Thanks for the reply, I'm not exactly sure what you mean by SRT() though. Here's what I just tried to do with the stencil buffer.

    I created a global surface variable to hold the stencil buffer, and called the following line after retrieving the back buffer during my initialization:
    Code:
    g_pDevice->GetDepthStencilSurface( &g_pStencilSurface );
    
    Then whenever I set my render target to my back buffer I use the following line:
    Code:
    g_pDevice->SetRenderTarget( g_pBackSurface, g_pStencilSurface ); 
    g_pDevice->SetRenderState(D3DRS_ZENABLE,TRUE);
    
    And when I set my render target to my glow surface I use this:
    Code:
    g_pDevice->SetRenderTarget( g_pGlowSurface, NULL ); 
    g_pDevice->SetRenderState(D3DRS_ZENABLE,FALSE);
    
    However, this seems to create an entirely different problem with the stencil buffer as you can see in the follwoing screenshot.


    In my presentation parameters during the initialization of my code I call:
    Code:
    g_SavedPresParams.EnableAutoDepthStencil = true;
    g_SavedPresParams.AutoDepthStencilFormat = D3DFMT_D16;
    
    I'm not sure if setting my presentation parameters to use an auto depth stencil would be the issue.

    Thanks for your help :D
     
  4. mkovacic

    Original Member Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    127
    Likes Received:
    0
    SRT() is just short for SetRenderTarget().

    It's hard to say from the screenshot what's going on as I don't know what it's supposed to look like.

    Does everything render correctly if you disable all SetRenderTarget() calls and rendering to the glow texture?

    I'm not sure what are you rendering after the second SRT() exactly, are you aware that the z-buffer values are preserved across SRT() calls? Do you need a z-buffer clear in there somewhere?
     
  5. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    I was not clearing my Stencil buffer after each rendering. By clearing my stencil buffer I have been able to get much closer to a good looking glow effect. However, it seems like I need to create a stencil buffer for my glow effect.

    Here are some screen shots:
    Screen Shot

    As you can see, the planet that is not in front of the glow effect is obscuring the glow.

    Screen Shot

    In the second case, the station pilon should be obscuring the glow, but it is not.

    I believe both of these issues can be resolved by creating a stencil buffer for my glow texture.
     
  6. mkovacic

    Original Member Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    127
    Likes Received:
    0
    I don't know what is your rendering setup exactly, but if I take a guess at it this is what you need to do:

    - get a global ptr to your depthstencil surface
    - SRT(backbuffer, depthstencil)
    - clear color/depth
    - render your scene, ZENABLE=true , ZFUNC = lessequal
    - SRT(glowtexture, depthstencil)
    - clear color only
    - render glowing stuff, ZENABLE=true , ZFUNC = lessequal or equal
    - SRT(backbuffer, NULL) (or SRT(backbuffer, depthstencil), doesn't matter)
    - render fullscreen quad with glow texture, ZENABLE=false, additive blending

    Anyway, if this still doesn't work, please post what is your rendering setup exactly, there are a lot of ways to achieve a glow effect and it's difficult to troubleshoot when I don't know what exactly are you doing.
     
  7. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    Do I need a second depthstencil buffer for my glow surface? It still isn't working.
     
  8. mkovacic

    Original Member Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    127
    Likes Received:
    0
    No, you want to use the same depth buffer, so that your objects obscure the glow when in front of it. Just make sure you're sending the depth buffer to the second SRT() call (the one where you set glow texture as the render target).

    Are you using the exact same geometry for the normal and glow rendering? If not, you could be getting z-fighting problems.

    Does glow color get rendered if you comment out this:

    - get a global ptr to your depthstencil surface
    - SRT(backbuffer, depthstencil)
    - clear color/depth
    - render your scene, ZENABLE=true , ZFUNC = lessequal
    //- SRT(glowtexture, depthstencil)
    //- clear color only
    - render glowing stuff, ZENABLE=true , ZFUNC = lessequal or equal
    //- SRT(backbuffer, NULL) (or SRT(backbuffer, depthstencil), doesn't matter)
    //- render fullscreen quad with glow texture, ZENABLE=false, additive blending
     
  9. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    Well, the reason I was thinking I needed a second depth buffer is because my back buffer surface is 800 x 600 and my glow surface is 256x256. When I use the same stencil buffer for both, The glow flickers in and out somewhat erratically.


    Yes... I'm rendering all of my geometry twice. If the texture in the mesh is a glow texture, I render it as is to the glow surface. Any other texture is removed, and the geometry is rendered as black.


    Not rendering the glow textures to my glow surface will still result in the glowing colors being rendered, because I render the "glow stuff" to both the solid surface and the glow surface.

    Thanks
     
  10. mkovacic

    Original Member Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    127
    Likes Received:
    0
    Ugh, yes, you'll need a separate depth buffer for the glow render target in that case. Create it using CreateDepthStencilSurface(), just make sure the format matches your render target format (check the docs on CheckDepthStencilMatch()).

    Sorry for misleading you.
     
  11. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0

    Thanks for getting back to me. Well, you found the problem, I'm just not sure what to do about it. Based on what was in the DirectX docs, I put the following lines in my init:

    Code:
    	r = g_pD3D->CheckDeviceFormat( D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_A8R8G8B8, 
    									D3DUSAGE_DEPTHSTENCIL, D3DRTYPE_SURFACE, D3DFMT_D16 );
    	if( FAILED( r ) ) {
    		DEBUGV( "CheckDeviceFormat FAILED: %s", DXGetErrorString8(r) );
    		EXIT_GAME();
    	}
    
    	r = g_pD3D->CheckDepthStencilMatch( D3DADAPTER_DEFAULT,
                                           D3DDEVTYPE_HAL,
                                           D3DFMT_A8R8G8B8,
                                           D3DFMT_A8R8G8B8,
                                           D3DFMT_D16 );
    
    	if( FAILED( r ) ) {
    		DEBUGV( "CheckDepthStencilMatch FAILED: %s", DXGetErrorString8(r) );
    		EXIT_GAME();
    	}
    
    The DX8 docs said the following
    And I have the following lines in my debug log:

    So, both my CheckDeviceFormat() and my CheckDepthStencilMatch failed because my DepthFormat could not be used in conjunction with my Adapter Format and Back Buffer format.

    Is this something specific to my machine? If not, is there a list of Depth/Adapter/BackBuffer formats that work together somewhere?

    Thanks
     
  12. mkovacic

    Original Member Indie Author

    Joined:
    Jul 28, 2004
    Messages:
    127
    Likes Received:
    0
    It's specific to the video card, that's why you need to check explicitly. To get it going, just change it to D3DFMT_D24X8 or D3DFMT_D24S8, some cards need the depth buffer to be of the same bitdepth as the depth/stencil buffer. I assume that's why it's not working for you.

    For production code, you'll typically try with every format (there aren't that many), in order of your preference, until you find one that is supported.
     
  13. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    Finally!

    I finally got my glow working. For whatever reason, I could not create a second stencil depth buffer no matter what my settings were, and performing my pixel shader on an 800x600 texture resulted in exceptionally poor performance. What I ended up doing is creating a third texture that was 400x300. Rendering all of my glow polys to the glow surface which I left at 800x600 and then just re-using my main stencil buffer. Then I rendered my glow surface to my 400x300 texture using a screen sized quad. It worked pretty well... check out the screenshot.

    Engine Glow Screenshot

    Thanks for all the help... next on the agenda is the streaking effect.
     
  14. soniCron

    Indie Author

    Joined:
    May 4, 2005
    Messages:
    3,664
    Likes Received:
    0
    Glad to hear you finally got ahold of it! I remember when I was working on a post-processing effect pipeline. I didn't have a card capable of decent acceleration of shaders, so I had to program everything on my own. After I got all the effects finished, they ran about six times as fast as a comparable pixel shader on accelerated hardware. My point is, I think that pixel shaders are far overused for simple things that could, and should, be done by creative texturing. For example, I had:
    • Motion blur
    • Film bloom far superior to any I've seen in use today
    • Light glow/burn
    • Depth of field
    and they all ran as fast on a non pixel-shading card as a pixel shader on a more expensive board, if not faster. All at the same time. ;) I think pixel-shaders are fantastic for all sorts of things, from normal mapping to atmospheric distortion. But a lot of the stuff they're used for today, in my opinion, is a waste of the power. I guess what I'm getting at is, if you ever want to support some of these effects for lesser video cards, give me a holler. I've got some experience. And again, congrats on overcoming this!
     
  15. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    That would be awesome! I tried locking the surface and doing the manipulation in memory, but that went just painfully slow. If you have a way to do motion blurr and bloom without pixel shaders, I would love to hear about it.

    Send me an email if you would like to talk privately:
    support@epochstar.com

    Or just post back to this thread.

    Thanks,
    Battleline
     
  16. soniCron

    Indie Author

    Joined:
    May 4, 2005
    Messages:
    3,664
    Likes Received:
    0
    Well, I don't know what development is like on DirectX, because I use OpenGL, but I'll tell you what I did:

    I'd render the screen (or parts, like glow textures) to a very small texture (between 64x64 and 128x128, though it could be rendered higher at the cost of performance), I'd copy the data from that texture into memory, and run various effects over it. Depending on the purpose (ie. glow, motion blur, etc) I would run the texture through different filters. For example, the glow I would run a Max filter over the texture (which causes the light pixels to expand). For the bloom, I'd lighten the darker areas and darken the lighter areas to reduce the "obviously-using-a-bloom-texture" effect. I could go on for days explaining exactly what I did, but it's largly an artistic choice. For all, I finally ran them through a blur filter which eliminated the low texture size blockyness. Then I uploaded it back to a texture, drew the screen, and pasted that texture over that, using different blending modes depending on what I wanted to accomplish.

    If you're more interested in image manipulation, I suggest you try out FilterMeister for Photoshop, PSP, etc. It has some very nice documentation and tutorials for beginners to multi-dimensional DSP. Just remember, it's mostly a "play with it to see what you like" kinda thing. It's all subjective, anyway! ;)

    Oh, and just so you know, I got your PM, but I thought it'd be best to post my reply on here.
     
  17. Battleline

    Original Member

    Joined:
    Feb 15, 2005
    Messages:
    194
    Likes Received:
    0
    Hmmm... I don't know why it would be any different (maybe the texture I used was too large) but when I used DX8 to lock a texture into system memory, it's performance performance was aweful. I actually tried that prior to doing the pixel shader. My game went from 50fps to 5fps. On my card, the pixel shader I implemented doesn't seem to have any performance impact, but I'm runing a Radeon 9600, so I'm not sure if a lesser card would be greatly impacted or not. I may have to pick up some lower end cards that still support pixel shading to see how it effects things.

    I spoke with someone else a while ago using OpenGL that also recommended working with the surfaces in system memory over pixel shaders, but I haven't heard from anyone using DirectX that has the same opinion. Maybe I'll start another thread on the topic.

    Any ideas?
     

Share This Page

  • About Indie Gamer

    When the original Dexterity Forums closed in 2004, Indie Gamer was born and a diverse community has grown out of a passion for creating great games. Here you will find over 10 years of in-depth discussion on game design, the business of game development, and marketing/sales. Indie Gamer also provides a friendly place to meet up with other Developers, Artists, Composers and Writers.
  • Buy us a beer!

    Indie Gamer is delicately held together by a single poor bastard who thankfully gets help from various community volunteers. If you frequent this site or have found value in something you've learned here, help keep the site running by donating a few dollars (for beer of course)!

    Sure, I'll Buy You a Beer