Eric Robert

I need to preserve the exact colors (and thus, hard edges) when I read back the rendered image via an offscreen surface filled with GetRenderTargetData. There is an option in the control panel where the driver forces anti-aliasing for all applications. If I do not set the quality-performance slider correctly, I will get anti-aliasing. Is there a way (in code) to prevent that

Re: Game Technologies: Graphics How to disable driver anti-aliasing

windozer_

Eric,

Have you tried setting D3DRS_MULTISAMPLEANTIALIAS = false in the SetRenderState for your device This should disable anti-aliasing for anything rendering from the device. Thanks.

Cale





Re: Game Technologies: Graphics How to disable driver anti-aliasing

BLANC Guillaume

From my understanding, control panel settings is directly tuning driver's behaviour. Thus it is not possible for the application to decide or force something that will be overriden anyway by the driver.





Re: Game Technologies: Graphics How to disable driver anti-aliasing

Eric Robert

Sorry, already tried that :-(



Re: Game Technologies: Graphics How to disable driver anti-aliasing

Eric Robert

Ok so, what can I do to have a "fullproof" way to read back the exact result from the pixel shader output





Re: Game Technologies: Graphics How to disable driver anti-aliasing

BLANC Guillaume

Maybe you could select a render target with a d3d format that does not support AA (If it exists). Driver won't be able override it cos it doesn't support it.
Also, multisampling is not available for all swap effect, so maybe you could also do it that way.





Re: Game Technologies: Graphics How to disable driver anti-aliasing

r3n

Set your multisample type to None in your presentation prameters, and set anti-aliased line enable to false in the device render state.



Re: Game Technologies: Graphics How to disable driver anti-aliasing

Ralf Kornmann

Nvidia allows you to create application specific profiles that override the global panel setting. Unfortunately ATI does not.

May I ask what you are doing with the data you read back from the device In my opinion it is a high risk to depend an accurate rendering.

r3n, This will not help as the driver override these settings.






Re: Game Technologies: Graphics How to disable driver anti-aliasing

r3n

I shall have to remember that one. Thanks for the tip.



Re: Game Technologies: Graphics How to disable driver anti-aliasing

Eric Robert

Using another render target with a float texture format seems to be the only way.

Thanks a lot!





Re: Game Technologies: Graphics How to disable driver anti-aliasing

Eric Robert

Reading back the exact output of the pixel shader can be used for many things: picking, arbitrairy computations (www.gpgpu.org). I was just hoping not having to create another render target like someone else suggested. But it seems I have no choice.



Re: Game Technologies: Graphics How to disable driver anti-aliasing

Ralf Kornmann

Eric, I know these possible usages and it is fine to do something like this in a controlled environment. If you try to use such technologies in a widely published application I need to renew my warning. You will never know the hardware you will face and some do even more ˇ°dirtyˇ± things then enable simple AA. Floating point targets will help you at the moment but there are already chips that support AA even there.






Re: Game Technologies: Graphics How to disable driver anti-aliasing

Eric Robert

I understand what you're saying.

Maybe you have an alternative to do picking (for example) without doing tons of collisions tests