Pixel-perfect discrepancies in OpenGL and DirectX


Since this project has a low-res pixel-art style, it is very important that everything is presented as point-filtered textures with no anti-aliasing, or it defeats the intended look. I noticed that my Windows builds had a very "soft" look to them, while my builds on OS X were perfectly crisp with no blurriness.

OpenGL vs DirectX pixel offset discrepancy

I have a quad that covers the entire visible area with a pass-through shader on it that is used for occasional visual effects. With no effects active, it simply passes the pixels behind it onwards. This is achieved using Shader Forge's "Scene Color" nodes. After discussing this with Shader Forge's developer Joachim Holmér, he suggested that the issue may be a different pixel boundary in DirectX vs OpenGL. Sure enough, this was the case and I was able to account for this by using a custom "Code" node in combination with Unity's pre-processor shaderlab macros:

It's a simple edit, but you can see that a pixel offset is first calculated. This is 0 (none) for OpenGL and 0.5 for DirectX. This value is then used to offset the screen's UVs by an amount calculated using that offset and the screen dimensions. I'm sure there are other ways to solve this, but this is working for me and hopefully the information is helpful regardless.