Work around Nvidia driver bug

https://www.khronos.org/registry/OpenGL/specs/gl/GLSLangSpec.1.20.pdf
Section 4.1.3 says that hexadecimal integer literals are supported, but
Nvidia have never read a specification since their founding, so their
engineers didn't know that hexadecimal integer literals are requires to
be supported to advertise support OpenGL versions with GLSL support.
pull/593/head
AnyOldName3 4 years ago
parent a080071588
commit a4f32a469e

@ -1,12 +1,12 @@
#define FUNC_NEVER 0x0200
#define FUNC_LESS 0x0201
#define FUNC_EQUAL 0x0202
#define FUNC_LEQUAL 0x0203
#define FUNC_GREATER 0x0204
#define FUNC_NOTEQUAL 0x0205
#define FUNC_GEQUAL 0x0206
#define FUNC_ALWAYS 0x0207
#define FUNC_NEVER 512 // 0x0200
#define FUNC_LESS 513 // 0x0201
#define FUNC_EQUAL 514 // 0x0202
#define FUNC_LEQUAL 515 // 0x0203
#define FUNC_GREATER 516 // 0x0204
#define FUNC_NOTEQUAL 517 // 0x0205
#define FUNC_GEQUAL 518 // 0x0206
#define FUNC_ALWAYS 519 // 0x0207
#if @alphaFunc != FUNC_ALWAYS && @alphaFunc != FUNC_NEVER
uniform float alphaRef;

Loading…
Cancel
Save