Shader version error on mac

Hi!

What glsl version should I use on a shader for mac?

I am working on a library thats uses a shader, works ok on Windows, but has compile errors on mac(tried El Capitan and High Sierra). A #version directive is required, nothing below 150 is supported, but 150 onwards you cant use “texture2D” as it should be changed to just “texture”. After changing it, any texture fetch throws an “unknown error” when compiling.

Maybe Processing is adding “uniform sampler2D texture” to the shader?

Thanks

1 Like

Generally if you don’t require a specific version you can leave it out entirely and the shader preprocessor will add it in and rewrite keywords and functions. Although AFAIK it should work with a version in place - maybe you’ve found a bug.

Thanks Neil. If I dont´t define the version I get an error requiring it (only on OSX). I was using some modern syntaxis, so I tried moving back to atributes and varyings, but seems like OSX does not support anything older than 150.

Shaders included in examples work without the version directive, maybe its related to using some fancy functions like texture2Dlod…

I´ve also tried defining one of Processing´s own shader type, like PROCESSING_TEXLIGHT_SHADER, but it does not change anything.

Without a version directive Processing will attempt to rewrite the shader code to suit the GL version (eg. attributes/varyings to in/out) It shouldn’t touch the source with a version in place.

Check out https://github.com/processing/processing/blob/master/core/src/processing/opengl/PGL.java#L1948 It might give you a hint to why yours is failing.

1 Like

Interesting, will look into it. Thanks!

One bug that I found on Mac was if I tried to declare and initialize an array outside of any function. This works on Linux, but apparently not on Mac. I had to populate the array inside main like this:

vec2 dirs[4];
void main()
{
    dirs[0] = vec2(0.0, 1.0);
    dirs[1] = vec2(0.0, -1.0);
    dirs[2] = vec2(1.0, 0.0);
    dirs[3] = vec2(-1.0, 0.0);
1 Like

Thanks hamoid, just checked and I’m not declaring arrays beyond the standard Processing lighting arrays. I guess something similar might be happening.

Since I don’t usually use Macs, I will put it on Github in case someone find it useful and wants to fix it.

@hamoid – I’m curious – is this filed as a bug somewhere?

Do shader compilation bugs come from the shader compiler provided by specific graphics driver or are they related to Processing?

The graphics driver. One thing I’ve also noticed is some drivers (Intel!) being stricter about the spec where other vendors will let you get away with mixing things from different versions. Mind you, I think the graphics driver on Apple is part of the OS? Apple also definitely stricter.

GLSL is like Java - write once, test everywhere :smile:

I’ve been reading and it might not be an actual bug, but something supported on certain version of GLSL and not supported on others.

My approach was to try things out and if they work, they are good. But I have discovered what @neilcsmith mentions above: “write once, test everywhere”. The fact that it works on my system doesn’t mean it will work on others, or that it is even correct. Maybe my driver is more tolerant, or includes features it shouldn’t.