Thanks for the replies guys!
@hamoid that code works for me as well. The problem is when I use textures in Java opengl and glsl shaders.
@neilcsmith I tried to put the texture after binding the shader, but the sketch keeps crashing.
pgl = beginPGL();
sh.bind();
sh.set("txtr", img);
// all the opengl stuff
sh.unbind();
endPGL();
In all of this what I want is working with GPGPU. So the first step is to try to make a texture working in shaders so I can make animations from textures passed from CPU to GPU.
Example: I have 1024 particles. I want to pass a 32x32 (=1024) texture to the vertex shader so that each particle color (or size) correspond to the color of a specific coordinate in the texture.
For example:
// I am omitting some parts of code
uniform sampler2D texture;
varying vColor;
void main() {
vec3 st = texture2D(texture, uv).rgb;
gl_PointSize = st.r * 10.0;
vColor = vec4(st.r, st.g, st.b, 1.0);
}
Maybe I am using the wrong approach or perhaps in the OpenGL realm it works different (I am porting some codes I wrote from WebGL to OpenGL in Processing).
**
SMALL UPDATE
**
If I pass the sampler2D to the fragment shader it works like a charm. The problem is that in that case I can only focus on the last part (rasterization). AFAIK, in the fragment shader we cannot change the vertex position and point size.
New fragment shader:
uniform sampler2D txtr;
void main() {
vec2 uv = vec2(0.5, 0.5); // uv coordinate on the texture, ideally an attribute
vec3 s = texture2D(txtr, uv).rgb; // get the color of that pixel
gl_FragColor = vec4(s.r, s.y, s.z, 1.0); // apply the color to the particle
}
I have to assume that in Processing OpenGL I cannot pass textures to the vertex shader?