Strange offset between shader and encoded data

Hey, I am working on a real-time 2d radiosity lighting demo similar to the inactive pixelflow demo in pure processing.

This uses no external libraries, and does not use pixelflow itself.

So far the pipeline runs smoothly real time, but there is a mismatch with the coordinates sent from the app to the coordinates displayed by the shader. Ive added an offset of exactly 120 pixels to the data sent to the shader to correct the issue in very a bandaid solution. However the offset changes slightly based on which area of the screen the object resides by what appears to be a slight scaling difference and hence there is still a slight but noticeable mismatch.

The problem has been tested and seems to stem from the shader which generates a signed distance function.

how the data is sent:

void encode(PImage p, int index){
    p.pixels[index*4] = color(col,lum*16f); // the colour of the object
    //coordinates to be sent to the shader ( with 120 pixel offset), adding 30000 as I cannot send a colour with a negative value.
    int ax=int(x)+30000 -120 ,
        ay=(height-int(y))+30000 -120;
    //encoding the coords into color channels. 
    p.pixels[index*4+1] = color(ax%256,ay/256,ay%256,ax/256);

    //width and height
    p.pixels[index*4+2] = color(type,w%256,h%256,16*(w/256)+h/256);

how the outlines are drawn on the objects

for(int i = 0;i<allObjects.size();i++){

how the shader decodes and reencodes the data (please add float32 images please processing so i dont have use wierd hacks qwq):

vec2 decode(vec4 a){
	return vec2((256.0*( 256.0 *a.a+a.r))-30000 , (256.0*( 256.0*a.g+a.b))-30000);
vec4 encode(vec2 pos){
	return vec4(mod(pos.x,256)/256.0,(int(pos.y)/256)/256.0, mod(pos.y,256)/256.0,(int(pos.x)/256)/256.0);

how the shader generates the distance function

float sdBox( vec2 p, vec2 b )
  vec2 d = abs(p) - b;
  return min(max(d.x,d.y),0.0) + length(max(d,0.0));
//rgb lum, pos+30000,  r: 0 = circle | 1 - square | gb: size, a: sizemul,   r:rotation, gba: other options.
float sdf(vec2 pos, vec4 data1, vec4 data2, vec4 data3){
	int thing = int(256*data3.x);
	float w = 256.0*(data3.g+  int(256.0 *data3.a)/16)-1.0;
	float h = 256.0*(data3.b+ int(256.0 *data3.a)%16)-2.0;
		return length(decode(data2)-pos)-w;
	else if(thing==1){
		return sdBox(decode(data2)-pos, vec2(w, h));
	return 199;

float sdfAll(vec2 pos){
	float minSdf = 19999;
	for(int i = 0;i<totalObjects;i++){
		minSdf= min(minSdf,sdf(pos
	return minSdf;

here in this screen capture you can see that the black outline (sorry for poor visibility), sticks out of the black shape. This is very noticable on the center bottom square and the circle as it passes through the lit area.
a more visible example, compare outline in left side shape with the right side shape

any insight into this issue is appreciated! <3


on further testing the offset of the object is proportional with the offset i use to include negative coordinates. The differnce in scaling between encoded and screen and is about 0.995x and
so far it has been patched up by multiplying the encoded data with that difference. I would still like to know why this happens, I have suspicions this same effect also interfered with my implementation of the jump flood algorithm, which produced large holes and rips in the output voronoi tessellation reminiscent of odd scaling.

My guess is it’s that 0…1 should be equivalent to 0…255, not 256. Don’t think it’s a coincidence that 255/256 is 0.996.


oh good catch! im a dumbass, thanks for the help!

1 Like

:laughing: it’s easy when you’ve made a similar dumbass mistake in the past!