```
// Bit hack fast geometric mean
void setup() {
float m1=300f;
background(0);
size(500, 500);
stroke(255, 0, 0); // red
for (int i=0; i<500; i++) {
point(i, 499-sqrt(m1*i));
}
stroke(0, 255, 0); // green
for (int i=0; i<500; i++) {
float f=i;
int j=Float.floatToRawIntBits(f);
int k=Float.floatToRawIntBits(m1);
j=(j+k)>>>1;
float p=Float.intBitsToFloat(j);
point(i, 499-p);
}
}
```

2 Likes

Thanks for the link. Floating point bit hacks are very useful for experimenting with neural network activation functions. If you just want a little bit of nonlinarity for example you can multiply the approximate square root of a number by itself.

Or if you want to convert 2 numbers from the Gaussian probability distribution to a number from the uniform distribution you can use atan2(x,y).

2 Likes