Geometric mean bit hack

Thanks for the link. Floating point bit hacks are very useful for experimenting with neural network activation functions. If you just want a little bit of nonlinarity for example you can multiply the approximate square root of a number by itself.
Or if you want to convert 2 numbers from the Gaussian probability distribution to a number from the uniform distribution you can use atan2(x,y).

2 Likes