Patterns with Sine

I am off in some weird space thinking about sub-random projections. To boldly go where no other human has gone and all that. Basically I need some sub-random pattern of sign flips to put before a fast (Walsh) Hadamard transform to make the sub-random projection. And one simple way is just to input a sequence of integers to the sin(e) function and look at the sign of the result. Or some other linear sequence of numbers.
Some interesting patterns emerge when you do that.
https://editor.p5js.org/congchuatocmaydangyeu7/sketches/H62P6YbRT

This is all to do with Fast Transform neural networks.

6 Likes

Further information: https://discourse.numenta.org/t/sub-random-patterns-for-sub-random-projections/8501

1 Like

Wow quite a few words i do not understand here but the output is intriguing.

I am usually able to follow some complex topics but have no idea how to understand this one, do you have some background info on the task you coded or what you are trying to achieve. Or even perhaps a simer explanation.

The reasons for wanting such patterns are off in outer-space. Basically I am looking for patterns that are approximately orthogonal to the basis vectors of the fast Walsh Hadamard transform, which however should not be random. They should have some repeating structure.
Here is the 8-point WHT with each row (and actually each column) being a basis vector
wht
https://en.wikipedia.org/wiki/Fast_Walsh%E2%80%93Hadamard_transform

maybe relevant?

Hi @seanc4s ,

Thanks for sharing, it’s beautiful! :wink:

I saw that you posted this on the Numenta HTM forum (I’ve looked at the Wikipedia page), could you give more details on that?

The main reason I post on the Numenta site is they tend not to block you. I have my own ideas about associative memory. If you have a hash algorithm that has +1,-1 output bits instead of the typical 0,1 output bits and you dot product that with a weight vector you can store d (dimension) number of <vector,scalar> associations. Since the d different (effectively random) vectors going into the dot product are approximately orthogonal in higher dimensional space. Ie. With high probability you can solve for an appropriate weight vector. Outside of those any input will hash to something completely random and the math says you just get low level Gaussian noise out. That’s not very useful, except you can use a Locality Sensitive Hash instead. Where similar inputs to the hash algorithm produce similar bit patterns out. Then you don’t have to get the input vector exactly right to get the expected scalar output.
You can use random projections followed by +1,-1 binarization to create such a locality sensitive hash. However that has not much robustness to rotation, scaling or translation.
An idea then is to use sub-random projections with the help of sub-random patterns to get some improved properties.

1 Like