Color map from grayscale

I am looking for a way to create a heat map from a sequence of gray scale images with nice gray scale blobs. The heat map should look like these maps you know from eyetracking. Starting with black, blue, green and then turning red with the most used spots. Has this been made before ?

I could readout the gray values in the image series (20 - 100 images), dont know maybe add the values and get a summed up image in an array / unclear ? will have to normalize them somehow ?

Then try to convert the gray scale in an blue - green - red image ? with some kind of look up table color map.
I guess I would have to define 256 colors for the 256 gray hues in an array ?

I found this on the web which has an interesting concept for the look up table.

What do you thing ist this the route I should go ? Don´t know how to start.
Not much on openprocessing and elsewhere.

1 Like

Since you have specific colors defining your own spectrum, I would stick to a lookup table as you mentioned. Also check this next post by Jeremydouglass as you might find it useful to your case:

I suggest try coding a basic concept and share your code to receive feedback.



Thanks a lot. The hint with lerpColor is great. What a clever idea.
Didn´t expect the lerp to be this useable in gradients.

I did some test with the heatmap and it looks good.

I am still struggeling with the summing up of the greyvalues.
I have a sequence of 20 to 60 greyscales Images and I need to reduce them to one, where all the white blobs are visable and the greys should add up somehow so the more often used spots should get brighter.

These are some samples

1 Like


What you describe sounds like the “ADD” blend mode:

Or you can also do it manually like this:


Thank you very much great ideas for solving this. Getting closer. The Demo Adding up images is nice.

It is still tricky as the normalisation seems to be the problem.
With normalisation the rare used spots (light gray) get darker. Which is correct but the heatmap gets unuseable.
If I turn off the normalisation, the bluriness of the blobs gets lost as it adds up to quick.
So I guess the right algorithm must contain some threshold value.
Like if a blob apears only once in a spot it gets fully recorded. if it appears more often it should add up slowly, depending on how often it apears ?

When you have the normalized sum you can do what you want with the data.

For example you can implement a simple S-Curve to change the value in a non-linear way.

I did an exemple here :

It is a bit slow on openProcessing but should run smoothly on your computer.


wow this is awesome. This is really cool. I havent fully understood it but this is great.
I guess I will add a controlleable blur depending on the number of images and then the heatmaps will be perfect.
So much better then doing it manually in Gimp and Photoshop. Processing has won again. :slight_smile:

Many thanks I was so stuck. Superb !