@mrjj Sorry for not mentioning this before, I already tried a Gaussian Blur, it didn't help.
The data is have is in the form: (xCood, yCood, zCood, value)
For the first phase I am ignoring zCood.
The problem comes with the sample data granuality as data is something like:
(0,0), (0,100)..............(0,1500)
(100, 0).(100,100)........(100,1500)
.....
.....
(1500, 0).(1500,100)........(1500,1500)
I have a canvas of 1500x1500
The image before gaussian (The border is part of a layer above the image, so they aren't going anywhere soon):
[image: 564300eb-b6df-46e3-a59c-cbf0e365343a.png]
After Gaussian Blurring (and some some interpolation on my part)(the data is a little different in both cases and I have done this in a separate application):
[image: ce602226-98b9-4337-a5ed-e1972bdd128e.png]
As you can see, the output still has some boxiness to it. Also, doing it like this is just removing the sharpness of the image [it is eventually meant to be used to visualize input data not be a sunset image. Not to mention I am going too far with the blurring here making it really heavy for a repaint].
EDIT:
Forgot to mention, I am using a continuous hue change here w.r.t value. But it seems that the proposal is for limited number of colors making the output even more blocky and blurring even more useless.
EDIT 2:
I eventually used the 2D shader for openGL.
Latest output:
[image: 9f0a0667-bd19-4e0e-aaeb-51113027e7da.png]
Seems the most promising one yet