How to combine a point shader with a blur shader?

I would like to display thousands of points on a 3D canvas with a Depth of Field effect. More specifically, I would like to use a z-buffer (depth buffering) to adjust the level of blur of a point based on its distance from the camera.

So far, I could come up with the following point shader:


#ifdef GL_ES
precision mediump float;
precision mediump int;

varying vec4 vertColor;
uniform float maxDepth;
void main() {

  float depth = gl_FragCoord.z / gl_FragCoord.w;
  gl_FragColor = vec4(vec3(vertColor - depth/maxDepth), 1) ;



uniform mat4 projection;
uniform mat4 modelview;

attribute vec4 position;
attribute vec4 color;
attribute vec2 offset;

varying vec4 vertColor;
varying vec4 vertTexCoord;

void main() {
  vec4 pos = modelview * position;
  vec4 clip = projection * pos;

  gl_Position = clip + projection * vec4(offset, 0, 0);

  vertColor = color;

I also have a blur shader (originally from the PostFX library):


#ifdef GL_ES
precision mediump float;
precision mediump int;
uniform sampler2D texture;
// The inverse of the texture dimensions along X and Y
uniform vec2 texOffset;
varying vec4 vertColor;
varying vec4 vertTexCoord;
uniform int blurSize;       
uniform int horizontalPass; // 0 or 1 to indicate vertical or horizontal pass
uniform float sigma;        // The sigma value for the gaussian function: higher value means more blur
                            // A good value for 9x9 is around 3 to 5
                            // A good value for 7x7 is around 2.5 to 4
                            // A good value for 5x5 is around 2 to 3.5
                            // ... play around with this based on what you need <span class="Emoticon Emoticon1"><span>:)</span></span>
const float pi = 3.14159265;
void main() {  
  float numBlurPixelsPerSide = float(blurSize / 2); 
  vec2 blurMultiplyVec = 0 < horizontalPass ? vec2(1.0, 0.0) : vec2(0.0, 1.0);
  // Incremental Gaussian Coefficent Calculation (See GPU Gems 3 pp. 877 - 889)
  vec3 incrementalGaussian;
  incrementalGaussian.x = 1.0 / (sqrt(2.0 * pi) * sigma);
  incrementalGaussian.y = exp(-0.5 / (sigma * sigma));
  incrementalGaussian.z = incrementalGaussian.y * incrementalGaussian.y;
  vec4 avgValue = vec4(0.0, 0.0, 0.0, 0.0);
  float coefficientSum = 0.0;
  // Take the central sample first...
  avgValue += texture2D(texture, * incrementalGaussian.x;
  coefficientSum += incrementalGaussian.x;
  incrementalGaussian.xy *= incrementalGaussian.yz;
  // Go through the remaining 8 vertical samples (4 on each side of the center)
  for (float i = 1.0; i <= numBlurPixelsPerSide; i++) { 
    avgValue += texture2D(texture, - i * texOffset * 
                          blurMultiplyVec) * incrementalGaussian.x;         
    avgValue += texture2D(texture, + i * texOffset * 
                          blurMultiplyVec) * incrementalGaussian.x;         
    coefficientSum += 2.0 * incrementalGaussian.x;
    incrementalGaussian.xy *= incrementalGaussian.yz;
  gl_FragColor = (avgValue / coefficientSum);


  • How can I combine the blur fragment shader with the point fragment shader ?

Ideally I’d like to have one single fragment shader that computes the level of blur based on the z-coordinate of a point. Is that even possible ?

Any help would be greatly appreciated.

An example sketch displaying points using the pointfrag.glsl and pointvert.glsl shaders above:


import peasy.*;

PeasyCam cam;
PShader pointShader;
PShape shp;
ArrayList<PVector> vectors = new ArrayList<PVector>();

void setup() {
  size(900, 900, P3D);
  cam = new PeasyCam(this, 500);
  perspective(60 * DEG_TO_RAD, width/float(height), 2, 6000);
  double d = cam.getDistance()*3;
  pointShader = loadShader("pointfrag.glsl", "pointvert.glsl");
  pointShader.set("maxDepth", (float) d);

  for (int i = 0; i < 5000; i++) {
    vectors.add(new PVector(random(width), random(width), random(width)));
  shader(pointShader, POINTS);
  shp = createShape();
  shp.translate(-width/2, -width/2, -width/2);  
  for (PVector v: vectors) {
    shp.vertex(v.x, v.y, v.z);

void draw(){
  shape(shp, 0, 0);


My guess would be to combine the two shaders so that sigma = depth.

I’m not sure how to do what you are asking, but I thought of a different approach: have a long texture with a bunch of points side by side with different amounts of blur applied to them. Then you use the depth value to choose where to sample that texture from, so you get more or less blur based on z value. This might be the fastest way, since the blur is computed in advance, and you could use a beautiful but slow-to-compute blur.
I should try this myself one of these days :slight_smile:

1 Like