Discrete Vectors

Dear all,

I’ve recently stumbled across a series of rendered drawings made with Grasshopper and got particularly interested in this one.

It is made from a combination of components called “Discrete Vectors” (part of the Pufferfish plugin) whose function is to compute “discrete vectors out of continuous or random vector sets”.

I’m not familiar at all with the whole Grasshopper environment/terminology and was hoping that you could help me find an approach to recreate the algorithm used in the picture above. So far the only pieces of information I could find are the descriptions provided on the components’ official page.

In short:

  • the visual complexity of the shape comes from the swarming behavior of agents
  • the “discrete vectors” components are there to straighten the originally curved path of these agents
  • this straightening is achieved by limiting each agent to a range of possible movements


QUESTIONS:

1/ How can the agents draw such homogeneous and uniform curves in the first place ? (linear interpolation ? agents-driven spline ? )

2/ How would you approach the computation of these “discrete vectors” ?

2 Likes

This is interesting. It isn’t totally clear whether the path data should be generated using discrete vectors, or whether the agents could swarm in free space as curves, and their points on their path data could then be snapped-to-grid before being drawn. Those would produce related but different results if every agent shared the same grid (then snapping would collapse paths). If every agent was on a slightly different grid, however, then I suspect (?) the results would be more visually similar.

Here is an example of a simple Path object that can render its point list either as a curve or as “discrete paths” on a grid of arbitrary spacing. It subtracts the % for the snap behavior, which is like floor() rather than round().

class Path {
  float[][] points;

  Path() {
  }

  Path(float[][] points) {
    this.points = points;
  }

  void populate(int count, float dim) {
    points = new float[count][];
    for (int i=0; i<count; i++) {
      points[i] = new float[]{random(-dim, dim), random(-dim, dim), random(-dim, dim)};
    }
  }

  void render(int mode, float grid) {
    beginShape();
    switch(mode) {
    case 0:
      for (int i=0; i<points.length; i++) {
        curveVertex(points[i][0], points[i][1], points[i][2]);
      }
    case 1:
      for (int i=0; i<points.length; i++) {
        vertex(points[i][0] - points[i][0]%grid, 
          points[i][1] - points[i][1]%grid, 
          points[i][2] - points[i][2]%grid);
      }
    }
    endShape();
  }
}

3 Likes

Its possible it was achieved via a vector field, sampled via perlin-noise, like this sketch by yasai:

https://wangyasai.github.io/Perlin-Noise/

In it, each particle uses the perlin noise to determine the direction of flow.

2 Likes

Thanks @tony. @WakeMeAtThree suggested the same approach yesterday (via pm).

I’m on it. Will let you know the outcome when I’m finished.

That’s exactly how I did these images:

Our lives are full of micro-decisions: the color of the socks we wear. The words we write or pronounce. The route to school. Some of them will lead us to very different futures, but in some cases we...

void calcRoute() {
  vertices.clear();
  int iterations = 0;
  do {
    vertices.add(pos.copy());
    float tdir = (baseDir + 5 * noise(pos.x * 0.01, pos.y * 0.01, frameCount * 0.003));
    float qdir = baseDir + (TAU / angles) * (int)(degrees(tdir)/(360/angles));
    dir = lerp(dir, qdir, 0.3);
    pos.add(speed*cos(dir), speed*sin(dir));
  } while (pos.dist(center) < width*maxDist && iterations++ < 1000);
}

I used lerp so it’s not hard-quantized to N directions, but it turns towards one of those N directions smoothly. The same should work on 3D.

6 Likes

This is the heart of the problem.

If we look at the example pictures above we can clearly see that vectors are swarming in the same direction while following their own path

If we place the agents on different grids, paths are different but we lose the coherent behavior.
Inversely, with a flowfield-based approach coherence is preserved but agents inevitably converge toward a single common path.

I’ve tried many things to keep near agents separated:

  • adding a repulsive force when too close to each-other
  • changing the angle when too close to each-other
  • changing the noise value of the grid cell when too close to each-other

but none of those ideas worked well.

So the questions is “How can I keep near agents at distance while preserving their own path ?”

I’m even starting to wonder whether the flowfield approach is appropriate or not in this case.

What do you think ?

4 Likes

Maybe try assigning each particle a different z-value? With a small enough scale, the perlin-noise will diverge, but not too radically…

thanks @tony, I tried already but it doesn’t work. I still end-up with all the particles sharing the same path.

@solub – the test you are running (with results consolidating to a line) looks as if it would produce the same result even if you weren’t using discrete vectors. That is, the agents are increasing or decreasing y based on the offset to the target, and they never overshoot and never delay in responding – so of course they all end up in a line, whether they are using 8 directions or 360. I don’t think that behavior is affected by whether the vectors are discrete or not. Am I right?

In order to get dispersion while chasing a distant target you would need imperfect agents – ones who wander or have a lag in response, or who stop converging when they get “close enough”, et cetera. Or else use flocking by perfect (or slightly imperfect) agents – they aren’t all following the same thing, they are following each other.

Perhaps you can try to create a 3d flowfield using perlin noise? So each flowfield is similar to the others but different all the same. I’m guessing this would prevent all agents from ending up in the same ‘highway’.

I think this is related to what @tony was suggesting with the z-value on Perlin noise – the Perlin values are 3D, so they should be able to serve as the flowfield. However, setting the right sampling distance (0.001, 0.05, 0.01 etc.) for the values to be similar-yet-different can be difficult – Perlin noise might just not be the right kind of values – or at the wrong scale.

Yes, you’re right. And this is totally expected since I’m using a 2D flowfield.

I agree, and that’s the type of behavior I’ve been trying to implement (cf. my previous post). But how exactly would you stop the convergence of near agents ? As I said before, I failed to keep them at distance. To be more specific I’m unable to change the angle of near agents while preserving that straight line aesthetic. I believe the separate() function in the script below to be flawed.

def separate(self, particles):
    desiredseparation = 25.0
        
    x = floor(self.location.x / xstep)
    y = floor(self.location.y / ystep)
        
    # Detection of nearest points
    for p in particles:
        px = floor(p.location.x / xstep)
        py = floor(p.location.y / ystep)
            
        # If other agent in same grid cell
        if px == x and py == y and self.location != p.location:
                
            # distance to nearest agent
            d = PVector.dist(self.location, p.location)
                
            # if distance equal to or below threshold -> change angle
            if d > 0 and d <= desiredseparation:
                    
                new_angle = PVector.fromAngle(QUARTER_PI).setMag(2.5)
                p.applyForce(new_angle)
Full script
W, H = 400, 400

factor = .015
scl = 30
ystep = int(H / scl)
xstep = int(W / scl)
flowfield = [[[] for x in range(scl+1)] for y in range(scl+1)]

da = [QUARTER_PI, HALF_PI, PI - QUARTER_PI, PI, PI + QUARTER_PI, PI + HALF_PI, TWO_PI - QUARTER_PI, TWO_PI]

def setup():
    global particles, flowfield
    size(W, H, P2D)
    background(245)
    smooth(8)
    
    particles = [Particle() for i in range(32)]
    
    for iy, y in enumerate(range(0, height+1, ystep)):
        for ix, x in enumerate(range(0, width+1, xstep)):
            n = int(round(map(noise(x * factor, y * factor), 0, 1, 0, 7)))
            flowfield[ix][iy] = n

        

        
def draw():
                
    for p in particles:
        p.follow(flowfield)
        #p.separate(particles)
        p.update()
        p.edges()
        p.render()



class Particle(object):
    def __init__(self):
        self.location = PVector(random(20, width/2), random(20, height - 20))
        self.velocity = PVector(0, 0)
        self.acceleration = PVector(0, 0)
        self.maxspeed = 1
        self.angle = PVector()

        
    def update(self):
        self.velocity.add(self.acceleration).limit(self.maxspeed)
        self.location.add(self.velocity)
        self.acceleration.mult(0)
        
    def applyForce(self, force):
        self.velocity.sub(force)
        
        
    def separate(self, particles):
        desiredseparation = 25.0
        
        x = floor(self.location.x / xstep)
        y = floor(self.location.y / ystep)
        
        # Detection of nearest points
        for p in particles:
            px = floor(p.location.x / xstep)
            py = floor(p.location.y / ystep)
            
            # If other agent in same grid cell
            if px == x and py == y and self.location != p.location:
                
                # distance to nearest agent
                d = PVector.dist(self.location, p.location)
                
                # if distance equal to or below threshold -> change angle
                if d > 0 and d <= desiredseparation:
                    
                    new_angle = PVector.fromAngle(QUARTER_PI).setMag(2.5)
                    p.applyForce(new_angle)

            
        
    def follow(self, vectors):
        x = floor(self.location.x / xstep)
        y = floor(self.location.y / ystep)
        i = vectors[x][y]     
            
        if self.location.y < height - 20 or self.location.y > 20:
            self.dvector = PVector.fromAngle(da[i]).setMag(2.5)
            self.applyForce(self.dvector)
                
        
    def render(self):
        strokeWeight(1.4)
        stroke(0)
        point(self.location.x, self.location.y)
        if frameCount == 1:
            fill(255)
            strokeWeight(1.2)
            ellipseMode(CENTER)
            ellipse(self.location.x - 1, self.location.y, 6, 6)
        
        
    def edges(self):
        if self.location.x > width - 20:
            fill(0)
            noStroke()
            rectMode(CENTER)
            rect(self.location.x, self.location.y, 8, 16)
            del particles[particles.index(self)]
        if self.location.y >= height - 20 or self.location.y <= 20:
            fill(0)
            ellipse(self.location.x, self.location.y, 6, 6)
            del particles[particles.index(self)]
        if len(particles) == 0: noLoop()

1 Like

1Thanks @makoho,

Implementing a 3D flowfield seems to be indeed a solution of choice. However that suggestion is not without problems.

  • 1/ The example pictures above show vectors rotating around a center, not flowing along an extended path. I could put “walls” around the bounding box (to prevent vectors to flow outside the edges) and try to divert the vectors to the center when reaching an edge. But with noise-based angles pushing them back right after against the walls again… the end result would be a mess.

A solution could be to mimic a vortex motion with the sine and cosine of the noise-based angles. Just like in the following video (from 1:15 to 1:25):

  • 2 / I tried to reproduce the effect with dir = PVector(cos(n), sin(n), abs(sin(n))) as suggested in the video but it looks somehow different (last picture below). I’m not even sure I’m implementing the 3D flowfield correctly.
for iz, z in enumerate(range(0, D, zstep)):
        for iy, y in enumerate(range(0, H, ystep)):
            for ix, x in enumerate(range(0, W, xstep)):
                n = map(noise(x * factor, y * factor, z * factor), 0, 1, 0, PI)
                flowfield[ix][iy][iz] = n
Full script
#add_library('PostFX')
add_library('peasycam')

W, H, D = 600, 1200, 700

factor = .002
scl = 10
ystep = int(H / scl)
xstep = int(W / scl)
zstep = int(D / scl)

flowfield = [[[[] for x in range(scl)] for y in range(scl)] for z in range(scl)]


def setup():
    global particles, flowfield, cam, fx, bbox, boxes
    size(1400, 900, P3D)
    hint(DISABLE_DEPTH_TEST)
    frameRate(1000)
    background(20)
    smooth(8)
    
    fx = PostFX(this)
    cam = PeasyCam(this, 1500)
    

    
    particles = [Particle() for i in range(260)]
    
    bbox = createShape(BOX, W, H, D)
    bbox.setFill(color(2, 170, 255, 0))
    bbox.setStroke(color(255, 100))
    bbox.setStrokeWeight(.5)
    
    boxes = createShape(GROUP)
    
    for iz, z in enumerate(range(0, D, zstep)):
        for iy, y in enumerate(range(0, H, ystep)):
            for ix, x in enumerate(range(0, W, xstep)):
                a = map(noise(x * factor, y * factor, z * factor), 0, 1, 0, PI)
                dvector = PVector(cos(a), sin(a), cos(a)).setMag(10)
                flowfield[ix][iy][iz] = dvector
    
                nbox = createShape(BOX, xstep, ystep, zstep)
                nbox.setFill(color(1, 130, 255, 1))
                nbox.setStroke(color(255, 1))
                nbox.setStrokeWeight(.4)
                nbox.translate(x-(W/2)+(xstep/2), y-(H/2)+(ystep/2), z-(D/2)+(zstep/2))
                boxes.addChild(nbox)
                
        
def draw():
    background(0)
            
    for p in particles:
        p.follow()
        p.update()
        p.edges()
        p.render()

    shape(boxes)
    shape(bbox)
        
        

    #fx.render().bloom(0.01, 10, 90).compose()
    #fx.render().saturationVibrance(0.5, .5).compose()
    #hint(DISABLE_DEPTH_TEST)
    
    cam.beginHUD()
    fill(255)
    textSize(14)
    text(frameRate, 30, height - 40)
    cam.endHUD()
        
        
 
 
class Particle(object):
    def __init__(self):
        self.location = PVector(random(W)-(W/2), random(H)-(H/2), random(D)-(D/2))
        self.velocity = PVector()
        self.acceleration = PVector()
        self.maxspeed = 14
        self.loclist = []
        self.type = random(1)
    
    def update(self):
        self.velocity.add(self.acceleration).limit(self.maxspeed)
        self.location.add(self.velocity)
        self.acceleration.mult(0)
        self.loclist.append(PVector(self.location.x, self.location.y, self.location.z))

        
    def follow(self):
        x = floor((self.location.x + (W/2)) / xstep)
        y = floor((self.location.y + (H/2))  / ystep)
        z = floor((self.location.z + (D/2))  / zstep)
        
        if x < scl and y < scl and z < scl:
                
            dvector = flowfield[x][y][z]
            self.velocity.add(dvector)
        
    def updateLast(self):
        for i in range(len(self.loclist)-1):
            self.loclist[i+1].x = self.location.x
            self.loclist[i+1].y = self.location.y
            self.loclist[i+1].z = self.location.z
        
        
    def render(self):
    
        if self.type > .9:
            if len(self.loclist) > 60:
                del self.loclist[0]
            for i in range(len(self.loclist) - 1):
                c = lerpColor(color(230, 5, 106), color(255, 225, 155), .8 - norm(i, 0, len(self.loclist) - 1))
                stroke(c, i*10)
                strokeWeight(1 - map(i, 0, len(self.loclist)-1, 1, 0))
                line(self.loclist[i].x, self.loclist[i].y, self.loclist[i].z, self.loclist[i+1].x, self.loclist[i+1].y, self.loclist[i+1].z)
        else:
            strokeWeight(3)
            stroke(0, 255, 253)
            point(self.location.x, self.location.y, self.location.z)

    
    def edges(self):
        
        if self.location.x > W/2: 
            self.location.x = 0 - W/2
            self.updateLast()

        if self.location.x < -W/2: 
            self.location.x = W/2
            self.updateLast()

        if self.location.y > H/2: 
            self.location.y = -H/2
            self.updateLast()

        if self.location.y < -H/2: 
            self.location.y = H/2
            self.updateLast()

        if self.location.z > D/2: 
            self.location.z = -D/2
            self.updateLast()

        if self.location.z < -D/2: 
            self.location.z = D/2
            self.updateLast()



dir = PVector(cos(n), sin(n), cos(n))

dir = PVector(cos(n), sin(n), abs(sin(n)))

  • 3/ How can I create a discrete vector in 3D ? In 2D I was mapping the noise-based angle against a range of 7 integers (each corresponding to the choice of a specific discrete angle).

      angle = vectors[x][y]        
      index = int(round(map(angle, 0, TWO_PI, 0, 7)))        
      
      discrete_angles = [QUARTER_PI, HALF_PI, PI - QUARTER_PI, \
            PI, PI + QUARTER_PI, PI + HALF_PI, \
            TWO_PI - QUARTER_PI, TWO_PI]
      
    
      discrete_vector = PVector.fromAngle(discrete_angles[index]).setMag(2.5)
      self.velocity.add(discrete_vector)
    

    In 3D however, PVector.fromAngle() and heading() can’t be used. As a result I’m unable to transform my 3D PVector() to a specific angle.

Any guidance on that matter would be greatly appreciated.

1 Like

Just answering my 3rd question (how to compute noise-based 3D discrete vectors)

for iz, z in enumerate(range(0, D, zstep)):
        for iy, y in enumerate(range(0, H, ystep)):
            for ix, x in enumerate(range(0, W, xstep)):
                
                #noise-based angle
                angle = map(noise(x * factor, y * factor, z * factor), 0, 1, HALF_PI, TWO_PI - HALF_PI)
   
                #indexes
                icos = int(round(map(cos(able), -1, 1, 0, 8)))
                isin = int(round(map(sin(angle), -1, 1, 0, 8)))
                iabsin = int(round(map(abs(sin(angle)), -1, 1, 0, 8)))

                #list of discrete angles                
                dA = [-1, -.75, -.5, -.25, 0, .25, .5, .75, 1]  
                
                #discrete vectors
                dvector = PVector(dA[icos], dA[isin], dA[iabsin]).setMag(10)
                flowfield[ix][iy][iz] = dvector

I’m not satsified with the result though:

  • vectors’ motion is lacking coherence
  • this approach based on noise means that I don’t have control over the vectors’ paths
2 Likes

@solub Well, even though your results at this moment don’t look like what you set out to create, they’re still gorgeous.

You are right though, the top picture shows a kind of vortex rather than a random flowfield.

An approach I would try is to create a single layer of a single agent and not bother with 3d just yet. If that is working as expected, I would scale up to multiple agents, and then repeat that layer with slightly different parameters and render it on top of the existing layers. Once that works, you can start to refactor code to generate and draw it all in one go. I’m afraid I don’t have any technical help for you at this moment regarding flowfields…

Keep evolving your solutions, you may produce something along the way that you like better than what you set out to do. Don’t know if this is in any way helpful to you, I hope so!

1 Like

@makoho – Thanks for your feedback and apologies for the late reply. Ultimately, I decided to opt for something simpler: swarming vectors in 3D with 2D noise only. The result looks very similar to what’s shown on the third picture

Example Script
add_library('peasycam')

n_line, n_point = 20, 300
factor = .05
offset = 1
da = [HALF_PI, QUARTER_PI, TWO_PI, -QUARTER_PI, - HALF_PI]
pts = [[] for i in range(n_line)]

def setup():
    size(1000, 800, P3D)
    strokeWeight(2)
    smooth(8)
    
    cam = PeasyCam(this, 1200)
    
    py = 0
    for y in range(n_line):
        px = 0
        for x in range(n_point):
          n = int(round(map(noise(x * factor + offset, y * factor + offset), 0, 1, -1, 5)))
          d = PVector.fromAngle(da[n]).setMag(3)
          px += d.x     
          py += d.y  
          
          if py > 60 or py < 0: py -= d.y
          
          pts[y].append(PVector(px, y * 7, py))
        py = 0
        
def draw():
    background(255)
    
    for l in pts:
        for i in range(len(l)-1):
            line(l[i].x, l[i].y, l[i].z, l[i+1].x, l[i+1].y, l[i+1].z)

Regarding the second picture (and following the same logic), I’m wondering if making particles follow a discrete path around a (slightly modifed) Lorenz attractor might not be a better solution than a noise-based 3D vector field… I think the outputs would be kind of similar.

Anyway thank you all for your time and suggestions, I think I’ll stick with a basic 2D noise solution for now.

4 Likes

Hi @solub, I really enjoyed this thread, I was attempting something similar a while ago in Processing-Java, after discovering the work by: Oleg Soroko. I was also using a 3D noise flow-field, but gave up, before trying to limiting their movement to a set of angles :frowning:

In the animated gif you posted above, I wanted to ask whether you had adapted your example script to work with particles or agents travelling in what appears to be a confined box-like-space? (As I see them collide with the boundary and appear to change direction.)

3 Likes

Hi @fugitive,

What a beautiful 3D flowfield you have there. Would you mind sharing how you computed the 3D noise for the 2nd picture ? (Is it the usual combination of sin(), cos(), sin() ?)

Regarding the animated gif, the sketch is based on the exact same script I posted and no agent behavior is involed. I’m just drawing lines from random locations and at different pace. When the lines reach an edge of the bounding box their coordinates are switched (left wall only): ‘x’ becomes ‘z’, ‘y’ stays unchanged and ‘z’ becomes ‘x’.

That’s it.

2 Likes

Thanks!

I experimented with a range of combinations of (sin(), cos(), sin()). However, I was never able to quite achieve what was shown in the video you posted much earlier: https://www.youtube.com/watch?v=c8qMWsXhI94 (especially using abs(cos()), etc.)

However, by chance more than design, I did try the following, using (tan(theta), sin(theta), cos(theta)):

Script (part of FlowField class)
  float inc = 0.1;
  void run(boolean _display) {
    float zoff = 0;
    for (int z = 0; z < layers; z++) {
      float yoff = 0;
      for (int y = 0; y < rows; y++) {
        float xoff = 0;
        for (int x = 0; x < cols; x++) {
          int index = x + y * cols;
          float angle = noise(xoff, yoff, zoff);
          float theta = map(angle, 0, 1, 0, TWO_PI);
          PVector v = new PVector(tan(theta), sin(theta), cos(theta));
          v.normalize();                             // Set all vectors to unit length
          if (_display) display(x, y, z, v);         // Display the Flow Field scaffold (debug)
          field[z][index] = v;                       // Add vector to 2D array
          xoff += inc;
        }
        yoff += inc;
      }
      zoff += inc;
    }
  }

Which results in:
flowfield

I also used:

PVector v = new PVector(sin(theta), cos(theta), tan(theta));

3 Likes

Back on that thread again.

I just tried the tan(theta), sin(theta), cos(theta) combo and didn’t get the result showed above. I think I’m not computing the angle correctly and here I am again with that same old question:

  • How do you compute a 3D noise based angle ?

The script from @fugitive is clear but once you have computed the vector v and normalized it, what should I do with it ? (Let’s say I want to draw a line showing the noised angle for each cell of the flow field, how do I compute the 3D rotation ?)

float inc = 0.1;
  void run(boolean _display) {
    float zoff = 0;
    for (int z = 0; z < layers; z++) {
      float yoff = 0;
      for (int y = 0; y < rows; y++) {
        float xoff = 0;
        for (int x = 0; x < cols; x++) {
          int index = x + y * cols;
          float angle = noise(xoff, yoff, zoff);
          float theta = map(angle, 0, 1, 0, TWO_PI);
          PVector v = new PVector(tan(theta), sin(theta), cos(theta));
          v.normalize();   

Also, the video I posted states that the 3D vectors are stored in a 2D dimensional array. How is that possible since we are in 3D ? (no Z axis ?)

1 Like