Help needed to replicate a specific generative pattern

Hi all,

I have recently found a Grasshopper “sketch” that I would like to port to Processing but I am having difficulties understanding how it was made.

Artwork by Eftihis Efthimiou

Asked about the creation process on IG, the author replied

These are a space filling curve that runs on a grid masked by a raster. The masking is breaking the standard Hilbert look. Then I am doing a multiple polyline offset […].

The line offsetting part put aside, this is the allusion to Hilbert curves that is surprising. I do believe that some sort of space filling algorithm has been used but nowhere I can recognize the Hilbert curve pattern.

As you can see, a “standard” Hilbert curve with a 90° angle never intersects itself (no closed squares, rectangles or diagonals) and has a very clear/identifiable pattern. A mask would just hide some part of if (replaced by offsetted lines in the picture) but wouldn’t have any impact on it (no bending, no changes in angle or direction).

  • Do you think the author of that sketch really used a Hilbert curve ?
  • If so, how would you explain the changes in the pattern ?
  • If not, what kind of algorithm would produce that kind of output (a modified maze generation, …) ?

I would be very interested to have your take on this.

EDIT: According to the hashtags under the IG post, the Traveling Salesman algorithm is being used as well. Not sure what to do with this.

Other examples can be found here and here (somewhat NSFW at a distance)

Well, it might actually be a combination of both the traveling salesman and a Hilbert curve.

Seems like the path of the traveling salesman was the basis, then the Hilbert curve was only calculated using the „denser“ parts of the traveling salesman path as a Mass and then the polyline part was added in the still open spaces… although i don‘t know how to calculate a hilbert curve only on some parts…

And even though the lines themselves also seem to behave very different from Hilbert curves, that is only if you look at them as a whole. If you look at some specific parts, you can see the typical „behaviour“ (although only for 3-4 turns)…

Btw, i‘m not good with algorithms, so don‘t take what i said as granted. Just giving some additional opinion here :sweat_smile:

1 Like

Thank you for sharing your input @Lexyth, I appreciate it.

The more I think about it, the more I am convinced that the grid traversal happens iteratively, selecting the next point at each step, based on specific conditions. That would mean that neither of the 2 algorithms mentioned was used because:

  • most Hilbert curve algorithms are recursive. (and that recursive nature wouldn’t allow for such an erratic path)
  • most TSP algorithms are permutative. (all combinations tested at once)

A BFS (best first search) type of algorithm would make more sense in my opinion.

Here below a grid masked with 2D Perlin noise and traversed iteratively. At each step the algorithm looks for the closest point (1 of the 4 neighbors, or beyond if all neighbors already traversed) before visiting it.

W, H = 60, 60
N = W*H
f = .05

toWalk = []
dir = (-W, 1, W, -1)



def setup():
    size(800, 800, P2D)
    background('#FFFFFF')
    strokeWeight(2)
    noiseSeed(1)
    smooth(8)
    
    global step, cur, stop
    
    step = width/float(W)
    stop = False
    
    for i in xrange(N):
        x = i%W
        y = i//W
        n = noise(x * f, y * f)
        if n < .5:
            toWalk.append(i)

    
    
    cur = toWalk[int(random(len(toWalk)))]
    
    while len(toWalk) > 1:
        toWalk.remove(cur)
        possibilities = [cur+d for d in dir if cur+d in toWalk and (cur+d)%W!=W-1 and (cur+d)%W!= 0]
        
        if not possibilities:
            closest = getClosest()
            if stop:
                cur = toWalk[int(random(len(toWalk)))]
                stop = False
                continue
            else:
                possibilities = [getClosest()]
                
        nxt = possibilities[int(random(len(possibilities)))] 
        line(cur%W * step, cur//W * step, nxt%W * step, nxt//W * step)
        
        cur = nxt


    noLoop()
        
    
def getClosest():
    global stop
    
    minDist = 1000
    closest = None
    for p in toWalk:
        pPos = PVector(p%W, p//W)
        d = PVector(cur%W, cur//W).dist(pPos)
        if d < minDist:
            minDist = d
            closest = p
            
    if minDist*step > 200: #avoid lines that are too long
        stop = True
        return
            
    return closest

It is a bit different from the picture but somewhat close to it.

1 Like

I‘d say that‘s very similar (except the connection between the bulks is a bit thicker). If you now fill in the empty spots (and maybe use a double line for the paths) then i‘m sure that both would be pretty much the same.

Almost there…

2 Likes

Excellent! Are you the big nested topographic regions created in a second phase after the BFS terminates? I was guessing that you would tackle the second phase with repeated convolution / edge detection on pixels[] multiple times, running until a pass causes no changes. But from the white spaces left in your image, I’m guessing that isn’t generated by convolution / edge detection…?

Thank you, it is much more complicated than I first thought.

Yes, the collection of offsetted lines is computed after the BFS. I prefer so because the BFS creates jumps between points and sometimes form “bridges” between non-masked areas. I would like these bridges to be part of the boundaries the topographic regions will be limited to.

To offset the lines I am using a full Python port of the Clipper library (the same library that, I believe, has been used in Grasshopper to create the original picture). From what I understand, the OffsetPolyLines() function tries to offset downward / shrink groups of consecutive lines inside the main path. Where there are large empty spaces, the polylines can be offsetted repeatedly. On the contrary, when space is scarce (inside the maze-like structure) offsets will happen once at best, hence the myriad of tiny triangles.

The library is not without flaws:

Firstly, it appears to be extremely slow. I say “appear to be” because I don’t know yet if that has to do with me not understanding how to handle it correctly or if is the Python port that is imperfect.
Offsetting lines 30 times on a partly masked 60x60 grid took 25 minutes to compute in Processing. For comparison, the same offsets on a 90x70 grid with Clipper in Grasshopper takes less than a second to render.

Secondly, the offset algorithm doesn’t follow the topology of the polygon passed as argument. A simple straight skeleton analysis shows that Clipper creates unnecessary edges for no apparent reasons.

Thirdly, the library uses integer coordinates for computation. This has an impact on the quality of the output: uneven offsets, irregular lines…

Lastly, it is buggy. Many times the offset algorithm will create artefacts: missing lines, bumps, overlaps… This is the reason why you can see white spaces on the picture from my previous post.

For all these reasons I would like to come up with my own offset algorithm. But finding the edges/boundaries for shrinking (downward offset) is not a trivial task. I could run an edge-detection on the path structure as your are suggesting but some aspects are still unclear to me:

  • What tools do I need: a specific library or custom implementation ?
  • Will the detection provide me with edges in sorted order (mandatory in this case)
  • If not how do I sort them ?
  • Will the edges fit well the topology of the maze-like path ?

I wish I had more time to get my head around all of this