Hardcore-model(random placement and detachment of spheres on a grid, whereby neighbouring sites are blocked by the sphere)

hello,

I’m trying now to do Simulation of Hard Core model (random placement and detachment of spheres on a grid, whereby neighbouring sites are blocked by the sphere) using Markov chain.

the task is the following :

We consider a connected graph G =(V,K) with finitely many vertices V=(v1,…vn) in my case( N= 8)and each vertex is either mapped to 1 or 0 .Where we consider the following set E={0,1} of admissible configurations , characterized by the following property :

Each pairs of connected vertices are not allowed to obtain the value 1 on both vertices.
To pick one of the admissible configurations i ocnsider the unif distribution P=1/2.

To construct a markov chain X0,…Xn i have to apply the following algorithm :

  1. Pick an admissible initial configuration ! X0 in E
  2. Pick an arbitrary vertex ‘v’ in V ,at random’’ and toss a fair coin.
  3. If the event ,head’’ occurs and if x_n(w)=0 for all vertices ‘w’ connected to ‘v’ !, then set x_(n+1)(v)=1 else set x_(n+1)(v)=0.
  4. The values of all edges w !=v are not changed, i.e. x_(n+1)(w)=x_(n)(w)

Actualy i tried to write the code of this algo in Processing and it is the following :

#initialisation of the markov chain    (matrix with values 0 )

   for i in range(n+1):
        for j in range(n+1):
                Ver_diag[ i ][ j ] = 0 

MCMC algo :

for t in range (1000): 
 
1-->        coor_x = int(random(1,n))
              coor_y = int(random(1,n))
              rdm_vertex = Ver_diag[coor_x][coor_y]
     
2-->        flip_coin = int(random(0,1))
              if flip_coin == 1 : #head
                 if Ver_diag[coor_x +1][coor_y]==0 and Ver_diag[coor_x -1][coor_y]==0 and Ver_diag[coor_x][coor_y +1]==0 and Ver_diag[coor_x][coor_y -1]==0 
                 rdm_vertex = 1 
                 else:
                rdm_vertex = 0   
           else:
4-->    #all values of vertices stay the same
                 for i in range(n+1):
                    for j in range(n+1):                           
                      Ver_diag[i != coor_x] [ j !=coor_y] = Ver_diag[ i ] [ j ]

    Ver_diag()

Please check if my code is correct or not.If yes how can i write a code to do animation of it.
I don’t have any idea how to animate my Code.

i will be very happy about your comments , critiques and your Help !
Thank you for your time Guys !

Hi @QTPay, welcome to the forum.

What language is this written in? You posted it to Processing, but this is not Java dialect code.

Hello, i used Python in Processing for this Simulation

Hi @QTPay,

There are 2 questions inside your original question:

  • Is the implementation correct ?
  • How to make an animation out of it ?

Before trying to answer those, a quick clarification.

MCMC (Markov Chain Monte Carlo) isn’t an algorithm per se but rather refers to a class of algorithms including “Metropolis-Hastings”, “Gibbs sampling”, “Slice sampling”, … etc. There are methods used to approximate a desired distribution. From what I understand the simulation approach you are trying to implement (Simulation of the Hard-Core Model) is sometimes employed as an example to illustrate this notion.

As the topic of statistics is slightly out of the scope of this forum I would strongly suggest to ask your first question on a dedicated platform like stackoverflow or crossvalidated. That said, a quick look at the original formulation of your algorithm (from page 46 of this book) shows that:

  • at step 3, the reassignment of the vertex value to 0 happens outside the preceding if statement

  • step 4 doesn’t require any explicit instruction (leave the matrix/2Darray as it is)

A reinterpretation of this routine in Python would probably (better ask qualified persons) look like this:

N = 8 # number of rows/cols
T = 1000 # number of iterations
G = [0] * N*N # matrix as a 1D array

for iter in xrange(T):
    i = int(random(N*N)) #index of vertex picked randomly (uniform)
    c = int(random(2)) #coin value picked randomly (uniform)
        
    # if heads and all neighbors = 0 -> assign value 1
    if c and not any(map(lambda x: G[x], neighbors(i))):
        G[i] = 1 
        continue
        
    G[i] = 0 # else -> assign value 0

Here neighbors can be a function or a dictionary returning the Von Neuman neighborhood of a vertex at a given index. The simple sketch below tries to render the resulting array in the style of the original author:

Populate Grid
N = 8 # number of rows/cols
T = 100 # number of iterations
G = [0] * N*N # matrix as a 1D array
off = 40

def setup():
    size(off*10, off*10)
    background('#FFFFFF')
    translate(off*1.5, off*1.5)

    #POPULATE GRID
    for iter in xrange(T):
        i = int(random(N*N)) #index of vertex picked randomly (uniform)
        c = int(random(2)) #coin value picked randomly (uniform)
            
        # if heads and all neighbors = 0 -> assign value 1
        if c and not any(map(lambda x: G[x], neighbors(i))):
            G[i] = 1 
            continue
        
    G[i] = 0 # else -> assign value 0
    
    
    #RENDER
    for id, v in enumerate(G):
        x = id%N
        y = id//N
        fill((not v)<<8)
        ellipse(x*off, y*off, off>>1, off>>1)
        if x:line(x*off-(off*.75), y*off, x*off-(off*.25), y*off)
        if y:line(x*off, y*off-(off*.75), x*off, y*off-(off*.25))
        
    
    
def neighbors(id):
        
    nbhood = set()
    
    for i, s in enumerate((-1, 1, -N, N)):
        nid = id+s
        if 0 <= nid < N*N:
            if i < 2 and nid//N != id//N:
                continue
            nbhood.add(nid)
            
    return nbhood

There is however a key element that you seem to be missing. The point of the algorithm is not only to populate a grid through a Markov process but also to do it many times so as to converge to an approximation of a distribution. Roughly, this is the Monte Carlo part of the process (MCMC).

This brings us to your second issue: the animation. In order for us to provide an answer we first need to know what part of this algorithm you would like to animate:

  • is it the grid population process (as seems to imply your original post) ?
  • is it the succession of grids being populated one after the other ?
  • is it the converging movement of their distribution ?
  • is it all of the above ?

If all you need is the grid animation then the sketch above may help you get on the right track.
If on the other hand you want to plot the converging distribution then I think (better ask qualified persons) you would have to:

  • store the total number of positive values (1) for each grid that you populate
  • divide that number by the number of iterations (mean)
  • save this mean value at each generation in a list
  • quantize + normalize + count their occurrences to build an histogram
N = 8
sums = ()

for gen in xrange(1000):
    iter = gen+1
    arr = [0] * N*N    
    sums += populate(arr, iter),
        
quantized_sums = map(lambda x: int(x)+round((x%1)*10/2.)*0.2, sums) #discretize to nearest decimal every 0.2
bins = sorted(Counter(quantized_sums).items())
omax = max(map(lambda x: x[1], bins)) #maximum occurence (needed for normalization)

for i, (k, v) in enumerate(bins):
    h = norm(v, 0, omax) * 120 #mapping the normalized occurence to a chosen height
        
    ### draw something here ###
Full demo
from collections import Counter

N = 8 # number of rows/cols

def setup():
    size(off*10, off*10)
    background('#FFFFFF')
    translate(off*1.5, off*1.5)
    
    sums = ()
    
    for gen in xrange(1000):
        iter = gen+1
        arr = [0] * N*N    
        sums += populate(arr, iter),
        
    
    quantized_sums = map(lambda x: int(x)+round((x%1)*10/2.)*0.2, sums) #discretize to nearest decimal every 0.2
    bins = sorted(Counter(quantized_sums).items())
    omax = max(map(lambda x: x[1], bins)) #maximum occurence (needed for normalization)

    for i, (k, v) in enumerate(bins):
        h = norm(v, 0, omax) * 120 #mapping the normalized occurence to a chosen height
        
        ### draw something here ###
        
        

def populate(G, T):
    
    count = 0
    
    for iter in xrange(T):
        i = int(random(N*N)) 
        c = int(random(2))
        
        if c and not any(map(lambda x: G[x], neighbors(i))):
            G[i] = 1 
            continue
        
        G[i] = 0
        count += sum(G)
        
    return count / float(T)



def neighbors(id):
        
    nbhood = set()
    
    for i, s in enumerate((-1, 1, -N, N)):
        nid = id+s
        if 0 <= nid < N*N:
            if i < 2 and nid//N != id//N:
                continue
            nbhood.add(nid)
            
    return nbhood


convergence to a Normal distribution after 1000 generations

In both cases I would suggest first to have a look at how the draw() function works in Processing and then come back later with your tryouts.

3 Likes