# Formula does work on paper however not in my very simple program

I’m running into an issue where it seems like processing is wrongly calculating my formulas.

The idea is simple I draw a line and I want to draw n number of points on that line, evenly distributed.

So I came up with a formula that calculates the x and y locations of the points along this line which in a calculator works fine however not in my program.

x = starting point of line
x1 = end point of line
xCheck = the to be calculated point along the line
n = number of points
i = how many iterations of the for loop

(x/ni)+(x1/n(n-i)) = x2

``````int x = floor(random(0,width));
int y = floor(random(0,height));
int x1 = floor(random(0,width));
int y1 = floor(random(0,height));

int n = 5;

size(800,800);
stroke(0);
fill(255,0,0);
line(x,y,x1,y1);
println(x,y,x1,y1);
println(" ");

for(int i = 0; i <= n; i++){
float xCheck = (x/n*i)+(x1/n*(n-i));
float yCheck = (y/n*i)+(y1/n*(n-i));
noStroke();
ellipse(xCheck,yCheck,4,4);
print(xCheck,yCheck + "   ");
}
``````

The result is that the points are always a few pixels off around(1 - 5)
Does anyone perhaps understand what exactly I am doing wrong here?

EDIT:
It also seems that the bigger the number n is, the bigger the offset of the points is. The reason for this I do not undertsand

Hello,

Change your variable declarations from int to float and see what happens.

More importantly, understand what is happening.

Look up integer division .