I got following problem I’m creating a Neuronal Network class And for some strange reasons I don’t know there is one point in the learning process at wich there it suddenly becomes Infinity and then NaN. Here is code of the classes
class Neuron {
  float lastinputs[];
  int id;
  float[] weights;
  Neuron(int id_, int lengthweights) {
    id=id_;
    weights=new float[lengthweights];
    try {
      String load[]=loadStrings("weight"+id+".txt");
      for (int i=0; i<weights.length; i++) weights[i]=Float.parseFloat(load[i]);
      String format_controll=load[weights.length-1];
    }
    catch(Exception e) {
      for (int i=0; i<weights.length; i++) weights[i]=random(-1, 1);
      String[] save=new String[weights.length];
      for (int i=0; i<weights.length; i++) save[i]=weights[i]+"";
      saveStrings("weight"+id+".txt", save);
    }
  }
  void forewardpropragation(float learningrate, float difference) {
    for (int i=0; i<weights.length; i++) {
      weights[i]+=learningrate*difference*lastinputs[i];
    }
    String[] save=new String[weights.length];
    for (int i=0; i<weights.length; i++) save[i]=weights[i]+"";
    saveStrings("weight"+id+".txt", save);
  }
  float calculate(float inputs[]) {
    float sum=0;
    lastinputs=inputs;
    for (int i=0; i<weights.length; i++) sum+=inputs[i]*weights[i];
    
    return sum;
  }
}
class Network {
  int structure[];
  int structure2[];
  Neuron n[];
  Network(int struc[]) {
    structure=new int[struc.length];
    structure[0]=0;
    for (int i=1; i<struc.length; i++) structure[i]+=struc[i]+structure[i-1];
    println(structure);
    structure2=struc;
    structure[0]=0;
    n=new Neuron[structure[structure.length-1]];
    for (int i=0; i<structure.length-1; i++) {
      for (int j=structure[i]; j<structure[i+1]; j++) {
        n[j]=new Neuron(j, struc[i]);
      }
    }
  }
  float calculate(float inputs[]) {
    int max=inputs.length;
    for (int i=0; i<structure.length; i++) if (max<structure2[i]) max=structure2[i];
    float[][] hiddenlayer=new float[structure.length][max];
    hiddenlayer[0]=inputs;
    for (int i=0; i<structure.length-1; i++) {
      for (int j=structure[i]; j<structure[i+1]; j++) {
        hiddenlayer[i+1][j-structure[i]]=n[j].calculate(hiddenlayer[i]);
      }
    }
    return hiddenlayer[structure.length-1][0];
  }
  void supervisedLearningCustom(float learning_rate, float inp[], float correct_answer) {
    for (int i=0; i<n.length; i++) n[i].forewardpropragation(learning_rate, correct_answer-calculate(inp));
  }
}
Now this code won’t create any problem:
Network n;
void setup() {
  n=new Network(new int[]{2, 5, 1});
  frameRate(30);
}
void draw() {
  n.supervisedLearningCustom(0.25, new float[]{1, 1}, 0);
  n.supervisedLearningCustom(0.25, new float[]{1, 0}, 1);
  println();
  println(n.calculate(new float[]{1, 0}));
  println(n.calculate(new float[]{1, 1}));
}
While this code leads to this result:
Network n;
void setup() {
  n=new Network(new int[]{2, 5, 1});
  frameRate(30);
}
void draw() {
  n.supervisedLearningCustom(0.25, new float[]{0, 1}, 1);
  n.supervisedLearningCustom(0.25, new float[]{1, 1}, 0);
  n.supervisedLearningCustom(0.25, new float[]{1, 0}, 1);
  println();
  println(n.calculate(new float[]{0, 1}));
  println(n.calculate(new float[]{1, 0}));
  println(n.calculate(new float[]{1, 1}));
}
I know the if Infinity occures once there is a chain reaction wich makes all NaN
- Infinity occures
- the result becomes Infinity or NaN
- the error uses it
- it gets build in in the weights
- all results become Infinity or NaN
Why is this and how can I fix this?