I did a JS version of the fixed filter bank neural network.
You keep the neural network weights fixed (by using a fast transform for them) and adjust the non-linear activation functions instead:
If you rerun the code a few times you can get different lower costs. Indicating there are effectively some local minimums that the optimization algorithm can’t get out of so easily. Anyway they are all low cost.