December 3, 2023
Fixed a bug in optimization loop:
In initial development, I used
activation = model(xinput)
as the performance parameter, to keep the structure simple.
However, this does not work for optimization because it tends to want to go to negative infinity.
After verifying that everything works, I moved to a quadratic performance index
loss = (activation - target) * (activation - target)
or initially for simplicity
loss = activation * activation
where target is zero.
However, I forgot to also change
activation.backward()
to
loss.backward()
After fixing that, optimization worked.