Breakthrough Machine Learning – Generalizing loss. Transform the loss 0..1 with a model2 the loss of model1. So if the loss oscilates wildy the generalizing loss dont. Its confined between around 0..1. Can the iterate with the same input for 100000 times without degrading. IDea and method by Per Lindholm

Breakthrough Machine Learning – Generalizing loss. Transform the loss 0..1 with a model2 the loss of model1. So if the loss oscilates wildy the generalizing loss dont. Its confined between around 0..1. Can the iterate with the same input for 100000 times without degrading. By Per Lindholm

https://peroglyfer.se/wp-content/uploads/2019/12/Screenshot-from-2019-12-09-09-39-07.png

https://peroglyfer.se/wp-content/uploads/2019/12/Screenshot-from-2019-12-09-09-38-23.png

https://peroglyfer.se/wp-content/uploads/2019/12/Screenshot-from-2019-12-09-09-40-22-1.png

The Halting Problem – You can also learn if the function is going to converge or not. By random choice selecting between at=loss for converge = 1 and at= np.random()*loss for converge = 0 Then a model2 learns the converge label. Voila I guess the Halting problem is solved?

You can also learn if the function is going to converge or not. By random choice selecting between at=loss for converge = 1 and at= np.random()*loss for converge = 0 Then a model2 learns the converge label. Voila I guess the Halting problem is solved?