Network Math Idea – Does Network Math Follow The Path Of Old Mathematics?

Machine learning as it is today is pretty simple. You have input, a and output b with a function net(a) in between.

Doesn’t this resemble old math. You have one object a that is proportional to an object b.

When does this not hold. If net(a) is singular for some a = a(k). But how could net(a(k)) get singular. One feature of 1/x is perhaps the closeness to a possible singularity at x = 0.

So I thought. If you have two networks. A rational object if you will. That is. net(a) = net0(a)/net1(a). This might get me new possibilities. Here I will train net0 and net1 as if they were separate neural networks. That is. Some hidden layer networks.

So the idea was to include a feature of math. The closeness to a singularity.

Testing …

Maybe you need to use network objects obj1, obj2, obj3,… as in ordinary math and just train them. Could there exist like a reduced taylor series for network objects?

So here you can create a built in gain function in the model.

gain 0..1 to -inf..inf

gain >0..1 gives -big..big