In my previous idea that the complexity problem is solvable with a network. I connected the weights in a continuous fashion to get a clearer picture. The smoothed image of the weights matrices.

From this I realized that you can connect the weights in a machine learning neural network. This so they are not so independent. Not looking so random when combined. So this gives me the idea that the weights are complex objects. Which are solvable each with a network.

So what input does the weight network have. Here I take inspiration from the universe. Molecules provide a clue. Here you have connected weights through bonds. But molecules are not so big.

Then this idea explains why we have gravity. Its just input data from all other weights to the particular weight. For the weights to solve more problems they need more information.

That is. Molecular forces provide input from the immediate surrounding weights and gravity provides information from all the rest of the weights.

So to sum this idea. Everything that seems complex is just networks. // Per Lindholm

Come to think of it. If energy runs the network it could resemble the error in a machine learning network. From that I realize that minimizing the error could happen in more ways. ?Either you **disperse** the error so it get evenly distributed or you **shift** the error to another place in the network. So at least locally you get a low amount of error. This with a limited amount of weights.