Inspired by recurrent machine learning networks where you can reuse output data as input. I wonder if this could inspire some information about gravity.
There is a similarity from gravity being a force that is both an output and an input to other weights.
By experience the recurrent data in machine learning converges to a state pretty quickly. So if the universe calculates its solutions according to a network theory.
Then I guess universe ?positions of the weights are calculated by this recurrent gravity information force as both input and output.
So when different weights come in range from each other. The gravity information needs to converge again.
So with this you get smart positions. The universe positions of the weights and its ?surrounding close weights.