I believe the universe takes advantage of machine learning networks ability to achieve complex things. That is. I believe its made of many connected networks of different types. From the very small internal atom sized to the very large empty space ?network.
From this I wonder if the idea of determinism could be fixed. What I could not accept is a start and let go type of a scenario. Where you only initialize the universe once and it goes on predictably forever.
A simple way to fix this I ?think. Is to enclose the deterministic idea in time. That is. You only have a start and let go for processes that are very predictable and probably short lived.
Then to make it simple for the universe to handle things the starting values of the networks are revealed to the surrounding networks. Since there is a determinstic process underway the starting values are very important. It could be that the starting values of the determinstic process underway is calculated by ?all the other networks. So everything is in agreement.
So I guess. Every time light is emitted from an atom it is revealing something about the starting values of parameters of the internal atom network.
From this I guess a particle that undergoes acceleration is a less predictable process and therefor must emit some information about its network.
Similarly for Double Slit Experiment
It is just the photon space network that got a little bit more complicated so it had to reveal more about what has happened to its network.
I can almost guess the positions the photons lands on are the output of a network. Thats the streaks you see. So its kinda like digital information but with more catagories or labels other than 0 and 1.
Don’t know how many labels you need. Just made a simple example.
So maybe there should exist a photon capture devices that could help in the devolopment of different products. Just by releaving something about the networks in the product.