Machine Learning Battery Idea – Battery Components Adapting To Their Surroundings With ?secondary voltage = model(random). For shared energy in a faster iteration

When iterating a machine learning model( rnge() ) you have a small range like rnge() = np.array( [1,0] ). This then lets the model adapt from random to the loss function. Similarly I imagine that the components in a battery need to adapt to its surroundings. Like the ion’s paths through the battery. I ?think its better to use the ions own energy. The heat energy (random) as little as possible. I guess this could cause low efficiency in its iterations to find their paths.

So my guess is that some crafted random secondary help voltage could help. For shared energy in a faster iteration.

Machine Learning Math Idea – Calculating With Probability?

As an experiment I thought I try to limit a range for which I randomly sampled numbers from to try a function or an equation = 0. I set up the equation such that it counted np.nonzero(function < tolerance)/1000 for 1000 random samples. This it compared with a tolerance number. So I got 0.001 the first iteration and up to 1.0 as the target.

So in the beginning the tolerance was large to get some hits to work on. Then as the iteration continued the range got smarter or less wide. So in the end the range was the solution.

I did not get the method to be particularly exact or fast.

Machine Learning Super Desalination Guess? Could A Fast Short Processing Path Work. Not Enough Time For Model(H2O + Salt) Iteration. Make Water Let Go Of The Salt As The Easiest Solution. Impulse Desalination?? Smart Networks

If There Exist A Model( H20 + Salt ) As The Water Network. Then any change would ?have to be iterated in. I guess you then can GAN (random input to get miss classifications) the network. Break the network a little to activate the release salt classification method in the molecule.

Voila and you have a super desalination method inspired by machine learning.

Machine Learning Physics – Maximizing The Probability Of Getting The Highest Speed? Light As A Wave Sampling Unit Algorithm?

Since sum(F) = ma. I wonder. What if you don’t need to cancel out forces not perfectly aligned. I assume an electron can exist at different locations at the same time.

Can this function be taken advantage of?

Could a particle of some kind pick up forces and get a higher speed from sampling ?gravity from different positions.

That is. A random sampling of positions and a focused release resulting in the highest velocity. Perhaps light is such a machine learning unit sampling particle. Some wave sampling algorithm.

Super Simple Image Compression Idea – Using Machine Learning Adaptation Of The Weight Matrix

With this algorithm you let the 3 layer model() adapt its initial layers to the last layer with holes in it. With the last layer I mean the last big weight vector. This way you don’t need to store the whole last layer. Which is of equal size of that of the image. Then just iterate the image with x = model( rnge() ) where x is the image.

Super simple compression.

Compressed try image. Looking yellow. Hmm.
Original ubuntu budgie test image
I used denoiser first after some time. Otherwise it would not iterate successfully.

Peace Project Idea – Get Homeless In Russia Into Farming. Since Vegetables Cost As Much As Meat.Why not have a veggi food contract with previously homeless or poverty stricken people growing vegetables?

Availibilty of good choices are the recipie for sustainible peace. Everyone should have them. Therefor I think. Why not get the homeless in Russia and elsewhere into farming. I mean. Vegetables are on the rise. McDonalds have vegetarian meals.

This gave me an idea. Why not have a veggi food contract with previously homeless or poverty stricken people growing vegetables?

Machine Learning Physics Speculation – M ( 2H+0 ) = M ( H20 ) ? If Model, M() Starts With An Error. Is The Water Life Optimized? Has Life Optimized Water Evolved For A Long Time?

Here the model starts like a machine learning model() at a random initial state. In machine learning the model would iterate so that the equations would hold. But the path would be different.

I wonder. What does this mean for physics. The path would be different for each molecule?