I started thinking that you could get the speed up using a traditional electric motor. Then to sustain that rotation. I wonder. As a concept. Could you use something else to sustain that rotational speed. Something much more efficient perhaps.
Similarly. Applying this concept to magnetism. There could exist a starter coil and a more efficient system applied later on.
Also for batteries. Could a small efficient flywheel generate high enough amps. Even from a failing battery or small irregular voltage battery. For a mobile this is not possible but for a truck or car a flywheel could maybe be built in.
I used images as labels for classifications instead of binary or integer classifications. I worked. But not so good as I had hoped.
Still I don’t give up. So what about x,y coordinates as labels. I was aiming at the textbook ?typical accuracy maps for construction.
I thought maybe every point could be drawn on a common image.
This way. A brain or my primitive model could adapt a little its predictions. Make some comparisons with earlier predictions.
The color for each prediction should be different. The labels are the black points. The numbers should not be visible on the image. Just the classification point in the corresponding color.
So with this my model will have continous update of the previous predictions via the classification image.
The result gave me an idea. Using KMeans for separation of the numbers there was an obvious class functionality missing in KMeans. The Surrounding class. Which is some kind of uncertainty. So we should have 11 classes 10 for the numbers and 1 for the surrounding class.
I wonder. Is there an n-dimensional complex like number? That is. Could you have some power rule for data with many columns. The power rule would shift the element data into the ?previous lower dimension if squared. Similarly taking the square root ?could (-1) put the element data into the next higher column.
It could be that the new number is hard to understand. So the idea is to separate the data into harder and harder columns. I mean. Machine learning does not care. I think it could make more sense if it computed data with similar data.
The blog post is meant as inspiration. Reinvent the math wheel.
My previous idea was about separating water into evaporation classes. From the easy evaporative to the desert sustainable.
From this I speculate that a battery could be improved by using sand. What if you can be picky and select the best material from within the same material. By this I mean that the sand can act as a natural separator or cleaner not only for water. Perhaps the electrolyte could also do a natural ?good and bad separation using dispersion in sand.
I was thinking. What if you rotate an object randomly during training in machine learning. Then the recognition of a rotated object would pose less of a problem.
Then if you rotate some other similar object randomly you will have many predictions to go by. Improving the accuracy then becomes discarding the probable miss classifications. Since they are not in the majority of the cases. Some median of the predictions. Or maybe its better to check the sequence of probability. That is. If you get the same result for the first ?10 predictions.
So from this I guess that the random spin properties of atom objects. Could perhaps in the machine learning sense be sequence sampling probability classifications. That is. If the atom objects are classified to be part of the same atom.
What if you used a hardened surface of a Peltier Element and slided that along some solar heated surface? Then I think you get a much more warmer hot side. At the same time you cool the upper side with a fan. This should generate more electricity than just placing the peltier still on surface.
So the idea is to sum up small patches of heat. With movement and low friction contact the added energy for motion is ?small. Like for a sliding bearing. At least solar energy could move the heat collector also.
From machine learning. Where you update the model from an error function. The update is done so that the next time there is a check the error function is closer to zero.
Then I wonder. If everything that can be decisions also are machine learned decisions. Then there exist a rain network. Since it is an easier way to create the smart conditions needed life.
If so. Then there is a error or update function that needs to be followed. Since every network needs to optimized for life then the goal of the update ?could be to check if there is some green ?growth.
Like watering the plant. You don’t want to water too much or too little.
So maybe all that is needed is a tree corridor with ground vegetation. From the Arabian Sea To Mediterranean Sea.
The idea is that the rain network will adapt since there is some purpose to it. That is. Sustaining vegetation.
This looks like reinforcement learning. Only lacking a bit ?vegetation up north to connect the rain?
Cities need to cool down. One way is to take advantage of the problem we have with overheated pavement and roads.
If you draw the heat (air suction?) from these surfaces and combine that with a chilled mass. In a ?underground heat storage. Then chill another storage to minus degrees. Using solar energy? By the way. Does the cold storage need to be as big as the heat storage?
I wonder if you then can create electricity from a temperature difference driven process.
Perhaps something like this can take reduce the need for air conditioners. Improve conditions for rain (no overheated surfaces) and more.