I bought an arctic air cold water evaporator. It work well for near field personal cooling.
So even though its summer time. I wonder. Could you heat a room or create near field warming. This with hot water instead of cold.
Perhaps the water temperature only need to be 35 degrees celsius and not boiling. Since this would break the device. Then the evaporation cools it down a bit. But it should still be warm. A 10 degree drop I guess.
Could be the most efficient way to save money on electricity. USB powered heater. But I think there needs to be a special version of the device.
Solved a stiff diff equation using the chainer python module. Taken from wikipedia. The way I did it was to iterate with rng() random as the input with a while loop. That is. Iterate until the model(rng()) outputs the correct number.
Some problems were that dt could not be too small or it would get zero for update(). So I put it at 0.01 which I think is pretty high.
The loop for the initial model outputs.
y0 = model0(rng()) # previous value, y0
y1 = model1(rng()) # next value, y1
yp = (y1-y0)/dt # derivate
loss_eq = F.mean_squared_error(yp,-15*y0) # first error loss equation
loss_initial = F.mean_squared_error(init,y0) # inital error loss
loss = loss_eq + loss_initial
optimizer0.update() # update each model0, model1
“In computability theory, the halting problem is the problem of determining, from a description of an arbitrary computer program and an input, whether the program will finish running (i.e., halt) or continue to run forever.” – wikipedia
I think this is possible and quite important. If we rewrite the halting problem to a practical, still useful problem. Then “from a description” becomes from a small number of initial steps.
Then the problem could be fed to another machine learning model(). With indata like the model parameters, data from inital step calculations. Like the input, output and error.
Then the idea is that the practical halting model() will output a number indicating the probability of success.
If this works. The lessons is that even if something seems impossible you can rewrite the rules a bit but still get something useful back. Do we need to change the approach of the model().
Why battery? Running on battery means you can move the air cooler to where your at. Then using a foldable solar charger to charge the battery. Means that those without reliable power or no power at all also could have a small air cooler. For rural Africa perhaps.
The USB powered Artic Air had three speed settings. So the running time with a powerbank will vary. It came with a usb charger but it could run on computer usb. So the manual said. So I assumed a powerbank could do the same job. The powerbank was rated at 2.1A.
Anyway 30+ degrees at home without an air conditioner. Just fans. Is hard. This air cooler worked by near field personal cooling. Sitting infront of the cooler is what I mean.
I think I imagined a new business. The energy truck driver.
So the idea is for a large foldable solar panel park on wheels. Where the operator goes to place to place on demand. Partly because the energy grid is not developed. So it could be for emergencies, refugee camps, black outs etc. Or maybe just as a business for earning money.
Anyhow this is a very important idea for a resilient world.
You split a problem to solve it more easily. Therefor I wonder if the universe is really a big energy problem. Splitted into many small solvable objects. I mean. Would this mean it stores energy in many small singularities. Since a singularities could have high value. Like divided by zero. This would be a managable way to store lots of energy in a safe way.
So the problem of the universe is safe energy storage?
I wonder if this learning technique could work. The idea is to split the data into two or more model() groups. Each belonging to a separate model().
So the way I would test this would be to look at some sort of ?confidence score for each classification the two models could do. Like in the long vector binary encoding. Where you want 1.0 and the rest to be zeros.
So then the idea is to move the sample data into the model group with the highest confidence. This way the two models compete to have data in its group.
Then if I get 100% confidence for each model for training data within its group. Then the sample data transformation output from the test set would be selected from the model with the highest confidence score.
If Hot Water Freeze Faster Than Cold. Do Pretreated Air Also Cool Faster?
The reason for this effect I believe is optimization for life. The fluid will ?try to get an acceptable mean temperature. By cooling fast the mean will be lower if then the temperature sky rockets again.