My idea is simple.
Inspired by an old philosophical question. I believe its better to assume something not divisible in the limit. Since you end up with something that still exist.
Then try to divide an odd number with 2 and you end up with a decision. Which side should have one more.
3/2 “equals” (1 and 2) or (2 and 1) compare 1.5*2 with 1+2
1/2 “equals” (0 and 1) or (1 and 0) compare 0.5*2 with 0+1
This can be used in algorithms or math I think.
In machine learning you can use this ?easily. I believe every ratio is a decision number. This means that you also got a decision for derivatives dy/dx.
So for differential equations I wonder if you can have a decision model(data) for when to use the perticular finite difference, f’=(f(x+h)-f(x))/h. Left or Right = model(data). That is. The model outputs wether to take the left or right side of the point. So its a classification problem.
So I called it a decision number because it comes directly from a model worthy decision.
A quick idea.
Why not test if you can using ogg vorbis (audio signal) or Opus compression. Compress a vec_xyz.flatten vertex vector.
Should be interesting to see how much detail still remains.
Inspired by machine learning where you have data and target values. I wondered. How do differential equations make sense to a network.
The way I see it. Differential equation solving takes way too few boundary values. Its pretty much just the amount to get the equation = 0. That is. A zero valued difference.
In machine learnig you ?usually have a lot target values.
So a differential equation is just a target value replacement. Missing values ?replaced by an Implicit function. A simplification.
How can this make sense to machine learning?
For machine learnig we could have missing values. Maybe we should treat every target value as a boundary value and have an implicit equation for some ?smart interpolation range.
Another idea would be to do research on how to come up with algorithms. Just use machine learning. To see what can be done and then make a similar ?somewhat human understandable algorithm. This goes for programming code also. Machine learning could give ?hints on design.
I wonder. How to you best combine electric motors. Everything needs to be optimized for life and life can operate at small dimensions. This means efficiency needs to be good at small dimensions. So from this I guess there should exist multi electric motor engines.
What if you have 4+ small electric motors for the same output as a large one. I saw that the sewing machine went from a 90W motor to a low voltage 10W small motor. This for those solar powered machines.
What would it look like? Is a 4 cylinder engine inspirational since its a proven structure?
So I think it could be an idea to create a powerful low voltage electric motor system.
There is a simple way to get a ?much higher convergence.
My idea is to just iterate like hell for the first batch. When the loss reaches low like 1e-6 or something. You continue with the rest of the training set.
Generator – Basically don’t be a detector. Just write what you feel. Generate. Use your imagination. Write all possibilities you can think of. It does not matter if some judgement feeling got in the way. You can always just rewrite what you have written later.
Detector – Regain some judgement feeling but not too much. Focus on making the text understandable for someone else. Be kind. Rewrite before reduction, rewrite among the sentences of the idea.
So the idea is that. What ever you write. It can be rewritten to make sense to a network.
When you do something together. Other than bickering. You learn to work together.
So before peace. Learn to work together by building something important together.
There are probably many examples where countries can together with their expertise.
Water treatment, desalination seems important enough in this region. Solar being another. Or why not a Desert To Food project.
A quick idea.
I think I have a start of an idea to meassure data. If you have a simple machine learning model.
Feed that model, y = model(np.random.rand(1)) and meassure the time for the total iteration to converge. To reach the data.
This time value is then some measurement of the complexity of the data.
One thing that could be of value from this is that an ordinary data like sin() needs a large amount of initial iterations. I think this could be compensated like this in other problems.
// Per Lindholm 2018-05-02
A quick idea.
Since you make assumptions when creating a differential equations by hand. I assume this can be generalised by using machine learning models.
So I guess its possible to simulate a slope field. The slope field is the equation. I picture a grid with randomly rotating vectors that the model then can iterate each a better and better value for.
This way your not bound by symbol math. Only by the data?
A quick idea.
You get a normal distribution from a random number generator if sampled many times.
Then this would cause a problem for water and life. Since in a normal distribution of something there should exist concentrations that are high, toxic or taste bad. Like putting instant coffee in hot water. You want an even distribution.
So since everything should go towards a normal distribution by ?default. I guess that water is like a network that calculates or uses its energy to get a distribution. That is optimized for life.