My idea is simple.
Inspired from image stacking. Where you got 4 or more noisy images of the same object. Taken at different times or different equipment. Meaning. The noise looks a bit different in each image.
I wonder if the same could be done with multicore computer CPU’s. The idea is to crank up the frequency and at same time limit energy consumption.
The assumption I make is that this will induce random errors. But with many cores there would be a possibility to error correct these with noise reduction methods. Inspired by image stacking.
So the idea is that the price for very high frequencies is a lot of cooperating cores.