Machine Learning Idea – Server Heat Dependent On The Type Of Layer Used? LSTM

I wonder. My little laptop goes super hot every time I use something like LSTM. I guess it could be the cache. However. For many processors in a room this could make a substantial difference I guess.

So the idea is for optimize algorithms with machine learning for low energy consumption or heat dissipation. Could there be a “lagom” speed for the amount of heat?