A good direction - we evidently process information - performing basic problem-solving, at a fundamental level - via an entropy-reduction process. Informational entropy in our innate form (as opposed to Shannon or whatever metric) must align to thermodynamic if not thermo-geometric equilibria. Network entropies and efficiencies must be the very stuff of information, and its processing..

When someone works intensive using his brain, he feels hunger because of many glucose is needed. Glucose is a fuel for the brain. If you give fuel to the brain and create effective cooling it should work better. It's like a processor in PC.

They demonstrated that, the slower a neuron learns, the less heat and entropy it produces, increasing its efficiency.

This suggests an interesting tradeoff between speed of adaptability to the environment and efficiency. Maybe not even a rigid tradeoff where we can have faster plasticity at the expense of decreased efficiency (e.g by altering the chemical balance using stimulants like caffeine?)

Slow learning means low intellectual ability. What is the best: fast processor with high heat generation and powerful cooler or slow processor without cooler? It depends on concrete application.
Processor works better when it takes pure electrical power without interferences (caffeine etc.).

A study investigating to what extent learning is an irreversible process would be an interesting aspect of the thermodynamic approach to learning theory.