Time Symmetries of Memory Determine Thermodynamic Efficiency.

2021 
While Landauer's Principle sets a lower bound for the work required for a computation, that work is recoverable for efficient computations. However, practical physical computers, such as modern digital computers or biochemical systems, are subject to constraints that make them inefficient -- irreversibly dissipating significant energy. Recent results show that the dissipation in such systems is bounded by the nonreciprocity of the embedded computation. We investigate the consequences of this bound for different types of memory, showing that different memory devices are better suited for different computations. This correspondence comes from the time-reversal symmetries of the memory, which depend on whether information is stored positionally or magnetically. This establishes that the time symmetries of the memory device play an essential roll in determining energetics. The energetic consequences of time symmetries are particularly pronounced in nearly deterministic computations, where the cost of computing diverges as minus log of the error rate. We identify the coefficient of that divergence as the dissipation divergence. We find that the dissipation divergence may be zero for a computation when implemented in one type of memory while it's maximal when implemented with another. Given flexibility in the type of memory, the dissipation divergence is bounded below by the average state compression of the computation. Moreover, we show how to explicitly construct the memory to achieve this minimal dissipation. As a result, we find that logically reversible computations are indeed thermodynamically efficient, but logical irreversibility comes at a much higher cost than previously anticipated.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    0
    Citations
    NaN
    KQI
    []