Hacker News new | past | comments | ask | show | jobs | submit login
Building a Completely Reversible Computer (arxiv.org)
90 points by lainon on April 19, 2017 | hide | past | favorite | 63 comments



OT: when I click on this, researchgate claims "People who read this publication also read 'Is minimal hepatic encephalopathy completely reversible following liver transplantation?'

I don't believe them.


Maybe someone was searching for "completely reversible" and got intrigued? But I guess a recommender system is the more likely explanation.


Not a recommender system or anything that intelligent. This is basically a keyword search of the paper titles. Even something as simple as term-frequence analysis would place the two papers far apart.


A couple questions:

1. How does "reversible" here apply to encryption algorithms? In a sense the encryption key is forgotten, but with enough computing power it can be derived from the output (at least for certain encryption methods). So, in a sense no information is lost.

2. If two separate computers perform the same reversible operation, but one in reverse, are the two computers (if considered as one system) consuming no energy? Or must the operations be performed on the same computer sequentially?

If the system of two computers is valid, then does that mean that somehow certain operations / calculations produce energy and others consume it?

I realize both these questions are somewhat nonsensical, so I clearly am missing something fundamental about this theory.


Reversible (in the paper's sense) is more about the paths information takes than whether it's recoverable in principle. Lossless compression is reversible, but a reversible computer performing it would still need to have storage equal to the size of the entire original data.

It's also important to note that the fact that you are doing calculations isn't the direct cause of energy consumption. Landhauer's principle is a consequence of the second law of thermodynamics, which mandates certain heat losses in certain situations but doesn't tell you anything about how they happen. The question of what actually causes the second law of thermodynamics to hold at the level of fundamental interactions is called Loschmidt's paradox, and there's not really a consensus as to whether we have an answer.


" it would still need to have storage equal to the size of the entire original data."

I think it's worse than this. You need the intermediate steps as well. So, for every and/or type calculation which looks like multiple inputs one output, you need to keep all the input bits.

Feynman has an interesting chapter on this in his lectures on computation, including a cool billiard ball computer (a la Ed Fredkin I think).


On 2. You can't run reversible logic on our currently non-reversible logic and expect to see less power usage. If you had two reversible computers you could reverse the output of the other while consuming minimal energy just as a single reversible computer could reverse it's own output. The operations must be run forward at least once, or technically you don't have the outputs to input to the reverse.


For 1., I don't think reversible computing relates to finding inverse functions of algorithms but rather considering how it might be possible reversing the operations while retaining whatever memory is required to reverse irreversible operations


2. Reversible operations have no minimum amount of energy that must be dissipated as heat, unlike irreversible operations which must dissipate every time bits are erased. So each computer individually could be consuming no energy, so both together would consume the sum (zero) of the two.


See also a type system and programming language for reversible computations: http://gradworks.umi.com/35/87/3587675.html


What kind of applications would this actually be useful for? I don't think I really understand the benefit(s) here.


The benefit of a reversible computer is that there is no minimum amount of energy it must dissipate as heat in order to perform its computations.

Every time a bit of information is erased, a certain amount of energy must be dissipated as heat. A reversible computer doesn't (have to) erase information.

Reversible computing is energy-aware computing taken to its logical extreme.


As the paper points out, there are a bunch of ways in which reversible computers will lose energy, even under ideal circumstances. It's also important not to gloss over the fact that reversible circuits (e.g. CSWAP) are tremendously difficult to create, so it's unlikely we'll be seeing computers built from them anyway; and without reversible circuits at the logic gate level, you don't really gain the expected energy benefits.


Yes, there are large difficulties to creating reversible computers in practice. However, it's the theoretical benefits (such as no minimum power dissipation) that motivate people to work on them.


I get that. The concept of saving energy is what I do understand.

What I don't understand is what kind of applications this could actually be used for. Given the (by-design) limitations of such a system, how would it be actually usable? What real-world scenarios would want this and be able to utilize it?


Reversible computers are turing-complete. The only requirement is you either have to store all intermediate results (no erasing), or you have to periodically "uncompute" them. I don't remember what the minimum bound on the overhead is, but it's manageable.


Conventional processors will eventually run into physical limits of heat dissipation that will impede moore's law. Reversible computers are a way of circumventing this physical limitation because they dissipate less energy as heat.


Physics simulations. I work in drug design, and my field would be enormously benefited by this. Quantum mechanics calculations are naturally reversible.


The interior of the computation is reversible, but the measurement performed at the end most definitely is not. If it were, quantum computers would be able to solve NP-complete problems trivially, and complexity theorists would get pissed that physicists were holding out on them for decades.


Can you expand on this?


Sure. I'll use the quantum circuit model of quantum computation since it removes a lot of non-essential details.

A quantum circuit is analogous to an idealized digital one, in that it is composed of gates with (possibly multiple) inputs and outputs, hooked together by "wires" that "carry" quantum states. However the gates aren't able to perform arbitrary operations; they can only multiply the inputs by unitary matrices to produce the outputs. These kind of matrices are always invertible, so none of the gates can "lose" any information the way some digital gates do. You also can't hook one output to multiple inputs, though you can ignore an output if you want.

However if you want to do some sort of computation whose output makes it into your brain (or a classical computer) you need to measure one or more of the output states. With digital circuits this is trivial; with quantum circuits you have to decide how you want to make the measurement (essentially along what "axis" between zero and one), and the output is in general probabilistic based on the angle between the measurement axis and the actual value. You also only get to do one measurement per output per instance of computation; the original output state will be replaced by the measured value which will usually not be the original output state, and all the intermediate states will already be consumed by the computation itself.

This is why most quantum algorithms are probabilistic. Shor's algorithm (the RSA breaking one) can actually fail even with an ideal quantum computer. However the likelihood of failure is small enough that running the computation a few times is enough to get the right answer with high probability (and it's easy to check if the output is right).

You could take any computable function that takes in N bits, apply it to a superposition of all 2^N possible inputs (encoded as N qubits) with about the same level of effort it takes to apply it to only one classical input. However there's no measurement which can recover the 2^N different outputs in one shot, or find the input associated with a specific output value. Most of the ingenuity in designing quantum algorithms comes in smuggling data out of quantum states by cleverly causing interference to push the possible output values into alignment with your measurement axis.


Creating a universe.


So reversible computing requires less energy not more right? Last I heard it is damn hard to forget information.


In theory yes, the catch is that you're not allowed to forget information. For instance, you cannot use an "AND" gate, because a result of 0 would be ambiguous, but you can use the Toffoli gate [1] which takes three inputs and produces three outputs, mapping (a,b,c) to (a,b,c^(a&b)).

But what happens to the outputs you don't need? They do need to go somehere. A reversible circuit ends up having a lot of unused outputs which need to be erased if the circuit is to be reused. In essence, we have traded a heatsink for a bitsink, but they are essentially the same thing.

I don't see reversible computing as categorically different from regular computing. It's regular computing, with careful and detailed heat management.

[1] https://en.wikipedia.org/wiki/Toffoli_gate


I wonder if the "bitsink instead of heatsink" problem might be somewhat mitigable.

So, we are just pushing our 0s and 1s around, instead of permanently destroying and re-creating them. As reversible logic is as universal as, say, NAND logic - wouldn't it be possible to sort the unused output bits of any gates into two pools for 0s and 1s [1] and use these pools for any constant inputs that are needed in the logic? So that we would only have to "create" any new 0s and 1s when one of these two pools runs dry?

[1] E.g. a sorting buffer with 0s at one end and 1s at the other. Or does sorting necessarily mean we are sorting unknown bits at the cost of mixing known bits (constant gate inputs)?


Sorting is not reversible, at least if your output is no larger than your input. [0,1] and [1,0] both sort to [0,1], so you'd already need 1 extra bit to unsort 2 bits.

I don't see how it would help anyway, but maybe I'm misunderstanding your suggestion. The state of these unwanted outputs needs to be kept around, if you know they will be 0 or 1 then the state is already implicit in the machine and the output isn't actually necessary.


If you sort two bits and get 0, 1 you can't know if you started out with 0, 1 or 1, 0.

The entropy of a sorted list of bit is O(log(n)) while the entropy of the unsorted bits is O(n).


[Self reply because this more or less concerns all replies to parent I have seen so far.]

"sorting is not reversible"/"you can't know [the original order]"/"sorting loses information"

These replies are have two problems in common:

1. They are wrong in the context of reversible gates, because (as I have hinted in my comment) when you sort bits using a normal reversible gate, then the original order of the bits will be encoded in the unused gate outputs. No information is lost, that's why we call these gates reversible in the first place.

2. The heart of my speculation lies in the question if a special irreversible circuit could be integrated into a reversible CPU, which moves bits into order without unnecessarily grounding charges. The more I think about it, the more I tend to think it could.

Edit: To make a clearer point on where I'm coming from here: I view reversibility not as a goal in itself, but a tool to use to get ultra low power compute capabilities. To achieve this, the unused output charges need to be recycled into the system. If you solve that problem, you have achieved the goal.


Sorting loses information proportional to the size of the list you are sorting.


Couldn't you just uncompute instead of erasing to restore the ground state before computing something else?

To my understanding you just need energy to set the inputs, give it a little push and everything else runs on its own (forwards and backwards).


There's a Philip K Dick book about that. (For any sufficiently weird idea, there usually is.)

Counter Clock World

The most powerful -- and most feared -- organisation in the world is the Library, in charge of expunging the written records of events, which have no longer happened.

https://en.wikipedia.org/wiki/Counter-Clock_World

The novel describes a future in which time has started to move in reverse, resulting in the dead reviving in their own graves ("old-birth"), living their lives in reverse, and eventually returning to the womb where they split into an egg and a sperm during copulation between a recipient woman and a man.

The Hobart Phase

The Hobart Phase is the new order of life where people rise from the dead and are rejuvenated. Time reversal apparently began in 1986. Other than aging, Hobart Phase resurrection has changed nutritional and excretion processes and associated social taboos. People do not eat, but instead consume "Sogum" anally through a pipe, and later "plop" out food orally, which is done in private, due to its 'shameful' nature. As for smoking, cigarettes are no longer smoked, but the smoke instead blown back into them, making them grow back to normal size (this also clears and freshens the air). "Goodbye" and "hello" have reversed their order within standard greetings, and "food" is used as a drop-in replacement for the expletive "shit". It is stated that Mars colonists do not have the Hobart Phase on their world, and it is limited to Earth, and presumably its lunar colonies as well.


You can uncompute, but in that case you need to erase the input to replace it with a new input.


Much less, in theory.

At an imaginary hand waving level imagine the complete state of a cpu as passing charged capacitors around and occasionally charging new ones or discharging some. This is not all that far off of how they work...

If you want to count to 7 reversibly there's an easy way involving hitting a 7 bit shift register, 7 times. In theory there's no reason to drain any capacitors per op or per the entire system, that same unit of charge just meanders down the line. Like a CCD sensor or old fashioned bubble memory kinda. The way chips are designed today that doesn't work, but insert a lot of kinda sorta in theory it could be done.

On the other hand if you want to count to 7 using three full binary adders you're going to run into scenarios where two bits enter an adder (say, 001+001) and one bit leaves (result of 010), so at least one capacitors worth of charge got turned to heat. And that charging and discharging is basically wasting battery power for no good reason. And its not a lot of power for one capacitor, but you toggle a flip flop a couple billion times a second and then not-much adds up over time to a pretty hot CPU.


That last issue can be handled by encoding each logical bit as a pair of physical signals (like low/high for 0, high/low for 1). I don't think there's any reason you can't implement your reversible counter in binary.


Are there limitations for a reversible computer? Could this run Firefox someday?


It is only capable of reversible operations. A completely reversible computer could not interact with you.

This paper goes long about how you can irreversibly initialize such a computer, run it, and then irreversibly read its results, and how much power you can save doing that.

In practice the computers people could build were always slow and inefficient, nothing comparable to the computers we are used to have (not even to the usual microcontrollers).


It's also worth noting that even low-power chip designers would laugh at worrying about nkT losses from throwing bits away. Our technology is very far away from the thermodynamic limit for irreversible computers.


Here https://arxiv.org/abs/1702.08715 is a link to the same article that might be a little easier to use.


what is meant by reversible?


Logically (and therefore thermodynamically) reversible. By Landauer's principle, the only reason a computer must emit heat (and therefore use energy) is to perform irreversible operations. Therefore, if you want a computer that (potentially) uses the minimum energy, it needs to be reversible in this sense.

An irreversible operation is anything that maps many (possible) states to the same state. Setting a bit to zero is irreversible (because it destroys the information about what bit was there previously). An AND gate is irreversible (because both TF and FT map to F). A NOT gate is reversible (because you can infer the input from the output.)

There are universal reversible gates (Toffoli and Fredkin gates are examples, but there are many others), so you can (theoretically) build a (usable) computer whose logical operations are completely reversible.


Does this imply that physical entropy and informational entropy are the same?

Does it imply the computer would not give off heat? Or that a significant amount of the heat generated by a computer is due to information loss?


>Does this imply that physical entropy and informational entropy are the same?

It does, and there is a very strong (but not universally accepted) case that they are the same thing: https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_...

Great lay article about it: http://lesswrong.com/lw/o5/the_second_law_of_thermodynamics_...

(Side note: I've long thought you can derive Carnot efficiency based on information-theoretic arguments about the information contained in knowing only the temperature difference between a heat source and a sink.)

>Does it imply the computer would not give off heat? Or that a significant amount of the heat generated by a computer is due to information loss?

Basically, yes. You can't avoid the heat emission from setting a 0 to a 1 on that computer's storage medium, but you would avoid the loss from any of the intermediate computations that typically make CPUs so hot.


I'm not the person who was asking, but thanks for the insightful comments. Makes sense.

Wouldn't a reversible computer that executes need to take up an exponentially large amount of physical space as it executes?

That might work for small problems but it doesn't seem like it would scale. Or rather, it would scale to fill the known universe.

Sorry I haven't really dug into the attached articles yet, but I will.


See the thread about the asymptotic penalties: https://news.ycombinator.com/item?id=14151390


> Does it imply the computer would not give off heat?

The implication goes the other way: if the computer does not give off heat, it must be reversible.


So, in a sense, heat is unrecoverable information.


In some sense that seems to be true.

But I just meant that while it's easy to make a reversible circuit, no tech we have today can make it as energy efficient as might be theoretically possible.


"Reversible" means that the physical processes which implement the computation (e.g. electricity flowing through wires, marbles running down tracks, etc.) can all be run backwards as well as forwards.

Think about pumping water up a hill: we have to spend energy to move it up; but it's reversible: by moving the water back down we receive energy (e.g. hydroelectric). This isn't completely reversible, since we lose a lot of energy to friction and turbulence that we can't get back. Some microscopic systems don't have such losses, e.g. gas molecules will bounce around without losing energy (gases don't spontaneously cool down). Could we use such microscopic systems to implement a computer?

From the computational side, the only logical operation which is irreversible is discarding information, e.g. overwriting some stored value with a different value. As long as we have the same amount of information out as we put in, we can run a computation in reverse: calculating the input, given the output.

Most computations take a big bunch of data as input, and produce some small output (e.g. averaging a single field from a list of records). This clearly throws away information. We can adjust this calculation to be reversible, by outputting the value we want (e.g. that average) as well as a bunch of extra "junk", which encodes all of the input data that we didn't use. Given a pair of an average and some "junk", we can compute the input which gave rise to that pair.

So why do this? If our computation is reversible, we can run our computer forwards, copy out the portion of the result we care about to somewhere else, then run the computer backwards to get it back to the state it started in. This way, we've performed our computation without using up any energy. There may be a slight loss outside the computer, if we store that result by overwriting some previous value in an irreversible way; but even this energy cost depends only on the size of the value we're copying out. We can perform an arbitrarily large amount of computation, e.g. a lengthy brute force search, and as long as the portion of the output we care about (i.e. not the "junk") is small, say a single integer, we can recover almost all of the energy used by that computation.


Don't you need energy for the computer to remember where it has been?


I think the idea is that from the output state and the program used to generate it, you have the information to exactly reverse the process and get back to the original input.

In something like an AND gate, you can't do that. I output 0...what were the two inputs? {0,0}, {0,1}, or {1,0}? And electrically, in 3/4 of the cases, you've got an input being sent to ground, using the electricity when you discard the bit of information.



What are the limitations to a completely reversible computer? I'm guessing it's still Turing complete, but do you pay a cost in terms of asymptotic complexity? I'm guessing you can make a program reversible trivially by doing something like making everything immutable and only ever appending new data, but of course that's extremely inefficient in terms of memory. Of course, I might be missing the core concept here.


AIUI, it's just a linear penalty because, at worst, you have to run each computation backwards to "uncompute it".

That's under the (necessary) assumption that you don't even try to avoid erasure operations that need for their own sake rather than as an intermediate step.


It is a bit more complicated then that. The only penalty to time is a constant factor if you are willing to use space proportional to the computation time. I did a bit of work on how to do time-space trade-offs in reversible circuits in my masters thesis http://hdl.handle.net/10012/10949.


So does this mean you can move the entropy cost around? For stack or 3D computing moving the heat cost to directly heatsinked surfaces would be useful.


Direct link to paper at arXiv, https://arxiv.org/pdf/1702.08715.pdf


Link to the abstract: https://arxiv.org/abs/1702.08715

Generally, it is better to link to the abstract than to the PDF. If the paper gets revised the abstract page page's PDF link will point to the latest version of the paper (with links available to prior revisions). The direct PDF link will remain stuck with the old version.


Because I the find abstracts of readable, clear, and interesting papers are often indistinguishable from unreadable and dull papers, I prefer the paper itself to its abstract...at least in terms of Hacker News...I am long since reading scholarly papers for academic purposes. If someone wants to read the abstract, it's right there at the beginning of the paper anyway.


Abstracts are short enough that they can be inlined into a hackernews comment directly:

> A critical analysis of the feasibility of reversible computing is performed. The key question is: Is it possible to build a completely reversible computer? A closer look into the internal aspects of the reversible computing as well as the external constraints such as the second law of thermodynamics has demonstrated that several difficulties would have to be solved before reversible computer is being built. It is shown that a conventional reversible computer would require energy for setting up the reversible inputs from irreversible signals, for the reading out of the reversible outputs, for the transport of the information between logic elements and finally for the control signals that will require more energy dissipating into the environment. A loose bound on the minimum amount of energy required to be dissipated during the physical implementation of a reversible computer is obtained and a generalization of the principles for reversible computing is provided.

It can save readers a click in deciding if they want to go for the whole paper or not. Think of the abstract as an upper bound on whether the paper is interesting or not. A paper could possibly be less interesting than its abstract, it is much less likely that it is more interesting than its abstract unless the author really messed up.


Personally, I usually click on links based on the title more often than based on the comments. But I know other people have different habits. With a direct link to the PDF, the abstract is at the top of the PDF for anyone who wants to read it.

For me personally, if I have the paper, I skip the abstract and read the introduction because it is more interesting and I don't feel a duty to bore myself reading the abstract. Here, I thought the title was interesting enough to click on the link and the paper interesting enough to provide a link to the less noisy original.


There's a small tradeoff in that not all readers might want to commit up-front to downloading a PDF. In the case of arXiv the abstract will be also updated with later versions of the paper if they appear.


With Firefox the resource located at the arXiv URL of the PDF is smaller than the the resource located at the arXiv URL for the abstract. So goes the modern web. Both are an order of magnitude smaller than the originally submitted link.

For me, the conventionality of UI of the PDF resource makes it easier to navigate than the UI of the resource for its metadata. Admittedly, I don't have a use case that is dominated by revisions to preprint journal articles...I just want to check papers out.





Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: