Hacker News new | past | comments | ask | show | jobs | submit login
Is There Anything Beyond Quantum Computing? (pbs.org)
125 points by ca98am79 on April 11, 2014 | hide | past | favorite | 41 comments



Hah.

So Roger Penrose, not content with the now-seemingly-attainable quantum computing, speculates there is an even more magical quantum gravity computing, which brains just happen to use, that makes them special, that Turing machines can't compute?

The man just really hates the idea of AI, doesn't he?


That's nothing new for Penrose, he wrote about it in Emperor's New Mind. His whole argument is based on noncomputability, and he points out in the book that ordinary quantum computers can't do the noncomputable.

And of course if he were right and we did figure out the physics of it, there'd be nothing to prevent us from building hardware that uses the same physics.


But the more you accelerate the spaceship, the more energy you need, with the energy diverging to infinity as your speed approaches that of light. At some point, your spaceship will become so energetic that it, too, will collapse into to a black hole.

That is simply wrong. It is called relativity for a reason. An object might be traveling arbitrarily fast, but in its reference frame it is not moving.

That pretty much invalidates that part of the article.

For curious readers: http://physics.stackexchange.com/questions/3436/if-a-1kg-mas...


>Dear Readers: I apologize; there's one point in this article that's not quite right and ought to be clarified. Namely, when I said that a spaceship traveling exponentially close to the speed of light would collapse to a black hole: that's only true because, when the spaceship inevitably collided with some interstellar particle, the energy in the center-of-momentum rest frame would be exponential. A spaceship in a theoretical vacuum could get arbitrarily close to the speed of light without collapsing.

>However, it's important to understand that this doesn't change the computational situation in any important way. It's still true that, to accelerate exponentially close to the speed of light, you need an exponential amount of energy! And therefore, it will take you exponential time to accelerate to such a speed---unless your fuel tank (or whatever else is providing your energy) is exponentially concentrated, in which case it will exceed the Schwarzschild limit and indeed collapse to a black hole.


That is only true if assumed that the rocket will not gather additional energy on the trip.

Also the premise that the energy has to be concentrated to a certain are for the rocket to work is wrong. You can have as much energy on as large area as you want without reaching the critical mass and collapsing into a black hole. The rocket can be of any size to work.

Even so, a black hole is the perfect source of energy. Converting mass into energy without loss via hawking radiation. And the mass is collected when moving trough interstellar medium. Such micro black hole( it has to be small to radiate fast enough ) would still weight a lot and moving it would be difficult, but not impossible.


Only the black hole part. As speed increases, mass increases too - and mass is energy. (Which is also why we can't have (anywhere) near speed light space travel, there's not enough energy anywhere to maintain accelerating the increasing mass)


An object can accelerate arbitrarily, but in any reference frame it will never be traveling faster than c.

Of course, it won't collapse into a black hole, but the amount of energy needed to accelerate further will increase asymptotically.


That is actually false from the frame of the traveler. Using constant acceleration, the traveler will observe constant energy usage. Then length contraction will guarantee that you can reach speeds arbitrarily close to speed of light( and also any location in the universe ) in about ~50 years with 1G acceleration.

http://en.wikipedia.org/wiki/Space_travel_using_constant_acc...


Yes, but when do you get there? In 50 years in your time, but it may be a few billion years later in their time...


I believe that is the point. Even more time for the stationary computer to finish the calculation.


Note the Scott has made a correction in the comments.


Also, I can't see how, theoretically, your ship turning into a black hole would prevent the answer to your computation from being received.


The most important application of quantum computers seems to actually be to enable scientists who do fundamental research to answer questions from reporters such as "What practical purpose does your research serve?" with a single catch all buzz word. So the answer to the title of this article is clearly: Yes.


> What’s more, there are recent developments in quantum gravity that seem to support the opposite conclusion: that is, they hint that a standard quantum computer could efficiently simulate even quantum-gravitational processes, like the formation and evaporation of black holes. Most notably, the AdS/CFT correspondence, which emerged from string theory, posits a “duality” between two extremely different-looking kinds of theories. On one side of the duality is AdS (Anti de Sitter): a theory of quantum gravity for a hypothetical universe that has a negative cosmological constant, effectively causing the whole universe to be surrounded by a reflecting boundary. On the other side is a CFT (Conformal Field Theory): an “ordinary” quantum field theory, without gravity, that lives only on the boundary of the AdS space. The AdS/CFT correspondence, for which there’s now overwhelming evidence (though not yet a proof), says that any question about what happens in the AdS space can be translated into an “equivalent” question about the CFT, and vice versa.

If you would like to read more about this, the author of this article has another blog post [0] that discusses the Susskind paper "Computational Complexity and Black Hole Horizons" [1] in its first half.

They key point, for those who don't have time to read the post:

> On one side of the ring is AdS (Anti de Sitter), a quantum-gravitational theory in D spacetime dimensions—one where black holes can form and evaporate, etc., but on the other hand, the entire universe is surrounded by a reflecting boundary a finite distance away, to help keep everything nice and unitary. On the other side is CFT (Conformal Field Theory): an “ordinary” quantum field theory, with no gravity, that lives only on the (D-1)-dimensional “boundary” of the AdS space, and not in its interior “bulk.” The claim of AdS/CFT is that despite how different they look, these two theories are “equivalent,” in the sense that any calculation in one theory can be transformed to a calculation in the other theory that yields the same answer. Moreover, we get mileage this way, since a calculation that’s hard on the AdS side is often easy on the CFT side and vice versa.

[0] http://www.scottaaronson.com/blog/?p=1697

[1] http://arxiv.org/abs/1402.5674


If you liked this, checkout the interview "Scott Aaronson on Philosophical Progress" [0]. Possibly my favorite read of last year.

[0] http://intelligence.org/2013/12/13/aaronson/


Actual article is here: http://www.pbs.org/wgbh/nova/blogs/physics/2014/04/is-there-... but Scott's blog has the comment section


I changed the article url. For those who want the comments, the original was http://www.scottaaronson.com/blog/?p=1781.


http://www.scottaaronson.com/blog/?p=1781

Seems the mods or poster updated the link to point to the PBS site, this is Scott's blog if you want to read the comments diziet mentions.



That's just a different architecture for a classical, non-quantum computer. (Keep in mind that analog computers came first, so this is a call back to the first half of the 20th century, albeit on a much larger scale.)


One thing we don't know is whether the brain is maybe able to perform calculations on real numbers which could allow for different or even more powerful calculations compared to the Turing machine.


No.

The line of thinking is, "Well, if we had real numbers, we could compute uncomputable functions." No shit. Most real numbers are not computable. If you start with something not computable, you end up with something not computable.

The big problem is that there is no way you will get more than several digits of precision on anything. The next problem is that there's zero evidence that anything in the universe is capable of encoding an arbitrary real number, even within a given bound. The last problem is that making a "real-number" machine capable of solving a problem that a Turing machine can't basically requires knowing the answer ahead of time and encoding it into the real-number machine as a constant. Unclear how you'd do that.


How I understood it, and please correct me if i am wrong, is that a machine than can caluclate numbers with infinite precision in constant time has certainly a time benefit, but not necessarily a broader class of computable functions compared to a Turing machine. I interpreted z3phyr’s reference to Wikipedia as a hint to that idea. But I agree that the most convincing arguments are against the existence of such a machine.

(By the way, writing "No.\n\n" can be received as a rude response. I felt a little bit uncomfortable reading it.)


Oops, sorry for this comment. ("don't drink and internet")


It is extremely unlikely due to noise. It's the same reason we use digital computers right now... for all the "waste" of converting voltages into 1 and 0 it pays back in spades in that we get reliable computing. You hit the noise floor of an analog system long before you're into any sort of revolutionary computation land. It's not much use if something can theoretically compute something amazing if it can't do any of receive that degree of input, maintain that degree of accuracy during the computation, or output it to that degree of accuracy.


> If so, then much like in Zeno’s paradox, your computer would have completed infinitely many steps in a mere two seconds!

The next target isn't infinitely many. The target is infinitely infinitely many.

Like something along the lines of forking infinite number of parallel Universes - that is pretty much "many-verse" interpretation of the quantum superposition(and thus computing) - enhanced with forking of infinitely many time dimensions inside each of the said Universes...


Infinitely infinitely many is still infinitely many. N x N has the same cardinality as N. Now if you meant N ^ N then that's actually different. "x" means set product and "^" means function space.


This isn't about cardinality, though, this is about number of steps, which is more properly measured with ordinal numbers. I'm not familiar with the theory of infinite-time Turing machines, but it's at least plausible that there are things you can do with omega^2 steps (or larger countable ordinals) that you couldn't do with only omega steps.


omega == omega^2. They are the same quantity. For every element of omega^2, there is a corresponding unique element of omega.


Actually, no. I just recently read about this.

1 + omega == omega < omega + 1 < omega^2

http://en.wikipedia.org/wiki/Ordinal_number


Weird! Thanks for the link!


Actually you are both right -- omega and omega + 1 (or omega^2) have the same 'number' of elements (there is a one to one correspondence) but the correspondence will not be (can not be) order-preserving. Ordinals are a natural extension of the natural numbers. If you take the natural numbers and add a new element that's bigger than all of them then you get a new structure (called omega + 1) which has the same size as the normal natural numbers, but has one new element which is order-distinguishable from all our normal numbers. Some ordinals, which correspond to a jump in 'size' (like 2^omega) are special and called cardinal numbers.


>N x N has the same cardinality as N

until the N is aleph-one if i remember correctly, or do you think it is not?

http://en.wikipedia.org/wiki/Cardinal_number


By N, dkarapetyan presumably means the set of natural numbers, or aleph_0. So it isn't aleph_1. That said, it is also true that aleph_1 x aleph_1 = aleph_1. More generally, any aleph_alpha x aleph_alpha = aleph_alpha, for any ordinal alpha; and if we assume the axiom of choice, then any infinite cardinal A can be written as some aleph_alpha, and hence satisfies A x A = A.

However, ultimately none of this is relevant; number of steps is properly measured with ordinals, not cardinals.


Yes. Super Duper Computing.


I laughed.


Scott is making it really difficult for people to follow along. My initial reaction was "wtf am I looking at" followed shortly by "why wasn't the HN link pointed straight to the PBS article?". And clearly others agree given that the top comment at the moment is a PBS link.

No doubt the guy is competent but this just smells like a poorly executed attempt to move traffic to the blog. I fully expect it to get ignored as such despite the PBS article being good.


If you're not familiar, Scott is a professor of MIT, specializing in quantum computing. Many readers of his blog are also well known in this sphere, so they would have more technical insight, more in depth than the pbs layman's version.


>>If you're not familiar, Scott is a professor of MIT

As I said, I don't doubt his ability. Its the presentation I object to. The HN blog leads to some random blog that barely touches on the topic that the headline promises and the core of the story is in a link contained in the blog. By any sane definition that is blog spam, MIT professor or not.

>>Many readers of his blog are also well known

So? Its linked to his blog, not to his readers. I don't care if his readers include Albert Einstein himself. The link did not deliver despite the authors (proven) ability to deliver on the content.


If you're a writer and you have an article available on some 3rd party site, wouldn't you inform your blog readers of it? Besides, the poster of this link is not Scott Aaronson, so why accuse him of attempting "to move traffic to the blog"?

EDIT: And a blog without advertisements at that.


>so why accuse him of attempting "to move traffic to the blog"?

Remember what I said...one of the first responses (3rd at worst) is someone posting the "proper" link to the actual article. That was there before I posted - someone else read the blog extracted the relevant info and concluded that the single link is the core content of the blog post. Then I put the same thing into words and go downvoted to hell. Oh well.

OP has since altered the link to point at the PBS article.

As I said before I don't doubt the technical ability or integrity of Mr Aaronson. I was commenting purely on the HN submission because I felt both the HN submission and blog were subpar & did an injustice towards the actual content (the PBS post). Downvote away...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: