Hacker News new | past | comments | ask | show | jobs | submit login
Probability Chip (technologyreview.com)
87 points by eli_s on Aug 18, 2010 | hide | past | favorite | 25 comments



The thesis on which this is based:

http://phm.cba.mit.edu/theses/03.07.vigoda.pdf

edit: p 135 is where he starts talking about implementation in silicon


Here's a nice presentation from him that has diagrams of how the basic logic circuits are built up using this approach:

http://cba.mit.edu/presentations/03.09.Vigoda.ppt


That is quite the thesis.


I'm curious how they deal with probabilities very close to 1 or 0. Usually when people are doing bayesian things with probabilities they work in logistic space so that the precision of values close to 1 or 0 is effectively unbounded. That seems like a hard thing to do with an analog circuit.


The founder's thesis mentions that they use a linearizer in their analog circuit, but all that does is give the same precision over the entire logical value range from 0 to 1 (by that I mean the same amount of voltage swing equals the same amount of "logic value change" anywhere in the range).

I suppose they could use a "non-linearizer" to put more of the precision near 0 and 1, but it would come at the expense of precision in the middle. The less voltage swing is involved, the more susceptible you are to noise from various sources.


Is it likely (in the future) to see more domain specific chips? Something like what http://www.deshawresearch.com/ has created---a custom chip Anton, optimised for Molecular Dynamics simulations.


I think it is likely we will see more of these. But this probability chip sounds more like an analog computer than what D. E. Shaw has done.

The Lyric web site says that they "model relationships between probabilities natively in the device physics", where D. E. Shaw's Anton chip sounds like it uses traditional logic gates the same way a GPU does.

P.S. Sorry, I downvoted you by accident -- I meant to upvote you.


Domain-specific chips is a cyclical trend. They come and go; at some times they have advantages, and at others they don't. (Remember Lisp machines? Good initially but vastly outperformed by the end of their lifespan.) See for example the classic 'wheel of reincarnation' paper on graphics: http://cva.stanford.edu/classes/cs99s/papers/myer-sutherland...

The fundamental problem as I see it is that any domain-specific chip will receive a tiny fraction of R&D and economies of scale and amortization that a general purpose one will, and so its advantage is only temporary. As long as Moore's law is operating, this will be true.


To quote the thesis on probabilistic chips:

> In practice replacing digital computers with an alternative computing paradigm is a risky proposition. Alternative computing architectures, such as parallel digital computers have not tended to be commercially viable, because Moore’s Law has consistently enabled conventional von Neumann architectures to render alternatives unnecessary. Besides Moore’s Law, digital computing also benefits from mature tools and expertise for optimizing performance at all levels of the system: process technology, fundamental circuits, layout and algorithms. Many engineers are simultaneously working to improve every aspect of digital technology, while alternative technologies like analog computing do not have the same kind of industry juggernaut pushing them forward.


You're already seeing domain specific chips, but they're in the form of a FPGA rather than an ASIC. If it can be implementation with traditional gates, a FPGA is the way to go for low to medium volume.

While Lyric may incorporate classic gates in their design, it also sounds like the heart of their technology uses something different from classic gates.



There's a single-quote missing in your link after Hitchhiker and before the letter s: http://en.wikipedia.org/wiki/Technology_in_The_Hitchhiker + ' + s_Guide_to_the_Galaxy (looks like HN comment filters it out)


Thanks!


My Ph.D advisor will go crazy, he had his European research project on a probability computer turned down a few months ago.


wouldn't starting just 6 months ago be a bit late in any case?


Isn't this just the revenge of the analog computer?

Not saying it's a bad idea... I'm really for the idea of revisiting assumptions in computer design.


Sure, if you can represent your problem using probabilities :)

That said, I'm more excited about the use of Lyric's technology in ECC memory. I'm skimming through Vigoda's thesis, and it seems that another very interesting application ought to be making even lower-power mobile backend chips.


Sounds a lot like the ByNase protocol that Ward Cunningham (inventor of the wiki) came up with:

http://c2.com/cybords/wiki.cgi?BynaseProtocol


I thought I'd heard something like this before. From 2004: http://www.eetasia.com/ART_8800354714_499488_NT_92255b4a.HTM

That's a turbo decoder rather than a generic probability calculator, but it's doing probability calculations in the analog domain.

This sort of thing may make sense for error correction, but I don't think people will run general probability calculations on it. Too difficult to debug :-)

Though, I do wonder if they can simulate a neuron more efficiently than digital logic.


Printer friendly, (almost) no ads, no pointless images:

http://www.technologyreview.com/printer_friendly_article.asp...


Looks like they check the referrer and redirect you to the original article :(


Sneaky swine ...


similar to the fuzzy-logic chips of the 90's?


How does this compare to what Navia Systems is working on?


But how do you connect it to the cup of no tea?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: