Hacker News new | past | comments | ask | show | jobs | submit login

The way I internalize it: public voting selects for layman plausibility, not correctness.

Because laymen massively outnumber experts, the layman vote always overwhelms the informed one, so the reaction of people who don’t know the subject is the only thing that matters. Truth only seems to matter because most subjects either can be somewhat intuited by non-experts, or are in a niche that you’re not, so “layman plausibility” means your reaction, too. But the true nature of the dialog reveals itself as soon as people talk about something you’re an expert on.

Answers like this aren’t a bug in a truth machine, they’re a plausibility machine working as designed.




> The way I internalize it: public voting selects for layman plausibility, not correctness.

To lend credence to this idea, I reflexively upvoted you despite not having read any experts on this voting phenomenon.


In that way, it’s a bit like an LLM choosing the most likely answer based on the mass of training material.


Humans are nearly all mimics, at least 98%+. They are LLMs. It's a survival optimization (energy spent copying the existing vs creating/innovating/distributing). It's only fitting that we'd create LLMs in the human mold.

LLMs are to human mimics what AGI will be to human creators/innovators (and then some of course).


> Humans are nearly all mimics, at least 98%+. They are LLMs.

We are GIs, at least 98%+, LLM like behavior may exist in our cognitive repertoire, but we certainly aren't limited to it. Can an LLM drive locomotion?

I never understood AGI as generating sui generis ideas as a requirement. I thought that AGIs could also be uncreative mimics.


> Can an LLM drive locomotion?

Can't see any principled reason it couldn't, if it was a big enough, sufficiently trained one, running on fast-enough hardware, if you represent the sensor data in its token vocabulary, and have the reverse for control outputs.

Quite probably not the most efficient way to drive locomotion, though.

> I never understood AGI as generating sui generis ideas as a requirement.

Creativity is among the applications of intelligence that I would deem included in the "G" in AGI; OTOH, like most proposed binary categories, its probably more useful to view generality as a matter of degree than a crisp binary attribute.


>Can an LLM drive locomotion

If it's been trained to, but I'm not sure that's been the focus recently, though I think some research has been done into it. Turns out prediction engines with attention are useful for more than just predicting text; our bodies and brains work on learned assumptions and behaviours.

But certainly I imagine transformer + attention can = learning to walk/perform task. LLM specifically no...because it's trained on language and not motion, it's all in the name. But even then perhaps motion can be turned into language (non English tokens tho) & an LLM still used, I know people are working on funky stuff like that as well.


It's basically the question of Turing machines and universal calculation. As a lay person I just wouldn't expect an LLM to be good at intelligences using other than sequential symbols.


And, it would seem, that training material is mostly wrong...


And just think, its training material is all this upvoted - and then believed and repeated - BS.


As we know in the age of the internet, truth doesn't matter, only popularity does.


The internet has taught me how many brilliant people there are out there. And how massively outnumbered they are by the rest of us!


there's another reason for some optimism about a voting-truth connection: wisdom of the crowds. As long as there isn't a strong bias to people's estimate, the average will converge on the truth.


> there's another reason for some optimism about a voting-truth connection: wisdom of the crowds. As long as there isn't a strong bias to people's estimate, the average will converge on the truth.

Hmmm ... that doesn't seem to match what actually happens. After false beliefs holding back humanity for its entire history, science came along and produced actual, working, truth. And science is the opposite of what you say: The crowds don't matter, only the facts. Newton was not a crowd, and the crowds didn't produce anything remotely as true and valuable for all those years. The crowds persecuted Galileo (and many others).

"In matters of science, the authority of thousands is not worth the humble reasoning of one single person." - attributed to Galileo

As someone pointed out, I think here on HN, the intuition of the crowds sucks. If it was any good, we'd have had the right physics in 5,000 BCE not starting in the 17th century.


I thought Newton was a mathematician, not a scientist.

> the intuition of the crowds sucks. If it was any good, we'd have had the right physics in 5,000 BCE not starting in the 17th century.

Eh. People used to stay in their lane. Only these days can you get a city person voting on proper farming techniques.


Newton was a mathematician and arguably the most important scientist in history. I recommend his biography - it's amazing reading.


I'm the kind of person who is completely disinterested in biographies.


Fine, but then why talk about Newton if you are aware you know nothing about them? Talk about what you know.


It's not as if I haven't been exposed to his laws of motion in physics courses. I just think of them as more math (or heck, even philosophy) than science.


I'm always interested in unique perspectives, but at the same time, English has a meaning outside any individual's concept of it.

> Newton was ... not a scientist.

That has a meaning, and its false. Whatever you personally think of it, Newton was a scientist. I don't love a wild goose chase.


I guess so. It's hard for me to think of anyone prior to about the mid 1800s as a scientist, but sure, he qualifies by the standards of the day.

I still don't understand why people view Linnaeus' classification as scientific though. I guess maybe because it functioned as a hypothesis of common descent later on?


> I thought Newton was a mathematician, not a scientist.

Newton was a mathematician, scientist, alchemist, theologian (though, by the view of most Christians at the time and now, quite a heterodox one), and high government official that conducted undercover investigations personally. People can sometimes do more than one thing, and Newton did...a lot.


> As long as there isn't a strong bias to people's estimate, the average will converge on the truth.

Yes, as long as the truth is the most significant systematic influence on beliefs, any reasonable method of aggregrate of belief will converge on the truth with sufficient numbers.

Unfortunately, the required condition for convergence on the truth is often not true, and there is no way of reliably determining when it is true other than determining the truth independently and determining if belief converges on it.

Significant effects on belief about facts from cognitive/perceptual biases, especially where the fact is not something easily observable like “is it raining at this instant where you are standing” are not rare, and these biases often align for similarly situated individuals.


I am quite unsure as to the veracity of the claim that "the average will converge [upon] the truth". I recall cases being made (as asides) for the opposite conclusion. Intuitively even, this idea of equating truth with convergance towards the average opinion appears contradictory, counterfactual, and ahistorical. Excuse my being brass, but a "wisdom of crowds" seems to me oxymoronic on its face. I'd love to be persuaded otherwise though; mainly due to my perception of a lack of credence towards your view. Perhaps I have misunderstood your qualifier: "As long as there isn't a strong truth bias to people's estimate . . . "? Off the top of my head, I can't imagine any scenario in which a mixed population of laypeople and academics/experts would converge towards the same (vote average) findings as a sample of a handful of experts/academics. For example, would The Average converge towards correct mathematics or physics answers? Besides trivial, non-technical questions that do not require complex analysis, I think not. (See: False Memory: Mandela Effect. [0] [note]) [0]: https://en.m.wikipedia.org/wiki/False_memory#Mandela_effect [1]: https://en.m.wikipedia.org/wiki/Information_cascade [Note]: My point is that groups' thinking is liable to be compromised. (After all, what has been more important to a human — evolutionarily: the truth or social access?) Also see: Information Cascade. [1] {Post-Scriptum: My position is that if averages for answers to questions were taken, from the 'crowd' of the whole Earth, then these would diverge significantly and routinely from The Truth. If there are cases in which you feel this to not be the case I would inquisitively consider such scenarios waveBidder.} <Edit: Deletion: " . . . ~difficulty in lending~ . . . ">


> I can't imagine any scenario in which a mixed population of laypeople and academics/experts would converge towards the same (vote average) findings as a sample of a handful of experts/academics.

Then you get crap where the experts, even when they agree, "dumb it down" for the crowds. This leads the masses who actually do pay attention to experts to think the wrong ideas are truth.

> After all, what has been more important to a human — evolutionarily: the truth or social access?

I don't think this is required for people to be very wrong. Caring about the truth can easily lead to assuming other people who speak authoritatively know what they're talking about, or to speaking authoritatively yourself when you think you're right.


As a peer comment mentioned, the wisdom of the crowds only functions when people operate independently. When people collaborate, our answers turn to junk again. And any sort of voting system is an inherent collaboration because you are basically seeing what's 'trending' by definition, so it destroys any sort of wisdom of the masses.

The only way you might have it work is if random people were shown random posts from random topics, and asked to vote on them. And the ranking was based upon that feedback. There's problems there as well, but probably far fewer than in the current system.


> And any sort of voting system is an inherent collaboration because you are basically seeing what's 'trending' by definition

Massively aggravated by "sorting by top" defaults for both original posts and separately for the comments on those posts.


Unfortunately not, because wisdom of the crowds requires not only a lack of bias but independence which, let’s face it, is usually impossible achieve.


That only works when people bet that their guess is correct.


Wisdom of the crowds is obviously dog shit.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: