> it works because we as a society have a visceral negative reaction to some labels.
Do you know why we have that reaction? Because of millions upon millions of dead, innocent humans. That is what those ideologies lead to. We learned this lesson once, and we learned it very well. We don't want that to happen again. We don't want to let those ideas spread again. We don't want to see the mass graves again they lead to again. We learned that.
I'm saying it's another authoritarian impulse to squash dissent, yes. Smaller magnitude, sure. But that's exactly why you compare things -- to see what's better or worse.
You could potentially always have with you as many displays you want to whatever size you prefer and use them wherever you like: desk, couch, laying in bed... VR / AR Headsets can be a more convenient and comfortable replacement for traditional displays and desk setups.
Yeah, but all the things mentioned are things that require my full attention for longer spans of time. I don't need separate displays for them, I need one single good display.
I personally would love VR/AR glasses that I could keep attached and do everything on. Phone, CLI, PowerPoint, web browser, unlimited monitor space, Netflix, Xbox, etc.
I know the pixel density isn’t there yet but I don’t want different screens. The AR/VR would be the “single good display” you’re looking for. I want glasses I can use for literally everything I need a screen for.
For the reality-clingers, this is also the perfect set-up. No more closing your laptop and having your TV staring back at you. Your screen is your screen is your screen, and you can turn it off and not have to deal with any of them.
Current headsets resolution is already surprisingly good with some dev tricks. We use something called Compositor Layers at https://supermedium.com/ to render comic books in VR that look sharp and vivid. Not “retina” resolution yet but improving very quickly. Give it a couple of headset generations. We already have much much higher resolution and density panels but need to work within other constraints like small form factor, mobile SOCs limited compute power, weight, thermals and battery life.
It's still unclear to me what the benefit is, though. The drawbacks are obvious - I have to wear a headset, I lost the physical interfacing, both in terms of input interfaces and in terms of being able to, say, just a put an e-reader down to stop reading it. What do I gain, after giving up all that?
Best is to try. If you have an Oculus Quest handy you can give supermedium demo a try. Your feedback would be super appreciated. With a headset you get better ergonomics: resize and position a display at will and don’t have to hold anything. For comic books in particular, mural scale pages give you an appreciation of the art not possible on a traditional display. Notice that physical input is still available via controllers: similar to a TV remote or gamepad and also tracked in space enabling more subtle interactions than “traditional” input. Tracked physical keyboards will be eventually available too. I agree wearing / removing a headset is additional friction. We consider it a feature, not a bug. Hard to focus these days with so many distractions. Once you put a headset on you’re committed to the task and it removes all the noise. We see VR as a tool for focus, like shutting the door of a super fancy and private office.
Regulators were definitely not asleep at the wheel. They were apparently actively defending them, for instance by attempting to sue the Financial Times for reporting on the irregularities.
Using a kafkatrap against an opponent you can't beat in debate when they have just pointed out the tactic is probably ill advised; perhaps try something else; Ad hominem or motte and bailey for example.
"Kafkatrap" is a meaningless term, beyond "stop calling me a racist just for saying racist things".
Acting like it's an accepted logical fallacy is ridiculous. It's a term ESR made up because people kept rightly calling him a sexist and racist and he didn't like it and threw a tantrum.
Well lets see... oh that's odd, that meaningless term appears to have a real meaning https://debate.fandom.com/wiki/Kafka_Trap . Now why would you be willing to lie about that?
Seems it's a perfectly accepted logical fallacy; and the only people who deny it are the sjw crowd largely because it is such a favoured tactic within their ranks.
Yes those are dictionary's for definitions of words not a repository of debate tactics; if you'd checked you'd also notice that there's no entry for "motte and bailey falacy", "Appeal to Ignorance" or "appeal to authority"; funnily enough it doesnt prevent those existing either.
Well yes if you purely limit yourself to a single college of liberal arts list of definitions then you won't, however search engines are your friend
https://en.wikipedia.org/wiki/List_of_fallacies
And desperately clinging to any page-not-found of whatever website you can find to display it isn't exactly the most secure display of debate.
> Using a kafkatrap against an opponent you can't beat in debate when they have just pointed out the tactic is probably ill advised; perhaps try something else; Ad hominem or motte and bailey for example.
It describes a fallacy and have no idea who coined it. First learned of it on HN, actually.
You don't know who coined "coined", but they may well have been a racist. Are you going to stop using it if so? Does that mean it's no longer useful for communication? Are you going to investigate every word on the chance it might've been and strike those from the lexicon?
It is, in fact, not useful for communication, because it does not honestly communicate anything. It exists only to undermine people who try to call you out on making bigoted remarks. It was coined by ESR, and is popular mainly with people with a strong affinity for bigotry, like him, and also libertarians.
The way Swift does them is the least awkward and most robust I have seen in any language.
> As an example, the colon and the whitespace are sufficient to separate parameters. So is the comma. So we have...both?
Designing a language around what the minimum requirements of a parser is will not get you a good language. Human language is massively redundant, because humans find it more easy to communicate when there is redundancy. Computer code is also communication aimed at humans, and a bit of redundancy does not hurt there, and can often help.
> There is also the issue that, please correct me if I got this wrong, the label that's in the signature becomes the name of the parameter inside the function.
This is incorrect. You can optionally specify both an internal and external name for any argument. Unlabelled arguments are just a special case of this syntax, where you say "external name does not exist, internal name is this".
Do you know why we have that reaction? Because of millions upon millions of dead, innocent humans. That is what those ideologies lead to. We learned this lesson once, and we learned it very well. We don't want that to happen again. We don't want to let those ideas spread again. We don't want to see the mass graves again they lead to again. We learned that.
Some people have forgotten, though.