Hacker News new | past | comments | ask | show | jobs | submit | DagAgren's comments login

> it works because we as a society have a visceral negative reaction to some labels.

Do you know why we have that reaction? Because of millions upon millions of dead, innocent humans. That is what those ideologies lead to. We learned this lesson once, and we learned it very well. We don't want that to happen again. We don't want to let those ideas spread again. We don't want to see the mass graves again they lead to again. We learned that.

Some people have forgotten, though.


IDK, I'd say people calling everyone they don't like a Nazi seem like a party which doesn't get it.


Some people remember the horror of Nazi Germany as well as the horror of the Red Terror, Stalinist Russia, and the Cultural Revolution.


All that, and the horrors of McCarthyism too.


You're comparing McCarthyism to the Nazi genocide, the Red Terror, the horror of Stalinism and the Cultural Revolution?


I'm saying it's another authoritarian impulse to squash dissent, yes. Smaller magnitude, sure. But that's exactly why you compare things -- to see what's better or worse.


> we're experimenting with the idea of an app that has a comic reader, book reader, Reddit client, art viewer, video player

Why? I have all of those on my computer without strapping anything to my head.


You could potentially always have with you as many displays you want to whatever size you prefer and use them wherever you like: desk, couch, laying in bed... VR / AR Headsets can be a more convenient and comfortable replacement for traditional displays and desk setups.

[edit] typo


Yeah, but all the things mentioned are things that require my full attention for longer spans of time. I don't need separate displays for them, I need one single good display.


I personally would love VR/AR glasses that I could keep attached and do everything on. Phone, CLI, PowerPoint, web browser, unlimited monitor space, Netflix, Xbox, etc.

I know the pixel density isn’t there yet but I don’t want different screens. The AR/VR would be the “single good display” you’re looking for. I want glasses I can use for literally everything I need a screen for.


For the reality-clingers, this is also the perfect set-up. No more closing your laptop and having your TV staring back at you. Your screen is your screen is your screen, and you can turn it off and not have to deal with any of them.


A VR headset can also give you a single good display on demand. What about reading a 100” mural scale comic book while laying down in bed?


It definitely will not have the relative resolution of a physical monitor.


Current headsets resolution is already surprisingly good with some dev tricks. We use something called Compositor Layers at https://supermedium.com/ to render comic books in VR that look sharp and vivid. Not “retina” resolution yet but improving very quickly. Give it a couple of headset generations. We already have much much higher resolution and density panels but need to work within other constraints like small form factor, mobile SOCs limited compute power, weight, thermals and battery life.


It's still unclear to me what the benefit is, though. The drawbacks are obvious - I have to wear a headset, I lost the physical interfacing, both in terms of input interfaces and in terms of being able to, say, just a put an e-reader down to stop reading it. What do I gain, after giving up all that?


Best is to try. If you have an Oculus Quest handy you can give supermedium demo a try. Your feedback would be super appreciated. With a headset you get better ergonomics: resize and position a display at will and don’t have to hold anything. For comic books in particular, mural scale pages give you an appreciation of the art not possible on a traditional display. Notice that physical input is still available via controllers: similar to a TV remote or gamepad and also tracked in space enabling more subtle interactions than “traditional” input. Tracked physical keyboards will be eventually available too. I agree wearing / removing a headset is additional friction. We consider it a feature, not a bug. Hard to focus these days with so many distractions. Once you put a headset on you’re committed to the task and it removes all the noise. We see VR as a tool for focus, like shutting the door of a super fancy and private office.


Regulators were definitely not asleep at the wheel. They were apparently actively defending them, for instance by attempting to sue the Financial Times for reporting on the irregularities.


He also chose the word "kafkatrap", a word coined by a notorious racist.


Using a kafkatrap against an opponent you can't beat in debate when they have just pointed out the tactic is probably ill advised; perhaps try something else; Ad hominem or motte and bailey for example.


"Kafkatrap" is a meaningless term, beyond "stop calling me a racist just for saying racist things".

Acting like it's an accepted logical fallacy is ridiculous. It's a term ESR made up because people kept rightly calling him a sexist and racist and he didn't like it and threw a tantrum.


Well lets see... oh that's odd, that meaningless term appears to have a real meaning https://debate.fandom.com/wiki/Kafka_Trap . Now why would you be willing to lie about that?

Seems it's a perfectly accepted logical fallacy; and the only people who deny it are the sjw crowd largely because it is such a favoured tactic within their ranks.



Yes those are dictionary's for definitions of words not a repository of debate tactics; if you'd checked you'd also notice that there's no entry for "motte and bailey falacy", "Appeal to Ignorance" or "appeal to authority"; funnily enough it doesnt prevent those existing either.



Well yes if you purely limit yourself to a single college of liberal arts list of definitions then you won't, however search engines are your friend https://en.wikipedia.org/wiki/List_of_fallacies

And desperately clinging to any page-not-found of whatever website you can find to display it isn't exactly the most secure display of debate.


Ah yes, Wikipedia, with one source form a libertarian propaganda rag. Very reputable.

Nobody but libertarians looking for excuses for racism use that term, deal with it.


And at last you've taken my advice

> Using a kafkatrap against an opponent you can't beat in debate when they have just pointed out the tactic is probably ill advised; perhaps try something else; Ad hominem or motte and bailey for example.

Allow my to quote from one your trusted sources: https://owl.purdue.edu/owl/general_writing/academic_writing/...

> Ad hominem: This is an attack on the character of a person rather than his or her opinions or arguments.


Feel free to provide me wrong by showing a non-libertarian source that takes this term seriously.


Appeal to authority (points for variety at least) is a logical fallacy that I literally pointed out earlier.


You really don't understand how logical fallacies work at all, do you.


Well I must admit I haven't had as much practice at them as you have.


Ironically, you're employing a fallacious debate tactic (ad-hominem attack) while incorrectly trying to label a different debate tactic fallacious.


It describes a fallacy and have no idea who coined it. First learned of it on HN, actually.

You don't know who coined "coined", but they may well have been a racist. Are you going to stop using it if so? Does that mean it's no longer useful for communication? Are you going to investigate every word on the chance it might've been and strike those from the lexicon?


It is, in fact, not useful for communication, because it does not honestly communicate anything. It exists only to undermine people who try to call you out on making bigoted remarks. It was coined by ESR, and is popular mainly with people with a strong affinity for bigotry, like him, and also libertarians.


> The implication is that some external knowledgable entity communicated with the patient

The implication of the actual article, if you read it to the end, is the exact opposite.


The article also mildly makes fun of the people who believe it to be a supernatural phenomenon.


This was to more closely match the behaviour of Objective-C, and it has been removed as better ways of Objective-C interop were developed.


> and Swift is particularly weird.

The way Swift does them is the least awkward and most robust I have seen in any language.

> As an example, the colon and the whitespace are sufficient to separate parameters. So is the comma. So we have...both?

Designing a language around what the minimum requirements of a parser is will not get you a good language. Human language is massively redundant, because humans find it more easy to communicate when there is redundancy. Computer code is also communication aimed at humans, and a bit of redundancy does not hurt there, and can often help.

> There is also the issue that, please correct me if I got this wrong, the label that's in the signature becomes the name of the parameter inside the function.

This is incorrect. You can optionally specify both an internal and external name for any argument. Unlabelled arguments are just a special case of this syntax, where you say "external name does not exist, internal name is this".


This is already the case with the name of your function. And named arguments are basically just an extension of the name of your function.


But this is already the case for the name of the function. Semantically, named arguments are just part of the name of the function.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: