Hacker News new | past | comments | ask | show | jobs | submit | hrayr's comments login

It must have learned from your prompt because this is what I got:

User: SGVsbG8gY2hhdGdwdCBob3cgYXJlIHlvdQ==

ChatGPT: Hello! I'm doing well as an AI assistant, thank you. It seems you've sent a message in Base64 encoded format. The decoded message is "Hello chatgpt how are you". I'm here to assist you with any questions or tasks you have. How can I help you today?


I'm sure you meant to say leaker not leader, correct?


Yes, seems I didn’t catch an autocorrect. Thank you.


Well timed article for me. I'm looking to buy an external monitor to go along with my older (mid 2015) 15" MBP.

I was searching hacker news and elsewhere for opinions last night. I got confused, frustrated and gave up. Today is no exception. The only thing I know is, I want a large hi-res monitor that is at least comparable to my laptop.


4k 60Hz should work no problems. Last Macbook that didn’t supported it was 2012 retina.


Had a lot of fun with this. Took few minutes to come up with this: https://music-grid.surge.sh/#16-68-2112-2056-2069-0-1152-102...


Reading this article reminded me that I had signed up for Remember the Milk Service years ago to try to beat my procrastination habit.

I signed in and found only ONE task (buy milk, due 01/05/07)

I have yet to buy the milk...maybe later...at what point do I seek professional help?


I don't think that's contradicting. Directly contradicting would be "Don't wear a mask if you are taking care of a person with COVID-19". What you described sounds more like extending the current WHO recommendations.


I think "You need to wear a mask when you go out of the house" directly contradicts "you only need to wear a mask when taking care of someone with COVID-19"


No they're definitely saying not to wear masks because they think it's net harmful. For example

> WHO also said community masking could lead to a "false sense of security" and cause people to ignore other evidence-based measures like handwashing and self-isolation.


That still doesn't seem like a contradiction to me since they are not explicitly recommending not to do it, only suggesting to be aware of the possible downsides.

That said, they also don't explicitly recommend to not take vitamin C or turmeric so the examples given by YouTube are already in conflict


And so continues this disgustingly paternal approach of "nudging" people to do things instead of presenting clear information and empowering citizens to take correct action.


because it is true, the N95 mask don't protect you from the virus completely, and the masks that city advocate to use (non N95) don't do anything to protect you, they are there to protect others from you. If you sneeze or talk those masks will restrict most of the droplets from traveling far.


> because it is true

I don't have an opinion on whether it's true or not myself. I’m not qualified to say.

I just know that reputable authorities disagree on it. So how can you enforce a truth here?


This is the heart of the issue. Just because legitimate and identifiably bad advice exists doesn't mean this is suddenly widely applicable to all health information. Youtube isn't banning content that the WHO is "absolutely certain is true" either, they are using any of the (frequently changing) general advice coming out of WHO.

There's always tons of grey area on what is true or good advice. No matter how much 'science' is thrown around, or credentialism ("they have a PhD!"), or credibility some international organization has, that does not make this any less true. Especially when it comes the speech of the general public - not some advice via some formal expert forum or government body.

Even Snopes.com is littered with incorrect and politicized positions, and unlike the WHO, their actual stated goal is trying to determine what is true or not. WHO has a million different incentives beyond this.

What is useful advice for governments to disseminate, given tons of competing interests (which is what WHO specializes in), doesn't automatically translate to what regular people should be allowed or not allowed to discuss in an open forum.

Additionally, we're not being asked to just trust WHO to have good advice and good intentions, Youtube is telling everyone to trust WHO information proxied via an Youtube moderation teams interpretation of it, via an opaque no-appeals process.

Anyone who has used any major platform before (Reddit, Paypal, etc) knows how stupid moderation teams can be. And the vast majority of moderation teams haven't been stupid enough to try to determine what is 'true' or not. Because that's crazy.

The only way I might support this is if it helps bring about the demise of Youtube.


That doesn’t really imply that you cannot or should not. The “only” there means they aren’t necessarily recommending you to do so but it is not forbidden.

So requiring you to wear masks in other situations does not contradict, because otherwise “doctors should wear masks at all times in hospital” would also contradict that (assuming the doctor isn’t taking care of just Covid-19 patients).


The "only" is a typo.


> "There is no specific evidence to suggest that the wearing of masks by the mass population has any potential benefit. In fact, there's some evidence to suggest the opposite in the misuse of wearing a mask properly or fitting it properly," WHO executive director of health emergencies Mike Ryan said Monday. - https://www.voanews.com/science-health/coronavirus-outbreak/...

WHO has been saying "uninfected" people should not wear masks from the beginning. That's bad advice unless you're trying to save masks for medical works which makes it a well-intentioned untruth. Either way I don't like YouTube deleting a video that contradicts this WHO "advice".


Thanks for explaining, I didn't know that. That advice is certainly bad. I agree with your take.


Nope, check the link.


No I mean it is a typo on the WHO page, I sent them feedback suggesting this phrasing:

"Wear a mask if you are healthy and you are taking care of a person with COVID-19."


How do you know they made a typo, rather than that's what they meant?


Hanlon's Razor


Perhaps it's neither malice, nor stupidity, but rather... just a different opinion to yours?


It’s a bit presumptuous to assume that the WHO has a typo in a document like this. Besides their messaging has been consistent.


Anyone can make a mistake.


I've worked at large government agencies. These things are reviewed and signed off by dozens of people before posting.

Besides, like I said this is not the only place they've said this. They've been consistent in down-playing the importance of masks.


Only in the same sense that "take vitamin C" is an extension of the WHO recommendations, and Wojcicki explicitly mentioned that as an example of what's banned.


My understanding here, from the quote in the article, is that one is ok to say 'Take vitamin C', but it is not ok to say 'Take vitamin C, it will cure you [of x]'.


> I don't think that's contradicting.

Sure it is. Only means exclusively, one, no more. They're saying, "You exclusively need to wear a mask if you're taking care of a person with C-19". This action, excluding all others.


"Only" qualifies "need". "you only need to wear a seat belt on public roads" and "you should wear a seat belt on private roads" are not contradictions.


You're confusing a declarative v. an imperative statement. What you propose is a declarative, and you're correct that it ain't contradictory.

The WHO's messaging, however, is imperative¹ (emphasis theirs):

"For healthy people wear a mask only if you are taking care of a person with suspected 2019-nCov infection"

This makes it clear that it directly contradicts the more widely-accepted similarly-imperative advice of "wear a mask in public, no matter what".

----

¹: https://www.who.int/images/default-source/health-topics/coro...


In English, "you only need to" is completely different from "you need to only".


I work on a virtual conference platform, we've had significant increase in interest in the last couple months... you might want to look into hosting a virtual conference to uphold your tradition and be responsible at the same time.


We will be teleconferencing. Perhaps I will advertise that option more.


One of my fondest memories in learning to program as a kid in the late 90s, was writing windows 98 UI clone in QBasic.

I would screenshot the start menu, buttons, window borders, and various other UI components and try to recreate them in QBasic by zooming in and inspecting all the pixels.

I had subroutines to create windows, buttons, menues, various fonts, 255 colors and mouse support. It was coming together incredibly well given I had no idea how any of these were built. I had a working version of minesweeper and a text editor.


> One of my fondest memories in learning to program as a kid in the late 90s, was writing windows 98 UI clone in QBasic.

I did the same although trying to create a Unix GUI (in a purely visual “I’ve seen this in the movies” sense) and I did it in Amos Basic.

Needless to say it wasn’t a great success, but it provided me with the foundation for a making couple of neat-looking applications which actually did useful things (to me).

It was slow as heck, but I had great fun doing it.


That's funny, I did the exact same thing, although I didn't make it as far as you. I had a working mouse cursor (reading the mouse data directly from the serial port) and buttons. At that age, I didn't know about subroutines and had gotos all over the place.


It was a common rite of passage at the time. I did something very similar, first cloning Borland's TUI, then Win3.1.

The nice thing about Windows of that era - its widgets and their default color scheme was designed to still work with just the original 16 EGA colors (since that was the baseline for video cards back then). To be even more precise, everything other than window title and selection was done in 4 colors - white, black, and two shades of gray. Window/selection added a fifth. Things like selection rectangles and resizable window borders were done using XOR. This all was readily accessible in a DOS app, pretty much regardless of the language.


Did you build a gui toolkit or just hard-code everything? I remember creating a GUI paint program in Turbo Pascal (the only language I could get my mouse to work in) and I quickly got over my head as I didn't abstract anything out


It was abstracted out, but I don't know if it qualified as a GUI. I had subroutines for creating the various components and placing them anywhere on the screen. I don't remember how I handled the events. One of my biggest regrets is loosing all my work around that time.


> until you have too many models to comfortably fit into one file. That might not be until 20KLOC.

Even for larger apps in my django project, I convert my models into a package and split all my models into individual files (same with views and serializers for DRF). This keeps my models easier to manage.


I don't entirely disagree with you, but what do we call the web if not distribution?


Someone still has to at least pay for the bandwidth though...

I agree it's not an Apples to Apples comparison, but it always strikes me that people forget they used to have to pay for distribution too.

I remember one piece of software I was working on would cost the company around $5K per release in download costs alone.


The web didn't disappear, quite the opposite.

Now you have the web AND the in-device app stores. Even more choice.


But they're not equal citizens on mobile devices. Web often feels sluggish on mobile, and it doesn't have the same access to device and hardware APIs, nor does it allow low-level control of concurrency and allocation. (I don't want to get off topic about HTML/JS. It does a lot of jobs well.)

There could have been an open, cross-platform native app stack if Apple and Google and a consortium of other companies had joined forces and made it so. There's no technical reason preventing this. The economic incentives got us to where we are.

Devices should be easy to target. There should be the option to use multiple app stores right off the bat, and you shouldn't have to bundle to get access to Gmail. Or even better - point your browser at gmail.com and get the native app installed on your mobile device. Distributed updates from gmail.com sans app store. The OS would still control permissions and guard against malicious apps.

The world wide web era was truly unique and special, and it's a shame the same principles didn't carry into the mobile world.


> There's no technical reason preventing this. The economic incentives got us to where we are.

What's the difference? There's also no technical reason preventing you from developing your own cross-platform native app stack that beats Apple's and Google's with no conceivable return on investment, constant PR disasters, and time-consuming negotiation with bad-faith or incompetent partners. The forces that stop you are mostly economic.


You can do all of that, just download the apk. What you seem to want is that app stores should be forced to accept any apps?

The rest is just saudosistic nostalgia. There wasn't mass access to the internet, you're just missing the times when a small elite had access to the internet.

Consumers are far better now than they were.


There was open access to the internet when aol started giving away cd-roms. Rich and poor, educated and less educated roamed different sites often clashing.

The interest wasn't massive because things were not super easy. Phones bridged the gap. Fast forward to today you have less choice but bigger buy buttons.


> rich and poor

You seem to have a mental image of the internet that doesn't correspond to reality. Poor didn't have access, they couldn't afford a computer, much less an internet plan.

Step out of the Hacker News bubble for a bit.


> Poor didn't have access, they couldn't afford a computer, much less an internet plan. Step out of the Hacker News bubble for a bit.

I was very, very poor. So were most of the kids I grew up with. Most of us couldn't afford new computers; that's true. So we bought old ones. My father - tears in his eyes - lugged in something ancient that he'd picked up for $50, having no idea how to use it but hoping that putting it in front me would do me some good. AOL had been mailing everyone in town (Detroit). Some kids got PCs from the nearby churches, some got them from the school (others still only ever used them in school). We collected and hoarded the access disks, and would go ringing each others' phones or knocking down doors to share websites we'd found. Imagine my embarrassment when I realized the AOL search results page was not the entire internet, and that I could click on any text with a blue outline.

> doesn't correspond to reality

I know there are people who managed (or didn't..) to grow up poorer than I did. But even poverty is a spectrum.


Very poor where? In the US?

The "very poor" in the US are rich in most of the developing world.


I am aware of this, anticipated this response from you (as you’ve repeated in this thread), and addressed it at the end of the comment you’re replying to. I do not believe that a global perspective diminishes the argument in any way.


Oh, it does. Only 5% of the global population had access to the internet in 2000.

If that doesn't qualify it as a product only available to the elites, I don't know what will...


Given the nature of your position it feels ironic to be saying to you that if you feel my family was amongst the worlds’ “elite” just because we lived in the US, then I believe you may need a more nuanced outlook on people. Also, there’s a tent-city of people outside my building who’d like a word with you.


Oh, if you could afford a computer, even an used one, in the 90's you were definitely part of a global elite.

Heck, in 2000, just under 30% of the global population had access to properly built sanitation services.

And about 28% of the global population live with under a $1 (in PPP) per day.

Respectfully, I think you are lacking some context about life in other countries.


> Respectfully,

I’m not though, and I have already said so. My original “poverty is a spectrum” comment still firmly addresses everything you’ve said, so if we still disagree, we can agree that’s ok.


The implication of “Consumers are far better now than they were” is that they are better, because of the App Store. This is false, completely illogical, and a straw man argument (as no one suggest users of tech have it worse in 2019 then 2005). Consumers are better off now, because technology has gotten better, not because App Stores have made things better. App Stores have actively caused harm to the consumers through lock in and through a compression of imagination amongst end users of how much better things could be.

You’re also factually incorrect about “a small elite had access to the internet”. That may have been true in the 80s, but by 2000 usage was at 50% and climbed to 75% by 2010. I mean the dot com boom was predicated on wide user adoption, so I don’t even know why you would even think that stat made inuitive sense.

https://www.pewinternet.org/fact-sheet/internet-broadband/


> The implication of “Consumers are far better now than they were” is that they are better, because of the App Store. This is false, completely illogical, and a straw man argument (as no one suggest users of tech have it worse in 2019 then 2005).

No, it isn't false. Buying and/or installing an app today is orders of magnitude easier, safer and cheaper than 2005.

You clearly don't like it, and you probably are part of the little elite that had access to the internet before the general public did.

You can still do online everything that you used to in 2005, nothing was removed.

> That may have been true in the 80s, but by 2000 usage was at 50% and climbed to 75% by 2010.

Sorry the break the news to you, but the US isn't the world.

Maybe you should step out of the hacker news bubble for a minute.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: