Yes I agree. Like original content takes a lot of work to produce, and could get an extra chance by default. Whereas news articles, tweets, and content from large tech companies have their own promotional campaigns.
I'd rather have eclectic ideas and projects from HN users not be overlooked (thus encouraging more of such content), and am less worried about GAFAM announcements, CNBC/Axios/BBC news, or things already popular on Twitter/Reddit.
I'm all in favor of doing more to help obscure sites and having less major-media and $BigCo stuff, but there are limits. A site being obscure or having original content by no means implies that it is interesting in HN's sense. If you try to encode those criteria into software (and we've tried many times) the median-quality post comes nowhere close to clearing that bar, so you still need human curation, and that is basically the status quo. If you look at https://news.ycombinator.com/pool you should see a lot of such sites.
Also, a lot of those media and BigCo stories really are of interest to the community. We try to dampen the stuff that's repetitive, and most of those sites are downweighted by default, but HN would not get better if they were excluded. It's all just more complicated than it seems like it might be.
What ultimately matters is how interesting a story is, not what site it comes from. I'm suspicious of encoding proxies for that, because it would be easy to end up optimizing for the wrong things. https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...
Yeah that's totally understandable. I'm not advocating for removing/demoting major media stuff or bumping up obscure sites, not even saying anything about the scoring algorithm should change.
Rather I think obscure sites should get more opportunities to be organically upvoted on (and if they don't get voted up, then fine) and not just fall off /new after a few hours only to be seen by a few people. The BigCo stuff naturally gets posted often several (different) links from different people, whereas obscure stuff is only posted by a single person once. So this is about evening the odds.
One idea here could be to have some set of guidelines for a domain like: is not commercial, is not promoting something, has had past HN front page discussions. Then those domains could just have a slightly different color in the new stream.
Maybe better would be to weight the first 50 votes or so, so if the site has rarely been submitted to HN, every 2 votes count as 3 or whatever variable weight works. The problem is that you can't give blanket +1 votes to submissions from less mainstream sites either, so initial traction might be harder to achieve anyway. I don't know if mods manually upvote some of the new content with this in mind, but yeah, in the end this second chance pool is pretty equivalent.
Looks like you made a calculation error or misunderstood your units. A credit hour is 3 hours of work, as a credit hour is "(1) One hour of classroom or direct faculty instruction and a minimum of two hours of out of class student work each week for approximately fifteen weeks for one semester or trimester hour of credit, or ten to twelve weeks for one quarter hour of credit, or the equivalent amount of work over a different amount of time"
So it's 2,700 hours for a 4-year degree, versus 960 hours in Lambda School, just using your method of calculation. It's also not counting the internships or summer programs that students in 4-year schools usually partake in. And it does not count extracurriculars during the schoolyear, like hackathons, interview preparation, programming competitions, student group projects, etc. Finally, you're assuming that an entire half of a college degree is geneds, which is really not the case. It's more like 1/4 geneds, 1/2 required major/concentration courses, 1/4 electives which many students opt to take technical courses in. So probably more like 3,200 (minimum) to 5,000 hours in a 4-year college.
This is a reasonable and balanced analysis of the situation. In retrospect, it seems like the reversion of the 190 patches was an overreaction that ended up causing a lot of confusion: many people even on HN misinterpreted the comments on reversions to believe that bad patches were committed to the source tree or to stable.
But besides the lesson that one ought not to be deceptive with submitting patches, is also the lesson that the kernel is not as well reviewed as one may hope and with some effort, it's certainly possible to add an undetected vulnerability. I think that's probably one thing that led to the drama, is that the fundamental trust and work of the kernel was attacked, and the maintainers felt the need to fight back to protect their reputation.
190 patches were not reverted. 190 patches were proposed to be reverted, and those reverts have been going through the normal kernel review process. In many cases, the patches were confirmed to be correct, and so the revert was dropped. In 42 cases, those commits were found to be inadequate in some way; in some cases, the commit didn't actually fix the problem, or introduced another problem, or I believe in one case, actually introduced a security problem(!). Call those last set of patches, "good faith hypocrite commits".
Remember, too, that many of these commmits were in random device drivers. Consider how often random Windows device drivers crash or cause random blue screens of death. Not all "SECURITY BUGS (OMG!)" are created equal. If it's in some obscure TV digitalization card in the media subsystem, it won't affect most systems. Core kernel subsystems tend to have much more careful review. As the ext4 maintainer, I'm a bit slow in accepting patches, because I want to be super careful. Very often, I'll apply the patch, and then looking at the changed code in the context of the entire source file before approving the commit. Just looking at the diff might not be enough to catch more subtle problems, especially dealing with error handling code paths.
The problem with device drivers is that in some cases, they are written by the hardware engineers that created the hardware, and then as soon as the hardware ships, the engineers are reassigned to other teams, and the device driver is effectively no longer maintained. One of the reasons why some maintainers are super picky about allowing device drivers to be admitted to the kernel is because the concern that driver author will be disappear after the patch is accepted, and that means the subsystem maintainer is now responsible for maintaining the driver. So if you want to sneak a SECURITY BUG (OMG!) into the kernel, targetting some obscure device driver is the going to be your simplest path. But that's really only useful if you are gunning for a IEEE S&P paper. The obscure device is not likely to be generally used, so it's not that useful.
(Unless, of course, you are a hacker working for Mossad, and you are targetting a super obscure device, like, say, a nuclear centrifuge in use by Iran.... in that case, that security vulnerability only works on a very small number of systems is a feature, not a bug. :-)
I assume it depends on the nature of the diff? I do a lot of code review at $JOB, and sometimes the diff is so obvious that there's no need to look further.
OTOH if the code is something that I've haven't looked at in a while or don't understand that much, I'll read around a bit and see if there's a way the diff can be improved.
I don't think the latter part is true, my impression is that the kernel people are very well aware of the limits of their review ability and don't pretend to be unfoolable.
There's a wide range of degrees between "unfoolable" and "can be done by a persistent student". I think the impression (at least my impression) used to be is that it was possible before but quite unlikely without state-level efforts, but now we understand a properly advised student can get most of their attempted vulnerabilities inserted.
The pure number of just regular bugs that aren't caught is already a good indicator that not much special effort is needed. (And "just a persistent student" isn't that little, given that the group also contributed regularly to the kernel, was studying its security, ... and thus quite familiar with the field, and the kind of people a nation state would employ for that)
In my life, this was an aggressive employee we hired who was not performing but also passive-aggressive to everyone, and at the beginning we were obligated to give them the benefit of the doubt (due to their explanations that they're struggling with the pandemic, civil unrest, family emergency, elections).
Then even after gathering evidence of their poor performing, they will just keep arguing their case, and would protest to everyone who will listen, and pit people against each other. They will claim they never knew those things were their responsibilities, that they were working hard on it, and that they were just about to address the performance issues, etc.
I'd be a little careful with your one-line dismissals of fields. Each field has hundreds of people dedicating their entire lives into it. It'd be offensive to them that someone who hasn't even done any research to think they can judge an entire field like that. Imagine a high school student who took a community college course and saying "University juniors don't know anything. I discovered that they just copy from Wikipedia." The advice in this thread is a great starting point.
Actually that is his genuine opinion which reflects his interest. I also happen to point to you that I would also be more careful to judge community college courses. There are people doing great things over there too - dedicating their entire life.
I think it's just about to take off. Browser support across the board is coming by the end of the year. PNG also took a while to become fully supported, so we just need some patience.
Not for me. I'm in the United States in an urban area and mobile data is 1 cent per megabyte (my provider is Google Fi). Decoding is nearly pretty fast for me, but the time for page elements to download is in the seconds (not even milliseconds). And if there's any low signal (wifi or mobile data), then over 10 seconds is not uncommon.
No. Tuition fees is just one of 5 money streams that the university relies on. Te others being Research grants, Service provisioning (contract research mostly), subsidies and Assets & Investments.
Seems a bit low. Isn't a drink already $7 a beer, or $12 a cocktail there? That's $10 and $15 with tax+tip respectively, leaving very little for the entree plus appetizer...