A lot of the web properties are HTTP based and they probably won't bother to switch either because the owner doesn't care enough to go through all the hassle, or because they hosted their site/pages on a server whose owner doesn't care enough to go through all the hassle.
Just because these owners don't care about their site, doesn't mean they are not valuable.
I was referring specifically to the author's claim that deprecating HTTP and displaying a warning to users is the equivalent of "rendering large parts of the web inaccessible" and "massive book burning".
If you mark something as 'Not Secure' you're effectively telling users not to access it. It doesn't matter in practice if the content is still accessible when the browser tells people that accessing it is dangerous.
A lot of content that exists on the unmaintained web is going to effectively be lost, and maybe that's okay because the benefits are worth it, but that's the argument that needs to be made.
A lot of content which exists on the unmaintained web is going to be lost anyway, regardless of HTTP. Hosting sites shut down after a while. There's a ton of content that was on Geocities, for example, which is now inaccessible.
A migration of newer websites to HTTPS is no more a loss to data than deprecating old versions of HTML in newer versions of Chrome or upgrading newer servers to HTTP/2. The old ones will still exist. And if they cease to exist, it's almost certainly not going to be because a browser displays a "not secure" warning on the top left corner of the scene.
There are also entire websites and communities built upon the archival of data:
> It doesn't matter in practice if the content is still accessible when the browser tells people that accessing it is dangerous.
It certainly does matter. There's a fundamental different between people being discouraged from accessing a website, and said website being completely inaccessible.
Sure it does. Users fear warnings they don't understand, and some will not click past they warning. For those users, you might as well replace the page with a 404 for all the good it does them. Maybe there are not MANY such users; I don't know and would love to find out. But surely some.
Wow, good job focusing on minute details instead of addressing the overall point your parent is making.
However, I'll indulge you and hazard a guess:
* The historically naive and somewhat incorrect generalization of some ill-defined "Western" culture (what time frame is he using?)
* the negative portrayal of it
* and the implicit assumption that any part of the portrayal is somehow unique to all Western countries, but not the rest of the world.
I will give you that this criticism of "Western culture" probably doesn't really relate to traditional Marxism, which more or less worked within the framework of Western culture.
I believe your parent might have had the concept of "cultural Marxism" in mind.
Another part that effects the current prison population is the length of sentences. One country might imprison less people but for much longer so the total population at one time might be very high.
I think it's a sign of horrible reasons. Dig into specifics and the US does unjustly keep people in prison on a regular basis.
We have huge issues from generations of politics over substance. It's only obvious that things are getting worse when you look at statistics because nobody has unbiased understanding of the entire system.
> As of now, text rendering methods are missing and for security reasons you cannot read back pixels from the canvas.
Does anyone have an idea as to what those reasons might be? I've heard of JavaScript access to certain CSS features being limited (e.g. getComputedStyle()), but I'm not sure what the benefit is here. Is there any way that user information could be leaked through a paint worklet context?
We made a mistake in the article there. (There aren't any security issues with pixel read-back here) I'll ping Surma on Monday to get it fixed.
The primary reason we did this is to ensure there wasn't a performance cliff if you did read-back pixels. With the current API surface you can record all of the canvas commands, and play them back when you need to raster. Additionally it doesn't leak how many pixels we are actually rastering to.
The idea behind my script is that the URL is encoding the unix time of the first visit to that URL. By starting at the current time and working backwards, the first HTTP 200 you come across should be the current page.
What exactly is this referring to?