There are some people that REALLY want to find racism even when it's not there. The "everything is racist" crowd is just as insufferable as the "nothing is racist" crowd.
As they should. Postel's Law was a terrible idea and has created minefields all over the place.
Sometimes, those mines aren't just bugs, but create gaping security holes.
If your client is sending data that doesn't conform to spec, you have a bug, and you need to fix it. It should never be up to the server to figure out what you meant and accept it.
Following Postel's law does not mean to accept anything. The received data should still be unambiguous.
You can see that in the case where ASN.1 data need to be exchanged. You could decide to always send them in the DER form (conservative) but accept BER (liberal). BER is still an unambiguous encoding for ASN.1 data but allow several representations for the same data.
The problem with BER mainly lies with cryptographic signature as the signature will only match a specific encoding so that's why DER is used in certificates. But you can still apply Postel's law, you may still accept BER fields when parsing file. If the field has been incorrectly encoded in a varied form which is incompatible with the signature, you will just reject it as you would reject it because it is not standard with DER. But still, you lessen the burden to make sure all parts follow exactly the standards the same way and things tend to work more reliably across server/clients combinations.
I agree that being liberal in what you accept can leave technical debt. But my comment was about the place in the code where they set a cookie with JSON content instead of keeping to a format that is known to pass easily through HTTP header parsing, like base64. They should have been conservative in what they sent.
And yet the html5 syntax variation survived (with all it's weird now-codified quirks), and the simpler, stricter xhtml died out. I'm not disagreeing with out; it's just that being flexible, even if it's bad for the ecosystem is good for surviving in the ecosystem.
There was a lot of pain and suffering along the way to html5, and html5 is the logical end state of postel's law: every possible sequence of bytes is a valid html5 document with a well-defined parsing, so there is no longer any room to be more liberal in what you accept than what the standard permits (at least so far as parsing the document).
Getting slightly off topic, but I think it's hard to find the right terminology to talk about html's complexities. As you point out, it isn't really a syntax anymore now that literally every sequence is valid. Yet the parsing rules are obviously not as simple as a .* regex. It's syntactically simple, but structurally complex? What's the right term for the complexity represented by how the stack of open elements interacts with self-closing or otherwise special elements?
Anyhow, I can't say I'm thrilled that some deeply nested subtree of divs for instance might be closed by a open-button tag just because they were themselves part of a button, except when... well, lots of exceptions. It's what we have, I guess.
It's also not a (fully) solved problem; just earlier this year I had to work around an issue in the chromium html parser that caused IIRC quadratic parsing behavior in select items with many options. That's probably the most widely used parser in the world, and a really inanely simple repro. I wonder whether stuff like that would slip through as often were the parsing rules at all sane. And of course encapsulation of a document-fragment is tricky due to the context-sensitivity of the parsing rules; many valid DOM trees don't have an HTML serialization.
You could split the difference with a 397 TOLERATING response, which lets you say "okay I'll handle that for now, but here's what you were supposed to do, and I'll expect that in the future". (j/k it's an April Fool's parody)
Misuse of inheritance is often the biggest generator of criticism of inheritance, sadly.
People use the wrong tool for the job, or use it incorrectly, and then blame the tool. It's like using a hammer to play drums, obliterating the drum set, then ranting against hammers.
OOP is a tool that causes people to hang themselves with it by design. The only way to avoid misusing it is to be to the future to see the impact of every single decision involving it to make only the ones that do not cause problems.
> People who are spending new car money are not going to settle for a product that requires planning and effort to be used outside of one's daily routine.
Maybe YOU won't, but others will.
I paid $60K for my Model 3 Performance. Yes, I chose to plan out my charging stops when I take my annual 1300 mile road trip from Portland to Santa Clara, or my recent 2,400 mile road trip from Portland to San Diego.
But I CHOSE to plan them. You don't HAVE to. The car's built-in nav will easy plan charging stops for you. I just choose to plan them out ahead of time (Using ABetterRoutePlanner.com) to min-max my charging time. IE, I can tell ABRP "This will be a stop where I expect to spend at least 30 minutes", and it will adjust the rest of the charging plan accordingly. Or I can tell it to stop at specific chargers that might have a specific place I want to eat, or whatever, but my usual workflow is to set all my destinations (actual destinations, not including chargers), hit Plan Drive, and then make some minor adjustments to the charging plan.
I suppose in some way, I'm sort of proving your point. But it's not nearly the chore you make it out to be. In fact, I actually enjoy the planning. Of course, one person's joy is another's drudgery.
Are you sure it's not actually applying friction brakes?
I have a Model 3, and even when the driving mode is set to "Stop" (enabling one-pedal driving), I know that it's applying the friction brakes at low speeds, even when the battery is warm and not full.
Regen isn't enough to slow the car to a stop, even in ideal conditions, and it certainly can't hold the car in place.
Supposedly, first-gen Leafs were known to have pretty nasty degradation due to lack of sufficient cooling. Combined with an already short-range battery, and the belief that you'd need to replace the battery frequently was justified.
Key word: WAS
Of course, modern EVs, and basically all Teslas, have bigger batteries with better cooling, so it's no longer an issue. But the belief won't die, just like how people still make memes about Java being slow as if it's still 1998.
> I don't remember the source so, someone please correct me if I'm wrong but, I read that no EV battery can be made for less than $50K.
Absolute hogwash.
The only way for this to be true is if you amortize the cost of R&D and factory building over a small number of batteries and include it in the manufacturing cost, and I think it's incredibly misleading to include the cost of R&D into the cost of a battery, simply for the fact that you can make wild claims by just including it.
So...for an incumbent manufacturer that's putting very little effort into actually selling EVs, it might be true that it's costing them $50K per battery if you include the cost of setting up the manufacturing. But for someone like Tesla, who has literally sold millions of cars, even if you include that cost, it's closer to $10K.
> the Biden administration's decision to never mention Tesla, instead proclaiming that manufacturers like GM were making America Electric (Tesla sold way more electric cars, and they are much more American).
This always bugged me.
Look, I'm no fan of Elon Musk, but Tesla has been the most influential car manufacturer in the EV space. To blatantly ignore them when talking about the electrification of cars in America is simply madness.
75% of the calories in broccoli is from carbs, sure, but because the overall calorie content of broccoli is so low, it's still considered low carb.
https://www.nutritionix.com/food/broccoli/1-cup
A 1-cup, 156-gram serving is 55 calories, 11g carbs, and 5g fiber, so is only 6g of net carbs for keto purposes.
reply