The constitution is merely a few pages of paper produced by the founding fathers together with many more pages of paper produced by the Supreme Court.
Without men and women willing to stand by it and defend it, it is useless. And what we are seeing is that there are increasing number of people who have taken an oath to defend the constitution but have chosen not to do so.
History is full of cases where a well written constitution is ignored by the ruling government.
Yes exactly. Google is very careful to say that "Google cannot access Android end-to-end encrypted backup data" and notice it doesn't say that all Android backups are end-to-end encrypted. For what we know, Google could have decided to use non-end-to-end backups in the UK and end-to-end backups everywhere else.
I do not understand the rules by which you inject the epsilon and I think this is a source of confusion for many people. I had thought that an epsilon could be injected anywhere REGEX can appear (effectively allowing epsilon as a REGEX) but of course that just leads to infinite number of parses. Manually injecting epsilon is a highly hacky thing to do; better to consider that when you design the grammar.
I would not worry about "cat:dog:mouse" because intuitively it is clearly correct and it means replacing cat with mouse. With parentheses it could be written as "((cat:dog):mouse)".
Epsilon injection appears whenever right or left side of ':' has no operand. E.g.
(:a)
(a:)
a:|b
a|b:
etc
I will try to change the precedence and see how it works.
Btw what do you think about explicit operators '>' '<' where '<' works as usual regex matcher, and '>' as a generator. For example to change 'cat' to 'dog' there could be something like '<cat>dog' where '<cat' part is a parser and '>dog' is a generator. Thanks.
Yes it is underspecified. The deletion example shows that an empty string is possibly a REGEX. So you can essentially treat any position as containing as many empty string regexes as you want. So there are indeed infinite number of parses.
If we instead require regex to be non-empty (breaking the deletion examples), then the ambiguity becomes that of concatenation: whether it's '(((c:d)(a:o))(t:g))' or '((c:d)((a:o)(d:g)))'. Assuming associativity, this would not matter.
I learned to implement my own multiplication and division as part of learning bignum with a high level language (in my case Pascal). By the time I learned assembly language I simply focused on how to translate high level language to assembler; basically be a manual compiler and compare my output against a real compiler.
It seems unnecessary to learn any algorithm including the multiplication algorithm in assembly language.
It depends on the use case. I believe LCO batteries are still more widely used since they are popular in small electronic devices rather than electric vehicles. These do not achieve the thousands of cycles you mention.
In general I don't like articles that simply talk about specs of Li-ion batteries as a whole without specifying the exact chemistry. Apple has different incentives when they choose a battery as compared to say Ford.
I'd argue that the Go example is worse than the equivalent in C, because Go has a garbage collector and C does not. In C, precisely because of the lack of a garbage collector, people need to explicitly and carefully document who is responsible for deallocating the memory. This makes things clear who is owning that memory. And this by extension leads to an intuition on whether the append is acceptable on the original slice or not. If a C function returns such a slice and is documented that the caller should free it, you can call realloc and add to it; if the C function documents that the caller should not free it, you naturally wouldn't even try to append to it: you would copy it first.
In Go, this garbage collector frees people from thinking about freeing memory. But it doesn't free people from thinking about owned vs borrowed memory. And that's where bugs can happen.
I happen to like the way Go did it, but now that generics and iterators have landed, it should be possible to create an always-make-a-copy vector type (although it'll need to use `get` and `set` like Java Lists, not [42] like slices/arrays).
It's not like you could have appended twice to the same List in Java and expected to get two disjoint arrays, unless you copy first, and `slices.Clone()` makes that easy in Go now too.
If you created the slice, you control it and you can modify its elements, append to it, etc.
If you didn't create the slice, you don't control it. If you want to modify it or append to it, you should copy it first.
This has reflected how I've seen Go used professionally by people experienced in the language. The language could offer more help (but probably never will), but it's not hardly a regression from C. The real regression from C is the lack of const pointers.
It's not worse. It's just only slightly better. The most damning thing you can say about Go is not that it doesn't improve upon C, it's that it improves only on C (and only in the ways cared about). The authors of Go really didn't examine other languages very closely. So starting with the authors' goals (light on the page, bounds checked, concurrency with message passing, easy to pick up, fast to compile) and a fairly deep knowledge of C (but little else) you pretty much get Go (especially early Go).
Even that isn't really the case, as Go, as it started, was basically Inferno's Limbo in a new clothing, with some influence from Oberon-2 method syntax and the SYSTEM package as unsafe.
Unfortunately afterwards they decided to follow Wirth's quest in Oberon-07 minimalism, instead of Active Oberon.
I've always felt that the behavior of Go's ownership, in regards to minimalism, is a reflection of it being considered a solution for their internal problems. This is something outside of what users are requesting or needing.
People have to look at Go alternative languages like Goplus, V (Vlang), Borogo, etc... for wanted features or greater responsiveness to user demands.
Go enums are the same as in every single other language. After all, all an enum does is count. There isn't anything more you can do with it.
You can introduce data structures and types that utilize enums, with some languages taking that idea further than others, but that's well beyond enums themselves.
But in defense of Java, modern Java is actually pretty pleasant.
Virtual threads, records and sealed classes, pattern matching, state-of-the-art garbage collectors, a great standard library etc. (and obviously well-behaved enums).
Not to mention the other languages you get for free with the JVM ecosystem.
It might not be as expressive as Rust, but certainly Java/JVM > Go.
No, I don't think HTTP or JSON are cherry-picked. Depending on what kind of code you write, maybe you haven't needed them as often as I have.
Of course, plenty of high-quality third-party libraries exist in the other languages. I've used commons-httpclient and jackson in Java, and everybody and their brother knows about requests in Python (though I prefer aiohttp nowadays). But they are odd omissions from the standard libraries of those languages. At least .NET caught up.
Like what? CrabLang? Its enums are identical to Go, unsurprisingly. Above the enum rests a discriminated union that ends up hiding the details of the enum. That is where things begin to differ greatly. You have to do some real trickery to actually get at the underlying enum result in that language. But, when you do, you see that it is exactly the same.
> and when you type a parameter as a `foo` you're not at risk of getting 69 or 4328.
That's thanks to the type system, though, not enums. Enums are not a type. Enums are a number generator. Hence the name.
> That's thanks to the type system, though, not enums. Enums are not a type. Enums are a number generator. Hence the name.
What's happened here is that you've mistaken "Things I believe" for "What everybody else believes" but you're talking to other people, not yourself, so, this makes you seem like a blithering idiot.
The particular version of this trap you've fallen into is a variation of the "It's all just the machine integers" mental model from C. It's just a model and the problem arises when you mistake that model for reality.
Now, technically this model isn't even correct for C's abstract machine, but it's close enough for many programmers and it tends to match how they think the hardware works, which is even more confusing for the hardware people who know how it actually works but that's another conversation.
This model is completely useless for languages which don't have the same type system, and so it's no surprise that it immediately led you astray.
No, I am clearly talking to a computer program. It is possible that the program is forwarding the discussion on to people. Perhaps that is what you are trying to allude to? The details of how the software works behind the scenes is beyond my concern. There is no intention to talk to other people, even if the software has somehow created that situation incidentally. If I wanted to talk to people, I would go out and talk to people, not type away at my computer into a box given to me by the software.
> The particular version of this trap you've fallen into is a variation of the "It's all just the machine integers" mental model from C.
As much as I enjoy your pet definition that you've arbitrarily made up on the spot here, the particular trap I have fallen into is the dictionary. It literally states what an enumeration is according to the prevailing usage. It does not describe it as a type, it describes it as the action of mentioning a number of things one by one. Which is exactly what an enum does.
The previous comment is talking about type constraints. You can definitely constrain a type such that it is invalid to use it outside of the numbers generated by the enum, just as you can constrain a type to only accept certain strings. e.g. from Typescript: `type Email = "{string}@{string}"` This idea is not limited to enums.
That's definitely a thing, and definitely a thing that could be used in conjunction with an enum, but is not an enum itself. Not as enum is normally used. Of course you can arbitrarily define it however you please. But, if you are accepting of each holding their own pet definition, your comment doesn't work. You can't have it both ways.
Quite right. The entire possible space for enums to explore was exhausted in the age of 1960s assembler. There is only so much you can do with a number generator.
Which, I guess, is why we have this desperate attempt to redefine what an enum is, having become bored with a term that has no invitation potential. But, unfortunately, we have not found shared consensus on what "enum" should become. Some think it should be a constraint, others think it should be a discriminated union, others think a type declaration, so on and so forth.
All of which already have names, which makes the whole thing particularly bizarre. You'd think if someone really feels the need to make their mark on the world by coining "enum" under new usage in the popular lexicon they would at least pick a concept that isn't already otherwise named.
That is probably true, but has little to do with enums, which are concerned with mentioning things one-by-one (i.e. producing values).
It is true that some type system features built upon enums. Like a previous commenter mentioned, Pascal offers a type that constrains allowable values of that type to be within the values generated by the enumerator. Likewise, I mentioned in another discussion that in CrabLang the enumerator value is used as the discriminant in its discriminated union types, which achieves a similar effect. I expect that confuses some people into thinking types and enums are the same thing, which may be what you are trying to get at, although doesn't really apply here. The difference is known to those reading this discussion.
The biggest problem with this desperate attempt to find new meaning for "enum" is: What are we going to call what has traditionally been known as an enum? It does not seem to have another word to describe it.
That's exactly my thought. I don't even need a group chat and instead copy–paste a message to several individual chats, customizing the invitation message along the way.
That works in theory. Again, I'm not an ATC, but might be better to have reasonable length workdays (maybe two hours is a reasonable shift though) and more time off between workdays. Similar to shift work in hospitals, where more hand offs means more opportunities to fail to hand off successfully. But I suspect it may be easier to hand off an airplane than to hand off an ICU patient.
I know nothing about this field, but if the main claim is personnel shortages, it seems like reducing your workforces hours by 75% would not be optimal.
Is the personnel shortage is because most people can't handle 8 hour shifts, then it's plausible that you'd have more than 4 times as many people that can handle 2 hour shifts
If 1 in 100k can handle 8 hours, and 1 in 10k can handle 2 hours, then the solution is to employ 4 times as many people for 2 hours.
Without men and women willing to stand by it and defend it, it is useless. And what we are seeing is that there are increasing number of people who have taken an oath to defend the constitution but have chosen not to do so.
History is full of cases where a well written constitution is ignored by the ruling government.
reply