I guess the point is that while many say what you do here, many of those also say "No! Using lambdas and reduces is great but saying `+/!10` is impossibly terse".
Essentially that's just the Blub Paradox in action.
Sadly, at the moment I'm on this side of the Blub Paradox in a weird way. I do acknowledge the power of specialist syntax and short notation (studied Maths which is full of this) but haven't dedicated sufficient time to it for it to be fluid to me.
range(100).reduce(plus, 0) isn't cryptic, what's cryptic is single character symbols. Looking at that line, unless you know K specifically, it's very difficult to figure out it's a reduce. Where as `sum(range(100))` is pretty self explanatory to most programmers who don't know python.
There's no good (or bad) choices for mapping abstract concepts like ranges, mapping, reducing, or combining to common (ASCII-) symbols.
Every choice is inherently arbitrary and carries no meaning whatsoever (besides maybe ergonomics). One could just as well argue that "/" is the ISO 8000-2:2009 2-9.6 recommended symbol for division. The same source also shows why symbols alone are inadequate for an explicit representation of abstract concepts: the Nabla symbol alone represents 5 different operations depending on context according to the ISO standard.
Really? That’s quite surprising; I would expect all of them to have some sort of “programming languages” course or similar…what classes do they have then?
I’m a math student minoring in CS and I covered all of that before I’d finished second year. I also covered functional programming in first year.
I’m really surprised your CS degree didn’t cover operating systems, compilers, or algorithms. CS students at my school are required to take all of that stuff (and a lot more, including everything you’ve mentioned) before they even finish 3rd year. In 4th year they’ll be studying things like networking, real-time programming, computer graphics (physically-based rendering), machine learning, and computational math (for simulations, numerical solvers, linear and nonlinear optimization, etc).
Ah yes, operating systems - file that under "I forget what else".
Like I said, I feel that I overpaid for that education. The degree itself has paid for itself many times over, but I needed to learn basically everything important for a CS career on my own.
"Write an interpreter for a functional-like language of your choosing with XYZ requirements". In Haskell. As a 1st year uni project. I was just getting over Fortran. And prove it! "Can I do it in Turing please?" "No. Haskell." I still curse that exercise to this day, but it enlightened thinking.
A CS or Maths course should certainly involve some functional language element.
For someone reading thinking of CS at university, whether there's a requirement or option in such would be a good question to ask (you're interviewing them as much as they're interviewing you).