In case people were wondering, the haskell stdlib answer to his interview question is at https://hackage.haskell.org/package/transformers-0.5.2.0/doc... (ContT is a generalized form which lets people run, for example, computations that involve IO). Though explaining it is still the fun part.
I know it's easy (and fun) to hate on Microsoft, but damn do they employ some great engineers/researchers: Erik Meijer, Anders Hejlsberg, Simon Peyton Jones, etc..
Hejlsberg did amazing work on Turbo Pascal/Delphi, which were basically his personal inventions. He then crossed over, and got himself lost in the C#/.net universe - technology some of us wouldn't touch with the proverbial ten foot pole, or one of any length, for that matter.
Considering that his (and my) compatriots Stroustrup and Lerdorf did C++ and PHP repsectively, a national apology might sort of be in order.
Yeah, when Leslie Lamport was mentioned a few days ago I thought to myself that MS has many, many more employees that I know from reading/watching in videos compared to apple, google, ibm etc.
It's Microsoft Research, specifically, that pulls in most of this great talent. Let's not forget the VMS's people run by Cutler that made the kernel, the Xbox hypervisor, and some other stuff. Add Butler Lampson if we're talking CompSci people. The teams behind Dafny, VerveOS, Ironclad, VCC, etc are way ahead of most language-based safety or formal verification in terms of cost/benefit analysis:
Of the three parent listed, I think two (Meijer and Hejlsberg) were both DevDiv. Cutler was OS before moving to Xbox of all things, and I think he's at Azure now? It's certainly not only MSR hiring these guys.
I should've broken up the comment a bit as it looks like I meant all MS Research as you said. I meant to say something along the lines of (1) MS Research is doing a lot of great stuff with some examples and (2) Butler Lampson should've been on the parent's list of top talent at Microsoft. Appreciate you telling me about the mistake, though. :)
Cont stands for Continuation. A continuation basically represents a 'suspended' computation with an intermediate result of type 'a' and final result of type 'r'
type Cont r a = (a -> r) -> r
Cont is a type constructor that takes two type arguments, r and a. This means that Cont r a can always be substituted by (a -> r) -> r. For example, Cont String Int is equivalent to (Int -> String) -> String
(a -> r) is the type of a function from a to r. For example, Int -> Bool is the type of a function from Int to Bool. (a -> r) -> r is the type of a function that takes a function from (a -> r) as its argument and returns an r. So Cont String Int takes a function from Int to String as its argument and finally returns a String.
Not really to do with partial evaluation or the unrelated idea of a partial function.
A key motivating (non-Haskell) example behind continuations is the idea of replacing "return" with a function call. This is continuation passing style, and obviously when you call a subroutine in CPS, you need to give it a function to call when it completes: a "continuation" which is contrived to be equivalent to what would happen when that particular subroutine "returned" in normal direct style
> I'm reading it as 'Container' passed 'r' and 'a' performs 'a' to 'r' which returns 'r'. (I'm guessing on verbs here).
This is an equation; it's a relating/defining one thing in terms of other things. In particular, there's no concept of "performing" (e.g. there's no time, state, etc. here)
In this case "Cont" is just a name; we can tell that, since the equation tells us that "Cont r a" is equal to some other thing involving "r" and "a", so we could just as well use that other thing (that's what it means for two things to be equal!). Hence, "Cont r a" is just a shorthand for "(a -> r) -> r".
So what is "(a -> r) -> r"? In general, for any types "x" and "y", the type "x -> y" is the type of functions which take an "x" as input and return a "y".
So "a -> r" is a function which takes an "a" and returns an "r".
So "(a -> r) -> r" is a function which takes a function from "a" to "r", and returns an "r".
Cont r a is a computation that knows how to produce an "a" but, instead of returning it directly, passes it to a function that takes the "a" and produces an "r".
Compared to typical sequential execution, this gives more power to the "current" phase of the computation, because it might choose to invoke the passed a -> r function more than one time, none at all, inspect the resulting "r" and change course based on the result, etc.
How do I “prove” a type forms a monad? I only have to implement `lift` and `bind` with the correct behavior, and I have a monadic interface. Did I prove then that the type forms a monad?
In particular, I suggest his Haskell lectures to anyone interesting by theoretical aspects of functional programming. https://channel9.msdn.com/Series/C9-Lectures-Erik-Meijer-Fun...