Even knowing a lot about FP, I still found this worth skimming for the esoteric Python syntax. When it comes to constructing dictionaries with list comprehensions, I would always do something like this:
dict([n, 2 ** n] for n in range(5))
But they pointed out an actual "dict comprehension" that I didn't even realize existed:
{ n: n ** 2 for n in range(5) }
And there is a similar "set comprehension":
{ n ** 2 for n in range(5) }
Always amazes me how you can use Python for so many years and still encounter new features in the language.
It's not exactly the same, which is what I thought was interesting. My first example did use a generator expression inside the dict() constructor, but in that case you need to specify the key and value in a tuple or a list.
With the dictionary comprehension you can just separate the key and value with a colon, which is more natural. It might just be sugar on top of a generator expression but it is definitely a special case, syntactically speaking.
Although Raymond Hettinger has also called them generator "comprehensions" in the early proposals, the current documentation calls them "generator expressions". And good examples.
Here's how you make those set/dict whatever comprehensions in Python 2.6, before the native syntax is used:
This little function is really, really cool because it allows you to build up more interesting functions by piecing together a bunch of small, useful ones.
def upper(s):
return s.upper()
def exclaim(s):
return s + '!'
# instead of this
really_angry = lambda s: exclaim(exclaim(upper(s)))
really_angry('napster bad') # NAPSTER BAD!!
# we can do this
really_angry = compose(upper, exclaim, exclaim)
really_angry('fire good') # FIRE GOOD!!
# and
import operator as op
from functools import partial as p
max(map(compose(p(op.add, 1), p(op.mul, 3)), (1, 2, 3, 4)))
`compose` is a neat function and worth exploring. This is a cool book and I always hope Python gets more light shone on its FP-friendly features.
For compose to really shine, you need to be able to curry/partially apply functions. This part of things is made much more difficult than necessary because of Python's unnecessarily neutered lambda syntax (in fact, I don't think one can claim that Python is FP friendly until this decision is corrected).
It's also worth noting that reduce() was removed as a builtin for Python 3.
I started with something like that when I was missing function composition in Python. Eventually I ended up with a library [1] including a bunch of other stuff for getting rid of some of the duct tape code you usually need when you just want to compose some functions.
You can write some pretty neat looking concise code with this, but also may regret it later when it comes to debugging, especially when lazy evaluation is involved (which is usually the case). The stacktraces tend to be not so helpful...
Funny how things turned out. I still remember this post[1], it was profoundly disappointing to see Guido's way of thinking. Much of the damage was reversed but it still left an indelible impression that there's a lack of vision for what's going to be important if the language is to stay relevant in the future.
"Python Bridge, officially known as High Bridge, is a bridge that spans the canal between Sporenburg and Borneo Island in Eastern Docklands, Amsterdam. It was built in 2001 and won the International Footbridge Award in 2002. The bright red bridge spans 90 meters and was designed by Adriaan Geuze of the architectural firm West 8"
Coincidentally, Amsterdam can be considered the birthplace of Python, where Guido used to work at the Center for Mathematics and Computer Science (CWI).
And now for some obligatory functional python.
Run with
python lambda.py 2>&1 | head -c 200
to avoid filling your screen with exhausted recursion depth.
Notice any pattern in the output?
It's becoming more common to see "Functional Programming in X". Why don't we use functional languages like oCaml or Haskell more often? Are we making the jump in two steps instead of one? I've never written more than a few lines of either so I can't tell if something "better" is waiting for me in functional land.
Why are there so very few programs written in say, haskell, that you actually want to use?
This is a serious question that I haven't found the answer for. Around here people suggest pandoc, shellcheck and sometimes the xmonad window manager as pretty much the full list of things you can install, use and hack on written in haskell whose purpose isn't writing haskell code.
Given the popularity of haskell amongst hackers and the various claims about its benefits and strengths, the fact this list is so small (if it's larger and I'm missing a bunch - please DO let me know, I want to play with them and hack on them!) Is something I have difficulty reconciling with haskell being a useful general purpose programming language. Maybe they're being written now and they're on their way? But haskell has been around for more than a few years now. Maybe haskell hackers just mostly hate open source & free software unless it's GHC or a general purpose library? Seems unlikely. So maybe something else I don't yet understand. It is puzzling when I don't code in the language fluently enough to understand its weaknesses as opposed to my own in coding in the language and it's a point the many lovers of haskell never seem to address other than with extreme defensiveness which kind of misses the point of the question.
> Why are there so very few programs written in say, haskell, that you actually want to use?
I think a big part of it was struggling with cabal hell. I know that quite a bit of web development and API stuff is happening with Haskell since I've gotten paid to do some for multiple clients (some requesting Haskell).
I think that with the release of stack[0] (and it eventually being merged into the Haskell platform IIRC) many application developers will start to pick up Haskell and create those types of programs.
Not open source, but an application created in Haskell was bump[1].
Probably because its easier to see the use of functional paradigms in languages you're familiar with. OCaml and Haskell are great languages, but they require a lot of new learning at once.
You are right about it being nice to see functional code in your language. However, from my learning experience I understood the functional style much better by using actual functional language. I started with Scheme which has very minimal easy to understand syntax. Many courses ask students to try forget whatever they know about programming before introducing functional style. If that helps (for me it did), I think starting with new language would be nice decision.
There are many steps between functional programming in, say, Python, and writing everything in Haskell. Many differences. Not everybody has the same opinion of all those differences.
I'm often missing a feature to make language X act in a purely functional way (i.e., to disable side effects completely in a relevant part of the code).
Also missing is a way to select between strict or lazy evaluation.
Yeah. Python has the functional programming features I expect of any modern language. However, I feel that Python has a lot of unneeded syntax. I always prefer apply() over * and map() and filter() over list comprehensions.
func(*args)
apply(func, args)
[func(a) for a in collection]
map(func, collection)
[a for a in collection if func(a)]
filter(func, collection)
I don't see why people use all of this special syntax.
(func(a) for a in collection)
map(func, collection)
as equivalent. If you want a list (and not a generator), you would need to do this:
[func(a) for a in collection]
list(map(func, collection))
For me, the first set (comprehensions) of notation has a more mathematical feel to it, i.e. { x^2 | x \in 0...10 }. Just replace the bar with "for" and it's almost the same thing.
I believe the documentation for `filter` even mentions that it is equivalent to the comprehension[1].
> It may just be a matter of which a person learned first.
I think this is the case, for me map is much harder to read. But I also think that comprehensions go back to math sets, so I was familiar with this even before learning any programming. Therefore comprehensions clicked immediately for me and it's by far my favourite python feature.
I learned both around the same time, but even in good functional languages like Haskell, using a comprehension for anything more than simple problems results in an unreadable mess.
map or filter are much easier to read for complex data manipulations and as a bonus, their composition rules make it easy to increase performance. For example, if you see two map functions together, you can wrap them in a compose and only map over your elements once. This isn't as immediately obvious when you're using comprehensions.
> [ manager.name for manager in set([
person.manager for person in employees
])]
?
I assume with something like set() or unique() you need to create the intermediate iterable anyway, but without it I have trouble finding an example where doing a single list comprehension wouldn't suffice.
Part that bothers me about Python is no special binding form, that is you could just do x = 3 to add x variable to my environment, and second is not-strict lexical scoping.
From other perceptive, these usually bother me when my functions are larger, and ideally functions should be small. So I take it as sign that I should probably break down my function.
And yeah, tail calls! I expect that from a modern language with functional programming features. Unfortunately it seems there are no plans to add them in Python
[(foo(a), bar(a)) for a in collection if condition(a) or alt(a)]
foo_bar = lambda x: (foo(x), bar(x))
condition_or_alt = lambda x: condition(x) or alt(x)
map(foo_bar, filter(condition_or_alt, collection))
As logic gets more complicated, wouldn't list comprehensions become easier to read straight through?
I don't see why people use all of this special syntax.
Comprehensions are one fairly easy way of thinking about sets of things, and transformations of those sets of things. It may not be your preferred way to think about them, but that doesn't mean it's unneeded or that people are wrong to prefer another way.
If you don't have the function handy and don't want to go thru the effort of making a lambda, list (set, dictionary, generator) comprehensions can be more convenient (and you can comprehend over nested lists too)
Just in case, a few years back I've seen an article on functional programming in python. Mostly arithmetic but the patterns were very pretty (think Euclid algorithm generalized). I never managed to find it again. If that rings a bell to someone, I'll be forever virtually indebted.
Interesting, thanks for the link. Do you know if it has any multicore (i.e. parallel) support? I looked but all I could see was support for concurrency.
For anyone wanting a quick intro to fp in Python I stumbled into this awesome presentation (50 slides) about Functional Programming in Python. I especially like his short but clear examples: http://kachayev.github.io/talks/uapycon2012/#/ . Good intro before getting deeper into the linked book.
Mostly OT, but I like the short form books oreilly and packt are producing. I never liked the trend in technical books where every single one had to have 6 chapters of language tutorial, etc.
I am curious why lists do not have ".map" and ".filter" methods? IMHO it would be so better for chaining, now using a few maps and filters is inconvenient and looks unreadable.
Plus, something shorter for "lambda"...
It's one of not too man aspects, where I prefer JavaScript (especially ES6) to Python.
That would require any new container classes to implement each of those methods again (and will probably still lack useful ones, like groupby) whereas currently they can just implement iteration and get the rest for free.
You could have both approaches, but that would go against one of Python's core principles ("There should be one-- and preferably only one --obvious way to do it.").