I've never seen it before, but it appears to be inflation notation. It's a subscript on prices, so the year is the original year of the price, and the cost above it is the inflation-adjusted equivalent in present-day dollars.
I've actually been thinking of moving to Atlanta recently. I live in Portland OR right now, so the sprawl of Atlanta scares me, but I've heard there are good walkable pockets of it.
I walk my dog 2 hours a day; are there places where the sidewalk density is high enough and the busy road density low enough that I could do that and have some route optionality so I'm not just doing the same walk all the time?
The Beltline plus Piedmont Park is a perfect fit. The Beltline is linear, but long (it circles the city), and all of the connected neighborhoods are extremely walkable.
In my experience -- I was a professional Clojure developer for 6 years (until 2018), and hung out in the #clojure IRC channel on freenode a lot between 2011 and 2014 -- the Clojure community tends to be very experienced. Clojure is not many people's first language, but tends to be a language you find your way to after dissatisfaction with other languages, or exposure to other lisps.
I'm sure you can find crappy or charlatan Clojure devs, but I was generally impressed with the Clojure devs I met in #clojure or at the cons. Overall I found the community to have high levels of both knowledge/experience and patience in explaining things thoroughly and precisely to people who were asking questions.
Yes. I was at Puppet (nee Puppet Labs) for four years, one of the larger Clojure shops to my knowledge (not that I've been keeping close tabs on who's picked it up recently).
I switched jobs and language choice was not my #1 criterion. A very interesting opportunity that aligned well with my values (and paid better to boot), but was in a completely different tech stack (Python DS ecosystem, with some Scala). I love the language, think it's wonderfully concise yet expressive way to think about code, runs on a great platform for the web, and is a great fit for when team size exceeds codebase size (though not so much the other way around). All my fun software projects are still in Clojure or Clojurescript; I just wrote some today to scrape doggy listings.
This is just one of the reasons why Limited (building a 40 card deck from only packs that you and your group open) is clearly the best way to play Magic. Another: even though it seems more variance-prone, it's arguable the most skill-testing format in the game.
Though one of the cool elements of the game is that you can also play a game with a random stranger you just met. That happened a lot back when I was playing a lot in the 1990s. But that game is less likely to be balanced, and it got increasingly less balanced as more cards came out and balance and ideas started shifting around.
Somewhat related, Leslie Valiant believes that if evolution performs a random search in genome space, it is computationally impossible for it to have led to the complexity that it's produced in only a few billion years.
Curiously, the language's features for doing runtime checks as a user--core.spec and pre & post conditions--are both very easy to completely disable for production. I don't know why they're not used in core: probably a combination of the core team's feeling that they don't need them, and not wanting to make the language slower than necessary by default.
They're now in the process of speccing all of core. Hopefully next release will have it.
I do think slow by default is part of it. I've seen a lot of builds deployed to prod without prod optimizations in my experience. That's why I think spec in not instrumented by default. Too worried people will keep it on at all times, and then Clojure would get a bad rep for being slow.
In this respect, they share the characteristic of being easy to wholly disable (i.e. prevent them from being compiled at all). With pre & post conditions if the var `clojure.core/assert` is bound to false, then they're elided. The usual pattern is to have them on for testing, and disable when compiling for production.
It's a great example of worse is better in action: a technically inferior platform winning out because it's better at one or two things that enable virality, which is the only thing that matters when all the money is looking for high growth.
In this case, it's that webapps require zero effort and time from the user to get started with, and allow developers to get the closest to the "write once, run anywhere" dream than anything else (if you're doing a decent responsive design, you can even get a good experience on both desktops and phones with much less effort and no gatekeeping), so the development effort is a lot lower.
These two attributes make it really hard for a native app to compete on growth terms with a webapp, since it has a higher hurdle for users, higher initial development costs to target the same amount of users, and higher iteration costs to ship (and get users to install) a new version. It doesn't matter that it's hilariously inefficient; as long as it's just below the threshold where the user tears their hair out, they're not going to jump ship.
Also web is open. I don't need Tim Cook's permission to run something. And despite this lack of walled garden, I'm much less likely to get something bad from web, than from app, because web have much better sandbox than app ever will.
Because your data is being held hostage by the service provider. No more grandmas showing photo albums to their grandchildren when Facebook is long gone 30 years from now :'(
But with dropbox and/or alternatives you can always access raw files via the/a file browser. With Fb, OTOH, there was a story some time ago where you only could download photos in downscaled resolutions, or where downloading native resolution files was hidden in obscure option menus.