Hacker News new | past | comments | ask | show | jobs | submit login
Steve Yegge Grades His Own Predictions (sites.google.com)
315 points by krupan on Sept 3, 2015 | hide | past | favorite | 164 comments



I think he's being overly harsh on himself. As he said in the original rant "the predictions themselves don't matter! Most of them are probably wrong. The point of the exercise is the exercise itself, not in what results"

Imagine someone in 1850 making the prediction that "by 1950, everyone will have a horse". Though they say "horse", they really mean "everyone will be able to afford their own personal transport", as a reflection on the changing socioeconmic something something. And they would be right. Sure the horse is mechanical, isn't shaped like a horse, and consumes petrol, but your prediction is correct, even though you didn't know exactly how it would play out.

The same is the of Yegge's XML DB prediction. If he was saying that XML is going to be amazing, then he's wrong. If he's saying the RDBMSes have problems, then he's right. And actually he's saying "RDBMSes suck, so something will replace them, and XML is big enough that it will be XML". So that looks like a 50% score to me.

With Apple, he's basically saying "apple is shiny, and shiny will win". That prediction has been 100% correct. The fact that iPads disrupted the laptop model means he's more correct than if they had just sold more Macbooks. 100% score here, IMO.


Hindsight is 20/20. You don't get to reinterpret statements made earlier just because he got the general gist.

Like he said, his specificity screwed him (laptops, xml, etc) - He's totally not being too hard on himself. He's being fair and honest in his evaluation.

It would be super easy to predict trends if you are nonspecific: I predict new and shiny will win in 10 years. Bleh that's a non-statement.


How specific do you need to be? Do you need to provide implementation details? I give full marks for the XML Database prediction because he began with this statement: "Nobody likes to do O/R mapping; everyone just wants a solution." And then: "XML is the most popular vehicle for representing loosely-structured, heirarchical data."

I could quite happily interpret that as "schemaless databases (using the most popular vehicle for representing loosely-structured data) will surpass relational databases in popularity by 2011". I'll take marks off for 2011 being too early, but not for the datatype choice.

It's unfortunate that he went into more detail by describing XPath and current support for XML operations, but I feel that's just justification for the specificity of the datatype choice in the title.

I totally agree with you regarding laptops and apple though. "New and shiny" is hardly a prediction, and the specificity is relevant to the prediction.


> I'll take marks off for 2011 being too early, but not for the datatype choice.

I would. If he had said something like "XML is most popular... but really meant for documents not data, so something similar will probably replace XML" then I would have given full marks, even if he didn't specify JSON or BSON explicitly.

(I would give ~50% though, not 20%)


Agree. It's an interesting post, and I don't mean any criticism toward him (the thought process was good), but in practice I'm not sure he truly got any of them right.

"a new internet community hangout will appear" is not really a prediction. People have been "hanging out" online since BBS's and AOL. Would anyone bet against the same prediction looking from today forward?

The only other one where he gave himself full marks is "someone will make a lot of money by hosting open-source web applications." It's true of course, but PaaS is not specific to open source applications. If you take away the open source part, what he got right was that someone would make money by hosting webapps. And that's been happening in some form since at least the early 2000's when AWS launched.

That probably sounds more critical than I meant it. My hats off to him hitting a lot of them pretty well. But I wouldn't say he is going easy on himself with the scoring.


AWS didn't come out until 2006, and it was considered pretty revolutionary at the time.

The way you hosted a website in 2004 was that you signed up with a web hosting provider and either rented a shared webhost for $10/month or paid for a dedicated server for about $60/month. There were little or no provisions for being able to grow your hosting needs at the click of a button. When you read stories of how eg. HotOrNot scaled, it involved a lot of panicked calls to investors, and on-the-phone negotiations with hosting providers where they were like "We need more servers now, we'll make sure we can get you the money within 30 days", often followed by a tech going down to the datacenter and physically installing a new server in the colo cage.

The other revolutionary part of his prediction is "Someone will make a lot of money". ISPs & hosting in 2004 were a very competitive commodity business, filled with overcapacity from the dot-com boom.


> AWS didn't come out until 2006, and it was considered pretty revolutionary at the time.

He did work at Amazon until 2005, so he likely saw the internal development that resulted in what would become the publicly-available AWS. And some of his other predictions strike me as being heavily influenced by what he saw working at Amazon. Companies like Amazon and Google tend to encounter problems a few years before the rest of the world realizes that those are even concerns. So making predictions about the industry at large based on the cutting edge technologies used inside those companies is actually a pretty reasonable assumption.

For example, according to Wikipedia, Dynamo and BigTable were developed in 2004. So while NoSQL would have seemed revolutionary to almost everyone outside of Amazon and Google in 2004, to people working inside those companies, that prediction was basically predicting that their ideas would work and become very successful. I'd also bet there were a ton of Google engineers that saw Map Reduce before it was made public that could have predicted in 2003 that computationally-heavy tasks would be primarily distributed in a few years. And there's probably a bunch of Intel employees who could have predicted in 2004 that the chips Intel would be selling a decade later would run at the same GHz as their current offering since 2005 was about the time that they switched to adding more cores.

When you're ahead of the curve, you can see what's coming around the corner before the rest of us.


> You don't get to reinterpret statements made earlier just because he got the general gist.

Of course someone can. You can do whatever you want too. Predictions are hard and any correlation in 11 years is impressive, as are these. The question raised is not "is this fair or accurate according to my standards" but moreso, "who cares?"


> It would be super easy to predict trends if you are nonspecific

Otherwise known as the Nostradamus principle.


XML databases actually got to be amazing, if that was his prediction he would have been totally right. The fact is that they just got amazing a little bit too late. No one is going to consider an XML database because XML got a bad reputation (and rightly so).

But XML database technology actually has nothing to do with XML. The idea is that you store nested data nodes. Imagine you have a mongodb collection, and in it you have documents, and each of those documents would also be a mongodb collection in which you could insert more documents. You can build full graphs like that, and if you allow references to nodes those graphs can be cyclic too. Pretty powerful right?

Now you might say yeah of course, but that's got to be crazy slow! And at the time Yegge made his prediction you would have been right. But then academics came into the scene, they developed data structures and algorithms that allow you to store these recursively nested documents in column stores (think SQL table with 2 columns, a primary key and a piece of data) in such a genius way that you can make whatever query you can think of (recursive, joins, whatever) and execute them almost as fast as any indexed search query.

That magic happens, and is implemented for example in MonetDB/XQuery which sadly got abandoned a few years ago.

I still think if a crazy person sets off to go build a database company (crazy people like that seem to be quite popular in SV these days) they could make a lot of money forking MonetDB/XQuery, and search-replaceing every instance of X with J and every instance of XML with JSON. Crazier still, you can even let MonetDB/XQuery results be formatted as json, even before MongoDB became popular I think.


Isn't that just the Object databases from 15 years ago? I remember using one from Versant that seems to be what you're describing.


Could you go into a bit more detail about the data structures and algorithms used in doing recursive queries on nested data in a column store? Pointers to relevant papers/documentation would be great.


This paper summarises the MonetDB/XQuery work, I'm on my phone and couldn't find a free PDF but I bet there's one somewhere. I read this paper while I was an undergraduate, it's very accessible.

http://dl.acm.org/citation.cfm?id=1142527


OpenLink Virtuoso implements a lot of this functionality (xml storage, column store, graphs, rdf, json), optimized for speed and open source. I know because I was lucky enough to have had conversations about it with the guy responsible for it, Orri Erling http://sites.computer.org/debull/A12mar/p3.pdf


Not everyone has a "horse" today. You're thinking of the developed world! (I remember a WIRED article back in 1999 where a panel discussed which technologies would have the biggest impact in the 21st century, and someone suggested the telephone, only then penetrating third world markets; the same person pointed out that the most impactful technology of the 20th century was probably the bicycle.)


> most impactful technology of the 20th century was probably the bicycle

The smallpox vaccine is a strong contender. Smallpox killed 300m people in the 20th century, more than all wars combined. And we wiped it out.


The smallpox vaccine would be a candidate for most impactful invention of the 18th century (the first smallpox vaccination, named for the fact that it was derived from a cowpox blister -- vacca being Latin for cow -- was administered in 1796).


The form of the vaccine that allowed us to undertake global eradication was much newer:

" In the late 1940s and early 1950s, Leslie Collier, an English microbiologist working at the Lister Institute of Preventive Medicine, developed a method for producing a heat-stable freeze-dried vaccine in powdered form.[50][51] Collier added 0.5% phenol to the vaccine to reduce the number of bacterial contaminants but the key stage was to add 5% peptone to the liquid vaccine before it was dispensed into ampoules. This protected the virus during the freeze drying process. After drying the ampoules were sealed under nitrogen. Like other vaccines, once reconstituted it became ineffective after 1–2 days at ambient temperatures. However, the dried vaccine was 100% effective when reconstituted after 6 months storage at 37 °C (99 °F) allowing it to be transported to, and stored in, remote tropical areas. Collier's method was increasingly used and, with minor modifications, became the standard for vaccine production adopted by the WHO Smallpox Eradication Unit when it initiated its global Smallpox Eradication Campaign in 1967 " -- https://en.wikipedia.org/wiki/Smallpox_vaccine


Yes but we're talking about the century in which the technology had an impact, not the century in which it was invented. The telephone is a 19th century invention which is only reaching the majority of the world's population in the 21st. Likewise the bicycle was a 19th century invention that really hit home in the 20th.


If "horse" is a stand-in for non-human-powered transportation, then many people in the developed world don't have one. Plenty of cities (outside of stereotypical US suburbia) are perfectly livable only owning a bicycle because of their density, decent public transportation for when you have guests, and car sharing.

That speaker was right: don't underestimate the impact of bicycles.


I get your point, but the nuclear bomb by preventing major countries from going to war with each other is a good contender for most impactful technology of the 20th century.


Which countries did the nuclear bomb prevent going to war? There was never any serious prospect of the US invading the USSR or vice versa but MUAD certainly didn't stop them from involving themselves in more wars (generally supporting opposing sides) in the late twentieth century than in the previous four, India and Pakistan haven't exactly softened their stances on Kashmir since acquiring deadly weapons, and the UK even had part of its territory briefly annexed by a foreign power.


Tell that to the NATO and Warsaw Pact armies and all of the plans, materials, and infrastructure they built around fighting the Next Big European/Asian war. Huge amounts of money and piles of weapon systems were developed by both sides with an eye towards fighting that war.

It's easy to say that in hindsight, but given all of the tension around the Iron Curtain and all of the times the US and USSR came very close indeed to going to war despite how likely total nuclear destruction was, it seems short-sighted to just brush off the idea of that war actually happening if neither side had any nuclear weapons.


Depends, personally I'd see the number of times the US and USSR nearly went to war (or actually did, but indirectly) despite having nuclear weapons as pretty good indication that lack of any feasible strategy for inflicting a decisive military defeat on the other party was a much more significant factor in the Cold War not turning hot than any nuke-induced desire for peace

Ultimately, one naval officer's minority vote[1] prevented a Soviet sub from launching a tactical nuclear weapon at US shipping. Several thousand miles of ocean prevented the Soviet Union from feasibly achieving decisive military victory against the United States. [1]https://en.wikipedia.org/wiki/Vasili_Arkhipov


It's true that an invasion of the US mainland was never in the cards, given the situation during the cold war. Pretty much nobody was seriously worried about that. What NATO was very worried about was a Soviet invasion of Western Europe, which is a very reasonable worry considering the number and detail of Soviet plans for doing just that.

For their part, the Soviets were very worried about a combined NATO invasion, which seems quite reasonable considering that Germany alone, while also fighting/occupying most of the rest of Europe, managed to get to the suburbs of Moscow, and kill millions of Russians along the way.

Now if the Soviets did manage to invade and conquer Western Europe and maintain control over it, then they might be in a rather better position to consider direct attacks against the US. Which is a good part of the reason why the US invested so much in keeping them out of Western Europe.


I think it was more luck than judgement that we escaped annihilation. There were any number of times where a wrong call would have meant Game Over. And the probability would have been much higher if someone other than Gorbachev had been the last Soviet president.

History isn't over yet. The risk is probably higher now than at any time since it peaked in the mid-80s.


> the nuclear bomb by preventing major countries from going to war with each other is a good contender for most impactful technology of the 20th century.

It didn't prevent the superpowers from having wars with each other. It just shifted the battle fronts, so now the superpowers fight through proxy wars, and the deaths all happen in distant, "developing" countries.


It's speculation, right? Either it had an overwhelming effect OR it had little if any effect. We'll never know -- thank goodness. We do know that penicillin, clean drinking water, vaccines, and the bicycle had enormous effects all over the world.


> It's speculation, right? Either it had an overwhelming effect OR it had little if any effect. We'll never know -- thank goodness.

By that logic, we can 'never know' anything about historical effects, ever, and it's pointless to discuss the consequences of any historical events, including the others you mentioned. But really, the effect of the nuclear bomb in shifting the battlefronts of wars to proxy wars in developing nations under hegemonic control is as clear and as accepted by historians as we can expect.

It may not be quite on par with the "theory" of evolution being all-but-proven fact, but it's in that ballpark.


It's not pointless but we can tell for certain that some things had a certain direct impact. It's very hard to claim that the psychological impact of nuclear weapons was X as a matter of fact, and it's nothing at all like the theory of evolution which is confirmed by ridiculous amounts of direct and indirect evidence.


In a world without nuclear weapons, the USSR would have invaded Europe (and the Middle East) and the entire continent would have been laid waste. They came ridiculously close to doing it anyway despite nuclear weapons. If you think that such a European war would have meant that conflicts in other parts of the world would not have happened, I think you're being a bit naive.


Oh for sure (on the nuclear bomb) -- it's really impossible to compare these things meaningfully. Nuclear bomb? Modern agriculture? Penicillin? Vaccines? It's really impossible to judge.


I consider him much less correct with his mobile phone prediction. "At least 5 years out" is completely, totally wrong. If you made a business decision with that prediction you'd have felt a severe sting.

I agree with you about the Apple and XML ones though.


Not really. Did you see Dan Luu's original post at http://danluu.com/yegge-predictions/ ? Even though the iPhone was released in 2007, smartphone adoption didn't really start exploding until 2009.


Sure, but Yegge's original post wasn't cautioning about speed of adoption being slow, it was ridiculing the notion that companies were even close to knowing what consumers wanted and suggesting there were huge collaboration and content problems, as well as technical issues, questioning the idea of a $500 price point for ordinary consumers and suggesting that smartphones and tablets people actually wanted to buy weren't going to happen for a "long time".

In practice, the ultimate form of today's smartphone was starting to take shape in secret in Cupertino pretty much as he wrote his article, the now ubiquitous design went on sale within 2 and a half years, the App store came the following year and Apple alone had already shipped 30M units before the "at least" 5 years were up. The release of the iPad also took less than six years.

Yegge didn't see significant enough innovations taking place within 5 or so years to make an impact. In fact, all the innovations that lead to over a billion people owning smartphones today took place in that time frame, with only incremental improvements since then.

I think it's pretty generous of Steve to give himself more than half marks for that!


The first iphone didn't have apps.


> If you made a business decision with that prediction you'd have felt a severe sting.

I somehow feel that this would depend on the decision you made.


If the decision was "it's not too late to make it big in this market if we start NOW", I think you would have done okay.


Like BlackBerry tried?


He is being perfectly fair and accurate, which is probably a first for anyone making futurist predictions, so kudos to him.

This is exactly why all futurists are full of shit (see Kurzweil) and predicting the future is impossible. Getting some aspects of your prediction right is virtually guaranteed if you generalize when you re-interpret later.

You'd have to make spectacularly bad predictions that are overly detailed and specific NOT to do half decently if we're going to allow post-hoc retrofitting of which broad trends kind of fit the original prediction.


How did iPads "disrupt" the laptop model?

That's just the bubble you're living in talking, as in most of the world, iPad sales have been declining, with the Mac revenue surpassing them again. And besides the iPad, the industry for tablets is basically non-existent. Really, our world is weirder and weirder.


Steve, if you are reading this, please know that we miss your writing. It would be awesome if you could write now.


From 2007:

... I'm sure someone will let me know, probably by tying a nice note to a nice red brick and hurling it through my window. You should see the email I get sometimes.

I think you can estimate the maturity of a culture by how easy it is to commit heresy. There is simply no excuse for how people reacted to his writing. It reflects poorly on all of us.

Maybe Alan Kay is right that we (the IT industry) are a pop culture, complete with our own sort of Kanye West.


You mean full of undeservedly famous and ignored genius?


Yeah, it pains me to even read this kind of thing, because it makes me remember the feeling of reading one of his new articles (which was, maybe even _irrationally_ excited). I don't feel that way about any other blogger, past or present.

It must be the combination of interesting and unique ideas with a really entertaining writing style that got me. Maybe it even had something to do with the years he was writing (I was in high-school & college at the time).

I don't really think Yegge is going to get back into it (and I don't blame him), I just hope to find someone else with similar qualities.


I felt mostly the same, and also can't explain. I also felt like he was a window into the kind of companies I wish I could have worked in (in their "golden years").


The long form writing combined with an entertaining style and real experience to back up his stories made his stuff pretty unique in my opinion. Certainly among the best articles I have read on the net.

I think it was in a Stackoverflow podcast where he said that the hateful feedback he got was the main reason he stopped writing. It sad that we as a community did not get more of that content because of anonymous haters on the internet.

(Pretty similar to Joel Spolsky actually. Both in quality of content and hateful feedback in return.)


> NoSQL data stores in their various incarnations apparently have equalled or slightly surpassed RDBMS as of 2014-2015

Wait, what is this based on? Hacker News postings? There is no way that NoSQL has surpassed RDBMS even among startups.


It would be interesting to know how much data Google stores in traditional RDBMS versus NoSQL stores (BigTable, Spanner[0], etc.). I could probably track down this number, but then I expect I wouldn't be able to talk about it after, so for now I'm saying ignorance is bliss :). I would wager that we're rather more NoSQL heavy than most organizations, so it wouldn't really be a good proxy for the industry in general, but it might be helpful for justifying Steve's claim. It might be particularly helpful if we looked at the workloads Google runs on NoSQL stores that traditionally would have mandated an RDBMS.

[0]: http://research.google.com/archive/spanner.html


Google actually built a RDBMS on top of Spanner: http://research.google.com/pubs/pub38125.html


Indeed! However, many uses of Spanner are what I would call native. It should be possible to separate out the two.


In terms of data volume, the Google Search store alone might move the scale substantially.

I'd probably measure by how much programmer time is spent working in each kind of DB.


You're probably thinking about this as "instances of NoSQL services replacing RDBMS installations" but that's putting the playing field in the wrong place. Look at it as a venn diagram. There are 3 important zones: the zone where RDBMS excels; the zone where RDBMS and NoSQL compete and one or the other may be preferable depending on circumstances; and the zone where RDBMS never makes sense. You're looking at the field as the first two zones, Yegge is looking at all three zones.

Which means that you're missing things like caching engines, such as memcached, and other services which are not replacing an RDBMS but filling a different niche, which are widely deployed in industry.


Even traditional RDBMS software is building in noSQL functionality - many people simply use postgres as a key-value store.


What's funny is that it performs better than many "pure" NoSql alternatives. I remember an issue on github, where the poster said they replaced mongodb with postgres and increased performance by an order of magnitude or something like that.


In case anyone is still reading this and is interested: http://www.enterprisedb.com/postgres-plus-edb-blog/marc-lins... shows that as well.


I think it's more that they're considered and deployed at the same scale as RDBMS--obviously oracle still has a pretty strong hold on the database world: http://db-engines.com/en/ranking


One thing that ranking does not quite take into account is SQLite on Android and iPhone. That's got to count for something, as that's a pretty massive deployment.


sqlite, per itself, isn't a RDBMS replacement, it's a fopen() replacement.


The reasoning in the original prediction was that O/R mapping sucks though, which is something you gotta do in sqlite. So according to that it should still count against the prediction.


SQLite is an "SQL database engine", so, no, definitely not a RDBMS replacement.


According that stat counting the top 10, NoSQL databases account for 12% popularity. Given that this is talking about website mentions, job offers, twitter mentions and not actual installs that number isn't indicative of much success. 10% is actually pretty impressive but not equalled or slightly surpassed.


> "I had high hopes for Clojure for a while, but they're fairly user-hostile, even if they think (and loudly assert) that they aren't."

Is that a prevalent opinion? I'm often in #clojure/#clojurescript and found the community quite helpful.


I took that to mean the language and tooling were hostile, not the community.


The Clojure community is great, but Clojure/Core has historically not been interested in lowering the barrier to entry with Clojure. I think this is part of what Steve is referring to.


Would love to hear more about this. AFAICT (from using it every day for 4 years), it's all pretty awesome (except startup time, but that doesn't matter much because REPL).


We're discussing this a bit more on Reddit: https://www.reddit.com/r/Clojure/comments/3jjit5/what_does_s...


The startup time bothered me, as I kept crashing my REPL.



Frankly, I'm quite happy clojure has not turned into a "yes" platform as he suggests. I also find Yegge's approach of insulting anyone who doesn't agree with him to be very off putting and not constructive. What I took away from it is that if you disagree with him, you are going to fail and be labelled hostile in his mind.


Look for the post on the mailing list beginning with "I think that one's called "Perl". " That's where the real meat of his argument seems to be.

> Unfortunately I've conflated saying Yes to features vs. saying yes to users. Saying yes to features is clearly a way to get a kitchen sink a la C++ or Perl.

I didn't see any instances of him insulting people, though he was hostile in the same way he accused the clojure community of being hostile. But the points he made seemed sound to me. Saying "no" to users can seem hostile to the person you're saying no to. You need users, or you die. So say yes and survive.


Suggesting that clojure is dying or going to die is some serious hyperbole. The community is growing steadily every year and is being adopted by industry players. Maybe not quite as fast as languages like scala that cater more to what users are familiar with, but that's not a bad thing necessarily.

Speaking of scala, you could also use it as an example of what happens when you say yes too much. In fact, to me, that's scala's major weakness. Sure, users can use it like a better java, but you aren't gaining as much from doing that as you would if you actually adopted a more functional style of programming. This is exactly what clojure encourages. I wouldn't count promoting this philosophy as any more hostile than any other opinionated language.

It certainly sounds like Clojure is not for Yegge, but that's not a valid reason to call the community hostile and claim that it's a dying language. Especially so because it's the complete opposite of what is true. The community is very helpful and welcoming to newcomers and the usage is growing, not declining.


That's not what he said though.

He said that being hostile to users is the quickest way to kill a language. Now I don't know much about the Clojure language or community, but that post from Yegge was over 4 years old. You say that it's growing and I'll take you at your word, but had it reached critical mass at 2011?

Yegge is a languages guy. He knows some of what it takes to make a language successful, and one of the main things is advertising. Being hostile to users is the worst possible form of advertisement. That was his critique 4 years ago.

Maybe things have changed. Maybe the hostility wasn't as bad as what he thought, I don't know. But the comments in that thread aren't hyperbolic if you're associating them with the state of things today.


Thank you very much!


Dan Luu's original post: http://danluu.com/yegge-predictions/


Thanks for posting this. Context is always welcome :)


I really wish more people would revisit predictions to see how good they were. Not enough of this. Great job.


I know exactly what you mean. Most predictions are cast into the wind and only the ones that are correct are recognized later. Nate Silver talks about this in his book "The Signal and The Noise"; I highly recommend it.


It's good to see that Steve is back publicly blogging.


Let's not count our chickens.


This is more of a one swallow does not a summer make situation


Yes, but a bird in the hand is worth two in the bush.


Back? Doesn't look like it. I mean, one post, right? Or am I just not seeing any more?


What exactly is multi-threaded programming in this context?


I interpret it as shared state concurrency. That is, multiple threads sharing the same address space performing reads and writes on shared data: atomics, mutexes, etc. From the update I understand that he's considering things like Actors, Communicating Sequential Processes (goroutines), and data partitioning divide+conquer (MapReduce) approaches as different even if they happen to be implemented on different threads sharing the memory space.

I'm not sure on the relative prevalence but I can say that I've read relatively little about what he's calling multi-threaded programming. Mostly just a handful of lock-free data structures from the C++ guys and Rust's take on how they can do it while avoiding data races due to the type system. The alternatives, on the other hand, have been on HN pretty much continuously since HN started up.


And this is in direct contrast to how much hype there was for programmers to use multi-threaded programming due to leveling off of single-core performance and the prevalence of multi-core systems.


I'm not sure about this, e.g. concurrent data structures are a form of shared state concurrency and "worker pools" may or may not use shared state, it seems orthogonal.


c++11 added a ton of support for that kind of stuff tho, with language-level atomics, mutexes, etcetcetc.


Programming that explicitly creates and manages threads.


I still think that he assessed this point favorably for himself. If you read the whole point he made he may be right. But for me all these asynchronous methods are used more and more, so I'd say multi-threaded programming is much bigger than it was 10 years ago.

If you look at some languages / frameworks I see now this: If you are programming now in C# you have keywords like "await" which make background processes much easiert.

When you are programming Android in AndroidStudio you are warned if you try to call a web service in a GUI thread.

15 years ago, when I've started programming commercially (MFC and C++) most of the books said something along "well this is multi-threaded programming, but 'There be dragons'.."


Anything around scaling across multiple cores I suppose. It wasn't as nuanced back then.

In 2004 single core computers were still quite prevalent, and the idea of steady significant performance improvements coming from increasing clock speeds was in its death throes. From memory it was about then that Intel had given up on the P4 Netburst architecture and stepped back to improving the P3 to release the new Core and Core Duo architectures.

Nowadays multicore has been reality for a long time and the number of cores in a CPU has increased a lot, and we're thinking much more about different competing ways of achieving better concurrency. Back then it was just about getting better multicore concurrency at all using bog standard threads.


My reading of it based on his grading is "preemptive multitasking with shared state"


Well that's just as popular as ever including in his examples (e.g. Go) so your interpretation is wrong.


He should stated at the beginning (or title!) of the post when he originally made these predictions. You have to read to the very end to see the footer "(Published Nov 10th, 2004)".


Regarding #1: he did mention, offhand, that folks might use json instead of xml, which is basically what happened.


Would love to hear more about the assertion that clojure is user hostile. He last elaborated in 2012 here:

https://plus.google.com/110981030061712822816/posts/KaSKeg4v...

which basically boiled down to an impression drawn from a conference where a speaker warned against using macros. I fully accept Yegge doesn't dig clojure, and perhaps he owes nobody any further explanation, but given his previous visibility as a champion of lisp and all the cool stuff happening in clojure / clojurescript, it just doesn't add up; his critiques are incomplete in that they fail to acknowledge the positives in the language.


Ah! So it's my fault…


> Apple Macs only have 13% market share as of late 2014

Can anyone find a source for this claim? The 2 I found [1][2] claim OS X has about an 8% market share, which is about 40% lower than his claim.

[1] https://en.wikipedia.org/wiki/Usage_share_of_operating_syste... [2] http://appleinsider.com/articles/15/07/09/new-idc-numbers-sh...


On number 9:

"Apple's laptop sales will exceed those of HP/Compaq, IBM, Dell and Gateway combined by 2010"

So, the prediction mentions "laptop sales", but in his grading, he talks about "Apple Macs only have 13% market share", which I would assume means unit market share? Given the profit share vs market share dichotomy seen in smartphones in particular, I'm left wondering if he isn't a bit too harsh on himself here. It seems likely to me that sales-based market share would substantially exeed unit market share for Apple Macs; surely Asymco or some other smart analyst has run the numbers on this?


He should definitely give himself more credit...in the US at least, Apple laptop revenue exceeds that of all Windows laptops combined. http://betanews.com/2015/08/12/yikes-apple-laptop-revenue-sh...


I guess it depends on your sales metric: number of people now with access to the technology vs. amount of $ padding added to Apple shareholders' bottom line.

Number of people is much more relevant and interesting.


  7) The mobile/wireless/handheld market is still at least 5 years out.  Nope, it was 3 years out.  iPhone came out in 2007.  Pretty easy to calculate this one.  Score: 0.6
Well, when he made the prediction the smartphone market was already there, it just didn't appeal to consumers. The year of his prediction the Treo 650 was released which I would argue was the first fully-fledged smartphone (that I know of, anyway) similar to those we use today. There were also earlier models of the Treo lacking mobile data. Kyocera made some similar devices as well.


> Treo 650 was released which I would argue was the first fully-fledged smartphone

Or Nokia Commumicators from 1996 onwards. https://en.wikipedia.org/wiki/Nokia_Communicator


That's another good example, although it only had a WAP browser until 2004 as far as I can tell.


It had a web browser already in the year 1996 model.


Yeah, a WAP browser.


No, not just a wap browser, a web browser.


Substance aside, this is an excellent example of how people will give themselves partial credit even for unequivocally wrong predictions. The temptation to double down on your original hunch can be irresistible.


Right -- all the predictions that mention specific percentages, or include specific dates, are wrong. The ones which are actually correct are the vague ones ("A new internet community-hangout will appear", "Someone will make a lot of money by hosting open-source web applications"). This is why fortune tellers don't mention dates.

The whole point, which Steve himself covers eloquently in his original post, is not to score the predictions but to talk about the ideas. And if you look at intent behind each prediction, some of those "partially-correct" ones are actually completely wrong.

I mean look at these lines on the prediction about the mobile market being at least five years out: "But the carriers don't know what the hell they're doing, and the content and distribution models aren't worked out yet, and won't be for a long time. A lot of players need to get their collective acts together for the mobile market to take off, and it just hasn't been happening. [...] Don't get me wrong: it will happen someday. But I think it's not going to be any time soon. Betcha!" Then Apple told the carriers what they were doing, and figured out "content and distribution models", and this all happened within three years, and the reason it happened so quickly is because Apple didn't rely on any of the large set of third parties and vested interests which Yegge thought someone was going to have to motivate in order to make progress.

That prediction was apparently 60% correct. It's doublethink.


For his 10th prediction: Using the Chebyshev inequality (https://en.wikipedia.org/wiki/Chebyshev%27s_inequality) and assuming Yegge means being two sigmas away from mean programming talent to be "average" (k=2) we see that the probability for this to happen in 0.75 for any distribution. So most of us are average.


> There was also some discussion on HN about whether I thought NBL was JavaScript. Yes indeed I did, and said so on stage at OSCON

NBL?


>NBL?

Next Big Language


I think just the fact that Carmack (and Carmack Jr.) is interested in Racket deserves a bump from 0.0 to 0.1 for the Lisp prediction.

VR has the potential to become a big and interesting market and one of the leading tech guys in that field is fascinated by a Lisp. He might just make it work.

[note: I've never used Lisp for anything just an obvervation]


He is not the first to look at Lisp, and I would surprised if he is the last.

Lisp has some really compelling properties as means of expressing a program.

Unfortunately there is a severe impedance mismatch between Lisp and the Algol inspired languages of today's mainstream. Overcoming the mismatch is not difficult but takes effort and so few are willing to expel any real effort; Carmack is certainly not the sort to shy away from some effort.

I would recommend trying out Racket or Common Lisp if you can. Common Lisp really need Emacs so if you are not familiar with the editor I would recommend the former.


Can you elaborate further on the efforts needed to close up the mismatch? Perhaps something concrete to work on today? ;)


The immediate mismatch is the alien nature of the parenthesis delimited s-expressions. My advice is that if you find yourself editing text as opposed to the structure of the s-exp or balancing parenthesis manually you are doing it wrong; Emacs has modes like paredit and smartparens to do it.

The next "culture shock", though I think this has lessened now due to other languages catching up, is the concept of closures, functions as values, and higher order functions.

Lisps also tend to be dynamically type so if that is not familiar to you then it will take some getting used to.

There are other high level concepts such has syntax abstractions (macros) but that can wait.

And Common Lisp has CLOS which is a different take, compared to the likes of C++ and Java, on Object Oriented programming.

As for what to do first; I am not really sure, depends on whether it's Racket or Common Lisp and a lot on what interests you. If you like doing small algorithmic problems such as the Euler Project then start there.

For Common Lisp there is Practical Common Lisp which is always a good primer.

http://www.gigamonkeys.com/book/

And Learn Lisp the Hard Way (I am not familiar with it though)

http://learnlispthehardway.org/

For a Common Lisp implementation I would recommend SBCL if on Linux (your package manager should have it) and CCL on Windows or MacOS.

SBCL: http://www.sbcl.org/

CCL: http://ccl.clozure.com/

Also use quicklisp for grabbing Common Lisp projects, it's alot easier than doing it manually.

https://www.quicklisp.org/beta/

This might also be useful:

http://eudoxia.me/article/common-lisp-sotu-2015/#specific-pr...

Someone else will have to give info on the Racket ecosystem I am not familiar with that.


I am actually familiar with Lisp and was wondering what to do to bring people over, not try it myself. I doubt the parentheses is a huge problem, because like you said, it can be trivially automated using your favorite tool.

I think what the Lisp world needs is a huge push from one of the big players. Not just an in-house, small-ish tool, but rather a publicly-facing bet on it, as dramatic as Facebook's bet on javascript with reactjs, or Twitter's move to Scala. After a 15 minutes survey I couldn't find an example of a big, respected company whose tool chain has a substantial Lisp project for other people to contribute to. Sure you can find multiple companies building Lisp solutions, but those rarely open source them and even when they do they are no where near as substantial to warrant massive public attention.

I sincerely think that this the only thing standing between Lisp and universal adoption.


> Lisps also tend to be dynamically type so if that is not familiar to you then it will take some getting used to.

Eh, dynamically typed languages like JavaScript, Python, Ruby, PHP and Perl are rather popular.


The comment is obviously aimed at someone who does not use dynamic languages C++, Java etc.


Interestingly, Shriram Krishnamurthi seems really insistent that Racket is it's own language, taking exception when I described Carmack's interest as being in "Lisp" -- even when I clarified that I just meant Racket is a Lisp language.

https://plus.google.com/+MichelAlexandreSalim/posts/2vhp2GE9...


> 4) Java's "market share" on the JVM will drop below 50% by 2010. Definitely hasn't hit 50%. However, Scala has really tremendous adoption

Yeah, not really [1].

[1] http://www.indeed.com/jobtrends?q=scala%2C+java&l=


He's hard on himself over the smartphone start, the orignal iphone had sales numbers that were a rounding error.


I think that's being just a wee bit unfair.

Smartphone sales in the US in the first quarter the iPhone was released (July-September) were ~4.2 million units, Apple sold 1.12 million iPhones that quarter. In other words - the iPhone had ~27% of the US Market sales.

That's... not a rounding error. That's an incredible and wild success. The price drop after September helped of course - but they'd already sold over 1 million iPhones by then anyhow.

By the end of the year they had a 5% market share worldwide. Nothing to sniff at for something only 6 months old.


Sure, if you want to define the market as "things similar to the iphone" then yes, they did have a good market share. In terms of the mobile phone market though the original iphone made up lezs that 1%of the market.

It took the 3G to change the world.


Sure but seeing as that included non smartphones that's a bit... well stupid of a metric to use.

I mean it's like complaining that Aston Martin is a failure in car market because Chevrolet sells more cars.


Well, yes. But as I pointed out below the iPhone's sales figures were identical to the N95 and no one is claiming the N95 was a massive success.

When the iPhone launched the "smart phone market" didn't exist - there were a couple of other smart phones. It was a incredibly niche category.


Quite true. The iPhone didn't really take off until late 2008 when iPhone 3G, the global launch and the App Store happened.

In the first two months, iPhone sales were slacking so much that Apple suddenly dropped the price by $200 after just two months on the market: http://www.nytimes.com/2007/09/07/technology/07apple.html?_r...


The price drop was due to finally getting the phone subsidized by major networks.

If you bought it unlocked (or if you still do) it was the same price. It sucked a little for Early Adopters (like me) but it was completely predictable and wasn't honestly a huge surprise to anyone paying attention.

By the time the price drop happened they had sold over 25% of the smartphones that were bought in the previous quarter. They'd already met their goal of 1 million smartphones by the end of September at that point.

I swear... the rubbish I see over this: the iPhone was a wild success considering that Apple had zero involvement in that business before. It was the kind of success that many companies dream about for new hardware.


I strongly suspect that the vast majority of people--even here--"remember" a number of Apple's biggest wins as being inevitable from Day One. In fact, as you say, the iPhone didn't become a must have smartphone until 3G/3GS timeframes. Similarly, until the 4th gen iPod with the click wheel, the iPod was pretty much just another reasonably well-regarded, albeit pricey, MP3 player.


I think that anyone that doesn't remember the iPhone as being a wild crazy success was just not paying attention - and the numbers back that up.

The 3GS was when it exploded but it was already a big success by that point.

What people have to remember is the success of the iPhone is practically unprecedented in terms of adoption. It's incredibly rare to come out and dominate an industry like that when you're a new entrant into the space.

Even if the iPhone had continued it's July-September sales curve it would have been a success. It just got even more-so when the other versions came out.


I looked up other phones and really No, the number do not back that up.

The Nokia N95 sold 7 million units in a year after launch, pretty much exactly the same numbers as the original IPhone.

Unless you are declaring the N95 a wild crazy success you are talking rose tinted nonsense.


I didn't realize Nokia got into the business with the N95... oh wait... (Also it was a large success for Nokia so that does t help your case much)

Also - either selling 25% of the U.S. market is not a success to you or you don't understand the market.


I think we're quibbling over terminology. As you say, it was the 3GS that "exploded." (And it's when I personally upgraded from a Treo 650.) I agree that, to a significantly greater degree than the iPod, the iPhone was from Day 1 obviously a big deal even if it wasn't clear that most competitors really needed to get busy working on their gravestones.


Why isn't this guy a professional writer, hes so good at it!


Professional software engineer at Google pays better, in a probability-of-income-per-dollar sense.


The number of writers who make more than a senior Google engineer would be very, very small.


I hear programming pays better.


He also might simply prefer writing programs to writing blog posts.


I think he would've gotten a better score on the Lisp prediction if he didn't specify Lisp and instead went after Lisp-like features. Metaprogramming and homoiconicity are becoming increasingly popular, as are machine learning techniques.


Macro systems are becoming increasingly popular (e.g. Rust and Sweet.js), but arguably neither of those languages is homoiconic in the Lisp sense (though see also Dave Herman's "Homoiconicity isn't the point" http://calculist.org/blog/2012/04/17/homoiconicity-isnt-the-... ).


By homoiconicity, I was referring to Julia and Elixir, both of which are cited as being homoiconic (even though some purists disagree).

Also, I think "Homoiconicity isn't the point" misses the point. Homoiconicity isn't about syntax - indeed, Lisp isn't technically confined to s-expressions, and was originally designed for M-expressions before folks realized that s-expressions were much easier to implement - but rather about the ability to represent a program as ordinary data and to manipulate it as such. Yeah, I can do all sorts of metaprogramming in, say, Ruby or Perl, but you won't be able to go very far before you end up passing strings into `eval`; this is because they're not homoiconic (at least in their current forms). Meanwhile, the ability of languages like Elixir to actually modify a program's syntax tree directly - a.k.a. homoiconicity - makes metaprogramming much easier. The two concepts go hand in hand.

S-expressions happen to be very convenient for defining tree-like structures, like - for example - Lisp programs. They're not strictly required, however.


Given this was 2004 you have to give him full points for his prediction on PG - "He has an impressive resume, and he appears to be only just getting started."


he predicted ruby's massive success relative to python over some of that time stretch in another article. he predicted everyone would be writing code in the browser instead of emacs & co and i don't think that has come true.


Ruby's massive success relative to python? The August 2015 Tiobe index lists python up to 5th from 7th a year ago, and ruby holding steady at 13th across both years. I know tiobe isn't necessarily an ideal source, but this fits my intuition better than your suggestion. Can you elaborate on your perception of the two?


I think that's the point the parent comment is making. That Yegge made two other predictions that were not even close to right.


I started using cloud9 IDE for a few free/open source projects and I have to say, it's awesome. I have 3-4 computers that I work off of and it's a god-send to be able to just fire up a browser and continue where I left off without worrying about managing the environment setup for each machine.

I liked using PyCharm and WebStorm and I love emacs and I've been using Atom lately, but the convenience of an always running virtual machine with a pretty good text editor is hard to beat.


Thanks awesome! Although I am the founder of a competitor called Codeanywhere, but super synced that your into a CloudIDE!


I think I tried using Codeanywhere and it didn't quite work for some reason, it may have been when I tried using the Android app? Is Codeanywhere free/open source? That's one reason I prefer Cloud9 though I like the 2-factor auth for private/proprietary repos...


I could be the Android App as there are some issues with some devices. Codeanywhere the Web App is Free to use but not Open source (yet). We also offer 2FA to our users :)


2- Google App Engine and Heroku are not Open Source, can't think of an Open Source PaaS or one that is Open Source and made somebody a lot of money,


I thought that github met his exact prediction. They took an open source product (git), hosted it as a webapp, and got people to pay for it (github enterprise).


Cloud Foundry. The major contributors are Pivotal and IBM.

Pivotal is on track for $100 million in revenue this year, the fastest growing revenue for any open source product in history. Ever. In any category.

Disclaimer: I work in Pivotal Labs, another division in Pivotal.


They aren't open source, but people run lots of open source software on them.


I think I'm not smart enough to get it; lots of servers run on Linux so how's that different?


You don't say "I want to run linux, I'll pay Heroku to do it for me."

You say, for example, "I want to run Wordpress, I'll pay Heroku to do it for me."


I don't care about whether Steve's "score" is 50% or 70% or 80% or something else. The fact that he was able to come up with these 10 predictions back in 2004 shows he has a whole lot of vision.

Steve, what are your next 10 predictions? Obviously, you're at least 10 years ahead of me...


> I had high hopes for Clojure for a while, but they're fairly user-hostile, even if they think (and loudly assert) that they aren't.

I'm curious as to what this means. In what way is clojure hostile? Is there some reason I should avoid it?


> But even I don't use Lisp anymore, other than the occasional Emacs hacking.

What do you use? Why don't use Lisp anymore?


Personally, I'm still trying to use Lisp every chance I get. My window manager is stumpwm (Common Lisp) and my editor is emacs (elisp). Racket definitely has some very cool ideas, but I just don't like Lisp-1s and I do like the industrial strength of CL.

The early-to-mid 2000s were an exciting time for Lisp. Practical Common Lisp (http://gigamonkeys.com/book/) was coming out; Paul Graham was writing about how Lisp was a secret weapon; reddit was based on it. It seemed like the stars had finally aligned.

But then…nothing happened. Ruby captured a huge amount of the hype cycle, Python captured a huge amount of the pragmatists, and…nothing happened. The Common Lisp community kept on churning out incredible code, building a better mousetrap and waiting for the world to beat a path to its door and…nothing happened. It really is dispiriting to reflect on.


Any good next ten year predictions? Steve, or anyone else?


Management refuses to believe

Error Prone == Evil

far more than programmers.


He is way off about XML.. It is bloated and tedious to work with. For most scenarios, there are immensely better alternatives.

For the record, I was heavily involved with XML years ago and helped create a popular light weight xml-based cms.


I think he's saying his main point, though, is that storing objects in a RDBMS using OR-mapping is a pain in the ass and there will be better ways to do it. With the advent of NoSql DBs, and the number of DBs that store their data in JSON, I think this is largely correct. Back in 2004, XML was largely shorthand for "serialized object". He was wrong about the serialization format, but right about the overall idea.


People who hate XML never had to deal with variable-length bit field formats...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: