Hacker News new | past | comments | ask | show | jobs | submit login
Norvig's Law (2002) (norvig.com)
121 points by saikatsg on Oct 2, 2022 | hide | past | favorite | 65 comments



Those who insist on using percentages greater than 100% hyperbolically when wishing to indicate "even more" would disagree with Norvig's Law.

Maybe if they gave it 150%, they could see Norvig's reasoning. It may take more than that, though -- maybe exponentially more.


> Maybe if they gave it 150%, they could see Norvig's reasoning. It may take more than that, though -- maybe exponentially more.

I hope you'll permit me explicitly to single out your mocking invocation of my bête noire. I think that most non-technical authors just confuse 'exponential' with 'super-linear' (if they think even that quantitatively) … but I sometimes worry that even the somewhat more technically minded think that 'exponential' just means 'has an exponent', and so think that quadratic growth is exponential, y'know, because there's an exponent of 2.


For those who don't know:

time*n is linear in time and n, but ther symmetry stops there.

time^n is *geometric* (or polynomial) growth over time.

n^time is exponential in time.

time! (factorial) doesn't have a common name that I know. It is (in the long run) faster than any exponential growth.


I just call it factorial time, or super-exponential if I'm being imprecise.


Does anyone find this interesting?

I respect Peter Norvig as a programmer and a problem solver. I've taken a course taught by him in the early mooc days that I really enjoyed.

What I don't understand how does something like that makes it to the top of Hacker News.

I used to visit HN to get smarter, lately I feel that I am getting dumber.


I find it mildly amusing as a reminder about some of the irrational exuberance floating around the ether.


I just thought it was an amusing little break between more in-depth HN articles.


I have the same feeling, that's why I decided to filter out all that noise for myself, and created this blog: https://rubberduck.so I read the posts every day and filter out "amusing" topics and keep the ones that I actually come back to read again later. It is published on Mondays and updated every day with the articles from the day before https://news.ycombinator.com/front


> I've taken a course taught by him in the early mooc days that I really enjoyed.

I think I took the same one, but I remember Sebastian Thrun being the better instructor.


Probably people from Google want to make some positive spin after the company killed another product.


Related:

Norvig's Law - https://news.ycombinator.com/item?id=7491767 - March 2014 (13 comments)

Norvig's Law - https://news.ycombinator.com/item?id=317170 - Sept 2008 (14 comments)

Norvig's Law: Any technology that surpasses 50% penetration will never double again - https://news.ycombinator.com/item?id=36047 - July 2007 (4 comments)


Another way of putting it: once it's obviously a huge success you're too late.


Did not seem to be true for Bitcoin in the distant past.



Well, maybe you can't double what you've got, but one way to measure past the 50% mark would be to try to halve what remains on the table.


Well, one use to have a family computer, now we have 4... 7 if you count ipads. same with phones.


This is covered in the OP.


Mynegation’s corollary: anything that can be allocated at maximum 1 unit per person, can experience at most 33 contiguous periods of doubling.


Sure it can, but not until the population exceeds 16 billion. Same as with Norvig: the penetration percent will never double again, but the number of users can still keep going up. As the the song says, "population keeps on breeding..."


Fair enough. I probably should have added a condition that the doubling period is significantly shorter than population doubling period.


You can double again if you go below 50%.


Or if you redefine the technology. That's the way it usually happens: "Android Gingerbread has 1% market share. 2%. 4%. 8%. 16%, better introduce Android KitKat. 32%. 64%, but look KitKat is now at 4% and climbing exponentially! Gingerbread is now deprecated, KitKat is on a majority of devices, time to introduce Lollipop."

Come to think of it, this applies to a lot of Google's (and Microsoft's, and Apple's, and most tech companies') product strategy.


If it was unbreakable it would be inconsistent with "laws" like Moore's and Gilder's.


There is probably some discussion inside Google that prompted this.

"We should aim to double our market share!"


> Less familiar are the more pessimistic laws, such as Proebsting's law, which states that compiler research has led to a doubling in computing power every 18 years.

If that were true it would actually be quite extraordinary, but in fact it's still hard to beat C and Fortran.


That’s because C and Fortran also continue to benefit from all the compiler research?


So if you ran benchmarks compiled using the best C compiler from 2004 compared against the best current C compiler on 2004 era hardware you'd see a factor of 2 performance gain ? That's possible, I suppose, but I doubt it.


I have seen that kind of thing happen, yeah. I used to use dumb Fibonacci as an easy microbenchmark for getting a rough idea of language implementation efficiency:

    __attribute__((fastcall)) int fib(int n)
    {
        return n < 2 ? 1 : fib(n-1) + fib(n-2);
    }
    main(int c, char **v) { printf("%d\n", fib(atoi(v[1]))); }
This gives a crude idea of the performance of some basic functionality: arithmetic, (recursive) function calls, conditionals, comparison. But on recent versions of GCC it totally stopped working because GCC unrolls the recursive loop several levels deep, doing constant propagation through the near-leaves, yielding more than an order of magnitude speedup. It still prints the same number, but it's no longer a useful microbenchmark; its speed is just determined by how deeply the unrolling happens.

It's unusual to see such big improvements on real programs, and more recent research has shown that Proebsting's flippant "law" was too optimistic.


Current compiler optimisations are written with current hardware in mind, while I doubt that older optimisations would become pessimisations on newer hardware, so I'd compare the performance of the best C compiler from 2004 against the performance of the current best C compiler on today's hardware instead.


Turn off optimizations and find out.


Compiler optimizations existed 18 years ago.


It's (mostly) because C and Fortran continue to benefit from all the hardware research.


Is this meant to be a joke or I am missing something here?


What don't you understand?


If something has more than 50% market share, then it obviously can’t double because then it would exceed 100%.

So this “law” appears tongue in the cheek and not some novel insight.


I attended a talk at the Royal Geographical Society where someone explained that given current trends, the super rich would own X00% of the planet if current trends continued for fifty years. And I never understood it. It's like, yeah, ok, if your model of wealth is that there are literally 100 gold bars somewhere then yes, that would be a contradiction. But firstly, lots of things are S-curves, not exponents, and secondly, we can just change what we measure. It looks to me that this comment is talking about something like this article:

>http://edition.cnn.com/TECH/computing/9902/11/50pc.idg/index...

Ok. Well, the US is a few hundred million people in a world of 6-7 billion. So yes, doubling would have been impossible. But it happened. According to some source that i just googled[2] there are 6 billion smartphones right now. So this schmuck thought that computers were hitting the wall coming up to 150million. That's an order magnitude of wrongness, and I bet you, the average person in the US today has multiple computers more powerful than a 1999 computer. One in their phone, one in their iPad, one in their laptop, one in their fridge, one in their coffee machine, one in the doorbell, one in their robot hoover, one in their thermostat. I mean.. it's a mad lack of imagination.

[2]: https://www.bankmycell.com/blog/how-many-phones-are-in-the-w...


I’m sure I’m missing something deeper here: isn’t it tautological that something that is at >50% can’t double again?


It's a joke. Even someone as well known as Peter Norvig is unlikely to be so gauche as to name a "law" after himself except tongue in cheek.


It seems to be intended just as a common-sense reminder that fast growth has to eventually slow/stop due to market saturation.

It's not strictly true though since the market itself can grow so your sales could still double or more from a level that had represented 50% of the market at some time in the past.


Everything true is tautological in some context.


Conversely, you can lose 50% market share per year indefinitely. Just ask Blackberry.


Just because your cryptocurrency lost 90% of its value today, doesn't mean it can't lose 90% of its value tomorrow.


This sounds like a bell curve.


Cute


That Proebsting's law link is of course dead, and redirects to the main page of Microsoft Research. In my experience, it's the natural state of links to Microsoft Research pages. What's up with that?


Can't answer your question, but here's the law (I was curious myself):

> I claim the following simple experiment supports this depressing claim. Run your favorite set of benchmarks with your favorite state-of-the-art optimizing compiler. Run the benchmarks both with and without optimizations enabled. The ratio of of those numbers represents the entirety of the contribution of compiler optimizations to speeding up those benchmarks. Let's assume that this ratio is about 4X for typical real-world applications, and let's further assume that compiler optimization work has been going on for about 36 years. These assumptions lead to the conclusion that compiler optimization advances double computing power every 18 years. QED.

> This means that while hardware computing horsepower increases at roughly 60%/year, compiler optimizations contribute only 4%. Basically, compiler optimization work makes only marginal contributions.

> Perhaps this means Programming Language Research should be concentrating on something other than optimizations. Perhaps programmer productivity is a more fruitful arena.

https://proebsting.cs.arizona.edu/law.html


I find that code performance optimization is not worthwhile a lot of the time. But developer performance optimization is almost always worthwhile.

One might argue that cheap overseas development labour makes it a commodity, but I care more for being humane towards humans than CPUs.


Compiler optimizations can actually improve developer productivity, because they allow developers to write clean but inefficient code that can be rewritten to near optimal form. For example, in Rust iterators are a very convenient and clear interface that are generally zero cost (sometimes even more efficient) compared to a manual loop implementation. But without optimization, they would be many times slower.


A lot of times code compiled with no optimisations (-O0) is unusable. Specifically, in video some software compiled without optimisations won't push frames on time and instead will just keep dropping frames. There was a post a couple days ago about it being problematic in the games industry where a game compiled without optimisations is unplayable, while higher optimisation levels are hard to inspect in a debugger, due to the myth of "zero-cost-abstractions" in C++. Also to put it on its head a bit, when a compiler isn't fast enough (read not enough work was put into performance of the compiler itself, mostly on the design level, not on the microoptimisation level really), the feedback loop is so long, that developers stop testing out hypotheses and instead try to do as much as possible in their heads, without verifying, only to avoid the cost of recompiling a project. Another instance: when a photo-editing application can't quickly give me a preview of the photo I'm editing, I'm going to test fewer possible edits and probably get a worse photo as a result. With websites, if an action doesn't happen within a couple seconds of me clicking I often assume the website doesn't work and just close it, even though I know there are a lot of crappy websites out there that are just this slow. Doesn't matter. The waiting usually isn't worth my time and frustration.


> One might argue that cheap overseas development labour makes it a commodity

It was already argued in 90s, and several companies bet on outsourcing to India. It wasn't a success for everyone.


Some of that computer horsepower increase is due to chips learning how compilers create code and optimizing for compiled code.


The earliest Wayback Machine archive of that link is from August 2000: https://web.archive.org/web/20000824013718/http://research.m...

Looks like in December 2008 (between https://web.archive.org/web/20081204015038/http://research.m... which works, and the next snapshot on Dec 30) it started redirecting to a new URL (https://web.archive.org/web/20090224224249/http://research.m...) which was still working as of 2012-03 (https://web.archive.org/web/20120307142916/http://research.m...). Meanwhile, https://proebsting.cs.arizona.edu/ says that Todd Proebsting joined the University of Arizona in August 2012 after leaving Microsoft, so presumably that's when the link stopped working. He still has it up at his new site: https://proebsting.cs.arizona.edu/law.html


You should make a law about Microsoft research pages and name it after yourself.


Microsoft seems to do a massive restructuring of their website every few years and they break all links in the process. Raymond’s blog has suffered this a few times.


(2002), or maybe (2001) or (2000) or (1999): The Wayback Machine's earliest archive of this page is from June 2002: https://web.archive.org/web/20020603071812/https://norvig.co... and the page itself mentions July 1999, so this page is from some time in 1999–2002.


According to the archived response headers, it was modified in April 2002:

  $ curl -s -I 'https://web.archive.org/web/20020603071812/https://norvig.com/norvigs-law.html' | grep -E '^x-archive-orig-.* [0-9]{4} '
  x-archive-orig-date: Mon, 03 Jun 2002 07:18:15 GMT
  x-archive-orig-last-modified: Thu, 18 Apr 2002 07:27:36 GMT


Ok, we'll put 2002 above. Thanks!


Nonsense. This observation is unworthy of a genius like Norvig and anyway it's not even generally true: it's all a matter of perspective, and the associated revenue model (purchase vs. subscription model). Whether the glass is half-full or empty depends entirely on perspective: whether I'm looking at this as a seller of a device (e.g. smartphone/PC/laptop/tablet) then maybe I only think of once-off purchases. But if I'm Microsoft (software suite/subscription) or Adobe or Netflix or Apple iTunes, then high penetration of my target market is great, it gives me recurring sales/subscriptions(/users on a social network, to serve ads to). If I'm an independent app developer, I love that Android has high penetration, or else that iOS has market segment of users with high propensity to spend on both app and IAP; but whatever I do, in the 2020s I don't target Microsoft Phone/ Nokia/ Blackberry/ PalmOS (RIP). Maybe HarmonyOS. (Also, high penetration and market share have a tertiary effect of squashing potential competition by siphoning revenues that might go to competitors. Anyone remember last.fm [0]? remember how Microsoft destroyed RealNetworks's business model [1] by giving away streaming-media server software for free? ("According to some accounts, in 2000 more than 85% of streaming content on the Internet was in the Real format.")

We will see the rebuttal of Norvig's Law when Netflix launches its ad-supported tiers. Or we saw it during 2020-2021/Covid, when Amazon aggressively pushed its discounted Prime to fixed-/low-income EBT/Medicaid/other government assistance recipients (at least in the US) [2,3]

With all due respect to Norvig (and if you've read his AI book or ever seen him speak in person, he's undilutedly brilliant, and also humble), he should get out there and try to sell a subscription-based device/service. Lemonade-Stand-for-web3.0, if you will... "customer acquisition" is not a dirty phrase.

[0] https://en.wikipedia.org/wiki/Last.fm

[1] https://en.wikipedia.org/wiki/RealNetworks#History

[2] https://www.amazon.com/gp/help/customer/display.html?nodeId=...

[3] https://techcrunch.com/2018/03/07/1604211/


Norvig said that:

> To be clear, it all depends on what you count. If you're counting units sold, you can double your count by selling everyone 1 unit, then 2, then 4, etc. (In Finland I understand that cell phone usage is above 1 per capita, but still growing.) If you're counting the total number of households that own the product, you can double your count by doubling the population, or by convincing everyone to divorce and become two households. But if you're counting percentage of people (or households), there's just no more doubling after you pass 50%.


I’m running a subscription-based service, but I’ve stalled at 57% market penetration. Can you give me some advice on how I can double my market penetration from this point?

Remember, what I’m looking for is 114% market penetration. Any help you can provide will be gratefully appreciated.


You missed my point entirely with your sarcasm. Partner companies selling apps on your service don't care that you only have 43% of the market left to capture; that's entirely your business problem, not theirs. However they very much do like that you already have 57% market share; from their perspective, that's good not bad. That's precisely why I write _"it's all a matter of perspective, and the associated revenue model (purchase vs. subscription model). Whether the glass is half-full or empty depends entirely on perspective"_. Understand now?

When Palm and then Blackberry died as platforms, vendors simply moved to a new platform and ported/rewrote.


> However they very much do like that you already have 57% market share […]

But if they want you to double it, you’re going to have some bad news to report.


Bundle your subscription with things that some or most of your customers already have - but make it impossible to migrate data from existing accounts.

So Prime gives them whatever it is, but they can’t cancel their current subscription.

Win-win evil.


Sell 2+ subs to every customer, eg separate phone from car from desktop.

I will not make jokes involving the word double and shame on you if you thought of it too.

Definitions are boring, no growth is limitless by entropy.


I'm willing to help you, but only if you want it done yesterday.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: