Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What are some “10x” software product innovations you have experienced?
441 points by pramodbiligiri on March 16, 2021 | hide | past | favorite | 795 comments
Peter Thiel has written about the "10x rule" for startups, where your innovation has to be 10 times better than the second best option [1].

Have you personally experienced such 10x improvements in your own interactions with software? What were they?

[1] - https://thenextweb.com/entrepreneur/2015/07/13/the-10x-rule-for-great-startup-ideas/




+ using Google for search in 2000 and being amazed at how much better the results were than AltaVista and Yahoo Search.

+ Google Maps in 2004 and dragging the map interactively around. This was a quantum leap beyond Mapquest's page reload and reset with cumbersome arrow buttons. This was a paradigm shift that let me explore a geography better than any book atlas. I gave away all my atlases

+ MS Window Media Player's ability to cleanly accelerate playback to 2x,3x,4x of audiobooks and tutorial videos for slow speakers. MS Windows 7 had this long before Youtube's player had a 2x playback option.

+ SQLite library : more than 10x improvement since I came from old school of writing custom formats for persisting data. No more dumping memory structs to disk or writing b-trees in C Language from scratch.

+ C++ STL in late 1990s. Instantly reduced need to write custom data structures like linked-lists or in-house string libraries for common tasks

+ VMware in 2000s : more than 10x productivity enhancement because I can play with malware in a virtual software sandbox instead of tediously re-imaging harddrives of air-gapped real physical machines

+ Google Chrome in 2008 : 10x quality-of-life since misbehaving websites crashing don't bring down all the other tabs in my browsing session like Firefox/Opera.

I probably have more than a hundred examples. Some software tech 10x improvements are more diffused. Reddit+HN websites are a much better use of my time than USENET newsgroups. Youtube with recordings of tech conference presentations I can watch at 2x+ is a better used of my time than physically traveling to the site.


> + Google Chrome in 2008 : 10x quality-of-life since misbehaving websites crashing don't bring down all the other tabs in my browsing session like Firefox/Opera.

Adding to this: just browser tabs themselves were a 10x improvement for me. I had actually forgotten what the world was like before tabs - a whole lot of unsortable windows, and a lot of clicks of the back button.

It was a godsend to be able to open new links in a tab, and to be able to see those tabs organised neatly in front of you.


This was the entrypoint to Firefox for millions of people.

Unlike today, back then Firefox had a clearly expressive value proposition: Use this! It has tabs! It will change how you browse the web!

Now the only value proposition is some wishy-washy privacy stuff that is much harder to sell people on, even if it's in their best interest.


I believe opera had tabs for a while before Firefox. It was one of the main reasons I think I used opera for a good long while.


You are correct - Opera had tabs way before Firefox, and Opera had a full mail client as well without feeling bloated or slow.


The free version of Opera also had ads at the time: https://upload.wikimedia.org/wikipedia/en/thumb/c/c6/Opera_6...


I strongly believe that if Opera had had a cool name, it would have been a real contender. People don't like operas, they're boring and stodgy.


Opera was a paid option. It was doomed from the start.


I paid for Opera (twice) because it was clearly the superior product. People spend money on a better machine, so why not on a clearly better browser? (At that time).

And although browsers are no longer distinctly better than each other, I still donate to Mozilla.


Because you can’t buy a free machine. But you can download a free browser.


IIRC the rendering engine wasn’t that great either at that time.


Opera Presto was the fastest browser (and js engine) in the world at its prime.


Opera was the best


Still remember the mouse gestures to zip around the pages.

What a fun time.


You don't need to feel deprived.

https://github.com/marklieberman/foxygestures

https://otter-browser.org/ (Tools → Preferences… → Advanced → Mouse → [X] Enable mouse gestures)


I was thinking more what the result looked like.


Website often broke in Opera due to being written for "IE" standard.


Not really, I was an Opera user when Firefox 0.8 came out... Switched pretty much permanently, Opera also had mouse gestures which I was actively using and still use with a Firefox extension...


Opera had no real tabs at that time. It was still MDI with a button-bar for fast access. Was'nt the browser doing this at the time, and it was'nt as good as Mozillas Tab-addon, which later was built into first firefox-versions.

Opera added true tabs some versions later, while still hanging someway on the MDI-tradition.


To me, the value proposition is still tabs. I have hundreds open at a time. Once or twice a month I will cleanup hundreds of windows and tabs. I've created habits to open up most everything in new tabs and new "projects" in new windows.

I think I do this as a self-remedy for my ADHD, I am likely to allow an interruption to disrupt me but I do need to close the loop on whatever I was working on before an interruption. A window or set of open tabs is enough of a reminder to help me close the loop.


I've long opened everything in a new tab, but would you believe I just realised I need to consciously make the effort to open new "projects" in new windows 2 hours before reading this post? I think the difficulty is working out when that line of a tangent has been crossed.

Tree-style tabs [0] in Firefox help to keep a train of "research" together.

[0]: https://addons.mozilla.org/en-US/firefox/addon/tree-style-ta...


I use a vertical tab extension, and with that I regularly get to > 1k tabs.


Different strokes for different folks but what on earth do you people do on 1000 or even 100 tabs? I have a handful of sites I regularly go to and maybe 50 that I visit more than occasionally. They’re all bookmarked. When I’m done browsing I close the window. What state are you preserving keeping everything open in a tab?


For instance, i go to frontpage of HN and open in new tabs the stories and comments that interest me. Its not uncommon for 15 stories with 15 comments sections (thats 30 tabs already),

Then I sometimes, open links from comments, or Wikipedia etc.

Today i have more than 50 tabs open just starting from HN.

Some stories when i read I close, but sometimes when I think i need to reread or save for later.

After the week i usually have around 150+ tabs that i have left for later, I do the triage and I either save to my notes (like interesting, os dev articles, or rust or postgre optimization tips ...) or close.

And thats just for HN.


Often it will be documentation + stack overflow results. It can be handy to just leave that stuff open, because then you can find it easily. I do purge all tabs every now and again, but when there's no downside to leaving them open, they can build up for a month or so.


What are those 50 sites you visit? I have like 4 which I visit on a daily basis (reddit/4chan/hckrnews/youtube)


Phoenix/Firebird was sold first through its incredible speed, compared to IE, plus popup blocking. The tabs were a nice bonus but the other two were the features that got all of us pushing the browser on every friend and relative who'd let us, even if they'd never use/understand tabbed browsing (some don't, to this day).


Extensions were the other 10x thing with Firefox. Even now browsing the web without them is torture.


> Now the only value proposition is some wishy-washy privacy stuff that is much harder to sell people on, even if it's in their best interest.

They'll still use Chrome or Edge/IE, but will use NordVPN or ExpressVPN after seeing their favorite YouTuber claim that the VPN protects their data and gives them privacy...

...while still not using an ad blocker, which makes the VPN almost entirely worthless from a privacy perspective. At most, it blocks your ISP from seeing what you're doing.


> At most, it blocks your ISP from seeing what you're doing.

In fairness, that is a nontrivial privacy win for a decent number of people.


That's not totally fair. They also hide your IP from sites you access. So if somebody sent me a link to an IP logger and I clicked it, they would not know what city or state I actually live in, for example.


True. You're definitely right there.


Which is just another reason why it is so awfully frustrating flr me that they won't fix their extension apis:

I fully believe the extensions were really a huge part of Firefox' moat.


In that case, Opera Browser in 2001 was the first tabbed browser I used. Long before Chrome.

I still miss some features in Opera like mouse gestures. No, plugins and add-ons can't reproduce the responsiveness of a native implementation, they all fall short.


Good to see someone else remembering the mouse gestures. It was what hooked me, together with being a fast browser at the time.

On a sidenote, I stopped using Opera when they included a torrent-client in their webbrowser. The amount of bloat was staggering.


I remember using Avant browser to have tabs back in the day before I discovered FF. I think it was just an IE wrapper


Wasn't Avant a reference browser to check standards compliance? I recall toying with it. Then switched to a Firefox plugin which used an HTML linter.


I was a teen so I'm not sure what it was. It just had the magic of tabs. Haha


I was thinking of Amaya. Avant is unrelated.


And Maxthon!


Virtual machines have been a huge win, and still are.

Spending 2-5 days setting up a development environment, then archiving a copy of the whole VM made some of the crappier setups usable. Back before rented software (including the OS) that meant one person could grind through the whole OS/tools/IDE/libraries stack and get something we could all use. I've broken software protection to make this work, because it's just such a huge productivity win.

Being able to have a whole test matrix worth of systems virtualised also helps. One place we interacted with MS-Office, so the matrix was Win95/98/Me/XP/2000 (etc), then three versions of Office {cries}. It would have been just unreasonable as a small company for us to have and maintain 2-3 copies of that matrix as physical machines.


I remember when I switched to Linux around 2004. Compared to Windows XP, setting up anything became minutes instead of days. Because everything was just so much simpler to automatr and so much faster in general.


Freezing and archiving the VMs used to build a particular release is also a pretty nice trick.

It comes in very handy for those rare times when a major customer that (for some reason) insists on sticking with an ancient, obsolete version of your software desperately needs a critical fix.


> Freezing and archiving the VMs used to build a particular release is also a pretty nice trick.

An essential trick if you have long-lasting products. I've been able to (and asked to!) make fixes in ~2015 to code that was written in Turbo Pascal 6. With firmware for embedded devices it's more necessary and often also harder, because often the hardware to flash the firmware doesn't work with new computers. Or you end up with a chain of USB-Serial adapter to serial-to-weird adapter to "programmer board" to actual device... but I knew that when I carefully packed everything into an antistatic bag 10 years ago :)


> using Google for search in 2000 and being amazed at how much better the results were than AltaVista and Yahoo Search.

Aaahhh. The year Google launched, a friend and I were sitting next to each other, at our PCs. Both of us were techie types and into innovative stuff, although working in a software services bigco at the time.

I casually said to him: "I came across this new search engine a few days ago while browsing for interesting stuff. They say they are fast(er than others)."

Gave him the URL. He typed it in, ran a search, saw the results and the time, like 0.0x seconds or so. And muttered "That's fast".

It was Google.


I still distinctly remember the moment of shock with the minimalism of their homepage at the time; compared to AltaVista jam-packed with ads and other stuff to the brim, seeing just the logo and the editbox on a plain white page made me startle and pause with some "Whaaat's going on???" and "This can't be true" feelings.


It was so light weight that when they did user tests, the user didn’t start typing. They just waited there waiting for the rest to load. So they added the copyright notice in the footer. According to Marissa Mayer

Copyright never has to be declared, and I think in many ways Google is responsible for websites having copyright in footers not realising it’s not a legal requirement.


Ha ha, yes. So streamlined.


Heck, GMaps is slow and clunky for me, but it's still 10x better than anything else I've ever used. I'm pretty sure it'll be the last GOOG service I manage to get myself off. Is there even a single OSM-based alternative that offers competitive POI search?


For me, Google Maps got slow some 5 years back.


Sadly, OpenStreetMap is weakest in POI coverage.

In typical area Google has simply much better data - and that is before interface optimised for selling ads about POIs and therefore displaying POIs quite well.

I map a lot in OSM, and POIs in my city are still light years behind Gmaps. And even with StreetComplete (an Android app that I recommend) that made adding and resurveying opening hours data easier, this info is often missing/outdated :/


Using google search in 2000 and being amazed at how much better the search results were than in 2021


To their defense, SEO was not as developed. The average website was not a scam like today.


The SEO was there, just optimizing for different things from what Google cared about. Remember pages of keywords with the same color as the background?


I remember developing in the early 90s and there was no open source coding tools. It was miserable. One had to write everything from scratch or buy expensive libraries.


> I remember developing in the early 90s and there was no open source coding tools.

There were open source coding tools in the early 90s (a lot of the GNU tools, including Emacs and GCC, were first released in the mid-to-late 1980s.)


I know what you're saying but the discoverability of such tools, especially on DOS, was poor. Everyone I knew was buying compilers or, ahem, downloading them from other than authorized sources. It was unreal when I was able to buy a RedHat CD-ROM for a few bucks (from Tucows?) and get access to a full tool suite. You couldn't get huge software packages over 56K modems from BBSes.


I did not have access to a unix machine, the internet or the GNU toolchain in the early 90s. I was coding on macs and DOS PCs using Borland compilers and Think C on the Mac.


And Matt's Script Archive was mid-90s.


Let me introduce you to comp.sources.unix: https://www.krsaborio.net/unix-source-code/research/1987/081...


I do that, because I use a not popular enough programming language.


Google Maps was amazing. Back when I showed it to my dad for the first time, he expected it to be a live satellite view of his house.


I'm working on a startup trying to make that happen now. There's an immense amount of "free energy" in the form of 120k+ flights around the world every day with that # expected to increase in the future with air taxis, delivery drones, etc. Idea is to hitch a ride on these platforms and crowdsource (cloudsource?) as much aerial image data as possible to create a map that updates every few minutes. Planet Labs and Maxar are great and satellites aren't going anywhere, but we see a pretty big unfilled opportunity in the remote sensing market.

Apologies for the shameless plug--we're on IG @notasatellite if you're interested in seeing some examples :^)


Interesting concept, but my gut says it could lead to some malicious use cases.


You're spot on, and that's something I think about daily. Satellite-based incumbents have an average revisit rate (assuming good weather) of roughly 2x/day, so hundreds or thousands of passes/day poses serious privacy and operational risk. We rely on guidance from EthicalGEO and other consumer privacy orgs when drafting standards, but the path to the type of hyper-revisit we're envisioning will need to be tread very cautiously.


How might this work? Could you give example of privacy heuristics you are considering?


Also curious. For example, if I run a shop and I normally have N cars parked out front, but anyone can see with 5 minute accuracy that I always have at least N-2 cars, they could easily figure out when nobody's watching the shop (or house).

Edit: also, nobody can ever sunbathe in peace again.


I would expect that most planes fly on specific "routes", leaving a lot of area uncovered. Do you have any solution for this problem?


You're absolutely right, there are huge swaths of airspace that rarely see traffic. Imaging satellites have an inherent advantage when it comes to total coverage but proportionally, a small % of all images captured in an orbital period are of interest at that point in time. We look at our approach as an opposite, complementary offering that optimizes for revisit in well-established and predictable routes around population centers (and whatever else lies in between takeoff and landing).

Most airports are zoned industrial and located within or on the edge of cities and towns, so we can provide more frequent updates of areas that are of greater value to customers monitoring supply chains, construction/site development, estimating traffic flows, etc. High-revisit is equally important at cruising altitude over much of the country where demand for agricultural/environmental monitoring, crop yield data, etc. is of similar interest.

All this to say coverage is a known limitation, but based on those reasons I think (hope) we'll become the yin to satellite imagery's yang.


Why is privacy intromission so trendy and tempting? I understand this in hard-to-reach areas or for wildlife preservation, but you're effectively adding a real-time aerial Big Brother on everyone's head.

Is technology too widespread? Do we need moral courses, in addition to tech ones?


I understand where you're coming from, but it's much more nuanced than plainly-stated "privacy intromission". For all the good that has come from tools like Google Maps, there's an opportunity to extend that good. For all the nefarious uses, it's an opportunity to pioneer techniques to mitigate those externalities in a way Google and other map data providers have not or will not.

The world is constantly changing and we need better, faster, and cheaper methods of measuring those changes. We can respect (and actively lobby for) privacy while obtaining these results at the same time. Morality and technology should not run parallel to one another, they should both be woven into the products we use every day.


My grandmother had the same expectation.


I cannot imagine a period of time where the STL didn't exist, which perhaps outs me as a younger dev


There was arguably a worse period than non-existence, which was when it existed but didn't work very well.


I know of the STL but don't know a damn thing about how to use it. They didn't teach STL at all in my university, even in the C/C++ classes.


We had to write our own Scanner class in our Java classes because the version of Java we were using didn't have that functionality yet. It came out in the middle of the course.


Many video games still don't use the STL due to performance concerns.


They rather use the proper C counterpart, the CTL, written by a game dev. Nobody should ever use set and unordered_map, and many despise string. btree and the various open hashtables are 10x better than the STL parts, (fuck pointer and iter stability), string only recently got improved via string_view.


> + MS Window Media Player's ability to cleanly accelerate playback to 2x,3x,4x of audiobooks and tutorial videos for slow speakers. MS Windows 7 had this long before Youtube's player had a 2x playback option.

I'm pretty certain there were third-party players that could do increased playback speed with unchanged pitch before Windows 7.


Funny how Google went from user's hero to zero.


Only on HN.

Otherwise it's the #1 browser by usage by far, which is saying a lot considering it doesn't come preinstalled on Windows or on Macs, so all those users are making a deliberate choice to switch.

So still the hero outside of HN.


Google is gas-lighting people into using Chrome…

Ever used for example YouTube with an alternative browser?

It's buggy, it's slow (and that's on purpose; Google puts regularly quite some engineering effort into using web features that don't work well in the last independent competing browser). But those Google pages have a big banner saying it will work better in Chrome, and asking the users to "try".

That strategy works very efficiently as Google controls the sites most average people use the whole time.

That's one of the reasons why Google's monopolies need to be broken by regulation. Actually they're over-due for a break-up.


> [Youtube is] buggy, it's slow (and that's on purpose; Google puts regularly quite some engineering effort into using web features that don't work well in the last independent competing browser).

https://addons.mozilla.org/firefox/addon/disable-polymer-you...

Yes, it's a work-around, not a long-term solution and does not help with the underlying problem. Spread the word and enjoy it while it lasts.


That's not what gaslighting means.


I don't mind using Youtube on Firefox over here, and I don't find it slow.

With adblocker I have nothing to complain about.


I actually understand what the grandparent is talking about with respect to the "hero" to "zero" comment they made, though I might not say that it has fallen all the way to zero.

The reason I understand their comment is because there was a period of time when Google was so cool that people wore Google t-shirts without having ever been affiliated with Google. You could find random college students wearing a Google t-shirt like they would a band t-shirt.


I wish I could hear you but there's this echo ....


It's because most of HN user queries are very niche.


No, it's because since the search is more and more based on user tracking data run through some "AI" the results become garbage for people that don't use the same five sites on the net the whole time, and prevent Google's tracking as much as possible additionally.


That last point is something i hadn’t fully considered but is absolutely valid here. I have watched so much content at 2x speed that I would never have cared about in the slightest at 1x. The human brain is strange like that.


Firebug.

Every developer from around 2006-2008 knows what I'm talking about. Debugging JS in IE6 was like trying to build a house blindfolded with both arms tied behind your back. Firebug is when JS went from just a web augmentation toy that could silently fail and your web page would still mostly function to becoming a critical function for a web page (many will see this as all a big mistake).


It's interesting that this is kind of a constant .1x * 10x = 1 cycle we repeatedly go through with new platforms. We rarely realize the cost of a new development platform in the fact that we often have to start from zero with documentation, tools, communities, etc. Tools are arguably the easiest to realize on day 1, when all of a sudden you can't set a breakpoint. So it's really .1x development in many ways until someone reimplements these standard utilities to "10x" us back to 1.


Yes. Web technology has advanced so much that now I can do what I was able to do in FoxPro 25 years ago, in just twice the time.


But think about all the things that you can't do in FoxPro, like build an accessible and beautiful user-experience :)


Ever really built an accessible Web-UI? I mean with all that ARIA stuff, that really works for the blind or other disabled people?

An accessible UI that honors the platform specific conventions and integrates into the OS environment?

Ever built an user-experience atop ideas like "form follows function" and "don't make me think", that has beautiful design at the same time?

I actually question that there are more than a handful Web-UIs been built in that way, ever, IF actually any!


> Ever really built an accessible Web-UI?

Yes. That's what I currently get paid to do.

> I mean with all that ARIA stuff, that really works for the blind or other disabled people?

Yes. If I didn't do that then we'd be sued by our customers for not being accessible.

> An accessible UI that honors the platform specific conventions and integrates into the OS environment?

Web-applications run in the browser, independent of the host OS. While web-applications do tend to adopt some OS conventions (such as having a "close popup" button in the top-left or top-right corner, web applications on-the-whole don't really ape after Windows or macOS' conventions and the past decade of post-Windows/post-macOS web-applications UX development shows that following desktop UI conventions really isn't necessary.

> Ever built an user-experience atop ideas like "form follows function" and "don't make me think", that has beautiful design at the same time?

Again, yes. (Beauty is subjective, of course - I've been told my work is "gorgeous" by my boss, so I've got at least one endorsement there)

> I actually question that there are more than a handful Web-UIs been built in that way, ever, IF actually any!

Fewer than I'd have liked, I admit. At least with today's "cleaner" look it's easier to visually design something that isn't an eyesore.

I appreciate a huge problem is that since the 2010s the learning-curve for front-end web-work has steepened considerably, and other barriers-to-entry have been introduced with no sign of them going away. Since the 2010s, when SPAs and modern CSS (flexbox, grid, responsive layout) came into being you might have noticed that fewer web dev clients today are making remarks like "my 12yo nephew could have made that!". Not to sound nostalgic, but in the late-1990s through the mid-2000s one could very easily be a very competent web-designer getting-by in Photoshop, Fireworks and Dreamweaver without ever needing to learn HTML and JavaScript (not that there was much to JavaScript back then either) - today it's increasingly essential to have at least an undergraduate CS degree to even understand how to correctly use React.js or Angular - and be comfortable with a command-line terminal. Project requirements for resolution-independent and form-factor-independent ("responsive design" - though I detest the term) web design means that the days of fixed-layout Photoshop-first web-design work and the age-old Image Slicing tool are over: tools originally designed for WYSIWYG print layout and graphic design are insufficiently expressive to allow someone to intuitively visually define how a CSS-driven layout and document-flow should work. In ye olden days it was straightforward for a relative beginner to grok how <table> could be used to make a semi-flexible layout - but explaining how CSS's grid's `auto-fit` works to the same person is impossible. And so on and so on.

Web-design has gone from the approachable place where COBOL was ("write business rules in plain-English!") to being a hallowed and exclusive techno-religious cult. I can't personally complain because I appreciate the job security I now have, but I am concerned it may lead to the web becoming less and less accessible to independent self-publishers who instead flock to much easier-to-use, but proprietary, walled-gardens like Facebook pages - or to web platforms that hide the complexity entirely, like Wix, and SquareSpace.


Most web sites aren't accessible. A capability is irrelevant if it's never used.

Meanwhile, beauty is in the eye of the beholder. The modern webtech trends are offensive to my eyes - in addition to being ridiculously compute- and visual-space-inefficient.


> visual-space-inefficient

If you’re referring to the oft discussed “information density” issue, my goodness, scroll a little. That spacing is good for usability and, yes, accessibility. Those larger action targets and the space between them are not only good for us with vision impairment but for people whose only computing device is a phone.


> my goodness, scroll a little

That's an anti-solution. Scrolling is bad for your brain and reduces comprehension - it makes it harder for your brain to figure out what's going on.

There's a continuum between "everything on the screen is 6pt text with no spacing" and gigantic mobile-UI buttons on a desktop environment - and modern webdev errs far too close to the latter.

> That spacing is good [...] accessibility.

Big buttons with small text and no borders - which are part of the design trend - are significantly worse for accessibility.

> That spacing is good for usability

Spacing in general? Yes. The ridiculous amounts of spacing that modern webdev encourages? Absolutely not. There are diminishing returns, and modern webdev goes from "reasonable" into "unreasonable" territory.

> Those larger action targets and the space between them are not only good for us with vision impairment but for people whose only computing device is a phone.

My argument had to do with desktop platforms. You can build websites that are usable on phones while being efficient on desktops, either by building two separate sites, or through responsive design - but modern webdev doesn't, instead choosing to build desktop sites that have all of the aforementioned problems for no good technical reason.


First of all, I appreciate you responding. And confirming that I did understand your complaint.

> There's a continuum between "everything on the screen is 6pt text with no spacing" and gigantic mobile-UI buttons on a desktop environment - and modern webdev errs far too close to the latter.

This is probably not out of laziness or lack of consideration, but because pixel accurate pointing devices also better serve a wider range of desktop users when there’s larger targets. If you’re upset by this, you’re better off focusing on improving vision correction and motor skills than complaining about designs that accommodate more people than just you.

> Big buttons with small text and no borders - which are part of the design trend - are significantly worse for accessibility.

I disagree with your interpretation of the design trend. What I see is buttons with text consistently sized, and with padding fitting to the container to halve the cumulative padding while providing the same tappable area.

> Spacing in general? Yes. The ridiculous amounts of spacing that modern webdev encourages? Absolutely not. There are diminishing returns, and modern webdev goes from "reasonable" into "unreasonable" territory.

This is apparently entirely a matter of experience and opinion. But I do the majority of my web browsing on my phone except when I’m doing dev, and I feel exactly the opposite. Stuff is too crammed in. When I do browse on my computer I don’t see more spacing, just larger sizes of everything.

> My argument had to do with desktop platforms. You can build websites that are usable on phones while being efficient on desktops, either by building two separate sites, or through responsive design - but modern webdev doesn't, instead choosing to build desktop sites that have all of the aforementioned problems for no good technical reason.

What you’re describing is the baseline principle of responsive design. It should be available on every device. If you’re using a touch interface on a tablet, no one can know for sure you’re not on a desktop. So, big touch targets are still necessary for those users.


I’ve noticed a pattern of what I get downvoted for and it doesn’t reflect well on this community.

1. Asking people to show a little empathy to others (eg use your scroll wheel so people can benefit from reading a thing or clicking it at all)

2. Asking people to show a little empathy to others (eg recognizing that people who have violent ideologies are dangerous and don’t deserve free mandatory amplification from others’ resources).

3. Asking people to show a little empathy to others (eg not trashing people’s fun/exploratory/educational/hobby projects).

I don’t get downvoted for anything else. I’m sometimes surprised for the silly stuff I get upvoted for. But give half a damn and it’s just like the hateful crowds on Slashdot and 4chan and every platform they’ve flocked to, who just hate seeing anyone give a fuck about anyone or anything.


Personally, I get downvoted for literally anything I type about Apple. Apple zealots will steadfastly defend the honor of the company even in opposition to their own interests.

Bring on the downvotes!


Well for what it’s worth I’m a lifelong Apple user and have no desire to downvote your criticisms of them.


You jumped to a conclusion about what was said (the informational issue). Then you very combatively and rudely dismissed that idea.

It reads like you had a bone to pick with the information density argument, and just wanted to get into a flamewar when you saw someone who perhaps disagreed with you. I expect people weren't downvoting empathy. I think they were down voting a rude attempt at starting an off-topic flame war.


I appreciate you telling me your perspective. I honestly didn’t know how else to interpret the comment, and I had no rude or combative intentions. I have no energy or desire for a flame war. I do have a direct communication style that can be offputting to some. Thank you for helping me remember to pay attention to that.


Some day, before I die, I hope that when I say "If it doesn't work for the developers, then soon it won't work for the customers," and people will know what the hell I'm talking about.


That's why Windows Phone didn't go far, right? Developers were spending so much time on iOS and Android that few companies were willing to spend time on another platform.


Microsoft historically did okay with DevEx issues. Urns out Microsoft doesn’t make money off of too many things. A lot of their software ends up being loss leaders to support Office and Windows sales.

I use my tablet occasionally for Numbers, but it never even occurs to me to try it on my phone.


I know what you mean. But I am a software developer so I dunno if that counts, or if you meant that non-developer people would know.


Yes. Web technology has advanced so much that now I can do with a Web-based GUI editor and some DB-to-JSON-Rest framework, what every child was able to do in VB6 25 years ago, just in 10x the time. We're finally almost there.


> Debugging JS in IE6

Visual Studio (even back when it was called Visual Interdev) supported first-class script debugging with Internet Explorer going as far back as Internet Explorer 4. Ditto cscript/wscript.

It just wasn't popular (and few people knew about it) because most people doing front-end web-work _back then_ weren't using Visual Studio, and in general most people weren't doing back-end web-work in Visual Studio either because (pre-ASP.NET MVC), ASP.NET WebForms is, was, and ever-shall-be, godawful.


WebForms is still one of the quickest things to actually build intranet applications with and it still does it incredibly well.

If I wanted to have a editable table (datagrid) control it literally took about 20 minutes and I had something working and people could use it.

The problem with it (much like Angular which btw is basically webforms for JavaScript) is that it required you learning how the event lifecycle worked properly. Whereas other things at the time were web scripts and you could just hack against.

It of course had a lot of problems (like everything on the web at the time) but for what it was actually intended for, it was actually really good.


I respect that WebForms had a decent concept for RAD, copying VB6’s “drag and drop components, write glue-logic to make them work together”, the problem is that you had to give up a lot of control over the generated HTML, which was a huge problem because the generated HTML was awful, like “1998’s HTML in 2006”-awful, to put it succinctly. WebForms supported item-templates, but using them was hard, and you still couldn’t override other HTML until Control Adapters came out, and even then it was a lot of work and very, very brittle. Oh, and so many third-party components embedded ancient versions of jQuery. It was a mess.

Other complaints about WebForms weren’t inherent to its architecture and should have been corrected early on, such as the lack of testability and over-use of [ThreadLocal] storage, and the nigh-impossibility of running WebForms outside of IIS, but these were never properly addressed by Microsoft, instead we had ugly band-aids like “BaseHttpRequest” and “HttpRequestWrapper”. Le sigh.


I will agree with some of your criticisms. However there are some things that aren't pertinent.

While the HTML is awful in earlier versions was IME of no concern. Most intranet apps just need to be functional not slick. I am sure it served different HTML based on the user-agent but I could be mis-remembering now as it was well over 15 years ago.

The lack of being able to run it on anything other than IIS isn't a concern. Most places using WebForms were Microsoft shops and whatever your opinion of IIS and Windows server maybe they weren't going to be running their ASP.NET app on a Linux server.

Most of the problems I've encountered was when people tried getting around the framework itself and rebuilding what already worked quite well within the framework (many people used to use large string builder objects in the code behind files in the page load event rather than using Web Components and building their custom HTML in there, I suspect it was because they didn't understand the Page/Component lifecycle).

I'm actually trying to get away from using .NET entirely (I am bored of the language, the frameworks and most importantly I don't like working with the developers who can't do anything outside of Visual Studio to save their life). But I do think the framework get unfairly maligned because of the quality of programmers that were using it. It is the same with VB.NET, the language isn't the best but the RAD aspect of it hasn't been bettered by anything IME.


> I am sure it served different HTML based on the user-agent but I could be mis-remembering now as it was well over 15 years ago.

It did - but the "different HTML" you refer to was HTML4 for IE5.5+ or HTML3 to browsers it didn't recognize. Even by 2001 that was a bad idea, and by 2004 Firefox was leagues ahead of IE6's support for HTML and CSS features yet ASP.NET WebForms defaults to rendering HTML3-era HTML to Firefox.

...and don't forget all of the problems inherent in user-agent sniffing. Users with too many third-party IE toolbars or shovelware on their computers would see things break (because said third-party software would frequently alter IE's default user-agent string to add their own product-names, which broke things).

> The lack of being able to run it on anything other than IIS isn't a concern

It was a *huge* concern because it meant that web-application code could not be run by automated testing tools! That was one of the design-objectives when OWIN was introduced: for greater testability (for unit and integration testing). Otherwise the only way to test WebForms applications was by having a dedicated web-testing farm with web-browsers open on the desktop driven by custom-written automation tools - that just wasn't an option for smaller shops. And Visual Studio didn't add built-in unit testing support until 2005 (only for Enterprise edition), and 2008 (or 2010?) for everybody else.

Another problem was that in the days of WebForms 1.x (before .NET 2.0) the only way to run ASP.NET applications, even for development, was by having IIS installed - but IIS was not available for Windows XP Home Edition (ASP.NET 2.0 introduced the "Cassini" local development web-server, but it was severely lacking and did not faithfully recreate the IIS environment, and IIS Express wasn't introduced until around 2008). People doing hobbyist development at home had to either pay to upgrade to Windows XP Professional (to use the rather limited IIS "5.1") or apply a variety of crude registry hacks to trick the Windows 2000-era IIS 5.0 to run on Windows XP Home Edition.

> Most of the problems I've encountered was when people tried getting around the framework itself and rebuilding what already worked quite well within the framework

What works "quite well" is subjective. ViewState just doesn't scale and is unsuitable for public Internet websites because it balloons the rendered page-size (e.g. if you have an <asp:DataGrid>, used as-directed such that it's only loaded on the initial page laod) then the entire data-grid's data is persisted in ViewState, which practically quadruples your rendered page-size (due to all the hidden control state) - which was unacceptable in the days when 56K modem use was still widespread (even through to ~2006+). Disabling ViewState was essential for any public web-pages, which increased the amount of work required to get things to _just work_. I could go on...


> It did - but the "different HTML" you refer to was HTML4 for IE5.5+ or HTML3 to browsers it didn't recognize. Even by 2001 that was a bad idea, and by 2004 Firefox was leagues ahead of IE6's support for HTML and CSS features yet ASP.NET WebForms defaults to rendering HTML3-era HTML to Firefox.

Intranet sites were run on highly locked down machines that were running IE. Don't pretend everyone was running their own browser on their machine. As I said it was intended for intranet sites, all the documentation and all the examples at the time clearly showed these user cases.

> ...and don't forget all of the problems inherent in user-agent sniffing. Users with too many third-party IE toolbars or shovelware on their computers would see things break (because said third-party software would frequently alter IE's default user-agent string to add their own product-names, which broke things).

So they were running third party things and it broke something else that had no knowledge of them. If someone runs hundreds of chrome extensions and it breaks my site, I wouldn't bother supporting them either. The situation isn't any different.

> It was a huge concern because it meant that web-application code could not be run by automated testing tools! That was one of the design-objectives when OWIN was introduced: for greater testability (for unit and integration testing). Otherwise the only way to test WebForms applications was by having a dedicated web-testing farm with web-browsers open on the desktop driven by custom-written automation tools - that just wasn't an option for smaller shops. And Visual Studio didn't add built-in unit testing support until 2005 (only for Enterprise edition), and 2008 (or 2010?) for everybody else.

Almost none of the shops were doing automated testing at the time. People barely do it now in the .NET space.

>Another problem was that in the days of WebForms 1.x (before .NET 2.0) the only way to run ASP.NET applications, even for development, was by having IIS installed - but IIS was not available for Windows XP Home Edition (ASP.NET 2.0 introduced the "Cassini" local development web-server, but it was severely lacking and did not faithfully recreate the IIS environment, and IIS Express wasn't introduced until around 2008). People doing hobbyist development at home had to either pay to upgrade to Windows XP Professional (to use the rather limited IIS "5.1") or apply a variety of crude registry hacks to trick the Windows 2000-era IIS 5.0 to run on Windows XP Home Edition.

So on an operating system that was intended for consumer use, they couldn't use advanced stuff like IIS that the home user wouldn't have any use for? Seriously?

> What works "quite well" is subjective. ViewState just doesn't scale and is unsuitable for public Internet websites because it balloons the rendered page-size (e.g. if you have an <asp:DataGrid>, used as-directed such that it's only loaded on the initial page laod) then the entire data-grid's data is persisted in ViewState, which practically quadruples your rendered page-size (due to all the hidden control state) - which was unacceptable in the days when 56K modem use was still widespread (even through to ~2006+). Disabling ViewState was essential for any public web-pages, which increased the amount of work required to get things to _just work_. I could go on...

None of this was intended for sites on the internet. I was specifically talking about the RAD tooling. Yet you seem to be talking about things it was never intended to do at the time.


I used Visual Interdev, but the debugger experience was bad: it was slow to start, and many times it didn’t worked (when it worked it was fine).

I’ll add that to the other valid reasons that you mention.


This didn't really help with the DOM or CSS, though. The developer toolbar that worked with IE6 was sorely lacking.


I think you could explore the DOM tree in the Document Outline window of VS and see node properties in the Properties window, but that could be a false-memory of mine - it's also completely undiscoverable, even today.

Support for CSS "debugging" was completely absent, yes.


You didn't like debugging with window.alert()?


For anyone still/stuck with console debugging, “pro” tip:

JS ignores the value of every statement in a comma-separated expression except the last one. So you can do trash like this:

    const foo = (bar) => (
        console.log(bar),
        someComputationOf(bar)
    );
And log-debug without restructuring your otherwise pure function.


I remember running in a frame set just so I could document.write instead of a million window.alert. Fun times.


This is a brilliant idea. Where were you back then? I would've benefitted immensely from a Xanga or LiveJournal post about this.


It’s really funny how isolated the Webdev world was back then. (Or at least I was).

I take stackoverflow and blogs for granted.

I did have a hand-rolled blog back then (The Hole Report) but was busy writing about magic tournaments and didn’t think anyone else cared about javascript hacks.


this gave me a vivid image of young me flailing around in the dark trying to understand why undefined is not a function


The worst was when I discovered all console.log() crashed because IE6 left “console” as undefined until the developer tools were open.

Fortunately another bad feature of IE was to alert() the errors, so customers were aware of each of them and could easily send us the bug back...

This bank employee used PrintScreen to print the screenshot on paper. He the successfully faxed it to his email, zipped it and uploaded it to our bugtracker. I had the zip of a .tiff of a scan of a printed screen. I still wonder whether he did it because he was upset, or whether it was a usual workflow.


probably a temporary ad hoc process that is now permanently etched into their brain


[object Object]


I remember filling js files with alert messages back in 2004. Imagine the joy debugging a 15k line js codebase back then.


This.

I worked at Amazon during this time on our custom web server and before that I had been using telnet to hand type my commands or various perl scripts I wrote to emulate a browser's requests. And... ugh, tcpdump to figure out what exactly browsers were sending before they got to apache.

Firebug was like magic!


Prior to Firebug, when I was unfortunately writing ActionScript in Flash, there was Charles [0].

It's still really good!

[0] https://www.charlesproxy.com/


I used to love Firebug, but couldn't help but notice it getting more and more sluggish over the years, with no apparent cause for it. Does anyone know what happened?


Firebug was the same speed, but Chrome got faster and websites felt comfortable piling on more JS as a result.


For whatever reason Firefox decided to write it's own Dev tool. I remember Firebug developer(s) saying why are you not integrating Firebug itself in Firefox?


Whatever happened to Joe Hewitt (one of my developer heroes back in the day)? Did he retire after Facebook?



As someone that was used to Visual Studio's Debugger, that Firefox had one was an (oh, they finally caught up to 1995)

To be clear I mean that Visual Studio had a C/C++ debugger but outside of that most languages that I used in any other situation were missing good debuggers.


I miss Firebug. Simpler times...


I don't. We have chrome dev tools and firefox dev tools which are both literally the same thing as firebug but with a decade of refinement. There are so many great aspects to those tools that firebug didn't have back then because it was the first version basically.


Seconding that. Although Firebug was wonderful, cross-platform development was not (IE and FireFox). Example of "quirks mode" nonesense:

1. The "hidden" element at the root of the IE DOM tree, only accessible with "*" via CSS

2. inline-block was broken on IE6 (but fixable with hacks), and no transparent PNG support.

3. The outline CS property was implemented the same as background in IE

Best of all was the dog slow IE 6 JavaScript interpreter, which was like 50x slower than Firefox and IE 7+.


Having a proper doc type and having a specific IE stylesheet that added zoom:1 property to element could fix a lot of that off the bat and was a simple one to include on a webpage. Transparent PNGs of course wouldn't work (well there was a way but it was an arcane trick and I can't remember it for the life of me now).

TBH I found cross platform development during that time easier than when smart mobile phones came out. There was about 20-30 slightly incompatible forks of the Android Browser/Webkit and the processors were slow and everything was JS. Blackberry OS was still a thing. I also don't miss the problems with iPhone 3GS and 3D background layer flickering.

Also programming cross platform at the time showed me how to write lightening fast JS. That has made me a lot of money when people come to me to fix their crap web apps.


> Transparent PNGs of course wouldn't work (well there was a way but it was an arcane trick and I can't remember it for the life of me now).

It was a DXImageTransform Alpha filter. Complete pain to use.


And I’m pretty sure this is where CSS preprocessors were born, right around the time of CoffeeScript and the inevitability of the web being a compilation target.


> Transparent PNGs of course wouldn't work (well there was a way but it was an arcane trick and I can't remember it for the life of me now).

In the recesses of my brain somewhere is a memory that it was related to ActiveX, and/or the CSS filter property. You set some special filter value that invoked ActiveX? Maybe I'm remembering incorrectly but if I think about it too deeply I might start screaming and never stop.


Yup it was something like that. I think it was the opacity property. IE6 required a custom property so in order to do opacity you had to set at least two CSS properties, might have actually been more because Safari/Firefox/Opera/IE mostly had opacity as a CSS extension.

I don’t miss those days.


Same here. I miss when I could inspect network headers from the console without having to go over to the Network tab. Updating HTML in realtime without committing to an edit was also really handy since I wouldn't lose the context that I was just editing in.


What do you miss from Firebug that's missing from the built-in tools in any major browser?


Personally I hate that you can no longer copy an entire URL and its GET/POST data in a single click like you previously could.

Now you can "copy URL" or "copy URL parameters" but there's no way to copy them all in one go. You need to manually combine them in an external text editor which is annoying.


"Copy as cURL" or "Copy as Fetch" don't suit your use case?


Debugging safari on ipad issues without a mac is still like that. ilibmobiledevice stops working all the time


> Every developer from around 2006-2008 knows what I'm talking about.

Have been a professional developer since 1999 - no idead what you're talking about.


Firebug came out AFTER IEDevToolbar, and IEDevToolbar was superior to Firebug for years for debugging JavaScript.

So not a very good example at all, in fact quite the opposite, it just happened to ride the back of the anti-M$ wave and wasn't at all innovative, copying its features from already existing products.

Conversely, though it was free, IEDevToolbar was only really advertised to MSDN subscribers, so never got much penetration, especially as by then everyone hated IE6 and a lot of Devs worked in Firefox first.

But my company only supported IE6, and when we finally started supporting Firefox, firebug was an annoyingly sub-par experience when debugging js.

Even Chrome's early Dev tools had some sorely missed features when it first came out, as far as I can remember it took them years to support being able to just hold your mouse over a variable and see its value. And because it was a web-page about a web-page you got all these weird bugs. And the CSS editor was (and still is) super annoying with its attempts to cut up your text.

On top of that visual studio also had a JavaScript debugger, I'm sure there were plenty more paid tools you simply didn't know about.


Do you realize that Firebug was the successor to the Venkman debugger and the original DOM Inspector for Mozilla/Netscape?

<https://web.archive.org/web/20051223164847/https://www.mozil...>

<https://web.archive.org/web/20050206082145/http://www.mozill...>

<https://web.archive.org/web/20050206185242/http://www.mozill...>

(Joe Hewitt created DOM Inspector and checked it in to CVS while working at Netscape. Robert Ginda had separately created Venkman. Firebug used the XPCOM interfaces that had been created to allow for Venkman's debugging capabilities, but IMO was always inferior to the much more powerful Venkman, even if Firebug was slightly prettier to look at. PS: shame on Mozilla for breaking those links. Do you even know how to Web?)

> Firebug came out AFTER IEDevToolbar

I'm not sure that even if you focus purely on Firebug and ignore DOM Inspector and Venkman that this is even true. Do you have a source for that?


Venkman was hard to use, and mostly focused on the internals of Mozilla. Firebug wasn't just prettier; it was much more usable, and it focused on actual website code. Hence the difference in popularity.


> mostly focused on the internals of Mozilla

Absolutely not.

> Firebug wasn't just prettier; it was much more usable

Venkman made some debatable decisions about UI defaults, as is the case with a lot of tools built by old school programmers trying to solve their own problems, but it was more customizable, flexible, and powerful (as is the case with a lot of tools built by old school programmers...). Aside from Firebug's template-based rich console, every feature implemented in Firebug's debug pane was implemented in Venkman, except Venkman gave you even more latitude and control. Firebug's debug pane was a basic UI over a subset of Venkman's APIs, after all, as previously mentioned. It was only many years later (much closer to Firebug's death than its birth) that Firebug got some improvements like showing a variable's value in the source window while stopped at a breakpoint without having to type it into the console (or set a watch on it, or look at its activation record entry). Venkman never had that, but it pales in comparison to all the things that Venkman could do that Firebug did not.

Firebug made its own share of dumb decisions, and the degree of dumb in those instances was higher.

> [Firebug] focused on actual website code

I'm not sure how it could be "focused on actual website code". Venkman was a straightforward JS debugger. (The only way this could possibly make sense is if you're referring to all the stuff baked into Firebug outside it's debug pane—style sheets, live view of document nodes, cookies, and so on; all things outside the purview of a language debugger—in which case it becomes an apples and oranges comparison. It would make more sense to criticize DOM Inspector for any of those things than Venkman. And once again—there were plenty of dumb decisions by Firebug in that comparison is concerned.)


> if you're referring to all the stuff baked into Firebug outside it's debug pane

Yes, you're focusing on the debugger only. The fact is, most people simply did not use Venkman since it was hard to understand how to use it, for what context it was meant. Firebug tied the debugger into a wider context, making it more accessible, and reached a level of popularity that Venkman never had.


The point remains: Venkman was the better debugger.

> a level of popularity that Venkman never had

You're not saying anything I don't already know.


That feels a bit like advocating that it was mainly Betamax which changed consumer habits in the 90s since it was technically superior to VHS.

It may very well have been the case for a small number of MS centric devs, but for the large majority of users, it was Firebug that really changed the game for the bulk of Frontend devs back then.


As a user of both Firebug and IE's Devtoolbar I can state this is simply not true.

While I agree that IE's devtools had more features regarding xpath integration, you'll also had the problem that it was based on libxml (yes, the reason trident was exploitable for decades, and probably still is).

IE's devtools didn't have access to the DOM and only had access to the HTML/SGML/actually XML representation _before_ it was parsed into the DOM.

This gave you hundreds of error scenarios where debugging user events was simply impossible if they caused the DOM to change, something like adding a node or debugging a parentNode was impossible in trident.

And oh how often have I seen bugs in websites that were relying on specified flow roots that behaved differently in trident. Something like an unclosed <p> in the wrong place could easily mess up everything in trident and switch it to quirks mode.

Remember IE was exploitable via a parentNode.remove(parentNode), too? The hooks for devtools was the underlying reason.

I am not sure why you claim that IE had superior devtools experience. You must have done a lot of ActiveX related development, because everything else was impossible to debug in my opinion.

The best thing MS did was to create the Edge team that tried to refactor trident, and soon realized that it's impossible and instead started over from scratch.


At least Firebug was attached to a good browser. IE6 was terrible to use so being able to develop in Firefox with a decent debugging experience improved web development


> 10 times better than the second best option

It's not who was first, just who seemed 10x better than the next best option.


Here are a few that I did not see listed by others:

1. Automatic device discovery and driver installation (e.g., with USB devices (also USB device categories, etc.)). Instead of trying to find a driver, things just worked.

2. Automatic updates. Keeping everything updated, largely, fell into the background.

3. Graphical integrated development environments (IDEs) for software development. I realize editors can be contentious, but tab completion of variable names, automatic identification of methods within scope, syntax highlighting, easily dropping breakpoints, etc. are, in my experience, wonderful improvements on productivity.

4. What you see is what you get (WYSIWYG) text / image editors. Thankfully, I did not spend much time in the prior era, but it was, at times, maddening to get something to format correctly.

5. Ad blockers / reader modes. Again, I know these can be contentious, but, for me, these reformatting services are sometimes the only way to make some websites practically readable.

I strongly second:

-The rise of memory-managed languages (e.g., JAVA, C#, etc) with pretty robust default library sets, especially for string manipulation, graphics, and network operations.

-Moving map software, especially for mobile GPS mapping.

-Spreadsheet software.

-Being able to easily search for answers to fairly technical programming problems, compiler errors, etc. along with better access to online documentation.


>1. Automatic device discovery and driver installation (e.g., with USB devices (also USB device categories, etc.)). Instead of trying to find a driver, things just worked.

I remember well when installing Windows went from "make sure you have all the driver CDs before you start" to "just make sure you have the network card driver and the disk driver (if needed)" and then it went to "as long as you can connect to the internet Windows Update will get everything".

Before you had to get on the internet and find all the driver files yourself. The last hold out was graphics card drivers IIRC.


In my experience, you'll still want to go to the NVIDIA/AMD website to download the latest drivers. What you get from Windows Update is likely 6+ months out of date.


True, but you at least have a functioning display out of the box, which lets you get to that website.


You poor souls running windows... even Linux can do this with mostly no fuss today.


We must have received a very different version of nvidia.ko if yours updates with no fuss.. either that, or you are using intel/amd in which case lucky you :)


No fuss in my experience. Some distros use dkms, some rebuild the driver on the build farm as a package corresponding to each kernel update. Both are quite reliable.


Sure, but from the user's perspective Linux operates in exactly the same way as Windows. A boatload of drivers available directly from the vendor but they're out-of-date (or in the case of Nvidia not the official drivers). And it's much much more of a PITA to install out-of-tree drivers on Linux than Windows.


I feel like automated kernel module compilation when updating to a new kernel (DKMS) is in a very similar ballpark for me as a user.


I’m at the point now where I won’t code without an IDE. Automatic imports, code completion, running a linter on save, nice git diff displays, finding all usages of a function - this stuff makes my life so much easier.


I'm the opposite, I've come me to rely on auto completion so much I cannot spell English. I've had to toggle it off.


> 1. Automatic device discovery and driver installation

Fun story. Had Windows 7 installed on an old HDD. Decided to build myself a new PC, so I got all the parts (completely different setup than what the HDD had been in), put them all together, connected the HDD, and powered it on to see what would happen. I was shocked to see that Windows booted just fine with the new setup, like nothing had changed...


Yeah, Windows got a lot better at handling motherboard swaps.

I remember trying to do a PC overhaul with Windows 2000. I swapped out the motherboard, RAM, and CPU, and Windows would fail to boot, crashing with an INACCESSIBLE_BOOT_DEVICE error. IIRC, I managed to recover without wiping the drive and doing a clean install by putting back in the old mobo/RAM/CPU, booting up, and swapping the IDE driver from a motherboard-specific driver to a generic IDE driver (Which comes at a signifcant performance penalty because I lost UDMA support), then swapping back to the new mobo. It booted fine after that, and I was able to install the proper motherboard drivers to get UDMA support on my IDE drive.


Back in the day, I had a drive of Windows 98 that had been through two or three very different motherboards. It was a good day if it bluescreened only once, because it usually did more often.


It's a wonderful feature - I've now been through 3 full system swaps with the same OS. The only exception is software that ties it's license to a kind of hardware identifier. Even then it's a very minor inconvenience compared with the Win 98/XP years. I have a few teenage memories of pulling all nighters just to reinstall my OS and programs.


Yep, and you could clone it with xcopy.


You gotta do WYSIWYG right, though. It seems like I break MS Teams once a week and have to start formatting part of my message over again. It's like someone stopped in the middle of implementing its Markdown support. I also frequently find myself trying to get the cursor out of the end of a block of formatting and back to normal formatting. I vaguely remember hearing complaints about Slack's editor at one time, too. There has to be a better way


Gmail.

Using web-based email clients was a nightmare before Gmail. They had limited storage space, and the UX was pretty bad, they were hard to search, etc. You spent all your time figuring out what you wanted to delete, or seeing your emails bounce when people had full inboxes. If you didn't log in for a while, your account would disappear.

And then suddenly, you got a GB of storage. For free. No questions asked. And its UI was simple and easy-to-use. And you could search it.

A lot of other products are 10x better in individual areas. For instance, Google Sheets was much more portable/shareable than Excel when it launched. But even today there's no comparison, Excel is superior for actual spreadsheet functionality. But Gmail was better on every axis, even against local clients like Thunderbird and Outlook.


>And then suddenly, you got a GB of storage. For free. No questions asked. And its UI was simple and easy-to-use. And you could search it.

Also don't forget that Gmail at the time had the most intelligent spam blocking algorithm compared to AOL/Yahoo/Hotmail/etc.

It was a big enough deal that some observers that switched to Gmail considered the email spam problem as "solved" because Gmail seemed so good at it. (On the other hand, many independent people trying to run their own SMTP servers think that Gmail is too aggressive with spam filtering because it also blocks many legitimate senders with low/unknown reputation.)


Funnily enough, Gmail can be on the receiving end sometimes too, e.g. being blocked in SORBS.

The response was:

  451 Currently Sending Spam See:
  http://www.sorbs.net/lookup.shtml?209.85.215.41
Although some may argue that SORBS is a bad system, anyway.


Is there a better spam filter today?


Define better. Gmail does a pretty good job for most use cases but it's all about compromises. One person's spam is another's ham; so you will unavoidably end up with false positives and/or false negatives. For some organisations you might want a system that you can fine tune those.


I studied Google's file system. What Google figured out is that, with the rise of very fast networking such as 10 Gig Ethernet and faster, the network is much faster than local disk. Files were spread across multiple servers so they could all stream different parts of the file off their local disks simultaneously to the client computer faster than the local disk on any one computer could run. Thus, you could have systems like Gmail that could run much faster than even local disk based email clients, even with thousands of users.

Other providers were probably using expensive NASs with huge profit margins built in. Google was using thousands of the cheapest crappiest commodity parts because it was all triple redundant... and it worked faster because the network was really fast and multiple computers could stream different parts of the same file to clients.


https://static.googleusercontent.com/media/research.google.c...

Very, very influential reading back in the day, and still interesting.


It also came out at an interesting time, because everyone was trying to push data-to-redundancy ratios to their limits. Since storage was so expensive back then, storing multiple copies of data made little sense when looking at it from a data storage view, even if the speeds were much better

Then Google dropped their MapReduce paper: https://static.googleusercontent.com/media/research.google.c...

Which quite literally paved the way for modern data processing, and works extremely well with the Google Filesystem architecture


Yeah, and then everyone took hadoop and threw it on a NAS or they provision it in the cloud and...throw it on a NAS. Always scratched my head on that one.


Gmail also automatically saved drafts. I can't tell you how many long emails I wrote and lost before hitting the send button with other web email UIs.

Gmail was not just 10x. I think it redefined what a good web based email experience could be. I think it completely changed what people realized and expected the web browser could be from an interactivity standpoint.


And conversation view. Hiding the quoted email being responded to.

Email before always looked like how Twitter threads do today.


Not just a GB. It was also constantly increasing in size, with a live counter to show how much storage you had available. Super gimmicky, but fun.

I guess there are probably a lot of people here who are too young to have ever used those early versions. You also had to scrounge forums for an invite code.


I made my account(s) off invite codes and have "just my last name" at gmail dot com as an address... it's one of my most prized possessions, to this day!


Hanging out in IRC channels in early 2000s was how I wasted my teenage years. But I did get a gmail invite out of it leading to my primary email address to this day: arithmetic@gmail.com


Thats awesome man.


I managed to snag my (very common) italian last name. I get a flood of messages every day. I both love and hate my account.

I don't even speak italian!


I get the same thing. My last name isn't that common though so I've messaged the culprits specifically (especially if they use their phone number when signing up for whatever it is) to ask them to please stop using it.

Someone will have the idea and sign up for 5-6 things all in one burst, which usually inspires my outreach.


do you receive any interesting emails?


I also have <last name>@gmail.com. It’s not a super common last name, but common enough that I get emails almost daily from people accidentally omitting a first initial.


I convinced my mom to buy me a beta invite for $5 on ebay and then spread my invites out to my family and friends.


Gmail was amazing when it launched...I was so excited to get an invite from a friend during the beta period. It made the crappy POP sync for my ISP email account look like a joke. Funny thing is–closing in on 20 years later–I dislike the latest generation of Gmail's web interface so much that I'm back to using a desktop email client.


I dislike it so much out of the box, or with a new account. When I help older people they look at Google like it is in hieroglyphics that they can’t understand because everything is an icon instead of a word. Then there’s the by default sorting your mail box for you and also having threaded emails. Features that have good intentions but just make it impossible to find the emails you are searching for.


> or seeing your emails bounce when people had full inboxes.

For those who don't remember: That was around the time when Yahoo offered 6MB, some others only 2MB.


Found this CNET article: https://www.cnet.com/news/google-to-offer-gigabyte-of-free-e...

They say 2 MB Hotmail, 4 MB Yahoo. Gmail went straight to a gigabyte.


It was so crazy people wrote code to use gmail as a network drive.



Today a single email without attachments can get close to those numbers!


How many emails was that back then?


The Gmail web interface also (re)invented AJAX. The number of web applications that used XMLHttpRequest increased exponentially after the success of Gmail.


it was also written in Java which then compiled into client side javascript(!)


Not the original version. The current version is a hybrid of code compiled from Java (J2CL, previously GWT) mostly business logic shared with Android, iOS, and the server backends, and a UI that is built with Closure Library. It's monolithically compiled and globally optimized via Closure Compiler.


They announced it on April Fools. And it seemed like a joke - 1GB for email? Yeah. Right. Hotmail had, what, 10MB? 15?

Best April Fools joke ever.


pretty sure Yahoo was 5MB and Hotmail 2MB when gmail came out.


Seems ancient at this point, but a total game changer at the time. I can actually recall getting my invite in 2006.


I bought invites on eBay for myself and a few family members. Maybe $5 each. Although I thought it was closer to 2004.


> If you didn't log in for a while, your account would disappear

That's still true for gmail, but the time is more reasonable at two years.


Agreed. Too bad it seems that email's best days are over now, as email nowadays is mostly for notifications.


>Too bad it seems that email's best days are over now, as email nowadays is mostly for notifications.

Email is still heavily used for business-to-business communication between humans. Talking about business matters is still more natural via email until the participants know each other well enough to switch to texting.

But yes, for personal communication, friends & family have shifted from email to phone texts. E.g. my friend who graduated from college in 1990s used to communicate with his parents with 100% email but now it's 100% text messaging. Email is too much friction for personal comms.


"Notifications" is being generous here. The average user's email inbox is 90% marketing spam.


Any sources on that? I’m guessing the 90% is hyperbole, but I still wonder how much spam the average user actually gets. (For me, close to absolute zero)


It has taken me over a year to wrangle my inbox into shape (unsubscribing from all marketing email that isn't spam/gets through google's spam filter) and I would say anecdotally that my incoming email has decreased by at least 90%. I go multiple days without receiving emails now.


Pretty much every company/service out there opts you into spam unless you proactively opt-out of it (which takes effort and knowledge to work around the dark patterns).

Looking at my password manager I've got ~260 logins right now and keep in mind that I don't do social media and try to avoid creating accounts as much as possible (and delete the ones I don't use for a long time), so the average user is likely to have a lot more accounts.

Even if each one of these companies only spammed once a week (most will do more frequently if you let them), that'll already significantly outnumber the amount of legitimate e-mail I receive.


Was Gmail really 10x better than Hotmail?


Hotmail was 10x worse than all alternatives. It was an ugly web page, not an app.

Gmail took XMLHttpRequest and its ActiveX fallback (we used to call this “comet”) and proved the world that we can ship a robust app inside a web browser.

It was well designed, had no ugly banners like Hotmail did, it was really fast, simple and working like a desktop app.


For historical context: XMLHttpRequest was invented for outlook web access, so the idea and usecase preceded gmail.


The first time I saw it was in Google's Orkut, their first failed social network. But it did have cool AJAX.


In terms of mailbox space, it was 500x Hotmail. Before Gmail you had to delete your old emails because if your mailbox was full it would stop receiving. The whole approach of "archive" and being able to search your entire email history from any computer in a webmail interface came from Google.

There was POP3 before this of course, to keep all your mail locally and empty out the server box. But that only works if you have a single computer that you check mail from. Even back in 2004 when Gmail launched that was a non-starter for me, I had email at home and at school.


Gmail was 100x Hotmail, if not even more than that. I think anybody who was on Yahoo Mail/Hotmail (or even something like Roundcube) who switched to Gmail probably realized how much these other companies had been holding stuff back.

I've never cared for invite only stuff, but Gmail really was a total revolution when it came out.


I thought the “invite only” approach was clever. It throttled growth so they could maintain a good user experience and limited automated use by spammers.


> Was Gmail really 10x better than Hotmail?

No.

Gmail did not suck badly enough to be merely 10× better than Hotmail.

Gmail was lightyears beyond other early free webmail services.


Hotmail had a 2MB e-mail limit. Gmail came with 2GB.


Sadly, this isn’t even snark.


I don't use either but the hotmail/live/outlook system is a disaster.

I think people who have addresses there must just use them as throwaways as they don't seem to have much utility. You can be a small email sender with everything right, clean IP, on no block lists, SPF, DKIM, mta-sts. You can have no problems with any other major email provider. You can be signed up to SNDS. And they will just routinely block your IP for no reason and it will not even show up in SNDS and you can't resolve it through the tools they provide to mail senders on SNDS. So you have to go through this stupid process of filling in a support form which gives an automatic reply saying there is nothing to fix and you can't reply. But you do reply and then they fix it until next time.

There is no innovation there. I guess some accountant has determined that paying sweat shop labour to untick an IP every time their brain dead system blocks the same sender with no spam history is cheaper than actually fixing their systems. This is a flashback to 1990s Microsoft where their software was buggy as hell and your support options were power cycling or reinstalling.

They not only put their company name on this mess but offer it on outlook.com which creates an association with the pro email solutions they sell to a massive enterprise market. They should be embarrassed.


What gets me is they don't honour whitelisting email addresses when their reasons for blocking can be really spurious.


Long story short, yes.


Hotmail offered 2Mb space and Yahoo proudly claimed "twice the space of other providers" with their 4Mb offer. I had a GMX.de account that was providing 25 Mb and felt like a king. Then April 1, 2004 came and nothing was the same anymore.


I didn't have that experience with gmail, I had it with oddmail which was 1-2-3 years before gmail? Oddmail reproduced Outlook in a webpage (3 panes, folders, list of items in selected folder, contents of selected email)

oddmail was bought by yahoo


Gmail was so good that when it was in beta i sold invites for £200 each.


- SSH. Maybe not in terms of performance or efficiency, but before SSH, we were using telnet everywhere and just sending passwords in plaintext all over the internet. Plus, you could give someone your public key and they could give you access to a server instead of the "here's a temporary password, change it as soon as you log in" approach.

- Perl. This was at the time when the other languages available to me as a student were Java or C (mid to late 90s). Those were fine, but Perl definitely felt 10x more productive for me for the things I actually wanted to write. Plus CPAN was the first directory of libraries/modules that I'd encountered of its ilk.

- VMWare/virtualization. We used it for an Operating Systems class so we could learn by actually writing Linux kernel code and running it on a VM. This was huge at the time. Friends at other schools taking Operating Systems had to work on dumbed down simulations and "teaching" OSes. Before VMWare, if you wanted to work on the kernel, you had to have spare hardware and a lot of patience for re-building your system when you did something stupid. With VMWare, you could just restore from a good snapshot and try again.

- apt-get. Coming to Debian from (old, pre-yum) Redhat, being able to type a command and reliably install pretty much anything was a huge improvement over untangling RPM dependencies. Even RPMs were a pretty big improvement over manual compiling or Windows-style installer wizards.

- Numpy (or "Numeric" as it was called at the time). Vector math in clean Python that was mind-blowingly efficient. The only other option that really balanced performance and high level accessibility was MATLAB, but that wasn't suitable for using in an application.


> apt-get.

I saw a joke the other day that went something like:

"What, is your grandma so senile that she can't type ./configure; make; make install" ?

Installing software on linux truly sucked before package managers.


> Installing software on linux truly sucked before package managers.

Although UN-installing software was possibly even worse!


Who you calling grandma! :)

Ahh, reading the Makefile to find switches you need or don't; documentation, ain't nobody got time for README, yah I'll take the default switches; also moving where the binary will be installed, sometimes an obscure library will not be in the path, have to find it, oops, wrong version, upgrade lib, oops, breaks backward dependency of something else, revert, put new lib in different place, mod path, make clean ... First time I used yum, I thought, my God, my barbarian life!

Also, kernel patching roll-back! If you put in a kernel patch that broke stuff, go get the LiveCD and make some coffee and call your spouse, you gonna be there a while.

Network booting, that too. I mean, is network booting 10x local boot? I can't say, but you're not going to make a data center filled with tens of thousands of nodes if you can't network boot.


That joke persists in the DeFi / yield farming space right now, except with sillier names because the main services have names like Sushi and Pancake


Sorry to have ask, but do you mind explaining this joke? I'm trying to fit in "grandma so senile she can't Pancake; sushi install" but it's not clicking.


something like “She cant mint aUSDC to farm CAKE”

there are better tweets and memes floating around about how convoluted yield farming really is


Not just packages managers, but repositories. When I started using Linux, I was still downloading or building RPM files, and manually working out dependencies. The idea that you could suddenly netboot a tiny image and `apt-get install xfce4-desktop` was amazing.


When I installed my first Linux (Red Hat 9) I was trying to install Gaim on it. I was trying different methods and I managed to install it by source compilation but not through rpm.


Yeah, I started out on Red Hat (maybe 3 or 4.x) and spent years fighting with RPM dependencies. At one point I installed one of those source only distros on a spare machine for fun ("Source Mage", before Gentoo took over that space). For all the misery involved in having to compile everything from scratch on slow hardware, I was absolutely blown away that mplayer with proprietary codecs just worked out of the box. It always took me hours to get the right combination of packages on Red Hat to make that work and would frequently break whenever I updated anything.


> - Perl

For me, Python. I knew Perl was the "leet" language and was determined to add its power to my toolkit. I made several (three, I think) weeks-long intensive efforts to master it. Every time, I read the book, read PerlMonks every day, did exercises, and got to the point of writing useful scripts using pretty powerful features. I'd think "I've finally got it!" and would be a Perl programmer for a while, but eventually a few weeks would go by without any need to use Perl, and it would turn to mush in my mind, and I'd have to start over again at the beginning.

One day in 2000 I was yet again failing to do things in Perl that had felt easy six weeks ago, and I thought, "Huh, I think I remember how to do this in that language Python I fiddled with for a few minutes once after reading some blog post." I started typing Python and was very quickly able to get it working.

I didn't really want to write such easy, bland, straightforward code. I wanted to be a wizard who could do amazing things with a few cryptic characters. Writing Python was like doing a magic trick while simultaneously explaining exactly how the trick was done. It reduced my style points to zero. But I liked how easy it was, and Perl kept brutally showing me that I wasn't cut out to be a wizard.

Twenty years later, and Python is still an important part of my toolkit.


Yeah, I switched to Python in 2002-2003ish when I started noticing that all the interesting new things I was seeing were using it (and I had to do some scientific/graphics stuff where Numpy and PyGame were much nicer than any of the Perl options). I probably haven't written any Perl in 15 years now, but I have to respect it for what it was at the time. When I learned Perl in 1996 or so, I didn't even own a computer; I just had access to a shell account on the university's unix server that I could telnet into from any computer lab. I basically got started by reading manpages. As hard as that was, it was far more accessible to me than figuring out how to get whatever weird C compiler there was on there working or getting a machine in the one computer lab that had a Java IDE installed on the desktops.

The "wizard" aspect was part of what eventually drove me away from Perl as well. I was never really into the whole JAPH/golf thing and I worked hard to write Perl that was well structured and readable, and there were plenty of other Perl programmers like me. But there was also always that part of the community that wanted things to be as obscure and obfuscated as possible and they were very visible. Python's complete rejection of that was really refreshing.


Good for you, but I personally don't want to read or edit any Perl code again if I can avoid it.


I haven't really written any Perl in probably 15 years now and I mostly avoid it if I can. But put yourself in my shoes back in 1996 or so as a student. Didn't own a computer of my own but had access to random computer labs on campus and an account on the university unix box that I could telnet to. My options for programming were to try to get a spot in the one computer lab that had Java and C++ compilers (and carry around a stack of floppy disks with code I was working on). Those were always booked solid and not open at odd hours (I was a Physics major, not necessarily taking a CS class that would grant me priority access). Or, I could find any any Mac or PC in any computer lab, including the 24/7 library ones, telnet in, and write real, useful programs in Perl. The Perl man pages were available right there (and extremely well written), everything was stored on the server, the Perl standard library was pretty comprehensive compared to what was included with most other environments at the time, and my editing environment (pico at first, then vi and emacs as I learned) was consistent no matter what computer I was on. As much as I wouldn't chose Perl for a project in 2021, it was really a game changer for me in 1996 and I'm not even sure if I would've transitioned from Physics to programming if it weren't for Perl.


Joke's on you for trying to read code in a write-only language!


Steam for PC Gaming.

At first, it was just annoying DRM. But it was convenient.

- Before that, you had to manually update your games in order to play the latest version

- Without no-cd cracks, you were required to leave a CD / DVD in your drive

- with the addition of steam workshops, installing mods for certain games became easier. You didn't have to manually copy paste files.

- you have one central friend list, and invite system which many PC games use. It took some time until more companies launched own launchers and fragmented this ecosystem again.

- save game cloud backups became the norm. No need to manually backup a savegame folder if you want to ever reinstall a game.

- the refund system is user friendly (refund if you haven't played for more than 2 hours)

- steam link allows you to stream your games from a PC to other clients locally or through the internet

- steam remote play together allows you to stream a game to a friend for remote couch-coop. Other player doesn't need to own the game and since a recent update, doesn't require a Steam account.

- family sharing lets users easily share a whole game library with friends and family

- big picture mode offers a great gamepad focused UI which is ideal for living room gaming PCs on the TV

- enchanced Steam controller settings which let you configure the Steam Controller and after some updates also other controllers for each game. This even works if the game doesn't have official controller support.

- compared to other launchers it is really fast…looking at you "Xbox Game Pass for PC Launcher" thing


> - Without no-cd cracks, you were required to leave a CD / DVD in your drive

Speaking of which, prior to Steam, I'd definitely consider "Making an ISO of your favorite game and mounting it as a virtual drive using DaemonTools" to be a 10x innovation.


Thanks for reminding me about this! I've just experienced a huge nostalgia rush! Oh, I wish I could go back 15-20 years back just for a day!


Ha, and you'd have to sometimes turn on DaemonTools' securom defeat. CloneCD was also probably the best way at the time to reliably make isos.

Slightly related, but moving from doom9.net dvd-rip tutorials (make vob, encode to DiVX for decent size) to simply using Handbrake and h264 was awesome.


thanks for reminding me about DaemonTools!


Still have all my ISOs backed up somewhere, and a .txt file with the serial codes in. You never know!


Is there an OS these games run on that can run on modern hardware or be emulated now, with good enough support for 3d drivers, etc.?


Oh yes. Been there.


> - the refund system is user friendly (refund if you haven't played for more than 2 hours)

Well, depends on your expectations.

* It wasn't first (that was Origin)

* It wasn't there for a long time after the store launched

* It doesn't meet minimum requirements of consumer law in places such as the EU or Australia.

The rest I agree with though.


> It doesn't meet minimum requirements of consumer law in places such as the EU or Australia.

I'm in Australia. What consumer laws do we have about games?


I forgot how annoying things were before steam. Manually patching and update games, making sure the disc you bought had a key that wasn't already used.

That being said, I do miss the little game guides that would come with your game. Starcraft 1 comes to mind. It told you what every unit did and it was a handy little reference companion that was wedged between the frontpanel of the CD cassette


Steam actually has a built-in feature for games to upload manuals, where they get a dedicated link[1]. Hardly any games bother though.

[1]In the current version of Steam, if there is a manual, it's at: Right click the game->Properties->General tab.


I also (even though it was a form of DRM) miss some of the cd keys that existed. My first, favorite game was Hillsfar on windows 3.1 or DOS, I don't remember which.

To play you had to have the little spinner 'secret decoder'. It seemed part of the game, but was obviously just to keep people from playing pirated copies.

But I miss that kind of neat little gee-gaw that used to come with games.


I want to piggyback off this comment and also throw in HL:Alyx for VR Games. I was honestly blown away by how good VR felt, VALVe did an amazing job and I think/hope it sets the standards for all other VR Games.


It's very likely a smart business decision, but I wish Steam had a re-sell market. Even just having only a single possible sale, I think would help sort the crap from the gold.

There'd be one more opportunity for profit AND people could buy games for cheap. Sure, there's an opportunity to copy games before you sell them, but I think there's ways to fix that. Sell for steam-credit only maybe? and there's plenty of opportunity for a transparent DRM. Even the threat of perma-ban for cheating. Dunno.


This is one of the more interesting use cases I've seen posed for NFTs. Use an NFT when a user buys the game and then they can resell the game later.


What is the advantage of an NFT versus a record in Valve's database?


Open standards, sort of. People could in theory take a game license they bought through some other service supporting NFT game licenses, and move it to Steam, or vice-versa. Steam et al would essentially be crypto wallets that let you download+play the games corresponding to the licenses they held. (Better yet if they’re non-custodial, actually. Decouple the responsibilities of originating licenses [i.e. buying games] from fulfilling game installs given licenses.)

Also, presuming Steam was coerced by industry/government pressure into supporting NFTs, it would also mean that you could resell the games even if Valve didn’t want to support that. As long as the game-license NFT was a regular ERC722 token, its owner would always be able to move it around—such as to someone else in exchange for payment.


OK, so when the game starts up it needs to see some proof that you own a license for it. So it needs to see proof that you own an NFT.

I am not aware of the technical details of various NFTs but I'm going to assume that you prove ownership similarly to how it can be done in Bitcoin--signing a transaction/message with a private key that corresponds to the public key of the asset in question. I would be happy to learn of some other way to do this!

An issue here is that this does not prove unique ownership. I could share the private key with other people, and everyone could play the game at the same time. Admittedly, these would have to be trustworthy people since I imagine that the holder of the private key would be able to transfer ownership. Maybe you could sign a bunch of messages ahead of time and distribute those to your friends? That could be averted by having the game require a random number and/or datecode to be part of the message.

Maybe the game can require a transaction to be posted on the NFT chain saying that you have started playing the game at x time, and eventually a corresponding stopped-playing message. So it would not let you play if it sees that a previous session hasn't been closed? Is your play time now publicly logged?

It isn't intuitive to me (as someone with solid familiarity only with how Bitcoin works) how this could work. I would love to see a more technical analysis.


You prove you own the NFT the same way you prove to your bank that you have money: by letting the bank hold it for you. You deposit or lock the NFT into the service (essentially making it a staking contract). This associates the NFT with an account within the service's smart-contract (of the depositor's choice), which can then be mirrored by an oracle-process observing on the service's backend to become "a record in Valve's database." You then log into Steam as normal.

The difference from "just using Steam" is that, at any time, you can tell the Steam NFT-staking-contract to unlock your NFT and give it back to you. Their oracle will observe this and erase the equivalent "record in Valve's database." (Note how the order of operations flows smart-contract → database in both cases. This isn't an arcade machine, where you have to convince the machine you put your token into to spit it out before you can exchange it back for cash. Effectively, you're asking for your money back at the exchange counter, and it's the arcade's responsibility to then — asynchronously — extract your token from the machine.)

It's easier to make analogies if you don't think about non-fungible tokens (which don't have many real-world analogies besides deeds, and people aren't very familiar with the mechanics of deeds), but rather just with tokens generally.

Say you're at a grocery store. How do you reserve a shopping cart? You insert a token that you own into a slot on the cart, which unlocks the cart from the cart next to it in line. The cart then acts as a staking contract for that token. While it's holding the token, it unlocks permissions for you to do something with the cart itself — wheel it around the store. The cart doesn't need to exist within the same abstract financial system that the token exists in, but merely needs to have some mechanism sticking up into that system — a coin slot — which can observe a token being locked in, and then report that event to a physical backend.

Where the analogy breaks down, is that you have to take your cart back to the chained-up carts in order to get your token out. In equivalent staking-contract scenario, you'd instead withdraw the token, and that would cause the cart to return to being chained up to the rest of the carts. A→B lock, A→B unlock; rather than A→B lock, B→A unlock.

Also, with a physical shopping cart, it's merely social norms and a common-sense observation of the mechanism that suggest that a shopping cart won't "eat" your locked token (i.e. not give it back when you fulfill the unlock condition.) Smart-contracts, meanwhile, can constrain themselves by the interfaces they implement; and can be static-analyzed for their theoretically-possible range of behavior, with the results published by reliable third-parties. So you can actually guarantee whether Steam's smart-contract has any internal capability to "eat" your deposited token or not, without anyone ever having to learn that the hard way.


Thanks, pretty good explanation of a way for it to work.

I don't see what advantage this implementation offers though (happy to be wrong). The game is checking with Valve to see if I am allowed to play it. I will need Valve to agree to honor my token and talk to the game appropriately in perpetuity. A third party that I sell the token to would also be relying on Valve to honor the validity of the token after I transfer it to them. This has very similar characteristics as if Valve build out tools to sell/trade games using their existing database. Valve can decide any day to stop honoring tokens, maybe "locking" currently deposited tokens to your account by making the database record permanent and not caring about tokens any more.

Maybe the game has a facility to check with other services that also implement the staking contract, as sort of a backup in case Valve dies or as an attempt to be vendor agnostic? I think that has some interesting difficulties too.


The idea would be that you could transfer your license NFT away from Steam to, say, Origin; and then the Origin release of the game would be checking with Origin’s servers instead of Valve’s. You’d delete your Steam copy of the game (which would no longer work) and install the Origin copy (which would now work.)

Yes, each individual release of the game checks with some particular private corporate server, rather than checking the NFT’s status directly (because, as you said, a PKI keypair is non-unique, and so cannot[1] be used to enforce max-concurrent usage.) And at any given time, you only have a right to play one individual release of the game (or zero, if your NFT isn’t staked anywhere.) But, by moving the NFT around, you can always exchange [a license to play] release X of the game, for [a license to play] release Y of the game. That’s the “freedom of movement” that NFTs get you.

Steam→Origin isn’t a very interesting example of this, since both releases of the game are likely nearly byte-for-byte identical. Consider instead moving a license NFT you possess for an abstract widely-released game title between Steam and, say, the PS4, or the Switch. These aren’t just different “releases”, they’re different ports, perhaps of differing quality—but they may be legally considered to be the same abstract “game title.” (The license NFT would probably correspond to the same abstract copyrightable work for which the ports are all derivative works.)

Even better, this would allow you to detach a license NFT from a “dead” platform (the Wii U, say), and move it to a living platform (e.g. the Switch) where the game has been re-released. Then I wouldn’t have to ever buy NSMBU again[2]. These platform companies’ business models would never allow implementing that on their own... unless a court of law forced them to do it. And there are no real obstacles stopping interested parties from lobbying for exactly such a law!

[1] Although PKI keypairs can be made unique, by generating the privkey on a TPM it can’t be read out from, such as a smart card. Games in e.g. South Korea already rely on smart-card based license activation (although IMHO this only works because of the culture — people in SK mostly play games in net cafes rather than at home, so people don’t generally need to own smart-card readers.) It would very much be possible to make the same smart-card your custodial wallet for your NFTs, and your proof-of-possession for the game to challenge-response against. Then you wouldn’t need a corporate license server — at least in the case of that release of the game. Probably you’d still want to enable interoperation with ecosystems they did use license servers, e.g. the game console ecosystems.

[2] Speaking of Wii U → Switch re-releases: I think Nintendo specifically has foreseen something like this coming, and is trying to get out ahead of it, which is why all their re-releases lately have been enhanced in arguably nontrivial ways — usually by bundling them together with something entirely different — such that they have supporting evidence to claim the re-released game is materially different from the previous version, over-and-above just being a technological upgrade. Even if you have a license NFT for Mario 3D World, that doesn’t necessarily attach to a game-download for Mario 3D World + Bowser’s Fury, y’know? It’s sort of like the stores that get their own model of a product made to trivialize “we’ll beat competitors’ prices” guarantees.

=====

> maybe "locking" currently deposited tokens to your account by making the database record permanent and not caring about tokens any more.

Like I said above, this can be made impossible. Smart contracts are, by default, immutable: their logic can’t be changed once deployed, even by their original author. (The contract author needs to introduce explicit support for upgrading the contract in the contract’s logic before deploying it to the chain.) And this game-theoretic situation is pretty much exactly why smart contracts are immutable: it prevents selfish companies from later reneging on their contractual obligations.

Presuming Valve implements their staking contract as immutable—and probably nobody would trust them with their NFTs if they did any different, just like nobody would trust a non-CDIC-insured bank—then Valve would never be able to later “opt out” of people transferring their tokens away. The contract is what it is. (This is why I was highlighting the importance of an “A→B lock, A→B unlock” flow. Valve can always make their server refuse a withdrawal; but they can’t make an immutable contract refuse a withdrawal, if it didn’t have logic for that in place from the beginning. As long as the contract was programmed to be “in charge” of the withdrawal flow, with the server being a mirror of its state, Valve has no mechanism by which to stop your withdrawal from being processed.)

So if they ever wanted to quit supporting NFTs, they’d be able to either make permanent or purge their own database-record equivalent representation of the NFTs; but even if they made their own records permanent, they wouldn’t be able to erase the tokenized representation, nor stop users from transferring the tokenized representation away. And those tokens would still represent a right to a license for the game, to the other game-download-service-providers in the market. So all they’d be doing by “opting out” would be, at most, giving everyone a free second permanent non-transferable license to their games, separate from the transferable license.


Thank you for putting in so much time to respond. I think this does make sense and does have a useful purpose, particularly in providing cross-vendor usefulness of purchases.

Of course now that I go back and read one of your earlier comments, this is exactly what you were saying all along:

> People could in theory take a game license they bought through some other service supporting NFT game licenses, and move it to Steam, or vice-versa.


That's a good use case, buy and actually own copies of digital assets.


It's Crypto!!!1111


Yeah but I want to be able to only buy one disc, take it out after the game starts on PC 1, and pop it into PC 2 for multiplayer.


Showing my age here, heh:

1. Early Ruby on Rails -- Now that MVC/ORM packges are the norm, it's hard to describe how revolutionary the original '15 minute blog' video was. It really felt like a quantum leap for CRUD apps.

2. Uber/Lyft - It has literally remolded the city I live in, by making large areas that are transit-inconvenient more attractive to live in.

3. Linode -- Access to a cheap server that you could spin up/down in a minute with root access was really great, in an era where a server that wasn't just a junk shared host often required months of commitment and started at 100 bucks a month.

4. Google Maps -- Just head and shoulders above mapquest.


+1 for Uber/Lyft. I remember being SO excited for them to come to my area, as they are so much better than taxis.


+1 for Ruby on Rails, I was developing on ASP.NET back then. It would have taken me at-least a week to develop 'the blog'. Rails had so many great ideas baked in (apart from MVC/ORM) viz. dev/production environments, migrations, default locations for css/js/images.


Re: Age

C with Classes was a very welcome advancement over C.

Watcom C/C++ compiler with DOS/4G extender was a pure fucking miracle. Need 1MB in one chunk? Just call the damn malloc().

Windows NT was a phenomenal leap forward for desktop OSes.


I lived half of my adult life in a major city before Uber/Lyft, and yet it is unimaginable what life was like before them. Did I really get ready for a night out and go out in the rain desperately waving for a cab?


> go out in the rain desperately waving for a cab

That's mind boggling. All my life I've been able to call central dispatch and order a taxi to a pick-up address of my choice.


I have no familiarity with #2 so that's interesting to me-- do you or other people actually use uber/lyft to get to and fro work? Isn't that like $15-20 each way every day, thousands of dollars a year?


It made a big difference for me with occasional trips, I took public transportation to work, but Uber/Lyft meant I could go visit my friends for $10 and 10 minutes instead of $2 and 45 minutes.

Note that a car also costs thousands of dollars a year once you factor in everything, Edmunds estimates the true cost of a corolla at $7k/year: https://www.edmunds.com/toyota/corolla/2020/cost-to-own/


I have multiple experiences with Toyota’s driven 200k to 300k miles with just standard repairs. A Corolla costs $20k? $25k?

Let’s say $1k for new tires/brakes/oil changes/filters every 45k miles.

That’s $333 per year for maintenance, and the car lasts 13 to 20 years at 15k miles per year for 200k to 300k miles, so let’s say $2k per year on the high end, $1,200 per year on the low end.

Extremely high liability insurance is $50 a month or less.

Fuel is 10 cents per mile if you assume $3 per gallon and 30 miles per gallon, so $1,500 per year in fuel for 15,000 miles.

$1,500+$50*12+$2k+$333 = $5k per year on the high end.

Excluding insurance and fuel costs, you can push a reliable Toyota/Honda’s costs down to 5 cents per mile or less.


Your maintenance cost seems optimistic to me, and you're also omitting parking and the cost of parking tickets (surprise yay), but besides that, I would say that there's a fairly high cost in terms of time and stress when driving a car.

As someone who doesn't own a car, I don't have to worry about parking tickets, or de-snowing my car, car not starting in winter. I also don't have to worry about finding parking, or parking tickets, dealing with repairs and scammy repairmen, paperwork, my car getting stolen or damaged, etc.

I also don't have to worry about driving drunk. It's pretty freeing to go out with friends and just have a car drop you at the door, and another car pick you back up when you want to go, and never having to worry about parking, when your parking timer might run out, or drunk driving.

And all of this being said, as someone living in a big city, between public transit, bike rentals, walking and ridesharing, the total cost I pay is less than your stated 400/month.


Others have commented on other points, but I'll note "Extremely high liability insurance is $50 a month or less." – the cheapest liability only insurance I can find is $120/month, and I'm 32 & married with a clean driving record. $100–$150/month seems to be the norm, talking to my friends.


I should have written "liability only insurance", meaning it protects me from everything except paying for my own vehicle, which I opted not to get since I can afford to replace or repair my vehicle out of pocket. I have $250k/$500k liability insurance incl uninsured and underinsured for ~$40/month from Amica on the west coast right now. I had it for $45/month or so from Geico on the east coast. Both my wife and I have clean driving record for past 15+ years.


I'm also talking about liability only insurance – from some other comments, it seems to vary a lot depending on exactly you're located. I did find much cheaper insurance when I was living in Maryland.


Different sources I find give between $1100 and $1600 as the annual average total auto insurance in the US, with about half of that liability. But there is a lot of regional variation. (state averages seem to vary by a factor greater than 2; and rates vary by ZIP code, not just state, and the range between ZIP codes has to be much greater than that between stats averages.) But, nationally, $120/mo. ($1,440/yr) for liability-only is very high.


AAA does this kind of calculation in more detail and publishes its results regularly for the average cost per mile driven for an average car owner in the US. It's on the order of about a dollar per mile, depending on the kind of car. Average sedan is about half a dollar, so you are still out by a factor of 10. It's obviously going to be higher in certain places too, like in cities, where there are extra and higher costs to car ownership.


Add parking in a major city (the places Uber/Lyft are a viable alternative).


Yes, parking is the big expense that makes Uber/Lyft worth it.


Cars can be a great value, an expensive convenience, or a total burden depending on how these numbers actually work out and where you live and what you do. If you feel like a car is necessary, then the math probably makes sense. If you're at all hesitant, then the numbers would have to be very compelling to be at all compelling. Regardless, they only really work out if you actually have it for a very long time. If I wanted to purchase one for a year, and then resell, I'd lose a lot of money to taxes and depreciation, especially if it's new. Likewise, it's a huge liability that you just don't have if you don't have one.


Not including potential repair costs either


in sf, pre uber, taxis just didn't come. They were extremely unreliable -- call, and they'd come if they felt like it. maybe. and on their own schedule. Also, they were run by utter assholes who knew you had 3-4 companies to choose from, and they all sucked.

uber and lyft later turned it into a service that was highly reliable (came, or at least told you if they weren't coming). It seriously enabled the use of taxis as a reliable part of a transportation solution, or eg to cover the 1-2 days a week you needed a car without having to own a car.


This describes the taxi experience I've had in every city the world over. Not that I've been everywhere, but places in Europe, Asia, North/Central America, that's been my experience. Small towns are generally better because the taxi driver isn't anonymous.


They're fairly polite in Japan and don't try to rip you off, but many taxi drivers are 900 years old and have incomprehensible accents (Japanese ones, not foreign) so it can be a little hard to explain where to go.

I think they're fine in the UK too. But they absolutely aren't in California, Uber was a godsend and far less criminal even considering all the stuff they have done.


Really interesting! Didn't know. In the northeast taxis are generally reliable, although I've never used them with the regularity of someone who uses them to get to work.


Yeah, I think if your baseline is NY or somewhere like that were taxis weren't allowed to screw over the entire city for a couple decades, people don't understand why uber and lyft are loved in california. I'm not a fan of their labor practices, but the situation before was indefensible.


I do use Lyft to get to work most days (or did, in the Before Times). I just checked a few hopefully-representative months (June/July/August 2019) and it looks like I was spending about $400/month. Google tells me that the average TCO of a car is about $700/month, so this would seem like a net win. That average might not be representative of my needs though - it's likely being dragged up by people driving around giant SUVs and such, so take this comparison with a grain of salt.


Depends on where you live. Probably subsidized by employers, just like public transit is sometimes also subsidized by employers. Even when it's not, the mental health benefit you can get from not behind the wheel stuck in traffic is certainly worth thousands of dollars per year.


It's too expensive to take to work, but I live in a city (DC) with a downtown centered transit system, so certain 20 minute drives take an hour or more on transit. Add in a 5-10 minute walk on both sides if your home/destination aren't right on top of a metro, and some trips become super inconvenient.

If you know DC, I'm thinking, say, Van Ness to Eastern Market, H-street, or anywhere off-metro in SE over the river. Cabs here are infamously unreliable, and just don't serve many neighborhoods, to the point of illegally refusing to take you once you're already in the cab.

Before Uber, if you were someone who wanted to go out to eat or visit friends in other neighborhoods a few times a week, you had to live as close as possible to metro/bus hubs on the same lines as your social group. If you couldn't afford that, you got a bike and dealt with it being stolen 3x a year.

Now you can reasonably live a little further from transit and still take those trips. You still want to live somewhere that minimizes commute, but I'm no longer optimizing for commute AND social group AND restaurants AND groceries, which widens things quite a bit price wise.


I used to take UberPool (or the Lyft equivalent) to work every day for more than a year (around 2016). It was on average ~$3-4 one way, slightly more than but still comparable to public transit. The door-to-door time would be 15 minutes as opposed to 40. And I often got discounts which made it cheaper than public transit.


250 days is mon-fri, with three days off. 250 * 2 * $20 is $10,000. That sounds like a lot, but - it depends on what you gain/lose. If you can avoid buying a car (plus insurance and gas), and live a little further away, in a place that's a little bit cheaper - you can easily save more than $10,000 a year. Uber-enabled at the edge of the city for $1200 / month vs walkable in the middle of the city at $2000 / month, and you're pretty close to break even.

I still took the bus for fixed commutes (to and from work) - when I woke up on time. But uber is very convenient if you're in a city - the population density makes rides near instantaneous.


Taxis would be at least twice as expensive, for a service that couldn't rate their drivers and customers and sometimes you'd have to wait in a queue to have your call taken so you could get picked up.

Currently there's only one disadvantage to ride-sharing apps, you can't wave down a car if you see one. You have to open the app first to request a ride, and that car might have driven away by then.


I cannot see in the dark, and I live in a city that is not very walkable. Before Lyft, I just didn't go places at night. So, yeah.


Rails + Heroku circa 2011... I became a professional web developer and launched a startup mainly because of these technologies, so quick and easy.

Yes, with Uber and equivalents around the world, I have zero reason to own a car. My driver's license expired years ago and I can't be bothered to renew it.


+1 for Rails. Also Django (Which came out around the same time).


I owe a lot to DHH and Ruby on Rails. It landed me a job in Silicon Valley 11 years ago. It was a game changer for my life. Glad to see it listed here.


Yeah, Uber/Lyft is 10x better than cabs ever were.


USB. It's hard to imagine, but we used to have a thing called SCSI (pronounced "scuzzy", and that's how it felt) which allowed you to connect to exactly one serial device. The plug was huge. And cables cost $50. Mice used an entirely separate interface that was different on Macs and PCs, and since Apple was dying at the time, it was a struggle to get Mac mice for a reasonable price. With USB, you could suddenly attach any device to Mac or PC and often not even need a driver. You could buy a splitter and attach multiple devices. Incredible! Wifi. I remember seeing Apple's Airport demo. You could connect to the internet WITHOUT WIRES! Magic!


> SCSI ... which allowed you to connect to exactly one serial device

Pedantry, but: SCSI is parallel, not serial. The distinction is the number of data lines in the cable -- serial has one line, and SCSI has 8 or 16. This allows for much faster data transfer rates, at the (significant) expense of greater complexity and cabling cost.

The other advantage of SCSI is the ability to connect to multiple devices through "daisy-chaining". Old-style serial connections (RS-232, RS-422, etc) were strictly point-to-point.

Modern SCSI (SAS) runs the SCSI protocol over a serial connection, because port clock speeds are now fast enough that the parallel advantage isn't important for most uses.

Not to detract from USB though. It was an improvement over all of the above, and nowadays it's pretty fast, too.


>we used to have a thing called SCSI (pronounced "scuzzy", and that's how it felt) which allowed you to connect to exactly one serial device.

No, SCSI was worse than that... you had 5 or so different types, about 5 different connectors, and you had to terminate things, set the dip-switches just right, have the right Adaptec card, with the right drivers, and if you didn't look at things too hard... you might be able to take a $5 CD blank and get a good burn on it... otherwise you had a $5 coaster.

I hate SCSI because I always had the SCSI Blues.


I was super excited the first time I personally owned my first SCSI card though. I look back on my fascination with the tech of those days fondly. It was super cool even though it badly sucked by todays standards.


Indeed, I'm running a lab experiment right now, and I count six USB cables coming out of the computer, one of which goes to a powered hub with a further five cables plugged in. Two of the USB interfaced hardware gadgets are homemade. All controlled by Python. A couple of the devices required downloading drivers from the vendor, but the setup process was utterly uneventful.


I remember when plug-and-play was a big deal, good one.


Nobody has the audacity to mention The Pirate Bay. Ethical and moral issues aside, it was and arguably still is a 10x method for obtaining digital content and software.


In a similar vein, Napster was also a game changer for mp3 distribution.


For me napster was a game changer that got me into music period. Before that discovering and learning what music I liked was tedious and low signal/noise along with high cost that I never developed any interest in music until napster existed.


Got me into computers really. I didn’t “get” the internet prior to Napster. It felt like a toy with no real value. But I was a teenager at the time and the access to music was a game changer. I was hoarding so much content I had to learn how to build my own systems and storage solutions. I used to download video content (music videos, live concerts, etc) from IRC too around this time.


Kazaa


The only problem I had with Kazaa is how often people would rename files for reasons I'll never understand.

Like...imagine downloading "The Matrix.avi", and you start playing it, and it turns out it was actually a copy of Fight Club.

WHY!?


A large chunk of the internet exists for the lulz. Imagine having to postpone your phone calls for an entire weekend in order to download a 1GB copy of Final Fantasy 7 on a crappy 32 kbps dial-up connection, only to open up the game and realize you got Leisure Suit Larry 6.


I was lucky enough that we had a second phone line dedicated to the modem.


I'm pretty sure there were servers out there taking incoming search strings and slapping them on the end of file names. You could search for RANDOM_STRING and find a "britney spears matrix hot babes unsavory search (2)(1)RANDOM_STRING.mov.wmv.exe"


One thing I liked about Napster and the like is that bootlegs and B-sides circulated along with the official releases. I miss not having these on Spotify. Yes, there are official live albums on there. And yes, you can still find these bootlegs elsewhere. But there's enough friction that I rarely do this.


Yeah, Napster's value still hasn't been replaced in full. I remember pulling some obscure song from another endpoint halfway around the planet and then being able to browse their music collection, which turned me on to other cool music. You can kind of do the same thing on spotify by following people and playlists, but it's not the same discovery magic that Napster had, especially for obscure or hyper-local stuff. I'm not enough of a music nerd anymore to care as much these days, but Napster really was the cat's venerable pj's back then.


For the same reason trolls are everywhere: because they can :/


Pirate Bay, Napster, Limewire etc. are what dragged us into the digital media age. Might not seem like it today but had studios and record companies had their way, Netflix or Spotify would not exist.


True, but Spotify was another 10x improvement. You click on ANY song, and, WELL under 1 second later, you are listening to that song. Closer to a quarter of a second. In 2010 or so, when I first used it.


But was it a 10x improvement on Grooveshark at the time? I went from Grooveshark, to Google Music, and finally to Spotify in around 2018


BitTorrent too the underlying protocol with less nonsense than download sites that often want to tease you in to a paid subscription. Torrents are as good as it gets but malware and pretend movies linking to the aforementioned sites are sometimes a problem


And libgen/scihub for ebooks


Limewire


Stackoverflow. Its closest competitor, experts exchange, was a slow website that required you sign up to see an answer that was rarely useful. There's a direct correlation to SO and dollars of revenue I've driven.


Experts Exchange also used to do that awful thing where they let Google index the solution but would hide it from visitors until they'd signed in.

I believe Google penalised them for doing it so they used another sneaky trick where they showed a obscured/pixelated answer first and then the actual one further down than most people would scroll.

SO was better in every way.


> Experts Exchange also used to do that awful thing where they let Google index the solution but would hide it from visitors until they'd signed in.

This might make someone here angry that they weren't aware of it (or make them smile):

This is how they did it:

The answers were always visible if you just scrolled far enough down the page.

At least that is how I experienced it - and I thought every oldtimer knew! :-)


No, they only started doing that after Google penalised them for underhand tactics.

There's a full description of how it worked on their Wikipedia article.

https://en.wikipedia.org/wiki/Experts_Exchange#Paywall

You should probably check before dishing out patronising and incorrect answers.


TIL that Experts Exchange actually existed, rather than being a meme of "why you should consider how your business name will look as a domain".


What's wrong with www.ExpertSexChange.com? People deserve to hire experts for that don't they? /s

Somewhere out there must be an RFC that discusses the original thought process behind designing for case sensitive vs case insensitive domain names, or why domains should be case insensitive while URL paths should be case sensitive... Ironically this SO post seems to be a decent starting point for my curiosity: https://stackoverflow.com/questions/7996919/should-url-be-ca...



Their old URL doesn't redirect there (take out the dash), so it's understandable that people think it was just a meme/disappeared.


I think poor therapists suffered most from the curious nature of how domain name are. Like http://www.therapist.com/ still exists but redirects. ...

Let's not even mention the plant nursery on the Mole River which ended up with www.molestationnursery.com ...


Are we allowed to make Arrested Development references on Hacker News?


Their weekly podcast where they discussed progress on building the site that week was fascinating too.


I wouldn't give them 10x, but certainly 2-3x for junior developers compared to predecessors and 1.2x for others. Which is still incredibly significant and nothing to sneeze at.


Its closest competitor has been Quora for several years.


Stack Overflow is much better than Quora. Quora has gone downhill over the last few years with paid contributions and modal nags. It sea they will also sign me in after I click a link in their email digests. I don't know how that can be secure


The abolition of punched cards...finally, it was practical to indent code!

Visual editing...I remember when all text editors used a command language that made you keep a listing of the file next to your terminal so you could translate your markup into editor commands. (And, yes, I still know my way around ed.)

SCCS/CVS/RCS: as wonderful as git/hg/fossil and others are, any source control system is better than none.

Tree-structured file directories, so you could separate files of different projects into different directories.

Yes, I HAVE been around a long time!


Ah, a simpler time that I have not experienced.. since 10.

It's unbelievable how fast technology has progressed. barely 20 year has passed, and while my favorite tool is about my age (vim) the technology that you pioneered has transformed the way of life of everyone on this planet beyond imagination.

Bravo. Salute.


1. DOCKER

I don't hear "works on my machine" nearly as much nowadays. Everyone is running the same code in the same environment. It's all there under source control.

Now I can get a project running on a different machine in a few minutes, without any special instructions. That also applies to my colleagues, or people looking at my GitHub projects. My software's interface with the host machine is clearly defined, so there are very few surprises.

2. GOOGLE PHOTOS

No more moving photos around with cables and SD cards. No more tagging anything in Lightroom. I can type "pug" and I'll see every photo of a pug I've taken.

There are no logistics around taking pictures anymore, and it's much better that way.

3. MAPS

Google Maps, but also Open Street Map. Cartography is an incredible blessing, and we take it for granted.


What I like about docker is that everything comes from one Dockerfile


Jquery - abstracted away tons of cross-browser inconsistent behavior, both with the dom and javascript, and added new selectors to make dom manuplation easier. I think there was a reason it was adopted as fast as it was, it really was a 10x improvement in working client side.


I feel it's usefulness has ran its course though. The spirit lives on through ES6.


Boost for C++ likewise. It was a necessary bridge in the long wait for C++0x, but once the language started updating again it's no longer as necessary.


Absolutely it has run its course. There are a now a litany of javascript frameworks to suit the variety of projects being done in javascript (both server and client side), and browsers are now far more compatible (I think partially because browser makers realized with the popularity of jquery how in demand that was) but I was only recognizing what an innovation it was at the time.


Surprised no one has mentioned AWS yet. Anyone who remembers procuring, racking and imaging physical servers knows how utterly incredible it is create a cloud VM. And probably a good 20X improvement to use a value add cloud service like S3 or Netlify.


+1000 to this. At my first startup after college (~2000-2002), I remember clearly our first "deployment" for a paid customer -- we had to order $thousands in Sun SPARC machines, rent a rack at a local data center, hire someone to set it all up, etc. etc. etc.

I don't think people appreciate what an absolute miracle the various cloud providers are. I'd say it was 100x - 1000x improvement, not just 10x.


A bit, but hosting providers were around long before AWS. You could do multi-tenant web / database deployments in the 90s.


>, but hosting providers were around long before AWS.

Yes but AWS exposed a programmable web api to provision servers and disk. You download an SDK and get a AWS developer key and then could create S3 buckets for storage and EC2 instances.

Yes, hosting datacenters like Rackspace in 2003 existed but you had to talk to a human or send an email to provision compute resources. There wasn't a "Rackspace web SDK".

The Amazon AWS that made "cloud" more acceptable was such a paradigm shift that both Google and Microsoft didn't have a competitive offerings of GCP & Azure for more than a year. AWS has kept its lead from the very beginning.


And the console. They set the standard for self-service. It's the most sophisticated enterprise software platform in the world and you can bootstrap without talking to a salesperson. That was pretty revolutionary.


Their software defined networking & virtualization was a big deal too. Many of the primitives of on-prem or "LAN" networking were translated into the cloud.


thats not 100% true. it wasnt SDK driven but it was console driven. no need to talk to people back in the days of the hosting market.


AWS? Peuh. Hetzner made it possible to host websites with many millions of pageviews per month without robbing a bank.


As someone working as a web developer for a smaller company, I don't get the use case for AWS. Super confusing naming of their services and nontransparent pricing.

For the vast majority of people some 5 Dollar hosting is more than they need. Just copy your files to the server. Does not get simpler.

And if you need more, it is super easy to set up a dedicated server these days anyway.


$5 hosting is absolute not enough for most big businesses. The times I've actually gone for simpler solutions, they're usually built on AWS (ie Heroku or Netlify)


I never said anything about big business. It is fine for a small Wordpress Blog and the like. Not everyone is Google. Lot's of small and medium sized businesses.


Yes, the thing in iOS that automatically clips SMS verification codes sent to you. It saves me from going to my messages and finding the code.


It was both magical and instantly obvious, the best kind of change.


Sorry, but that one is 4x at best.


Maybe if you're only counting brain cycles saved, but brain cycles of pure joy and amazement are worth more than ones spent doing pointless tasks.


4x? 1x at best - saves my brain from memorizing a small thing very infrequently. Might be detrimental to my brain over the long run.:)


Ha yes, my point was just that it is hard to quantify most of these and that 10x ends up meaning so awesome you buy it, which is almost a tautology.

(Thiel actually quantifies it in the examples he offers in the original text, if I remember correctly)


When I took driver's ed many years ago, the official state written material had it as fact that highway driving is "3 times easier" than city driving. It was actually on my written test. Multiple choice, and the other answers were different numbers to chose from... but no: "3 times" is the correct answer.


I take this one for granted. I can't wait until they add the same thing for emails.


Android apps did this first, but by having an all-or-nothing permission to read your texts. Making it part of the keyboard under user control was so much nicer.


Yeah, this was phenomenal the first time I experienced it. I wish it also came with a "don't use SMS for 2-factor authentication" warning though. ;)


What 2FA method is preferred instead of SMS these days?


TOTP, I guess.


Hardware keys. If you can't use those, HOTP (push notification-based - OTP with an HMAC and a counter).


Users aren’t always in control of that though.


Yes but not being able to copy and paste the code in iMessage by right clicking on it if you're on MacOS reverts this back to 1x. In MacOS right clicking on any word shows the copy menu with that word selected but in iMessage for some the entire message gets selected. You first need to highlight the word you're interested in before right clicking, major pain.


Reaper. Its simple, fast, and no licensing bull.

Steam. Just works, simple and easy to use. Copy-Paste the Steam folder to your new system to move your entire game library.

ZoomIt by Sysinternals - excellent, excellent tool that has improved all my presentations/screen-share sessions.

Everything by David Carpenter - super fast system wide search for files that has bookmarks and other features like match using file name/file path/regex etc.

ShareX - Very useful screenshoting/screenrecording + more tool with automation capabilities like auto upload to imgur, etc.


When Steam first started being a thing, I decried it as the death of physical media and fun game inserts. I vowed never to use it, because of what I perceived as the locked down steam launcher controlling access/ability to shut off access to games that I purchased with my own money. It seemed like too close to a 'license' to play.

Then I used it for the first time and realized how stupid easy it was compared to managing all of the physical disks and keys and what-not.

I mean, I wasn't wrong. But it is easier to use.


Years ago, when MS Flight Simulator X came to Steam, I bought it even though I had MSFSX on DVD.

Eliminating the need for the discs as well as removing the silly DRM in favor of Steam's far more sane DRM was worth the $10.


I also hated steam on day 1. I just wanted to play counterstrike! But obviously Gabe was right!


Reaper also came to my mind. The “everything is a track“ philosophy feels so natural to me now while other DAWs seem convoluted. I look back to my early Cubase and Samplitude days with amusement.

Also, Reaper basically never crashes for me.


> Steam. Just works, simple and easy to use. Copy-Paste the Steam folder to your new system to move your entire game library.

I use Steam, GOG, Epic, Battle.NET, Origin, Uplay/Ubisoft Connect. Steam and B.Net are only ones that can handle this smoothly and it's INFURIATING.


I've never used a more stable, feature-filled, and affordable DAW than Reaper. Making music with it is a lot of fun


Sysinternals never ceases to amaze me.


I've been deploying Everything to anyone who loses their documents on our employee workstations. It's a wonderful tool; how Windows search should work.


> Copy-Paste the Steam folder to your new system to move your entire game library.

Can't do that. I don't own a hard disk big enough to hold my entire game library.

So I have the most frequent games in an SSD, other games in several HDDs, and so on.


shareX is a godsend, I wish MacOS had as good as it


User experience is usually the 10x differentiator.

An Uber and a regular Taxi will both get me to my location with similar time and cost. The difference was that I could get an Uber by pressing a couple buttons on my phone and monitor the entire process from an app. A taxi required (at the time) phone calls, waiting around for a taxi to arrive, trying to communicate location, and other hassles that disappeared when using Uber.

Same final product (car transportation between points) but the experience was 10x better.


What's crazy is that any taxi company (and there are some large ones which have the resources) could have done this.

Then, after Uber launched, instead of jumping on the bandwagon and doing they same they instead dig in their heels and fought it. Only after it was too late did they attempt to do the same.


In other markets, they did. Where I live you need to have an appropriate license to transport passengers for money. Hailo brought the app approach to hiring taxi drivers. Uber now exists here, but all the drivers are actual licensed taxi drivers charging the standard rates. They did briefly try to "disrupt" the market here by ignoring those rules, but they got slapped down.


In Austin, they had a taxi app at the right time. The problem was that they were still taxis, and if no one felt like ever coming to pick you up, then you were just left high and dry. Uber told a specific person to come pick me up. Taxis would take whatever fare they found on the way to my house.


Not quite. There was a post on here that Uber's other big advance was the fact they don't own taxis and via demand pricing they incentivize more people to drive during high demand times. A Taxi company can't do that. They can't afford to own enough cars and hire enough drivers to handle high demand times as they'd lose money. Just adding an app to taxis is not enough


This is insightful.

I wonder how many SaaS products have the advantage of not having to talk to another human being. You don't have to align schedules ("busy signal"), convince them if they don't want to ("yes, I moved a block down, can you redirect the taxi please?"), and sometimes deal with a bad day.


> phone calls

This can make things awfully complicated when you and the dispatcher don't speak the same language.

> waiting around for a taxi to arrive

The worst part about waiting for a taxi is the dice roll of whether one will actually arrive. Often in the case of a popular spot (like after the end of an event) the taxi could pick up someone else from the same location - or worse yet, not show up at all. The introduction of feedback associated with a specific driver has completely changed the incentive to actually show up and pick up the correct person.


> dispatcher

I remember a lot of taxi companies used some funky phone-to-CB thing where the dispatcher would be talking to you over CB radio on your phone call, which was just rotten audio quality on top of any lingo barrier. Just hilariously awful.


> An Uber and a regular Taxi will both get me to my location with similar time and cost

I don't know how many places that was true. I gave up on taxi service in my city (Austin) a few years after moving here, because it wasn't a reliable or quick way to get anywhere. Twice I ordered taxis hours ahead of time to get to the airport (once literally the night before) and after being assured on the phone over and over again that the driver would be there "in just a few minutes" ended up driving myself at the last minute and barely making my flight. I also had a few treats of walking miles home after waiting 45 minutes for a cab to arrive. Uber was a game-changer simply because they would show up.


A regular taxi costs you the same as an uber? May I ask what country you live in?

In Australia, a taxi usually costs between 1.6ish - 2.3ish times as much as an uber or didi.


Don't forget the payment process - no tip to worry about, no conveniently "broken" credit card machines, costs and duration are estimated up front.


Postfix. It's the only software package that I use where I am consistently sure that any issue I come up with is me making a mistake in configuration or me not having read the documentation closely enough.

It's stable, it has good error messages, it supports all the different ways to send email. And it's documentation is just really well and precisely written.

It's just really good.

It's not that I think that Postfix is 10x better than other email software that I use - it's 10 times better than any other kind of software I have used in the 20 years that this has been a relevant question.

Thank you Wietse, thank you Viktor, thank you Ralf and thank you Kyle.


And all that without having to learn the amazing m4 macro language so you can ‘easily’ configure Sendmail


If you ever start thinking that m4 is the solution to any problem you have just shred the drive and start over.


I'll avoid dating myself by keeping it sufficiently vague, but Open Source Software. Going from the world of closed source, for-profit software to that of FOSS is like night and day. It's more like a 100x improvement. Today's open-source world is quite different, but I'll still take it hands-down.


I find it interesting how big companies have adapted to use OSS to their advantage, leading to the popularity of JS libraries like React and complicated infrastructure systems like kubernetes.

They've managed to completely move the goalposts for what the baseline needs of a computer program are.


Initially everything was OSS. Then Gates and Jobs came along and started doing closed source software. But in 50's and a good chunk of 60's people programming those frameworks would freely exchange software.


Not sure if mathematics counts as software:

Geometric / Clifford Algebra.

It makes just about every aspect of handling geometry in computer graphics and robotics so much easier. It comes with a ton of upsides and only two downsides: You probably were not taught it yet (so you have to learn it) and if you do, others will have trouble following you unless they learn it too.

I somehow feel like coming from a tribe where we only count up to three, and then being introduced to the concept of natural numbers (and other number classes) by outsiders. As I stared to use it myself, it changed the way I think about things, but now I can't communicate with the rest of my tribe anymore. Yet, I think it is worthwhile and about the only silver bullet I have seen.

If you are interested, here [0] is a nice introduction to one class of clifford algebras.

[0] https://arxiv.org/abs/2002.04509


Do you think it is all of GA that helps, or really just the concept of bivectors / multivectors?

(I have a lot of skepticism about the geometric product as a useful object, but I think that multivectors are clearly fantastically useful everywhere. My exposure is mostly in physics where multivectors are basically necessary to make any sense of a bunch of concepts. But I don't work in robotics/graphics so maybe don't know how much good the geometric product does in practice?)


>> Do you think it is all of GA that helps, or really just the concept of bivectors / multivectors?

Yes, definitely all of it. Especially in physics, where every well known equation comes with its own, not so well known algebra. There are dozens (maybe even hundreds?) of them, all with their own notations and shenanigans. The thing is, all of them can be covered by GA. And you only need three instances of GA to cover everything [0]: Classical mechanics, quantum mechanics and relativistic mechanics / space time.

>> I have a lot of skepticism about the geometric product as a useful object

The geometric product is kind of like: "Here is everything you could possibly want to do, all at once!". We usually pick out specific parts of it (called grade projections) to operate on sub-algebras matching the specific problem domain at hand. Yet, all of them are still connected in one über-algebra so to speak. So, we don't need to switch notations.

>> bivectors / multivectors

Multivectors, the vectors of GA, are like "normal" vectors, from linear algebra, just tuples. There is nothing special about them. The genius lies in the idea that unlike "normal" vector algebra, where you have one element in your tuple for every dimension, you also have elements for every combination of dimensions in a multivector. The multivector is usually sorted by the grade (number of dimensions an element combines). That way a multivector has a 0D scalar, 1D vectors, 2D bivectors, 3D trivectors, etc.

[0]: https://www.scirp.org/journal/PaperInformation.aspx?PaperID=...


I'm very aware of GA and multivectors and the usual equations that are included in GA texts, like the one-term form of Maxwell's equations. I'm specifically interested in whether you know of a good reason to include the geometric product in all this. It seems to me that for the most part you can do all of the useful stuff without ever mentioning it, and the equations it produces tend to be very awkward (compute something, grade project, compute something else, project again...) -- it's very 'just so', and kinda feels like it's all using the wrong tool. But I haven't figured out what the better tool is yet.


Nice. I’ve been always curious but also worried about alienating myself. Any communities of GA users out there to feel connected?


I think the biggest gathering is: bivector.net

It has a discourse and a discord.


Do you know 3D computer graphics libraries that use GA?


You are right, while there are already some libraries which provide GA, there are almost non which use it. I know that it is used in some photogrammetry pose estimation / frame registration code and I am currently porting my vector / path rendering library to GA.

It is still really niche, but gaining traction in the last 3 years or so (considering it was invented in 1878).


Modern Linux distros.

Yeah, I have so many complaints about them, but for a wide range of use-cases and hardwares, they "just work" now.

In the olden days, Linux desktops and servers, and especially laptops were much more "pets" than "livestock", needing constant attention and care to keep them working, correctly configured, and up to date. Half the time when you added a new software package or piece of hardware you'd end up breaking your X-windows configuration and need to spend six hours getting things mostly working again.


Reading your comment (and replies to) is a bit interesting because I almost never see an admission that Linux is anything but gods gift to humanity. (yes, I exaggerate, bot not too much). It's nice to see more reflection and honesty about various annoyances that came with it.

I remember being told (circa 2005) that my PC issues would be resolved if I just switched from Windows to Linux.

I tried Linux a few times back then, and it never stuck. I would hit some small issue that I couldn't solve and eventually I'd give up and go back to Window (which certainly had its own problems, but I wasn't under any illusion that it was a flawless)

I use Linux more now (mostly as part of WSL) and I really enjoy it when I get to use it.


Linux Mint literally just works 99.99999% of the time. It's better than windows, tbh, in terms of stability on my machines.

Compare that to Ubuntu, Mint, or Fedora from even 10 years ago, and it's startling.

Go back to 2004-5 when I was really into this kind of thing in college (and had time to dicker with it), and it's like they're not even the same product.

You're 100% right. Linux desktops used to be pets, very fragile, sensitive pets. Now, they're machines that work. It's remarkable how far its come.

(and I know this is a fanboy statement forever, but) I genuinely do not know what is holding linux back from massive adoption anymore.


For me it's the microsoft office suite. I used to make documents in LaTeX and continue in some cases for text heavy or math heavy final documents. Since using powerpoint however, I cannot image going back to either LaTeX or libre office for either presentation or quick intermediate reports. LaTeX is too cumbersome and libre office is just a pain to use (one part of me still hope that I am just using it wrong). Apart from that, gaming is also a reason for my personnal computer.


Microsoft Office and most desktop games, basically.

Linux is a little too command-line happy, sure, but not being able to run Excel or World of Warcraft is a much bigger deal-breaker for most people.


I found the 0.0001%. Mint could not reliably connect to my run off the mill BW Brother wireless laser printer. I had to reboot often to get printing. Same problem on 2 different PCs.


I remember the old days of having to run a program to generate proper modelines to get XFree86 working with whatever monitor I was trying to use.

Though to be fair, having explicit modelines was once also useful when modifying a BNC Trinitron monitor that ran at one very specific resolution to connect to VGA.

It's nice that monitors mostly just work these days.


You're giving me flashbacks to when I had to compile my own kernel for ubuntu on a macbook.

All to get suspend working.


Java vs. the 90s C/C++ standards. Not having to deal with memory management made it so much easier to write applications.

I'm guessing SQL was a 10x innovation at least when it came out, too.


+1 for this. Java gets shade from the hipsters these days, but it's almost impossible to overstate what a revolution it was for software engineering.


Java had an impact similar to COBOL in the previous generation of business apps. I used to call it COBOL for the 90s.

If you don't know COBOL, it was a revolution compared to programming in Assembler. It's hard to overstate how big an improvement it was for productivity.


Java brought two things to the table compared to C++: garbage collection, and an enormous library.

I don't think that garbage collection, by itself, was a 10x change. For it to be so, programmers would have to be spending 90% of their effort on freeing memory. Don't get me wrong, garbage collection is a big win when you can use it, but I don't think it's quite that big of a win.

The library... that might almost be 10x, by itself, because of everything that you don't have to write. The combination of that plus garbage collection gave Java quite a kick compared to C++.


Not freeing memory, but tracking down memory leaks. We rarely even hear that term nowadays, but most large commercial programs used to have them.


Memory leak is a thing on JVM too. Tends to reoccur at times.

Java brought call stack dumps, security model and alot of standardisation across orgs.


Or you just reference things all the way through your program, so it's possible in every language.

Still, for the memory management related kind in C++ there are tools like valgrind.


I was part of the C++ -> Java revolution, and I'd add the language being much simpler than C++. It felt like a rare, real 80/20 thing where you get 80% of C++'s power for 20% of the complexity.


In the 90s I used to spend DAYS hunting down memory issues (corruption, leaks, crashes) in C++ code. Your whole schedule could slip by weeks because of one hard to reproduce issue. I haven't seen that kind of problem in a long time.


+another huge benefit - Jar files and class files made the IDEs possible. Studying at the university, moving from Visual Studio/C to Eclipse/Java was a wtf-how-is-that-even-possible moment.


Java saved me from having a C++ career. And that was the 90s C++, not the C++ of today. I dodged a bullet there.


Unfortunately I learned Delphi

Afterwards you could not take the other languages seriously

Like comparing two strings for equaltiy, strA and strB.

In Delphi, you write: strA = strB

In Java: strA != null && strA.equals(strB)

Delphi just did everything better


I tried to learn Delphi on an existing Delphi project.

My experience was one of hunting dependencies in bookshelves, file serves, (defunct) vendor web sites and also source forge.

That was the point were I grokked the immense value of Maven.


Java was designed when C and C++ were still going strong and there was a feeling (or fear, depending on who you asked) that C++ would be as omnipresent as C was.

Java tried very hard to fix the perceived errors of C++ and operator overloading was high on that list.

Nowadays, we are no longer afraid of operator overloading so I kind of wonder why it hasn't found its way into modern Java.


But they fixed the wrong "errors"

C++ is actually a good language. But everything that comes from C is bad.


There's a very good language in C++ trying to get out.

The problem is that different people consider different subsets of C++ as good.


C++: has generics, inheritance, pointers, macros, const, function overloading, and exceptions

Sun: C++ is too complicated. We need to simplify it by removing generics, pointers, macros, and const

Google: C++ is too complicated. We need to simplify it by removing generics, inheritance, pointers, macros, const, function overloading, and exceptions

Mozilla: C++ is too complicated. We need to simplify it by removing inheritance, function overloading and exceptions


You are ignoring that every one of that languages you're describing here have not only removed stuff but also implemented stuff differently.

Also, Sun said "manual memory management is dangerous" and history proved them right (granted, they were not the first to say so). Java also proved that a VM based language can be competitive performancewise with a compiled language. It influenced the design of PL down the lines.

Go, well I'm not the biggest fan of its design philosophy but I'm still would have chosen Go over C++ in the 90s if both were available back then.

Rust goes well beyond anything C++ has done. C++ failed in replacing C but Rust might be able to pull that off. Considering that we still have lots of the buffer overflows in 2021 I'd say it's about time.

> C++: has generics, inheritance, pointers, macros, const, function overloading, and exceptions

Pointers. Plural. That's a good point. C++ likes pointers, so much it has raw pointers, references, std::auto_ptr, std::unique_ptr and std::shared_ptr.

C++ also has a shitton of other functionality without a coherent design. In that regard, even C is better. They also almost never remove anything. That std::auto_ptr is up for removal just shows how bad it was and that it shouldn't have been added in the first place.


The same, except today, Python vs C/C++

Novice C programmers would get stumped on opening a file, reading and parsing the data.

Now you have:

    with open('foo') as reader:
      # blah blah


And to think, Java was a tough sell to companies at one time. I had to sneak it in (same way we snuck Linux in) by building some invisible infrastructure thing with it, having it run without fail for months, and then saying, "Oh hey, did you know thing X is actually Java/Linux?"


Java got it wrong IMO. There are many things that need the same management. C++ allows this vis destructors. You can auto-manage memory, auto manage locks, auto manage files, auto manage GPU resources, auto manage video players, audio players, etc.. etc ...

Java you can only auto-manage memory. All the rest you can't.

Yes I get that C++ has too many foot guns but Java's solution is half assed. There's so much more to resource management than just memory.


One other thing about Java is it had great libraries. C++ didn't have things like collections so you couldn't write a library that used a 3rd party linked list because your user might not have a roguewave license etc. STL became popular soon after Java which helped.


SSDs.

Virtualization.

Git. I've used CVS and SourceSafe before.

LVM. The old MBR way of partitioning is just awful.

systemd and journald, just to be a bit controversial. I really don't miss my days of screwing around with init scripts, or having to parse logs by hand.

Arduino. It makes lots of cool stuff very accessible.

sshfs, it's amazingly convenient.

pulseaudio. Sound on Linux finally works. I spent an unbelievable amount of time fighting with it before.

valgrind. Amazing for debugging memory issues.

Modern hardware. It's only recently that I'm no longer tightly restricted by RAM, disk space or the CPU. I remember the hours spent on freeing up conventional memory, a kernel taking 3 hours to compile, and having room for half the stuff I wanted to install.


SSDs did it twice - the improvement from spinning disk to SSD it desktops was amazing, and then NVME/M.2 took it ANOTHER 10x - the speeds for storage are now so insanely fast that some of the trade-off decisions no longer apply.


+1 for git, I learned it at my second job and I remember at my previous position using window shared network drives as a very basic versioning control system


I absolutely love systemd and journald. The fact that you can do most things declaratively and have a common interface for logs, it's just beautiful. I can do things like "show me the logs for this service for the last 5 mins".

I still don't fully understand why people prefer bash scripts.

But then again, I have never once needed to edit those bash scripts since I usually don't stray too much from defaults.


It all depends of course on what you do. When you try to ship stuff for various Linux distros, having to figure out how SuSE, Red Hat and Ubuntu like doing their init scripts is a pain.

Admin-wise it's nice not to ever have to look at a 3-way merge of stuff in /etc/init.d if somebody touched anything, and then the system was upgraded.

But besides that there's lots of cool benefits. Like not needing to deal with PID files, the fact that you always know what a random process belongs to, and that if you want to stop something, nothing will be left behind. Plus all the nice resource limits, auto-restart and other useful parameters that can be applied to anything.


It may make the lives of distro developers harder, but doesn't it make the lives of sysadmins (of which are probably 100,000x more than distro developers) easier?

And we always wax poetic about how doing things in a declarative way ultimately leads to a better experience. So I am always perplexed when people give systemd a hard time when that's exactly what systemd has done, it's established a standard and consistent way to declare services.

Speaking of other things that seem to not want to change, why in 2021, a decade after PowerShell, do we still pipe unstructured text between processes and perform gymnastics with sed/awk/print, does that not just feel extremely janky to people? I'm not remotely a fan of Windows servers, but PowerShell got this right from day one since it started from scratch with the idea of piping objects back and forth. Can we not as a community at least have standardized flags to output the various standard commands as maybe JSON? and take inputs as JSON as well? maybe this has already been tried but I don't see it.


> LVM. The old MBR way of partitioning is just awful.

Just wait until you find out about ZFS!


Honest question: can you explain the practical advantages of ZFS over LVM + Ext4? I understand that ZFS combines a filesystem and package manager, but why should I, a humble OSS user, care?


Not all of ZFS's advantages necessarily apply to "humble" uses like your home PC, but I can think of a few that surely do.

Filesystems share all available space in the pool. Say you have 16GB space available, so with LVM, you might make an 8GB /home and an 8GB /var. Later on, you've run out of space in /home while still having 4GB free in /var. What do? Well, you can probably get by with resize2fs followed by lvresize ... but this is not only cumbersome, but prone to catastrophic failures. With ZFS this is simply never an issue.

Snapshots. LVM has them too, of course, but they're not aware of files so they're far less useful. With ZFS, you can simply run `zfs diff` and it will pretty quickly tell you which files were added, modified, and deleted.

Building on this snapshot functionality, incremental backups become a breeze with `zfs send` and `zfs receive`, obviating the need for clunky (by comparison, at least) solutions like rsnapshot.

ZFS actively detects data corruption, by checksumming everything. Assuming you have redundancy (mirrors or raidz pools), it also automatically fixes them. It's often said that this is only relevant to "enterprises," but I'd disagree: I don't want to discover my personal files have been corrupted any more than an enterprise does their business files. A better argument is that this protection is incomplete if one is not using ECC RAM, however, a partial counter to that is that a lot of recent consumer hardware (namely, AMD's) does support ECC RAM.

Overall I'd say comparing ZFS to LVM+ext4/XFS/whatever is kind of comparing, say, Windows 95 to Windows 2000 (or Unix, if you prefer). It's just not a fair comparison, they're not even playing the same sport. A much more apt comparison would be to BTRFS, but ZFS has been rock-solid for longer than BTRFS has been in existence, and the latter has been eating people's data left and right for most of that time.

On the other hand, one downside to using ZFS at home is that you can't really add one drive at a time. Say you have a raidz2 (similar to RAID-6) pool with 6 drives in it. You can't just add a seventh. The ideal way to grow such a pool in ZFS would be to add a whole nother raidz2 "vdev", which would then be striped with the first one. That means buying 6 new drives, not 1.


Thanks so much for this detailed explanation, it really helped underscore the differences! Do you run ZoL or use FreeBSD?


I use both. On Debian stable, ZFS is a piece of cake, because the kernel version never changes. You just install the ZFS package (which is actively maintained) and call it a day. I imagine on something like Arch it could become quite a pain, though (never tried it myself).


A few more questions if you'll indulge me. First, how do you actually install ZFS on Debian? I understand you can install the ZFS package, but that in and of itself won't change the underlying filesystem on disk. Also, doesn't Ext4 also support checksumming? (See https://ext4.wiki.kernel.org/index.php/Ext4_Metadata_Checksu...).


Well, yes. You can't just convert filesystems in place, whether ZFS or otherwise. After installing the package you can create ZFS pools and filesystems. If you want to replace existing filesystems you would move all your stuff over after the fact.

Booting from ZFS will require some extra steps. I actually cheated here and just left /boot and / on ext4, using ZFS for /usr, /home, /var and everything else, so I don't have firsthand experience with this (on Linux anyway, on FreeBSD you can just select ZFS during installation). But if you do want that, it doesn't seem too difficult, https://openzfs.github.io/openzfs-docs/Getting%20Started/Deb...

As far as ext4 checksumming, as your own link says, it only checksums metadata -- actually, a subset of the metadata apparently. ZFS checksums everything, meaning your actual data. Further, since ext4 is not aware of redundancy -- you'd be using lvm or mdraid for that -- it's not clear what it can do if it does detect an error. With ZFS these errors are fixed automatically.


Wireshark. I've saved countless enterprise problems that nobody could debug by intercepting traffic and guessing, or looking at the raw data nobody knew was there.

Python list comprehensions + ipython: it changed how I approach all compley python code. I iterate over the data until I get it right, then use that as the basis of the program/script.

Youtube: I can learn anything now. It's the real world The Matrix


Re: YouTube, the internet as a whole has gotten pretty for "How do I X?" queries.

My favorite anecdote was the time I ripped a hole in the side of my house, then googled "How do I frame a rough opening for a window?" ... and that story turned out fine, with my house having a new window at the end of the day, and not a gaping hole with a tarp tied to it.


Interesting use of wireshark.. I'd assume they'd at least allow you access from at least one end.


last time it was: install tcpdump on java server doing nasty things. save pcap. Analyze with Wireshark, find out the JAVA app was doing insane things against the Oracle DB even after vendo said they had "optimized" database access. Th good thing is the same technique applies to REST apis, sockets, load balancer problems, other databases. You probably can do a better job using each specific stack debug facilities, but a .pcap and wireshark is the swiss army of debugging.

Also: strace for reading what anything in Linux is trying to do when failing.


The rr debugger [1]. I desperately missed it after switching to a mac laptop -- to the point that I now have an older linux system which exists primarily to run rr.

(I've heard that Pernosco [2] - partly built on rr, AFAIK - is even more revolutionary, but haven't yet tried it myself)

[1] https://rr-project.org/ [2] https://pernos.co/


FYI, Jean Yang had Robert O'Callahan on her weekly PL-themed "talk show" on Twitch to talk about rr a couple weeks ago. The video is available until this coming Friday, I believe [0], at which point it will disappear into the ether.

[0] https://www.twitch.tv/videos/938439732


From a musicians standpoint, the everything today is 10x of what he had just 20 years ago.

If I wanted to record a halfway decent sounding guitar 20 years ago, I had a couple choices:

1. Go to some studio, spend probably at least $1k, get the stems and transfer them to my computer.

2. Purchase recording gear (microphone, preamp, etc.), which would minimum cost around $1k, and on top of that, have a computer and audio interface. Again, $$$.

3. Purchase some early modeller, some audio interface, DAW, etc. Probably $1k just in that.

Simply put, there weren't any really affordable or efficient method, that didn't sound like a$$.

Today, it's a breeze. Free plugins, free DAWs, extremely affordable audio interfaces.

As far as cost, quality, and efficiency goes - everything is 10x of what we had 20 years ago. And almost everything boils down to the software products involved.


Could you point to some professional grade plugins or DAWs that are free? Never heard of that.

Also a decent professional grade audio interface is still quite some $$. It's not like $$k like in the past, but you still have to pay a few hundred bucks.

Compared to say 50 years ago everything audio got really cheap, sure. It was in the tens of thousands range, now it's in the "a few thousands" range for semi-professional studio equipment.

Audio is actually one of the remaining software niches where they still call prices in the "a few hundred bucks" range for a end-user license. Other things around got much cheaper much faster.


As far as free options go, there's def. a lot of tweaking to be done, and more than anything, the end result (IMO) rests A LOT on whether or not you use decent cab impulse responses.

Lepou is one option: http://www.grebz.com/simulator_freeamp_poulin_eng.php

As for audio interfaces - there's no free things there, but if you're only interested in recording one or two tracks, like any stringed instrument, any used $50-$100 interface from the past couple of years will do. No, it will not sound like a $xxxx setup, but good enough for recording demos and even amateur records. Obviously depends on the music you're going for, too.

Now, just for the discussion - let's say you own a guitar or bass, and a computer, and you want to record something for as cheap as humanly possible, one setup could be:

instrument -> audio/microphone input on computer using ASIO4ALL -> Reaper or Cakewalk -> Freeware plugins like LePou amp plugins -> IR loader + Free cab IRs -> built-in effects in DAW.

Obviously will not sound like pro quality, but could do the work, if money really is the pressing issue.

(Another option would be "free" plugins via pirating, but that's another discussion...though many pro-grade plugins offer free trial periods, or heavily stripped down plugins)


there might be something on this list that interests you: https://github.com/ciconia/awesome-music

reaper is another DAW that is not free but its only 60 or something so its practically nothing compared to others. i used it for 3 or so years without paying and the only drawback is you have to wait 5 seconds for the nag screen at the start to close.

i don't want to sound like im on commission for reaper but i really love how fast it is to open a project and get started (once you have a licence that is), it has no bs logins that need to connect to a server before it will work. its insanely customisable and its also portable so you can run it from a usb drive. you can also have different versions that can run side by side and be set up in different ways with different settings etc. there's really nothing like it!


Oh, the section about music programming environments contained a few really awesome things that were new to me.

Thanks a lot!


How about a 10x reduction?

Thinking of when "Paste with style" became the default. :P


This makes me mad every time. And every single thing has a different shortcut for "Paste without style", if they even have one.


On MacOS there’s a native shortcut for pasting as plain text:

Option+Shift+Command+V to paste text without any formatting.

https://superuser.com/a/512502


This is the same on Windows and Linux afaik you have Ctrl + Shift + V. Still makes me angry because it's a stupid default.


It's not universal on Windows: for example it doesn't work in MS Word. The place I'd want it the most! Ctrl+Shift+V does nothing at all there. It's Alt-HVT for this in Word (duh). But you can set the default to be no-formatting.


The following is almost a reflex for me: Win+R -> notepad -> Ctrl+v -> Ctrl+a -> Ctrl+c


In most Office apps you can make this Alt + 1 by setting "Paste As Plain Text" as position 1 in the Quick Access Toolbar (the small list of icons in the top left of the app). But yes, would love for this to be a universal Windows command.


It works in LibreOffice but it pops up a window asking you which sort of paste you want, which wastes more time.


This is why most of my copy-paste is actually: copy-paste_in_SublimeText-copy-paste.


And most insane thing I find it that sometimes it works over Operating system that is from VirtualBox to Windows...

That level of VM to host integration is though 10x innovation.


Since there are already a lot of great answers, I'll give a niche one: fzf.

It's a simple concept and implementation, but you can plug it in all kinds of things and quickly compose complex tools in a simple manner. You can use it to search (and navigate to) logs, files, git history, external APIs (e.g. AWS). It's amazing.


APM software (application performance monitoring). It was super expensive, pain in the ass to maintain, required a lot of training. But when someone skilled sat down with it - just wow. You could investigate particular users problem, and see that when she typed "s" in search form it triggered a few proxies, sql query which had proper execution plan, but it seems that connections pools are handled poorly.

It could visualize what is the actual architecture (not what we think it is) and show which connections are laggy, or more used, when it should be round robin.

It discovered undocumented connections and could show us laggy requests even if remote system was not monitored by APM - purely based on data from one side.

I could report to developers a particular line of code that is problematic from performance perspective (like "this takes 40% of time of the request, even though it's the simplest task in the process) without knowing much about that program or even coding in general (I'm more of an admin).


What software do you use? It sounds like for monitoring a website. Is there something similar for a desktop app?


And for a desktop app I guess it's just called a debugger :> Unless said desktop app still uses http requests or something similar to webapps that can be attached to APM software. Then APM will know were it came from, but you won't get insight into the app itself, just the resulting connections.


I used a few, mostly Dynatrace. Experimented with Pinpoint and did like 3 PoC of commercial products. Dynatrace won for me but the prices were insane.

There is now a lot of opensource ones, usually not so fancy on the visual side. Elastic APM, Pinpoint and so on.


Wolfram Alpha. The ability to perform calculations on "nebulous" things is amazing. Things like comparing the population of China to the population the US (https://www.wolframalpha.com/input/?i=population+of+united+s...) are great. Visual solutions to complex math problems? Awesome.


Maybe the original AJAX in internet explorer? Combined with Javascript it moved the web from being RESTful hypertext applications to enabling moveable code - killing Java applets, Ole/com plug-ins (including flash).

I'm a bit sad that tls/http2 basically put the final nail in the coffin, essentially killing the viability (if not possibility) of elegant caching from REST. But arguably it's an architecture that's not needed with today's ample resources (fast networks, fast cpus, ample ram and storage).

Hard to imagine that the defining technology of this decade came out of a classic "screw standards; extend/embrace/extinguish"-playbook.

Thanks to Netscape/Mozilla/Firefox (and opera/chrome) it went the other way.. .


When I started as a developer the internet wasn't there. If you wanted a new library or tool you had to go to a shop and buy the floppy disks (or mailorder). If you wanted docs you used the man pages or had to buy the book, probably the bookstore would order it for you. Its hard to imagine now.


I was 18 and moved to Univeristy where I "met the Internet".

And yet I find it hard to recall life before having mobile phones to easily coordinate meeting friends, google maps to navigate, banking apps for account and bill management.

Clearly I had a life before internet. Clearly I coped. I just can't recall how.


I’m a millennial, and that is hard to imagine. Though, some of my favorite coding exercises have been writing something for which there already exists a library, so I’d probably have learned a lot more of nitty gritty details if I had grown up in that kind of environment.


It was liberating as there wasn't this continual firehose of new applications, tools and ideas. There weren't as many developers around and you also didn't know how good many other teams were. You just had your little world of getting your application to work.


A couple i havent seen listed:

Tivo - pretty cool when it came out - you didn't have to run to the bathroom/kitchen during commercial breaks and hope you made it back in time. Of course it's obsolete now but a significant improvement over the standard at the time

Streaming music services - i spent an embarrassing amount of time as a kid/early teenager collecting cds and ripping them to my hard drive - if only I could have all that time back

The selfie camera on smartphones - pretty self explanatory

MagSafe connector - why they ever removed this is beyond me. I don't know if it made my life 10x better but if extended the life of a laptop a year or two it's worth it's weight in gold


I have never witnessed somebody tripping over a laptop charging cable - does that happen a lot for others?


On the second day of ownership of the first MBP w/out MagSafe my father in-law tripped on the cable and it yanked the new MBP off the table and the force of the MBP bent the USB-C plug something nasty.

The only way I’ll ever forgive Apple is if one of their C-suite exec personally calls me and tells me why on earth they did this.

And we all know why... because MagSafe wasn’t Ive’s idea and since people loved it, he hated it.


Yes, but also pulling your laptop to where the cable is taut and having it yank out. I realized this is more of a hardware innovation


I completely wrecked my PowerBook G4 by accidentally getting my leg on the wrong side of the cable when getting up from the couch in a hurry. It flew up in the air and landed on the floor at a weird angle, and after that it was forever bent...


My cat often pulls the cable and in panic it could destroy whole laptop (did not happen though).


Some pretty basic but workflow changing improvements for me (maybe 10x is too much, but 5x maybe? :)

1) Tabbed browsing in Opera/Mozilla in early 2000s. You'd have to open new window each time before that.

2) Moving from Windows Phone 8 (Nokia Lumia) to Android 4.4:

IE Mobile was okayish and pretty standards compliant browser, but could only handle 6 tabs, and switching between tabs would reload the page. Chrome Mobile would open infinite no. of tabs, and easily keep 10-20 of them in memory, allowing to switch around.

3) Two features of Sublime Text compared to old generation editors/IDEs:

- multiple cursors (ctrl-d) make editing very different

- keeping new unnamed files as drafts when closing editor (instead of asking "do you want to save the newfile1? newfile2? ...")

One fundamental change:

4) Firebug. Debugging websites before that was slapping random CSS `border`-s/`background-color`-s and `alert()` calls.


In the 1980s I was impressed with LapLink software which I used to move files between computers. The copy I bought came with cables to connect the computers, both serial and parallel cables. The serial cable had 25 pin and 9 pin connectors on both ends. If one PC did not have a disk drive, they gave you instructions to write a short command to re-direct the serial input. LapLink would then transfer itself across the serial link and begin running.


Google & stackoverflow

Prior to this everything was based out of tech manuals, now I can find good information fast, and memorize pointless trivia far less.


5 years ago I would have agreed.

Now, Google is heavily polluted with SEO garbage, junior-level blog posts churned out by what must be student assignments to "write a blog post about this week's programming assignment", and the truly horrible data scraping sites like xspdf or whatever. General web search is increasingly poor.

And Stackoverflow was great for a long time. Now it is mostly outdated (which usually means incorrect) Q/A data. Worse, the old information that should be retired actually blocks new questions with current answers because of the aggressive system of trying to prevent "duplicate" questions. It's not a duplicate question if there's an 8+ year (or even 3+ year!) gap between the date of the original and the date of the new one. Quite often, even if the original question is close enough to current needs, the answers are very unlikely to be correct now.


Because it wasn't mentioned yet, Dropbox was quite magical as an end user tool. You use your computer normally with folders and everything is sync'd to other computers.

Unfortunately dropbox the company chose to grow into everything except their core product.


Nixpkgs and the nix build.

After rough times with ubuntu default package manager I finally installed nix and there's no going back.

Previously installing any software used to be a risky affair that could corrupt the packages or bring some dpkg issue. Now I literally install whatever I find. It's truly game changing. I haven't seen a missing library error since then. Being able to reliably install software across machines with different versions is huge and right now there is no better solution than nix.


Banking apps(avoid going to bank to deposit check, or going to post office to pay bills) Podcasts (alternative is reading a blog) Airbnb (alternative is hotel or possibly a BnB)


The personal computer (as a category, not just the IBM PC). I first ran into one in high school, when the computer science lab got five TRS80s. Before that, it was submitting a FORTRAN deck that someone took to an off-site mainframe. Half a week later, you got your results. But with the TRS80, you got them as fast as the program ran.

Now the CS lab did have a Teletype terminal, though I never used it. But the TRS80 had a real interactive display, not just static paper.

I know there were others before the TRS80. It's just the first one I came across.


I am guessing such a long feedback loop required proofreading your code very carefully before submission. Do you still have this skill now when the feedback loop is so much shorter?

I noticed that some very senior programmers check their code for much longer than rookies like me, who just keep banging "Run" until there are no more warnings ;)


Why keep it? If you can find a typo with an hour of inspection, or with a minute of compiling, why do the hour of inspection? Save that for the bugs that will take a day to track down...


The following come to my mind:

- Google Earth: A globe with ultra-zoom just didn't exist before.

- Tex and LaTeX: They revolutionized Academic typesetting and publishing.

- DAWs: I used to use a four-track recorder. When I first got to edit in a DAW with a few built-in effects, that was such a quantum leap. (I was about to write "affordable DAWs" but to be honest these came quite late and my first DAW was a cracked copy of Protools, of course.)

- REALBasic when it was affordable shareware: I've never been more productive than in REALBasic. (Also learned my lesson, though, never rely on commercial tools whose price suddenly might skyrocket.)

- XLisp: I did not end up programming much in it but this was so good and mature software for hobby programmers interested in Lisp during the 90s (at a time when CommonLisp was hard to get outside your university).

- Wikipedia: Doesn't really count as software but I had to include it. It's possibly the most useful Internet-based tool besides email.


I've experienced some 10x jumps, mainly because I tend to wait until something comes along that's 10x before switching. Not that I'm smart, but stubborn.

- Turbo Pascal. My dad knew nothing about computers, but the Wall Street Journal ran an article about TP, and he knew I was interested in programming, so he got me a copy for my birthday. It allowed me to program almost as quickly as BASIC, but with half a chance of my programs working. (Yes, 50% chance was a 10x improvement). Not to mention, a motivation to learn more disciplined programming.

- HyperCard and then Visual Basic. I never had to learn the guts of a modern OS and its API just to write usable software.

- "Scientific" Python stack. The quality and breadth of the language, tools, and libraries is just stunning to me. I looked into Python because it was getting a lot of "buzz" when I first joined HN. Thanks, y'all.

- Arduino and the higher powered microcontrollers that it supports, for making it easy and quick to whip up applications. But even pre-Arduino, a number of 10x improvements came along for me on the hardware / electronics side, starting with microprocessor interfacing in BASIC, EEPROM based microcontrollers, decent C compilers for embedded.


I literally (as in 30 minutes ago) just fixed a bug in 10 minutes that I thought was going to take me 3 days to fix using https://pernos.co/ (a "timeless" debugging tool that wraps Mozilla's `rr`). This is a factor of speedup on the order of 100x+, not 10x.

I highly recommend checking this tool out if you value your time.


The most recent one I've experienced is ElasticSearch + Kibana + FileBeat/MetricBeat. I've gone from munging through log files in a text editor or maybe LogParser to being able to quickly visualize issues in charts and dashboards, but then zoom into log-level details in seconds. It may not be fair to call notepad the "second best option", but for open source log (and more) analysis, Elastic+Kibana definitely stands out among competition.


Wise (was Transferwise). Literally made my monthly alimony payments less painful and much cheaper compared to international bank wires. I'm talking about saving 200-300EUR per month on fees. (No I have no affiliation with them.)

Slashdot back in the day.

Digg back in the day.

Hackernews, back .. ... ... .

Java in 1997.

Python and Ruby starting around 2004ish.


I guess I would date myself, but:

- syntax highlighting in text editor

- tabs in browsers (in my case that was Firefox before it was called Firefox)

- DVCS, I've started with Bazaar then moved to Git (of course). Used VCS (painful) and SVN (less so) before that.

- REST APIs and discovering Sinatra framework to write them.

- Rails. First time I saw famous "write blog in 15 minutes" my head was blown.


Link to the Ruby on Rails demo for anyone wondering https://www.youtube.com/watch?v=Gzj723LkRJY


Docker-compose: Over the past years I've gone from running some things I really wanted (Samba, Nginx with a website or 2 and maybe Drupal) to just about anything (NextCloud [2 installs for 2 families], Home Assistant, Bitwarden_rs, Minecraft servers, Unify Controller, Samba, WireGuard ...) And trying something new is soooo easy with Traefik taking care of name based routing and certs. And it feels good that my entire personal infrastructure can be backed up by copying a single Yaml file and a single folder (with many subfolders of course).


Jetbrains IntelliJ IDEA. Using it since about 2004, it was slow as molasses and unstable for my codebase but it was the first widely available refactoring IDE for a mainstream language, and it rocked and it and its offshoots still rock today. They didn't invent automated refactoring but they made it excellent and accessible in a way that VisualAge and similar tools weren't.


Multiple cursors in Sublime Text. Made it incredibly easy to do repetitive edits to json data or html templates. VS Code does a better job of cursor management now, but I think that was inspired by sublime text.

Language and date time services like Moment and i18n. Huge productivity gains from having off the shelf solutions for multi-language features.

Node.js and Express got rid of all the cruft from backend API development. It’s a breeze to spin up a new API, whereas years prior needed a fairly strict environment setup to run reliably.


Reminds me that "columnar select" like BBEdit has is amazingly useful when dealing with tabular data.

https://www.barebones.com/support/bbedit/faqs.html#rectangle


Google the search engine. Before I had to us Hotbot and Altavista. Google changed the game.

YouTube. The user-generated content helped bring a lot of small time productions to be seen by millions of people. Now you can see a video about almost anything, especially if you need a how-to, tutorial, or learn about anything.

Adobe Photoshop's Content Aware Fill, Healing Brush, etc. This changed the game for photo manipulation. You no long had to do it manually by hand using clone stamp. Now the computer does a pretty good job for you.

Smartphone camera. This is more hardware than software, but they practically destroyed the entire compact camera market. Now everyone has a great camera and camcorder in their pockets. Now with their image processing, they're rivaling DSLR images for low-res images.


For me the switch to Mac many years ago was like that. It was like 100 mini annoyances suddenly disappeared from my life. True or not, I believe that's how it felt.


1) Ruby on Rails.

The standardized structure.

Convention over configuration gave the app a level of consistency that I hadn't experienced before.

2) Lyft - being able to see who your driver was and exactly where they were in route to pick you up, was amazing.

I think the first couple of years of riding I would always ask the driver when they started driving for Lyft and if they enjoyed it.


* Any Smalltalk IDE (VW, Vast): it’s hard at the beginning, but once that you get used to evaluate expressions everywhere it’s amazing. I don’t use St since a long time, but I haven’t seen that dev UX in any other editor or IDE.

* MS Word for Windows: today we are used to the spellchecking as you type, but IIRC Word was the first one to have that feature.

* VMWare.. being able to run a Windows VM from Linux in the 2000 was amazing. It also changed the QA processes for desktop apps.

* GMail... 1GB email for free, it was way beyond of any other web email at the time (that and also good IMAP support, and Pop3 with TLS, none of the competitors had that when it launched)

* Photoshop... if you used any image editor, PS was 10x better.


In chronological order of my experience:

  Interactive terminals (over punch cards)
  PLATO
  Macintosh
  Windows 3.11 (over DOS)
  C++ (over C)
  The Internet (over BBSes)
  Windows 95 (over Windows 3.11)
  The World Wide Web
  Perl (over shell and C)
  Python (over scripting alternatives)
  Rust (over C++)


Heroku

- Git push to deploy

- Scaling dynos

- Turning add-ons on/off

- Managed PG (backups, rollback, copies)

I can't imagine how much time I would have had to invest in building my first few apps without Heroku. Had a similar experience with Google AppEngine in ~2009; I barely knew how to code and I was able to charge a customer for the first time ever for a working service.


PaaS is still the best model, IMO, and the one I still use today for all of my businesses and personal projects.

IaaS is too much sysadmin overhead, serverless can't be easily selfhosted.


These came to mind

On Demand tv

DVR

SSDs versus regular hard drives

windowing functions in databases

MP3s vs CDs or minidisks. (Suddenly I could hold 1000 songs in my pocket)

Columnar databases (Sybase IQ, SQL Server, Redshift)

20 tools on 1 device in your pocket.. replaced flashlight, magnifying glass, calculator, maps, notepad etc etc

Multicore CPUs.. Oh I can do 2 or more things while stuff is compiling in the back ground.. and of course async

Ajax.. no more refreshing the whole page

Automated tests and builds..


I think clipboard managers (like flycut for mac) are under rated. I regularly use the last 3-5 clips.

Heck, it'd be great if clipboard history was baked into the OS.


It is on windows. I use win+v all the time.


Whoa, mind blown thanks for the tip


Pastebot is pretty good too. I use its sequential paste queue many times a day.


Check out Alfred.


PHP Inspections (EA Extended)[1] by Vladimir Reznichenko, a PHP language static analysis plugin for PHPStorm / JetBrains. I've coded in PHP for many years now, but there are many helpful reminders and checks that come standard with it. Some of the small performance quirks are game changers in long running processes, or intensive methods.

[1]: https://github.com/kalessil/phpinspectionsea


Configuration management.

A few years ago, I timed myself doing a full reinstall on my private notebook to switch to a different partition layout. I was done in 30 minutes, of which most of the time was spent waiting for packages to download and install and for the /home backup to restore. Without configuration management, this would have been a full day of work just installing stuff and figuring out things like "how do you enable palm detection on that touchpad model again".


Which specific software do you use for configuration management?



Maven: not having to hunt dependencies in bookshelves, on file shares, on vendor web sites or sourceforge.

Actual extensions for Firefox (no, not the awful watered-down thing we have to live with today):

Back when Firefox created the extensions API they had two competitions with (I think) rather huge sum of cash as prizes for the three top spots.

The quality and ingenuity of those early extensions were nothing short of astounding IMO.

Later extensions were also brilliant.

(To fully understand just how powerful the early extension API was consider this: Firebug, for all practical concerns the precursor of todays developer consoles, was just another Firefox extension! Same with adblocking! Yes, ad blockers still exist, but only because of that legacy, no way someone would have proposed and gotten away with implementing that if the extension API had been invented today IMO.)


dbt [1] is a tool that is deceptively simple, but kind of created a new job role: analytics engineer. It allow you to define tables using SQL plus Jinja2, so you can add loops and variables to create your SQL code. It might sound complicated, but it's a pleasure to work with.

This improvement was only possible because of other improvements in cloud databases and reductions in storage costs, which allowed us to go from ETL, where we transform the data outside of the database, to ELT, where we load the raw data in the DB and transform it using SQL.

All these improvements taken together allows a single person to do a job that required a small team not many years ago.

[1] https://www.getdbt.com/


Agree, this was a huge improvement on workflow in analytics.


GPT-3, for the first time you need to read a few minutes to know fake from real. I used to play with LSTM language models and they barely made sense 10 words at a time. The difference is a huge leap. My wish list for GPT-4 is: multimodal - to learn text, image, video and other modalities. Multitask - cultivate as many of its skills as possible, maybe it learns to combine skills in new ways. Longer sequence and additional memory, maybe even ability to use search/retrieval for augmentation. And last - recursive calls - ability to call itself in a loop and solve sub-problems. I hope in the future a pre-trained GPT-n variant will be a standard chip on the motherboard.


When I started making music 10 years ago, I limited myself to using hardware only. Hardware synth, drum machine, mixer, effects etc. It was a great learning experience, which is why I did it. I wanted to understand the fundamentals.

5 years ago I finally took the dive into the digital realm with a mixer/soundcard combo and a DAW (logic X).

Forget 10x software, this was closer to 1000x faster.

I’m sure the early days of DAWs wasn’t like stepping into the game in 2015, but the software workflow was stupidly quicker than doing everything physically in real-time.

Now days we have heaps of choices for DAWs and soundcards and cross platform comparability. What an age to be a musician!


My first use of proper capacitative touch screen on the iPad. It was an absolutely transformative device compared to all the insensitive resistive ones that you had to jab at. Multi-touch for zoom and auto-rotate weere also innovations that felt like magic at the time.


Heroku. We used to Colo servers or rent dedicated servers. You had to setup linux/apache/database, etc. Manage logging and backups. And then to scale up you had to do it all over again. First you split the dB onto its own server, then start sharding/load balancing/replicating. And then you take the whole site down by letting a log file fill a disk or something.

Heroku and 12-factor apps easily made building a business 10x easier/faster/cheaper.


Home Assistant has been a revelation in terms of a platform against which to create sensor networks. Their MQTT discovery isn’t ideal but it’s damn good.

Related to this, Tasmota as a platform that I can run on ESP8266 devices to keep my IoT local to my network.

A document scanner connected to Wi-Fi (wish it had the ability to send directly to my laptop instead of having the laptop pull from it, but still 10x better than connecting a USB cable or emailing attachments).

Last night I discovered Tabula, a GUI for extracting tables from PDFs. Saved me more than 10x the time than copy pasting by hand. Fuck banks that only let you download the last 60 days of transactions as a CSV.

Discord is 10x better than most other chat platforms (Slack being the workplace competitor). Mainly this is because of how easy the signup process is.

Django is 10x better than PHP I left.

Pelican is better than WordPress for my needs.

Waze’s ability to search for things along my route (gas, food, coffee, etc.) instead of in my current area.

Reddit. Reddit to me is a community in a box for any new interest. If I pick up knitting, there is already an established knitting community that will have lots of info and helpful people to answer my questions. Same with motorcycles, home improvement, bargain hunting, rug weaving, whatever.

Instagram. It’s 10x better than most social networker for interacting with people. Still sucks, but everything else sucks more.

AirBnB experiences. Had some great tours through them when I visited Italy a couple of years ago and was way easier than the individual scammy-looking tour company sites.


The very first iPhone after it's release by Steve Jobs. Easily 100x better than the phones that came before it.

I still have it - though I'm an Android user now...


Strong disagree. It was slow, had barely any features compared to, say, the Nokia N95. Sure, it made leaps with certain key mobile features, e.g. voicemail and the chat-like SMS interface, but it was a big step down from the N95 overall in almost every other area.


- Git

Single handedly let me both reap the benefits of SVN but opened up shared code, and a greatly simplified setup and management.

- Linux (Linux Mint - Ubuntu)

After years of crashing Windows and Mac's, finally getting on Linux was a dream. App installs, native tools (git), server and database support being top-notch, etc... (ie, my desktop environment was the same as my server!)

- VScode

After years of struggling through various editors, I found one that focused heavily on webdev, integrated with Git by default, and was super fast and open source.

- OpenSSH Server (not sure this is the right name)

Having private key shares between computers made accessing remote resources so effortless.

- Ansible

My latest find. This has probably done more than 10x the value invested in learning and setting it up. Managing multiple servers, computers, setting up new environments, migrating between computers, has become FUN because of Ansible. Setting up a fantastic ansible playbook is like a game or satisfying puzzle.

- Macromedia Fireworks

Combined vector and bitmap graphics, had symbols (like shared library resources), non-destructive effects, live editable vector masks, native file format was web accessible PNG. Wiped Photoshop's fanny for doing web and interface design.


Oh man, Macromedia Fireworks was great. Folks made the most beautifully unique websites with it. I wish the app, and those design sensibilities could have survived. The web and app ecosystem today would be less monotonous and more fun to make.

Macromedia made the web a fun place on so many fronts. I really do miss Flash development (and Flash) too.

In an alternative universe we're all just using Macromedia tools and are being very productive and happy.


Not sure if this is the kind of thing you had in mind, but...

`clang-format` saves a non-trivial chunk of time during code reviews, eliding a very banal topic.


I'm not sure if it's a 10x improvement, but formatters and linters have definitely cut my time writing code by 50% in stupid bugs and time wasted moving parenthesis.

And enforcing those formatters and linters cuts the time I spend reviewing other people's code by at least 50% because I know those issues have been dealt with.


This was a while ago, but Borland's Turbo languages were revolutionary.


To my chagrin:

1) Robinhood for normal stocks

2) Disney+ vs AmazonPrime/Netflix

-> HDR for no additional fee, remastered exclusive content, a very full non region-locked (AFAIK) library, consistent streaming quality, straight-to-VOD shows, and premium movies.

->Some might argue not 10x, I can be convinced to agree. It's a solid 1.5x at a minimum though.

3) AWS Workspaces vs RDP

-> Ease of use out of the box is just unparalleled.


> HDR for no additional fee

The first thing I turn off when I get a new TV is motion smoothing. The second thing is HDR.


Why?


Most mid-to-cheap end "HDR" TVs look absolutely awful on HDR. The colours are washed out and the contrast is blown out. If your device sends an HDR signal, in many cases it'll look worse than not having HDR at all.


I completely agree. I was an early adopter of a Sony HDR10 set and am pretty worse off for it. Genuinely curious though as to what you consider a good HDR tv though.


The only way HDR looks good is on an OLED screen with perfect black levels, in my opinion.

When the entire screen can be pitch black, and a single pixel can be fully lit up, that's when HDR really shines.

All of the LED/LCD HDR solutions look like shit, in my opinion. Even the screens touting a hundred different lighting zones. Nothing compares to OLED.


OLED is about double the price, but IMO, it's worth it.

Being able to have TRUE blacks with no light bleed anywhere is amazing. Yeah, you've got TVs with a hundred zones as you said, but that just means you get blocks of grey when there's something on top of true black. Depending on the scene, that compromise can look even worse than just allowing all the black to be grey from the backlight.


Gitlab - Having 12 products in the SDLC space rolled into a single product was a life changing productivity benefit.


GitLab team member here. Thanks for this comment! I just shared on our team Slack.


Retool [1]. Its been years since I've ran into a software product to support software teams which could so fundamentally change how you build things, and how you build the things you build. Retool can basically cut down the effort of making internal administrative dashboards by, like, a solid 70%. I still sometimes run into dashboards other engineers have made on Retool and say "holy shit you can do this?" or "how did we have the time to build this, is it really that easy?"

I believe its possible Retool will become the next Jira, in the sense that its a totally internal tool that's so valuable companies hire people whose entire job is to just live in it and develop it out.

[1] https://retool.com/


Some parts of the Rust ecosystem belong to this category. For example the way Rustup and Cargo features with conditional compilation helps cross compiling code to new platforms.

Yes we also have cross compiling in C/C++ but the extra tooling that cargo/rustup provide make the 10x difference.


And docs.rs.

Having one place with documentation for every library is just amazing. Cargo as standard layout for all projects is such a force multiplier.


1. Windows 95, as opposed to Win3.x/DOS. 100% worth the hardware/software compatibility pains for GUI/multitasking/multimedia/PnP that just worked, an amazing thing at the time (and honestly, in retrospect, giving the preposterous unstandardized diversity of PCs & their peripherals and software in that decade)

2. SQLite, vs fopen and hand-parsed files for any random data storage that didn't rate a full blown remote DBMS. Never using sscanf again!

3. NodeJS. Memory Safety, Thread Safety, No Deadlocks, No build system nonsense, credibly scalable I/O concurrency, and only ~an order of magnitude slower than C? Absolutely worth the nightmare syntax and (then) lack of decent libraries. As soon as npm came on the scene (0.6?) Java ceased to have any reason to exist.

4. Uber. It's like a taxi, but the UI doesn't suck, and they tell you when the car is coming, and the car actually arrives, and they'll pick up in areas other than the 30% of the city and 50% of the day and 60% of the population that taxis are willing to cover in LA!

5. Linux network namespaces + associated tools. So many things I used to have to do on dedicated boxes/VMs, or multiple network interfaces, or single-purpose network equipment. Worth it for per-process ovpn tunnels alone, but you can scale it up to crazy elaborate SDN schemes if you need to. Blows away almost every middlebox and hypervisor networking tool.

6. AWS. There were lots of stick-a-host-on-the-internet services before, but nothing like this: software provisionable, software networking, reliable host-to-host and host-to-storage networks, no lead times... There are credible competitors now but it was a gamechanger for anyone working at the servers > fingers > datacenters scale.

7. This one's kind of goofy, but TimeSnapper. It just takes a screenshot of my display. Every 10 seconds. And saves it into a time-and-keyword searchable database. You can't remember exactly how you fixed that thing 5 months ago that's broken again? I can. The best productivity tool you can get for $40 with the possible exception of 1440 post-it notes.


You can still have deadlocks in NodeJS, when you play with promises and are not careful.


I'm just here with my popcorn waiting for the responses on NodeJS


Moving to Linux from Dos/Windows 3.1 in the 90's. (An OS that's both free and comes with development tools? I don't have to shell out a hundred dollars for Borland Turbo C++? What an exciting time to be alive.)

The Internet / world wide web.

Also in the late 90's I had a summer job at HP and was able to use their cad tools (ME-30 and Solid Designer) which put Autocad, which I was familiar with, to shame. Unfortunately, they were very expensive programs that ran on very expensive computers.

Switching to ML-style functional programming (Ocaml and later Haskell) over C and Perl.

Switching to Wikipedia over the top Google results for a search query as a source of basic information. (These quickly converged to be the same thing.)


Also, one more I forgot: bittorrent. Overnight we went from "oh no, everyone else is overloading the ftp sites, I won't be able to download the new Ubuntu / Mandrake / Whatever for a couple weeks until the demand dies down" to "hooray, everyone is dowloading the same iso file I am, I'll be able to get it quick!"


Edit and continue, Visual Basic 3, 1994

After adding a major piece of new functionality to my app, I would step through the new code, fixing each bug as it appeared, and then continue stepping. Being able to produce bug-free code after only one run-through just blew my mind.


Visual Studio.

At the time I used this really shitty, platform dependant tool,which is only an inch better than Notepad. And then,one day I get to use Visual Studio. It felt like I went from a half dead donkey to Star Trek level tech. Amazing.


control^r in bash - literally 10x faster to type things which (ideally) makes 10x more productive :)

control^r is reverse search in the command history (for UI people out there).


If you like ctrl^r, you'll _love_ https://github.com/junegunn/fzf


If you like that you should try the fish Shell. Not great for automation but amazing for general use.


It's also Undo in Vim. Cheers!


iOS pausing music when unplugging headphones. Totally obvious nowadays, but a revolution to me when I experienced it the first time (coming from Nokia). It was one key experience that shifted my perception of my phone as something that needed to be “managed” to do what I wanted to something that supports me throughout the day


Google Maps when it first came out fits this description, for those who remember MapQuest and the Before Times.

I would also mention flight tracking and package tracking, which felt like sci-fi the first time you used them.


If this were true more broadly, we'd never get anywhere - most advances in science and technology are incremental, and improve slightly on past things. This applies even to great discoveries (Standing on the shoulder of giants). Is there a more precise definition of "10x" ? It's not well-defined-enough as-is to survive on intuition.

I agree that many of the things posted here were 10x better - but this is a selection of the most notable software projects in memory: Not a reasonable threshold for if a business is successful!


Apache and *nix, for some applications LAMP was a game-changer for rapid web solutions for me, back in the day.

Started with FreeBSD for my servers and box, then GNU/Linux on my boxes and laptops, as the hardware support got better on GNU/Linux. Apache, but really the whole stack, what a 10x!

No clicking, no checkboxes, no go to this or that website to download, just run a command or a shell script and pull in software you need, configs, etc ...

Of course I wish we had Lisp machines, but this is what I have to work with. Having *nix, learning it over 30 years ago, and it's still at the core of what I do. So much time saved over the years not re-learning the fads. I shudder to think how much time I would waste on the other platforms and their quirks, planned obsolescence, etc.

Was saved relatively early (enough), an iMac I got as a gift & tried to update to the next OS got fried, was perhaps a bug, some say malevolent design by apple to get consumers to buy new machines. Don't care, done with that platform. Though I did buy an iPad 2 for peanuts, to read on. It's cheap because people can't update it anymore, so there's that.

I suppose I saved about a year of man-hours at this point in life, just using the *nix stack that just keeps on truckin'

& Common Lisp for rapid idea implementation.*


Gotta say, Google's pioneering and Facebook's shamelessness in the personal information harvesting space has been, for me, the single most valuable feature of any website on the internet. If they had been even slightly less brazen, I may never have been convinced to opt out as hard as I have. Removing the most addictive and exploitative sites from my life has cut my time online and improved my quality of life dramatically. Could not recommend more.


1) Pandas for data analysis and manipulation 2) BitTorrent (P2P file distribution technology). It is the best and fastest way to distribute files, however, it's main problem is piracy 3) Self updating software (Google chome was an early example) 4) AJAX - The web wouldn't have been what it is today without it 5) VIM 6) Package management software - Brew / apt-get / yum 7) software package management - pip / npm / maven


ZFS, so many problem got solved for me all at once: no more having to spend money on hardware RAID cards, or having them end up being the single point of failure... so having to spend more on RAID cards. zfs-send really streamlines the backup process as well - so it actually happens. It basically made it possible to build an enterprise class SAN that I wouldn't have been able to otherwise afford. So not technically a 10x, more like a 10(x+1).


Docker containers (and the Dockerfile).

Docker has almost completely eliminated the need for OS-level configuration management. That's an entire class of software that is now virtually obsolete. Developers don't understand this, but Ops people got back like 20-50% of our time, and we can now do Immutable Infrastructure with apps. (Though it appears we traded Puppet for Terraform.... F*#&*@^&^!!!)

Creating immutable images with stock packaging tools and allowing you to build on top of them was the core of the revolution. But the other critical part was the Dockerfile. At first it seems almost stupidly simple. But actually, its genius is in its lack of freedom or functionality. It is incredibly impressive that it hasn't been overwhelmed by logic statements, templating, nested frankenstructures, etc. It's also wonderful that it's constructed for humans, not machines. Probably the best configuration format I've ever seen.

It's definitely not perfect, and there's still a lot of improvement needed. But we're never going back to not using containers. (The only alternative for Immutable IaC at the OS-level are entire VM images built by Packer and a shell script, and it's just not the same)


* FaceTime - of course VC has been around a long time, but FT made it all better, usable even for old people, and expected.

* Windows Subsystem for Linux - for what I do with Linux (no GUI, some coding, some admin) it's a 10x improvement over using Gnome/unity as a daily driver GUI, VMs, dual booting, or using a Mac (just IMO).

* Postman - I don't know if it was first, but it made debugging requests 10x easier for me.


FT was a major leap forward in Video Chat. It completely changed the game.


`dplyr`, the data manipulation library in R.

Some prefer base R or data table, or python pandas, but for me, dplyr is essentially a _perfect_ data manipulation library for small to medium data, and nothing comes close.

Given that for a data scientist, data cleaning is “80% of the work”, and in a world where science is increasingly data science, means that dplyr has done a LOT of good.

Honorable mention goes to ggplot2, the plotting library, and the rest of the tidyverse.


I use OpenRefine for that (I don't use R though). How comparable is it according to you, apart from the fact one is a lib for a language and the other is a GUI?


Black

Black [0] has been 10x for me. Automatic code formatting frees up a whole bunch of cycles that would otherwise be spent trying to comply with someone else's style guide, or trying to convince someone to write their code in the approved style. I don't have to anymore.

[0] https://black.readthedocs.io/en/stable/


Single quotes are better.


Saleae Logic. Their USB3 “pro” analyzers are fantastic and the software is clean, fast and user-friendly. Using host RAM for capture buffer is great and completely changed my workflow. I can capture hour or days of trace and sift through after a fault occurs.

We have the usual other test equipment from the usual big names, stuff that costs 40x as much. For many jobs, I’d rather have the Saleae.


OS X / macOS. After about 5 years of Windows, 10 years of various Linux desktops, I got converted to Mac in 2012 and never looked back.


What makes you feel MacOS is 10x better. I am a Mac convert myself but felt like it was more of an incremental improvement for me. Maybe 1.5x?


I made a basic CRUD app that 10xed the productivity of a group I was working with.

The big differences between my app and other solutions that they had used were:

1. I talked to users not only to get the work flow right, but also to refine it. This also helped with buy in once the app started being used.

2. I made it easy to use for the user base (most were not very tech-savvy).

3. Related to 2, I made it hard for a user to fail.


  - collaborative docs like google docs, google sheets, quip. 
  - cloud sync like dropbox (also evernote, icloud, google drive)
  - group voice chat
  - open source package management (all of rpm, yum, dpkg, ports, npm , cpan , etc, etc ) – at a high level, run 1 command , get all the deps you need (and a license to use them) to build your software.


So many good comments

for me its a lot of what others said +

- Python as the default for scripting/automation: instead of using bash/batch/autohotkey/<programmingLanguage> if you need to use some popular API it saved you time with it's huge number of packages. so many good built ins. it's dynamic enough to be comfortable but not insane like JS. So good.


I agree with this but Python is far more dynamic than JS.


It’s just as dynamic, but its typing is stronger.


This is really dumb but I was watching my pupils using a REPL yesterday. They didn’t know about readline so I showed them what up, down, ^A and ^E did.

I felt like I had done them a disservice for not sharing this sooner.

Readline is fantastic. I wish even more OS components were standardised (e.g. Firefox tabs and Inkscape dialogs just being windows; let my WM handle the layout.)


I'm showing my age, but here's an abridged list:

1. Google Search - before Google came out, it was wading through Alta-vista and Yahoo's hand picked pages and ignoring the 50% adult content spam, or using a physical 'phone book' of web pages

2. Google Email - when it game out, the fact that it had gigabytes of storage was astounding

3. Wikipedia - this became one of the first resources for most queries that weren't highly specialized and, even, then, sometimes Wikipedia would come through

4. Stack Overflow - hours or days of debugging are reduced to a search with an occasional copy/paste

5. Arduino - suddenly electronics became within reach and was reduced, for the most part, to software, all for a fraction of the cost microcontrollers and electronics were just a decade previously

6. Amazon, Aliexpress, Ebay - to a certain extent. Each provided a trove of sellers with access to items that were previously very difficult and expensive to find, sometimes with a 10x difference in price or accessibility. They've all kind of normalized out now but there was a time when they were more differentiated.

7. Raspberry Pi - A full linux box with a Ghz processor for $20-$50. There was a time when we were talking about the $100 laptop as the great white whale

8. Archive.org - one of the few resources that has an astounding amount of public domain work that can be searched, sorted and downloaded

9. Github - the amount of free/libre/open source software that can be accessed and used is at least an order of magnitude larger than it's closest competitor (Gitlab? Sourceforge?). It's not just investing in Git's source management model, it's also providing a clean interface to search code and present projects cleanly

Maybe these are all obvious but you did ask...

Here are some software projects that I think give me a "10x" boost or I think have large potential:

* Bootstrap - Before bootstrap, I could barely cobble a website together that didn't look like it came out of the 90s

* Clipperlib - When you need to do 2d polygon boolean operations, in a programmatic way, Angus Johnson's clipperlib is it

* WebAudio - I'm still playing with this but this provides an entry point to music creation that was orders of magnitude more painful before. Currently I'm playing with Gibber (gibber.cc)

* Face Recognition - This is now a Python package that you can use to find faces in images. This used to be bleeding edge technology just a decade ago

* Mozilla's DeepSpeech - though it still has it's problems, for someone who has a mind to, they could theoretically make (an offline and FOSS) competitor to Google's Dot and Amazon's Alexa

Unix/Linux in general provides many orders of magnitude more productivity than any other environment I've worked in (at least for me) so I'm not sure it's worth going into all the tools, "classic" and recent, that help me build software, analyze data, do data wrangling or any of the other myriad of tasks that I do.


Good call on Arduino. The ability to write LED blinky embedded software on a microcontroller using a simple IDE and a USB cable was definitely 10x from what came before. The barriers to entry in terms of knowledge, tools (compilers), and dev boards was immense before Arduino.


You mean getting a oscillator, caps, breadboard, MCU, avr-gcc, and a JTAG interface was too much work to get a LED to blink? :)

Arduino changed the game, wish I had it in college. Honorable mention to Microchip's PIC line - they had a arduino like kit that made a lot of those things easier too.


came to say Arduino; I cut my teeth on Basic Stamps in school but today they still cost over $150 just to get all the hardware you need to started. And forget about leaving just the brains in a home monitoring gadget or clock project, they cost over $50 each!

Arduino MCUs can literally be removed from the developer board and function stand-alone with a few cents of external hardware, and the chip itself can be replaced with a blank for a few dollars.


Didn't see this mentioned, but Wordpress and Drupal have had a huge impact on my early career.

Wouldn't count them as developer tools per se, but as a software provider, I could serve a lot more customers with ease and confidence thanks to hundreds of thousands of man hours put in by the Wordpress and Drupal developers.


Bash was an eye opener after having to deal with CMD for years.


Makes one wonder how someone was able to create a shell that is as shity as CMD is.

It is remarkable how bad it is.


Python (coming from the C/C++ world), amazing productivity gains for basic stuff. Sure, there was Perl, Awk, etc, but I didn't learn those

Zsh/Oh-my-Zsh (from Bash)

Virtualization. Making it practical to run 2 or 3 different OSs on the same machine has saved me countless hours. Docker is good but it's a smaller step


1. Automatic driver installation and device discovery. IIRC Windows Me could install my first flash drive without resorting to fumbling with mini cd drive disc.

2. Steam Proton, most of my library of games just works and it is in easy and convenient format.

3. Norton/Midnight Commander style of file managers.

4. Live CD distributions.

5. Password managers (Keepass)


VS Code's Remote Development. I'm not sure how many folks rely on this, but its been amazing for me.


While it's amazing, they made the remote extensions proprietary. So projects like VSCodium (which is VSCode minus the telemetry) cannot use it. Very crappy move by Microsoft. I think they are extending that behaviour to other parts of VSCode - I think their python LSP has proprietary bits now.


Unity for indie/mobile game development. Especially around 8 years ago (Unity 4.x era) when it really took off.

It completely redefined what was possible for a solo developer or small team to create. Easy to learn, yet very flexible for experienced users. And moving from C++ to C# was a real game changer.


MDN (Mozilla Developer Network) was a complete game changer.


As SW developer, the 10x improvement for me is having the callstack on a crash, like in Java, Python or C with Valgrind. In combination with memory safety, so many bugs can simply be fixed with only the callstack.

10x is quite much so please excuse if I am a bit conservative.


1. Power Query/Pivot Tables/Power Pivot Directly connecting to SQL from Excel for data exploration 10x for me. Personal record is 77,000,000 rows in Excel via Power Pivot. Lining up count vs count distinct on year-week is my go to garbage data detector. Weather data suppliers have an unseemly habit of over as well as under reporting. :-) 2. SYGRAPH, the data visualizer Lee Wilkinson to accompany SYSTAT, before he wrote THE GRAMMAR OF GRAPHICS which we now now as ~ggplot2 3. NT 3.51 - the first vision of what a stable workstation could be 4. SSDs "20 times faster" but non-users can't process the meaning of those 3 perfectly good English words


Here's my top 10. Each of them was an "oh sh*t" moment.

1. Pong

2. First BBS connection (on a 300bps modem)

3. Mosaic

4. Altavista

5. Napster

6. Gmail

7. Google Maps

8. iPhone 1 (followed by the first homebrew apps, a full year before AppStore launched on iOS)

9. Patreon (ok, this one is not sw related)

10. Bitcoin

Bonus (both sw and hw): Oculus Quest 2. 10x compared to DK2 or other early VR prototypes I tested in the past decade.


Password managers for me.


Time Machine on Mac OS + the Time Capsule router.

I admit that it was my first real automated backup setup. But I guess it was also the first for many other people.

And the UI for recovering lost files by scrolling back in "the past" was just like magic.


OCRing items on screen with (Alfred App glue + Screen Capture.app + Google Vision).

WebUSB Printing and Postage Scales - (one click postage label creation without installing software/drivers or going through the OS printer queue is magical)


K8. It is night and day how well our organization operates now that we are fully on k8.

ClickHouse/BigQuery have allowed me to tackle massive analytics projects with a tenth of the effort when I had to setup a map-reduce/spark infra.


I assume you mean k8s (kubernetes)?


Correct.


Clipboard History a la CopyClip for Mac

I'm honestly surprised it's not more widely adopted, the ability to have a shortcut key to access the last X things I've copied to the clipboard has made a huge difference in my dev work.


Being able to have my programs on a hard drive rather than switching between floppy disks was pretty nifty.

Going from a 300 baud modem to 1200 baud modem I guess is technically only a 4x difference, but it definitely felt more at the time.


7zip. God winrar was annoying.


Not software but transition from Floppy Drive to USB was perhaps more than 10X. I remember the first time I used USB, I was amazed at the capacity of 128MB, vs 1.4MB, the speed of transfer and ease of carrying around.


First time I imported data and ran a query in Google Cloud BigQuery ML / AutoML was an absolute, 100x eye opener. An honest to God revolution. So fast, precise and automated. Many talk about "democratizing AI" but there it was. And every time I recommend or demo it, it gets the same stunned reaction.

Terrific list and discussion! I especially like the old school visual debugging tools: firebug, wireshark, physx / nsight, etc. Really brings back the memories and really believe in general having a picture of running processes is 10x more helpful than console output alone ;)


Launchbar for Mac from obdev.at. Makes using a mac so much quicker and simpler.


kubernetes. so much easier to treat "servers" as cattle now rather than pets. The way it manages many things that you used to have to write chef/puppet/ansible thingies for is just magical.


Roon.

Roon is a home audio platform. It plays on all the devices in my house.

The 10x difference is during parties. The iPad with the interface on it gets passed around and people queue up music they like, but also get to read about what is playing. It gets magical when people pipe up "Well, if you like that then try this..."

I don't know why make such a difference compared to an iPad with Apple Music, or Spotify. My guess would be the music reviews, and its "take music seriously" presentation, encourages people to think about the music as more than just background noise? Anyway. Very happy.


Github. There are downsides to centralization, but it's still amazing that you can get a hosted and integrated bundle of version control, bug tracking, CI, patch submission, and Q&A (recent) for free.


I hope I don't get downvoted for this:

Internet Explorer 4.0

As somebody who wrote web apps back in 2000ish, IE4 was lightyears ahead of Netscape 4. Basically IE4 was close to what we have now, and NS4 (or opera), could barely switch a color with JavaScript. While lot's of the standards were already defined, NS4 would crash constantly (when doing JS). That was at a time when NS4 was the beloved browser and IE4 tried to catch up (remember the lawsuit because they tried to bundle it with windows). Well they also made the much better browser. Enough robust to built the first web apps.


The R packages plyr/reshape2 and subsequently dplyr/tidyr. Data manipulation in base R can be pretty painful (awk arguably even more so, and Pandas didn't exist yet), and going from that to an environment that's syntactically the nicest option available today for complicated manipulations of tabular data was a game changer. I know it's become fashionable to hate on the Tidyverse lately, but IMO it's the single biggest reason (though there's also a very long list of smaller reasons) that R is still relevant in industry.


and ggplot2 for R graphics. just. wow.


* jinja2 as general purpose templates * org mode * yaml (sorry but I like it and write it without more errors than the other formats) * docker / docker compose * rss * rofi -dmenu * .http files to keep requests * sentry * ansible * atom hydrogen - the most comfortable programming environment where I can execute my code as I write it, there are other solutions but they are worse for me * spreadsheets * vuejs works as compiled framework, but integrates everywhere - userscripts, jupyter notebook, app prototypes, etc etc


The boring answer is that most "sexy" features don't make that much of a difference. However, a lot of spreadsheet or manually processed data being automated is well over 10x improvement for a customer. Headline features like AI don't make nearly as much of a difference as not having to perform repetitive manual labor.

How much better is it to run a script to extract image file names out of a big csv file and convert them into a different format in a different directory if you were doing that by hand before? 10x? 100x? more?


The GNU C/C++ compiler. Working on a game console with a proprietary compiler that let you pick: a) 16 floating point registers and 32 bit pointers or b) 32 floating point registers and 64 bit pointers. And also a shitty solution for a cpu bug. Or, GNU compiler! I'll have all my floating point registers, but 32 bit pointers, oh, and can we handle the bug without just adding a nop every bloody time it might happen? Thanks! It had been around a long time by that point, but it made a 10x difference on our project.


Macromedia Dreamweaver - It was 10x times better than Microsoft FrontPage (I was very young, please don't judge).

Mac OS X - Moving from Windows ME on my Packard Bell tower, to an iMac G3 with Mac OS X was quite a pivotal moment in my life - everything just felt smoother, it was actually nice to use an operating system, not a battle.

Heroku.com - The ease of getting a Rails site live was 10x better than setting up your own server.

Git - I was always messing up SVN systems, as soon as I switched to Git it just didn't seem possible to mess it up.


Lots of good ones mentioned so far, but no one got Calendly.

Instead of a dozens of emails and/or texts, phone calls etc. to find a time when someone (not in my organization, so they can’t normally see my calendar) can meet, I just email a link and they pick. It will add call info or zoom links as appropriate. Tons of control over scheduling (synced to all your calendars, or even several people’s; buffers; limits to meetings per day).

Basically it eliminates needing a secretary or recruiter or admin person to arrange meetings.


Version Control. Imagine going back to FTP-ing updated files onto servers not knowing if you've overwritten someone else's work. Or not being able to see a file's history.


Probably one underestimates productivity increases a lot. You really only see how much something gives you when you have to go back to the old way.

For instance large parts of the company where I am currently working still does not have short-lived feature branches. CI means after checking into the development branch used by everyone else you have to wait minutes up to hours to see if your changes broke something. Only seeing the pain and wasted time regularly lets you realize how much easier you have it now.


1. Hot reloading in Java. I wasn't aware of Erlang at that time.

2. Erlang. It was a 10x boost after doing a whole lot of CORBA/J2EE crap. I was quite late to the party even in the mid-90's.

3. Google search, maps compared to offerings from Altavista, Excite.

4. Turbo Pascal compared to anything else before it. I don't think I have experienced that level of programming pleasure since then.

5. HotJava on the browser. It was sad how Microsoft neutered the whole Java on the browser initiative.

6. Git. So much faster and pain-free compared to SCCS, RCS ..


For web development, error monitoring services like Sentry are a breath of fresh air: instantly find the exact lines of code that are causing problems, even in front end code, before anyone files a bug report.

Any tool that automatically captures a lot of data needs to be used with care (eg verifying that no sensitive data is sent to the server), but many tools in this space make an effort to scrub the most common fields. In practice, the payoff has been worth it most of the time.


Steam Link (and lately nVidia Gamestream). It made my dream viable: having a server rack in my garage or basement, and stream my games to any room in my house. Wanna play in the living room TV? Sure! Want to play from your laptop in your room at full performance? No problem!

Later, I switched to Gamestream because the performance is vastly superior. Eventually, I'd like to enable multiple devices playing games from the same server... I might have to switch to a VM setup.


One of my interests is browser history, so I'll list these. Before I begin, shout-outs to Perl, GNU, Windows 2000, Windows 95, Norton Commander, Windows Commander, Linux, BSD, Digg 1.0, irssi, and, of course, HN.

Mosaic, the original "best viewed with" browser, which brought us inline graphics.

Netscape 3.0, the first "to the masses" browser, which made not only browsing, but authoring too, accessible for anyone and everyone.

Internet Explorer 3.0, not quite 10x, but it was the first browser with a truly beautiful UI/toolbar, still one of the nicest looking ever in my opinion.

Lynx, one of the oldest browsers, allowing Web access to countless individuals otherwise in countless scenarios and situtions.

Opera 3.0, the super fast browser which tried to stick to standards. Along with my own add-on to give it status-colored tabs, this was my first "better than mainstream" browser experience.

Internet Explorer 4.0, the first browser which had true DOM, true dynamic HTML, if you coded carefully.

Netscape 4.0, the fastest browser for a long time, one I stuck with longer than most gave up. Proper caching, and blazing fast rendering of uncomplicated HTML.

Internet Explorer 6.0, with reasonable standards support, DOM, CSS, and embeddability with full DOM access from the parent app, a breakthrough in scraping abilities.

Opera 12.0, even today one of the best standards-supporting rendering engines, fast beyond imagination, fully Acid compliant, stable, and one of the first single-key shortcut supporters.

Mozilla Suite and Firefox 1.0, the best browser of its day, not only fast and flexible, but allowing previously unimaginable UI improvements and enhancements, spurring an era of creativity in development.

Vimium/VimFX, a new level of keyboard accessibility in browsers.

Links, and Links2-GUI, a new level of rendering prowess and accessibility in the console

w3m, the console browser beast for nerds, with even better accessibility (albeit with a learning curve) and customizability.

And, last of not least, the King of hacker-friendly browsing as of today, qutebrowser, a whole new level of stability, accessibility, and clean design, albeit with a week-long learning curve.


Google Photos. Easy image backup, arranged chronologically, with the ability to arrange stuff into albums. I haven't found anything quite like it since they killed off Picasa... and it has the added bonus of working across my operating systems.

Google being Google... I still backup my photos elsewhere, but I'm more than happy to keep paying for the convenience that it offers. It's amazing that I can pull up photos from 10 years ago on my phone on a whim.


The Microsoft Intellimouse. When I was in school in the '90s, I had to de-gunk mouse rollers almost every time I sat down in a computer lab. A low cost optical mouse for generic surfaces made all that work just evaporate.

(I guess that's not strictly software, since it needs a basic digital camera.)

https://en.wikipedia.org/wiki/IntelliMouse


Sinatra.

The Ruby framework was a game changer for me to quickly prototype API concepts, experience what the client developer experience would be like, and learn best practices.


Using a tiling window manger over traditional desktop environment. For instance, I use I3. I have it set up with 10 workspaces. w1 is for the browser, w2 is for notepad, w3 is for code editor, w4 is for file manager, w5 - misc stuff, w10 - terminals. I can switch between them immediately with Super+(1-0) - no clunky animations, no waste. It's more convenient to work that way on a single monitor than using 2 monitors.


First example that comes to my mind is Jupyter Notebooks with NBDev - a game changer for some kinds of software development. Coding feels way more efficient, even without all the extra gains that come with having tests, documentation website, CI, package management etc built in. (https://github.com/fastai/nbdev)


Staying on topic with HN today I have to say Docker. Getting running dev builds through Docker has definitely saved me a '10x' metric of time.


A lot of people dislike it, but Maven was a massive productivity boost.

A tool that automatically downloads dependencies, manages their transitive dependencies, resolves conflicting artifact versions, is able to check for changes, constructs proper classpath, standardizes builds, compilation, testing, artifact storage and versioning.

The onboarding went from days of fiddling to "just download Maven, run mvn install and off you go"


A more recent one:

Retool

I just started using them, and it saved an ungodly amount of time setting up internal dashboards. Typically speaking internal dashboards are an afterthought, don't work well, accumulate a lot of tech debt, and yet are indispensable to running an online service. Retool isn't perfect, but what you get is way beyond anything you'd be willing to build for yourself early on.


vim - when I started using it around 2004, switching from UltraEdit and Visual Studio. Haven't stopped using it to this day.


Perl.

Before JSON, XML and other standardised data exchange standards became a norm. And Java was still playing the catch up game with libraries. Perl was the the go to tool for achieving over a weekend what projects done in other languages took months to achieve. This was true as early as 2012.

Those days are behind us. But it was some thing like a magical tool back then.


Infrastructure-as-code (Ansible, Terraform, CloudFormation). I can easily manage 10x the infrastructure once this is in place.


California DMV services going online are 100x


I’d say the little known pricespy. I barely even check product reviews any longer because the popularity ranking will always have the highest score for the best product and the filters are almost always spot on.

Before I’d spend weeks researching things, making my own spreadsheets. Now it’s just a five minute ordeal to find the best gift.


Google Earth.

In many ways this was the killer app for first version iPads. It's still top-echelon, and a delight to use (mostly).


First 10x would be floppy drives to CD drives. In 99, I used to copy games from internet centres using floppy which is 3.5MB using pkzip to multiple floppies and extract it in MSDos on my windows 3.11 computer and later came the CD drives cd writers with 700MB.

Notepad++ with tabs and keeping the tabs session saved.


Not software but financially, option spreads. The ability to construct risk/reward defined outcomes completely changed the effectiveness of my investing.

These never used to be realistically available to retail trading, but are incredible financial tools if you take the time to understand the mechanics.


The availability of the "mpris" dbus interface[1] for controlling media players. I've used joysticks, & midi mixers, & command line tools, & my phone to control whatever media player is playing. I can keep an "audioscrobbler" program running that will record whatever I'm playing on whatever app I'm using, without needing to configure each app.

It's so nice being able to have programs that extend programs. Rather than needing to build a big robust program with all the features built in, I can have tools that access & enhance whatever other program is running. This is such a better model for software, so much more creative & extensible, & having this modular software environment for media playing has absolutely has been a 10x experience over monolithic applications.

[1] https://specifications.freedesktop.org/mpris-spec/latest/


1. Specifically AWS RDS and Aurora. It is incredible how much time it saved for me in the past 7 years.

2. Resharper from Jetbrains and all IDEs from them. Using something like Eclipse was such a pain and nightmare.

3. Cloud and SaaS as a concept. Everything just became 10x better when moved to SaaS.


Notion. It's the first organization tool that can effectively tie Todo lists, project and time planning, and knowledge management. It's the first Todo system that stuck for me since it allows you to work with higher level goals, scheduling and hierarchy.


SPARK:

At the time there were some motions towards a "better MapReduce," Scalding (ooh weee that had a learning curve) comes to mind, but I feel like Spark really lowered the bar for getting big data tooling and the ecosystem into mid level and pre-ipo orgs.


For those who mostly work on terminals, GNU screen/tmux is truly a blessing.

Also, if this isn't considered as cheating, Python. Now I can automate everything without worrying the quirkiness of shell scripts.

My work setup hasn't changed much in the last two decades.


Twenty years ago, GUI designers like Visual Basic or Delphi. You drag-and-drop the controls on a form to create a layout, then you compile it instantaneously and had a complete GUI app. Nothing else has come close to that productivity since


VIM and VIM key bindings on other software (e.g., Vimium extension for Firefox and Chrome)


https://www.stitchdata.com/

A happy user. It helps you connect data from multiple places and write to a single source. Anyone know a smiliar open source project like that?


ggplot2.

The 10X was not just that it made it easier and faster to make exceptionally beautiful plots. Rather it was that it made me realize for the first time that software was, fundamentally, about an implementation of ideas. That the underlying idea, together with how well it's communicated through the API design and documentation, really matters. Reading Wilkinson's description of the grammar of graphics, and then seeing that translated to a code library... That was amazing. It defined what I think software is, as someone who was new to the world coming from an engineering background where coding was just an easy way to do linear algebra.


Interface Builder in particular, and the NextStep/Cocoa (Mac OS X) system generally.


Ngrok for webhook testing, along with postman. The replay feature in ngrok is amazing.


I can go with a couple of my own experiences:

1. Ride hailing apps like Uber, Lyft

2. Wix.com and similar website builders


> Wix.com and similar website builders

I understand this in the sense of a website builder that gives endusers a way to build a websites - but wix.com is a world of pain.

It tries to lock you in (no way that I have found to add javascript without using their online IDE), the output is really inefficient and buggy (how on earth did they manage to duplicate one of my menu entries?) and their online editor leaks so much memory that Chrome consistently crashes after a while when I try to use it.

I mean wordpress has so many annoyances - but compared to wix it's not nearly as nerve wrecking.


I would love to hear people's thoughts on what hardware saw 10x improvements too:

https://news.ycombinator.com/item?id=26485925


VPN servers and client, I don't know how to build one. There are many opensource VPN like wireguard, I just don't understand why this and that. And how do I makesimole VPN, how do they intercept our request etc.


Friendster was an amazing product for the first year or so. No dark patterns, just sign up, find or invite your friends, and explore their friends in a process that felt quite revolutionary and seriously fun at the time.


REST/modern interface specs and DX tooling is probably a 2x over SOAP for me. Not 10x, but the reduction in interface and validation issues has led to a vast improvement in consuming APIs from other organisations.


i know this is unpopular choice but I would say Whatsapp, it completely change the text communication especially in country like India, where from elders to teens were able to use it without any learning curve.


React Native/Expo

Mobile app development used to take a lot of effort with separate builds for iOS/Android and low code reuse. React Native changed the game and made building mobile apps easy and much less expensive.


Gmail, Git, tmux


I'll give a shoutout to the Microsoft Foundation Classes and Visual C++. That, along with the documentation was a VASTLY better way to write Windows apps than the previous C code and WM message loops.


svn to git/github was a 10x for sure


Hire engineers to fix the root cause instead of hiring "employees to do job X". Engineers can automate almost anyone away by fixing the root cause. Lemonade is doing a great job at this.


- terraform. Declarative, centralized, cross-platform configuration. There's no way I'd be able to manage the amount of stuff I am without it (or some comparable alternative like pulumi)


Firefox! Oh my god, there are tabs! I had never used anything but Internet Explorer. Browsing the web was never, ever the same. 30 minutes in and I didn't know how I'd lived without it.


- Internet

- fibre

- email

- Smartphones

- gpus

- starlink will be one

More personal:

- language servers/linters

- Package Management

- a mouse

- dictation


Great list. I agree with pretty much all of them but I'd also add the following to my list:

- Multi-Core CPUs which enabled switching between applications pretty much instantly.

- Solid State Storage


Solid State was the last great change to our personnel hardware, computers went from meh to awesome in a flash


Breaking out of the 80 cols × 25 rows box. WIMP interfaces in general.


Bitcoin for money you as an individual control and cannot be seized. There is no alternative so it’s infinitely better (altcoins don’t count due to governance or tokenomics issues).


My list is more digital-creative than hacker:

- Photoshop layers (non-destructive editing)

- Bryce (3D for consumers)

- Flash (easy rich media web apps)

- Flash video (online video explosion)

- Plex (easy digital media collection)

- Traktor DJ Studio (digital DJing)

- Wordpress (self-hosted CMS)

- Pacemaker (auto-DJing app)


Python. The productivity and elegance of it are way more than 10x.


Parse.io which I got burned by when Facebook shut it down

but similar services like Firebase are okay, I guess.

for me they were really a jumping off point to doing similar things myself slightly more manually


Power query for Excel. It trivialized data transformation for admin/operations users, who would have spent 50x the time on a VBA solution or (more likely) not bothered.


Showing my age: windows 95 being able to play FULL SCREEN VIDEO


The Weezer video it included! Still love that song!


YES! That’s the one I was thinking of exactly. Such a great video anyway but even more amazing on a 14 inch monitor that has only previously barely seen 256 colours moving about slowly.


This is a little further from the beaten path, but Wireshark, especially with custom per-project plugins, is a complete game changer for network application debugging.


Svelte.

It's so much better than React or Vue. It's generally the same ergonomics as a developer, but the speed and simplicity when you push into the browser is revolutionary.


Using a terminal instead of punch cards. That was liberating.


Although i started a while JQuery was dominant, I have to imagine JQuery was a 10x improvement over vanilla dom manipulation and dealing with cross browser issues.


WordPress hands down. Built in CMS, easy to use editor, not to mention the extensible plug-ins system. You could get a site up and running in an hour.


My first 3DFX Voodoo card.

And maybe jumping from a 286-12 to a 486DX-33.


+ Compiler Explorer + Jasik Debugger (68000 Macintosh) - One of the best pieces of software that I have ever used. It could still compete today.


Distributed version control was such a huge win that I really couldn't imagine not using it now. I think this is a fairly common sentiment.


Uniswap in 2020. So much easier to buy any altcoin compared to making an account, passing KYC and sending crypto first on all other exchanges.


Storage combinators [1] Which are my own, so shameless plug, I guess. When I wrote the paper, I asked colleagues if they could say something about the productivity increase. They wrote back that it was a 2x improvement, which I put in the paper. When I bumped into them at a meet-up, they confided it was actually more like 10x, but they didn't say that because they felt it sounded unrealistic.

¯\_(ツ)_/¯

[1] http://objective.st/Publications/


Amazed not to see markdown listed anywhere (or textile before it)... major step forward when it comes to inline and in-situ documentation!


Windows NT4.

In two years we went from Win3.11, delightful cooperative multitasking et al. Via Win95. To something that would survive a nuclear bomb.

Then security happened.


How come everyone is talking about boring software from 10 years ago? Check out Zombo.AI or LightSpace. We are living in the future.


DOS - high memory and expanded memory. It allowed games (OK, and maybe some productivity apps boooorrrinnggg) to take giant leaps.


PKzip - way faster and better compression that previous things like ARC. The first version control I did was project01.zip, project02.zip, etc.

SideKick - Being able to pop up and edit underneath a running MS-DOS program was a game changer for me

Turbo Pascal - Being able to compile programs in less than a second in MS-DOS was magic, compared to 15min - hour waiting for the compiler queue on the VAX at school, only to find out you had an error.

Backpack Portable Hard drive - A piece of hardware, but being able to boot a floppy and have 100 megabytes of storage instantly available was like magic.

EDwin - TurboPower Software - the first text editor (that I used) that could record and playback macros, I did all kinds of cool stuff with it.

GoBack - A system tool that kept all the changes to your hard drives.. the salesman demo involved deliberately infecting a system with a virus... then undoing it via GoBack. Unfortunately the wrong people decided it was too slow and "optimized it for speed", which killed the ability to undo virus attacks.

MultiLink - Allowed running of multiple MS-DOS users with serial terminals, usually a Wyse 60.

The $25 network - Allowed the very slow emulation of networking, with just plain old serial ports and cables. Saved hundreds of dollars if you only need a file now and then, back when Arcnet cards were about $100.

Delphi - GUI development for Windows that just works, and like Turbo Pascal, compiles in a blink. Drag your components into a form, hook up the events, make a report or two, and you're done.

Microsoft Office - This one is way under-rated

  Microsoft Excel - Reactive programming, comprehensible by humans and accountants.

  Microsoft Word - The outliner is quite useful for keeping track of tasks, and the details of projects

  Microsoft Access - being able to do a forms based database with nice reports, master/detail records all with zero SQL required is powerful stuff

  Microsoft Exchange/Outlook - Exchange is *the world's best database* disguised as a task manager/calendar/email server/client.  You can make offline changes, and they just work consistent with expectations.
WebDAV - Uploading by just copying to a folder in explorer was far more intuitive than FTP.

Mercurial - Being able to keep old versions without sucking up the hard drive was very nice.

GIT/GitHub - Being able to keep all versions, branches and push them almost instantly to the web.

Python - The ability to get a lot done in almost no code is very powerful. It's too bad that there's no good GUI for it that works as well as Delphi.

VMware - Ersatz Capability Based Security - The virtual machine gets a set of resources, and nothing more. It'll do until we get better Operating Systems. Being able to save a machine as a file is a very powerful thing.

ThumbsPlus - A photo organizer from Cerious Software, keeps thumbnails in a database, does tagging, etc.

Picassa 3.0 - Killed by google, does local photo management, with local facial recognition, helped me tag the more than 10,000 photos of my daughter. 8)

Hugin - Panorama alignment software - very handy for my experiments in virtual focus/synthetic aperture photography, and for doing landscapes.

GIMP - Orders of magnitude better than Microsoft Paint

WSL - Windows Subsystem for Linux - Allows me to run Ubuntu and Windows programs at the same time. VScode supports running you compiled code in Linux, while it lives in Windows... wizardry!

GNU Radio - If you have a fast enough machine, you can build almost any type of radio you want in a flowgraph, and it supports $25 USB dongles that receive 25-1200 Mhz.


Picassa was the best thing miss it so.


Kindle


This one little brick has more than my entire bookshelf on it...and isn't even close to full. It's one part of the future I really appreciate.


I was frankly shocked at some of the things mentioned here. Instagram? Wix.com? Uber? Discord? Seriously?

BUT that got me thinking: Could it be that a large portion of the 10x improvement could be in marketing, i.e. you'll have to be able to market your product as "10 times as good", regardless of whether or not it actually is?

AND secondly, could it be that many if not most products that are marketed as 10x as good are also - at the same time - 10x worse?

Just picking random examples from the thread:

- Gmail: Definitely 5-10 times better than Outlook Express or Horde. But also at least 10x worse when it comes to support, privacy, flexibility, agency in regard to your own data and server configuration and many other areas.

- Siri/Alexa: Speech recognition and what you can do with it is easily 10x better than what existed before. However, again at least 10x worse in terms of what you can control about the underlying technology and hardware and in terms of (controlling) what happens with the recordings of your voice. Also, if you want to go so far (someone mentioned dictation, more generally) you could also include lost jobs (e.g. secretarial jobs/assistants). Not convinced so much myself by this argument though.

- AWS Workspace: That might be somewhat more cost-effective and easier to set-up than a local VDI infrastructure. But in many ways it is, or can be, 10x or infinitely worse than having your own server.

- Wix.com: 10x easier to use than Notepad++ for web development (if you don't know what you're doing) but arguably at least 10x worse for good websites, web developers and designers, people locked into subscription payments, limits on flexibility etc.

Other "great innovators", like Reddit (severely optimised towards monetisation, data extraction and advertising over the last years), Uber (questionable employment practices and corporate culture), even Google making memorisation unnecessary (everything's just a click away... but what happens if the internet is down?) and Amazon ("killing local businesses since 1994" ;-)) also could be argued to come with severe downsides.

Personally, I would greatly appreciate much more widespread 2x innovation vs. chasing the rare unicorn 10x innovators.

70 MBit/s internet would be much nicer than the 35 I can get right now. I don't need (nor would be willing to pay for) 350, let alone if the 350 are only available in large cities. If Gmail was just one tenth as innovative in parsing my mails for data nuggets and instead cared ten times more about my privacy, I would be more than okay with that.

Personally, real 10x innovations without large apparent downsides (although there might be some) for me right now are Wikipedia and Starlink. I also want to mention Delphi, whose approach to programming and UI design was at some point definitely a 10x innovation over existing solutions at the time (in many ways it still can be 2x as good even today, but unfortunately, they've also increased the price 10x and stopped caring for their users).

On a grander scale, a well-working and ethical interplanetary species/society might also be a 10x improvement over being confined to Earth only. But that might very well turn out to be incorrect.

As always, if you disagree, please DON'T downvote, instead reply and tell me why and how you Think Different™. I post here because I'm curious to discuss, not just to share and read isolated opinions.


You have a point. A product can be 10x better only in one or two aspects. And there will be downsides in other dimensions.

But usually that suffices because the other dimensions don’t really matter or have ceased to matter (like MS Word’s advanced features).


I was going to post something similar. There has been some amazing technology in the past 30 years, but the argument can be made that we are focused on the wrong things. I love that anyone can carry all the world's knowledge in their pocket, but we have also increased the hours people spend working so much that we no longer have TIME to take out an encyclopedia and look something up the old fashioned way.


Often the 10x improvement is just an ∞x - because you're overcoming the barrier for someone to try it. The key is getting them to try it.

Often "worse" solutions win in the apparent 10x improvement because they're the one to finally get people to try it out.

Uber would have been as useful to me in the beginning if literally all it did was hail a cab for me. The "ridesharing" part was pointless (for me).


ColdFusion - it's derided now, but in the mid-late 90s the productivity gains it gave you over other options was amazing.


I have an 0.1x example, every os is using a file explorer instead of orthodox file manager like "total commander"


Ubiquity plugin for firefox, it treated the web as API, sending the request to forms without opening the webpages first.


libtls. I wish it this API was more popular than it is, as it makes using TLS way easier. https://github.com/spc476/libtls-examples/blob/master/get1.c


CMA-ES for black box optimization. Insane.


VisiCalc, Turbo Pascal, WATCOM Compilers, Perl, Mathematica, C++ STL, Memory Mapping APIs, cURL, clang/LLVM


Emacs: its versatile and easy to configure code editor. If you are fan of Org mode, then you have to use Emacs.


a few of my favorite things: shellcheck when dealing with bash scripting; pylint for python scripting; package managers (yum, pip, brew - all of them); valgrind for c/c++ development; intellij for java development; unit testing; stackoverflow, github code search, google + duckduckgo; libgen;


VLC Media Player

It just works. No more dealing with codecs, or having multiple players installed for different media types.


Going from traditional polygon to zBrush.

Reading books on an iPad.

React.

Discord for any interactive online communication (text/voice/video chat).


+ Ruby

I never thought programming could be so enjoyable and eloquent.

+ Ruby on Rails

I never thought building web app from scratch could be so quick and fun.


- Amazon Prime - VLC - Netflix - Slack


Firefox when I was still using Internet Explorer, it had tabs, which IE did not, and ran om Linux.


As a mac user: Cleanshot and Alfred. Very well thought out products, am a happily paying customer.


Notarize.com - I was blown away by how easy, simple and fast it was to get documents notarized.


Laptops. Ok, I know thats hardware, but I write software, and they made a 10x difference.


Prisma ORM. It's an ORM done right and makes storing in an RDBMS fun and convenient


Zocdoc. It is SO easy to find good doctors now compared to what you used to have to do.


Git + Github


While it helped as a conversation starter here, the 10x rule is hollow punditry.


running slack, skype and zoom in a browser tabs instead of 3 extra electron apps


surprised no one mentioned Ipod Nano, spotify, and then Iphone (for everything)


Tab-completion in the shell


Delphi.


- Revolut (vs. traditional banks and traders)

- Wolt (vs. Lieferando, Germany)

- Hey (vs. Gmail, Fastmail)


AirPods are a 10x in terms of my day to day life. I’d say they count.


I guess torrent clients were pretty useful to share large files :P


Safari hooked up with Messages to prefill MFA codes sent via SMS


one click deployment to prod as opposed to a deployment ceremony


Mathematica

Even just using it to simplify equations was worth the asking price.


Webflow - have been using it recently and it is excellent


AWS and DynamoDB was a revelation. C# Generics .NET Linq


WhatsApp relative to everything else at the time


Genius Scan. I never used a scanner ever again.


+ ZeroTier

+ Homebrew

+ (DIY) Aliases in my shell profile (.bashrc, .zshrc, etc)


Unlocking my iPhone with a mask on in 14.5


For those not fully versed into Apple stuff:

Parent comment is referring to the new functionality in iOS 14.5 that allows you to use your Apple Watch to unlock your iPhone if it concludes you’re wearing a mask while trying to unlock with Face ID [0]

[0]: https://www.macrumors.com/how-to/unlock-iphone-wearing-mask-...


Not a single mention of Spotify, the first version was magic, click a song and there were no lag, started to play instant ..


Definitely "find -exec"


1. what.cd

2. auto-update feature on Google Chrome

3. Uber/Lyft

4. ZFS

5. Cheap VMs (R.I.P. Slicehost; DigitalOcean, Hetzner, etc.)

6. Ruby on Rails

7. Postgres

8. Spotify

9. 1Password

10. Arq backup

11. Dropbox

12. Wireguard


10x is too web 2.0 - we're now on 200x -- get on with the buzzwords if you want to remain in the 200x category.


Steve Jobs once said "don't make your users do shit work", or something to that effect. To me, that boils down to three things:

1) managing resources manually, be it memory, storage, organization, etc 2) giving information that the computer already knows 3) waiting

So I count as 10x innovations, things which massively reduce those three:

1) Plug-n-Play, DNS, and all forms of auto-configuration.

2) WiFi, LTE, and Broadband. Seriously, I started on 300bps modems in the 80s, and my first internet access was at 2400baud in the late 80s over dialup. It's hard to remember how long I used to have to wait, for pretty much everything.

3) Web vs "install". Web Surfing created a new stateless paradigm: Always up to date, effortless changing, like TV channel surfing, and as soon as you leave the site, you can forget about it. No "uninstalling", no shit-work created for later cleanup. The Web Cache eliminated the need to INSTALL. Regrettable, Steve Jobs reintroduced this paradigm back, which I think is a huge step backwards. Phone apps should stream in as needed and be cached, and be evicted when space is needed. I should never ever have to visit my phone's storage page and delete stuff.

4) Automatic information organization. Search. Web Search, Gmail bundling/auto-folders/priority/spam detection, Google Photos. Google Photos is the latest example. I take as many photos as I want. I never organize them. I don't make albums. I don't do anything. Peace of mind. My stuff is there. And When I need to find that photo of my daughter with a panda and a yellow raincoat, it finds it.

Don't waste my time with uploading and organizing my photos.

5) Service economy. Uber, Doordash, Amazon. I need something, or I need to get somewhere? It's incredibly easy to get it, and get it quickly, compared to what it was 20 years ago.

6) the iPhone. I was an early adopter of smart phones for years. PalmPhones, Nokia Communicator, iPAQ, etc. It wasn't until the iPhone combined a full-screen touch interface with a REAL Web browser, not XHTML Mobile, or WML, but a REAL browser experience, combined with fast WiFi, that the phone truly turned into a usable mobile experience for me. Sorry, but even expensive Nokia smartphones, and Java Personal Profile, and Wifi-enabled feature phones, or XHTML Mobile on Opera Mobile, couldn't hold a candle to this. The iPhone 2g also had usable YouTube and Google Maps out of the box. It was a quantum leap.

7) The Voodoo1 3DFx. It's hard to describe, but the Voodoo1 + Quakeworld was a tipping point, to me it marks the dividing line in history between gimped, and shitty wannabe 3d accelerators, and the first one that had enough power and capability to run a real time multiplayer first person shooter. (Also, hats off to Carmack's ping compensating netcode)

8) Linux. I started out on BSD (FreeBSD), and before that on IRIX, HPUX, and Solaris at college, but there's no denying that Linux was the inflection point for widespread adoption of Unix everywhere. And that widespread adoption meant anything else you wanted to do was much easier. In the 80s and 90s, whenever I downloaded Unix software, getting it to compiler was an exercise in frustration, patching, Configure scripts, compiler errors from incompatible .h files or libraries on your system, etc

9) Containerization. Enough said.

10) git / hg, github. If you grew up on CVS and SVN, you know why.

I mean, there's lots more, but I would say all of the innovations boil down to reducing or removing cognitive burdens and lost time. If you can do something for me without me having to worry or work on it, and/or wait for it, you might be 10x.

Don't ask me to do stuff. Don't make me learn stuff. Figure out what I need to do, what I want to do, and help me do it, getting what you need by osmosis, or by asking for a minimal amount of information.

oh, and if other people have already done it, making it really easy for me to find and reuse, or to share.


Erlang


1) Emacs. When I was growing up, it was per-language IDEs for development. There was Visual Basic, Visual C++, Borland Pascal and C++, etc. Emacs was an editor that, to a certain extent, understood every programming language out there, and could be taught new ones using Lisp. It took a long time for me to learn Emacs Lisp and understand the true nature of this power, but saying 'M-x psychoanalyze-pinhead' for the first time gave me a glimpse of it.

2) Framework. The best way to understand this 1980s "integrated software" (as office suites were called in the 80s) is as "Emacs for the office". Usually, until Microsoft Office, integrated packages offered cut-down, entry-level versions of a word processor, spreadsheet, rudimentary database like a cardfile, and maybe a telecommunications program. Framework was different. It was basically a self-contained pseudo-GUI, in which documents and spreadsheets were represented with the unifying metaphor of a frame. Each frame, drilling down to individual cells in a spreadsheet which themselves counted as frames, was addressable, and frames could be nested, allowing for compound documents containing word-processed reports, spreadsheets, imported database data, and even graphs and charts. And, much like Emacs, it was scriptable in a Lisp-like language (though Framework's language had a Lotus-inspired @function syntax). Every frame could have code associated with it that responds to events. It was so powerful, it was marketed as an executive decision making tool, not a productivity tool for office drones. It was WAY ahead of its time for 80s software.

3) The Video Toaster. Professional grade video effects in your bedroom studio. Perhaps single-handedly turned amateur video from "home movies" into actual productions. In an era well before YouTube, when video was still analog and equipment was prohibitively expensive. With LightWave, it also gave you an inexpensive option for the then new and hot technology of 3D CG. Required an Amiga because of course it did; what else could handle all this?

4) Tcl/Tk. Still to date, the fastest way to author a GUI, as it was in the 90s. I would take entering a few lines of shell-like script to lay out a GUI over the (admittedly powerful) form designer tools in environments like Delphi any day, just due to the rat wrestling involved in the latter, and the fact that Tk's layout options do a much better job of placing widgets in various window size and configurations than I could manually in the form designers back then. And recently I tried to put together an Electron app with $HOT_FRAMEWORK_OF_THE_WEEK, and spent several hours figuring out how all the pieces fit together. With Tcl/Tk, I said 'sudo apt-get install tcl tk' and was prototyping in wish immediately after. Tcl may be a crazy-pants stringly-typed language from space, but it's still the best thing for throwing together GUIs quickly.


Dropping off social media.


Breaking the 640K barrier.


iPhone in 2007 is 10x better than other "smart phones".


ripgrep or silver searcher for searching large code bases.


Webflow & Zapier


Stack Overflow x1000


git - compared with previous source control systems.


Ethereum vs Bitcoin


wireless headphones


Jetbrains products


Datomic database


Package manager :

- apt

- maven

- npm

You can't imagine the pain not having them.


Prioritisation


* Redis and Memcached

* MapReduce

* Column stores


Ctrl-Alt-Del


Easily AWS


- GMail

- Google Suggest

- IntelliJ IDEA

- Ruby on Rails

- TiVO

- iPhone

- Roomba

- Napster

- TurboPascal


sci-hub


tmux


Ripgrep.


mtailor and their fancy body measurement algorithm.


del.icio.us


git, alarumist (trading alerts)


VLC.


Using the Palm Tx.

It was one of the first small devises that connected to the Internet. I was in Goodguys when it was closing down. They had a highly crooked third party liquidator selling off the merchandise.

I’m not into games, buy saw Halo 3 Collector’s edition for 5 bucks each. There was 500 games for sale.

I took my beloved Palm Tx to the library across town. It had free Wífi up to 9 pm. (No password—just connect.) I got a rough idea what the retail on the game was going for.

I went back and bought all the units. Sold all for $30-50 on eBay, and it felt great. So many mothers emailing me, “I have no idea what you sold my son, but he hasn’t left his room, and loves the game.

I remember thinking I will make a fortune with this devise.

This was just when the internet was getting reliable, most people just had a dumb cell phone.

I blinked, and it seemed like everyone had smart phone.

Anyway, I really liked Palm products. The quality of the hardware was outstanding. The software was just ok, but for the time, I thought great. I once tied a gps to a Palm tx, and took a river trip down the Rogue River. I could tell when the class 4-5 rapids were coming up.

It’s to bad what the company has turned into.


Cryptocurrency over banks. Having your own money, in digital form on your own device, was a paradigm shift for me.


SQL itself was huge compared to ISAM-type programming for data apps. Though I suppose it still has its place. Something like the difference between how and what, or procedural vs. declarative.

Edit: to make it more clear:

>how and what

Meaning, do how I tell you, vs. do what I tell you.


Darkmode for the web.


How exactly did that improve your life by 10x ?


My customers need it :) So my customers' satisfation improved by 10x


Here's one case where I clocked an over 3x improvement when measuring time only. Probably would go as a 10x improvement in some cases (accessibility, intuitivity etc) https://www.youtube.com/watch?v=Q2gwzTWADns




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: