Hacker News new | past | comments | ask | show | jobs | submit login
A man who invented VR goggles 50 years too soon (2016) (ieee.org)
140 points by andsoitis 11 months ago | hide | past | favorite | 110 comments



The problem is that the guy didn't "invent" anything, more like had a farfetched idea and left all "the practical details" to others to work out. That's like saying that Meliès invented space travel - he too had the idea of flying to the Moon in a gun projectile ...

It was only non-functioning mockup, at best, because the vacuum tube electronics of the era was simply incapable of producing a wireless device with those capabilities and weighing 140 (!!!) grams.

Compare that with the first actual working HMD which was the Sutherland's "Sword of Damocles" boom-mounted display from 1968.

EDIT: It was George Meliès not Lumiere brothers who made the famous film. Thanks for correcting me.


> That's like saying that Lumiere brothers invented space travel - they too had the idea of flying to the Moon in a gun projectile ...

Just to nitpick, it was Jules Verne's idea. The Lumieres just put it on screen.


Just to add more nitpick to the nitpicking, it's a Georges Meliès movie: A trip to the moon https://en.wikipedia.org/wiki/A_Trip_to_the_Moon


You are right, of course. I had Meliès in mind, not sure why I wrote Lumiere brothers.


That seems a little unfair to me. In many cases an original idea, regardless of ability to execute, is priceless.

I agree that execution is everything in terms of winning, though.


Ideas are a dime a dozen, anything that can be thought of eventually will be. It's no coincidence that new discoveries tend to get made by multiple people on different sides of the planet at roughly the same time once they have access to the same technological starting point to work from. Leibniz and Newton, Wright Brothers and Dumont, Bell Labs and Westinghouse, etc.

This is why patents are harmful imo (without even getting into patent trolls), all they do is restrict the commercialization to the one who was the fastest gun in the west at the patent office and hold back society from developing the technology further because the patentee has no incentive to improve the idea while they enjoy a two decade state sanctioned monopoly.


How do you reconcile a patentless world with things that require massive R&D budgets? Like it or not many industries require very complicated ideas that are extremely expensive and labor intensive to generate the idea.

For example, new pharmaceuticals (of which there are very few being discovered) take over 10 years of R&D and cost multiple billions of dollars to whittle down ~10,000 candidates into a single government approved drug.

But once that single drug has been identified from the haystack of failing options, and rigorously tested, and trialed in humans for years, recreating that molecule is quite trivial. The actual production is very cheap compared to the fact it took thousands of people billions of dollars over 10+ years to get there.

This is just one example of how ideas can be very expensive to develop, but very cheap to copy.


Frankly, I don't think such projects should be left up to the private sector to develop. If we did space travel that way we'd make the first moon landing in 2038.

This sort of live saving research is the best contender for R&D by government grants, if it's a worthy cause it can be supported for a long time (like fusion is being), and when the solution is found it can be immediately shared and sold at cost for the benefit of everyone, not hoarded by one company and sold at a ridiculous price to people that are forced to buy it or die.


It’s a funny comment because I think you could also argue that the government misappropriated a huge amount of public money to put a man on the moon in the 1960s, when it should have invested that into medical research or better transportation infrastructure in America. The fact that the private sector hasn’t developed moon travel yet is a sign that it’s just not that important to most people, not a lack of productivity.


Or maybe that the return on investment is data they can't directly commercialize. Though I guess they could auction off pieces of the moon to the highest bidder or something of the sort. But yes I think we can all agree that that money could've been better or at least far more efficiently spent, same goes for the US's colossal military budget. This is in a nutshell the whole mindset behind the EU's spending the way I see it - social programs first, science second, everything else third.


Enjoy it while it lasts. The EU countries have only been able to under spend on their militaries because of US security guarantees. That subsidy is coming to an end one way or another and they are rapidly increasing defense spending. That means austerity for social programs.

In terms of novel drugs introduced per year (not just repackaging existing drugs), the EU as a bloc is consistently behind the US. So, their scientific spending doesn't seem to be paying off in pharmaceuticals.


I think patents are better overall for society as it encourages rapid disclosure of new ideas and applications, instead of people withholding information because they are worried of being copied after they go to market.

As an aside, there are plenty of inventions that were developed or rediscovered long after they were technologically feasible. The whole Middle Ages occurred because we forgot how to do things they knew how to do in ancient times and never documented, like how to use concrete. Other things were just never thought of until much later, like the analog phonograph could have been created in ancient times but no one tried running a pin over a grooved surface.


I think the disclosure part used to be more important in the past when progress was slow, but today the twenty years given in return is an eternity to block other development for, as technology just keeps compounding on itself. FDM 3D printers which are arguably an indispensable tool for all kinds of research were blocked this way from 1988 to 2008. Maybe Stratasys would've hoarded the design and kept it secret until they could perfect it, maybe the processing power wasn't there yet... or maybe we could've had low cost 3D printers in the fucking 90s and actual flying cars or something by now with how much progress that would've enabled. But I digress...


Record players aren't as useful without record carvers. Without electricity, recording and playing records is not easy or sounding good.

A player piano is a better example.

Antiquity probably had many inventions thta died out when the local village did. Life was expensive, and capital hard to come by. If a patron wasn't interested and aware, development stalls.


Yeah true, but a player piano requires inventing the piano ;)

It’s definitely possible to make a recording without electricity, it just requires steady source of rotation for engraving, so maybe a spring or motor was a necessary precursor to make it useful.


I just don’t see value in original ideas. IMO ideas are statistical outcomes that are bound to cowme to muktiple people, typically with ever-increasing audiences and likelihood as the world advances.

So there may some some first mover advantage in being lucky enough to be (among) the first to have an idea, but first mover itself is of debatable advantage.

Help me see why you think an original idea can be priceless?


I think it depends on the nature of the idea. Some are so original that just the idea independent of the execution are worthy of note (famous conjectures in mathematics come to mind).

Some seem so obvious that it seems likely that many people dreamt of them at the same time. Hard to tell in this case.


No, ideas are worthless, anybody can come up an unlimited number of these ideas, if they were valuable you've got infinite value and that's ludicrous.

Several other things really are valuable, one thing that surprises people is that Advertising is actually valuable, if nobody knows you made this thing/ service/ whatever then they can't benefit from it. Now obviously there's a big gap between "Have you heard about bicycles? I was dubious at first but it was surprisingly easy to learn how to use it" and the pop-up insisting that maybe you've got Restless Leg Syndrome and should ask your doctor - but fundamentally advertising delivers some value for our society.


Just because you can come up with an unlimited number of ideas, does not imply that all ideas are of the same value. I can think of an unlimited number of numbers with > 10,000 digits, but almost certainly, none that I think of will be special(lets say, prime). But, those special numbers (or ideas) are out there, lurking.

I think your line of thinking discounts how absolutely novel and groundbreaking even the idea was. It's an example of someone seeing something about reality that basically the rest of the population of the world managed to miss. I think that's pretty special - even if there was no realistic plan to implement the idea.


> No, ideas are worthless, anybody can come up an unlimited number of these ideas

If you're trying to create a commercial device, the idea is such tiny part of what it takes to deliver a successful product that it is close to worthless on its own.

But the context here is science fiction, where good ideas are a major part of it. (The other major parts are good characters and storytelling, but it's the ideas that distinguish science fiction from other kinds of fiction.)

So give the guy his due.

He looked ahead and tried to envision where technology was going, and, in this case anyway, pretty much nailed it 50 years out.


Frederik Pohl wrote, "Somebody once said that a good science-fiction story should be able to predict not the automobile but the traffic jam. We agree."


Welcome to our patent system.


great lets see what you've invented


I wonder how is that pertinent to the argument I am making?

Or calling out click-bait and hype now requires filed patents and other "inventor" credentials?

FYI, I have 20+ years research and commercial experience developing virtual reality solutions, including published papers in scientific journals. Unlike patents and far-fetched ideas left for others to sort out, journal papers actually require results to be demonstrated and undergo peer review.

Does that qualify, Mr throwaway account?


I wish more people knew/appreciated the incrementalism of progress.

A huge amount of constant "boring" percent-point improvement work or price-shifts are required to unlock the interesting changes, and often the people and companies associated with those new discoveries and ways of doing things are not really the first at all, but rather the lucky or the ruthless.


I worked on a tablet computer ca. '99. We were not first, nor did we have any illusions we were. There were novel parts about our business plan, but a tablet was "obvious".

But it's an illustration of what you describe: When the iPhone and later iPad were launched, a lot of people thought they were new.

Even though there had been a whole fad of touch devices in phone and tablet form that had come and gone before the iPhone came.

A lot of us dismissed them at first because we'd seen that fad come and go, without realising that the genius of Apple with respect to those devices was not in the technical innovation, but in recognising when enough percent-point improvements had been made to unlock something that'd actually appeal to the general public.

"Our" tablet was tethered to a custom DECT base station (wifi wasn't widespread yet), had about 2 hours of battery. The resistive touch was awful, the screen resolution was small, and I had to cram Linux and the GUI, with space left over for a port of Opera, in to 16MB of flash and 16MB RAM in the original prototype...

In retrospect it had no chance in hell. But to a geek like me it was amazingly cool.

Other tablets and pda's with or without network and phone of that era were either attempts at slimming down laptops (e.g. I had a laptop with a touch screen on a swivel back then) that were still way too big, or they were devices like ours, with all sorts of limitations that geeks were fine with, and regular people weren't.

So when these devices disappeared one after the other, many of us dismissed the concept as a whole as tried and failed, rather than realise that the problem was one of enough percent-point improvements and not the concept.

And for all of the many issues I have with Apple, while they were not first, they still deserve plenty of credit for simply being smart enough to recognise when the time was right for this category of devices. Because a whole lot of companies got that timing extremely wrong.


I'll never forget the crowd going bananas when Steve Jobs showed that you could scroll on the iPhone by just dragging the list with a finger, or zoom a map/website with two fingers.

Smartphones weren't new, but one where you didn't have to rely on buttons or an input pen on a terrible resistive touchscreen on the other hand was a big improvement. I still miss phones with qwerty keyboards but not the UIs before the iPhone.


The touch certainly was a big issue. I think gestures would've been less of an issue vs. getting decent capacitative touch instead of those horrible resistive screens.

This is the one I worked on then:

https://linuxdevices.org/freepad-norways-alternative-to-swed...

The screenshots you see are real (the phone dialler even has the name of one of the employees up...). The GUI was NanoX, which is still around (and despite the name it's not an X11 server, but has partial X11 source-level compatibility; we built our own toolkit for it).

And even that, of course, was a testament to the limitations - we picked NanoX and writing our own toolkit not because we wanted to but because fitting an actual X11 server cost too much in terms of extra RAM and flash. It added many months of engineering. Fun months, but time that'd have been better spent on more applications.

I still love that project, but the hardware limitations gimped it so painfully.


I remember the UMPC! That was a whole category of portable computers that didn't stand the test of time. Apple got a lot of things right, but I think it was less about timing and more about being usable, and affordable by the masses. Blackberry was great but it cost too much and didn't have a big enough battery.


Being usable was largely about timing, though.

None of the choices that compromised our device - or those of most of our competitors - were choices we made because we thought they were right, or best, but because we were forced to by timing and what was available at what cost, and failed to realize that the result was too compromised to stand a chance.

Put another way: Had Apple launched an iPad in 2000, it would've had exactly the same hardware usability issues. There were no good, cheap enough touch screens, or displays, or cheap enough RAM, or cheap enough flash, to deliver what they did later at that time at any price, much less affordable enough.

Timing it when the available components were good enough and cheap enough was how they got it usable and affordable.

Trying too early and drawing the wrong conclusions was also why a lot of their competitors had moved onto a very different track in terms of design.

If you e.g. look at Palm devices that started to get network connectivity etc., from ~2000, you'll see the grid of (user-installable) apps w/touch and no keyboard, and Blackberry-like devices w/keyboard "won" for a while because people took users rejection of crappy, limited resistive touch as a rejection of touch. (I walked around with a Palm device for several years from '98 or so, and couldn't understand why "normal people" rejected them - it felt like scifi)

[EDIT: UMPC was from 2006; and not the type of devices I'm talking about - they're a couple of generations of hardware later; the UMPC if anything were part of that bastardization of the earlier touch focused devices as manufacturers declared the "cleaner" devices dead and buried in the market place without understanding why]


> Being usable was largely about timing, though.

Most breakthrough inventions, in areas where people fumbled for years or decades, could have been released long before with earlier insight. Maybe not as advanced, but if the right things were done right they still would have been a win.

Many of those "right things" were more practicality, psychological, or fit for market, not just waiting around to jump the moment the "right" tech component arrived.

Breakthrough products also tend to take competitors by surprise. If the breakthrough product didn't happen, it could be years before any other team comes together with the same focus and insights.

So timing a little bit. The vision is important, but often was a common dream. The real rare resources are a very clear vision, stable highly supportive management, a densely talented team, and an intensely iterative focus, and a willingness to go deep and get previously unavailable specialized components made.

Somehow "timing" always follows those last resources.


You really underestimate timing here, and I feel overestimate how great the first iPhone and iPad was.

They didn't take technical genius. All the elements preceded them. Most of the design preceded thrm. They didn't engineer anything particularly new. The only previously unavailable component they got made was better glass, which was great, but not where the deal breaker was for previous iterations.

I find it funny when people want to declare Apple as visionary geniuses over the tech, when the path from earlier devices to the iPhone and iPad was so clear, and given what I know about design discussions etc. in that space.

There would have been multiple devices very similar to the iPad had the components existed. The vision was there, the supportive management was there, the talent was there because frankly none of it is complicated, and people iterated for years. None of that means anything when the components just do not exist yet.


> They didn't take technical genius. All the elements preceded them. Most of the design preceded thrm. They didn't engineer anything particularly new.

That sounds quite mediocre! Except where were all these non-geniuses with their just-as-not-new phones putting Microsoft, Palm, Sony, Blackberry, etc., to shame? Most of that was crap in comparison. And despite many (many) revamped product lines (Microsoft!) trying to take smart phones mainstream. (I give the Blackberry kudos for being very good at what it did.)

You are vastly underestimating the value and work of getting lots of details right (so many, and so "unimportant", most people never notice what holds products back), and finding a way to fit it all together in a way that works well.

You know what is true of every breakthrough theorem in math? If you break down the proof, its just made of steps of already known relations. It wouldn't be a proof if it was otherwise. Nothing new.


> That sounds quite mediocre! Except where were all these non-geniuses with their just-as-not-new phones putting Microsoft, Palm, Sony, Blackberry, etc., to shame? Most of that was crap in comparison. And despite many (many) revamped product lines (Microsoft!) trying to take smart phones mainstream. (I give the Blackberry kudos for being very good at what it did.)

They had built phones with the same kind of interfaces 4-5 years earlier, and then moved on because the hardware wasn't ready then. You keep comparing to products that appears because of the market rejection of the first tablet/touch device fad, and use that to argue that it was timing, while ignoring the many devices that were conceptually far more similar a few years before that.

As I've said: Apple's genius was paying attention to when the hardware was ready for another try at the "pure" devices that "everyone" else tried too early.

And bringing up the Blackberry is comical to me, because even at the time it was an aberrant offshoot used by people who scoffed at the touch-focused devices, and not a relevant comparison. They were an entirely different device category.

> You are vastly underestimating the value and work of getting lots of details right (so many, and so "unimportant", most people never notice what holds products back), and finding a way to fit it all together in a way that works well.

No, I'm aware of the products that were actually built over the preceding years and which design decisions were driven purely by what was actually available vs. were intentional choices, and so don't give credit for simply having better, cheaper components available. Apple does great at polishing things, and they deserve credit for that, but what the 1st gen iPhone and iPad brought was not that.

The also did great at timing it right, and I do give them a lot of credit for that, and that is important - a whole lot of smart people got that very wrong, and frankly it's a lesson the tech industry needs hammered in over and over because we have a tendency to rush ahead without asking ourselves if the market is ready and/or if a product is wrong or just not right yet, and ending up making stupid decisions. I'm not trying to downplay what Apple achieved. I'm trying to drive home the point that Apple achieved this by vastly outperforming their competitors in deploying their resources in a more efficient way, by not rushing into something that was doomed (in the 2000 timeframe). That was, and is, impressive.

"Just" building a smartphone and tablet at the level of the original iPhone and iPad was good, but nothing special, nor visionary in terms of engineering. They had one or two features that were unusual (gestures was), and lots of missing features, and great industrial design, but very little technical innovation. Overall they did amazingly well in hitting the right balance for a first version, and that outweighs pushing the envelope on the technical side by a huge margin - it's irrelevant if you make technical advances if your company folds and/or the product disappears of the face of the earth.

> You know what is true of every breakthrough theorem in math? If you break down the proof, its just made of steps of already known relations. It wouldn't be a proof if it was otherwise. Nothing new.

Great, but this again implies that there were lots of elements that were put together in some completely novel way, but there were not, and suggesting there were is just pure revisionism that just shows that you're unaware of what that field was like in those years.

Nobody went around wondering if capacitative touch would be better or not - we all knew it was. Nobody wanted as little RAM or flash as in the early devices. Nobody wanted big bezels. Nobody wanted slow CPUs. Nobody wanted low capacity batteries. All of the steps to go from ~2000-era tablets and full-screen touch devices to an iPhone and iPad level device were cost-reduction and component improvements.

Heck, the original iPhone didn't even have installable apps, which was a huge step back compared to earlier devices that was only rectified later when Apple realised they'd made a huge misstep (remember Steve Jobs pushing web apps as sufficient? I do).

Apple got the timing right, and then they did an amazing job at iterating to fix the many defects in their original product before their competitors realised they'd gotten the timing right. Getting the timing right was genius. That's impressive enough that there's no reason to push unsupportable ideas about things that weren't new. Most of the things I've mentioned weren't new when I worked on them in 2000 either - we invented next to nothing, and we knew we didn't.


The business end of things is not to be forgotten. The real trick was that the iPhone debuted on AT&T with an affordable data plan. that was the key. Blackberries had all the same capabilities, but they were so expensive that only business people could afford them through work.


> They didn't take technical genius. All the elements preceded them. Most of the design preceded thrm. They didn't engineer anything particularly new.

Actually the one genius move was doing an 100% touch-with-thumb interface.

They gave up on revenue for replacement styluses though :)


I worked on devices that didn't need styluses 6 years before the iPhone. They were not unusual at all. What kept people using styluses was that resistive touch is shit and imprecise, and on small screens they were awful to use.

So this boils down to capacitative touch of sufficient quality having gotten cheap enough.


Did they offer finger scrolling? PalmOS basically didn't, just because the scroll bars were too thin.


Not gestures, but pulling on scroll bars was certainly a thing. Gestures were definitely an improvement, but pretty much not something you want to try on a bad quality small resistive screen unless you want the user to throw their device at the wall in frustration.

The funny thing is with number of those kinds of early devices I had for use or testing, they all seemed awesome at the time, but I know I probably shouldn't ever pick any of them up again, because I know logically that the experience would be awful given what I know about the tech and compared to current iterations of the same tech... It's funny how memory works.

I still wish for the reality where having a pure-unadulterated Linux installation on a tablet that early remained a thing, though - the current locked-down tablet ecosystems in many ways (but not the hardware!) still feels like a big step back in many respects.


this is like watching younger developers scoff at jquery not understanding how revolutionary it was at the time.

compare iphone to the mobile of the day and it becomes obvious why it won.


> Had Apple launched an iPad in 2000, it would've had exactly the same hardware usability issues

and thats what excites me the most about the vision pro. when Apple steps in, its generally a signal that the tech has achieved a certain critial mass.

vision pro is going to incentivise a lot of hardware manufacturers to start building components for the VR market and increase the ecoomies of scale for other vr systems.

Put another way, android wouldn't exist or be relevant if iphone didnt show us what a smart phone could be. now I can go to any store and buy a cheap andrpid for $200. is it as good as a modern flagship iphone? of course not. but it holds up well against any 4 year old iphone.


Well, in this case, stereoscopic photography had been a thing for a long time (https://en.wikipedia.org/wiki/Stereoscopy), and TVs were becoming widespread, so having the idea of combining the two is not some kind of stroke of genius. So I would give more credit to those who work on the incremental improvements that make the thing actually possible. Same as I would give more credit to the Wright brothers (and the other aviation pioneers) than to e.g. Leonardo da Vinci...


Prime example is OpenAI. General public thinks they have open some sort of Pandora's box when it has been a long iterative process of which a lot was done by others.


OpenAI made the first AI chat bot product that didn't suck. There is a critical mass of innovation and Open AI got their 9kg of Uranium-235 together first as reflected in the market.

The real question isn't why did OpenAI get there first but why Google didn't, they had every advantage.


> OpenAI made the first AI chat bot product that didn't suck.

And let people try it for free. Non-techies were using it and the news was reporting on it. They nailed their launch.


After horrible launches for GPT2, GPT3 and GPT3.5 where most hype and momentum was smothered by a long waitlist. ChatGPT was a 180 degree reversal from their usual release process and an instant hit.


I think this is the thing people like the guy linking to Google's paper on arxiv.org don't get. OpenAI actually delivered something.


Because keeping the user on Google, so they don't need to visit the actual page, means that people won't make web pages anymore. that was a real concern.


Steve Jobs always said in Apple if we don't cannibalize our own products someone else will eat them for us. So the answer is Google isn't willing to take risks like it used to.


> Google isn't willing to take risks like it used to

Has Google ever taken a risk that endangered their ad business?


My armchair answer to that question is that Google isn’t a product company.

They might have had similar or better technical AI capabilities, but they aren’t great at productizing it.


The one thing that most defines Google products is that each one must collect data from the user in a way that is unique to their other products. If it doesn’t do that it doesn’t get to exist.


>Google isn’t a product company.

I disagree. Google does have great products, Google Maps, Search, GMail etc.


Google products by launch date:

Search: 1998 Gmail: 2004 Maps: 2005


Survivorship bias. Any company that launches as many products as Google has to have a couple good ones by pure chance. And all products you mentioned were created two decades ago, when Google was a very different company.


It has free Products. I would suggest that all three would struggle to act as individual paid products if they didn't cross-support each other.


Google had it in 2015 [1]

[1] https://arxiv.org/abs/1506.05869


Honestly it wasn't even having the first capable chat LLM. OpenAI released the first LLM that has interesting characteristics, based on an architecture they created to be good at this task (Generative Pre-trained Tranformers) and then iterated on improving those characteristics - It took them a while before they had a model capable of following instructions well. There was lots of interest in OpenAI when they released GPT-1 and much more when they released GPT-2.

All the while this was going on Google slept - I think the reasons for this are varied and many, stretching into company culture, structure, processes etc. This was after researchers at Google created the Transformer architecture in the first place.

Google seemed to panic suddenly when ChatGPT was released. I think this is when the people who could actually make change realised that ChatGPT could be a threat to Google in a number of ways - including for search market share.


There is a saying. How to make a lot of money? Be first, best, or cheat.


I think it's "only" a quote from a movie called Margin Call [1]. But my grandma could have said it. ;)

[1] - https://www.imdb.com/title/tt1615147/quotes/?item=qt1531207&...


https://www.youtube.com/watch?v=ag14Ao_xO4c

It's the first 30s of this clip, for the curious. (Movie is a bit dry but pretty good.)


A lot of companies make money by being second or even third mover. There's a lot of benefit to having someone else create the market and then swoop in with a marginally better product (or even just better marketing).


I have a story like this.

My first job out of engineering school was working for Dr. Robert Jarvik, of artificial heart fame.

He has a patent where there is a drawing of a person, on a treadmill, with cables tied to each joint. The idea is it's a virtual reality exercise setup. A Nintendo wii or kinect before sensors were any good.

https://patents.google.com/patent/US5577981A/en?q=(exercise)...


This is the problem with the public understanding of patents - something superficially similar like "measuring someone doing exercise" is only a patent violation if the method used is the same as the patented invention. The Wii and Kinect use remote sensors based on video. Clearly that isn't the same idea as your former boss's patent.


I don't think the previous comment implies anything otherwise.


Huh?



Virtual Reality Before it Had That Name: https://www.youtube.com/watch?v=Y2AIDHjylMI


I wish they had a picture




I’ve frequently compared modern VR headsets with cell phones of the 1980s. Big, bulky pieces of metal and glass that will one day look comical. Meanwhile this guy invents VR of the 1960s!


Big and bulky wasn't really the problem, price was. Size/weight isn't an issue with a car-based phone, the fact that the phone was prices as much as a car is.


Size/weight wasn't an issue as long as you were basically talking about a car-phone which was a valid but specific use-case. (Back in the day, my senior manager would occasionally give me rides from my car dealer and he'd put me on the phone with various senior sales people if there was something he wanted me to have a conversation with.) But you're right that the phone plans were also out of reach for all but execs and very high net-worth individuals. Heck, until the late 90s or so, cell phone usage was pretty expensive for a random person for all but very occasional use.


Even in mid-late nineties I would have never paid for a cell phone personally (too expensive), but they were becoming cheap enough that it was common for employers to see the value in making sure you had one.


Most kids had cell phones in mid late nineties where I lived, the phones themselves were cheap but using them were not very cheap, but giving a kid a phone so you could call them if the parent needed seemed popular.


According to [1] (which might be incomplete, IDK) cell phone sales went from 100 million per year in 1997 to 400 million per year in 2000. Subsequently, sales rose to 1.5 billion per year.

So when talking about how many people have cell phones, the difference between "mid 90s" and "late 90s" makes a big difference :)

[1] https://en.wikipedia.org/wiki/List_of_best-selling_mobile_ph...


Indeed, in Norway for example, the tipping point was ~1995: Suddenly the competition had heated up enough that there were "free" phones with contract, and "everyone" got one - in absolute numbers they were still not everywhere, but the rate of increase was dramatic, and you'd go from not knowing anyone with a cellphone (me in '94 - literally zero people) to a large number having one by the summer of '95.

Other countries will have different specific times when this happened, but my impression is that cell phones largely took off in a lot of countries in the same way, because competition and high prices for spectrum meant the large telcos suddenly had a reason to poor vast amounts of cash into pushing cellphones almost "overnight", and suddenly even if you didn't have one yourself you'd at least know enough people that it wasn't anything unusual any more.

I'm 48. While I've seen many technical changes, and been part of e.g. driving internet adoption (the company I worked at in '95 was an internet service provide I co-founded, and triggered a price war on internet access in Norway with, that we soundly lost but at least it helped significantly change the market), but none of those changes - including the internet adoption - had such extremely rapid, very visible, effect on society in such a short period of time

(I think pervasive internet access has probably made a bigger difference in the long run, but of course part of that was also enabled by cellphones, but it took longer for it to shape things as visibly)


I had cell phones from sometime in the probably late 90s but they were only used for specific purposes like if I were meeting up with someone and I wanted them to be able to reach me. I didn't really start using cell phones until I got a Treo in 2006 and then an iPhone maybe a few years later.


1) Kids? You mean only high schoolers, right? I lived that time period and can't imagine any kid who didn't drive having a cell phone, then.

2) I graduated in the early '00s and I think maybe 5% of the kids in my graduating class, at most, had a cell phone by graduation. Mostly some shitty Nokia with a phone-only or low-texting plan (texting was still crazy-expensive), mainly for contacting parents about after school activity stuff.

Middle-class-spectrum car-dependent exurb. Maybe it was different for, like, kids growing up in Brooklyn or something? Or rich kids, perhaps.

A more-common perk for children of free-spending parents, in my town, was still a second POTS line, though I do think that was starting to tip the other way just at the end.


Where I lived in Sweden, 1998-1999 was when the breakthrough started with prepaid SIM cards with subsidized phones. In my graduating middle school class there were maybe 3-4 kids with mobile phones (none of us were rich at all, just middle class, and all too young to drive), but in the following 3 years in highschool that went up to basically 100%.

The US was far behind on mobile phones for a long time.


There was maybe a 5-10 year period when personal cell phones were something you had for emergencies or when you were meeting up with people but didn't casually use for extended conversations much less to replace a landline.


For one thing, they sounded like absolute shit even compared to digital landlines.


You know I kind of forgot the dynamics of that as a 90s kid, compared to 13 year olds with iphones now. Imagine how much TikTok kids would watch if they were charged 10 cents per video.


Definitely early adopter neighborhood though


I feel like the Quest could be very light if they took the Apple approach and moved the battery and computer to a pocket box. Even before any of these headsets existed I imagined most of them being that way.


That would need quite the cable to support 6 camera video feeds, 2 display video feeds, 2 audio feeds, and some other control data in addition to power.


Would it? How much total throughput and power would that be, vs a skinny USB-C PD 3.2 Gen 2x2 cable that can carry a 100W or more along with 20 Gbps of data? How about 2 or 3 of them in a wrapper?

The Quest can be a PCVR headset with a 3.2 Gen 1 cable, ie 5 Gbps. How different is this?

I’m pretty sure outboarding most of the hardware that’s not speaker, camera, or screen is plenty doable. It just blows Meta’s selling point of making it a self-contained device. I’d probably prefer it though, especially if I could attach the outboard module to the back of the head strap as a counterweight.


>How different is this?

Hardware decoding / encoding all of those video streams requires a lot of processing power on both ends, adds 10s of ms of latency, and makes the video quality worse.


Not video encoding, raw-ish signals.

Let's say DSC is allowed, which costs microseconds and is basically invisible, and can compress down to about 8 bits per pixel.

That USB cable can transmit 20Gbps in each direction. The display needs to do 23 million pixels 90 times a second, and at 8 bits per pixel that's about 17Gbps.

The video cameras going the other way are 13 million pixels for the main cameras plus some amount more for the 6 tracking cameras and the other sensors. And those main cameras are outputting 8 bits per pixel before any compression is applied.

It's a tight fit at that bitrate, but we can upgrade to USB4 and get 40Gbps in each direction. That leaves plenty of room for everything, and makes sure there's headroom for HDR and 100Hz and whatnot. It works out.


These had nothing to do with VR. It was only allegedly able to be used for showing steroscopic video, like a stereoscope. It was not capable of showing a virtual reality to the user where they were present in it. It's hard to find any actual information about this device as the articles that refer to this don't cite their sources and probably just copied the information off of each other. There is no evidence that this actually worked and there is no mention of using lenses to make it possible to focus on the displays and then claim that it worked using small batteries leaves me skepticle. Not to even mention the weight this would be. I would be fairly confident to say this mock up was nonfunctional and was just an art project.

Edit: The source is page 79 of this life magazine https://archive.org/details/LifeV55n0419630726 Reading this makes it clear this was purely a mockup of an idea he had and not a functional device.

>Sex has not distracted him in the slightest, however, from his lifelong interest in electronic gadgetry and in the new horizons being opened by the advance of more orthodox scientific knowledge. Neither has it inhibited his bent for invention on those occasions when he feels that duty and circumstance demand it —although he now invents only in broad outline, leaving the actual mechanics of the thing to others. His television eyeglasses —a device for which he feels millions yearn— constitute a ease in point

>When the idea for this handy, pocket-size portable TV set occurred to him in 1936, he was forced to dismiss it as impractical. But a few weeks ago, feeling that the electronics industry was catching up with his New Deal-era concepts, he ordered some of his employees to build a mock-up.

>"It is now perfectly possible to make thin, inch-square cathode lubes," he says, and to run them with low-voltage current from very small batteries with no danger at all of electrocuting the wearer. Sound can carried to the ear just as in a hearing aid. Television eyeglasses should weigh only about five ounces. Since there will be a picture for each eye, the glasses will make a stereoptical view possible and since they will be masked —like goggles— they can be used in bright sunlight. The user can take them out of his pocket anywhere, slip them on, flip a switch and turn to his favorite station." A V-type aerial protrudes from the top of Gernsback’s mock-up of the TV glasses. He likes the effect which can only he described as neo-Martian.


You can use VR goggles to display static 2D images, and they would still be VR googles though. It's actually a bit disappointing how close he was with his mockup to modern VR goggles (which are still a bulky contraption wrapped around your head).


It's actually a bit disappointing how an art project by a dilettante is promoted over previous work by others, and praised as an early version of VR when Gersbeck meant it as a portable personal stereoscopic TV receiving shows from a broadcast station.

Here's pre-1963 work mentioned in a patent application at https://patents.google.com/patent/US20180224936A1/en?q=(Tele...

> Morton Heilig's next commercial offering was the Telesphere Mask (patented 1960) and was the first example of a head-mounted display (HMD), albeit for a non-interactive film medium without any motion tracking. The headset provided stereoscopic 3D and wide vision with stereo sound.

> In 1961, two Philco Corporation engineers (Comeau & Bryan) developed the first precursor to the HMD as we know it today—the Headsight. It incorporated an independent video screen for each eye and a magnetic motion tracking system, which was linked to a closed circuit camera. The Headsight was not actually developed for virtual reality applications (the term didn't exist then), but to allow for immersive remote viewing of dangerous situations by the military. Head movements would move a remote camera, allowing the user to naturally look around the environment. Headsight was the first step in the evolution of the VR head mounted display but it lacked the integration of computer and image generation.

Here's Heilig's patent for "Stereoscopic-television apparatus for individual use" https://patents.google.com/patent/US2955156A/en?q=(Morton)&i... , filed in 1957 and granted in 1960. Not only was it a head-mounted display, but it included air tubes so the wearer could feel a breeze, and optionally smell scents.

Headsight was on the cover of the magazine "Electronics" in 1961, https://archive.org/details/sim_electronics_1961-11-10_34_45... , and the article shows it actually worked, and was not a mockup.


Using VR goggles that have no capability of viewing anything VR makes them just regular goggles though?

This tank simulator from the 1970s is a great example of early VR: https://www.youtube.com/watch?v=AcQifPHcMLE I wonder if there's anything earlier..


OK, but it seems to me that, even now, the market isn't exactly crazy about VR googles. They're expensive and the novelty of them is fleeting (haven't tried the Apple ones).


1963+50 = 2013. Give VR google adoption rate and 10% of weekly user among those who own them, I'd say he was more than 70 years too soon at least.


you're not far off the mark

> This article appears in the December 2016 print issue as “Before Virtual Reality Was Cool.”


it's like cellphones. Took decades until the flip phone and then iPhone. But I think it's still too early to say VR is mainstream. It is still a niche.


Anecdote - I remember as child in the UK, in the 1980s, there was a full sized lorry that toured, carrying a Harrier jump-jet VR set up. The graphics were wireframe, possibly 8 colour, I'm not sure, but I do remember the VR helmet. It was two small CRT TVs suspended from the roof by cables, and you sat in a chair that had to be adjusted so your head was in the helmet, not the other way around, because of the weight.

All that aside, the thing actually worked - you could turn your head and the display updated and you could fly around. Remarkable for the time.


Neural networks were 'invented' in the 1960s-70s too. We only have the hardware to get something semi useful* out of them ... now.

* actually they have been used before, but they didn't manage to generate religious like fanaticism until the past few years.


It may turn out to still be "too soon" if Meta doesn't up their game.


There was a whole "now we finally have the technology!" push for VR in the late 80s/90s. Google the Forte VFX1, for example. Jaron Lanier made one of the first ever TED Talks in 1990 and it was about how VR will revolutionize everything ( https://youtu.be/lfvOACM-vbE ). It all... "rhymes".

I have this thought that maybe the problem with VR isn't the display technology but the input. Walking forward in a straight line is more or less an unsolvable problem. Motion sickness, the need of a frickin' threadmill, it's just messy. Touch feedback is an unsolvable problem unless you introduce robot gloves that can break your fingers. And then we have the question of use cases. VR solves a very specific spatial problem with an interconnection between perspective and hand movement. Very few problems exist in that space. It sometimes seems like VR creates more problems than it solves, in fact.

I low key believe in AR (although Apple finally played its cards and the result was underwhelming). Something about infinite and freely positioned 3D monitors. But VR? Great for cockpit sims and maybe some very specific professional uses. But useless for 99% of tasks of an average person.


Meta are the ones putting out lightweight cheap VR headsets. If it turns out we currently need a $3000 device to deliver VR people find acceptable, that is a sign it is "too soon" for mass adoption (still useful for many niche use cases though, like frequent travelers working on a plane or in a train).

In any case, the technology seems pretty close. If it takes another 3-10 years for tech to advance enough Meta is in a great position to execute as we get there.


This feels very relevent:

https://www.youtube.com/watch?v=AcQifPHcMLE

It's not really VR, or is it? It's a model.


It makes me sad he can’t be here to try out something like Vision Pro and see the evolution of this concept 50 years on. I imagine you’d struggle to wipe the smile off his face.


Speaking of old(er) VR, I recall trying to play Duke Nukem 3D in a VR helmet and a joystick with 3 buttons. It had to be early 2000s, probably at some tech show.


The cliche in the tech world: you want to be ahead of your time, but you don't want to be too far ahead of your time.


What if you had racing VR inside your car? Your windshield turns into a screen in "fun mode".


CRT on your eye balls, ouch. That said, they look small and light. So maybe CRT is the way to go!


I thought lasers right into the eye were the best option? At least that was presented as the 'holy grail' in a VR book I read in the early 90's (must have been this one, although my German edition looks very different: https://www.amazon.de/-/en/Howard-Rheingold/dp/0671778978).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: