Hacker News new | past | comments | ask | show | jobs | submit | Civitello's comments login

> Imagine telling the homeowner to recover damage from their insurance company.

That is exactly how these things work in the US.

> The insurance company should recover their costs from Tesla.

These two things are not mutually exclusive.


It is hard to cover Tesla in good faith without frequently reporting on things likely to cause FUD, reality has a well known anti-tesla bias.


This would look incredible in VR.


As someone who has worked in compliance testing for tightly controlled software platforms, things like this piss me off. These problems have known solutions.


Works on my <1yr old iPhone phablet. Ship it.


But the known solutions are old, and new is always better


depends on the goal of the speaker


heck there are probably humans who can sight read ROT13


> Every use case we have is essentially “Here’s a block of text, extract something from it.” As a rule, if you ask GPT to give you the names of companies mentioned in a block of text, it will not give you a random company (unless there are no companies in the text – there’s that null hypothesis problem!). Make it two steps, first: > Does this block of text mention a company? If no, good you've got your null result. If yes: > Please list the names of companies in this block of text.


I'm more worried about possible impacts of an increase in atmospheric CO2 on cognitive function. It probably isn't an easily measurable amount on individuals, but across the worlds population it may have a notable effect.


Why?


Not ready to evolve into homo matrixens quite yet.

And to the people who are going to say, "then dont use it," I counteroffer the fact that--at least where I live--there is a damned-if-you-do-damned-if-you-don't aspect to phone use. You will be socially isolated if you don't conform, especially if you are a teenager. Tech evolves faster than our minds, morals and laws can handle. I dont think humanity is ready to jump into fully immersive screenlife quite yet, and I fear Apple makes things sexy enough to be the ones to kick us down that well.


Yeah, but I don't think anyone is going to be using these in public (and if they are, it will be super easy to punk them).

Now, I'm not stupid. The moment people saw AI generated art as art, something fundenmentally changed: no longer were these impregnable objects that infinitely aroused our aesthetic sensibilities containing something within them we might call "human." It was just machines, machines that could appear confusing and scary to us, in a way that has started to feel overwhelming.

The world is turning, but what will end first, the concept of humanity, or humanity itself?


I'm not sure they will be used in public; I'm just worried about the eventual "killer app" that has no one going out in public ever again. That's part of the problem, as I see it.


> homo matrixens

I think that's more of Meta's approach. Apple seems to be aiming for something a bit more like Dennou Coil, where the virtual coexists with real things rather than being an either/or.


> field of view...Limits of the technology

It really isn't though, at least not for long, as of mid 2023 there are publicly showcased compact lightweight prototypes with 240° FoV.


Well there’s FOV and there’s pixel density which are both too low right now and antagonistic to one another feature wise. There’s also display brightness which is an issue and I’m not sure how that fits in to the FOV/density spectrum. And then more pixels means more compute… That prototypes exist doesn’t necessarily mean much in a space that is full of prototypes showing off one particular feature. The very hard thing is to combine all of the desired features in to one consumer ready headset.


>Well there’s FOV and there’s pixel density which are both too low right now.

Exactly. You know what is the best "goggle" type display I ever saw (and I tried quite a few)? Recent FPV goggles (hdzero) that have 1930*1080 OLED displays at approximately 46 degrees fov. I'm very glad the manufacturer decided to use the his low fov instead of increasing it to 55deg and more like others. The picture is insanely crisp, and looks better than looking at a 4k display. Another huge benefit is that entire picture fits within your "focus cone" so there is no need to gaze around. It is not a VR display, it's purpose is different, but it shows us what visual quality is possible.

I'd love if manufacturers, if they can't make 16k displays that fill the entire fov, create variable pixel density displays. Best quality in the center. Deteriorating towards the edges. That would be much cheaper, but then for good illusion one would need eye tracking and motorised optics which would probably be more expensive in the end...

Oh, well, I'm pretty happy with my fpv goggles (for fpv) I just wish there was a way for them to display different picture for each eye. They already have head tracking, I wonder what VR would be like with this huge ppi narrow fov goggles. Would it be more or less immersive?


for those who don't know: FPV = first person view. Used for drone racing and things like that.


I'm pretty sure one of the varjo headsets had displays like you describe.


motorized displays also sounds HEAVY which is a big issue for a device intended to be used for long periods of time


they are paying the price for their obsession with super high resolution. I think it's a mistake, from my perception Quest 3 at 25ppd is nearly good enough, and its panels are nearly half the resolution. They should have traded 20% resolution for 10 deg FOV either side. In fact, with appropriate optics the effective resolutions sacrifice could be even less than that.


How is it to read text on the Quest 3? How would you compare it to a high-density display (such as what is on your phone)?


Well, it's no Retina screen. I'd say it's bit like reading text on a 1080p CRT at 96 dpi.

It's perfectly doable, but there's still a hint of fuzziness and we're still a couple generations away from crisp LCD text.

I care about my eyes, so I use a 4k screen at 2x scaling for coding, and will not use the Quest 3 for work, unless that involves playing games and watching videos.


Is there any evidence of pixelated text damaging eyes?



They only need hi-res where the user is looking, don't they?

We can't see what's in our peripheral vision clearly, so they should be able to get away with most of the FoV being blocky af, shouldn't they?


The goggle FoV is fixed right in front of you, but your eyes turn.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: