Hacker News new | past | comments | ask | show | jobs | submit login
SteamVR Tracking (steamgames.com)
202 points by Impossible on Aug 4, 2016 | hide | past | favorite | 50 comments



It's awesome that Valve is doing this! They have said from the start that they were going to open up Lighthouse (the codename for the tracking system) to hardware manufacturers and this is the first part of that follow-through.

Should be exciting to see what kinds of devices come out of this. I know I saw a few devices (like gloves) at SIGGRAPH that attempted to reverse engineer tracking with the base stations, so this should speed up their development considerably. I wouldn't be too surprised if we see at least an announcement of wireless Lighthouse-tracked gloves for VR by the end of the year.

It's not just controllers or HMDs either. I expect to see a beer coozy with tracking so people can pick up drinks on a desk while remaining in VR, and maybe a collar for pets so you don't step on your dog/cat while the headset is on. The possibilities are endless!


> I know I saw a few devices (like gloves) at SIGGRAPH that attempted to reverse engineer tracking with the base stations

I'm attempting to do the same thing at the moment: https://github.com/jb55/libvive. I got basic stuff like buttons and gyros working. Right now I'm wrapping my head around some of the trigonometry that makes the lighthouse tracking work. Hopefully I'll have an open source reference implementation soon.


great! One big annoyance with all the SDKs fighting it out right now is how much extra (and often closed source) junk they pull in that replicates things you likely already have, with quite invasive integration with your rendering pipeline to boot. Access to lower level primitives rather than integrated solutions is definitely a nice thing.


>I expect to see a beer coozy

Yes, the killer app. I've been hoping for this myself.


VR: Pepsi. Real world: Coke. Brain: shattered.


I wonder if that would "change" the taste of it?


Taste is difficult to measure but perception, yes. Look for the blind pepsi campaign from the 80s.


It certainly holds true for red vs. white wine. Not being able to see the colour makes it very hard for most people to reliably tell the difference, if you avoid the dead-giveaway heavy reds.


If there is one thing I CANNOT wait for is these new VR systems being applied to CAD/CAM software.

The ability to move tools and remove material in 3D from items and move them around as you feel fit will revolutionize the world of CAD software.

I'm saying this as someone who is attempting to get into doing some CAD modeling but having an EXTREMELY hard time getting into it. It really is software made for engineers by engineers.


Have a look at the VR support in the Unreal Engine editor[0]. You can actually manipulate the virtual world while inhabiting it! It's not much of a stretch to apply the same principles to CAD.

[0] https://docs.unrealengine.com/latest/INT/Engine/Editor/VR/


I didn't know this exact example but this is precisely what I am dreaming of.

It's just completely intuitive.


Since a little before the launch of the Vive, I've wondered if early VR adoption would have been better off if the focus of the platform holders was on content creation tools instead of hardcore games and 360 video. There are many examples of content creation tools available or in development in the form of art and music tools. Tiltbrush and Medium are awesome for example, but most of the applications available are much more on the toy side than the professional tool side currently.


I feel the same exact way. Not only for the design aspect but for debugging and data visualization as well.

Imagine simulating a 3D representation of a nuclear power plant and overlaying information from sensors in pipes into a 3D model.

Rather then thinking of "Sensor 19932324Z is malfunctioning" you can think "All of the pipes intersecting at point X are having a shared problem"

When dealing with large and complex systems it is a HUGE issue to deal with data visualization. 3D representations of data basically completely solves that problem for many systems.


But is VR all that relevant, compared to an interactive 3D rendering on a regular screen?


Yes because I feel the mode of interface is entirely different.

Looking around with your face and using you legs to walk (with a correct treadmill setup) will be much better for human computer interaction in my opinion.


I love my vive but I too am a little skeptical? It is an entirely different experience and it's cool. I am just not yet convinced that in the long term the interface will be as productive. Now glasses size true AR that is light weight high FOV/Res would be a game changer.


That is my opinion as well.

I was very excited for the castAR. They even had a device to clip on top of the glasses to turn them into a VR setup.


I would imagine something more along the lines of the Fantastic Contraption minimap.


There is a pretty high barrier to entry here with a required $3,000 per-person training program. I'm not sure why that is necessary, unless it's really difficult to work with.


They don't want a bunch of people making bad hardware. They would rather put safeguards in place to help ensure that early builders actually implement the tracking well rather than have a bunch of devices that do it poorly and reviews saying it's the tracking tech that is bad.

It also reduces cost of support as, theoretically, each company should have at least 1 person who is well versed in the technology for troubleshooting.


Probably because its not ready for mass hobbyists. If it was cheap their manufacturing wouldn't be able to keep up and Reddit would be super pissed their pre-order took 6 months and the SDK was janky as hell.


Not to mention, it's in-person training, so anyone outside the US is effectively excluded.


I am sure you can come visit and take the training, though perhaps you mean it's more expensive for non-residents of the US to attend.

It's more expensive for people living in other countries, certainly, but perhaps you could work out a deal where you become a certified trainer yourself and go back to your country to offer discounted officially-sanctioned training to folks in your region.

Look for opportunities rather than problems.



Interesting that Valve isn't looking for royalties here. Any speculation as to what the strategy behind this release is?


Valve isn't looking to be in the hardware game in the long-term, with the exception of expediting new technology like they have done with HTC and the Vive for room-scale VR. They want as many room-scale capable devices on the market as possible because they are confident most of those games and such are going to be sold on Steam.

They are not locking down the tracking technology (although in the short-term, you still need HTC Vive base stations, but that's just for now) and the OpenVR standard can be implemented by anyone and there are already integrations not just for the Vive and Rift (with SteamVR), but also OSVR.


Valve looks to be using VR to grow Steam as a platform


Valve makes money selling software. The more accessories on their system (SteamVR) the more software will be sold.


As an Oculus user, based on Valve work for the VR community, I will very likely move to the Vive for GEN II.


And there it is - you are the strategy.

They're pushing for an open vr ecosystem, as they realise both that it's important to the medium as a whole as it encourages experimentation through greater access, and equally importantly, they realise that that matters to you.

At this point they're setting up a playing field where oculus will be pretty much compelled to put down their dreams of platform and exclusivity, and all will benefit, particularly the guys who run the ubiquitous game marketplace. Oculus can have a glorious future as a hardware manufacturer, which is where they started, too.


Own the platform.


More to stop Oculus from dominating VR with its own platform and blocking out or disadvantaging Steam. Valve was very friendly with Oculus until the Facebook acquisition (or maybe shortly before).


yes, notice it is steamVR tracking not openVR tracking.

They likely have a clause you need to make a steam game to get access.


Q. Can I sell my product wherever I want? Do I need Valve to approve my product?

A. Yes, you can sell your own products wherever you want. No, you don’t need Valve’s review and approval to ship your product, but you do need to comply with the license to be able to use our technology, brands, names, or trademarks.

Source: http://steamcommunity.com/app/507090/discussions/0/360671247...


They probably want this tech to benefit robotics and other autonomous research.


Wow, this could be exciting, I'm really glad valve followed through on their word of opening this up, and it seems they've done so in a pretty good way. Providing an open standard for tracking is fantastic.


SO I guess you can only track within a range of 5 meters? Any way to daisy chain the base stations to have higher coverage? This is an awesome tech, soooo many applications!


Well, the current TDM system has a limitation of 2 lighthouses. If you read into how it works, each lighthouse sends out an IR floodlight pulse follow by a horizontal and vertical laser scan. Each lighthouse has a single IR detector to be able to sync these scanning events (or you can use a cable). As far as I've read, the system is running a 60Hz (due to the laser scanner RPM) and as far as I can tell they can't really get the current TDM based system past 2 units without decreasing scanning frequency and theoretically decreasing tracking resolution.

What I find interesting is the idea of switching to a FDM system. I don't know enough about the current hardware to know if the IR detectors are capable of FDM (I think they've discussed this in interviews though), but if they are then you could have a system that supported many more lighthouses and a higher scan rate (since you don't require TDM anymore).


Well, the Vive does exactly this with two base stations. So, I'm guessing: yes


What happened to Neal Stephenson's plan to make a sword-fighting game? Did it fizzle out? Maybe he could make use of this.



Never heard of it. Is it related?

I believe this is the kickstarter video: https://www.youtube.com/watch?v=vuWCEpcTbww


That's correct; he was funding it himself initially and failed to get the backing he needed to get it off the ground. It was an idea a little too early for its time IMO, so maybe this will be the catalyst for a renewed effort!


Seems the wrong way around for a lot of applications, I want the objects to be the dumb emitters and have a central base station that has a low-latency, synchronized (!) state of their positions.

Otherwise all the effort on low-latency seems wasted on a device that has no low-latency path to getting the information out.


I disagree. That's the model the Rift uses, but that means that as you add devices, the extra computation of position lies on the PC (or mobile device, or w/e).

With this Lighthouse system, the positional computation is distributed amongst each device and they report their computed position, so it scales way better as you add devices.


I think it is definitely for scaling. Also you can add any number of devices that are independent of each other. Like for instance using two Vives in the same room, but use the same base stations.


The headset is wired to the PC. That is its low latency path. Latency is less critical for the hands and they have still done a good job with getting it down.

This will allow things like backpack PCs (4 major companies already have them on the way, tailored for VR) and tracked mobile phone holders to have the low latency path.

You could connect such things with cameras that did on board processing and wireless sent the position, but that would fall on the high latency path.

Also you have it a bit backwards, optical tracking is high latency and low refresh, all the low latency stuff is done with IMUS on both Rift and Vive and then combined with optical via Kalman filtering.


Edit: I misread your proposal - I thought you wanted more intelligence in the peripherals. It sounds like you're actually proposing putting the light source on the tracked device. I suspect Valve made the decisions they did in part looking towards a wireless future where sensors can operate far longer on batteries than light sources can.<end edit>

UNC Chappel Hill developed a system like you describe a number of years ago [0]. The position-evaluation software they developed was amazingly cool but the size of the tracker (a bit larger than a golf ball) and the cost of the tracker were both far higher than the cost of the SteamVR sensors, while the cost of their "light houses" was much less than the cost of Valve's lighthouses. The UNC system works great in the case where you want to track a small number of high value devices in an extremely large office-building style environment but it scales much less well if you want to track a large number of inexpensive peripherals in a room sized environment. The approach you describe is great for many applications, but packing the complexity into one or two reusable base stations as both Valve and Oculus have done scales much better for a consumer market that is in part interested in additional "inexpensive" aftermarket peripherals.

[0] https://www.cs.unc.edu/~welch/media/pdf/scaat.pdf


If you like podcasts, the man primarily responsible for the Lighthouse system covers exactly this topic on this show: http://embedded.fm/episodes/162

I don't remember where in the episode, but if you're interested in these sorts of things it's a good listen overall. He posts a lot on Reddit, Twitter, and elsewhere and seems very willing to talk about every aspect of the system.


The tracked objects almost always have buttons on them that the user presses to interact with the world.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: