Hacker News new | past | comments | ask | show | jobs | submit login
Leap Motion Has Launched (leapmotion.com)
153 points by tucif on July 22, 2013 | hide | past | favorite | 68 comments



I have actually developed on the Leap Motion device and here are my few critiques:

1. There are way too many gestures and motions available. The SDK needs to provide a solid framework for fewer gestures/motions and the company needs to make sure that consumers readily understand them. Unless there is a whole lot of standardization of the platform it will become hard for developers/other companies to produce a consistent and coherent experience.

2. I do not like the strategy of lets-throw-it-in-the-wild-and-see-what-other-people-come-up-with; the tech has loads of cool factor that you will get people to make interesting stuff with it. However, will that translate into useable and effective computing software? From my point of view, I have not really seen a single application on Leap which does something 'better' than a touch based or mouse-based interface can do. Better meaning more intuitively and/or more productively

3. I think one of the biggest challenges for any gesture controlled device is the shoulder/forearm/wrist/palm pain that is inherently going to be the result of prolonged gesture-driven interface use. Maybe applications have to use gestures sparingly?

On a broader note, I still think humans have an inherent desire to touch/feel interface elements. Being able to directly touch a screen and manipulate it means I can easily manipulate visual behavior through an actual/physical connection. Interfaces which disconnect this physical medium seem more alien to me at least. Maybe I am less evolved as a neanderthal but I don't think we are ready for a gesture driven interface.


Your third complaint can be resolved by placing the device lower than your elbow. Most people use it on the desk, which is the problem. It's not the ideal height for the Leap.

When it's higher than your elbow, your shoulder must be engaged and power all your movements. There are also many apps that map directly from x/y/z space. These can be hard to navigate. All your joints move in arcs, not straight lines.

You get to use the Leap however you want, so figure out what's comfortable for yourself.

Leap is quite sensitive. You could look for tiny flicks of the finger, for instance to create a virtual DataHand. These kinds of interfaces should be much gentler on the body than apps which require large sweeping motions, punches, etc.

https://en.wikipedia.org/wiki/Datahand


I don't think a lot of consumers will be ok with trying to find the optimal place to fit a Leap on their computer desks. It is a forced behavior not covering any specific need.

Leap has a long way to go in defining a use case for the technology; something which it does better than existing interfaces. Otherwise it will be a really cool niche product used by tech enthusiasts, not something a heavily VC-backed form is targetting.

Also, there is a reason DataHands never really took off and it is the same reason Leap might not. Changing human behavior is ridiculously hard, and pouring money in a dev ecosystem (how Leap is doing it) is not the way to do it.


I personally think it should be additional input to whatever, for your desktop or laptop.

I'm planning on creating my own laptop startup and eventually try to integrate Leap(or something similar). In many cases it may not apply to general usage, but there are many use cases where it would be preferred.

1) Cases where user interacts with a GUI where user is visually engaged, imagine I am on Youtube, personally because stupid instability of HTML5 and Flash on Linux, I hate using it, especially when I am full screening. That's something that can be replaced with a gesture. So try doing a pinch gesture(basically zoom out) and the browser can interrupt that as put video in full screen.

2) Rather than using your whole arm what about just one finger. Lets say you want to scroll down and both hands are on the keyboard, rather than moving one hand back to the mouse you can just flick one finger. This gesture could scroll pages, it could flip through tabs and you wouldn't even have to take your hands of the keyboard.

3) Now there are a lot of edge cases I would say where using a keyboard/trackpad is uncomfortable. When I am lying down using a trackpad is very awkward and strains my wrist. This is certainly the case where I would want to flick some stuff around, maybe even talk to the computer.

4) My parents connect their laptop via HDMI cable to the TV and that's all they use the laptop for, basically to watch youtube and Netflix(if Leap motion had the range), it's something I would want to control almost as an imaginary but intuitive remote.

And this is kind of just the start. I think with the increasing number of input devices available interfaces are going to have to be clever to hide that complexity from the user and make things intuitive.


DataHands cost hundreds of dollars, have huge brick interface boxes, cables everywhere, a mains power brick, they don't let you type one-handed while you drink, eat, mouse, or hold a phone, they take up a lot of space for each hand unit, and they look weird.

But in terms of changing behaviour, they are pretty much a qwerty keyboard. In use, they're not ever-so different at all. I would be very surprised if that was a significant contributing reason to why they 'never really took off'.


I think there are a number of specific areas where Leap Motion can be interesting.

For example, there are some startups that use Kinect to let surgeon interact with a screen showing patient data during an operation. Some researchers have also proposed the use of gesture recognition in the context of a worker having to follow precise instructions for some complex task. Leap motion + Google Glass could be an interesting combo for interactive training.

I'm sure there are other areas. But I agree with your comment and I fail to see the mass market use case. But I'd be happy to be proven wrong :-)


Leap wants an unspecified cut if you make one of these:

"Specialized Application" means an Application which is: (i) sold, licensed, leased, or otherwise disposed of for a list price of more than US$500 or local equivalent, or more than US$240 per year or local equivalent if on a subscription, lease or similar basis; (ii) sold, licensed, leased or otherwise disposed of as part of, or for use with, another application, system, machine or device (other than a personal computer), having a list price of more than US$500 or local equivalent, or more than US$240 per year or local equivalent if on a subscription, lease or similar basis; (iii) designed for use, or that is primarily used, with or for control, whether direct or indirect, of industrial, commercial, military or medical equipment.


Suddenly, the device seems a lot less interesting.


That (iii) including "industrial" or "commercial" equipment could be interpreted very broadly. In other words, sorry, can't use a Leap to control the lights in a conference room, upscale club, or retail store without special permission from Leap.

I'll pass.


Based on my experience with Leap and it's limitations, sounds like someone wants to create Surgeon Simulator 2014...


I think your right, that it might be prove useful in edge cases and custom application scenarios. But as a consumer product, it's just confusing at the moment.

I make mobile applications for kids and I made a prototype of one of our games for some kids to work with, but even after a tutorial it was hard for kids to pick it up.

I think there is something very human about being able to physically interact with things. And it is extremely hard to break that barrier.


I'm mostly just surprised that it seems like they didn't include a trainable gesture recognition app.

It should be pretty doable to hook it up to a neural network and train it to recognize some of your own gestures.

I was already looking forward to training up a drumroll gesture for 'deploy to production.'


To prevent accidental deploys, it waits for a second of drumrolling, then asks if you're sure you want to deploy. To confirm, hit the cymbal.



I have also developed for the Leap Motion device [1] and agree with much of your critique.

The only arrangement I have had real success with is far away from the screen, i.e. mounted in a guitar.

My kids have been mildly amused by some of the things I've written and by the apps available in the beta, but e.g. with Cut the Rope, a lot of fun swiping on touchscreen iOS devices, they couldn't get it to work waving about in empty space with the Leap Motion. "Dad, this is too hard."

[1] https://github.com/heuermh/leap-motion-processing


We're inventing this stuff as we go, right now, which makes gestural and spatial interfaces an exciting thing to work on. Your broader point is definitely well taken. Touch is important. But touch, combined with free-space gesture, plus appropriate use of tools (like the keyboard), and voice, and head tracking, and other kinds of contextual sensing from our computers -- now that starts to look like the future.

I work at Oblong Industries. We're the folks who did the interfaces in Minority Report. We've been building gestural and spatial stuff for a long time.

Regarding (1) - standardization is really important. I'm old enough to remember the early days of mouse-driven interfaces. It took a long time for the standard window manipulation semantics, scrollbars, buttons, and drop-down menus to develop. The original Macintosh team did a lot of quite elegant heavy lifting in that regard. It's very much worth reading the stories on Andy Hertzfeld's site and in the related book. http://www.folklore.org/index.py ; http://www.amazon.com/Revolution-The-Valley-Insanely-Great/d...

If you'd like to experiment with a toolkit that provides a standard framework for building gestural, spatial, multi-screen, multi-device applications, please feel free check out our Greenhouse SDK. Greenhouse supports the Leap Motion controller, the Kinect, and a bunch of other forms of input. http://greenhouse.oblong.com/

Regarding (3) - we've got a lot of experience now with ergonomics of spatial interfaces. (We've been selling glove-based gestural systems to early adopter bigco customers since 2005.) It turns out that accuracy of the underlying sensing hardware is incredibly important, and then on top of that design of the gesture language is really important, too. Basically, if you have good enough sensing that small motions are precisely tracked, and then if you use that accuracy to track hand movements and poses that are "natural" (glossing over what "natural" means, for the moment), people can comfortably and happily use gestural interfaces all day, every day.

Think about it this way: most people "talk with their hands" all the time. (Some people wave their hands around more than others. But spend some time paying attention to people moving their hands while they talk; it's really interesting to watch.) Our high-end, glove-based systems track finger positions to 0.01 mm at 100hz. The tracking volume is big, so your motions aren't constrained. You can walk around and you can go right up to a screen or stand back from it. We recognize hand poses ("one finger point," "two finger point," etc.) "Talking" to the computer with your hands feels a lot like what you do when you use your hands while making a point in a discussion with another person. You don't get tired and you don't feel any strain or overuse pain.

The consumer-priced sensors like the Leap Motion and the Kinect don't yet provide this combination of tracking precision, tracking volume, and robustness to occlusion. But they're getting closer! And when a $50 sensor (or a few $0.50 sensors) give you the same accuracy and precision we have today with Oblong's optically tracked gloves, our collective expectations about interfaces and user experiences are going to radically change. (Or, more accurately, as happened with the invention of the GUI we use today, our expectations will change slowly over fifteen years, starting roughly now, with some moments of punctuated equilibrium analogous to the release of the Mac in 1984 and Windows 95 in, um, 1995.)


Do you plan to support the MYO? How do you think it's accuracy will compare to the Leap and Kinnect?


Their marketing worked on me. I pre-ordered one. Like many, I suspect, I ordered mine because it looked like a cool toy, not because I thought it filled a need.

I've only played with it for an hour or so in total, but so far I'm returning it. Maybe it's a training issue (on my part; the device doesn't learn, as far as I can tell), but it just doesn't work very well for me.

The demo apps suggest a severe lack of precision when I'm using it, and complain that the room is too bright. Coincidentally, it's a stormy, overcast day, and my office lamp burned out this morning. This is as dark as it'll ever be in here during the day.

I'll definitely give it some more time (you have 30 days from when it ships to return it), but I'm not that interested in learning how to move my hands in a Leap-specific way, or in permanently darkening my office.


> The demo apps [...] complain that the room is too bright. Coincidentally, it's a stormy, overcast day, and my office lamp burned out this morning. This is as dark as it'll ever be in here during the day.

Could it be that it's aimed too much towards the screen? I mean, that's the most obvious remaining light source.


Could be, it is in front of the monitor, but it's flat on my desk in more or less the same place shown in most of their demo videos.


I wonder if your monitor's backlight puts out an inordinate amount of infrared light. I've noticed interference with IR remote control reception when one of my LCD monitor's backlights is turned down close to minimum brightness; I'm assuming that the PWM dimmer frequency (or one of its harmonics) is making it through the filter used by the IR receiver.


I actually fall into a similar category. I fell for their advertising, too (a man can dream...), but it's definitely clear to me that this really is a toy and not really practical for much. Definitely not practical for most (all?) of their apps they currently have available for free. It's still fun to use, but there is no way this will find itself in any productive person's everyday workflow in its current state. We'll see how the Myo turns out...

On the bright side (no pun intended), I actually didn't have any lighting issues though, regardless of how dim the room was. Maybe you just have a defective unit? Giving them a call might be worth a shot...



I am so looking forward to getting mine. I ordered the thing in August of last year. Guess they didn't like my application for a touch free interface for some hospital software I've worked on to get me an early development device. I thought the idea of the possibility of zero infection risk interface would be a great plus for this. But, they wanted apps for the masses I'm guessing.


My proposal was for a medical data manipulation tool for surgeons in an operating room, and I received one. Maybe I just beat you to it ;)


Well, congrats! I hope it's been fun! I'm excited.


They didn't like that, but they liked my application for a gesture based file browser? I feel bad for being on the dev program now...


This would have been a great app, instead of just rehashed games.


It's also possible that medical applications (and, by extension, deploying the hardware in a hospital) would require them to have a whole different set of dev, manufacturing, and QA procedures, much like flight control and other "mission critical" software.


Could be. But, again, they can use iPads and other random tablets, cheap PCs bought from anywhere on the internet, flea market computer mice and keyboards, cheap flatscreen tv's for the wall, etc. and that doesn't seem to be a problem. :-)


They also didn't seem too interested in our application for a radiation oncology treatment planning tool, so maybe they just weren't into medical stuff?


Just tried mine out and am pretty disappointed. Mouse and keyboard seem pretty revolutionary after 15 minutes with the Leap.


Could a Leap be mounted on the side of a laptop looking sideways at whatever table surface happens to be there (if there is one), so as to functionally give the laptop a big touchpad off to the side?


Leap doesn't work (or at didn't work during beta) well or at all if you had solid object blocking it's viewport. Even if blocking object was behind your hand.


I'd be excited to try this if it weren't (as far as I can tell) another use-the-app-store-or-nothing setup. https://www.leapmotion.com/developers points to a signup page with a lengthy contract.


They told me I'm free to distibute apps outside their store. They are some things in the sdk that are not redistibutable but I am fairly certain core libs required to run a program are allowed.


Thank you! I'll take another look now.


Too bad a lot of people who pre-ordered via Kickstarter haven't even gotten their shipping confirmations yet.


Yeah, I got an update yesterday that my credit card info was no longer valid and had to enter a new one. Guess I must have replaced the card since then. I hope I get mine in a timely manner. I'm glad I am not the only one who hasn't received a confirmation of shipment. Well, not glad... But, perhaps it's just the norm at the moment and that is somewhat encouraging.

Edited to say: "Man, I use the word 'I' enough in that post."


I've cancelled my pre-order. I feel that as a "pre-order" customer I've been let down. After a year of waiting, other non pre-order customers will get their LeapMotion before I do. Why did I pre-order a product when there is no "pre"?

Additionally, developer feedback on the product is not that positive.


Mea culpa: a few hours after I posted this comment, I got my "Your Leap Motion order has shipped!" email.


Yeah similar problem. Preordered year ago in June but no credit card charges have been made, nor have I received any emails about shipping.

I did call them today and they couldn't even find the order, even though it was still fine couple weeks ago. (This is an international order btw)

Anyways, the product still looks great and the customer support has been fast to reply so I have no complaints about that either. I was just quite eager to start developing some projects for it.


This is annoying. I guess my card expired. I ordered the damn thing in Feb.


It wasn't on Kickstarter. They did pre-order from the get go.


I'm was looking at the order form on their website, and then saw the URL: https://store.leapmotion.com/(S(k0erjjqyuqwxmojbybjv6vpx))/P... Is that a hash? It looks weird, and I have never seen anything like this before.


I'm not surprised that many of the launch apps are wonky. I developed something using a Leap for a hackathon, and the Leap is so accurate that significant time has to be invested to debugging and just playing around with the app to improve the user experience.


The accuracy is definitely a mixed blessing - interpolation, velocity awareness, and (screen-independent) gestures go a long way toward a good user experience.


Totally agree, but I just didn't realize how difficult it was to really master those gestures and harness their power.


Used as a pointing device it will have the same problem as other mouse alternatives (and mice). The only way to get feedback is through rigid starting at the screen to keep track of the cursor. This breaks the relationship between physical action and computer action in the body. My theory theory is that this adds stress into the nervous system and adds to the muscular fatigue. If an interface is poorly tuned you can just feel the anxiety rising if you are trying to get work done.


Depends on what you mean by "feedback". The Leap Motion controller tracks fingertips and objects absolutely in real space. Applications can easily tie cursor movement (or viewspace/camera movement, or other interface elements) to absolute hand position or to spatial offset from an arbitrary position in three-space. This actually works really, really nicely for applications designed from the ground up to think this way.

Here, for example, is demo video of a three-space navigation app built on these principles that uses the Kinect: http://vimeo.com/65937620

And a simpler demo using the Leap: http://vimeo.com/66196803

There's no muscle fatigue or stress using these kinds of applications. You get muscle fatigue (gorilla arm, forearm pain, hand pain) when your ergonomic setup -- which includes both hardware and software -- forces your body to conform to motion patterns that are rigidly repetitive, forces you to position yourself in ways that are awkward relative to your joint kinematics, or forces you to over-correct inaccuracy or unpredictability in the interface.


One of our HN Kansai members developed a few demo applications for the Leap Motion Controller: you can check it out here: https://www.youtube.com/watch?v=b67QedK9jhA


Can the Leap Motion be used in addition to the MYO Armband, or would that not make sense?


My thoughts exactly! I've preordered my MYO :) Let's throw a Kinect in for body-scale gestures.


Don't forget WiSee!


I wonder what cool things can be done with a Raspberry Pi and one of these motion controllers! Both are very small in size.


Hopefully lots someday. Nothing today. :-(

The Leap Motion Controller works with computers running Mac OS X 10.7 or 10.8, or Windows 7 or 8. It requires a minimum Intel Core™ i3 or AMD Phenom™ II processor, 2 GB RAM and a USB 2.0 port.


Update: If you go to the SDK, they do have linux support (version 0.8.0.5300) as a Debian package. From the release notes, it supports: Ubuntu Linux 12.04 LTS and Ubuntu 13.04 Raring Ringtail

I was able to install and run it on my Ubuntu 12.04 system. It had a dependency issue because I had previously upgraded to lts-quantal - the install did not recognize the OpenGL package because it was (from memory) "libgl1-mesa-glx-lts-quantal" instead of the expected "libgl1-mesa-glx", but I did a --force-depends and it installed and ran fine.


I have Mac OS X 10.6.8, and I wasn't planning on updating in a hurry. I guess I'll be sending my Leap Motion back (I have up to 60 days in which to do so).

When I ordered it, back in September 2012, there was no indication that it wouldn't be compatible with my OS. I wonder what there is in 10.7 that they require.


The RPi has a terrible USB system. Hopefully its successor fixes this problem. In the mean time, there are lots of other ARM boards available.


Response to a dead comment ("Just saying that without elaborating isn't very useful. Or credible."):

I elaborate a bit more on the subject in an older blog post: http://blog.mikebourgeous.com/2012/10/02/the-ideal-arm-platf...

You don't have to take my word for it. You can search the web for RPi USB or Raspberry PI USB problems, or even do your own tests.

Try running dd if=/usb/storage/some/file of=/dev/null and see just how slow it goes. The fact that the Ethernet interface is connected over USB is also a problem, because it hogs what precious little bandwidth the RPi has. Try running a bandwidth test from the RPi to another computer on your LAN; you probably won't come close to 100mbit/s.

No doubt the RPi is good at a lot of things, but USB performance and network bandwidth are not among those things.


Amazing. I wonder if it's possible to read movements away from the screen? One big drawback of touch/motion is that it covers the screen, but what if we put the detection device lower?

The feedback is tight as long as you can continuously see your input, and motioning lower could help with tiredness.


The device can be positioned anywhere you like, but I'm guessing there's a definite learning period if you have it somewhere other than under your monitor (sort of like what I imagine Wacom users have the first time the use a screen-mapped tablet).


Looked at the page they linked... still have no idea what it is they're launching, having never heard of it before.

Might be a good idea not to assume that people know what your product is when you write such a release page...


This is only the beginning. We would have reacted the same way to the mouse. Keyboard shurtcuts is better, Mousearm problems etc.


Looking forward to combining Leap with the Oculus Rift!


I tried to think of an application for that combo and all I could come up with is this:

You sit down at a virtual PC, and control it by waving your hands over a virtual Leap Motion. Which is very meta, but perhaps not very interesting.


Off topic: But I really like their site design.


This thing is horrible for porn.

My web browser keeps scrolling up and down very rapidly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: