Leap Motion is a revolutionary company, but revolutions take time. In Leap’s case, their motion control device introduces a whole new way for people to interact with computers. As with anything truly new, our first instinct is to map it to what we already know. For computer interaction, what we know is the mouse, which maps two dimensions of a surface to the two dimensions of a screen. And, since the advent of mobile, we know touch, where a gesture on the surface of a screen maps directly to the elements on that screen. The Wikipedia page for Leap Motion supports this common preconception explaining that the device “supports hand and finger motions as input, analogous to a mouse, but requiring no hand contact or touching.”
After working on the problem for a couple of years, Leap founders Michael Buckwald and David Holz secured Series A funding in May of last year and announced pre-sale of the $79.95 USB drive-sized device. The reception from developers was astounding, and The MIT Technology Review called it, “The Most Important New Technology Since the Smart Phone.” Unlike Apple, Leap Motion did not try to control every aspect of the user’s experience, instead sending tens of thousands of kits to developers to allow them to create their own apps in their ecosystem. I wrote, later that summer, that the company was “Putting Its Future Into The Hands Of Developers.”
The actual product was released this past July, six months later than expected, to decidedly mixed reviews. Leap succeeded, almost too well, at creating excitement and awareness around their product. Their design and marketing was all first-rate, but the first crop of apps on its Airspace app store were a mixed bag. My own first hands-on experience was a frustrating disappointment. My kids were initially excited, but soon lost interest.
The problem was twofold. First, the hand tracking software was incredibly accurate, but suffered from some glitchy and erratic behaviors. Second, many of the apps did not make it clear exactly what type of behavior they were expecting from the user. A platform like Leap Motion is only as good, to a new user, as the worst app they try out first. Combine these issues with an unfamiliar interaction paradigm and you have a formula for frustration.
But before you write the future of computer interaction off as a gimmick, count to ten. Leap Motion is part of a new hardware product paradigm as well. Unlike the iPhone which seems designed for hardware obsolescence, these new purpose-built devices can increase their functionality by orders of magnitude with changes to software alone. And in the four months since the Leap Motion Controller was released, this is exactly what the company’s engineers have been busy doing.
The Leap’s second-generation hand tracking software that I previewed at the company’s San Francisco offices last week (see screen shot above) with founder Buckwald and marketing head Michael Zagorsek, addresses the glitches in the first version while capturing more information in a more efficient way. Most importantly, it solves the occlusion problem that happened when the device’s camera’s “lost sight” of the fingertips it was tracking, either from a hand rotating perpendicular to the line of sight or being blocked by the other hand. This had the unexpected effect of of causing the representation of the hands to disappear from the screen in the middle of an action.
The new software tracks not only the fingertips and palms of the hands, but each joint as well, making for a much more accurate representation of hand position and motion. And the software has also grown up (according to the “object permanence” stage of Piaget’s childhood development model) so that it remembers where a hand is even when temporarily out of view. This will definitely eliminate some of the potentially confusing feedback users can receive that breaks the illusion of continuity in some of the apps.
Perhaps even more significant in terms of growing adoption and engagement with the device is the emerging sense of what the Leap Motion Controller is actually good for. As I wrote about in my description of Elliptic’s ultrasound gesture recognition technology, Leap is really overkill for reading simple swipe gestures to turn a page or scroll. Leap is now defining itself as a “motion capture” technology in contrast to mere gesture recognition. This distinction is important for understanding what Leap does best.
Gestures, whether through touch or in the air, are generalized movements. Moving your hand or finger from left to right within certain statistical boundaries can be interpreted as a swipe. But the capture of precisely where the different parts of your hand are during the course of that gesture is a whole other matter at which Leap Motion uniquely excels.
To bring this point home, Leap has just released a new app which it developed internally called Freeform. Freeform allows you to sculpt highly detailed figures within the three-dimensional space of the app directly with your hands and then export these as 3D models that can be printed on any 3D printer. As you can see in the screen shot above, you can choose the material and tools to work with and even rotate the model for lathe or pottery wheel effects.
This is only the second app produced by Leap Motion that has been released into Airspace. The first is a general computer control app called Touchless (available in different versions for Mac and Windows.) The difference between the two apps is striking and marks a major refocusing of the company’s approach to what a Leap Motion app should be. Instead of emphasizing Leap as a three dimensional mouse, Freeform highlights the precise control you can achieve with it as an input method. This, more than mouse replacement, is the technology’s unique strength. Leap Motion is really for capturing fine motor movements as opposed to the gross motor movements captured by devices like Kinect and Elliptic.
There is something more, as well. For Leap Motion to regain the enthusiasm of its initial reception from the chasm of disappointment that followed its actual release, it must demonstrate concrete use cases that satisfy the mass audience. Freeform could help do that.
Imagine for a moment the share of the population that has the kind of spatial intelligence required to work physically in three dimensions. Now, imagine the sliver of that population that also has the abstract intelligence required to work with 3D programs on a computer. And even for people in that sliver, consider that it takes 3-6 months to get up to speed with such programs. This series of constraints is a limitation, in turn, on the adoption of 3D printing. By creating a program that people without highly developed abstract intelligence can get comfortable using in a matter of minutes or hours instead of months (if ever) Freeform could radically alter the landscape of 3D creativity.
And 3D modeling and printing is just one example. In the months ahead, look for Leap’s developer community to refocus their efforts as well around opening these kinds of constraints from all kinds of dimensional activities. With this is mind, it is very exciting that SOS Ventures and Founders Fund have announced the formation of a new accelerator program specifically designed to jumpstart the next generation of apps for Leap Motion’s technology. The LEAP.AXLR8R will pick startups with disruptive and achievable ideas, provide seed funding and office space adjacent to Leap Motion’s offices in San Francisco as well as access to mentors and other resources to support an intensive three month development cycle beginning in late January, 2014.