Category Archives: WOW

Wearable Tech: Healthcare Win

Posted on by .
InformationWeek: HEALTHCARE // MOBILE & WIRELESS
7/28/2014
While most businesses still view wearable computers as little more than toys, healthcare has embraced them. Check out these interesting examples.

Recovery and therapyAfter surgery, you often face a recovery period that involves physical therapy. Along comes Google Glass again, this time with an app from a company called Wearable Intelligence. The app guides a therapist through a specific set of exercises for a specific patient, with exact instructions on such things as amount of flexion and limb rotation. The app also uses the camera to record video of the exercise, and even gives a check mark 'OK' when all steps have been accomplished. This should help reduce misapplied therapies, and help new therapists rapidly gain the right skill sets.</p>
<p>Image credit: wearableintelligence.com<br />

Recovery and therapy
After surgery, you often face a recovery period that involves physical therapy. Along comes Google Glass again, this time with an app from a company called Wearable Intelligence. The app guides a therapist through a specific set of exercises for a specific patient, with exact instructions on such things as amount of flexion and limb rotation. The app also uses the camera to record video of the exercise, and even gives a check mark “OK” when all steps have been accomplished. This should help reduce misapplied therapies, and help new therapists rapidly gain the right skill sets.

Image credit: wearableintelligence.com

Zipbuds: an example of why did I not think of that?

Posted on by .
Zipbuds Juiced 2.0
COMPANY Zipbuds, San Diego
DESIGNER Robin DeFay, president; Erik Groset, director of operations; Justin Liu, director of technology
PRODUCT LAUNCHED November 20, 2013

WHAT IS IT? Zipbuds headphones’ unique zipper design eliminates headphone tangles. The cables are made of military-grade aramid, so they’re built to last. And the company uses proprietary technology to give the earbuds a comfortable and secure fit.WHY? How do you tie the world’s strongest knot? Put your headphones in your pocket and wait five minutes. Headphones snarl. Sometimes, it seems as if that’s what they’re designed to do. Zipbuds’s design is simple and obvious in retrospect, and that’s what the judges liked about it.Sales of stereo headphones in the United States amounted to $2.3 billion in 2013. It’s unclear what portion of that was driven by the need to replace headphones whose cables had become impossibly tangled.

“You know a great design when you say, ‘Why didn’t I think of that?’ It’s often hardest to solve a simple problem well. This design nails it.”

–BILL BURNETT, EXECUTIVE DIRECTOR, DESIGN PROGRAM, STANFORD UNIVERSITY

Google picks up incredible visual translation app Word Lens and makes it free

Posted on by .
By  in TechRepublic
May 20, 2014
Word Lens
 Image credit: Apple

 

Back in 2010, a company called Quest Visual debuted a little app called Word Lens. It scarcely seemed possible, but the app translated a number of different languages in real time using just the smartphone’s camera. When traveling in a foreign country, Word Lens users would simply hold the phone up to a sign and the camera would immediately translate it.

Currently, users can translate between English and Portuguese, German, Italian, French, Russian, and Spanish.

It’s easy to see why Google would want to own it — its stated mission is to make all the world’s information searchable in any language — and Google Translate generally does this quite well, at least for web pages.

With Word Lens, iPhone users can translate the world. Apple even featured the app in its recent “Powerful” television ad for the iPhone 5s, and it’s obvious why.

Even better, it doesn’t require a connection to the internet, which is another benefit for business travelers.

Word Lens isn’t perfect. It has trouble with particularly stylized text or handwriting, and the translations will make occasional mistakes. However, most of the time, it will at least get the point across.

No financial terms on the acquisition, which was announced on Quest Visual’s website, were disclosed. Neither company shared details on what the future holds for Word Lens either, other than the website saying that the app and language packs would be “free to download for a limited time,” while the Quest Visual team transitions to Google. Individual language packs previously cost $3 each.

Quest Visual

 Image credit: Quest Visual

The app itself is free to download from the App Store for both the iPhone and iPad, and I couldn’t recommend it more highly. The translations are available via an in-app purchase, though they are currently free. It’s also available on the Google Play Store for Android users.

Because we don’t know how long Word Lens will remain on the stores, I recommend that you pick it up as soon as possible, particularly if you travel internationally.

Word Lens is truly one of the killer apps for mobile, and it should be a staple on every phone. Here’s hoping that Google keeps it around in some form or another and doesn’t kill it off.

What about you? Do you have a situation where you used — or should have used — Word Lens? Let us know in the comments below.

 

SOLAR ROADWAYS

Posted on by .

CoolBusinessIdeas.com

Solar Roadways

Eight years ago, Scott and Julie Brusaw had a vision of replacing the asphalt on American roadways and parking lots with energy-producing solar panels that are strong enough to withstand vehicular traffic. After a lot of experimentation and funding struggles, the couple and their company Solar Roadways just unveiled their first parking lot made of hexagonal panels! The road panels don’t just harvest solar energy – they are also equipped with circuit boards, programmable LEDS, and a heating element that melts ice and snow, all covered in “super-strength” textured glass. The parking lot is equivalent to a 3,600-watt solar array.

Not only does the parking lot harvest energy, it also incorporates overhead utilities and repositions underground utilities for more efficient use. Power and data cables line a cable corridor alongside the parking lot, which provides easy access to power and data companies. This eliminates the need for overhead power lines and amazingly removes cell phone dead spots. The cable corridor is able to house all kinds of cables, including TV, fiber optics for high speed internet, and phone. Another function of this incredibly smart parking lot is to store, treat and redistribute storm water.

LED light bulb that doubles as a Bluetooth speaker

Posted on by .

 

ghg

Photo: AwoX website

AwoX, a provider of multimedia interconnection technologies for the home, released an LED light bulb that doubles as a Bluetooth speaker.

The StriimLIGHT twists into any light socket to provide light and sound that you can control using the AwoX smart control app on your smartphone, laptop or desktop computer.

The app allows the user to brighten or dim the light or change its colour, add music, apply an alarm timer to its functions and control the speaker’s volume.

AwoX CEO Alain Molinie said: “Best of all, StriimLight’s unobtrusive design makes it easy to add music to rooms where space is scarce, such as bathrooms and kitchens.”

The light bulb delivers 110 to 240 volts of LED illumination and pumps out the sound via a 10-watt speaker.

There is also a Wi-Fi version, which operates in the same way as the Bluetooth version, while there’s a mini version for smaller sockets.

The cost of each light bulb is $99 and is currently available in the US.

Night-Vision Contact Lenses That Use Infared Technology May Soon Be Possible, Researchers Say

Posted on by .

Those night-vision devices used by hunters and soldiers may soon get a lot smaller — small enough, in fact, to be built right in to contact lenses.

That’s the word from University of Michigan researchers, who say they’ve created the first-ever full-spectrum infrared light detector that works at room temperature. Conventional night-vision devices require bulky built-in cooling units to work properly.

Night-vision technology makes it possible to see light that is imperceptible to our eyes, and heat that radiates from the bodies of people and animals in the dark.

“We can make the entire design super-thin,” Dr. Zhaohui Zhong, assistant professor of electrical and computer engineering at the university, said in a written statement. “It can be stacked on a contact lens or integrated with a cell phone.”

The key to the new technology is a lightweight and super-strong form of carbon known as graphene. Ordinarily, graphene absorbs only about 2.3 percent of light that hits it — not enough to generate a usable infrared signal. But by combining two layers of graphene with an insulator, the researchers were able to boost the signal dramatically. Sensors made of sandwiched graphene can detect the full infrared spectrum, in addition to visible and ultraviolet light.

Zhong and his team have yet to integrate their technology into contact lenses, but he says the technological pathway to such devices is clear.

“If we integrate it with a contact lens or other wearable electronics, it expands your vision,” Zhong said. “It provides you another way of interacting with your environment.”

And wearable night-vision contacts are just one possible application of the new technology. Infrared devices are also used to identify gas leaks, help doctors find blood vessels and even allow art historians to see sketches under layers of paint.

“Our work pioneered a new way to detect light,” Zhong said in a statement. “We envision that people will be able to adopt this same mechanism in other material and device platforms.”

A paper describing the research was published online March 16 in the journal Nature Nanotechnology.

UC Davis taps Ginger.io for psychotic illness study by passively using smartphones

Posted on by .

 

By: Aditi Pai | Mar 10, 2014 in mobihealthnews

UCDavisGingerioUniversity of California Davis’ Early Diagnosis and Preventive Treatment (EDAPT) Clinic is launching a study in partnership with mobile health company Ginger.io. The 12-month study follow 120 young people who are in the early stages of psychotic illness. Only youth who are enrolled in the UC Davis or City of Sacramento (EDAPT) clinics are eligible to participate.

UC Davis received $588,000 from the Robert Wood Johnson Foundation to fund this study.

All of the children in the study will receive the Ginger.io app that passively collects data from users’ phones including information on their movement throughout the day, call patterns, and texting patterns. Users can also actively record information about how they are feeling each day. If the app senses that something is off with the user, it will automatically notify the user’s family and physician.

This system makes it easier for physicians to gain insights into how a patient is doing on a day-to-day basis and helps users recall information from the week as sessions.

“We are trying to identify the early warning signals that someone is struggling, so we can intervene earlier and hopefully prevent relapse,” Tara Niendam, assistant professor in the Department of Psychiatry and Behavioral Sciences and director of operations for the EDAPT Clinic said in a statement. “If an individual is having a bad week, we can reach out to them quickly, rather than waiting for them to call us or come in to the clinic for their next appointment.”

Niendam added that there were many warning signals that the study could identify. One metric might be whether forgetting a dose of medication could trigger an increase in symptoms or how the number of arguments a user has with family members affects his or her mood.

If study participants do not have smartphones, UC Davis will provide phones via a partnership with T-Mobile USA.

Ginger.io has partnered with multiple other organizations in the past including Sanofi US DiabetesC3N’s IBD Pilot, and UCSF’s Health eHeart initiative.

Ginger.io’s spokesperson Stephanie Wilson told MobiHealthNews in an email that the app is “currently part of the care solutions at Kaiser Permanente and Novant Health and contributes to core research at UCSF, Cincinnati Children’s, and MIT Medical. Ginger.io also recently launched Mood Matters, making the technology directly accessible to people living with depression.”

Blocks to build products, not just for little ones. Like Legos? You’ll Love LittleBits

Posted on by .
By Sarah Ang in Mashable

Playtime is fun. Electronics and circuit boards? Not so much — until now.

littleBits is an open-source library of color-coded electronic modules — miniscule circuit boards with specific functions, such as light, sound, sensors, buttons, thresholds, motors and more — that snap together via tiny magnets in order to make larger circuits. There is no programming, wiring or soldering.

As one would imagine, these simple, intuitive blocks promote limitless experimentation, prototyping and learning and have been dubbed the “Legos for the iPad generation.” Projects range from sound-activated bowties to self-scrolling sheet music.

“It’s more this idea of democratizing electronics and putting the power of electronics in everyone’s hands,” says founder and CEO Ayah Bdeir.

The company upholds a firm commitment to open innovation and wants to help create the next generation of inventors by making electronics a fully accessible material for both children and adults — artists, designers and explorers alike. Everything is open sourced.

And as Bdeir explained in her 2012 TED Talk, electronics should be just another transformable building material, not unlike paper or wood. Now, the toy industry has embraced littleBits as an educational toy.

Part of what makes littleBits so intuitive for everyone is that it breaks down complex and abstract concepts — electricity, for one — into tangible, color-coded blocks.

“There’s a brick that’s light, and a brick that’s sound, and a brick that’s a sensor, and you can build it within seconds without having any background in engineering whatsoever,” she says.

“It’s really important that we are able to understand how electricity works — how a light comes on, how our systems communicate with each other, and how all these new sensors that are all around us are gathering data about us.”

An alumna of the MIT Media Lab and a co-founder of the Open Hardware Summit, Bdeir was creating interactive art as a post-grad when she found herself more interested in the tools she used than in the outcome of her work.

Four years ago, littleBits was but a prototype Bdeir posted on her personal website. It was picked up by a friend, received some media attention, and soon she was receiving hundreds of pre-orders for a nonexistent product.

The company’s first seed round in September 2011 received $850K, led by investors including tech and open-source hardware pioneers Joi Ito and Nicholas Negroponte. The first kit was created and sold, and littleBits was officially up and running. Since then, littleBits has received more than $14 million in funding.

The New-York based startup has won 22 awards thus far and has grown to almost 30 employees. It currently offers 37 Bits modules, which are sold in nine different kits. To date, littleBits kits have been sold in 80 countries.

Watch the video above to see how Bdeir grew littleBits from an idea to one of the most ingenious companies in the field.

What do you think of littleBits? Tell us in the comments.

Image: Mashable

Leap Motion Controller Leaps Forward With Software, Sharpens Focus With Apps

Posted on by .
11/23/2013 Forbes by Anthony Wing Kosner.

leap-motion-second-generating-hand-tracking

Leap Motion is a revolutionary company, but revolutions take time. In Leap’s case, their motion control device introduces a whole new way for people to interact with computers. As with anything truly new, our first instinct is to map it to what we already know. For computer interaction, what we know is the mouse, which maps two dimensions of a surface to the two dimensions of a screen. And, since the advent of mobile, we know touch, where a gesture on the surface of a screen maps directly to the elements on that screen. The Wikipedia page for Leap Motion supports this common preconception explaining that the device “supports hand and finger motions as input, analogous to a mouse, but requiring no hand contact or touching.”

After working on the problem for a couple of years, Leap founders Michael Buckwald and David Holz secured Series A funding in May of last year and announced pre-sale of the $79.95 USB drive-sized device. The reception from developers was astounding, and The MIT Technology Review called it, “The Most Important New Technology Since the Smart Phone.”  Unlike Apple, Leap Motion did not try to control every aspect of the user’s experience, instead sending tens of thousands of kits to developers to allow them to create their own apps in their ecosystem. I wrote, later that summer, that the company was “Putting Its Future Into The Hands Of Developers.”

The actual product was released this past July, six months later than expected, to decidedly mixed reviews. Leap succeeded, almost too well, at creating excitement and awareness around their product. Their design and marketing was all first-rate, but the first crop of apps on its Airspace app store were a mixed bag. My own first hands-on experience was a frustrating disappointment. My kids were initially excited, but soon lost interest.

The problem was twofold. First, the hand tracking software was incredibly accurate, but suffered from some glitchy and erratic behaviors. Second, many of the apps did not make it clear exactly what type of behavior they were expecting from the user. A platform like Leap Motion is only as good, to a new user, as the worst app they try out first. Combine these issues with an unfamiliar interaction paradigm and you have a formula for frustration.

But before you write the future of computer interaction off as a gimmick, count to ten. Leap Motion is part of a new hardware product paradigm as well. Unlike the iPhone which seems designed for hardware obsolescence, these new purpose-built devices can increase their functionality by orders of magnitude with changes to software alone. And in the four months since the Leap Motion Controller was released, this is exactly what the company’s engineers have been busy doing.

The Leap’s second-generation hand tracking software that I previewed at the company’s San Francisco offices last week (see screen shot above) with founder Buckwald and marketing head Michael Zagorsek, addresses the glitches in the first version while capturing more information in a more efficient way. Most importantly, it solves the occlusion problem that happened when the device’s camera’s “lost sight” of the fingertips it was tracking, either from a hand rotating perpendicular to the line of sight or being blocked by the other hand. This had the unexpected effect of of causing the representation of the hands to disappear from the screen in the middle of an action.

The new software tracks not only the fingertips and palms of the hands, but each joint as well, making for a much more accurate representation of hand position and motion. And the software has also grown up (according to the “object permanence” stage of Piaget’s childhood development model) so that it remembers where a hand is even when temporarily out of view. This will definitely eliminate some of the potentially confusing feedback users can receive that breaks the illusion of continuity in some of the apps.

Perhaps even more significant in terms of growing adoption and engagement with the device is the emerging sense of what the Leap Motion Controller is actually good for. As I wrote about in my description of Elliptic’s ultrasound gesture recognition technology, Leap is really overkill for reading simple swipe gestures to turn a page or scroll. Leap is now defining itself as a “motion capture” technology in contrast to mere gesture recognition. This distinction is important for understanding what Leap does best.

Gestures, whether through touch or in the air, are generalized movements. Moving your hand or finger from left to right within certain statistical boundaries can be interpreted as a swipe. But the capture of precisely where the different parts of your hand are during the course of that gesture is a whole other matter at which Leap Motion uniquely excels.

freeform-2-800

To bring this point home, Leap has just released a new app which it developed internally called Freeform. Freeform allows you to sculpt highly detailed figures within the three-dimensional space of the app directly with your hands and then export these as 3D models that can be printed on any 3D printer. As you can see in the screen shot above, you can choose the material and tools to work with and even rotate the model for lathe or pottery wheel effects.

This is only the second app produced by Leap Motion that has been released into Airspace. The first is a general computer control app called Touchless (available in different versions for Mac and Windows.) The difference between the two apps is striking and marks a major refocusing of the company’s approach to what a Leap Motion app should be. Instead of emphasizing Leap as a three dimensional mouse, Freeform highlights the precise control you can achieve with it as an input method. This, more than mouse replacement, is the technology’s unique strength. Leap Motion is really for capturing fine motor movements as opposed to the gross motor movements captured by devices like Kinect and Elliptic.

There is something more, as well. For Leap Motion to regain the enthusiasm of its initial reception from the chasm of disappointment that followed its actual release, it must demonstrate concrete use cases that satisfy the mass audience. Freeform could help do that.

Imagine for a moment the share of the population that has the kind of spatial intelligence required to work physically in three dimensions. Now, imagine the sliver of that population that also has the abstract intelligence required to work with 3D programs on a computer. And even for people in that sliver, consider that it takes 3-6 months to get up to speed with such programs. This series of constraints is a limitation, in turn, on the adoption of 3D printing. By creating a program that people without highly developed abstract intelligence can get comfortable using in a matter of minutes or hours instead of months (if ever) Freeform could radically alter the landscape of 3D creativity.

And 3D modeling and printing is just one example. In the months ahead, look for Leap’s developer community to refocus their efforts as well around opening these kinds of constraints from all kinds of dimensional activities. With this is mind, it is very exciting that SOS Ventures and Founders Fund have announced the formation of a new accelerator program specifically designed to jumpstart the next generation of apps for Leap Motion’s technology. The LEAP.AXLR8R will pick startups with disruptive and achievable ideas, provide seed funding and office space adjacent to Leap Motion’s offices in San Francisco as well as access to mentors and other resources to support an intensive three month development cycle beginning in late January, 2014.

 

Recon Instruments Launches Jet Heads-Up Display

Posted on by .
Recon Instruments Launches Jet Heads-Up Display
New sport eyewear computer puts ride metrics and more in front of your eyes
ByGreg Kaplan in icycling Magazine
 

Vancouver, Canada-based Recon Instruments has a new product that puts ride metrics right in front of your eyes. The company’s Jet combines the computing power of a mobile phone with a display that’s easy-to-read and unobtrusive, powered by a battery that can last for hours, and fits on a pair of sunglasses. The new product was unveiled today, and the first units will arrive in December.

Recon Instruments’ co-founder Dan Eisenhardt was a competitive swimmer at the national level in Demark when he started searching for a device that could give him instant and accurate updates on his performance metrics in the water, but found none. Later, while in graduate school at the University of British Columbia, Eisenhardt and classmate/Recon co-founder Hamed Abdollahi developed a solution that could eventually quench his thirst for data. The pair came up with an idea for a heads-up display for ski goggles, and with the encouragement of a professor, took the idea and ran with it.

The first designs for ski goggles incorporated the processors from a Texas Instruments graphing calculator. Metrics such as speed and airtime were projected onto a small display at the bottom right side of the goggles’ lens, where the information and the actual display hardware were the least obtrusive. With funding from angel investors and government grants, the first version of the goggles, called Snow, went into development in 2008 and finally launched in 2010. Snow sports were a good start, but the company’s reach was limited by the seasonal ski season. Targeting cyclists, many of whom ride year-round, was a natural progression, and many riders want the same kind of performance data that Recon already provides to skiers, including the ability to connect in real-time with friends via mobile devices.

The Jet provides its own GPS-derived data such as speed, altitude, and gradient, and also connects to other networkable devices—power meter, heart rate monitor, speed/cadence sensor, or a smartphone—and display data from those devices. If connected to a smartphone, Jet can even display more mundane data like text messages and incoming calls. To motivate you further, the glasses are even capable of displaying a ghost rider to pace your workouts, or a Strava KOM “pace indicator” to show your progress on a specific segment.

The Recon Jet system incorporates three components: the optics themselves, the Recon display and processor, and a swappable power supply. Recon-ready eyewear comes in a range of options, including lens shade, polarization, and size.

The Recon unit itself fits snugly against the lower right side of a pair of sunglasses, on the outside of the lens. A rubber gasket holds the display against the lens and keeps out moisture, preventing condensation. The company says that most users will be able to read the display, regardless of what corrective eyewear they use. You control the display by swiping and tapping a sensor on the unit, just as you would a smart phone. Recon’s battery is on the left side of the glasses, to balance the unit.


The Recon Jet sits on a normal-sized pair of sunglasses.

 

 


The Recon Jet display is based on technologies developed for Recon’s snow sport goggles, but will display metrics such as speed, power, and time.

The Recon Jet system, the company says, adds about one ounce to a pair of glasses. In your hands, it feels heavy and unbalanced compared to a pair of sunglasses. But after five minutes of wearing the system, you stop noticing the added weight.

The brain that drives Jet is a 1GHz dual-core ARM processor, with 1GB memory, and 8GB storage—similar to the technology powering some smartphones. The Jet display is 428x240px and incorporates a prism and magnifying lens to make the display appear to be a 30” LCD, as you would see it from seven feet away. The 16:9 display is clear and easy to read. Every Jet unit has 9-Aix sensors, powered by an accelerometer, a gyroscope, and a magnetometer. The company says their battery will last as long as nine hours, depending on how you use the system, the strength of your GPS signal, and environmental factors including temperature. You can connect to other devices via Ethernet (“wifi”), Bluetooth low power, and ANT+. There’s a micro-USB connector for data transfer and charging.

When wearing the Jet, the display is only activated when you look at it; an infrared eye-sensor toggles the display to extend battery life. Glance down, and the display is instantly enabled and viewable; when you return your focus to the road ahead, the display turns off. To the user, it appears seamless. Speed, power, grade, and other metrics are displayed simultaneously in a segmented view, similar to the interface used by Garmin, and you can set which metrics are displayed at any time.

You can also display route information thanks to a navigation interface—especially useful for finding your way home. There’s also a feature that lets you locate friends—ideal for keeping your group together on a Gran Fondo.

Linking a GoPro or Contour camera can provide streaming video to anyone following a Jet wearer; and you can even give yourself a rear view by mounting a rearward-facing camera and connecting it to your display.

You can pre-order a Jet now for $500, and units will ship in December. After July 21, the price will jump to $600. The glasses have a one-year warranty, but it does not cover crash replacement.

The Recon Jet shows a lot of promise, but we hope the glasses become more sleek. As to which cyclists will be best-served by this new product, one obvious application is riders who enter a lot of time trials—the ability to watch their power output without breaking from their aero position will be a boon to performance.