SpaceX and Tesla Motors founder Elon Musk has often been compared to a real life version of Tony Stark, aka Iron Man. Director Jon Favreau has even openly said that Musk inspired his depiction of Stark in the first Iron Man film. But now it seems as though the imitation has come full circle: Musk tweeted last night that he’d “figured out how to design rocket parts just w[ith] hand movements,” and would post a video of the process “next week.” Favreau tweeted at Musk asking: “Like in Iron Man?” And Musk responded in the affirmative.
“I think that people and electronic music could be more free. Normally, electronic musicians aren’t even a little bit free on live stage because they are face to face with the laptop all the time. Just move.” —Electro beat boxer Ryo Fujimoto.
“eyeSight is a leading provider of gesture recognition technologies, powering mass market, embedded touch-free solutions that create new and exciting user experiences.
With eyeSight’s technology, users enjoy a natural user interface, allowing them to easily and intuitively control a variety of devices using simple hand gestures. Devices such as mobile phones, tablets, PCs, TVs, set-top-boxes, in-car infotainment systems, and more can now be easily controlled using natural hand gestures.”
Forget Kinect! WiSee Wants To Bring Gesture Recognition to Your Whole Home
WiSee Whole-Home Gesture System lets you use Gesture Recognition where ever you are in the home - on the couch, in the kitchen etc.
The system uses available wireless signals already in your home including your router, smartphone and laptop. The signals reflect off the human body which sends signals back to these devices. Their unique algorithm makes senses of this signal to translate it into a gesture.
This system reminds me of the Myo only from the perspective that you are able to use gestures anywhere you are in order to control connected things. But I love that the founders are taking it one step further to remove the wearable device completely.
Removing wearable devices from the gesture picture is definitely the way of the future.
Wi-Fi Signals Enable Gesture Recognition Throughout Entire Home
University of Washington computer scientists have developed gesture-recognition technology that brings this a step closer to reality. Researchers have shown it’s possible to leverage Wi-Fi signals around us to detect specific movements without needing sensors on the human body or cameras.
Virtual Bridge Lets People From Amsterdam Play with Venice
Created By DropStuff, a virtual “bridge” connecting Amsterdam and Venice will be active from May 27 to June 8.
According to DropStuff: “The bridge is formed by a live connection between two large public screens that will be connected live through internet, and by using a special device; the Kinect. Between May 27 and June 10, at the two sites, visitors can see and meet each other, and play a game together on the spot.”
The video above is a demonstration of the virtual bridge in action on the Venice side.
HTML Toronto is hosting an event tonight in Toronto, Canada which they will be live streaming for those around the globe. The live stream is a presentation by TVO Ideashaker Innovation Lab and their use of new gesture-control device, Leap Motion.
TVO will be launching “Caterpillar Count for Leap Motion” when Airspace, Leap Motion’s store launches in the end of July. The IdeaShaker, TVO’s Innovation Lab took a current Flash based children’s game, modified it for HTML5, added gesture capabilities and spent a few days testing the game with children.
Please note that they will be streaming the event online for those that are not able to be there. Stream will start at 6:30PM EDT, presentations begin at 7:00PM EDT.
With Leap Motion technology and Windows, you can do everything that’s possible with multi-touch inputs — without actually touching anything.
Looking forward to get my hands above(?) this. It certainly looks awesome. The Leap Motion Controller costs $ 79.99 and the company will start shipping on July 22nd. A Mac OS X demo video is on its way as well.
Business Insider just posted a fantastic article on the “15 Ways Tech is Reinventing Society" covering most, if not all, of the hot future tech topics making headlines today (see a snapshot of their list below).
As you know my blog, Future Tech Report, is dedicated to exploring how emerging and disruptive technology is impacting our daily lives and so this article is very dear to my heart (Great job BI!).
On the BI list of 15 included:
Wearable Tech - Google Glass
Wearable Tech - Health & Fitness
The Sharing Economy
Smartphones & Connected Devices
Outside use of touchscreens and wifi
If you want a quick snapshot of the future tech which is just about to change your life head on over to Business Insider.
I can’t say it enough how grateful I am to be born in this time. We truly are on the brink of a connected revolution where we will start to see changes in society we haven’t seen since the last great industrial revolution (which was before me).
What tech are you most excited to see impact the new world, Future Geeks?
Behind the Scenes at Thalmic Labs - Creator of the Wearable Gesture Control Device “Myo”
The Myo is one of the most anticipated gesture control wearable devices expected to be released to early adopters later this year (my order is already in!).
Based in my hometown of Waterloo, Canada - this video gives a great behind the scenes look at the team, the Thalmic Labs office and some great shots of Myo in action and also provides some more information on the product and their development process which hasn’t been previously released before.
Myo uses the electrical activity from your muscles as your move your hand to detect what you are doing with your fingers as well as the motion of your hand. These gestures control connected devices via bluetooth.
The Myo stretchable cuff has been designed to be one-size fits all (they even considered making sure that arm hair doesn’t get in the way).
The team has confirmed that their developer program in the next few months giving out exclusive access to early versions of the software of the devices.
Thalmic Labs believes that the Myo device could revolutionalize the way we interact with technology - and I agree.
Look Ma - No Hardware!Software Turns Dumb Paper into a Smart Touchscreen
Fujitsu has created a spatially aware Fingerlink Interaction System creates an interactive touchscreen like system using objects in the real world - both flat surfaces like paper and tables but also curved surfaces like books.
In this way - the system takes “dumb” items and makes them “smart”
The system doesn’t use any special hardware, it re-uses an ordinary webcam and a commercial projector. It relies on image processing technology to work its magic.
The video demo shows how you can import information on the paper by selecting areas on the page to import as data.
The system is designed to operate on specific gestures and will not react when you make ordinary motions on the table. The system uses finger height accurately in order to translate touch. The system can also be operated by gesture controls.
Gesture and spatial input for PC and other connected devices have received a lot of press lately with the likes of Thalmic Labs, Leap Motion and the updates from Microsoft for Kinect - but this is the first that I have seen of a company that is attempting to use this type of technology to merge the real world with the digital world outside of a typical screen.
Watches and glasses aren’t the only wearable items we may be using in the near future. Our clothing itself could be connected with the ability to play games, turn on our TV, track our health and more.
Woven is the first complete e-wearable pervasive game platform prototype.
Woven is a graduate project of two Master students at the School of the Arts Utrecht (HKU) in the Netherlands: Patrick Kersten, an Interaction Designer, and Christiaan Ribbens, a Game Designer.
The project, which ends in August, will result in an e-wearable platform with one pervasive game (games that involve the physical world not just digital0 and other smaller prototypes (games and sports apps).
Spooky, the pervasive game in development, lets the player experience the hidden dark and funny world of ghosts, spirits and phantoms.
Microsoft Kinect Brings Gesture Control Merchandising to Retail Stores
Kinect recently was updated with gesture controls. This video does a great job of demoing the potential of this technology from an advertising perspective.
Imaging a large screen in the mall or in a retail location. When a user is not in front of the screen, images from the retailer’s catalogue scroll automatically.
But when someone gets in front of the screen, Kinect recognizes the individual and gives them control of the experience.
This type of interactivity is meant to create an enhanced experience in-store and could possibly allow for retailers to have smaller real estate footprints yet still be able to merchandise the breadth of their catalogue.
As you all know, I am giddy with excitement over attending the conference of the future this weekend - Engadget Expand. One of the speakers is from my home turf here in Canada - Ariel Garten, the CEO of InteraXon.
InteraXon provides hardware and software interfaces which lets you control computing with your mind. Yep - you read that right, your mind! All this talk about voice and gesture control and here is a company skipping over all that to link computing directly to our brains.
Their technology works by converting brainwaves into digital signals that are used as inputs into a computer - wearable devices, smartphones, smart TVs, smart cars or basically any of the connected items we expect through the internet of things.
InteraXon believes that “Brainwave-controlled interfaces (BCI) are the next steps in the big evolution in technology”
InteraXon’s newest consumer product is Muse a brainwave sensor headband with 4 built-in brainwave sensors based on EEG technology which detect levels, combinations and proportions of the 5 key types of brainwaves.
Here is how it works according to the website:
Just upload one of our custom applications to your smart phone or tablet.
Pop on your Muse. It should rest comfortably on the upper forehead, and sit behind the ears like a pair of eyeglasses. Clear any big tufts of hair out of the way, so that the earpiece of your Muse headband can make good contact with your skin.
Pop in a pair of your favorite ear buds and start the app. The Bluetooth should connect you once the app is running.
Customize your app—choose the length of time you want to spend, the soundtrack and, when applicable, the type of environment.
Wearable devices and especially new input controls for computing (including controlling our smartphones and tablets) are making massive headlines lately especially with the reveal of Google Glasses (Voice) and Leap Motion and Thalmic Lab’s Myo (Gesture Control). Time will tell which type of input will rule the computing world of the future. My guess is that we may see a combination of voice, touch, gesture and brainwaves moving forward depending on the interface and use case. One thing is for sure, typing and even touch are looking pretty dated in the face of this Sci-Fi interaction.