What I’m Doing
I initialize my Node.js server, plug my Leap Motion into the computer’s USB port and hold my hand in the air. I make a gesture that looks like left-clicking on a mouse, but am not actually touching anything. Suddenly, with a whoosh, the rotors on my drone buzz to life. This particular model, the AR Drone 2.0, has a quadcopter structure with 4 rotors in a square shape, which gives it enhanced stability. The drone lifts into the air and hovers there, waiting for me to issue a command. I move my hand forward and the drone edges away from me.
The further away from center I move my hand, the faster the drone moves until I realize, almost too late, that it is quickly approaching a tree. I drive my hand down and to the right and the drone dodges to the right and under the looming branches, narrowly escaping disaster. I point with my finger and make a circle in the air counter-clockwise and the drone rotates so that it is facing me. Moving my hand forward again, the drone accelerates towards me. Glancing back at my computer, I can see myself getting closer in the drone’s video camera, which is streaming in my browser. This is only the beginning of my drone journey.
The implications for robotics are that multiple commands can either be processed simultaneously or chained to occur in a particular sequence. Languages like Ruby and Python are not asynchronous and commands block one another, which may result in disaster if a single command gets stuck and takes a long time to process.
Why I used Leap Motion and The Future of Controllers
At this point, Leap Motion’s software is in need of an upgrade and isn’t very effective at detecting finger movements if, for instance, you turn your hand to the side. As a result, I used hand movements for everything except takeoff / landing (done by gesturing with pointer finger as though left-clicking mouse) and rotation (done by making a circle in the air with pointer finger). I have heard that Leap Motion is upgrading their firmware in the next few weeks and am excited that one of the features is much more precise finger tracking. In the meantime, I am leveraging Leap Motion’s X, Y and Z axes hand position detection to control left/right, up/down and forward/back actions.
Devices for interacting with the world around us are rapidly increasing in effectiveness. Imagine if you could use something like Leap Motion without being tied to a computer. In early 2014, Thalmic Labs is releasing the Myo, an arm band that detects electrical activity in muscles associated with finger movements to wirelessly control digital technologies. In the near future, I may be able to build a pocket-sized autonomous personal drone that follows me down the sidewalk while sending a video feed to my Google Glass, then disengage autopilot and control my drone assistant through hand gestures. I could also send my drone on missions to pick up a burrito, survey surrounding traffic, mow my lawn, take an aerial picture of me and my buddies, and thousands of other possibilities.
In part 2 of this article, I’ll go through some of the challenges facing the emerging drone industry and why we are on the cusp of a hardware revolution.
To Be Continued…