Archives For robotics

We are on the cusp of the personal robotics age.  The most optimistic think that it will only be another 2-3 years until we all have personal robotic assistants, drone deliveries, and robotic entertainment for accessing information and interacting with the world around us.  I think that we won’t begin seeing the beginnings of that paradigm shift until sometime in 2018, about 5 years from now.

For those that don’t believe domestic drones will ever be legal – laws passed in 2012 require the FAA to allow commercial drones in domestic airspace by 2015.  Just a couple months ago, the FAA released a roadmap for drone legalization by 2015:  In it, they set the stage for legalizing drone use by law enforcement, businesses, universities, and hobbyists.  Although they may not hit that exact deadline, it is likely that we are about to witness the emergence of a multi-billion dollar industry seemingly overnight.

Unfortunately, many people associate drones with military operations and the press has primarily cast them in a negative light.  The word drone makes most people cringe as they think about the dangers of militarized drones and possible reductions in privacy.  However, this way of thinking is akin to fearing computers in the 70s because of the possibility that they could be used by black hats to wreck havoc on society.

There are thousands of domestic applications for drones that will enhance our world.  Drones will be used in agriculture for targeted weed management, watering, harvesting, and transportation resulting in less pesticide use, less water waste and fresher food.  Restaurants and grocery stores will deliver food more quickly, Amazon will deliver packages within hours (though Amazon’s timeline is pretty optimistic), and logistical issues like traffic will be monitored real-time.  Little league games will be videotaped as though professional, weddings will be filmed from previously impossible angles, extreme athletes will more easily capture epic moments, and journalists will take pictures and video of previously inaccessible areas.

Other benefits to humanity include search and rescue operations, fire and wildfire control, ecological monitoring, deep ocean surveillance (yes, these are technically drones despite not flying), medical first responders, medical supply transportation, transporting food and water to impoverished areas, and disaster relief.

Drones can also be used for entertainment.  Imagine a stadium filled with spectators watching a game of drone quidditch (think Harry Potter), where the snitch is also a drone.  The drones are controlled by humans where their right arms control movement and their left arms control a primary mechanism depending on whether the drone is defense, offense, etc.

The future is coming – can you hack it?

What I’m Doing

I initialize my Node.js server, plug my Leap Motion into the computer’s USB port and hold my hand in the air.  I make a gesture that looks like left-clicking on a mouse, but am not actually touching anything.  Suddenly, with a whoosh, the rotors on my drone buzz to life.  This particular model, the AR Drone 2.0, has a quadcopter structure with 4 rotors in a square shape, which gives it enhanced stability.  The drone lifts into the air and hovers there, waiting for me to issue a command.  I move my hand forward and the drone edges away from me.

The further away from center I move my hand, the faster the drone moves until I realize, almost too late, that it is quickly approaching a tree.  I drive my hand down and to the right and the drone dodges to the right and under the looming branches, narrowly escaping disaster.  I point with my finger and make a circle in the air counter-clockwise and the drone rotates so that it is facing me.  Moving my hand forward again, the drone accelerates towards me.  Glancing back at my computer, I can see myself getting closer in the drone’s video camera, which is streaming in my browser.   This is only the beginning of my drone journey.

Why Node opens the door for programming robotics with Javascript

A few years ago, it was virtually impossible to control robots using Javascript alone because it was so slow that any application that required a reasonable response time would not function properly.

With Google’s V8 Javascript Engine, Javascript’s day has arrived.  V8 is written in C++ and compiles down to assembly, so it is very fast.  Recent benchmarks put it ahead of PHP, Ruby, and Python – second only to C itself.  Despite being initially designed to run in Google’s Chrome browser, V8 has since been adopted by several javascript frameworks, including Node.

Node is used to make web applications responsive by quickly pushing javascript from the server to the client.  Node also functions asynchronously so that multiple data streams can be queried simultaneously.  This non-blocking data transmission means that Node is able to process a second and third command without waiting for the first command to succeed.  Another unique feature of Node is that it leverages callbacks (functions that run upon success or failure to receive data) to chain instructions so that you can create a series of commands that will run in order upon completion of the previous command.

The implications for robotics are that multiple commands can either be processed simultaneously or chained to occur in a particular sequence.  Languages like Ruby and Python are not asynchronous and commands block one another, which may result in disaster if a single command gets stuck and takes a long time to process.

Why I used Leap Motion and The Future of Controllers

Leap Motion is the first viable product in a paradigm shift that is changing the way we interact with technology.  For those that don’t know, Leap Motion is a small camera that plugs into a computer’s USB port.  It can detect and track each of your hands and every finger’s movement in the half-dome-shaped space above it.  There are several applications, including playing video games, computer interfacing and, now, flying drones.  I used the javascript framework leap.js to translate hand coordinates into drone commands, then published instructions to my Node server using Faye (a simple publish-subscribe messaging system), and issued movement commands to the drone.

At this point, Leap Motion’s software is in need of an upgrade and isn’t very effective at detecting finger movements if, for instance, you turn your hand to the side.  As a result, I used hand movements for everything except takeoff / landing (done by gesturing with pointer finger as though left-clicking mouse) and rotation (done by making a circle in the air with pointer finger).  I have heard that Leap Motion is upgrading their firmware in the next few weeks and am excited that one of the features is much more precise finger tracking. In the meantime, I am leveraging Leap Motion’s X, Y and Z axes hand position detection to control left/right, up/down and forward/back actions.

Devices for interacting with the world around us are rapidly increasing in effectiveness.  Imagine if you could use something like Leap Motion without being tied to a computer.  In early 2014, Thalmic Labs is releasing the Myo, an arm band that detects electrical activity in muscles associated with finger movements to wirelessly control digital technologies.  In the near future, I may be able to build a pocket-sized autonomous personal drone that follows me down the sidewalk while sending a video feed to my Google Glass, then disengage autopilot and control my drone assistant through hand gestures.  I could also send my drone on missions to pick up a burrito, survey surrounding traffic, mow my lawn, take an aerial picture of me and my buddies, and thousands of other possibilities.

In part 2 of this article, I’ll go through some of the challenges facing the emerging drone industry and why we are on the cusp of a hardware revolution.

To Be Continued…