Question about Automated Food Servers using Quadcopter + Panera bread IPS
New here? Learn about Bountify and follow @bountify to get notified of new bounties! x

I'll give a $50 tip for each well researched response.

  1. I recently saw this video of a waiter using a quadcopter to deliver food using the iPad.
    http://www.youtube.com/watch?v=uHNkWpPNK9A

  2. Then I saw this TedTalk on how Quadcopters can sense objects around it and make judgements to not bump into anything. (obstacle avoidance sensors)
    http://youtu.be/4ErEBkj_3PY?t=9m2s

  3. Then I was at Panera bread and discovered that the wait staff knows where you're sitting because that Table Tracker they give you contains an IPS (indoor positioning system)
    http://www.youtube.com/watch?v=B9ilqbB6Ow0

What software libraries can I use to make the Quadcopter deliver food to the location of the Table Tracker? What software libraries/sensors exist to make sure a Quadcopter reaches its destination point while avoiding obstacles.

I recognize that there would be a lot of User Experience problems to iron out (lack of human touch, etc), but I'm interested in the current state of software/libraries and how to make a working prototype of this.

If I haven't phrased my question correctly, I apologize, and I can clarify it based on your comments.

Awesome!!
alixaxel 7 months ago
This was a seriously fun question and I really enjoyed discovering possibilities and collaborating on the research with you guys. There's definitely a lot more research needed but these suggestions seem to be a great stepping point. Thank you!
akshatpradhan 7 months ago
awarded to ochi
Tags
research

Crowdsource coding tasks.

2 Solutions


I haven't done any specific research for this but I thought I would share my two cents of what I know about it.

The quadcopter in the first video seems remarkably similar to the Parrot AR.Drone - in fact, I'm pretty sure it's the same. I'm also pretty sure the waiter is flying the drone at all times (talk about being lazy =P). AFAIK, AR.Drone can fly by itself, but you need to record fixed GPS coordinates of the flight plan beforehand.

I must say however, that it would be a bit dangerous to pick the plate without the protection for the helices.

I still haven't checked the second and third videos (will do when in a couple of hours), and I do not own a AR.Drone to know what kind of tech it has or doesn't have but from a theoretical point of view you could attach a Raspberry Pi with OpenCV installed to the bottom of the drone, and have some code (neural networks are usually employed for path-finding tasks via CV like this one) analyzing the obstacles in the way. Since the AR.Drone only has one camera, the drone would have to stop and turn every time it saw an obstacle in the way. Also, Raspberry Pi has a nice-ish graphical processor, but I'm not sure if it would be fast enough for real-time image decomposition and the path finding task, most probably it would have a slowish flight.

Another option, probably easier and faster, would be to setup three or four sound transmitters / receivers at the bottom of the drone, each directed to a distinct position. Then you would have the transmitters send a ultrasound and the receiver would see how long it took for the sound wave to come back, just like a sonar. The speed of sound is ≃ 340m/s, and that varies slightly with the temperature and the altitude (pressure). Since the drone would be operating at relatively stable conditions, you wouldn't even need to know that.

The good thing about the sonar approach, besides being cheap, is that you could "hear" obstacles in 3/4 distinct directions with near zero computational effort (compared to the OpenCV model), you would just have to feed to the path-finding algorithm something along these lines:

  • north is clear for 120m
  • west is clear for 100m
  • south is clear for 10m (don't go!)
  • east is clear for 60m

Then if the target was 150m to the northwest the drone would pick north, then west. Or just turn northwest right away and see if that's also clear. That's a heuristic you would have to test and benchmark. You could also do this with only one sound transmitter / receiver, but it would take at least 1 second on each direction to probe for a 120m (340m / 2) path. As for the hardware, you would probably still need Raspberry Pi for the path-finding computation but I'm not sure if it would be viable to plug several sound devices on it. I think not. In that case, you could pipe the sound signals from Arduino boards, or even better Microduino ones.

LIDAR technology would even surpass sonars, but I don't think it would be viable for several reasons:

  • very, very expensive
  • higher power consumption
  • customers would get lasered all the time xD

As for the dropping food problem, I don't think that's really a problem because all quadcopters have built-in gyroscopes. As long as you could ensure that the plate center of gravity was within the drone plate (and that there was enough friction between both surfaces), you would have very comfortable tilt parameters to fly with.

I also recommend you watch this YouTube video if you haven't already: Quadrocopter Pole Acrobatics.

Now, I don't know if AR.Drone has some sort of open hardware specification that would allow you to get rid of the Raspberry Pi and pipe the computations to the controlling tablet (I don't think this would be very reliable anyway) or internal CPU. I'm also not even sure you are asking specifically about AR.Drone or not so...

I was reading this through, but do you think location accuracy is going to be the biggest issue? Here's how the Table Tracker at Panera bread works, "When you place your order in a casual dining situation – like Panera – you receive a coaster with a number on it. You find a place to sit and place the coaster on the table. As your order becomes available, a server can see on restaurant floor plan exactly what table you are sitting at. The coaster recognizes the RFID tag under your table and identifies your placement on a floor layout map in the back of the store."
akshatpradhan 7 months ago
@akshatpradhan: GPS accuracy is crucial. Most GPS receivers are accurate to 6-10 meters, but even that is unacceptable in this scenario. I have no idea how good the GPS receiver in the AR.Drone is, sorry. Suppose that the drone gathers the meal from a signal insulated kitchen - that would also be very problematic to deal with. One possible solution would be to have some sort of beacon at the table the drone could understand: may it be a strong ultrasound signal (for the sonar - prone to sinal weakening [windows]), color sign (computer vision - prone to signal blocking [walls]) or a self-powering RFID tags (prone to signal blocking [metals] and would need an additional predefined location for place orientation).
alixaxel 7 months ago
@akshatpradhan: I plan on checking the remaining videos as soon I finish something I have on hands and I'll get back to you if something else pops on my mind. But most likely, you would need to use CV and train the drone (using a neural network) so the it would know "the corners around the place". That's the only way I see it being accurate enough and functional in non-ideal conditions. The DARPA Grand Challenge suddenly became very similar.
alixaxel 7 months ago
I also think I should just go out and buy this Quadcopter configured with Ardupilot and play around with it to see how much "fail" we're really dealing with. http://store.3drobotics.com/products/apm-3dr-quad-rtf
akshatpradhan 7 months ago
@akshatpradhan: Certainly looks more hackable than AR.Drone, and better equipped as well. Just read the description, but if has CV built-in, you might be able to get the whole thing done by simply using the "return to launch" and "follow me" feature. I don't know how it deals with locations (seems radar-like technology), but if you set up a tablet under each table and have the drone follow it... Just be sure to also get the Telemetry Set.
alixaxel 7 months ago
@alixaxel yeah, and as ochi mentioned below, the Mission Planner seems to be the command and control for planning routes, so we'd have to feed coordinates into the MissionPlanner. http://planner.ardupilot.com It has some nice screenshots though and helps visualize how all the components come together. This also seems to come into mind, "allows the trajectory to be plotted in 3D. It can be useful for indoor navigation" http://code.google.com/p/ardupilot-mega/wiki/QGC#3D_View
akshatpradhan 7 months ago
@akshatpradhan: I wouldn't put my hopes on the mission planner though, AR.Drone has the same feature but AFAIK it's only useful for free pathways as it doesn't avoid obstacles (I think the main practical application is aerial video recording). My bet is that a preset flight wouldn't even be useful indoors, as it could crash onto passing people and whatnot. I don't know. If you're counting on that, I suggest you contact the developer and ask that specific question.
alixaxel 7 months ago
you're right, I also found this article, just some FYI "10 things to know about indoor positioning systems" http://www.directionsmag.com/articles/10-things-you-need-to-know-about-indoor-positioning/324602
akshatpradhan 7 months ago
Winning solution
This Ardupilot seems really interesting, especially with their GPS Way points. I wonder how exactly the GPS waypoints work on this thing, especially their accuracy.
akshatpradhan 7 months ago
@ochi I'm just curious, what does the development environment get me? I'm not entirely sure what the possibilities are. https://github.com/diydrones/ardupilot#development-using-virtualbox
akshatpradhan 7 months ago
View Timeline