System Overview

This page provides a system-level overview of the Robotic Explorer (RX) system, including the:

  • RX Console (single-page web app)
  • RX Guide (robotic navigation aid)
  • RX Rover (robotic exploration tool)
  • RX Server (online data repository)

The Console will be used by the Rover's (human) supervisor as a way to control the Rover, add annotations, etc. The Guide may be somewhat smaller and simpler than the Rover. The Server collects, analyzes, and distributes information.

Note: This page concentrates mostly on the Rover; the designs for the other subsystems are still very fuzzy!

Details

Computers

The Raspberry Pi makes a fine system-level computer and I really like the fact that the available operating systems (e.g., Linux, FreeBSD) are Unix-inspired. However, it doesn't have lots of low-level interfaces and an OS can make low-level programming more difficult.

So, I plan to use an Arduino for analog I/O (e.g., controlling steppers, accepting sensor data). I can also rely on the iRobot's built-in control computer to run the drive wheels, avoid obstacles, etc. In summary, the Arduino performs low-level sensing and control tasks; the Pi records data, supervises the mapping operation, etc.

Communication

I plan to connect to the Pi mostly via USB. Its four built-in USB ports will be used for devices such as the Wi-Fi transceiver and USB hubs. Lower-rate interfaces (e.g., Arduino, cameras, serial adaptors) will daisy-chain off the USB hubs. The RX subsystems communicate with each other – and possibly the Web Application (WA) Server – using Wi-Fi and/or the Internet.

Block Diagram

Note: Accessible SVG contains an attempt at a text-based explication of the graph portrayed above.

Legend

  • solid lines denote USB
  • dot-dash lines denote mechanical linkage
  • loose dotted lines denote Wi-Fi
  • tight dotted lines denote analog, RS-232, and TTL

  • black arrowheads denote control
  • white arrowheads denote result

Components

There is a wealth of components that can be used to build an economical, yet capable and extensible Rover. Here is the recipe I'm using for my prototype; the parts cost has yet to hit $500:

  • control and logging computer (e.g., Raspberry Pi 2 Model B)
  • data communication (e.g., USB hub, Wi-Fi transceiver)
  • data logging storage (e.g., 128 GB MicroSD Memory Card)
  • fixed and mobile cameras (for panoramic and selected views)
  • inertial measurement unit (e.g., Adafruit 10-DOF)
  • real time clock module (e.g., RTC I2C DS1307 AT24C32)
  • robotic mobile platform (e.g., iRobot Create 2)
  • servo-controlled sensor mount (e.g., EMAX ES3103E)
  • ultrasonic sensors (e.g., 3 MaxBotix MB1360's)
  • miscellaneous infrastructure (e.g., battery, chassis)

Sensors

Extremely cheap USB cameras and hubs are readily available, so there is no need to economize here. Putting eight cameras in a ring, for example, would provide nicely overlapping, time-synchronized images, with no need to re-orient the cameras. Another camera can track each ultrasonic sensor, so it always "sees" the same part of the wall, etc.

Adafruit's 10-DOF "inertial measurement unit" is pretty amazing: it provides barometric pressure/altitude, temperature, and three axes of accelerometer, gyroscopic, and magnetic (compass) data. I'm not sure which of these I'll want (let alone need), but for $29, I'll take it...

Servos

Putting three sets of ultrasonic sensors and cameras on a servo-controlled sensor mount serves two functions. In normal (corridor traversal) mode, they scan the surrounding walls. However, a set can also be pointed backward, at an desired angle, etc. If need be, we can add tilt servos, increasing the range of directions.

Positions that are "normal" to a wall will have the fastest and strongest return signals. So, to accommodate variations in the Rover's path, the servo-controlled sensor mount can swing back and forth (covering a wider range than the expected angular variation).

The mount can also reduce motion blur in camera images: by rotating backwards at the right speed, it can compensate for (most of) the Rover's forward motion. Finally, tilt servos could swing the sensor pairs up and down, using an approach similar to conical scanning to increase their positional resolution.

Concurrency

The Rover has to perform a number of concurrent activities (e.g., accepting data, controlling sensors, plotting a course). I'm planning to use Elixir (running on the Pi) for most of this. Similarly, I expect to use the Elixir-based Phoenix Framework on the Server, in order to have channels, fail-soft behavior, and scalability.

Note: Many thanks to my "brain trust": Gene Dronek, Isaac Wingfield, and Vicki Brown.


This wiki page is maintained by Rich Morin, an independent consultant specializing in software design, development, and documentation. Please feel free to email comments, inquiries, suggestions, etc!

Topic revision: r21 - 13 Jan 2016, RichMorin
This site is powered by Foswiki Copyright © by the contributing authors. All material on this wiki is the property of the contributing authors.
Foswiki version v2.1.6, Release Foswiki-2.1.6, Plugin API version 2.4
Ideas, requests, problems regarding CFCL Wiki? Send us email