LBCC SlamBot v0.0.1

LBCC SlamBot v0.0.1

by Levi Willmeth -
Number of replies: 3

This thread will hold updates and links for the LBCC Slambot, a 2d environment mapping robot which will (hopefully) be able to move around while creating a map of it's surroundings.  This project, nicknamed Slambot for now, will be built in several phases originally planned over the 2015 Winter and Spring terms.

The nickname Slambot comes from the intention to build a Simultaneous Location And Mapping roBot (SLAMBot) which will be capable of creating a 2d map while recording it's position on that map.  This will not be an easy project, but I think it is possible if broken into small enough steps.  With help from our club mentors, teachers, and some motivated students I think we can accomplish many, if not all of our goals.

I wrote up an initial proposal and budget estimate, which has been accepted by our club advisor David Becker.  In the proposal I suggest breaking the project into 3 phases.  Here are the snippets about each phase, which I wrote in the initial proposal:

Phase 1: Stationary 2d Mapping

Keeping our robot in one place will simplify troubleshooting by allowing repeatable tests and eliminating errors introduced by changing the position of our robot.  It will allow us to use predetermined and repeatable distances to test the accuracy of our sensors.

We’ll start out by mounting our laser rangefinder on a spinning platform on top of a table, to create a 2d overhead (x,y) map of the distance to every obstacle in the room.  These distances can be stored as an array and treated as a grid, which can be visualized and debugged.

Using a stationary position will allow us to more easily place obstacles at known distances and confirm the sensor places them correctly into our grid.

 

Phase 2: Mapping on the Move

Moving our robot around adds a lot of functionality, but even more complexity.  In phase 2 we will mount our sensor to a remote controlled chassis and move it around, using wheel encoders and sensor data to create a real-time map of the places it’s been.

One major challenge here will be accuracy.  As the robot moves, it will need to update it’s position on the map using wheel encoders, while facing problems like loss of traction or inaccuracies in the wheel movement algorithms.  We will need to be vigilant against even small errors because they will add up over time.

During this phase, the Slambot will NOT attempt to find it’s position on the map, and will create a new, blank map each time it starts up.  These maps may be saved, but never loaded.  This is done because finding a position on a map is quite difficult, and deserves it's own phase.

 

Phase 3: Persistent Mapping

By now, we should be able to create accurate maps on the move.  The next challenge is using the maps we’re generating, to find out where our robot is.  This will not be easy.  When the robot starts up, it has no idea where it is and must find it's position based on cues from the environment, such as distances between walls and corners, lengths of hallways, and other sensor inputs.

We plan to create a blank map on startup, as before.  As the robot moves and generates the map, it will compare it against a previously recorded ‘master’ map.  By comparing distances and angles the robot should be able to eliminate poor matches, until a certain threshold of confidence is reached.

At that point, when the robot is reasonably sure it’s temporary map matches the master, it can merge the temporary map into master and begin updating the permanent map directly.

 

I hope to accomplish phase 1 and 2 during the 2015 winter term, and invite all interested students to get in touch with me.  If you aren't sure what you can bring to the project but still want to help, that's ok too.  There is plenty of work to be done in several areas, from designing and manufacturing parts for the chassis, to writing code or documentation, to playing with the robot while testing wall arrangements on a tabletop.  We could also use people willing to write emails looking for sponsors willing to buy better parts for us.  There's plenty of tasks for anyone, of any skill level, to contribute.

All code will be available on github, because git is a fantastic way to collaborate amongst several programmers while keeping our codebase current.  Github is also an easy way to show our work to future employers, hint hint.  If you have something to offer, please feel free to fork the project and submit a pull request, or contact me if you'd like help learning how to use git.

In reply to Levi Willmeth

Re: LBCC SlamBot v0.0.1

by Deleted user -

Have you seen the senseFly? HowToGeek had a pic on their blog from the CES show. I think the name you chose, SLAMBot, is awesome.

In reply to Deleted user

Re: LBCC SlamBot v0.0.1

by Levi Willmeth -

I saw a bunch of really neat UAV projects revealed at CES this year, but I hadn't seen senseFly yet.  It looks like a really exciting field to be in right now.  On that note, I'm trying to build interest in some sort of quadcopter or UAV project for next semester (Spring 2015).  I'm open to ideas, but I was thinking of attempting to build a relatively simple arduino-based flight controller which could handle simple tasks like keeping a quadcopter level and at a minimum height off the ground.

I would love to see the CPU Club invest in a couple micro quadcopters this semester so several students can practice flying them.  It would help us to better understand the mechanics of flight and what a flight controller needs to do to keep something in the air.

I'm glad you like the name SLAMBot!  It seemed appropriate, even though I know it's not unique.  The acronym SLAM is well established and exactly describes what I'm trying to do, so I just ran with it even though I was a little worried people may think it was a battlebot.

In reply to Levi Willmeth

Re: LBCC SlamBot v0.0.1

by Levi Willmeth -

Week 1 update:

The SLAMBot project is making progress!  I wrote some code, purchased a LIDAR unit and what seems like a poor choice of stepper motors to control a turret.

  • The LIDAR unit is working very well and seems to spit out consistent and accurate distances under the limited test conditions I've tried so far.
  • The stepper motor turns but is quite slow.  It has 64 positions * 64 gears for a total of 4096 positions per revolution, which causes it to be very slow but extremely precise.  In hindsight I don't need resolution down to a tenth of a degree, so I will buy a second stepper with fewer gear teeth and a higher rpm.

The arduino code in the project's github repo takes the angle of the turret and the LIDAR measurement, finds the X,Y position of the target, and stores it in an array of obstacles.  It's still very much a work in progress, but I'm optimistic about having a working tabletop demo in a week or two.  I need to figure out a good way of displaying the data once it's been collected.

Anyone interested in the project is still very much welcome to get in touch and offer suggestions, code, or just talk about ideas.  Enthusiasm counts for a lot at this stage in the project and I'm learning a lot about this as I go, so don't worry about having a minimum skill level.