test Browse by Author Names Browse by Titles of Works Browse by Subjects of Works Browse by Issue Dates of Works
       

Advanced Search
Home   
 
Browse   
Communities
& Collections
  
Issue Date   
Author   
Title   
Subject   
 
Sign on to:   
Receive email
updates
  
My Account
authorized users
  
Edit Profile   
 
Help   
About T-Space   

T-Space at The University of Toronto Libraries >
School of Graduate Studies - Theses >
Master >

Please use this identifier to cite or link to this item: http://hdl.handle.net/1807/31339

Title: Visual Teach and Repeat Using Appearance-based Lidar - A Method For Planetary Exploration
Authors: McManus, Colin
Advisor: Barfoot, Timothy D.
Department: Aerospace Science and Engineering
Keywords: mobile robotics
vision systems
visual teach and repeat
Issue Date: 14-Dec-2011
Abstract: Future missions to Mars will place heavy emphasis on scientific sample and return operations, which will require a rover to revisit sites of interest. Visual Teach and Repeat (VT&R) has proven to be an effective method to enable autonomous repeating of any previously driven route without a global positioning system. However, one of the major challenges in recognizing previously visited locations is lighting change, as this can drastically change the appearance of the scene. In an effort to achieve lighting invariance, this thesis details the design of a VT&R system that uses a laser scanner as the primary sensor. The key novelty is to apply appearance-based vision techniques traditionally used with camera systems to laser intensity images for motion estimation. Field tests were conducted in an outdoor environment over an entire diurnal cycle, covering more than 11km with an autonomy rate of 99.7% by distance.
URI: http://hdl.handle.net/1807/31339
Appears in Collections:Master

Files in This Item:

File Description SizeFormat
McManus_Colin_A_201111_MASc_thesis.pdf25.01 MBAdobe PDF
View/Open

This item is licensed under a Creative Commons License
Creative Commons

Items in T-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

uoft