Week #1: Introduction, Sensors & Actuators
Yewon Lee
MSc student
Computer Science, UofT
csc477-tas@cs.toronto.edu
Yasasa Abeysirigoonawardena
MSc student
Computer Science, UofT
Radian Gondokaryono
PhD student
Computer Science, UofT
Mission: create algorithms that enable robots to learn to act intelligently in outdoor environments and alongside humans
Manually-controlled inspection robots
Remote-controlled cleaning robot at Fukushima Daiichi, 2011
Remote-controlled cleaning robot at Fukushima Daiichi, 2011
Remote-controlled cleaning robot at Chernobyl, 1986
daVinci robot-assisted surgery
farmbot.io
BP Deepwater Horizon Spill, Gulf of Mexico, 2010
Q: When is full or partial autonomy necessary?
Q: When is remote control preferred?
Currently
Required: CSC209H5; STA256H5; MAT223H5/MAT240H5; MAT232H5; CSC376
Recommended: MAT224H5; CSC384H5; CSC311H5;
~80% coding and the rest theory
Starter code will be provided
Bonus questions will be provided
Accepted languages: Python, C++
You’re going to learn ROS (Robot Operating System) and use the Gazebo simulator
You’re also going to learn numpy and scipy
About 2 weeks to work on each
5-10 mins to complete them
Not cumulative in terms of material. They cover only one lecture
Meant to check whether you have understood basic concepts
CSC477
4 assignments, 15% each = 60%
7 quizzes, 2% each = 14%
1 final exam = 26%
CSC2630
3 assignments, 15% each = 45%
7 quizzes, 2% each = 14%
1 final project = 41%
https://www.udacity.com/course/artificial-intelligence-for-robotics--cs373
https://www.edx.org/course/autonomous-mobile-robots-ethx-amrx-1
https://underactuated.mit.edu/ (more advanced, little overlap with 477)
Use Quercus
Please check your course-related email frequently
Email us at csc477-instructor@cs.toronto.edu and csc477-tas@cs.toronto.edu
Anonymous feedback about anything course-related: https://www.surveymonkey.com/r/H8QH65F
Main question: what is the next state given the curent state and controls?
Main question: what are the controls that will take the system from state A to B?
https://www.youtube.com/watch?v=_9VcvGybsDA
Intro to the Robot Operating System (ROS)
Refresher on linear algebra and least squares
Refresher on basic probability and continuous distributions
How to align 3D pointclouds. Demo of the PCL library
How to implement a Kalman Filter
How to implement a Particle Filter
How to approximate functions
A1: Designing a feedback controller for wall-following
A1: Designing a feedback controller for wall-following
A2: Implementing path-planning and feedback control algorithms
A1: Designing a feedback controller for wall-following
A2: Implementing path-planning and feedback control algorithms
A3: Occupancy grid mapping with known robot location
A4: Localization in a known map using particle filters
Devices that can sense and measure physical properties of the environment.
Key phenomenon is transduction (conversion of energy from one form to another). E.g.:
Measurements are noisy, and difficult to interpret
CCD (charge-coupled device) imaging sensors:
voltage → analog-to-digital converter → pixel value in {0, 255}
CMOS (complementary metal-oxide semi-conductor) imaging sensors:
Shutter = mechanism that allows light to hit the imaging sensor
Shutter “speed” = Exposure time = time duration in which the sensor is exposed to light
Each pixel contains an intensity value from 0…255
Each pixel contains an intensity value from 0…255
\(\to\) A matrix of 600 x 1000 x 3 = ~ 1.8 million numbers
I’m seeing a parrot
I’m seeing a toy bicycle
The parrot is riding the bicycle
The bicycle is on top of a desk
Is this physically plausible?
Where is the parrot in 3D w.r.t. the camera?
Where will the parrot go next?
What is the speed of the parrot?
Conclusions/Inference/Deduction/Estimation
We know approximately how a 3D point (X,Y,Z) projects to pixel (x,y)
We call this the pinhole projection model
By similar triangles: x/f = X/Z
So, x = f * X/Z and similarly y = f * Y/Z
Problem: we just lost depth (Z) information by doing this projection, i.e. depth is now uncertain.
Unlike the pinhole camera, this is able to model blur.
Drawback: Doesn’t work underwater
Main ideas:
Drawbacks:
Advantages:
Enabled a wave of research, applications, and video games, based on real-time skeleton tracking
Despite their drawbacks RGBD sensors have been extensively used in robotics.
Produces a pointcloud of 3D points and intensities
Works based on time-of-flight for each beam to return back to the scanner
Not very robust to adverse weather conditions: rain, snow, smoke, fog etc.
Used in most self-driving cars today for obstacle detection. Range < 100m.
Produces a scan of 2D points and intensities
Certain surfaces are problematic for LIDAR: e.g. glass
Lots of moving parts: motors quickly rotate the laser beam and once complete (angle bound reached) a scan is returned. I.e. points are not strictly speaking time-synchronized, even though we usually treat them as such.
Usually around 1024 points in a single scan.
Gyroscopes, Accelerometers, Magnetometers
Inertial Measurement Unit (IMU)
Perhaps the most important sensor for 3D navigation, along with the GPS
Without IMUs, plane autopilots would be much harder, if not impossible, to build
Measure angular velocity in the body frame
Often affected by noise and bias
\[ w_\text{measured}(t) = w_\text{true}(t) + b_g(t) + n_g(t) \]
Measure linear acceleration relative to freefall (measured in g)
A free-falling accelerometer in a vacuum would measure zero g
An accelerometer resting on the surface of the earth would measure 1g
Also affected by bias and noise.
Double integration to get position is very noisy. Errors grow quadratically with time.
Drawbacks:
Advantages:
Combines measurements from accelerometer, gyroscope, and magnetometer to output an estimate of orientation with reduced drift.
Does not typically provide a position estimate, due to double integration.
Runs at 100-1000Hz
Expect yaw drift of 5-10 deg/hour on most modern low-end IMUs
Each GPS satellite periodically transmits:
[Coarse/Acquisition code] A 1023-bit pseudorandom binary sequence (PRN code), which repeats every 1 ms, unique for each satellite (no correlation with other satellites).
[Navigation frame] A 1500-bit packet that contains
[Precision code] A 6.2-terabit code for military use.
Carrier frequencies are 1575.42 MHz (L1) and 1227.60 MHz (L2)
Contains an analog to digital converter for encoding the angle of a shaft/motor/axle
Usually outputs the discretized absolute angle of the shaft/motor/axle
They turn continuously at high RPM (revolutions per minute) when voltage is applied. Used in quadrotors and planes, model cars etc.
Usually includes: DC motor, gears, control circuit, position feedback
Precise control without free rotation (e.g. robot arms, boat rudders) Limited turning range: 180 degrees
Positioning feedback and no positioning errors.
Rotates by a predefined step angle.
Requires external control circuit.
Precise control without free rotation.
Constant holding torque without powering the motor (good for robot arms or weight-carrying systems).
Used for creating analog/continuous behavior when voltage applied is discrete.
Main idea: turn on and off the motor fast enough so average voltage is the desired target.
Used in dimming LEDs, controlling the speed of DC motors, controlling the position of servo motors.