Objective

Get a feel for Bayes approximations for robot position, map out a space with the robot



Materials Used



Procedure

Grid Localization using Bayes Filter


The intention of this part of the lab was to write psuedo-code to acheive basic Bayes filtering tasks based on the odometry of the robot.
The pre-written code was all run first, which moved the robot in a circle around an object while mapping it on the screen, like last week very inaccurately without probability based estimation.
I first went back through the lectures regarding probability to re-familiarize myself with the equations and algorithms surrounding odometry based motion models.

    # In world coordinates
def compute_control(cur_pose, prev_pose):
    """ Given the current and previous odometry poses, this function extracts
    the control information based on the odometry motion model.

    Args:
        cur_pose  ([Pose]): Current Pose
        prev_pose ([Pose]): Previous Pose 

    Returns:
        [delta_rot_1]: Rotation 1  (degrees)
        [delta_trans]: Translation (meters)
        [delta_rot_2]: Rotation 2  (degrees)
    """
    
    #getting vars from input
    distX     = cur_pose[0] - prev_pose[0];
    distY     = cur_pose[1] - prev_pose[1];
    prevTheta = prev_pose[2];
    curTheta  = cur_pose[2];
    
    #calculating results based on lecture 11
    delta_trans = sqrt(distX^2*distY^2);
    delta_rot_1 = atan2(distY, distX) - prevTheta; 
    delta_rot_2 = curTheta-prevTheta-delta_rot_1;
    

    return delta_rot_1, delta_trans, delta_rot_2

# In world coordinates
def odom_motion_model(cur_pose, prev_pose, u):
    """ Odometry Motion Model

    Args:
        cur_pose  ([Pose]): Current Pose
        prev_pose ([Pose]): Previous Pose
        (rot1, trans, rot2) (float, float, float): A tuple with control data in the format 
                                                   format (rot1, trans, rot2) with units (degrees, meters, degrees)


    Returns:
        prob [float]: Probability p(x'|x, u)
    """
    
    #get translation/rotation data
    computed_control = compute_control(cur_pose, prev_pose);
    delta_trans = computed_control[0];
    delta_rot_1 = computed_control[1];
    delta_rot_2 = computed_control[2];
    

    #calculate probability of each condition 
    prob1 = loc.gaussian(delta_trans, u[0], loc.odom_trans_sigma);
    prob2 = loc.gaussian(delta_rot_1, u[1], loc.odom_rot_sigma);
    prob3 = loc.gaussian(delta_rot_2, u[2], loc.odom_rot_sigma);
    
    prob = prob1*prob2*prob3;
    
    return prob

def prediction_step(cur_odom, prev_odom, u):
    """ Prediction step of the Bayes Filter.
    Update the probabilities in loc.bel_bar based on loc.bel from the previous time step and the odometry motion model.

    Args:
        cur_odom  ([Pose]): Current Pose
        prev_odom ([Pose]): Previous Pose
    """
    sumBel = 0;
    for i in range(loc.MAX_CELLS_X):
        for j in range(loc.MAX_CELLS_Y):
            for k in range(loc.MAX_CELLS_A):
                sumBel += loc.bel[i][j][k]*odom_motion_model(cur_odom, prev_odom, u)
                
    loc.bel_bar = sumBel;   
    
    return void
 

def sensor_model(obs, u):
    """ This is the equivalent of p(z|x).


    Args:
        obs ([ndarray]): A 1D array consisting of the measurements made in rotation loop

    Returns:
        [ndarray]: Returns a 1D array of size 18 (=loc.OBS_PER_CELL) with the likelihood of each individual measurements
    """
    
    prob_array = [0]*loc.OBS_PER_CELL;
    
    for i in range(loc.OBS_PER_CELL):
        prob_array[i] = loc.gaussian(obs[i], u, loc.sensor_sigma);

    return prob_array

def update_step(eta, obs, u):
    """ Update step of the Bayes Filter.
    Update the probabilities in loc.bel based on loc.bel_bar and the sensor model.
    """
    
    loc.bel = eta*sensor_model(obs, u)*loc.bel_bar;
    


      

Mapping


This part of the lab was especially challenging due to struggles with Bluetooth. I had been having issues with bluetooth in the PID lab, which is why I chose to go with a cabled approach, but I decided I needed to figure it out sooner or later.
After many hours of debugging and finally calling with Vivek, I discovered that Windows was being especially clingy with control of the Bluetooth module, so I needed to disable it and delete the drivers so the VM could have control. Finally, I could send data over bluetooth.
I added both Bluetooth and TOF functionality to my PID loop from last lab, simply integrating my work from the Bluetooth and TOF labs to send distance and IMU data over bluetooth to the VM.
Once I got this functionality working, I tried just running the robot on my desk and plotting the polar coordinates, which seemed to be about what I expected. This is a log plot because it was dominated by the robot turning to face the room, which maxed out the sensor.
imu not still
Once I had done this, I moved to cleaning and setting up my room. Although I initially intended to use my whole room, I found that it was larger than 4mx4m, and would be difficult to reduce only partially. Therefore, I set up a little corner of my room and put some boxes/bags in it.
imu not still
I measured it out the full space and took measurements from two different points in the space so that the robot could see around obstacles.
imu not still
I took two measurements each from the points indicated in the above plot. I then used the transformation matrix talked about all the way back in lecture 2 regarding a rotation around z, which this was, to get X and Y data from my rotational data.
imu not still
I decided to just use an offset when calculating x and y of where I had started the robot, in the below case corresponding to 550mm away in the x and 370 in the y.
    A=load('room1.txt');

    theta         = A(:,1);
    theta_radians = deg2rad(theta);
    r             = A(:,2);

    x_data = zeros(1, length(r));
    y_data = zeros(1, length(r));


    for i = 1:length(r)
        r_matrix = [r(i); 0; 1];
        rotation_matrix = [cos(theta_radians(i)) -sin(theta_radians(i)) 0; sin(theta_radians(i)) cos(theta_radians(i)) 0; 0 0 1;];
        new_matrix = rotation_matrix*r_matrix;
        x_data(i) = (new_matrix(1)-550);
        y_data(i) = (new_matrix(2)-370);
    end


    
After running this 4 times, 2 each from each point, I obtained the following map
imu not still
The data is fairly consistent. the primary source of error is the robot not being able to see around obstacles and projecting a sort of triangle in the space that the obstacle covers. I overlayed it with a map of what the room actually looks like and it's not bad.
imu not still
I created lines based on this to feed into the simulator, as shown below.
    # Start points for each line segment describing the map
    start_points = np.array([[-1.85, -.790],
                             [.2, -.79],
                             [-1.85, -.79],
                             [-1.85, 0],
                             [-1.85, -.18],
                             [-1.3, -.18],
                             [-1.1, -.79], 
                             [-.7, -.79],
                             [-1.1, -.6],
                             [-.500, -.1], 
                             [-.500, -.1]   ])

    # End points for each line segment describing the map
    end_points = np.array([[.2, -.79],
                           [.2, 0],
                           [-1.85, 0],
                           [.2, 0],
                           [-1.3, -.18],
                           [-1.3, 0],
                           [-1.1, -.6], 
                           [-.7, -.6],
                           [-.7, -.6],
                           [-.5, 0], 
                           [.2, -.1]])


    

imu not still