Lab 3

Bits and Bytes of Information

Wanna know more?

Objective

After building circuits to detect IR and audio signals, working with sensors to sense walls, and following the lines of a maze, this lab was meant to put all of those moving parts together and display that interaction on a virtual representation of the maze.

Procedure

This lab required two subgroups: radio and robot. Orby and Caitlin worked with the radio, and Luke and Labhansh worked with the robot.

Let's Break it Down

Audio Detection

Starting at the Sound of the Bell!

While integrating 660Hz with our existing system, we faced an interference issue wherein the IR wall sensors create noise on the DC power supply. This in return disrupts the OpAmp and microphone, because they are powered by the same DC power i.e. the 5V coming from the Arduino. This noise makes it hard to detect 660Hz and also result in false detects a few times. Many solutions were tried, including - adding capacitors to reduce noise, but it didn’t work. Even the TAs gave up! So, we came up with the following work-arounds:

We chose the latter, because we didn't want to add weight of an extra battery. But eventually we had to use the former because the wall sensors values were varying a lot due to noise produced in Vdd line. So, we just separated the wall sensors to a different power source; And everything started working fine.

Another issue was the short supply of analog pins. This was an easy fix. Since we had space in our mux (the mux used for the three wall sensors), we simply added the audio input to the fourth input for the mux (still using 2 selector bits for the three walls and the audio detection). Then, we selected the correct input using the selector bits in the audio function like so:

bool audio_detected() {
          // Store old values 
          old_TIMSK0 = TIMSK0;
          old_ADCSRA = ADCSRA;
          old_ADMUX = ADMUX;
          old_DIDR0 = DIDR0;

          bool ret = false;
          
          digitalWrite(12, HIGH);
          digitalWrite(13, HIGH);

          cli();
          for (int i = 0 ; i < 512 ; i += 2) {
              fft_input[i] = analogRead(A1); // <-- NOTE THIS LINE
              fft_input[i+1] = 0;
            }
          fft_window();
          fft_reorder();
          fft_run();
          fft_mag_log();
          sei();
          cli();
          if(fft_log_out[19] >= 45 || fft_log_out[20] >= 45 || fft_log_out[21] >= 45)
            ret = true;

          // Restore old values
          TIMSK0 = old_TIMSK0;
          ADCSRA = old_ADCSRA;
          ADMUX = old_ADMUX;
          DIDR0 = old_DIDR0;

          return ret;
        }
        

Notice how we save and restore several registers, similar to the robotDetected() (optical detection) function. Also notice the two digitalWrite lines, which are used to select the mux input for audio.This is essentially how we incorporated the audio.

Our State Machine

Welcome to the United States of Arduino

We had to make some changes to our state machine and the functions it used. For example, we have two functions: goToIntersection() and followLine(). goToIntersection() is supposed to start at one intersection and go to the next one. It calls followLine() in its loop, like so:

void goToIntersection() {
          for ( int i = 0 ; i < 1000 ; i++ ){                           // Start following line
            followLine();
          }
          while (!(isWhite(backLeft) && isWhite(backRight))) {       // While not at an intersection
            followLine();
          }
          // STOP AT THE INTERSECTION
          stopRight();
          stopLeft();
          delay(500);
        }
        

We edited followLine() to NOT include a loop of its own. I.e. followLine() is simply a series of checks (conditionals) that monitor the line sensors to keep the robot on the line. After this, we updated our state machine like so:

state = LISTEN_FOR_START;
        while(1){
          switch(state){
            case LISTEN_FOR_START:{
              if ( audio_detected() ) {
                pinMode(7, OUTPUT);
                digitalWrite(7, HIGH);
                state = FOLLOW_LINE;
                //Serial.println("We got 660");
              }
              break;
            }
            case FOLLOW_LINE:{
              goToIntersection();
              if(robotDetected()){
               state = ROBOT_DETECTED; 
              }
              else{
                state = FOLLOW_WALL;
              }
              break;
            }
            case ROBOT_DETECTED:{
              turnAround();
              state = FOLLOW_LINE;
              break;
            }
            case FOLLOW_WALL:{
              if (frontWall()) {
                if (rightWall()) {
                  if (leftWall()) { // F, R, L
                    //Serial.println("turning around");
                    turnAround();
                  }
                  else {            // F, R, no L
                    //Serial.println("turning left");
                    turnLeft();
                  }
                }
                else {              // F, no R (L not considered)
                  turnRight();
                }
              }
              else {                // no F
                if (!rightWall()) { // no F, no R
                  turnRight();
                }
              }
              state = FOLLOW_LINE;
              break;
            }
          }
        }
        

We added a case called LISTEN_FOR_START, which essentially waits until audio_detected() is true. Once we obtain 660 Hz (audio_detected()), we write a digital output high to the mux pin, which was used so that we'd be able to receive signals from multiple different sensors (i.e. wall sensors, audio detection) without any noise or incorrect data.

Also pay special attention to the FOLLOW_WALL case. These conditionals only tell the robot which direction to turn. After it decides this, it always returns to the FOLLOW_LINE state, which calls goToIntersection().

Here is a video of Arnold starting upon hearing 660 Hz, successfully navigating a maze, AND detecting IR:

Recall that when Arnold detects IR at an intersection, he will execute a U-turn.

And here is a video of the same skills demonstrated in a different maze format!

GUI Interaction

Seeing is Believing

The GUI provided by the lab handout was relatively simple to use. After downloading the relevant files, all that was need to do to run the GUI was a simple python command: python main.py --rows --cols . Running this command would open a browser window with a GUI configured to the number of rows and columns given.

In order to update the interface, there were certain variables that could be declared:

These variables represent the current location of the robot, the presence of walls, the detection of a robot nearby, and the existence of treasures with specific shapes and colors. This data was also passed along with integers representing x and y coordinates, similar to the statement 2,3,west=false,north=true,south=false, which defined the robot to be at coordinate location (2,3) with a wall to the north, not to the west or south.

To start with updating the GUI, the most simple and effective way to see if the values were being updated was through changing the x and y coordinates of the virtual robot. Information from the base station is sent with a simple Serial.println statement, so global variables to store incrementing x and y values were created and placed in a Serial.println statement within the GettingStarted.ino code provided to run the radio. The python command was then run (after ensuring that the serial monitor wasn’t running, otherwise it would be using the same port that the GUI would want to receive information from) and the GUI displayed. A video of this is below:

Because we are only allowed to use a restricted number of bits and we have a lot of data to transmit, we came up with the following data scheme to efficiently transmit the information.

The three bits for treasure detection will be used as follows:

To send this information between two radios connected to two separate boards, the roles of each radio had to be defined. There are two possible roles: transmitter, responsible for sending out data; and receiver, responsible for taking in data. The default role for all radios is to be a receiver, but the role could be switched by inputting ‘T’ or ‘R’ in the serial monitor to flip to transmitter or receiver, respectively.

The GettingStarted.ino code has, within the loop function, conditionals to handle what to do if the radio is in transmitter mode versus receiver mode. Both modes incorporate the movement of data, checking to make sure the transmission was sent/received, and sending an acknowledgement signal to the other radio.

The data needed to be transmitted between the two Arduinos is in regards to the location, orientation, and spatial awareness of the robot, as described above. Some of these variables, specifically the coordinates, cardinal direction, and surrounding walls, needed to be updated and maintained within the motion of the robot itself. We created a helper function called create_transmission to create the data to be transmitted to the receiver using the data scheme above. This helper function is shown below:

byte create_transmission(bool north=false,  bool east=false, bool south=false, bool west=false, bool robot_detected=false) {
        byte result = 0b00000000;
        byte tmp = 0b00000000;
        if (north) {
          tmp = 1 << 7;
          result = result | tmp;
        }
        if (east) {
          tmp = 1 << 6;
          result = result | tmp;
        }
        if (south) {
          tmp = 1 << 5;
          result = result | tmp;
        }
        if (west) {
          tmp = 1 << 4;
          result = result | tmp;
        }
        if (robot_detected) {
          tmp = 1 << 3;
          result = result | tmp;
        }
        return result;
      }
      

At first, to check that transmissions from the transmitter radio were appropriately received by the base station, we hardcoded values for x, y, and data to be sent. The video below shows the GUI responding to these new values and displaying the robot at a new place in the maze:

In order to simulate robot movement, w­­e created a counter variable and three arrays: data_array, x_array and y_array.

We then stored values in these arrays according to the movement we wanted to simulate. Within the loop function, if the radio was transmitting, we looped over the values in each of the arrays, obtaining and transmitting values of x, y and data (from their respective arrays) as well as incrementing the value of the counter variable after each iteration. On the other hand, if the radio was receiving, we printed an API acceptable string to the GUI. This was done using a helper function create_gui_string which takes in a 3 element array and uses various conditionals to return an API acceptable string.

String create_gui_string(byte data[3]) {
        String result = "";
        //adding x, y
        result = String(data[0]) + "," + String(data[1]);
        //decoding existence of walls and other robots
        byte tmp = 0b00000000;
        if ((data[2] & (1 << 7)) != tmp) {
          result = result + ",north=true";
        }
        if ((data[2] & (1 << 6)) != tmp) {
          result = result + ",east=true";
        }
        if ((data[2] & (1 << 5)) != tmp) {
          result = result + ",south=true";
        }
        if ((data[2] & (1 << 4)) != tmp) {
          result = result + ",west=true";
        }
        if ((data[2] & (1 << 3)) != tmp) {
          result = result + ",robot=true";
        }
        return result;
      }
      

For the simulation of our robot's actual movements on the GUI, the following code was written and integrated within our general Arduino file:

The code snippets below outline the procedure used to appropriately update the values for these variables.

The definition of the global variables
//variables for GUI---used to determine location/direction of robot----
      int x = 0;
      int y = 0;
      int dir = 0;
        //0 --> relative north
        //1 --> relative east
        //2 --> relative south
        //3 --> relative west
      bool north = false; //existence of north wall (in front)
      bool south = false; //existence of south wall (behind)
      bool east = false; //existence of east wall (left)
      bool west = false; //existence of west wall (right)
      
Functions to determine what direction the robot was pointing in, taking the initial direction as north
// Orienting Functions ----------------------------------------
      // Used to determine coordinate location and cardinal direction of robot
      void orient_right_turn() {
        if (dir < 3) {
          dir = dir + 1;
        } else if (dir == 3) {
          dir = 0;
        }
      }

      void orient_left_turn() {
        if (dir > 0) {
          dir = dir - 1;
        } else if (dir == 0) {
          dir = 3;
        }
      }

      void orient_turn_around() {
        if (dir  == 0) {
          dir = 2;
        } else if (dir == 1) {
          dir = 3;
        } else if (dir == 2) {
          dir = 0;
        } else if (dir == 3) {
          dir = 1;
        }
      }
      
Within the case FOLLOW_WALL, the sensors corresponding to each cardinal direction were defined
case FOLLOW_WALL:{
            bool atIntersection = isWhite(backLeft) && isWhite(backRight);
            if (atIntersection) {
              if (dir == 0) {
                north = frontWall();
                east = rightWall();
                west = leftWall();
                south = false;
                y = y + 1;
              }
              if (dir == 1) {
                north = leftWall();
                east = frontWall();
                west = false;
                south = rightWall();
                x = x + 1;
              }
              if (dir == 2) {
                north = false;
                east = leftWall();
                west = rightWall();
                south = frontWall();
                y = y - 1;
              }
              if (dir == 3) {
                north = rightWall();
                east = false;
                west = frontWall();
                south = leftWall();
                x = x - 1;
              }
              create_transmission(x,y,data);
            }
      

Speedbumps

There were a few...

The problems started with hardware issues. The second Arduino we were given wasn’t a real Arduino, and thus didn’t send the 3.3V to the radio that it needed to operate adequately. Once this issue was spotted, the radio then had to be hooked up to a direct power supply of 3.3V.

The easiest way to spot that a radio wasn’t working was to look at the addresses it displayed when the serial monitor was first opened. We hardcoded the message channels as 0x000000002CLL and 0x000000002DLL, so those were the values expected. However, there were issues where the address displayed would be 0x0000000000, even with the radios correctly powered and all of the pins on the radio, adaptor, and board probed for shorts and correct voltages.

The solution was often to re-upload the code to the board, open the serial monitor, and hope that the addresses were correct. There didn’t seem to be any reasoning as to when the radio would decide to not take in the correct message channel addresses, and similarly code, leading to hours spent attempting to pinpoint the problem only to have it work again after re-compiling the same files, with no edits having been made.

After struggling with the radio chip for a while, we decided to borrow one from another team. We then discovered that even though the radio chip was faulty, our code was also problematic. For some reason, it would only send data between the Arduinos two times before stopping as opposed to continuing to send data. The video below demonstrates the outputs on the serial monitor that we would see when we would run into this issue.

To figure out the cause of this problem, we added println statements after every while loop and if-else statement. By observing the outputs to the serial monitor, we were able to figure out that the cause of this problem was that one of the variables in the stopping condition of a while loop was being wrongly set in the body of the while loop. We were unable to figure out why the current variable assignment would be a problem before the end of the lab session.