Connect 4 Robot

Background:

This project is a robot v.s. human Connect 4 game where a person plays the robot on a physical game board as if they were playing another player. A camera scans the board and then motors move a dispenser along the top of the game board to drop in a piece for the robot. Next, the human player places their piece in a column, then the robot makes its turn after scanning the board again.

This project was a way for me to learn about robotics, OpenCV, motors, circuits, and coding. I thought it would be fun to make a robot I could play a game against and Connect 4 seemed like a simple game for my first robotics project.

Current Version of Project:

My project is currently a fully functional prototype where a human player can naturally start and finish a game of Connect 4. The robot needs to be plugged into a computer to run. (This project uses a PocketBeagle, which is nearly equivalent to a Raspberry Pi.)

Please see the video below to watch a full game between the robot and I:

 
 

Building the Project:

I came up with a series of steps to guide me through the process of creating this robot:

  1. Come up with simplest design possible to drop a disk in one of seven columns

  2. Select electronic components & create a wiring diagram

  3. Test components

  4. PCB design

  5. Write game code to input sensor data and output motor controls

  6. Create mechanical design

  7. Assemble, test, & debug code

1. Preliminary Design:

I brainstormed a variety of mechanisms to drop a robot disk in a specified column on the board. I chose a design with a piece dropper mechanism attached to a linear slider on a ball screw that I would purchase (the bottom sketch in the image below). I did not want to "reinvent the wheel" with my own linear actuator design and I chose to focus my efforts on the piece dropping mechanism which was a more interesting design challenge for me.

For the robot to sense where a human player places a piece, I decided to use a camera so that I could gain experience using OpenCV and learn about computer vision. I am aware of other solutions such as beam break sensors, ultrasonic sensors, or even a way to weigh the pieces in each column.

2. Component Selection & Wiring

With the design concept chosen, I selected the electrical components I would use in the robot.

  • Camera: board imaging for piece detection

  • Stepper motor: linear sliding rail on ball screw

  • Servo motor: piece dropping mechanism

  • LCD screen & button: user interface

  • 5V power supply: robot food

3. Testing Components:

The first component I tested was the 16x2 character LCD display. Wiring was very simple but I had to install the adafruit_character_lcd.character_lcd Python 3 library to connect to the display. A potentiometer controls the contrast of the display.

Next I worked on connecting the stepper motor. The stepper motor will move the piece dropper along the top of the Connect 4 board to the right column via a linear sliding rail. I used a stepper motor driver in order to more easily control the stepper motor. The stepper motor driver requires a 5V input in order to control it but the PocketBeagle can only output 3.3V. I used a Logic Level Converter to convert the signals from 3.3V to 5V. I used an external power supply (5V, 2A) to power the stepper motor. Wiring was simple and I found some test code to use from https://github.com/petebachant/BBpystepper.

Next I connected the 360 degree continuous rotation servo motor. The servo motor will be used in the piece dropper to drop one piece into the right column (in hindsight choosing a continuous rotation servo was a mistake since it has poor positional accuracy and control). The servo motor also required a 5V power supply and I controlled the servo through the PocketBeagle's PWM pin.

The last piece of hardware to connect was the USB camera. Connecting the camera required connecting (I just soldered them) the VB and VI as well as the ID and GND pins together. To test the camera, I used some basic code in the terminal:

python3 
>>> import cv2 
>>> cap = cv2.VideoCapture(0) 
>>> ret, frame = cap.read() 
>>> cv2.imwrite("temp.jpg", frame)

4. PCB Design:

I created a custom printed circuit board (PCB) for this project which houses all electronic components. I did not end up ordering this PCB due to cost and time constraints on the project. See below for the schematic, board layout, and a sample gerber file of the bottom of the board:

5. Writing the Game Code:

I based my project code off of Keith Galli's virtual Connect 4 game: https://github.com/KeithGalli/Connect4-Python. I used the built-in Connect 4 AI Keith wrote using the minimax algorithm as the robot brains. I removed all the visual pygame elements and instead made it text based.

Next, I wrote the OpenCV code for the bot to detect where the human player places a piece. Writing the OpenCV code was the most challenging part of this project. I started writing code to work on an image of a Connect 4 board and then modified it to work with photos from my USB camera. I put a green background behind the game board to make it easier to detect open spaces. The program takes a picture of the board and outputs a 6x7 array (what the main game code uses to represent the board) with the value of each space on the board. I used an OpenCV function called HoughCircles to detect all circles on the original image and looked at the color of the pixels in the center of each circle. This creates an array of circles and their colors which I then sort based on their y and x positions in the image to create the 6x7 array. To the right is a photo of the code detecting the red circles on an image.

6. Mechanical Design:

At this point in the project I determined that using a pre-constructed linear actuator instead of designing my own would be best to save time and focus on the parts of the project I was more interested in. The image below shows a NEMA 23 stepper motor attached to a linear sliding rail. This solution allowed for extremely high positional accuracy as well as decent speed (about 1 inch/second travel speed). In hindsight, this motor is overkill for my application. (NOTE: This is a different motor than I used in the earlier testing phase of the project. This motor requires a minimum of 24V to operate and I used an adjustable external power supply for power and a compatible stepper motor driver that allowed me to control it with the output signals from the PocketBeagle.)

I wrote a script using the Adafruit_BBIO.GPIO Python 3 library to control the position of the piece dropper on the linear actuator. I had to test what ranges of speeds worked for this motor and ball screw as well as write code to smoothly accelerate the motor.

Designing the Piece Dropper

I needed to come up with a way to store twenty-one Connect 4 pieces and release them reliably one at a time into a narrow slit at the top of the game board. A simple design I thought of was to have all the pieces stacked in a tube and then knock off the bottom piece with a bat. I used a continuous rotation servo motor to rotate the bat to push a piece through a funnel and onto the board. After testing this mechanism I realized that the continuous rotation servo motor did not have the positional accuracy I need. 90% of the time it rotates 180 degrees to drop one piece but occasionally it rotates too much or too little, dropping two or no pieces. I plan to use a stepper motor for this application in the future or use an encoder with the servo.

Designing the Structure

I wanted a structure that held the linear actuator above the game board, provided a space for the electronics, and positioned the camera and lights at the optimal location. I chose to make my project out of plastic since it is easy to work with using the few hand tools I had at home (although choosing acrylic was a mistake since it is brittle). I bought a plastic base and mounted two thick plastic tubes towards the edges to hold up the linear actuator. I 3D printed the mounts and then tapped the acrylic and used bolts to secure the mount. A second sheet of plastic and a second tube positions the camera and light in front of the game board.

I designed 3D printed mounts for the linear actuator, limit switches, and the camera and light. I had to take very careful measurements using calipers in order for everything to fit together in the right positions.

Limit Switches

Limit switches are important for safety and for positional accuracy. My robot uses two limit switches on either side of the linear actuator to detect when the piece dropper goes off the rails (pun intended). I also use the limit switches to start each game in the correct position (the robot starts the game by hitting a limit switch and moves to a column on the board by knowing the distance of the switch to the column).

7. Putting Everything Together (a.k.a. integration hell)

I ran into many issues when testing this project:

  1. I initially designed the piece funnel to be too shallow which caused the piece to get stuck

  2. I had a bug in my limit switch code which resulted in the right limit switch breaking off after the robot moved too far to the right

  3. The 360 degree continuous rotating servo has poor positional accuracy

  4. The camera connectivity was sometimes unreliable

  5. The board detecting OpenCV algorithm occasionally detected non-existent game pieces

  6. The linear slider is overkill in terms of accuracy but also too slow

Next Steps and Improvements:

I (had a) plan to improve the fully-functioning robot by organizing wires, making small improvements, and solving mechanical and software issues. I also want(ed) to use "cron" to make my project automatically boot upon startup of the PocketBeagle. This way, my robot can be fully standalone.

(Previously) Planned Improvements List:

  • Make a wiring schematic so that people can more easily replicate my project

  • Use stepper motor for piece dropping mechanism (For positional accuracy)

  • Use a magnetic limit switch instead of a plunger limit switch (For greater accuracy. Plunger limit switch slightly bends plastic on contact so a magnetic limit switch would eliminate contact altogether.)

  • Add button and reposition screen to create a user interface

  • Create wire management system and an updated custom PCB

  • Improve the bot's algorithm to make it nearly unbeatable

  • Making robot fully standalone and can run without being connected to a computer

I did not end up making any of these improvements since I moved on to newer and more exciting projects such as the autonomous airplane.

Code From this Project:

https://github.com/SMSARVER/ENGI301/tree/main/connect4