Micro-Mouse using Image Processing in OpenCv


This video describes our robot which is a micro-mouse. Unlike a conventional micro-mouse where the maze is in form of a wall and the robot contains infra-red sensors, this robot was made for a maze which is made up distinct color lines and as shown by the purple line in the video, these lines serve as the walls. The robot has to generate its own path and get out of the other side of the maze. The track is detected by means of an overhead camera placed at a height of about 3 meters.

The video below shows the first test run of the robot on a simple maze.

Robot Specs:-

  • AtMega8535 microcontroller at 16MHz clock
  • USB Webcam connected to a PC running OpenCV in Visual Studio 2010
  • A 2 wheeled robot chassis with 2 castor wheels
  • USART to communicate between the computer and the micro-controller
  • PWM to adjust the speed of the motors


The overhead camera (USB webcam) is connected to a computer which is running the code written in OpenCV. This code processes the image and generates a path for the robot, after which is dynamically sends the path to the robot via USART. After receiving the the directions from the computer, the Micro-controller actuates the motors accordingly.

Algorithm(in OpenCV):-

Colour Detection: We chose to use the HSV colourspace for the image. The we detected the colour by setting appropriate threshold for Hue and Saturation channels for the image and kept a large range for the intensity channel. Due to this, we were able to achieve a better colour detection independent of the ambient light variation as compared to using a BGR colur space.

Path Generation: After finding the location of the walls using colour detection, we divided the map into M x N grids to serve as nodes. Each grid is either connected to its adjacent grids (if there is no wall between the 2 grids) or  is not connected to its adjacent grid (if there is a wall between the grids). Hence we made a map of the maze. We then applied the Dijkstra's Algorithm to find the shortest path for the robot to move.

Location and Orientation of the Robot: The the top part of the robot was made of 2 colours, say C1 and C2. We found the cordinates of centriods of both the colours ie.(x1,y1) and (x2,y2)
Hence the robots location is given by: ( (x1+x2)/2, (y1+y2)/2 )
and the orientation was found out by: arctan( (y2-y1)/(x2-x1) )

Sending Directions to the robot: Location and orientation of the robot is compared to that generated according to the trajectory and hence directions are sent to the robot such that it moves along the trajectory. The USART communication was used to achieve this.  


No comments:

Post a Comment