This is a funny experiment which I've had in my head for some months but never went into it. At the end I found the time and it was not as hard as expected. At least the basic concept.
The idea was to build an automatic Pong player using OpenCV and some Android devices
How it works:
A central device is the 'tennis court' and visualizer.
At this moment the 'player' is a bit clumsy and misses some balls . There are things to be improved, such as choosing other colors and make detection more reliable, since the squares used are very small and detection also depends very much on light conditions.
The idea was to build an automatic Pong player using OpenCV and some Android devices
How it works:
A central device is the 'tennis court' and visualizer.
- It animates the ball, calculating collisions against the walls or the racket if the ball hits it
- The racket position is controlled by means of an UDP socket.
- It looks at the first device screen using its back camera
- The preview is analyzed using OpenCV
- It detects the court boundaries (blue squares) and the ball using colorBlobdetection. Then perspective is 'corrected' and calculates the ball position, relative to the court. With this information (and at this moment nearly no 'intelligence', since it only follows its vertical position) it calculates where the racket should be in order to hit the ball.
- As the devices are connected through Wifi (the player knows the IP and port of the first device), the new racket position is sent through UDP
At this moment the 'player' is a bit clumsy and misses some balls . There are things to be improved, such as choosing other colors and make detection more reliable, since the squares used are very small and detection also depends very much on light conditions.