Final Development Week: Part 1
CSE181A Winter 2011
Progress Report for Week #10 – Part 1 (3/2/2011 to 3/9/2011)
Project Title: Super Spray
It is week 10 of Winter quarter 2011! Our senior design project has to be demoed to the university’s senior design showcase on Friday, March 11th. That means that this week we need to get everything wrapped up, working, bug free, and presentable.
One thing we have struggled with was how we detect our laser on the screen. We were taking an image captured from our webcam, converting it from an RGB to an HSV image, and checking a range of values that was the color of our red laser and creating a binary image of that threshold, and detecting the points in that image. This worked – when we got it right. Using HSV values was really problematic, and basically guess and check. It was also very sensitive to the room’s lighting conditions, and seemed to vary in its effectiveness day to day. We did some optimization to that OpenCV code early in the week, and instead of converting to HSV, we left the image as RGB. Then we took a screenshot of what our laser looks like on the wall to our webcam. We opened the screenshot in Photoshop, and used the color dropper tool to get precisely what color our laser was on the wall. We changed the range of values to reflect those values, and we got a MUCH better and more reliable threshold image. We didn’t have to be nearly as careful as what was used in our game anymore either. Before we couldn’t use too bright of colors, whites, or anything even close to red or even other colors that were bright. Now we pick up the red laser, and only the red laser. It’s so much better! The only thing is if the laser is on a target, say a “green” target, the color that the camera sees from the laser is a mix of green and red and appears brown to the camera, so we no longer detect the laser. We remedied this problem by having all of our targets be outlines only, so that the laser is always on a black background.
Even though we were detecting our laser correctly, we still had an issue calibrating the screen. We were baffled as to what in the world our webcam was doing when capturing points… it just didn’t seem to make sense! At first we thought that our points were being flipped horizontally because when we shot the left side and then the right side of the screen to calibrate, we noticed if we printed out those values that they weren’t correct. After literally hours of debugging, printing the console, researching known OpenCV bugs with our webcam, we realized that the sample code from OpenCV that we used to start was constantly using a threshold image of the previous frame. When running the sample code, in its own project and looking at the threshold image in real time, everything looked normal because of the high framerate, and we couldn’t tell that the image being displayed was one frame behind where our laser actually was. As soon as we used our code, which only captures a frame when the trigger is pulled, the error became obvious. So now, every time our trigger is pulled, we capture two frames, and just through away the first, and use the second. This has solved our problem of getting incorrect coordinates. Now our game is calibrated well, and the laser is detected perfect. Our crosshair appears right where our laser is, and is very reliable.
After that, the next step was to get the game running not off of our computer’s keyboard, but actually off of our bluetooth data sent from the microcontroller. This code was finished about two weeks ago, and it was just a matter of integrating it in to the project. We added a gluttimerfunc() in OpenGL, and poll for bluetooth data inside. The fastest our microcontroller will ever send data is once every 100ms (the debounce for holding the trigger), so we check the incoming data bluetooth buffer at the nyquist rate of 50ms. Merging the bluetooth code into the game proved to be pretty seamless. We did add the ability to tell the microcontroller if we were calibrating the screen or playing a game, and changed the debounce time of the button accordingly. We wouldn’t want a player to be trying to calibrate the screen and shoot once and have it pick more than 1 point without giving them a chance to re-aim. Also, if the player is on the start screen, and wants to recalibrate the game, they can simply shake the gun and it will bring them back to the calibration screen.
Another exciting thing that happened this week was that we received a finished rendering of our case for our LCD that will be used to hold the LCD to our gun. Alvaro (Adrian’s twin brother) really did us a favor by designing this and it looks great. Below are pictures of the case. It is two pieces, a top and the bottom, which will be held together with four screws.
Coming up in the next 2-3 days we have to finish 100%. Middle Earth is an undergraduate student housing complex at UCI, and we are both on staff. Jared is the RA of the dorm “Evenstar” and Adrian is a Middle Earth attendant and works at various facilities around the complex. We both live in Evenstar and plan on letting the residents test our game before Friday as a last chance for us to find any bugs and take criticism from an audience. We’ll be back with pictures and video from that experience as well.