CMSC5711 ver.11.5 Programming Exercise 1- opencv- Camera Calibration 1. Motivation of this exercise: To help you to get familiar with the OpenCv library which is the most frequently used tool for vision and image processing tasks. And by doing the exercise, you are expected to: Practice your C++ programming skills. Get familiar with the basic OpenCv functions and data structures. Enhance your understanding of the camera model and calibration process. Understand the world coordinate system and the camera coordinate system. This exercise will occupy 6% in your final grade of this course. 2. Install and configuration of OpenCv library: I. Download and documentation: Download version 2.1: http://opencv.willowgarage.com/wiki/ Download CMake: http://www.cmake.org/cmake/resources/software.html Download Microsoft VS2008: https://www.dreamspark.com/default.aspx Documentation for C++: http://opencv.willowgarage.com/documentation/cpp/index.html EMGU: OpenCV in C#: http://www.emgu.com/wiki/index.php/Main_Page II. Install and compilation: 1. First, install the OpenCv and VS2008 on your computer, supposing the OpenCv home directory is “C: \OpenCV2.1” 2. Run the CMake program to generate the OpenCv installation project for VS2008: I. For the source directory, choose OpenCv home directory II. For the target location, select anywhere you want, supposing we put it in a new folder in the OpenCv home directory named: ” C: \OpenCV2.1\VS2008” III. Click “Configure” and select the appropriate version of your visual studio. IV. Then click “Configure” again and after finish, click “Generate”. After generation completed, you can find the newly created VS projects named “opencv” in the target directory you selected, say “C: \OpenCV2.1\VS2008”. III. Develop OpenCV project with MS visual studio: With VS2008 on windows platform: http://www.opencv.org.cn/index.php?title=VC_2008_Express%E4%B8%8B %E5%AE%89%E8%A3%85OpenCV2.0&variant=zh-tw IV. Introduction to programming with OpenCv: A step-by-step guide to the use of Microsoft Visual C++ and the Intel OpenCV library: http://www.site.uottawa.ca/~laganier/tutorial/opencv+directshow/cvision. htm updated version for opencv2.1: http://www.site.uottawa.ca/~laganier/tutorial/opencv+directshow/old/cvis ion.htm An OpenCv tutorial on the basic data structures and operations: http://www.cs.iit.edu/~agam/cs512/lect-notes/opencv-intro/opencv-intro. html 3. Introduction to camera calibration: According to the pinhole camera model, we know that the projective relationship between the camera coordinate system and the image plane is established with the following the formula: where I is the 2D projected image pixel, and Po is the corresponding 3D point in the camera coordinate system, λ is a scaling factor and M is a 3*3 matrix representing the camera intrinsic parameter. M can be decomposed into the following form: where fx and fy represent the focal length of the camera, Ox and Oy represent the image center. This relationship is illustrated in Figure 1. However, since the world coordinate system and the camera coordinate are not identical in most cases and people prefer to use the world coordinate system to locate 3D points, the ultimate goal of camera calibration is to establish relationship between image pixels and world coordinate system point. This can be done by introducing two extra parameter sets, rotation and translation vectors, which describe the transformation from world coordinate system to camera coordinate system as in Equation 3. where Pc and Po are the 3D points in camera and world coordinate system respectively, R and T are the rotation and translation vector employed to describe this transformation. To estimate the extrinsic parameter is indeed to ﬁnd out the rotation and translation relationship between the two coordinate systems. By estimating the intrinsic and extrinsic parameters of a camera, the calibration process is accomplished, and thus a projecting relationship can be established as in Equation 4: where R is a 3 × 3 rotational matrix and T is a 3*1 translational matrix, Po is the 3D point in the world coordinate system. Most of the existing calibration methods collect a set of corresponding images pixels and 3D points, then ﬁll them into Equation 4, by solving this equation, the intrinsic parameter and extrinsic parameter can be estimated. And the correspondence collecting procedure can be done with the OpenCv toolbox. 4. Exercise: As an introducing exercise of programming with OpenCv, you are asked to calibrate your own webcam with the OpenCv routines. Till now we have assumed that you should become familiar with the basic OpenCv operations and data structures, if not, please refer to section 2 and go through it! Working flow: i. Make your own checker board by printing the pattern as shown in ChessBoard.pdf and sticking it onto a flat board. (The horizontal and vertical corner count of the given pattern is 10 and 7 respectively) ii. Create a video capture by calling OpenCv routines and then fetch each frame from it for processing. iii. For each frame obtained, employ findChessboardCorners to find the corners on the chess pattern and obtain their 2D coordinates in the image automatically. (No need to care about how it is implemented, just call the function to do it) iv. Assume that the X-O-Y plane of the world coordinate system coincide with the board and the origin locates at the top left (or center, or any as you may like) corner of the chess pattern. Based on this, you can define the 3D coordinates of the corners yourself. For instance, (0,0,0) for the top left one , (1,0,0) for its right neighbor, and (0,1,0) for its down neighbor. v. Having got the 3D coordinates and 2D coordinates ready, we can feed them into the calibrateCamera routine, then OpenCv will help you to calculate the camera matrices and distortion coefficients. Till now, you have finished the basic task of this exercise. vi. Finally if you want to verify the result of your calibration, it can be done by estimating the back-projection error among the 2D and 3D points. Since by calibration, the 2D image plane and 3D world coordinate system is associated together, then we can simulate the projection process and project the 3D points back to the camera plane according to the estimated camera matrices. By comparing the projected 2D image corners with the detected 2D image corners, we can calculate an average back-projection error, if it’s small enough; we can conclude that the calibration is accurate. This can be done by employing the projectPoints function. vii. SUBMIT: submit your source code, screen shot of calibration result containing the back-projection error and an executable file(*.exe) to the course email box firstname.lastname@example.org with your name and student ID. Marking scheme: i. Intrinsic Matrix (1%): the estimated camera matrix should approximately agree with the form as given in formula 2. Normally, for the webcams we used, the fx and fy should be more or less the same. ii. Back-projection Error (2%): the back-projection error per corner should be no more than 10 pixels. iii. Program(3%): a. Your program can run smoothly. (1%) b. You are modifying the sample code but not copying it, using functions in OpenCv2.1 instead of older versions. (1.5%) c. Coding styles and necessary comments. (0.5%) Hints: i. You can find the useful functions in this page: http://opencv.willowgarage.com/documentation/cpp/camera_calibration_ and_3d_reconstruction.html ii. http://dasl.mem.drexel.edu/~noahKuntz/openCVTut10.html This is a piece of sample code to do the calibration with OpenCv, you can refer to this, but remember to implement your own code with the functions provided in OpenCv version 2.1. Hopefully this document can give you a pleasant and useful first experience with the OpenCv library, enjoy it!