SURFACE ROUGHNESS PREDICTION IN TURNING PROCESS BY APPLYING COMPUTER VISION METHOD

: This paper reports the utilization of computer vision and backlight techniques to determine the surface roughness of a workpiece under a variety of process parameters. A CCD (Charge-Coupled Device) camera was used to capture the image of the edge of the workpiece of the turned components using backlight technology to provide an edge roughness profile. The image was processed using SRVISION software developed in MATLAB to extract the profile of the workpiece and calculated the arithmetic average value of roughness (Ra) and root mean square roughness (Rq). The experiments are carried out with AISI 1045 (medium carbon steel), using various feed rates and cutting speeds, comparison is then made of the surface roughness values achieved through the conventional stylus probe method and the image processing technique. The comparison indicates that the vision method provides precise and consistent results with a correlation up to 0.99 with the traditional stylus method. The mean variations in Ra and Rq between the two methods were just 1.65 and 1.433 percent, respectively. As the vision method is a non-contact procedure, it can be significant potential for application without damaging the machined surfaces in the in-process inspection of the components as well as aids monitoring of the components in a shorter period.


INTRODUCTION
Turning is a common machining process that removes material from a rotating cylindrical component using a single-point cutting tool. The turned component has a certain surface roughness that acts as a significant parameter in the performance of its work, much like friction, wear, lubrication, electrical and thermal resistance, fluid dynamics, vibration, and noise. Different parameters such as feed rate, cutting speed, cutting depth, cutting tool configuration, machine tool, and material of the component all affect the performance of the required product feature and surface roughness values at an appropriate cost. Roughness may be evaluated using two basic methods: contact and noncontact methods. The contact method utilizes a stylus, which is drawn across the measured surface. The surface waveform is collected through an electronic sensor, commonly a linear differential variable transformer, that calculates parameters of surface roughness, like root mean square roughness Rq, average roughness Ra, maximum peak-to-valley height Rt, etc. The main disadvantages of the stylus device are that: (1) it requires direct physical contact, (2) it limits the measuring speed, (3) it cannot be used as an online measurement because the workpiece needs to be withdrawn from the machine for monitoring, and (4) it has restricted versatility in handling the specific geometric component to be measured [1].
Non-contact methods may be divided into many categories based on the lighting system used and image analysis. Several investigations were performed utilizing noncontact vision methods for the surface roughness assessment. Lee et al. [1] employed computer vision techniques to predict a workpiece's surface roughness under many cutting operations. The workpiece surface image was first acquired with a digital camera, and then the surface image feature was extracted. A polynomial grid was implemented utilizing a self-organizing adaptive modeling method to create relations between the surface image characteristics and real roughness of the surface through various turning operations. Gadelmawla [2] implemented a vision system to capture images for surfaces to be characterized and software was designed to investigate the captured images based on the "Gray Level Co-occurrence Matrix (GLCM)". 3D plots of the GLCMs for different captured images were implemented, compared, and discussed. Also, several statistical parameters were calculated from the GLCMs and compared with the arithmetic average roughness, Ra.
Al-Kindi et al. [3] developed a technique for using computer vision data to achieve accurate measurement of surface roughness parameters. Stylus-based measurements were obtained utilizing standard and non-standard roughness parameters and compared to vision-based measurements. Two light reflection models were adopted and implemented, namely the "Intensity-Topography Compatible (ITC) model" and the "Light-Diffuse model", to explain the obtained vision data and to allow appropriate roughness parameter calculation. Results revealed that the "ITC model" performed better than the "Light-Diffuse model", with notable similar values to those obtained by conventional stylusbased data of roughness parameters. Zhongxiang et al. [4] employed a method for determining the three-dimensional roughness of the surface using profile information. They suggested a three-dimensional measuring technique that was used to investigate surface roughness components on the basis of the digital image processing technology, and set up a three-dimensional surface roughness assessment system containing hardware and software architecture. Fadare et al. [5] developed a computer vision system appropriate for on-line surface roughness measurement of machined components utilizing an "artificial neural network (ANN)" depending on a digital image processing of the machined surface, consisting of a CCD camera, computer, Microsoft Windows Video Maker, digital image processing software, and two light sources. The machined surface images were captured; analyzed and optical roughness characteristics were assessed using the "2-D fast Fourier transform (FFT) algorithm". They concluded that the optical roughness values predicted by ANN were considered to be in good agreement with the measured values (R 2 -value = 0.9529).
Shahabi et al. [6] proposed a different method for measuring roughness using a 2-D contour extracted from an edge image of the workpiece surface. A comparison with a stylus type device indicated a maximum variation of 10% in the measurement of average roughness Ra utilizing the visual method. Sridhar et al. [7] used a machine vision method to determine surface roughness through image processing and backlight technique on the turned components. The comparison was then made of the surface roughness values achieved through the image processing technique and the conventional stylus method, which showed that the suggested method provided close and dependable results similar to the traditional stylus method. Balasundaram et al. [8] calculated the amplitude and spacing, in addition to functional surface roughness parameters through the dry cutting of AISI 1035 carbon steel utilizing machine vision. A "DSLR camera" with high shutter speed was employed to capture a blur-free image of the workpiece surface profile perpendicular to the cutting tool. The edge of the surface profile was identified to subpixel precision using the grey level constant moment and the roughness parameters were calculated using the profile. Srivani et al. [9] presented a methodology to characterize the nature of the surface using a computer vision system. For further investigations, a computerized optical microscope was used to collect surface images, and those images were fed into MATLAB software.
Qingqun et al. [10] suggested a different method of on-line turned surface inspection by observing the characteristics of the grey value of the surface digital image. The uniformity of the surface image was evaluated and analyzed by fractal analysis, wavelet transform, and discreteness analysis of the wavelength of the texture profile. The normal texture image was extracted from the average wave profile, which indicated the state of the process and turned surface conditions. The results indicated that the turned surface condition could be effectively checked on-line. Naresh et al. [11] used the technique of machine vision to observe the surface roughness when turning composite MMCs. The machining surfaces were identified during machining operation utilizing machine vision technology and the stylus probe instrument was used to measure the surface finish of the machined surfaces. Patel et al. [12] introduced a computer vision system that captured the surface texture contours of the machined surfaces and extracted images. Using the graylevel co-occurrence matrix, the texture function parameters were extracted and compared to various surface roughness parameters reported from a surface profilometer of a contact form. The image analysis was carried out for the extraction of texture characteristics at various levels of roughness. The variation between the characteristics of each texture and the parameter of surface roughness was examined. Multiple regression models were expanded to estimate individual surface roughness parameter (Ra) estimation and good recognition of surface roughness degree. The linear detection model was found to have better output features compared with a nonlinear recognition model. The findings showed that surface roughness estimation utilizing a linear regression model was a robust method for non-contact measurement. Patel et al. [13] presented a surface roughness prediction approach utilizing "Computer Vision", "Image Processing", and "Machine Learning". Two machine learning algorithms, "Stochastic Gradient Boosting" and "Bagging Tree" were compared and assessed on the basis of statistical parameters. It was found that "Stochastic Gradient Boosting" effectively estimated surface roughness for training as well as Ten-fold cross-validation. The methods may be utilized for online monitoring of machined components and good evaluation.
In this paper, a computer vision system for tracking and predicting the surface roughness of the turned components with different cutting conditions (cutting speed, feed rate, and cutting depth) utilizing image processing and backlight technique is presented. The surface roughness values that will obtained by the image processing technique and the conventional stylus method will then be compared.

METHODOLOGY AND EXPERIMENTATION
The average surface roughness (Ra) and root mean square roughness (Rq) are commonly used as index of measurements to assess a machined surface finish. Estimation of roughness parameters has a significant role in distinguishing difficulties in industrial sectors like contact deformation, friction, and tightness of joint contact precision.

Stylus Method Description
The machining process was performed on a WILTON lathe (model no. 52TL1440-3) by 18 medium carbon steel AISI 1045 workpieces having a 30 mm diameter and a 300 mm length. The chemical composition and mechanical properties of AISI 1045 material were measured as shown in tables 1 and 2 respectively.  The experiments were conducted using the Taguchi method by changing working parameters like feed rate, cutting speed, and a fixed cutting depth. The direction of the workpiece rotation was fixed in a counterclockwise direction. No cooling was concerned throughout the turning process. Table 3 indicates the amount of the parameters during the turning cutting process. A stylus device was used as a contact method for measuring the surface roughness of machined components. It contained a diamond stylus probe that was moved perpendicularly to the direction of roughness, and a characteristic of surface roughness was recorded at the other end. Because of its advantages, it is the most widely used technique and generates an object's profile in a clear direction. Surface roughness measurements of 18 turned components were performed on the stylus roughness tester type (SRT-6210).

Computer Vision System Description
The fundamental components of the vision system designed to capture images of the surfaces to be inspected consist of two parts: hardware and software systems. The hardware system consists of four main items: (1) a Sony DSC-WX100 CCD digital camera with a resolution of 18.2 megapixels, (2) an LED illumination source, (3) a black tube of cardboard to prevent the effect of environmental light, and (4) a personal computer (PC) with MATLAB program for image processing as software. The camera was fixed using a special frame designed to move horizontally and vertically to ensure that the view of the camera was always perpendicular to the surface of the workpiece and could scan any area that needed to be measured. A software system named "SRVISION" was developed using MATLAB software. It was developed to work on any Windows environment. The image of the surface that needed to be measured was opened by the software, and then the variation of the surface profile was plotted and the surface parameters were calculated. The actual and schematic configuration of the on-machine measurement system of roughness is shown in Figs. 1 and 2, respectively.  Fig. 1: The actual setup of the on-machine roughness measurement system.

System Calibration
The horizontal and vertical scaling factors were obtained using a standard block with a length of 2 mm to transform the image dimensions from pixels to real dimensions in microns; the block was located at the same level as the shaft. The block width (in pixels) was calculated using the camera calibration toolbox in MATLAB software and the calibration factors were calculated using the following equation [

Measuring Procedure
A procedure for the assessment of surface roughness using the image processing method is described below: 1-Preparation of specimen: 18 medium carbon steel AISI 1045 components were turned by adjusting the cutting speed and feed rate. The surface roughness values were measured by a stylus type roughness tester.
2-Components were put under the "CCD camera" and modified for appropriate illumination of the LED; the "CCD camera" was focused to get an obvious contour image of the specimen edge. An image of the contour edge of the component being turned was captured and stored in the computer using a USB cable.
3-The captured image was converted to a grayscale version to reduce the operating time of the algorithm.
4-The area to be measured was cropped from the original image, and unnecessary areas around the shaft edge were deleted.

5-
The image stored in the computer was recovered and treated using a median filter (mask size 3X3) to remove the noise present in the image.
6-The developed SRVISION software, calculated the image gradient in the Y direction to find the change in the intensity from white to black and found the edge of the workpiece.
7-Converted the grayscale amount of the image into black and white with a binarization technique, the limits were applied on the image so that the component area was black and the rest was white.
8-An algorithm was written for scanning the first row to find the first white pixel in the profile, then scanning the second row to find the second white pixel. This method was repeated to find the whole white pixels lying in the image, these pixels reflected the profile of the workpiece's surface profile.
9-The best fit line was drawn to the contour image to get a mean line of the contour by least-square fitting.
10-Assessment of average surface roughness (Ra) and root mean square roughness (Rq) by image processing technique from the image contour was performed by subtracting each pixel of the counter profile from the calculated mean line and using the following relationships: where: n is the number of data points, hi is the absolute distance of the i th point on the profiling measure from the mean line, and ƒ is a scaling factor.

RESULTS AND DISCUSSION
The results of the stylus and vision methods for measured surface roughness are presented in this section. Additionally, their demand results were compared and discussed.

Measuring Surface Roughness Using Stylus Method
A stylus instrument was utilized to compare with the values of measured roughness by the vision system. Every surface was measured 5 times at different positions of the workpiece utilizing a cutoff of 0.8 mm. The minimum and maximum values of surface roughness achieved by the stylus method are indicated in Table 4. The variation ΔRa between the minimum and maximum Ra values changed between 0.24 μm and 0.844 μm for the 18 specimens. The maximum variation as a percentage of the minimum Ra value was 13.22% for each workpiece. The difference ΔRq between a minimum and maximum Rq values ranged from 0.13 μm to 1.94 μm. The maximum variation as a percentage of the minimum Rq value was 15.36 % for each workpiece. The different values of surface roughness at the same workpiece were a result of instability in the machining process performed by the traditional turning machine.

Measuring the Surface Roughness Using the Vision Method
Every image of the workpiece was measured 4 times at different positions. Table 5 indicates the minimum and maximum surface roughness values achieved by the machine vision system. The difference ΔRa between the minimum and maximum Ra values ranged from 0.256 μm to 1.184 μm. The maximum variation for each workpiece as a percentage of the minimum Ra value was 12.7%. The difference ΔRq between a minimum and maximum Rq values ranged from 0.377 μm to 0.973 μm. The maximum variation for each workpiece as a percentage of the minimum Rq value was 10.76 %.

Comparison of Roughness Values Achieved by Stylus and Vision Methods
The results of measurements of average surface roughness (Ra) and root mean square surface roughness (Rq) using the suggested method of a vision system and comparison with the stylus method are shown in Table 6. The results show that the maximum Ra and Rq differences between the two methods were 3,744% and 3,727% respectively. The mean and the standard deviation between the two Ra measurements were 1.65% and 1.0% respectively. Also, the mean and the standard deviation of the difference for Rq were 1.433% and 1.0%, respectively. Figures 4 and 5 show a plot of average roughness and root mean square roughness respectively found by the suggested vision method (Ra (v)) versus the average roughness found by the stylus measurement (Ra(s)). The data were fitted with a linear trend line, and the correlation value was determined in Microsoft Excel using linear regression. A correlation value would specify a perfectly linear relationship between the two data groups. The high correlation of 0.99 indicates that the visual method is capable of giving dependable roughness values for the measurements obtained in this study.

CONCLUSION
A computer vision system and backlight method for assessing the surface roughness of turned medium carbon steel AISI 1045 specimens under different machining conditions were proposed in this study. The computer vision system captured and stored the enlarged contour edge images of the specimens as they were being turned. SRVISION software was developed for calculating the surface roughness immediately from the specimen's contour image. The advantage of using a backlighting device is that it is not influenced by industrial environment lighting conditions. The precision of the vision method was compared with the stylus method for many experiments. Comparison graphs drawn between the vision and stylus methods demonstrated the percentage error obtained a maximum variation of 3.75 % and the coefficient of correlation (R 2 ) values were close to one. Hence the vision method is reliable and appropriate for on-line, non-contact surface roughness measurement of machined components.