Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access May 18, 2021

Defect Detection of Printed Fabric Based on RGBAAM and Image Pyramid

  • Junfeng Jing EMAIL logo and Huanhuan Ren
From the journal Autex Research Journal

Abstract

To solve the problem of defect detection in printed fabrics caused by abundant colors and varied patterns, a defect detection method based on RGB accumulative average method (RGBAAM) and image pyramid matching is proposed. First, the minimum period of the printed fabric is calculated by the RGBAAM. Second, a Gaussian pyramid is constructed for the template image and the detected image by using the minimum period as a template. Third, the similarity measurement method is used to match the template image and the detected image. Finally, the position of the printed fabric defect is marked in the image to be detected by using the Laplacian pyramid restoration. The experimental results show that the method can accurately segment the printed fabric periodic unit and locate the defect position. The calculation cost is low for the method proposed in this article.

1 Introduction

The appearance of printed fabric is composed of color, pattern, and texture [1]. In the process of printed fabric processing, the appearance of defects on the surface of the fabric is due to the production or machinery process and other reasons, which will seriously affect the quality and price of the fabric [2, 3]. Therefore, defect detection of printed fabric is a key link in product quality control, which can reduce the product cost and improve the quality. Printed fabric has rich in colors and varied patterns, and its defects are various in types and shapes. The existence of color and pattern will cause certain interferences in the detection. Therefore, it is difficult to detect the defects of the printed fabric.

Generally, printed fabrics can be divided into periodic printed fabrics and nonperiodic printed fabrics. The shape, size, and pattern of the periodic printed fabrics show regularity in horizontal and vertical directions. The period of printed fabrics is an important parameter to characterize the regularity of fabric. Once there are defects, the regularity will be destroyed. Therefore, the determination of the period of printed fabrics is the basis of subsequent defect detection. Grigorescu and Petkov [4] proposed to determine the minimum periodic unit of the fabric by calculating the minimum value of the local entropy of the sliding window. However, this method can only segment the fabric texture period in which the periodic unit is square, which has certain limitations. Lin [5] extracted the fabric period based on the gray-level co-occurrence matrix feature of the fabric image. Because the gray-level co-occurrence matrix involves the calculation of the pixel direction and distance and the determination of the gray level, the calculation amount is large and the adaptability to texture is poor. Jing et al. [6] proposed a method for measuring the period of printed fabric's patterns by using the distance matching function (DMF). In this method, the first peak of the second forward difference of two-dimensional DMF is the texture periodic unit, which is susceptible to noise interference and poor adaptability. Zhou et al. [7] proposed a method of fabric texture period measurement based on frequency domain analysis and DMF. This method can be used to calculate the period of the fabric with simple texture, but it cannot measure the minimum periodic unit for the printed fabric with varied patterns. Liu et al. [8] proposed a dictionary learning method based on K-SVD sparse coding to detect the defects of printed fabrics. This method only has a good effect on fabric detection with a simple background texture. Fu [9] proposed a method of printed fabrics based on Gabor filter and regular band. In this method, the parameters of Gabor filter are complex to adjust and the filter is sensitive to noise. Pan et al. [10] proposed the defect detection of printed fabrics using normalized cross-correlation (NCC), which realizes the detection of defects such as broken-color, cross-printing, and so on. This method needs to select different sizes of subwindows for different types of printed fabrics, and it has no good universality.

Owing to the limitations of existing defect detection algorithms for printed fabrics, this article proposes a printed fabric defect detection algorithm based on RGB accumulative average method (RGBAAM) combined with the image pyramid. The RGBAAM is used to realize the periodic segmentation of printed fabrics. This method is compared to the autocorrelation function (AF) with the DMF to achieve the periodic segmentation of printed fabrics. The image pyramid is constructed with the minimum periodic unit as the template for template matching to realize the defect detection of printed fabrics. The running time of the algorithm is compared with the traditional template matching algorithm to verify the effectiveness of the proposed method.

2 Printed fabric defect detection based on RGBAAM and image pyramid

The proposed method can be divided into the following steps: (1) Calculate the minimum periodic unit of the defect-free printed fabrics by using the RGBAAM; (2) A Gaussian pyramid is constructed for the template image and the detected image to down-sampling by using the minimum period as a template; (3) Similarity measure method is used to match the template image and the image to be detected; and (4) The position of the printed fabric defect is marked in the image to be detected by using the Laplacian pyramid restoration. The flow chart of the proposed method is shown in Figure 1.

Figure 1 Overall flow chart of defect detection of printed fabrics.
Figure 1

Overall flow chart of defect detection of printed fabrics.

2.1 RGB accumulative average method

Each pixel in the color image has position information and RGB color information, which are stored in a five-dimensional array. The RGBAAM [11] can convert this five-dimensional array into two independent one-dimensional arrays, namely the X- and Y- arrays. We can acquire the minimum repetition length in the horizontal and vertical directions by comparing the X and Y arrays, respectively; therefore, a minimum periodic unit is obtained.

A color image consisting of X and Y coordinates and RGB components is a five-dimensional spaces array, which can be converted into X and Y arrays by RGBAAM, and the equations can be defined as follows (1) and (2):

(1) X_arrayx=(1,2,,X)=y=1Y(R(x,y)+G(x,y)+B(x,y))Y

(2) Y_arrayy=(1,2,,Y)=x=1X(R(x,y)+G(x,y)+B(x,y))X

The RGB accumulative average (RGBAA) in the X and Y directions of printed fabrics of Figure 2 is shown in Figure 3.

Figure 2 Periodic printed fabric.
Figure 2

Periodic printed fabric.

Figure 3 RGB accumulative average (RGBAA) results of printed fabrics: (a) is the RGBAA in Y-direction and (b) is the RGBAA in X-direction.
Figure 3

RGB accumulative average (RGBAA) results of printed fabrics: (a) is the RGBAA in Y-direction and (b) is the RGBAA in X-direction.

The RGBAA of printed fabrics with a single repeating pattern exhibits a certain periodicity in the vertical and horizontal directions. The minimum repetition length in the horizontal and vertical directions is obtained by the distance of adjacent peaks. In the production process of the fabric, the error data appear due to the uneven force of the yarn in various directions, and some errors are indicated by the red square in Figure 3. For the processing of such errors, the threshold value is set by the program to eliminate the erroneous data and improve the accuracy of periodic segmentation. The minimum periodic unit of the printed fabric in Figure 2 is shown in Figure 4.

Figure 4 The minimum periodic unit of printed fabric.
Figure 4

The minimum periodic unit of printed fabric.

2.2 Construct Gaussian pyramid

To improve the matching precision, a method of printed fabric's defect detection based on image pyramid matching is proposed. Gaussian pyramid matching is a complete match on the top pyramid, and all instances of the target searched at the top of the Gaussian pyramid will be traced to the bottom of the Gaussian pyramid. The matching result in the upper layer is mapped to the next layer in the pyramid, and the search area in the next layer locates an area around the matching results in consideration of the uncertainty of the matching position. Then, the similarity is calculated in this region of interest. Gaussian pyramid matching improves the matching speed while ensuring the matching accuracy.

The Gaussian pyramid is constructed by using Gaussian filter smoothing and down-sampling processing to generate low-resolution image sequences [11]. The multiscale operation is adopted to avoid the influence of target scale changes on the detection results. The size of the original image is M × N, and we can obtain the low-resolution image sequence {L(0), L(1), L(2),…, L(n)} through the Gaussian pyramid. L(n) represents the n-th layer image. The size of the image sequence is {M × N, M/2 × N/2, M/4 × M/4,…}. The pyramid of the printed fabric image is shown in Figure 5.

Figure 5 Gaussian pyramid of printed fabric.
Figure 5

Gaussian pyramid of printed fabric.

2.3 Similarity measure

In the printed fabric defect detection, template matching is a method of sliding template, which takes the standard defect-free sample image as the template and uses it pixel by pixel in the printed fabric image to be detected. The similarity between the template image and the detected image is used to determine whether the detected image has defects. Template matching has a strong robustness to noise. The NCC matching algorithm is highly adaptable to the change of image grayscale, so the NCC algorithm is used to measure the similarity between the standard defect-free sample and the detected image in this article [12]. The NCC algorithm is shown in Eq. (3).

(3) R(x,y)=x',y'(T(x',y')/(x+x',y+y'))x',y'(T(x',y')2x',y'/(x+x',y+y')2

I(x, y) represents the detected image and T(x, y) represents the template image.

The NCC matching is used to measure the similarity between the template image and the detected printed fabric image, reducing the computational consumption and increasing the speed. The NCC measures the degree of similarity between the template image and the image to be detected, and the larger the NCC coefficient is, the more identical the two images are. We can defect the detection by setting an appropriate threshold.

2.4 Laplacian pyramid restoration

The Laplacian pyramid reconstructs the upper-layer image from the lower-layer image, and the i-th layer image is shown in Eq. (4). The Laplacian pyramid is used in conjunction with the Gaussian pyramid to reconstruct the original image.

(4) Li=GiUP(Gi+1)g5×5

In Eq. (4), Gi represents the i-th layer image, UP operation is to map the position from (x, y) at the lower-layer image to (2x + 1.2y + 1) at the upper-layer image. The symbol ⊗ represents the convolution operation and g5 × 5 represents the Gaussian kernel.

The defects of printed fabrics are located by NCC matching and threshold setting. Through the Laplacian pyramid restoration, the defect location is marked on the image to be detected.

3 Experimental results and analysis

The printed fabrics of different patterns were selected for experimental testing to verify the effectiveness of the proposed algorithm. The printed fabric images used in the test were obtained by the Canoscan 9000F scanner. The experimental environment is Intel(R) Core(TM) i5-4460 that the CPU is 3.20 GHz and the memory is 4 GB. The operating system is Windows 10, and the software compilation environment is Visual Studio 2015. Algorithm testing is performed in combination with OpenCV 3.4. The sample used in the test is shown in Figure 6, and the image size is 280 pixels x 209 pixels.

Figure 6 Samples of printed fabrics to be detected; (a), (b) and (c) are different patterns of printed fabric samples.
Figure 6

Samples of printed fabrics to be detected; (a), (b) and (c) are different patterns of printed fabric samples.

3.1 Obtain the minimum periodic unit of printed fabrics

The RGBAAM is used to obtain the minimum periodic unit of printed fabrics. The algorithm is compared with DMF [6] and AF [13] to obtain the period of fabrics. Some results are shown in Figure 7.

Figure 7 Comparison of different methods for printed fabric period segmentation; (a1), (a2), (a3) and (a4) are defect-free of printed fabrics; (b1), (b2), (b3) and (b4) are results of DMF to periodic segmentation for printed fabrics; (c1), (c2), (c3) and (c4) are results of AF to periodic segmentation for printed fabrics; (d1), (d2), (d3) and (d4) are results of RGBAAM to periodic segmentation for printed fabrics. Abbreviations: AF, autocorrelation function; DMF, distance matching function; RGBAAM, RGB accumulative average method.
Figure 7

Comparison of different methods for printed fabric period segmentation; (a1), (a2), (a3) and (a4) are defect-free of printed fabrics; (b1), (b2), (b3) and (b4) are results of DMF to periodic segmentation for printed fabrics; (c1), (c2), (c3) and (c4) are results of AF to periodic segmentation for printed fabrics; (d1), (d2), (d3) and (d4) are results of RGBAAM to periodic segmentation for printed fabrics. Abbreviations: AF, autocorrelation function; DMF, distance matching function; RGBAAM, RGB accumulative average method.

It can be seen from Figure 7 that DMF has high precision in the fabric periodic segmentation with simple texture and single color. However, it is impossible to accurately segment the minimum periodic unit of the fabric with complex patterns. For the above four types of printed fabrics, the accuracy of AF is higher than that of DMF, but the precise segmentation of the unit has not been achieved. The RGBAAM can not only accurately segment the minimum period for printed fabrics with rich patterns and abundant colors but also the calculation process is simple. The algorithm has small calculation cost and relatively low requirements on the hardware environment.

3.2 Template matching based on image pyramid

The minimum period is taken as a template image, and image pyramid matching is performed on the printed fabric image to be detected and the template image. Further, the Laplacian pyramid restoration is used to mark the defect on the image of printed fabrics to be detected. The algorithm is verified by testing different patterns of printed fabrics. The algorithm proposed in this article is compared with the visual saliency algorithm [14] to realize the detection of mini-jacquard fabric defects. Some results are shown in Figure 8.

Figure 8 Comparative experimental results of defect detection for printed fabrics. (a1), (b1), (c1) and (d1) are results of oil stain on different types of printed fabrics by the algorithm of this article. (a2), (b2), (c2) and (d2) are results of grinning of white ground on different types of printed fabrics by the algorithm of this article. (a3), (b3), (c3) and (d3) are results of foreign fiber on different types of printed fabrics by the algorithm of this article. (a4), (b4), (c4) and (d4) are results of oil stain on different types of printed fabrics by the visual saliency algorithm. (a5), (b5), (c5) and (d5) are results of grinning of white ground on different types of printed fabrics by the visual saliency algorithm. (a6), (b6), (c6) and (d6) are results of foreign fiber on different types of printed fabrics by the visual saliency algorithm.
Figure 8

Comparative experimental results of defect detection for printed fabrics. (a1), (b1), (c1) and (d1) are results of oil stain on different types of printed fabrics by the algorithm of this article. (a2), (b2), (c2) and (d2) are results of grinning of white ground on different types of printed fabrics by the algorithm of this article. (a3), (b3), (c3) and (d3) are results of foreign fiber on different types of printed fabrics by the algorithm of this article. (a4), (b4), (c4) and (d4) are results of oil stain on different types of printed fabrics by the visual saliency algorithm. (a5), (b5), (c5) and (d5) are results of grinning of white ground on different types of printed fabrics by the visual saliency algorithm. (a6), (b6), (c6) and (d6) are results of foreign fiber on different types of printed fabrics by the visual saliency algorithm.

It can be seen from Figure 8 that the presence of printed patterns has a great influence on the detection result when the visual saliency is applied to defect detection for printed fabrics of different patterns. Although the saliency algorithm can locate the defect, it causes a certain blur and enlargement of the details of printed fabric defects and mistakenly detects the background pattern as the defect area in the result. Especially for the defects of the grinning of white ground, only the contour is detected and the shape information of the defect is ignored. The algorithm proposed in this article can accurately locate the defect position. It has good universality for different types of defects such as oil stain, foreign fiber, and grinning of white ground, and the detail of the defects is preserved intact.

The TILDA database is used to verify the algorithm proposed in this article for fabric images with different patterns and defect shapes. There are 40 defect images in each of the four types of defects: broken end, hole, float, and oil stain. The image size is 256 pixels x 256 pixels. Compared with the visual saliency algorithm, some results are shown in Figure 9.

Figure 9 Partial defect detection results of the TILDA database. (a1) and (b1) are results of broken end on different types of printed fabrics by the algorithm of this article. (a2) and (b2) are results of hole on different types of printed fabrics by the algorithm of this article. (a3) and (b3) are results of float on different types of printed fabrics by the algorithm of this article. (a4) and (b4) are results of oil stain on different types of printed fabrics by the algorithm of this article. (a5) and (b5) are results of broken end on different types of printed fabrics by the visual saliency algorithm. (a6) and (b6) are results of hole on different types of printed fabrics by the visual saliency algorithm. (a7) and (b7) are results of float on different types of printed fabrics by the visual saliency algorithm. (a8) and (b8) are results of oil stain on different types of printed fabrics by the visual saliency algorithm.
Figure 9

Partial defect detection results of the TILDA database. (a1) and (b1) are results of broken end on different types of printed fabrics by the algorithm of this article. (a2) and (b2) are results of hole on different types of printed fabrics by the algorithm of this article. (a3) and (b3) are results of float on different types of printed fabrics by the algorithm of this article. (a4) and (b4) are results of oil stain on different types of printed fabrics by the algorithm of this article. (a5) and (b5) are results of broken end on different types of printed fabrics by the visual saliency algorithm. (a6) and (b6) are results of hole on different types of printed fabrics by the visual saliency algorithm. (a7) and (b7) are results of float on different types of printed fabrics by the visual saliency algorithm. (a8) and (b8) are results of oil stain on different types of printed fabrics by the visual saliency algorithm.

It can be seen from Figure 9 that the algorithm proposed in this article also has good adaptability to periodic fabrics in the TILDA database. For different types of defects, the defect position can be accurately located.

It has been verified by the experiments that the algorithm has a different detection time for different patterns of printed fabrics. Real-time detection was carried out for the four types of printed fabrics in Figure 8, and the average running time of the algorithm was calculated by repeated experiments. The results are shown in Table 1, which is compared with the traditional template matching algorithm. The time unit is “s.”

Table 1

Detection time of different methods

Methods types (a) (b) (c) (d)
Traditional template matching 0.087 0.125 0.094 0.064
Our method 0.052 0.054 0.058 0.056

As can be seen from Table 1, the detection time of the algorithm proposed in this article is nearly twice that of the traditional template matching. The real-time performance of the algorithm in this article is relatively high. The more complex the pattern is, the more time it takes to execute the algorithm.

4 Conclusion

Based on the theory of image pyramid correlation, an algorithm of defect detection for printed fabrics based on RGBAAM and image pyramid is proposed. It can not only accurately segment the minimum period of printed fabrics but also precisely locate the defect position. It improves the execution efficiency of the algorithm and reduces the running time compared with the traditional template matching. The proposed algorithm is still only in the testing stage of the laboratory environment. In the actual production process of printed fabrics, defect detection needs further optimization that is also the direction of the next step.

References

[1] Xin, J., Wu, J., Yao, P. P., Shao, S. (2018). An empirical study on fabric image retrieval with multispectral images using color and pattern features. In: Progress in Color Studies: Cognition, Language and Beyond, 391.10.1075/z.217.21xinSearch in Google Scholar

[2] Zhang, H., Li, R., Jing, J., Li, P., Zhao, J. (2015). Fabric defect detection based on Frangi filter and fuzzy C-means algorithm in combination. Journal of Textile Research, 36(09), 120–124.Search in Google Scholar

[3] Zhang, Z.-F., Zhai, Y.-S., Guo, Y.-Y. et al. (2015). Research on method to measure cotton defects based on optoelectronics technique. Laser and Optoelectronics Progress, 52(3), 154–159.10.3788/LOP52.031202Search in Google Scholar

[4] Grigorescu, S.E., Petkov, J.M.F. (2003). Texture analysis using Renyi's generalized entropies. In: Proceedings 2003 International Conference on Image Processing. IEEE, 2003, 1, I–241.Search in Google Scholar

[5] Lin, J.J. (2002). Applying a co-occurrence matrix to automatic inspection of weaving density for woven fabrics. Textile Research Journal, 72(6), 486–490.10.1177/004051750207200604Search in Google Scholar

[6] Jing, J.F., Yang, P., Li, P. (2015). Determination on design cycle of printed fabrics based on distance matching function. Journal of Textile Research, 36(12), 98–103.Search in Google Scholar

[7] Zhou, J., Wang, J., Pan, R., et al. (2017). Periodicity measurement for fabric texture by using frequency domain analysis and distance matching function. Journal of Dong Hua University, Natural Sciences, 43(5), 629–633.Search in Google Scholar

[8] Liu, S.M., Li, P., Zhang, L., et al. (2015). Defect detection based on sparse coding dictionary learning. Journal of Xi’an Polytechnic University, 29(5), 594–599.Search in Google Scholar

[9] Fu, Q. (2013). Defect detection of printed fabrics. Journal of Xi’an Aeronautical University, 31(5), 50–52.Search in Google Scholar

[10] Pan, R., Gao, W., Qian, X., et al. (2010). Detection of printed fabrics using normalized cross correlation. Journal of Textile Research, 31(12), 134–138.Search in Google Scholar

[11] Kuo, C.F.J., Hsu, C.T.M., Chen, W.H., et al. (2012). Automatic detection system for printed fabric defects. Textile Research Journal, 82(6), 591–601.10.1177/0040517511426615Search in Google Scholar

[12] Li, Y., Wang, R., Cui, Z., et al. (2016). Spatial pyramid covariance based compact video code for robust face retrieval in TV-series. IEEE Transactions on Image Processing, 25(12), 5905–5919.10.1109/TIP.2016.2616297Search in Google Scholar PubMed

[13] Zhu, S., Hao, C. (2012). Fabric defect detection approach based on texture periodicity analysis. Computer Engineering and Application, 48(21), 163–166.Search in Google Scholar

[14] Li, M., Cui, S., Chen, J. (2016). Defection for mini-jacquard fabric based on visual saliency. Journal of Textile Research, 37(12), 38–42+48.Search in Google Scholar

[15] Nakhmani, A., Tannenbaum, A. (2013). A new distance measure based on generalized image normalized cross-correlation for robust video tracking and image recognition. Pattern Recognition Letters, 34(3), 315–321.10.1016/j.patrec.2012.10.025Search in Google Scholar PubMed PubMed Central

[16] Rao, Y.R., Prathapani, N., Nagabhooshanam, E. (2014). Application of normalized cross correlation to image registration. International Journal of Research in Engineering and Technology, 3(5), 12–16.10.15623/ijret.2014.0317003Search in Google Scholar

Published Online: 2021-05-18

© 2021 Junfeng Jing et al., published by Sciendo

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 27.5.2024 from https://www.degruyter.com/document/doi/10.2478/aut-2020-0007/html
Scroll to top button