• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Detection of Residual Yarn in Bobbin Based on Odd Partial Gabor Filter and Multi-Color Space Hierarchical Clustering

    2023-12-28 09:11:40ZHANGJinZHANGTuanshan張團(tuán)善SHENGXiaochao盛曉超HUYANPengfei呼延鵬飛
    關(guān)鍵詞:鵬飛

    ZHANG Jin(張 瑾),ZHANG Tuanshan(張團(tuán)善),SHENG Xiaochao(盛曉超), HUYAN Pengfei(呼延鵬飛)

    1 School of Textile Science and Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    2 School of Mechanical and Electrical Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    Abstract:In an automatic bobbin management system that simultaneously detects bobbin color and residual yarn, a composite texture segmentation and recognition operation based on an odd partial Gabor filter and multi-color space hierarchical clustering are proposed. Firstly, the parameter-optimized odd partial Gabor filter is used to distinguish bobbin and yarn texture, to explore Garbor parameters for yarn bobbins, and to accurately discriminate frequency characteristics of yarns and texture. Secondly, multi-color clustering segmentation using color spaces such as red, green, blue (RGB) and CIELUV (LUV) solves the problems of over-segmentation and segmentation errors, which are caused by the difficulty of accurately representing the complex and variable color information of yarns in a single-color space and the low contrast between the target and background. Finally, the segmented bobbin is combined with the odd partial Gabor’s edge recognition operator to further distinguish bobbin texture from yarn texture and locate the position and size of the residual yarn. Experimental results show that the method is robust in identifying complex texture, damaged and dyed bobbins, and multi-color yarns. Residual yarn identification can distinguish texture features and residual yarns well and it can be transferred to the detection and differentiation of complex texture, which is significantly better than traditional methods.

    Key words:residual yarn detection; Gabor filter; image segmentation; multi-color space hierarchical clustering

    0 Introduction

    Winding is the final step in the ring-spinning process, which transfers the spun yarns from the spinning bobbin into a large package form containing a considerable length of yarns for weaving and knitting. Since the bobbins may not be completely withdrawn from the winder, it is necessary to detect the amount of residual yarn and classify the bobbin according to the results. Currently, textile companies have the following three problems with the handling of bobbins. Firstly, there are many types of bobbins and the sorting situation is complicated. Secondly, manual sorting is inefficient and costly. Finally, the intermixing of yarns and yarn-free bobbins makes sorting difficult. How to effectively sort out these mixed bobbins is an urgent problem for textile companies today. The residual yarn inspection technology combined with machine vision not only excels in line inspection efficiency, but also has an irreplaceable position in terms of inspection accuracy and precision, and it has become the mainstream product line inspection technology today. Figure 1 shows bobbins in the factory application process.

    Non-contact methods of residual yarn detection have emerged, the data is mainly collected using a color sensor or camera to detect the color of the bobbin. By comparing the same type of the empty bobbin with the preset value, and using a support vector machine or neural network to segment the color, the yarn region is extracted. These solutions rely more on the color difference between the bobbin and the yarn. They are easily misidentified when two colors are close together and are not ideal for detection with very small amounts of yarns. In addition, the large variety of yarns and bobbins and their complex combination of colors, texture, and shapes make it difficult to detect the number of residual yarns on the bobbin using simple image techniques.

    Texture image segmentation is an important tool to solve this problem and is one of the main research hotspots in graphics and computer vision processing. There are many algorithms for texture image segmentation. These algorithms first use the strategy of extracting the texture features of the image and then performing image segmentation. The commonly used algorithms can be divided into three categories. (1) The use of filter-based algorithms represented by Gabor filter[1]and wavelet filter[2], if combined with the level set algorithms for energy functions, can get good results. (2) Algorithm based on cluster analysis[3-5]is represented by the fusion of Gabor, Steer and other filters and color information, which firstly extract texture features with filters and then fuse the color information to finally achieve the segmentation of texture images. (3) The level set algorithm based on energy function is represented by the regional model[6]to build the segmentation model and texture features. The third algorithm firstly establishes segmentation models, then extracts texture features based on local binary patterns (LBPs)[7], local ternary patterns (LTPs)[8], Gabor filter, structure tensor[9], local Chan-Vese (LCV) method based on extended structure tensor[10], combined with a tensor structure consisting of multiple features[11], local similarity factor (RLSF)[12], and finally segments the texture images. All the above methods are for medical and remote sensing images, which are too complex and poorly robust for application in real-time systems.

    As a linear filter, the frequency and direction representation of Gabor filter is very close to the banding ability and directivity of the human visual system, and is widely used in edge detection[13], texture segmentation[14], defect detection[15-17]and other fields. Considering the color difference between the bobbin and yarns as well as the difference in surface texture, a parameter-optimized multi-directional and multi-scale filter bank is designed to filter the two-dimensional image signal, extract the edges of the wrapped yarn, and perform preliminary calculation and segmentation of the residual yarn by using the antisymmetry of the odd part of the Gabor filter.

    Texture segmentation starts with color segmentation, and there are two main problems in color image segmentation: choosing a suitable color space; choosing a suitable segmentation method. The choice of the color feature space depends on the specific image and segmentation method. Currently, no one color space can replace other color spaces and is suitable for the segmentation of all color images[18]. Many scholars have used more complex feature selection and clustering techniques and then improved the final segmentation results with sophisticated optimization methods. Several segmentation techniques that combine specific theories, methods, and tools have emerged, such as graph theory-based method[19-20], wavelet domain hidden the Markov model[21], mean shift[22], and other information fusion strategies that have been used for image segmentation.

    Combining the same segmentation method or different segmentation methods for the segmentation of multi-color and multi-feature space can effectively solve the two main problems of color image segmentation mentioned above and improve the segmentation effect. The combined method is simple and does not require complex segmentation theories and models. The authors[23-24]verified the feasibility and effectiveness of this strategy in medical image segmentation, remote sensing image segmentation, and natural scene image segmentation. However, there are relatively few research results on feature fusion in image segmentation due to the particular difficulties associated with features[25-26].

    In this paper, we use multi-color spaces for clustering, each dealing with its linear part, and then synthesize the segmentation results. In terms of color space selection, there are six color spaces to choose from: red, green and blue (RGB); hue-saturation-value (HSV); brightness, in-phase, quadrature-phase (BIQ); XYZ (an international standard); CIELAB (LAB) (luminosity,a,b) and LUV (to further improve and unify color evaluation methods, the International Commission on Illumination proposed a unified LUV color space, where L represents the luminance of an object, U and V are chromaticities). Firstly, through experiments, we select two color spaces, RGB and LUV, and use the clustering method to initially segment the enhanced images of RGB and LUV. Secondly, we use the second clustering method to fuse the two initial segmentation results to obtain the fused segmented images. Finally, the segmentation results are obtained by region merging, which effectively solves the over-segmentation and mis-segmentation problems in natural image segmentation.

    The overall structure of this paper is as follows. Section 1 introduces the experimental software and hardware configuration. Section 2 is the detection algorithm, which is divided into three processes: bobbin image acquisition and extraction of main regions; the specific method of yarn edge extraction and segmentation based on Gabor filter; the color space fusion algorithm based on multi-color space hierarchical clustering and segmentation that combines Gabor edge detection to achieve residual yarn detection. Section 3 is a summary of the proposed method and experiments.

    1 Experimental Device

    The experimental system contains the following equipment. A charge coupled device(CCD) camera MV-GED500C-T (Shenzhen Minvision, China) is used with 2 448×2 048 dots per inch (DPI), 9 frames per second, and an ethernet interface, which can meet the needs of the experiment and actual production applications. For the lens, an industrial lens with a focal length of 25 mm (Zhejiang Dahua Technology CO., LTD., China) is adopted. The illumination source (Hangzhou Hikvision Digital Technology Co., LTD., China) is a downward-inclined white LED positive light source with a color temperature of 6 500 K. The experimental platform is shown in Fig.2. The computer operating system is Windows 7, the processor is i5-2500k@3.3GHz, the memory is 4 GB, and the graphics card is GeForce GTX 750. The framework is Visual Studio, and the OpenCV version is 3.4.3.

    According to the process of bobbin inspection, the whole system can be divided into a bobbin transfer module in the first stage, the bobbin online inspection module in the middle stage and the bobbin management module in the last stage. In the first stage, the messy bobbin needs to be pre-sorted, so that the bobbin can be placed more neatly before entering the online inspection in the middle stage to improve the inspection efficiency. The bobbin online inspection module is the core part of the pipeline management system, and the inspection result is directly related to the effect of the later bobbin management module. The bobbin online inspection module mainly realizes the detection of the bobbin with yarns and the color classification detection of the bobbin without yarns.

    Among them, yarn-containing bobbin detection precedes yarn-free bobbin color classification detection, and yarn-containing bobbin detection includes broken bobbin detection, end detection and yarn-containing detection. The color classification detection of yarn-free bobbins requires classification based on the head and body color of the bobbin. The bobbin online inspection system analyzes each bobbin image captured from the conveyor belt by the corresponding image processing algorithm to calculate the pixel length, end pixel width, edge detection, and color classification process of the bobbin. According to the results of calculation and analysis, communication and interaction with the embedded computer master is carried out to separate the broken end, inverted end and yarn-containing yarn barrel from the conveyor belt and send them from the plate to the corresponding box. If it is a qualified yarn barrel, then it can be processed for color analysis and move on to the next module. The process diagram is shown in Fig.3.

    Since the system is based on a fixed color background, in order to effectively preserve the target area, it is necessary to remove the support frame, background and other factors, and it is necessary to pixel mask each read image to obtain a binary image containing only 0 and 1 values. In the initially segmented binary image, each pixel with a value of 0 is judged to be within the similarity threshold. If it is within this range, the pixel range is reset according to the processing relationship, and the desired color image is obtained. The whole process is depicted in Figs. 4-6.

    The bobbin is positioned according to the optical band to effectively cut the experimental map and get the region of interest. The bilateral filtering can effectively filter out the noise in the image, make the contour of the bobbin clearer, effectively retain the details of the image, and extract the region of interest (ROI).

    2 Gabor Detection Algorithm

    2.1 Image acquisition of bobbin

    As shown in Fig.7(a), the camera captures an image of the bobbin spool directly. Horizontal correction of the image is needed to crop the main area of the bobbin spool. First, the smallest outer rectangle of the bobbin axis region is drawn with OpenCV, then using the built-in function of OpenCV. The angle of the bobbin is

    (1)

    2.2 Gabor filter feature extraction

    2.2.1ExtractionofyarnedgesusingoddpartialGaborfilter

    Gabor filter features have been widely used as effective texture feature descriptor for target detection and recognition. In the spatial domain, two-dimensional Gabor filter is usually obtained by multiplying a sinusoidal plane wave and a Gaussian kernel function. The properties of Gabor filter are self-similar. That is, any Gabor filter can be obtained by extending and rotating the mother wavelet. In practice, Gabor filter can extract relevant features from images with different orientations and different scales in the frequency domain. For a Gabor filter with a two-dimensional digital part, the functional expression is

    (2)

    where (x,y) is the pixel coordinate;x′=xcosθ+ysinθ;y′=-xsinθ+ycosθ;θdenotes the direction of the parallel stripes of the Gabor function, taking values from 0° to 360°;ωis the central frequency;σis the variance of the Gaussian distribution;ψis the phase of the sinusoidal function, and in general,ψ= 0;γis the aspect ratio,i.e., the spatial aspect ratio, which determines the ellipticity of the Gabor function. Whenγ= 1, the shape is circular, and whenγ<1, the shape is elongated in the direction of parallel stripes.σcannot be set directly. It varies with the bandwidth of the filter’s half-response spatial frequency (defined asb). The relationship between them is

    (3)

    whereσandλdetermine the wave form of the odd partial of the Gabor filter. When the product of the two is constant, the center frequency only affects the effective response region, while the waveform remains constant. Depending on the number of pixels occupied by a single veil in the image, a filter bank consisting of filters with the same center frequency is designed, and each bank contains filters with different orientations. In the experiments,σ/λis set to 1.2, the horizontal direction is ±90°, the number of groups is 3, and the center frequencies are 4.8, 5.6 and 6.4 Hz, respectively. The filter banks are convolved for each RGB channel component separately. The maximum value is taken from different directions of the same center frequency, and the minimum value is taken between different center frequencies to form the filter output. The processing effect of the filter is shown in Fig.8. After parameter optimization, the filter can suppress the texture of the cylinder wall and enhance the gray gradient of the bobbin at the same time.

    2.2.2Filterwaveformoptimization

    The waveform and output response of two-dimensional Gabor filter vary withσ,ω,θ,etc. Due to the bandpass characteristics of two-dimensional Gabor filter, it is necessary to use a multi-center frequency and multi-direction filter bank, and process the input signal according to fineness, so that the filter produces a greater response to the edge of the cylinder, while suppressing other texture. To enhance the edge detection of the odd part Gabor filter, it is necessary to make its waveform have a more pronounced step feature at axisx= 0. This step also gives a larger output response of filterG(x,y). LetEbe the integral of the response on the central axis side ofG(x,y).

    (4)

    After the integral operation, we can get

    (5)

    whereI(·) represents the imaginary part of the function;L(·) represents the error function. Equation (5) shows that ifσis known, thenEhas a Dawson distribution withωx(frequency atx). The inflection point of the Dawson integral occurs at approximately 0.924 14. It can be seen that whenEis the maximum value, the following equation can be obtained.

    (6)

    Figure 9 shows the filter output response curves withλ/σfor the test images. It can be seen that the actual peak occurs aroundλ/σ=6.4 and the actual output value is consistent with the theoretical output value.

    2.2.3Filterangleselection

    Figure 10 shows the ideal variation curve of the output response of the filter in the surprise section withθ. The maximum response of the filter was obtained whenθ= 90°. To prove the theoretical results, a total of five filters were selected to filter the test images in the range of 0° to 180° with 45° intervals. It can be seen that the filters filter best when the axes of the filters coincide with the step edge lines. Whenθis 90° and the centrosymmetric angle is 270°, these two angles can produce the ideal filtering effect. For a vertical bobbin, too many filters in different directions can reduce the operating efficiency and increase the noise.

    2.2.4Filtercenterfrequencyselection

    Fig.1 Bobbins in factory: (a) yarn bobbins in textile mills; (b) multi-color and yarn-containing bobbins that require sorting

    Fig.2 Experimental platform

    Fig.3 Process diagram : (a) yarn identification and detection system; (b) image acquisition device

    Fig.4 Image of bobbin

    Fig.7 Region of interest automatic process of image: (a) bobbin image; (b) main area of the extraction; (c) main area of bobbin

    Fig.8 Gabor rendering: (a) bobbin backbone area; (b) Gabor filtering effect of odd part

    Fig.9 Output response curve with λ/σ

    Fig.10 Output response curve with θ

    Fig.11 Theoretical output curve with respect to σ/d

    2.3 Fusion segmentation of multi-color spaces

    RGB model is the most common and basic color model in digital image processing. In the RGB color space, any color can be represented as a weighted mixture of different components of the three primary colors red, green and blue. The RGB color space, also known as the additive mixed color space, is characterized by poor color uniformity. This color space model is shown in Fig.12(a). For general images, the luminance ranges from 1 to 100, and the values of U and V are between -100 and 100. +U is red, -U is green, +V is yellow, and -V is blue. The LUV color space model is shown in Fig.12(b).

    Fig.12 Different color spaces: (a) RGB color space; (b) LUV color space

    The calculation formula is obtained by XYZ through nonlinear calculation. The specific equation is

    (7)

    (8)

    whereYnis 1.0;u′ andv′ describe the test color stimulus.

    The proposed method of fusion and segmentation of multi-color space based on hierarchical clustering is shown in Fig.13.

    Fig.13 Multi-color space fusion and segmentation method flow

    The specific steps are as follows.

    3)The local class labeled histogram features ofRRGBandRLUVare extracted and serialized into a fusion feature vector, and then the second fuzzyC-means clustering is performed to obtain a fusion segmentation resultSfusion.

    4)Conduct region merging onSfusionto obtain the final segmentation resultS.

    2.3.1Fusionandsegmentationmethodflowbasedonhierarchicalclustering

    In color image segmentation, choosing an appropriate color space is a difficult problem because it is difficult to represent complex natural scene images in a single-color space. Different color space representations can be seen as images with different channels provided by different sensors. An information fusion strategy combining complementary information from multi-color spaces is an effective way to improve the segmentation effect. Through the segmentation experiment of multiple groups of natural scene images, two color spaces, RGB and LUV, are selected to represent the segmented image and perform fusion segmentation. Image enhancement can highlight the light and dark changes in the image, and enhance the contrast between the background and the target, thereby effectively improving the segmentation effect of grayscale images. It is found that color image segmentation after enhancement processing can highlight the contours of the original image, reduce the number of segmentation blocks and improve the segmentation effect. The mathematical morphology image enhancement method is a simple and effective image enhancement method that can obtain a description of the structural features of an image by the influence of structural elements on the image.

    (9)

    2.3.2InitialsegmentationbasedonfuzzyC-meansclustering

    In general, the color histogram has a high degree of freedom. Taking RGB color space as an example, the color histogram has a high degree of freedom of 2563. Here, each color component is uniformly quantized to levelP, and the color histogram has a degree of freedom ofP3. For any pixelXin the image, the normalized local color histogramh1in a windowRcentered onXis computed.

    (10)

    2.3.3Fusionoftheinitialsegmentationofmulti-colorspace

    The fuzzyC-means clustering method is used to fuse the two initial segmentation results ofK1category, and the fusion segmentation results ofK2different categories are obtained bySfusion.

    For the two initial segmentation results, the feature vectors of the local class-labeled histograms centered on pixelxare extracted respectively, and the class-labeled histogramsh2are calculated in the windowRx.

    (11)

    wherenjis the number of pixels in the window labeled as (j+1);Nwrepresents the number of images corresponding to the bobbin. Then, two local class labeling histograms are concatenated and normalized to obtain the fused local class labeling histogramh2(Rx) with vector dimensions ofK1andK2.

    2.3.4Regionalconsolidation

    Since the clustering results are pixel-based, it is necessary to perform region merging onSfusionto achieve a more complete description of the target. In the segmentation results, the distance between a regionRand an adjacent regionRois denoted asDmerging(R,Ro),

    (12)

    whereC= {RGB, LUV};h(R) represents the normalized local color histogram of theRregion;h(Rx) represents the normalized local color histogram of a pixelxin the neighborhoodRo;DB[h(R),h(Rx)] represents the Batacaria distance of the histogram;h(i,R) andh(i,Rx) represent the occurrence frequency of the bin in the histograms ofRandRxrespectively;Nbrepresents the number of bins in the histogram. By calculating the distance betweenRand all adjacent regions ofR, the minimum adjacent regionRminis obtained. If the distance betweenRandRminisDmerging(R,Rmin) and less than the thresholdT, thenRis merged intoRmin. In the experiment of this paper, the segmentation of three different yarn volume fractions has achieved a very good segmentation effect, which can effectively restrain the texture of the bobbin and eliminate the influence of reflection. The experimental results are shown in Fig.14.

    Fig.14 Segmentation effect: (a) unsegmented image; (b)segmentation result image

    3 Experiment and Analysis

    In this study, we select 150 bobbins as the test image set, as shown in Fig.15. We test them according to LUV color space, and generally use chromaticity, saturation, and luminance according to the judgment rules of human eyes. For the sake of simplicity, we use saturation as the variable, so it is divided into four cases. For the first one, the saturation and luminance of the bobbin and yarns are completely distinguished. For the second one, the saturation of the bobbin and yarns is similar, but the hue is different. For the third one, the hue of the bobbin and yarns is similar, but with different saturation. For the fourth one, the bobbin and yarns are with little difference in hue saturation, which is the industry phenomenon of “the same bobbin and the same yarn”. In these four cases, the bobbin texture, slight stains,etc. should be considered. From the classification point of view, bobbin texture belongs to the fourth category.

    Fig.15 Images for testing

    Regarding the clustering, theoretically, two clusters are enough, one is the color of the bobbin itself and the other is the color of the yarn, but in practice, the bobbin is reflective and the bright band counts as a cluster. The bobbin texture counts as a cluster, so there are four clusters in total, but this increases the burden on the computer and is not suitable for low-cost situations. Thus three clusters are considered. Garbo filter is used for texture processing. The idea is that at the boundary of the cluster, the Garbo binarized image is checked. If the yarn shape is satisfied and the width is greater than that of the texture, it is the yarn. If both are texture, further clustering of texture is performed.

    An experimental process with the yellow bobbin is carried out, because the high saturation of the yellow bobbin is the high saturation of the white bobbin, and the identification of the yellow bobbin is the best test set to validate the system. The main reasons are as follows.

    1)The RGB of yellow is (255, 255, 0) and the RGB of crimson is (255, 0, 255). These two colors are easily confused in a color instant, which means that the colors easily distinguished by the naked eye are mathematically identical.

    2)Yellow is more likely to be contaminated and fade.

    3)Yellow (255, 255, 0) is visually white shown in Fig.16. Yellow and light white yarns, which are indistinguishable by normal methods, are theoretically extremely susceptible to the interference from color component blue, resulting in white color.

    Fig.16 Test process

    Fig.17 Comparison curves of different algorithms

    This process is also the block diagram of our software. From the diagram, we can see that three methods are used. Firstly, the Gabor filter is used to get the approximate position of the yarns, as seen in Fig.16. The texture and yarns are mixed and the exact position of the yarns cannot be distinguished, and other means are needed. Secondly, RGB three-component clustering is used, and the green component appears as a possible judgment of the yarn, but stain interference needs to be excluded. Thirdly, three-component clustering in the color space of LUV, from the L component to determine the location of the yarn. This “L” judgment is more realistic. If the U and V components are connected on the Y-axis at the same time, it can also be considered as having a yarn. The results of the three methods are fused by using the method of this paper to derive the yarns and their specific location. The accuracy of the method requires tuning the parameters of the fusion, which will not be developed here. Therefore, the yarns and texture which are very close to each other can be differentiated and processed. The situation is ideal when the filter is running at a speed of 50 bobbins per second, as shown in Fig.16.

    To test the accuracy and robustness of the detection algorithm in this paper, different colors of yarns and different kinds of bobbins are selected for test groups, and the test groups are shown in Table 1. Five colors of white, black, blue, red and gray yarns with linear densities of 4.0 tex and 10.0 tex are selected, the true positive rate (TPR) of the residual yarn skeleton is used as the test evaluation index, which is defined as

    (13)

    whereQTPRrepresents the accuracy rate, and the value range ofQTPRis [0,1];ETPrepresents the actual bobbin with residual yarns, and the detection result is also the number of samples of the bobbin with residual yarns;A(TP+FN)represents the actual bobbin with residual yarns and the actual empty bobbin, and the detection result is the number of the bobbin with residual yarns plus the number of empty bobbin samples. The higher the value, the better the classification effect. The test results corresponding to Table 1 are shown in Table 2. Figure 17 shows the effect comparison curve, from which it can be seen that the accuracy of the traditional algorithm is lower than 65% and that of the algorithm in this paper is higher than 80%.

    Table 1 Test groups

    Table 2 Accuracy of test results

    4 Experimental Verification of Indistinguishability

    To further validate the method of this paper, the following are a few cases that are more difficult for the human eye to distinguish.

    4.1 Texture judgment

    The bobbin in Fig.18 is very prone to error even if it is manually selected. At this point, the Gabor filter does not work as well as required, as shown in Fig.18(c), color clustering must be considered to complete the yarn judgment. In LUV color space, the L component acts obviously; in RGB color space the blue component acts obviously, so that the state marker of the yarn can be judged after fusion.

    Fig.18 Texture judgment process: (a) camera’s capture; (b) bobbin automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    4.2 Loose yarn judgment

    This is less common but still exists in a certain percentage. Figure 19 shows an example where the yarn is not horizontal on theY-axis and has a certain angle. In the adaptive Gabor filter used in Fig.18, the yarn contour is still distinguished, but when the yarn is close to the bobbin, the texture features are confused with the yarn features. In this identification, the concept of convex packets is utilized, as shown in Fig.19(g).

    Fig.19 Yarn judgment process: (a) camera’s capture; (b) automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    5 Conclusions

    In this paper, odd partial Gabor filter, multi-color space and hierarchical clustering of compound texture segmentation operators are used to detect residual yarns. Yarn segmentation is realized by optimizing the design of Gabor filter banks and adjusting the parameters to maximize the order amplitude in the band pass range. To solve the problem of the specific width, the most suitable center frequency is explored. By setting a reasonable filter combination, the frequency inconsistent with the yarn direction is removed, the noise is suppressed and the detection efficiency is improved.

    At the same time, it combines the fusion segmentation based on RGB and LUV color space hierarchical clustering to solve the problems of over-segmentation and mis-segmentation caused by the low contrast between the target and the background in the color. For image segmentation, image enhancement techniques in color image segmentation are introduced to make the segmented image better reflect the contours of the original image and highlight the parts of interest in the image.

    The results show that the algorithm can accurately detect yarn bobbins with different colors and brightness, and its optimization strategy provides a theoretical reference for the research of non-contact bobbin sorting.

    猜你喜歡
    鵬飛
    樊應(yīng)舉
    書香兩岸(2020年3期)2020-06-29 12:33:45
    漫畫
    Quality Control for Traditional Medicines - Chinese Crude Drugs
    為了避嫌
    雜文月刊(2019年18期)2019-12-04 08:30:40
    懲“前”毖“后”
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    執(zhí)“迷”不悟
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    舉賢
    21世紀(jì)(2019年9期)2019-10-12 06:33:44
    漫畫
    粗看“段”,細(xì)看“端”
    漫畫
    我的老师免费观看完整版| 自拍欧美九色日韩亚洲蝌蚪91 | 亚洲精品国产色婷婷电影| 卡戴珊不雅视频在线播放| 99热这里只有是精品在线观看| 涩涩av久久男人的天堂| 黄色配什么色好看| 久久精品久久久久久噜噜老黄| 免费大片18禁| 免费看av在线观看网站| 日本欧美视频一区| 欧美三级亚洲精品| 最新中文字幕久久久久| 美女福利国产在线 | 精品国产露脸久久av麻豆| 2022亚洲国产成人精品| 狂野欧美激情性xxxx在线观看| 老司机影院成人| 中文在线观看免费www的网站| 一个人看的www免费观看视频| av免费观看日本| 久久综合国产亚洲精品| 嘟嘟电影网在线观看| 久久av网站| 婷婷色麻豆天堂久久| 日本wwww免费看| av在线播放精品| 美女高潮的动态| 男人添女人高潮全过程视频| 亚洲欧美精品自产自拍| 亚洲精品乱久久久久久| 99久久精品国产国产毛片| av黄色大香蕉| 欧美日韩综合久久久久久| 视频区图区小说| 久久6这里有精品| 91精品国产九色| 久久久久久久久久成人| av国产免费在线观看| 欧美性感艳星| 色视频www国产| 日韩一区二区视频免费看| 两个人的视频大全免费| 亚洲国产欧美人成| 久久人人爽人人片av| 国产午夜精品一二区理论片| 国产精品成人在线| 女人十人毛片免费观看3o分钟| 亚洲成人一二三区av| 尾随美女入室| av免费观看日本| 啦啦啦中文免费视频观看日本| 亚洲精品中文字幕在线视频 | 成人亚洲精品一区在线观看 | 欧美丝袜亚洲另类| 美女中出高潮动态图| 大话2 男鬼变身卡| 搡老乐熟女国产| 日韩 亚洲 欧美在线| 男女无遮挡免费网站观看| av天堂中文字幕网| 日韩中文字幕视频在线看片 | 22中文网久久字幕| 在线观看国产h片| 国产精品国产三级国产专区5o| 久久毛片免费看一区二区三区| 亚洲高清免费不卡视频| 免费看av在线观看网站| 波野结衣二区三区在线| 亚洲av男天堂| 成年人午夜在线观看视频| 美女中出高潮动态图| 交换朋友夫妻互换小说| 99九九线精品视频在线观看视频| 超碰97精品在线观看| 伦精品一区二区三区| 少妇人妻一区二区三区视频| 女人十人毛片免费观看3o分钟| 精品99又大又爽又粗少妇毛片| 伦理电影大哥的女人| 日韩中字成人| 美女cb高潮喷水在线观看| av在线老鸭窝| 中文天堂在线官网| 嫩草影院新地址| 国产欧美日韩一区二区三区在线 | 国产毛片在线视频| 日韩不卡一区二区三区视频在线| 亚洲va在线va天堂va国产| av在线老鸭窝| 国产女主播在线喷水免费视频网站| 日韩大片免费观看网站| 成年人午夜在线观看视频| 国产欧美日韩精品一区二区| 深夜a级毛片| 久久97久久精品| 99热这里只有精品一区| 制服丝袜香蕉在线| 久久久久人妻精品一区果冻| 欧美成人午夜免费资源| 亚洲精品aⅴ在线观看| 亚洲欧洲日产国产| 嘟嘟电影网在线观看| 久久久久久久久久久免费av| 自拍欧美九色日韩亚洲蝌蚪91 | 日韩在线高清观看一区二区三区| 少妇人妻一区二区三区视频| 日本一二三区视频观看| 在线播放无遮挡| 国产精品偷伦视频观看了| 久久99热这里只频精品6学生| 九九爱精品视频在线观看| 精品国产一区二区三区久久久樱花 | av免费观看日本| 天天躁日日操中文字幕| 又黄又爽又刺激的免费视频.| 男女免费视频国产| 一级毛片久久久久久久久女| 直男gayav资源| 天堂中文最新版在线下载| 一级a做视频免费观看| 亚洲精品成人av观看孕妇| 国产精品99久久久久久久久| 中国三级夫妇交换| 熟女人妻精品中文字幕| 成人高潮视频无遮挡免费网站| 亚洲av成人精品一二三区| 最近手机中文字幕大全| 国产伦精品一区二区三区四那| 只有这里有精品99| 一区二区三区乱码不卡18| 精品久久久噜噜| 2022亚洲国产成人精品| 18禁裸乳无遮挡免费网站照片| 国产成人精品一,二区| 日韩av在线免费看完整版不卡| 久久人人爽av亚洲精品天堂 | 大码成人一级视频| 99热全是精品| 欧美区成人在线视频| 麻豆国产97在线/欧美| 久久99热6这里只有精品| 久久久久国产网址| 又粗又硬又长又爽又黄的视频| 一级黄片播放器| 黄色配什么色好看| 久久人人爽人人爽人人片va| 久久久久久久大尺度免费视频| 久久久久久久久久人人人人人人| 啦啦啦在线观看免费高清www| 人妻系列 视频| 久久鲁丝午夜福利片| 精品一区二区免费观看| 成年人午夜在线观看视频| 欧美老熟妇乱子伦牲交| 国产精品无大码| 王馨瑶露胸无遮挡在线观看| 日韩在线高清观看一区二区三区| 亚洲一级一片aⅴ在线观看| 国产美女午夜福利| www.av在线官网国产| 亚洲一级一片aⅴ在线观看| 麻豆成人午夜福利视频| 日韩视频在线欧美| 男女免费视频国产| 亚洲久久久国产精品| 亚洲国产毛片av蜜桃av| 内射极品少妇av片p| 日本免费在线观看一区| 国产午夜精品久久久久久一区二区三区| 国产精品女同一区二区软件| a级毛色黄片| 亚洲av在线观看美女高潮| 日韩视频在线欧美| 国产精品久久久久久久电影| 色婷婷av一区二区三区视频| 久久97久久精品| 婷婷色av中文字幕| 熟女av电影| 亚洲精品自拍成人| 九色成人免费人妻av| 国产老妇伦熟女老妇高清| 久久久a久久爽久久v久久| 成人毛片a级毛片在线播放| 搡老乐熟女国产| 国产精品久久久久久精品古装| 熟女av电影| 国产精品成人在线| 欧美一区二区亚洲| 三级国产精品片| 日日摸夜夜添夜夜添av毛片| 国产精品国产三级专区第一集| 色哟哟·www| 欧美精品亚洲一区二区| 午夜免费观看性视频| 亚洲精品亚洲一区二区| 成人漫画全彩无遮挡| 男人狂女人下面高潮的视频| 老司机影院毛片| 黑人猛操日本美女一级片| 亚洲国产成人一精品久久久| 午夜福利网站1000一区二区三区| videossex国产| 国产精品久久久久久久电影| 亚洲国产精品一区三区| 中文在线观看免费www的网站| 亚洲真实伦在线观看| 国产免费一区二区三区四区乱码| 亚洲在久久综合| 亚洲av国产av综合av卡| 免费观看无遮挡的男女| 国产又色又爽无遮挡免| 男女边摸边吃奶| 久久人人爽av亚洲精品天堂 | 99久久精品热视频| 国产无遮挡羞羞视频在线观看| 国产av码专区亚洲av| 深爱激情五月婷婷| 免费黄频网站在线观看国产| 久久精品久久久久久噜噜老黄| 国产精品久久久久久精品电影小说 | 一本—道久久a久久精品蜜桃钙片| 国产精品一二三区在线看| 熟女电影av网| 亚洲,一卡二卡三卡| 国产淫语在线视频| 日日撸夜夜添| 成年av动漫网址| 成年美女黄网站色视频大全免费 | 国产精品免费大片| 精品亚洲成a人片在线观看 | 久久婷婷青草| 精品久久久久久电影网| 免费看日本二区| 身体一侧抽搐| 欧美日韩国产mv在线观看视频 | 日韩欧美精品免费久久| 黄色一级大片看看| 丰满乱子伦码专区| 精品久久久噜噜| 男女下面进入的视频免费午夜| 下体分泌物呈黄色| 肉色欧美久久久久久久蜜桃| 国产成人a∨麻豆精品| 人人妻人人看人人澡| 一级毛片 在线播放| 成人二区视频| 亚洲精品aⅴ在线观看| 亚洲成人手机| 国产成人freesex在线| 少妇的逼好多水| 久久精品国产a三级三级三级| av在线观看视频网站免费| 日本欧美国产在线视频| 少妇高潮的动态图| 搡老乐熟女国产| 久久久久久久亚洲中文字幕| 少妇被粗大猛烈的视频| 免费在线观看成人毛片| 夜夜看夜夜爽夜夜摸| 国精品久久久久久国模美| 人妻 亚洲 视频| 精品一区二区免费观看| 婷婷色av中文字幕| 久久热精品热| av女优亚洲男人天堂| 久久国产精品大桥未久av | 亚洲精品中文字幕在线视频 | 精品国产乱码久久久久久小说| 欧美精品亚洲一区二区| 在线天堂最新版资源| 欧美zozozo另类| 午夜免费观看性视频| av免费在线看不卡| 精品亚洲乱码少妇综合久久| 欧美日韩精品成人综合77777| 久久精品夜色国产| 国内少妇人妻偷人精品xxx网站| 日韩欧美精品免费久久| 夫妻性生交免费视频一级片| 卡戴珊不雅视频在线播放| 熟女av电影| 精品久久久久久久久av| 国产淫语在线视频| 久久人人爽人人爽人人片va| av黄色大香蕉| 菩萨蛮人人尽说江南好唐韦庄| 啦啦啦啦在线视频资源| 亚洲精品,欧美精品| 国产精品一区www在线观看| 成人亚洲欧美一区二区av| 天美传媒精品一区二区| 日韩欧美一区视频在线观看 | 五月天丁香电影| 成人亚洲欧美一区二区av| 久久97久久精品| 尾随美女入室| 99九九线精品视频在线观看视频| 日韩在线高清观看一区二区三区| 国产亚洲最大av| 成人美女网站在线观看视频| 黑人高潮一二区| 亚洲av在线观看美女高潮| av国产免费在线观看| 青春草国产在线视频| 1000部很黄的大片| 热99国产精品久久久久久7| .国产精品久久| 女人十人毛片免费观看3o分钟| 久久av网站| 成人特级av手机在线观看| 国产精品麻豆人妻色哟哟久久| 日韩制服骚丝袜av| 免费黄网站久久成人精品| 国产一区二区三区av在线| 午夜激情福利司机影院| 婷婷色麻豆天堂久久| 丝袜脚勾引网站| 肉色欧美久久久久久久蜜桃| 2021少妇久久久久久久久久久| 亚洲成人av在线免费| 久久97久久精品| 国产精品99久久久久久久久| 校园人妻丝袜中文字幕| 中文资源天堂在线| 18禁在线播放成人免费| 伦精品一区二区三区| 在线观看国产h片| 国产在线男女| 丰满乱子伦码专区| 国产精品人妻久久久久久| 国国产精品蜜臀av免费| av专区在线播放| 亚洲国产欧美在线一区| 精品一区二区三卡| 一级a做视频免费观看| 亚洲三级黄色毛片| 九色成人免费人妻av| 日韩视频在线欧美| 国产精品免费大片| 80岁老熟妇乱子伦牲交| 91久久精品国产一区二区三区| 男人添女人高潮全过程视频| 有码 亚洲区| 青春草视频在线免费观看| 97超视频在线观看视频| 精品亚洲成国产av| 18禁裸乳无遮挡动漫免费视频| 新久久久久国产一级毛片| 少妇被粗大猛烈的视频| 免费黄网站久久成人精品| 3wmmmm亚洲av在线观看| 熟妇人妻不卡中文字幕| 国产精品av视频在线免费观看| 纯流量卡能插随身wifi吗| 一区在线观看完整版| 一级av片app| 亚洲欧美精品专区久久| 天堂俺去俺来也www色官网| 一区二区三区精品91| 狂野欧美白嫩少妇大欣赏| 国内少妇人妻偷人精品xxx网站| av女优亚洲男人天堂| 永久免费av网站大全| 高清欧美精品videossex| av视频免费观看在线观看| 九色成人免费人妻av| 精品视频人人做人人爽| 男女国产视频网站| 人妻一区二区av| 中国国产av一级| 中国美白少妇内射xxxbb| 18禁在线播放成人免费| 久久久久久久久久成人| 国产高潮美女av| 男人爽女人下面视频在线观看| 免费观看在线日韩| 一区二区三区乱码不卡18| 最近中文字幕2019免费版| 性色avwww在线观看| 九九在线视频观看精品| 大片免费播放器 马上看| 久久国产精品大桥未久av | 最近最新中文字幕大全电影3| 99久久中文字幕三级久久日本| 亚洲不卡免费看| 草草在线视频免费看| 免费黄网站久久成人精品| 国产伦精品一区二区三区视频9| 久久久久久久国产电影| 精品久久久久久电影网| 国产精品麻豆人妻色哟哟久久| 在线观看三级黄色| 中文字幕久久专区| av一本久久久久| 免费看不卡的av| 深爱激情五月婷婷| 九草在线视频观看| 一级毛片我不卡| 色网站视频免费| 午夜老司机福利剧场| 全区人妻精品视频| 国产精品一区www在线观看| 久久久亚洲精品成人影院| 99热全是精品| 精品久久久久久久久亚洲| 啦啦啦啦在线视频资源| 日韩不卡一区二区三区视频在线| 国产大屁股一区二区在线视频| 大码成人一级视频| 成人一区二区视频在线观看| 国产精品精品国产色婷婷| 色5月婷婷丁香| 天美传媒精品一区二区| 免费av不卡在线播放| 成人无遮挡网站| 久久国产精品大桥未久av | 老熟女久久久| 国产欧美另类精品又又久久亚洲欧美| 成人一区二区视频在线观看| 一级二级三级毛片免费看| 人妻一区二区av| 亚洲欧美日韩卡通动漫| 欧美日韩综合久久久久久| 毛片女人毛片| 人妻夜夜爽99麻豆av| 少妇 在线观看| 久久99蜜桃精品久久| 中文字幕亚洲精品专区| 久久久久久久久大av| 黄色欧美视频在线观看| 久久ye,这里只有精品| 内射极品少妇av片p| 久久 成人 亚洲| 精品国产露脸久久av麻豆| 久久久色成人| 这个男人来自地球电影免费观看 | 亚洲欧美日韩另类电影网站 | 欧美xxxx性猛交bbbb| 亚洲精品,欧美精品| 欧美日韩亚洲高清精品| 丰满人妻一区二区三区视频av| 99热这里只有是精品50| 欧美另类一区| 日本黄色日本黄色录像| av国产久精品久网站免费入址| 欧美最新免费一区二区三区| 老司机影院成人| 天天躁夜夜躁狠狠久久av| a级毛色黄片| 看非洲黑人一级黄片| 国产大屁股一区二区在线视频| 高清av免费在线| 另类亚洲欧美激情| 久久 成人 亚洲| 在线天堂最新版资源| 水蜜桃什么品种好| 成人黄色视频免费在线看| 午夜精品国产一区二区电影| 各种免费的搞黄视频| 80岁老熟妇乱子伦牲交| 久久青草综合色| 久久女婷五月综合色啪小说| 一本一本综合久久| 在线观看免费日韩欧美大片 | 亚洲av综合色区一区| 久久精品久久久久久噜噜老黄| 国产大屁股一区二区在线视频| 一级a做视频免费观看| 人人妻人人爽人人添夜夜欢视频 | 简卡轻食公司| 中文欧美无线码| 国产91av在线免费观看| 日韩中文字幕视频在线看片 | 美女福利国产在线 | 各种免费的搞黄视频| 国产色爽女视频免费观看| 97在线视频观看| 日韩,欧美,国产一区二区三区| 国产免费一区二区三区四区乱码| 中文天堂在线官网| 欧美日韩亚洲高清精品| 亚洲精品色激情综合| 在线播放无遮挡| 卡戴珊不雅视频在线播放| 日本黄大片高清| 成年免费大片在线观看| 少妇被粗大猛烈的视频| 夜夜爽夜夜爽视频| 成人18禁高潮啪啪吃奶动态图 | 直男gayav资源| 久久人人爽人人爽人人片va| 欧美人与善性xxx| 国模一区二区三区四区视频| 永久免费av网站大全| 啦啦啦啦在线视频资源| 日韩大片免费观看网站| 亚洲欧美成人精品一区二区| 王馨瑶露胸无遮挡在线观看| 精品人妻偷拍中文字幕| 晚上一个人看的免费电影| 一本—道久久a久久精品蜜桃钙片| 午夜免费男女啪啪视频观看| 国产成人a∨麻豆精品| 纵有疾风起免费观看全集完整版| 免费看日本二区| 免费不卡的大黄色大毛片视频在线观看| 国产精品国产av在线观看| 卡戴珊不雅视频在线播放| 永久免费av网站大全| 免费看日本二区| 国产精品国产三级国产av玫瑰| 亚洲精华国产精华液的使用体验| 好男人视频免费观看在线| 永久免费av网站大全| 嘟嘟电影网在线观看| 国产欧美另类精品又又久久亚洲欧美| 久久av网站| 亚洲综合精品二区| 亚洲精品日韩在线中文字幕| 久久精品国产鲁丝片午夜精品| 夫妻午夜视频| www.av在线官网国产| 五月天丁香电影| 一区在线观看完整版| 免费少妇av软件| 人人妻人人添人人爽欧美一区卜 | 女人十人毛片免费观看3o分钟| 亚洲欧美一区二区三区黑人 | 日韩,欧美,国产一区二区三区| 毛片女人毛片| 国产真实伦视频高清在线观看| 国产亚洲av片在线观看秒播厂| 我要看日韩黄色一级片| 国产有黄有色有爽视频| 欧美丝袜亚洲另类| 高清日韩中文字幕在线| 日日摸夜夜添夜夜添av毛片| 一本久久精品| 在线观看免费日韩欧美大片 | 中文字幕免费在线视频6| 午夜激情久久久久久久| 久久久久国产网址| 国产精品一区www在线观看| 99热国产这里只有精品6| 国产伦精品一区二区三区四那| 午夜福利视频精品| 91午夜精品亚洲一区二区三区| av线在线观看网站| 日韩强制内射视频| 人体艺术视频欧美日本| 欧美3d第一页| 在线观看三级黄色| www.av在线官网国产| 简卡轻食公司| 国产伦精品一区二区三区视频9| 久久久欧美国产精品| 天天躁日日操中文字幕| freevideosex欧美| 精品亚洲成国产av| 黄色怎么调成土黄色| 大话2 男鬼变身卡| 中文字幕免费在线视频6| 伦理电影免费视频| 亚洲经典国产精华液单| 午夜激情福利司机影院| 日韩大片免费观看网站| 99国产精品免费福利视频| 成年人午夜在线观看视频| 又大又黄又爽视频免费| 国产精品国产av在线观看| 性色avwww在线观看| 国产一区二区三区av在线| 久久国产精品男人的天堂亚洲 | av在线蜜桃| 女的被弄到高潮叫床怎么办| 2018国产大陆天天弄谢| 人人妻人人澡人人爽人人夜夜| 国产黄色视频一区二区在线观看| 美女国产视频在线观看| 在线精品无人区一区二区三 | 哪个播放器可以免费观看大片| 波野结衣二区三区在线| 久久久久性生活片| 免费不卡的大黄色大毛片视频在线观看| 国产成人freesex在线| 欧美精品亚洲一区二区| 日本vs欧美在线观看视频 | av国产免费在线观看| 日韩一区二区视频免费看| 国产免费一区二区三区四区乱码| 少妇 在线观看| 中国三级夫妇交换| 国产一区二区三区综合在线观看 | 国产精品无大码| 大又大粗又爽又黄少妇毛片口| 国产成人a∨麻豆精品| 一区二区三区精品91| 亚洲三级黄色毛片| 大片免费播放器 马上看| 亚洲欧美精品专区久久| 国产免费又黄又爽又色| 亚洲aⅴ乱码一区二区在线播放| 国产极品天堂在线| av在线app专区| 精品少妇久久久久久888优播| 久久久久久九九精品二区国产| 新久久久久国产一级毛片| 免费观看性生交大片5| 欧美精品人与动牲交sv欧美| 精品亚洲成国产av| 欧美zozozo另类| 汤姆久久久久久久影院中文字幕| 免费少妇av软件| 99久久综合免费|