• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Detection of Residual Yarn in Bobbin Based on Odd Partial Gabor Filter and Multi-Color Space Hierarchical Clustering

    2023-12-28 09:11:40ZHANGJinZHANGTuanshan張團(tuán)善SHENGXiaochao盛曉超HUYANPengfei呼延鵬飛
    關(guān)鍵詞:鵬飛

    ZHANG Jin(張 瑾),ZHANG Tuanshan(張團(tuán)善),SHENG Xiaochao(盛曉超), HUYAN Pengfei(呼延鵬飛)

    1 School of Textile Science and Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    2 School of Mechanical and Electrical Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    Abstract:In an automatic bobbin management system that simultaneously detects bobbin color and residual yarn, a composite texture segmentation and recognition operation based on an odd partial Gabor filter and multi-color space hierarchical clustering are proposed. Firstly, the parameter-optimized odd partial Gabor filter is used to distinguish bobbin and yarn texture, to explore Garbor parameters for yarn bobbins, and to accurately discriminate frequency characteristics of yarns and texture. Secondly, multi-color clustering segmentation using color spaces such as red, green, blue (RGB) and CIELUV (LUV) solves the problems of over-segmentation and segmentation errors, which are caused by the difficulty of accurately representing the complex and variable color information of yarns in a single-color space and the low contrast between the target and background. Finally, the segmented bobbin is combined with the odd partial Gabor’s edge recognition operator to further distinguish bobbin texture from yarn texture and locate the position and size of the residual yarn. Experimental results show that the method is robust in identifying complex texture, damaged and dyed bobbins, and multi-color yarns. Residual yarn identification can distinguish texture features and residual yarns well and it can be transferred to the detection and differentiation of complex texture, which is significantly better than traditional methods.

    Key words:residual yarn detection; Gabor filter; image segmentation; multi-color space hierarchical clustering

    0 Introduction

    Winding is the final step in the ring-spinning process, which transfers the spun yarns from the spinning bobbin into a large package form containing a considerable length of yarns for weaving and knitting. Since the bobbins may not be completely withdrawn from the winder, it is necessary to detect the amount of residual yarn and classify the bobbin according to the results. Currently, textile companies have the following three problems with the handling of bobbins. Firstly, there are many types of bobbins and the sorting situation is complicated. Secondly, manual sorting is inefficient and costly. Finally, the intermixing of yarns and yarn-free bobbins makes sorting difficult. How to effectively sort out these mixed bobbins is an urgent problem for textile companies today. The residual yarn inspection technology combined with machine vision not only excels in line inspection efficiency, but also has an irreplaceable position in terms of inspection accuracy and precision, and it has become the mainstream product line inspection technology today. Figure 1 shows bobbins in the factory application process.

    Non-contact methods of residual yarn detection have emerged, the data is mainly collected using a color sensor or camera to detect the color of the bobbin. By comparing the same type of the empty bobbin with the preset value, and using a support vector machine or neural network to segment the color, the yarn region is extracted. These solutions rely more on the color difference between the bobbin and the yarn. They are easily misidentified when two colors are close together and are not ideal for detection with very small amounts of yarns. In addition, the large variety of yarns and bobbins and their complex combination of colors, texture, and shapes make it difficult to detect the number of residual yarns on the bobbin using simple image techniques.

    Texture image segmentation is an important tool to solve this problem and is one of the main research hotspots in graphics and computer vision processing. There are many algorithms for texture image segmentation. These algorithms first use the strategy of extracting the texture features of the image and then performing image segmentation. The commonly used algorithms can be divided into three categories. (1) The use of filter-based algorithms represented by Gabor filter[1]and wavelet filter[2], if combined with the level set algorithms for energy functions, can get good results. (2) Algorithm based on cluster analysis[3-5]is represented by the fusion of Gabor, Steer and other filters and color information, which firstly extract texture features with filters and then fuse the color information to finally achieve the segmentation of texture images. (3) The level set algorithm based on energy function is represented by the regional model[6]to build the segmentation model and texture features. The third algorithm firstly establishes segmentation models, then extracts texture features based on local binary patterns (LBPs)[7], local ternary patterns (LTPs)[8], Gabor filter, structure tensor[9], local Chan-Vese (LCV) method based on extended structure tensor[10], combined with a tensor structure consisting of multiple features[11], local similarity factor (RLSF)[12], and finally segments the texture images. All the above methods are for medical and remote sensing images, which are too complex and poorly robust for application in real-time systems.

    As a linear filter, the frequency and direction representation of Gabor filter is very close to the banding ability and directivity of the human visual system, and is widely used in edge detection[13], texture segmentation[14], defect detection[15-17]and other fields. Considering the color difference between the bobbin and yarns as well as the difference in surface texture, a parameter-optimized multi-directional and multi-scale filter bank is designed to filter the two-dimensional image signal, extract the edges of the wrapped yarn, and perform preliminary calculation and segmentation of the residual yarn by using the antisymmetry of the odd part of the Gabor filter.

    Texture segmentation starts with color segmentation, and there are two main problems in color image segmentation: choosing a suitable color space; choosing a suitable segmentation method. The choice of the color feature space depends on the specific image and segmentation method. Currently, no one color space can replace other color spaces and is suitable for the segmentation of all color images[18]. Many scholars have used more complex feature selection and clustering techniques and then improved the final segmentation results with sophisticated optimization methods. Several segmentation techniques that combine specific theories, methods, and tools have emerged, such as graph theory-based method[19-20], wavelet domain hidden the Markov model[21], mean shift[22], and other information fusion strategies that have been used for image segmentation.

    Combining the same segmentation method or different segmentation methods for the segmentation of multi-color and multi-feature space can effectively solve the two main problems of color image segmentation mentioned above and improve the segmentation effect. The combined method is simple and does not require complex segmentation theories and models. The authors[23-24]verified the feasibility and effectiveness of this strategy in medical image segmentation, remote sensing image segmentation, and natural scene image segmentation. However, there are relatively few research results on feature fusion in image segmentation due to the particular difficulties associated with features[25-26].

    In this paper, we use multi-color spaces for clustering, each dealing with its linear part, and then synthesize the segmentation results. In terms of color space selection, there are six color spaces to choose from: red, green and blue (RGB); hue-saturation-value (HSV); brightness, in-phase, quadrature-phase (BIQ); XYZ (an international standard); CIELAB (LAB) (luminosity,a,b) and LUV (to further improve and unify color evaluation methods, the International Commission on Illumination proposed a unified LUV color space, where L represents the luminance of an object, U and V are chromaticities). Firstly, through experiments, we select two color spaces, RGB and LUV, and use the clustering method to initially segment the enhanced images of RGB and LUV. Secondly, we use the second clustering method to fuse the two initial segmentation results to obtain the fused segmented images. Finally, the segmentation results are obtained by region merging, which effectively solves the over-segmentation and mis-segmentation problems in natural image segmentation.

    The overall structure of this paper is as follows. Section 1 introduces the experimental software and hardware configuration. Section 2 is the detection algorithm, which is divided into three processes: bobbin image acquisition and extraction of main regions; the specific method of yarn edge extraction and segmentation based on Gabor filter; the color space fusion algorithm based on multi-color space hierarchical clustering and segmentation that combines Gabor edge detection to achieve residual yarn detection. Section 3 is a summary of the proposed method and experiments.

    1 Experimental Device

    The experimental system contains the following equipment. A charge coupled device(CCD) camera MV-GED500C-T (Shenzhen Minvision, China) is used with 2 448×2 048 dots per inch (DPI), 9 frames per second, and an ethernet interface, which can meet the needs of the experiment and actual production applications. For the lens, an industrial lens with a focal length of 25 mm (Zhejiang Dahua Technology CO., LTD., China) is adopted. The illumination source (Hangzhou Hikvision Digital Technology Co., LTD., China) is a downward-inclined white LED positive light source with a color temperature of 6 500 K. The experimental platform is shown in Fig.2. The computer operating system is Windows 7, the processor is i5-2500k@3.3GHz, the memory is 4 GB, and the graphics card is GeForce GTX 750. The framework is Visual Studio, and the OpenCV version is 3.4.3.

    According to the process of bobbin inspection, the whole system can be divided into a bobbin transfer module in the first stage, the bobbin online inspection module in the middle stage and the bobbin management module in the last stage. In the first stage, the messy bobbin needs to be pre-sorted, so that the bobbin can be placed more neatly before entering the online inspection in the middle stage to improve the inspection efficiency. The bobbin online inspection module is the core part of the pipeline management system, and the inspection result is directly related to the effect of the later bobbin management module. The bobbin online inspection module mainly realizes the detection of the bobbin with yarns and the color classification detection of the bobbin without yarns.

    Among them, yarn-containing bobbin detection precedes yarn-free bobbin color classification detection, and yarn-containing bobbin detection includes broken bobbin detection, end detection and yarn-containing detection. The color classification detection of yarn-free bobbins requires classification based on the head and body color of the bobbin. The bobbin online inspection system analyzes each bobbin image captured from the conveyor belt by the corresponding image processing algorithm to calculate the pixel length, end pixel width, edge detection, and color classification process of the bobbin. According to the results of calculation and analysis, communication and interaction with the embedded computer master is carried out to separate the broken end, inverted end and yarn-containing yarn barrel from the conveyor belt and send them from the plate to the corresponding box. If it is a qualified yarn barrel, then it can be processed for color analysis and move on to the next module. The process diagram is shown in Fig.3.

    Since the system is based on a fixed color background, in order to effectively preserve the target area, it is necessary to remove the support frame, background and other factors, and it is necessary to pixel mask each read image to obtain a binary image containing only 0 and 1 values. In the initially segmented binary image, each pixel with a value of 0 is judged to be within the similarity threshold. If it is within this range, the pixel range is reset according to the processing relationship, and the desired color image is obtained. The whole process is depicted in Figs. 4-6.

    The bobbin is positioned according to the optical band to effectively cut the experimental map and get the region of interest. The bilateral filtering can effectively filter out the noise in the image, make the contour of the bobbin clearer, effectively retain the details of the image, and extract the region of interest (ROI).

    2 Gabor Detection Algorithm

    2.1 Image acquisition of bobbin

    As shown in Fig.7(a), the camera captures an image of the bobbin spool directly. Horizontal correction of the image is needed to crop the main area of the bobbin spool. First, the smallest outer rectangle of the bobbin axis region is drawn with OpenCV, then using the built-in function of OpenCV. The angle of the bobbin is

    (1)

    2.2 Gabor filter feature extraction

    2.2.1ExtractionofyarnedgesusingoddpartialGaborfilter

    Gabor filter features have been widely used as effective texture feature descriptor for target detection and recognition. In the spatial domain, two-dimensional Gabor filter is usually obtained by multiplying a sinusoidal plane wave and a Gaussian kernel function. The properties of Gabor filter are self-similar. That is, any Gabor filter can be obtained by extending and rotating the mother wavelet. In practice, Gabor filter can extract relevant features from images with different orientations and different scales in the frequency domain. For a Gabor filter with a two-dimensional digital part, the functional expression is

    (2)

    where (x,y) is the pixel coordinate;x′=xcosθ+ysinθ;y′=-xsinθ+ycosθ;θdenotes the direction of the parallel stripes of the Gabor function, taking values from 0° to 360°;ωis the central frequency;σis the variance of the Gaussian distribution;ψis the phase of the sinusoidal function, and in general,ψ= 0;γis the aspect ratio,i.e., the spatial aspect ratio, which determines the ellipticity of the Gabor function. Whenγ= 1, the shape is circular, and whenγ<1, the shape is elongated in the direction of parallel stripes.σcannot be set directly. It varies with the bandwidth of the filter’s half-response spatial frequency (defined asb). The relationship between them is

    (3)

    whereσandλdetermine the wave form of the odd partial of the Gabor filter. When the product of the two is constant, the center frequency only affects the effective response region, while the waveform remains constant. Depending on the number of pixels occupied by a single veil in the image, a filter bank consisting of filters with the same center frequency is designed, and each bank contains filters with different orientations. In the experiments,σ/λis set to 1.2, the horizontal direction is ±90°, the number of groups is 3, and the center frequencies are 4.8, 5.6 and 6.4 Hz, respectively. The filter banks are convolved for each RGB channel component separately. The maximum value is taken from different directions of the same center frequency, and the minimum value is taken between different center frequencies to form the filter output. The processing effect of the filter is shown in Fig.8. After parameter optimization, the filter can suppress the texture of the cylinder wall and enhance the gray gradient of the bobbin at the same time.

    2.2.2Filterwaveformoptimization

    The waveform and output response of two-dimensional Gabor filter vary withσ,ω,θ,etc. Due to the bandpass characteristics of two-dimensional Gabor filter, it is necessary to use a multi-center frequency and multi-direction filter bank, and process the input signal according to fineness, so that the filter produces a greater response to the edge of the cylinder, while suppressing other texture. To enhance the edge detection of the odd part Gabor filter, it is necessary to make its waveform have a more pronounced step feature at axisx= 0. This step also gives a larger output response of filterG(x,y). LetEbe the integral of the response on the central axis side ofG(x,y).

    (4)

    After the integral operation, we can get

    (5)

    whereI(·) represents the imaginary part of the function;L(·) represents the error function. Equation (5) shows that ifσis known, thenEhas a Dawson distribution withωx(frequency atx). The inflection point of the Dawson integral occurs at approximately 0.924 14. It can be seen that whenEis the maximum value, the following equation can be obtained.

    (6)

    Figure 9 shows the filter output response curves withλ/σfor the test images. It can be seen that the actual peak occurs aroundλ/σ=6.4 and the actual output value is consistent with the theoretical output value.

    2.2.3Filterangleselection

    Figure 10 shows the ideal variation curve of the output response of the filter in the surprise section withθ. The maximum response of the filter was obtained whenθ= 90°. To prove the theoretical results, a total of five filters were selected to filter the test images in the range of 0° to 180° with 45° intervals. It can be seen that the filters filter best when the axes of the filters coincide with the step edge lines. Whenθis 90° and the centrosymmetric angle is 270°, these two angles can produce the ideal filtering effect. For a vertical bobbin, too many filters in different directions can reduce the operating efficiency and increase the noise.

    2.2.4Filtercenterfrequencyselection

    Fig.1 Bobbins in factory: (a) yarn bobbins in textile mills; (b) multi-color and yarn-containing bobbins that require sorting

    Fig.2 Experimental platform

    Fig.3 Process diagram : (a) yarn identification and detection system; (b) image acquisition device

    Fig.4 Image of bobbin

    Fig.7 Region of interest automatic process of image: (a) bobbin image; (b) main area of the extraction; (c) main area of bobbin

    Fig.8 Gabor rendering: (a) bobbin backbone area; (b) Gabor filtering effect of odd part

    Fig.9 Output response curve with λ/σ

    Fig.10 Output response curve with θ

    Fig.11 Theoretical output curve with respect to σ/d

    2.3 Fusion segmentation of multi-color spaces

    RGB model is the most common and basic color model in digital image processing. In the RGB color space, any color can be represented as a weighted mixture of different components of the three primary colors red, green and blue. The RGB color space, also known as the additive mixed color space, is characterized by poor color uniformity. This color space model is shown in Fig.12(a). For general images, the luminance ranges from 1 to 100, and the values of U and V are between -100 and 100. +U is red, -U is green, +V is yellow, and -V is blue. The LUV color space model is shown in Fig.12(b).

    Fig.12 Different color spaces: (a) RGB color space; (b) LUV color space

    The calculation formula is obtained by XYZ through nonlinear calculation. The specific equation is

    (7)

    (8)

    whereYnis 1.0;u′ andv′ describe the test color stimulus.

    The proposed method of fusion and segmentation of multi-color space based on hierarchical clustering is shown in Fig.13.

    Fig.13 Multi-color space fusion and segmentation method flow

    The specific steps are as follows.

    3)The local class labeled histogram features ofRRGBandRLUVare extracted and serialized into a fusion feature vector, and then the second fuzzyC-means clustering is performed to obtain a fusion segmentation resultSfusion.

    4)Conduct region merging onSfusionto obtain the final segmentation resultS.

    2.3.1Fusionandsegmentationmethodflowbasedonhierarchicalclustering

    In color image segmentation, choosing an appropriate color space is a difficult problem because it is difficult to represent complex natural scene images in a single-color space. Different color space representations can be seen as images with different channels provided by different sensors. An information fusion strategy combining complementary information from multi-color spaces is an effective way to improve the segmentation effect. Through the segmentation experiment of multiple groups of natural scene images, two color spaces, RGB and LUV, are selected to represent the segmented image and perform fusion segmentation. Image enhancement can highlight the light and dark changes in the image, and enhance the contrast between the background and the target, thereby effectively improving the segmentation effect of grayscale images. It is found that color image segmentation after enhancement processing can highlight the contours of the original image, reduce the number of segmentation blocks and improve the segmentation effect. The mathematical morphology image enhancement method is a simple and effective image enhancement method that can obtain a description of the structural features of an image by the influence of structural elements on the image.

    (9)

    2.3.2InitialsegmentationbasedonfuzzyC-meansclustering

    In general, the color histogram has a high degree of freedom. Taking RGB color space as an example, the color histogram has a high degree of freedom of 2563. Here, each color component is uniformly quantized to levelP, and the color histogram has a degree of freedom ofP3. For any pixelXin the image, the normalized local color histogramh1in a windowRcentered onXis computed.

    (10)

    2.3.3Fusionoftheinitialsegmentationofmulti-colorspace

    The fuzzyC-means clustering method is used to fuse the two initial segmentation results ofK1category, and the fusion segmentation results ofK2different categories are obtained bySfusion.

    For the two initial segmentation results, the feature vectors of the local class-labeled histograms centered on pixelxare extracted respectively, and the class-labeled histogramsh2are calculated in the windowRx.

    (11)

    wherenjis the number of pixels in the window labeled as (j+1);Nwrepresents the number of images corresponding to the bobbin. Then, two local class labeling histograms are concatenated and normalized to obtain the fused local class labeling histogramh2(Rx) with vector dimensions ofK1andK2.

    2.3.4Regionalconsolidation

    Since the clustering results are pixel-based, it is necessary to perform region merging onSfusionto achieve a more complete description of the target. In the segmentation results, the distance between a regionRand an adjacent regionRois denoted asDmerging(R,Ro),

    (12)

    whereC= {RGB, LUV};h(R) represents the normalized local color histogram of theRregion;h(Rx) represents the normalized local color histogram of a pixelxin the neighborhoodRo;DB[h(R),h(Rx)] represents the Batacaria distance of the histogram;h(i,R) andh(i,Rx) represent the occurrence frequency of the bin in the histograms ofRandRxrespectively;Nbrepresents the number of bins in the histogram. By calculating the distance betweenRand all adjacent regions ofR, the minimum adjacent regionRminis obtained. If the distance betweenRandRminisDmerging(R,Rmin) and less than the thresholdT, thenRis merged intoRmin. In the experiment of this paper, the segmentation of three different yarn volume fractions has achieved a very good segmentation effect, which can effectively restrain the texture of the bobbin and eliminate the influence of reflection. The experimental results are shown in Fig.14.

    Fig.14 Segmentation effect: (a) unsegmented image; (b)segmentation result image

    3 Experiment and Analysis

    In this study, we select 150 bobbins as the test image set, as shown in Fig.15. We test them according to LUV color space, and generally use chromaticity, saturation, and luminance according to the judgment rules of human eyes. For the sake of simplicity, we use saturation as the variable, so it is divided into four cases. For the first one, the saturation and luminance of the bobbin and yarns are completely distinguished. For the second one, the saturation of the bobbin and yarns is similar, but the hue is different. For the third one, the hue of the bobbin and yarns is similar, but with different saturation. For the fourth one, the bobbin and yarns are with little difference in hue saturation, which is the industry phenomenon of “the same bobbin and the same yarn”. In these four cases, the bobbin texture, slight stains,etc. should be considered. From the classification point of view, bobbin texture belongs to the fourth category.

    Fig.15 Images for testing

    Regarding the clustering, theoretically, two clusters are enough, one is the color of the bobbin itself and the other is the color of the yarn, but in practice, the bobbin is reflective and the bright band counts as a cluster. The bobbin texture counts as a cluster, so there are four clusters in total, but this increases the burden on the computer and is not suitable for low-cost situations. Thus three clusters are considered. Garbo filter is used for texture processing. The idea is that at the boundary of the cluster, the Garbo binarized image is checked. If the yarn shape is satisfied and the width is greater than that of the texture, it is the yarn. If both are texture, further clustering of texture is performed.

    An experimental process with the yellow bobbin is carried out, because the high saturation of the yellow bobbin is the high saturation of the white bobbin, and the identification of the yellow bobbin is the best test set to validate the system. The main reasons are as follows.

    1)The RGB of yellow is (255, 255, 0) and the RGB of crimson is (255, 0, 255). These two colors are easily confused in a color instant, which means that the colors easily distinguished by the naked eye are mathematically identical.

    2)Yellow is more likely to be contaminated and fade.

    3)Yellow (255, 255, 0) is visually white shown in Fig.16. Yellow and light white yarns, which are indistinguishable by normal methods, are theoretically extremely susceptible to the interference from color component blue, resulting in white color.

    Fig.16 Test process

    Fig.17 Comparison curves of different algorithms

    This process is also the block diagram of our software. From the diagram, we can see that three methods are used. Firstly, the Gabor filter is used to get the approximate position of the yarns, as seen in Fig.16. The texture and yarns are mixed and the exact position of the yarns cannot be distinguished, and other means are needed. Secondly, RGB three-component clustering is used, and the green component appears as a possible judgment of the yarn, but stain interference needs to be excluded. Thirdly, three-component clustering in the color space of LUV, from the L component to determine the location of the yarn. This “L” judgment is more realistic. If the U and V components are connected on the Y-axis at the same time, it can also be considered as having a yarn. The results of the three methods are fused by using the method of this paper to derive the yarns and their specific location. The accuracy of the method requires tuning the parameters of the fusion, which will not be developed here. Therefore, the yarns and texture which are very close to each other can be differentiated and processed. The situation is ideal when the filter is running at a speed of 50 bobbins per second, as shown in Fig.16.

    To test the accuracy and robustness of the detection algorithm in this paper, different colors of yarns and different kinds of bobbins are selected for test groups, and the test groups are shown in Table 1. Five colors of white, black, blue, red and gray yarns with linear densities of 4.0 tex and 10.0 tex are selected, the true positive rate (TPR) of the residual yarn skeleton is used as the test evaluation index, which is defined as

    (13)

    whereQTPRrepresents the accuracy rate, and the value range ofQTPRis [0,1];ETPrepresents the actual bobbin with residual yarns, and the detection result is also the number of samples of the bobbin with residual yarns;A(TP+FN)represents the actual bobbin with residual yarns and the actual empty bobbin, and the detection result is the number of the bobbin with residual yarns plus the number of empty bobbin samples. The higher the value, the better the classification effect. The test results corresponding to Table 1 are shown in Table 2. Figure 17 shows the effect comparison curve, from which it can be seen that the accuracy of the traditional algorithm is lower than 65% and that of the algorithm in this paper is higher than 80%.

    Table 1 Test groups

    Table 2 Accuracy of test results

    4 Experimental Verification of Indistinguishability

    To further validate the method of this paper, the following are a few cases that are more difficult for the human eye to distinguish.

    4.1 Texture judgment

    The bobbin in Fig.18 is very prone to error even if it is manually selected. At this point, the Gabor filter does not work as well as required, as shown in Fig.18(c), color clustering must be considered to complete the yarn judgment. In LUV color space, the L component acts obviously; in RGB color space the blue component acts obviously, so that the state marker of the yarn can be judged after fusion.

    Fig.18 Texture judgment process: (a) camera’s capture; (b) bobbin automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    4.2 Loose yarn judgment

    This is less common but still exists in a certain percentage. Figure 19 shows an example where the yarn is not horizontal on theY-axis and has a certain angle. In the adaptive Gabor filter used in Fig.18, the yarn contour is still distinguished, but when the yarn is close to the bobbin, the texture features are confused with the yarn features. In this identification, the concept of convex packets is utilized, as shown in Fig.19(g).

    Fig.19 Yarn judgment process: (a) camera’s capture; (b) automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    5 Conclusions

    In this paper, odd partial Gabor filter, multi-color space and hierarchical clustering of compound texture segmentation operators are used to detect residual yarns. Yarn segmentation is realized by optimizing the design of Gabor filter banks and adjusting the parameters to maximize the order amplitude in the band pass range. To solve the problem of the specific width, the most suitable center frequency is explored. By setting a reasonable filter combination, the frequency inconsistent with the yarn direction is removed, the noise is suppressed and the detection efficiency is improved.

    At the same time, it combines the fusion segmentation based on RGB and LUV color space hierarchical clustering to solve the problems of over-segmentation and mis-segmentation caused by the low contrast between the target and the background in the color. For image segmentation, image enhancement techniques in color image segmentation are introduced to make the segmented image better reflect the contours of the original image and highlight the parts of interest in the image.

    The results show that the algorithm can accurately detect yarn bobbins with different colors and brightness, and its optimization strategy provides a theoretical reference for the research of non-contact bobbin sorting.

    猜你喜歡
    鵬飛
    樊應(yīng)舉
    書香兩岸(2020年3期)2020-06-29 12:33:45
    漫畫
    Quality Control for Traditional Medicines - Chinese Crude Drugs
    為了避嫌
    雜文月刊(2019年18期)2019-12-04 08:30:40
    懲“前”毖“后”
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    執(zhí)“迷”不悟
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    舉賢
    21世紀(jì)(2019年9期)2019-10-12 06:33:44
    漫畫
    粗看“段”,細(xì)看“端”
    漫畫
    99国产综合亚洲精品| 看免费av毛片| 亚洲欧美清纯卡通| av免费观看日本| 久久久久视频综合| 少妇猛男粗大的猛烈进出视频| 天堂俺去俺来也www色官网| h视频一区二区三区| 免费观看av网站的网址| av免费在线看不卡| 最后的刺客免费高清国语| 久久久久久人妻| 伦精品一区二区三区| 91午夜精品亚洲一区二区三区| 久久久久久伊人网av| xxxhd国产人妻xxx| 国产精品久久久久久精品古装| 考比视频在线观看| 蜜臀久久99精品久久宅男| 免费久久久久久久精品成人欧美视频 | 国国产精品蜜臀av免费| 在现免费观看毛片| 97在线人人人人妻| 日本免费在线观看一区| 国产男人的电影天堂91| 人妻人人澡人人爽人人| 老女人水多毛片| www.av在线官网国产| 国产精品成人在线| 看免费av毛片| 丝瓜视频免费看黄片| 免费看光身美女| 91aial.com中文字幕在线观看| 国产视频首页在线观看| 丰满饥渴人妻一区二区三| 久久久精品免费免费高清| 最近最新中文字幕大全免费视频 | 精品视频人人做人人爽| 亚洲激情五月婷婷啪啪| 久久ye,这里只有精品| 黄色视频在线播放观看不卡| 国产又色又爽无遮挡免| 午夜日本视频在线| 人人澡人人妻人| 青春草国产在线视频| 少妇被粗大的猛进出69影院 | 91精品伊人久久大香线蕉| 亚洲国产欧美日韩在线播放| 免费在线观看完整版高清| 亚洲精品乱码久久久久久按摩| 国产亚洲午夜精品一区二区久久| 亚洲欧美中文字幕日韩二区| 久久国产精品男人的天堂亚洲 | 欧美xxxx性猛交bbbb| 精品视频人人做人人爽| 欧美精品人与动牲交sv欧美| 亚洲欧洲日产国产| 国产无遮挡羞羞视频在线观看| 久久av网站| 欧美日韩成人在线一区二区| 成人漫画全彩无遮挡| 99热这里只有是精品在线观看| 国产精品人妻久久久久久| 国产精品 国内视频| 日本猛色少妇xxxxx猛交久久| 精品一区二区三区视频在线| 另类亚洲欧美激情| 免费观看a级毛片全部| 日韩成人av中文字幕在线观看| 婷婷色av中文字幕| 99国产精品免费福利视频| 一区二区三区精品91| 久久影院123| 欧美成人午夜精品| 欧美激情极品国产一区二区三区 | 免费少妇av软件| av.在线天堂| 桃花免费在线播放| 国产精品一区www在线观看| 亚洲丝袜综合中文字幕| 精品人妻偷拍中文字幕| 综合色丁香网| 丁香六月天网| 狠狠婷婷综合久久久久久88av| 精品久久蜜臀av无| 在线免费观看不下载黄p国产| 午夜免费男女啪啪视频观看| 只有这里有精品99| 9191精品国产免费久久| 亚洲精品久久成人aⅴ小说| 亚洲av欧美aⅴ国产| √禁漫天堂资源中文www| 欧美日韩av久久| 精品人妻在线不人妻| 亚洲激情五月婷婷啪啪| 亚洲,一卡二卡三卡| 男人舔女人的私密视频| 国产成人精品久久久久久| 2018国产大陆天天弄谢| 性色av一级| 精品少妇黑人巨大在线播放| 搡女人真爽免费视频火全软件| 午夜福利视频精品| 丁香六月天网| 高清黄色对白视频在线免费看| 男女边吃奶边做爰视频| 亚洲精品成人av观看孕妇| 色5月婷婷丁香| 欧美日韩视频精品一区| 高清不卡的av网站| 国产色爽女视频免费观看| 在线看a的网站| 国产激情久久老熟女| 亚洲欧美日韩另类电影网站| 日韩,欧美,国产一区二区三区| 人人妻人人澡人人看| videossex国产| 又黄又爽又刺激的免费视频.| 男女免费视频国产| 久久久精品94久久精品| 免费看av在线观看网站| 精品国产一区二区三区久久久樱花| 国产一区亚洲一区在线观看| 51国产日韩欧美| 国产av一区二区精品久久| 亚洲精品,欧美精品| 校园人妻丝袜中文字幕| 国产精品蜜桃在线观看| 亚洲av电影在线进入| 中国三级夫妇交换| 韩国av在线不卡| 国产探花极品一区二区| 日日撸夜夜添| 女性被躁到高潮视频| 日韩在线高清观看一区二区三区| 少妇人妻 视频| 人妻人人澡人人爽人人| av视频免费观看在线观看| 亚洲精品,欧美精品| 黄色毛片三级朝国网站| 天天影视国产精品| 丝袜人妻中文字幕| 最后的刺客免费高清国语| 成年动漫av网址| freevideosex欧美| 亚洲高清免费不卡视频| 热99久久久久精品小说推荐| 成人国产麻豆网| 亚洲精品国产色婷婷电影| 久久精品夜色国产| 国产精品不卡视频一区二区| 伊人亚洲综合成人网| 综合色丁香网| 免费少妇av软件| 最新的欧美精品一区二区| 国精品久久久久久国模美| 久久精品国产鲁丝片午夜精品| 日本-黄色视频高清免费观看| 97在线视频观看| 免费看av在线观看网站| 亚洲av日韩在线播放| 亚洲精品国产色婷婷电影| 女人被躁到高潮嗷嗷叫费观| 国产成人精品无人区| 在线观看国产h片| 中文字幕制服av| 国产精品久久久久久久电影| 美女xxoo啪啪120秒动态图| 中文字幕免费在线视频6| av.在线天堂| 国产亚洲最大av| 亚洲国产欧美日韩在线播放| 久久久久久久久久久久大奶| 久久97久久精品| 午夜影院在线不卡| 如日韩欧美国产精品一区二区三区| 嫩草影院入口| 久久av网站| 成年av动漫网址| 久久久国产一区二区| 日本av免费视频播放| 亚洲综合色惰| 午夜福利视频精品| 性色av一级| 天天影视国产精品| 亚洲性久久影院| 成年美女黄网站色视频大全免费| 亚洲色图 男人天堂 中文字幕 | 成年美女黄网站色视频大全免费| 亚洲天堂av无毛| 日本vs欧美在线观看视频| 成人国语在线视频| xxx大片免费视频| 国产免费视频播放在线视频| 人妻人人澡人人爽人人| 久久精品国产综合久久久 | 黑丝袜美女国产一区| 日产精品乱码卡一卡2卡三| 亚洲天堂av无毛| 精品一区在线观看国产| 黄色毛片三级朝国网站| 最近手机中文字幕大全| 国产无遮挡羞羞视频在线观看| 欧美激情 高清一区二区三区| 国产有黄有色有爽视频| 18在线观看网站| 女人精品久久久久毛片| 高清不卡的av网站| 欧美 亚洲 国产 日韩一| 久久人人爽人人片av| 久久久久久久亚洲中文字幕| 亚洲久久久国产精品| 色网站视频免费| 2022亚洲国产成人精品| 日韩不卡一区二区三区视频在线| 亚洲内射少妇av| 国产免费视频播放在线视频| 香蕉精品网在线| 高清在线视频一区二区三区| 国产一区二区激情短视频 | 少妇高潮的动态图| 狂野欧美激情性xxxx在线观看| 国产爽快片一区二区三区| 国产成人精品无人区| 天堂8中文在线网| 亚洲精品一区蜜桃| 两性夫妻黄色片 | 岛国毛片在线播放| 男女国产视频网站| 国产在线一区二区三区精| 久久免费观看电影| 2021少妇久久久久久久久久久| 久久毛片免费看一区二区三区| 又大又黄又爽视频免费| 一边摸一边做爽爽视频免费| 午夜福利在线观看免费完整高清在| 久久影院123| 高清在线视频一区二区三区| 婷婷色综合大香蕉| 最近的中文字幕免费完整| av有码第一页| 青青草视频在线视频观看| 欧美精品高潮呻吟av久久| 一区在线观看完整版| 在线 av 中文字幕| 黑丝袜美女国产一区| 在线观看国产h片| 精品亚洲乱码少妇综合久久| 国产 一区精品| 在现免费观看毛片| 国产av精品麻豆| 亚洲av电影在线观看一区二区三区| 亚洲少妇的诱惑av| 观看av在线不卡| 国产老妇伦熟女老妇高清| 国产欧美亚洲国产| 免费大片18禁| 热99久久久久精品小说推荐| xxx大片免费视频| 久久久久久人人人人人| 汤姆久久久久久久影院中文字幕| av女优亚洲男人天堂| 国产在线免费精品| 国产成人av激情在线播放| 国产色婷婷99| 亚洲国产精品专区欧美| 晚上一个人看的免费电影| 男女下面插进去视频免费观看 | 亚洲国产精品一区二区三区在线| 丝袜美足系列| 九草在线视频观看| 久久综合国产亚洲精品| 成人午夜精彩视频在线观看| 日韩中文字幕视频在线看片| 99久久人妻综合| 最新的欧美精品一区二区| 久久国内精品自在自线图片| 90打野战视频偷拍视频| 亚洲,一卡二卡三卡| 另类亚洲欧美激情| 精品久久国产蜜桃| 亚洲国产毛片av蜜桃av| 桃花免费在线播放| 国产爽快片一区二区三区| 免费高清在线观看视频在线观看| 午夜激情av网站| 夫妻性生交免费视频一级片| 亚洲色图综合在线观看| 久久av网站| 啦啦啦在线观看免费高清www| 纯流量卡能插随身wifi吗| 这个男人来自地球电影免费观看 | 久久 成人 亚洲| 欧美变态另类bdsm刘玥| 777米奇影视久久| 久久久久久久国产电影| 国产精品一国产av| 在线观看www视频免费| 中文天堂在线官网| 国内精品宾馆在线| 欧美日韩一区二区视频在线观看视频在线| 9191精品国产免费久久| 色94色欧美一区二区| 视频中文字幕在线观看| 亚洲精品一二三| 亚洲av福利一区| 九九爱精品视频在线观看| 国产精品偷伦视频观看了| 黄色视频在线播放观看不卡| 人成视频在线观看免费观看| av不卡在线播放| 看十八女毛片水多多多| 免费观看性生交大片5| 少妇的逼水好多| 各种免费的搞黄视频| 狠狠精品人妻久久久久久综合| 免费高清在线观看视频在线观看| 侵犯人妻中文字幕一二三四区| 观看美女的网站| 久久精品熟女亚洲av麻豆精品| 桃花免费在线播放| 欧美日韩亚洲高清精品| 欧美亚洲 丝袜 人妻 在线| 下体分泌物呈黄色| 交换朋友夫妻互换小说| av女优亚洲男人天堂| 中文字幕另类日韩欧美亚洲嫩草| 好男人视频免费观看在线| 日韩,欧美,国产一区二区三区| 久久午夜综合久久蜜桃| 国产精品国产av在线观看| 国产国拍精品亚洲av在线观看| 免费黄网站久久成人精品| 母亲3免费完整高清在线观看 | www.熟女人妻精品国产 | 草草在线视频免费看| 国产 精品1| 中文字幕另类日韩欧美亚洲嫩草| 婷婷色综合www| 看十八女毛片水多多多| 热re99久久国产66热| 在线天堂中文资源库| 两个人看的免费小视频| 欧美+日韩+精品| av一本久久久久| 精品国产一区二区三区四区第35| av又黄又爽大尺度在线免费看| 99久久人妻综合| 免费播放大片免费观看视频在线观看| 午夜精品国产一区二区电影| 精品国产一区二区久久| 黑人巨大精品欧美一区二区蜜桃 | 久久狼人影院| 少妇的逼好多水| 精品一区二区三卡| 2018国产大陆天天弄谢| 80岁老熟妇乱子伦牲交| 久久午夜综合久久蜜桃| 欧美+日韩+精品| 亚洲情色 制服丝袜| 久久毛片免费看一区二区三区| 97在线视频观看| 日韩欧美一区视频在线观看| 高清av免费在线| 免费少妇av软件| av又黄又爽大尺度在线免费看| 少妇精品久久久久久久| 一边亲一边摸免费视频| 日韩一区二区视频免费看| 日本猛色少妇xxxxx猛交久久| 日韩精品免费视频一区二区三区 | 97在线人人人人妻| 一级毛片 在线播放| 精品亚洲成a人片在线观看| 99热网站在线观看| 一级爰片在线观看| 精品久久国产蜜桃| 宅男免费午夜| 久久综合国产亚洲精品| 九草在线视频观看| 国产免费一区二区三区四区乱码| 全区人妻精品视频| 在线 av 中文字幕| 91午夜精品亚洲一区二区三区| 最近的中文字幕免费完整| 亚洲精品乱码久久久久久按摩| 免费黄色在线免费观看| 久久99精品国语久久久| 亚洲av综合色区一区| 亚洲熟女精品中文字幕| 久久人人97超碰香蕉20202| 26uuu在线亚洲综合色| 18禁动态无遮挡网站| 观看av在线不卡| 久久久久久久国产电影| 色婷婷av一区二区三区视频| 国产精品久久久av美女十八| 男女边吃奶边做爰视频| 国产精品人妻久久久久久| av一本久久久久| 尾随美女入室| 中文字幕av电影在线播放| 日韩熟女老妇一区二区性免费视频| 国产1区2区3区精品| 久久女婷五月综合色啪小说| 国产在视频线精品| 满18在线观看网站| 99热全是精品| av免费观看日本| 精品99又大又爽又粗少妇毛片| 日韩不卡一区二区三区视频在线| 丝袜喷水一区| 国产激情久久老熟女| 80岁老熟妇乱子伦牲交| 欧美日韩视频高清一区二区三区二| 亚洲精品美女久久av网站| 少妇猛男粗大的猛烈进出视频| 亚洲精品美女久久av网站| 91久久精品国产一区二区三区| 国产精品国产三级国产av玫瑰| 精品亚洲乱码少妇综合久久| freevideosex欧美| 一区二区三区乱码不卡18| 成人午夜精彩视频在线观看| 黑人欧美特级aaaaaa片| 成年人免费黄色播放视频| 久久 成人 亚洲| 黑丝袜美女国产一区| 中文字幕精品免费在线观看视频 | 97精品久久久久久久久久精品| 亚洲av中文av极速乱| 欧美成人午夜精品| 看非洲黑人一级黄片| 精品亚洲乱码少妇综合久久| 久久久国产一区二区| 另类亚洲欧美激情| 亚洲 欧美一区二区三区| 久久久久久久久久久免费av| 国产精品熟女久久久久浪| 天天躁夜夜躁狠狠久久av| 建设人人有责人人尽责人人享有的| 黄片无遮挡物在线观看| 天天躁夜夜躁狠狠躁躁| 黑人高潮一二区| 久久久精品免费免费高清| 久久综合国产亚洲精品| 国产精品一区二区在线观看99| 美国免费a级毛片| 亚洲欧美一区二区三区黑人 | 久久久精品免费免费高清| 亚洲情色 制服丝袜| 伦理电影大哥的女人| 最近手机中文字幕大全| 男女边摸边吃奶| 亚洲精品456在线播放app| 日韩制服骚丝袜av| 男女无遮挡免费网站观看| 国产日韩一区二区三区精品不卡| 亚洲中文av在线| 夜夜骑夜夜射夜夜干| 边亲边吃奶的免费视频| 亚洲av综合色区一区| 精品一区二区三区四区五区乱码 | 国产精品秋霞免费鲁丝片| 国产精品久久久av美女十八| 在线观看国产h片| 夜夜骑夜夜射夜夜干| 亚洲一码二码三码区别大吗| 在线免费观看不下载黄p国产| 日本色播在线视频| 97在线人人人人妻| 国产精品三级大全| 日韩,欧美,国产一区二区三区| 日韩欧美精品免费久久| 男的添女的下面高潮视频| 妹子高潮喷水视频| 精品久久久久久电影网| 久久精品aⅴ一区二区三区四区 | 国产精品久久久久成人av| 日韩av在线免费看完整版不卡| 如何舔出高潮| 十分钟在线观看高清视频www| 国产淫语在线视频| 黄片播放在线免费| 最近中文字幕2019免费版| 国产精品久久久久久精品电影小说| 中文欧美无线码| 内地一区二区视频在线| 九色成人免费人妻av| 亚洲在久久综合| 性色avwww在线观看| 成人毛片a级毛片在线播放| 国产精品久久久久久精品古装| 国产精品秋霞免费鲁丝片| 亚洲三级黄色毛片| 久久综合国产亚洲精品| 精品人妻熟女毛片av久久网站| 91精品国产国语对白视频| 中文字幕人妻丝袜制服| 建设人人有责人人尽责人人享有的| 亚洲欧美成人综合另类久久久| 精品国产一区二区三区四区第35| 日韩成人av中文字幕在线观看| 波野结衣二区三区在线| 9热在线视频观看99| 国产精品无大码| 国产日韩欧美亚洲二区| av天堂久久9| av有码第一页| 下体分泌物呈黄色| 精品国产乱码久久久久久小说| 免费少妇av软件| 久久综合国产亚洲精品| 插逼视频在线观看| 秋霞在线观看毛片| www.色视频.com| 亚洲精品久久成人aⅴ小说| 久久青草综合色| 香蕉丝袜av| 99热这里只有是精品在线观看| 亚洲国产精品专区欧美| 蜜桃在线观看..| 18禁在线无遮挡免费观看视频| 久久免费观看电影| 欧美成人午夜免费资源| 男女高潮啪啪啪动态图| 亚洲人成77777在线视频| a 毛片基地| 99热国产这里只有精品6| 一级爰片在线观看| 日本爱情动作片www.在线观看| 制服丝袜香蕉在线| 黄色 视频免费看| 亚洲成国产人片在线观看| 午夜福利,免费看| 最近手机中文字幕大全| 欧美精品一区二区免费开放| tube8黄色片| 国产成人av激情在线播放| 狠狠婷婷综合久久久久久88av| 国产精品99久久99久久久不卡 | 午夜日本视频在线| 中文字幕免费在线视频6| 国产精品免费大片| 看十八女毛片水多多多| 51国产日韩欧美| 日韩制服骚丝袜av| 亚洲精华国产精华液的使用体验| 9191精品国产免费久久| 国产极品天堂在线| 乱人伦中国视频| 久久久精品94久久精品| 国产男女超爽视频在线观看| 九色亚洲精品在线播放| 五月伊人婷婷丁香| 秋霞伦理黄片| 精品卡一卡二卡四卡免费| 亚洲av日韩在线播放| 18禁动态无遮挡网站| 国产精品免费大片| 成人免费观看视频高清| 日韩 亚洲 欧美在线| 日韩不卡一区二区三区视频在线| 热99久久久久精品小说推荐| 性色avwww在线观看| 街头女战士在线观看网站| 天天影视国产精品| 亚洲精品av麻豆狂野| 纵有疾风起免费观看全集完整版| 久久人人97超碰香蕉20202| 精品一区在线观看国产| 在线天堂最新版资源| 啦啦啦啦在线视频资源| 久久久久视频综合| 久久久久国产网址| 51国产日韩欧美| 美女主播在线视频| 在线观看人妻少妇| √禁漫天堂资源中文www| 赤兔流量卡办理| 亚洲成国产人片在线观看| 久久午夜福利片| 成人亚洲欧美一区二区av| 国产成人a∨麻豆精品| 99国产综合亚洲精品| 日韩免费高清中文字幕av| 纵有疾风起免费观看全集完整版| 人妻系列 视频| 少妇人妻 视频| 国产免费又黄又爽又色| 成人漫画全彩无遮挡| 国产男女超爽视频在线观看| 国产成人aa在线观看| 久久精品国产自在天天线| av在线老鸭窝| 日韩av免费高清视频| 少妇的逼水好多| 在线观看免费高清a一片| 满18在线观看网站| 免费看av在线观看网站| 七月丁香在线播放| 午夜免费观看性视频| 亚洲精品久久午夜乱码| 中文字幕人妻丝袜制服| 国产乱来视频区| 一区二区av电影网| 巨乳人妻的诱惑在线观看| 久久久久人妻精品一区果冻| 日本黄色日本黄色录像| 大片免费播放器 马上看| 国产精品久久久久久久电影| 老司机影院成人| 2022亚洲国产成人精品| 在线亚洲精品国产二区图片欧美| 日韩av免费高清视频|