• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    An Adaptive Vision Navigation Algorithm in Agricultural IoT System for Smart Agricultural Robots

    2021-12-14 03:51:40ZhibinZhangPingLiShuailingZhaoZhiminLvFangDuandYajianAn
    Computers Materials&Continua 2021年1期

    Zhibin Zhang,Ping Li,Shuailing Zhao,Zhimin Lv,Fang Du and Yajian An

    1School of Computer science,Inner Mongolia University,Hohhot,010021,China

    2Key Laboratory of Wireless Networks and Mobile Computing,School of Computer Science,Inner Mongolia University,Hohhot,010021,China

    3Simulation Center,Air Force Early Warning Academy,Wuhan,430019,China

    Abstract:As the agricultural internet of things(IoT)technology has evolved,smart agricultural robots needs to have both flexibility and adaptability when moving in complex field environments.In this paper,we propose the concept of a vision-based navigation system for the agricultural IoT and a binocular vision navigation algorithm for smart agricultural robots,which can fuse the edge contour and the height information of rows of crop in images to extract the navigation parameters.First,the speeded-up robust feature(SURF)extracting and matching algorithm is used to obtain featuring point pairs from the green crop row images observed by the binocular parallel vision system.Then the confidence density image is constructed by integrating the enhanced elevation image and the corresponding binarized crop row image,where the edge contour and the height information of crop row are fused to extract the navigation parameters(θ,d)based on the model of a smart agricultural robot.Finally,the five navigation network instruction sets are designed based on the navigation angle θ and the lateral distance d,which represent the basic movements for a certain type of smart agricultural robot working in a field.Simulated experimental results in the laboratory show that the algorithm proposed in this study is effective with small turning errors and low standard deviations,and can provide a valuable reference for the further practical application of binocular vision navigation systems in smart agricultural robots in the agricultural IoT system.

    Keywords:Smart agriculture robot;3D vision guidance;confidence density image;guidance information extraction;agriculture IoT

    1 Introduction

    There are many vision-based technologies used in applications of autonomous robots[1–3]or other applications such as the real-time visual tracking shape and colour feature of object in the literature[4].Traditional agricultural robots based on vision technology have obtained great success[5,6],and can be operated in several stages of a process to solve the demanding problems in agricultural production[7].Many researchers have studied robot navigation[8],mainly focusing on crop-row line detection.However,the field environment is so complex that the navigation information extraction is not only affected by factors such as weeds and variations in illumination,but is also influenced by the irregular growth of crops.The irregularity of crop plant growth is particularly obvious in the late growth stage when the inter-row spaces are narrow,making automatic navigation difficult for traditional agricultural robots guided by vision technology.Thus,it is necessary to develop a smart agricultural robot that can automatically adjust its posture in real-time to adaptively move along an irregular crop row,and can also be maneuvered by the control instruction of the IoT node[9].This will prevent unevenly growing crop plants from being crushed during the automatic navigation process.Moreover,the smart agricultural robot can also overcome the deficiencies of crop-row line detection due to dynamic and unpredictable situations such as fixed obstacles[10],which create issues for traditional agricultural robots.

    The line-detection vision navigation algorithms of traditional agricultural robots have been proposed using different crop-row recognition methods for different field applications[11–13].Searcy et al.[14]applied the Hough transform to the extraction of navigation parameters of agricultural robots.In[15]the excess green method was used to separate green crops from their soil background,and then vertical projection was used to determine the candidate points of crop centerlines to extract the row line.The authors of[16]proposed a vision approach for row recognition based on the grayscale Hough transform on intelligently merged images,which was able to detect crop rows at the various growth stages.In[17]a novel automatic and robust crop row detection method based on maize field images was proposed.Some navigation algorithms based on stereo vision technology for crop row recognition have also been proposed.For instance,in[18],after a three-dimensional(3D)crop-row structure map of an entire field was created using the acquired images,a feature point tracking algorithm was used to extract a tractor motion indicated by the feature points from continuous stereo images,and then feed the outcomes to a dynamic model of the tractor to estimate its traveling speed and heading direction.In[19],a stereo vision-based 3D egomotion estimation system was proposed to track the features in image sequences,in which those feature points were matched to obtain the 3D point clouds for motion estimation.The authors of[20]proposed an unsupervised algorithm for vineyard detection and evaluation of vine row features based on the processing of 3D pointcloud maps,in which the information on local vine row orientations and local inter-row distances were organized in geo-referenced maps to allow the automatic path planning along the inter-row spaces.In[21]a branch detection method was developed,which used the depth features and a region-based convolutional neural network(R-CNN)for detection and localization of branches.

    However,the aforementioned research does not address the edge information of plant leaves when agricultural robots are advancing along a crop row using two-dimensional(2D)or 3D row-line recognition,and the methods did not employ IoT technology[22].This paper proposes a vision navigation algorithm based on the 3D morphological edge and height information of crop rows to guide a smart agricultural robot to adapt to irregular crop rows to avoid crushing crops.Furthermore,the smart agricultural robot advancing along crop rows can obtain essential real-time non-destructive crop growth information.This information can then be transmitted to a cloud computing server in the smart agriculture IoT system to predict the yield and evaluate the health status of crops.This study makes two primary contributions:1)We propose the concept of a smart agricultural robot vision navigation system for use in the agricultural IoT;and 2)We propose an adaptive vision navigation algorithm for the smart agricultural robot.

    2 Smart Agricultural Robot Navigation IoT System

    To enable the automatic navigation of a smart agricultural robot,we designed a smart agricultural robot navigation IoTsystem according to the literature[23].As shown in Fig.1,in this system an image acquisition layer is used to collect information,a transmission layer is used to transmit data,and a cloud computing layer provides complex computing services.After processing the data in the cloud computing layer,the results are transmitted to the controller through the transmission layer.

    The function modules of the smart agricultural robot are shown in Fig.2.

    Figure 1:Framework of the agricultural IoT system for the smart agricultural robot

    Figure 2:Function modules of the smart agriculture robot embedded in an agricultural IoT system

    In the data acquisition layer,the image data are acquired by using a Bumblebee2 binocular stereoscopic camera installed on the agricultural robot to observe green crops in real time.In the transmission layer,the collected image data are transmitted in real time to the cloud computing layer through 4G/5G protocols.In the cloud computing layer,we propose an adaptive vision navigation algorithm for the agricultural robot,which fuses the 2D and 3D information of the green crop feature points to obtain the navigation parameters of the smart agricultural robot.The robot’s control center can also receive the control instructions of the cloud computing services in order to complete autonomous navigation tasks.These IoT capabilities can improve the robustness,flexibility,and reliability of the smart agricultural robot.

    3 Adaptive Vision Navigation Implementation

    3.1 Elevation Image

    Stereo matching is the process of constructing the corresponding pairs in the left and right images from different perspectives of an object.When these points are matched,their 3D information can be obtained by using Eq.(1),whereZcis the camera coordinate;Xw,Yw,andZware the world coordinates;andxrandyrare the image coordinates.The relationships of the coordinates are shown in Fig.3.We use the right image as the reference image.

    Figure 3:Diagram of relationships between the world coordinate,camera coordinate,and image coordinate systems

    In Eq.(1),the rotation matrixRand the translation vectorTcontain the pose parameters of a camera relative to the world coordinate system[24],which are called external parameters;hxandhyrepresent the physical scale of each pixel in the image coordinates,together with the focal length of camera lensf,which are called internal parameters.The origin of the image coordinate is(u0,v0),with 0T=(0,0,0).Internal and external parameters can be obtained by the camera calibration process[25].Based on the parallel binocular vision model,in this study the speeded-up robust feature(SURF)extracting and matching algorithm[26]is used to obtain the 3D spatial information of corresponding pairs of green crop rows.The matched features are shown in Fig.4.Then,the elevation image of the crop row can be obtained.As shown in Fig.5a,the brighter the feature point region in the elevation image,the higher its representing crop row height,according to Eq.(2).

    Figure 4:Results of SURF feature extracting and matching

    Figure 5:Process of producing enhanced elevation images(a)Elevation image(b)Filtered image(c)Enhanced image

    In Eq.(2),the signsmaxandmindenote taking a maximum and minimum fromYw,respectively.The functionf(Yw)represents a grayscale value at a certain point aboutYw,denoting the height of the crops in the elevation image.Considering the 3D morphological and structural characteristics of the crop rows are roughly consistent relative to weeds or other plants in the field,we aim to preserve certain heights of the crop plants according to Eq.(3)to improve the robustness of detecting crop rows,wherehcis a threshold value(hc= 16 in the experiments);Yw∈(0,25)cm,f(Yw)∈(0,255).The processed result is shown in Fig.5b.

    From Fig.5b,we see that the points that do not meet the height requirement are completely removed.However,the available feature points are relatively sparse,resulting in poor functionality in the elevation image.To eliminate the impact of sparse feature points in elevation images,we dilate the feature points in the adjacent regions by using the morphological dilatation operator with a template size of 4 × 4.A typical resulting image is shown in Fig.5c.In this way,the regions of feature points will be extended to some extent in the elevation image to increase the stability and reliability of the process of extracting navigation parameters.

    3.2 Image Edge Extraction

    The elevation image emphasizes only the height information of the crop row,but the crop row edge information is also important for the navigation system of the agricultural robot,particularly for uneven crop rows in the late growth stage under relatively complex field environmental conditions.Therefore,the crop row edge information is extracted to ensure that the crop is not crushed during automatic navigation.First,the excess green method[27]is used to extract green crop rows from field images.The green crop and its soil background are represented by the black and the white pixel points,respectively,as shown in Fig.6b.Second,the noise points in the corresponding binary image are filtered by using the median filter with a template size of 5 × 5.Some isolated noise points and weeds patches(less than five pixels)can be removed completely,as shown in Fig.6c.Then the LoG operator[28]is used to extract crop edges,and a typical resulting image is shown in Fig.6d.Obviously,we could not directly obtain the entire outer contour of the row.Therefore,the dilation method with a template size of 5 × 5 is first used to link the edge curve segments detected by LoG,as shown as in Fig.6e.If the template is too small,it will affect the contour connectivity;conversely,it may introduce noise points into crop row edges.Next,we fill the connected regions inside the row by using a hole-filling method,as shown in Fig.6f.Then the erosion method is used to remove the isolated points on the outer edges of the row using the same template size as the dilation operator used above,as shown in Fig.6g.Finally,we extract the complete edge contours from the rows,as shown in Fig.6h.These edge contours are overlaid on the original image,as shown in Fig.6i,and it can be seen that they are consistent with the edge boundaries of the real rows.

    Figure 6:Extracting process of edge contours of crop rows(a)Original image(b)Binarized image(c)Filtered result(d)Edge extraction(e)Dilated image(f)Filled image(g)Eroded image(h)Edge contour image(i)Edges detected

    3.3 Confidence-Based Dense Image

    To make sufficient use of the crop row growth information,in this paper,we fuse the height and the edge information to produce the adaptive navigation parameters for an agricultural robot by using Eq.(4)(the fused image is called a confidence dense image),as shown in Fig.7d.During the fusion,if the grayscale value of a point in the fused image exceeds 255,it will be set to 255.In Eq.(4),I1(i)is theith pixel grayscale value of the binarized edge image(corresponding to a 2D image);I2(i)is the value of the elevation image(corresponding to a 3D image),ranging from 0 to 255;wis defined as a fusing factor that can integrate the grayscale value of the fusing image.

    Figure 7:Process of producing Confidence-based dense image(a)Original image(b)Elevation image(c)Crop edge image(d)Confidence dense image

    The confidence dense image proposed in this paper can be considered as the probability of a crop plant occurring in the corresponding position in a row image.If the grayscale value of a certain pixel point in the crop row image is bigger than the other,the probability of this point regarded as the crop row point is relatively higher.However,if the grayscale value of the point is smaller than the other,it will have a relatively smaller probability as a crop row point(the threshold value set ishc/2,as shown in Eq.(3).At the same time,the black pixels inside the crop may be weeds,or crops that do not reach the set threshold height.In this case,the binarized edge image can be used to obtain navigation information and the elevation image can be used to improve the robustness of the recognition of the irregular crop rows.Therefore,the confidence-based dense image can be used to reliably extract the parameters needed for the navigation system of the smart agricultural robot.

    3.4 Navigation Instructions

    In the experiments the agricultural robot used a four-wheel differential steering method.The steering model is shown in Fig.8a.The parametersWandLrepresent the width and the length of the agricultural robot,respectively.The σ represents the steering angle.This is a typical structure model of a smart agricultural robot.

    In field environments,the edge contours of crop rows show different morphological features.This characteristic is not considered by existing conventional navigation algorithms that focus on extracting green crop-row lines.Moreover,when the crop plant is in its late growth stage,its edge contour information is more important than an extracted row line for guiding the smart agricultural robot to avoid crushing crop plant leaves.In this case,we have designed five basic adaptive navigation control network instructions,which are sent by the smart agricultural IoT system in this study and are based on the edge contour tangent lines to extract navigation parameters.This allows the smart agricultural robot to make adaptive posture adjustments during the automatic navigation process.In some green fields in particular,such as kale and cabbage,in the late growth stages,there is a need to consider the boundaries of crop leaves in the navigation information of the smart agricultural robot.Otherwise,the crushed crop leaves will affect the crop yield prediction and health status analysis when the robot works in the field to transmit the spectral image data to the cloud computing server[29].

    Figure 8:Steering model and diagram of navigation parameters(a)Steering model(b)Diagram of navigation parameters

    Our navigation parameter extraction model is shown in Fig.8b,in which we assume that the rectangle formed by the dotted line is a frame crop image in the computer buffer taken by a camera.The point O is regarded as a reference point marked red with the high density in x-coordinate direction,and is calculated by the white points from elevation images.LlandLrare tangent lines passing the two edges points A and B,respectively.

    In Fig.8b,the sign α denotes an angle betweenLland the x-axis,and β denotes an angle betweenLrand the x-axis.The sign θ is a navigation control angle,being obtained by Eq.(5).Thed1 andd2 represent the distances from the reference point to the corresponding two edge points of a crop row,respectively.The signddenotes a lateral distance of the agricultural robot relative to the reference point,as expressed by Eq.(6).

    In Eq.(5),(x1,y1),(x2,y2)belong toLl;(x3,y3),(x4,y4)belong toLr.

    Generally,the working status of a smart agricultural robot can be either straight moving status or turning status.Straight moving status is easy to steer;the turning statuses are relatively complex.Thus,the turning statuses are divided into four cases:Left turning,right turning,right turn with straight moving,and left turn with straight moving.The corresponding statuses’network instruction sets sent by the smart agricultural IoT system are expressed in Eq.(7–11),where θtanddtare the thresholds corresponding to the navigation angle θ and the lateral distanced.The threshold θtanddtare set to 35°and 15 cm,respectively.

    The moving instructions are determined by the moving statuses of a smart agricultural robot in the field,which represent its basic moving steps as follows.

    1)Instruction set of straight moving status

    In this case,the distance between the left crop boundary and the wheel is roughly the same as that of the right boundary and the corresponding wheel.When the parameterdsatisfies Eq.(7),the smart agricultural robot will enter the straight moving status.

    2)Instruction set of right turning status

    When Eq.(8)is satisfied,the agricultural robot will enter right turning status.This usually occurs in a situation in which the angle difference θ between two tangent lines is relatively large.Therefore,the possibility of crops on the right of a frame image is higher.

    3)Instruction set of right turn with straight moving status

    When Eq.(9)is satisfied,the agricultural robot will turn right and go straight.In this case,the angle difference θ between two tangent lines is relatively small,but the possibility of crops on the right is higher.Therefore,the agricultural robot needs to make a slight adjustment to the right,and then advances in a straight line.

    4)Instruction set of left turning status

    When Eq.(10)is satisfied,the agricultural robot will turn left.This usually occurs in a situation in which the angle difference θ between two tangent lines is relatively large.Therefore,the possibility of crops on the left of a frame image is higher.

    5)Instruction set of left turn with straight moving status

    When Eq.(11)is satisfied,the agricultural robot will turn left and move in a straight line.In this case,the angle difference θ between two tangent lines is relatively small,but the possibility of crops on the left of a frame image is high.Therefore,the agricultural robot needs to make a slight adjustment to the left and then advances in a straight line.

    When the serial image data from the binocular cameras in the data acquisition layer are processed in real time in the cloud computing layer,the instruction sets obtained can be transmitted in real time to the controller through the transmission layer to control the corresponding actual movements of the smart agricultural robot.

    4 Experimental Results and Discussions

    In the experiments,we used the Bumblebee2 binocular vision system(Model BB2-03S2C-60,Canada)and a smart agricultural robot(manufactured in Shanghai,China).The agricultural robot is 107 cm long and 82.3 cm wide,with a tire width of 15 cm.These specifications are designed according to the field operation requirements for smart agricultural robots in North China.All program codes of image data processed were run in the C++environment on a computer with an Intel Core2 Duo CPU and 1.96 GB of RAM to test the adaptive navigation algorithm proposed in this study,which can only meet the low-speed requirements of less than 0.5 m/s of the smart agricultural robot.These specifications will need to be extended further to the cloud computing layer of the smart agricultural robot navigation IoT system designed in this study to speed up the image data processing and accomplish more intelligent operations in fields.The navigation parameter extraction and motion instruction sets designed were validated in the simulation experiments by designingOandStype of moving paths,as shown in Fig.9a and 9b.The moving trails of the smart agricultural robot were recorded by putting black toner on the middle of its tires.Then we manually recorded the data of the moving trails and the planning paths.In Fig.9c and d,the black curves represent actual moving trails of the smart agricultural robot;the red curves represent the edge contours of the simulated row.

    Figure 9:Display of planning path and moving trails(a)O-type path(b)S-type path(c)O-type moving trail(d)S-type moving trail

    In the experiments,the smart agricultural robot did not crush the simulated crop plant leaves when the navigation parameterdsatisfied d ∈[-15.6,15.6]cm,according to the crop row space and its width.The results from running the experiment six times are shown in Fig.10a and 10b,in which the actual measuring value ofdranges from-10 cm to 10 cm.This means that the values of the parameterdin the experiments all fell into the required range,indicating that the smart agricultural robot could normally move along a simulated crop row edge contour without crushing its leaves.

    Figure 10:Measured d of the two moving trails(a)d values of O-type trail(b)d values of S-type trail

    To highlight the edge-based navigation method proposed in this study,the contrast experiments based on the maximum density row-line detection without edge information proposed in the literature[30](the speed of the agricultural robot is also less than 0.5 m/s)were conducted in the same experimental path of O-type and S-type.The motion trails of the smart agricultural robot are shown in Fig.11a and 11b.The black lines are the robot’s actual paths.The values ofdwere obtained by conducting the experiments six times,as shown in Fig.11c and 11d.

    Figure 11:Comparison of experimental results of different methods proposed in[30](a)O-type path(b)S-type path(c) d values of O-type trail(d)d values of S-type trail

    In the experimental results,somedvalues located above the red line or below the blue line exceed the required range,indicating that the simulated crop leaves in these points’positions were crushed by the smart agricultural robot.Their means are 7.18 cm and 8.00 cm,with standard deviations of 4.67 cm and 5.82 cm.However,the experimental results from running our algorithm,as shown in Fig.10,show that these situations of crushed crops never occur,with the means being only 3.85 cm and 3.00 cm,with corresponding standard deviations of only 2.44 cm and 1.92 cm.

    Furthermore,we fit a curve equation of the steering angle σ and the navigation angle θ of the agricultural robot by using Matlab14 function Fourier,shown in Eq.(12),where the edge contour points parameters(σ,θ)are taken every 4 cm in an O-type planning path,the parameter σ is obtained by implementing our algorithm procedure,and the parameter θ is obtained manually.

    In this experiment,the coefficients of the equation are shown in Tab.1.

    Table 1:Function parameter values

    To further validate the above function,the testing process was performed in an S-type planning path.The results are shown in Fig.12.

    Figure 12:Comparison of the manually measured σ and σ,as calculated by Eq.(12)

    Due to the irregularity of crop rows in their late growth stages,the smart agricultural robot needs to adjust its moving posture during navigation.Therefore,the fitting equation is nonlinear,with R2of 0.96.The absolute mean of turning angle error is 0.7° with an absolute standard deviation of 1.5°,indicating that our navigation algorithm for the agricultural robot has good turning performance.Although the experimental results are obtained in simulated environments,without loss of generality,our proposed algorithm has fully fused the edge and height information of real crop rows.It can then be embedded into the smart agricultural IoT,and it will lay down a foundation in the vision navigation application field of smart agricultural robots.

    5 Conclusions

    To achieve automatic navigation in a smart agricultural robot,we proposed an adaptive vision navigation algorithm,which can be embedded into the smart agricultural robot IoT system we designed.The adaptive visual navigation algorithm can fuse the edge contour and height information of crops to extract the navigation parameters of the smart agricultural robot.The navigation network instruction sets designed for this study were successfully validated according to the moving statuses of the smart agricultural robot in the field.The simulated experimental results show that the smart agricultural robot can autonomously advance along S-type and O-type planning paths without crushing the leaves of the crop plant when its speed is less than 0.5 m/s,with an absolute mean of turning angle error of 0.7° and an absolute standard deviation of 1.5°.Our work provides a valuable reference for further practical application of the smart agricultural robot in responding to green crops in different growth periods.

    Acknowledgement:We are grateful to the National Natural Science Foundation of China for its support,and to all reviewers for their work and patience.

    Funding Statement:This study has been financially supported by the National Natural Science Foundation of China(No.31760345).The author who received the grant is Zhibin Zhang.The URL for the sponsor’s website is http://www.nsfc.gov.cn/.

    Conflicts of Interest:All authors declare that we have no conflicts of interest to report regarding the present study.

    老司机影院成人| 亚洲人成网站在线播放欧美日韩| 我要搜黄色片| 两性午夜刺激爽爽歪歪视频在线观看| 18禁在线无遮挡免费观看视频| 国产高清不卡午夜福利| 国产成人aa在线观看| 人人妻人人澡欧美一区二区| 日韩 亚洲 欧美在线| 午夜福利高清视频| 麻豆国产av国片精品| 午夜福利在线观看免费完整高清在 | 亚洲久久久久久中文字幕| 亚洲欧美精品专区久久| 桃色一区二区三区在线观看| 91aial.com中文字幕在线观看| 午夜激情欧美在线| 亚洲精品久久国产高清桃花| 18禁裸乳无遮挡免费网站照片| 搡老妇女老女人老熟妇| 日日干狠狠操夜夜爽| 精品久久久久久久久av| 国产成人精品久久久久久| 亚洲av中文字字幕乱码综合| 日本在线视频免费播放| 久久久久久久久久久丰满| 九九爱精品视频在线观看| 一卡2卡三卡四卡精品乱码亚洲| 可以在线观看的亚洲视频| 97热精品久久久久久| 久久韩国三级中文字幕| 亚洲欧美日韩卡通动漫| 搡女人真爽免费视频火全软件| 久久久成人免费电影| 久久精品国产亚洲av香蕉五月| 在线观看一区二区三区| 精品久久久久久久久亚洲| 精品人妻视频免费看| 黄色一级大片看看| 最近最新中文字幕大全电影3| 日本欧美国产在线视频| 男女边吃奶边做爰视频| 久久欧美精品欧美久久欧美| 亚洲欧美日韩东京热| 九九爱精品视频在线观看| 亚洲精品国产av成人精品| 亚洲自偷自拍三级| 国产精品福利在线免费观看| 国产白丝娇喘喷水9色精品| 国产精品麻豆人妻色哟哟久久 | 一个人看视频在线观看www免费| 亚洲图色成人| 欧美最新免费一区二区三区| 国产真实乱freesex| 一夜夜www| 亚洲18禁久久av| 亚洲欧美日韩无卡精品| 99国产极品粉嫩在线观看| 成人高潮视频无遮挡免费网站| 人人妻人人看人人澡| 欧美激情国产日韩精品一区| 国语自产精品视频在线第100页| 女同久久另类99精品国产91| 你懂的网址亚洲精品在线观看 | 国内久久婷婷六月综合欲色啪| 国产v大片淫在线免费观看| 在线播放国产精品三级| 国产日本99.免费观看| 99久久无色码亚洲精品果冻| 亚洲av成人av| 亚洲精品日韩av片在线观看| 全区人妻精品视频| 国产伦在线观看视频一区| 精品久久久久久久久久久久久| 精品99又大又爽又粗少妇毛片| 如何舔出高潮| 十八禁国产超污无遮挡网站| 国产成人午夜福利电影在线观看| 国产精品.久久久| 在线观看av片永久免费下载| 色噜噜av男人的天堂激情| 免费一级毛片在线播放高清视频| 91精品国产九色| 亚洲av免费高清在线观看| 综合色av麻豆| 日韩三级伦理在线观看| 少妇丰满av| av天堂在线播放| 亚洲第一区二区三区不卡| 亚洲最大成人中文| 久久久久久久久久成人| 成人高潮视频无遮挡免费网站| 三级男女做爰猛烈吃奶摸视频| 欧美日韩国产亚洲二区| 99热这里只有是精品50| 晚上一个人看的免费电影| 狂野欧美白嫩少妇大欣赏| 一本久久中文字幕| 最近中文字幕高清免费大全6| 变态另类丝袜制服| 久久99热这里只有精品18| 日本五十路高清| 1000部很黄的大片| 免费无遮挡裸体视频| 亚洲成av人片在线播放无| 中文在线观看免费www的网站| 国产69精品久久久久777片| 午夜福利视频1000在线观看| 一区二区三区四区激情视频 | 91精品国产九色| 日韩 亚洲 欧美在线| 一本久久中文字幕| 69av精品久久久久久| 噜噜噜噜噜久久久久久91| 亚洲国产欧美在线一区| 色综合亚洲欧美另类图片| 亚洲欧美精品专区久久| 26uuu在线亚洲综合色| av又黄又爽大尺度在线免费看 | 人妻夜夜爽99麻豆av| 国产精品久久久久久亚洲av鲁大| 天天一区二区日本电影三级| 中文字幕久久专区| 久久久久久久久久久丰满| 国产亚洲av片在线观看秒播厂 | 麻豆久久精品国产亚洲av| 2021天堂中文幕一二区在线观| 成人毛片a级毛片在线播放| 男人舔奶头视频| 久久精品久久久久久噜噜老黄 | 久久精品国产清高在天天线| 精品人妻视频免费看| 精品午夜福利在线看| 嫩草影院新地址| 国产真实乱freesex| 中出人妻视频一区二区| 99久久精品一区二区三区| 网址你懂的国产日韩在线| 久久精品人妻少妇| 国产乱人视频| 赤兔流量卡办理| 嘟嘟电影网在线观看| 国产精品野战在线观看| 最近手机中文字幕大全| 99久久无色码亚洲精品果冻| 最近中文字幕高清免费大全6| 欧美性猛交╳xxx乱大交人| 哪里可以看免费的av片| 日韩欧美三级三区| 九九久久精品国产亚洲av麻豆| 97人妻精品一区二区三区麻豆| 亚洲精品日韩在线中文字幕 | 99国产精品一区二区蜜桃av| 亚洲av一区综合| av在线天堂中文字幕| a级毛片a级免费在线| 国产一区二区亚洲精品在线观看| 嫩草影院精品99| 中文字幕熟女人妻在线| 日日干狠狠操夜夜爽| 国产亚洲av嫩草精品影院| 免费在线观看成人毛片| 国产av一区在线观看免费| 国产欧美日韩精品一区二区| 免费一级毛片在线播放高清视频| 日韩一区二区视频免费看| 欧美人与善性xxx| 我的老师免费观看完整版| 久久精品夜夜夜夜夜久久蜜豆| 亚洲国产欧洲综合997久久,| 丰满乱子伦码专区| 国产精品一区二区三区四区久久| 国产精品国产三级国产av玫瑰| 欧美bdsm另类| 少妇丰满av| 国产 一区精品| 夫妻性生交免费视频一级片| 99精品在免费线老司机午夜| 国产人妻一区二区三区在| 91久久精品国产一区二区成人| 一个人免费在线观看电影| 中国国产av一级| 亚洲国产欧美人成| 国产精品,欧美在线| 成年女人看的毛片在线观看| 九九爱精品视频在线观看| 六月丁香七月| 99久久中文字幕三级久久日本| 中文亚洲av片在线观看爽| 国产亚洲av片在线观看秒播厂 | 村上凉子中文字幕在线| 国产精品久久久久久久电影| 日韩国内少妇激情av| 免费看av在线观看网站| 亚洲最大成人av| 少妇的逼好多水| 亚洲国产欧美人成| 国产精品爽爽va在线观看网站| 欧美一区二区国产精品久久精品| 麻豆国产97在线/欧美| 国产av不卡久久| 亚洲国产欧美人成| 国产一区二区三区在线臀色熟女| or卡值多少钱| 丝袜美腿在线中文| 欧美成人a在线观看| 成人av在线播放网站| 精品久久久久久久久久久久久| 不卡视频在线观看欧美| 日韩一本色道免费dvd| www.色视频.com| 精品免费久久久久久久清纯| 丰满乱子伦码专区| 国产成人精品婷婷| 一边亲一边摸免费视频| 乱人视频在线观看| 久久九九热精品免费| 国内久久婷婷六月综合欲色啪| 欧美+日韩+精品| ponron亚洲| 亚洲18禁久久av| 日韩欧美精品免费久久| 麻豆精品久久久久久蜜桃| 少妇熟女欧美另类| 欧美一级a爱片免费观看看| 久久久久久久久久久免费av| 村上凉子中文字幕在线| 久久这里只有精品中国| 久久久久久伊人网av| av在线天堂中文字幕| 99热这里只有是精品在线观看| 亚洲三级黄色毛片| 亚洲一区二区三区色噜噜| 亚洲国产欧洲综合997久久,| 日本一二三区视频观看| 老女人水多毛片| 看十八女毛片水多多多| 六月丁香七月| 听说在线观看完整版免费高清| 狠狠狠狠99中文字幕| 亚洲婷婷狠狠爱综合网| 插阴视频在线观看视频| 男人的好看免费观看在线视频| 欧美xxxx性猛交bbbb| 在线播放国产精品三级| av在线亚洲专区| 悠悠久久av| 免费电影在线观看免费观看| 一级毛片电影观看 | 亚洲av二区三区四区| 欧美日韩在线观看h| 亚洲欧美成人综合另类久久久 | 久久99精品国语久久久| 26uuu在线亚洲综合色| 99热网站在线观看| 日韩三级伦理在线观看| 国产精品爽爽va在线观看网站| 精品免费久久久久久久清纯| 国产精品久久视频播放| 久久久久网色| 免费看av在线观看网站| 天堂影院成人在线观看| 国产激情偷乱视频一区二区| 国产伦在线观看视频一区| 熟女电影av网| 国内少妇人妻偷人精品xxx网站| av国产免费在线观看| 国产三级中文精品| 成人午夜高清在线视频| 欧美性感艳星| 日本欧美国产在线视频| 成人性生交大片免费视频hd| 亚洲国产精品国产精品| 内射极品少妇av片p| 国产av在哪里看| 99在线人妻在线中文字幕| 男女视频在线观看网站免费| 国产精品蜜桃在线观看 | 亚洲人成网站高清观看| 91狼人影院| 小说图片视频综合网站| 国产三级在线视频| 高清日韩中文字幕在线| 噜噜噜噜噜久久久久久91| 能在线免费看毛片的网站| 成人鲁丝片一二三区免费| 别揉我奶头 嗯啊视频| 色尼玛亚洲综合影院| 国产在视频线在精品| 午夜精品一区二区三区免费看| 黄色一级大片看看| 大香蕉久久网| 三级国产精品欧美在线观看| 日本与韩国留学比较| av免费在线看不卡| 男女视频在线观看网站免费| 亚洲欧洲日产国产| 一区二区三区四区激情视频 | 国产精品嫩草影院av在线观看| 99久久无色码亚洲精品果冻| 午夜精品在线福利| 美女 人体艺术 gogo| 国产视频内射| 日本一二三区视频观看| 一边摸一边抽搐一进一小说| 淫秽高清视频在线观看| 一夜夜www| 美女黄网站色视频| 亚洲国产欧美人成| 偷拍熟女少妇极品色| 久久精品夜夜夜夜夜久久蜜豆| 国产爱豆传媒在线观看| 亚洲,欧美,日韩| 中文亚洲av片在线观看爽| 91aial.com中文字幕在线观看| 国产精品国产高清国产av| 久久久久久久亚洲中文字幕| 午夜精品国产一区二区电影 | 男女做爰动态图高潮gif福利片| 一个人看视频在线观看www免费| 丝袜喷水一区| 精品无人区乱码1区二区| 深爱激情五月婷婷| 亚洲欧美日韩卡通动漫| 国产不卡一卡二| 亚洲成人av在线免费| 国产探花在线观看一区二区| 日韩,欧美,国产一区二区三区 | 久久精品影院6| 久久久久久久久久久免费av| 国产精品久久电影中文字幕| 天美传媒精品一区二区| 日本与韩国留学比较| 免费无遮挡裸体视频| 色综合色国产| 一个人看的www免费观看视频| 人人妻人人看人人澡| 亚洲国产精品国产精品| 色尼玛亚洲综合影院| av在线老鸭窝| 免费观看精品视频网站| av在线老鸭窝| 内地一区二区视频在线| 色视频www国产| 国产黄色视频一区二区在线观看 | 亚洲在线观看片| 欧美xxxx黑人xx丫x性爽| 97热精品久久久久久| 午夜福利视频1000在线观看| 国内精品宾馆在线| 午夜精品国产一区二区电影 | 日本欧美国产在线视频| 别揉我奶头 嗯啊视频| 夫妻性生交免费视频一级片| 国产av在哪里看| 色综合站精品国产| 乱系列少妇在线播放| 日韩欧美国产在线观看| 天天一区二区日本电影三级| 久久人人精品亚洲av| 欧美精品国产亚洲| 春色校园在线视频观看| 丝袜喷水一区| 亚洲国产精品合色在线| 亚洲乱码一区二区免费版| 精品久久国产蜜桃| 一本一本综合久久| 亚洲天堂国产精品一区在线| 亚洲人成网站在线播放欧美日韩| 国产一区二区在线观看日韩| 亚洲七黄色美女视频| 51国产日韩欧美| 精品久久久久久久久久免费视频| 黄色配什么色好看| 日韩精品青青久久久久久| 婷婷六月久久综合丁香| 成人漫画全彩无遮挡| 最近2019中文字幕mv第一页| 亚洲人成网站在线观看播放| 三级国产精品欧美在线观看| 色噜噜av男人的天堂激情| 色播亚洲综合网| 成年免费大片在线观看| 亚洲天堂国产精品一区在线| 人人妻人人看人人澡| www.av在线官网国产| 中文欧美无线码| 国产精品精品国产色婷婷| 在线免费十八禁| 又爽又黄a免费视频| 看免费成人av毛片| 国产av在哪里看| 色哟哟·www| 男人和女人高潮做爰伦理| 成年av动漫网址| 人妻制服诱惑在线中文字幕| 国产av在哪里看| 亚洲欧洲日产国产| 国产淫片久久久久久久久| 天堂影院成人在线观看| 国产精品一区二区三区四区免费观看| 日本黄大片高清| 久久久久久伊人网av| 女人十人毛片免费观看3o分钟| 麻豆av噜噜一区二区三区| 久久久欧美国产精品| 边亲边吃奶的免费视频| 日韩欧美三级三区| 亚洲五月天丁香| 亚洲图色成人| 欧美+日韩+精品| 久久99蜜桃精品久久| 国产精品日韩av在线免费观看| 欧美xxxx黑人xx丫x性爽| 亚洲久久久久久中文字幕| 中文字幕av成人在线电影| 亚洲欧美清纯卡通| 一级黄色大片毛片| 日韩av在线大香蕉| 午夜免费男女啪啪视频观看| 久久中文看片网| 中出人妻视频一区二区| 国内少妇人妻偷人精品xxx网站| 18禁在线无遮挡免费观看视频| 日日摸夜夜添夜夜添av毛片| 亚洲av免费在线观看| 精品久久久久久成人av| 日产精品乱码卡一卡2卡三| 日本欧美国产在线视频| 老熟妇乱子伦视频在线观看| 国产精品爽爽va在线观看网站| 十八禁国产超污无遮挡网站| 久久精品国产自在天天线| 久久99精品国语久久久| 欧美激情在线99| 亚洲五月天丁香| 国产中年淑女户外野战色| 亚洲欧美成人精品一区二区| 日韩人妻高清精品专区| 91久久精品国产一区二区三区| 国产成人aa在线观看| 日本-黄色视频高清免费观看| 色吧在线观看| 午夜老司机福利剧场| 免费黄网站久久成人精品| 伊人久久精品亚洲午夜| 欧美日本亚洲视频在线播放| 欧美性感艳星| 蜜桃亚洲精品一区二区三区| 两性午夜刺激爽爽歪歪视频在线观看| 村上凉子中文字幕在线| 欧美性感艳星| 国产69精品久久久久777片| 国产午夜精品论理片| 国产极品天堂在线| 我要看日韩黄色一级片| 热99re8久久精品国产| 亚洲丝袜综合中文字幕| 亚洲精品乱码久久久久久按摩| 变态另类丝袜制服| 少妇人妻精品综合一区二区 | 少妇熟女aⅴ在线视频| 97人妻精品一区二区三区麻豆| 国产高清有码在线观看视频| 欧美丝袜亚洲另类| 亚洲国产日韩欧美精品在线观看| 国产亚洲5aaaaa淫片| 亚洲欧美日韩卡通动漫| 国产精品野战在线观看| 亚洲欧洲国产日韩| 乱系列少妇在线播放| 国产精品一区二区三区四区免费观看| av又黄又爽大尺度在线免费看 | 哪里可以看免费的av片| 晚上一个人看的免费电影| 身体一侧抽搐| 国产高清有码在线观看视频| 国产成人a区在线观看| 免费观看a级毛片全部| 春色校园在线视频观看| 欧美高清成人免费视频www| 久久久成人免费电影| 在线免费观看不下载黄p国产| 日本成人三级电影网站| 久久久久久大精品| 中文字幕人妻熟人妻熟丝袜美| 久久99精品国语久久久| 男人的好看免费观看在线视频| 麻豆成人av视频| 亚洲av成人av| 成人亚洲精品av一区二区| 国产爱豆传媒在线观看| 夜夜爽天天搞| 少妇的逼水好多| 国内揄拍国产精品人妻在线| 少妇人妻精品综合一区二区 | 哪个播放器可以免费观看大片| 久久久色成人| 国产精品.久久久| 日韩强制内射视频| 久久人妻av系列| 免费观看a级毛片全部| 最近2019中文字幕mv第一页| 亚洲成人av在线免费| 偷拍熟女少妇极品色| 欧美最黄视频在线播放免费| 国产蜜桃级精品一区二区三区| 桃色一区二区三区在线观看| 亚洲一区二区三区色噜噜| 国产成人a∨麻豆精品| 国产精品不卡视频一区二区| 亚洲精品色激情综合| 国产一级毛片在线| 男女啪啪激烈高潮av片| 久久久久九九精品影院| 日本熟妇午夜| 色综合站精品国产| 日韩欧美国产在线观看| 久久国产乱子免费精品| 久久国内精品自在自线图片| 网址你懂的国产日韩在线| 69av精品久久久久久| 黄色配什么色好看| 精品免费久久久久久久清纯| 亚洲无线观看免费| 综合色av麻豆| 在线播放无遮挡| 欧美性猛交╳xxx乱大交人| 日韩在线高清观看一区二区三区| 黑人高潮一二区| 亚洲天堂国产精品一区在线| 91麻豆精品激情在线观看国产| 中文字幕制服av| 久久精品久久久久久噜噜老黄 | 精品欧美国产一区二区三| 波野结衣二区三区在线| 99国产精品一区二区蜜桃av| 国产精品99久久久久久久久| 久久久久久大精品| 亚洲av成人av| 国产色婷婷99| 亚洲国产精品sss在线观看| 国产人妻一区二区三区在| 夫妻性生交免费视频一级片| 国产黄片视频在线免费观看| 精品久久久久久久人妻蜜臀av| 色吧在线观看| 欧美高清性xxxxhd video| 日韩欧美精品v在线| 久久精品夜色国产| 精品一区二区三区人妻视频| 欧美xxxx黑人xx丫x性爽| 99久久九九国产精品国产免费| 99九九线精品视频在线观看视频| 日韩一区二区视频免费看| 日韩欧美三级三区| 天堂影院成人在线观看| 午夜a级毛片| 晚上一个人看的免费电影| 久久99蜜桃精品久久| 热99re8久久精品国产| 国产高潮美女av| 最近手机中文字幕大全| 三级男女做爰猛烈吃奶摸视频| 99国产精品一区二区蜜桃av| 欧美+日韩+精品| 中文字幕熟女人妻在线| 麻豆一二三区av精品| 国产亚洲91精品色在线| 国产日韩欧美在线精品| 色综合亚洲欧美另类图片| 国产久久久一区二区三区| 久久精品夜夜夜夜夜久久蜜豆| 国产成人91sexporn| 99热这里只有精品一区| 精品久久久久久久久亚洲| 欧美丝袜亚洲另类| 日日啪夜夜撸| 精品无人区乱码1区二区| 日韩av不卡免费在线播放| 麻豆成人午夜福利视频| 日本黄大片高清| 久久久色成人| 人妻夜夜爽99麻豆av| 亚洲av免费高清在线观看| 国产熟女欧美一区二区| 亚洲婷婷狠狠爱综合网| 国产精品国产高清国产av| 亚洲欧美日韩卡通动漫| 看免费成人av毛片| 亚洲aⅴ乱码一区二区在线播放| 伦精品一区二区三区| 日韩成人av中文字幕在线观看| 中国美白少妇内射xxxbb| 亚洲最大成人av| 中文资源天堂在线| 中文字幕熟女人妻在线| 国产成人一区二区在线| 一级毛片久久久久久久久女| 欧美+日韩+精品| 观看美女的网站| 两性午夜刺激爽爽歪歪视频在线观看| 国产精品女同一区二区软件| 成人永久免费在线观看视频| 午夜亚洲福利在线播放| 看免费成人av毛片| 天天躁日日操中文字幕| 免费av不卡在线播放| 午夜免费激情av| 寂寞人妻少妇视频99o| 在线免费十八禁| 大型黄色视频在线免费观看| 韩国av在线不卡| 深夜a级毛片| 女的被弄到高潮叫床怎么办|