• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Deep Learning Control for Autonomous Robot

    2022-08-24 06:59:46RihemFarkhSaadAlhuwaimelSultanAlzahraniKhaledAlJaloudandMohammadTabrezQuasim
    Computers Materials&Continua 2022年8期

    Rihem Farkh,Saad Alhuwaimel,Sultan Alzahrani,Khaled Al Jaloud and Mohammad Tabrez Quasim

    1College of Engineering,Muzahimiyah Branch,King Saud University,Riyadh,11451,Saudi Arabia

    2Laboratory for Analysis,Conception and Control of Systems,LR-11-ES20,Department of Electrical Engineering National Engineering School of Tunis,Tunis El Manar University,1002,Tunisia

    3King Abdulaziz City for Science and Technology,Saudi Arabia

    4College of Computing and Information Technology,University of Bisha,Bisha,67714,Saudi Arabia

    Abstract: Several applications of machine learning and artificial intelligence,have acquired importance and come to the fore as a result of recent advances and improvements in these approaches.Autonomous cars are one such application.This is expected to have a significant and revolutionary influence on society.Integration with smart cities,new infrastructure and urban planning with sophisticated cyber-security are some of the current ramifications of self-driving automobiles.The autonomous automobile,often known as selfdriving systems or driverless vehicles,is a vehicle that can perceive its surroundings and navigate predetermined routes without human involvement.Cars are on the verge of evolving into autonomous robots,thanks to significant breakthroughs in artificial intelligence and related technologies,and this will have a wide range of socio-economic implications.However,in order for these automobiles to become a reality,they must be endowed with the perception and cognition necessary to deal with high-pressure real-life events and make proper judgments and take appropriate action.The majority of self-driving car technologies are based on computer systems that automate vehicle control parts.From forward-collision warning and antilock brakes to lane-keeping and adaptive drive control,to fully automated driving,these technological components have a wide range of capabilities.A self-driving car combines a wide range of sensors,actuators,and cameras.Recent researches on computer vision and deep learning are used to control autonomous driving systems.For self-driving automobiles,lane-keeping is crucial.This study presents a deep learning approach to obtain the proper steering angle to maintain the robot in the lane.We propose an advanced control for a selfdriving robot by using two controllers simultaneously.Convolutional neural networks (CNNs) are employed,to predict the car’and a proportionalintegral-derivative(PID)controller is designed for speed and steering control.This study uses a Raspberry PI based camera to control the robot car.

    Keywords: Autonomous car;cascade PID control;deep learning;convolutional neural network;differential drive system;raspberry PI;road lane detector

    1 Introduction

    Since the middle of the 1980s,many universities,research centers,car companies,and companies in other industries have studied and developed self-driving cars(Fig.1)(also known as autonomous cars and driverless cars).

    Figure 1:self-driving car

    The Navlab mobile platform[1],the University of Pavia and University of Parma’s car,ARGO[2],and UBM’s vehicles,VaMoRs and VaMP,are all important examples of self-driving car research platforms from the last two decades.DARPA(Defense Advanced Research Projects Agency)conducted a test in 2004 to see if self-driving cars could complete the course,but no one did.Sebastian Thrun led his team to victory in the second challenge,which took place in 2005.Google developed the efficient self-driving car technology in the Toyota Prius,which was also licensed by the Department of Motor Vehicles in 2012[3].

    Deep learning and artificial intelligence have been key technologies behind many breakthroughs in computer vision and robotics over the last decade [4,5].They also have a huge influence on the emerging autonomous driving revolution in academia and industry.A self-driving or autonomous car can sense its environment and travel to a pre-determined destination without the assistance of a human driver on a route that has not been specifically built for it[6].

    With the advent of technologies and transportation,more cars now have self-driving modes to help passengers in protecting their health when driving long distances and reducing the risk of traffic accidents.The advantages of self-driving vehicles include the ability to predict and manage traffic problems and provide built-up accessibility for all users.Academic and industry research groups have developed various low-cost systems for studying research problems in autonomous driving and control[7-12].

    Self-driving vehicles must maintain their lane.Because of numerous sensors on autonomous vehicles,such as radar,LiDAR,ultrasonic sensors,and infrared cameras,ordinary color cameras remain critical because of their low cost and capacity to collect rich data [13,14].Lane detection is a critical environmental perception tool for creating a self-driving vehicle.It is critical to monitor a vehicle’s movement in compliance with the road lane in which it is driving[10,11,15,16].

    After detecting two lines,we must quantify and draw a virtual line in the center,then approximate the offset angle between the vehicle’s body and the virtual line to change the vehicle’s steering such that the vehicle is still in the middle of two lines under all circumstances[17].

    The above-mentioned steering angle calculations are complicated and can result in several errors.Several roads have no lane or blurry lane markings.Furthermore,as cars are speeding along a sloping driveway,the camera installed previously will point to the sky and fail to keep up with the lane ahead,which can also lead to erroneous detection[18].

    Recent developments in low-cost sensors and small-scale robot cars have provided new incentives and prospects for solving and overcoming the limitations,such as TurtleBot[19],Donkey car[20],and DFROBOT[21].Students and researchers interested in self-driving can use these robot cars.They are also permitted to design and build their robot cars using low-cost sensors and units,such as remotecontrol cars,Raspberry Pi(RPi),GPS,Wi-Fi,Bluetooth,and single/stereo cameras[22].

    Convolutional neural networks(CNNs)have been the most prevalent deep learning architecture due to its efficacy on image classification problems since their introduction in the 1990.For selfdriving vehicles,deep learning has been widely used.Obstacle detection,scene recognition,and lane recognition are all major issues in self-driving cars that must be solved[23].

    However,using CNNs,[24]showed an end-to-end learning method for self-driving vehicles.Reference[25]introduced using YOLO and CNN algorithms,which allow the system to detect road lanes,objects,and make suggestions to the driver.However,the system was recommended only for use on highways,not on city streets.

    This study proposes a self-driving vehicle prototype,in which experimental hardware components and methodologies are used in a self-driving car.The device consists of an RPi as the key part operating the algorithms,cameras linked to the RPi,and various sensors.The Arduino is also a critical component because it controls the vehicle motors and their movements.

    2 Autonomous Robot Car Architecture

    Training the car in a real-time environment,such as a motorway or road with heavy traffic,is a risky and cost-driven task.To manage such a situation,a small RPi-based car and a custom environment were established so that training can be conducted feasibly.

    This section describes the architecture and system block.A suitable configuration was chosen for developing a line-tacking robot using a Pi camera connected to the motor driver IC through an Raspberry PI3 B+.Fig.2 illustrates this configuration.

    Figure 2:self-driving robot structure

    The system’s implementation on the Raspberry Pi ensured the following functions:

    ? Data and image collection using the Pi camera

    ? PID controllers and the expected steering angle are used to determine the speed of the left and right motors

    ? directing the robot in the desired direction.

    3 Deep Learning Based PID Control for Self-Driving Vehicle

    To make the vehicle autonomous,image processing,AI,and deep learning approaches are used.To control the robot,a cascaded PID controller is designed.Fig.3 shows the overall proposed control diagram.

    Figure 3:Main controller for autonomous lane-keeping robot

    To control the robot’s maneuver,steering control is needed.Therefore,two PID controllers are employed to control the robot’s direction.The first PID controls the throttling speed (basic speed)based on the steering angle calculation.The steering angle is determined using an image processing controller.The input of the system is images;the output is steering angle in degrees(θ).Algorithm is applied on one single image(one frame)and will be repeated on all frames.The second PID determines the car’s deviation speed using a CNNθprediction.The trained network receives an image frame from a single forward-looking camera as input and generates a predictedθvalue as the output at each control period.

    Ifθ=90°,the car will drive forward without needing to be steered.A constant throttling speed of 50%PWM is used to ensure that the robot car moves forward at 50%off the total speed.Ifθ>90°,the car should steer to the right;otherwise,it should steer left.

    The driving signals for the left and right motors are calculated using the following equations.

    Right Speed=throttling speed(PID1)-speed correction(PID2)

    Left Speed=throttling speed(PID1)+speed correction(PID2)

    These driving signals will be used to generate the motor speed set-point for the robot.The speed control is tuned to make the robot move according to the speed set-point input using PID3 and PID4.For precise speed control of the two DC motors,encoders and PID controllers are required to obtain the actual positions and shaft rotations speeds.

    3.1 Steering Angle Calculator

    Lane detection is a critical component of intelligent transportation systems For more than a decade,lane identification has been an important topic research,with numerous techniques suggested,such as the Hough transform,Kalman filtering and probabilistic fitting.Fig.4 represents the entire operation.The system’s input is images and its output isθ.

    Figure 4:Steering angle(θ)calculation process

    Camera

    The camera will start recording a video with a resolution of 100×100(Fig.5).Since frames per second(fps)will drop after applying the processing techniques to each frame,we propose lowering the resolution to obtain a better fps.

    Figure 5:road line

    hue saturation,value(HSV)color space conversion

    After taking the video recording as frames from the camera,the following step is to convert each frame into an HSV color space(Fig.6).The main benefit is that it helps you distinguish between colors based on their luminance levels[26].

    Figure 6:convert the image to HSV color

    Detect white color and Edges

    After converting the image to an HSV color space,we can now detect only the color we’re interested in (i.e.,white,since it is the color of the lane lines).Edges are only detected using canny edge detectors to minimize the overall distortion in each frame(Fig.7)[26].

    Figure 7:Canny edge detection

    Select region of interest(ROI)

    To concentrate only on one area of the picture,a ROI must be chosen.We do not want the robot car to see many things in the environment in this situation.We only focus on the lane lines in the bottom half of the frame and disregard the rest(Fig.8)[26].

    Figure 8:ROI selection

    Detect line segments

    Figure 9:Identification of lane lines on the road

    Average slope and intercept(m,b)

    Here,the average of line segment slopes and intercepts identified using the Hough transform will be determined.Left lane points are defined as all lines with a negative slope;the right lane is the opposite.To improve detection accuracy,two boundary lines divide each picture into two sections(right and left).Right lane estimation is correlated with all width points greater than the right boundary line.If all width points are less than the left boundary line,they are linked to measuring the left lane(Fig.10)[26].

    Figure 10:superimpose the left and right lines onto the original image

    Calculate and display heading line

    The heading line determines the direction in which the steering motor should rotate and the speed at which the throttling motors should run.Ifθ=90°,the car can drive forward without having to steer.Ifθ>90°,the car should turn right;otherwise,it should turn left.

    3.2 Steering Angle Estimation Using CNN

    3.2.1 Data Description

    When working with deep learning networks,data preparation is needed.The RPi captures images and data as the user drives the vehicle around the track at 4-5 km/h.Approximately 30,000 images are paired with steering angles in the data collected(Fig.11).The image’s initial resolution is 100×100 pixels.

    To avoid the blur caused by vibration as the vehicle drives on the track,the Pi Camera is set to capture at 10 fps with an exposure time of 5000 us.Fig.12 provides several examples of this dataset’s images.

    3.2.2 Data Augmentations

    21.Offered meat and drink:If the king is fasting as a form of penance to take the offered food would be a break of that fast. Because he refused, he shows his piety like his wife does when she prays.Return to place in story.

    Because there are too few examples to train on,deep learning models overfit the limited dataset,resulting in a model with poor generalization performance.Data augmentation manipulates incoming training data to produce more instances of training data by randomly converting existing examples into new ones.This approach expands the training size,reducing overfitting.Horizontal flip,brightness changes,random shadow,height,and width change are all common transformations.Furthermore,data augmentation is done only on the training data,not the validation or test data[27].

    3.2.3 Training Process

    The dataset was divided into three subsets:training(60%),validation(20%)and test(20%).The training dataset contains 36,000 augmented frames,and the validation set contains 6000 frames.The model was copied back to the RPi after it was trained on the desktop computer.The vehicle’s main controller then uses the model to feed a picture frame from the Pi camera as input.Each control cycle can process 10 images per second,and the vehicle’s top speed around curves is 5-6 km/h.

    Figure 11:Collected data set

    Figure 12:Samples from the driving dataset

    3.2.4 Convolutional Neural Networks

    CNNs is a powerful artificial neural network technique CNNs,are multilayer neural networks designed specifically for 2D input,such as video and pictures,and have become widely used in computer vision.These networks preserve the problem’s spatial structure,make effective use of structural information and patterns in a picture and were designed for object identification and classification tasks[28].

    CNNs get their name from the convolutional layers that make up its structure.Convolutional layers’primary task is to detect specific local characteristics in each area of the input picture.A convolution layer is made up of a number of separate filters.The filter is slid over the entire image,and the dot product of the filter and chunks of the input image is calculated.Each filter is independently convolved with the image to produce feature maps.We can utilize a feature map for a variety of purposes,including reducing the size of an image while keeping its semantic content.

    The network architecture comprises 18 layers,including 6 convolutional layers,and 4 fully connected ones.The input image is 100×100×3 (height×width×depth format).Convolutional layers were designed to perform feature extraction and were chosen empirically through a series of experiments with varied layer configurations.Convolutional layers use a 3×3 kernel and a stride of 2×2.The respective depth of each layer is 32,32,64,64,128,and 128 to push the network going deeper.

    The max-pooling layer is a powerful tool used by CNN.This is a method that resizes large images but keeps the most important information about them.All max-pooling layers were chosen with a kernel of 2×2 and non-stride.The output is flattened after the convolutional layers,followed by a succession of fully connected layers of decreasing size:512,256,64,and 1.

    Batch normalization is a layer that allows the network’s layers to learn more independently.It is used to normalize the output of the preceding layers and can be used as regularization to prevent the model from overfitting.By placing the BN after the rectified linear unit(ReLU)yields a slightly higher accuracy and lower loss.

    All hidden layers are equipped with the ReLU to improve the convergence.From feature vectors,we apply Softmax to calculate steering wheel angle probability.The entire network will have~2,813,921 parameters (Tab.1) and will offer excellent training performance on modest hardware.Additionally,to make the architecture more robust and avoid overfitting,we use batch normalization and dropout[28]at a rate of 0.5 for the last two fully connected layer gatherings of most parameters.

    Table 1:The convolutional neural network structure

    Table 1:Continued

    3.3 Proportional-Integral-Derivative Control

    The PID constants were tuned separately in this cascaded PID method.A PID controller adjusts the output to keep the input variable close to the reference set-point.Three parameters can be tweaked to fine-tune this process(Kp,Ki,and Kd).Here is a continuous representation of the well-known PID controller equation.

    Kp,Ki,and Kd represent the proportional,integral,and derivative gain constants,respectively.

    To implement the controller equation in digital form,the reverse Euler method for numerical integration is employed.

    u(kT)and e(kT)present the discrete time control and error signals at T sampling time.

    The PID regulates both the left and right motor speeds based on the expected error measurement.The PID controller generates a control signal to establish the left and right speeds of the robot wheels(PID-value).It is a differential drive system.

    The right and left speed used the duty cycle of the PWM and applied at the input pins of the motor driver IC.

    The Ziegler-Nichols tuning method[29]was used to obtain the PID parameters(Kp,Ki,and Kd).To begin,the system’s Ki and Kd were set to 0.The Kp was raised from 0 to the maximum gain Ku,after which the robot oscillated constantly(Tab.2).

    Table 2:Proportional-integral-derivative control tuning parameters

    Because of the variations in line curvature,a traditional PID controller was found to be inadequate for the line-follower robot after extensive testing.To fix this problem,only the last three error numbers were summarized rather than aggregating all preceding values.The updated controller equation is as follows:

    This technique provides satisfactory performance for lane detection.

    4 Conclusions

    We presented a vision navigation autonomous robot in which a camera takes an image input and a CNN model is employed to take a decision accordingly.To provide a smooth tracking line,we designed a deep learning based cascade PID controllers.The results indicate that the suggested approach is well suited to autonomous robots,as they can operate with imprecise information.More advanced controllers based deep learning could be developed in the future.

    Acknowledgement:The authors acknowledge the support of King Abdulaziz City of Science and Technology.

    Funding Statement:The authors received no specific funding for this study.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding this study.

    精品视频人人做人人爽| 久久久久久久国产电影| 一级二级三级毛片免费看| 赤兔流量卡办理| 国产不卡av网站在线观看| tube8黄色片| 久热这里只有精品99| 欧美一级a爱片免费观看看| 韩国高清视频一区二区三区| 日韩一区二区三区影片| 亚洲欧美日韩另类电影网站| av女优亚洲男人天堂| 我要看黄色一级片免费的| 国模一区二区三区四区视频| av电影中文网址| 国内精品宾馆在线| 亚洲国产精品成人久久小说| 国内精品宾馆在线| 人妻少妇偷人精品九色| a级毛片黄视频| 最近2019中文字幕mv第一页| 天堂俺去俺来也www色官网| 最近的中文字幕免费完整| 国产精品麻豆人妻色哟哟久久| 久久久国产精品麻豆| 91aial.com中文字幕在线观看| √禁漫天堂资源中文www| 人妻人人澡人人爽人人| 亚洲人成网站在线播| 赤兔流量卡办理| 纵有疾风起免费观看全集完整版| 大片电影免费在线观看免费| 国产成人aa在线观看| 国产一区有黄有色的免费视频| www.av在线官网国产| 人妻一区二区av| 欧美精品一区二区大全| 国国产精品蜜臀av免费| 亚洲五月色婷婷综合| 大又大粗又爽又黄少妇毛片口| 99视频精品全部免费 在线| 搡女人真爽免费视频火全软件| 天美传媒精品一区二区| 999精品在线视频| 日韩在线高清观看一区二区三区| 久久99蜜桃精品久久| 精品一区在线观看国产| av女优亚洲男人天堂| 精品久久久噜噜| 国产av码专区亚洲av| 国产伦精品一区二区三区视频9| 久久青草综合色| 性色av一级| 熟女av电影| av.在线天堂| 日日撸夜夜添| 中文乱码字字幕精品一区二区三区| 涩涩av久久男人的天堂| 亚洲欧美精品自产自拍| 精品久久国产蜜桃| 少妇熟女欧美另类| 熟女av电影| 亚洲精品久久久久久婷婷小说| 国产亚洲午夜精品一区二区久久| 国产片特级美女逼逼视频| 久久久久久久亚洲中文字幕| 天天操日日干夜夜撸| 国产一区亚洲一区在线观看| 国产 精品1| 精品视频人人做人人爽| 国产黄频视频在线观看| 国产av国产精品国产| 婷婷色av中文字幕| 久久久久久久久久久免费av| 一本—道久久a久久精品蜜桃钙片| 国模一区二区三区四区视频| 中文字幕久久专区| 一级,二级,三级黄色视频| 日本猛色少妇xxxxx猛交久久| 老司机影院成人| 国产成人精品一,二区| 夜夜看夜夜爽夜夜摸| 国产成人91sexporn| 99久久综合免费| 国产深夜福利视频在线观看| 制服丝袜香蕉在线| 男女边吃奶边做爰视频| 婷婷色av中文字幕| 久久国产精品大桥未久av| 亚洲怡红院男人天堂| 国产成人a∨麻豆精品| 18禁观看日本| 成年美女黄网站色视频大全免费 | 人人澡人人妻人| 久久久久国产精品人妻一区二区| 乱码一卡2卡4卡精品| 色5月婷婷丁香| 大香蕉久久成人网| 高清不卡的av网站| videos熟女内射| 亚洲成人av在线免费| 亚洲国产av影院在线观看| 亚洲高清免费不卡视频| 精品亚洲成a人片在线观看| 欧美日韩综合久久久久久| 亚洲国产精品专区欧美| 国产精品嫩草影院av在线观看| 欧美一级a爱片免费观看看| 交换朋友夫妻互换小说| 多毛熟女@视频| 久久精品夜色国产| 伊人久久精品亚洲午夜| 人人妻人人澡人人看| 丝袜美足系列| 日韩视频在线欧美| 大香蕉久久成人网| 亚洲欧美日韩卡通动漫| 免费黄网站久久成人精品| 久久精品人人爽人人爽视色| 五月玫瑰六月丁香| 亚洲五月色婷婷综合| 亚洲性久久影院| 国产精品一国产av| 人妻人人澡人人爽人人| 国产淫语在线视频| av在线观看视频网站免费| 黑人欧美特级aaaaaa片| 自线自在国产av| 国产熟女欧美一区二区| 国产精品欧美亚洲77777| 亚洲欧美日韩另类电影网站| 日本黄色日本黄色录像| 成人国语在线视频| 久久人人爽人人片av| 国产精品不卡视频一区二区| 亚洲欧洲国产日韩| 中文乱码字字幕精品一区二区三区| av视频免费观看在线观看| 汤姆久久久久久久影院中文字幕| 99久久综合免费| 一个人免费看片子| 久久免费观看电影| 中文字幕av电影在线播放| 黑人巨大精品欧美一区二区蜜桃 | 国产成人91sexporn| 国产精品久久久久成人av| 日本欧美视频一区| 亚洲天堂av无毛| 亚洲成人手机| 久久久久久久亚洲中文字幕| 日本av免费视频播放| 久久 成人 亚洲| av黄色大香蕉| freevideosex欧美| videossex国产| 欧美精品高潮呻吟av久久| 母亲3免费完整高清在线观看 | 国产精品偷伦视频观看了| 高清不卡的av网站| 久久这里有精品视频免费| 超色免费av| av在线播放精品| 伊人久久国产一区二区| 男女边摸边吃奶| 综合色丁香网| 亚洲国产精品国产精品| 三级国产精品欧美在线观看| 最近中文字幕高清免费大全6| 免费大片18禁| 亚洲国产欧美日韩在线播放| 你懂的网址亚洲精品在线观看| 国产国拍精品亚洲av在线观看| 人妻系列 视频| 成人免费观看视频高清| 国国产精品蜜臀av免费| av线在线观看网站| 母亲3免费完整高清在线观看 | 欧美人与性动交α欧美精品济南到 | 国产一级毛片在线| 寂寞人妻少妇视频99o| 国产高清国产精品国产三级| 精品亚洲乱码少妇综合久久| 国产免费福利视频在线观看| 青春草亚洲视频在线观看| 久久久久久久大尺度免费视频| 免费高清在线观看日韩| 黑人欧美特级aaaaaa片| 男人添女人高潮全过程视频| 看非洲黑人一级黄片| 国产亚洲精品久久久com| 午夜视频国产福利| 亚洲国产精品专区欧美| 国产无遮挡羞羞视频在线观看| 色网站视频免费| 欧美另类一区| 精品亚洲成a人片在线观看| 国产精品 国内视频| 亚洲欧美日韩卡通动漫| 日本欧美视频一区| 亚洲欧洲国产日韩| 丰满饥渴人妻一区二区三| 国产高清不卡午夜福利| 男女国产视频网站| videos熟女内射| 亚洲精品久久久久久婷婷小说| 嫩草影院入口| 久久亚洲国产成人精品v| 亚洲怡红院男人天堂| 国产 精品1| 久久鲁丝午夜福利片| 亚州av有码| 国产日韩欧美视频二区| 99精国产麻豆久久婷婷| 人人澡人人妻人| 免费黄频网站在线观看国产| 黄片无遮挡物在线观看| 亚洲精品日本国产第一区| 十八禁网站网址无遮挡| 精品久久久久久久久av| 亚洲美女搞黄在线观看| a级毛片在线看网站| 少妇猛男粗大的猛烈进出视频| 日本欧美视频一区| 精品国产一区二区久久| 韩国高清视频一区二区三区| 国产女主播在线喷水免费视频网站| 视频中文字幕在线观看| 免费人成在线观看视频色| 成人黄色视频免费在线看| 一本大道久久a久久精品| 国产成人一区二区在线| 一边亲一边摸免费视频| 涩涩av久久男人的天堂| 26uuu在线亚洲综合色| 中文乱码字字幕精品一区二区三区| 欧美日本中文国产一区发布| 一本—道久久a久久精品蜜桃钙片| 久久精品久久久久久久性| 精品国产露脸久久av麻豆| 久久影院123| 极品少妇高潮喷水抽搐| 日本色播在线视频| 日本欧美国产在线视频| 亚洲经典国产精华液单| 国产综合精华液| 国产高清三级在线| 亚洲熟女精品中文字幕| 国产老妇伦熟女老妇高清| 大香蕉久久网| 综合色丁香网| 男女无遮挡免费网站观看| 最新中文字幕久久久久| 三级国产精品片| 免费观看在线日韩| 一区二区三区免费毛片| 国产成人精品福利久久| 亚洲国产精品一区三区| 久久久精品免费免费高清| 久久久久精品久久久久真实原创| a级毛色黄片| 免费观看在线日韩| videossex国产| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 国产在线免费精品| 亚洲国产精品一区三区| av视频免费观看在线观看| 久久久久久伊人网av| 国产69精品久久久久777片| 美女脱内裤让男人舔精品视频| 欧美性感艳星| 两个人免费观看高清视频| 久久久久久伊人网av| 欧美亚洲日本最大视频资源| 一区二区三区精品91| 美女cb高潮喷水在线观看| 国产探花极品一区二区| 精品一区在线观看国产| 国产成人精品一,二区| 国产欧美日韩综合在线一区二区| 999精品在线视频| 天天躁夜夜躁狠狠久久av| 性色avwww在线观看| 久久久久久久久久成人| 国产日韩一区二区三区精品不卡 | 人人妻人人澡人人看| 人妻系列 视频| 国产免费福利视频在线观看| 亚洲欧美清纯卡通| 日韩熟女老妇一区二区性免费视频| 中文字幕最新亚洲高清| 精品一区二区三卡| 日韩大片免费观看网站| 精品人妻一区二区三区麻豆| 亚洲精品日韩av片在线观看| 99热这里只有是精品在线观看| 午夜福利网站1000一区二区三区| 王馨瑶露胸无遮挡在线观看| 亚洲无线观看免费| 久久久久人妻精品一区果冻| 国产成人午夜福利电影在线观看| 精品午夜福利在线看| 亚洲图色成人| 人人妻人人澡人人看| 妹子高潮喷水视频| 只有这里有精品99| 国产日韩欧美视频二区| 国产黄色免费在线视频| 久久久久国产精品人妻一区二区| 精品人妻偷拍中文字幕| 99久国产av精品国产电影| 女人久久www免费人成看片| 亚洲情色 制服丝袜| 日韩一区二区视频免费看| 成人国语在线视频| 在线观看人妻少妇| 国产国语露脸激情在线看| 97精品久久久久久久久久精品| 人体艺术视频欧美日本| 午夜91福利影院| 国产一区二区三区综合在线观看 | 日韩一区二区三区影片| 婷婷色麻豆天堂久久| 国产老妇伦熟女老妇高清| 插逼视频在线观看| 午夜福利影视在线免费观看| 国产av精品麻豆| 卡戴珊不雅视频在线播放| 91精品三级在线观看| 国产成人免费无遮挡视频| 少妇的逼水好多| 国产欧美另类精品又又久久亚洲欧美| 十八禁高潮呻吟视频| 婷婷色综合大香蕉| 晚上一个人看的免费电影| 亚洲精品自拍成人| 亚洲人与动物交配视频| 人成视频在线观看免费观看| 99久国产av精品国产电影| 99视频精品全部免费 在线| 国产免费视频播放在线视频| 久久久国产精品麻豆| 久久97久久精品| 尾随美女入室| 亚洲欧洲精品一区二区精品久久久 | 最近中文字幕2019免费版| 一区二区三区四区激情视频| 国产成人精品福利久久| 在线观看免费高清a一片| 欧美日本中文国产一区发布| 日本av手机在线免费观看| 久久av网站| 青春草亚洲视频在线观看| 国产精品国产av在线观看| 免费久久久久久久精品成人欧美视频 | 婷婷成人精品国产| 日本午夜av视频| 国产精品久久久久久精品古装| 亚洲av国产av综合av卡| 精品久久久精品久久久| 免费看不卡的av| 最新的欧美精品一区二区| 成年美女黄网站色视频大全免费 | 丰满饥渴人妻一区二区三| 涩涩av久久男人的天堂| 视频在线观看一区二区三区| 免费观看的影片在线观看| 夜夜爽夜夜爽视频| 国产 一区精品| 国产精品久久久久久av不卡| 日韩在线高清观看一区二区三区| 十分钟在线观看高清视频www| 国产精品久久久久久精品电影小说| 国产成人精品福利久久| 男女边吃奶边做爰视频| 免费黄网站久久成人精品| 久久久午夜欧美精品| 一本色道久久久久久精品综合| 国产亚洲一区二区精品| 国产精品一区二区三区四区免费观看| 日韩伦理黄色片| 最近2019中文字幕mv第一页| 久久女婷五月综合色啪小说| 中国三级夫妇交换| 最近的中文字幕免费完整| 看十八女毛片水多多多| 亚洲精品乱久久久久久| 丰满饥渴人妻一区二区三| 色94色欧美一区二区| 91久久精品国产一区二区三区| 狠狠精品人妻久久久久久综合| 亚洲av欧美aⅴ国产| 久久精品久久精品一区二区三区| 国产淫语在线视频| 少妇人妻 视频| 97精品久久久久久久久久精品| 欧美激情极品国产一区二区三区 | 久久久国产一区二区| 特大巨黑吊av在线直播| 永久免费av网站大全| 日韩成人av中文字幕在线观看| 伊人久久国产一区二区| 欧美亚洲 丝袜 人妻 在线| 午夜福利影视在线免费观看| 亚洲性久久影院| 亚洲av福利一区| 亚洲人成网站在线播| 亚洲精品视频女| 一本大道久久a久久精品| 日本黄色日本黄色录像| 精品久久国产蜜桃| 中文字幕人妻熟人妻熟丝袜美| 制服诱惑二区| 国产 精品1| 在线天堂最新版资源| 日本av手机在线免费观看| 欧美一级a爱片免费观看看| 成人国产麻豆网| 免费黄色在线免费观看| 精品国产露脸久久av麻豆| 91在线精品国自产拍蜜月| 午夜视频国产福利| 爱豆传媒免费全集在线观看| 亚洲精品456在线播放app| 午夜福利网站1000一区二区三区| 亚洲五月色婷婷综合| 91国产中文字幕| 久久久久久久大尺度免费视频| 一区二区av电影网| 午夜影院在线不卡| av一本久久久久| 亚洲精品一二三| 9色porny在线观看| 五月开心婷婷网| 性色avwww在线观看| 大话2 男鬼变身卡| 久久精品国产亚洲网站| 日韩中字成人| 91成人精品电影| 各种免费的搞黄视频| 国产成人精品一,二区| 精品久久国产蜜桃| 久久久久久久久久成人| 美女内射精品一级片tv| 三级国产精品欧美在线观看| 日韩一区二区三区影片| 99热国产这里只有精品6| 中文字幕人妻熟人妻熟丝袜美| 国产男人的电影天堂91| 国产av一区二区精品久久| 黑人欧美特级aaaaaa片| 久久久国产精品麻豆| 欧美少妇被猛烈插入视频| 免费人成在线观看视频色| 国产高清国产精品国产三级| 国产欧美亚洲国产| 最后的刺客免费高清国语| 人妻夜夜爽99麻豆av| 亚洲欧美一区二区三区国产| 国产午夜精品久久久久久一区二区三区| 亚洲精品456在线播放app| 亚洲怡红院男人天堂| 在线观看一区二区三区激情| 啦啦啦在线观看免费高清www| 热99久久久久精品小说推荐| 免费观看无遮挡的男女| 肉色欧美久久久久久久蜜桃| 亚洲欧美日韩另类电影网站| 99热国产这里只有精品6| 精品亚洲成国产av| 啦啦啦在线观看免费高清www| 国产又色又爽无遮挡免| 亚洲精品久久午夜乱码| 国产高清不卡午夜福利| 国产精品久久久久久精品电影小说| 超色免费av| 蜜桃久久精品国产亚洲av| 亚洲一级一片aⅴ在线观看| h视频一区二区三区| 国产成人av激情在线播放 | 亚洲国产精品一区三区| 久久久久久久久久久免费av| 精品一区二区三区视频在线| 街头女战士在线观看网站| 国产深夜福利视频在线观看| 国产男女内射视频| 国产成人一区二区在线| videossex国产| 亚洲性久久影院| 极品人妻少妇av视频| 亚洲精品国产av蜜桃| 日韩中字成人| 免费看av在线观看网站| 欧美亚洲 丝袜 人妻 在线| 国产精品熟女久久久久浪| 91精品三级在线观看| 少妇的逼好多水| 蜜桃久久精品国产亚洲av| 91国产中文字幕| a 毛片基地| 中文字幕人妻丝袜制服| 丰满少妇做爰视频| 女人久久www免费人成看片| 大陆偷拍与自拍| 亚洲一级一片aⅴ在线观看| 亚洲精品成人av观看孕妇| 精品国产一区二区三区久久久樱花| 日本av手机在线免费观看| 国产日韩欧美亚洲二区| 亚洲熟女精品中文字幕| 欧美xxxx性猛交bbbb| 三级国产精品欧美在线观看| av不卡在线播放| 人体艺术视频欧美日本| 最后的刺客免费高清国语| 夫妻性生交免费视频一级片| 好男人视频免费观看在线| 国产精品一区二区三区四区免费观看| av国产精品久久久久影院| 人妻制服诱惑在线中文字幕| av天堂久久9| 热99久久久久精品小说推荐| 激情五月婷婷亚洲| 免费看不卡的av| 成人18禁高潮啪啪吃奶动态图 | 亚洲成人av在线免费| 丝袜美足系列| 伦理电影大哥的女人| 毛片一级片免费看久久久久| 亚洲av二区三区四区| 国产免费一区二区三区四区乱码| 黄色一级大片看看| 亚洲成人手机| 亚洲精品视频女| 日韩一区二区三区影片| 22中文网久久字幕| 国产精品一区www在线观看| 精品一区二区三区视频在线| av黄色大香蕉| 免费观看a级毛片全部| 午夜激情福利司机影院| 交换朋友夫妻互换小说| 熟女人妻精品中文字幕| 久久久久久久久久久久大奶| 国产成人aa在线观看| 18在线观看网站| 人妻系列 视频| 成人黄色视频免费在线看| 中文字幕亚洲精品专区| 精品国产乱码久久久久久小说| 国产精品久久久久久久久免| 国产欧美日韩综合在线一区二区| 免费观看的影片在线观看| 极品少妇高潮喷水抽搐| 精品国产乱码久久久久久小说| 亚洲国产成人一精品久久久| 又大又黄又爽视频免费| 欧美成人精品欧美一级黄| 欧美精品一区二区免费开放| 国产精品一二三区在线看| 日产精品乱码卡一卡2卡三| 欧美3d第一页| 3wmmmm亚洲av在线观看| 免费黄网站久久成人精品| 亚洲伊人久久精品综合| 国产精品麻豆人妻色哟哟久久| 国产亚洲欧美精品永久| 日韩在线高清观看一区二区三区| 国产综合精华液| 五月开心婷婷网| 日韩欧美精品免费久久| 成年美女黄网站色视频大全免费 | 欧美激情国产日韩精品一区| 高清不卡的av网站| 亚洲成人av在线免费| 亚洲四区av| 日韩一本色道免费dvd| 国产精品一二三区在线看| 69精品国产乱码久久久| 国产在线一区二区三区精| 亚洲精品色激情综合| 丝袜喷水一区| 人人澡人人妻人| 国产免费一级a男人的天堂| 久久久久久久大尺度免费视频| 色哟哟·www| 亚洲av二区三区四区| 青春草国产在线视频| 亚洲一级一片aⅴ在线观看| 十八禁网站网址无遮挡| 国产精品一区www在线观看| 亚洲伊人久久精品综合| 伦精品一区二区三区| a级毛片黄视频| 日本wwww免费看| 国产成人一区二区在线| 精品一区二区三区视频在线| 国产伦精品一区二区三区视频9| 欧美激情 高清一区二区三区| 热99国产精品久久久久久7| 日韩伦理黄色片| 免费人成在线观看视频色| 99九九线精品视频在线观看视频| 九草在线视频观看| 国语对白做爰xxxⅹ性视频网站| 日韩视频在线欧美| 国产精品 国内视频| 啦啦啦啦在线视频资源| 少妇人妻 视频| 秋霞伦理黄片| 高清不卡的av网站| 人体艺术视频欧美日本| 日韩亚洲欧美综合| 国产精品国产三级专区第一集| 久久久久久久大尺度免费视频| 男人操女人黄网站|