• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Computer Vision-Control-Based CNN-PID for Mobile Robot

    2021-12-14 10:29:36RihemFarkhMohammadTabrezQuasimKhaledAljaloudSaadAlhuwaimelandShamsTabrezSiddiqui
    Computers Materials&Continua 2021年7期

    Rihem Farkh,Mohammad Tabrez Quasim,Khaled Al jaloud,Saad Alhuwaimel and Shams Tabrez Siddiqui

    1College of Engineering,Muzahimiyah Branch,King Saud University,P.O.Box 2454,Riyadh,11451,Saudi Arabia

    2College of Computing and Information Technology,University of Bisha,Bisha,67714,Saudi Arabia

    3King Abdulaziz City for Science and Technology,Saudi Arabia

    4Department of Computer Science,College of Computer Science and Information Technology,Jazan University,Jazan,Saudi Arabia

    5Department of Electrical Engineering,Laboratory for Analysis,Conception and Control of Systems,LR-11-ES20,National Engineering School of Tunis,Tunis El Manar University,Tunis,Tunisia

    Abstract: With the development of artificial intelligence technology, various sectors of industry have developed.Among them, the autonomous vehicle industry has developed considerably,and research on self-driving control systems using artificial intelligence has been extensively conducted.Studies on the use of image-based deep learning to monitor autonomous driving systems have recently been performed.In this paper,we propose an advanced control for a serving robot.A serving robot acts as an autonomous line-follower vehicle that can detect and follow the line drawn on the floor and move in specified directions.The robot should be able to follow the trajectory with speed control.Two controllers were used simultaneously to achieve this.Convolutional neural networks(CNNs)are used for target tracking and trajectory prediction, and a proportional-integral-derivative controller is designed for automatic steering and speed control.This study makes use of a Raspberry PI,which is responsible for controlling the robot car and performing inference using CNN,based on its current image input.

    Keywords: Autonomous car; pid control; deep learning; convolutional neural network; differential drive system; raspberry pi

    1 Introduction

    From the automobile to pharmaceutical industries, traditional robotic manipulators have been a popular manufacturing method for various industries for at least two decades.Additionally,scientists have created many different forms of robots other than traditional manipulators to expand the use of robotics.The new types of robots have more freedom of movement, and they can be classified into two groups:redundant manipulators and mobile (ground, marine, and aerial)robots.Engineers have had to invent techniques to allow robots to deal automatically with a range of constraints.Robots that are equipped with these methods are called autonomous robots [1].

    Mobile robots can move from one location to another to perform desired tasks that can be complex [2].A mobile robot is a machine controlled by software and integrated sensors,including infrared, ultrasonic, webcam, GPS, and magnetic sensors.Wheels and DC motors are used to drive robots through space [3].Mobile robots are often used for agricultural, industrial,military, firefighting, and search and rescue applications (Fig.1), allowing humans to accomplish complicated tasks [4].

    Figure 1:Autonomous robots

    Line-follower robots can be used in many industrial logistics applications, such as transporting heavy and dangerous material, the agriculture sector, and library inventory management systems(Fig.2).These robots are also capable of monitoring patients in hospitals and warning doctors of dangerous symptoms [5].

    Figure 2:Line-follower serving robots

    A substantial number of researchers have focused on smart-vehicle navigation because of the limitations of traditional tracking techniques due to the environmental instability under which vehicles move.Therefore, intelligent control mechanisms, such as neural networks, are needed, as they provide an effective solution to the problem of vehicle navigation due to their ability to learn the non-linear relationship between input and sensor variables.A combination of computer vision techniques and machine learning algorithms are necessary for autonomous robots to develop“true consciousness” [6].Several attempts have been made to improve low-cost autonomous cars using different neural-network configurations, including the use of a convolutional neural network(CNN) for self-driving vehicles [7].A collision prediction system was constructed using a CNN,and a method was proposed for stopping a robot in the vicinity of the target point while avoiding a moving obstacle [8].A CNN has also been proposed for keeping an autonomous driving control system in the same lane [9], and a multilayer perceptron network has been used for mobile-robot motion planning [10].It was implemented on a PC Intel Pentium 350 MHz processor.Additionally, the problem of navigating a mobile robot has been solved by using a local neural-network model [11].

    A variety of motion control methods have been proposed for autonomous robots:proportional-integral-derivative (PID) control, fuzzy control, neural network control, and some combination of these control algorithms [12].The PID control algorithm is used by most motion control applications, and PID control methods have been extended with deep learning techniques to achieve better performance and higher adaptability.Highly dynamic high-end robotics with reasonably high accuracy of movement almost always require these control algorithms for operation.For example, a fuzzy PID controller for electric drives for a differential drive autonomous mobile robot trajectory application has been developed [13].Additionally, a PID controller for a laser sensor-based mobile robot has been designed to detect and avoid obstacles [14].

    In this paper, the low-cost implementation of a combined PID controller with a CNN is realized using the raspberry pi platform for the smooth control of an autonomous line-follower robot.

    2 Autonomous Line-Follower Robot Architecture

    In this Section, the architecture and system block are described.First, a suitable configuration was selected to develop a line-follower robot using a Pi camera connected through a Raspberry Pi3 B+ to the motor driver IC.This configuration is illustrated in the block diagram shown in Fig.3.

    Figure 3:Block diagram of the line-follower robot

    Implementing the system on the Arduino Uno ensured the following:

    ? moving the robot in the desired direction;

    ? collecting data from the Pi camera and feeding it into a CNN;

    ? predicting the error between the path line and the robot position;

    ? determining the speed of the left and right motors using a PID controller and the predicted error;

    ? operating the line-follower robot by controlling the four DC motors; and

    ? avoiding obstacles using an ultrasonic sensor.

    2.1 Mobile-Robot Construction

    The proposed mobile-robot design can easily be adapted to new and future research studies.The physical appearance of the robot was evaluated, and its design was based on several criteria,including functionality, material availability, and mobility.During the analysis of different guided robots of reduced size and simple structure (as shown in Fig.4), the work experience of the authors with mechanical structures for robots was also considered.

    Figure 4:Line-follower robot prototype

    Seven types of parts were used in the construction of the robot:

    (1) Four wheels,

    (2) Four DC motors,

    (3) Two base structures,

    (4) A control board formed using the Raspberry Pi3 B+ board,

    (5) An LN298 IC circuit for DC-motor driving,

    (6) An expansion board, and

    (7) An ultrasonic sensor HC-SR04.

    2.2 Raspberry Pi Camera

    The Raspberry Pi camera module is a light-weight, compact camera that supports the Raspberry Pi (Fig.5).Using the MIPI camera serial interface protocol, it communicates with the Raspberry Pi.It is usually used in machine learning, processing images, or security projects.The Raspberry Pi camera module is connected to the CSI port of Raspberry Pi.When the Raspberry Pi camera starts and is logged into the desktop typing the raspistill–o image.jpg command in prompt and pressing enter stores the image asimage.jpg, which can be viewed in GUI and confirms the running of the Raspberry Pi camera.Python script is used to control the Raspberry Pi camera and capture the image [15].

    Figure 5:The raspberry Pi camera

    2.3 Obstacle Detector

    The HC-SR04 ultrasonic sensor uses sonar to determine the distance from it to an object(Fig.6).It offers excellent non-contact range detection with high accuracy and stable readings in an easy-to-use package.It comes complete with ultrasonic transmitter and receiver modules [16].Below is a list of some of its features and specs:

    Figure 6:Ultrasonic sensor HC-SR04

    ? Power Supply:+5 V DC

    ? Quiescent Current:<2 mA

    ? Working Current:15 mA

    ? Effectual Angle:<15°

    ? Ranging Distance:2?400 cm/1′?13 ft

    ? Resolution:0.3 cm

    ? Measuring Angle:30 degree

    ? Trigger Input Pulse width:10 uS

    ? Dimension:45 mm×20 mm×15 mm

    2.4 Raspberry Pi 3B+

    The Raspberry Pi is an inexpensive, credit-card-sized single board computer developed in the United Kingdom by the educational charity, the Raspberry Pi Foundation (Fig.7).The Raspberry Pi 3 Model-B is the 3rd generation Raspberry Pi minicomputer with a 64-bit 1.2 GHz quad-core processor, 1GB RAM, WiFi, and Bluetooth 4.1 controllers.It also has 4× USB 2.0 ports, 10/100 Ethernet, 40 GPIO pins, a full-size HDMI 1.3a port, camera interface (CSI), combined 3.5 mm analog audio and composite video jack, a display interface (DSI), MicroSD slot, and VideoCore IV multimedia/3D graphics core@400 MHz/300 MHz [17].

    The GPIO27 (Physical pin 13) and the GPIO22 (Physical pin 15) are connected to IN1 and IN2 of the L298N module, respectively, to drive the left motors.The GPIO20 (Physical pin 38) and the GPIO21 (Physical pin 40) are connected to IN3 and IN4 of the L298N module respectively, to drive the right motors.

    Figure 7:Raspberry Pi 3B+ board

    2.5 L298N Motor Driver

    The L298N motor driver (Fig.8) consisted of two complete H-bridge circuits.Thus, it could drive a pair of DC motors.This feature makes it ideal for robotic applications because most robots have either two or four powered wheels operating at a voltage between 5 and 35 V DC with a peak current of up to 2A.This module incorporated two screw-terminal blocks for motors A and B and another screw-terminal block for the ground pin, the VCC for the motor, and a 5-V pin, which can either be an input or output.The pin assignments for the L298N dual H-bridge module are shown in Tab.1.The digital pin, which is assigned from HIGH to LOW or LOW to HIGH, was used for the IN1 and IN2 on the L298N board to control the direction.The controller output PWM signal was sent to the ENA or ENB to control the position.The forward and reverse speed or position control of the motor was achieved using a PWM signal [18].Then,using the analogWrite() function, the PWM signal was sent to the Enable pin of the L298N board,which drives the motor.

    Figure 8:LN298 IC motor driver

    Table 1:Errors and line positions in the camera frame

    3 Convolutional-Neural-Network-Based Proportional Integral Derivative Controller Design for Robot Motion Control

    A PID-based CNN controller was designed to control the robot.An error value of zero referred to the robot being precisely in the middle of the image frame.Error is assigned for each line position in the camera frame image, as explained Tab.1.

    A positive error value meant that the robot had deviated to the left and a negative error value meant that the robot had deviated to the right.The error had a maximum value of ±4, which corresponds to the maximum deviation.

    Tab.2 presents the assignment of errors to real images tacked with the Raspberry Pi camera.

    Table 2:Errors and real line positions in the Raspberry Pi camera frame

    The primary advantage of this approach is extracting the error from the image given by the Raspberry Pi camera, which can be fed to the controller to compute the motor speeds such that the error term becomes zero.

    The estimated error input was used by the PID controller to generate new output values(control signal).The output value was exploited to determine the Left Speed and Right Speed for the two motors on the line-follower robot.

    Fig.9 presents the control algorithm used for the line-follower robot.

    Figure 9:Proportional-integral-derivative and convolutional neural network deep-learning control

    3.1 Proportional-Integral-Derivative Control

    A PID controller aims to keep an input variable close to the desired set-point by adjusting an output.This process can be ‘tuned’by adjusting three parameters:KP,KI,KD.The well-known PID controller equation is shown here in continuous form

    where Kp, Ki, and Kd refer to proportional, integral, and derivative gain constants respectively.

    For implementation in a discrete form, the controller equation is modified by using the backward Euler method for numerical integration.

    where u(kT)and e(kT) are control signal and error signal in discrete-time at T sampling time.

    The PID controls both the left and right motor speeds according to predicted error measurement of the sensor.The PID controller generates a control signal (PID-value), which is used to determine the left and the right speed of robot wheels.

    It is a differential drive system:a left turn is achieved if the speed of the left motor is reduced,and a right turn is achieved if the speed of the right motor is reduced.Right Speed=Base ?speed ?PID ?value Left Speed=Base ?speed+PID ?value

    Right speed and left speed are used to control the duty cycle of the PWM applied at the input pins of the motor driver IC.

    The PID constants (Kp, Ki, Kd) were obtained using the Ziegler–Nichols tuning method [19].To start, the Ki and Kd of the system was set to 0.Then, the Kp was increased from 0 until it reached the ultimate gain Ku, and at this point, the robot continuously oscillated.The Ku and Tu (oscillation period) were then used to tune the PID (see Tab.3).

    Table 3:Proportional-integral-derivative control tuning parameters

    According to substantial testing, using a classical PID controller was unsuitable for the linefollower robot because of the changes in line curvature.Therefore, only the last three error values were summed up instead of adding all the previous values to solve this problem.The modified controller equation is given as

    This technique provides satisfactory performance for path tracking.

    3.2 LeNet Convolutional Neural Networks

    The use of CNNs is a powerful artificial neural network technique.CNNs are multilayer neural networks built especially for 2D data, such as video and images, and they have become ubiquitous in the area of computer vision.These networks maintain the spatial structure of the problem, make efficient use of patterns and structural information in an image, and have been developed for object recognition tasks [20].

    CNNs are so named due to the presence of convolutional layers in their construction.The detection of certain local features in every location of the particular input image is the primary work of the convolutional layers.The convolution layer comprises a set of independent filters.The filter is slid over the complete image and the dot product is taken between the filter and chunks of the input image.Each filter is independently convolved with the image to end up with feature maps.There are several uses that we can gain from deriving a feature map and reducing the size of the image by preserving its semantic information is one of them.

    In this paper, we propose using the LeNet CNN (Fig.10) to take advantage of it ability to classify small single-channel (black and white) images.LeNet was proposed in 1989 by Yann et al.[21], and it was later widely used in the recognition of handwritten characters.He combined a CNN trained to read black lines using backpropagation algorithms.LeNet has the essential basic units of CNN, such as a convolutional layer, pooling layer, and full connection layer.The convolution kernels of the three convolutional layers are all 5×5, and the activation function uses the Sigmoid function.

    Figure 10:The LeNet convolutional neural network structure

    The LeNet CNN included five layers and configured them to process single-channel scale images of size (28×28):

    ? Layer C1:Convolution layer (number of kernels = 4, kernel_size = 5 × 5, padding = 0,stride=1)

    ? Layer S2:Average pooling layer (kernel_size=2×2, padding=0, stride=2)

    ? Layer C3:Convolution layer (numberofkernels = 12, kernel_size = 5 × 5, padding = 0,stride=1)

    ? Layer S4:Average pooling layer (kernel_size=2×2, padding=0, stride=2)

    ? Layer F5:Fully connected layer (out_features=10)

    The input for the LeNet CNN is a 28×28 grayscale image, which passes through the first convolutional layer with four feature maps or filters with a size of 5×5 and a stride of one.The image dimensions change from 28×28×1 to 24×24×4.Then, the LeNet applies an average pooling layer or sub-sampling layer with a filter size of 2×2 and a stride of two.The resulting image dimensions are reduced to 12×12×4.Next, there is a second convolutional layer with 12 feature maps with a size of 8×8 and a stride of 1.In this layer, only 8 out of 12 feature maps are connected to four feature maps of the previous layer.The fourth layer (S4) is again an average pooling layer with a filter size of 2×2 and a stride of two.This layer is the same as the second layer (S2) except it has 12 feature maps, so the output is reduced to 4×4×12.The fifth layer (F5) is a fully connected softmax layer with 10 feature maps each of a size of 1×1 with 10 possible values corresponding to the digits from 0 to 9.

    4 Implementation on Raspberry Pi

    A substantial amount of data are required to create a neural network model.These data are gathered during the training process.Initially, the robot has to be wirelessly operated using the VNC Viewer, which helps monitor the Raspberry Pi through Wi-fi.As the robot is run on the path line it collects image data through the Raspberry Pi camera.These data are used to train the LeNet neural network model.

    The script for the CNN algorithm was developed in Python [22].This code was transported to the RP via a virtual development environment provided by the Python language composed of the interpreter, libraries, and scripts.Each project implemented on either the computer or Raspberry depends on the Python virtual development environment without affecting dependence on other projects.The dependencies in the RP virtual environment for running the training and test applications include Python 3, OpenCV, NumPy, Keras, and TensorFlow.

    To capture a set of images at an interval of 0.5 s, we used the pseudo code listed in the figure below:

    from picamera.array import PiRGBArray

    from picamera import PiCamera

    import time

    import cv2

    camera=PiCamera( )camera.resolution=(640,480)

    camera.framerate =32

    rawCapture = PiRGBArray(camera, size=(640,480))

    time.sleep(0.1)

    start =1

    for frame in camera.capture_continuous(rawCapture, format=“bgr,”use_video_port=True):

    image= frame.array

    cv2.imshow(“Frame,”image)

    key =cv2.waitKey(1)&0xFF

    cv2.imwrite(str(start)+“.jpg,” image)

    start = start +1

    if key == ord(“q”):breaktime.sleep(.5)

    Once the images were captured for many positions of the robot on the track, they were placed in different folders.We trained our LeNet system on the images captured using the following code.However, first we imported the necessary libraries, such as Numpy, cv2, and Matplotlib.

    class LeNet:

    @staticmethod

    def build(width, height, depth, classes):

    model = Sequential()inputShape =(height, width, depth)

    model.add(Conv2D(20,(5,5), padding=“same,”input_shape=inputShape))

    model.add(Activation(“relu”))

    model.add(MaxPooling2D(pool_size=(2,2), strides=(2,2)))

    model.add(Conv2D(50,(5,5), padding=“same”))

    model.add(Activation(“relu”))

    model.add(MaxPooling2D(pool_size=(2,2), strides=(2,2)))

    model.add(Flatten())

    model.add(Dense(500))

    model.add(Activation(“relu”))

    model.add(Dense(classes))

    model.add(Activation(“softmax”))

    return model

    The path of each image present in subfolders is registered, stored in the imagePath list, and shuffled.Shuffling is expected to provide data randomly from each class when training the model.Additionally, each image is then read, resized, converted into Numpy array and stored in data,and each image is given a label.

    The folder name is extracted from the image path of each image.A label of 0 is assigned if the folder name is “error=0.” Similarly, a label of ?4, ?3, ?2, ?1, 1, 2, 3, 4 is assigned to images present in folder “error=?4,” “error=?3,” folder “error=?2,” folder “error=?1,”folder “error=1,” folder “error=2,” folder “error=3,” and folder “error=4.”

    label = imagePath.split(os.path.sep)[?2]print(label)

    if label ==′error=?4′:label =?4

    elif label ==′error=?3′:label =?3

    elif label ==′error=?2′:label =?2

    elif label ==′error=?1′:label =?1

    elif label ==′error=0′:label =0

    elif label ==′error=+1′:label =+1

    elif label ==′error=+2′:label =+2

    elif label ==′error=+3′:label =+3

    else:label =+4

    labels.append(label)

    The images are normalized and each of the pixel values is between 0 and 1.Then, the images are separated into a training set and test set to validate the performance of the model.Next, the LeNet class was used to construct the model using the training data.

    Once the model is trained, it can be deployed on the Raspberry Pi.The code below was used to control the robot, based on the prediction from the LeNet CNN and the PID controller.A PID controller continuously calculates an error and applies a corrective action to resolve it.In this case, the error is the position prediction delivered by the CNN and the corrective action is changing the power to the motors.We defined a function PID_control_robot that will control the direction of the robot based on prediction from the model.

    def PID_control_robot(image):

    error_predection = np.argmax(model.predict(image))

    Pid_adjustement=PID(Error_predection)

    Right_speed=Base_speed+Pid_adjustement

    Left_speed=Base_speed?Pid_adjustement

    5 Conclusions

    We presented a vision-based, self-driving robot, in which an image input is taken through the camera and a CNN model is used to take a decision accordingly.We developed and a combined a PID and CNN deep-learning controller to guaranty a smooth tracking line.The results indicated that the proposed method is well suited to mobile robots, as they are capable of operating with imprecise information.More advanced controller-based CNNs could be developed in the future.

    Acknowledgement:The authors acknowledge the support of King Abdulaziz City of Science and Technology.

    Funding Statement:The authors received no specific funding for this study.

    Conficts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    亚洲午夜理论影院| 欧美成狂野欧美在线观看| 18美女黄网站色大片免费观看| 高清毛片免费观看视频网站| 精品午夜福利视频在线观看一区| 真人做人爱边吃奶动态| 国产精品久久久久久久电影| 亚洲精品日韩av片在线观看| 麻豆国产97在线/欧美| 日本免费a在线| 麻豆国产av国片精品| 亚洲欧美精品综合久久99| 国产色爽女视频免费观看| 中文资源天堂在线| 久久热精品热| 国产av不卡久久| 嫩草影院入口| 国产精品伦人一区二区| 久久久精品大字幕| 精品不卡国产一区二区三区| 亚洲精品成人久久久久久| 波多野结衣高清作品| 中文字幕熟女人妻在线| av女优亚洲男人天堂| 一卡2卡三卡四卡精品乱码亚洲| 丁香六月欧美| 国产高潮美女av| 我的女老师完整版在线观看| 99久久无色码亚洲精品果冻| 国产成人欧美在线观看| 中文字幕高清在线视频| 每晚都被弄得嗷嗷叫到高潮| 欧美午夜高清在线| 久久午夜福利片| 国产单亲对白刺激| 色5月婷婷丁香| 国内少妇人妻偷人精品xxx网站| 九九久久精品国产亚洲av麻豆| 露出奶头的视频| 亚洲精华国产精华精| 欧美xxxx性猛交bbbb| www.熟女人妻精品国产| 欧美一级a爱片免费观看看| 亚洲av五月六月丁香网| 亚洲国产精品999在线| 宅男免费午夜| 久久天躁狠狠躁夜夜2o2o| 国产日本99.免费观看| 国产精品一区二区免费欧美| 国产视频一区二区在线看| 亚洲精品在线观看二区| 国产一区二区在线av高清观看| 性插视频无遮挡在线免费观看| xxxwww97欧美| 搡老熟女国产l中国老女人| 成年女人看的毛片在线观看| 我的女老师完整版在线观看| a在线观看视频网站| 成年女人毛片免费观看观看9| 精品国内亚洲2022精品成人| av在线蜜桃| 一进一出好大好爽视频| 久久久久国内视频| 欧美高清性xxxxhd video| 91九色精品人成在线观看| 国内精品美女久久久久久| 一区福利在线观看| 一区二区三区四区激情视频 | 伦理电影大哥的女人| 亚洲精品亚洲一区二区| 精品久久久久久久久av| 最近在线观看免费完整版| 丰满人妻熟妇乱又伦精品不卡| 欧美+日韩+精品| 此物有八面人人有两片| 国产三级黄色录像| 可以在线观看的亚洲视频| 久久人妻av系列| 国产乱人伦免费视频| 村上凉子中文字幕在线| 18禁在线播放成人免费| 国产视频内射| а√天堂www在线а√下载| 热99re8久久精品国产| 国产成人a区在线观看| 国产白丝娇喘喷水9色精品| 亚洲乱码一区二区免费版| 色哟哟·www| 一级作爱视频免费观看| 日韩大尺度精品在线看网址| 精品福利观看| 免费电影在线观看免费观看| 18美女黄网站色大片免费观看| 蜜桃亚洲精品一区二区三区| 在线国产一区二区在线| 国产乱人视频| 色哟哟·www| 国产淫片久久久久久久久 | 免费看a级黄色片| 亚洲国产欧美人成| 欧美xxxx性猛交bbbb| 看免费av毛片| 又粗又爽又猛毛片免费看| 国产高清视频在线观看网站| 18禁裸乳无遮挡免费网站照片| 精品久久久久久成人av| 国产黄片美女视频| 91麻豆精品激情在线观看国产| 91久久精品国产一区二区成人| 国产69精品久久久久777片| 看片在线看免费视频| 欧美日韩黄片免| 亚洲精品色激情综合| 国产伦人伦偷精品视频| 99久国产av精品| 亚洲国产日韩欧美精品在线观看| 亚洲成人免费电影在线观看| 搡女人真爽免费视频火全软件 | 亚洲 欧美 日韩 在线 免费| 久久精品国产自在天天线| 国产黄色小视频在线观看| 国产视频内射| 国内精品一区二区在线观看| 色在线成人网| 偷拍熟女少妇极品色| av女优亚洲男人天堂| 午夜精品一区二区三区免费看| 国产精品亚洲一级av第二区| 精品午夜福利在线看| 禁无遮挡网站| av专区在线播放| 757午夜福利合集在线观看| 国产在线男女| 色吧在线观看| 90打野战视频偷拍视频| 欧美日韩中文字幕国产精品一区二区三区| 搡老妇女老女人老熟妇| 1024手机看黄色片| 国产成人av教育| 国产在视频线在精品| 高潮久久久久久久久久久不卡| 亚洲色图av天堂| 男人舔奶头视频| 久久人人精品亚洲av| 国产野战对白在线观看| 亚洲国产精品999在线| 2021天堂中文幕一二区在线观| 欧美+亚洲+日韩+国产| 日韩人妻高清精品专区| 成人国产一区最新在线观看| 日本成人三级电影网站| 永久网站在线| 男人舔奶头视频| 五月伊人婷婷丁香| 美女高潮喷水抽搐中文字幕| 极品教师在线免费播放| 很黄的视频免费| 好看av亚洲va欧美ⅴa在| 人人妻人人澡欧美一区二区| 亚洲狠狠婷婷综合久久图片| 成年女人毛片免费观看观看9| 伊人久久精品亚洲午夜| 黄色女人牲交| 亚洲人成电影免费在线| 脱女人内裤的视频| 最好的美女福利视频网| 国产精品自产拍在线观看55亚洲| 亚洲欧美激情综合另类| a级一级毛片免费在线观看| 在线观看免费视频日本深夜| 不卡一级毛片| 精品午夜福利在线看| 淫秽高清视频在线观看| 高清毛片免费观看视频网站| 高潮久久久久久久久久久不卡| 免费av不卡在线播放| 亚洲欧美日韩无卡精品| 亚洲熟妇熟女久久| 亚洲最大成人手机在线| 乱码一卡2卡4卡精品| 免费看光身美女| 91午夜精品亚洲一区二区三区 | 午夜福利欧美成人| 国模一区二区三区四区视频| 精品国内亚洲2022精品成人| 久久国产乱子伦精品免费另类| 舔av片在线| 免费看日本二区| 国产黄片美女视频| 色哟哟·www| 首页视频小说图片口味搜索| 两性午夜刺激爽爽歪歪视频在线观看| 欧美在线黄色| 一边摸一边抽搐一进一小说| av专区在线播放| 久久国产乱子伦精品免费另类| 亚洲av一区综合| 免费在线观看成人毛片| 欧美精品啪啪一区二区三区| 热99在线观看视频| 能在线免费观看的黄片| 欧美日韩国产亚洲二区| 精品国产三级普通话版| 亚洲最大成人手机在线| www.熟女人妻精品国产| 国产高潮美女av| 国内毛片毛片毛片毛片毛片| 黄色丝袜av网址大全| 夜夜看夜夜爽夜夜摸| 天堂√8在线中文| 一级黄片播放器| 少妇熟女aⅴ在线视频| 91av网一区二区| 18美女黄网站色大片免费观看| 亚洲人成网站高清观看| 又粗又爽又猛毛片免费看| 国内毛片毛片毛片毛片毛片| 日本免费一区二区三区高清不卡| 制服丝袜大香蕉在线| 欧美另类亚洲清纯唯美| 欧美三级亚洲精品| 亚洲国产欧美人成| 欧美中文日本在线观看视频| 三级男女做爰猛烈吃奶摸视频| 97超视频在线观看视频| 国产中年淑女户外野战色| 国产伦在线观看视频一区| 少妇丰满av| 亚洲成av人片免费观看| 亚洲国产日韩欧美精品在线观看| 少妇丰满av| 国产亚洲av嫩草精品影院| 美女被艹到高潮喷水动态| 欧美最新免费一区二区三区 | 欧美黑人巨大hd| 亚洲人与动物交配视频| 老司机深夜福利视频在线观看| 黄色女人牲交| 最新在线观看一区二区三区| 国产乱人伦免费视频| 男女床上黄色一级片免费看| 最近在线观看免费完整版| 国产亚洲精品av在线| 男女下面进入的视频免费午夜| 嫁个100分男人电影在线观看| 国产探花在线观看一区二区| 国产一级毛片七仙女欲春2| 欧美日韩中文字幕国产精品一区二区三区| av黄色大香蕉| 动漫黄色视频在线观看| 亚洲狠狠婷婷综合久久图片| 国产久久久一区二区三区| 国内精品美女久久久久久| 综合色av麻豆| 欧美国产日韩亚洲一区| 久久精品人妻少妇| www.www免费av| 窝窝影院91人妻| 精品人妻视频免费看| 欧美一区二区亚洲| 在线播放国产精品三级| 午夜福利在线在线| 久久6这里有精品| 国产aⅴ精品一区二区三区波| 久久热精品热| 国产精华一区二区三区| 9191精品国产免费久久| 黄色女人牲交| 成年女人毛片免费观看观看9| 日本成人三级电影网站| 色综合站精品国产| 久久人妻av系列| 久久人人精品亚洲av| 18美女黄网站色大片免费观看| 日日夜夜操网爽| 3wmmmm亚洲av在线观看| 日韩有码中文字幕| 亚洲专区国产一区二区| 综合色av麻豆| 国内精品久久久久精免费| 欧美日本亚洲视频在线播放| 久久午夜亚洲精品久久| 91在线观看av| 国产成人福利小说| 国产av一区在线观看免费| 我的老师免费观看完整版| 99国产综合亚洲精品| 久久精品国产亚洲av天美| 91久久精品电影网| 日韩欧美精品免费久久 | 狠狠狠狠99中文字幕| 九色国产91popny在线| 免费看a级黄色片| 禁无遮挡网站| .国产精品久久| 综合色av麻豆| 亚洲在线观看片| 美女免费视频网站| 非洲黑人性xxxx精品又粗又长| h日本视频在线播放| 亚洲精品粉嫩美女一区| 免费黄网站久久成人精品 | 蜜桃亚洲精品一区二区三区| 成年女人看的毛片在线观看| 免费大片18禁| 99热精品在线国产| 九九久久精品国产亚洲av麻豆| 欧美色视频一区免费| 欧美潮喷喷水| 精品国产亚洲在线| 国产免费av片在线观看野外av| av在线天堂中文字幕| 精品一区二区三区视频在线| 亚洲av成人精品一区久久| 婷婷亚洲欧美| 国产精品日韩av在线免费观看| 欧美日韩亚洲国产一区二区在线观看| 99国产极品粉嫩在线观看| 午夜老司机福利剧场| 亚洲午夜理论影院| 亚洲国产欧美人成| 国内精品久久久久精免费| 精品久久久久久久久亚洲 | 亚洲欧美日韩高清在线视频| 久久精品夜夜夜夜夜久久蜜豆| av女优亚洲男人天堂| 老熟妇乱子伦视频在线观看| 舔av片在线| 人妻丰满熟妇av一区二区三区| 欧美在线黄色| 国内精品久久久久久久电影| 动漫黄色视频在线观看| 欧美成人a在线观看| 日韩欧美在线乱码| 女生性感内裤真人,穿戴方法视频| 欧洲精品卡2卡3卡4卡5卡区| 色在线成人网| 麻豆成人午夜福利视频| 国产av麻豆久久久久久久| 成人永久免费在线观看视频| 亚洲va日本ⅴa欧美va伊人久久| 99久久久亚洲精品蜜臀av| 国产成年人精品一区二区| 精品无人区乱码1区二区| 亚洲中文字幕日韩| 亚洲av电影在线进入| 熟女人妻精品中文字幕| 蜜桃久久精品国产亚洲av| 成人永久免费在线观看视频| 亚洲人成伊人成综合网2020| 桃色一区二区三区在线观看| 两个人的视频大全免费| 久久天躁狠狠躁夜夜2o2o| 91九色精品人成在线观看| 男女床上黄色一级片免费看| 亚洲第一电影网av| av欧美777| 性欧美人与动物交配| 麻豆久久精品国产亚洲av| 久久久久久久精品吃奶| 9191精品国产免费久久| 哪里可以看免费的av片| 天堂影院成人在线观看| 97人妻精品一区二区三区麻豆| 精品熟女少妇八av免费久了| 亚洲 欧美 日韩 在线 免费| 欧美黑人巨大hd| 99国产精品一区二区三区| 哪里可以看免费的av片| 日本一本二区三区精品| 他把我摸到了高潮在线观看| 日本撒尿小便嘘嘘汇集6| 亚洲国产高清在线一区二区三| 成人午夜高清在线视频| 国产精品99久久久久久久久| 国产一区二区在线av高清观看| 精品久久久久久久久久久久久| 亚洲自拍偷在线| 国产美女午夜福利| 国产精品综合久久久久久久免费| 久久久久九九精品影院| 亚洲av二区三区四区| 日本免费a在线| 国产精品不卡视频一区二区 | 中文字幕高清在线视频| 麻豆国产97在线/欧美| 中文在线观看免费www的网站| 深夜a级毛片| xxxwww97欧美| 国产精品久久久久久亚洲av鲁大| 亚洲成人久久爱视频| 全区人妻精品视频| 一边摸一边抽搐一进一小说| 免费无遮挡裸体视频| 国产精品女同一区二区软件 | 国产精品亚洲一级av第二区| 日本成人三级电影网站| 老司机深夜福利视频在线观看| 精品一区二区免费观看| 亚洲国产色片| 黄色配什么色好看| 深夜精品福利| 99久久99久久久精品蜜桃| 国产色婷婷99| 国产三级中文精品| 国产精品女同一区二区软件 | 欧美性感艳星| 亚洲欧美精品综合久久99| 91狼人影院| 2021天堂中文幕一二区在线观| 哪里可以看免费的av片| 99riav亚洲国产免费| 午夜久久久久精精品| 熟女电影av网| 国产精品免费一区二区三区在线| 一个人免费在线观看电影| 日本三级黄在线观看| 色哟哟哟哟哟哟| 美女xxoo啪啪120秒动态图 | 成人三级黄色视频| 国产精华一区二区三区| 亚洲中文日韩欧美视频| 丁香欧美五月| 久99久视频精品免费| 在线观看免费视频日本深夜| 一边摸一边抽搐一进一小说| 欧美最新免费一区二区三区 | 中文字幕久久专区| 床上黄色一级片| 成人美女网站在线观看视频| 99久久精品热视频| 国产大屁股一区二区在线视频| 一区福利在线观看| 欧美另类亚洲清纯唯美| 午夜福利欧美成人| 久久精品国产亚洲av天美| 男女下面进入的视频免费午夜| 国内精品久久久久久久电影| 亚洲 国产 在线| 国产精品美女特级片免费视频播放器| 97碰自拍视频| 成人鲁丝片一二三区免费| 亚洲精品在线观看二区| 亚洲成a人片在线一区二区| 神马国产精品三级电影在线观看| 国产亚洲精品久久久com| 国产精品亚洲一级av第二区| 久久久久九九精品影院| 日本黄色视频三级网站网址| 亚洲五月婷婷丁香| 国产一区二区三区视频了| 日韩高清综合在线| 亚洲第一区二区三区不卡| 毛片女人毛片| 成人毛片a级毛片在线播放| 成人美女网站在线观看视频| 日本与韩国留学比较| 18禁裸乳无遮挡免费网站照片| 伦理电影大哥的女人| 国产成人免费无遮挡视频| 欧美激情国产日韩精品一区| 亚洲精品久久午夜乱码| 亚洲欧美日韩东京热| 亚洲av一区综合| 成人高潮视频无遮挡免费网站| 天美传媒精品一区二区| 亚洲国产日韩一区二区| 国产免费视频播放在线视频| 国产精品一区二区在线观看99| 国产成人免费无遮挡视频| 99久久九九国产精品国产免费| 丝袜美腿在线中文| 免费av不卡在线播放| 一区二区三区免费毛片| 国产精品久久久久久久电影| 亚洲国产最新在线播放| 国产精品久久久久久久电影| 国产精品嫩草影院av在线观看| 色网站视频免费| 2021少妇久久久久久久久久久| 永久网站在线| 在线免费观看不下载黄p国产| 亚洲av国产av综合av卡| 天堂俺去俺来也www色官网| 亚洲自拍偷在线| 亚洲精品456在线播放app| 欧美日韩精品成人综合77777| 在线亚洲精品国产二区图片欧美 | 日本一本二区三区精品| 国产av不卡久久| 80岁老熟妇乱子伦牲交| 中国国产av一级| 久久久久国产网址| 精品久久久久久久末码| 熟女电影av网| 国产美女午夜福利| 国产 一区精品| 久久精品国产亚洲av天美| 菩萨蛮人人尽说江南好唐韦庄| 极品少妇高潮喷水抽搐| 97超视频在线观看视频| 一级毛片aaaaaa免费看小| 成人特级av手机在线观看| 精品久久久噜噜| 国产亚洲午夜精品一区二区久久 | 欧美日韩视频高清一区二区三区二| 国产精品99久久久久久久久| 亚洲aⅴ乱码一区二区在线播放| 国产黄片美女视频| 亚洲av免费高清在线观看| 身体一侧抽搐| 亚洲成人久久爱视频| 久久久午夜欧美精品| 麻豆乱淫一区二区| 99热国产这里只有精品6| 亚洲国产精品国产精品| 国产一区亚洲一区在线观看| 大又大粗又爽又黄少妇毛片口| 一级毛片电影观看| 成人高潮视频无遮挡免费网站| 久久久久久久精品精品| 久久女婷五月综合色啪小说 | 国产成人freesex在线| 色播亚洲综合网| 夜夜看夜夜爽夜夜摸| 麻豆成人午夜福利视频| 国产亚洲5aaaaa淫片| 亚洲精品456在线播放app| 亚洲欧美成人综合另类久久久| 涩涩av久久男人的天堂| 高清av免费在线| 中文资源天堂在线| 大片免费播放器 马上看| 成人亚洲精品一区在线观看 | 国产乱来视频区| 又大又黄又爽视频免费| 老师上课跳d突然被开到最大视频| 大码成人一级视频| kizo精华| 国产成人a∨麻豆精品| 美女脱内裤让男人舔精品视频| 好男人在线观看高清免费视频| 午夜福利高清视频| 蜜桃亚洲精品一区二区三区| 成人国产麻豆网| 日韩视频在线欧美| 欧美日韩视频高清一区二区三区二| 伦理电影大哥的女人| 日韩免费高清中文字幕av| 亚洲av二区三区四区| 欧美zozozo另类| 久久这里有精品视频免费| 亚洲无线观看免费| 亚洲精品国产色婷婷电影| 精品视频人人做人人爽| 日韩大片免费观看网站| 香蕉精品网在线| 中文欧美无线码| 97精品久久久久久久久久精品| 人体艺术视频欧美日本| 久久97久久精品| 免费大片黄手机在线观看| 欧美一区二区亚洲| 色5月婷婷丁香| 青青草视频在线视频观看| 亚洲av中文字字幕乱码综合| 爱豆传媒免费全集在线观看| 直男gayav资源| 精品久久久久久电影网| 午夜精品国产一区二区电影 | 国产成人91sexporn| 香蕉精品网在线| 国产一区二区在线观看日韩| 国产一级毛片在线| 五月伊人婷婷丁香| 欧美极品一区二区三区四区| 亚洲一级一片aⅴ在线观看| 亚洲丝袜综合中文字幕| 天美传媒精品一区二区| 国产熟女欧美一区二区| 少妇人妻精品综合一区二区| 欧美bdsm另类| 成年女人在线观看亚洲视频 | 国产毛片a区久久久久| 久久国内精品自在自线图片| 国产极品天堂在线| av在线app专区| 我要看日韩黄色一级片| 亚洲成色77777| 久久精品夜色国产| 视频区图区小说| 亚洲av中文字字幕乱码综合| 91在线精品国自产拍蜜月| 免费观看av网站的网址| 人妻系列 视频| 一级毛片 在线播放| 欧美变态另类bdsm刘玥| 国产免费视频播放在线视频| 天堂中文最新版在线下载 | 午夜福利高清视频| 熟妇人妻不卡中文字幕| xxx大片免费视频| 人妻制服诱惑在线中文字幕| 亚洲综合精品二区| 中文欧美无线码| 国产日韩欧美在线精品| 国产综合精华液| 大话2 男鬼变身卡| 亚洲av不卡在线观看| 亚洲国产欧美在线一区| 中文资源天堂在线| 免费观看的影片在线观看| 一区二区三区精品91| 国产日韩欧美亚洲二区| 男女国产视频网站|