• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Automated Patient Discomfort Detection Using Deep Learning

    2022-08-24 03:27:10ImranAhmedIqbalKhanMisbahAhmadAwaisAdnanandHananAljuaid
    Computers Materials&Continua 2022年5期

    Imran Ahmed,Iqbal Khan,Misbah Ahmad,Awais Adnan and Hanan Aljuaid

    1Center of Excellence in Information Technology,Institute of Management Sciences,Peshawar,Pakistan

    2Computer Sciences Department,College of Computer and Information Sciences,Princess Nourah bint Abdulrahman University(PNU),Riyadh,Saudi Arabia

    Abstract:The Internet of Things(IoT)has been transformed almost all fields of life,but its impact on the healthcare sector has been notable.Various IoTbased sensors are used in the healthcare sector and offer quality and safe care to patients.This work presents a deep learning-based automated patient discomfort detection system in which patients’discomfort is non-invasively detected.To do this,the overhead view patients’data set has been recorded.For testing and evaluation purposes,we investigate the power of deep learning by choosing a Convolution Neural Network(CNN)based model.The model uses confidence maps and detects 18 different key points at various locations of the body of the patient.Applying association rules and part affinity fields,the detected key points are later converted into six main body organs.Furthermore,the distance of subsequent key points is measured using coordinates information.Finally,distance and the time-based threshold are used for the classification of movements associated with discomfort or normal conditions.The accuracy of the proposed system is assessed on various test sequences.The experimental outcomes reveal the worth of the proposed system’by obtaining a True Positive Rate of 98%with a 2%False Positive Rate.

    Keywords: Artificial intelligence; patient monitoring; discomfort detection;deep learning

    1 Introduction

    The IoT begins smart healthcare systems in the medical sector, generally comprised of smart sensors, a remote server, and the network.In smart healthcare, it has many applications, including early warning service (emergency, first aid, medical assessment), real-time supervision services(patient monitoring, elderly care), scheduling and optimization service (medical staff allocation,bed allocation, resources allotment).A patient monitoring system has been gaining the consideration of researchers in the field of advanced computer vision and machine learning.It is one of the ongoing research fields because of its broad range of applications, including respiration monitoring, pain detection, depression monitoring, sleep monitoring, patient behavior monitoring,posture monitoring, epilepsy seizure detection, etc.Researchers have developed different systems for patient monitoring systems, e.g., some use specialized hardware, pressure mattresses, and sensors but at the cost of additional expense.Similarly, connecting sensors to the body of the patient is unwilling from the patient? point of view.Few used signal based approaches to observe breathing, depth rate, and steadiness of breath besides monitoring the breath time and the ratio.Even though pain detection techniques exist, they mainly use facial expressions.The major drawback of such systems is that they require the patient to align face directly to the camera.A sleep monitoring system has been developed to detect sleep apnea and sleep disorders;such a system is also based on hardware and sensors installed in patients’beds.Some techniques monitor patient behavior, which helps to analyze their medical condition.However, such developed techniques are based on the installation of multiple cameras.

    Multiple camera posture-based monitoring techniques have also been developed, e.g., mainly focusing on the upper body part of the patient.Because of these limitations, a non-invasive discomfort detection system has been proposed in this work, which neither utilizes specialized hardware/sensors nor a line of sight vision devices or any constrained/ specialized environment.The introduced system is principally based on ten layers of the Convolutional neural network(CNN).It is the class of deep learning containing input, output as well as some hidden layers.The layers are fully connected, which helps to detect and recognize features and patterns.A pretrained model is used to test/evaluate the patient’s discomfort using our newly recorded data set.The CNN model’s output is 18 keypoints detected on different patient body locations using confidence maps.The detected keypoints are further utilized to shape six major body organs.This formation is based on association rules and part affinity fields.The distance of all detected keypoint is estimated from each succeeding keypoint of successive frames.The distance and timebased thresholds are considered to recognize discomfort in a specific organ of the body of the patient.Finally, experimental evaluation is made using manually created ground truths.The work presented in the paper has the following main contributions;

    ·An automated system is introduced for detection of patient discomfort using a deep learning-based model.

    ·By utilizing CNN architectures, confidence maps and 18 different keypoints are detected at various locations of the patient’s body,

    ·The detected keypoints are then converted into six main body parts/organs based on association rules and part affinity fields, and the distance of the following key points is measured using coordinates information,

    ·Finally, distance and the time-based threshold are utilized for the classification of movements as either discomfort or normal conditions.

    The proposed system could have many possible applications such as analysis, monitoring, detection of pain, discomfort, automatic patient monitoring in hospitals or homes, and elderly monitoring.The presented work is organized as follows:A review of the related work has been presented in Section 2.Then, the proposed system is introduced in Section 3.While Section 4 explains experimental results.Lastly, Section 5 concludes the presented work and provides future directions.

    2 Literature Review

    In recent years automated patient monitoring has been gaining the interest of researchers.Different signal, image processing, and computer vision techniques have been developed in the last decade.Some of the techniques have been discussed in this section which has been categorized as follows:

    2.1 Respiration Monitoring Approaches

    Respiration monitoring aims to observe the depth and steadiness of breath besides monitoring the inhalation and exhalation time and the ratio.Cho et al.[1] used a thermal image-based approach to respiration rate monitoring by specifying a region of interest under the nose.In [2],a radio frequency-based method is proposed, which helps to estimate the rate of respiration using a Multiple Signals Classification (MUSIC) algorithm.Authors in [3], presented a contactless breathing monitoring system using single camera approach.Ostadabbas et al.[4] have proposed a respiration monitoring system for estimating airway resistance non-intrusively using depth data obtained from the Microsoft Kinect sensor.Fang et al.[5] proposed a system for detecting sudden infant death syndrome.Al-Khalidi et al.[6] used facial thermal images of children to monitor their respiration rate.Janssen et al.[7] use the intrinsic respiratory features for finding the region of interest for respiration and motion factorization to extract respiration signals.Braun et al.[8]divide the input images into blocks and then estimates motion for each block.These block motions are then classified to find the respiration activity.Wiede et al.[9] introduce a method for remotely monitoring respiration rate using RGB images.This approach finds the region of interest and applies principal component analysis and frequency finding methods to determine the respiration rate.Frigola et al.[10] produced a video-based non-intrusive technique for respiration monitoring, which detects movement applying optical flow and quantifies the detected movement.Monitoring a patient’s respiration can provide insights and help diagnose many diseases like lung problems and abnormal respiration rates.

    2.2 Pain Detection and Depression Monitoring Approaches

    In the literature, pain detection and depression monitoring has been handled mostly by analyzing facial expressions.Authors in [11] exploited facial appearances for pain detection by using a feature-based method similar to [12–16], i.e., pyramid histogram of oriented gradients and pyramid local binary pattern.They used these features to extract the shape and appearance of patients’faces, respectively.Authors in [17] used Prkachin and Solomon Pain Intensity (PSPI)metric.Other approaches that consider facial emotions to detect pain and/or depression are proposed in [18–22].Each of these movements is categorized as a different action unit.The authors extract the face’s canonical appearance using Active Appearance Models (AAMs), filtered to extract features.These features are then fed to different SVMs, each trained to measure a separate level of pain intensity.In [23], authors suggested a system using AAM to detect patients’pain in videos.In [24,25], authors introduced a system that could discriminate facial emotions of pain from other facial emotions and applied SVM for severity score of pain.The system has been tested on UNBC-McMaster Database [26] using four different classifiers, namely SVM, Random Forest, and two neural networks.For assessment of the system, they applied the HI4D-ADSIP data set [27].Nanni et al.[28] classify pain states by proposing a descriptor named Elongated Ternary Patterns (ELTP), which combines the features of Elongated Binary Pattern (ELBP) [29]and Local Ternary Patterns (LBP).

    2.3 Sleep Monitoring Approaches

    Sleep monitoring encompasses recording and analyzing chest and abdomen movements, as is the case with respiration monitoring.In [30], Al-Naji et al.developed a system for detecting sleep apnea and monitoring respiration rate in children by using the Microsoft Kinect sensor.Li et al.[31] proposed a non-invasive system for cardiopulmonary signals monitoring in various sleeping positions.The infrared light source and Infrared sensitive camera are used in this approach.Metsis et al.[32] proposed sleep patterns monitoring system.They have investigated many factors corresponding to sleep disorders.Malakuti et al.[33] address the problem of sleep irregularities based on pressure data.Liao et al.[34] designed and measured the sleep quality using infrared video.They have used the technique of motion history image [35] for analyzing videos to recognize the patterns of patients’movements.Nandakumar et al.[36] introduced a smartphonebased sleep apnea detection system, which analyzes chest and abdominal motion.Saad et al.[37]proposed a device for finding sleep quality using several sensors in the room.The sensors are used for determining heart rate, temperature, and movement of the body.Hoque et al.[38] attach WISPs [39] to the bed’s mattress to know about the positions of the body and thereby monitor sleep.Accelerometer data is used for movement detection.

    2.4 Behavior Monitoring Approaches

    Human behavior understanding also plays a vital role in knowing much about people.Borges et al.[40] tried to recognize individual activities associated with psychiatric patients by utilizing blob detection, and optical flow analysis, and applied decision rules to analyze patients’activities.Authors in [41] proposed a system based on monitoring patients’ vital signs to prevent incidents such as falling, injuries, and pain.The system uses the Canny Edge Detector and Hough Transform for detecting beds.Once a bed is detected, the system determines whether or not a patient is present in the bed by detecting the patient’s head.Martinez and Stiefelhagen [42]have applied multi-cameras for observing the behavior of patients’in an ICU irrespective of the environmental conditions.By examining a patient’s behavior, much information can be collected about his medical conditions [43].

    2.5 Posture Monitoring Approaches

    Knowing about patient posture proves helpful for purposes like fall detection, pressure ulcer detection, and activity recognition.Chang et al.[44] introduced a system based on depth videos for restricting pressure ulcers in the bed of patients by investigating their movement and posture.In [45], the authors introduced a non-invasive patient posture monitoring method.This approach extracts HOG features for the classification of postures.The system also tracks the postures of the patient and generates a report accordingly.Wang et al.[46] have introduced a monitoring system for recognizing a person’s pose while covered with a blanket.In another approach, [47] proposed a system for determining the top body parts of the human under a blanket utilizing an overhead camera [48–50].Brulin et al.[51] suggested a technique for monitoring the elderly at home.The proposed method is based on posture recognition.This technique detects the individual body and then utilizes posture identification methods on the human silhouette based on Fuzzy Logic.

    2.6 Epilepsy Monitoring Approaches

    Many attempts have been made towards vision-based detection and prediction of epilepsy seizures.In [52] proposed a method for eyeball detection.The main purpose is to track the movement of eyes for knowing about the presence or absence of epileptic seizures.Lu et al.[53]used color videos and proposed a method for quantification of limb movement occurring in seizures associated with epilepsy.Cuppens et al.[54] apply the optical flow method and detect epilepsy movement.Kalitzin et al.[55] used the optical flow method to find movements associated with epileptic seizures.

    All of the above discussed approaches focus either on a single patient and/or a single bed, and specialized hardware is used.Also, the intrusive approaches among these need connecting sensors to the body or bed to record various measurements that are both costly and unwanted from the patient’s point of view.Even though pain detection approaches are there, they wholly solely depend on facial expressions, restraining the patient from retaining his/her face directly towards the camera.On the other hand, the proposed system may work in the existing wards setups monitor more than one patient simultaneously, lacking advanced beds or functional equipment,etc., except a single camera.Being non-invasive, it makes no contact with the patient while recording their movements.Recently, scholars also utilized deep learning based methods [56–59]for patient discomfort monitoring [60].In this work, we also used a deep learning based method for automated patient discomfort detection.

    3 The Proposed Method

    In this section, a deep learning based sustainable discomfort detection system is introduced.The flow chart presented in Fig.1 highlights the main steps of the proposed method.The proposed method is mainly based on Convolution Neural Network (CNN) based architecture [61].Firstly, the input images of the patient from the IMS-PDD-II data set are transmitted to the pre-trained model, which detects key points at various locations on the body of a patient.Then,the information of detected key points is then applied for the formation of the patient body organs using defined association rules.Finally, a distance threshold has been applied to recognize discomfort or pain in the organs of a patient’s body.The detailed explanation of the proposed method exhibited in Fig.1 has been described with the help following steps:

    ·The pre-trained model used non-parametric representation, which is called parts affinity fields.The parts affinity fields contain the orientation and position information used to identify human body parts in the input image.The model employs CNN architecture, shown in Fig.1 [62].The input images from the data set are given to the pre-trained model.The trained model mainly has two branches—the top branch is used for predicting the confidence maps and detection of human body parts, while the bottom one is for predicting part affinity fields, which are used to link human body parts, as shown in Fig.2.Each of the two branches is repetitive prediction architecture refining the predictions via the number of successive stages.

    Figure 1:Flowchart of the CNN-based discomfort detection method

    Figure 2:Proposed Model Architecture.(a) shows input image, (b) CNN model and (c) shows detected key points and (d) shows detected patients’body organs

    ·A set of feature maps represented byFare extracted for each input image using CNN.TheFis used as input features to the initial stages of both branches, as shown in Fig.2.At these initial stages, the network generated a set of detection confidence maps.The detected confidence maps for the initial stage is given as;

    While fortthstage the confidence, maps have been calculated as;

    In Eq.(2),tis the CNN for interference at the initial stage totthstage of branch 1 as shown in Fig.2.

    ·The part affinity fields have also been generated along with confidence mapsS1.The part affinity field for the initial stage is calculated using the below Equation:

    Moreover, fortthstage, the part affinity fields are shown in Eq.(2).

    Hereφ1represent the CNN for inference at the initial stage totthstage of branch 2.After every succeeding stage, the model concatenates both branches’predictions and generates image features.These features are used for refined predictions calculated in Eqs.(2) and(4), as shown in Fig.2.

    ·For iterative prediction of confidence map of the human body part at the first branch and part affinity fields at the second branch of each stage, loss function has been calculated.As there are two branches, so two-loss functions are calculated and applied on each stage.These loss functions are given by Eqs.(5) and (6) [62].The first loss function for the first branch and calculated as;

    In Eq.(5)is ground truth confidence map of human body.The second loss functionfor ground truth of part affinity fields is given as:

    whereis ground truth of part affinity vector.In Eqs.(5) and (6),pis the location at input image,Wis a binary mask equal to 0, in case annotation is missing there at locationp.The calculated loss function at each stage is to minimize the distance between predicted and real confidence maps for each affinity part.

    ·The main objectives of the calculated loss function L for full architecture shown in Fig.2 are obtained by adding Eqs.(5) and (6) is given by.

    ·The pretrained model shown in Fig.3 gives 18 detected key points on the body, as determined in Fig.4a.The key points information is moreover utilized to form body organs, as shown in Fig.4b.Finally, using association rules, six organs of body are formed and have been manually highlighted in Fig.4d.

    ·When a patient feels any type of discomfort, frequently movement occurs in any such part of the patient’s body.For example, the patient may touch/hold his/her head with hands or moving legs or arms.Furthermore, in few cases, patients may move his/her legs, arms, or any other part in a disruptive way.For instance, he/she sometimes sits or lies or switch sides frequently.All such random and frequent changes are considered as discomfort signs.If the frequency of these frequent and random movements lasts for a long duration, it is considered as a discomfort condition.The discomfort investigation in a body is based on constant movements of the specific part of body.The presented system determines a change in the body organ utilizing key points information across time and categorizes the condition as discomfort or normal.The coordinates information of detected key points is used to identify pain.The movement in any body part or organ is measured using distance information that is determined by applying Euclidean distance across consecutive video frames.

    Figure 3:The sample images show Heatmap and PAF’s for the right elbow.The body part is encoded in the 3rd channel, so in this case, the right knee is at index 9

    ·The threshold measuring distanceTof consecutive key points is used in terms of the number of pixels.Thave been set as 25 pixels.The threshold decides movements in the patient body organ or partb.For instance, a variation in the coordinates (x,y) of detected key points e.g., 5, 6 and 7 on a body of the patient will cause a movement in the left arm and change in the (x,y) coordinates of joints 8, 9 and 10 would mean a movement in the right leg.For this reason, the Euclidean distances for all detected key points of that body organs have been examined using Eq.(8).

    Figure 4:Organ formation (a) shows detected key points on the patient’s body, (b) shows six different body organs formed using part affinity method and association rules, (c) shows linking of six body organs

    ·Lastly, to investigate either a patient is feeling normal or having some discomfort problem,video frames are examined for frequent movements of occurrences using a time-based thresholdTtas shown in Eq.(9).(This threshold can be changed depending upon on size and variety of data set.In this work, ten frames per second have been practiced due to limited data set).

    whereCpatientrepresent the condition of the patientP.Ttis the time threshold representing the span of time that discriminates between the normal and discomfort movements.

    4 Experimental Results and Discussion

    The proposed method has been evaluated on a recorded IMS-PDD-II data set.A brief description of the video clips considered in this work is given in Tab.1.Experiments have been performed on an HP core i3 Laptop with 8 GB RAM.The frames of the video clips are given as input to the pre-trained model to identify key points and organs of the patient’s body.A few output images of detected organs can be observed from Fig.5.After detecting the key points,the movement frequency of the patient organs has been analyzed using key points coordinates information.Based on movement frequency, the discomfort in the patient’s body has been decided.In this section, the result of different video clips has been discussed; each video clip contains movement in the different organs of the patient body.The results of the different video clips show movements in different organs are briefly discussed in this section.

    Table 1:Description of video clips used

    Figure 5:Sample of output images shows organs of the patients body

    In video 1, the patient moved his left arm over many times, as noted in Fig.6.To be exact,left arm involves movement in frames 21–40, 43–54, 57–82, 84–135, 137–157, 160–174, 176–188,190–207, 209–229, 239–258.All these changes occurs continously and are greater than the defined threshold.Also, these sequences of movements in the left arm are separated by no movement in one or two frames, indicating that there is continuous movement in the left arm.It determines that there is severe pain (discomfort) in the left arm.The movements in the left arm are also accompanied by movement or changes in the right arm in some frames because the patient retains touching his left arm with his right hand, as explained in Fig.7.In video 2, excessive movements have occurred in the patient’s right arm, i.e., in frames 14–22, 24–68, 73–84, 93–102, 148–180,185–202, 207–218, and 227–236 consecutively.The frequency of the movement in the right arm is large compared to other organs.In addition, the patient has moved his head, left arm, and both legs in some of the frames, as is seen from Fig.7.However, as the pattern or frequency of the right arm is greater than the threshold, this indicates discomfort in the right arm.The reason is that most of the time, discomfort in one part of the body also causes movements in other parts besides the concerned body part.

    Figure 6:Movement detection in video 1

    Figure 7:Movement detection in video 2

    Video 3 contains movement in the patient’s right leg almost continuously throughout the video with the exception of a few frames gaps.The movement in the right leg is accompanied by movement in the right arm in most of the frames.The patient has also moved his head and left arm, but the movement in the head is a bit more frequent, as depicted in Fig.8.Ten or more consecutive frames involving movement in the right leg are 3–30, 62–82, 84–102, 134–149, and 158–178, 180–209, and 241–255.This situation can be classified as a discomfort in the right leg.In video 4 the patient moved both his arms frequently throughout the video, but movements in the right arm are more substantial and last for a longer duration, as is clear from Fig.9.Here,consecutive frames with movements in the right arm include 2–32, 62–78, 81–105, 120–139, 155–191, and 197–213, 216–240 and 255–274.Movements in the left arm also occur almost parallel to those in the right arm in most of the frames.The patient has also moved his head and right leg in some frames.The frequent movements in.the right arm help reach the conclusion that in this video, movements in both arms caused discomfort in the right arm of the patient.

    Figure 8:Movement detection in video 3

    Figure 9:Movement detection in video 4

    Video 5, on the other hand, comprises of two patients—the first patient is lying on bed 1 (left side), while the second patient is lying on bed 2 (right side).The results of movement in various organs of both patients are presented in Figs.10 and 11, respectively.Patient in bed 1 has largely moved his head and both arms, particularly in frames 38–77 and 103–244.All these movements satisfies the time-based threshold hence intimating that the patient feels some pain in his body.On the other hand, the patient lying in bed 2 also moved various parts of his body.Fig.12 also shows that for patient 2 most of the frames containing a change in various body parts, although the frequency of movement is less than the defined threshold, which shows that the movement is normal.

    Figure 10:Movement detection in video 5 Bed 1

    Figure 11:Movement detection in video 5 Bed 2

    The evaluation of the proposed system is made for which ground truth is labeled manually for each of the video clips, whereby each frame of the video was inspected for the (x,) coordinates of detected key points.To measure movement in a particular key point, Euclidean distance has been calculated between the coordinates of the same point in successive frames.Finally, for knowing about which body part was moved, the quantified movements in all the key points associated with the organ of the body of patients were examined against the threshold.The results produced by the system for each video clip is compared to those in the ground truth.The confusion matrices and the derived performance measures have been measured as follows:

    ·TP:Movement occurs in a particular organ, and the method also detects it.

    ·TN:Movement does not occur in a particular organ, and the method also does not detect it.

    ·FP:Movement does not occur in a particular organ, but the method detects it.

    ·FN:Movement occurs in a particular patient, but the method does not detect it.

    Various performance measures like accuracy, True Positive Rate (TPR), False Positive Rate (FPR),True Negative Rate (TNR), and Miss-Classification Rate (MCR) are measured from a confusion matrix.The TPR and FPR for each video clip and of each body organ is presented in Fig.12.

    It can be observed that the system reveals good results by identifying discomfort in different body organ.The TPR ranges from 98% to 99 %, while the FPR of the proposed system is between 4% to 1%.The organ-wise average performance measures are shown in Tab.2.Results show average measures in various videos for different organs of the patient’s body, revealing that the proposed system achieves 98% overall average accuracy.The TPR of the proposed system is 99% with 2% of FPR.

    Figure 12:TPR and FPR for different videos against each body organ (1 to 6) representing Head,Right Arm, Left Arm, Right Leg, Left Leg, and Torso, respectively.The TPR and FPR are showing the performance of the proposed method.(a) TPR and FPR of Video 1 (b) TPR and FPR of Video 2 (c) TPR and FPR of Video 3 (d) TPR and FPR of Video 4 (e) TPR and FPR of Video 5 bed 1 (f) TPR and FPR of Video 5 bed 2

    Table 2:Average performance measures for all parts in all videos

    5 Conclusion and Future Directions

    In this work, a non-invasive system is developed for automated discomfort detection in the patient body using CNN.The proposed system contains ten layers of the CNN model, which detects key points at different body locations of patient using confidence maps.The key points information is used to form main body organs by applying association rules and part affinity fields.Next, the discomfort in the body’s organs of the patient is investigated by estimating the distance between succeeding key points information of consecutive video frames.Finally, the distance and time-based thresholds are used for the classification of movement as discomfort and normal.To investigate the performance, the system is tested on a newly recorded data set.Experiments are evaluated using several performance measures, including TPR, FPR, TNR, MCR,and average accuracy.The TPR and FPR of each body organ are measured for all sequences,revealing the proposed system’s robustness.The overall average TPR of the system is 98%, with average FPR of 2%.

    This paper provides several future directions.First, new high-quality, pro-long overhead view data sets with multiple patients covering different types of the discomfort of different diseases in consultation with medical experts can be recorded.Second, the proposed work might be continued by recording high resolution data sets, which may capture the facial expressions of patients.This might add a second layer of discomfort detection as facial expressions will be a good way of concluding feelings and emotions.Furthermore, an interactive real-time automated detection system might be introduced for patients’discomfort in which an overhead camera will be accompanied by LEDs installed in the nursing staff room and the medical superintendent’s room.The system might generate an alarm in case of the detection of discomfort.This might help the patients immediately receive the attention of the staff on duty.

    Funding Statement:This research was funded by the Deanship of Scientific Research at Princess Nourah bint Abdulrahman University through the Fast-track Research Funding Program.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    尾随美女入室| 久久韩国三级中文字幕| 成人鲁丝片一二三区免费| 国内精品久久久久精免费| 成年版毛片免费区| 3wmmmm亚洲av在线观看| 国产黄色小视频在线观看| 在线免费观看不下载黄p国产| 精品久久久久久久久久久久久| 日日摸夜夜添夜夜添av毛片| 国产亚洲5aaaaa淫片| 亚洲精品粉嫩美女一区| 精品欧美国产一区二区三| 校园春色视频在线观看| 国产精品.久久久| 欧美日韩在线观看h| 久久久久久大精品| 色视频www国产| 成人午夜精彩视频在线观看| 亚洲av中文av极速乱| 长腿黑丝高跟| 国产精品一区www在线观看| 久久久久久久亚洲中文字幕| av在线蜜桃| 高清日韩中文字幕在线| 高清毛片免费观看视频网站| 久久久a久久爽久久v久久| 欧美最新免费一区二区三区| 91麻豆精品激情在线观看国产| 卡戴珊不雅视频在线播放| 亚洲av电影不卡..在线观看| 欧美最新免费一区二区三区| 亚洲av免费在线观看| 能在线免费看毛片的网站| 九九爱精品视频在线观看| 九草在线视频观看| 精品一区二区免费观看| 免费观看的影片在线观看| 中文字幕免费在线视频6| 亚洲在久久综合| av在线观看视频网站免费| 国产视频首页在线观看| 婷婷色av中文字幕| 国产白丝娇喘喷水9色精品| 亚洲自偷自拍三级| 国模一区二区三区四区视频| 天堂网av新在线| 免费观看在线日韩| 成年女人永久免费观看视频| 韩国av在线不卡| 三级男女做爰猛烈吃奶摸视频| 久久精品夜夜夜夜夜久久蜜豆| 女人十人毛片免费观看3o分钟| 99久国产av精品| 亚洲性久久影院| 久久99蜜桃精品久久| 青春草亚洲视频在线观看| 最近的中文字幕免费完整| 成人午夜精彩视频在线观看| 精品人妻视频免费看| 偷拍熟女少妇极品色| 久久人人爽人人爽人人片va| 简卡轻食公司| 亚洲无线观看免费| 成人国产麻豆网| 91久久精品国产一区二区成人| 99久国产av精品| 欧美激情久久久久久爽电影| 久久精品国产清高在天天线| 春色校园在线视频观看| 精品久久久久久久人妻蜜臀av| 悠悠久久av| 国产一区二区激情短视频| 成人性生交大片免费视频hd| 中文资源天堂在线| av在线观看视频网站免费| 一级二级三级毛片免费看| 色视频www国产| 搞女人的毛片| 久久久久久伊人网av| 午夜福利在线观看吧| 尤物成人国产欧美一区二区三区| 两个人的视频大全免费| 美女xxoo啪啪120秒动态图| 国产视频内射| 免费无遮挡裸体视频| 亚洲欧洲国产日韩| 久久久久网色| 国产精品99久久久久久久久| 丰满的人妻完整版| 成人性生交大片免费视频hd| 不卡视频在线观看欧美| 国产私拍福利视频在线观看| 在线免费观看不下载黄p国产| 欧美高清成人免费视频www| av在线观看视频网站免费| 日本免费a在线| 蜜桃亚洲精品一区二区三区| 欧美日韩一区二区视频在线观看视频在线 | av免费观看日本| av视频在线观看入口| 亚洲精品乱码久久久v下载方式| 黄色日韩在线| 国产精品国产三级国产av玫瑰| 好男人在线观看高清免费视频| 亚洲丝袜综合中文字幕| 人人妻人人澡欧美一区二区| 日日摸夜夜添夜夜爱| 直男gayav资源| 国产熟女欧美一区二区| 91久久精品国产一区二区三区| 青春草国产在线视频 | 精品久久久久久久久av| 国产一级毛片在线| 成人特级黄色片久久久久久久| 亚洲自偷自拍三级| 99热这里只有是精品在线观看| 欧美激情国产日韩精品一区| 免费不卡的大黄色大毛片视频在线观看 | 色吧在线观看| 一进一出抽搐动态| 精品少妇黑人巨大在线播放 | 日韩av在线大香蕉| 精品人妻一区二区三区麻豆| 一进一出抽搐gif免费好疼| 欧美性猛交黑人性爽| 麻豆av噜噜一区二区三区| 伊人久久精品亚洲午夜| 亚洲国产欧美在线一区| 欧美3d第一页| 国产老妇女一区| 欧美精品国产亚洲| 性欧美人与动物交配| 白带黄色成豆腐渣| АⅤ资源中文在线天堂| 国产单亲对白刺激| 波野结衣二区三区在线| 久久久精品94久久精品| 欧美成人精品欧美一级黄| 在线观看66精品国产| 国产男人的电影天堂91| 国产单亲对白刺激| 精品一区二区三区人妻视频| 国产黄色小视频在线观看| 直男gayav资源| 寂寞人妻少妇视频99o| 亚洲精品国产成人久久av| 婷婷色av中文字幕| 国产老妇女一区| 亚洲欧洲日产国产| 亚洲av电影不卡..在线观看| 国产爱豆传媒在线观看| 毛片一级片免费看久久久久| 天堂网av新在线| 国产亚洲5aaaaa淫片| 国产成人a∨麻豆精品| 国产成人精品一,二区 | 精品久久久久久久久久免费视频| 久久99蜜桃精品久久| 六月丁香七月| 小蜜桃在线观看免费完整版高清| 99久久精品热视频| 日韩 亚洲 欧美在线| 天天一区二区日本电影三级| 国产国拍精品亚洲av在线观看| a级毛色黄片| 亚洲经典国产精华液单| 国产一区二区三区在线臀色熟女| 亚洲天堂国产精品一区在线| 亚洲欧美精品专区久久| 亚洲四区av| 在线播放无遮挡| avwww免费| 亚洲欧美日韩高清专用| 亚洲国产精品合色在线| 一级毛片电影观看 | 精品久久久噜噜| 蜜臀久久99精品久久宅男| 国产成人a∨麻豆精品| 联通29元200g的流量卡| 亚洲欧美成人综合另类久久久 | 黄色欧美视频在线观看| 亚洲成人精品中文字幕电影| 久久久久网色| av天堂中文字幕网| 久久综合国产亚洲精品| 99精品在免费线老司机午夜| 国产一区二区在线观看日韩| 国产亚洲精品久久久com| 91精品一卡2卡3卡4卡| 成人国产麻豆网| 国产精品一区www在线观看| 日韩,欧美,国产一区二区三区 | 亚洲四区av| 亚洲性久久影院| 美女黄网站色视频| 中文字幕制服av| 三级国产精品欧美在线观看| 国产黄片视频在线免费观看| 少妇的逼水好多| 婷婷六月久久综合丁香| 国产不卡一卡二| 欧美xxxx黑人xx丫x性爽| 可以在线观看毛片的网站| 免费观看a级毛片全部| 毛片女人毛片| 欧美不卡视频在线免费观看| 狂野欧美激情性xxxx在线观看| 高清毛片免费看| 精品人妻偷拍中文字幕| 日日摸夜夜添夜夜爱| 亚洲在线自拍视频| 最新中文字幕久久久久| 最近2019中文字幕mv第一页| 亚洲国产高清在线一区二区三| 嫩草影院精品99| 99久久成人亚洲精品观看| 亚洲精品久久久久久婷婷小说 | 熟女人妻精品中文字幕| 国产黄片美女视频| 国产黄色小视频在线观看| 亚洲在线观看片| 免费观看a级毛片全部| 国产片特级美女逼逼视频| 少妇熟女aⅴ在线视频| 黑人高潮一二区| 91精品国产九色| 人妻夜夜爽99麻豆av| av视频在线观看入口| 国产伦精品一区二区三区四那| 最新中文字幕久久久久| 免费看av在线观看网站| 亚洲激情五月婷婷啪啪| 美女cb高潮喷水在线观看| 精品少妇黑人巨大在线播放 | 国产人妻一区二区三区在| 伦理电影大哥的女人| 国产精品三级大全| 国产精品麻豆人妻色哟哟久久 | 亚洲精品乱码久久久久久按摩| 午夜福利在线观看吧| 亚洲无线观看免费| 色综合站精品国产| 99热这里只有是精品50| 婷婷六月久久综合丁香| 亚洲av不卡在线观看| 91久久精品电影网| 一个人观看的视频www高清免费观看| 看片在线看免费视频| 男女下面进入的视频免费午夜| 狂野欧美白嫩少妇大欣赏| 在线观看免费视频日本深夜| 我要搜黄色片| 欧美变态另类bdsm刘玥| 又粗又硬又长又爽又黄的视频 | 国产精品福利在线免费观看| 少妇人妻精品综合一区二区 | 99国产精品一区二区蜜桃av| 搞女人的毛片| 99九九线精品视频在线观看视频| 小蜜桃在线观看免费完整版高清| 长腿黑丝高跟| 精品久久久久久久久久免费视频| 久久久久久久久大av| 国产69精品久久久久777片| 国产一级毛片七仙女欲春2| 欧美zozozo另类| 国产精品人妻久久久久久| 免费不卡的大黄色大毛片视频在线观看 | 我要搜黄色片| 欧美日本视频| 亚洲国产欧美人成| 欧美潮喷喷水| 99在线人妻在线中文字幕| 国产乱人偷精品视频| 欧美日韩在线观看h| 亚洲成人久久爱视频| 中文亚洲av片在线观看爽| 午夜亚洲福利在线播放| 18+在线观看网站| 中文精品一卡2卡3卡4更新| 成年av动漫网址| 亚洲欧美精品专区久久| 欧美色欧美亚洲另类二区| 国产极品精品免费视频能看的| 69av精品久久久久久| 国产精品不卡视频一区二区| 精品久久久久久久久av| 中文字幕久久专区| 亚州av有码| 精品久久久久久成人av| av国产免费在线观看| 麻豆精品久久久久久蜜桃| 国产一级毛片七仙女欲春2| 男女视频在线观看网站免费| 久久久久国产网址| 日本免费a在线| 久久久久久久久中文| 亚洲av免费在线观看| 久久精品国产亚洲av天美| 亚洲国产精品sss在线观看| 亚洲人与动物交配视频| 日韩一本色道免费dvd| 人人妻人人看人人澡| 亚洲美女视频黄频| 精品日产1卡2卡| 黄色日韩在线| 男女下面进入的视频免费午夜| 日本撒尿小便嘘嘘汇集6| 22中文网久久字幕| 久久久国产成人免费| 九九在线视频观看精品| 99久久无色码亚洲精品果冻| 成年版毛片免费区| 国产成人a∨麻豆精品| 国产伦一二天堂av在线观看| 一级二级三级毛片免费看| av专区在线播放| 亚洲精品色激情综合| 亚洲国产精品国产精品| 99热网站在线观看| 老师上课跳d突然被开到最大视频| 26uuu在线亚洲综合色| 女人被狂操c到高潮| 亚洲精品久久久久久婷婷小说 | 日韩强制内射视频| 亚洲人成网站在线观看播放| 97在线视频观看| 99热精品在线国产| 99热只有精品国产| 国产综合懂色| 色吧在线观看| 亚洲av成人精品一区久久| 免费av毛片视频| 久久精品国产99精品国产亚洲性色| 此物有八面人人有两片| 午夜激情欧美在线| 国产亚洲5aaaaa淫片| 美女 人体艺术 gogo| 我的老师免费观看完整版| 春色校园在线视频观看| 欧美最新免费一区二区三区| 日韩欧美精品免费久久| 美女脱内裤让男人舔精品视频 | 国产探花极品一区二区| 欧美日韩乱码在线| 国产免费男女视频| 日日干狠狠操夜夜爽| 麻豆成人av视频| 最近视频中文字幕2019在线8| 赤兔流量卡办理| 国产一区二区亚洲精品在线观看| 日韩一区二区视频免费看| 一边亲一边摸免费视频| 美女被艹到高潮喷水动态| 少妇的逼好多水| 国产探花在线观看一区二区| 国产一区二区亚洲精品在线观看| 国产成人午夜福利电影在线观看| 18禁在线播放成人免费| 久久久久免费精品人妻一区二区| 淫秽高清视频在线观看| 久久久久久久久中文| 亚洲精品国产成人久久av| 波多野结衣巨乳人妻| 国产一区二区亚洲精品在线观看| kizo精华| 日韩成人av中文字幕在线观看| 久久亚洲国产成人精品v| 91久久精品国产一区二区三区| 午夜视频国产福利| 成人漫画全彩无遮挡| 亚洲精华国产精华液的使用体验 | 三级经典国产精品| 在线观看午夜福利视频| 欧美日韩综合久久久久久| 嫩草影院入口| 欧美区成人在线视频| 午夜老司机福利剧场| 熟女电影av网| 亚洲国产色片| 亚洲精品自拍成人| 国产精品一二三区在线看| 国产精品福利在线免费观看| 精品日产1卡2卡| 最新中文字幕久久久久| 一本精品99久久精品77| 波多野结衣高清作品| 大又大粗又爽又黄少妇毛片口| 久久人人爽人人爽人人片va| 久久久欧美国产精品| 日韩强制内射视频| 国产探花极品一区二区| 亚洲五月天丁香| 婷婷精品国产亚洲av| 97热精品久久久久久| 在线观看66精品国产| 一本久久中文字幕| 一区二区三区高清视频在线| 狂野欧美白嫩少妇大欣赏| 久久久午夜欧美精品| 日本黄色片子视频| 日韩中字成人| 亚洲国产精品国产精品| 中文精品一卡2卡3卡4更新| 美女被艹到高潮喷水动态| a级毛片免费高清观看在线播放| 久久久欧美国产精品| 久久综合国产亚洲精品| 老熟妇乱子伦视频在线观看| 在线观看午夜福利视频| 国产女主播在线喷水免费视频网站 | 国产精品人妻久久久影院| 国产色婷婷99| 丰满乱子伦码专区| 国产伦一二天堂av在线观看| 国产成人a∨麻豆精品| 在线观看免费视频日本深夜| 国产精品久久久久久精品电影小说 | 亚洲欧美精品专区久久| 免费黄网站久久成人精品| 亚洲欧洲国产日韩| 成人美女网站在线观看视频| 综合色av麻豆| 小蜜桃在线观看免费完整版高清| 天美传媒精品一区二区| 在线观看66精品国产| 给我免费播放毛片高清在线观看| 亚洲精品亚洲一区二区| 中出人妻视频一区二区| 精品一区二区免费观看| 国产探花在线观看一区二区| 欧美性感艳星| h日本视频在线播放| 日日摸夜夜添夜夜爱| 内射极品少妇av片p| 噜噜噜噜噜久久久久久91| 日韩视频在线欧美| 丰满的人妻完整版| 久久精品国产99精品国产亚洲性色| 中文字幕av成人在线电影| 高清毛片免费观看视频网站| 91狼人影院| 哪个播放器可以免费观看大片| 少妇人妻精品综合一区二区 | 91午夜精品亚洲一区二区三区| 性色avwww在线观看| 波野结衣二区三区在线| 久久6这里有精品| 国产成人午夜福利电影在线观看| 天堂√8在线中文| 又爽又黄a免费视频| 久久国产乱子免费精品| 狂野欧美白嫩少妇大欣赏| 欧美日韩一区二区视频在线观看视频在线 | 日本成人三级电影网站| 淫秽高清视频在线观看| 少妇的逼水好多| 在线观看美女被高潮喷水网站| 婷婷亚洲欧美| 午夜福利视频1000在线观看| 亚洲精品亚洲一区二区| 一卡2卡三卡四卡精品乱码亚洲| 色综合亚洲欧美另类图片| 国产在线精品亚洲第一网站| 久久久精品94久久精品| 亚洲国产精品合色在线| 久久九九热精品免费| 中文字幕av成人在线电影| 成人二区视频| 免费电影在线观看免费观看| 日韩欧美在线乱码| 日日摸夜夜添夜夜爱| 成年女人看的毛片在线观看| 日本撒尿小便嘘嘘汇集6| avwww免费| 亚洲在线观看片| 国产精品精品国产色婷婷| 黄色一级大片看看| 亚洲四区av| av在线观看视频网站免费| 欧洲精品卡2卡3卡4卡5卡区| 少妇的逼好多水| 特级一级黄色大片| 亚洲精品国产av成人精品| 大型黄色视频在线免费观看| 午夜精品国产一区二区电影 | 别揉我奶头 嗯啊视频| 国产成年人精品一区二区| kizo精华| 精品久久久久久久末码| 国产爱豆传媒在线观看| 午夜福利成人在线免费观看| 亚洲18禁久久av| 欧美日韩国产亚洲二区| 一级毛片我不卡| 日韩人妻高清精品专区| 亚洲人成网站在线播| 长腿黑丝高跟| 亚洲最大成人中文| 国产亚洲av嫩草精品影院| 国产av在哪里看| 久久久久久久久久久丰满| 亚洲国产欧美人成| 伦理电影大哥的女人| 久久久国产成人免费| 深夜a级毛片| 国产在视频线在精品| 成人特级av手机在线观看| 国内精品一区二区在线观看| 国产精品嫩草影院av在线观看| 蜜臀久久99精品久久宅男| 直男gayav资源| 97在线视频观看| 色综合色国产| 久久久国产成人免费| 97人妻精品一区二区三区麻豆| av在线老鸭窝| 欧美性感艳星| 欧美日韩一区二区视频在线观看视频在线 | 99久久精品一区二区三区| 最近中文字幕高清免费大全6| 夜夜看夜夜爽夜夜摸| 亚洲成a人片在线一区二区| 菩萨蛮人人尽说江南好唐韦庄 | 免费看av在线观看网站| 久久99蜜桃精品久久| 国产69精品久久久久777片| 又粗又爽又猛毛片免费看| 亚洲成a人片在线一区二区| 91av网一区二区| 亚洲第一电影网av| 久久中文看片网| 午夜久久久久精精品| 99热网站在线观看| 最近最新中文字幕大全电影3| 在线免费观看的www视频| 乱码一卡2卡4卡精品| 亚洲国产欧洲综合997久久,| 国产精品电影一区二区三区| 色哟哟·www| 国产精品电影一区二区三区| 国产国拍精品亚洲av在线观看| 国产色婷婷99| 国产亚洲91精品色在线| 日韩一区二区视频免费看| 91久久精品国产一区二区三区| 日日摸夜夜添夜夜添av毛片| 少妇裸体淫交视频免费看高清| 亚洲国产精品合色在线| 天天躁日日操中文字幕| 精品不卡国产一区二区三区| 亚洲精品国产av成人精品| 国内精品美女久久久久久| 精品人妻熟女av久视频| 亚洲精品久久久久久婷婷小说 | 亚洲婷婷狠狠爱综合网| 你懂的网址亚洲精品在线观看 | 久久99热6这里只有精品| 男人舔女人下体高潮全视频| 免费人成在线观看视频色| 欧美zozozo另类| 中文字幕av成人在线电影| 国内久久婷婷六月综合欲色啪| 国产视频首页在线观看| 中文亚洲av片在线观看爽| 又爽又黄无遮挡网站| 中文精品一卡2卡3卡4更新| 大香蕉久久网| 国产中年淑女户外野战色| 精品一区二区免费观看| 国产午夜精品论理片| 国产亚洲av嫩草精品影院| av国产免费在线观看| 99久久精品一区二区三区| 精品久久久久久久末码| 久久99蜜桃精品久久| 国产色婷婷99| 国产精品久久久久久av不卡| 国产极品天堂在线| 十八禁国产超污无遮挡网站| 两个人视频免费观看高清| 中文字幕av成人在线电影| 级片在线观看| 日本av手机在线免费观看| 久久久久久九九精品二区国产| 日本三级黄在线观看| 一级二级三级毛片免费看| 身体一侧抽搐| 国产精品久久电影中文字幕| 我要看日韩黄色一级片| av.在线天堂| 晚上一个人看的免费电影| 久久久精品94久久精品| 国产亚洲精品久久久久久毛片| 欧美3d第一页| 亚洲欧洲日产国产| 国产熟女欧美一区二区| 亚洲欧美精品综合久久99| a级毛片免费高清观看在线播放| 高清午夜精品一区二区三区 | 亚洲18禁久久av| 三级经典国产精品| 日本黄色片子视频| 精品久久久久久久久av| 99视频精品全部免费 在线| 欧美成人精品欧美一级黄| 九九热线精品视视频播放| 亚洲中文字幕日韩| 中文字幕制服av| 舔av片在线| 国产女主播在线喷水免费视频网站 | 国产成人a∨麻豆精品| 国产三级在线视频| 国产精品电影一区二区三区|