• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Automated Patient Discomfort Detection Using Deep Learning

    2022-08-24 03:27:10ImranAhmedIqbalKhanMisbahAhmadAwaisAdnanandHananAljuaid
    Computers Materials&Continua 2022年5期

    Imran Ahmed,Iqbal Khan,Misbah Ahmad,Awais Adnan and Hanan Aljuaid

    1Center of Excellence in Information Technology,Institute of Management Sciences,Peshawar,Pakistan

    2Computer Sciences Department,College of Computer and Information Sciences,Princess Nourah bint Abdulrahman University(PNU),Riyadh,Saudi Arabia

    Abstract:The Internet of Things(IoT)has been transformed almost all fields of life,but its impact on the healthcare sector has been notable.Various IoTbased sensors are used in the healthcare sector and offer quality and safe care to patients.This work presents a deep learning-based automated patient discomfort detection system in which patients’discomfort is non-invasively detected.To do this,the overhead view patients’data set has been recorded.For testing and evaluation purposes,we investigate the power of deep learning by choosing a Convolution Neural Network(CNN)based model.The model uses confidence maps and detects 18 different key points at various locations of the body of the patient.Applying association rules and part affinity fields,the detected key points are later converted into six main body organs.Furthermore,the distance of subsequent key points is measured using coordinates information.Finally,distance and the time-based threshold are used for the classification of movements associated with discomfort or normal conditions.The accuracy of the proposed system is assessed on various test sequences.The experimental outcomes reveal the worth of the proposed system’by obtaining a True Positive Rate of 98%with a 2%False Positive Rate.

    Keywords: Artificial intelligence; patient monitoring; discomfort detection;deep learning

    1 Introduction

    The IoT begins smart healthcare systems in the medical sector, generally comprised of smart sensors, a remote server, and the network.In smart healthcare, it has many applications, including early warning service (emergency, first aid, medical assessment), real-time supervision services(patient monitoring, elderly care), scheduling and optimization service (medical staff allocation,bed allocation, resources allotment).A patient monitoring system has been gaining the consideration of researchers in the field of advanced computer vision and machine learning.It is one of the ongoing research fields because of its broad range of applications, including respiration monitoring, pain detection, depression monitoring, sleep monitoring, patient behavior monitoring,posture monitoring, epilepsy seizure detection, etc.Researchers have developed different systems for patient monitoring systems, e.g., some use specialized hardware, pressure mattresses, and sensors but at the cost of additional expense.Similarly, connecting sensors to the body of the patient is unwilling from the patient? point of view.Few used signal based approaches to observe breathing, depth rate, and steadiness of breath besides monitoring the breath time and the ratio.Even though pain detection techniques exist, they mainly use facial expressions.The major drawback of such systems is that they require the patient to align face directly to the camera.A sleep monitoring system has been developed to detect sleep apnea and sleep disorders;such a system is also based on hardware and sensors installed in patients’beds.Some techniques monitor patient behavior, which helps to analyze their medical condition.However, such developed techniques are based on the installation of multiple cameras.

    Multiple camera posture-based monitoring techniques have also been developed, e.g., mainly focusing on the upper body part of the patient.Because of these limitations, a non-invasive discomfort detection system has been proposed in this work, which neither utilizes specialized hardware/sensors nor a line of sight vision devices or any constrained/ specialized environment.The introduced system is principally based on ten layers of the Convolutional neural network(CNN).It is the class of deep learning containing input, output as well as some hidden layers.The layers are fully connected, which helps to detect and recognize features and patterns.A pretrained model is used to test/evaluate the patient’s discomfort using our newly recorded data set.The CNN model’s output is 18 keypoints detected on different patient body locations using confidence maps.The detected keypoints are further utilized to shape six major body organs.This formation is based on association rules and part affinity fields.The distance of all detected keypoint is estimated from each succeeding keypoint of successive frames.The distance and timebased thresholds are considered to recognize discomfort in a specific organ of the body of the patient.Finally, experimental evaluation is made using manually created ground truths.The work presented in the paper has the following main contributions;

    ·An automated system is introduced for detection of patient discomfort using a deep learning-based model.

    ·By utilizing CNN architectures, confidence maps and 18 different keypoints are detected at various locations of the patient’s body,

    ·The detected keypoints are then converted into six main body parts/organs based on association rules and part affinity fields, and the distance of the following key points is measured using coordinates information,

    ·Finally, distance and the time-based threshold are utilized for the classification of movements as either discomfort or normal conditions.

    The proposed system could have many possible applications such as analysis, monitoring, detection of pain, discomfort, automatic patient monitoring in hospitals or homes, and elderly monitoring.The presented work is organized as follows:A review of the related work has been presented in Section 2.Then, the proposed system is introduced in Section 3.While Section 4 explains experimental results.Lastly, Section 5 concludes the presented work and provides future directions.

    2 Literature Review

    In recent years automated patient monitoring has been gaining the interest of researchers.Different signal, image processing, and computer vision techniques have been developed in the last decade.Some of the techniques have been discussed in this section which has been categorized as follows:

    2.1 Respiration Monitoring Approaches

    Respiration monitoring aims to observe the depth and steadiness of breath besides monitoring the inhalation and exhalation time and the ratio.Cho et al.[1] used a thermal image-based approach to respiration rate monitoring by specifying a region of interest under the nose.In [2],a radio frequency-based method is proposed, which helps to estimate the rate of respiration using a Multiple Signals Classification (MUSIC) algorithm.Authors in [3], presented a contactless breathing monitoring system using single camera approach.Ostadabbas et al.[4] have proposed a respiration monitoring system for estimating airway resistance non-intrusively using depth data obtained from the Microsoft Kinect sensor.Fang et al.[5] proposed a system for detecting sudden infant death syndrome.Al-Khalidi et al.[6] used facial thermal images of children to monitor their respiration rate.Janssen et al.[7] use the intrinsic respiratory features for finding the region of interest for respiration and motion factorization to extract respiration signals.Braun et al.[8]divide the input images into blocks and then estimates motion for each block.These block motions are then classified to find the respiration activity.Wiede et al.[9] introduce a method for remotely monitoring respiration rate using RGB images.This approach finds the region of interest and applies principal component analysis and frequency finding methods to determine the respiration rate.Frigola et al.[10] produced a video-based non-intrusive technique for respiration monitoring, which detects movement applying optical flow and quantifies the detected movement.Monitoring a patient’s respiration can provide insights and help diagnose many diseases like lung problems and abnormal respiration rates.

    2.2 Pain Detection and Depression Monitoring Approaches

    In the literature, pain detection and depression monitoring has been handled mostly by analyzing facial expressions.Authors in [11] exploited facial appearances for pain detection by using a feature-based method similar to [12–16], i.e., pyramid histogram of oriented gradients and pyramid local binary pattern.They used these features to extract the shape and appearance of patients’faces, respectively.Authors in [17] used Prkachin and Solomon Pain Intensity (PSPI)metric.Other approaches that consider facial emotions to detect pain and/or depression are proposed in [18–22].Each of these movements is categorized as a different action unit.The authors extract the face’s canonical appearance using Active Appearance Models (AAMs), filtered to extract features.These features are then fed to different SVMs, each trained to measure a separate level of pain intensity.In [23], authors suggested a system using AAM to detect patients’pain in videos.In [24,25], authors introduced a system that could discriminate facial emotions of pain from other facial emotions and applied SVM for severity score of pain.The system has been tested on UNBC-McMaster Database [26] using four different classifiers, namely SVM, Random Forest, and two neural networks.For assessment of the system, they applied the HI4D-ADSIP data set [27].Nanni et al.[28] classify pain states by proposing a descriptor named Elongated Ternary Patterns (ELTP), which combines the features of Elongated Binary Pattern (ELBP) [29]and Local Ternary Patterns (LBP).

    2.3 Sleep Monitoring Approaches

    Sleep monitoring encompasses recording and analyzing chest and abdomen movements, as is the case with respiration monitoring.In [30], Al-Naji et al.developed a system for detecting sleep apnea and monitoring respiration rate in children by using the Microsoft Kinect sensor.Li et al.[31] proposed a non-invasive system for cardiopulmonary signals monitoring in various sleeping positions.The infrared light source and Infrared sensitive camera are used in this approach.Metsis et al.[32] proposed sleep patterns monitoring system.They have investigated many factors corresponding to sleep disorders.Malakuti et al.[33] address the problem of sleep irregularities based on pressure data.Liao et al.[34] designed and measured the sleep quality using infrared video.They have used the technique of motion history image [35] for analyzing videos to recognize the patterns of patients’movements.Nandakumar et al.[36] introduced a smartphonebased sleep apnea detection system, which analyzes chest and abdominal motion.Saad et al.[37]proposed a device for finding sleep quality using several sensors in the room.The sensors are used for determining heart rate, temperature, and movement of the body.Hoque et al.[38] attach WISPs [39] to the bed’s mattress to know about the positions of the body and thereby monitor sleep.Accelerometer data is used for movement detection.

    2.4 Behavior Monitoring Approaches

    Human behavior understanding also plays a vital role in knowing much about people.Borges et al.[40] tried to recognize individual activities associated with psychiatric patients by utilizing blob detection, and optical flow analysis, and applied decision rules to analyze patients’activities.Authors in [41] proposed a system based on monitoring patients’ vital signs to prevent incidents such as falling, injuries, and pain.The system uses the Canny Edge Detector and Hough Transform for detecting beds.Once a bed is detected, the system determines whether or not a patient is present in the bed by detecting the patient’s head.Martinez and Stiefelhagen [42]have applied multi-cameras for observing the behavior of patients’in an ICU irrespective of the environmental conditions.By examining a patient’s behavior, much information can be collected about his medical conditions [43].

    2.5 Posture Monitoring Approaches

    Knowing about patient posture proves helpful for purposes like fall detection, pressure ulcer detection, and activity recognition.Chang et al.[44] introduced a system based on depth videos for restricting pressure ulcers in the bed of patients by investigating their movement and posture.In [45], the authors introduced a non-invasive patient posture monitoring method.This approach extracts HOG features for the classification of postures.The system also tracks the postures of the patient and generates a report accordingly.Wang et al.[46] have introduced a monitoring system for recognizing a person’s pose while covered with a blanket.In another approach, [47] proposed a system for determining the top body parts of the human under a blanket utilizing an overhead camera [48–50].Brulin et al.[51] suggested a technique for monitoring the elderly at home.The proposed method is based on posture recognition.This technique detects the individual body and then utilizes posture identification methods on the human silhouette based on Fuzzy Logic.

    2.6 Epilepsy Monitoring Approaches

    Many attempts have been made towards vision-based detection and prediction of epilepsy seizures.In [52] proposed a method for eyeball detection.The main purpose is to track the movement of eyes for knowing about the presence or absence of epileptic seizures.Lu et al.[53]used color videos and proposed a method for quantification of limb movement occurring in seizures associated with epilepsy.Cuppens et al.[54] apply the optical flow method and detect epilepsy movement.Kalitzin et al.[55] used the optical flow method to find movements associated with epileptic seizures.

    All of the above discussed approaches focus either on a single patient and/or a single bed, and specialized hardware is used.Also, the intrusive approaches among these need connecting sensors to the body or bed to record various measurements that are both costly and unwanted from the patient’s point of view.Even though pain detection approaches are there, they wholly solely depend on facial expressions, restraining the patient from retaining his/her face directly towards the camera.On the other hand, the proposed system may work in the existing wards setups monitor more than one patient simultaneously, lacking advanced beds or functional equipment,etc., except a single camera.Being non-invasive, it makes no contact with the patient while recording their movements.Recently, scholars also utilized deep learning based methods [56–59]for patient discomfort monitoring [60].In this work, we also used a deep learning based method for automated patient discomfort detection.

    3 The Proposed Method

    In this section, a deep learning based sustainable discomfort detection system is introduced.The flow chart presented in Fig.1 highlights the main steps of the proposed method.The proposed method is mainly based on Convolution Neural Network (CNN) based architecture [61].Firstly, the input images of the patient from the IMS-PDD-II data set are transmitted to the pre-trained model, which detects key points at various locations on the body of a patient.Then,the information of detected key points is then applied for the formation of the patient body organs using defined association rules.Finally, a distance threshold has been applied to recognize discomfort or pain in the organs of a patient’s body.The detailed explanation of the proposed method exhibited in Fig.1 has been described with the help following steps:

    ·The pre-trained model used non-parametric representation, which is called parts affinity fields.The parts affinity fields contain the orientation and position information used to identify human body parts in the input image.The model employs CNN architecture, shown in Fig.1 [62].The input images from the data set are given to the pre-trained model.The trained model mainly has two branches—the top branch is used for predicting the confidence maps and detection of human body parts, while the bottom one is for predicting part affinity fields, which are used to link human body parts, as shown in Fig.2.Each of the two branches is repetitive prediction architecture refining the predictions via the number of successive stages.

    Figure 1:Flowchart of the CNN-based discomfort detection method

    Figure 2:Proposed Model Architecture.(a) shows input image, (b) CNN model and (c) shows detected key points and (d) shows detected patients’body organs

    ·A set of feature maps represented byFare extracted for each input image using CNN.TheFis used as input features to the initial stages of both branches, as shown in Fig.2.At these initial stages, the network generated a set of detection confidence maps.The detected confidence maps for the initial stage is given as;

    While fortthstage the confidence, maps have been calculated as;

    In Eq.(2),tis the CNN for interference at the initial stage totthstage of branch 1 as shown in Fig.2.

    ·The part affinity fields have also been generated along with confidence mapsS1.The part affinity field for the initial stage is calculated using the below Equation:

    Moreover, fortthstage, the part affinity fields are shown in Eq.(2).

    Hereφ1represent the CNN for inference at the initial stage totthstage of branch 2.After every succeeding stage, the model concatenates both branches’predictions and generates image features.These features are used for refined predictions calculated in Eqs.(2) and(4), as shown in Fig.2.

    ·For iterative prediction of confidence map of the human body part at the first branch and part affinity fields at the second branch of each stage, loss function has been calculated.As there are two branches, so two-loss functions are calculated and applied on each stage.These loss functions are given by Eqs.(5) and (6) [62].The first loss function for the first branch and calculated as;

    In Eq.(5)is ground truth confidence map of human body.The second loss functionfor ground truth of part affinity fields is given as:

    whereis ground truth of part affinity vector.In Eqs.(5) and (6),pis the location at input image,Wis a binary mask equal to 0, in case annotation is missing there at locationp.The calculated loss function at each stage is to minimize the distance between predicted and real confidence maps for each affinity part.

    ·The main objectives of the calculated loss function L for full architecture shown in Fig.2 are obtained by adding Eqs.(5) and (6) is given by.

    ·The pretrained model shown in Fig.3 gives 18 detected key points on the body, as determined in Fig.4a.The key points information is moreover utilized to form body organs, as shown in Fig.4b.Finally, using association rules, six organs of body are formed and have been manually highlighted in Fig.4d.

    ·When a patient feels any type of discomfort, frequently movement occurs in any such part of the patient’s body.For example, the patient may touch/hold his/her head with hands or moving legs or arms.Furthermore, in few cases, patients may move his/her legs, arms, or any other part in a disruptive way.For instance, he/she sometimes sits or lies or switch sides frequently.All such random and frequent changes are considered as discomfort signs.If the frequency of these frequent and random movements lasts for a long duration, it is considered as a discomfort condition.The discomfort investigation in a body is based on constant movements of the specific part of body.The presented system determines a change in the body organ utilizing key points information across time and categorizes the condition as discomfort or normal.The coordinates information of detected key points is used to identify pain.The movement in any body part or organ is measured using distance information that is determined by applying Euclidean distance across consecutive video frames.

    Figure 3:The sample images show Heatmap and PAF’s for the right elbow.The body part is encoded in the 3rd channel, so in this case, the right knee is at index 9

    ·The threshold measuring distanceTof consecutive key points is used in terms of the number of pixels.Thave been set as 25 pixels.The threshold decides movements in the patient body organ or partb.For instance, a variation in the coordinates (x,y) of detected key points e.g., 5, 6 and 7 on a body of the patient will cause a movement in the left arm and change in the (x,y) coordinates of joints 8, 9 and 10 would mean a movement in the right leg.For this reason, the Euclidean distances for all detected key points of that body organs have been examined using Eq.(8).

    Figure 4:Organ formation (a) shows detected key points on the patient’s body, (b) shows six different body organs formed using part affinity method and association rules, (c) shows linking of six body organs

    ·Lastly, to investigate either a patient is feeling normal or having some discomfort problem,video frames are examined for frequent movements of occurrences using a time-based thresholdTtas shown in Eq.(9).(This threshold can be changed depending upon on size and variety of data set.In this work, ten frames per second have been practiced due to limited data set).

    whereCpatientrepresent the condition of the patientP.Ttis the time threshold representing the span of time that discriminates between the normal and discomfort movements.

    4 Experimental Results and Discussion

    The proposed method has been evaluated on a recorded IMS-PDD-II data set.A brief description of the video clips considered in this work is given in Tab.1.Experiments have been performed on an HP core i3 Laptop with 8 GB RAM.The frames of the video clips are given as input to the pre-trained model to identify key points and organs of the patient’s body.A few output images of detected organs can be observed from Fig.5.After detecting the key points,the movement frequency of the patient organs has been analyzed using key points coordinates information.Based on movement frequency, the discomfort in the patient’s body has been decided.In this section, the result of different video clips has been discussed; each video clip contains movement in the different organs of the patient body.The results of the different video clips show movements in different organs are briefly discussed in this section.

    Table 1:Description of video clips used

    Figure 5:Sample of output images shows organs of the patients body

    In video 1, the patient moved his left arm over many times, as noted in Fig.6.To be exact,left arm involves movement in frames 21–40, 43–54, 57–82, 84–135, 137–157, 160–174, 176–188,190–207, 209–229, 239–258.All these changes occurs continously and are greater than the defined threshold.Also, these sequences of movements in the left arm are separated by no movement in one or two frames, indicating that there is continuous movement in the left arm.It determines that there is severe pain (discomfort) in the left arm.The movements in the left arm are also accompanied by movement or changes in the right arm in some frames because the patient retains touching his left arm with his right hand, as explained in Fig.7.In video 2, excessive movements have occurred in the patient’s right arm, i.e., in frames 14–22, 24–68, 73–84, 93–102, 148–180,185–202, 207–218, and 227–236 consecutively.The frequency of the movement in the right arm is large compared to other organs.In addition, the patient has moved his head, left arm, and both legs in some of the frames, as is seen from Fig.7.However, as the pattern or frequency of the right arm is greater than the threshold, this indicates discomfort in the right arm.The reason is that most of the time, discomfort in one part of the body also causes movements in other parts besides the concerned body part.

    Figure 6:Movement detection in video 1

    Figure 7:Movement detection in video 2

    Video 3 contains movement in the patient’s right leg almost continuously throughout the video with the exception of a few frames gaps.The movement in the right leg is accompanied by movement in the right arm in most of the frames.The patient has also moved his head and left arm, but the movement in the head is a bit more frequent, as depicted in Fig.8.Ten or more consecutive frames involving movement in the right leg are 3–30, 62–82, 84–102, 134–149, and 158–178, 180–209, and 241–255.This situation can be classified as a discomfort in the right leg.In video 4 the patient moved both his arms frequently throughout the video, but movements in the right arm are more substantial and last for a longer duration, as is clear from Fig.9.Here,consecutive frames with movements in the right arm include 2–32, 62–78, 81–105, 120–139, 155–191, and 197–213, 216–240 and 255–274.Movements in the left arm also occur almost parallel to those in the right arm in most of the frames.The patient has also moved his head and right leg in some frames.The frequent movements in.the right arm help reach the conclusion that in this video, movements in both arms caused discomfort in the right arm of the patient.

    Figure 8:Movement detection in video 3

    Figure 9:Movement detection in video 4

    Video 5, on the other hand, comprises of two patients—the first patient is lying on bed 1 (left side), while the second patient is lying on bed 2 (right side).The results of movement in various organs of both patients are presented in Figs.10 and 11, respectively.Patient in bed 1 has largely moved his head and both arms, particularly in frames 38–77 and 103–244.All these movements satisfies the time-based threshold hence intimating that the patient feels some pain in his body.On the other hand, the patient lying in bed 2 also moved various parts of his body.Fig.12 also shows that for patient 2 most of the frames containing a change in various body parts, although the frequency of movement is less than the defined threshold, which shows that the movement is normal.

    Figure 10:Movement detection in video 5 Bed 1

    Figure 11:Movement detection in video 5 Bed 2

    The evaluation of the proposed system is made for which ground truth is labeled manually for each of the video clips, whereby each frame of the video was inspected for the (x,) coordinates of detected key points.To measure movement in a particular key point, Euclidean distance has been calculated between the coordinates of the same point in successive frames.Finally, for knowing about which body part was moved, the quantified movements in all the key points associated with the organ of the body of patients were examined against the threshold.The results produced by the system for each video clip is compared to those in the ground truth.The confusion matrices and the derived performance measures have been measured as follows:

    ·TP:Movement occurs in a particular organ, and the method also detects it.

    ·TN:Movement does not occur in a particular organ, and the method also does not detect it.

    ·FP:Movement does not occur in a particular organ, but the method detects it.

    ·FN:Movement occurs in a particular patient, but the method does not detect it.

    Various performance measures like accuracy, True Positive Rate (TPR), False Positive Rate (FPR),True Negative Rate (TNR), and Miss-Classification Rate (MCR) are measured from a confusion matrix.The TPR and FPR for each video clip and of each body organ is presented in Fig.12.

    It can be observed that the system reveals good results by identifying discomfort in different body organ.The TPR ranges from 98% to 99 %, while the FPR of the proposed system is between 4% to 1%.The organ-wise average performance measures are shown in Tab.2.Results show average measures in various videos for different organs of the patient’s body, revealing that the proposed system achieves 98% overall average accuracy.The TPR of the proposed system is 99% with 2% of FPR.

    Figure 12:TPR and FPR for different videos against each body organ (1 to 6) representing Head,Right Arm, Left Arm, Right Leg, Left Leg, and Torso, respectively.The TPR and FPR are showing the performance of the proposed method.(a) TPR and FPR of Video 1 (b) TPR and FPR of Video 2 (c) TPR and FPR of Video 3 (d) TPR and FPR of Video 4 (e) TPR and FPR of Video 5 bed 1 (f) TPR and FPR of Video 5 bed 2

    Table 2:Average performance measures for all parts in all videos

    5 Conclusion and Future Directions

    In this work, a non-invasive system is developed for automated discomfort detection in the patient body using CNN.The proposed system contains ten layers of the CNN model, which detects key points at different body locations of patient using confidence maps.The key points information is used to form main body organs by applying association rules and part affinity fields.Next, the discomfort in the body’s organs of the patient is investigated by estimating the distance between succeeding key points information of consecutive video frames.Finally, the distance and time-based thresholds are used for the classification of movement as discomfort and normal.To investigate the performance, the system is tested on a newly recorded data set.Experiments are evaluated using several performance measures, including TPR, FPR, TNR, MCR,and average accuracy.The TPR and FPR of each body organ are measured for all sequences,revealing the proposed system’s robustness.The overall average TPR of the system is 98%, with average FPR of 2%.

    This paper provides several future directions.First, new high-quality, pro-long overhead view data sets with multiple patients covering different types of the discomfort of different diseases in consultation with medical experts can be recorded.Second, the proposed work might be continued by recording high resolution data sets, which may capture the facial expressions of patients.This might add a second layer of discomfort detection as facial expressions will be a good way of concluding feelings and emotions.Furthermore, an interactive real-time automated detection system might be introduced for patients’discomfort in which an overhead camera will be accompanied by LEDs installed in the nursing staff room and the medical superintendent’s room.The system might generate an alarm in case of the detection of discomfort.This might help the patients immediately receive the attention of the staff on duty.

    Funding Statement:This research was funded by the Deanship of Scientific Research at Princess Nourah bint Abdulrahman University through the Fast-track Research Funding Program.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    国产精品 欧美亚洲| 18禁黄网站禁片午夜丰满| 国产精品美女特级片免费视频播放器 | 国产欧美日韩一区二区三区在线| 国产真人三级小视频在线观看| 免费人成视频x8x8入口观看| 夜夜躁狠狠躁天天躁| 成人免费观看视频高清| 国产av一区二区精品久久| 亚洲五月婷婷丁香| 久久久久视频综合| 亚洲一区高清亚洲精品| 日韩一卡2卡3卡4卡2021年| 伦理电影免费视频| www日本在线高清视频| 国产三级黄色录像| 亚洲欧美日韩高清在线视频| 国产精品 国内视频| 9191精品国产免费久久| 国产免费现黄频在线看| 午夜91福利影院| aaaaa片日本免费| 免费av中文字幕在线| 亚洲av成人一区二区三| 在线观看免费视频网站a站| 国产日韩一区二区三区精品不卡| 99久久精品国产亚洲精品| 亚洲五月色婷婷综合| 一进一出抽搐动态| 大香蕉久久网| 国产乱人伦免费视频| 亚洲一区二区三区欧美精品| 在线观看免费视频日本深夜| 免费在线观看黄色视频的| 午夜视频精品福利| 亚洲人成伊人成综合网2020| 一进一出好大好爽视频| 在线观看免费视频日本深夜| 少妇的丰满在线观看| 99re在线观看精品视频| 18禁黄网站禁片午夜丰满| 日韩欧美一区二区三区在线观看 | 女人被狂操c到高潮| 国产精品自产拍在线观看55亚洲 | 亚洲精品一卡2卡三卡4卡5卡| 欧美日韩福利视频一区二区| 中文字幕人妻丝袜一区二区| 欧美乱码精品一区二区三区| 日韩有码中文字幕| 香蕉久久夜色| 老司机午夜福利在线观看视频| 天堂√8在线中文| 色老头精品视频在线观看| 国产极品粉嫩免费观看在线| 亚洲精品美女久久av网站| xxxhd国产人妻xxx| 母亲3免费完整高清在线观看| 男人舔女人的私密视频| 欧美日韩亚洲国产一区二区在线观看 | 亚洲一区二区三区欧美精品| 动漫黄色视频在线观看| 黄色怎么调成土黄色| 亚洲熟女精品中文字幕| 视频区欧美日本亚洲| 久久人人97超碰香蕉20202| 亚洲精品av麻豆狂野| 日韩有码中文字幕| 亚洲国产毛片av蜜桃av| 91老司机精品| 国产av精品麻豆| 在线观看免费高清a一片| 69av精品久久久久久| 麻豆成人av在线观看| 一区在线观看完整版| 亚洲成人免费av在线播放| 黄片播放在线免费| 久久午夜亚洲精品久久| 亚洲av日韩精品久久久久久密| 大型黄色视频在线免费观看| 看片在线看免费视频| 欧美日韩福利视频一区二区| 丰满饥渴人妻一区二区三| 777米奇影视久久| 人成视频在线观看免费观看| 欧美 亚洲 国产 日韩一| 国内久久婷婷六月综合欲色啪| a级毛片在线看网站| 国产精品免费一区二区三区在线 | 久久精品国产亚洲av香蕉五月 | 欧洲精品卡2卡3卡4卡5卡区| 一区二区三区精品91| 欧美日韩亚洲高清精品| 免费av中文字幕在线| 欧美日韩视频精品一区| 一区福利在线观看| 欧美乱色亚洲激情| bbb黄色大片| 日本a在线网址| 国产精品香港三级国产av潘金莲| 国产成人影院久久av| 99国产精品免费福利视频| 一本综合久久免费| 免费人成视频x8x8入口观看| 水蜜桃什么品种好| 中文字幕人妻熟女乱码| 老鸭窝网址在线观看| 久久久久久久国产电影| 国产精品久久电影中文字幕 | 色婷婷久久久亚洲欧美| 久久香蕉激情| 亚洲国产精品合色在线| 老司机福利观看| 91国产中文字幕| 一进一出抽搐gif免费好疼 | 国产野战对白在线观看| 免费av中文字幕在线| 日韩一卡2卡3卡4卡2021年| 他把我摸到了高潮在线观看| 国精品久久久久久国模美| 视频区图区小说| 99国产精品一区二区蜜桃av | 亚洲精品久久午夜乱码| 亚洲中文av在线| 国产精品1区2区在线观看. | 美女午夜性视频免费| 在线国产一区二区在线| 久9热在线精品视频| 老司机影院毛片| 日韩 欧美 亚洲 中文字幕| 91字幕亚洲| 欧美乱码精品一区二区三区| 亚洲色图综合在线观看| 老鸭窝网址在线观看| 免费黄频网站在线观看国产| 极品人妻少妇av视频| 日韩中文字幕欧美一区二区| 50天的宝宝边吃奶边哭怎么回事| 成人免费观看视频高清| 9191精品国产免费久久| 亚洲精品久久午夜乱码| 久久午夜亚洲精品久久| 日韩熟女老妇一区二区性免费视频| 午夜影院日韩av| 国产精品1区2区在线观看. | 热re99久久精品国产66热6| 亚洲在线自拍视频| 亚洲av熟女| 男女床上黄色一级片免费看| 大型黄色视频在线免费观看| 久久人人爽av亚洲精品天堂| 大型av网站在线播放| 侵犯人妻中文字幕一二三四区| 久久久国产成人免费| 俄罗斯特黄特色一大片| 热99国产精品久久久久久7| 日韩人妻精品一区2区三区| 亚洲国产毛片av蜜桃av| 露出奶头的视频| 免费少妇av软件| 色婷婷久久久亚洲欧美| 最近最新中文字幕大全电影3 | 最近最新免费中文字幕在线| 免费女性裸体啪啪无遮挡网站| 91精品三级在线观看| 亚洲第一青青草原| 精品国产美女av久久久久小说| 在线永久观看黄色视频| 免费在线观看完整版高清| www.999成人在线观看| 久久香蕉国产精品| 亚洲中文av在线| 一级作爱视频免费观看| 人人妻人人添人人爽欧美一区卜| 麻豆成人av在线观看| 亚洲午夜精品一区,二区,三区| 欧美人与性动交α欧美精品济南到| 黑人欧美特级aaaaaa片| 免费在线观看影片大全网站| 久久九九热精品免费| 久久精品国产综合久久久| 在线看a的网站| 欧美日韩av久久| 水蜜桃什么品种好| 女人高潮潮喷娇喘18禁视频| 国产蜜桃级精品一区二区三区 | 日本黄色视频三级网站网址 | 女性生殖器流出的白浆| 亚洲av日韩精品久久久久久密| 亚洲精品自拍成人| 亚洲一卡2卡3卡4卡5卡精品中文| 久久久精品国产亚洲av高清涩受| 国产精品亚洲av一区麻豆| 大码成人一级视频| 成人影院久久| 黄色毛片三级朝国网站| 两性午夜刺激爽爽歪歪视频在线观看 | 不卡av一区二区三区| 欧美激情久久久久久爽电影 | 91成年电影在线观看| 在线观看免费日韩欧美大片| 国产片内射在线| 手机成人av网站| 亚洲精品久久成人aⅴ小说| 午夜精品久久久久久毛片777| 最新美女视频免费是黄的| 水蜜桃什么品种好| av视频免费观看在线观看| 欧美色视频一区免费| 看黄色毛片网站| 国产主播在线观看一区二区| 成年人黄色毛片网站| 中文字幕av电影在线播放| 国产日韩欧美亚洲二区| 亚洲黑人精品在线| 多毛熟女@视频| 免费在线观看影片大全网站| av超薄肉色丝袜交足视频| 一级毛片精品| 国产不卡一卡二| 亚洲一区二区三区欧美精品| 黑人欧美特级aaaaaa片| 久久狼人影院| 亚洲一卡2卡3卡4卡5卡精品中文| 韩国av一区二区三区四区| 国产一区在线观看成人免费| 久久香蕉激情| 老司机深夜福利视频在线观看| 村上凉子中文字幕在线| 国产成人影院久久av| 两个人看的免费小视频| 欧美黄色淫秽网站| 老司机靠b影院| 亚洲第一av免费看| 99精品欧美一区二区三区四区| 黑人操中国人逼视频| 亚洲专区字幕在线| 国产高清视频在线播放一区| 91精品三级在线观看| 国产欧美日韩一区二区精品| 欧美日韩亚洲国产一区二区在线观看 | 午夜两性在线视频| 国产亚洲精品第一综合不卡| 桃红色精品国产亚洲av| 色综合婷婷激情| 91麻豆精品激情在线观看国产 | 在线播放国产精品三级| 亚洲av成人不卡在线观看播放网| 少妇裸体淫交视频免费看高清 | 国产精品久久久人人做人人爽| 欧美日韩成人在线一区二区| 欧美日韩瑟瑟在线播放| 日韩熟女老妇一区二区性免费视频| 在线观看免费日韩欧美大片| 热99国产精品久久久久久7| 99re6热这里在线精品视频| 老司机福利观看| 成人av一区二区三区在线看| 精品福利永久在线观看| 一级毛片精品| 欧美 日韩 精品 国产| 69av精品久久久久久| 啦啦啦在线免费观看视频4| 女人高潮潮喷娇喘18禁视频| 日韩大码丰满熟妇| 热99国产精品久久久久久7| 国产一区二区激情短视频| 老司机午夜十八禁免费视频| 老熟妇乱子伦视频在线观看| 巨乳人妻的诱惑在线观看| 丝袜人妻中文字幕| 757午夜福利合集在线观看| 午夜福利,免费看| 99国产精品一区二区蜜桃av | 亚洲精品在线观看二区| 桃红色精品国产亚洲av| 欧美激情久久久久久爽电影 | 99久久99久久久精品蜜桃| 热99久久久久精品小说推荐| 性少妇av在线| 一边摸一边做爽爽视频免费| 国产av精品麻豆| 欧美日韩亚洲国产一区二区在线观看 | 久99久视频精品免费| 亚洲专区国产一区二区| 久久人妻熟女aⅴ| 亚洲欧美日韩高清在线视频| 日韩欧美国产一区二区入口| 亚洲精华国产精华精| 欧美最黄视频在线播放免费 | 侵犯人妻中文字幕一二三四区| 啦啦啦 在线观看视频| 国产片内射在线| 欧美亚洲日本最大视频资源| 交换朋友夫妻互换小说| 亚洲情色 制服丝袜| 午夜影院日韩av| 亚洲人成电影观看| 女性被躁到高潮视频| 午夜福利一区二区在线看| 亚洲一卡2卡3卡4卡5卡精品中文| 亚洲午夜理论影院| 在线播放国产精品三级| 国产精品一区二区在线观看99| 后天国语完整版免费观看| 男女高潮啪啪啪动态图| 亚洲精品一卡2卡三卡4卡5卡| 成年动漫av网址| 久久精品91无色码中文字幕| 熟女少妇亚洲综合色aaa.| 老司机深夜福利视频在线观看| 久久久国产欧美日韩av| 久久中文字幕一级| 亚洲精品国产一区二区精华液| 成年人午夜在线观看视频| 国产av一区二区精品久久| 久久香蕉激情| 日韩免费高清中文字幕av| 女人被躁到高潮嗷嗷叫费观| av有码第一页| 三级毛片av免费| a级毛片黄视频| 1024香蕉在线观看| 一区在线观看完整版| 日日爽夜夜爽网站| 亚洲少妇的诱惑av| 精品福利永久在线观看| 午夜免费鲁丝| 国产精品影院久久| 亚洲精品中文字幕一二三四区| svipshipincom国产片| 纯流量卡能插随身wifi吗| 欧美日韩乱码在线| 丰满人妻熟妇乱又伦精品不卡| 悠悠久久av| 亚洲熟妇中文字幕五十中出 | 91麻豆av在线| 色在线成人网| 国产成人一区二区三区免费视频网站| 精品国产亚洲在线| 最近最新免费中文字幕在线| 亚洲va日本ⅴa欧美va伊人久久| 女人久久www免费人成看片| 成人18禁在线播放| 亚洲片人在线观看| 欧洲精品卡2卡3卡4卡5卡区| 欧美在线黄色| 亚洲第一欧美日韩一区二区三区| 欧美av亚洲av综合av国产av| aaaaa片日本免费| 欧美日韩黄片免| 欧美精品人与动牲交sv欧美| 亚洲午夜理论影院| 叶爱在线成人免费视频播放| 丰满迷人的少妇在线观看| 亚洲伊人色综图| 亚洲av电影在线进入| 亚洲视频免费观看视频| 一级,二级,三级黄色视频| 国产高清国产精品国产三级| 亚洲欧美激情在线| 日韩三级视频一区二区三区| e午夜精品久久久久久久| 欧美乱色亚洲激情| 国产av一区二区精品久久| 欧美色视频一区免费| 99国产精品一区二区三区| 国产欧美日韩一区二区三| 欧美日韩一级在线毛片| 亚洲专区字幕在线| 久久久精品免费免费高清| 男人操女人黄网站| 亚洲一区二区三区不卡视频| 久热这里只有精品99| 久9热在线精品视频| 天天躁日日躁夜夜躁夜夜| 91麻豆精品激情在线观看国产 | 精品欧美一区二区三区在线| 欧美在线黄色| 一级片'在线观看视频| 亚洲欧美色中文字幕在线| 免费不卡黄色视频| 国产精品影院久久| 日本wwww免费看| 99国产极品粉嫩在线观看| 精品无人区乱码1区二区| 亚洲精品在线观看二区| 99久久精品国产亚洲精品| 亚洲 国产 在线| 欧美人与性动交α欧美精品济南到| 亚洲三区欧美一区| 男人的好看免费观看在线视频 | 一区二区日韩欧美中文字幕| www.熟女人妻精品国产| 国产精品国产av在线观看| √禁漫天堂资源中文www| 国产在线观看jvid| 国产成人免费观看mmmm| 91成人精品电影| 国产麻豆69| 十八禁高潮呻吟视频| 久久精品国产99精品国产亚洲性色 | 亚洲欧美激情综合另类| 久久久国产欧美日韩av| 精品久久蜜臀av无| 久久狼人影院| 久久热在线av| 19禁男女啪啪无遮挡网站| 少妇猛男粗大的猛烈进出视频| 视频区欧美日本亚洲| 亚洲国产欧美网| 欧美激情高清一区二区三区| 欧美精品啪啪一区二区三区| 满18在线观看网站| 少妇 在线观看| 99riav亚洲国产免费| 亚洲精品在线美女| 久9热在线精品视频| 久久人人97超碰香蕉20202| aaaaa片日本免费| 男女午夜视频在线观看| 老司机福利观看| 亚洲第一av免费看| 777米奇影视久久| 国产主播在线观看一区二区| 成人18禁高潮啪啪吃奶动态图| 无遮挡黄片免费观看| 久久久国产成人精品二区 | 天天添夜夜摸| 亚洲va日本ⅴa欧美va伊人久久| 国产成人精品无人区| 亚洲欧美精品综合一区二区三区| 99riav亚洲国产免费| 国产99白浆流出| 99热国产这里只有精品6| 欧美日韩福利视频一区二区| 建设人人有责人人尽责人人享有的| 成年人黄色毛片网站| 天天躁夜夜躁狠狠躁躁| 日本vs欧美在线观看视频| 久久久国产欧美日韩av| 法律面前人人平等表现在哪些方面| 国产av一区二区精品久久| 丰满人妻熟妇乱又伦精品不卡| 亚洲男人天堂网一区| 丁香六月欧美| 少妇裸体淫交视频免费看高清 | 国产成+人综合+亚洲专区| 香蕉丝袜av| 香蕉久久夜色| 少妇裸体淫交视频免费看高清 | 两个人看的免费小视频| 深夜精品福利| 日日爽夜夜爽网站| 成人免费观看视频高清| 亚洲专区字幕在线| 丝袜人妻中文字幕| 亚洲精品国产区一区二| 国产成人av教育| 国产野战对白在线观看| 少妇的丰满在线观看| 免费看a级黄色片| 精品午夜福利视频在线观看一区| 人妻 亚洲 视频| 国产人伦9x9x在线观看| 五月开心婷婷网| 好看av亚洲va欧美ⅴa在| а√天堂www在线а√下载 | 国产精品 国内视频| 亚洲欧美激情在线| 日韩有码中文字幕| 午夜影院日韩av| 久久草成人影院| 人妻 亚洲 视频| 人人澡人人妻人| 大香蕉久久网| 欧美日韩瑟瑟在线播放| 日韩人妻精品一区2区三区| 亚洲欧美日韩另类电影网站| 青草久久国产| 日韩制服丝袜自拍偷拍| 欧美日韩亚洲国产一区二区在线观看 | 91成年电影在线观看| 一本综合久久免费| 亚洲精华国产精华精| 不卡一级毛片| 免费不卡黄色视频| 精品一品国产午夜福利视频| 欧美久久黑人一区二区| 啦啦啦 在线观看视频| 怎么达到女性高潮| 99国产精品免费福利视频| 91精品国产国语对白视频| 视频区欧美日本亚洲| 757午夜福利合集在线观看| 男人的好看免费观看在线视频 | 日本欧美视频一区| 久久午夜综合久久蜜桃| 91成年电影在线观看| 热99国产精品久久久久久7| 麻豆av在线久日| 999精品在线视频| 日韩有码中文字幕| 老熟女久久久| 久久性视频一级片| 国产亚洲av高清不卡| 亚洲七黄色美女视频| 美女高潮到喷水免费观看| 色综合婷婷激情| 性色av乱码一区二区三区2| 曰老女人黄片| 国产色视频综合| 久久人人97超碰香蕉20202| 精品午夜福利视频在线观看一区| 丰满的人妻完整版| 99精品在免费线老司机午夜| 成在线人永久免费视频| 欧美日韩av久久| 性色av乱码一区二区三区2| 99久久99久久久精品蜜桃| av有码第一页| 精品国内亚洲2022精品成人 | 成在线人永久免费视频| 热re99久久国产66热| 女性生殖器流出的白浆| 黄色毛片三级朝国网站| 黄色视频不卡| 午夜免费鲁丝| 午夜福利,免费看| 国产高清国产精品国产三级| 久久天堂一区二区三区四区| 免费av中文字幕在线| 国产主播在线观看一区二区| 国产男女内射视频| 天天影视国产精品| 少妇的丰满在线观看| 韩国精品一区二区三区| 久99久视频精品免费| 国产精品一区二区免费欧美| 精品一区二区三卡| 老司机午夜福利在线观看视频| 美女午夜性视频免费| 成人18禁高潮啪啪吃奶动态图| 19禁男女啪啪无遮挡网站| 国产精品久久久久成人av| 在线观看午夜福利视频| 中文字幕制服av| 亚洲综合色网址| 高清视频免费观看一区二区| 亚洲专区字幕在线| 又大又爽又粗| 一进一出抽搐gif免费好疼 | 免费日韩欧美在线观看| 这个男人来自地球电影免费观看| 丝瓜视频免费看黄片| 久久久国产一区二区| 高清视频免费观看一区二区| 又黄又粗又硬又大视频| 中文欧美无线码| 精品久久久久久,| 国产高清国产精品国产三级| 19禁男女啪啪无遮挡网站| 好看av亚洲va欧美ⅴa在| 日韩大码丰满熟妇| 涩涩av久久男人的天堂| 国产高清videossex| 欧美日韩中文字幕国产精品一区二区三区 | 另类亚洲欧美激情| 丝袜美足系列| 国产黄色免费在线视频| 国产深夜福利视频在线观看| 国产成人欧美在线观看 | 亚洲av片天天在线观看| 亚洲一码二码三码区别大吗| 身体一侧抽搐| av网站在线播放免费| www.999成人在线观看| 国产免费男女视频| 国产精品 国内视频| 国产成人欧美| 欧美日韩亚洲国产一区二区在线观看 | 亚洲精华国产精华精| ponron亚洲| 人妻久久中文字幕网| 黄色视频不卡| 九色亚洲精品在线播放| 国产极品粉嫩免费观看在线| 在线视频色国产色| 欧美成人免费av一区二区三区 | 黑人猛操日本美女一级片| av有码第一页| 一边摸一边做爽爽视频免费| 性少妇av在线| 欧美老熟妇乱子伦牲交| 女人爽到高潮嗷嗷叫在线视频| 国产精品偷伦视频观看了| av中文乱码字幕在线| 久久热在线av| 亚洲一码二码三码区别大吗| 久久人人97超碰香蕉20202| 国产精品 欧美亚洲| 啦啦啦 在线观看视频| 日日夜夜操网爽| 亚洲九九香蕉| 男人操女人黄网站| 亚洲中文字幕日韩| 亚洲精品自拍成人| 曰老女人黄片| a在线观看视频网站| 少妇粗大呻吟视频| 最新在线观看一区二区三区| 91麻豆精品激情在线观看国产 | 欧美日韩国产mv在线观看视频| 亚洲欧美一区二区三区黑人| 亚洲精品一卡2卡三卡4卡5卡| 在线观看免费日韩欧美大片|