• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Proposal of Initiative Service Model for Service Robot

    2018-01-12 08:30:47ManHaoWeihuaCaoMinWuZhentaoLiuJinhuaSheLuefengChenandRiZhang
    關(guān)鍵詞:補(bǔ)救措施尾部空壓機(jī)

    Man Hao, Weihua Cao, Min Wu, Zhentao Liu, Jinhua She, Luefeng Chen, and Ri Zhang

    1 Introduction

    Service robot is an important part of robot industry, and it is a key entry point to improve the way of human life. As service robot stepping into all aspects of people’s daily lives, e.g., guest service[1], family service[2], disabled pension[3-5], medical rehabilitation[6], etc., simple and mechanical service of robots has been unable to meet the needs of people. It is hoped that robots can be endowed with the human-like capabilities of observation, interpretation, and even emotion expression[7], i.e., robots have the ability to interact with people emotionally, including react appropriately to human expectations and behavior. Hence intelligent service research has attracted more attention.

    In recent years, there are many researches on service robot. The ambient assisted living service system based on SIFT algorithm that proposed in [8] can locate the objects who have service demands for elderly and disabled. A service architecture is proposed to solve the problem of heterogeneity in multi-robot cooperation[9]. A mobile robot is able to perceive the surrounding environment and interact with people in [10]. However, function of cognitive and understanding object’s intention are still not enough to meet humans’ requirements. The Sacario service robot system that is used in the hotel service can realize the human-robot interaction, including speech recognition, face location, and gesture recognition[11]. This system, however, lacks the perception of emotion in speech recognition, thus it can’t correctly understand the intention of users. Robot behavior adaption and human-robot interaction researches have been done in [12,13]. Although these robots are able to execute corresponding service tasks through instructions mentioned above, the service pattern is passive, which causes robots can not integrate into people’s life. If a robot can serve users initiatively, i.e., robots provide initiative service for human, a natural and harmonious human-robot interaction (HRI) could be turned into reality.

    To realize initiative service in HRI, communication information of human is necessary, which is divided into two parts, i.e., surface communication information and deep cognitive information[14,15]. Surface communication information refers to the information that can be obtained directly by sensors or the combination of knowledge base, e.g., speech, facial expression, and gesture. Deep cognitive information is based on human cognitive ability and inference ability to understand the intricate physiological or psychological changes in human mind, e.g., emotional states, personal intention, etc. Most of the service robots mentioned above serve for people based on surface communition information. They execute tasks mechanically according to the instruction, without understanding the thoughts and feelings of humans, (i.e., deep cognitive information). Deep cognitive information reflects people’s inner thinking, which is an important basis for initiative service if we can make good use of them.

    An initiative service model for service robot is proposed to solve above problems, in which surface communication information and deep cognitive information are both taken into account. The initiative service model is composed of three layers. From bottom to top, the first one is information acquisition layer, in which surface communication information (e.g., speech, expression, and gesture) and other influence factors (i.e., personal information and environment information) are collected by sensors. The second layer is decision layer, where deep cognitive information (i.e., emotional state and personal intention) is obtained through emotion recognition and intention understanding. Furthermore, a demand analysis model is established to obtain a fuzzy relationship between deep cognitive information and users’ demands, by which users’ demands are obtained. The last one is initiative service execution layer, in which the task execution and emotion adaption is accomplished according to users’ demands. To validate the proposal, drinking service experiments are performed, in which 3 men and 2 women within 20-25 years old are invited as the experimenters. At the same time, the drinking time and water quantity of experimenters at each time by initiative service and passive service are recorded respectively. The experimental results validate that initiative service is real-time and reasonable.

    The remainder of this paper is organized as follows. The architecture of initiative service model is proposed in Section 2. Initiative service method is introduced in Section 3. Drinking service experiments using initiative service model are given in Section 4.

    2 Initiative Service Architecture

    In recent years, service pattern of service robot is mainly passive service pattern, in which robot carries out tasks when an instruction is received. This pattern is too mechanized to meet humanized demands. To solve the problem, an initiative service model is proposed to provide a humanized and real-time service.

    2.1 Introduction of passive service and initiative

    service

    For passive service, service robots need to receive commands (e.g., semanteme) from users, and then perform corresponding tasks. For instance, the user tells the robot that he/she wants to drink water specifically, and then the robot bring him/her water. The schematic of the passive service is shown in Fig.1.

    Fig.1 Schematic of passive service.

    For initiative service, emotional states and personal intention are obtained by analyzing surface communication information (e.g., physiological signals and speech), which are used to estimate user’s demand. According to this demand, the robot provides initiative service including corresponding task execution and emotion adaption. For example, the robot obtains user’s drinking demands through physical condition information and environmental information if the user is short of water, thereby providing water to the user initiatively and generating emotion adapting to the emotional state of the user. The schematic of the initiative service is shown in Fig.2.

    Initiative service is the spontaneous execution of the robot via a series of decisions, and it doesn’t need to wait for users’ instructions, which can make the service more humanized and effective.

    Fig.2 Schematic of initiative service.

    2.2 Architecture of initiative service

    Users’ demands are essential to implement initiative service of robots. Usually, emotional states and personal intention are closely related to users’ demands, therefore, emotion recognition and personal intention understanding are indispensable for initiative service, which could be obtained by surface communication information (e.g., facial expression, speech), personal information, and environmental information. Based on the above analysis, a three-layer framework for initiative service model is proposed, which is composed of three parts, i.e., information acquisition layer, decision layer, and initiative service execution layer, as shown in Fig.3.

    Fig.3 Initiative service architecture.

    In decision layer, emotional states of the user are obtained through the identification of multi-modal information, and personal intention is inferred based on emotional states, personal information, and environmental information. In the demand analysis model, machine learning, fuzzy inference, etc., could be used to get the relationship between deep cognitive information (i.e., personal intention and emotional states) and users’ demands.

    In initiative service execution layer, there are two kinds of services, i.e., task execution, in which appropriate task according to user’ demands are executed; emotion adaption, in which appropriate emotion is generated adapting to user’s emotional state.

    3 Demand Analysis Based on Deep Cognitive Information

    The premise of realizing initiative service is to obtain user’s demand, which is related to user’s emotional state and personal intention. Firstly, emotional state is obtained by emotion recognition, including speech emotion recognition, facial expression recognition, etc. Secondly, personal intention is inferred by intention understanding. And then a demand analysis model is established, in which fuzzy inference is adopted to acquire user’s demand.

    3.1 Multimodal emotion recognition

    Humans’ emotion is usually divided into seven basic categories, i.e., neutral, happy, anger, surprise, fear, disgust and sadness[16]. In the communication between humans and robots, surface communication information (e.g., speech, facial expressions) can reflect humans’ emotional state that is obtained by emotion recognition.

    3.1.1 Speech recognition

    As fast and easy understand way for communication, speech signal is considered to identify emotions[17]. It is believed that speech conveys not only syntactic and semantic contents of the linguistic sentences but also the emotional states of humans[18]. Speech signal can be collected by audio sensors, and the preprocessing of speech signal is adopted at the same time. Firstly, the emotion features (e.g., fundamental frequency, Mel-frequency Cepstral Coefficients and other statistic features) are extracted according to our previous works[19]. After obtaining the initially extracted features from preprocessed speech signal samples based on the analysis and extraction of speech emotion features, the correlation analysis is used to gain the lower redundancy features and reduce the dimension of selected feature set. Next, these selected features are put into pre-trained ELM model, by which the speech emotion is recognized.

    3.1.2 Facial expression recognition

    Facial expression is the most expressive way to convey emotions, and facial expression contributes 55% to the effect of the speakers’ message[20]. User’s emotional state can be estimated through facial expression recognition. The procedure of facial expression recognition mainly consists of three steps[21], i.e., face detection and segmentation, in which Adaboost learning is adopted for detecting the face region and Haar classifiers are used to further detecting and segmenting salient facial expression patches which are closely related to facial expressions; feature extraction, in which 2D Gabor and LBP are combined to extract discriminative features of facial expressions, and principal component analysis (PCA) is used for dimensionality reduction; facial expression classification, in which ELM is adopted for facial expression classification using the extracted features.

    3.1.3Fusionofmulti-modalemotionrecognition

    To improve the accuracy of recognition results, the multi-modal emotion information is taken into account. There are two fusion layers for multi-modal emotion recognition[21]. The first one is feature level fusion, where the features from all modalities are transformed into one feature vector for emotion recognition. An ELM classifier is designed for the feature vector, and the final emotion result is obtained. The other is the decision level fusion, where Naive Bayes classifier is designed for every modality. The probability of each basic emotion estimated by single modality is multiplied to obtain the final probability. By the product rule, the emotion with biggest probability is chosen as the final decision result of multi-modal fusion.

    3.1.4Emotionalstaterepresentedina3Demotionspace

    A 3-D emotion space named Affinity-Pleasure- Arousal (APA) space is employed to represent emotional state of humans[22], in which emotional state and its dynamic changes can be presented intuitively, as shown in Fig.4, being thought over the similar relationship between the three axes of emotion space, namely, “Affinity”, “Pleasure-Displeasure” and “Arousal-Sleep”. For instance, “Affinity” relates to the mutual relationship among individuals, “Pleasure-Displeasure” relates to favor, and “Arousal-Sleep” relates to liveliness in communication. Moreover, by using APA emotion space, it is not only possible to represent fixed mentality states but also possible to take into account rapid variations in emotional states.

    Fig.4 Affinity-Pleasure-Arousal emotion space.

    3.2 Personal intention understanding

    Intention understanding is mainly obtained by emotion, and intention is understood by collecting and analyzing of multimodal information (e.g., speech, expression and gesture). In addition, personal intentions also can be affected by personal information and surrounding environment information. Personal information (e.g., the gender and exercise) are obtained through the sensor detection and the knowledge base. And the surrounding environment information (e.g., temperature, humidity, etc.) are obtained directly through the sensor.

    An intention understanding model based on two layers fuzzy support vector regression is adopted to comprehend humans’ inner thoughts in our previous work[14], which includes two parts, i.e., local learning layer and global learning layer. In the local learning layer, fuzzy C-means (FCM) clustering is used to classify the input data, and then the training data set are split into several subsets using the FCM algorithm. Therefore, the center and spread width is obtained. After the local learning, the global output of the proposed two-layer fuzzy support vector regression is based on the fusion model which is calculated using fuzzy weighted average algorithm.

    3.3 Task execution and emotion adaption

    It is difficult to describe user’s demands with accurate mathematical model, which belongs to the empirical model. Therefore, fuzzy production rule is adopted in this paper. Firstly, personal intention and emotional state of humans are fuzzified. Secondly, the relationship between deep cognitive information (i.e., emotional states and personal intentions) and user’s demands are established. And user’s demands are obtained by fuzzy inference. Thirdly, robots’ tasks and emotion are defuzzified, and execute corresponding task and emotion adaption. The model of fuzzy production rules is shown in Fig.5.

    Fig.5 Fuzzy production rules model.

    3.3.1 Fuzzification

    There are many kinds of intention of users when robot service for people. When the intention is estimated, the intention level (IL) of user’s desire is divide into four levels, i.e., urgent (U), normal (N), slight (S), and needless (NL).

    Emotional state of humans is usually divided into seven categories, i.e., happiness, surprise, fear, neutral, disgust, sadness, and anger. For each categories, the emotional state level EL can be divided into five levels, i.e., very strong (VS), strong (S), normal (N), weak (W), very weak (VW).

    The level of robot’s tasks and emotional states are fuzzified as IL (i.e., U, N, S, NL) and EL (i.e., VS, S, N, W, VW), respectively, which are the same as humans’ in order to achieve smooth communication between humans and robots.

    3.3.2 Fuzzy inference and defuzzication

    The intention and emotional state of robots can be obtained by using IF-THEN rules shown in Eq. (1) on the basis of fuzzification.

    If Humans={ILandEL}
    Then Robots={ILandEL}

    (1)

    The initiative service of robots includes two kinds of service, i.e., task execution and emotion adaption.

    補(bǔ)救措施:如果實(shí)際施工中遇到黏錘現(xiàn)象,則可考慮下列辦法解決。一是讓鉆機(jī)的鋼絲繩受力,并將鉆機(jī)尾部吊起,然后在鉆機(jī)尾部緩慢增加配重,但要注意緩慢增加,避免鋼絲繩拉斷。二是如黏錘現(xiàn)象發(fā)生時(shí)現(xiàn)場(chǎng)具備空壓機(jī),則可用鋼管作為導(dǎo)氣管伸入到鉆孔底部,邊送氣邊調(diào)整鋼管位置,此種方法見效較快,一般不會(huì)超過1小時(shí)即可解決黏錘現(xiàn)象。

    For task execution, it will generate the corresponding intention to perform a task when it receives user’s demands, and according to the users’ desire level of demands, the corresponding task execution (TE) and the rules are expressed in Eq. (2). For instance, task execution can be expressed as the water quantity in drinking service is large amount, normal, small amount and none.

    If Robot={IL}
    ThenTE

    (2)

    For emotion adaption, the three-dimensional emotion space is used to generate robot’s emotion to maintain consistency between robots and humans. It can generate its own emotional state to adapt to users according to the emotional state of users. The emotion adaption of robots (EA) is expressed through screen, mechanical arm, and other executive agencies, in which each emotional state is divided as very strong (VS), strong (S), normal (N), weak (W), very weak (VW). And the rules are expressed in Eq. (3).

    If Robot={EL}
    ThenEA

    (3)

    4 Experiments on Initiative Service Model for Drinking Service

    In our daily life, there is a significant relationship between people’s drinking habits and the health of the body. In this paper, experiments on drinking service are performed to validate the proposal. In the scenario of drinking service, how often and how much should people drink is an important problem. The goal of the initiative service model is to ensure that people can drink water at the right time and supplement enough water.

    4.1 Drinking water demand analysis

    People often forget to drink water because of tedious daily works, and they only drink water when they feel thirsty. However, the body has been lack of water when they feel thirsty. People lose fluid so rapidly that the brain can’t respond in time. Therefore, it is hoped that the robot can provide water initiatively according to the demands of users to ensure the health of people.

    In addition to drinking time, water quantity is another important thing in drinking habits. When you drink too much water, you may develop a condition known as hyponatremia, in which the excess water floods your body’s cells, causing them to swell up. Lack of water, or dehydration inversely, reduces the amount of blood in your body, forcing your heart to pump harder in order to deliver oxygen-bearing cells to your muscles. The amount of daily drinking water of an adult is 1500~2000 mL according to a survey. And the quantity of each time is 100~400 mL approximately. Furthermore, factors such as speech, physiological signal, temperature, humidity, and gender affect people’s drinking water requirements[23].

    Usually, talking makes human short of water, and we want to drink water when talking for more than 20 minutes continuously ordinarily. Thus, it hints that users need to drink water when they speak a lot or the voice becomes hoarse. The length of speech can be obtained by recording equipment and timers. Firstly, a fixed threshold is set, and then two sound acquisition units are set up. Recording equipment starts recording when the intensity ratio of two sound signals is larger than the threshold, and stops when the ratio is less than the threshold. At the same time, the length of speech is acquired by timer during recording. In addition, the frequency of sound decreases when voice becomes hoarse. The collected speech set is processed to realize frequency spectrum analysis, in which the change of frequency is obtained to determine the change of tone.

    The body will consume a lot of water after exercise, therefore, exercise is an important factor affecting people’s drinking water demand. The common measures of exercise intensity are oxygen intakes and heart rate. Heart rate is easy to measure with a small instrument, while the measurement of oxygen intake is not easy to get since it needs expertise and a large apparatus[24]. Thus heart rate is adopted to calculate users’ exercise intensity in this paper, which is obtained by sphygmomanometer.

    In addition, environmental factors also affect people’s drinking water demand. For example, the amount of water will increase in demand when the temperature rises or humidity decreases. For example, daily water requirements for any given energy expenditure in temperate climates (20 ℃) can triple in very hot weather (40 ℃), as shown in Fig.6[25]. In addition to air temperature, other environmental factors also modify sweat losses, including relative humidity, air motion, solar load, and choice of clothing for protection against sweat losses. And the personal information (e.g., gender and age) affects drinking demands as well. A study from Maastricht University in the Netherlands found that women lose more water during exercise than men[25].

    4.2 Experiments on drinking service

    Experiments are performed using the multi-modal emotional communication based human-robot interaction (MEC-HRI) system[21], which consists of 3 NAO robots, 2 mobile robots, Kinect, eye tracker, 2 high-performance computers (i.e., a server and a workstation), and other wearable sensing devices as well as data transmission and network-connected devices. The topology structure of MEC-HRI system is shown in Fig.7. And we selected 3 men and 2 women who are within 20-25 years old as the experimenters. In the experiment, some foundational data, such as humidity, temperature, and physiological data are collected under our supervision. If there is a big mistake in acquiring information such as physical condition and environment, we will intervene in the data collection until it returns to normal. Thus the reliability of the collected data is guaranteed by our supervision. The scenario of drinking service is shown in Fig.8.

    Fig.6 Relationship between temperature and drinking need[25].

    Fig.7 The topology structure of MEC-HRI system.

    Fig.8 Scenario of “drinking service”.

    People often forget to drink water because of busy work, and they may drink water when they feel thirsty. However, when people feel thirsty, it may have been a long time since the body has been lack of water. To solve above problems and ensure human health, firstly, the factors that influence humans’ drinking water demands are selected after in-depth analysis, including physical activity, physical condition, and environment. Secondly, drinking time and water quantity are inferred according to correlation analysis between these factors and water demand. Thus, robots can provide water to users timely and moderately even if users are unaware of their lack of water or forget to drink water. Consequently, initiative service can ensure a real-time and rationality drinking service.

    User’s information (e.g., speech, physiological, signal, and gender) and environmental information (e.g., temperature and humidity) are captured by the underlying hardware including Kinect, eye tracker, microphone, and other wearable devices respectively. Then emotional state and personal intention are obtained by emotion recognition and intention understanding, respectively, and users’ water demand is obtained by the demand analysis model.

    When the intention of drinking is estimated, the intention level can be divided as urgent (U), normal (N), slight (S), and needless (NL) and the emotion state level of human is divided as very strong (VS), strong (S), normal (N), weak (W), very weak (VW). According to Section 2.3, the water quantity provided by the robot is divided into three level correspondingly, i.e., large amount (400 mL), normal (300 mL), small amount (100 mL) and none (0 mL). The emotional state level of robot is divided as VS, S, N, W, and VW.

    4.3 Experimental results and discussion

    Experimental results are recorded, including activities from 5 experimenters, drinking time and water quantity at each time, as shown in Table 1. The curve of drinking time that robots serve the user initiatively shown as the solid line in Fig.9a. The time when experimenter feels thirsty and orders robot to provide water is shown as the dotted line in Fig.9a. The curves of water quantity supplied initiatively to the user each time are shown as the solid line in the Fig.9b. The water quantity that experimenter orders robot to provide is shown as the dotted line in the Fig.9b.

    In the experiments, two experimenters began to discuss at 9:40, and after a period of discussion, robot recorded this discussion time has reached 20 minutes, then the robot renders a drinking service at 10:00. However, experimenters were busy discussing until they felt thirsty and drank water at 10:30. As time goes by, temperature rose after 11:00, robot provides normal amount of water to experimenters after the temperature reached 30℃. Aircondition is used in our laboratory due to the heat in summer, which makes air dry, thus robot provided water to experimenters at 12:50 and 14:00, while experimenters felt thirsty and drank a lot of water at 12:00 and 14:00 in fact. Then experimenters ran for 30 minutes during 19:00—19:30, and heart rate of the experimenter was transmitted to robot, from which robots judged that the experimenter have a strong exercise and served experimenters drink a large amount of water at 19:30, since their body had lost a lot of water due to sweat losses.

    Fig.9 Experimental results of drinking service.

    Table 1 The activities of experimenters.

    From the experiment results, the time of passive service lags behind the initiative service 30 minutes on average; the initiative service renders 2 times water supply more than passive service on average; in initiative service, the water quantity for each time is decided according to the loss of water in the body, which is more reasonable.

    It can be seen that the drinking time of initiative service and passive service is consistent when the water loss of the body is serious, for example, the drinking time and water quantity provided by initiative service is the same as passive service after exercise, but it is not be noticed by experimenters when the body drains a small amount of water. Experimenters do not drink water until they feel thirsty, which leads to belated water supplying by passive service.

    5 Conclusion

    An initiative service model for service robot is proposed. Firstly, multi-modal information (e.g., speech, expression and gesture) is collected by the underlying hardware (e.g., Kinect, eye tracker, and microphone). And deep cognitive information (i.e., emotional state and personal intention) is acquired by using emotion recognition and intention understanding. User’s demands are obtained by a demand analysis model using deep cognitive information. Finally, robot executes corresponding task and realizes emotion adaptation. By using this model, robots could understand actual demands of people in such an initiative and humanized way and provide corresponding service correctly.

    Drinking service experiments are performed by 3 men and 2 women aged from 20 to 25 years old. For initiative service, robots can render drinking service for the user in a timely manner. Experimental results show that the time of passive service lags behind the initiative service 30 minutes on average and the water supply in initiative service is 2 times more than passive service on average, which avoids the situation that the body is short of water. In addition, by initiative service, water quantity for each time is more reasonable, which ensures that the user can supplement enough water.

    Initiative service of service robot is a new concept, which promotes the realization of intelligence and humanization. Initiative service model can be applied to many scenarios in our daily life, e.g., family service, caring for the elderly, medical rehabilitation.

    Acknowledgement

    This work was supported by the National Natural Science Foundation of China under Grants 61403422 and 61273102, the Hubei Provincial Natural Science Foundation of China under Grant 2015CFA010, the Wuhan Science and Technology Project under Grant 2017010201010133, the 111 project under Grant B17040, and the Fundamental Research Funds for National University, China University of Geosciences (Wuhan).

    [1]A.D.Nuovo, G.Acampora, and M.Schlesinger, Guest editorial cognitive agents and robots for human-centered systems,IEEETransactionsonCognitive&DevelopmentalSystems, vol.9, no.1, pp.1-4, 2017.

    [2]F.Zhou, G.Wang, G.Tian, Y.Li, and X.Yuan, A fast navigation method for service robots in the family environment,JournalofInformation&ComputationalScience, vol.9, no.12, pp.3501-3508, 2012.

    [3]S.Bedaf, G.J.Gelderblom, D.S.Syrdal, and H.Michel, Which activities threaten independent living of elderly when becoming problematic: inspiration for meaningful service robot functionality,Disability&RehabilitationAssistiveTechnology, vol.9, no.6, pp.445-452, 2014.

    [4]J.Jiang and T.U.Dawei, Human-robot collaboration navigation of service robots for the elderly and disabled in an intelligent space,CAAITransactionsonIntelligentSystems, vol.9, no.5, pp.560-568, 2014.

    [5]S.Xu, D.Tu, Y.He, S.Tan, and M.Fang, ACT-R-typed human-robot collaboration mechanism for elderly and disabled assistance,Robotica, vol.32, no.5, pp.711-721, 2014.

    [6]X.Cheng, Z.Zhou, C.Zuo, and X.Fan, Design of an upper limb rehabilitation robot based on medical theory,ProcediaEngineering, vol.15, no.1, pp.688-692, 2011.

    [7]S.Iengo, A.Origlia, M.Staffa M, and A.Finzi, Attentional and emotional regulation in human-robot interaction, inProceedingsof21stIEEEInternationalSymposiumonRobotandHumanInteractiveCommunication, Paris, France, 2012, pp.1135-1140.

    [8]N.Hendrich, H.Bistry, and J.Zhang, Architecture and Software Design for a Service Robot in an Elderly-Care Scenario,Engineering, vol.1, no.1, pp.27-35, 2015.

    [9]Y.Cai, Z.Tang, Ding Y, and B.Qian, Theory and application of multi-robot service-oriented architecture,IEEE/CAAJournalofAutomaticaSinica, vol.3, no.1, pp.15-25, 2016.

    [10] A.Hermann, J.Sun J, Z.Xue, S.W.Ruehl, J.Oberlaender, A.Roennau, J.M.Zoellner, and R.Dillmann, Hardware and software architecture of the bimanual mobile manipulation robot HoLLiE and its actuated upper body, inProceedingsofIEEE/ASMEInternationalConferenceonAdvancedIntelligentMechatronics, San Diego, 2013, pp.286-292.

    [11] R.Pinillos, S.Marcos, R.Feliz R, E.Zalama, and J.Gómez-García-Bermejo, Long-term assessment of a service robot in a hotel environment,Robotics&AutonomousSystems, vol.79, pp.40-57, 2016.

    [12] L.F.Chen, Z.T.Liu, Wu M, F.Y.Dong, and K.Hirota, Multi-Robot Behavior Adaptation to Humans’ Intention in Human-Robot Interaction Using Information-Driven Fuzzy Friend-Q Learning,JournalofAdvancedComputationalIntelligence&IntelligentInformatics, vol.19, pp.173-184, 2015.

    [13] X.Wang and C.Yuan, Recent Advances on Human-Computer Dialogues,CAAITransactionsonIntelligenceTechnology, vol.1, no.4, pp.303-312, 2016.

    [14] L.F.Chen, Z.T.Liu, M.Wu, M.Ding, F.Y.Dong, and K.Hirota, Emotion-Age-Gender-Nationality Based Intention Understanding in Human-Robot Interaction Using Two-Layer Fuzzy Support Vector Regression,InternationalJournalofSocialRobotics, vol.7, no.5, pp.709-729, 2015.

    [15] B.Schuller, A.Batliner, S.Steidl S, and D.Seppi, Recognizing Realistic Emotions and Affect in Speech: State of the Art and Lessons Learnt From the First Challenge,SpeechCommunication, vol.53, no.9, pp.1062-1087, 2011.

    [16] G.Sandbach, S.Zafeiriou, M.Pantic M, and L.J.Yin, Static and dynamic 3D facial expression recognition: A comprehensive survey,Image&VisionComputing, vol.30, no.10, pp.683-697, 2012.

    [17] M.E.Ayadi, M.S.Kamel, and F.Karray, Survey on speech emotion recognition: Features, classification schemes, and databases,ElsevierScienceInc, vol.44, no.3, pp.572-587, 2011.

    [18] C.N.Anagnostopoulos, T.Iliou, and I.Giannoukos, Features and classifiers for emotion recognition from speech: a survey from 2000 to 2011,ArtificialIntelligenceReview, vol.43, no.2, pp.155-177, 2015.

    [19] Z.T.Liu, K.Li, D.Y.Li, L.F.Chen, and G.Z.Tan, Emotional feature selection of speaker-independent speech based on correlation analysis and Fisher, inProceedingsofthe34thChineseControlConference, Hangzhou, China, 2015, pp.3780-3784.

    [20] Mehrabian A, Communication without words,CommunicationTheory, pp.193-200, 2008.

    [21] Z.T.Liu, F.F.Pan, M.Wu M, W.H.Cao, L.F.Chen, J.P.Xu, R.Zhang, and M.T.Zhou, A multimodal emotional communication based humans-robots interaction system, inProceedingsofthe35thChineseControlConference, Chengdu, China, 2016, pp.6363-6368.

    [22] K.Hirota, Fuzzy inference based mentality expression for eye robot in affinity pleasure-arousal space,JournalofAdvancedComputationalIntelligenceandIntelligentInformatics, vol.12, no.3, pp.304-313, 2008.

    [23] M.Hao, W.H.Cao, M.Wu, Z.T.Liu, L.F.Chen, and R.Zhang, An initiative service method based on intention understanding for drinking service robot,IECON, 2017

    [24] J.She, H.Nakamura, and K.Makino, Selection of Suitable Maximum-heart-rate Formulas for Use with Karvonen Formula to Calculate Exercise Intensity,InternationalJournalofAutomationandComputing, vol.12, no. 01, pp.62-69, 2015.

    [25] M.N.Sawka, N.Samuel, and P.R.Cheuvront, Robert Carter III PhD, MPH, Human Water Needs,NutritionReviews, vol.63, no.s1, pp.s30-s39, 2005.

    猜你喜歡
    補(bǔ)救措施尾部空壓機(jī)
    船舶尾部響應(yīng)特性試驗(yàn)與計(jì)算
    超聲及磁共振診斷骶尾部藏毛竇1例
    建筑工程屋面防水施工技術(shù)及滲漏補(bǔ)救措施分析
    作物遭受藥害的補(bǔ)救措施
    空壓機(jī)系統(tǒng)運(yùn)行優(yōu)化
    電子制作(2017年17期)2017-12-18 06:40:54
    KYJ-T型空壓機(jī)試驗(yàn)臺(tái)自動(dòng)化控制系統(tǒng)研究與開發(fā)
    淺析如何改善空壓機(jī)運(yùn)行的節(jié)能技術(shù)
    試論室內(nèi)裝修過程的常見問題與補(bǔ)救措施
    彎式尾部接頭注塑模具設(shè)計(jì)
    2,4-D丁酯對(duì)棉花造成的藥害及補(bǔ)救措施
    亚洲精品在线观看二区| 伦理电影免费视频| 欧美成人性av电影在线观看| 国产精品日韩av在线免费观看| 日韩 欧美 亚洲 中文字幕| 人妻夜夜爽99麻豆av| 99精品在免费线老司机午夜| 午夜福利成人在线免费观看| 国产成人精品久久二区二区91| 国产精品久久电影中文字幕| 夜夜爽天天搞| 黄色视频不卡| av国产免费在线观看| 精品一区二区三区视频在线观看免费| 波多野结衣巨乳人妻| 50天的宝宝边吃奶边哭怎么回事| 亚洲人与动物交配视频| 1024视频免费在线观看| 香蕉丝袜av| 国内毛片毛片毛片毛片毛片| www.999成人在线观看| 午夜福利欧美成人| 男女下面进入的视频免费午夜| 精华霜和精华液先用哪个| 无遮挡黄片免费观看| 一级a爱片免费观看的视频| 国产精品久久久av美女十八| 99久久久亚洲精品蜜臀av| 日韩免费av在线播放| 成人特级黄色片久久久久久久| 激情在线观看视频在线高清| 国产99久久九九免费精品| 亚洲天堂国产精品一区在线| 久久午夜综合久久蜜桃| 色尼玛亚洲综合影院| 国内久久婷婷六月综合欲色啪| 色综合欧美亚洲国产小说| 久久久久国产一级毛片高清牌| 国产av又大| 久久精品国产亚洲av高清一级| 1024视频免费在线观看| 在线观看免费日韩欧美大片| 又大又爽又粗| 久久午夜亚洲精品久久| 在线观看免费日韩欧美大片| 日日干狠狠操夜夜爽| 国产黄色小视频在线观看| АⅤ资源中文在线天堂| 日韩欧美三级三区| 草草在线视频免费看| 搡老岳熟女国产| 午夜福利在线在线| 日韩免费av在线播放| 亚洲电影在线观看av| 悠悠久久av| 伊人久久大香线蕉亚洲五| 嫩草影视91久久| 国内少妇人妻偷人精品xxx网站 | 757午夜福利合集在线观看| 亚洲国产高清在线一区二区三| 国产精品免费视频内射| 777久久人妻少妇嫩草av网站| 白带黄色成豆腐渣| 午夜激情av网站| 日韩精品中文字幕看吧| 欧美高清成人免费视频www| 亚洲国产日韩欧美精品在线观看 | 国产成人精品久久二区二区91| 欧美zozozo另类| 男女视频在线观看网站免费 | 超碰成人久久| 国产熟女xx| 好看av亚洲va欧美ⅴa在| 黄色毛片三级朝国网站| 最近最新中文字幕大全电影3| 级片在线观看| 国产激情久久老熟女| 成人18禁高潮啪啪吃奶动态图| 亚洲国产精品久久男人天堂| xxx96com| 看免费av毛片| 日日摸夜夜添夜夜添小说| netflix在线观看网站| 99久久99久久久精品蜜桃| 高潮久久久久久久久久久不卡| 日韩欧美在线二视频| 国产精品亚洲av一区麻豆| 亚洲第一欧美日韩一区二区三区| 亚洲欧洲精品一区二区精品久久久| 日本一本二区三区精品| 久久国产精品人妻蜜桃| 天堂av国产一区二区熟女人妻 | 老熟妇仑乱视频hdxx| 日韩成人在线观看一区二区三区| 精品乱码久久久久久99久播| 国产黄a三级三级三级人| 亚洲性夜色夜夜综合| 久久中文字幕一级| 九九热线精品视视频播放| 90打野战视频偷拍视频| 久久人妻福利社区极品人妻图片| 欧美日韩中文字幕国产精品一区二区三区| 99在线人妻在线中文字幕| 色哟哟哟哟哟哟| 激情在线观看视频在线高清| 中文亚洲av片在线观看爽| 国产亚洲av高清不卡| 色在线成人网| av国产免费在线观看| 中文字幕人成人乱码亚洲影| 色在线成人网| 男人舔女人下体高潮全视频| av欧美777| 亚洲熟妇中文字幕五十中出| 久久久久久大精品| 亚洲一码二码三码区别大吗| 久久久精品欧美日韩精品| 亚洲五月天丁香| 精品欧美一区二区三区在线| 久久草成人影院| 日本在线视频免费播放| 99热这里只有是精品50| 国产精品1区2区在线观看.| 97碰自拍视频| 国产成年人精品一区二区| 狠狠狠狠99中文字幕| 亚洲精华国产精华精| 精品欧美国产一区二区三| 精品久久蜜臀av无| xxx96com| 国产精品电影一区二区三区| 91九色精品人成在线观看| 成人高潮视频无遮挡免费网站| 黄色成人免费大全| 亚洲精品一卡2卡三卡4卡5卡| 小说图片视频综合网站| 亚洲专区中文字幕在线| 最好的美女福利视频网| 99re在线观看精品视频| 哪里可以看免费的av片| or卡值多少钱| 天天添夜夜摸| 欧美成人一区二区免费高清观看 | 一级片免费观看大全| 久久天躁狠狠躁夜夜2o2o| 高清毛片免费观看视频网站| 亚洲国产欧美一区二区综合| 久久久久国内视频| 久99久视频精品免费| 亚洲免费av在线视频| 国内久久婷婷六月综合欲色啪| 脱女人内裤的视频| 特级一级黄色大片| www日本在线高清视频| 好看av亚洲va欧美ⅴa在| 麻豆成人av在线观看| 老司机靠b影院| 最好的美女福利视频网| 久久久久久久久久黄片| 欧美日韩乱码在线| 午夜精品久久久久久毛片777| 亚洲一区高清亚洲精品| 最新美女视频免费是黄的| 亚洲自偷自拍图片 自拍| 国产精品亚洲一级av第二区| 村上凉子中文字幕在线| 亚洲精品美女久久av网站| 丝袜美腿诱惑在线| 男女视频在线观看网站免费 | 久久精品91蜜桃| 午夜影院日韩av| 免费av毛片视频| or卡值多少钱| 青草久久国产| 欧美中文日本在线观看视频| 久久久精品大字幕| 少妇熟女aⅴ在线视频| 欧美日韩瑟瑟在线播放| 91九色精品人成在线观看| 亚洲中文字幕一区二区三区有码在线看 | 99热6这里只有精品| 久久伊人香网站| 亚洲中文日韩欧美视频| 国产精品免费视频内射| 国产三级在线视频| 99在线人妻在线中文字幕| 一个人观看的视频www高清免费观看 | 国产亚洲精品久久久久久毛片| 久久精品人妻少妇| 日韩 欧美 亚洲 中文字幕| 成在线人永久免费视频| 在线观看免费日韩欧美大片| 午夜福利成人在线免费观看| 精品高清国产在线一区| 国产精品一区二区精品视频观看| 亚洲熟女毛片儿| 天天躁夜夜躁狠狠躁躁| 欧美黑人精品巨大| АⅤ资源中文在线天堂| 色播亚洲综合网| 欧美日韩福利视频一区二区| 国产久久久一区二区三区| 脱女人内裤的视频| 精品福利观看| 国产午夜福利久久久久久| 男女午夜视频在线观看| АⅤ资源中文在线天堂| 欧美黑人巨大hd| 老汉色∧v一级毛片| 在线观看一区二区三区| 久久久精品欧美日韩精品| av福利片在线| 国产激情久久老熟女| 国产av麻豆久久久久久久| 久久国产精品影院| 丁香欧美五月| 国产欧美日韩精品亚洲av| 国产精品久久视频播放| 亚洲avbb在线观看| 国产成人精品久久二区二区免费| 一边摸一边做爽爽视频免费| 国产av在哪里看| 老汉色av国产亚洲站长工具| 久久天躁狠狠躁夜夜2o2o| 最近在线观看免费完整版| 国产人伦9x9x在线观看| 亚洲国产欧美网| 在线观看免费视频日本深夜| 18禁黄网站禁片免费观看直播| 国产三级黄色录像| 日韩大码丰满熟妇| 婷婷丁香在线五月| 99久久99久久久精品蜜桃| 国产爱豆传媒在线观看 | 国产精品 欧美亚洲| 亚洲专区字幕在线| 日本黄色视频三级网站网址| 午夜免费成人在线视频| 免费高清视频大片| 久久精品成人免费网站| 狂野欧美激情性xxxx| 窝窝影院91人妻| 亚洲成人免费电影在线观看| 久久精品aⅴ一区二区三区四区| 国产高清有码在线观看视频 | 午夜日韩欧美国产| 精品少妇一区二区三区视频日本电影| 国内揄拍国产精品人妻在线| 一个人免费在线观看电影 | 最近在线观看免费完整版| 日韩欧美在线二视频| 亚洲va日本ⅴa欧美va伊人久久| 精品电影一区二区在线| 1024香蕉在线观看| 岛国在线免费视频观看| 少妇的丰满在线观看| 成人精品一区二区免费| 日韩欧美 国产精品| 国产精品av视频在线免费观看| 丝袜美腿诱惑在线| 99久久精品国产亚洲精品| 韩国av一区二区三区四区| 男人舔女人的私密视频| 变态另类丝袜制服| 久9热在线精品视频| 一本精品99久久精品77| 又紧又爽又黄一区二区| 黄片小视频在线播放| 欧美乱妇无乱码| 亚洲国产精品久久男人天堂| 亚洲午夜精品一区,二区,三区| 亚洲片人在线观看| 午夜亚洲福利在线播放| 老司机福利观看| 亚洲色图av天堂| 草草在线视频免费看| 给我免费播放毛片高清在线观看| 最新美女视频免费是黄的| 精品少妇一区二区三区视频日本电影| 日韩欧美一区二区三区在线观看| 51午夜福利影视在线观看| 欧美日韩福利视频一区二区| 精品国内亚洲2022精品成人| 999久久久精品免费观看国产| 免费在线观看完整版高清| 精品免费久久久久久久清纯| 性欧美人与动物交配| 韩国av一区二区三区四区| 国产精品影院久久| 九九热线精品视视频播放| 国产成人一区二区三区免费视频网站| 丁香六月欧美| 999久久久精品免费观看国产| 欧美黑人精品巨大| 夜夜爽天天搞| 蜜桃久久精品国产亚洲av| 国产精品一区二区三区四区免费观看 | 丁香六月欧美| 中文亚洲av片在线观看爽| 亚洲男人天堂网一区| 亚洲人与动物交配视频| 黑人欧美特级aaaaaa片| 中文亚洲av片在线观看爽| 国产欧美日韩一区二区三| 色精品久久人妻99蜜桃| 欧美日韩亚洲国产一区二区在线观看| 欧美激情久久久久久爽电影| 村上凉子中文字幕在线| 国产成人精品无人区| 一卡2卡三卡四卡精品乱码亚洲| 久热爱精品视频在线9| 色综合亚洲欧美另类图片| 成年人黄色毛片网站| 成人欧美大片| 伊人久久大香线蕉亚洲五| 90打野战视频偷拍视频| 中文字幕精品亚洲无线码一区| 中文资源天堂在线| 最好的美女福利视频网| 视频区欧美日本亚洲| 久久这里只有精品19| 国产精品亚洲美女久久久| 久久久久亚洲av毛片大全| 国产伦人伦偷精品视频| 人妻久久中文字幕网| 小说图片视频综合网站| 国产成年人精品一区二区| 在线观看免费日韩欧美大片| 中文字幕人成人乱码亚洲影| 性色av乱码一区二区三区2| 亚洲第一欧美日韩一区二区三区| 在线观看免费视频日本深夜| 波多野结衣高清作品| 国产熟女午夜一区二区三区| 一级毛片精品| 2021天堂中文幕一二区在线观| 丝袜美腿诱惑在线| 成熟少妇高潮喷水视频| 亚洲电影在线观看av| 精品国产乱码久久久久久男人| 丰满人妻熟妇乱又伦精品不卡| 午夜两性在线视频| 精品乱码久久久久久99久播| 真人一进一出gif抽搐免费| 人人妻人人看人人澡| 一区福利在线观看| 久久九九热精品免费| 久久久久免费精品人妻一区二区| 日韩精品免费视频一区二区三区| 久久久久久久精品吃奶| 国产日本99.免费观看| 国产av麻豆久久久久久久| 亚洲精品一卡2卡三卡4卡5卡| 草草在线视频免费看| 成年人黄色毛片网站| 成人永久免费在线观看视频| 国产蜜桃级精品一区二区三区| 精品无人区乱码1区二区| 国内精品一区二区在线观看| 国产精品一及| 国产真实乱freesex| 一级黄色大片毛片| 90打野战视频偷拍视频| 欧美国产日韩亚洲一区| 午夜免费成人在线视频| 又大又爽又粗| 国产亚洲精品av在线| 国产免费av片在线观看野外av| 欧美人与性动交α欧美精品济南到| 老司机深夜福利视频在线观看| 久久中文看片网| 免费在线观看亚洲国产| 欧美日韩精品网址| 亚洲熟妇中文字幕五十中出| 国产成人av教育| 老司机在亚洲福利影院| 久久久久久久午夜电影| 精品日产1卡2卡| 熟女电影av网| or卡值多少钱| 正在播放国产对白刺激| 日韩免费av在线播放| 最近最新免费中文字幕在线| 国产99久久九九免费精品| 亚洲国产精品999在线| 中国美女看黄片| 久久久久久久久中文| 日韩精品免费视频一区二区三区| 麻豆成人午夜福利视频| 欧美精品啪啪一区二区三区| 亚洲精品粉嫩美女一区| 精品日产1卡2卡| 久久久久久九九精品二区国产 | 国产99久久九九免费精品| 国产一区在线观看成人免费| 欧美日韩福利视频一区二区| 亚洲欧美日韩东京热| 久久天堂一区二区三区四区| 久久久久性生活片| 麻豆成人av在线观看| 亚洲av电影在线进入| 人妻久久中文字幕网| 亚洲成人久久性| 亚洲av熟女| 亚洲精品av麻豆狂野| avwww免费| 一级作爱视频免费观看| 三级男女做爰猛烈吃奶摸视频| 五月玫瑰六月丁香| 精品欧美一区二区三区在线| 亚洲 欧美 日韩 在线 免费| 国产一区二区三区在线臀色熟女| 久久午夜综合久久蜜桃| 波多野结衣高清无吗| 久久久久久久精品吃奶| 窝窝影院91人妻| av欧美777| 国产成人aa在线观看| 成人国产综合亚洲| 欧美性猛交黑人性爽| 国产片内射在线| 久久久久久九九精品二区国产 | 在线观看66精品国产| 香蕉丝袜av| 精品一区二区三区av网在线观看| 亚洲成人中文字幕在线播放| 欧美色视频一区免费| 一级黄色大片毛片| 我要搜黄色片| 观看免费一级毛片| 美女免费视频网站| 午夜精品在线福利| 亚洲一区中文字幕在线| av有码第一页| 国产av一区在线观看免费| 村上凉子中文字幕在线| 亚洲五月婷婷丁香| 又黄又粗又硬又大视频| 欧美黑人欧美精品刺激| www国产在线视频色| 免费在线观看影片大全网站| 国产精品久久久久久人妻精品电影| 看黄色毛片网站| 1024香蕉在线观看| 成年免费大片在线观看| 岛国在线免费视频观看| 成年免费大片在线观看| 亚洲一区二区三区色噜噜| 国产伦人伦偷精品视频| 国产99久久九九免费精品| 国产精品一及| 蜜桃久久精品国产亚洲av| 国产男靠女视频免费网站| 两性夫妻黄色片| av免费在线观看网站| videosex国产| 正在播放国产对白刺激| 中文字幕人成人乱码亚洲影| 久久香蕉激情| 女同久久另类99精品国产91| 51午夜福利影视在线观看| 一级毛片精品| 88av欧美| 日韩欧美在线二视频| 日韩欧美国产一区二区入口| 久久久久九九精品影院| 亚洲午夜理论影院| avwww免费| 亚洲专区国产一区二区| 三级国产精品欧美在线观看 | 国产一级毛片七仙女欲春2| 一级a爱片免费观看的视频| 欧美成狂野欧美在线观看| 老司机深夜福利视频在线观看| 久久精品影院6| 国产高清videossex| 18禁黄网站禁片免费观看直播| 久久这里只有精品19| 老司机靠b影院| 国产亚洲av嫩草精品影院| 麻豆成人av在线观看| 中亚洲国语对白在线视频| 亚洲午夜精品一区,二区,三区| 久久人妻福利社区极品人妻图片| 免费一级毛片在线播放高清视频| 一进一出好大好爽视频| 国产精品久久电影中文字幕| www日本黄色视频网| 国产亚洲欧美98| 国产三级中文精品| 制服诱惑二区| 国产精品亚洲一级av第二区| 亚洲aⅴ乱码一区二区在线播放 | 19禁男女啪啪无遮挡网站| 天堂影院成人在线观看| 搡老岳熟女国产| 久久久久性生活片| 成人av一区二区三区在线看| 免费看十八禁软件| 精品福利观看| 欧美zozozo另类| 特级一级黄色大片| 舔av片在线| 中文字幕高清在线视频| 免费在线观看完整版高清| 国产片内射在线| avwww免费| 亚洲avbb在线观看| 美女免费视频网站| 国产精品亚洲av一区麻豆| 97碰自拍视频| 免费在线观看黄色视频的| 亚洲电影在线观看av| 免费看十八禁软件| 亚洲精品一卡2卡三卡4卡5卡| 亚洲一码二码三码区别大吗| 熟妇人妻久久中文字幕3abv| 国产精品精品国产色婷婷| 亚洲av成人一区二区三| 久久久久久久午夜电影| 国内精品久久久久精免费| 久久中文看片网| 精品第一国产精品| 丰满的人妻完整版| 国产片内射在线| 少妇被粗大的猛进出69影院| 亚洲av电影不卡..在线观看| av国产免费在线观看| 性欧美人与动物交配| 午夜福利18| 首页视频小说图片口味搜索| 欧美av亚洲av综合av国产av| 国产单亲对白刺激| 在线视频色国产色| 精品久久蜜臀av无| 国产在线观看jvid| 中出人妻视频一区二区| 美女免费视频网站| 蜜桃久久精品国产亚洲av| 婷婷六月久久综合丁香| 亚洲国产中文字幕在线视频| 国产主播在线观看一区二区| 日本三级黄在线观看| 婷婷六月久久综合丁香| 午夜成年电影在线免费观看| 国产免费av片在线观看野外av| 久久久久精品国产欧美久久久| 久久中文看片网| 男女那种视频在线观看| 亚洲av美国av| 黑人欧美特级aaaaaa片| 99热只有精品国产| 一级毛片女人18水好多| 久久久久久国产a免费观看| 看黄色毛片网站| 嫩草影院精品99| 欧美乱色亚洲激情| 午夜亚洲福利在线播放| 亚洲va日本ⅴa欧美va伊人久久| 日韩av在线大香蕉| 国语自产精品视频在线第100页| 最近最新免费中文字幕在线| 日本成人三级电影网站| 黄片大片在线免费观看| 欧美日韩黄片免| 无人区码免费观看不卡| 亚洲真实伦在线观看| 亚洲成a人片在线一区二区| 99国产极品粉嫩在线观看| 亚洲免费av在线视频| 国产一区在线观看成人免费| 精品久久久久久成人av| 国产精品一及| 国产精品电影一区二区三区| 淫秽高清视频在线观看| 搞女人的毛片| 人成视频在线观看免费观看| 中文亚洲av片在线观看爽| 制服诱惑二区| 欧美又色又爽又黄视频| 中出人妻视频一区二区| 国产精品久久久久久精品电影| 午夜久久久久精精品| 国产黄a三级三级三级人| 欧美高清成人免费视频www| 久久中文字幕一级| 色老头精品视频在线观看| 色精品久久人妻99蜜桃| 精品国产亚洲在线| 国产成人aa在线观看| 此物有八面人人有两片| 免费看日本二区| 精品国产美女av久久久久小说| 午夜精品一区二区三区免费看| 欧美性猛交╳xxx乱大交人| 757午夜福利合集在线观看| 中国美女看黄片| 亚洲一区二区三区色噜噜| 无人区码免费观看不卡| 日韩欧美精品v在线| 亚洲男人的天堂狠狠| 国产精品一及| 亚洲av中文字字幕乱码综合| 99国产综合亚洲精品| 日韩精品中文字幕看吧| 十八禁人妻一区二区| 国内少妇人妻偷人精品xxx网站 | 欧美色欧美亚洲另类二区| 日韩 欧美 亚洲 中文字幕| 久久久久精品国产欧美久久久| 岛国视频午夜一区免费看| 国产成人精品久久二区二区免费| 精品免费久久久久久久清纯| 国产亚洲av高清不卡| 99精品久久久久人妻精品|