• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Technology of Intelligent Driving Radar Perception Based on Driving Brain

    2017-10-10 03:29:37HongboGaoXinyuZhangJianhuiZhaoandDeyiLi

    Hongbo Gao, Xinyu Zhang, Jianhui Zhao, and Deyi Li

    TechnologyofIntelligentDrivingRadarPerceptionBasedonDrivingBrain

    Hongbo Gao, Xinyu Zhang*, Jianhui Zhao, and Deyi Li

    Radar is an important sensor to realize intelligent driving environment perception, enabling the detection of static obstacles and dynamic obstacles, and the tracking of a dynamic obstacle. The models, quantities and installing location of the platform radar sensors as well as the information processing modules differ from each other on different intelligent driving testing platforms, resulting in different quantities and interfaces on the intelligent driving system. In this paper, we build the software architecture of intelligent driving vehicle based on driving brain which is used to adapt to different types of radar sensors and use the variable granularity road ownership radar for radar information fusion. Under the condition of complete driving information, increasing or reducing the number of radar sensors and changing the radar sensor model or installing location will not affect the intelligent driving decision directly. Therefore, we meet the demands of multi-radar sensor adapting to different intelligent driving hardware testing platforms.

    driving brain; intelligent driving; radar perception

    1 Introduction

    Early in the 1950s, the United States developed the research on an unmanned vehicle. In 1950, the United States Barrett Electronics developed the world’s first autonomous navigation vehicle[1]. The research of US unmanned vehicle originated in the US Department of Defense Advanced Research Projects Agency (DARPA), which is one of the world’s leaders in this domain. Since the mid-80s in the last century, European countries began to develop unmanned driving technology, they regarded unmanned vehicles as independent individuals and mixed these vehicles in normal traffic flow[2]. In 1987, the Prometheus plan (PROMETHEUS, Program for a European Traffic of Highest Efficiency and Unprecedented Safety) jointly participated by the Munich Federal Defense Force University, Daimler-Benz, BMW, Peugeot, Jaguar and other well-known R & D institutions and automotive companies exerted significant influences all over the world[3]. Since the 90s of last century, the Advanced Cruise-Assist Highway System Research Association (AHSRA) attached to Japan Department of Transport has set up an advanced safety vehicle (ASV) project developing unmanned driving technology research every five years[4]. Chinese unmanned driving technology research began in the late 80s of last century, and it was supported by the Chinese national “863” plan and the related research program of Chinese National Defense Science and Technology Commission[5]. Since 2008, with the support of the National Natural Science Foundation of China, China has been tackling the future challenges of smart cars. There are more and more teams participating year by year and the difficulty of the competition increases. More and more teams accomplished the competition better and better. The car enterprises gradually laid a solid foundation for introducing unmanned driving technology into domestic cars, and helping unmanned driving technology progress rapidly[6].

    Intelligent driving cannot be separated from the technology of perception, the technology of perception mainly proceeds the perception of intelligent vehicle internal and external environmental information with sensors. Sensors include visual sensors, pose sensors, radar sensors etc.. Among them, the radar sensor is mainly used for static and dynamic obstacle detection, dynamic obstacle tracking, environment mapping. There is massive research work about the method of obstacle detection based on radar. By measuring the height information of the front of the autonomous vehicle, the distribution of obstacles can be determined. For example, Yuan et al. proposed a pavement extraction algorithm that can be applied to a structured and semi-structured environment for 64-wire radar[7]. Wan et al. improved the pavement detection algorithm by using the plane model of the radar and the extended Kalman filter algorithm and realized the data classification of the single-frame radar[8]. Moosmann et al. used the graph-based method to carry out the ground and obstacle division towards the 3D radar data. The experiment demonstrated that even in the curved city environment it can get better results[9]. Douillard et al. used both the mean height map and the minimum maximum height map. They segmented the ground with the mean height map. Then, the maximum and minimum grid maps were used to subdivide the obstacles. Using this HEM(Hybrid Elevation Maps) can extract the ground and get the obstacle information[10].Himmelsbach et al. used the polar coordinate grid map to represent the radar data, segmented the road through the block straight line fitting method, and did obstacle clustering through 2-dimensional connectivity analysis. The experiment demonstrated that this method can obtain a better obstacle detection effect[11]. Douillard et al. used the two-dimensional Gaussian process regression algorithm to do ground fitting and extraction of the obstacle directly on the whole map[10].

    Dynamic tracking based on the radar is an important part of environmental understanding. Dynamic obstacle tracking mainly includes pedestrian tracking and vehicle tracking. For example, Lee et al. detected the position of the pedestrian by extracting the geometric characteristics of the human leg and extracted the frequency and amplitude of the leg walking through the inverted pendulum model and finally used the Kalman filter for pedestrian tracking[12].Brscic et al. used the prospect data obtained from the radar to cluster to obtain the target need to be tracked and finally used Kalman filtering for tracking[13]. Scholer et al. introduced a particle-filtered observation model to identify partially or completely obscured pedestrians and then determined the number of obstructions behind the obstacles[14]. Spinello et al. chosen the suitable hypothesis from the multi-hypothesis which was developed by the top-down classifier and the bottom-top detector, then used the multi-objective and multi-hypothesis tracking method to perform pedestrian tracking[15]. Navarro-Serment et al. used the ground estimation algorithm from the three-dimensional radar to get the ground and then calculated SOD characteristics according to the height of between data point and the ground, and finally used Kalman filter to do pedestrian tracking[16]. Petrovskaya et al.used the vehicle observation model to sample the importance to obtain the evaluation of the candidate vehicle, and then the particle filter algorithm was used to track the multi-vehicle[17].

    The three-dimensional reconstruction based on radar and SLAM research has made a series of progress and results. For example, Wang et al. used the SLAM technique of extended Kalman filtering, extracted the linear feature from the original radar data, and performed the threshold filtering towards the observation of linear characteristic according to the data fusion technique. He regarded weighted Euler distance as observation and the distance measurement between observations was used to construct the correlation matrix. Then he matched the characteristic line observation with the forecast using the rule of minimizing the total distance[18]. Vivet and Pathak et al. summarized the application examples of LIDAR in 3D reconstruction and SLAM[19,20]. Park et al. achieved the UGV positioning using the method which they firstly aligned two frames of 64-line radar data and then fused the digital surface model(DSM)[21].Wu et al. proposed an adaptive threshold method based on the integral graphics to compress the redundant data and retained many three-dimensional object edge information for the reconstruction of 3D scenes[22]. Lin et al. proposed to use the worst neglect method ICP and feature method ICP to reconstruct the teaching building and the surrounding environment, and obtained the precise environment model[23].Kurniawati et al. used the original ICP algorithm for 64-line radar data without using the positioning sensor, and did a three-dimensional reconstruction of objects on the surface of water[24].

    Radar-based environmental awareness technology is mainly focused on road detection, static obstacle detection, and environmental modeling. Intelligent vehicle sensor configuration is the technical basis of perceiving the surrounding environment and obtaining their own states. It is also the most costly and the most significant part for the test platform. Due to different types of vehicle platform mechanical structure, the intelligent vehicle sensor selection and installing location are also different, there is no unified program. Some research teams tend to complete environmental awareness relying on the visual sensor, typically represented by the smart car team at VisLab Labs, University of Parma, Italy[25]. Other research teams tend to obtain environmental information relying on radar sensor, typically represented by Google’s unmanned vehicles. They mainly focus on the improvement of the algorithm and technology itself. However, how to reduce the impact of driving the decision on the number, type and installing location, the changes of sensors to make the technical architecture can adapt to different sensor configuration intelligent vehicle platform and achieve intelligent decision and sensor information decoupling[26]. As well as under the condition of complete driving information, increasing or reducing the number of radar sensors or changing the radar sensor model or installing location willnot affect the intelligent driving decision directly. This paper puts forward the technical structure of an intelligent vehicle with driving brain as the core and solves the convenient transplantation and robustness of the technology of radar perception and application.

    This paper describes the technology of radar perception and application using the technical structure based on driving brain. The classification and application of radar sensor are introduced in the second part. The application of radar sensor in intelligent driving environment perception is introduced in the third part. The structure of the technology of radar sensor environment perception based on driving brain is introduced in the fourth part. The fifth part is the conclusion.

    2 The Classification and Application of Intelligent Vehicle Radar Sensor

    According to the detection principle, the radar is divided into millimeter-wave radar and laser radar, while the laser radar is divided into single-line and multi-line. Single-line laser radar has only one scanning line and obtains the depth information of one line by rotational scannings, such as the SICK series. Multi-line laser radar obtains the depth information of several lines by rotational scanning with several scan lines, such as 4-wire and 8-wire laser radar of IBEO series; three-dimensional omnidirectional radar obtains a spatial depth information by scanning a space, such as Velodyne’s 64 lines laser radar.

    2.1 Millimeter-wave radar and application

    Millimeter-wave radar is commonly used in unmanned vehicle obstacle detection, tracking and collision warning. It mainly includes ESR series and SRR series, of which the ESR and SSR are featured with the “multi-mode” integrating wide-viewing-angle mid-distance and narrow-viewing-angle long-distance as is shown in Fig.1. Limited by the working mode of millimeter-wave radar, the general radars can only detect a very narrow area in the front, so both sides of radar and both sides of the front of radar become blind areas. In order to expand the radar detecting angle, the detecting distance will be limited. A single ESR radar can provide both modes of mid-distance wide-coverage and high-resolution long-distance detection. The mid-distance wide-viewing-angle can not only find vehicles in adjacent lanes cutting from the side, but also identify vehicles and pedestrians crossing between the large vehicles. SRR series millimeter-wave radar, long-distance can provide accurate distance and speed data. Table 1 is the ESR radar parameters, as is shown in Fig. 1 ESR radar map and measurement range chart.

    Table 1 The ESR radar parameters.

    Fig.1TheESRradarmapandmeasurementrange.

    2.2 Single-line laser radar and application

    Single-line laser radar is a radar launching a laser beam, which has the advantages of simple structure, low power consumption, handy using, etc. It is widely used in the detection of obstacles, roadside, road, and track, etc. There are defects of deficient information content since there is only one scan line, resulting in the incapability to get the size, shape and other information of obstacles, and solve the obstruction of obstacles. Table 2 is the SICK single-line laser radar parameters, as shown in Fig.2 SICK single-line laser radar and measurement range chart.

    Table 2 The SICK single-line laser radar parameters.

    2.3 Multi-line laser radar and application

    Multi-line radar can achieve all the functions of single-wire radar, partially solving the problems of obstacles blocking each other, and providing the height of obstacles, but the accuracy is poor. Table 3 is the ibeo 4-lines laser radar parameters. Fig.3 shows the ibeo 4-lines laser radar and measurement range chart.

    Fig.2TheSICKsingle-linelaserradarandmeasurementrange.

    Table 3 The ibeo 4-lines laser radar parameters.

    Fig.3Theibeo4linelaserradarandmeasurementrange.

    2.4Three-dimensionalomnidirectionallaserradarandapplication

    Traditional single-line or multi-line radar can only perform a simple function such as obstacle detection, but it can not meet the demands of real environmental perception, such as dynamic obstacle tracking, map construction etc. In order to provide more comprehensive, richer and more accurate environmental information of intelligent vehicles, the 3D laser radar is used in the field of intelligent vehicle studying. Among them, the representative three-dimensional omnidirectional laser radar is Velodyne 64-line laser radar. Table 4 is the Velodyne 64-lines laser radar parameters.Fig.4 shows the Velodyne 64-lines laser radar and measurement range chart.

    Table 4 The Velodyne 64-lines laser radar parameters.

    3 The Application of Radar Sensor in Intelligent Driving Environment Perception

    Intelligent vehicles aim to achieve autonomous driving in an unknown environment, they must be able to obtain the real-time reliable external and internal information in various environments. In the process of developing intelligent driving technology, the radar sensor is one of the mostly applied for environmental perception. From the US DAPAR Challenge to China’s Intelligent Vehicle Future Challenge, almost every participating intelligent vehicle is equipped with radar sensors, the areas marked with red rectangles are intelligent vehicles using various types of radar sensors, including single-line laser radar, multi-line laser radar, millimeter-wave radar, as shown in Fig.5.

    Fig.4TheVelodyne64-lineslaserradarandmeasurementrange.

    Fig.5 Intelligent vehicle equipped with radar equipment.

    4 The Structure of Radar Sensor and Environment Perception

    The technology of intelligent vehicle environment perception includes road information, traffic signs information, pedestrian behavior information and peripheral vehicle information etc. The information is acquired by a variety of vehicle heterogeneous sensor equipment, there are a variety of camera, radar (such as laser radar, millimeter-wave radar, ultrasonic radar, infrared radar etc.), GPS receiver, inertial navigation and so on. When a human driver performs a complex driving operation like lane changing, he or she will focus on different areas to ensure lane safety. Radar sensor includes various types, which can ensure heterogeneous sensor extract the real-time information accurately. Drawing lessons from the feature human’s recognition, we proposed a technology structure of intelligent vehicle based on driving brain. As is shown in Fig.6. The sensor which belongs to the different intelligent driving testing platform has different models, amounts and installing locations, the information processing modules are also different. As is shown in Fig.7. The granularity of information provided by different driving maps does not have a fixed standard. Therefore, the amount and interfaces of unmanned driving system software modules are different. Using driving brain to be core and making driving cognitive formalization, we can design the general unmanned driving system software structure using the formal language of the driving cognitive. In the architecture, the intelligent decision modules do not couple with the sensor information directly, it forms a comprehensive driving situation to complete intelligent decision through sensor information and prior map information, and it uses the method which acquires information by variable granularity right of way radar diagram to proceed perception[27].

    Fig.6 Driving brain technology architecture.

    Fig.7 The hardware platform of intelligent vehicle.

    Among them, the sensor information was acquired by equipment like camera, laser radar, millimeter-wave radar, combined positioning system. Radar sensor completes the static, dynamic obstacle identification, by launching electromagnetic waves initiative and getting cloud information through echo analysis point. The influence of light changes is small and the robustness is high. Radar sensor can be divided into laser radar, millimeter-wave radar, infrared radar, ultrasonic radar and other types according to their wavelengths of electromagnetic waves. Laser radar, which has the highest resolution among them, is the most commonly used sensors among intelligent driving testing platforms. The high-resolution radar can not only detect static and dynamic obstacles but also can use the reflection intensity to identify lane marking and ground traffic signs. The high precision point cloud data provided by laser radar can also be used for synchronous positioning and mapping[28]. It can build the environment point cloud model, acquiring the precise position of the vehicle meantime. However, laser radar costs high, and the weather conditions and other factors affect a lot. The millimeter-wave radar is also widely used on the intelligent driving testing platform. Millimeter-wave radar can accurately detect the distance and speed of the obstacle in the environment. Its cost is low, and the weather conditions and other factors affected a little. It is a useful supplement to laser radar.

    4.1Theestablishmentofvariablegranularityrightofwayradardiagram

    The rightof way means the road space and time which is needed by the conditions of vehicle safe driving, recorded asRow(t). Road ownership radar diagram means to detect and display the current status of the car’s road ownership using the form of radar diagram, as shown in Fig.8.

    Fig.8 The right radar map.

    The white car in Fig.8 indicates the intelligent vehicle and the yellow cars indicate other cars. The red part of road ownership radar diagram indicates the road ownership of the intelligent vehicle current driving stats. The shadow part indicates that the road ownership cannot be occupied by an intelligent vehicle.

    Variable granularity radar diagram uses different sizes of grids to construct the form of variable granularity radar diagram, with the environment perception information of various sensors like fusion camera etc. and to display the road ownership space which the vehicle can occupy and the tendency of change. The establishment method is as follows:

    (1) The road ownership radar diagram takes the geometric center of the vehicle as the center and covers a radius of about 200 m.

    (2) Be closer to the intelligent vehicle, the size of road ownership radar diagram grid is smaller, the accuracy is higher. Be farther from the intelligent vehicle, the size of road ownership radar diagram grid is bigger, the accuracy is lower. For example, the minimum of grid radial is 5 cm, the maximum of it is 400 cm. The angel resolution of road ownership radar diagram can change according to the driving concern area, it also can be fixed, like Δ=1°. The variable granularity grid which is built by it can be matched with an array structure.

    (3) Different types of sensors like camera and radar perceive the surrounding environment through the physical signals like sound, light, and electricity etc. Its working mechanism and the installing location on the vehicle are different from each other. Even if the same type of sensor, their accuracy, effective range and parameter settings (such as camera focal length, white balance, the identification distance of laser radar and angel resolution etc.) are also different. We can map the information which is perceived and processed by different sensors into the unified coordinate system of road ownership radar diagram with the parameter registration and position calibration and assign towards the corresponding grid. The content of the assignment may include the height of obstacle and confidence level etc.

    (3) The road ownership is always parasitic on the running vehicle, relates to the vehicle speed and changes dynamically over time. Its period of updating is decided by the factors like the sampling frequency of sensor and the driving status etc.

    (4) In the integration of multi-class sensor testing results, we will obtain the surrounding conditions of road ownership and the tendency of changing. We will form the foundational platform with the basic driving control strategy such as the lane of the car-following model and the adjacent lane of the road mode (including left turn, right turn, straight line, U word U-turn) and proceed the dynamic planning of local path and collaborative navigation.

    (5) Based on the road ownership radar diagram, combined with the human driving experience, it will be beneficial to form the intelligent vehicle decision rule base and complete the conversion of qualitative knowledge to quantitative control.

    Variable granularity radar diagram can realize the fusion of the sensed information of the sensor on the vehicle in different categories and different installing location in real time and judge the road ownership space which the vehicle can occupy and its tendency of changing. The Fine-grained, coarse-grained variable granularity grids which constructed according to the natural cognition law, meeting the demands of intelligent vehicle environment perception meantime, uses less storage space and computing resources to complete the simulation and calculation about human driving behavior recognition. It provides an important technical support for the realization of intelligent driving.

    4.2Themethodofacquiringinformationusingvariablegranularityroadownershipradardiagram

    The intelligent vehicle uses variable granularity road ownership radar diagram. It can fuse the environmental perception results of various sensors and display the status and change tendency of intelligent vehicle road ownership. In this section, the settings and calculation method of variable granularity road ownership radar diagram.

    (1) Determine the coverage of the right of way of radar map

    The road ownership radar diagram should cover the safe distance while intelligent vehicle driving. In the calculation of the safety distance of theintelligent vehicle, it is necessary to consider the speed of the vehicle, the possible wet slippery road and the reaction time of the driver. Article 70 of the Regulations on the Implementation of Road Traffic Safety Law stipulates that the expressway shall sign the speed of the lane and the maximum speed shall not exceed 120 km/h.

    The coefficient of adhesion of slippery pavementμ=0.32, the general driver response timeTfranges from 0.35 to 0.65 s. For the sake of safety, let’s takeTf=0.65 s, and vehicle speedVm=120 km/h. In the response time, the vehicle do uniform motion forward distanceS1=Vm×Tf=22 m. After that, brakes, and the vehicle does uniformly retard the motion Forward distance

    The safety distance isS=S1+S2=195 m. Therefore, the radius of road ownership radar diagram should not be less than 195 m.

    (2) Determine the right of way radar of grid accuracy

    Whether from the driver’s driving experience or from the actual requirements of vehicle control, the demands of perception accuracy in the different distance are not the same. Be closer to the intelligent vehicle, the size of road ownership radar diagram grid is smaller and accuracy is higher. Be farther from the intelligent vehicle, the size of road ownership radar diagram grid is bigger and accuracy is lower.

    The distanceD, corresponds to a maximum speedVDthat ensures safe travel of the vehicle. The redial length of the fan-shaped grid at distanceDis the distance which the vehicle travels during the periodTof acquisition-sensing-control. According to the actual situation,T=100 ms. The road ownership radar diagram consists of a multi-turn grid. The grid which is located in the same circle has the same radial length.

    The accuracy of grid isSn=Dn+Dn-1,n≥1.

    Limited by the accuracy of the sensor, the minimum of the accuracy of the grid is 5 cm.Dnsequence is determined by the following recursion formula:

    After calculating, we obtained that the radius of the road ownership radar diagram is 197.7 m, a total of 140 ring grid.

    The minimum of the vertical length of the grid is 5 cm, the maximum is about 336 cm. The radial length of the intermediate grid can be calculated according to the recursion formula.

    The following analysis is about the angle resolution:

    The lane width of the ExpresswayL=3.5 m, the road ownership radar diagram can be divided intoNcopies. If it is required to distinguish different lanes at the furthest distance of the road ownership radar diagram, it should satisfy the following conditions:

    According to the above content that the demonstration about the range and resolution of the road ownership radar diagram, road ownership radar diagram consists of 140×360=50 400 grids, we form the array with the size of 140×360 for information fusing.

    (3) Determine the right of way radar of updating cycle

    The upgrading period of road ownership radar diagram is determined by the factors like sampling frequency of sensors. Taking the PIKE100C camera sensor as an example, its sampling period is 25~40 ms (There will be some changes if the environment conditions like light change.). The sampling period of single-line, four-line, eight-line laser radar and millimeter-wave radar is all the 80 ms. In order to reflect the perception information of sensors accurately and real-time, the upgrading period should be less than the minimum period of each sensor. Therefore, the upgrading period of road ownership radar diagram is set to 20 ms in this example.

    After fusing the detecting results of massive types of sensors, the stored array of road ownership radar diagram reflects the status of surrounding ownerships and the changing frequency. On the basis of this, we can establish the basic control strategy such as the lane of the car-following model and the adjacent lane changing mode and intersection mode,and proceed the dynamic planning of local path and collaborative navigation.

    4.3Themethodofacquiringinformationbasedonvariablegranularityroadownershipradardiagram

    This example is based on the military lion 3 intelligent vehicles, equipped with one single laser radar, one forward-4-line laser radar, one forward-8-line laser radar, one backward millimeter-wave radar and one forward PIKE100C camera both in the front and rear of the vehicle for the perception of the surrounding environment. Various types of sensor coverage overlap each other, as is shown in Fig.9.However, after the location calibration, the coverage is round or fan-shaped, we can use the road ownership radar diagram to proceed the information fusion.

    Fig.9Thecoverageareaofeachsensoroftheintelligentvehicle: (a)forwardsinglelineradar, (b)backwardsinglelineradar, (c)forward4-linesradar, (d)forward8-linesradar, (e)backwardmillimeterradar, (f)camera,and(g)allsensors.

    The easiest way to fuse information is to set the grid the height of obstacle(unit/cm) as long as any of the sensors detects that a fan-shaped grid is occupied, and the fan-shaped grid which is not detected by any sensors is set to zero. This can ensure the results of information fusion reflects the ownership occupied situation of the obstacles around the vehicle. Fig.10 shows an example of the way in which the road ownership radar diagram is used to simulate the obstruction of the surrounding vehicle on the intercity expressway. The amplitude of the grid which contains obstacles in the figure is the height of the obstacles. For example, the amplitude of the grid which contains the far right fence means the height of the fence is 90 cm. Except, we can also introduce a confidence probability for each grid and set it to a real number range from 0 to 1. It can provide more precise support for the subsequent intelligent decision, path planning etc.

    Fig.10Thestatesurroundingofintelligentvehicleusingtherightofwayoftheroadmap.

    Using the variable ownership radar diagram for the information of environment perception processing, lion intelligent vehicle has been successfully completed several Beijing-Tianjin intercity expressway tests.

    5 Conclusion

    There are massive achievements and breakthroughs of Radar-based environment-perception technology in road detection, static obstacle detection, and environmental modeling, but the sensor configuration of the intelligent vehicle is the hardware basis for perceiving the surrounding environment and obtaining its own state. It is also the most costly and significant part on the testing platforms. Due to the differences of vehicle platform mechanical structure, the mechanical and electrical transformations are also slightly different. Intelligent vehicles sensor selection, installing location of the studying teams are also different, without unified program. This paper designs the radar sensor perception technology architecture of intelligent vehicle with the driving brain as the core, decouples the intelligent decision and sensor information and reduces the influence of the sensor quantity, type and installing position on the whole technology structure, and facilitates the migration of technical structures on platforms with different sensor collocations. In this paper, we propose a method to obtain the information using the variable granularity radar diagram. Under the condition of complete driving information, through the formal language of driving cognition, we can increase or decrease the number of radar sensors and change the radar sensor model or installation location, not affecting the intelligent driving decision directly, which can be easily transplanted on different test platforms.

    Acknowledgment

    This work was supported by National Natural Science Foundation of China under Grant No.61035004, No.61273213, No.61300006, No.61305055, No.90920305, No.61203366, No.91420202, No.61571045, No.61372148, the National High Technology Research and Development Program (“863” Program) of China under Grant No.2015AA015401, the National High Technology Research and Development Program (“973” Program) of China under Grant No. 2016YFB0100903, and the Junior Fellowships for Advanced Innovation Think-tank Program of China Association for Science and Technology under Grant No.DXB-ZKQN-2017-035, and the Beijing Municipal Science and Technology Commission special major under Grant No. D171100005017002.

    [1]D.W.Gage, UGV History 101: A brief history of unmanned ground vehicle (UGV) development efforts,UnmannedSystems, vol.13, pp.9-32, 1970.

    [2]T.Kanade and C.Thorpe,CMUstrategiccomputingvisionprojectreport: 1984to1985.Carnegie-Mellon University, the Robotics Institute, pp.10-90, 1986.

    [3]M.Williams, PROMETHEUS-The European research programme for optimising the road transport system in Europe, inProceedingsofIEEEColloquiumonDriverInformation, 1988, pp.1-9.

    [4]S.Tsugawa, M.Aoki, A.Hosaka, and K Seki, A survey of present IVHS activities in Japan,ControlEngineeringPractice, vol.5, no.11, pp.1591-1597, 1997.

    [5]M.Yang, Overview and prospects of the study on driverless vehicles,JournalofHarbinInstituteofTechnology, vol.38, no.8, pp.1259-1262, 2006.

    [6]H.B.Gao, X.Y.Zhang, T.L.Zhang, Y.C.Liu, and D.Y.Li, Research of intelligent vehicle variable granularity evaluation based on cloud model,ActaElectronicaSinica, vol.44, no.2, pp.365-374, 2016.

    [7]X.Yuan, C.X.Zhao, and H.F.Zhang, Road detection and corner extraction using high definition lidar,InformationTechnologyJournal, vol.9, no.2, pp.1022-1030, 2010.

    [8]Z.T.Wan,ResearchonLADARbasedroadandobstaclesdetection.National University of Defense Technology, 2010.

    [9]F.Moosmann, O.Pink, and C.Stiller, Segmentation of 3D lidar data in non-flat urban environment using a local convexity criterion, inProceedingsofIntelligentVehicleSymposium, 2009, pp.215-220.

    [10] B.Douillard, J.Underwood, N.Melkumyan, and S.Singh, Hybrid elevation map: 3D surface models for segmentation, inProceedingsofIEEE/RSJInternationalConferenceonIntellignetRobotsandSystems, 2010, pp.1532-1538.

    [11] M.Himmelsbach, F.V.Hundelshausen, and H.J.Wuensche, Fast segmentation of 3D point clouds for ground vehicles, inProceedingsofIntelligentVehicleSymposlum, 2010, pp.560-565.

    [12] J.H.Lee, T.Tsubouchi, and K.Yamamoto, People tracking using a robot in motion wuth laser range finder, inProceedingsofInternationalConferenceonIntelligentRobotsandSystems, 2006, pp.2936-2942.

    [13] D.Brscic and H.Hashimoto, Tracking of objects in intelligent space using laser range finders, inProceedingsofInternationalConferenceonIntelligentRobotsandSystems, 2006, pp.1723-1728.

    [14] F.Scholer, J.Behley, and V.Steinhage, Person tracking in three-dimensional laser ranger data with explicit occlusion adaption, inProceedingsofInternationalConferenceonRoboticsandAutomation, 2011, pp.1297-1303.

    [15] L.Spinello, M.Luber, and K.O.Arras, Tracking people in 3D using a bottom-up top-down detector, inProceedingsofInternationalConferenceonRoboticsandAutomation, 2011, pp.1304-1310.

    [16] L.E.Navarro-Serment, C.Mertz, and N.Vandapel, LADAR-based pedestrian detection and tracking, inProceedingsofIEEEWorkshoponHumanDetectionfromMobilePlatformsCalifornia, 2008.

    [17] A.Petrovskaya and S.Thrun, Model based vehicle tracking in urban environments,inProceedingsofInternationalConferenceonRoboticsandAutomation, 2009.

    [18] L.Wang,Simultaneouslocalizationandmappingofmobilerobotwithwholeareacoverage.Nanjing University of Science & Technology, 2005.

    [19] D.Vivet, P.Checchin, and R.Chapuis, On the fly localization and mapping using a 360 Field-of-view Microwave Radar Sensor, inProceedingsofInternationalConferenceonIntelligentRobotsandSystems, 2009.

    [20] K.Pathak, A.Birk, N.Vaskevicius, M.Pfingsthorn, S.Schwertfeger, and J.Poppinga, Online three-dimensional SLAM by registration of large planar surface segments and closed-form pose-graph relaxation,JournalofFieldRobotics, no.27, pp.52-84, 2010.

    [21] S.Y.Park, S.I.Choi, J.Moon, and J.Kim, Localization of an unmanned ground vehicle using 3D registration of laser range data and DSM, inProceedingsofApplicationsofComputerVision, 2009, pp.1-6.

    [22] B.Wu, Compression of 3D range data with adaptive thresholding using the Integral Image, inProceedingsofComputerScienceandInformationEngineeringKaohsiung:NationalUniversityofKaohsiung, 2011.

    [23] Y.C.Lin, Large scale 3D scene registration using data from Velodyne LIDAR, inProceedingsofComputerScienceandInformationEngineeringKaohsiung:NationalUniversityofKaohsiung, 2011.

    [24] H.Kurniawati, J.C.Schulmeister, T.Bandyopadhyay, G.Papadopoulos, F.S.Hover, and N.M.Patrikalakis, Infrastructure for 3D model reconstruction of marine structures, inProceedingsofInternationalConferenceonRoboticsandAutomation, 2010.

    [25] M.Bertozzi, A.Broggi, and A.Fascioli, VisLab and the evolution of vision-based UGVs,Computer, vol.39, no.12, pp.31-38, 2006.

    [26] E.Guizzo, How google’s self-driving car works,IEEESpectrumOnline, pp.18, 2011.

    [27] X.Y.Zhang, H.B.Gao, M.Guo, G.P.Li, Y.C.Liu, and D.Y.Li, A study on key technologies of unmanned driving,CAAITransactionsonIntelligenceTechnology, vol.1, no.1, pp.4-43, 2016.

    [28] M.W.M.G.Dissanayake, P.Newman, S.Clark, H.F.Durrant-Whyte, and M.Csorba,A solution to the simultaneous localization and map building problem,IEEETransactionsonRoboticsandAutomation, vol.17, no.3, pp.229-241, 2001.

    ?Hongbo Gao is with State Key Laboratory of Automotive Safety and Energy, Tsinghua University, Beijing 100083, China.

    ?Xinyu Zhang is with Information Technology Center, Tsinghua University, Beijing 100083, China. Email: xyzhang@tsinghua.edu.cn.

    ?Jianhui Zhao is with Department of Computer Science and Technology,Tsinghua University, Beijing 100083, and Military Transportation University, Tianjin 300161, China.

    ?Deyi Li is with Institute of Electronic Engineering of China, Beijing 100039, China.

    *To whom correspondence should be addressed. Manuscript

    2017-08-06; accepted: 2017-09-18

    天堂动漫精品| 亚洲av电影在线进入| 免费看美女性在线毛片视频| 欧美色视频一区免费| 丰满人妻熟妇乱又伦精品不卡| 中文字幕高清在线视频| 日本黄大片高清| 国产免费av片在线观看野外av| or卡值多少钱| 日韩 亚洲 欧美在线| 一级av片app| 国产又黄又爽又无遮挡在线| 亚洲av中文字字幕乱码综合| 成人高潮视频无遮挡免费网站| 免费无遮挡裸体视频| 欧美三级亚洲精品| 欧美在线黄色| 日本免费a在线| 中文字幕精品亚洲无线码一区| 看免费av毛片| 欧美日韩黄片免| 国产精品一及| 嫁个100分男人电影在线观看| 别揉我奶头 嗯啊视频| 欧美精品啪啪一区二区三区| 美女免费视频网站| 亚洲七黄色美女视频| 两人在一起打扑克的视频| 久久久久久久久中文| 精品人妻偷拍中文字幕| 国产三级黄色录像| 精品人妻熟女av久视频| 国产av麻豆久久久久久久| 欧美日韩黄片免| 国产成年人精品一区二区| 18禁裸乳无遮挡免费网站照片| 成年女人毛片免费观看观看9| 精品久久久久久久久久免费视频| 午夜a级毛片| 久久久久亚洲av毛片大全| 日本免费一区二区三区高清不卡| 看十八女毛片水多多多| 黄色配什么色好看| 全区人妻精品视频| 中文字幕av成人在线电影| 亚洲美女黄片视频| 久久中文看片网| 亚洲国产欧美人成| а√天堂www在线а√下载| 亚洲精华国产精华精| 亚洲人成伊人成综合网2020| 欧美成人免费av一区二区三区| 真实男女啪啪啪动态图| 一本综合久久免费| 动漫黄色视频在线观看| 亚洲成人精品中文字幕电影| 日韩欧美三级三区| 日韩av在线大香蕉| 别揉我奶头 嗯啊视频| 在线观看66精品国产| 草草在线视频免费看| 性插视频无遮挡在线免费观看| 亚洲久久久久久中文字幕| 99国产精品一区二区三区| 亚洲自拍偷在线| avwww免费| 午夜福利免费观看在线| 精品久久久久久久久av| 波多野结衣高清无吗| 在线观看舔阴道视频| 真人做人爱边吃奶动态| 中文字幕人妻熟人妻熟丝袜美| 又黄又爽又刺激的免费视频.| 国产av麻豆久久久久久久| 老熟妇乱子伦视频在线观看| 国产精品久久久久久久电影| 久久国产乱子免费精品| 国产私拍福利视频在线观看| 俄罗斯特黄特色一大片| 亚洲黑人精品在线| 国产精品嫩草影院av在线观看 | 琪琪午夜伦伦电影理论片6080| 日韩 亚洲 欧美在线| 人人妻人人看人人澡| 婷婷精品国产亚洲av在线| 最近视频中文字幕2019在线8| 在线看三级毛片| 久久久久久久久久成人| 在线观看一区二区三区| www.www免费av| 久久人妻av系列| av在线蜜桃| 国产精品三级大全| 两人在一起打扑克的视频| 特级一级黄色大片| av欧美777| 成人无遮挡网站| 九色国产91popny在线| 亚洲精品在线美女| av黄色大香蕉| 两个人的视频大全免费| 亚洲无线在线观看| 又爽又黄无遮挡网站| 少妇熟女aⅴ在线视频| 久久精品国产亚洲av香蕉五月| 高清毛片免费观看视频网站| 欧美潮喷喷水| av在线观看视频网站免费| 午夜亚洲福利在线播放| 亚洲美女搞黄在线观看 | 亚洲va日本ⅴa欧美va伊人久久| eeuss影院久久| 亚洲五月天丁香| 观看免费一级毛片| 97碰自拍视频| 男女之事视频高清在线观看| www.熟女人妻精品国产| 精品一区二区三区人妻视频| a级毛片a级免费在线| 国产单亲对白刺激| 变态另类成人亚洲欧美熟女| 天堂动漫精品| 亚洲成a人片在线一区二区| 日本熟妇午夜| 久久这里只有精品中国| 成年人黄色毛片网站| 国产欧美日韩精品一区二区| 亚洲成人精品中文字幕电影| 成人无遮挡网站| 免费在线观看日本一区| av中文乱码字幕在线| а√天堂www在线а√下载| 黄色丝袜av网址大全| 亚洲一区二区三区色噜噜| 丰满乱子伦码专区| 久久久国产成人精品二区| 国产精品1区2区在线观看.| 国产成年人精品一区二区| 午夜激情福利司机影院| 成人国产综合亚洲| 日本精品一区二区三区蜜桃| 免费在线观看成人毛片| 国语自产精品视频在线第100页| 欧美黄色片欧美黄色片| 很黄的视频免费| 国产精品乱码一区二三区的特点| 高清日韩中文字幕在线| 三级毛片av免费| 久久精品国产亚洲av涩爱 | 少妇的逼水好多| 91av网一区二区| 俺也久久电影网| 免费一级毛片在线播放高清视频| 婷婷色综合大香蕉| 免费观看的影片在线观看| 国产精品野战在线观看| 精品人妻1区二区| 欧美zozozo另类| 精品久久久久久久久久免费视频| 亚洲一区二区三区色噜噜| 国产精品,欧美在线| 国产伦在线观看视频一区| 成人高潮视频无遮挡免费网站| 精品人妻偷拍中文字幕| 日日干狠狠操夜夜爽| 久久人妻av系列| 国产精品久久久久久亚洲av鲁大| 一个人观看的视频www高清免费观看| 国产 一区 欧美 日韩| 精品一区二区三区视频在线观看免费| 亚洲国产精品sss在线观看| 国产精品自产拍在线观看55亚洲| 成年女人毛片免费观看观看9| 国产老妇女一区| 丰满乱子伦码专区| 精品99又大又爽又粗少妇毛片 | 美女大奶头视频| 18禁裸乳无遮挡免费网站照片| 中文字幕av在线有码专区| 男女下面进入的视频免费午夜| 亚洲成av人片免费观看| 熟妇人妻久久中文字幕3abv| 国产精品免费一区二区三区在线| 51国产日韩欧美| 国产毛片a区久久久久| 国产成年人精品一区二区| 97超级碰碰碰精品色视频在线观看| 亚洲色图av天堂| 国产欧美日韩精品亚洲av| 欧美中文日本在线观看视频| 色哟哟哟哟哟哟| 少妇裸体淫交视频免费看高清| 午夜久久久久精精品| 中文字幕av成人在线电影| 丝袜美腿在线中文| av专区在线播放| 一a级毛片在线观看| 禁无遮挡网站| 国产色爽女视频免费观看| 成年女人毛片免费观看观看9| 在现免费观看毛片| 男人狂女人下面高潮的视频| 中文字幕av在线有码专区| 每晚都被弄得嗷嗷叫到高潮| 亚洲自偷自拍三级| 男插女下体视频免费在线播放| 黄色丝袜av网址大全| 亚洲经典国产精华液单 | 久99久视频精品免费| 免费在线观看日本一区| 国产免费一级a男人的天堂| 久久中文看片网| 日韩高清综合在线| 尤物成人国产欧美一区二区三区| 国产乱人伦免费视频| 男人舔女人下体高潮全视频| 亚洲av第一区精品v没综合| 老司机午夜福利在线观看视频| 亚洲va日本ⅴa欧美va伊人久久| 高清日韩中文字幕在线| av女优亚洲男人天堂| 国产色婷婷99| 久久精品91蜜桃| 精品一区二区免费观看| 午夜福利在线观看吧| 亚洲第一电影网av| 日本成人三级电影网站| 黄色视频,在线免费观看| 国产大屁股一区二区在线视频| 国内少妇人妻偷人精品xxx网站| 国产一区二区三区在线臀色熟女| 99久久精品国产亚洲精品| 免费看光身美女| АⅤ资源中文在线天堂| 免费在线观看成人毛片| 欧美精品啪啪一区二区三区| 亚洲国产欧美人成| 日韩欧美国产在线观看| 精华霜和精华液先用哪个| 国产真实乱freesex| 激情在线观看视频在线高清| 久久久久久久久大av| 亚洲狠狠婷婷综合久久图片| 国产91精品成人一区二区三区| 亚洲无线在线观看| 亚洲精品久久国产高清桃花| 精品久久久久久久久久免费视频| 日韩人妻高清精品专区| 国内毛片毛片毛片毛片毛片| 女人被狂操c到高潮| 国语自产精品视频在线第100页| 久久精品久久久久久噜噜老黄 | 国产爱豆传媒在线观看| 国产精品三级大全| 两人在一起打扑克的视频| 特级一级黄色大片| 久久精品夜夜夜夜夜久久蜜豆| 亚洲精品影视一区二区三区av| 久久精品国产亚洲av天美| 一区二区三区激情视频| 国产欧美日韩精品亚洲av| 简卡轻食公司| 色5月婷婷丁香| 九九久久精品国产亚洲av麻豆| 十八禁人妻一区二区| 免费人成在线观看视频色| 日本与韩国留学比较| 亚洲欧美日韩卡通动漫| 乱码一卡2卡4卡精品| 精品一区二区三区人妻视频| 99在线人妻在线中文字幕| 国产视频内射| 亚洲人成网站高清观看| 一区二区三区激情视频| 高清日韩中文字幕在线| 精品熟女少妇八av免费久了| 桃色一区二区三区在线观看| 国产精品亚洲av一区麻豆| 国产成+人综合+亚洲专区| 深夜a级毛片| 精品久久久久久久末码| 国产aⅴ精品一区二区三区波| 亚洲第一区二区三区不卡| 欧美黑人欧美精品刺激| 很黄的视频免费| 欧美+日韩+精品| 久久精品夜夜夜夜夜久久蜜豆| 久久久久九九精品影院| 99精品在免费线老司机午夜| 国产三级在线视频| av视频在线观看入口| 亚洲不卡免费看| 午夜福利在线观看免费完整高清在 | 国产精品不卡视频一区二区 | 男人和女人高潮做爰伦理| 国产探花在线观看一区二区| 亚洲自拍偷在线| 欧美日韩福利视频一区二区| av在线蜜桃| a在线观看视频网站| 又黄又爽又免费观看的视频| 日韩欧美免费精品| 91九色精品人成在线观看| 久久人人爽人人爽人人片va | av国产免费在线观看| 少妇人妻精品综合一区二区 | 亚洲精品久久国产高清桃花| 国产白丝娇喘喷水9色精品| 日韩欧美免费精品| 欧美日韩国产亚洲二区| av在线天堂中文字幕| 99精品在免费线老司机午夜| 久久人人爽人人爽人人片va | 国内揄拍国产精品人妻在线| 91麻豆av在线| 国产真实伦视频高清在线观看 | 色5月婷婷丁香| 久久久久久久午夜电影| 欧美潮喷喷水| 麻豆国产97在线/欧美| 99热这里只有是精品50| 亚洲,欧美精品.| 99久久成人亚洲精品观看| 男女下面进入的视频免费午夜| 欧美中文日本在线观看视频| 亚洲乱码一区二区免费版| 中文字幕精品亚洲无线码一区| 日韩欧美 国产精品| 免费观看的影片在线观看| 三级男女做爰猛烈吃奶摸视频| 久久久久久久午夜电影| 色尼玛亚洲综合影院| 亚洲第一区二区三区不卡| 90打野战视频偷拍视频| 成年女人毛片免费观看观看9| 午夜两性在线视频| 露出奶头的视频| 亚洲美女搞黄在线观看 | 欧美日韩亚洲国产一区二区在线观看| 一级a爱片免费观看的视频| 成年免费大片在线观看| 欧美一区二区精品小视频在线| 午夜亚洲福利在线播放| 亚洲人成网站高清观看| avwww免费| 色综合欧美亚洲国产小说| 精品人妻视频免费看| 99热这里只有是精品在线观看 | 国产精品久久电影中文字幕| av欧美777| 精品久久久久久成人av| 婷婷六月久久综合丁香| 精品人妻一区二区三区麻豆 | 日本黄色视频三级网站网址| 亚洲成av人片免费观看| 悠悠久久av| 亚洲av免费高清在线观看| 精品国产亚洲在线| 一本一本综合久久| 一区二区三区高清视频在线| 一进一出抽搐动态| 久久欧美精品欧美久久欧美| 国内精品久久久久久久电影| 午夜福利18| 麻豆av噜噜一区二区三区| 2021天堂中文幕一二区在线观| 简卡轻食公司| 欧美又色又爽又黄视频| 久久久久性生活片| 精品久久久久久,| 久久久国产成人精品二区| 亚洲欧美日韩无卡精品| 一卡2卡三卡四卡精品乱码亚洲| www日本黄色视频网| 蜜桃亚洲精品一区二区三区| 午夜a级毛片| 国产成人啪精品午夜网站| 国产精品av视频在线免费观看| 蜜桃久久精品国产亚洲av| 麻豆国产av国片精品| 国内精品美女久久久久久| 亚洲av免费高清在线观看| 日本黄大片高清| 精品一区二区三区视频在线| 又粗又爽又猛毛片免费看| 精品无人区乱码1区二区| 午夜免费成人在线视频| 国产v大片淫在线免费观看| eeuss影院久久| 欧美高清性xxxxhd video| 国产免费av片在线观看野外av| 黄色丝袜av网址大全| 香蕉av资源在线| 一本精品99久久精品77| 久久人人爽人人爽人人片va | 麻豆久久精品国产亚洲av| 男人的好看免费观看在线视频| 在线观看免费视频日本深夜| 久久久久久久精品吃奶| 国产精品不卡视频一区二区 | 日韩中文字幕欧美一区二区| 精品一区二区三区视频在线| 人人妻,人人澡人人爽秒播| 桃色一区二区三区在线观看| 成人三级黄色视频| 国产久久久一区二区三区| 婷婷六月久久综合丁香| 精品一区二区三区av网在线观看| 琪琪午夜伦伦电影理论片6080| 国产亚洲精品综合一区在线观看| 69人妻影院| 婷婷精品国产亚洲av| 最后的刺客免费高清国语| 成人特级黄色片久久久久久久| 亚洲成av人片免费观看| 亚洲av成人不卡在线观看播放网| 成年女人毛片免费观看观看9| 男人舔女人下体高潮全视频| 久久人妻av系列| 国产国拍精品亚洲av在线观看| a级一级毛片免费在线观看| 舔av片在线| 欧美乱色亚洲激情| 中亚洲国语对白在线视频| 在线天堂最新版资源| 在现免费观看毛片| av天堂在线播放| 成年女人毛片免费观看观看9| 日本 av在线| 最近最新中文字幕大全电影3| 亚洲自偷自拍三级| 国产亚洲精品综合一区在线观看| 欧美zozozo另类| 欧美区成人在线视频| 欧美色视频一区免费| 日韩精品青青久久久久久| 国产亚洲欧美98| 久久精品国产自在天天线| 精品久久久久久久久久久久久| 亚洲国产欧洲综合997久久,| 亚洲中文日韩欧美视频| 岛国在线免费视频观看| 午夜两性在线视频| 18禁黄网站禁片午夜丰满| 男人和女人高潮做爰伦理| 免费在线观看影片大全网站| 国产亚洲欧美98| 少妇丰满av| 久久久久精品国产欧美久久久| 国产精品一区二区三区四区久久| 亚洲成av人片在线播放无| av视频在线观看入口| 99视频精品全部免费 在线| 别揉我奶头~嗯~啊~动态视频| 波多野结衣高清无吗| 婷婷丁香在线五月| 国产私拍福利视频在线观看| 成人精品一区二区免费| 亚洲av第一区精品v没综合| 一级黄色大片毛片| 能在线免费观看的黄片| 美女大奶头视频| 尤物成人国产欧美一区二区三区| 国产精品三级大全| 在线观看舔阴道视频| or卡值多少钱| 欧美高清成人免费视频www| 有码 亚洲区| 啦啦啦观看免费观看视频高清| 国产免费av片在线观看野外av| 精品午夜福利视频在线观看一区| 亚洲av成人不卡在线观看播放网| 成人av在线播放网站| 国产亚洲精品久久久久久毛片| 日韩av在线大香蕉| 99国产精品一区二区三区| 日韩欧美免费精品| 日日摸夜夜添夜夜添av毛片 | 制服丝袜大香蕉在线| 亚洲成av人片在线播放无| 欧美性猛交黑人性爽| 精品国内亚洲2022精品成人| 男女床上黄色一级片免费看| 老女人水多毛片| 美女黄网站色视频| 最近最新中文字幕大全电影3| 色吧在线观看| 又爽又黄a免费视频| 99久久精品热视频| 内地一区二区视频在线| 亚洲欧美日韩无卡精品| 欧美3d第一页| 久久国产乱子免费精品| 如何舔出高潮| 久久精品国产自在天天线| 精品午夜福利视频在线观看一区| 亚洲国产精品sss在线观看| 国产v大片淫在线免费观看| 免费av毛片视频| 天堂av国产一区二区熟女人妻| 欧美成人性av电影在线观看| 色哟哟哟哟哟哟| 又爽又黄无遮挡网站| 宅男免费午夜| 久久久久免费精品人妻一区二区| 日韩高清综合在线| 亚洲真实伦在线观看| 高清在线国产一区| 亚洲av电影在线进入| 黄色日韩在线| 精品一区二区三区视频在线| 大型黄色视频在线免费观看| 少妇人妻精品综合一区二区 | 欧美最新免费一区二区三区 | 国产又黄又爽又无遮挡在线| 男人舔女人下体高潮全视频| 色综合欧美亚洲国产小说| 美女大奶头视频| 婷婷丁香在线五月| 亚洲精品日韩av片在线观看| 校园春色视频在线观看| 久久精品影院6| 少妇熟女aⅴ在线视频| 久久午夜亚洲精品久久| 亚洲欧美日韩无卡精品| 久久久精品大字幕| 91在线精品国自产拍蜜月| 国内精品美女久久久久久| 国产单亲对白刺激| 一二三四社区在线视频社区8| 亚洲无线观看免费| 国产成人影院久久av| 国产男靠女视频免费网站| 乱码一卡2卡4卡精品| 极品教师在线免费播放| 国产精品亚洲一级av第二区| 亚洲欧美激情综合另类| 我的女老师完整版在线观看| 在线播放无遮挡| 18美女黄网站色大片免费观看| 亚洲专区中文字幕在线| 色尼玛亚洲综合影院| 淫秽高清视频在线观看| 国产精品亚洲一级av第二区| 麻豆av噜噜一区二区三区| 亚州av有码| 日韩国内少妇激情av| 精品欧美国产一区二区三| 日韩中字成人| 真人做人爱边吃奶动态| 午夜福利在线观看吧| 午夜免费成人在线视频| 久久国产精品人妻蜜桃| 一夜夜www| 身体一侧抽搐| 精品久久久久久成人av| 90打野战视频偷拍视频| 在线观看免费视频日本深夜| 欧美黄色淫秽网站| 长腿黑丝高跟| 婷婷亚洲欧美| 中文字幕精品亚洲无线码一区| 免费av毛片视频| 一本综合久久免费| 最近在线观看免费完整版| 久久精品国产自在天天线| 麻豆一二三区av精品| 成人无遮挡网站| 日本与韩国留学比较| 午夜精品一区二区三区免费看| 51午夜福利影视在线观看| 三级国产精品欧美在线观看| 亚洲av电影不卡..在线观看| 丰满乱子伦码专区| 在线天堂最新版资源| 国产单亲对白刺激| 精品久久国产蜜桃| 麻豆成人av在线观看| 欧美黄色淫秽网站| 少妇的逼好多水| 欧美最新免费一区二区三区 | 国产精品综合久久久久久久免费| 国产久久久一区二区三区| 欧美性猛交╳xxx乱大交人| 国产一区二区三区视频了| 亚洲人成网站在线播| 午夜福利免费观看在线| 日本黄色视频三级网站网址| 老司机福利观看| 午夜福利免费观看在线| 国产单亲对白刺激| 国产精品三级大全| 国产视频一区二区在线看| 成人鲁丝片一二三区免费| 一二三四社区在线视频社区8| h日本视频在线播放| 色视频www国产| 少妇的逼好多水| 日本成人三级电影网站| 亚洲精品456在线播放app | 精品无人区乱码1区二区| 美女高潮喷水抽搐中文字幕| 激情在线观看视频在线高清| 日本撒尿小便嘘嘘汇集6| 亚洲熟妇中文字幕五十中出| 精品福利观看| 欧美黄色淫秽网站| 国产一区二区激情短视频| 麻豆成人av在线观看| 久久草成人影院| 久久久久久大精品| 亚洲中文字幕一区二区三区有码在线看| 老司机福利观看| 听说在线观看完整版免费高清| 精品一区二区三区人妻视频|