• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Online Observability-Constrained Motion Suggestion via Efficient Motion Primitive-Based Observability Analysis

    2018-04-16 07:27:11ZhengRongShunanZhongandNathanMichael

    Zheng Rong, Shun’an Zhong and Nathan Michael

    (1.School of Information and Electronics, Beijing Instituteof Technology, Beijing 100081, China; 2.Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, China; 3.Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213, USA)

    A reliable state estimation is essential for an autonomous vision-based navigation system such as a mobile robot to enable accurate localization and environment perception with respect to undefined environments[1].Reliable perception systems depend on the availability of sufficient informative sensor observations and environmental information to ensure a continued accurate performance[2]. State estimation methodologies assume the existence and preservation of observability (the ability of the system states to be reconstructed from system output)[3]. However, this preserved observability condition may be violated in environments with limited or scarce information, resulting in increased estimation uncertainty and even divergence, especially for monocular vision-based autonomous systems.

    In recent years, observability analysis techniques have received significant attention. Observability analysis provides a tool to evaluate the observability conditions with respect to the current system state and environment. Observability-constrained active control techniques leverage observability analysis to quantify the implications of sensor observations on a state estimate accuracy toward an informed trajectory selection[3], faster estimator convergence[4], and optimal sensor placement[5]. Observability-constrained vision navigation systems[6-8]detect the unobservable direction and reduce the estimation inconsistency by explicitly prohibiting spurious information gain. Trajectory is optimized to guarantee the sensing observability and controlling stability of the system[9]. Sampling trajectories are selected to maximize observability to localize the robot and construct the map of the environment[10-11]. Path planning has also been incorporated with visibility constraints towards the goal of less exploration time in unknown environments[12]. However, these studies focused on the observability analysis of the current system states, without considering the future observability and impact of future motions on the system observability. Local observability prediction can be of great value in many problematic situations where the reconstruction of the state is difficult and need to be sensed actively, such as robot localization in an unknown environment[9]. By characterizing the impact of future motion on system observability, this technique enables informed selection of future motions to avoid the potential observability degradation, and consequently to improve state estimation performance.

    In this work, an online methodology is proposed, seeking to predict the local observability conditions and suggest observability-constrained motion directions toward enabling robust state estimation and safe operation for an autonomous system in unknown environments.The formulation leverages efficient numerical observability analysis and motion primitive technique to realize the local observability prediction, and accordingly suggest the future motion directions to avoid the potential estimation degradation due to observability deficiency. Following the prior work[13], the empirical observability Gramian (EOG)[14]is employed to enable a real-time local observability evaluation. A motion primitive[15]technique is utilized to enable local path sampling and observability-constrained motion suggesting. We assess the implication of potential motions on the system observability and resulting state estimation performance by evaluating the observability of potential trajectories and seek to preserve the estimation consistency by explicit motion direction selection.

    The proposed approach is specialized to a monocular visual-inertial state estimation framework to assess the viability of the proposed approach to correctly predict the observability of future motions and effectively making motion suggestions to avoid the potential state estimation degeneracy of the perception system. Monocular visual-inertial state estimation methodology is a representative vision-based perception strategy that enables autonomous operation with resource-constrained systems such as a micro aerial vehicle and commodity devices.This choice of sensing modalities also represents a particularly challenging state estimation formulation due to the lack of the direct metric depth observation[16]. It is thereby essential for such a system to assure a full-state observability to achieve the state estimation accuracy.

    1 Methodology

    This section briefly summarizes relevant concepts related to monocular visual-inertial state estimator and EOG-based observability analysis based on prior works[2,13,15,17-19], and introduces a method to predict the observability of future motions and propose motion direction suggestions.

    1.1 EOG-based observability evaluation for mono V-I estimator

    The optimization-based monocular visual-inertial state estimation problem is formulated with respect to a sliding window that containsnvisual keyframes and a local map containingmlandmarks observed by the key frames, which is solved via a recursive optimization strategy. The full state parameters to be estimated are formulated as a vector X∈10n+m+3,

    (1)

    (2)

    where zimuand zcamare inertial measurement unit (IMU) and camera measurements, rimuand rcamare the residuals of IMU and camera measurements, Pimuand Pcamare the corresponding residual covariance. An IMU pre-integration technique[20]is employed to represent the IMU measurement. The resulting integrated system is treated as sensor outputs in the optimization and EOG state-output simulation.

    The system observability is analyzed given the monocular vision-based sliding window state estimation formulation by the efficient EOG computation[13]. The EOG provides insight into the sensitivity of the system outputs with respect to the perturbed system states and captures the ability of the estimator to reconstruct the system states given the sensor outputs by numerically simulating the system state-output behaviors. Crucially, computation of the EOG is independent of state estimation and is therefore a viable observability analysis technique, which makes it possible to perform the observability prediction even in a degraded estimation scenario.Benefiting from the additive property, the EOG can be efficiently computed by partitioning the system output into sub-vectors and exploiting the underlying formulation sparsity of the EOG. The full EOG W is readily computed by perturbing the system initial conditions x∈ndirectly in positive and negative directions and simulating the state-output 2ntimes.

    (3)

    1.2 Generation and observability evaluation of motion primitive

    Motion primitives are incorporated with the EOG evaluation to enable a motion extrapolation and subsequent local observability prediction. The EOG is computed for motion primitive end points based on the current state to enable the observability prediction locally.

    A motion primitive generation strategy[15]is employed to compute primitives based on a discretized set of initial and final conditions with different trajectory durations. While the approach extends readily to three dimensions, in this work,for simplicity of presentation, we compute motion primitives for a system in two dimensions. The computation is decoupled into translation and heading trajectory generation with higher-order end point constraint bounds that correspond to the expected rates of motion exhibited during the simulation and experimental studies. After the computation of the motion primitive library, a look-up-table is generated to enable efficient online queries to find the appropriate end point states for observability evaluation.

    1.3 Trajectory observability evaluation

    To predict the local observability condition and suggest next-step motion direction, a trajectory tree with specified number of levels and branches is generated to realize extrapolation. As shown in Fig.1, a 3-level extrapolation frame with 3-branch motion primitives is generated, and totally 27 potential trajectories in the local region based on the current state are created. The observability conditions of potential trajectories composed of a series of motion primitive end points are evaluated by synthesizing the observability measure of involved end points. For simplicity of representation and without loss of sensitivity due to wide camera FOV, we compute each motion primitive with three spreading branches. The orientation of the system is instantaneously in the direction of the motion.

    Fig.1 Example of a 3-level extrapolation frame with 3-branch motion primitives

    Using the trajectory tree with multiple level extrapolation, the observability evaluation enables local observability prediction with a certain degree to avoid a sudden dead end from which the state estimator may not recover. The observability cost for each trajectory is computed using a weighted strategy, i.e. the closer steps use the larger weighting factor, based on the fact that the closest step has more accurate observability evaluation and is more crucial in near-term motion execution than later steps.Thus for a specified trajectory withNsteps (end points), the observability measureKcan be computed using all the observability measureskiof end points and preset weighting factorswi

    (4)

    As the observability prediction for the first step is more accurate than later steps, a more conservative strategy can be applied by executing one-step motion according to the multi-level evaluation result, without loss of foresight. Under this scenario, it is preferable to evaluate the observability measure for a motion direction instead of a specified trajectory. The observability cost for each motion direction can be computed using all the observability measures of the involved motion primitives.

    (5)

    in which,Nis the number of levels,Mis the number of motion primitives in each level of one motion direction,k(i,j)is the observability measure of thejthmotion primitive inithlevel, and the motion primitives in the same level use identical weighting factorwi. Considering the example in Fig. 1, the future motion for current stateSis spread out with three motion directionsS1,S2,S3, and the observability cost of each direction can be evaluated using the involved 13 end points. For example, the observability cost of the second directionis computed as

    (6)

    Fig.2 Motion suggestions for a predefined simulation scenario and straight trajectory

    Thus the observability constrained motion direction can be suggested according to the resulting observability measures, and then can be incorporated with other motion constraints to yield an informed motion planning for the next step. Note that in the actual application of this strategy, the system doesn’t have to wait for the completion of the one-step execution to start the next motion suggestion. Instead, a denser motion suggestion can be generated and followed after each state estimation of a new coming frame, so that the new sensor observations can be added into the system and the environment information can be updated to yield a more accurate local observability prediction and consequently better motion suggestion.

    1.4 Observability-constrained motion direction suggestion

    After computing the observability cost for all the motion directions, the following strategy in Algorithm 1 can be used to propose the motion suggestion. An example of motion suggestion along a predefined trajectory in a simulated environment is shown in Fig.2. The corresponding online observability prediction results in three directions (left, forward, or right) is shown in Fig.3. Different motion directions are suggested due to the different landmark distribution in three stages. A threshold is used to identify the “forbidden direction”. Motion planner prefers the motion direction with a higher observability measure.

    Fig.3 Observability prediction and resulting motion direction suggestion

    Algorithm1Strategy of motion direction suggestion

    Assume there aremmotion directions to be evaluated.

    Initial condition:

    Kmax=0;

    imax=1;

    forbidden_directions.clear();

    Loop i: from 1 tom

    ifKi

    end.

    ifKi>Kmax, thenKmax=Ki,imax=i;

    end.

    end loop.

    Results:

    ·Forbidden_directionsindicates the directions in which the system is believed to experience an observability-deficient condition and prone to estimation degradation or failure;

    ·imaxindicates the most-suggested direction. Even in the worst case whereKmax

    ·Other directions are moderately suggested, in which the observability condition is believed to be able to provide sufficient information to ensure reasonable state estimation performance.

    A reasonable extrapolation range is essential to make sure the system can make prediction with sufficient foresight and take action in advance to avoid an irreversible degraded situation. An extrapolation with too short distance cannot provide sufficient predictive information, while a large extrapolation distance cannot ensure the prediction accuracy as the system cannot get sufficient environment information and consequently cannot predict the sensor observation accurately. For example, in a confined environment the camera observation may change significantly along the trajectory and the system can hardly predict the feature observation for a far end point. The extrapolation distance is controlled by the velocity of the final state and extrapolation time duration. In this work, we assume the constant linear velocity magnitude and choose different extrapolation duration according to the scenario type. In a confined scenario, smaller duration is used, as the effective sensor observation prediction distance is limited by the environment. In an open scenario, a larger extrapolation duration can be used. The extrapolation width is controlled by the branch number and the heading separation between the branches. As a camera with wide FOV is employed in this study, the configuration of three branches and 45-degree separation is reasonable without any loss of sensitivity and computing efficiency.

    2 Evaluation and Analysis

    The proposed motion direction suggesting approach is evaluated given the optimization-based monocular visual-inertial state estimation formulation through simulation and real-world experiments. The perception system including a monocular camera and IMU (simulated and real) is carried along trajectories in different environments. By the simulation analysis we seek to demonstrate the expected ability and efficacy of the proposed approach both on the local observability prediction and motion direction suggestion. Our current study seeks to verify the effectiveness of the proposed methodology in real-world scenarios.

    2.1 Implementation detail

    Fig.5 Predicted observability, actual observability, and estimation covariance of the two trajectories

    The simulation and experimental analysis leverage the same algorithm implementations, including the optimization-based sliding-window state estimator and local observability prediction-based motion suggestion as presented in sect. 1.4. We employ a time-synchronized monocular camera and IMU in the experiment. The model accurately simulates the associate sensor characteristics and uncertainty. Ceres solver[22]is utilized to solve the optimization problem, and analytic Jacobians are employed to ensure run-time performance. The full sliding window size of the state estimator is set to 30, and the update rate of optimization and motion suggestion is 10 Hz.The motion primitives are generated with three directions and four levels for a reasonable extrapolation distance and width, and consequently 120 observability measure of end points are evaluated. The EOG-based observability evaluation is established with a fixed perturbation size of 0.001 and translational states perturbed. Note that we use different threshold in simulation and experiment to detect the observability deficiency due to different scenario model. From the actual experiment we find that for the similar environment model (for example outdoor open scenario) a well-chosen threshold can be used and doesn’t need further adjustment.

    2.2 Simulation results

    In the simulation two pre-defined trajectories are tested in the same scenario (Fig.4). Straight trajectory-1 experienced observability degradation due to feature deficiency, while trajectory-2 is always involved in a feature-rich area. The observability measures of three motion directions are evaluated to make the prediction, and the actual current observability measure is used as a reference (Fig.5).

    Fig.4 Pre-defined simulation scenario and two trajectories

    Firstly, the observability deficiency in trajectory-1 (left column) is correctly predicted (by 8.5 s) with a threshold of 0.3 applied on the predicted observability measure. The correctness of the prediction can be verified by the actual observability measure and the consequent state estimation performance degradation indicated by the increasing covariance. Secondly, during the duration [60 s, 100 s], left direction is strongly suggested, while the actual motion disobeyed the suggestion and entered into the observability-deficient area, which results in the consequent uncertainty with a degraded state estimation. On the contrary, trajectory-2 (right column) executed a motion following the suggested direction to avoid the feature-deficient area, yielding a reasonable observability measure along the trajectory and having a consistent estimated performance. The estimation performance can be checked by both the estimation covariance (Fig.5) and estimated path (Fig.6). The observability deficiency in trajectory-1 results in a big jump in estimated path and end point, while trajectory-2 yields a result with limited uncertainty.

    Fig.6 Visualized estimation results, including estimated path and features

    Note that in trajectory-1, the prediction of the observability degeneracy occurred 8 s earlier than the actual occurrence of the degeneracy, permitting a motion planner to take actions before entering into a pathological scenario. Although trajectory-2 is much longer than trajectory-1, the state estimation along trajectory-2 preserved a better performance with the well selected motion direction. Ideally, the best trajectory can be selected as the motion direction that keeps the highest observability value, but in actual application the motion planning should be constrained by both the observability and user goal. Note that trajectory-2 is generated based on the direction suggestion in trajectory-1 and actual feature distribution model in the simulated environment.

    2.3 Experimental results

    In the experiment, the system moves in a hallway with different observability conditions representing degraded scenarios such as white wall, narrow space, sharp turning, and dark lighting. As shown in Fig.7,four representative cases and three prediction points are studied.

    Fig.7 Experiment scenario illustrated by estimated path and environment features

    Fig.8 Predicted observability in three directions, actual observability, and estimation covariance

    Fig.9 Camera images overlaid with online motion direction suggestion in four representative cases

    The observability measure in three motion directions are predicted and motion directions are suggested accordingly (Fig.8). The actual observability measure is given as a reference to verify the efficacy of the prediction, while the estimation covariance is used to evaluate the actual estimation performance. Four representative cases and three prediction points are marked. In case-1, forward motion is mostly suggested according to the highest observability prediction, and the actual motion followed this direction, which preserved a good observability condition with little state estimation uncertainty. In case-2, the potential degeneracy in forward motion is successfully predicted.While the system kept moving forward, as expected, it experienced an actual observability degeneracy, which results in an increased estimation covariance. Before the sharp turn in case-3, moving right is strongly recommended due to the highly confined environment. The suggestion is followed by an actual motion. As a result, the actual observability increased significantly and the estimation uncertainty decreased to a low level. In case-4, before the system entering the dark area the degeneracy in future motion is successfully predicted and the violation of the suggestion results in a significantly increased state uncertainty.

    The four representative scenarios include two true-positive predictions (1, 3) and two true-negative predictions (2, 4). In the true positive cases, the system proposed “should go” direction (green arrows overlaid on the camera images), and the actual motion followed the suggestion and consequently yielded a good state estimation performance. On the contrary, in the true negative cases the system moved along the predicted “forbidden” direction (red arrows on the camera images), eventually entering the degenerate conditions and inducing significant state estimation degradation. The true-positive and true-negative cases demonstrate the correctness of the observability prediction in motion directions with informative and deficient sensor observation, respectively. The corresponding images (Fig.9) captured by the camera are given to exhibit the visual environment conditions. Green arrows (G) indicate the most suggested directions, blue (B) for the moderately suggested directions and red (R) for the forbidden directions. Note that the arrows indicating the motion suggestions are generated and overlaid on the camera image online.

    Similar to the simulation test, the degenerate conditions are successfully predicted prior to the actual occurrence of the degeneracy. Three representative prediction points (A′,B′,C′), and the corresponding actual occurrence points (A,B,C) are checked,tA′=12.35 s,tA=12.85 s,tB′=78.75 s,tB=79.45 s,tC′=86.95 s,tC=88.35 s. The prediction is 0.5 s, 0.7 s, 1.4 s earlier than the degeneracy occurrence, respectively.

    3 Conclusion

    An online observability-constrained motion suggesting methodology is proposed in this paper. The proposed approach seeks to make informed motion suggestion for a representative monocular visual-inertial state estimation system to preserve the robust estimation performance. An efficient EOG-based observability evaluation technique and motion primitives are incorporated to enable a local observability prediction and real-time motion suggestion, which makes it possible to evaluate an observability for 120 end points locally within 25 ms using one thread on a 2.9 GHz Intel Core-i7 laptop.The approach is evaluated by both simulation and experiments. The results demonstrate the correctness of the local observability prediction and efficacy of the motion suggestion.

    The observability-constrained motion suggestion is an active strategy to ensure the state estimation performance of a vision-based autonomous system towards safe operation in an undefined environment. To achieve the goal, we will incorporate the proposed approach with the controlling system to enable observability-constrained planning in challenging operation environments.

    [1] Lupton T, Sukkarieh S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions[J]. IEEE Transactions on Robotics,2012, 28(1): 61-76.

    [2] Hinson B T. Observability-based guidance and sensor placement[D]. Washington: University of Washington, 2014.

    [3] Alaeddini A, Morgansen K A. Trajectory design for a nonlinear system to insure observability[C]∥2014 European Control Conference (ECC), Strasbourg, France, 2014.

    [4] Hinson B T, Binder M K, Morgansen K A. Path planning to optimize observability in a planar uniform flow field[C]∥2013 American Control Conference (ACC), Washington, DC, USA, 2013.

    [5] Qi J, Sun K, Kang W. Optimal PMU placement for power system dynamic state estimation by using empirical observability gramian[J]. IEEE Transactions on Power Systems,2015, 30(4): 2041-2054.

    [6] Hesch J A, Kottas D G, Bowman S L, et al. Observability-constrained vision-aided inertial navigation, 2012-001[R]. Minneapolis: Dept of Comp Sci & Eng, MARS Lab, University of Minnesota, 2012.

    [7] Kottas D G, Hesch J A, Bowman S L, et al. On the consistency of vision-aided inertial navigation[C]∥Experimental Robotics, Switzerland, 2013.

    [8] Hesch J A, Kottas D G, Bowman S L, et al. Towards consistent vision-aided inertial navigation[M]. Berlin: Springer, 2013: 559-574.

    [9] Alaeddini A, Morgansen K A. Trajectory optimization of limited sensing dynamical systems[EB/OL]. [2017-02-15].https:∥arxiv.org/abs/1611.08056.

    [10] Lorussi F, Marigo A, Bicchi A. Optimal exploratory paths for a mobile rover[C]∥IEEE International Conference on Robotics and Automation (ICRA), Seoul, Korea, 2001.

    [11] Devries L, Majumdar S J, Paley D A. Observability-based optimization of coordinated sampling trajectories for recursive estimation of a strong, spatially varying flowfield[J]. Journal of Intelligent & Robotic Systems,2013, 70(1-4): 527-544.

    [12] Richter C, Roy N. Learning to plan for visibility in navigation of unknown environments[C]∥International Symposium on Experimental Robotics (ISER), Tokyo, Japan, 2016.

    [13] Rong Z, Michael N. Detection and prediction of near-term state estimation degradation via online nonlinear observability analysis[C]∥2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland, 2016.

    [14] Singh A K, Hahn J. On the use of empirical gramians for controllability and observability analysis[C]∥American Control Conference, Portland, Oregon, USA, 2005.

    [15] Mueller M W, Hehn M, D’andrea R. A computationally efficient motion primitive for quadrocopter trajectory generation[J]. IEEE Transactions on Robotics,2015, 31(6): 1294-1310.

    [16] Hansen P, Alismail H, Rander P, et al. Monocular visual odometry for robot localization in LNG pipes[C]∥2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 2011.

    [17] Lall S, Marsden J E, Glava?ki S. Empirical model reduction of controlled nonlinear systems[C]∥IFAC World Congress, New York, USA, 1999.

    [18] Qi J, Sun K, Kang W. Adaptive optimal PMU placement based on empirical observability Gramian[C]∥10th IFAC Symposium on Nonlinear Control Systems NOLCOS, Monterey, California, USA, 2016.

    [19] Shen S J, Michael N, Kumar V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs[C]∥IEEE International Conference on Robotics and Automation (ICRA) Washington, USA, 2015.

    [20] Forster C, Carlone L, Dellaert F, et al. On-manifold preintegration theory for fast and accurate visual-inertial navigation[EB/OL]. [2017-02-12].https:∥pdfs.semanticscholar.org/ed4e/9f89d1fbcf 50bea8c65b947b6397a61b4945.pdf.

    [21] Krener A J, Ide K. Measures of unobservability[C]∥48th IEEE Conference on Decision and Control (CDC/CCC), Shanghai, China, 2009.

    [22] Agarwal S, Mierle K. Ceres solver[EB/OL]. [2016-08-14].http:∥ceres-solver.org/.

    欧美+日韩+精品| 欧美日韩综合久久久久久| 久久狼人影院| 亚洲美女搞黄在线观看| 狂野欧美激情性bbbbbb| 日本欧美国产在线视频| 黄色怎么调成土黄色| 国产男女内射视频| 亚洲美女视频黄频| 亚洲精品日韩在线中文字幕| 看十八女毛片水多多多| 尾随美女入室| 久热这里只有精品99| 老女人水多毛片| 欧美成人午夜免费资源| 青春草国产在线视频| 丝袜脚勾引网站| 美国免费a级毛片| 久久久久久久精品精品| 激情五月婷婷亚洲| 日韩中文字幕视频在线看片| 国产一区二区在线观看av| 亚洲欧美成人综合另类久久久| 久久99精品国语久久久| 老鸭窝网址在线观看| 肉色欧美久久久久久久蜜桃| 老汉色av国产亚洲站长工具| 热99久久久久精品小说推荐| 国产色婷婷99| 亚洲国产毛片av蜜桃av| 欧美成人午夜免费资源| 久久久久久久大尺度免费视频| 少妇 在线观看| 国产 精品1| 国产精品嫩草影院av在线观看| 国产成人a∨麻豆精品| 人人妻人人添人人爽欧美一区卜| 日韩av免费高清视频| 国产一区亚洲一区在线观看| 搡女人真爽免费视频火全软件| 欧美日韩视频精品一区| 国产精品一国产av| 男女啪啪激烈高潮av片| 欧美精品人与动牲交sv欧美| 亚洲av日韩在线播放| 99国产综合亚洲精品| 亚洲成人av在线免费| 亚洲精品美女久久av网站| 亚洲精品美女久久久久99蜜臀 | 亚洲国产精品一区二区三区在线| 精品酒店卫生间| 岛国毛片在线播放| 十八禁网站网址无遮挡| 热99久久久久精品小说推荐| 人人妻人人爽人人添夜夜欢视频| 18+在线观看网站| 欧美变态另类bdsm刘玥| 久久精品熟女亚洲av麻豆精品| 午夜福利视频在线观看免费| 老女人水多毛片| 久久久久久久亚洲中文字幕| 一级爰片在线观看| 久久久a久久爽久久v久久| 亚洲第一av免费看| 国产淫语在线视频| 天天躁狠狠躁夜夜躁狠狠躁| 国产乱人偷精品视频| 精品人妻一区二区三区麻豆| 丁香六月天网| 在线观看人妻少妇| 日韩视频在线欧美| 久久狼人影院| 久久久精品免费免费高清| 精品视频人人做人人爽| av在线老鸭窝| 欧美精品亚洲一区二区| 亚洲国产最新在线播放| 美女国产视频在线观看| 女人精品久久久久毛片| 最近最新中文字幕免费大全7| 中国三级夫妇交换| 免费不卡的大黄色大毛片视频在线观看| 性少妇av在线| 爱豆传媒免费全集在线观看| av女优亚洲男人天堂| 久久精品国产综合久久久| 日韩欧美精品免费久久| 久久国内精品自在自线图片| 欧美人与性动交α欧美软件| 人妻人人澡人人爽人人| 在现免费观看毛片| a 毛片基地| 丝袜脚勾引网站| 久久久久国产精品人妻一区二区| 久久人人爽人人片av| 国产无遮挡羞羞视频在线观看| 999精品在线视频| 中文字幕亚洲精品专区| 三上悠亚av全集在线观看| 国产一级毛片在线| 妹子高潮喷水视频| 韩国高清视频一区二区三区| 免费看av在线观看网站| 国产精品蜜桃在线观看| 久久久久久人妻| 乱人伦中国视频| 日本91视频免费播放| 精品一区二区三卡| 国产精品国产av在线观看| 1024香蕉在线观看| 中文字幕人妻丝袜一区二区 | 欧美av亚洲av综合av国产av | 成年女人在线观看亚洲视频| 午夜福利一区二区在线看| 亚洲,欧美,日韩| 大香蕉久久网| 色网站视频免费| 亚洲欧美精品自产自拍| 中文字幕另类日韩欧美亚洲嫩草| 国产探花极品一区二区| 夫妻性生交免费视频一级片| 制服丝袜香蕉在线| 纵有疾风起免费观看全集完整版| 亚洲精品自拍成人| 久久精品亚洲av国产电影网| 26uuu在线亚洲综合色| 中国三级夫妇交换| 国产片内射在线| 欧美日韩视频高清一区二区三区二| 日本av手机在线免费观看| 久久97久久精品| 久久精品国产综合久久久| 人妻少妇偷人精品九色| 美女视频免费永久观看网站| 精品亚洲乱码少妇综合久久| 久久婷婷青草| 中文字幕人妻丝袜一区二区 | 美女高潮到喷水免费观看| 久久女婷五月综合色啪小说| 中文字幕人妻丝袜制服| 欧美精品一区二区大全| 中文欧美无线码| 亚洲伊人久久精品综合| 亚洲内射少妇av| 成人18禁高潮啪啪吃奶动态图| 国产1区2区3区精品| 免费日韩欧美在线观看| 三级国产精品片| 日韩制服丝袜自拍偷拍| 亚洲国产看品久久| 高清视频免费观看一区二区| 99精国产麻豆久久婷婷| 一级,二级,三级黄色视频| 制服诱惑二区| 国产成人免费观看mmmm| 男女国产视频网站| 久久韩国三级中文字幕| 欧美日韩av久久| 在线观看www视频免费| 熟妇人妻不卡中文字幕| 精品一区二区三卡| 老汉色av国产亚洲站长工具| 狠狠婷婷综合久久久久久88av| 成人毛片a级毛片在线播放| 国产精品偷伦视频观看了| 性少妇av在线| 制服人妻中文乱码| 久久精品国产综合久久久| 国产精品国产三级国产专区5o| 高清在线视频一区二区三区| av视频免费观看在线观看| 国产欧美亚洲国产| 肉色欧美久久久久久久蜜桃| 一区在线观看完整版| 国产一区亚洲一区在线观看| 91在线精品国自产拍蜜月| 午夜福利影视在线免费观看| 秋霞在线观看毛片| 国产在线一区二区三区精| h视频一区二区三区| 亚洲,欧美,日韩| 黄频高清免费视频| 亚洲色图 男人天堂 中文字幕| 亚洲av福利一区| 国产成人av激情在线播放| av国产久精品久网站免费入址| a级毛片黄视频| 一级毛片 在线播放| 免费久久久久久久精品成人欧美视频| 午夜免费鲁丝| 一级毛片我不卡| 人妻系列 视频| 国产在线一区二区三区精| 精品人妻在线不人妻| av有码第一页| 天天操日日干夜夜撸| 久久久精品国产亚洲av高清涩受| 久久久国产一区二区| 啦啦啦中文免费视频观看日本| 2021少妇久久久久久久久久久| 啦啦啦啦在线视频资源| 欧美在线黄色| 大香蕉久久网| 日韩av在线免费看完整版不卡| 亚洲欧美成人综合另类久久久| 国产片内射在线| 性色avwww在线观看| 美女国产视频在线观看| 大码成人一级视频| 制服人妻中文乱码| 99久国产av精品国产电影| 亚洲av综合色区一区| 成人午夜精彩视频在线观看| 久久精品人人爽人人爽视色| 男女啪啪激烈高潮av片| 精品久久久久久电影网| 国产一区二区 视频在线| 性色avwww在线观看| 免费看av在线观看网站| 午夜福利乱码中文字幕| 九草在线视频观看| 亚洲精品自拍成人| 日韩大片免费观看网站| 精品卡一卡二卡四卡免费| 91精品伊人久久大香线蕉| 亚洲精品第二区| 老女人水多毛片| 亚洲男人天堂网一区| 精品福利永久在线观看| 成人国产麻豆网| 久久ye,这里只有精品| 丰满少妇做爰视频| 丝袜在线中文字幕| 午夜福利一区二区在线看| 亚洲成人av在线免费| 亚洲天堂av无毛| 丰满乱子伦码专区| 夜夜骑夜夜射夜夜干| 亚洲成人一二三区av| 国产xxxxx性猛交| 久久久国产精品麻豆| 91aial.com中文字幕在线观看| 黑人猛操日本美女一级片| 大陆偷拍与自拍| 男女边摸边吃奶| 各种免费的搞黄视频| 亚洲精品久久午夜乱码| 天堂俺去俺来也www色官网| 精品一品国产午夜福利视频| 人人妻人人澡人人爽人人夜夜| 午夜日本视频在线| 国产成人精品婷婷| av国产久精品久网站免费入址| 国产精品久久久久久久久免| 午夜福利视频在线观看免费| 精品酒店卫生间| 超色免费av| 免费久久久久久久精品成人欧美视频| 亚洲综合精品二区| 欧美成人精品欧美一级黄| 亚洲视频免费观看视频| 免费观看在线日韩| 菩萨蛮人人尽说江南好唐韦庄| 久久久久久久久免费视频了| 狂野欧美激情性bbbbbb| 日韩精品免费视频一区二区三区| 精品国产一区二区三区久久久樱花| 99热国产这里只有精品6| 黑人欧美特级aaaaaa片| 免费黄色在线免费观看| 午夜福利乱码中文字幕| 成年美女黄网站色视频大全免费| 精品国产乱码久久久久久小说| 久久综合国产亚洲精品| 曰老女人黄片| 亚洲国产看品久久| 国产精品av久久久久免费| 久久精品久久久久久噜噜老黄| 欧美另类一区| av女优亚洲男人天堂| 男人添女人高潮全过程视频| 色网站视频免费| 视频区图区小说| 毛片一级片免费看久久久久| 日韩中文字幕欧美一区二区 | 亚洲国产av影院在线观看| 久久99热这里只频精品6学生| 天堂俺去俺来也www色官网| 日韩一区二区视频免费看| 日本免费在线观看一区| 久久精品久久久久久噜噜老黄| 看免费av毛片| 晚上一个人看的免费电影| 黄片小视频在线播放| 91午夜精品亚洲一区二区三区| 国产激情久久老熟女| 一本大道久久a久久精品| 久久久a久久爽久久v久久| 久久国产精品大桥未久av| 日韩精品有码人妻一区| 国产av国产精品国产| 韩国精品一区二区三区| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 久久久久久久久久久久大奶| 一本—道久久a久久精品蜜桃钙片| 大片电影免费在线观看免费| 高清欧美精品videossex| 久久久久久人妻| 不卡视频在线观看欧美| 亚洲天堂av无毛| 亚洲精品久久午夜乱码| 国产欧美日韩一区二区三区在线| 黄色配什么色好看| 观看av在线不卡| 看免费av毛片| 亚洲国产欧美日韩在线播放| 久久久久精品人妻al黑| 国产熟女午夜一区二区三区| 青春草亚洲视频在线观看| 亚洲三区欧美一区| a级片在线免费高清观看视频| a级毛片在线看网站| 激情五月婷婷亚洲| 秋霞在线观看毛片| 18禁国产床啪视频网站| 欧美精品一区二区免费开放| 黄色怎么调成土黄色| 黄网站色视频无遮挡免费观看| 欧美老熟妇乱子伦牲交| 日韩av不卡免费在线播放| 男女高潮啪啪啪动态图| 青春草国产在线视频| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 久久久久精品久久久久真实原创| 丝袜美足系列| 老女人水多毛片| 久久人妻熟女aⅴ| 日韩三级伦理在线观看| 老司机影院成人| 久热这里只有精品99| 免费高清在线观看视频在线观看| 亚洲欧洲精品一区二区精品久久久 | 日韩电影二区| 中文天堂在线官网| 一级毛片电影观看| 亚洲精品成人av观看孕妇| 18禁国产床啪视频网站| 大香蕉久久成人网| 黄色怎么调成土黄色| 999精品在线视频| 欧美激情极品国产一区二区三区| 99热国产这里只有精品6| 大片免费播放器 马上看| 丝袜喷水一区| 成人毛片60女人毛片免费| 亚洲综合色网址| 侵犯人妻中文字幕一二三四区| 精品少妇黑人巨大在线播放| 国产一区二区三区av在线| 啦啦啦视频在线资源免费观看| 精品99又大又爽又粗少妇毛片| 久久女婷五月综合色啪小说| 最近最新中文字幕免费大全7| 日本欧美国产在线视频| 久久精品aⅴ一区二区三区四区 | 七月丁香在线播放| 夜夜骑夜夜射夜夜干| 精品国产一区二区三区四区第35| 日本免费在线观看一区| 欧美国产精品一级二级三级| 在线观看www视频免费| 精品酒店卫生间| 中文字幕色久视频| 一二三四在线观看免费中文在| a级片在线免费高清观看视频| 亚洲熟女精品中文字幕| 91精品国产国语对白视频| 久久久久久免费高清国产稀缺| 伦理电影大哥的女人| 精品一区二区免费观看| 女性被躁到高潮视频| 亚洲国产精品999| 欧美bdsm另类| 9191精品国产免费久久| 日韩视频在线欧美| 免费人妻精品一区二区三区视频| av在线播放精品| 丁香六月天网| 1024视频免费在线观看| 色视频在线一区二区三区| 久久精品国产a三级三级三级| 国产女主播在线喷水免费视频网站| 搡老乐熟女国产| 香蕉精品网在线| 美女脱内裤让男人舔精品视频| av网站在线播放免费| 性色av一级| 国产成人精品在线电影| 亚洲精品乱久久久久久| 男人爽女人下面视频在线观看| 欧美变态另类bdsm刘玥| 亚洲色图 男人天堂 中文字幕| 国语对白做爰xxxⅹ性视频网站| 日韩精品免费视频一区二区三区| 男人操女人黄网站| 咕卡用的链子| 亚洲国产看品久久| 伊人亚洲综合成人网| 精品少妇久久久久久888优播| 精品人妻一区二区三区麻豆| a级毛片在线看网站| 日日摸夜夜添夜夜爱| 久久精品国产鲁丝片午夜精品| 国产免费福利视频在线观看| 久久人人爽av亚洲精品天堂| 精品人妻一区二区三区麻豆| 汤姆久久久久久久影院中文字幕| 高清不卡的av网站| 成年动漫av网址| 国产一区二区三区av在线| av不卡在线播放| 人人妻人人爽人人添夜夜欢视频| 少妇被粗大猛烈的视频| 春色校园在线视频观看| 免费看不卡的av| 爱豆传媒免费全集在线观看| 天堂俺去俺来也www色官网| 成人亚洲欧美一区二区av| √禁漫天堂资源中文www| 精品99又大又爽又粗少妇毛片| 精品福利永久在线观看| 日本爱情动作片www.在线观看| tube8黄色片| 中文欧美无线码| 午夜老司机福利剧场| 国产伦理片在线播放av一区| 欧美+日韩+精品| 少妇人妻 视频| 国产成人aa在线观看| 中文字幕人妻丝袜一区二区 | 观看av在线不卡| 免费在线观看视频国产中文字幕亚洲 | 春色校园在线视频观看| 免费看不卡的av| 最近最新中文字幕免费大全7| 日韩中文字幕欧美一区二区 | 啦啦啦视频在线资源免费观看| 青春草视频在线免费观看| 日韩中字成人| 国产精品99久久99久久久不卡 | 国产探花极品一区二区| 各种免费的搞黄视频| 成人二区视频| 国产成人精品久久二区二区91 | 国产一区二区三区av在线| av电影中文网址| 免费看不卡的av| 亚洲一级一片aⅴ在线观看| 丝袜在线中文字幕| 高清黄色对白视频在线免费看| 黄色怎么调成土黄色| 成年女人在线观看亚洲视频| 亚洲av成人精品一二三区| 亚洲欧美中文字幕日韩二区| 亚洲成色77777| 国产视频首页在线观看| 丝袜喷水一区| 亚洲熟女精品中文字幕| av不卡在线播放| 国产成人精品福利久久| 国产精品欧美亚洲77777| 女人高潮潮喷娇喘18禁视频| 久久99精品国语久久久| 欧美人与性动交α欧美软件| 免费大片黄手机在线观看| 国产激情久久老熟女| 亚洲伊人色综图| 国产精品无大码| 亚洲欧美中文字幕日韩二区| av有码第一页| 国产激情久久老熟女| 亚洲国产精品成人久久小说| 黄片小视频在线播放| 免费播放大片免费观看视频在线观看| 国精品久久久久久国模美| 男女无遮挡免费网站观看| 女人精品久久久久毛片| 美女视频免费永久观看网站| 男女边摸边吃奶| 欧美人与性动交α欧美软件| 美女高潮到喷水免费观看| 久久影院123| 美女中出高潮动态图| 最近最新中文字幕免费大全7| 久久久久久免费高清国产稀缺| 电影成人av| 9191精品国产免费久久| 亚洲国产日韩一区二区| www.熟女人妻精品国产| 一个人免费看片子| 国产一区二区三区av在线| 777久久人妻少妇嫩草av网站| 90打野战视频偷拍视频| 亚洲国产精品一区三区| 欧美97在线视频| 日韩精品有码人妻一区| 韩国高清视频一区二区三区| 伊人亚洲综合成人网| 一区二区三区乱码不卡18| 黄色配什么色好看| 久久影院123| 国产淫语在线视频| 少妇的丰满在线观看| 免费看av在线观看网站| 天天躁夜夜躁狠狠久久av| 一区二区三区乱码不卡18| 久久久久久久久久久免费av| 久久人人爽av亚洲精品天堂| 美女中出高潮动态图| 一本—道久久a久久精品蜜桃钙片| 欧美日韩亚洲高清精品| 街头女战士在线观看网站| 亚洲欧美一区二区三区国产| 肉色欧美久久久久久久蜜桃| 欧美日韩国产mv在线观看视频| 久久鲁丝午夜福利片| av一本久久久久| 午夜福利一区二区在线看| 免费久久久久久久精品成人欧美视频| 91成人精品电影| 日本vs欧美在线观看视频| 久久久久久久国产电影| 亚洲精品,欧美精品| 久久精品夜色国产| 满18在线观看网站| av电影中文网址| 人体艺术视频欧美日本| 亚洲,一卡二卡三卡| 久久av网站| 久久热在线av| 80岁老熟妇乱子伦牲交| 国产一区亚洲一区在线观看| 一个人免费看片子| 亚洲综合色惰| 成年av动漫网址| av免费观看日本| 国产一级毛片在线| 两性夫妻黄色片| 天堂俺去俺来也www色官网| a级毛片在线看网站| 午夜激情av网站| 久久久久久久久久久久大奶| 亚洲一区中文字幕在线| 亚洲av日韩在线播放| 一本久久精品| 美女脱内裤让男人舔精品视频| 亚洲欧美一区二区三区久久| 亚洲成色77777| 亚洲婷婷狠狠爱综合网| 国产av精品麻豆| 9191精品国产免费久久| 国产精品一区二区在线观看99| 91在线精品国自产拍蜜月| 亚洲国产日韩一区二区| av在线播放精品| 国产一区二区激情短视频 | 丰满迷人的少妇在线观看| 18禁国产床啪视频网站| 欧美精品av麻豆av| 久久精品aⅴ一区二区三区四区 | 一本久久精品| 你懂的网址亚洲精品在线观看| 咕卡用的链子| 国产男女超爽视频在线观看| 好男人视频免费观看在线| 国产男女超爽视频在线观看| 久久 成人 亚洲| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 亚洲婷婷狠狠爱综合网| 成人亚洲欧美一区二区av| 一级毛片我不卡| 中文字幕人妻丝袜制服| 久久青草综合色| 久久精品人人爽人人爽视色| 在线 av 中文字幕| 日韩av在线免费看完整版不卡| 亚洲欧美日韩另类电影网站| 亚洲精品自拍成人| 午夜激情av网站| 男女边摸边吃奶| 亚洲精品成人av观看孕妇| 最新的欧美精品一区二区| 韩国高清视频一区二区三区| 制服诱惑二区| 国产欧美日韩一区二区三区在线| 国产国语露脸激情在线看| 亚洲av综合色区一区| 欧美日韩精品成人综合77777| 国产爽快片一区二区三区| 精品一区二区三区四区五区乱码 | 色94色欧美一区二区| 午夜免费鲁丝| 国产一区二区激情短视频 | 春色校园在线视频观看| 久久人妻熟女aⅴ| 欧美日韩av久久| 视频在线观看一区二区三区| 久久精品久久久久久久性| 成人黄色视频免费在线看| 丝瓜视频免费看黄片| 日本wwww免费看| 国产日韩欧美在线精品| 色哟哟·www| 18+在线观看网站| 国产综合精华液|