• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Monocular Visual-Inertial and Robotic-Arm Calibration in a Unifying Framework

    2022-10-26 12:24:02YinlongZhangWeiLiangSeniorMingzeYuanHongshengHeJindongTanandZhiboPangSenior
    IEEE/CAA Journal of Automatica Sinica 2022年1期

    Yinlong Zhang,, Wei Liang, Senior, Mingze Yuan, Hongsheng He,, Jindong Tan,, and Zhibo Pang, Senior

    Abstract—Reliable and accurate calibration for camera, inertial measurement unit (IMU) and robot is a critical prerequisite for visual-inertial based robot pose estimation and surrounding environment perception. However, traditional calibrations suffer inaccuracy and inconsistency. To address these problems, this paper proposes a monocular visual-inertial and robotic-arm calibration in a unifying framework. In our method, the spatial relationship is geometrically correlated between the sensing units and robotic arm. The decoupled estimations on rotation and translation could reduce the coupled errors during the optimization. Additionally, the robotic calibration moving trajectory has been designed in a spiral pattern that enables full excitations on 6 DOF motions repeatably and consistently. The calibration has been evaluated on our developed platform. In the experiments, the calibration achieves the accuracy with rotation and translation RMSEs less than 0.7° and 0.01 m, respectively.The comparisons with state-of-the-art results prove our calibration consistency, accuracy and effectiveness.

    I. INTRODUCTION

    MONOCULAR visual-inertial and robotic-arm calibration aims at correlating the spatial relationship between the visual-inertial unit and robot. Typically, monocular camera and inertial measurement unit (IMU) form a compact and minimum sensor suite. They could be potentially used for robot pose estimation and surrounding environment perception [1]–[5].

    Camera and IMU share heterogeneous features regarding their quick response, tracking accuracy, and absolute pose observability [6]–[9]. Camera belongs to the exteroceptive sensor type. It captures body pose by tracking the visual features (usually, the blob patterns, corners or pixel intensities) and minimizing reprojection errors or photometric errors over image sequence. Comparably, IMU belongs to the proprioceptive type that captures body inertial measurements,i.e., linear accelerations and angular velocities. The absolute metric scale, gravity and short-term yaw-pitch-roll angles could be derived [10], [11]. Through this way, robot tracking works even in presence of camera tracking failures such as illumination changes, motion blurs, and absence of textures[12], [13]. Potential areas for visual-inertial based robot perception include visual-inertial odometry for unmanned aerial vehicle and ground vehicle, robotic arm motion tracking, etc [14]–[17].

    Basically, the performance of monocular visual-inertial based robot pose estimation and surrounding environment perception considerably relies on the accuracy of calibration among the triplet [18]–[22]. Incorrect calibration will result in severe drifts over time, which is unacceptable for high-end industrial robot perception. The existing literatures on visualinertial calibration can be classified into two categories: offline and online [23]. Off-line calibration, as the name suggests, performs the spatial and temporal calibration before use [7], [24], [25]. The calibration process usually requires users to gently and evenly move the sensor rig in front of a stationary target (typically a chessboard or april tags). The whole process repeats a few times until convergence.Generally, the calibration result has relatively higher precision and consistency, since it enables batch optimization over a large visual-inertial dataset. While, for online calibration, the extrinsic parameters as well as the body initial values (e.g.,velocity, pitch, and roll) are estimated online [26]. The angular velocity and acceleration biases could be estimated to compensate for temperature changes. However, the calibration performance, sometimes, is not quite stable compared to offline ones, since the calibration process only lasts a few seconds and lacks repeatability.

    The general goal is to design a visual-inertial & robotic-arm calibration method (illustrated in Fig. 1). In order to solve the inherent calibration issues (for instance, inconsistent calibration movement, unexpected disturbances while holding the visual-inertial rig, absence of adequate excitations on 6-DOF movement), this work designs an accurate and consistent moving trajectory, which is performed by the 6-DOF robotic arm. The trajectory could excite yaw-pitch-roll rotations andx-y-ztranslations evenly and repeatably. Besides, we develop a unifying model that is able to correlate the spatial relationships among the triplet in an elegant mathematical formulation. The contributions of this paper are summarized as follows.

    Fig. 1. Illustration on our proposed method: visual-inertial and robotic-arm calibration. The coordinate frames include camera frame, IMU frame and robotic-arm frame. The goal is to estimate the visual and inertial intrinsics, as well as the spatial relationships among visual-inertial and robotic-arm frames.

    Fig. 2. Pipeline of our calibration on camera, IMU and Robotic arm system. It consists of three parts: IMU-Manipulator calibration, Camera-IMU calibration,and Camera-Manipulator calibration. Afterwards, the calibration parameters are fed into the convergence model for sanity check.

    i) A monocular visual-inertial and robotic-arm calibration model is developed in a unifying framework;

    ii) We design a spiral moving trajectory that could excite yaw-pitch-roll rotations andx-y-ztranslations evenly,uniformly and repeatably. The unexpected moving jitters and disturbances could be alleviated.

    iii) The proposed method has been evaluated on our developed platform. The repeatability tests, systematic analysis and comparisons with state-of-the-art results have been extensively performed to prove our method effectiveness.

    The rest of the paper is organized as follows. In Section II,the related works on visual-inertial-robot calibrations are reviewed. In Section III, preliminaries on variable notation and IMU model are briefly introduced. In Section IV, our proposed method is described in detail. In Section V, the developed platform is briefly introduced; the calibrations are extensively analyzed and compared with state-of-the-art results. Section VI concludes the whole paper.

    II. RELATED WORK

    A. Camera-IMU Calibration

    By and large, camera-IMU calibrations can be classified into two types, i.e., online and offline. Online calibration aims at correlating the spatial relationship between visual-inertial sensors rightly before use. The calibration usually performs in a plug-and-play manner. In [27], Yanget al.analyzed the observability for spatial and temporal calibration parameters of visual-inertial sensors and it has been verified that calibrations are observable for random motions rather than degraded motions (e.g., planar motion) which might result in the occurrence of calibration failures. Similarly, Kelly and Sukhatme showed that the relative observability of visualinertial transformation needs the platform to undergo both accelerations and rotations in more than two IMU axes in[19]. Moreover, [28] introduced a closed-form method to estimate visual-inertial orientation, speeds, scales and biases.In [29], Zhenget al.estimated the calibration parameters within the filtering framework for ground vehicle VIO. In[26], [30], Yanget al.proposed to calculate camera-IMU time offset, relative transformation and inertial biases in the VINS framework. Basically, online calibration fits the plug-and-play scenario such as unmanned aerial vehicles (MAV) odometry.

    Offline calibration calibrates camera and IMU sensor suite in advance. It needs a larger number of calibration datasets and achieves more accurate and reliable calibrations. In [31],Lobo and Dias estimated the rotation and translation between camera and IMU in a separate manner. The rotation is firstly calculated by aligning visual sensor orientation with gravitational components (obtained by inertial accelerations).The translation is computed by using the encoder and turnable. However, it needs to precisely place IMU right at the center and camera at the rim of turntable, which requires expertise and patience to follow the procedures. In [32], the rotation is estimated in form of quaternions. Afterwards,slerping is applied to synchronize the visual-inertial measurements. The rotational matrix is computed by aligning the quaternion rotating axes, which leaves alone the angle part from the derived quaternion. Through this way, the noises in rotational angle components from quaternions could be suppressed. In [33], Furgaleet al.proposed to estimate both the temporal offset and spatial transformation in a joint framework. It minimizes the error terms that relate to tracking feature reprojections, inertial accelerations, velocities and biases. Generally, the offline calibration is more accurate than online ones in that the moving trajectories could be repeated and estimation inconsistency could be alleviated. Additionally, the camera and IMU measurements characterize in highdimensional and sparse space [34], which could be further used to extract nonnegative latent useful information via the NLF model in the offline mode, which are extensively analyzed by Luoet al.in [35], [36].

    B. Camera and Robotic-Arm Calibration

    In the camera and robotic-arm calibration, rotation and translation between camera frame and robotic-arm frame are computed by tracking calibration board features from different viewpoints. Afterwards, the reprojection minimization is implemented to align the adjoint attitudes [37]. The cameraand robotic arm calibration are also categorized into two types, i.e., eye-in-hand mode and eye-to-hand mode.

    TABLE I NOMENCLATURE

    In [38], [39], Liet al.introduced a comprehensive overview of calibration on industrial robots and dual arms. By taking measurement noise and external constraints into the robot kinematic model, the calibration accuracy could be boosted. In terms of eye-in-hand type, camera is rigidly mounted to robotic arm. The calibration board moves within the camera captured views. In [40], the authors proposed a unified mathematical formulation by using procrustes analyzers. In[41], the authors introduced an eye-in-hand calibration for the scanner robotic system. By cyclically capturing images from a 2d calibration object, the calibration kinematic model parameters are obtained. For the eye-to-hand calibration type,camera is fixed in a global-view fashion. Robot holds the checker board and moves within the camera capturing view.In [42], Koide and Menegatti introduced to directly take images with calibration patterns in the pose graph optimization model. In [43], the authors presented the calibration that minimizes two kinematic loop errors. By and large, the eye-in-hand type is preferable because of its flexibility and first person view.

    III. PRELIMINARIES

    A. Variable Notations

    The variables used in the paper are listed in Table I. The coordinate frames used in this paper include world frame {W},camera frame {C}, IMU frame {I} (also known as body frame{B}) , robot frame {R}. The transformation from camera frame{C} to IMU frame {I} is described by a 4 × 4 transformation matrixin its homogeneous form, which is given by

    B. IMU Model

    IMU consists of 3-axis accelerometers and 3-axis gyroscopes [44]. The accelerometer measures both local gravity accelerations and linear accelerations. Usually, they are corrupted by sensor noises, non-linear bias variations with temperature. Unfortunately, these errors will be accumulated and lead to significant drift in the position, attitude, and velocity outputs. Luckily, the inertial errors could be considerably compensated by complementary visual sensing unit.

    In this work, the acceleration model is given by [45]wheredenotes the linear acceleration in the world frame;denotes the sampled body accelerations in the body frame,which is coupled with gravitational parts and noises.gWis the gravitational vector in the world frame.is the rotation matrix from world frame to body frame.denotes the bias (which is caused by low-frequency offsets and changes over time slowly). Usually, it causes the majority of errors while double integrating the accelerations. In this work,is modeled as a random walk process [46].nαis the acceleration noise (usually taken as white Gaussian noise).

    Similarly, the angular rate measurement model is

    IV. METHOD

    In this section, our method is described in detail. As illustrated in Fig. 2, it consists of three modules: camera-IMU calibration, camera-robot calibration, and IMU-robot calibration. For camera-IMU calibration, it includes the computations on IMU biases, coordinate rotations and translations.Then the IMU biases are fed into the calibration on IMU-robot module to derive the corresponding IMU-robot frame transformations. Meanwhile, camera and robot calibration is performed using the hand-eye calibration. The calibration completes when the triplet satisfies the convergence condition.The details are described in the following subsections.

    A. Camera-IMU Calibration

    In this work, camera-IMU calibration is divided into two parts, i.e., IMU preintegration and camera-IMU intrinsic &extrinsic calibration.

    1) Inertial Measurement Preintegration:In order to align the transformation between camera and IMU, it is crucial to implement IMU preintegration [5]. Given time instantsiandjbetween camera successive frames, the body orientation,velocity and position could be preintegrated from the collected inertial measurements. They are given by

    IMU measurements could be preintegrated that are only related to visual frames at timestiandtj. In (5a), the noise term could be isolated and approximated by its first-order Jacobian given as

    Similarly, the increment in velocity could also be expressed by its first-order approximation given as

    2) Camera-IMU Extrinsic and Intrinsic Calibration:In this work, we assume that the gyroscopic and acceleration biases remain constant. The biases can be calculated by minimizing the difference between IMU rotation and camera rotation iteratively. During the iteration, the estimated rotationis used to update the angular velocity bias.

    a)Angular velocity bias estimation

    The rotation of camera frame between timestandt+1 could be given by

    By combining (4a), (6), and (11), the relationship between camera rotation and IMU preintegration of the time periodstandt+1 can be given by

    Equation (13) could be solved using the Levenberg-Marquart nonlinear optimization. The preintegrations on IMU rotations are updated afterwards.

    b)Camera-IMU rotation estimation

    The rotation can be computed by aligning the camera rotation between framestandt+1 with the preintegrated IMU rotations. Assume the changes of attitude for camera frame and IMU frame areandrespectively between time interval [t,t+1].andrepresent the same rotation though in different coordinate reference systems. The quaternionthat converts the camera reference to IMU reference satisfies

    where ( ·)?means quater nion conjugation.

    Then the optimal quaternionwill be obtained by maximizing the following equation:

    c)Accelerometer bias estimation

    By combining (4b), (4c), and (11), the relationship between camera translation and inertial preintegrated translation during time interval [i,j] is

    The optimal accelerometer biascould be obtained by minimizing the following function:

    d)Camera-IMU translation estimation

    B. Camera and Robotic-Arm Calibration

    In this work, camera and IMU are rigidly attached to robotic wrist (the 6th joint of robotic arm). The camera and roboticarm calibration is expressed by the transformation matrix in its homogeneous form.

    C. IMU and Robotic-Arm Calibration

    IMU and robotic-arm calibration could be formulated by homogeneous transformationXRI[48]. The robotic wrist moves in a predefined trajectoryMeanwhile, IMU follows the similar path ({I1→I2→I3···},also seen as IMU pose sequence).

    Similarly to (19), IMU and robotic wrist poses satisfy the following equation:

    In a similar manner, the relative rotationRRIand translationtRIbetween IMU frame and robotic wrist frame could be derived by solving the following equations:

    D. Calibration Convergence Criteria

    In our model, there exists an inherent geometric constraint between robotic wrist frame {R} , IMU frame {I} and camera frame {C}. Theoretically, the transformation matricesTRC,TRI, andTICsatisfy the following geometric relationship:

    E. Robotic Arm Calibration

    Because of the structural parameter errors of robot and the robotic dynamics effects, the actual running trajectory of robot could, somehow, deviate from the programming values.However, the errors caused by the aforementioned issues could be compensated by calibrating the kinematic model of the robot dynamics factors (e.g., centrifugal force, coriolis force, dynamic coupling [38]). The robotic arm calibration model is briefly introduced in this section.

    The robotic arm could be described by the multi-link mechanism connected by joints [49]. Its kinematics could be formulated by the D-H model, which is described as

    wheresymbolizes the transformation matrix between the successive linksi?1 andi.cθ andsθ symbolize the trigonometric functions sinθ and cosθ, respectively. The group parameters {θi,αi,di,ai} are the variables associated with jointiand linki, which mean the joint angle, link length,link offset, and link twist, respectively. The link between the successive joints could be fully described by the D-H model.

    Additionally, the moving trajectory could also be affected by the robot dynamics. Its dynamic model could be described in the form of Lagrangian equation as follows:

    whereτsymbolizes the torque andθsymbolizes the joint angle;M(θ) means the inertial matrix;stands for Coriolis and centrifugal force;g(θ) is the gravity.

    In this work, the least square method is used to calibrate the robot. It computes the inherent geometric parameters (i.e., link parameter error, joint rotation error) by minimizing the errors between the theoretical data and its measurement. In this paper, the 3-D schematic model of position measuring system with drawstring displacement configuration is adopted. Then the robot kinematic errors are calculated by minimizing the loss function of errors between the actual end-effector position and its measurement from the D-H model in (34). Please refer to [50] for more details in calibration implementation.

    V. EXPERIMENTAL RESULTS AND ANALYSIS

    A. Experimental Platform

    In our developed system (shown in Fig. 3), the robotic arm is Kinova Jaco 2 with 6 rotation joints. The total weight of the robot is 5.2 kg. Its maximum reach is up to 98 cm and its maximum linear arm speed is 20 cm/s. Its communication protocol is RS-485 with USB 2.0 port. The commands are given on Linux Ubuntu & ROS system. Its maximum payload is 2.2 kg.

    Fig. 3. The experimental platform consists of Camera, IMU, and Robotic arm. Intel D435i (integrated with monocular camera and IMU) is rigidly attached to the robotic-arm (Kinova Jaco 2) 6th joint. It is programmed to move in a predefined pattern to capture the calibration board and body inertial measurements.

    Fig. 4. Designed robotic arm moving trajectory. It moves in a spiral pattern.The x-y-z translations and yaw-pitch-roll rotations are uniformly excited.

    Intel D435i integrated with camera and IMU (Bosch) are fixed to the 6th joint. The monocular camera samples the image at 20 fps; its image resolution is 640×480. The camera intrinsic parameters are calibrated with standard deviation less than 0.1 pixel. The IMU samples the inertial measurements at 200 Hz. The inertial noise terms are estimated using Allan variance analysis [45]. The initial acceleration noise and its bias noise are 0.00014 m/s2and 0.00004 m/s3respectively;gyroscopic noise and its bias noise are 0.00035 rad/s and 0.000055 rad/s2, respectively. The image and inertial data,together with robot poses, are synchronized by timestamps.The calibration parameters are calculated on the laptop with 2.4 GHz Intel Core i7 and 8 G RAM.

    B. Repeatability Test

    Since the poses read from the robot arm are used to define the trajectory for calibration consistency, we firstly test the robot moving repeatability. The robot is programmed to follow a pre-defined trajectory and expected to move and return to the same set of fixed points. Also, the positions and orientations read from the robot are supposed to be precise.The robotic arm is tested by automatically moving its end to a set of specified spots on a calibration table. A 10×10 chessboard points are designed; each point is denoted by a 1×1 millimeter square. The robot goes over the chessboard points cyclically for more than 100 times. It has been found that the robot arm always directs to the grid points with position error less than 0.1 mm and rotation error less than 0.2°. Hence, the robot moving repeatability satisfies our requirement.

    C. Designed Trajectory

    In the test, we use Apriltag [51] as the calibration board.The robotic arm with the joint-mounted monocular camera and IMU moves in front of the calibration board to capture the Apriltag patterns in the image sequence. It is noticeable that the moving trajectory proceeds in a spiral fashion, as shown in Fig. 4. The trajectory projections onx-yplane,x-zplane, andy-zplane are shown in Fig. 5, Fig. 6, Fig. 7, respectively. This type of trajectory is intentionally designed to fully and uniformly excite yaw-pitch-roll rotations andx-y-ztranslations. Besides, in order to satisfy that the calibration tag patterns are captured in the image sequence, the poses are supposed to enable camera lens (optical axis denoted by blue axis in Fig. 4) point to the calibration board.

    Fig. 5. Manipulator trajectory projected onto x-y plane.

    Fig. 6. Manipulator trajectory projected onto x-z plane.

    Fig. 7. Manipulator trajectory projected onto y-z plane.

    The robot spiral moving pattern is

    wheresx=0.3,sy=?2, andsz=0.3 denote the moving scale along the axes; ω=1 is the angular rate; θ=1, ζ=80, and φ=1 are the initial phases.

    The correspondingyaw-pitch-rollangles are parameterized by

    where ξy=0.001, ξp=0.01, and ξr=0.001 are the angular scales onyaw,pitch, androll, respectively; α=2, β=?1, and γ=1 are the initial angular phases, respectively.

    In robot cartesian mode, there are some configurations in which the robot loses one or more degrees of freedom (i.e., the robot is unable to move in one direction or another). In order to maximally reduce the odds of singularity, we used the

    ActivateAutomaticSingularityAvoidanceAPI function and set it to false. However, it should be noticed that the designed moving trajectory might still be affected due to the existence of singular positions inside the robot workspace, while calculating its inverse kinematics using ROS Moveit [52]. In presence of singularity, the robotic arm will move arbitrarily or stay still (For instance, the arm is at full reach. It is unable to move anymore in the direction it is currently reaching out),rather than proceed in the predefined path. In order to solve this issue, we have tried several initial poses and enabled the robot arm to move the predefined spiral trajectory.Afterwards, the initial pose and its corresponding trajectory without any singularities during the moving process are selected. Additionally, the joint moving angle series are computed in advance, which could save computational time for deriving inverse kinematics during the movement. Totally,there are 80 poses during the manipulator movement. The robot followed the spiral trajectory and the whole movement process lasted about 2 minutes. It has been repeated 20 times for the spiral curve trajectory. The computational time for each trajectory was approximately 10 minutes.

    The translations for the designed trajectory are shown in Fig. 8. The translations onxandzaxes (scale:[?12.85 cm,12.85 cm]) are designed to proceed in a sinusoidal form. The translation onyaxis proceeds in a dog-leg path(scale: [17.20 cm,80.00 cm]). The moving pattern moves back and forth for a few times.

    The moving trajectory angular curves are shown in Fig. 9.For ensuring the cameray-axis points approximately to the calibration board, we designed theroll(angle scale:[0°,17.76°]) andyaw(angle scale: [28.71°,46.47°]) that move in a relatively small scale. Comparably, thepitch(angle scale:[2.23°,233.14°]) alongy-axis changes in a dog-leg style on a large scale.

    Fig. 8. Trajectory curves with repsect to x, y, and z.

    Fig. 9. Trajectory curves with respect to yaw-pitch-roll.

    D. Results and Analysis

    In our experiments, more than twenty groups of visualinertial and robot pose data were collected. Each test lasted about 60 seconds. The acceleration biases are [0.105 m/s?2,?0.131 m/s?2,?0.065 m/s?2]; the gyroscopic biases are[?0.0068 rad/s,?0.0008 rad/s,0.0014 rad/s].

    Our method has been compared with state-of-the-art, i.e.,ICSR [53] and Kalibr [25]. The designed trajectories for the traditional calibrations include the typical motions, e.g.,circular motion, zig-zag motion, circular motion, rosette motion and irregular rotation & translation that excite 6DOF motions. The tests have been repeated more than 20 times.

    In Fig. 10, it can be observed that ours achieves more stable and consistent outcomes (obviously smaller standard deviations onx-y-ztranslations andyaw-pitch-rollrotations) than the other two methods, partly because the geometrical constraint among camera, IMU and robotic arm is added to the calibration model. Also, the robotic arm moves in a predefined spiral fashion, which could potentially improve moving consistency, accuracy, stability.

    Fig. 10. Comparisons on calibration methods using Ours, ICSR [53], and Kalibr [25]. It includes the comparisons on rotation: yaw, pitch, roll; and translations: x, y, z.

    Fig. 11. Comparisons on image reprojection error between (a) Ours, (b) ICSR [53], and (c) Kalibr [25].

    TABLE II COMPARISONS OF CALIBRATION METHODS BETWEEN OURS, ICSR, AND KALIBR

    The corresponding Average and root mean squared error(RMSE) of translations and rotations are shown in Table II. Its Std onyaw,pitch,rollare 0.494°, 0.646°, 0.621°; its translation Std onx,y,zare 0.004 m, 0.009 m , and 0.012 m,respectively. From the results, we can see that ours achieves competitive results while comparing with the other two methods in terms of both Average and RMSE.

    The reprojection errors are also analyzed and compared.As shown in Fig. 11(a), most of the reprojection errors are within the circle less than 0.25 pixel. Comparably, the convergence area using ICSR [53] (Fig. 11(b)) and Kalibr [25](Fig. 11(c)) are relatively larger, which are approximately 0.5 pixel.

    Fig. 12. Comparisons on acceleration errors and angular velocity errors using ours, ICSR, and Kalibr. (a)–(c) Acceleration errors in 2d; (d)–(f) Gyroscopic errors in 2d; (g)–(i) Acceleration errors in 3d; (j)–(l) Gyroscopic errors in 3d.

    We have also compared the acceleration errors and angular velocity errors in both 2d and 3d forms. As can be seen in Figs. 12(a)–12(f), the acceleration and angular velocity errors are all within a small range. By contrast, there exist severe error jitters for both ICSR and Kalibr, which can be attributed to unexpected disturbances and manual jitters during the movement. In Figs. 12(g)–12(i), the acceleration errors are plotted in its 3d form. The acceleration convergence sphere radius using our method is approximately 0.429 m/s2; while the other sphere radii are roughly 1 m/s2. In a similar manner,the gyroscopic errors are plotted in Figs. 12(j)–12(l), the convergence area radius of our method is 0 .08 rad/s, while the radii for ICSR and Kalibr are 0.107 rad/s and 0.114 rad/s,respectively. From the results, it can be observed that IMU errors are comparably smaller than the state-of-the-art, thanks to the spiral moving trajectory design and decoupled estimations on translation & rotation.

    We have also performed the comparisons on the computational time between ours, ICSR [53], and Kalibr [25],to evaluate the method efficiency. As can be observed in Table III, the spiral moving trajectory has been performed two times. The collected dataset related to the first test has approximately 2400 images and the capturing sequence lasts about 120 seconds; the dataset related to the second test has approximately 3600 images and the capturing sequence lasts about 180 seconds. The symbol “*” represents the IMU preintegration. From the results, it can be seen that the computational time using ours with IMU preintegration takes 1232 seconds (the the least amount of time), which can be largely attributed to the strategy in the inertial preintegrated process. IMU preintegration saves the time in iterations of quaternions, which are fed into the velocity and position model. Comparably, ours without preintegration consumes longer time in deriving the calibration parameters. It can be observed that ICSR and Ours show comparable computational time, but ours achieves higher calibration accuracy.Noticeably, Kalibr requires large amount of time, especially for the one with a larger dataset, which is due to the minimization of bundle volume of states being processed.

    TABLE III COMPARISONS ON THE COMPUTATIONAL TIME BETWEEN OURS (WITH IMU PREINTEGRATION), OURS (WITHOUT IMU PREINTEGRATION), ICSR [53], AND KALIBR [25] USING THE SPIRAL MOVING TRAJECTORY. THE 1ST TEST SET HAS APPROXIMATELY 2400 IMAGES AND THE CAPTURING SEQUENCE LASTS ABOUT 120 SECONDS. THE 2ND TEST SET HAS APPROXIMATELY 3600 IMAGES AND THE CAPTURING SEQUENCE LASTS ABOUT 180 SECONDS. “*”MEANS THE IMU PREINTEGRATION

    VI. CONCLUSION AND FUTURE WORK

    In this paper, we have developed a unifying monocular visual-inertial and robotic arm calibration framework. It is able to geometrically correlate the spatial relationship among the sensing unit and robotic arm. Besides, we have designed the calibration moving trajectory in a spiral pattern. Through this design, the excitations onyaw-pitch-rollrotations andx-yztranslations could be performed uniformly and consistently.The performance of the calibration has been evaluated on our developed platform. In the experiments, the standard deviations on rotations and translations are less than 0.7°and 0.012 m, respectively, which proves its advantages in visualinertial-robot calibrations.

    One drawback of our current calibration method is the lack of systematic comparisons on the typical trajectories, such as zig-zag trajectory, circular trajectory, radial trajectory, rosette trajectory. Thus, in the future, we plan to perform more tests and analysis on these trajectories. Besides, we will perform the robot calibration based on [38] to check the repeatability and accuracy of the manipulator qualitatively. Eventually, we plan to design that trajectory that could avoid singularities in the robot cartesian space since the robot experiences the inevitable singularity in the moving process.

    天堂俺去俺来也www色官网| 国产精品影院久久| 午夜精品国产一区二区电影| 久久久久亚洲av毛片大全| 男女做爰动态图高潮gif福利片 | 欧美不卡视频在线免费观看 | 国产蜜桃级精品一区二区三区| 丰满的人妻完整版| 国产一区二区激情短视频| videosex国产| 91成人精品电影| 黄色女人牲交| 日韩视频一区二区在线观看| 欧美成人性av电影在线观看| 国产成人影院久久av| x7x7x7水蜜桃| av欧美777| 一区二区三区激情视频| 国产三级在线视频| 在线天堂中文资源库| a级毛片在线看网站| 性欧美人与动物交配| 美女福利国产在线| 亚洲国产欧美日韩在线播放| 久久天堂一区二区三区四区| 日本a在线网址| 精品久久久久久电影网| 日日夜夜操网爽| 丰满饥渴人妻一区二区三| 亚洲自拍偷在线| 国产精品一区二区三区四区久久 | 亚洲精品av麻豆狂野| 欧美性长视频在线观看| 亚洲精品国产区一区二| av欧美777| 午夜福利在线免费观看网站| 这个男人来自地球电影免费观看| 久久久久久大精品| 最近最新免费中文字幕在线| 又大又爽又粗| 国产97色在线日韩免费| 制服人妻中文乱码| 久久精品亚洲精品国产色婷小说| 亚洲人成网站在线播放欧美日韩| 男人舔女人的私密视频| 精品一品国产午夜福利视频| 国产精品日韩av在线免费观看 | 亚洲人成电影观看| e午夜精品久久久久久久| 国产日韩一区二区三区精品不卡| 亚洲自偷自拍图片 自拍| 妹子高潮喷水视频| 在线观看一区二区三区激情| 久久精品影院6| 久久精品国产亚洲av高清一级| 久久久国产成人免费| 三级毛片av免费| 成人18禁在线播放| netflix在线观看网站| 久久 成人 亚洲| 中文字幕精品免费在线观看视频| 丝袜美足系列| 真人做人爱边吃奶动态| 精品福利观看| 欧美成人午夜精品| 女人高潮潮喷娇喘18禁视频| 黄片小视频在线播放| 欧美日韩亚洲综合一区二区三区_| 黄片大片在线免费观看| 校园春色视频在线观看| 亚洲av片天天在线观看| 亚洲av成人不卡在线观看播放网| 午夜影院日韩av| 国产一区二区三区综合在线观看| 久久天躁狠狠躁夜夜2o2o| 免费在线观看黄色视频的| 亚洲性夜色夜夜综合| 男女做爰动态图高潮gif福利片 | 男女下面进入的视频免费午夜 | 日韩中文字幕欧美一区二区| 亚洲全国av大片| 宅男免费午夜| 视频区欧美日本亚洲| 亚洲人成电影观看| 法律面前人人平等表现在哪些方面| 伦理电影免费视频| 亚洲国产欧美网| 热re99久久精品国产66热6| 999精品在线视频| 国产av一区二区精品久久| av在线天堂中文字幕 | 99国产精品99久久久久| 国产亚洲精品久久久久久毛片| 精品电影一区二区在线| 国产精品免费一区二区三区在线| 一个人免费在线观看的高清视频| 青草久久国产| 人妻久久中文字幕网| 久久久久久久久中文| 日本黄色视频三级网站网址| 男人舔女人下体高潮全视频| 99国产综合亚洲精品| 国产欧美日韩综合在线一区二区| 亚洲情色 制服丝袜| 亚洲av五月六月丁香网| 亚洲自拍偷在线| 成人亚洲精品一区在线观看| 一边摸一边抽搐一进一小说| 久久久国产成人精品二区 | 神马国产精品三级电影在线观看 | 又大又爽又粗| 欧美一级毛片孕妇| 国产成人精品久久二区二区免费| 丝袜在线中文字幕| 国产91精品成人一区二区三区| 电影成人av| 中文字幕色久视频| 午夜日韩欧美国产| 国产高清视频在线播放一区| 国产一卡二卡三卡精品| 91成年电影在线观看| tocl精华| 黄色丝袜av网址大全| 国产精品亚洲一级av第二区| 999久久久精品免费观看国产| 国产xxxxx性猛交| 人成视频在线观看免费观看| 正在播放国产对白刺激| 久久久久久免费高清国产稀缺| 99国产精品免费福利视频| av福利片在线| 精品卡一卡二卡四卡免费| 欧美日本亚洲视频在线播放| 天堂中文最新版在线下载| 欧美中文综合在线视频| 9热在线视频观看99| 国产免费av片在线观看野外av| 国产成人影院久久av| 怎么达到女性高潮| 神马国产精品三级电影在线观看 | 高清欧美精品videossex| 国产又爽黄色视频| 母亲3免费完整高清在线观看| 色尼玛亚洲综合影院| 妹子高潮喷水视频| 色综合站精品国产| www国产在线视频色| 欧美日韩精品网址| 宅男免费午夜| 国产极品粉嫩免费观看在线| 女人爽到高潮嗷嗷叫在线视频| 国产又爽黄色视频| 免费看十八禁软件| 精品久久久久久久久久免费视频 | 日韩大码丰满熟妇| 日日爽夜夜爽网站| 午夜免费成人在线视频| 久久精品国产亚洲av高清一级| 男女高潮啪啪啪动态图| 12—13女人毛片做爰片一| 成人亚洲精品一区在线观看| 欧美成人免费av一区二区三区| 九色亚洲精品在线播放| 长腿黑丝高跟| 亚洲成a人片在线一区二区| 老司机亚洲免费影院| 亚洲视频免费观看视频| 99热国产这里只有精品6| 中文字幕精品免费在线观看视频| 国产免费av片在线观看野外av| 真人一进一出gif抽搐免费| 色播在线永久视频| avwww免费| 久久九九热精品免费| 黄色毛片三级朝国网站| 亚洲美女黄片视频| 久久亚洲真实| 一级毛片女人18水好多| 村上凉子中文字幕在线| 免费搜索国产男女视频| 50天的宝宝边吃奶边哭怎么回事| www.自偷自拍.com| 18禁美女被吸乳视频| 精品人妻一区二区三区麻豆 | 成人精品一区二区免费| 国产一区二区在线观看日韩| 99riav亚洲国产免费| 窝窝影院91人妻| 午夜福利18| 嫩草影院入口| 精品国内亚洲2022精品成人| 2021天堂中文幕一二区在线观| 欧美zozozo另类| 老师上课跳d突然被开到最大视频 久久午夜综合久久蜜桃 | 亚洲七黄色美女视频| 国产在视频线在精品| 国产高潮美女av| 偷拍熟女少妇极品色| 日韩欧美在线二视频| 99久久久亚洲精品蜜臀av| 99国产综合亚洲精品| 人妻久久中文字幕网| 不卡一级毛片| 我要搜黄色片| 老女人水多毛片| 亚洲中文字幕一区二区三区有码在线看| or卡值多少钱| 久久伊人香网站| 久久久久久九九精品二区国产| 天美传媒精品一区二区| 国产又黄又爽又无遮挡在线| 日本五十路高清| 日本 欧美在线| 天堂av国产一区二区熟女人妻| 久久国产乱子免费精品| 久9热在线精品视频| 美女被艹到高潮喷水动态| 精品久久久久久久久亚洲 | 老女人水多毛片| 午夜福利18| 男插女下体视频免费在线播放| 18+在线观看网站| 国产精品久久久久久精品电影| 国产人妻一区二区三区在| 又黄又爽又免费观看的视频| 欧美精品国产亚洲| 久久精品国产99精品国产亚洲性色| 国产老妇女一区| 一个人免费在线观看的高清视频| 日韩欧美在线二视频| 精品人妻偷拍中文字幕| 69av精品久久久久久| 国产色婷婷99| 丁香欧美五月| 国产精品嫩草影院av在线观看 | 999久久久精品免费观看国产| 免费大片18禁| 老女人水多毛片| 国产午夜精品久久久久久一区二区三区 | 9191精品国产免费久久| 国产精品久久久久久精品电影| 一级黄片播放器| 日韩精品中文字幕看吧| 长腿黑丝高跟| 色综合欧美亚洲国产小说| 99久久久亚洲精品蜜臀av| 国产精品爽爽va在线观看网站| 97热精品久久久久久| 欧美三级亚洲精品| netflix在线观看网站| 国产精品久久久久久久电影| 中文字幕av成人在线电影| 亚洲综合色惰| 噜噜噜噜噜久久久久久91| 欧美xxxx性猛交bbbb| 看黄色毛片网站| 久久热精品热| 成年女人永久免费观看视频| 国模一区二区三区四区视频| 欧美中文日本在线观看视频| 一本综合久久免费| 精品久久久久久,| 俄罗斯特黄特色一大片| 久久99热6这里只有精品| 赤兔流量卡办理| 午夜福利在线观看免费完整高清在 | av视频在线观看入口| 成人欧美大片| 国产伦精品一区二区三区视频9| 51国产日韩欧美| 伦理电影大哥的女人| 久久久久久大精品| 观看免费一级毛片| 婷婷色综合大香蕉| 午夜激情福利司机影院| 精品不卡国产一区二区三区| 性色av乱码一区二区三区2| 天堂网av新在线| 精品国产三级普通话版| 99热这里只有是精品在线观看 | 欧美丝袜亚洲另类 | 日本精品一区二区三区蜜桃| 欧美成人免费av一区二区三区| 好男人电影高清在线观看| 久久久久九九精品影院| 成人欧美大片| 国产91精品成人一区二区三区| 日韩 亚洲 欧美在线| 国产老妇女一区| 91在线精品国自产拍蜜月| 赤兔流量卡办理| 亚洲精品色激情综合| 老司机福利观看| 日韩人妻高清精品专区| 久久久久九九精品影院| 激情在线观看视频在线高清| 我要看日韩黄色一级片| 欧美最黄视频在线播放免费| 国产精品一及| 亚洲男人的天堂狠狠| 国产av在哪里看| 亚洲国产色片| 一级黄色大片毛片| 成人午夜高清在线视频| 色综合婷婷激情| 一二三四社区在线视频社区8| 精品99又大又爽又粗少妇毛片 | 人妻夜夜爽99麻豆av| 国产91精品成人一区二区三区| 夜夜爽天天搞| 别揉我奶头 嗯啊视频| 亚洲午夜理论影院| 91字幕亚洲| 99riav亚洲国产免费| 免费无遮挡裸体视频| 免费观看人在逋| 精品久久久久久久人妻蜜臀av| 久久久久久久久久黄片| 日韩欧美在线二视频| 97人妻精品一区二区三区麻豆| 中文字幕精品亚洲无线码一区| 精品一区二区三区人妻视频| 18美女黄网站色大片免费观看| 国产精品一区二区三区四区免费观看 | 午夜福利在线观看免费完整高清在 | 亚洲国产日韩欧美精品在线观看| 国产成人福利小说| 亚洲三级黄色毛片| 亚洲,欧美精品.| 色尼玛亚洲综合影院| 色噜噜av男人的天堂激情| 女人被狂操c到高潮| 亚洲精品在线观看二区| 一进一出抽搐动态| 每晚都被弄得嗷嗷叫到高潮| 久久性视频一级片| 成人特级黄色片久久久久久久| 嫩草影院入口| 人妻制服诱惑在线中文字幕| 一级毛片久久久久久久久女| 亚洲午夜理论影院| 有码 亚洲区| 国内精品久久久久精免费| 亚洲av.av天堂| 黄色丝袜av网址大全| 亚洲国产精品999在线| 欧美黑人巨大hd| 精品久久久久久久久久免费视频| 亚洲国产高清在线一区二区三| 男女做爰动态图高潮gif福利片| 欧美成人一区二区免费高清观看| 可以在线观看毛片的网站| 精品一区二区三区视频在线观看免费| 国产伦精品一区二区三区四那| 精品一区二区三区人妻视频| 长腿黑丝高跟| 能在线免费观看的黄片| 国产精品99久久久久久久久| 婷婷六月久久综合丁香| 久久久久国内视频| 国产伦精品一区二区三区四那| 久久人人爽人人爽人人片va | 日本免费一区二区三区高清不卡| 精品熟女少妇八av免费久了| 日本免费一区二区三区高清不卡| 成人av在线播放网站| 12—13女人毛片做爰片一| 久久久色成人| 成人av一区二区三区在线看| 亚洲国产色片| 91字幕亚洲| 在线免费观看不下载黄p国产 | 亚洲狠狠婷婷综合久久图片| 色综合婷婷激情| 综合色av麻豆| 色5月婷婷丁香| 偷拍熟女少妇极品色| 国产伦在线观看视频一区| 国内毛片毛片毛片毛片毛片| 两性午夜刺激爽爽歪歪视频在线观看| 性色av乱码一区二区三区2| 一个人看视频在线观看www免费| 51午夜福利影视在线观看| 欧美精品啪啪一区二区三区| 亚洲黑人精品在线| 日本成人三级电影网站| 国产精华一区二区三区| av在线蜜桃| 国产欧美日韩一区二区精品| 国产高清视频在线播放一区| 亚洲成人久久爱视频| 嫩草影视91久久| 成人性生交大片免费视频hd| 两个人的视频大全免费| 精品欧美国产一区二区三| 国产精品99久久久久久久久| 亚洲av第一区精品v没综合| 麻豆一二三区av精品| 国产黄a三级三级三级人| 欧美最黄视频在线播放免费| 深夜a级毛片| 性欧美人与动物交配| 五月伊人婷婷丁香| 一进一出抽搐动态| 免费看a级黄色片| bbb黄色大片| av中文乱码字幕在线| 亚洲五月天丁香| 国产精品1区2区在线观看.| 在线观看午夜福利视频| 国语自产精品视频在线第100页| 97热精品久久久久久| 日韩精品中文字幕看吧| 久久午夜福利片| 亚洲男人的天堂狠狠| eeuss影院久久| 亚洲人成网站在线播| 亚洲精品一区av在线观看| 成年女人毛片免费观看观看9| 少妇丰满av| 老司机午夜福利在线观看视频| 男女下面进入的视频免费午夜| 又爽又黄无遮挡网站| 男女下面进入的视频免费午夜| 女人被狂操c到高潮| 欧美bdsm另类| 一级av片app| 国产主播在线观看一区二区| 天堂影院成人在线观看| 99久久精品热视频| 亚洲在线观看片| 精品乱码久久久久久99久播| 精品国内亚洲2022精品成人| 国产精品自产拍在线观看55亚洲| 少妇熟女aⅴ在线视频| 欧美在线一区亚洲| 99视频精品全部免费 在线| 18禁裸乳无遮挡免费网站照片| 精品福利观看| 午夜精品在线福利| 午夜福利在线观看免费完整高清在 | 99riav亚洲国产免费| 全区人妻精品视频| 中国美女看黄片| 成年女人看的毛片在线观看| 欧美日韩黄片免| 舔av片在线| 美女被艹到高潮喷水动态| 听说在线观看完整版免费高清| 看片在线看免费视频| 一级毛片久久久久久久久女| 老熟妇乱子伦视频在线观看| 国产高清有码在线观看视频| 久久香蕉精品热| 欧美日韩福利视频一区二区| 亚洲成人久久性| 一级黄片播放器| 人妻夜夜爽99麻豆av| 在线看三级毛片| 久久精品国产自在天天线| 精品午夜福利视频在线观看一区| 精品乱码久久久久久99久播| 午夜视频国产福利| or卡值多少钱| 亚洲av二区三区四区| 青草久久国产| 三级国产精品欧美在线观看| 日本黄色片子视频| 一区福利在线观看| 丰满人妻熟妇乱又伦精品不卡| 亚洲国产欧洲综合997久久,| 色综合婷婷激情| 我要搜黄色片| 午夜两性在线视频| 51国产日韩欧美| 此物有八面人人有两片| 99热这里只有精品一区| 丁香欧美五月| 内地一区二区视频在线| 哪里可以看免费的av片| 亚洲国产日韩欧美精品在线观看| 少妇裸体淫交视频免费看高清| 亚洲av二区三区四区| 国产高清有码在线观看视频| 欧美日韩黄片免| 色视频www国产| 日韩精品青青久久久久久| 人妻久久中文字幕网| 欧美黄色片欧美黄色片| 国产探花在线观看一区二区| 亚洲av第一区精品v没综合| 日本成人三级电影网站| 国产精华一区二区三区| 欧美日本亚洲视频在线播放| 级片在线观看| 精品午夜福利在线看| 免费av观看视频| 免费人成视频x8x8入口观看| 日本成人三级电影网站| 一级黄片播放器| 婷婷六月久久综合丁香| 久久人妻av系列| 国产精品自产拍在线观看55亚洲| 黄色一级大片看看| 欧美成人一区二区免费高清观看| 此物有八面人人有两片| 日本三级黄在线观看| 久久精品国产清高在天天线| 日韩欧美在线乱码| 色精品久久人妻99蜜桃| 一级黄色大片毛片| 老司机午夜十八禁免费视频| 欧美bdsm另类| 中文字幕人妻熟人妻熟丝袜美| 久久精品综合一区二区三区| 人妻制服诱惑在线中文字幕| 欧美zozozo另类| 国内精品美女久久久久久| 色视频www国产| 欧美在线黄色| 国产精品电影一区二区三区| 亚洲欧美日韩卡通动漫| 又紧又爽又黄一区二区| 亚洲欧美激情综合另类| 久久精品国产自在天天线| 成熟少妇高潮喷水视频| 亚洲av成人av| 日韩精品中文字幕看吧| 亚洲精品久久国产高清桃花| 久久欧美精品欧美久久欧美| 丰满人妻熟妇乱又伦精品不卡| 亚洲自偷自拍三级| 天天一区二区日本电影三级| 精品一区二区三区视频在线| 日韩欧美精品v在线| 亚洲不卡免费看| 丰满乱子伦码专区| 波野结衣二区三区在线| 欧美成狂野欧美在线观看| 亚洲欧美日韩无卡精品| 十八禁国产超污无遮挡网站| 国产成人福利小说| 亚洲av二区三区四区| 精品无人区乱码1区二区| 美女大奶头视频| 又黄又爽又刺激的免费视频.| 他把我摸到了高潮在线观看| 久久人妻av系列| 午夜亚洲福利在线播放| 国产极品精品免费视频能看的| 欧美3d第一页| 国产主播在线观看一区二区| 波多野结衣巨乳人妻| 成人无遮挡网站| 国产精品1区2区在线观看.| 日韩高清综合在线| 国语自产精品视频在线第100页| 狠狠狠狠99中文字幕| 1024手机看黄色片| 757午夜福利合集在线观看| 日韩高清综合在线| 国产亚洲精品久久久久久毛片| 国产亚洲精品久久久com| 在线观看66精品国产| 禁无遮挡网站| 国产精品1区2区在线观看.| 18美女黄网站色大片免费观看| 在线观看免费视频日本深夜| 久久久久久久精品吃奶| 全区人妻精品视频| 亚洲成人久久爱视频| 自拍偷自拍亚洲精品老妇| 午夜福利在线观看吧| 国产精品国产高清国产av| 91久久精品国产一区二区成人| 欧美激情国产日韩精品一区| 99国产综合亚洲精品| 中文资源天堂在线| 制服丝袜大香蕉在线| 久久香蕉精品热| 一边摸一边抽搐一进一小说| 中文亚洲av片在线观看爽| 亚洲人与动物交配视频| 91字幕亚洲| 午夜福利成人在线免费观看| 精品一区二区免费观看| 又爽又黄a免费视频| 精品久久久久久久久av| 熟妇人妻久久中文字幕3abv| av视频在线观看入口| 精品一区二区三区人妻视频| 国内精品久久久久久久电影| av中文乱码字幕在线| 日韩欧美在线二视频| 深夜a级毛片| 精品久久久久久久久久久久久| 九色成人免费人妻av| 麻豆一二三区av精品| 观看美女的网站| 在线观看66精品国产| 俺也久久电影网| 亚洲精品亚洲一区二区| 欧美xxxx性猛交bbbb| 一区福利在线观看| 人妻丰满熟妇av一区二区三区| 日韩欧美三级三区| 9191精品国产免费久久| 亚洲aⅴ乱码一区二区在线播放| 日韩精品中文字幕看吧| 18禁裸乳无遮挡免费网站照片| 国产精品1区2区在线观看.| 能在线免费观看的黄片| 91麻豆精品激情在线观看国产| 亚洲中文日韩欧美视频| 一本一本综合久久|