• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Camera calibration method for an infrared horizon sensor with a large field of view

    2023-02-06 09:44:18HuajianDENGHaoWANGXiaoyaHANYangLIUZhongheJIN

    Huajian DENG,Hao WANG,Xiaoya HAN,Yang LIU,Zhonghe JIN

    1Micro-Satellite Research Center,Zhejiang University,Hangzhou 310027,China

    2Zhejiang Key Laboratory of Micro-nano Satellite Research,Hangzhou 310027,China

    3Beijing Institute of Tracking and Telecommunications Technology,Beijing 100094,China

    Abstract:Inadequate geometric accuracy of cameras is the main constraint to improving the precision of infrared horizon sensors with a large field of view(FOV).An enormous FOV with a blind area in the center greatly limits the accuracy and feasibility of traditional geometric calibration methods.A novel camera calibration method for infrared horizon sensors is presented and validated in this paper.Three infrared targets are used as control points.The camera is mounted on a rotary table.As the table rotates,these control points will be evenly distributed in the entire FOV.Compared with traditional methods that combine a collimator and a rotary table which cannot effectively cover a large FOV and require harsh experimental equipment,this method is easier to implement at a low cost.A corresponding three-step parameter estimation algorithm is proposed to avoid precisely measuring the positions of the camera and the control points.Experiments are implemented with 10 infrared horizon sensors to verify the effectiveness of the calibration method.The results show that the proposed method is highly stable,and that the calibration accuracy is at least 30% higher than those of existing methods.

    Key words:Infrared horizon sensor;Ultra-field infrared camera;Camera calibration

    1 Introduction

    The infrared horizon sensor provides almost uninterrupted fine attitude knowledge at a relatively low cost(Mazzini,2016;Nguyen et al.,2018;Modenini et al.,2020),and it is therefore widely used in various space missions(Deng et al.,2017;Gou and Cheng,2018).To capture the full Earth in low orbit,an infrared horizon sensor equipped with an infrared panoramic annular lens(PAL)camera was designed and has been validated in orbit(Wang H et al.,2021).The PAL camera has a maximum field of view(FOV)of 180°and a±30°circular blind area in the center(Niu et al.,2007).Its imaging diagram,when it is facing the Earth at an orbital altitude of 500 km,is shown in Fig.1.Edge points of the Earth are extracted and reprojected to the camera frame to calculate the Earth center’s direction.Apparently,the reprojection process depends on the geometric calibration accuracy of the infrared PAL camera,so a highly accurate camera calibration of the infrared horizon sensor is essential.

    Fig.1 Imaging diagram of the horizon sensor

    Cameras with a super large FOV,including the PAL camera,are also called ultra-field cameras(Zhang S et al.,2020b).The most widely used calibration method for ultra-field infrared cameras is the infrared plane calibration board(IPCB),which uses IPCB pictures with different attitudes and positions to resolve the camera distortion parameters(Kannala and Brandt,2006;Scaramuzza et al.,2006).Due to the specific detection range of infrared cameras,many types of IPCBs have been designed to provide higher-contrast infrared characteristics and achieve more accurate control point positioning(Sheng et al.,2010;Vidas et al.,2012;Dias et al.,2013;Zhang Y et al.,2013;Shibata et al.,2017;Usamentiaga et al.,2017;Li XY et al.,2019).Generally,the IPCB method is good to use when calibrating ultra-field infrared cameras(Chen et al.,2019;Wang ZA et al.,2020;Zhang S et al.,2020a),but there are still several challenges when it is applied to the camera calibration of infrared horizon sensors.

    Normally,an IPCB with huge size chessboard squares is needed to deal with an ultra-field infrared camera with low resolution(Chen et al.,2019).The IPCB needs to surround the center area of the ultra-field infrared camera,so the control points can cover the camera’s entire FOV.However,as shown in Fig.1,the large blind area of the infrared PAL camera causes the discontinuity in the FOV,which leads to difficulties in placing the IPCB.In particular,it is almost impossible to place conventional IPCBs in the narrow area above and below the circular blind area.In this case,the distortion parameters will fit only the image area covered by the control points but not the entire FOV,which is unacceptable in our application.

    Another standard calibration method for highprecision cameras is using a rotary table and a collimator,which is widely used in star tracker calibration tasks(Liebe,2002;Sun et al.,2013;Zhang H et al.,2017;Fan et al.,2020).Star trackers,which are the most accurate attitude measurement devices in satellites,are equipped with a narrow-field visible light camera.Wei et al.(2014)proposed a calibration method based on integrated modeling of intrinsic and extrinsic parameters to calibrate star trackers.This method is insensitive to the errors incurred in installation and alignment,and it is appealing for use in calibrating infrared horizon sensors.However,there are several problems in applying this method to infrared horizon sensors.

    The camera should be covered by parallel light when the collimator method is applied.However,this is difficult to accomplish in calibrating cameras with a large FOV,because a small misalignment of the camera and the center of the rotary table may cause the camera not to be covered in parallel light.Thus,the camera should be placed exactly in the center of the rotary table,or the infrared collimator should have a large enough aperture.Furthermore,the frame of the rotary table should not block the camera’s FOV.These requirements are too strict to be practical.Therefore,in our work,instead of the traditional collimator method,we use multiple infrared targets.Because it is only necessary to ensure that the infrared targets are within the camera’s FOV,the additional requirements for the experimental equipment no longer exist.The targets can easily cover the infrared horizon sensor’s super-large FOV at a low cost.In addition,by using multiple targets,we are able to obtain enough data in less experimental time,which means that higher accuracy can be achieved efficiently.

    A complete model of the camera’s imaging process during the rotation is established to obtain the required camera parameters.By considering the distance between the camera and the center of the rotary table,the proposed model can describe the effect of the camera motion on the imaging process when rotating.The positions of the camera and the infrared targets should be known parameters in our method,so they should be precisely measured beforehand.Normally,because the measurement of these parameters is always troublesome,a parameter estimation algorithm could help in simplifying this operation.

    There are many existing algorithms for solving similar parameter estimation problems in the star tracker calibration task(Li YT et al.,2014;Wei et al.,2014;Zhang CF et al.,2018;Ye et al.,2019).These algorithms rely on the close connection between the ideal model and the distorted model of the star tracker’s camera.However,the ideal model for ultra-field cameras varies according to their design,and the widely used distorted model(Scaramuzza et al.,2006)is not directly related to the ideal model,which greatly limits the universality of these algorithms.They should be adapted for use in camera calibration of infrared horizon sensors.Inspired by the existing two-step algorithm(Wei et al.,2014),which is the most widely used algorithm by far,a three-step parameter estimation algorithm is proposed to deal with unknown parameters.The ideal imaging model and distorted imaging model of the ultra-field camera are developed for preliminary parameter estimation and optimal parameter estimation,respectively.The connection between the ideal imaging model and the distorted imaging model is established using polynomial fitting.As a result,the required parameters can be estimated with high accuracy and robustness.

    Compared with traditional camera calibration methods,the proposed method has fewer requirements for experimental equipment,wider application,and higher accuracy.It effectively solves the problem of camera calibration of infrared Earth sensors with a large FOV.Experimental results validate the excellent robustness,accuracy,and stability of the proposed method.

    2 Calibration method

    2.1 Platform setup

    The infrared camera calibration system consists of a two-axis rotary table with a position accuracy of±3'',a computer,infrared cameras to be calibrated,and infrared targets.The setup of the calibration system is shown in Fig.2.The infrared targets and the two-axis rotary table are placed on a stable platform that is isolated from vibration.The infrared camera is mounted on the two-axis rotary table and points at the infrared targets.The two-axis rotary table drives the infrared camera to rotate to different angles,and the camera takes pictures of infrared targets at each pose.The proposed method has no additional requirements on the structure of the rotary table,the camera can be mounted anywhere on the rotary table,and the targets can easily cover the large FOV of the camera as the rotary table rotates.

    Fig.2 Setup of the infrared camera calibration system

    2.2 Comparison of calibration methods using a collimator and infrared targets

    Comparison of calibration methods using a collimator and infrared targets is shown in Fig.3,revealing the superiority of the infrared target method.As shown in Fig.2,the internal frame of the rotary table(P)is established with the rotation center(O)of the rotary table as the origin,the inner axis of the rotary table as theZaxis(Z),and the outer axis as theXaxis(X).TheYaxis(Y)of framePis normal to theX-Zplane.Assume that the camera’s optical centerOcis on theZaxis,and that the camera’s optical axis is parallel to theZaxis.Let the rotary table rotate around theXaxis at an angle ofW1.To ensure that the FOV of the ultra-field infrared camera is not blocked by the rotary table itself,there may be a large distance between the optical center of the camera and the rotating center of the rotary table.Thus,as shown in Fig.3,the position ofOcchanges fromOc1toOc2when the rotary table rotates.

    In the collimator method,the angle changeW2measured by the camera is equal to the rotation angleW1of the rotary table.As shown in Fig.3,the aperture of the collimator should be large enough,andOcandOshould coincide as much as possible;otherwise,the camera will leave the parallel light coverage.This leads to high demands on experimental equipment and greatly limits the versatility of the collimator method.However,in the infrared target method,the angle changeW3of the infrared target measured by the camera is equal toW1+W4,which is affected by the position change ofOc.Therefore,the infrared target method can use camera motion to achieve easier coverage of the camera’s full FOV,rather than being disturbed by the motion as in the collimator method.In summary,the infrared target solution is less demanding on the experimental equipment,more widely adaptable,and particularly suitable for calibration of ultra-field cameras.

    Fig.3 Comparison of calibration methods using a collimator(top)and infrared targets(bottom)

    2.3 Data acquisition

    The specific experimental procedure is as follows:First,the rotary table and infrared targets are adjusted so that the infrared targets are directly in front of the camera.Then the rotary table rotates 90°around theXaxis at 5°intervals;at each rotation interval of theXaxis,the rotary table rotates 360°around theZaxis at 10°intervals.The infrared camera takes pictures of the infrared targets at each pose.This process produces a large amount of experimental data,and a computer is responsible for controlling the rotary table and collecting infrared camera data in sequence so that automatic calibration is realized.The workflow of the calibration system is shown in Fig.4,which can greatly promote the calibration efficiency.The final effective calibration data include about 260 images.

    Fig.4 Workflow of the infrared camera calibration system

    2.4 Infrared targets

    The actual image and the obtained infrared image of infrared targets are shown in Fig.5.The infrared targets consist of three circular ceramic heating plates with a diameter of 9 mm,which are lowcost shelf commodities.More data can be obtained in a shorter time by using multiple infrared targets.The heating plates are energized and hung in the air using their own power supply metal wire during calibration.They are placed on a vibration-free platform in enclosed rooms to avoid vibration and interference,so the position accuracy of the target during the rotation of the table can be ensured.After the plates reach thermal equilibrium,their surface temperature is maintained at 200-500°C based on power consumption and remains steady for several hours.The heating plates are made of multilayer alumina ceramics,which have a small thermal expansion coefficient and high sintering temperature,so they do not deform and keep their shape during the operation.

    2.5 Positioning of control points

    The grayscale images of the 9-mm three-target combination taken by an infrared PAL camera at a distance of about 1 m are shown in Fig.5b.Considering the symmetry of the circular heating sheet,its energy distribution will also be symmetrical.The high contrast of the infrared targets ensures a high signal-to-noise ratio,and the energy distribution can be easily measured in the form of gray pixel values.Therefore,it is a good approach to achieve control point positioning based on energy distribution.The specific steps of the control point positioning algorithm are as follows:

    Fig.5 Combination of three infrared targets with a diameter of 9 mm:(a)actual image;(b)infrared image

    1.Separate infrared targets and background in the image based on grayscale thresholdT.Here,the average value of the entire image plus 3.5 times the standard deviation of the whole image is taken asT.Then the threshold grayscale image is obtained as follows:

    where(u,v)denotes the coordinates of the pixel,f(u,v)denotes the original grayscale of the pixel,andF(u,v)denotes the threshold grayscale of the pixel.

    2.Divide the image into four connected components according to the threshold grayscale.Then the infrared target areas are identified according to the size and grayscale of the connected components,and the images that do not contain the infrared targets are excluded.

    3.The weighted centroid positioning algorithm is applied to complete the control point positioning(Stone,1989):

    wheremdenotes the number of rows,ndenotes the number of columns,and(uc,vc)denotes the control point position.

    4.Match actual targets and image spots according to the relative position features.For the limitation of the ultra-field camera,the tangential features distort noticeably in different regions of the image.Therefore,the radial features,which are less prone to distortion,are chosen to be constructed here.As shown in Figs.2 and 5,the infrared targets are placed at different distances in the radial direction of the camera with a margin.The position of the infrared targets and the rotation angles of the rotary table are deliberately planned to ensure that targets do not cross the central area during rotation.Once the control points are positioned,the radial distance between the control points and the image center is calculated.The control point identification and matching are completed according to the radial distance.

    3 Parameter estimation

    3.1 Integrated modeling

    Camera calibration aims to obtain the intrinsic parameters of the camera.Before estimating parameters,an accurate integrated model is needed to describe the whole imaging process.The imaging process of the infrared target method is different from that of the collimator method(Wei et al.,2014),because the motion of the camera has an impact on the imaging process.As coordinate transformation errors from different frames accumulate,their effect on the result will be great.Thus,a novel and more accurate calibration model is proposed.The distance between the camera and the center of the rotary table is considered to describe the effect of the camera’s motion on the imaging process when rotating.The infrared camera calibration system and the frames used in the following integrated modeling are shown in Fig.6.The frames used are defined as follows:

    Fig.6 The infrared camera calibration system

    FrameP(defined in Section 2.2)rotates with the rotary table.When the rotary table is at its initial position,framePis defined as the inertial coordinate frame(B).The origin of the camera coordinate frame(C)is the optical center of the camera(Oc).TheXaxis(Xc)of frameCis the row’s direction of the imaging sensor,and theYaxis(Yc)of frameCis the column’s direction of the imaging sensor.TheZaxis(Zc)of frameCis normal to theXc-Ycplane.The image coordinate frame takes the imaging center of the image sensor(Os)as the origin,and the pixel coordinate frame takes the upper left corner of the image sensor as the origin.In both cases,the row and column of the image sensor are taken as theXaxis andYaxis respectively,and the unit is pixel.

    The installation position deviation and angle deviation between frameCand framePare taken as one set of extrinsic parameters.The positions of the infrared targets in frameBare taken as another set of extrinsic parameters.The principal point and distortion coefficient of the ultra-field camera are taken as intrinsic parameters.The integrated calibration model is established in Sections 3.1.1 and 3.1.2.

    3.1.1 Extrinsic parameter model

    When the rotary table rotates at theithset of angles,assume thatis the position of thejthinfrared targetDjin frameC.The expression is as follows:

    whereis the position of thejthinfrared target in frameB.is the position of the camera’s optical centerOcin frameP.RBPirepresents the rotation matrix from frameBto framePunder theithset of rotation angles.The rotation of the rotary table applied in this study can be described by two sets of independent parametersωxiandωzi,and its expression is as follows:

    whereRrepresents the rotation matrix corresponding to the Euler angles.RPCrepresents the rotation matrix from framePto frameC,which can be described by three independent groups of parametersα,β,φ,and its expression is as follows:

    3.1.2 Intrinsic parameter model

    Assume that an infrared targetCXD=is projected as a pointp'=[u',v']Ton the image coordinate frame according to the imaging relationship,which can be expressed as

    whereλrepresents the scaling factor of the infrared target from frameCto the image coordinate frame,andgrepresents the imaging function that describes the imaging relationship.

    To simplify parameter calculation,an ideal imaging model and a distorted imaging model are established here,respectively,for initial estimation and later refinement.

    1.Ideal imaging model

    To obtain a large FOV,the PAL camera obeys the equidistance projection relationship(Niu et al.,2007),which can be expressed as

    wherefis the focal length,θis the field angle of the target,andρis the image height.Considering the distortion in the lens design and processing process,the camera generally does not deviate too much from the imaging principle,so Eq.(7)roughly fits the imaging functiong.

    Regardless of the imaging principle,the incident vector can be expressed as the following unit vector:

    whereλ1represents the scaling factor andφrepresents the azimuth of the target.The relationship between the coordinates and azimuth is as follows:

    In addition,the deviation of the principal point needs to be considered.The pointp'on the image coordinate frame is transformed intop=[u,v]Ton the pixel coordinate frame after affine transformation.The rotation matrix in the affine transformation is treated as an identity matrix here,so the affine transformation can be described as

    where[u0,v0]Tis the position of the imaging centerOsin the pixel coordinate frame.The length unit of both frames is one pixel,so there is no scaling relationship.

    2.Distorted imaging model

    To describe the distortion of the ultra-field camera more accurately,the polynomial distorted model proposed by Scaramuzza et al.(2006)is used,and the imaging functiongis well-fitted by the polynomial:

    whereλ2is the scaling factor,anda0,a2,...,aNare the coefficients of each order.According to Chen et al.(2019),Wang ZA et al.(2020),and Zhang S et al.(2020a),whenNis set to 4,a good calibration effect can be obtained for an ultra-field lens.Subsequent experiments also verified this.In addition,considering the image sensor’s imaging process,the inconsistency of the pixel length and width directions,and the tilt deviation of the two axes of the pixel(Forsyth and Ponce,2011),the transformation from image coordinate frame to pixel coordinate frame can be described as

    wherekrepresents the pixel aspect ratio andsrepresents the tilt coefficient of the pixel axes.

    3.2 Parameter estimation algorithm

    A three-step calibration algorithm is proposed to deal with unknown parameters,which are too many to be estimated directly.The summary of the three-step calibration algorithm is shown in Fig.7.The position of the control points obtained from the images and the corresponding rotation angles are used as input data.Assuming that the camera follows the lens’s optical design,the first step is to provide a reasonable estimate for most parameters with the ideal imaging model.The parameters in the ideal imaging model are transformed into the parameters in the distorted imaging model by polynomial fitting in the second step.In the third step,the distorted imaging model is used to estimate the camera’s distortion coefficient and thus optimize the estimation of all parameters.The three steps are presented in detail in Sections 3.2.1,3.2.2,and 3.2.3.

    Fig.7 Summary of the estimation algorithm

    3.2.1 First step:preliminary estimation of a part of parameters

    When the rotary table rotates at theithset of angles,the position of thejthinfrared targetDjin the pixel coordinate frame is measured by the infrared camera.The estimated value of the target positionCin the camera coordinate system is obtained from the extrinsic parameter model,which can be expressed as

    whereErepresents the coordinate transformation relationship described by Eq.(3).

    Then the estimated value of the target image positionon the image coordinate frame is obtained from the ideal imaging model,which can be expressed as

    whereFrepresents the inverse process of the ideal imaging relationship described by Eqs.(7),(8),and(10).Due to the simplicity of the functional relationship,this inverse process is not difficult to express analytically.

    A nonlinear least-squares estimation problem is established,and the optimization objective is the minimization of the following cost function:

    The variables to be estimated in the first step are

    wherej=1,2,...,K.The positions here are relative values,so we assumeBZD1=10 according to actual situations.The Levenberg-Marquardt algorithm is used to solve the nonlinear optimal estimation problem.Ignoring the installation errors,the initial guesses ofα,β,andφare given according to the corresponding relations of the coordinate frames.The initial guess of(u0,v0)locates at the center of the image coordinates,whilef’s initial guess is the lens design value.The initial guesses ofBXDj,BYDj,BZDj,P XC,P YC,andP ZCare set roughly according to the site conditions.

    3.2.2 Second step:transformation of parameters in two imaging models

    The parameters in the ideal imaging model and distorted imaging model do not correspond to each other directly.The conversion process is essential to use the results of the preliminary parameter estimation to the fullest.A simple solution is as follows:When 0<θ<π/2,the following approximation is available according to Eqs.(8),(9),and(11):

    For thefobtained in the first step,a set of discreteθ’s are used as the input to Eq.(7),and a corresponding set ofρ’s are obtained.Then,a set of approximate solutions ofa0,a2,a3,a4are obtained by polynomial fitting according to Eq.(17).These approximate solutions will be used as initial guesses in the following step.

    3.2.3 Third step:optimal estimation of all parameters

    Adopting the distorted imaging model based on the first step,the estimated value of(?uij,?vij)Tis obtained.It can be expressed as

    whereGrepresents the inverse process of the distorted imaging relationship described by Eqs.(11)and(12).However,this inverse process is difficult to express analytically.The difficulty comes mainly from finding a suitable root of the polynomials in Eq.(11).The distorted imaging model implies that the reprojection process from the point in the pixel coordinate to the vector in the camera coordinate is convenient,whereas the projection process from the vector in the camera coordinate to the point in the pixel coordinate is complex and nonlinear.The opposite situation exists in other kinds of models(Kannala and Brandt,2006).The computational complexity is unavoidable due to the extensive use of high-order polynomials in the distorted model,but leaving the computational convenience in the reprojection process will be more practical.This will help achieve better real-time performance in measurement applications(Wang H et al.,2021).

    The same nonlinear least-squares estimation problem as in Eq.(15)is used.The variables to be estimated in the third step are

    wherej=1,2,...,K.The Levenberg-Marquardt algorithm is applied,while the Jacobian matrix needs to be approximated numerically due to the complexity of calculating the analytical formula.This is realized using the MATLAB function lsqnonlin.The parameters obtained in the first and second steps are adopted as initial values,and the initial guesses forkandsare set to 1 and 0,respectively.

    4 Experiments and analysis

    4.1 Numerical simulations

    4.1.1 Configurations of simulations

    The infrared camera used for simulations is the same as a real infrared PAL camera,including the FOV and the blind areas.Its main specifications are shown in Table 1.A combination of three calibration targets is chosen here.The values of model parameters and their initial guesses during the iteration process are shown in Table 2.The two-axis rotary table is set to rotate around theXaxis in steps of 5°,and rotate around theZaxis in steps of 10°.

    Table 1 Specifications of the infrared panoramic annular lens(PAL)camera

    Table 2 Simulation parameters and the initial values

    Because the true value is known in the simulations,the root mean square error(RMSE)of the reprojected points,which are based on the calibrated model,and the corresponding true points are used to evaluate the calibration effect in the simulations(Scaramuzza et al.,2006).This RMSE is also calledthe real reprojection error(RRE):

    where(uij,vij)represents the true value.The RRE is the average distance error in thexandydirections,which is slightly different from the Euclidean distance.We performed 200 sets of independent simulations for each condition,and the results shown are the average values.

    4.1.2 Simulation results

    In the simulations,we investigated the feasibility and robustness of the proposed algorithm in the case of inaccurate control point positioning.To this end,Gaussian noise with a mean of 0 and a standard deviation ofσpixels was added to thexandycoordinates of each control point.The noise level was changed fromσ=0.1 pixels toσ=2 pixels with a step size of 0.1 pixels.The simulation results are shown in Fig.8.As we can see,the average RRE of the optimal estimate is much smaller than the noise level in different simulation conditions,which fully demonstrates the good robustness of the proposed method.The average RRE increases linearly with the increase of the noise level in both optimal estimation and preliminary estimation,while the gap between the RRE of the preliminary estimation and that of the optimal estimation gradually decreases.This indicates that the distorted imaging model shows greater advantages in low-noise conditions.

    Fig.8 Performance under different noise levels

    Furthermore,forσ=2 pixels,which is larger than the noise caused by control point positioning,the average RRE of the optimal estimation is 0.254 pixels.Real points,measurement points,and reprojected points of one simulation are shown in Fig.9.Although the measurement points are very noisy relative to the real points,after correction by the proposed method,the reprojected points can approximate the real points very well.

    Fig.9 Distribution of the real points,measurement points,and reprojected points

    The iteration process of one calibration withσ=2 pixels is shown in Fig.10.Although the initial cost functionJin the preliminary estimation is ultra-high due to the poor initial guess of parameters,it still converges in a few cycles,andJdecreases further in the optimal estimation,which demonstrates the effectiveness of the proposed parameter estimation algorithm.

    Fig.10 Iteration process of the calibration algorithm

    4.2 Practical experiments

    4.2.1 Experimental results

    We used 10 infrared horizon sensors to validate the accuracy and stability of the proposed method.All control points obtained in the experiment of one infrared horizon sensor are shown in Fig.11.Due to the use of three closely spaced targets and the high-density rotation of the rotary table,the control points of the three targets covered the camera’s entire FOV evenly,as shown in Figs.11 and 1.

    Fig.11 Distribution of all control points obtained

    The reprojection error distribution of one experiment with three targets is shown in Fig.12.The errors of all three targets were uniformly distributed and concentrated around zero.This indicates that the control point positioning accuracies of the three infrared targets are similar,and the constraints provided by all three infrared targets are treated equally by the estimation algorithm.The calibration results of one infrared horizon sensor are shown in Table 3.These parameters will be injected into each infrared horizon sensor’s software to help them achieve higher accuracy.

    Fig.12 Distribution of reprojection errors

    Because the true value was not known in the experiments,the RMSE between the reprojected points and the corresponding measurement points was used to evaluate the calibration effect in the experiments.This RMSE is also called the measurement reprojection error(MRE):

    The experimental results of 10 cameras are shown in Fig.13 and Table 4.Comparison experiments were performed with the same set of calibration photos,and one of the targets,as well as all three targets,was taken for parameter solving,separately.The average MRE is 0.175 pixels when using only one target,while the average MRE is 0.148 pixels when using three targets,implying an improvement of 15% in the calibration accuracy.Also,the experiments with three targets showed better stability with a standard deviation of 0.0028 pixels,while it was 0.014 pixels with a single-target calibration experiment.Compared to a single target,the combination of three targets increases the calibration data exponentially without additional time consumption.In addition,the position relationship constraint between the three targets suppresses the random noise of a single target.These factors lead to better performance of the calibration system when using three targets.

    Fig.13 Calibration experiments of 10 infrared horizon sensors

    During the calibration experiments,the positions of the three targets with respect to the rotary table were kept constant,so we can evaluate the estimation accuracy of the extrinsic parameters based on their distribution.According to the estimation algorithm in Section 3.2,the estimates of the extrinsic parameters are relative values and are based on the assumption ofBZD1=10.The estimates of extrinsic parameters are shown in Table 5.The error distributions are relatively small,so the extrinsic parameters are estimated effectively via the proposed algorithm.4.2.2 Comparison with other geometric calibration methods

    Table 3 Infrared horizon sensor calibration results

    Table 4 Results of calibration experiments

    Table 5 Estimation of extrinsic parameters

    Existing results of other infrared camera calibration methods are shown in Table 6 for comparison.The evaluation criterion was described in Eq.(21);and the results of all calibration methods are similar.The method proposed in this study achieves higher calibration accuracy than other ultra-field infrared cameras.Even compared with the best result available,which is 0.219 pixels(Zhang S et al.,2020a),our method has achieved at least a 30%improvement.

    As shown in the last row of Table 6,because of the severe distortion and low control point positioning accuracy of the ultra-field cameras,the geometric calibration of ultra-field infrared cameras is still less accurate than that of narrow-field infrared cameras(Usamentiaga et al.,2017).The distorted model and control point positioning algorithm of the ultra-field infrared camera will be researched more thoroughly in future works.

    Table 6 Comparison of infrared camera calibration methods

    5 Conclusions

    A high-accuracy camera geometric calibration method for infrared horizon sensors with a large FOV has been proposed.Control points were evenly distributed throughout the entire FOV with the help of a two-axis rotary table and multiple infrared targets.Thus,the calibration accuracy of the entire FOV was ensured.The three-step parameter estimation algorithm based on integrated modeling achieved highly accurate parameter estimation.

    Simulation results showed that the proposed calibration algorithm was effective and robust at different noise levels.Experiments using 10 infrared PAL cameras demonstrated that the proposed method achieved not only high accuracy,with an average MRE of 0.148 pixels,but also good stability with a standard deviation of 0.0028 pixels.The combina-tion of three targets proposed in this paper improved the calibration accuracy by 15% compared to a single target.The calibration accuracy of this method is at least 30% better than those of other existing methods.

    The camera calibration method proposed in this paper helped infrared horizon sensors achieve higher accuracy,and it will be an important foundation for other measurement applications with ultra-field infrared cameras.

    Contributors

    Huajian DENG and Hao WANG designed the research.Huajian DENG acquired and processed the data.Huajian DENG and Hao WANG drafted the paper.Xiaoya HAN,Yang LIU,and Zhonghe JIN offered advice.Huajian DENG and Hao WANG revised and finalized the paper.

    Compliance with ethics guidelines

    Huajian DENG,Hao WANG,Xiaoya HAN,Yang LIU,and Zhonghe JIN declare that they have no conflict of interest.

    Data availability

    The data that support the findings of this study are available from the corresponding author upon reasonable request.

    日韩精品有码人妻一区| 亚洲人成网站高清观看| 嘟嘟电影网在线观看| 国产精品一及| 国产国拍精品亚洲av在线观看| 一级黄色大片毛片| 免费看av在线观看网站| 亚洲国产精品专区欧美| av免费在线看不卡| 亚洲国产最新在线播放| 白带黄色成豆腐渣| 大话2 男鬼变身卡| 22中文网久久字幕| 精品一区二区三区人妻视频| 少妇熟女aⅴ在线视频| 九九在线视频观看精品| 村上凉子中文字幕在线| 亚洲熟妇中文字幕五十中出| 在线观看美女被高潮喷水网站| 在线观看美女被高潮喷水网站| 午夜福利在线观看吧| 午夜激情欧美在线| 级片在线观看| 国产免费男女视频| 国产成人精品久久久久久| 亚洲av日韩在线播放| 白带黄色成豆腐渣| 国产真实伦视频高清在线观看| 美女国产视频在线观看| 国产精品爽爽va在线观看网站| 国产av不卡久久| 99久久中文字幕三级久久日本| 综合色av麻豆| 插阴视频在线观看视频| 女的被弄到高潮叫床怎么办| 看黄色毛片网站| 老司机影院毛片| 久久久久久九九精品二区国产| 婷婷色综合大香蕉| 日本猛色少妇xxxxx猛交久久| 免费观看性生交大片5| 最近手机中文字幕大全| 亚洲一区高清亚洲精品| 亚洲在线观看片| 国产探花极品一区二区| 22中文网久久字幕| 日本午夜av视频| 亚洲熟妇中文字幕五十中出| 精品久久久久久久末码| 91精品国产九色| 久久久精品大字幕| 国产精品一区二区在线观看99 | 在线观看66精品国产| 亚洲av日韩在线播放| 精品人妻偷拍中文字幕| 亚洲精品自拍成人| 欧美性猛交黑人性爽| 欧美一区二区亚洲| 亚洲av不卡在线观看| 国产成人freesex在线| 国产精品无大码| 欧美性猛交╳xxx乱大交人| 国产中年淑女户外野战色| 赤兔流量卡办理| 久久久亚洲精品成人影院| 成人美女网站在线观看视频| 亚洲最大成人中文| 久久久成人免费电影| 亚洲五月天丁香| 免费黄网站久久成人精品| 久久鲁丝午夜福利片| 少妇丰满av| 一夜夜www| 成人鲁丝片一二三区免费| 国产免费又黄又爽又色| 亚洲在久久综合| 大话2 男鬼变身卡| 美女黄网站色视频| 狠狠狠狠99中文字幕| 亚洲国产精品成人久久小说| 亚洲图色成人| 久久久久久久久中文| 九九在线视频观看精品| 男人舔女人下体高潮全视频| 一个人免费在线观看电影| 免费观看的影片在线观看| 男插女下体视频免费在线播放| 美女高潮的动态| 一级毛片aaaaaa免费看小| 网址你懂的国产日韩在线| 在线观看av片永久免费下载| 亚洲乱码一区二区免费版| 色噜噜av男人的天堂激情| 国产精品日韩av在线免费观看| 欧美xxxx黑人xx丫x性爽| 三级国产精品片| 又爽又黄a免费视频| 欧美人与善性xxx| 国产乱人视频| 亚洲自偷自拍三级| 日本一二三区视频观看| 99久久成人亚洲精品观看| 日韩成人伦理影院| av在线亚洲专区| 夫妻性生交免费视频一级片| 午夜a级毛片| 色综合站精品国产| 18禁裸乳无遮挡免费网站照片| 亚洲高清免费不卡视频| 欧美xxxx性猛交bbbb| 视频中文字幕在线观看| 日本免费a在线| 欧美日韩在线观看h| 波多野结衣高清无吗| 三级经典国产精品| 色5月婷婷丁香| 一级毛片aaaaaa免费看小| 精品久久久久久久久久久久久| 亚洲欧美精品综合久久99| 色视频www国产| 老女人水多毛片| 国产伦理片在线播放av一区| 日本五十路高清| 有码 亚洲区| 校园人妻丝袜中文字幕| 欧美3d第一页| 国产成人午夜福利电影在线观看| 亚洲精品久久久久久婷婷小说 | 亚洲精品aⅴ在线观看| 亚洲国产精品专区欧美| 精品久久久久久久末码| 99久久人妻综合| 精品久久久久久成人av| 女人久久www免费人成看片 | 亚洲乱码一区二区免费版| 国产精品国产三级专区第一集| 精品久久久噜噜| 免费看av在线观看网站| 好男人在线观看高清免费视频| 丝袜美腿在线中文| 日日摸夜夜添夜夜爱| 听说在线观看完整版免费高清| 我要看日韩黄色一级片| h日本视频在线播放| 久久久久久久亚洲中文字幕| 色视频www国产| 欧美一区二区亚洲| 乱人视频在线观看| 国产成年人精品一区二区| 少妇的逼好多水| 91精品伊人久久大香线蕉| 久久精品国产99精品国产亚洲性色| 亚洲丝袜综合中文字幕| 22中文网久久字幕| 嫩草影院精品99| 最近的中文字幕免费完整| 亚洲在线观看片| 亚洲中文字幕日韩| 国产一区亚洲一区在线观看| 国产精品美女特级片免费视频播放器| 一级毛片aaaaaa免费看小| 日本黄大片高清| 高清日韩中文字幕在线| 一个人观看的视频www高清免费观看| 三级毛片av免费| 啦啦啦韩国在线观看视频| 免费观看在线日韩| 美女黄网站色视频| 久久久色成人| 97超碰精品成人国产| 国产三级中文精品| 综合色av麻豆| 欧美3d第一页| 国产白丝娇喘喷水9色精品| a级毛色黄片| 晚上一个人看的免费电影| 国产免费男女视频| 久久亚洲精品不卡| 国产成人a∨麻豆精品| 国产69精品久久久久777片| 国产亚洲精品久久久com| av专区在线播放| 亚洲av成人精品一区久久| 久久久久久久亚洲中文字幕| 国产免费男女视频| av.在线天堂| 日本免费一区二区三区高清不卡| 欧美日韩在线观看h| 99热网站在线观看| АⅤ资源中文在线天堂| 日本爱情动作片www.在线观看| 一区二区三区免费毛片| 成人性生交大片免费视频hd| 超碰97精品在线观看| 观看免费一级毛片| 夜夜爽夜夜爽视频| 久久久久久久久久成人| 欧美区成人在线视频| 男女啪啪激烈高潮av片| 国产一区二区在线观看日韩| 97超视频在线观看视频| 在线观看美女被高潮喷水网站| 亚洲国产精品成人久久小说| 夜夜爽夜夜爽视频| 大话2 男鬼变身卡| 国产男人的电影天堂91| 精品久久久久久久久av| 成人毛片60女人毛片免费| 免费看光身美女| 亚洲自拍偷在线| 日韩强制内射视频| 国产亚洲精品久久久com| 久久久精品大字幕| 99热精品在线国产| 欧美高清成人免费视频www| 国产女主播在线喷水免费视频网站 | 国产精品国产三级国产专区5o | 国产一区二区在线av高清观看| 国产精品熟女久久久久浪| 国产一区有黄有色的免费视频 | 国产免费男女视频| 美女大奶头视频| 免费观看人在逋| 干丝袜人妻中文字幕| 亚洲,欧美,日韩| 国产精品无大码| 91在线精品国自产拍蜜月| 高清在线视频一区二区三区 | 色噜噜av男人的天堂激情| 亚洲五月天丁香| 久久久精品大字幕| 国产精品国产高清国产av| 三级国产精品欧美在线观看| 国产真实伦视频高清在线观看| 特级一级黄色大片| 麻豆久久精品国产亚洲av| 少妇丰满av| 成人毛片a级毛片在线播放| 日韩精品有码人妻一区| 国产精品国产三级国产av玫瑰| 91久久精品国产一区二区三区| 亚洲最大成人手机在线| 国产大屁股一区二区在线视频| 不卡视频在线观看欧美| 91aial.com中文字幕在线观看| 嫩草影院精品99| 大香蕉97超碰在线| 国产成人freesex在线| 变态另类丝袜制服| 中文在线观看免费www的网站| 蜜臀久久99精品久久宅男| 久久久久九九精品影院| 欧美不卡视频在线免费观看| 国国产精品蜜臀av免费| 亚洲在线观看片| 国内揄拍国产精品人妻在线| a级毛色黄片| 寂寞人妻少妇视频99o| 舔av片在线| 亚洲一区高清亚洲精品| 好男人在线观看高清免费视频| 毛片女人毛片| 在线免费观看不下载黄p国产| 成人一区二区视频在线观看| 国产精品1区2区在线观看.| 深夜a级毛片| 噜噜噜噜噜久久久久久91| 久久久久性生活片| 成人欧美大片| 国产一级毛片七仙女欲春2| 国产欧美另类精品又又久久亚洲欧美| or卡值多少钱| 床上黄色一级片| 国产精品久久视频播放| 亚洲精品456在线播放app| 两性午夜刺激爽爽歪歪视频在线观看| 亚洲av中文av极速乱| 亚洲熟妇中文字幕五十中出| 一级黄片播放器| 日本熟妇午夜| 国产欧美另类精品又又久久亚洲欧美| 国产精品久久久久久精品电影小说 | 国产一区二区亚洲精品在线观看| 亚洲综合精品二区| 久久久久久久亚洲中文字幕| 日产精品乱码卡一卡2卡三| 亚洲va在线va天堂va国产| 亚洲在久久综合| 男女国产视频网站| 午夜视频国产福利| 国产黄a三级三级三级人| 日本三级黄在线观看| 久久99蜜桃精品久久| 国产片特级美女逼逼视频| 黄色一级大片看看| 一个人免费在线观看电影| 久久久久久久久中文| 国产高清视频在线观看网站| 国产一区有黄有色的免费视频 | 国产成人福利小说| 99热精品在线国产| 九色成人免费人妻av| 尤物成人国产欧美一区二区三区| 久久99热这里只有精品18| 国产精品一二三区在线看| 亚洲成色77777| 国产高清有码在线观看视频| 丝袜喷水一区| 国产极品精品免费视频能看的| 亚洲国产欧洲综合997久久,| 国产视频首页在线观看| 久99久视频精品免费| 黄色欧美视频在线观看| 精品酒店卫生间| 日韩精品有码人妻一区| 亚洲国产精品成人综合色| 国产单亲对白刺激| 色5月婷婷丁香| 91aial.com中文字幕在线观看| 两个人视频免费观看高清| 亚洲天堂国产精品一区在线| 久久国产乱子免费精品| 国产单亲对白刺激| 亚洲精品日韩av片在线观看| 干丝袜人妻中文字幕| 国产精品国产三级专区第一集| 国产精品一区www在线观看| 亚洲中文字幕日韩| 亚洲综合精品二区| 天堂中文最新版在线下载 | 91av网一区二区| 国产中年淑女户外野战色| 三级国产精品欧美在线观看| 精品欧美国产一区二区三| 日本免费在线观看一区| 国产精品熟女久久久久浪| 夜夜看夜夜爽夜夜摸| 精品久久久久久久末码| 欧美区成人在线视频| 亚洲婷婷狠狠爱综合网| 国产精品一区二区在线观看99 | 99在线视频只有这里精品首页| 在线a可以看的网站| 久久精品久久精品一区二区三区| 九草在线视频观看| 天天躁日日操中文字幕| h日本视频在线播放| 91在线精品国自产拍蜜月| 内射极品少妇av片p| or卡值多少钱| 国产午夜精品论理片| 乱人视频在线观看| 亚洲在久久综合| 观看美女的网站| 国产午夜精品一二区理论片| 狂野欧美激情性xxxx在线观看| 秋霞伦理黄片| 色综合站精品国产| 欧美日本视频| 欧美性感艳星| 亚洲成av人片在线播放无| 国产亚洲最大av| a级一级毛片免费在线观看| 国产成人福利小说| 欧美高清成人免费视频www| 男女国产视频网站| 91狼人影院| 超碰av人人做人人爽久久| 亚洲成av人片在线播放无| 激情 狠狠 欧美| 少妇的逼水好多| 国产亚洲最大av| 97超碰精品成人国产| 久久久精品欧美日韩精品| 免费av观看视频| 91久久精品国产一区二区三区| 欧美变态另类bdsm刘玥| 噜噜噜噜噜久久久久久91| 日韩在线高清观看一区二区三区| 免费观看人在逋| 日韩国内少妇激情av| 嫩草影院新地址| 久久久久久久久久成人| 成人午夜高清在线视频| 国产精品一区二区三区四区免费观看| 午夜福利在线观看吧| 欧美日韩综合久久久久久| 日韩高清综合在线| av.在线天堂| 久久久色成人| 在线观看av片永久免费下载| a级毛色黄片| 国产av在哪里看| 午夜激情福利司机影院| 欧美日韩在线观看h| 亚洲精品456在线播放app| av国产免费在线观看| 精品99又大又爽又粗少妇毛片| 久久亚洲精品不卡| 最近最新中文字幕大全电影3| 亚州av有码| 99热精品在线国产| 亚洲精品色激情综合| 欧美高清性xxxxhd video| 国产老妇女一区| 人体艺术视频欧美日本| 18禁在线播放成人免费| 午夜免费男女啪啪视频观看| 国产av不卡久久| 国产免费又黄又爽又色| 亚洲欧洲国产日韩| 亚洲国产成人一精品久久久| 日本免费一区二区三区高清不卡| 亚洲综合精品二区| 亚洲成色77777| 免费观看性生交大片5| 免费av观看视频| 久久婷婷人人爽人人干人人爱| 韩国高清视频一区二区三区| 久久精品91蜜桃| 成人午夜高清在线视频| 亚洲成av人片在线播放无| 亚洲av成人av| 久久精品国产鲁丝片午夜精品| 深夜a级毛片| 少妇熟女欧美另类| 九九在线视频观看精品| eeuss影院久久| 国产精品人妻久久久久久| 身体一侧抽搐| 天天躁日日操中文字幕| 亚洲欧美成人精品一区二区| 国产午夜精品久久久久久一区二区三区| 视频中文字幕在线观看| 国产淫语在线视频| 亚洲乱码一区二区免费版| 精品免费久久久久久久清纯| 夜夜爽夜夜爽视频| 精品久久久久久电影网 | 精品熟女少妇av免费看| 一二三四中文在线观看免费高清| 日本与韩国留学比较| 毛片一级片免费看久久久久| 日日摸夜夜添夜夜添av毛片| 亚洲伊人久久精品综合 | 国产免费男女视频| 九九热线精品视视频播放| 天堂网av新在线| 久久久成人免费电影| 亚洲精品456在线播放app| 中文天堂在线官网| 午夜福利视频1000在线观看| 国产精品1区2区在线观看.| 视频中文字幕在线观看| 97超碰精品成人国产| 亚洲欧洲日产国产| 国产成人精品一,二区| 久久久久精品久久久久真实原创| 欧美3d第一页| 只有这里有精品99| 中文乱码字字幕精品一区二区三区 | 国产美女午夜福利| 精品不卡国产一区二区三区| 少妇猛男粗大的猛烈进出视频 | 国产成人aa在线观看| 欧美三级亚洲精品| 最后的刺客免费高清国语| 天天一区二区日本电影三级| 美女xxoo啪啪120秒动态图| 小蜜桃在线观看免费完整版高清| 国产成人福利小说| 联通29元200g的流量卡| 国产成年人精品一区二区| 中文字幕制服av| 久久欧美精品欧美久久欧美| 18禁裸乳无遮挡免费网站照片| 久久久久久久久久久免费av| 在线观看美女被高潮喷水网站| 97超碰精品成人国产| 一级毛片电影观看 | 男的添女的下面高潮视频| 一本一本综合久久| 国产精品蜜桃在线观看| 国产精品一区二区性色av| 韩国av在线不卡| 五月伊人婷婷丁香| 丝袜喷水一区| 中国国产av一级| 成年版毛片免费区| 国产真实乱freesex| 只有这里有精品99| 高清在线视频一区二区三区 | 中文字幕人妻熟人妻熟丝袜美| 久久精品影院6| 欧美日本视频| 一边摸一边抽搐一进一小说| 天天躁夜夜躁狠狠久久av| 久热久热在线精品观看| 99久久中文字幕三级久久日本| 国产日韩欧美在线精品| 免费观看人在逋| 精品国产露脸久久av麻豆 | 综合色丁香网| 中国国产av一级| av在线老鸭窝| 国产精品一及| 国产在视频线在精品| 亚洲国产精品sss在线观看| 热99re8久久精品国产| 国产一级毛片七仙女欲春2| 国产在视频线精品| 国产熟女欧美一区二区| 午夜免费激情av| 日本黄色视频三级网站网址| 丰满乱子伦码专区| 日韩视频在线欧美| 青青草视频在线视频观看| 免费av不卡在线播放| 亚洲欧美一区二区三区国产| 精品国产三级普通话版| 啦啦啦韩国在线观看视频| 嫩草影院新地址| 亚洲欧美精品自产自拍| 韩国高清视频一区二区三区| 免费电影在线观看免费观看| 欧美高清成人免费视频www| 精品久久久久久久末码| 久久欧美精品欧美久久欧美| 色尼玛亚洲综合影院| 精品久久久久久久久av| 伦理电影大哥的女人| 能在线免费观看的黄片| 久久精品夜夜夜夜夜久久蜜豆| 久久久久性生活片| 91aial.com中文字幕在线观看| 男人的好看免费观看在线视频| 在线免费观看不下载黄p国产| 国产熟女欧美一区二区| 国产亚洲精品久久久com| 欧美一区二区精品小视频在线| 午夜免费激情av| 午夜福利在线观看免费完整高清在| 日韩欧美精品免费久久| 亚洲av中文字字幕乱码综合| 国产成人福利小说| 三级国产精品片| 日韩一区二区视频免费看| 久久久久久久久大av| 啦啦啦啦在线视频资源| 亚洲乱码一区二区免费版| 特大巨黑吊av在线直播| 日韩精品有码人妻一区| 高清在线视频一区二区三区 | 日韩三级伦理在线观看| 联通29元200g的流量卡| 欧美一区二区国产精品久久精品| 看片在线看免费视频| 日韩大片免费观看网站 | 日本黄色片子视频| 熟女电影av网| 日韩高清综合在线| 女人久久www免费人成看片 | 天天一区二区日本电影三级| 免费大片18禁| 亚洲在久久综合| 嫩草影院新地址| 老司机影院毛片| 精品欧美国产一区二区三| 国产精品国产三级专区第一集| 国产美女午夜福利| 22中文网久久字幕| 非洲黑人性xxxx精品又粗又长| 波多野结衣高清无吗| 久久精品久久久久久久性| 3wmmmm亚洲av在线观看| 在线播放国产精品三级| 久久久久久久久大av| 成人综合一区亚洲| 国产视频首页在线观看| 国产成人freesex在线| 欧美xxxx黑人xx丫x性爽| 美女脱内裤让男人舔精品视频| 国语对白做爰xxxⅹ性视频网站| 亚洲真实伦在线观看| 国产成人精品一,二区| 欧美zozozo另类| 国产av码专区亚洲av| 国产真实乱freesex| 日韩欧美精品v在线| 国产欧美日韩精品一区二区| 亚洲精品影视一区二区三区av| 人人妻人人澡欧美一区二区| 又黄又爽又刺激的免费视频.| 一个人免费在线观看电影| 中文天堂在线官网| 高清在线视频一区二区三区 | 秋霞伦理黄片| 日韩欧美精品免费久久| av免费观看日本| 看黄色毛片网站| 大香蕉97超碰在线| 成人国产麻豆网| 国产成人一区二区在线| 国产一区亚洲一区在线观看| 国产精品一区二区三区四区免费观看| 久久人人爽人人片av| 亚洲18禁久久av| 精品酒店卫生间| 建设人人有责人人尽责人人享有的 | 日韩欧美精品免费久久| 久久国产乱子免费精品| 成人美女网站在线观看视频| 国产精品久久久久久av不卡| 久久99蜜桃精品久久| 免费观看精品视频网站|