A novel method for measuring center-axis velocity of unmanned aerial vehicles through synthetic motion blur images

Quanxi Zhan , Yanmin Zhou , Junrui Zhang , Chenyang Sun , Runjie Shen , Bin He

Autonomous Intelligent Systems ›› 2024, Vol. 4 ›› Issue (1) : 16

PDF
Autonomous Intelligent Systems ›› 2024, Vol. 4 ›› Issue (1) : 16 DOI: 10.1007/s43684-024-00073-x
Original Article
research-article

A novel method for measuring center-axis velocity of unmanned aerial vehicles through synthetic motion blur images

Author information +
History +
PDF

Abstract

Accurate velocity measurement of unmanned aerial vehicles (UAVs) is essential in various applications. Traditional vision-based methods rely heavily on visual features, which are often inadequate in low-light or feature-sparse environments. This study presents a novel approach to measure the axial velocity of UAVs using motion blur images captured by a UAV-mounted monocular camera. We introduce a motion blur model that synthesizes imaging from neighboring frames to enhance motion blur visibility. The synthesized blur frames are transformed into spectrograms using the Fast Fourier Transform (FFT) technique. We then apply a binarization process and the Radon transform to extract light-dark stripe spacing, which represents the motion blur length. This length is used to establish a model correlating motion blur with axial velocity, allowing precise velocity calculation. Field tests in a hydropower station penstock demonstrated an average velocity error of 0.048 m/s compared to ultra-wideband (UWB) measurements. The root-mean-square error was 0.025, with an average computational time of 42.3 ms and CPU load of 17%. These results confirm the stability and accuracy of our velocity estimation algorithm in challenging environments.

The original online version of this article was revised: the statement of Data availability and Competing interests have been added.

A correction to this article is available online at https://doi.org/10.1007/s43684-025-00108-x.

Keywords

Hydroelectric power plants / UAV / Motion blur / Axial velocity measurement

Cite this article

Download citation ▾
Quanxi Zhan, Yanmin Zhou, Junrui Zhang, Chenyang Sun, Runjie Shen, Bin He. A novel method for measuring center-axis velocity of unmanned aerial vehicles through synthetic motion blur images. Autonomous Intelligent Systems, 2024, 4(1): 16 DOI:10.1007/s43684-024-00073-x

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

LigęzaP.. Reconstructing the trajectory of the object’s motion on the basis of measuring the components of its velocity. Measurement, 2023, 221. 113546

[2]

W. Xu, X. Peng, L. Kneip, Tight fusion of events and inertial measurements for direct velocity estimation. IEEE Trans. Robot. (2023)

[3]

DahanN., KleinI.. Ghnet: learning GNSS heading from velocity measurements. IEEE Sens. J., 2024, 24(4): 5195-5202.

[4]

ZhangH., TangZ., XieY., GuiW.. Rpi-surf: a feature descriptor for bubble velocity measurement in froth flotation with relative position information. IEEE Trans. Instrum. Meas., 2021, 70: 1-14.

[5]

XuM., HuA., WangH.. Visual impedance based human-robot co-transportation with a tethered aerial vehicle. IEEE Trans. Ind. Inform., 2023, 19: 10356-10365.

[6]

LuoJ., ZhuL., LiL., HongP.. Robot visual servoing grasping based on top-down keypoint detection network. IEEE Trans. Instrum. Meas., 2023, 735000511

[7]

LeeM., ChoJ.-S., KimK.-S., KimS.. Modulated motion blur-based vehicle body velocity and pose estimation using an optical image modulator. IEEE Trans. Veh. Technol., 2021, 70(9): 8744-8754.

[8]

Cortés-OsorioJ.A., Gómez-MendozaJ.B., Riaño-RojasJ.C.. Velocity estimation from a single linear motion blurred image using discrete cosine transform. IEEE Trans. Instrum. Meas., 2018, 68(10): 4038-4050.

[9]

LeeM., KimK.-S., KimS.. Measuring vehicle velocity in real time using modulated motion blur of camera image data. IEEE Trans. Veh. Technol., 2017, 66(5): 3659-3673.

[10]

LiJ., YanB., LinQ., LiA., MaC.. Motion blur removal with quality assessment guidance. IEEE Trans. Multimed., 2021, 23: 2986-2997.

[11]

ChangM., YangC., FengH., XuZ., LiQ.. Beyond camera motion blur removing: how to handle outliers in deblurring. IEEE Trans. Comput. Imaging, 2021, 7: 463-474.

[12]

ZhangZ., DongK., SuoJ., DaiQ.. Deep coded exposure: end-to-end co-optimization of flutter shutter and deblurring processing for general motion blur removal. Photon. Res., 2023, 11(10): 1678-1686.

[13]

WangX., ZhengS., LinX., ZhangQ., LiuX.. Robust loop closure detection and relocalization with semantic-line graph matching constraints in indoor environments. Int. J. Appl. Earth Obs. Geoinf., 2024, 129. 103844

[14]

Cortés-OsorioJ.A., Gómez-MendozaJ.B., Riaño-RojasJ.C.. Velocity estimation from a single linear motion blurred image using discrete cosine transform. IEEE Trans. Instrum. Meas., 2018, 68(10): 4038-4050.

[15]

ÖzaslanT., MohtaK., KellerJ., MulgaonkarY., TaylorC.J., KumarV., WozencraftJ.M., HoodT.. Towards fully autonomous visual inspection of dark featureless dam penstocks using mavs. 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 201649985005. IEEE

[16]

ÖzaslanT., LoiannoG., KellerJ., TaylorC.J., KumarV., WozencraftJ.M., HoodT.. Autonomous navigation and mapping for inspection of penstocks and tunnels with mavs. IEEE Robot. Autom. Lett., 2017, 2(3): 1740-1747.

[17]

TanC.H., NgM., ShaifulD.S.B., WinS.K.H., AngW., YeungS.K., LimH., DoM.N., FoongS.. A smart unmanned aerial vehicle (UAV) based imaging system for inspection of deep hazardous tunnels. Water Pract. Technol., 2018, 13(4): 991-1000.

[18]

TanC.H., ShaifulD.S., AngW.J., WinS.K.H., FoongS.. Design optimization of sparse sensing array for extended aerial robot navigation in deep hazardous tunnels. IEEE Robot. Autom. Lett., 2019, 4(2): 862-869.

[19]

LucasB.D., KanadeT.. An iterative image registration technique with an application to stereo vision. IJCAI’81: 7th International Joint Conference on Artificial Intelligence, 19816746792

[20]

IlgE., MayerN., SaikiaT., KeuperM., DosovitskiyA., BroxT.. Flownet 2.0: evolution of optical flow estimation with deep networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 201724622470

[21]

ZhenW., SchererS.. Estimating the localizability in tunnel-like environments using lidar and uwb. 2019 International Conference on Robotics and Automation (ICRA), 201949034908. IEEE

[22]

LiY., KaiD., MaorongJ., QiangL., XiaofengL.. Research and experiment on uwb path loss characteristics in outdoor near-ground environments. Autom. Control Comput. Sci., 2019, 53(2): 186-193.

[23]

MaoY., WanZ., DaiY., YuX.. Deep idempotent network for efficient single image blind deblurring. IEEE Trans. Circuits Syst. Video Technol., 2022, 33(1): 172-185.

[24]

DwicahyaJ.A., RamadijantiN., BasukiA.. Moving object velocity detection based on motion blur on photos using gray level. 2018 International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC), 2018192198. IEEE

[25]

PhamilaY.A.V., AmuthaR.. Discrete cosine transform based fusion of multi-focus images for visual sensor networks. Signal Process., 2014, 95: 161-170.

[26]

Cortés-OsorioJ.A., Gómez-MendozaJ.B., Riaño-RojasJ.C.. Velocity estimation from a single linear motion blurred image using discrete cosine transform. IEEE Trans. Instrum. Meas., 2018, 68(10): 4038-4050.

[27]

FanC.-P., SuG.-A.. Pruning fast Fourier transform algorithm design using group-based method. Signal Process., 2007, 87(11): 2781-2798.

[28]

MohammadiM., PouyanA.A., KhanN.A., AbolghasemiV.. An improved design of adaptive directional time–frequency distributions based on the Radon transform. Signal Process., 2018, 150: 85-89.

[29]

PengJ., ZhangP.. Velocity prediction method of quadrotor uav based on bp neural network. 2020 International Symposium on Autonomous Systems (ISAS), 20202328.

[30]

MichalczykJ., SchöffmannC., FornasierA., SteinbrenerJ., WeissS.. Radar-inertial state-estimation for uav motion in highly agile manoeuvres. 2022 International Conference on Unmanned Aircraft Systems (ICUAS), 2022583589.

[31]

BrooksT., BarronJ.T.. Learning to synthesize motion blur. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 201968406848

[32]

LancelleM., DoganP., GrossM.. Controlling motion blur in synthetic long time exposures. Computer Graphics Forum, 2019393403Wiley Online Library38

[33]

PunnappurathA., RajagopalanA.N., TaheriS., ChellappaR., SeetharamanG.. Face recognition across non-uniform motion blur, illumination, and pose. IEEE Trans. Image Process., 2015, 24(7): 2067-2082.

[34]

LiuX., El GamalA.. Synthesis of high dynamic range motion blur free image from multiple captures. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl., 2003, 50(4): 530-539.

[35]

DJI Manifold 2-C 256 GB - AEROMOTUS — aeromotus.com. https://www.aeromotus.com/product/dji-manifold-2-c/. [Accessed 06-03-2024]

[36]

TOFSense-F — nooploop.com. https://www.nooploop.com/en/tofsense-f/. Accessed 06-03-2024

[37]

ZhaoS., OhS.-K., KimJ.-Y., FuZ., PedryczW.. Motion-blurred image restoration framework based on parameter estimation and fuzzy radial basis function neural networks. Pattern Recognit., 2022, 132. 108983

[38]

LiX., XuQ., TangY., HuC., NiuJ., XuC.. Unmanned aerial vehicle position estimation augmentation using optical flow sensor. IEEE Sens. J., 2023, 23(13): 14773-14780.

[39]

BaiC., XiaoT., ChenY., WangH., ZhangF., GaoX.. Faster-lio: lightweight tightly coupled lidar-inertial odometry using parallel sparse incremental voxels. IEEE Robot. Autom. Lett., 2022, 7(2): 4861-4868.

Funding

Shanghai Municipal Science and Technology Major Project(2021SHZDZX0100)

RIGHTS & PERMISSIONS

The Author(s)

AI Summary AI Mindmap
PDF

213

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/