ISSN 0021-3454 (print version)
ISSN 2500-0381 (online version)
Menu

4
Issue
vol 67 / April, 2024
Article

DOI 10.17586/0021-3454-2022-65-6-430-442

UDC 004.932.2

DEFINING THE ORIENTATION OF THE STAND PLATFORM FOR SEMI-NATURAL MODELING OF THE DYNAMICS OF A NANOSATELLITE RELATIVE MOTION

V. D. Meshcheryakov
Samara University, Inter-University Department of the Space Researches; Research Laboratory 102; Engineer-Programmer


P. N. Nikolaev
Samara University; Interuniversity Department of Space Research; Post-Graduate Student


A. A. Khusainov
Samara University, Inter-University Department of the Space Researches; Research Laboratory 102; Engineer-Programmer


Read the full article 

Abstract. A technique for determining the orientation of the stand platform for semi-natural modeling of angular motion dynamics relative to the center of mass of a nanosatellite is proposed. The developed technique is based on the use of a stereo camera consisting of an infrared camera and a color camera of the visible spectral range. Each of these cameras is designed, respectively, for the formation of infrared and color images on processed photographs. The operation of both cameras is based on the use of a system of active optical markers emitting in the infrared and visible ranges. Based on the image taken by the infrared camera, the centers of optical markers are determined using the Hough transform. By the same method, extraneous artefacts are filtered out in the image obtained by a color camera by evaluating the fundamental matrix. After the color of the marker from the color image is determined, this feature is added to the obtained coordinates of the markers from the infrared image. After that, a triple of vectors is formed in the coordinate system of the stand platform and its orientation is determined in the coordinate system of the infrared camera. According to results of semi-linear modeling, the orientation error does not exceed 0.5°.
Keywords: semi-natural modeling, stereo camera, optical system, active markers

References:
  1. Kashirin A.V., Glebanova I.I. Molodoy uchenyy (Young Scientist), 2016, no. 7(111), pp. 855–867. (in Russ.)
  2. https://keldysh.ru/papers/2008/prep38/prep2008_38.html. (in Russ.)
  3. Igritsky V.A., Mayorova V.I. Science and Education of Bauman MSTU, 2011, no. 13, pp. 16. (in Russ.)
  4. Meitu Ye, Jin Liang, Leigang Li, Boxing Qianc, Maodong Ren, Mingkai Zhang, Wang Lu, Yulong Zong, Optics and Lasers in Engineering, 2021, no. (6)146, рр. 106697. DOI:10.1016/j.optlaseng.2021.106697.
  5. Jinzhao Yang, Peter Tse, Measurement, 2021, no. 175(365), pp. 109104. DOI:10.1016/j.measurement.2021.109104.
  6. https://sputnix.ru/ru/oborudovanie/elementi-stenda-polunaturnogo-modelirovaniya/. (in Russ.)
  7. https://spacetest.ru/index.php?id=orient. (in Russ.)
  8. https://sputnix.ru/ru/oborudovanie/elementi-stenda-polunaturnogomodelirovaniya/sistema-nezavisimyix-izmerenij. (in Russ.)
  9. Madgwick Sebastian O.H., University of Bristol, 2010, https://www.researchgate.net/publication/ 267197884_An_efficient_orientation_filter_for_inertial_and_inertialmagnetic_sensor_arrays.
  10. Branko K. FME Transactions, 2015, no. 1(43), pp. 47–54. DOI:10.5937/fmet1501047K.
  11. Hartley R.I., Zisserman A. Multiple View Geometry in Computer Vision, Cambridge, 2003, https://doi.org/10.1017/CBO9780511811685.
  12. https://habr.com/ru/post/130300/ (in Russ.).
  13. Svalbe I.D. IEEE Trans. on Pattern Analysis Machine Intelligence, 1989, no. 9II), pp. 286.
  14. CamCal 011 Fundamental Matrix, 2021, http://datahacker.rs/camera-calibration-essential-matrix-computation.
  15. Ribo M. IEEE Instrumentation and Measurement Technology Conference, Budapest, 2001.
  16. https://habr.com/ru/post/181580/.(in Russ.)
  17. Stereo Camera Calibrator App, MathWorks, 2021https://www.mathworks.com/help/vision/ug/stereo-camera-calibrator-app.html/.