ISSN 0021-3454 (print version)
ISSN 2500-0381 (online version)
Menu

10
Issue
vol 68 / October, 2025
Article

DOI 10.17586/0021-3454-2025-68-10-838-843

UDC 519.8

ANALYTICAL METHOD FOR FINDING UNKNOWN CONSTANT PARAMETERS OF LINEAR REGRESSION INEQUALITIES

A. M. Zenkin
ITMO University, Saint Petersburg, 197101, Russian Federation; Assistant


A. A. Bobtsov
ITMO University, Saint Petersburg, 197101, Russian Federation; Head of the School of Computer Technologies and Control, Professor at the Faculty of Control Systems and Robotics, Head of the Adaptive and Nonlinear Control Systems Lab

Reference for citation: Zenkin A. M., Bobtsov A. A. Analytical method for finding unknown constant parameters of linear regression inequalities. Journal of Instrument Engineering. 2025. Vol. 68, N 10. P. 838–843 (in Russian). DOI: 10.17586/0021-3454-2025-68-10-838-843.

Abstract. A system of linear regression inequalities with unknown constant parameters, whose number is assumed to be given and finite, is considered. The problem of constructing the domain of the components of the vector of unknown parameters that ensure the validity of the prescribed inequalities is addressed. A method is proposed, based on the procedure of dynamic regressor extension and the selection of active constraints, which reduces the original problem to solving a square system of linear equations. The application of Cramer’s rule and Hadamard’s inequality to the resulting system makes it possible to obtain an analytical upper bound for the components of the vector of unknown parameters. The correctness of the proposed method is illustrated by numerical simulation. Unlike numerical optimization methods, the presented approach does not require iterative computations and provides a rigorous guaranteed bound valid for the entire class of admissible data. A theorem establishing this bound in the general case is formulated and proved. The conclusion discusses the prospects for further development of the proposed approach.
Keywords: linear regression inequality, dynamic regressor extension, Cramer’s rule, Hadamard’s inequality

References:

1. Schollmeyer G., Augustin T. International Journal of Approximate Reasoning, 2015, vol. 56, pt. B, рр. 224–248, DOI: 10.1016/j.ijar.2014.07.003. 2. Ben-Moshe D. Econometric Theory, 2021, no. 4(37), pp. 633–663, DOI: 10.1017/S0266466620000250. 3. Löhne A., Weißing B., Ciripoi A. Optimization, 2024, https://arxiv.org/html/2310.06602v3/. 4. Ziegler G.M.., Henk M., Richter-Gebert J. Handbook of Discrete and Computational Geometry, Boca Raton, CRC Press, 2017, рр. 383–413. 5. Cánovas M.J., Parra J. Set-Valued and Variational Analysis, 2025, no. 3(33), pp. 25–49, DOI: 10.1007/s11228-025- 00760-8. 6. Nwaigwe E., Wobo O.G. FNAS Journal of Mathematics, Statistics and Computing, 2024, no. 1(2), pp. 66–73. 7. Hu H., Sremac S., Woerdeman H.J., Wolkowicz H., arXiv:2508.15608, 2025. 8. Hillar C.J., Wibisono A. Linear Algebra and its Applications, 2015, vol. 472, рр. 135–141, DOI:10.1016/j. laa.2015.01.037. 9. Hadamard J. Bulletin des Sciences Mathématiques, 1893, vol. 17, рр. 240–246. 10. Aranovskiy S., Bobtsov A., Ortega R., Pyrkin A. IEEE Transactions on Automatic Control, 2017, no. 7(62), pp. 3546–3550, DOI: 10.1109/TAC.2016.2614889. 11. Ortega R., Aranovskiy S., Pyrkin A., Astolfi A., Bobtsov A. IEEE Transactions on Automatic Control, 2019, no. 5(66), pp. 2265–2272.