<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.3 20210610//EN" "JATS-journalpublishing1-3.dtd">
<article article-type="research-article" dtd-version="1.3" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xml:lang="ru"><front><journal-meta><journal-id journal-id-type="publisher-id">pribor</journal-id><journal-title-group><journal-title xml:lang="ru">Известия высших учебных заведений. Приборостроение</journal-title><trans-title-group xml:lang="en"><trans-title>Journal of Instrument Engineering</trans-title></trans-title-group></journal-title-group><issn pub-type="ppub">0021-3454</issn><issn pub-type="epub">2500-0381</issn><publisher><publisher-name>Национальный исследовательский университет ИТМО</publisher-name></publisher></journal-meta><article-meta><article-id pub-id-type="doi">10.17586/0021-3454-2023-66-3-247-250</article-id><article-id custom-type="elpub" pub-id-type="custom">pribor-103</article-id><article-categories><subj-group subj-group-type="heading"><subject>Research Article</subject></subj-group><subj-group subj-group-type="section-heading" xml:lang="ru"><subject>КРАТКИЕ СООБЩЕНИЯ</subject></subj-group><subj-group subj-group-type="section-heading" xml:lang="en"><subject>BRIEF NOTES</subject></subj-group></article-categories><title-group><article-title>Автоматическое распознавание зрительных стимулов по единичным вызванным потенциалам на электроэнцефалограмме</article-title><trans-title-group xml:lang="en"><trans-title>Automatic Recognition of Visual Stimules by Single Evoked Potentials on an Electroencephalogram</trans-title></trans-title-group></title-group><contrib-group><contrib contrib-type="author" corresp="yes"><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Марусина</surname><given-names>М. Я.</given-names></name><name name-style="western" xml:lang="en"><surname>Marusina</surname><given-names>M. Ya.</given-names></name></name-alternatives><bio xml:lang="ru"><p>Мария Яковлевна Марусина - д-р техн. наук, профессор; факультет систем управления и робототехники</p><p>Санкт-Петербург</p></bio><bio xml:lang="en"><p>Maria Ya. Marusina - Dr. Sci., Professor; Faculty of Control Systems and Robotics</p><p>St. Petersburg</p></bio><email xlink:type="simple">myamarusina@itmo.ru</email><xref ref-type="aff" rid="aff-1"/></contrib><contrib contrib-type="author" corresp="yes"><name-alternatives><name name-style="eastern" xml:lang="ru"><surname>Бурдаев</surname><given-names>И. В.</given-names></name><name name-style="western" xml:lang="en"><surname>Burdaev</surname><given-names>I. V.</given-names></name></name-alternatives><bio xml:lang="ru"><p>Игорь Владиславович Бурдаев - студент; факультет программной инженерии и компьютерной техники</p><p>Санкт-Петербург</p></bio><bio xml:lang="en"><p>Igor V. Burdaev - Student; Faculty of Software Engineering and Computer Systems</p><p>St. Petersburg</p></bio><email xlink:type="simple">burdaev-igor@mail.ru</email><xref ref-type="aff" rid="aff-1"/></contrib></contrib-group><aff-alternatives id="aff-1"><aff xml:lang="ru">Университет ИТМО<country>Россия</country></aff><aff xml:lang="en">ITMO University<country>Russian Federation</country></aff></aff-alternatives><pub-date pub-type="collection"><year>2023</year></pub-date><pub-date pub-type="epub"><day>26</day><month>11</month><year>2024</year></pub-date><volume>66</volume><issue>3</issue><fpage>247</fpage><lpage>250</lpage><permissions><copyright-statement>Copyright &amp;#x00A9; Национальный исследовательский университет ИТМО, 2024</copyright-statement><copyright-year>2024</copyright-year><copyright-holder xml:lang="ru">Национальный исследовательский университет ИТМО</copyright-holder><copyright-holder xml:lang="en">Национальный исследовательский университет ИТМО</copyright-holder><license xlink:href="https://pribor.ifmo.ru/jour/about/submissions#copyrightNotice" xlink:type="simple"><license-p>https://pribor.ifmo.ru/jour/about/submissions#copyrightNotice</license-p></license></permissions><self-uri xlink:href="https://pribor.ifmo.ru/jour/article/view/103">https://pribor.ifmo.ru/jour/article/view/103</self-uri><abstract><p>Обоснована необходимость повышения эффективности автоматической классификации зрительных стимулов по единичным вызванным потенциалам на электроэнцефалограмме испытуемого. Определены факторы, влияющие на точность распознавания вида предъявляемых зрительных стимулов (живой/неживой, четкий/размытый). Представлен алгоритм обработки данных, позволяющий выявлять значимые различия амплитуд единичных вызванных потенциалов.</p></abstract><trans-abstract xml:lang="en"><p>The necessity of increasing the efficiency of automatic classification of visual stimuli by single evoked potentials on the observer's electroencephalogram is substantiated. The factors affecting the accuracy of recognition of the type of presented visual stimuli (living/non-living, clear/blurred) are determined. A data processing algorithm is developed that makes it possible to identify significant differences in the amplitudes of single evoked potentials.</p></trans-abstract><kwd-group xml:lang="ru"><kwd>распознавание единичных вызванных потенциалов</kwd><kwd>методы глубокого обучения</kwd><kwd>искусственные нейронные сети</kwd></kwd-group><kwd-group xml:lang="en"><kwd>recognition of single evoked potentials</kwd><kwd>deep learning methods</kwd><kwd>artificial neural networks</kwd></kwd-group></article-meta></front><back><ref-list><title>References</title><ref id="cit1"><label>1</label><citation-alternatives><mixed-citation xml:lang="ru">Капралов Н. В., Нагорнова Ж. В., Шемякина Н. В. Методы классификации ЭЭГ-паттернов воображаемых движений // Информатика и автоматизация. 2021. Т. 1, вып. 20. С. 94-132 DOI: org/10.15622/ia.2021.20.1.4.</mixed-citation><mixed-citation xml:lang="en">Kapralov N., Nagornova Zh., Shemyakina N. Informatics and Automation, 2021, no. 1(20), pp. 94-132, DOI: https://doi.org/10.15622/ia.2021.20.1.4</mixed-citation></citation-alternatives></ref><ref id="cit2"><label>2</label><citation-alternatives><mixed-citation xml:lang="ru">Lotte F. et al. A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update // Journal Neural Eng. 2018. Vol. 15, N 3. P. 031005.</mixed-citation><mixed-citation xml:lang="en">Lotte F. et al. J. Neural. Eng., 2018, no.</mixed-citation></citation-alternatives></ref><ref id="cit3"><label>3</label><citation-alternatives><mixed-citation xml:lang="ru">Zhao X., Zhao J., Liu C., Cai W. Deep Neural Network with Joint Distribution Matching for Cross-Subject Motor Imagery Brain-Computer Interfaces // BioMed. Res. Intern. 2020. Vol. 2020, N 7285057.</mixed-citation><mixed-citation xml:lang="en">(15), pp. 031005. 3. Zhao X., Zhao J., Liu C., Cai W. Biomed. Res. Int., 2020, vol. 2020, рр. 7285057</mixed-citation></citation-alternatives></ref><ref id="cit4"><label>4</label><citation-alternatives><mixed-citation xml:lang="ru">Пономарев С. В., Малашин Р. О., Моисеенко Г. А. Автоматическая классификация зрительных стимулов по электроэнцефалограмме наблюдателя // Оптич. журн. 2018. № 8. С. 67-76.</mixed-citation><mixed-citation xml:lang="en">Ponomarev S.V., Malashin R.O., Moiseenko G.A. Journal of Optical Technology, 2018, no. 8, pp. 499-506.</mixed-citation></citation-alternatives></ref><ref id="cit5"><label>5</label><citation-alternatives><mixed-citation xml:lang="ru">Spaminato C., Palazzo S., Kavasidis I., Shah M. Deep learning human mind for automated visual classification // CVPR. 2017 [Электронный ресурс]: &lt;https://arxiv.org/abs/1609.0034&gt;</mixed-citation><mixed-citation xml:lang="en">Spaminato C., Palazzo S., Kavasidis I., Shah M. CVPR, 2017, https://arxiv.org/abs/1609.00344.</mixed-citation></citation-alternatives></ref><ref id="cit6"><label>6</label><citation-alternatives><mixed-citation xml:lang="ru">Анодина-Андриевская Е. М., Божокин С. В., Марусина М. Я., Полонский Ю. З., Суворов Н. Б. Перспективные подходы к анализу информативности физиологических сигналов и медицинских изображений человека при интеллектуальной деятельности // Изв. вузов. Приборостроение. 2011. Т. 54, № 7. C. 27-35.</mixed-citation><mixed-citation xml:lang="en">Anodina-Andrievskaya E.M., Bozhokin S.V., Marusina M.Ya., Polonsky Yu.Z., Suvorov N.B. Journal of Instrument Engineering, 2011, no. 7(54), pp. 27-35. (in Russ.)</mixed-citation></citation-alternatives></ref><ref id="cit7"><label>7</label><citation-alternatives><mixed-citation xml:lang="ru">Kiryakova T. N., Marusina M. Ya., Fedchenkov P. V. Automatic methods of contours and volumes determination of zones of interest in MRI images // REJR. 2017. N 7 (2). P. 117-127. DOI: 10.21569/2222-7415-2017-7-2-117-127.</mixed-citation><mixed-citation xml:lang="en">Kiryakova T.N., Marusina M.Ya., Fedchenkov P.V. REJR, 2017, no. 2(7), pp. 117-127, DOI: 10.21569/2222-7415-2017-7-2-117-127.</mixed-citation></citation-alternatives></ref><ref id="cit8"><label>8</label><citation-alternatives><mixed-citation xml:lang="ru">Marusina M. Ya., Karaseva E. A. Automatic Segmentation of MRI Images in Dynamic Programming Mode // Asian Pacific Journal of Cancer Prevention (APJCP). 2018. N 19(10). P. 2771-2775. DOI: 10.22034/APJCP.2018.19.10.2771.</mixed-citation><mixed-citation xml:lang="en">Marusina M.Ya., Karaseva E.A. Asian Pacific Journal of Cancer Prevention, 2018, no. 10(19), pp. 2771-2775, DOI: 10.22034/APJCP.2018.19.10.2771.</mixed-citation></citation-alternatives></ref><ref id="cit9"><label>9</label><citation-alternatives><mixed-citation xml:lang="ru">Marusina M. Y., Mochalina A. P., Frolova E. P., Satikov V. I., Barchuk A. A., Kuznetcov V. I., Gaidukov V. S., Tarakanov S. A. MRI Image Processing Based on Fractal Analysis // Asian Pacific Journal of Cancer Prevention (APJCP). 2017. N 18 (1). P. 51-55. DOI: 10.22034/APJCP.2017.18.1.51.</mixed-citation><mixed-citation xml:lang="en">Marusina M.Y., Mochalina A.P., Frolova E.P., Satikov V.I., Barchuk A.A., Kuznetcov V.I., Gaidukov V.S., Tarakanov S.A. Asian Pacific Journal of Cancer Prevention, 2017, no. 1(18), pp. 51-55, DOI: 10.22034/APJCP.2017.18.1.51.</mixed-citation></citation-alternatives></ref><ref id="cit10"><label>10</label><citation-alternatives><mixed-citation xml:lang="ru">Marusina M. Ya., Karaseva E. A. Application of fractal analysis for estimation of structural changes of tissues on MRI imagies // REJR. 2018. N 8 (3). P. 107-112. DOI: 10.21569/2222-7415-2018-8-3-107-112.</mixed-citation><mixed-citation xml:lang="en">Marusina M.Ya., Karaseva E.A. REJR, 2018, no. 3(8), pp. 107-112, DOI: 10.21569/2222-7415-2018-8-3-107-112.</mixed-citation></citation-alternatives></ref><ref id="cit11"><label>11</label><citation-alternatives><mixed-citation xml:lang="ru">Tang Z., Sun S. Single-trial EEG classification of motor imagery using deep convolutional neural networks // Optik — Intern. Journal for Light and Electron Optics. 2017. Vol. 130. P. 11-18.</mixed-citation><mixed-citation xml:lang="en">Tang Z., Sun S. Optik - International Journal for Light and Electron Optics, 2017, vol. 130, рр. 11-18.</mixed-citation></citation-alternatives></ref><ref id="cit12"><label>12</label><citation-alternatives><mixed-citation xml:lang="ru">Malashin R. O. Extraction of object hierarchy data from trained deep-learning neural networks via analysis of the confusion matrix // Journal of Optical Technology. 2016. Vol. 83. N 10. P. 599-603.</mixed-citation><mixed-citation xml:lang="en">Malashin R.O. Journal of Optical Technology, 2016, no. 10(83), pp. 599-603.</mixed-citation></citation-alternatives></ref></ref-list><fn-group><fn fn-type="conflict"><p>The authors declare that there are no conflicts of interest present.</p></fn></fn-group></back></article>
