{"id":4182,"date":"2024-04-28T18:21:04","date_gmt":"2024-04-28T23:21:04","guid":{"rendered":"https:\/\/cirics.uqo.ca\/mohand-said-allili\/"},"modified":"2025-02-07T10:04:09","modified_gmt":"2025-02-07T15:04:09","slug":"mohand-said-allili","status":"publish","type":"page","link":"https:\/\/cirics.uqo.ca\/en\/mohand-said-allili\/","title":{"rendered":"Mohand Said Allili"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"4182\" class=\"elementor elementor-4182 elementor-2523\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e09216a e-flex e-con-boxed e-con e-parent\" data-id=\"e09216a\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-9386cfe elementor-hidden-desktop elementor-hidden-tablet elementor-hidden-mobile e-flex e-con-boxed e-con e-parent\" data-id=\"9386cfe\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-ea9c07d elementor-widget elementor-widget-heading\" data-id=\"ea9c07d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Welcome<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-70e53c3 elementor-widget elementor-widget-text-editor\" data-id=\"70e53c3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>If needed, this section can contain some or all of the following:<\/p>\n<ul>\n<li>A large, engaging image of the university, department, or an abstract representation of the academic field can set a professional and inspiring tone.<\/li>\n<li>A brief welcome message or introduction that explains what visitors will find on the page. This could be a short paragraph detailing the purpose of the page, such as highlighting the academic and research achievements of the faculty.<\/li>\n<li>Key facts, achievements or statistics about the professor or department. For instance, number of published papers, years of experience, key projects, or awards won.<\/li>\n<li>Interactive timeline that highlights major milestones, such as significant publications, awards, and other achievements.<\/li>\n<li>A short video where the professor introduces themselves and talk about their work and interests providing a personal touch, and making the page more engaging and approachable.<\/li>\n<\/ul>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-9b8f585 e-flex e-con-boxed e-con e-parent\" data-id=\"9b8f585\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-99badd5 e-con-full my-profs-page-image-card e-flex e-con e-child\" data-id=\"99badd5\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-d8d58ce eael-team-align-centered elementor-widget elementor-widget-eael-team-member\" data-id=\"d8d58ce\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"eael-team-member.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\n\n\t<div id=\"eael-team-member-d8d58ce\" class=\"eael-team-item eael-team-members-simple team-avatar-rounded\">\n\t\t<div class=\"eael-team-item-inner\">\n\t\t\t<div class=\"eael-team-image\">\n\t\t\t\t<figure>\n\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cirics.uqo.ca\/wp-content\/uploads\/2024\/04\/Mohand_Said_Allili-300x300-1.webp\" alt=\"\">\n\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\n\t\t\t\t\n\t\t\t<\/div>\n\n\t\t\t<div class=\"eael-team-content\">\n\t\t\t\t<h2 class=\"eael-team-member-name\">Mohand Said Allili<\/h2><h3 class=\"eael-team-member-position\"><span>Professor<\/span><br> Universit\u00e9 du Qu\u00e9bec en Outaouais (UQO)<br> Computer Science and Engineering Department<\/h3>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<ul class=\"eael-team-member-social-profiles\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<li class=\"eael-team-member-social-link\">\n\t\t\t\t\t\t\t\t\t\t<a href=\"https:\/\/uqo.ca\/profil\/allimo01\" target=\"_blank\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<svg aria-hidden=\"true\" class=\"e-font-icon-svg e-fas-university\" viewBox=\"0 0 512 512\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\"><path d=\"M496 128v16a8 8 0 0 1-8 8h-24v12c0 6.627-5.373 12-12 12H60c-6.627 0-12-5.373-12-12v-12H24a8 8 0 0 1-8-8v-16a8 8 0 0 1 4.941-7.392l232-88a7.996 7.996 0 0 1 6.118 0l232 88A8 8 0 0 1 496 128zm-24 304H40c-13.255 0-24 10.745-24 24v16a8 8 0 0 0 8 8h464a8 8 0 0 0 8-8v-16c0-13.255-10.745-24-24-24zM96 192v192H60c-6.627 0-12 5.373-12 12v20h416v-20c0-6.627-5.373-12-12-12h-36V192h-64v192h-64V192h-64v192h-64V192H96z\"><\/path><\/svg>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\t\t<\/li>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<li class=\"eael-team-member-social-link\">\n\t\t\t\t\t\t\t\t\t\t<a href=\"http:\/\/w3.uqo.ca\/allimo01\/\" target=\"_blank\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<svg aria-hidden=\"true\" class=\"e-font-icon-svg e-fab-wordpress\" viewBox=\"0 0 512 512\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\"><path d=\"M61.7 169.4l101.5 278C92.2 413 43.3 340.2 43.3 256c0-30.9 6.6-60.1 18.4-86.6zm337.9 75.9c0-26.3-9.4-44.5-17.5-58.7-10.8-17.5-20.9-32.4-20.9-49.9 0-19.6 14.8-37.8 35.7-37.8.9 0 1.8.1 2.8.2-37.9-34.7-88.3-55.9-143.7-55.9-74.3 0-139.7 38.1-177.8 95.9 5 .2 9.7.3 13.7.3 22.2 0 56.7-2.7 56.7-2.7 11.5-.7 12.8 16.2 1.4 17.5 0 0-11.5 1.3-24.3 2l77.5 230.4L249.8 247l-33.1-90.8c-11.5-.7-22.3-2-22.3-2-11.5-.7-10.1-18.2 1.3-17.5 0 0 35.1 2.7 56 2.7 22.2 0 56.7-2.7 56.7-2.7 11.5-.7 12.8 16.2 1.4 17.5 0 0-11.5 1.3-24.3 2l76.9 228.7 21.2-70.9c9-29.4 16-50.5 16-68.7zm-139.9 29.3l-63.8 185.5c19.1 5.6 39.2 8.7 60.1 8.7 24.8 0 48.5-4.3 70.6-12.1-.6-.9-1.1-1.9-1.5-2.9l-65.4-179.2zm183-120.7c.9 6.8 1.4 14 1.4 21.9 0 21.6-4 45.8-16.2 76.2l-65 187.9C426.2 403 468.7 334.5 468.7 256c0-37-9.4-71.8-26-102.1zM504 256c0 136.8-111.3 248-248 248C119.2 504 8 392.7 8 256 8 119.2 119.2 8 256 8c136.7 0 248 111.2 248 248zm-11.4 0c0-130.5-106.2-236.6-236.6-236.6C125.5 19.4 19.4 125.5 19.4 256S125.6 492.6 256 492.6c130.5 0 236.6-106.1 236.6-236.6z\"><\/path><\/svg>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\t\t<\/li>\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/ul>\n\t\t\t\t\t\t\t\t\t\t<p class=\"eael-team-text\">Mohand Said Allili is a full professor in the Department of Computer Science and Engineering at the Universit\u00e9 du Qu\u00e9bec en Outaouais, where he heads the Imaging, Vision and Artificial Intelligence Research Laboratory (LARIVIA). Its research activities revolve around computer vision, machine learning and multimedia data processing, with applications in the semantic analysis of medical and aerial images, and cybersecurity.  <\/p>\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-aaa41d5 e-con-full my-profs-page-publ-card e-flex e-con e-child\" data-id=\"aaa41d5\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-a474f56 elementor-widget elementor-widget-heading\" data-id=\"a474f56\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Productions included in the research:<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ce5ef54 elementor-widget elementor-widget-text-editor\" data-id=\"ce5ef54\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p style=\"text-align: left;\"><strong>AUT<\/strong> (Other), BRE (Patent), <strong>CAC<\/strong> (Refereed publications in conference proceedings), <strong>CNA<\/strong> (Non-refereed paper), <strong>COC<\/strong> (Contribution to a collective work), <strong>COF<\/strong> (Refereed paper), <strong>CRE<\/strong>, <strong>GRO<\/strong>, <strong>LIV<\/strong> (Book), <strong>RAC<\/strong> (Refereed journal), <strong>RAP<\/strong> (Research report), <strong>RSC<\/strong> (Non-refereed journal).<\/p>\n<p style=\"text-align: center;\"><span style=\"text-shadow: 1px 1px 2px gray;\">Year: 1975 to 2024<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-107306a elementor-align-center my-prof-publ-but elementor-widget elementor-widget-button\" data-id=\"107306a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-sm\" href=\"https:\/\/cirics.uqo.ca\/publications\/?tsr=&#038;auth=141\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t<span class=\"elementor-button-icon\">\n\t\t\t\t<svg aria-hidden=\"true\" class=\"e-font-icon-svg e-fas-book-open\" viewBox=\"0 0 576 512\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\"><path d=\"M542.22 32.05c-54.8 3.11-163.72 14.43-230.96 55.59-4.64 2.84-7.27 7.89-7.27 13.17v363.87c0 11.55 12.63 18.85 23.28 13.49 69.18-34.82 169.23-44.32 218.7-46.92 16.89-.89 30.02-14.43 30.02-30.66V62.75c.01-17.71-15.35-31.74-33.77-30.7zM264.73 87.64C197.5 46.48 88.58 35.17 33.78 32.05 15.36 31.01 0 45.04 0 62.75V400.6c0 16.24 13.13 29.78 30.02 30.66 49.49 2.6 149.59 12.11 218.77 46.95 10.62 5.35 23.21-1.94 23.21-13.46V100.63c0-5.29-2.62-10.14-7.27-12.99z\"><\/path><\/svg>\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">All publications<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-eb6a7a7 elementor-widget elementor-widget-heading\" data-id=\"eb6a7a7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Selected publications<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-29ef351 elementor-widget elementor-widget-shortcode\" data-id=\"29ef351\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"shortcode.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-shortcode\"><div class=\"teachpress_pub_list\"><form name=\"tppublistform\" method=\"get\"><a name=\"tppubs\" id=\"tppubs\"><\/a><\/form><table class=\"teachpress_publication_list\"><tr>\r\n                    <td>\r\n                        <h3 class=\"tp_h3\" id=\"tp_h3_2026\">2026<\/h3>\r\n                    <\/td>\r\n                <\/tr><tr class=\"tp_publication tp_publication_article\"><td class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Abdollahzadeh, S.;  Allili, M. S.;  Boulmerka, A.;  Lapointe, J. -F.<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" onclick=\"teachpress_pub_showhide('546','tp_abstract')\" style=\"cursor:pointer;\">A Vision-Based Framework for Safe Landing Zone Mapping of UAVs in Dynamic Environments<\/a> <span class=\"tp_pub_type tp_  article\">Journal Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span><span class=\"tp_pub_additional_journal\">IEEE Open Journal of the Computer Society, <\/span><span class=\"tp_pub_additional_volume\">vol. 7, <\/span><span class=\"tp_pub_additional_pages\">pp. 492\u2013503, <\/span><span class=\"tp_pub_additional_year\">2026<\/span>, <span class=\"tp_pub_additional_issn\">ISSN: 26441268 (ISSN)<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_546\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('546','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_546\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('546','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span>  |  <span class=\"tp_pub_links_label\">Links: <\/span><a class=\"tp_pub_link\" href=\"https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-105029942397&amp;doi=10.1109%2FOJCS.2026.3663268&amp;partnerID=40&amp;md5=b11484e035458c84b1d3f6780b92c91c\" title=\"https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-105029942397&amp;doi=10.1109%2FOJCS.2026.3663268&amp;partnerID=40&amp;md5=b11484e035458c84b1d3f6780b92c91c\" target=\"_blank\"><i class=\"fas fa-globe\"><\/i><\/a><a class=\"tp_pub_link\" href=\"https:\/\/dx.doi.org\/10.1109\/OJCS.2026.3663268\" title=\"Follow DOI:10.1109\/OJCS.2026.3663268\" target=\"_blank\"><i class=\"ai ai-doi\"><\/i><\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_546\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@article{abdollahzadeh_vision-based_2026,<br \/>\r\ntitle = {A Vision-Based Framework for Safe Landing Zone Mapping of UAVs in Dynamic Environments},<br \/>\r\nauthor = {S. Abdollahzadeh and M. S. Allili and A. Boulmerka and J. -F. Lapointe},<br \/>\r\nurl = {https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-105029942397&doi=10.1109%2FOJCS.2026.3663268&partnerID=40&md5=b11484e035458c84b1d3f6780b92c91c},<br \/>\r\ndoi = {10.1109\/OJCS.2026.3663268},<br \/>\r\nissn = {26441268 (ISSN)},<br \/>\r\nyear  = {2026},<br \/>\r\ndate = {2026-01-01},<br \/>\r\njournal = {IEEE Open Journal of the Computer Society},<br \/>\r\nvolume = {7},<br \/>\r\npages = {492\u2013503},<br \/>\r\nabstract = {Identification safe landing zones (SLZ) for Uncrewed Aerial Vehicles (UAVs) is important to ensure reliable and safe navigation, especially when they are operated in complex and safety-critical environments. However, this is a challenging task due to obstacles and UAV motion. This paper proposes a vision-based framework that maps SLZs in dynamic scenes by integrating several functionalities for analyzing visually static and dynamic aspects of a scene. Static analysis is achieved through context-aware segmentation which divides the image into thematic classes enabling to identify suitable landing surfaces (e.g., roads, grass). For dynamic content analysis, we combine object detection, tracking, and trajectory prediction to determine object occupancy and identify regions free of obstacles. Trajectory prediction is performed through a novel encoder\u2013decoder architecture taking past object positions to predict the most likely future locations. To ensure stable and robust trajectory prediction, we introduce an optimized homography computation using multi-scale image analysis and cumulative updates to compensate UAV motion. We tested our framework on different operational scenarios, including urban and natural scenes with moving objects like vehicles and pedestrians. Obtained results demonstrate its strong performance, and its significant potential for enabling autonomous and safe UAV navigation. \u00a9 2020 IEEE.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {article}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('546','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_546\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Identification safe landing zones (SLZ) for Uncrewed Aerial Vehicles (UAVs) is important to ensure reliable and safe navigation, especially when they are operated in complex and safety-critical environments. However, this is a challenging task due to obstacles and UAV motion. This paper proposes a vision-based framework that maps SLZs in dynamic scenes by integrating several functionalities for analyzing visually static and dynamic aspects of a scene. Static analysis is achieved through context-aware segmentation which divides the image into thematic classes enabling to identify suitable landing surfaces (e.g., roads, grass). For dynamic content analysis, we combine object detection, tracking, and trajectory prediction to determine object occupancy and identify regions free of obstacles. Trajectory prediction is performed through a novel encoder\u2013decoder architecture taking past object positions to predict the most likely future locations. To ensure stable and robust trajectory prediction, we introduce an optimized homography computation using multi-scale image analysis and cumulative updates to compensate UAV motion. We tested our framework on different operational scenarios, including urban and natural scenes with moving objects like vehicles and pedestrians. Obtained results demonstrate its strong performance, and its significant potential for enabling autonomous and safe UAV navigation. \u00a9 2020 IEEE.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('546','tp_abstract')\">Close<\/a><\/p><\/div><\/td><\/tr><tr>\r\n                    <td>\r\n                        <h3 class=\"tp_h3\" id=\"tp_h3_2025\">2025<\/h3>\r\n                    <\/td>\r\n                <\/tr><tr class=\"tp_publication tp_publication_inproceedings\"><td class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Lapointe, J. -F.;  Allili, M. S.;  Hammouche, N.<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" onclick=\"teachpress_pub_showhide('523','tp_abstract')\" style=\"cursor:pointer;\">Field Trials of\u00a0an\u00a0AI-AR-Based System for\u00a0Remote Bridge Inspection by\u00a0Drone<\/a> <span class=\"tp_pub_type tp_  inproceedings\">Proceedings Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span> D., Harris;  W.-C., Li;  H., Kr\u00f6mker (Ed.): <span class=\"tp_pub_additional_booktitle\">Lect. Notes Comput. Sci., <\/span><span class=\"tp_pub_additional_pages\">pp. 278\u2013287, <\/span><span class=\"tp_pub_additional_publisher\">Springer Science and Business Media Deutschland GmbH, <\/span><span class=\"tp_pub_additional_year\">2025<\/span>, <span class=\"tp_pub_additional_isbn\">ISBN: 03029743 (ISSN); 978-303176823-1 (ISBN)<\/span><span class=\"tp_pub_additional_note\">, (Journal Abbreviation: Lect. Notes Comput. Sci.)<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_523\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('523','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_523\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('523','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span>  |  <span class=\"tp_pub_links_label\">Links: <\/span><a class=\"tp_pub_link\" href=\"https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-85213387549&amp;doi=10.1007%2f978-3-031-76824-8_20&amp;partnerID=40&amp;md5=565ae5dded9cfdf27632e79e702c7718\" title=\"https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-85213387549&amp;doi=10.1007%2f978-3-031-76824-8_20&amp;partnerID=40&amp;md5=565ae5dded9cfdf27632e79e702c7718\" target=\"_blank\"><i class=\"fas fa-globe\"><\/i><\/a><a class=\"tp_pub_link\" href=\"https:\/\/dx.doi.org\/10.1007\/978-3-031-76824-8_20\" title=\"Follow DOI:10.1007\/978-3-031-76824-8_20\" target=\"_blank\"><i class=\"ai ai-doi\"><\/i><\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_523\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@inproceedings{lapointe_field_2025,<br \/>\r\ntitle = {Field Trials of\u00a0an\u00a0AI-AR-Based System for\u00a0Remote Bridge Inspection by\u00a0Drone},<br \/>\r\nauthor = {J. -F. Lapointe and M. S. Allili and N. Hammouche},<br \/>\r\neditor = {Harris D. and Li W.-C. and Kr\u00f6mker H.},<br \/>\r\nurl = {https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-85213387549&doi=10.1007%2f978-3-031-76824-8_20&partnerID=40&md5=565ae5dded9cfdf27632e79e702c7718},<br \/>\r\ndoi = {10.1007\/978-3-031-76824-8_20},<br \/>\r\nisbn = {03029743 (ISSN); 978-303176823-1 (ISBN)},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-01-01},<br \/>\r\nbooktitle = {Lect. Notes Comput. Sci.},<br \/>\r\nvolume = {15381 LNCS},<br \/>\r\npages = {278\u2013287},<br \/>\r\npublisher = {Springer Science and Business Media Deutschland GmbH},<br \/>\r\nabstract = {Bridge inspections are important to ensure the safety of users of these critical transportation infrastructures and avoid tragedies that could be caused by the collapse of these infrastructures. This paper describes the results of field trials of an advanced system for remotely guided inspection of bridges by a drone, which relies on artificial intelligence and augmented reality to achieve it. Results indicate that a high speed network link is critical to achieve good performance. \u00a9 The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.},<br \/>\r\nnote = {Journal Abbreviation: Lect. Notes Comput. Sci.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {inproceedings}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('523','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_523\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Bridge inspections are important to ensure the safety of users of these critical transportation infrastructures and avoid tragedies that could be caused by the collapse of these infrastructures. This paper describes the results of field trials of an advanced system for remotely guided inspection of bridges by a drone, which relies on artificial intelligence and augmented reality to achieve it. Results indicate that a high speed network link is critical to achieve good performance. \u00a9 The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('523','tp_abstract')\">Close<\/a><\/p><\/div><\/td><\/tr><tr class=\"tp_publication tp_publication_inproceedings\"><td class=\"tp_pub_info\"><p class=\"tp_pub_author\"> Valem, L. P.;  Pedronette, D. C. G.;  Allili, M. S.<\/p><p class=\"tp_pub_title\"><a class=\"tp_title_link\" onclick=\"teachpress_pub_showhide('527','tp_abstract')\" style=\"cursor:pointer;\">Contrastive Loss Based on\u00a0Contextual Similarity for\u00a0Image Classification<\/a> <span class=\"tp_pub_type tp_  inproceedings\">Proceedings Article<\/span> <\/p><p class=\"tp_pub_additional\"><span class=\"tp_pub_additional_in\">In: <\/span> G., Bebis;  V., Patel;  J., Gu;  J., Panetta;  Y., Gingold;  K., Johnsen;  M.S., Arefin;  S., Dutta;  A., Biswas (Ed.): <span class=\"tp_pub_additional_booktitle\">Lect. Notes Comput. Sci., <\/span><span class=\"tp_pub_additional_pages\">pp. 58\u201369, <\/span><span class=\"tp_pub_additional_publisher\">Springer Science and Business Media Deutschland GmbH, <\/span><span class=\"tp_pub_additional_year\">2025<\/span>, <span class=\"tp_pub_additional_isbn\">ISBN: 03029743 (ISSN); 978-303177391-4 (ISBN)<\/span><span class=\"tp_pub_additional_note\">, (Journal Abbreviation: Lect. Notes Comput. Sci.)<\/span>.<\/p><p class=\"tp_pub_menu\"><span class=\"tp_abstract_link\"><a id=\"tp_abstract_sh_527\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('527','tp_abstract')\" title=\"Show abstract\" style=\"cursor:pointer;\">Abstract<\/a><\/span> | <span class=\"tp_bibtex_link\"><a id=\"tp_bibtex_sh_527\" class=\"tp_show\" onclick=\"teachpress_pub_showhide('527','tp_bibtex')\" title=\"Show BibTeX entry\" style=\"cursor:pointer;\">BibTeX<\/a><\/span>  |  <span class=\"tp_pub_links_label\">Links: <\/span><a class=\"tp_pub_link\" href=\"https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-85218461565&amp;doi=10.1007%2f978-3-031-77392-1_5&amp;partnerID=40&amp;md5=cf885303646c3b1a4f4eacb87d02a2b6\" title=\"https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-85218461565&amp;doi=10.1007%2f978-3-031-77392-1_5&amp;partnerID=40&amp;md5=cf885303646c3b1a4f4eacb87d02a2b6\" target=\"_blank\"><i class=\"fas fa-globe\"><\/i><\/a><a class=\"tp_pub_link\" href=\"https:\/\/dx.doi.org\/10.1007\/978-3-031-77392-1_5\" title=\"Follow DOI:10.1007\/978-3-031-77392-1_5\" target=\"_blank\"><i class=\"ai ai-doi\"><\/i><\/a><\/span><\/p><div class=\"tp_bibtex\" id=\"tp_bibtex_527\" style=\"display:none;\"><div class=\"tp_bibtex_entry\"><pre>@inproceedings{valem_contrastive_2025,<br \/>\r\ntitle = {Contrastive Loss Based on\u00a0Contextual Similarity for\u00a0Image Classification},<br \/>\r\nauthor = {L. P. Valem and D. C. G. Pedronette and M. S. Allili},<br \/>\r\neditor = {Bebis G. and Patel V. and Gu J. and Panetta J. and Gingold Y. and Johnsen K. and Arefin M.S. and Dutta S. and Biswas A.},<br \/>\r\nurl = {https:\/\/www.scopus.com\/inward\/record.uri?eid=2-s2.0-85218461565&doi=10.1007%2f978-3-031-77392-1_5&partnerID=40&md5=cf885303646c3b1a4f4eacb87d02a2b6},<br \/>\r\ndoi = {10.1007\/978-3-031-77392-1_5},<br \/>\r\nisbn = {03029743 (ISSN); 978-303177391-4 (ISBN)},<br \/>\r\nyear  = {2025},<br \/>\r\ndate = {2025-01-01},<br \/>\r\nbooktitle = {Lect. Notes Comput. Sci.},<br \/>\r\nvolume = {15046 LNCS},<br \/>\r\npages = {58\u201369},<br \/>\r\npublisher = {Springer Science and Business Media Deutschland GmbH},<br \/>\r\nabstract = {Contrastive learning has been extensively exploited in self-supervised and supervised learning due to its effectiveness in learning representations that distinguish between similar and dissimilar images. It offers a robust alternative to cross-entropy by yielding more semantically meaningful image embeddings. However, most contrastive losses rely on pairwise measures to assess the similarity between elements, ignoring more general neighborhood information that can be leveraged to enhance model robustness and generalization. In this paper, we propose the Contextual Contrastive Loss (CCL) to replace pairwise image comparison by introducing a new contextual similarity measure using neighboring elements. The CCL yields a more semantically meaningful image embedding ensuring better separability of classes in the latent space. Experimental evaluation on three datasets (Food101, MiniImageNet, and CIFAR-100) has shown that CCL yields superior results by achieving up to 10.76% relative gains in classification accuracy, particularly for fewer training epochs and limited training data. This demonstrates the potential of our approach, especially in resource-constrained scenarios. \u00a9 The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.},<br \/>\r\nnote = {Journal Abbreviation: Lect. Notes Comput. Sci.},<br \/>\r\nkeywords = {},<br \/>\r\npubstate = {published},<br \/>\r\ntppubtype = {inproceedings}<br \/>\r\n}<br \/>\r\n<\/pre><\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('527','tp_bibtex')\">Close<\/a><\/p><\/div><div class=\"tp_abstract\" id=\"tp_abstract_527\" style=\"display:none;\"><div class=\"tp_abstract_entry\">Contrastive learning has been extensively exploited in self-supervised and supervised learning due to its effectiveness in learning representations that distinguish between similar and dissimilar images. It offers a robust alternative to cross-entropy by yielding more semantically meaningful image embeddings. However, most contrastive losses rely on pairwise measures to assess the similarity between elements, ignoring more general neighborhood information that can be leveraged to enhance model robustness and generalization. In this paper, we propose the Contextual Contrastive Loss (CCL) to replace pairwise image comparison by introducing a new contextual similarity measure using neighboring elements. The CCL yields a more semantically meaningful image embedding ensuring better separability of classes in the latent space. Experimental evaluation on three datasets (Food101, MiniImageNet, and CIFAR-100) has shown that CCL yields superior results by achieving up to 10.76% relative gains in classification accuracy, particularly for fewer training epochs and limited training data. This demonstrates the potential of our approach, especially in resource-constrained scenarios. \u00a9 The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.<\/div><p class=\"tp_close_menu\"><a class=\"tp_close\" onclick=\"teachpress_pub_showhide('527','tp_abstract')\">Close<\/a><\/p><\/div><\/td><\/tr><\/table><\/div><\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-5558b72 elementor-hidden-desktop elementor-hidden-tablet elementor-hidden-mobile e-flex e-con-boxed e-con e-parent\" data-id=\"5558b72\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-92ea5b7 elementor-widget elementor-widget-heading\" data-id=\"92ea5b7\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Some Heading<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-de7d7a3 elementor-widget elementor-widget-text-editor\" data-id=\"de7d7a3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>If needed, this section can contain some or all of the following:<\/p>\n<ul>\n<li>Recent news, updates, or upcoming events related to the professor or their department, such as guest lectures, seminars, and conferences.<\/li>\n<li>Social media feed.<\/li>\n<li>A quote from the professor about their philosophy on education and research or a testimonial from a peer or student adding a personal and inspirational element to the page, placing this information just above the share icons can give visitors current and relevant reasons to engage and share.<\/li>\n<li>Call to Action to attend or participate in some even.<\/li>\n<li>Contact Form<\/li>\n<li>Subscribe form<\/li>\n<\/ul>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-931098e e-flex e-con-boxed e-con e-parent\" data-id=\"931098e\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-ccca712 elementor-widget__width-auto eael-dual-header-content-align-center elementor-widget elementor-widget-eael-dual-color-header\" data-id=\"ccca712\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"eael-dual-color-header.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t<div class=\"eael-dual-header\">\n\t\t\t\t<h2 class=\"title eael-dch-title\"><span class=\"eael-dch-title-text eael-dch-title-lead lead solid-color\">Share Professor Mohand Aaid Allili<\/span> <span class=\"eael-dch-title-text\">publications with your network!<\/span><\/h2><div class=\"eael-dch-separator-wrap\"><span class=\"separator-one\"><\/span>\n\t\t\t<span class=\"separator-two\"><\/span><\/div>\t\t\t<\/div>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Welcome If needed, this section can contain some or all of the following: A large, engaging image of the university, department, or an abstract representation of the academic field can set a professional and inspiring tone. A brief welcome message or introduction that explains what visitors will find on the page. This could be a [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"open","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-4182","page","type-page","status-publish","hentry","entry"],"_links":{"self":[{"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/pages\/4182","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/comments?post=4182"}],"version-history":[{"count":1,"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/pages\/4182\/revisions"}],"predecessor-version":[{"id":4184,"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/pages\/4182\/revisions\/4184"}],"wp:attachment":[{"href":"https:\/\/cirics.uqo.ca\/en\/wp-json\/wp\/v2\/media?parent=4182"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}