Introducing a Stress Management and Navigation System for Blind Individuals
Volume 10, Issue 2, Page No 12-27, 2025
Author’s Name: Youssef Keryakos*,1 Youssef Bou Issa 1, Michel Salomon 2, Abdallah Makhoul 2
View Affiliations
1 Ticket lab, Antonine university, Baabda, Lebanon
2 Femto-St, Franche-comte university, Besancon, France
a)whom correspondence should be addressed. E-mail: youssef.keryakos@ua.edu.lb
Adv. Sci. Technol. Eng. Syst. J. 10(2), 12-27 (2025); DOI: 10.25046/aj100202
Keywords: Stress, Navigation for the blind, Obstacle detection
Export Citations
The most challenging task in daily life of blind individuals is navigating outdoors. In this context, we are introducing and describing a navigation system that will provide two important tasks for blind individuals. Initially, the system will suggest the least stressful route for the blind to navigate among the various possible paths between a starting point and a destination. Finally, the system will provide real-time navigation guidance and real-time obstacle detection. The process to identify the less stressful route begins with the identification of all possible routes then calculating the Index of Difficulty for each Route (IDR), afterwads calculating the index of stress (ISR) based on the IDR for each route, eventually the route with lowest ISR is selected as the least stressful.
Received: 25 September 2024 Revised: 31 January 2025 Accepted: 08 February 2025 Online: 20 March 2025
- Y. Keryakos, Y. B. Issa, M. Salomon, A. Makhoul, “Correlation Between Types of Obstacles and Stress Level of Blind People in Outdoor Navigation,” in Proceedings of the International Wireless Communications and Mobile Computing Conference (IWCMC), Marrakech, Morocco, 2023, online: https://hal.archives-ouvertes.fr/hal-04224786.
- A. J. Ramadhan, “Wearable Smart System for Visually Impaired People,” Sensors, 18(3), 843, 2018, doi:10.3390/s18030843.
- R. K. Katzschmann, B. Araki, D. Rus, “Safe Local Navigation for Visually Impaired Users with a Time-of-Flight and Haptic Feedback Device,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(3), 583– 593, 2018, doi:10.1109/TNSRE.2018.2800665.
- G. Goncalves, H. Helena, “Indoor Location System Using ZigBee Technology,” in Proceedings of the 2009 Third International Conference on Sensor Technologies and Applications, 152–157, 2009, doi:10.1109/SENSORCOMM.2009.35.
- J. Karchˇn´ak, D. ˇSimˇs´ık, B. Jobb´agy, D. Onofrejov´a, “Feasibility Evaluation of Wearable Sensors for Homecare Systems,” Acta Mechanica Slovaca, 19(2), 58–62, 2015, doi:10.21496/ams.2015.016.
- D. Freitas, G. Kouroupetroglou, “Speech Technologies for Blind and Low Vision Persons,” Technology and Disability, 20(2), 135–156, 2008, doi:10.3233/TAD-2008-20208.
- C.-H. Lim, Y. Wan, B.-P. Ng, C.-M. S. See, “A Real-Time Indoor WiFi Localization System Utilizing Smart Antennas,” IEEE Transactions on Consumer Electronics, 53(2), 618–622, 2007, doi:10.1109/TCE.2007.381746.
- W. Elmannai, K. Elleithy, “Sensor-Based Assistive Devices for Visually- Impaired People: Current Status, Challenges, and Future Directions,” Sensors, 17(3), 565, 2017, doi:10.3390/s17030565.
- R. Kanan, O. Elhassan, “A Combined Batteryless Radio and WiFi Indoor Positioning for Hospital Nursing,” Journal of Communication and Software Systems, 12(1), 34–39, 2016, doi:10.24138/jcomss.v12i1.95.
- J. P. Gomes, J. P. Sousa, C. R. Cunha, E. P. Morais, “An Indoor Navigation Architecture Using Variable Data Sources for Blind and Visually Impaired Persons,” in Proceedings of the 13th Iberian Conference on Information Systems and Technologies (CISTI), 2018, doi:10.23919/CISTI.2018.8399264.
- L. Marco, G. Farinella, Computer Vision for Assistive Healthcare, Elsevier Science, Amsterdam, The Netherlands, 2018.
- A. Caldini, M. Fanfani, C. Colombo, “Smartphone-Based Obstacle Detection for the Visually Impaired,” in Image Analysis and Processing – ICIAP 2015, volume 9279 of Lecture Notes in Computer Science, 480–488, Springer International Publishing, 2015, doi:10.1007/978-3-319-23231-7 43.
- H.-C. Wang, R. K. Katzschmann, S. Teng, B. Araki, L. Giarre, D. Rus, “Enabling Independent Navigation for Visually Impaired People through a Wearable Vision-Based Feedback System,” in Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), 6533–6540, 2017, doi:10.1109/ICRA.2017.7989772.
- R. Luo, X. Tan, R. Wang, T. Qin, J. Li, S. Zhao, E. Chen, T.-Y. Liu, “Light- Speech: Lightweight and Fast Text to Speech with Neural Architecture Search,” in Proceedings of the ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 5699–5703, 2021, doi:10.1109/ICASSP39728.2021.9414856.
- A. Karkar, S. Al-Maadeed, “Mobile Assistive Technologies for Visual Impaired Users: A Survey,” in Proceedings of the 2018 International Conference on Computer and Applications (ICCA), 427–433, 2018,
doi:10.1109/COMAPP.2018.8460195. - S. Phung, M. Le, A. Bouzerdoum, “Pedestrian lane detection in unstructured scenes for assistive navigation,” Computer Vision and Image Understanding, 149, 186–196, 2016, doi:10.1016/j.cviu.2016.01.011.
- L. Singh, P. Mazumder, G. Sharma, “Comparison of drug susceptibility pattern of Mycobacterium tuberculosis assayed by MODS (Microscopicobservation drug-susceptibility) with that of PM (proportion method) from clinical isolates of North East India,” IOSR Journal of Pharmacy, 4, 1–6, 2014, doi:10.9790/3013-04020201-06.
- P. Boss´e, “A Plant Identification Game,” American Biology Teacher, 39, 115, 1977, doi:10.2307/4445817.
- G. Sainarayanan, R. Nagarajan, S. Yaacob, “Fuzzy image processing scheme for autonomous navigation of human blind,” Applied Soft Computing, 7, 257–264, 2007, doi:10.1016/j.asoc.2005.06.005.
- National Instruments, “mySmartCane: Giving Freedom to Visually Impaired People,” 2018, accessed August 2018.
- V. Mustonen, M. Pantzar, “Tracking Social Rhythms of the Heart: From Dataism to Art,” Approaching Religion, 3(2), 16–21, 2013, doi:10.30664/ar.67512.
- A. Riazi, F. Riazi, R. Yoosfi, F. Bahmeei, “Outdoor difficulties experienced by a group of visually impaired Iranian people,” Journal of Current Ophthalmology, 28, 2016, doi:10.1016/j.joco.2016.04.002.
- I. F. B. Hairuman, O.-M. Foong, “OCR Signage Recognition with Skew and Slant Correction for Visually Impaired People,” in Proceedings of the 2011 11th International Conference on Hybrid Intelligent Systems (HIS), 306–310, 2011, doi:10.1109/HIS.2011.6122118.
- M. D. Messaoudi, B.-A. J. Menelas, H. Mcheick, “Autonomous Smart White Cane Navigation System for Indoor Usage,” Technologies, 8(3), 37, 2020, doi:10.3390/technologies8030037.
- J. Bai, D. Liu, G. Su, Z. Fu, “A Cloud and Vision-Based Navigation System Used for Blind People,” in Proceedings of the 2017 International Conference on Artificial Intelligence, Automation and Control Technologies (AIACT 17), 416–420, 2017, doi:10.2991/aiact-17.2017.93.
- O. Oladayo, “A Multidimensional Walking Aid for Visually Impaired Using Ultrasonic Sensors Network with Voice Guidance,” Int. J. Intell. Syst. Appl., 6, 53–59, 2014, doi:10.5815/ijisa.2014.08.06.
- C. Barberis, A. Andrea, G. Giovanni, M. Paolo, “Experiencing Indoor Navigation on Mobile Devices,” IT Prof., 16, 50–57, 2013, doi:10.1109/MITP.2013.54.
- J. Ducasse, A. M. Brock, C. Jouffrais, “Accessible interactive maps for visually impaired users,” in Mobility of Visually Impaired People, 537–584, Springer, 2018.
- J. Albouys-Perrois, J. Laviole, C. Briant, A. M. Brock, “Towards a Multisensory Augmented Reality Map for Blind and Low Vision People: A Participatory Design Approach,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 629:1–629:14, 2018, doi:10.1145/3173574.3174203.
- T. G¨otzelmann, K. Winkler, “SmartTactMaps: A Smartphone-Based Approach to Support Blind Persons in Exploring Tactile Maps,” in Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments, 2:1–2:8, 2015, doi:10.1145/2769493.2769497.
- Q. Liu, R. Li, H. Hu, D. bao Gu, “Building Semantic Maps for Blind People to Navigate at Home,” in 2016 8th Computer Science and Electronic Engineering Conference (CEEC), 12–17, IEEE, 2016, doi:10.1109/CEEC.2016.7835901.
- T. G¨otzelmann, “LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People,” in Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, 81–90, 2016, doi:10.1145/2982142.2982163.
- C. Gleason, A. Guo, G. Laput, K. Kitani, J. P. Bigham, “VizMap: Accessible Visual Information through Crowdsourced Map Reconstruction,” in Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, 273–274, 2016, doi:10.1145/2982142.2982180.
- D. Sato, U. Oh, K. Naito, H. Takagi, K. Kitani, C. Asakawa, “NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment,” in Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’17), 270–279, 2017, doi:10.1145/3132525.3132535.
- A. Ganz, J. M. Schafer, Y. Tao, C. Wilson, M. Robertson, “PERCEPTII: Smartphone Based Indoor Navigation System for the Blind,” in Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 3662–3665, 2014, doi:10.1109/EMBC.2014.6944417.
- B.-S. Lin, C.-C. Lee, P.-Y. Chiang, “Simple Smartphone-Based Guiding System for Visually Impaired People,” Sensors, 17(6), 1371, 2017, doi:10.3390/s17061371.
- S. Ren, K. He, R. Girshick, J. Sun, “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks,” in Advances in Neural Information Processing Systems 28 (NIPS 2015), 91–99, 2015.
- J. Redmon, S. Divvala, R. Girshick, A. Farhadi, “You Only Look Once: Unified, Real-Time Object Detection,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 779–788, 2016, doi:10.1109/CVPR.2016.91.
- T. V. Matar´o, F. Masulli, S. Rovetta, A. Cabri, C. Traverso, E. Capris, S. Torretta, “An Assistive Mobile System Supporting Blind and Visual Impaired People When Outdoors,” in Proceedings of the IEEE 3rd International Forum on Research and Technologies for Society and Industry (RTSI), 1–6, 2017, doi:10.1109/RTSI.2017.8065886.
- J. C. Lock, G. Cielniak, N. Bellotto, “A Portable Navigation System with an Adaptive Multimodal Interface for the Blind,” in Proceedings of the AAAI 2017 Spring Symposium on Designing the User Experience of Machine Learning Systems, 395–400, 2017.
- W. Heuten, N. Henze, S. Boll, M. Pielot, “Tactile Wayfinder: A Non-Visual Support System for Wayfinding,” in Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, 172–181, ACM, 2008, doi:10.1145/1463160.1463179.
- R. Manduchi, J. Coughlan, “(Computer) Vision Without Sight,” Communications of the ACM, 55(1), 96–104, 2012, doi:10.1145/2063176.2063200.
- “A Survey: Outdoor Mobility Experiences by the Visually Impaired,” No further details available.
- “Obstacles for blind people,” https://iranhumanrights.org/2018/06/ drr-obstacles-for-blind-people/, 2018, accessed: 2023-08-28.
- M. Bandukda, A. Singh, N. Berthouze, C. Holloway, “Understanding Experiences of Blind Individuals in Outdoor Nature,” in Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, CHI EA ’19, 1–6, Association for Computing Machinery, New York, NY, USA, 2019, doi:10.1145/3290607.3313008.
- W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, A. Berg, “SSD: Single Shot MultiBox Detector,” in Proceedings of the European Conference
on Computer Vision (ECCV), 21–37, 2016, doi:10.1007/978-3-319-46448-0 2. - M. Abadi, P. Barham, J. Chen, Z. Chen, A. Davis, J. Dean, M. Devin, S. Ghemawat, G. Irving, M. Isard, M. Kudlur, J. Levenberg, R. Monga, S. Moore, D. G. Murray, B. Steiner, P. Tucker, V. Vasudevan, P. Warden, M. Wicke, Y. Yu, X. Zheng, “TensorFlow: A System for Large-Scale Machine Learning,” in Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), 265–283, 2016.
- E. Bisong, “Google Colaboratory,” in Building Machine Learning and Deep Learning Models on Google Cloud Platform: A Comprehensive Guide for Beginners, 59–64, Springer, 2019, doi:10.1007/978-1-4842-4470-8 7.
No. of Downloads Per Month
No. of Downloads Per Country