Fatma Ben Taher, Nader Ben Amor, and Mohamed Jallouli


  1. [1] K. Arai and R. Mardiyanto, A prototype of electric wheelchair controlled by eye-only for paralyzed user, Journal of Robotics and Mechatronics, 23, 2011, 66–74.
  2. [2] X. Xu, Y. Zhang, Y. Luo, and D. Chen, Robust bio-signal based control of an intelligent wheelchair, Robotics, 2, 2013, 187–197.
  3. [3] N. Mani, A. Sebastian, A.M. Paul, A. Chacko, and A. Ragunath, Eye controlled electric wheel chair, International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering, 4, 2015, 2494–2497.
  4. [4] T. Kaufman, A. Herweg, and A. Kübler, Toward brain– computer interface based wheelchair control utilizing tactually-evoked event-related potentials, Journal of Neuroengineering and Rehabilitation, 11, 2014, 1–17. 73
  5. [5] J. Philips, et al., Adaptive shared control of a brain-actuated simulated wheelchair, Proc. 10th IEEE Int. Conf. on Rehabilitation Robotics, Noordwijk, The Netherlands, 2007, 408–414.
  6. [6] H. Tran et al. An EEG-controlled wheelchair using eye movements, Proc. 5th Int. Conf. Biomed. Eng., Vietnam, 2015, 470–473.
  7. [7] T. Carlson and J. del Rs Millan, Brain-controlled wheelchairs: A robotic architecture. IEEE Robotics Automation Magazine, 20, 2013, 65–73.
  8. [8] G. Massimo, et al., Towards a brain-activated and eyecontrolled wheelchair, International Journal of Bioelectromagnetism, 13(1), 2011, 44–45.
  9. [9] E.C. Lee, J.C. Woo, J.H. Kim, M. Whang, and K.R. Park, A brain–computer interface method combined with eye tracking for 3D interaction, Journal of Neuroscience Methods, 190(2), 2010, 289–298.
  10. [10] A. Jain, C.W. de Silva, and Q.M.J. Wu, Intelligent fusion of sensor data for product quality assessment in a fishcutting machine, Control and Intelligent Systems, 32, 2004
  11. [11] D. Izadi, J.H. Abawajy, S. Ghanavati, and T. Herawan, A data fusion method in wireless sensor networks. Sensors (Basel), 15, 2015, 2964–2979.
  12. [12] S. Koelstra and I. Patras, Fusion of facial expressions and EEG for implicit affective tagging, Image and Vision Computing, 31, 2013, 164–174
  13. [13] X. Li, A. Dick, C. Shen, Z. Zhang, A. van den Hengel, and H. Wang, Visual tracking with spatio temporal Dempster Shafer information fusion. IEEE Transactions on Image Processing, 22, 2013, 3028–3040
  14. [14] W. Zheng, B. Dong, and B. Lu, Multimodal emotion recognition using EEG and eye tracking data, Proc. 36th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society (EMBC), Chicago, IL, 2014, 5040–5043.
  15. [15] “Emotiv epoc software development kit , http://www.emotiv. com/.
  16. [16] P. Viola and M. Jones, Robust real-time face detection, International Journal of Computer Vision, 57(2), 2004, 137–154.
  17. [17] P.K. Allen, A. Timcenko, B. Yoshimi, and P. Michelman, Automated tracking and grasping of a moving object with a robotic hand-eye system, IEEE Transactions on Robotics and Automation, 9(2), 1993, 152–165.
  18. [18] J. Rada-Vilela, fuzzylite: A fuzzy logic control library written in C++, 2013.
  19. [19] Opencv tutorial: http://www.geckogeek.fr/tutorial-opencvisoler-et-traquer-une-couleur.html.
  20. [20] A. Urken, Voting theory, data fusion, and explanations of social behaviour, AAAI Spring symposium Series, North America, 2011.
  21. [21] L.I. Kuncheva, A Theoretical study on six classifier fusion strategies, IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(2), 2002, 281–286.
  22. [22] K.S. Ahmed, Wheelchair movement control via human eye blinks, American Journal of Biomedical Engineering, 1(1), 2011, 55–58.
  23. [23] H. Yamada and T. Muto, Using virtual reality to assess factors affecting shipboard accessibility for wheelchair users, Control and Intelligent System, 32, 2004.

Important Links:

Go Back