Oleg Špakov @ TUNI

2026
[79]
 Müller, P., Špakov, O., Verho, J., Salpavaara, T., Savia, M., Sinkkonen, A., Kamppari, V., Rantala, J., Kallio, P., Surakka, V.
 (2026)
 Self-learning multichannel olfactory display.
 In
 Digital Chemical Engineering
, 18
, Elsevier
2025
[78]
 Surakka, V., Björkbacka, M., Lylykangas, J., Rantala, J., Salpavaara, T., Verho, J., Špakov, O., Kamppari, V., Müller, P., Vehkaoja, A., Kallio, P., Thaploo, D., Hummel, T.
 (2025)
 A new method for automated olfactory threshold testing.
 In
 Chemical Senses
, 50, 2025, bjaf029
, Oxford Academic
[77]
 Singh, R., Ziat, M., Špakov, O., Mäkelä, J., Surakka, V., Raisamo, R.
 (2025)
 Trust and Visual Focus in Automated Vehicles: A Comparative Study of Beginner and Experienced Drivers.
 In
 Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems
, ACM
, pp. 1-11
[76]
 Remizova, V., Špakov, O., Sand, A., Lylykangas, J., Qin, M., Helminen, T. M., Takio, F., Rantanen, K., Kylliäinen, A., Surakka, V., Gizatdinova, Y.
 (2025)
 Foggy fun: an exploration of mid-air gestural interaction with a fogscreen by children with attention-deficits.
 In
 Behaviour & Information Technology
, Taylor & Francis
, pp. 1-24
2023
[75]
 Gizatdinova, Y., Špakov, O., Tuisku, U., Turk, M., and Surakka, V.
 (2023)
 Vision-Based Interfaces for Character-Based Text Entry: Comparison of Errors and Error Correction Properties of Eye Typing and Head Typing.
 In
 Advances in Human-Computer Interaction
, 2023, Article ID 8855764
, Hindawi
, 23 pages
[74]
 Remizova, V., Sand, A., Špakov, O., Lylykangas, J., Qin, M., Helminen, T., Rantanen, K., Kylliäinen, A., Surakka, V. & Gizatdinova, Y.
 (2023)
 Exploring Mid-air Gestural Interfaces for Children with ADHD.
 In
 Proceedings of the 16th ACM SIGGRAPH Conference on Motion, Interaction and Games (MIG`23)
, #7
, ACM
, pp. 1-10
[73]
 Remizova, V., Sand, A., MacKenzie, I. S., Špakov, O., Nyyssönen, K., Rakkolainen, I., Kylliäinen, A., Surakka, V. & Gizatdinova, Y.
 (2023)
 Mid-Air Gestural Interaction with a Large Fogscreen.
 In
 Multimodal Technologies and Interaction
, 7(7), 63
, MDPI
[72]
 Špakov, O., Venesvirta, H., Lylykangas, J., Farooq, A., Raisamo, R., and Surakka, V.
 (2023)
 Multimodal Gaze-Based Interaction in Cars: Are Mid-Air Gestures with Haptic Feedback Safer Than Buttons?.
 In
 Marcus, A., Rosenzweig, E., Soares, M.M. (eds) Design, User Experience, and Usability: 25th International Conference on Human-Computer Interaction (HCII`23)
, 14032
, Springer, Cham
[71]
 Špakov, O., Venesvirta, H., Farooq, A., Lylykangas, J., Rakkolainen, I., Raisamo, R., and Surakka, V.
 (2023)
 Cueing Car Drivers with Ultrasound Skin Stimulation.
 In
 Krömker, H. (eds) HCI in Mobility, Transport, and Automotive System: 25th International Conference on Human-Computer Interaction (HCII`23)
, 14049
, Springer, Cham
[70]
 Farooq, A., Raisamo, R., Lylykangas, J., Špakov, O., and Surakka, V.
 (2023)
 Comparing Electrostatic and Vibrotactile Feedback for In-Car Touchscreen Interaction using common User Interface Controls.
 In
 In: Ahram, T., Karwowski, W., Di Bucchianico, P., Taiar, R., Casarotto, L., and Costa, P. (eds) Intelligent Human Systems Integration (IHSI 2023): Integrating People and Intelligent Systems. AHFE (2023) International Conference
, 69
, AHFE Open Access
[69]
 Parttimaa, T., Kylliäinen, A., Lehtonen, E., Saarimäki, H., Špakov, O., Senju, A., Leppänen, J., & Helminen, T.
 (2023)
 Heart Rate Orientation Responses to Faces in Toddlers with Prodromal ASD.
 In
 Poster at the INSAR Annual Meeting, Stockholm
[68]
 Saarimäki, H., Keles, U., Helminen, T., Lehtonen, E., Konopkina, K., Spakov, O., Takio, F., Kliemann, D., Byrge, L., Kennedy, D. P., & Kylliäinen A.
 (2023)
 Gaze Patterns to Social and Nonsocial Videos in Toddlers with High and Low ADOS Total Scores.
 In
 Poster at the INSAR Annual Meeting, Stockholm
2022
[67]
 Špakov, O., Farooq, A., Venesvirta, H., Hippula, A., Surakka, V., and Raisamo, R.
 (2022)
 Ultrasound Feedback for Mid-air Gesture Interaction in Vibrating Environment.
 In
 Ahram, T., and Taiar, R. (eds) Human Interaction & Emerging Technologies (IHIET-AI 2022): Artificial Intelligence & Future Applications, AHFE`22 International Conference
, 23
, AHFE Open Access
[66]
 Kangas, J., Špakov, Raisamo, R., Koskinen, O., Järvenpää, T., and Salmimaa, M.
 (2022)
 Head and Gaze Orientation in Hemispheric Image Viewing.
 In
 Peysakhovich, V. (Ed.), Frontier in Virtual Reality
, 3:822189
2021
[65]
 Farooq, A., Nukarinen, T., Sand, A., Venesvirta, H., Špakov, O., Surakka, V., and Raisamo, R.
 (2021)
 Where's My Cellphone: Non-contact based Hand-Gestures and Ultrasound haptic feedback for Secondary Task Interaction while Driving.
 In
 Proceedings of IEEE SENSORS 2021
, IEEE
, pp. 1-4
2020
[64]
 Sand, A., Remizova, V., MacKenzie, I.S., Špakov, O., Nieminen, K., Rakkolainen, I., Kylliäinen, A., Surakka, V., and Kuosmanen, J.
 (2020)
 Tactile feedback on mid-air gestural interaction with a large fogscreen.
 In
 AcademicMindtrek '20: Proceedings of the 23rd International Conference on Academic Mindtrek
, ACM
, pp. 161–164
2019
[63]
 Majaranta, P., Räihä, K.-J., Hyrskykari, A., Špakov, O.
 (2019)
 Eye Movements and Human-Computer Interaction.
 In
 Klein, C., & Ettinger, U. (Eds.), Eye Movement Research
, Springer
, pp. 971-1015
[62]
 Špakov, O., Niehorster, D. C., Istance, H., Räihä, K.-J., Siirtola, H.
 (2019)
 Two-way gaze sharing in remote teaching.
 In
 Lamas D., Loizides F., Nacke L., Petrie H., Winckler M., Zaphiris P. (eds) Human-Computer Interaction – INTERACT 2019
, Springer, Cham
, pp. 242-251
[61]
 Siirtola, H., Räihä, K.-J., Istance, H., Špakov, O.
 (2019)
 Dissecting Pie Charts.
 In
 IFIP International Federation for Information Processing
, Springer, Cham
, pp. 688-698
[60]
 Špakov, O., Istance, H., Räihä, K.-J., Viitanen, T., Siirtola, H.
 (2019)
 Eye gaze and head gaze in collaborative games.
 In
 Proceeding of the 2019 Symposium on Eye Tracking Research & Applications, ETRA’19
, #85
, ACM
[59]
 Siirtola, H., Špakov, O., Istance, H., Räihä, K.-J.
 (2019)
 Shared Gaze in Collaborative Visual Search.
 In
 International Journal of Human–Computer Interaction
, 35(18)
, Taylor & Francis Online
, pp. 1693-1705
2018
[58]
 Špakov, O., Istance, H., Hyrskykari, A., Siirtola, H., Räihä, K.-J.
 (2018)
 Improving the performance of eye trackers with limited spatial accuracy and low sampling rates for reading analysis by heuristic fixation-to-word mapping.
 In
 Behavior Research Methods
, 51(6)
, Springer
, pp. 2661-2687
[57]
 Gizatdinova, Y., Špakov, O., Tuisku, O., Turk, M., Surakka, V.
 (2018)
 Gaze and head pointing for hands-free text entry: applicability to ultra-small virtual keyboards.
 In
 Proceeding of the 2018 Symposium on Eye Tracking Research & Applications, ETRA’18
, ACM
, 9 pages
[56]
 Špakov, O., Istance, H., Viitanen, V., Siirtola, H., Räihä, K.-J.
 (2018)
 Enabling unsupervised eye tracker calibration by school children through games.
 In
 Proceeding of the 2018 Symposium on Eye Tracking Research & Applications, ETRA’18
, ACM
, 9 pages
2017
[55]
 Räihä, K.-J., Špakov, O., Istance, H., Niehorster, D. C.
 (2017)
 Gaze-assisted remote communication between teacher and students.
 In
 Radach, R., Deubel, H., Vorstius, C., & Hofmann, M.J. (Eds.), Abstracts of the 19th European Conference on Eye Movements, Journal of Eye Movement Research
 Presented at European Conference on Eye Movements 201719th European Conference on Eye Movements, ECEM‘17
, 10(6)
, Wuppertal
, p. 105
[54]
 Špakov, O., Siirtola, H., Istance, H., Räihä, K.-J.
 (2017)
 Visualizing the Reading Activity of People Learning to Read.
 In
 Journal of Eye Movement Research
, 10(5):5
, pp. 1-12
[53]
 Venesvirta, H., Špakov, O., Gizatdinova, J., Tuisku, O., Rantanen, V., Verho, J., Vetek, A., Lekkala, J., Surakka, V.
 (2017)
 Smile to Save It : Facial Expressions for Lifelogging.
 In
 Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia, MUM ‘17 [poster]
, pp. 441–448
[52]
 Rantala, J., Majaranta, P., Kangas, J., Isokoski, P., Akkil, D., Špakov, O., Raisamo, R.
 (2017)
 Gaze Interaction with Vibrotactile Feedback: Review and Design Guidelines.
 In
 Journal of Eye Movement Research
, 39 pages
2016
[51]
 Špakov, O., Isokoski, P., Kangas, J., Rantala, J., Akkil, D., Raisamo, R.
 (2016)
 Comparison of Three Implementations of HeadTurn: A Multimodal Interaction Technique with Gaze and Head Turns.
 In
 Proceeding of International Conference on Mumtimodal Interfaces, ICMI’16
, ACM
, pp. 289-296
[50]
 Nukarinen, T., Kangas, J., Špakov, O., Isokoski, P., Akkil, D., Rantala, J., Raisamo, R.
 (2016)
 Evaluation of HeadTurn - An Interaction Technique Using the Gaze and Head Turns.
 In
 Proceedings of Nordic Conference on Human-Computer Interaction, NordiCHI’16
, #43
, ACM
[49]
 Špakov, O., Istance, H., Räihä, K.-J., Siirtola, H., Hyrskykari, I.
 (2016)
 Field Testing of a Low Cost Eye Tracker with primary school children in the context of developing a gaze aware reading aid.
 In
 Presented at The Scandinavian Workshop on Applied Eye Tracking, SWAET’16
, University of Turku, Finland
, p. 14
[48]
 Špakov, O., Siirtola, H., Istance, H., Räihä, K.-J.
 (2016)
 GazeLaser: A Hands-Free Highlighting Technique for Presentations.
 In
 Proceedings of the Extended Abstracts on Human Factors in Computing Systems, CHI EA’16
, ACM
, pp. 2648-2654
[47]
 Kangas, J., Špakov, O., Isokoski, P., Akkil, D., Rantala, J., Raisamo, R.
 (2016)
 Feedback for Smooth Pursuit Gaze Tracking Based Control.
 In
 Proceedings of the 7th Augmented Human International Conference, AH’16
, 6
, ACM
[46]
 Špakov, O., Isokoski, P., Kangas, J., Akkil, D., Majaranta, P.
 (2016)
 PursuitAdjuster: An Exploration into the Design Space of Smooth Pursuit–based Widgets.
 In
 Proceeding of the 2016 Symposium on Eye Tracking Research & Applications, ETRA’16
, ACM
, pp. 287-290
[45]
 Majaranta, P., Isokoski, P., Rantala, J., Špakov, O., Akkil, D., Kangas, J., & Raisamo, R.
 (2016)
 Haptic Feedback in Eye Typing.
 In
 Groner R. et al. (Eds.) Journal of Eye Movement Research
, 9(1):3
, pp. 1-13
2015
[44]
 Rantala, J., Kangas. J., Isokoski, P., Akkil. D., Špakov, O., and Raisamo. R.
 (2015)
 Haptic Feedback of Gaze Gestures with Glasses: Localization Accuracy and Effectiveness.
 In
 Proceeding of the 14th International Conference on Ubiquitous Computing (UbiComp`15: PETMEI 2015)
, ACM
, pp. 855-862
[43]
 Venesvirta, H., Surakka, V., Gizatdinova, Y., Lylykangas. J., Špakov, O., Verho, J., Vetek, A., and Lekkala, J.
 (2015)
 Emotional Reactions to Point-Light Display Animations.
 In
 Interacting with Computers
, Oxford Journals
, 18 pages
[42]
 Špakov, O., Rantala, J. and Isokoski, P.
 (2015)
 Sequential and simultaneous skin stimulation with multiple actuators on head, neck and back for gaze cuing.
 In
 Proceedings of World Haptics Conference
, IEEE
, pp. 333-338
[41]
 Akkil, D., Kangas, J., Rantala, J. and Isokoski, P., Špakov, O., Raisamo, R.
 (2015)
 Glance Awareness and Gaze Interaction in Smartwatches.
 In
 Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems
, ACM
, pp. 1271-1276
[40]
 Sharmin, S., Špakov, O., and Räihä, K.-J.
 (2015)
 Dynamic Text Presentation in Print Interpreting–an Eye Movement Study of Reading Behaviour.
 In
 International Journal of Human-Computer Studies
, 78
, Elsevier
, pp. 17-30
2014
[39]
 Käki, K., Majaranta, P., Špakov, O., and Kangas, J.
 (2014)
 Effects of Haptic Feedback on Gaze Based Auto Scrolling.
 In
 Proceedings of the 8th Nordic Conference on Human-Computer Interaction, NordiCHI EA ’14
, ACM
, pp. 947-950
[38]
 Tuisku, O., Rantanen, V., Špakov, O., Surakka, V., Lekkala , J.
 (2014)
 Pointing and Selecting with Facial Activity.
 In
 Interacting with Computers 2014
, Oxford Journals
, 12 pages
[37]
 Špakov, O., Isokoski, P., Majaranta, P.
 (2014)
 Look and Lean: Accurate Head-Assisted Eye Pointing.
 In
 Proceeding of the 2014 Symposium on Eye Tracking Research & Applications, ETRA’14
, ACM
, pp. 35-42
[36]
 Špakov, O., Gizatdinova, Y.
 (2014)
 Real-Time Hidden Gaze Point Correction.
 In
 Proceeding of the 2014 Symposium on Eye Tracking Research & Applications, ETRA’14
, ACM
, pp. 291-294
2013
[35]
 Rantanen, V., Kumpulainen, P., Venesvirta, H., Verho, J., Špakov, O., Lylykangas, J., Vetek, A., Surakka, V., Lekkala, J.
 (2013)
 Capacitive Facial Activity Measurement.
 In
 The e-Journal of the International Measurement Confederation (IMEKO)
, 2 (2)
, Elsevier
, pp. 78–85
[34]
 Rantanen, V., Venesvirta, H., Špakov, O., Verho, J., Vetek, A., Surakka, V., Lekkala, J.
 (2013)
 Capacitive Measurement of Facial Activity Intensity.
 In
 Sensors Journal
, 13 (11)
, IEEE
, pp. 4329-4338
[33]
 Sharmin, S., Špakov, O. and Räihä, K.-J.
 (2013)
 Reading On-Screen Text with Gaze-Based Auto Scrolling.
 In
 Proceedings of the Eye Tracking South Africa, ETSA'13
, ACM
, pp. 24-31
[32]
 Špakov, O.
 (2013)
 Previewable scrolling by gaze.
 In
  K. Holmqvist, F. Mulvey & R. Johansson (Eds.), Book of Abstracts of the 17th European Conference on Eye Movements, ECEM'2013
, 6(3)
, Lund, Sweden
, p. 469
[31]
 Kangas, J., Špakov, O., Majaranta. P., Rantala, J., Isokoski, P., Raisamo, R.
 (2013)
 Defining Gaze Interaction Events.
 In
 Presented at "Gaze Interaction in the Post-WIMP World" Workshop of the Conference on Human Factors in Computing Systems, CHI’13
, Paris, France
[30]
 Špakov, O.
 (2013)
 Defining Standard Gaze Tracking API.
 In
 Presented at "Gaze Interaction in the Post-WIMP World" Workshop of the Conference on Human Factors in Computing Systems, CHI’13
, Paris, France
2012
[29]
 Rantanen, V., Kumpulainen, P., Venesvirta, H., Verho, J., Špakov, O., Lylykangas, J., Vetek, A., Surakka, V., Lekkala, J.
 (2012)
 Capacitive Facial Activity Measurement.
 In
 Proceedings of the XX IMEKO World Congress
, Busan, South Korea
, ISBN 978-89-950000-5-2
<b> best of IMEKO 2012 award.</b>
[28]
 Špakov, O., and Majaranta, P.
 (2012)
 Enhanced Gaze Interaction Using Simple Head Gestures.
 In
 Proceeding of the 14th International Conference on Ubiquitous Computing (UbiComp`12: PETMEI 2012)
, ACM
, pp. 705-710
<b> best of PETMEI 2012 award.</b>
[27]
 Sharmin, S., Špakov, O. and Räihä, K.-J.
 (2012)
 The effect of different text presentation formats on eye movement metrics in reading.
 In
 Groner R. et al. (Eds.) Journal of Eye Movement Research
, 5(3):3
, pp. 1-9
[26]
 Špakov, O.
 (2012)
 Comparison of eye movement filters used in HCI.
 In
 Proceeding of the 2012 Symposium on Eye Tracking Research & Applications, ETRA’12
, ACM
, pp. 281-284
[25]
 Gizatdinova, Y., Špakov, O., and Surakka, V.
 (2012)
 Comparison of Video-Based Pointing and Selection Techniques for Hands-Free Text Entry.
 In
 Proceedings of Advanced Visual Interfaces (AVI’12)
, ACM
, pp. 132-139
[24]
 Gizatdinova, Y., Špakov, O., and Surakka, V.
 (2012)
 Face Typing: Vision-Based Perceptual Interface for Hands-Free Text Entry with a Scrollable Virtual Keyboard.
 In
 Proceedings of the 2012 IEEE Workshop on the Applications of Computer Vision (WACV’12)
, IEEE
, 7 pages
2011
[23]
 Špakov, O.
 (2011)
 Comparison of Gaze-to-Objects Mapping Algorithms.
 In
 Proceeding of The 1st conference on Novel Gaze-Controlled Applications, NGCA'11
, #6
, ACM
2009
[22]
 Špakov, O., and Majaranta, P.
 (2009)
 Scrollable Keyboard for Casual Gaze Typing.
 In
 Boehme, M., Hansen, J.-P., and Mulvey, F. (eds.) PsychNology Journal: Gaze Control for Work and Play
, 7(2)
, pp. 159-173
, ISSN 1720-7525
[21]
 Isokoski P., Joos, M., Špakov, O., and Martin, B.
 (2009)
 Gaze Controlled Games.
 In
 Stephanidis, C., Majaranta, P., & Bates, R. (eds.) Universal Access in the Information Society: Communication by Gaze Interaction
, 8(4)
, Springer
, pp. 323-337
[20]
 Majaranta, P., Majaranta, N., Daunys. G., and Špakov, O.
 (2009)
 Text Editing by Gaze: Static vs. Dynamic Menus.
 In
 Proceedings of COGAIN 2009: Gaze Interaction For Those Who Want It Most
, Technical University of Denmark
, pp. 19-24
[19]
 Majaranta, P., Ahola, U.-K., and Špakov, O.
 (2009)
 Fast Gaze Typing with an Adjustable Dwell Time.
 In
 Proceedings of the Conference on Human Factors in Computing Systems, CHI’09
, ACM
, pp. 357-360
<b> best of CHI 2009 award.</b>
[18]
 Räihä, K.-J., and Špakov, O.
 (2009)
 Disambiguating Ninja Cursors with Eye Gaze.
 In
 Proceedings of the Conference on Human Factors in Computing Systems, CHI’09
, ACM
, pp. 1411-1414
has been nominated for a best of CHI 2009 award.
2008
[17]
 Špakov, O.
 (2008)
 A New Method for Visualization of Reading and Writing Processes in Translation of Text.
 In
 Presented at Abstracts for the Scandinavian Workshop on Applied Eye Tracking, SWAET’08
, Lund University, Sweden
[16]
 Špakov, O.
 (2008)
 iComponent ­ Device-Independent Platform for Analyzing Eye Movement Data and Developing Eye-Based Applications.
 In
, University of Tampere, TamPub
[15]
 Sharmin. S., Špakov, O., Räihä, K.-J., and Jakobsen, A. L.
 (2008)
 Where on the Screen Do Translation Students Look While Translating, And for How Long?.
 In
 S. Gopferich, A. L. Jakobsen and I. M. Mees (ed.), Proceedings of "Looking at Eyes: Eye-Tracking Studies of Reading and Translation Processing"
, 36
, Copenhagen Studies in Language
, pp. 31-51
[14]
 Špakov O, and Majaranta P.
 (2008)
 Scrollable Keyboards for Eye Typing.
 In
 Proceedings of COGAIN 2008: Communication, Environment and Mobility Control by Gaze
, CTU Publishing House
, pp. 63-66
[13]
 Špakov, O., and Räihä, K.-J.
 (2008)
 KiEV: A Tool for Visualization of Reading and Writing Processes in Translation of Text.
 In
 Proceeding of the 2008 Symposium on Eye Tracking Research & Applications, ETRA’08
, ACM
, pp. 107-110
[12]
 Sharmin, S., Špakov, O., Räihä, K.-J., and Jakobsen, A. L.
 (2008)
 Effects of Time Pressure and Text Complexity on Translators’ Fixations.
 In
 Proceeding of the 2008 Symposium on Eye Tracking Research & Applications, ETRA’08
, ACM
, pp. 123-126
2006
[11]
 Špakov, O.
 (2006)
 iComponent - Gaze-data visualization and analysis tool: researcher-frendly approach.
 In
 Presented at Abstracts for the Scandinavian Workshop on Applied Eye Tracking, SWAET’06
, Lund University, Sweden
[10]
 Miniotas, D., Špakov, O., Tugoy, I., and MacKenzie, I. S.
 (2006)
 Speech-Augmented Eye Gaze Interaction with Small Closely Spaced Targets.
 In
 Proceeding of the 2006 Symposium on Eye Tracking Research & Applications, ETRA’06
, ACM
, pp. 67-72
2005
[9]
 Špakov, O., and Miniotas, D.
 (2005)
 Gaze-Based Selection of Standard-Size Menu Items.
 In
 Proceeding of International Conference on Mumtimodal Interfaces, ICMI’05
, ACM
, pp. 124-128
[8]
 Špakov, O., and Miniotas, D.
 (2005)
 EyeChess: A Tutorial for Endgames with Gaze-Controlled Pieces.
 In
 Proceedings of COGAIN 2005
, pp. 16-18
[7]
 Špakov, O., and Miniotas, D.
 (2005)
 EyeChess: A Tutorial for End-games with Gaze-Controlled Pieces.
 In
 Book of Abstracts of the 13th European Conference on Eye Movements, ECEM13
, p. 34
2004
[6]
 Špakov, O., and Miniotas, D.
 (2004)
 On-Line Adjustment of Dwell Time for Target Selection by Gaze.
 In
 Proceedings of Nordic Conference on Human-Computer Interaction, NordiCHI’04
, ACM
, pp. 203-206
[5]
 Miniotas, D., Špakov, O., and MacKenzie, I. S.
 (2004)
 Eye Gaze Interaction with Expanding Targets.
 In
 Extended Abstracts of the Conference on Human Factors in Computing Systems, CHI’04
, ACM
, pp. 1255-1258
2003
[4]
 Špakov, O., Evreinova, T., and Evreinov, G.
 (2003)
 Pseudo-Graphic Typeface: Design and Evaluation.
 In
 Proceedings of the 1st Nordic Symposium on Multimodal Communication
, pp. 183-196
[3]
 Miniotas, D., Špakov, O., and Evreinov, G.
 (2003)
 Symbol Creator: An Alternative Eye-Based Text Entry Technique with Low Demand for Screen Space.
 In
 Proceedings of IFIP TC13 International Conference on Human-Computer Interaction, INTERACT 2003
, IOS Press
, pp. 137-143
2001
[2]
 Daunys, G., Laurutis, V., and Špakov, O.
 (2001)
 Eye And Hand Tracking Of 2D Pseudo-Random Targets.
 In
 Proceedings of 3rd Conference Of Sensomotoric Control and Man & Machine Interaction
, p. 28
1999
[1]
 Daunys, G., Miniotas, D., Špakov, O., and Laurutis, V.
 (1999)
 Accuracy of Visual Tracking During Pseudorandom Two-Dimensional Target Motion.
 In
 Book of Abstracts of the 10th European Conference on Eye Movements, ECEM10
, p. 28