key: cord-0058250-nk8obbaa authors: Schwarz, Thorsten; Akbarioroumieh, Arsalan; Melfi, Giuseppe; Stiefelhagen, Rainer title: Developing a Magnification Prototype Based on Head and Eye-Tracking for Persons with Low Vision date: 2020-08-10 journal: Computers Helping People with Special Needs DOI: 10.1007/978-3-030-58796-3_42 sha: 507a01d25daf10b4c511c510fbe6e0fb4096eafa doc_id: 58250 cord_uid: nk8obbaa Severe visual impairments make it difficult for users to work on a computer. For this reason, there is great demand for new technical aids on the computer to compensate for these limitations. Current magnification software makes it possible to adjust the screen content, but due to the lack of overview and the time-consuming use of the mouse it is sometimes difficult to find the right content. If another physical disability is involved, working on a computer often becomes even more difficult. In this paper, we present the development of an affordable magnification system based on a low-cost eye-tracking device, which can be adjusted to the visual impairment without the need for a mouse or keyboard by using the line of vision derived from eye or head movements. Two studies with experts and potential users showed the usefulness of the system. Nowadays computers are all around us and life without them is unimaginable. For most people, computers are the most convenient way to get things done, whether it's everyday life or work. However, not everyone can take full advantage of digital information because of a visual impairment such as albinism, glaucoma, retinopathy, macular degeneration, often accompanied by nystagmus, strabismus, limited field of vision and false color vision. Thus, people low vision have problems when working with computers because they often try to use the same working techniques as people with full sight. They usually use various tools on the computer to be able to work, such as magnifying software (e.g. Zoom-Text, 1 SuperNova Magnifier 2 or screen reader software (e.g. JAWS, 3 NVDA 4 ) to compensate their reduced vision. Current magnification software allows the screen content to be adapted to the visual impairment in terms of magnification factor and color settings. Since the enlarged screen content does not adapt to the width of the screen, they must constantly scroll to capture the content. Skimming over a text is no longer possible and reading becomes more difficult. In addition, people with low vision can easily lose track of what is happening on the screen due to the magnification. If another physical disability is involved, working on a computer becomes even more difficult and inefficient. Moreover, commercial assistive technologies are expensive and many people with disabilities cannot afford a suitable efficient system. It is therefore necessary to develop intuitive solutions that enable efficient work with low vision. Thus, the aim of this work was to develop an affordable magnification system based on a low-cost eye-tracking device, which can be adjusted to the visual impairment without the need of a mouse or keyboard by using the direction of vision derived from eye or head movements. The paper is structured as follows: First, we present related work, then we describe the development of our prototype and its functions as well as the design challenges. In Sect. 4, we present a pilot study with sighted experts and a second study with users with low vision. Finally, we draw our conclusions. Various approaches have already investigated eye-tracking approaches as a possible method of controlling the computer instead of using the keyboard and mouse. Xuebai et al. [13] developed an application with low-cost eye trackers. Their application recognizes the user's gaze and places the cursor at the corresponding position of the gaze. Missimer et al. [8] created a camera-based system that monitors the eyes of users and uses the position of the head to control the cursor. Eriksson-Backa et al. [2] designed a magnification system that uses the gaze point. This system is also suitable for users with physical disabilities and for users with low vision. However, the user must wear special glasses to control the system. Chin et al. [1] designed a system for computer users with disabilities who cannot use their hands due to spinal disabilities. The system uses electromyogram (EMG) signals from muscles in the face and gaze point coordinates as inputs. Lupu et al. [4] developed a system for communication with people with neuro-locomotor disabilities using eye tracking. The eye-tracking system is based on a webcam mounted on glasses for image processing. The eye movement is recorded by a special device and the voluntary blinking of the eye is correlated with a pictogram or keyword selection reflecting the patient's needs. In 2013, Lupu et al. proposed another communication system [5] using head-mounted video glasses based on eye tracking. The system tracks the eye movements of the user and consequently the mouse pointer is moved on the screen. Salunkhe et al. [11] proposed an eye tracking system to control the movement of the computer mouse pointer and simulated a mouse click by influencing the blinking. Similarly, Meghna et al. [7] developed a tool based on a virtual mouse with headtracking that uses the method of classification to recognize features. Shengwen Zhu (2017) [12] developed a magnification system using the inexpensive Tobii eyeX eye-tracker. The goal of the work was to design and implement a magnification system for people with impaired vision. Almost all requirements of the work of Shengwen Zhu [12] were taken as a basis for this study. WenChung Kao et al. [3] developed a magnification software that recognizes the user's gaze point on the screen based on a digital camera and changes the size of the local image. The main problem with this approach is that the magnification window jumps around and does not move smoothly. Stephan Pƶlzer et al. (2018) [9] proposed a modular magnification tool based on the inexpensive Tobii EyeX eye tracker for users with visual impairments such as nystagmus and users without visual impairments. The tool has no graphical interface and starts from the command line. They include the technique described by [10] to solve the problem of the moving window for users with nystagmus. Since the goal of this work was to develop a magnification system and not an eye tracker, the focus was on finding the best algorithm for a stable magnification window. Thus, when the magnification system uses the eye's gaze points that it receives directly from the eye tracker, the magnification window is unstable and wobbles a lot. WenChung Kao's idea to limit the magnification window was very helpful. The limit within the magnification window helps the system to react more stable. The magnification window only moves when it receives the eye's gaze outside the limit. The magnification tool has been developed in an iterative and incremental development process. We pursued the idea of co-design development by involving both potential users with visual impairments and experts working with people with visual impairments in order to get feedback as early as possible for the design of an intuitive interface. We chose the Tobii 4 C eye-tracker 5 as the core part of the prototype. It has a sampling rate/frequency between 80-90 Hz and is connected to a computer via USB and mounted in the middle of the bottom edge of the screen. In addition, the system also offers head tracking, which could be an alternative if eye tracking is not feasible. The software was developed entirely in C# to create the most efficient, accessible tool possible. During our development process, we faced three challenges: (i) a stable magnification window, (ii) a magnification window with smooth movements, and (iii) maintaining an overview of the screen. Stabilizing the Magnification Window. The biggest challenge in developing a magnification software with an eye tracker for people with low vision is the question of when to move the magnification window. We have tried to stabilize our magnification window by using the collected gaze points of the eye tracker. It turned out that for users without visual impairment, the viewpoint remains in a small area around the fixation target, i.e. in most cases the distance is less than 50 pixels. For users with low vision, however, the distance can be up to 1000 pixels, which would lead to rapid jumps of the magnification window. Thus, we defined a limit for eye movements within which the magnification window remains stable (see Fig. 1 ). Within this area, eye movements can jump back and forth without affecting the position of the magnification window, and only when the eye moves outside the area does the magnification window move. The size of the boundary is also very important. If the area of the inner box is too small, the eye movement may jump out slightly and the window will start to wobble slightly. If the area is too large, the magnification window may be difficult to move. The size of the inner box depends on the size of the magnification window, and it is a small area in the middle of the magnification window. The default width and height of the inner box is set to half the width and height of the magnification window. When the user's eye movements are inside the inner box, the magnification window remains stationary, and only when the eye movements are outside this box does the window move. Smoothing the Movement of the Magnification Window. The second challenge was to determine how the magnification window moves. If only the viewpoint function outside the inner box is considered for the movement of the window, the window does not move smoothly but jumps to the point. And this is not optimal when the user wants to read a text horizontally or vertically. Therefore the speed of the movement must be determined. The speed of the movement must be set so that it is inversely proportional to the zoom factor. This means that the window moves slower with a zoom factor of 8 and faster with a zoom factor of 2. Keeping the Overview of the Screen. Another challenge was to determine the correct ratio between the magnification window and the overall screen. A higher magnification factor requires a larger screen to display the visual information that covers most of the main screen. This made it easy for users to lose track of the main screen. An approach to solve this problem was developed by Ashmore et al. [6] . They used a fisheye lens with the highest zoom factor in the center and a lower zoom factor at the edges of the fisheye lens. This approach was not accepted by the users in our pilot study and was therefore replaced by a "Where am I" function. In our prototype, we implemented an eye (head) controlled and a mouse controlled magnification mode. The final tool includes the following five functions: (i) Magnification, (ii) Settings, (iii) Automatic determination of the best control mode, (iv) Tool-internal eyetracker calibration process, and (v) Instructions. The goal of our approach was to develop software that is easy to use, covers all expected functions and provides the functionality of normal magnification software for the majority of users (Fig. 2) . We have qualitatively evaluated the magnification system in two ways: in a first step with experts without visual impairment and in a second step with users with various eye diseases. The tasks for the test procedure were the same for all users: 1. Use Tobii's own calibration procedure to calibrate the eye tracker. 2. Use the internal calibration process to find out the best tracking mode. 3. Set the most convenient configurations. 4. Start using the magnification for reading. In the first pilot study, six sighted experts working with people with visual impairments were invited to test the software. In all tests, the calibration process of the Tobii Eye Tracker was successfully completed. The magnification tool was tested with both the head tracking mode and a combination of head and eye tracking. There was no clear preference regarding the tracking mode. Some users preferred the combination of head and eye tracking because the magnification window moved or reacted faster than in eye tracking mode alone. For other users, however, eye tracking was the best solution. The goal of the first user study was to evaluate our internal calibration tool and the general functions of the magnification software to optimize the system. The following two features have been added according to user feedback: (1) Freezing, i.e. stopping the movement of the magnification window to allow reading even when the eyes are moving outside the bounding box, and (2) integration of a shortcut key for finding the main menu. In our second pilot study, three students with low vision (two male and one female) were invited to test the tool. Table 1 shows an overview of the three participants (P) regarding their eye disease, the magnification factor they used, how the calibration was performed, which tracking method was preferred (eye, head) or whether they preferred to work with the mouse. Participant 1 had problems using Tobii Eye Tracker's own calibration process due to the used colors, contrasts and size of the calibration points. Therefore, the calibration was done manually. Another problem was his habit of moving his head too close to the monitor, which meant that the eye tracker could not determine the eye positions. As a result, the tool could not respond optimally. Therefore the participant was asked to test the software in mouse-controlled The participant also mentioned that the color representation is much better than with other magnification tools. Participant 2 could also not successfully complete the calibration process of the eye tracker. However, our internal calibration process could be used to select the best eye tracking mode. The results are shown in Fig. 3 . Based on the results of the calibration, the internal tool suggested the tracking mode "right eye". The test result was surprisingly good and the magnification software was stable during the test. The user found the software easy to use and very promising. Since the user has nystagmus, he mentioned that the freezing functions were very useful for him. In the second part of the test, the combined head-eye tracking was tested for comparison. This option causes a faster movement of the magnification window. Due to this speed increase combined with the high zoom factor, it was difficult to follow a sentence or find the beginning or end of a sentence. In the third and last part of the evaluation, only head-tracking was used. The magnification window was surprisingly stable in the middle of the screen. In the case that the participant turned his head too far to the left or right, the magnification window became unstable. However, the user mentioned that he needed time to train with this mode. The user also said that it was better to move the magnification window only along the X axis and not up and down when trying to read something. Locking the direction would be a desirable feature here. The eyes of the 3rd participant were directed in different directions (strabismus), which tempts the participant to balance the steering with the "stable" eye. The Tobii calibration process was again not completed successfully (not even by pointing manually at the calibration points on the screen). The internal calibration process was tested to find the best control mode again (see Fig. 4 ). The result was that the eye tracker could not detect the gaze points in the current case and suggested using the head tracking mode. For this participant, the head-tracking mode worked very well and she was able to interact with the software in this mode better than other users. She mentioned that she already noticed that she could use the software better when she was using it for a longer time and became more familiar with the controls. Finally, she was asked to test the software with the mouse-driven mode. Although she had been using other magnification tools for years, in her opinion the color definition of color filters in the presented magnification tool (e.g. inverted color filters) was better than the other tools. Results. The following three features were added based on feedback from participants with low vision: (1) "Where am I" function. This function is triggered by pressing a key combination that shows the user the position of the current magnification area by a large red circle on the entire screen. (2) Shortcut key to adjust the color inversion filter to the magnification window. (3) The magnification window should follow the keyboard focus. In this paper, we presented a user-centered approach to the development of a novel software magnification system. It is based on a low cost eye-head tracking system, where the magnification window is displayed on the screen at the same position as the fixation target. In this way, the user maintains an overview of the screen and can work with the mouse in the original sense instead of using it for scrolling. For optimization, the software is not limited to the eye positions, but also offers the possibility to evaluate head movements and uses them to control the magnification window. Our pilot studies indicate that the combined eye-head magnification tool presented in this paper is helpful for almost all users with low vision. We also expect that it can provide great added value for people with limited mobility (paraplegia). The evaluation of the prototype has also shown that interaction via a purely eye-based control is very difficult or even impossible for some of the people with low vision. For this reason, control by head movement was implemented as mouse control in the classical sense. This opens up all possibilities of individual control for the user. The user studies also pointed out that in many cases it makes sense to use only one eye instead of both for control, since most people have a so-called guiding eye. Finally, we plan to continue user testing with a larger number of people with low vision. The tool can be found on our website (https://www.szs.kit.edu) and can be used after registration and the willingness to participate in a survey. We would appreciate any feedback in order to gather further experiences and suggestions for improvement. Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities Vision enhancement technique based on eye tracking system Real-time image magnifier with visible-spectrum gaze tracker Mobile embedded system for human computer communication in assistive technology Eye tracking mouse for human computer interaction Efficient eye pointing with a fisheye lens Head tracking virtual mouse system based on ad boost face detection algorithm Blink and wink detection for mouse pointer control Gaze based magnification to assist visually impaired persons Assisting people with nystagmus through image stabilization: Using an arx model to overcome processing delays A device controlled using eye movement Software magnifier with eye-tracking for visully impaired Eye tracking based control system for natural human-computer interaction