Next Article in Journal
Data-Analytics Modeling of Electrical Impedance Measurements for Cell Culture Monitoring
Next Article in Special Issue
Fault Diagnosis of a Rotor and Ball-Bearing System Using DWT Integrated with SVM, GRNN, and Visual Dot Patterns
Previous Article in Journal
Influence of Moisture Content on Electromagnetic Response of Concrete Studied Using a Homemade Apparatus
Previous Article in Special Issue
An Advanced ICTVSS Model for Real-Time Vehicle Traffic Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Computer Mouse Using Blowing Sensors Intended for People with Disabilities

1
School of Computer Engineering, University of Electronic Science and Technology of China, Zhongshan Institute, Zhongshan 528402, China
2
School of Electrical Engineering and Automation, Jiangxi University of Science and Technology, Ganzhou 341000, China
3
Department of Electronic Engineering, National Taipei University of Technology, Taipei 10608, Taiwan
4
Department of Electronic Engineering, St. John’s University, Tamsui, New Taipei City 25135, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(21), 4638; https://doi.org/10.3390/s19214638
Submission received: 12 August 2019 / Revised: 20 October 2019 / Accepted: 22 October 2019 / Published: 25 October 2019

Abstract

:
The computer is an important medium that allows people to connect to the internet. However, people with disabilities are unable to use a computer mouse and thus cannot enjoy internet benefits. Nowadays, there are various types of assistive technologies for controlling a computer mouse, but they all have some operational inconveniences. In this paper, we propose an innovative blowing-controlled mouse assistive tool to replace the conventional hand-controlled mouse. Its main contribution is that it uses microphones to induce small signals through the principle of airflow vibration, and it then converts the received signal into the corresponding pulse width. The co-design of software programming enables various mouse functions to be implemented by the identification of the blowing pulse width of multiple microphones. The proposed tool is evaluated experimentally, and the experimental results show that the average identification rate of the proposed mouse is over 85%. Additionally, compared with the other mouse assistive tools, the proposed mouse has the benefits of low cost and humanized operation. Therefore, the proposed blowing control method can not only improve the life quality of people with disabilities but also overcome the disadvantages of existing assistive tools.

1. Introduction

With the development of computer and internet technology, peoples’ lives have undergone major changes. Nowadays, people greatly rely on the use of computers and the internet in their work, information acquisition, shopping, or even entertainment, all of which deeply affect their daily lives. In computer-based devices, in general, a mouse is an important device to interact with the computer, and it is used as a pointer to access the graphical user interface (GUI) on the screen. However, according to the statistics of the World Health Organization, there are about 250,000~500,000 people with disabilities caused by accidents or illnesses every year. Their daily lives almost completely rely on family care, and it is more difficult for them to operate computers using conventional hand-controlled mice. Therefore, to improve the life quality of people with disabilities, it is necessary to design a humanized mouse assistive tool [1].
Apparently, people with disabilities may be unable to operate computers using conventional mice. People with disabilities, especially those with spinal cord injuries, can flexibly operate only above their neck, so most of the existing mouse assistive tools are implemented based on eyes, head, mouth, or blowing and sucking movements in order to provide the basic mouse functions [2,3,4]. The eye-tracking mouse is one of the popular and often-used mouse assistive tools, and it uses eye-movement tracking to control the cursor movement on a computer screen [5]. On the other hand, a head-controlled mouse uses sensors such as accelerometers and gyroscopes to detect head actions or captures information about head movements through computer vision [6]. Brainwave recognition, as a developing technology [7,8] allows people with disabilities to use their minds to control mouse movements. Furthermore, some other popular mouse assistive tools use mouth control technologies, which include the control of a sip and puff switch [9], bite operation [10], and mouth shape recognition [11]. All the mentioned mouse assistive technologies are described in detail in Section 2. However, in addition to their high manufacturing costs, there are some inconveniences and limitations in operating all the aforementioned mouse assistive tools.
Motivated to provide an easy way to access the computers and the internet to people with disabilities, we developed a blowing-controlled mouse to replace the conventional hand-controlled mouse. The main contribution of the proposed mouse is that it employs multiple electret microphones as blowing sensors and uses a technology that converts a blowing signal into a corresponding pulse width. Via the identification of the pulse width, various mouse operations such as a cursor movement, left/right click, drag and scroll can be completed. Unlike the other methods used in an assistive mouse, the proposed blowing-controlled mouse uses signal processing technology that converts a blowing signal into a pulse width, providing the benefits of a fast response and a lower implementation complexity compared with traditional digital signal processing. Since the proposed mouse only needs a slight blow from a user’s mouth and a small swing of a user’s head to operate for people with disabilities or even for paralyzed patients, it is relatively easy to control the computer. In addition, the proposed mouse can be applied to different computer operating systems without installing any driver; it thus possesses the feature of plug-and-play, because it can be set up by just connecting it to a USB (Universal Serial Bus) port.
This paper is organized as follows. Section 2 reviews the existing control methods of mouse assistive techniques. Section 3 introduces the proposed blowing-controlled mouse. Section 4 illustrates the implementation of the proposed mouse and provides the experimental results. Finally, Section 5 summarizes the features and performance of the proposed mouse.

2. Reviews of Mouse Assistive Technologies

At present, there are various mouse assistive tools on the market, and they use different control types, such as eyes-, head-, and mouth-based control methods. Table 1 shows the comparison of different mouse assistive technologies. In this section, we discuss the basic operating principles of various mouse assistive tools in detail.

2.1. Eye Control

In the past, the eye tracking was one of the popular control methods in devices intended for people with disabilities. Eye tracking is based on PCCR (pupil center corneal reflection) technology that tracks the eye movements and records the point of gaze related to the environment [12,13,14,15,16]. Such devices need a near-infrared light source projecting toward the user’s eyes, and they use a high-definition video camera to sense the light reflected from the user’s eyes. By using image processing and recognition, the cursor can move to where the user looks. Burger et al. [14] evaluated the suitability of an inexpensive eye-tracking device for the enhancement of user experience. Jiang et al. [15] pointed out that the research was proposed to survey players’ visual attention mechanisms of various interactive levels of mobile games’ interfaces under free-browsing and task-oriented conditions. Antunes et al. [16] proposed to use the eye tracking in a first-person shooter game as a mechanism. In summary, there have been many applications of eye control in computer games. However, people with disabilities easily feel eye fatigue after long periods of gazing, and the price for constructing such a mouse assistive tool is high due to the high amount of required equipment.

2.2. Head Control

This kind of mouse measures the head rotation angle by using an accelerometer and a gyroscope as sensors, and then it finds out the relationship between the head rotation angle and the mouse movement distance; however, some head-controlled techniques use a camera-based computer vision technology [17,18,19,20] to capture the user’s head movements to perform mouse functions. In addition, Rudigkeit et al. [17] proposed a human–robot interface that enables tetraplegics to control a multi-degree-freedom robot arm in real-time by solely using head motion, allowing them to single-handedly perform the easy manipulation of tasks. The results showed that the mapping of head motion onto robot motion was intuitive, and a smooth, precise, and efficient robot control was achieved. The operation of this head-controlled mouse is very easy, but it can cause people with disabilities, especially with anemia, to suffer dizziness and discomfort during long-time usage.

2.3. Mouth Control

The most direct method, mouth-based control, is designed for people with disabilities to control mouse movements with a joystick-like stick which they bite or by moving their tongue across the detection surface area of an optical device [21]. Additionally, the user initiates mouse button functions by blowing air into the device. Though this method is simpler and easily controllable, its manufacturing cost is high. Additionally, it is uncomfortable for the user’s mouth and can cause shoulder and neck aches during long-time usage.
The other type of mouth control is to blow and suck a pipe and then open or close a mechanical sip and puff switch [22] by using the airflow pressure to decide which mouse function to perform based on a number of particular cyclic blowing/sucking actions. However, it can be uncomfortable and inconvenient for people with disabilities to keep biting a pipe, and it can also be a little laborious.
In [23], the recognition of mouth shape was used as a control method. This mouse device uses a camera to capture images of mouth actions, and then it determines the mouth position and identifies the mouth shape as an opened mouth or a closed mouth to realize the mouse functions. The authors claimed that the detection accuracy of the opened mouth was higher than of the closed mouth when the mouth stayed still, but the detection accuracy decreased down when mouth kept moving [23]. For this mouth-controlled mouse, it is necessary to extract feature values from captured images of the mouth and build corresponding models for different mouth shapes. Therefore, this control method needs a pre-training before being used, and its identification accuracy can decrease when attempting multiple functions or when the mouth does not make a large and distinct action.

2.4. Brainwave Control

Brainwave control technology [24] has been developed recently, and the control method based on this technology requires a non-invasive EEG (electroencephalography) headset with many electrodes put on a user’s head to sense weak signals produced by the brain’s neurons through the scalp. Tanaka et al. [25] put forth an idea to develop an electroencephalogram-based control method for a mobile robot. The authors employed an algorithm to direct direction thinking and applied it to the direct control of a mobile robot, where the algorithm used wavelet transformation to initiate a time–frequency domain analysis. In the experiments of the EEG-based control for a mobile robot, the success rate of arriving at the desired positions was about 23%. Since the brainwaves of people differ, pre-training is necessary. The mouse device decides the correct mouse function by signal processing and computer learning. However, brainwave control has a large hardware and software complexity, and the system cost is very high. Additions, the identification rate of brainwave control is not yet good enough for the technology to move beyond the research stage.

3. Proposed Blowing-Controlled Mouse

Though the past studies have also used a microphone as a sensor to receive blowing signals, they have mainly focused on the digital signal processing of the blowing signal after amplification andanalog-to-digital conversion, which is similar to speech signal processing [26,27]. However, it is necessary to transform the blowing signal to the frequency domain by using a fast Fourier transform (FFT), then analyze and extract the feature values, and finally establish the models to determine whether the blow is long or short. The whole process shown in Figure 1 is more complicated. When there are multiple microphone inputs, a problem of mutual interference between the microphones may appear, and the hardware cost may increase.
In order to create a more convenient mouse assistive tool for patients with cervical spine injuries or limb defects, we developed a blowing-controlled mouse device based on airflow vibration to assist people with disabilities in manipulating mice. Figure 2 shows the system architecture of the proposed blowing-controlled mouse, in which six small electret microphones are used as sensing receivers of blowing signals; they represent the cursor’s movements, including up, down, left, and right, as well as left and right composite function keys. A hysteresis comparator and a re-triggerable monostable multivibrator (one shot) are used to complete the conversion of blowing signals to pulse widths. In addition, a microcontroller unit (MCU) is used to identify the position of the blown microphone and determine whether the blowing signal is long or short, and then it sends the pulses of movement coordinate or performs the key actions with the USB mouse controller chip. Finally, a USB mouse control chip with standard human interface device (HID) specification is used to communicate with the computer via a USB protocol to complete various functions of a mouse, such as cursor movement, left/right click, drag, and scroll. The blowing signal processing, the method of blowing position identification, and the basic working principle of the mouse control chip are described in detail in the following subsections.

3.1. Blowing Signal Processing

A technology without ADC (Analog-to-Digital Converter) that converts a blowing signal into the corresponding pulse width is proposed, and it includes the following three steps.
  • Based on the principle of airflow vibration, a small blowing signal is captured by the sense of an electret microphone.
  • Passing through a hysteresis comparator, many impulses are obtained in proportion to the time of the blowing signal.
  • A converted pulse width corresponding to the blowing impulses is generated by a re-triggerable one shot.
Figure 3 illustrates the conversion of the blowing signal for both short and long blows; the converted pulse width is proportional to the blowing time.
In Figure 4, one of six sense and conversion circuits for capturing the blowing signals is presented. This circuit for blowing signal processing can be divided into three parts, namely a microphone sensor, a hysteresis comparator, and a re-triggerable one shot, they are described in detail in the following.
(1) Microphone sensor
A low-cost electret microphone is a type of electrostatic capacitor-based microphone. It has a stable dielectric material and high resistance, so it can be used as a receiving sensor to capture a blowing signal. During the microphone blowing, a weak signal is sensed via a microphone vibration caused by the airflow, and this small blowing signal is obtained by removing the DC bias by a coupling capacitor CS.
(2) Hysteresis comparator
To achieve digital trigger signals corresponding to a blowing signal, a hysteresis comparator is used to convert the blowing signal into a series of impulses (VTRG). Here, by changing a variable resistor (VR), the reference voltage VREF can adjust the sensitivity to sense the blowing signal, where VREF should be set to an appropriate value to obtain an optimum sensitivity. Since VR is equivalent to two resistors R1 and R2 in series connection, VREF can be expressed as:
V R E F = R 1 R 2 ( R 1 R 2 ) + R 3 × ( ± V S A T ) + R 2 R 3 R 1 + ( R 2 R 3 ) × 5
where VSAT is the saturation voltage of the comparator output and the hysteresis voltage VH of the hysteresis comparator is given by:
V H = 2 × ( R 1 R 2 ) ( R 1 R 2 ) + R 3 × V S A T
(3) Re-triggerable one shot
The impulses generated by the comparator are discrete, and their intervals are not the same. Therefore, a re-triggerable one shot is required to make these impulses continuous. When these impulses enter into the re-triggerable one shot, an appropriate RC time constant needs to be selected such that the output pulse width tp of the one shot is two times greater than the maximum time interval between the impulses (ti), where tp can be expressed as:
t p 2 × m a x { t 1 , t 2 , t n 1 }
Through the re-triggering operations, a continuous blowing pulse width (TW) corresponding to the blowing signal is given by:
T W = i = 1 n 1 t i + t p
where n denotes the number of impulses generated by a comparator, and it is proportional to the blowing time. Finally, the converted blowing pulse width is sent to the MCU controller for further processing.

3.2. Blowing Position Identification

Since there are multiple microphones on a blowing plate, when one of the microphones is blown into, the adjacent microphones may also sense the blowing signals. In this paper, we propose an identification algorithm to find the maximum blowing pulse width among all the microphones to reduce the mutual interference of multiple microphones; thus, the MCU controller can correctly determine the position of the blown microphone. Of course, there must be appropriate distance in the position arrangement of six microphones so that the identification accuracy can be improved. Figure 5 shows the flowchart of the identification algorithm for finding the maximum blowing pulse width, where i denotes the microphone number whose maximum value is 6, DETi denotes the blowing detection of the ith microphone, and CNTi denotes the blowing counter of the ith microphone. The MCU controller can use a timer interrupt to handle the detection and count of the blowing pulse width in the algorithm. The algorithm steps are as follows.
  • In the beginning, all microphones have a blowing detection (DETi) with a low level, and the initial value of their blowing counter variable (CNTi) is set to zero.
  • When a user blows into a particular microphone, the adjacent microphones may also receive a portion of the blowing signal.
  • In turn, the algorithm detects whether the DETi of each microphone is high. If it is high, the corresponding counter variable starts counting; otherwise, it decreases by one if the counter variable value is not zero.
  • The algorithm checks whether all the blowing detections (DETi) are returned to zero. If so, it stops the detection; otherwise, it repeats Steps (3)–(4).
  • Finally, the algorithm compares the blowing counters of all the microphones and finds the maximum blowing pulse width to identify the correct blown microphone position. At the same time, the MCU controller determines whether the signal is a long or short blow according to the counter value of the maximum blowing pulse width.
To more clearly understand the identification algorithm, a case study is provided: When a user blows into the second microphone, its detection signal DET2 turns to high from low, and then its counter (CNT2) starts counting until the DET2 turns to low again. At this moment, it is assumed that the first and third microphones may receive a portion of the blowing signal, which will cause their detection signals (DET1 and DET3) to be at high level, but their occurrence times will not necessarily be the same. Their corresponding counters (CNT1 and CNT3) also start counting until DET1 and DET3 turn to low. When all of the DET signals return to low, the current blowing detection is ended. According to the values of the counters (CNT1–CNT3), we can finally find that the value of CNT2 is maximum; therefore, the second microphone is considered as the main blowing target. Figure 6 illustrates the principle of the blowing position detection of the identification algorithm.

3.3. USB Mouse Control Chip

A USB mouse controller chip (for instance, TP8833) [28] with standard HID specification can be used to communicate with a computer via a USB protocol to complete various mouse functions, such as cursor movement, left/right click, drag, and scroll. Such a chip typically includes a USB serial interface engine (SIE) and a mouse functional unit; the SIE handles the transmission via a USB protocol, and the mouse functional unit provides an LED driver and several optical detectors to receive the photo-couple pulse signals caused by mouse movement. In the proposed mouse, due to the existence of the mouse controller chip, the MCU does not need to communicate with the computer as long as it converts the blowing signal into the corresponding input of the mouse button or the photo-couple pulses of the mouse movement. Additionally, it is fully compatible with various computer operating systems and has plug-and-play support without the need to develop any USB driver. This type of mouse controller chip is widely used in a mechanical wheel mouse or an optical mouse. In addition to providing the inputs of the middle, left, and right key switches, the chip also receives the photo-couple signals corresponding to the coordinates (X, Y, Z) where the mouse moves or scrolls. These photo-couple signals include the X1 and X2 pulses that indicate horizontal movement, the Y1 and Y2 pulses that indicate vertical movement, and the Z1 and Z2 pulses that indicate up and down scrolling. Figure 7 shows the timing relationship between the coordinate pulses; for instance, the cursor’s left movement is when the X1 pulse leads to the X2 pulse; inversely, when the X2 pulse leads to the X1 pulse that is the cursor’s right movement. Using the time difference between the two coordinate pulses in each direction, we can adjust the speed of cursor’s movement along that direction. After identifying the blowing position and the blowing pulse width, the MCU controller reproduces the coordinate pulses corresponding to the original mouse movement, - or sends the button inputs for the mouse controller chip to perform the click action of the left and right keys. Finally, the mouse controller chip directly communicates with the computer via the USB protocol; thereby, various functions of the conventional mouse can be implemented.

4. Implementation and Experimental Results

Figure 8 shows the practically implemented prototype of the proposed blowing-controlled mouse. In addition to the main control device connected to the computer via a USB port, the proposed mouse also includes a blowing panel mounted on the user’s neck. The function configuration of different microphones is shown in Figure 9, where the left and right microphones represent composite blowing function keys. A short blow into the microphone of the left key performs the clicking action and a long blow realizes the drag mode when it is combined with blows into the microphones labeled with different movement directions. Similarly, a short blow into the microphone of the right key performs the open action, and a long blow activates the scrolling mode. When a user blows into the up and down microphones under such a scrolling mode, the scrolling function can be performed.

4.1. Implementation Procedure

Figure 10 illustrates the implementation procedure of the hardware and software of the proposed blowing-controlled mouse. First, the blowing signal processing was realized in the hardware circuit, which included using the microphone to capture the blowing signal, converting the blowing signal into the corresponding impulses through the hysteresis comparator, and using a re-triggerable one shot to convert the impulses into the corresponding pulse width. To reduce the mutual interference of all the microphones, an identification algorithm to find the maximum blowing pulse width was executed by the MCU. After the identification, the MCU generated the control signals that were sent to the USB mouse controller such as coordinate pulses or the inputs of mouse buttons. Finally, the mouse controller chip directly communicated with the computer via the USB protocol and performed all the mouse functions.

4.2. Waveform Measurement

As shown in Figure 4, after the hardware blowing signal processing, the output of the re-triggerable one shot was converted into a pulse width that was proportional to the blowing time. Figure 11 demonstrates the measured waveforms of the blowing signals and their corresponding converted pulse widths. Figure 11a,b shows the waveforms corresponding to the short blow and long blow, respectively.

4.3. Sensitivity of Blowing Detection

The sensitivity of the blowing detection affects the accuracy of microphone identification, and thus, an optimal reference voltage VREF of the hysteresis comparator should be determined to facilitate further signal processing. In the practical experiments, the reference voltage VREF of the hysteresis comparator could be changed by adjusting the variable resistor VR, and the optimal reference voltage setting could be observed and found. In Table 2 and Figure 4, it can be seen that the output of one shot was as follows: VDET remained high when VREF was set too low (<0.5 mV), and VDET had hardly any pulse width output when VREF was larger than 0.98 V. When VREF was set at in the range 0.5–219 mV, the comparator generated a long-impulse-interval output at the end of a long blow, and then VDET had an intermittent pulse width output as shown in Figure 12a, which could easily cause a misjudgment. Therefore, by choosing a VREF value close to the range 0.22–0.83 V, once the microphone had been blown into, a complete pulse width output corresponding to the blowing signal (shown in Figure 12b) could be obtained.

4.4. Identification Rate

In this subsection, we evaluate the identification rate of the proposed blowing control method and compare it with mature speech recognition methods [29,30]. In general, the identification rate decreased with the increase in the distance between the sensor and signal source; as such, an appropriate distance is 3–4 cm. Table 3 shows that the proposed blowing identification method achieved an excellent identification rate of over 90% for the short blow and 85% for the long blow under the conditions without speech interference. The speech identification rate was also higher than 70%; the identification errors were mainly caused by the incorrectness of the semantic analysis or speaker’s inaccurate pronunciation. On the other hand, even when someone spoke alongside the user, the identification rate by using the proposed blowing identification was almost unaffected. However, the speech recognition was easily disturbed by environment sounds, and its identification rate immediately decreased below 60% when multiple voices appeared at the same time.

4.5. Evaluation of Mouse Operation

In the development process, the proposed blowing-controlled mouse was practically operated by different people by repeating the same tests. Table 4 shows the usage evaluation for different people and includes their age, gender, training time, operational time, and reaction (where the training time is the first practice time to learn how to operate the proposed mouse). Since everyone has a different mastery of blowing skills, some people need more practice time to learn how to blow continuous air to perform a correct long blow, and the average training time was about 5.8 min after an evaluation by ten persons. Therefore, regardless of age and gender, people can get start quickly after knowing the operational points. Because the operational habit of the proposed mouse is similar to that of a hand-controlled mouse and there is no need to blow hard nor to substantially swing one’s head, none of the users felt dizzy after a usage of about 30 min—even elders. Additionally, some people thought it was easy and effortless, with some even not feeling tired. In general, after more practice, people will be more familiar with the proposed mouse and then be able to use it smoothly.
As for the measurement of target acquisition time, we asked our subjects to to move the cursor from 5 cm around the target to the target through short blowing into the movement direction microphones, and we found that the average target acquisition time was about 4.43 s after 10 measurements in a fine-tuned mode. When the distance between the cursor and the target was larger—for example, from the rightmost of the screen to the left most of the screen—the average target acquisition time was about 4.7 s at a fixed speed of automatic movement through long blowing into the left movement microphone. In addition, in order to provide feedback to the users in operation, an LED or a buzzer can be added to indicate whether the blown microphone is sensed to remind users of its status.
The operational skill was very simple and user-friendly, as long as the user slightly blew against the microphone for operation. For instance, if a user wanted to move the cursor from the far-right side to the leftmost icon on a 27-inch computer screen, the user could slightly blow long into the left movement microphone, and then the cursor automatically shifted left; the speed of automatic movement was about 0.13 m/s. When the cursor approached the target icon, the user could shortly blow into any microphone to stop cursor movement. When there was a little position offset between the cursor and the target icon, the cursor could be fine-tuned to the target icon by short blowing into the up, down, left and right microphones. Additionally, the response time from blowing into the microphone to the moment of execution of the corresponding action was only about 100 ms. According to the presented mouse operation principle, a user is less likely to feel tired when manipulating the proposed blowing-controlled mouse, and the mouse can be easily manipulated by simple actions. Due to its high identification rate, as presented in Section 4.4, almost all of the microphones that are blown into can be quickly and accurately recognized, and the corresponding actions can be immediately performed.

4.6. Discussion on Results

As presented in Table 1 and described in Section 2, the existing mouse assistive tools adopt different control methods, but most of them have operational inconveniences and limitations. After profound analysis and research, we have found a simple control method that is easy to implement through blowing identification, especially by using our proposed technology of blowing signal to pulse width conversion, which can achieve better results. Since the control method of the proposed mouse significantly differs from the existing mouse control methods in construction and usage, it is difficult to compare the performance of our and other methods; therefore, Table 5 provides a comparison of different mouse assistive tools regarding their operational convenience, complexity, and price.
According to the experimental results provided above, as well as further analysis, we found that the sensitivity of blowing detection depends on the reference voltage VREF of the hysteresis comparator, which further affects the blowing identification rate. Namely, when VREF was set to an optimal value of about 0.22–0.83 V, the proposed blowing-controlled mouse could achieve an excellent identification rate of over 90% under conditions without speech interference; regardless of whether the signal was a short or long blow.
To sum up, the aim of the proposed blowing-controlled mouse is to help people with disabilities to easily use computers and the internet, helping them improve their life quality. Moreover, the proposed mouse can be widely promoted to people with disabilities because it has a low-cost implementation.
The novelty and contributions of this work can be summarized as follows:
(1)
Hardware-based blowing signal processing without ADC is used to convert a blowing signal into a corresponding pulse width.
(2)
The assistance of an identification algorithm to find the maximum blowing pulse width can reduce the mutual interference of multiple microphones, and the identification rate can thus be increased.
(3)
The proposed technology requires no effort in operation and provides operational convenience, which makes it especially suitable for people with disabilities.

5. Conclusions

To overcome the operating drawbacks and limitations of conventional mouse assistive tools, a hardware-based blowing signal processing technology has been proposed in this paper to help people with disabilities, especially paralyzed people, to easily control a computer mouse. Using the software co-design of an identification algorithm, the proposed blowing-controlled mouse can be successfully equipped with various mouse functions, and it can achieve a stable and accurate identification rate even in the presence of other-sound interference. The experimental results showed that an identification rate of over 85% can be achieved. Moreover, compared with other mouse assistive tools, the proposed mouse has the advantages of low cost and user-friendly operation; thus, it can help people with disabilities to promote their life quality.

Author Contributions

H.-C.C. planned this study, completed the circuit design, and presented the analysis of experimental results. W.-R.T. and C.-L.H. completed this circuit implementation and test. C.-J.H. handled the experimental measurement and partial article writing. H.-C.C. and C.-J.H. contributed in drafted and revised the manuscript.

Funding

This work was performed under auspices of the University of Electronic Science and Technology of China, Zhongshan Institute, China, under Grants 418YKQN11 and supported in part by the Jiangxi University of Science and Technology, China, under Grants jxxjbs18019.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kourouproglou, G. Assistive Technologies and Computer Access for Motor Disabilities; Information Science Reference Press (IGI Global): Hershey, PA, USA, 2013. [Google Scholar] [CrossRef]
  2. Reste, J.; Zvagule, T.; Kurjane, N.; Martinsone, Z.; Martinsone, I.; Seile, A.; Vanadzins, I. Wrist Hypothermia Related to Continuous Work with a Computer Mouse: A Digital Infrared Imaging Pilot Study. Int. J. Environ. Res. Public Health 2015, 12, 9265–9281. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Lin, S.T.; Chen, W.H.; Lin, Y.H. A Pulse Rate Detection Method for Mouse Application based on Multi-PPG Sensors. Sensors 2017, 17. [Google Scholar] [CrossRef]
  4. Phillips, J.G.; Triggs, T.J. Characteristics of Cursor Trajectories Controlled by the Computer Mouse. Ergonomics 2001, 44, 527–536. [Google Scholar] [CrossRef] [PubMed]
  5. Caspi, A.; Roy, A.; Wuyyuru, V.; Rosendall, P.E.; Harper, J.W.; Katyal, K.D.; Barry, M.P.; Dagnelie, G.; Greenberg, R.J. Eye Movement Control in the Argus II Retinal-Prosthesis Enables Reduced Head Movement and Better Localization Precision. Investig. Opthalmol. Vis. Sci. 2018, 59, 792. [Google Scholar] [CrossRef] [PubMed]
  6. Rafik Manihar, S. Head Controlled Mouse. Int. J. Comput. Appl. 2012, 50, 1–4. [Google Scholar] [CrossRef]
  7. Cavazza, M. A Motivational Model of BCI-Controlled Heuristic Search. Brain Sci. 2018, 8. [Google Scholar] [CrossRef]
  8. Sánchez-Reolid, R.; García, A.; Vicente-Querol, M.; Fernández-Aguilar, L.; López, M.; González, A. Artificial Neural Networks to Assess Emotional States from Brain-Computer Interface. Electronics 2018, 7, 384. [Google Scholar] [CrossRef]
  9. Wikipedia: Sip-and-Puff. Available online: https://en.wikipedia.org/wiki/Sip-and-puff (accessed on 10 July 2019).
  10. Enginursday: DIY Assistive Technology Mouse. Available online: https://www.sparkfun.com/news/1570 (accessed on 15 July 2019).
  11. Xu, M.; Hu, R. Mouth Shape Sequence Recognition Based on Speech Phoneme Recognition. In Proceedings of the IEEE First International Conference on Communications and Networking, Beijing, China, 25–27 October 2006; pp. 1–5. [Google Scholar]
  12. Li, X.; Wee, W.G. An Efficient Method for Eye Tracking and Eye-gazed FOV Estimation. In Proceedings of the 16th IEEE International Conference on Image Processing (ICIP), Cairo, Egypt, 7–10 November 2009; pp. 2597–2600. [Google Scholar]
  13. Duchowski, A.T. Eye Tracking Methodology; Springer International Publishing: Cham, Germany, 2017; ISBN 978-3-319-57881-1. [Google Scholar]
  14. Burger, G.; Guna, J.; Pogačnik, M. Suitability of Inexpensive Eye-Tracking Device for User Experience Evaluations. Sensors 2018, 18, 1822. [Google Scholar] [CrossRef] [PubMed]
  15. Jiang, J.-Y.; Guo, F.; Chen, J.-H.; Tian, X.-H.; Lv, W. Applying Eye-Tracking Technology to Measure Interactive Experience Toward the Navigation Interface of Mobile Games Considering Different Visual Attention Mechanisms. Appl. Sci. 2019, 9, 3242. [Google Scholar] [CrossRef]
  16. Antunes, J.; Santana, P. A Study on the Use of Eye Tracking to Adapt Gameplay and Procedural Content Generation in First-Person Shooter Games. Multimodal Technol. Interact. 2018, 2, 23. [Google Scholar] [CrossRef]
  17. Rudigkeit, N.; Gebhard, M. AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot. Sensors 2019, 19, 2836. [Google Scholar] [CrossRef] [PubMed]
  18. Manresa-Yee, C.; Varona, J.; Perales, F.J.; Salinas, I. Design Recommendations for Camera-based Head-controlled Interfaces that Replace the Mouse for Motion-impaired Users. Univers. Access Inf. Soc. 2014, 13, 471–482. [Google Scholar] [CrossRef]
  19. Chen, Y.-L.; Chen, W.-L.; Kuo, T.-S.; Lai, J.-S. A Head Movement Image (HMI)-controlled Computer Mouse for People with Disabilities Analysis of a Time-out Protocol and its Applications in a Single Server Environment. Disabil. Rehabil. 2003, 25, 163–167. [Google Scholar] [CrossRef] [PubMed]
  20. Bouaynaya, N.; Wei, Q.; Schonfeld, D. An Online Motion-Based Particle Filter for Head Tracking Applications. In Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP ’05), Philadelphia, PA, USA, 23 March 2005; pp. 225–228. [Google Scholar]
  21. Sturtz, C.R. Mouth-operated Computer Input Device and Associated Methods. U.S. Patent 007,768,499B2, 3 August 2010. [Google Scholar]
  22. Discover Accessibility: Technology That Improves Lifestyle. Available online: https://www.closingthegap.com/quadjoy-discover-accessibility/ (accessed on 1 July 2019).
  23. Lin, C.-H.; Yang, S.-S.; Chen, J.-H.; Cao, Y.-Z. Design and implementation of a mouth-controlled mouse. In Proceedings of the 2011 IEEE EUROCON-International Conference on Computer as a Tool, Lisbon, Portugal, 27–29 April 2011; pp. 1–4. [Google Scholar]
  24. Li, Z.; Liu, Y.; Yu, F.; Cui, X.; Xue, Z. Design of Intelligent Car Steward Controlled by Brain Waves. In Proceedings of the 2018 IEEE 3rd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chongqing, China, 12–14 October 2018; pp. 2152–2155. [Google Scholar]
  25. Tanaka, K.; Matsunaga, K.; Hori, S. Electroencephalogram-Based Control of a Mobile Robot. Electr. Eng. Jpn. 2005, 152, 39–46. [Google Scholar] [CrossRef]
  26. Carbonneau, M.-A.; Gagnon, G.; Sabourin, R.; Dubois, J. Recognition of Blowing Sound Types for Real-time Implementation in Mobile Devices. In Proceedings of the 2013 IEEE 11th International New Circuits and Systems Conference (NEWCAS), Paris, France, 16–19 June 2013; pp. 1–4. [Google Scholar]
  27. Al-Junaid, H.; Saif, A.M.; Al-Wazzan, F.Y. Design of Digital Blowing Detector. Int. J. Inf. Electron. Eng. 2016, 6, 180–184. [Google Scholar] [CrossRef]
  28. Topro Technology Inc. Mouse Controller TP8833 Datasheet; Topro Technology Inc.: Hsinchu, Taiwan, 2002. [Google Scholar]
  29. Kingsbury, B.E.; Morgan, N.; Greenberg, S. Robust Speech Recognition Using the Modulation Spectrogram. Speech Commun. 1998, 25, 117–132. [Google Scholar] [CrossRef]
  30. Reynolds, D.A. An Overview of Automatic Speaker Recognition Technology. In Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Orlando, FL, USA, 13–17 May 2002; pp. IV-3229–IV-3232. [Google Scholar]
Figure 1. Conventional blowing signal processing.
Figure 1. Conventional blowing signal processing.
Sensors 19 04638 g001
Figure 2. The architecture of the proposed blowing-controlled mouse.
Figure 2. The architecture of the proposed blowing-controlled mouse.
Sensors 19 04638 g002
Figure 3. Conversion of blowing signal to pulse width.
Figure 3. Conversion of blowing signal to pulse width.
Sensors 19 04638 g003
Figure 4. The circuit for blowing signal sensing and conversion.
Figure 4. The circuit for blowing signal sensing and conversion.
Sensors 19 04638 g004
Figure 5. The identification algorithm for finding the maximum blowing pulse width.
Figure 5. The identification algorithm for finding the maximum blowing pulse width.
Sensors 19 04638 g005
Figure 6. The principle of blowing position detection.
Figure 6. The principle of blowing position detection.
Sensors 19 04638 g006
Figure 7. Timing of coordinate pluses corresponding to different mouse movements.
Figure 7. Timing of coordinate pluses corresponding to different mouse movements.
Sensors 19 04638 g007
Figure 8. Image of the blowing-controlled mouse prototype.
Figure 8. Image of the blowing-controlled mouse prototype.
Sensors 19 04638 g008
Figure 9. Function configuration of different microphones.
Figure 9. Function configuration of different microphones.
Sensors 19 04638 g009
Figure 10. The implementation procedure of the proposed blowing-controlled mouse.
Figure 10. The implementation procedure of the proposed blowing-controlled mouse.
Sensors 19 04638 g010
Figure 11. The waveforms of (a) a short blow, and (b) a long blow.
Figure 11. The waveforms of (a) a short blow, and (b) a long blow.
Sensors 19 04638 g011
Figure 12. The pulse width output at (a) VREF (reference voltage) = 114 mV, and (b) VREF = 0.56 V.
Figure 12. The pulse width output at (a) VREF (reference voltage) = 114 mV, and (b) VREF = 0.56 V.
Sensors 19 04638 g012
Table 1. Comparison of different mouse assistive technologies.
Table 1. Comparison of different mouse assistive technologies.
Control TypesTechnologiesOperational ConveniencePriceDisadvantages
Eye ControlEye TrackingSimpleHigherEasy to cause eye fatigue
Head ControlCamera-based Computer VisionSimpleHighEasy to cause dizziness for people with anemia
Mouth ControlSip and PuffControl of Sip and Puff SwitchNeed of more exercisesMediumFeel a little laborious, uncomfortable, and inconvenient
BiteBite Combining with Blowing and Sucking ActionsEasyHighIt is uncomfortable for the user’s mouth, shoulder, and neck
Mouth ShapesImage RecognitionNeed of pre-trainingHigherMouth must perform a large and distinct action
Brainwave ControlBrainwave RecognitionNeed of pre-trainingVery ExpensiveA large complexity of both hardware and software, and unsatisfactory identification accuracy
Table 2. Detection sensitivity dependence on the reference voltage value.
Table 2. Detection sensitivity dependence on the reference voltage value.
VREFComparator OutputSensitivity
<0.5 mVSelf-generated impulsesToo High
0.5–219 mVToo long impulse interval at the end of a long blowHigh
0.22–0.83 VSuitable impulse intervalsOptimal
0.84–0.98 VLess impulses for a short blowLow
>0.98 VNo detected impulseNo Response
Table 3. Comparison of identification rates.
Table 3. Comparison of identification rates.
SituationSpeech RecognitionBlowing Identification
Short BlowLong Blow
Without Speech Interference72.0%94.7%90.6%
With Speech Interference60.0%92.3%87.8%
Table 4. Usage evaluation for different people.
Table 4. Usage evaluation for different people.
No.GenderAgeTraining TimeOperational TimeReaction
1Male114.5 min25 minNo dizziness. Easy
2Female138 min33 minNo dizziness. Not tired
3Male225 min35 minNo dizziness. Not tired
4Female214.5 min30 minNo dizziness. Easy
5Male334 min28 minNo dizziness. Effortless
6Female389 min32 minNo dizziness. Not tired
7Male403.8 min25 minNo dizziness. Easy
8Female485 min27 minNo dizziness. Easy
9Male555.5 min30 minNo dizziness. Not tired
10Male758.5 min28 minNo dizziness. Effortless
Table 5. Performance comparison of different mouse assistive tools.
Table 5. Performance comparison of different mouse assistive tools.
Control TypesOperational ConvenienceComplexityPrice
Eye ControlSimpleHighHigher
Head ControlSimpleHighHigh
Mouth ControlSip and PuffNeed of more exercisesLowMedium
BiteEasyLowHigh
Mouth ShapesNeed of pre-trainingHighHigher
Brainwave ControlNeed of pre-trainingHigherVery Expensive
Proposed Blowing ControlEasyLowLow

Share and Cite

MDPI and ACS Style

Chen, H.-C.; Huang, C.-J.; Tsai, W.-R.; Hsieh, C.-L. A Computer Mouse Using Blowing Sensors Intended for People with Disabilities. Sensors 2019, 19, 4638. https://doi.org/10.3390/s19214638

AMA Style

Chen H-C, Huang C-J, Tsai W-R, Hsieh C-L. A Computer Mouse Using Blowing Sensors Intended for People with Disabilities. Sensors. 2019; 19(21):4638. https://doi.org/10.3390/s19214638

Chicago/Turabian Style

Chen, Hsin-Chuan, Chiou-Jye Huang, Wei-Ru Tsai, and Che-Lin Hsieh. 2019. "A Computer Mouse Using Blowing Sensors Intended for People with Disabilities" Sensors 19, no. 21: 4638. https://doi.org/10.3390/s19214638

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop