Skip to main content

Ethical framework of assistive devices: review and reflection

Abstract

The population of ageing is growing significantly over the world, and there is an emerging demand for better healthcare services and more care centres. Innovations of Information and Communication Technology has resulted in development of various types of assistive robots to fulfil elderly’s needs and independency, whilst carrying out daily routine tasks. This makes it vital to have a clear understanding of elderly’s needs and expectations from assistive robots. This paper addresses current ethical issues to understand elderly’s prime needs. Also, we consider other general ethics with the purpose of applying these theories to form a proper ethics framework. In the ethics framework, the ethical concerns of senior citizens will be prioritized to satisfy elderly’s needs and also to diminish related expenses to healthcare services.

Introduction

Ethnographic reports present that ageing population is growing significantly all over the world [1, 2]. This increase gives rise to particular needs of elderly people [3,4,5]. Moreover, population increase leads to substantial issues such as shortage in medical centres, healthcare services, and medical professionals [6] and burdens of enormous amount of healthcare expenses [7]. Recently, there have been noticeable technological innovations in Information and Communication Technology (ICT). These developments have resulted in creation of various types of assistive medical robots such as RIBA, paro-robot, telerobot, and remote presence robot [7], assistive devices, home automation systems, and canes [8,9,10,11]. Assistive devices and robots are developed with the purpose of fulfilling elderly’s needs and expectations, compensating their disabilities, boosting their life quality, providing assistance to carry out task(s), whilst maintaining their autonomy [7, 12]. Research studies revealing the primary needs of older adults are listed in Table 1 [13, 14]. Medical assistive robots including walking devices can be adopted by elderly if they prove to be useful, reliable, efficient, effective, and also easy to utilize [15, 16].

Table 1 Primary needs of elderly

Paper organization

This paper is organized as follows: “Theories of Ethics” section introduces a summary of literature of general ethics, human rights and values; “Ethical Issues of Assistive Medical Robots” section addresses existing ethical issues related to assistive medical robots; “Discussion and Conclusion”, summarizes the important role of ethical framework on both assistive medical robots and walking devices

Theories of ethics

Under this section, general ethics theories as well as human rights and values are described.

General theories of ethics and bases of ethics concerns

In area of ethics, there are general theories which can be practiced in different fields such as assistive medical robot including walking devices ethics. Robots, specifically assistive medical robots, simulate human behaviours, performance, and actions; therefore, ethics general theories can be applied in design and use of medical robots like walking devices. The prime example is the robot’s program where the program’s codes are written based on ethics general theories, whilst taking ethics concerns into consideration. This section of the paper describes three (3) relevant general theories of ethics and also bases of ethics concerns.

Deontology

The word “deontology” is taken from two (2) Greek words which are duty and study. Deontology ethics, which is established by Immanuel Kent, is known as non-consequentialist or duty based [17]. According to this theory, individuals are enforced morally to perform or take actions in accordance with series of principles as well as rules without considering the outputs of taken actions [18]. This theory mainly considers rightness and wrongness of an action itself rather than focusing on its consequences and outputs [19]. In accordance with Sullivan [20], this is the first ethics theory prioritizing decision-making to a person. Moreover, [21] stated that in a moral action of an individual, feelings and incentive refuse to play a significant role. Therefore, incentive for taking an action is based on obligation before the action takes place [17]. In other words, in accordance with this theory, in spite of destructive consequences, individuals are required to take right actions which are based on rules [22].

The prime example of applying duty-based theory in assistive medical robots is giving medicine to an older adult. If a senior citizen requests a painkiller from his/her assistive medical robot, in spite of being allergic to the painkiller, the medical agent is required to follow rules and to provide the medication to older adult. It is evident that medical robot action triggers older adult’s health condition. In contrast, in other general theories of ethics such as consequentialism, the action of medical robot endangering older adult’s well-being declines to be accepted; therefore, assistive agent is required to provide another solution to relive older adult’s discomfort.

Virtue

This ethics virtue is recognized as character-based ethics which is far towards individual based rather than action based. Virtue ethics is recognized a character-based ethics highlighting an individual’s right action in all the same circumstances [23]. This theory emphasizes on virtue and moral character of a person carrying out an action rather than considering action’s consequences or ethical rule [24]. The concern of this ethics theory is not only focusing on rightness or wrongness of an individual’s action but also offers a number of behaviours.

Virtue theory is beneficial if an individual incline to assess another individual character rather than goodness or badness of a particular action. In this theory, individuals are required to have series of characteristics for virtuousness [25].

Character-based theory sporadically tends to deontology ethics theory, whilst it is contrary to consequentialism ethics theory. The prime example is helping needy: based on consequentialism theory, helping needy improves well-being. On the other hand, deontology theory says that helping needy is in accordance with moral rule, whilst virtue theory argues that this kind of assistance is a character of generosity.

Consequentialism

This ethics theory is known as result-based theory which highlights two (2) primary principles. The first one states that rightness or wrongness of an action is based on its result and potential consequences. The second concept indicates that when the result of an action has greater consequences, that action is considered as a more right action [26]. In accordance with consequentialism, an action is favourable if its consequences refuse to produce harmful consequences.

Hedonism and utilitarianism are two forms of consequentialism ethics theory. Hedonism indicates that it is necessary for individuals to ameliorate human, whilst utilitarianism states that it is essential for individuals to enhance human health. In addition, another form of consequentialism states that individuals are required to improve their preferences satisfaction and happiness.

It is stated by Cummiskey [27] that in result-based theory a murder is considered right if its consequences produce good result. In other words, if a murderer inclines to kill a group of innocent individuals, it is accepted based on consequentialism to kill the murderer to save the victim’s lives. In contrast, based on both deontology and virtue theories, in spite of victims death, killing the murderer is wrong [28].

Human rights and values

Human rights related to senior citizens consist of the right to a standard of living which is sufficient for health and welfare, freedom from discrimination, inhuman and torture or humiliating treatment, and private and family life.

A focus on human rights provides support to highlight that physical and psychological well-being of older adults is as significant as the well-being of other member of society. Therefore, it is substantial to make sure those assistive medical robots embedded into older adults lives aim at benefiting elderly, and not embedded to diminish care burden on the other people [29]. In addition, it is essential to consider twelve human values which are introduced to technological developments [30].

Ethical issues of assistive medical robots

The debates about the ethical actions of robots date back to 70 years ago [31]. From 1950, when Asimov presented his three laws, [33] there has been arguments about the potency of those particular rules to render robots capable of making ethical decisions independent of human interference. The key argument of Asimov’s laws considered the self-directedness of robots. Being autonomous, robots were assumed to have the physical and intellectual capacity to make moral decisions, using the knowledge and rationality which they were equipped with [34]. Asimov’s three laws discussed these notions: (1) A robot may not be a source of damage for a human being or, its inactivity expose a human being to harm. (2) A robot must follow human beings’ orders except the ones which would confront the first law. (3) A robot should guard itself as long as such defence does not contrast with the other two laws.

Both researchers and science fiction writers have expressed their concerns about a number of ethical issues that daily use of robots has made them possible. However, the robots that we use daily are limited to vacuum cleaners, grass cutters, and robot toys. These are not same to the advanced science fiction robots that are the subject of the recent robotics ethics [32]. Consequently, the ethical concerns related to robots should not be based on empirical data and studies done by users. Instead, taking Asimov’s laws as an opening point for ethical debates [35], they need to discuss ethics according to their potential, future application [36].

Robots-exclusive Concerns. Ryan Calo is a law professor who wrote the “Robots and Privacy” chapter in Robot Ethic. He points out that the debated on robots are currently paying attention to ubiquity, and, conceivably, this is not that good [37]. Calo detects three privacy dangers which robots can create: “surveillance”, “access to living and working spaces”, and “social impact”. The anxiety about such an access is exacerbated by the research done by Denning et al. [38]. In this research, the authors explore vulnerable security measures in several toy robots [37].

Certainly, in areas such as robotics, producers need to be very innovative. Current world is witnessing a technological explosion with new possibilities. One argument is that ignoring speculation about future robots and their use can create ethical dilemmas. However, our argument is that it is necessary to adapt a perspective that is in agreement with the experiences resulted from empirical use of robots. This will help to complement the current debate on robot ethics.

A list of ethical issues related to the use of assistive medical robot from older adults’ perspective are explained in details in this section and the following section. There are a noticeable number of ethical issues which are stated by senior citizens about the use of assistive medical robots. Amongst the stated issues, there are primary issues which are of significant concern not only to the older adults but also to elderly’s family and caretakers, robot designers and developers.

Moreover, trust is a vital element for the formation and preservation of humans’ dynamic relationship with assistive robots [39,40,41].

The lack of trust is the main reason that seniors do not wish, do not need, or do not consider robots.

Lack of trust results from some factors [42, 40]:

  • Privacy: how can youth and the elderly leave their privacy in the hands of a robot?

  • Safety: If a robot is set to undertake physical responsibilities, the physical interaction of human–robot leads to serious challenges. Besides, upgraded methods are necessary to eliminate the failures raised by safety problems and confirm the absence of any unreliable behaviour.

  • Robustness: despite the circumstances, how the elderly can be convinced about the suitability of the behaviour of a robot?

  • Security: Affirming that the robot is not harmful for the elderly.

  • Data protection: how can the elderly be convinced of the safety of the significant data?

The ethical issues of assistive medical agents are listed in sections in below.

Privacy of older adults

Privacy of senior citizens is of paramount importance that it is well in line with other ethical issues such as data protection and security and safety. This ethical issue is of great concern to scholars [43,44,45,46,47,48,49,50,51,52]. This issue has substantial effect on older adults to lose their appeal to adopt smart home technology. The main process of smart home technology includes collection, transmit, distribution, and exchange of elderly’s private information. This main process has impact on elderly to refuse smart home technology [43, 46, 53, 54]. Take home healthcare robots as an example; this kind of robot enable medical specialists to keep a wary eye on their patients’ well-being in remote places by means of various tools such as camera, ultrasound, and speaker [7].

The process of Ambient Intelligent Technology (AIT) consists of various procedures such as collecting, distributing, and storing full confidential data of user [55]. The key functions of this technology are to keep eyes on robot’s user and to combine data from different resources by sensors to obtain the details of circumstance [45]. In the process of data collection, profound, medical, and confidential data of robot user are gathered. In addition, other parties might have access and control to the gathered data; therefore, user’s privacy might be abused [55, 56].

In addition, home automation system is one of the main ICT devices employed for fall prevention purpose. Home automation system is type of device which is wearable attached to the body of user by means of transparent film and neoprene belt. The primary function of this device is to detect fall incidents through video monitoring [9, 10]. It is asserted from various studies that noticeable numbers of senior citizens are of critical concern about their privacy. Consequently, it is far favourable for them if the wearable device captures unclear photographs when they are at personal places such as bedroom. In contrast, it is accepted for elderly if the device takes clear images when they are at other rooms such as living room [57, 10]. It is claim that privacy concern slackens older adults’ interest towards this kind of devices especially visual surveillance or cameras [58].

Two-way visual contact is a way of communication and connection through webcams and television monitors, though it is not widely used despite its rather cheap price. This allows family members or employed carers to “look in” on older persons and their homes with no need to commute [59]. If older people feel at ease in working with computers, virtual visiting and communication is reasonable and easily established. It is not more difficult than installing and making a Skype account. Even there are virtual visiting systems which are more user-friendly than Skype and operate by connecting to local broadband networks.

Data protection

The ethical issue of data protection is well connected to privacy issue. In the process of home healthcare services, there should be a connection between both medical centre personnel and the place of robot user to provide not only safety services but also social care and daily basis services [49]. In multi-user cases, the intelligent system is in charge of distinguishing different data, namely robot user’s private data, caretaker’s data, as well as other relevant information to monitor well-being of user [60]. Consequently, it is essential to subject the collected data to act of data protection [61].

The primary function of assistive walking devices including fall detection devices such as home automation systems is to capture image or record video of older adults. The captured images or recorded videos might be inappropriate or unwanted; therefore, these images or video are unfavourable to elderly. Moreover, it is of importance concern to older adults if their personal data, namely images or videos, are accessed and viewed by third parties.

Some assistive robots are used to help in remote sensing and monitor the elderly in variety of locations. These robots are as assistant for those specialists who want to check their patients remotely, mainly in critical situations. They do this by making use of speakers, light, cameras, remote controls, ultrasound, and electronic medical recording accessories [62].

Security and safety

Safety and security ethical issue is well related to privacy concern [63]. It is strongly recommended that there should be a balance amongst the needs of elderly for safety, whilst preserving elderly’s privacy and autonomous [29, 56, 64]. In addition, it is claimed that older adults, their families and caretakers have contrary point of view about privacy, safety and security concerns [56]. A conducted study reveals that family and caretakers of senior citizens are more concern about safety and security rather than privacy and independency concerns [56]. Moreover, although some scholars subscribe to the belief that there should be a balance between privacy and safety concern [29, 64], other scholars believe that safety and security of elderly is of dramatic importance [43, 65,66,67].

Regarding security and safety of walking devices, over recent decades, one of the substantial and pricy public health issues is fall incidents and injuries happen to older adults [68,69,70]. It is found that one older adult out of three with the age of sixty-five or above falls yearly resulting in serious injuries which require treatment in medical centres [71,72,73, 69]. Although there have been significant developments in fall prevention devices, fall incidents take place with severe consequences such as morbidity and mortality. Injuries resulted from fall incident are ranked number five in terms of causing mortality in ageing group with the age of sixty-five and above [71]. For this reason, safety and security of assistive walking devices are of dramatic concern to senior citizen.

In addition, it is imperative to ensure that walking devices especially ICT ones which function based on human-made programs do not pose fall incidents to elderly on account of negligible errors in their programs. Besides, fall incidents give rise to another ethical issue which is responsibility of fall incidents. It is evident that assistive walking devices including fall detection ones play important role not only in the well-being of older adults but also in occurrence of fall incidents. For this reason, it is dramatic to identify the responsibility of such incidents.

Various types of tasks are made possible by making use of the services offered by autonomous service robots. Samples are taking care of old people at home [74] or accompanying guests in multi-level buildings [81].

Robotic service solutions include the simplest telepresence to the most complex functions to back caregivers. Examples are the Giraff (www.giraff.org) advanced in the ExCITE project [75], AVA (www.irobot.com/ava) and Luna [76], assisting needy persons in their everyday movements (www.aal-domeo.eu), self-management of long-lasting illness [77], comfort and safety as in the cases of Florence [78] and Robo M.D [79], and unification in an environment controlled by smart applications [80]. On the other hand, the number of robotic applications that are dedicated to social services in settings like smart office buildings is very few [81].

Error and safety

Safety of elderly using assistive medical robots is of significant concern to older adults, their family members and caregivers, and robot designers and programmers. The assistive medical agent carries out a task in accordance with program(s) which is written by a range of codes through robot developers. For this reason, a negligible error in robot’s program might trigger older adults’ well-being and might cause fatal and severe consequences [82].

Technological care giving is already realized in most of Western European counties, but the technology that is usually used in this case is not robotic. On the opposite, some of it is no doubt low-tech. The aiding technology that is mostly available for old people in the UK ranges from portable alarms for requesting help; smoke, CO2 and flood sensors; pillboxes or containers that are designed in a way that let older people take their drug on time; fall sensors are another samples as well [83].

Responsibility

In Ambient Intelligent Technology (AIT), artificial robot and its user interact with each other directly. This interaction amongst them has led to several issues such as responsibility of tasks, designation of control, decrease in human force, and allocation of decision-making [43, 45, 46, 82, 84]. In today’s world, artificial robots are increasingly and pervasively becoming autonomous which has resulted in diminishing human participation in some actions including decision-making. For this reason, liability of autonomous action is of critical concern [45, 85].

Responsibility concerns about robots for older adults

Robots are capable of interacting with human being and the encompassing environment in very intricate ways. The traditional theories of moral responsibility are challenged by social robots. The production of robots results in various ethical questions: what are the possible harmful consequences of such production? What would be the end of key moral concepts such as autonomy and privacy in a time when robots are integrated with human life? Are these robots moral agents? Is it ethical to take them responsible? These ethical issues result from the developing sovereignty of the smart technical products the most remarkable representatives of which are the social robots. Can robots be assumed as socially autonomous responsible confidant agents that care and, meanwhile, perform their duty as technical gadgets?

Whilst most of these concerns are related to other fields of engineering, the capacity of robots to turn into ethical agents puts forth another set of moral questions, such as those related to the rights and responsibilities of robots [86].

People’s ideas about the moral concerns introduced by autonomous products like robots very and address various notions such as the application of robots in, for instance, healthcare tasks. These views imply an understanding about the achievements of technology which depends on the ideas about the entity of technology and the relation of mind and matter in human and machine. The main focus of the usual approach of research in robot ethics deals with the robot and its entity and thoughts.

It helps to answer questions about the intelligence and rationality of robots, to see are they “moral agents”. Or, it restricts ethical concerns to things that, interactions with robots, might go wrong. For most of philosophers of morality, ethics is related to feeling of responsibility, the appropriateness of some one’s actions, and, then, the centrality of questions that consider moral status and action [87]. Usually, moral responsibility is only attributed to creatures that enjoy a tenable levee of moral agency—what does it mean—and concentrates on the suitability of what that agent performs, has performed, or can perform [88]. To investigate the ethics of robot technology, Coeckelbergh [88] puts forth an approach which centralizes human or interaction. Instead of thinking of a mental philosophy which regards the real entity and thought of robots, it would be better to adopt a philosophy of interaction and seriously consider the ethical importance of exterior form [89].

One of the benefits of the Accompany focus group’s discussions was the agreement that for monitoring the programming of robots, it is necessary to consider the communication of the older person who lived with a robot, with other organizations of formal and informal carers, instead of basically gratifying an aged person’s desires. Still, the data also propose that, at least, one approach—the “let’s do it together” strategy—may itself destabilize sovereignty by (unintentionally, perhaps) treating the older persons like children [83]. A robot would be considered as a social one when it takes responsibility, not when it is assigned with responsibility.

Human responsibility and robot responsibility

Robots have the power and ability of interacting with human being and human context in complicated ways. Robotics and making robots bring to the fore variety of applied ethical questions. Following introduces some of them: what are the potential risky consequences of making these robots? What autonomy and privacy concerns will be raised when robots turn to be an inseparable part of human life? Whilst most of these concerns are expressed in relation to other fields of engineering, the capability of robots to act as ethical entities introduces some other moral concerns, amongst them the right and responsibilities of robots?

The ethical issues have different layers that need to be discussed. The most central concerns deal with the responsibilities of robots [90,91,92] and human beings [92, 93].

There is a question shared by many people who are worried about this matter: who is responsible for the mistakes committed by robots? In cases that a robot does not pass the limits of autonomous function, a minimum level of the product liability is assumable. Given that robots follow the plan and procedure decided by some persons or companies, those people or companies are clearly responsible for failure (barring misuse). In the cases that robots are equipped with the accessories to be programmed by customers, the realm of liability will be clear. Still, in semi-autonomous robots such as self-driving cars, the concept of liability would be complicated, particularly when an accident happens in the cases of cooperation between robot and human agent.

In the cases that robot is autonomous, responsibility will be considered entirely as that of robot. It means that the robot is not under the direct influence of programs, programmers, or operators [94].

Equal right for use of robot

It is found that one of the noticeable issues in robot ethics is having equal access to assistive medical robots. There have been a great number of debates surrounding this issue to consider whether it is affordable for every individual or particular group of individuals over the world to utilize and benefit from AIT or not [45, 95]. It is stated that unequal access to robots and healthcare systems might result injustice [47].

One of the ethical chief issues is having unequal access to assistive walking devices. In other words, it is injustice that particular groups of older adults because of different factors such as being from third-world countries do not benefit from assistive walking tools. In addition, it is pointed out that a noticeable number of senior citizens are strongly concern about the cost and also maintenance expenses of assistive walking device. Consequently, this factor slackens their interest towards use of walking devices [10, 58, 96, 97].

Social impact

In some cases, use of assistive medical robots instead of weakening negative impacts, it strengthens the adverse effects such as social isolation which results in reducing social interaction [29, 82, 98]. The result of conducted research studies reveals that assistive robots such as telecare decline social communication [99]. In addition, Chan et al. [49] believed that smart home technology affects human’s relationship and communication with others owing to decreasing interaction between robot users and their caretakers.

It is found that albeit assistive walking devices such as wheeled walker compensate elderly’s disabilities in moving, yet there is a gap for amelioration to diminish fall incidents, whilst improving elderly’s appearance in public [8]. It is asserted that older adults encounter difficulties indoor and outdoor when they employ wheeled walkers. These issues take place when they move in curve, uphill, downhill, over obstacle(s), passing a door, on uneven ground, and carrying an object. In addition, mentioned issues might pose fall incidents to elderly. These issues might have negative effect on older adults’ morality and make them to feel embraced to carry out outdoor activities such as visiting medical doctors, using public transportation, and visiting family members or friend [8].

Technology development

Over the past decades, there has been an abrupt development in technology. This has created hardship for technology users specifically older adults to learn and cope with new modern technology and systems. It is pointed out by Weiser and Brown [100] that it is significant for computer technology to be invisible when assisting users. In other words, technology users are not required to gain knowledge about technology. However, it is said that it is essential for technology users to be aware of advantages and disadvantages of technology’s role in their lives [101].

Apart from the mentioned ethical issues, there is another significant issue which the authors of this paper believe that it is essential to take this ethical issue into consideration and embed it in ethical framework of assistive medical robots. This issue is related to robots users’ feelings towards assistive robots. It is claimed that direct interaction between robots and individuals poses social isolation; therefore, this may influence robot users to have human feeling, namely love towards assistive robots. For this reason, it is important to consider appropriate standards in behaviours of robots to handle this issue.

Recently, there have been substantial technological developments in assistive walking devices. Some researchers believe that older adults are novice users; therefore, they prefer simple functions. Besides, older adults’ behaviour is towards emergency situation is different; they refuse to ask for assistance from their caretakers or nurses [102]. On the other hand, it is stated that some older adults found utilization of technology easy and convenient [103] [104]. It can be learned from literature review that there are common ethics issues between assistive walking devices and robots. Therefore, proper framework can be formed to alleviate and solve the ethical issues with the purpose of satisfying elderly’s needs.

Discussion and conclusion

It is evident that assistive walking devices and robots play imperative role in senior citizens’ lives. These assistive agents and devices have embedded themselves into human’s daily tasks pervasively. It is obvious that robots increasingly have been empowered; therefore, the action of robots might have either destructive effect or useful impact on older adults. In other words, the consequence of assistive robot including walking device is far of significant concern rather than its action. In this case, the concept of consequentialism ethics theory can be applied in assistive walking devices and autonomous agent framework. Moreover, the common ethical issues of both assistive walking devices should be taken into consideration to complete a proper ethics framework which can be applied globally. In addition, a proper ethics framework play beneficial role to promote elderly’s standard of living, improve elderly’s satisfaction, compensate elderly’s disabilities, whilst reducing burdens of expenses related to healthcare services and centres.

References

  1. Ball MM, Perkins MM, Whittington FJ, Hollingsworth C, King SV, Combs BL. Independence in assisted living. J Aging Stud. 2004;18(4):467–83.

    Article  Google Scholar 

  2. Mitzner TL, Chen TL, Kemp CC, Rogers WA. Identifying the potential for robotics to assist older adults in different living environments. Int J Social Robot. 2014;6(2):213–27.

    Article  Google Scholar 

  3. Broadbent E, Tamagawa R, Patience A, Knock B, Kerse N, Day K, MacDonald BA. Attitudes towards health-care robots in a retirement village. Aust J Ageing. 2012;31(2):115–20.

    Article  Google Scholar 

  4. Parker MG, Thorslund M. Health trends in the elderly population: getting better and getting worse. The Gerontologist. 2007;47(2):150–8.

    Article  Google Scholar 

  5. Pigini L, Facal D, Blasi L, Andrich R. Service robots in elderly care at home: users’ needs and perceptions as a basis for concept development. Technol Disabil. 2012;24(4):303–11.

    Google Scholar 

  6. Hassmiller SB, Cozine M. Addressing the nurse shortage to improve the quality of patient care. Health Aff. 2006;25(1):268–74.

    Article  Google Scholar 

  7. Alaiad A, Zhou L. The determinants of home healthcare robots adoption: an empirical investigation. Int J Med Informatics. 2014;83(11):825–40. https://doi.org/10.1016/j.ijmedinf.2014.07.003.

    Article  Google Scholar 

  8. Lindemann U, Schwenk M, Klenk J, Kessler M, Weyrich M, Kurz F, Becker C. Problems of older persons using a wheeled walker. Aging Clin Exp Res. 2016;28(2):215–20. https://doi.org/10.1007/s40520-015-0410-8.

    Article  Google Scholar 

  9. Mathie MJ, Coster AC, Lovell NH, Celler BG, Lord SR, Tiedemann A. A pilot study of long-term monitoring of human movements in the home using accelerometry. J Telemed Telecare. 2004;10(3):144–51. https://doi.org/10.1258/135763304323070788.

    Article  Google Scholar 

  10. Mihailidis A, Cockburn A, Longley C, Boger J. The acceptability of home monitoring technology among community-dwelling older adults and baby boomers. Assist Technol. 2008;20(1):1–12. https://doi.org/10.1080/10400435.2008.10131927.

    Article  Google Scholar 

  11. Wilkinson KA. U.S. Patent No. 4,899,771. U.S. Patent and Trademark Office, Washington, DC. 1990.

  12. Rogers WA, Mynatt ED. How can technology contribute to the quality of life of older adults. The technology of humanity: Can technology contribute to the quality of life; 2003. p. 22–30.

    Google Scholar 

  13. Bonaccorsi M, Fiorini L, Cavallo F, Saffiotti A, Dario P. A cloud robotics solution to improve social assistive robots for active and healthy aging. Int J Soc Robot 2016;8(3):393–408.

    Article  Google Scholar 

  14. Van den Broek G, Cavallo F, Wehrmann C. AALIANCE ambient assisted living roadmap, vol. 6. Amsterdam: IOS press; 2010.

    Google Scholar 

  15. Aquilano M, Salatino C, Carrozza MC. Assistive technology: a new approach to evaluation. In: 2007 IEEE 10th International Conference on Rehabilitation Robotics. IEEE; 2007. p. 809-19.

  16. Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. MIS Q. 2003;425–78.

  17. Shaw W, Barry V. Moral issues in business. Boston: Cengage Learning; 2015.

    Google Scholar 

  18. Johnson R. Kant’s moral philosophy. Stanford encyclopedia of philosophy. 2008.

  19. Alexander L, Moore M. Deontological ethics. In: Zalta EN, editor. The stanford encyclopedia of philosophy. Winter 2016 edn; 2007.

  20. Sullivan RJ. Immanuel Kant’s moral theory. Cambridge: Cambridge University Press; 1989.

    Book  Google Scholar 

  21. Kant I. Critique of practical reason in LW Beck (ed and trans) (1976) critique of practical reason and other writings in moral philosophy; 1788

  22. Sullivan RJ. An introduction to Kant’s ethics. New York: Cambridge University Press; 1994.

    Book  Google Scholar 

  23. Hursthouse R. Normative virtue ethics. How should one live, Vol. 1; 1996. pp 19–37.

  24. Sandler RL. Environmental virtue ethics. Oxford: Blackwell; 2013.

    Book  Google Scholar 

  25. Crossan M, Mazutis D, Seijts G. In search of virtue: the role of virtues, values and character strengths in ethical decision making. J Bus Ethics. 2013;113(4):567–81. https://doi.org/10.1007/s10551-013-1680-8.

    Article  Google Scholar 

  26. Morin C, Dick DG. The development of the ethical approach scale: an operationalization of moral theory. Acad Manag Proc. 2015;2015(1):13236. https://doi.org/10.5465/AMBPP.2015.13236abstract.

    Article  Google Scholar 

  27. Cummiskey D. Consequentialism. Int Encycl Ethics. 2013. https://doi.org/10.1002/9781444367072.wbiee428.

    Google Scholar 

  28. Peterson M. The dimensions of consequentialism: ethics, equality and risk. Cambridge: Cambridge University Press; 2013.

    Book  Google Scholar 

  29. Sharkey A, Sharkey N. Granny and the robots: ethical issues in robot care for the elderly. Ethics Inf Technol. 2012;14(1):27–40.

    Article  Google Scholar 

  30. Friedman B, Kahn Jr PH. Human values, ethics, and design. In: The human–computer interaction handbook. L. Erlbaum Associates Inc.; 2003, p. 1177–1201.

  31. Bizony P. Asimov’s three laws of robotics engineering and technology magazine, 15; (2015). http://www.auburn.edu/~vestmon/robotics.html.

  32. Ljungblad S, Nylander S, Nørgaard M. Beyond speculative ethics in hri?: Ethical considerations and the relation to empirical data. Paper presented at the proceedings of the 6th international conference on Human-robot interaction. 2011.

  33. Asimov I. I, Robot, 2004 edn. New York, NY: Bantam Dell; 1950.

  34. McBride N, Hoffman RR. Bridging the ethical gap: from human principles to robot instructions. IEEE Intell Sys. 2016; 31(5):76–82.

    Article  Google Scholar 

  35. Norman DA. Emotional design: why we love (or hate) everyday things. New York: Basic books; 2005.

    Google Scholar 

  36. Sharkey N, Sharkey A. The crying shame of robot nannies: an ethical appraisal. Interact Stud. 2010;11(2):161–90. http://www.jbeplatform.com/content/journals/10.1075/is.11.2.01sha.

  37. Calo R. Robots and privacy. In: Patrick Lin GB, Abney K, editors. Robot ethics. Cambridge: MIT Press; 2010.

    Google Scholar 

  38. Denning T, Matuszek C, Koscher K, Smith JR, Kohno T. A spotlight on security and privacy risks with future household robots: attacks and lessons. Paper presented at the proceedings of the 11th international conference on Ubiquitous computing. 2009.

  39. Kaniarasu P, Steinfeld A, Desai M, Yanco H. Potential measures for detecting trust changes. Paper presented at the Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. 2012.

  40. Schaefer KE. The perception and measurement of human-robot trust. Florida: University of Central Florida Orlando; 2013.

    Google Scholar 

  41. Yagoda RE, Gillan DJ. You want me to trust a ROBOT? The development of a human–robot interaction trust scale. Int J Social Robot. 2012;4(3):235–48.

    Article  Google Scholar 

  42. Leroux C. EU robotics coordination action: a green paper on legal issues in robotics. Paper presented at the proceeding of international workshop on autonomics and legal implications, Berlin. 2012.

  43. Aarts E. Ambient intelligent: a multimedia perspective. IEEE Multimedia. 2004;11(1):12–9. https://doi.org/10.1007/978-3-540-73281-5_11.

    Article  Google Scholar 

  44. Albrechtslund A. House 2.0: towards an ethics for surveillance in intelligent living and working environments. In: Proceedings of the seventh international conference of computer ethics philosophical enquiry, San Diego, USA: University of San Diego; 2007. p. 7–16.

  45. Bohn J, Coroama V, Langheinrich M, Mattern F, Rohs M. Living in a world of smart everyday objects—social, economic, and ethical implications. Hum Ecol Risk Assess. 2004;10(5):763–85. https://doi.org/10.1080/10807030490513793.

    Article  Google Scholar 

  46. Brey P. Freedom and privacy in ambient intelligent. Ethics Inf Technol. 2005;7(3):157–66. https://doi.org/10.1007/s10676-006-0005-3.

    Article  Google Scholar 

  47. Brown I, Adams A. Ethical challenges of ubiquitous healthcare. Int Rev Inf Ethics. 2007;8(12):53–60.

    Google Scholar 

  48. Caire P, Moawad A, Efthymiou V, Bikakis A, Le Traon Y. Privacy challenges in ambient intelligent systems: lessons learned, gaps and perspectives. J Ambient Intell Smart Environ. 2014;1:1–23.

    Google Scholar 

  49. Chan M, Campo E, Estève D, Fourniols JY. Smart homes—current features and future perspectives. Maturitas. 2009;64:90–7. https://doi.org/10.1016/j.maturitas.2009.07.014.

    Article  Google Scholar 

  50. Oishi MMK, Mitchell I, Machiel Van der Loos HFM, editors. Design and use of assistive technology: social, technical, ethical, and economic challenges. New York: Springer; 2010.

    Google Scholar 

  51. Sadri F. Ambient intelligent: a survey. ACM Comput Surv. 2011. https://doi.org/10.1145/1978802.1978815.

    Google Scholar 

  52. Van Heerde HJW, Anciaux NLG, Feng L, Apers PMG. Balancing smartness and privacy for ambient intelligent. In: Proceedings of the 1st European conference on Smart Sensing and Context (EuroSSC). Lecture notes in computer science 4272, 2006; p. 255–8.

  53. Ikonen V, Kaasinen E, Niemelaa M. Defining ethical guidelines for ambient intelligent applications on a mobile phone. In: Proceedings of the 5th international conference on intelligent environments, IOS Press, Amsterdam, 2009; p. 261–8.

  54. Kaasinen E, Kyma¨la¨inen T, Niemela¨ M, Olsson T, Kanerva M, Ikonen V. A user-centric view of intelligent environments: user expectations, user experience and user role in building intelligent environments. Computers. 2013;2:1–33. https://doi.org/10.3390/computers2010001.

    Article  Google Scholar 

  55. Friedewald M, Da Costa O, Punie Y, Alahuhta P, Heinonen S. Perspectives of ambient intelligent in the home environment. Telemat Inform. 2005;22:221–38. https://doi.org/10.1016/j.tele.2004.11.001.

    Article  Google Scholar 

  56. Schülke AM, Plischke H, Kohls NB. Ambient Assistive Technologies (AAT): socio-technology as a powerful tool for facing the inevitable sociodemographic challenges? Philos Ethics Humanit Med. 2010. https://doi.org/10.1186/1747-5341-5-8.

    Google Scholar 

  57. Londei ST, Rousseau J, Ducharme F, St-Arnaud A, Meunier J, Saint-Arnaud J, Giroux F. An intelligent videomonitoring system for fall detection at home: perceptions of elderly people. J Telemed Telecare. 2009;15(8):383–90. https://doi.org/10.1258/jtt.2009.090107.

    Article  Google Scholar 

  58. Steele R, Lo A, Secombe C, Wong YK. Elderly persons’ perception and acceptance of using wireless sensor networks to assist healthcare. Int J Med Informatics. 2009;78(12):788–801. https://doi.org/10.1016/j.ijmedinf.2009.08.001.

    Article  Google Scholar 

  59. Shaw-Garlock G. Looking forward to sociable robots. Int J Social Robot. 2009;1(3):249–60. https://doi.org/10.1007/s12369-009-0021-7.

    Article  Google Scholar 

  60. Mittelstadt B, Fairweather NB, McBride N, Shaw M. Privacy, risk and personal health monitoring. In: Proceedings of ETHICOMP 2013: the possibilities of ethical ICT, 2013; p. 340–351.

  61. Hert PD, Gutwirth S, Moscibroda A, Wright D, Fuster GG. Legal safeguards for privacy and data protection in ambient intelligent. Pers Ubiquit Comput. 2009;13(6):435–44. https://doi.org/10.1007/s00779-008-0211-6.

    Article  Google Scholar 

  62. Shneier M, Hong T, Cheok G, Saidi K, Shackleford W. Performance evaluation methods for human detection and tracking systems for robotic applications. vol. NISTIR, 8045. 2015.

  63. Jones S, Hara S, Augusto JC. eFRIEND: an ethical framework for intelligent environments development. Eth Inf Technol. 2015;17(1):11–25.

    Article  Google Scholar 

  64. Landau R, Auslander GK, Werner S, Shoval N, Heinik J. Families’ and professional caregivers’ views of using advanced technology to track people with dementia. Qual Health Res. 2010;20(3):409–19. https://doi.org/10.1177/1049732309359171.

    Article  Google Scholar 

  65. Nixon P, Wagealla W, English C, Terzis S. Security, privacy and trust issues in smart environments. In: Cook D, Das S, editors. Smart environments: technology, protocols and applications. Hoboken: Wiley; 2004. p. 220–40.

    Google Scholar 

  66. Rashidi P, Mihailidis A. A survey on ambient assisted living tools for older adults. IEEE J Inf Technol Biomed. 2013;17(3):579–90. https://doi.org/10.1109/JBHI.2012.2234129.

    Google Scholar 

  67. Van Hoof J, Kort HSM, Markopoulos P, Soede M. Ambient intelligent, ethics and privacy. Gerontechnology. 2007;6(3):155–63. https://doi.org/10.4017/gt.2007.06.03.005.00.

    Article  Google Scholar 

  68. Al-Aama T. Falls in the elderly spectrum and prevention. Can Fam Physician. 2011;57(7):771–6.

    Google Scholar 

  69. Hill KD, Wee R. Psychotropic drug-induced falls in older people. Drugs Aging. 2012;29(1):15–30. https://doi.org/10.2165/11598420-000000000-00000.

    Article  Google Scholar 

  70. Kalisch BJ, Tschannen D, Lee KH. Missed nursing care, staffing, and patient falls. J Nurs Care Qual. 2012;27(1):6–12.

    Article  Google Scholar 

  71. Ambrose AF, Paul G, Hausdorff JM. Risk factors for falls among older adults: a review of the literature. Maturitas. 2013;75(1):51–61. https://doi.org/10.1016/j.maturitas.2013.02.009.

    Article  Google Scholar 

  72. Callisaya ML, Blizzard L, Schmidt MD, Martin KL, McGinley JL, Sanders LM, Srikanth VK. Gait, gait variability and the risk of multiple incident falls in older people: a population-based study. Age Ageing. 2011;40(4):481–7. https://doi.org/10.1093/ageing/afr055.

    Article  Google Scholar 

  73. Coussement J, De Paepe L, Schwendimann R, Denhaerynck K, Dejaeger E, Milisen K. Interventions for preventing falls in acute- and chronic-care hospitals: a systematic review and meta-analysis. J Am Geriatr Soc. 2008;56:29–36. https://doi.org/10.1111/j.1532-5415.2007.01508.x.

    Article  Google Scholar 

  74. Kartal B et al. Tree search with branch and bound for multi-robot task allocation. In: IJCAI’16 workshop on autonomous mobile service robots. 2016.

  75. Coradeschi S, Cesta A, Cortellessa G, Coraci L, Gonzalez J, Karlsson L, Pecora F. Giraffplus: combining social interaction and long term monitoring for promoting independent living. In: Human system interaction (HSI), 2013 the 6th international conference. IEEE; 2013. pp 578–85.

  76. Ackerman E. Nevada bill would provide tentative roadmap for autonomous vehicles. IEEE Spect. 2011.http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/nevada-bill-would-provide-tentative-roadmap-for-autonomous-vehicles.s.

  77. Simonov M, Bazzani M, Frisiello A. Ubiquitous monitoring & service robots for care. Paper presented at the 35th German conference on artificial intelligence September, Saarbrucken, Germany. 2012.

  78. Frank AW, Labas MC, Johnston JD, Kontulainen SA. Site-specific variance in radius and tibia bone strength as determined by muscle size and body mass. Physiother Can 2012; 64(3):292–301. http://www.utpjournals.press/doi/10.3138/ptc.2010-40BH.

  79. van de Ven AA, Sponselee AMA, Schouten BA. Robo MD: a home care robot for monitoring and detection of critical situations. In: Proceedings of the 28th Annual European conference on cognitive ergonomics. ACM; 2010. pp 375–6.

  80. Cavallo E, Galiani S, Noy I, Pantano J Catastrophic natural disasters and economic growth. Rev Econ Stat 2013; 95(5):1549–61. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1817292.

  81. Rosenthal S, Veloso MM. Mobile robot planning to seek help with spatially-situated tasks. In: Association for the Advancement of Artificial Intelligence (AAAI), Vol. 4, No. 5.3; 2012. p. 1.

  82. Lin P, Abney K, Bekey G. Robot ethics: mapping the issues for a mechanized world. Artif Intell. 2011;175(5):942–9. https://doi.org/10.1016/j.artint.2010.11.026.

    Article  Google Scholar 

  83. Sorell T, Draper H. Robot carers, ethics, and older people. Ethics Inf Technol. 2014;16(3):183–95.

    Article  Google Scholar 

  84. Rouvroy A. Privacy, data protection, and the unprecedented challenges of ambient intelligent. Stud Ethics Law Technol. 2008;2(1):1–51. https://doi.org/10.2202/1941-6008.1001.

    Article  Google Scholar 

  85. Langheinrich M, Coroama V, Bohn J, Friedemann M. Living in a smart environment—implications for the coming ubiquitous information society. Telecommun Rev. 2004;15(1):132–43. https://doi.org/10.1109/ICSMC.2004.1401091.

    Google Scholar 

  86. Lin P, Abney K, Bekey GA. The ethical and social implications of robotics. Cambridge: MIT Press; 2012.

    Google Scholar 

  87. Op den Akker HJA. What do care robots reveal about technology? In: Proceedings of the 1st international conference on social robots in therapy and education, NewFriends. Almere, The Netherlands: Windesheim Flevoland; 2015. pp 82–83.

  88. Coeckelbergh M. Can we trust robots? Ethics Inf Technol. 2012;14(1):53–60.

    Article  Google Scholar 

  89. Coeckelbergh M. Virtual moral agency, virtual moral responsibility: on the moral significance of the appearance, perception, and performance of artificial agents. AI & Soc. 2009;24(2):181–9.

    Article  Google Scholar 

  90. Cesta A, Cortellessa G, Orlandini A, Tiberio L. Long-term evaluation of a telepresence robot for the elderly: methodology and ecological case study. Int J Social Robot. 2016;8(3):421–41.

    Article  Google Scholar 

  91. Koceski S, Koceska N. Evaluation of an assistive telepresence robot for elderly healthcare. J Med Syst. 2016;40(5):1–7.

    Article  Google Scholar 

  92. Van Wynsberghe A. Service robots, care ethics, and design. Ethics Inf Technol. 2016;18(4):311–21.

    Article  Google Scholar 

  93. Allen C, Wallach W. Moral machines: contradiction in terms or abdication of human responsibility. In: Robot ethics: the ethical and social implications of robotics. Cambridge (MA): MIT Press, 2012. pp 55–68.

  94. Malle BF. Integrating robot ethics and machine morality: the study and design of moral competence in robots. Ethics Inf Technol. 2016;18(4):243–56.

    Article  Google Scholar 

  95. Wright D, Gutwirth S, Friedewald M, Vildjiounaite E, Punie Y, editors. Safeguards in a world of ambient intelligent. New York: Springer; 2010.

    Google Scholar 

  96. Demiris G, Rantz MJ, Aud MA, Marek KD, Tyrer HW, Skubic M, Hussam AA. Older adults’ attitudes towards and perceptions of ‘smart home’ technologies: a pilot study. Med Inform Internet Med. 2004;29(2):87–94. https://doi.org/10.1080/14639230410001684387.

    Article  Google Scholar 

  97. Dorsten AM, Sifford KS, Bharucha A, Mecca LP, Wactlar H. Ethical perspectives on emerging assistive technologies: insights from focus groups with stakeholders in long-term care facilities. J Empir Res Hum Res Ethics. 2009;4(1):25–36. https://doi.org/10.1525/jer.2009.4.1.25.

    Article  Google Scholar 

  98. Sun H, De Florio V, Gui N, Blondia C (2009) Promises and challenges of ambient assisted living systems. In: Proceedings of the 6th international conference on information technology: new generations. IEEE, p. 1201–7. http://dx.doi.org/10.1109/ITNG.2009.169. 2009; .

  99. Perry J, Beyer S, Holm S. Assistive technology, telecare and people with intellectual disabilities: ethical considerations. J Med Ethics. 2009;35:81–6. https://doi.org/10.1136/jme.2008.024588.

    Article  Google Scholar 

  100. Weiser M, Brown JS. Designing calm technology. PowerGrid J. 1996;1(1):75–85.

    Google Scholar 

  101. Augusto JC, McCullagh PJ, Augusto-Walkden J-A. Living without a safety net in an intelligent environment. ICST Trans Ambient Syst. 2011;11(10–12):e6. https://doi.org/10.4108/trans.amsys.2011.e6.

    Article  Google Scholar 

  102. Johnson M, George A, Tran DT. Analysis of falls incidents: nurse and patient preventive behaviours. Int J Nurs Pract. 2011;17(1):60–6. https://doi.org/10.1111/j.1440-172X.2010.01907.x.

    Article  Google Scholar 

  103. Silveira P, van het Reve E, Daniel F, Casati F, de Bruin ED. Motivating and assisting physical exercise in independently living older adults: a pilot study. Int J Med Informatics. 2013;82(5):325–34. https://doi.org/10.1016/j.ijmedinf.2012.11.015.

    Article  Google Scholar 

  104. Wu GE, Keyes LM. Group tele-exercise for improving balance in elders. Telemed J E Health. 2006;12(5):561–70. https://doi.org/10.1089/tmj.2006.12.56.

    Article  Google Scholar 

Download references

Authors’ contributions

Despite the considerable benefits from assistive devices, yet there are a great number of ethical issues from elderly’s perspective that need to be considered. The authors of this paper provide a review of ethics theories with the purpose of considering the concept of related theories to form a proper ethical framework to overcome current issues encountered by elderly. It is believed that having clear understanding of ethical issues related to assistive devices will assist to create proper ethical framework which can be applied globally. This paper is written by Nazanin Mansouri, Seyed Ebrahim Hosseini, and reviewed by Khaled Goher. All authors read and approved the final manuscript.

Acknowledgements

The authors of this paper would like to thank Lincoln University in New Zealand for offering the funding support for this publication.

Competing interests

The authors declare that they have no competing interests.

Funding

This research is originally funded by research grant from Lincoln University, New Zealand.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nazanin Mansouri.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mansouri, N., Goher, K. & Hosseini, S.E. Ethical framework of assistive devices: review and reflection. Robot. Biomim. 4, 19 (2017). https://doi.org/10.1186/s40638-017-0074-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40638-017-0074-2

Keywords