Skip to main content
Log in

The use of task modeling in interactive system specification

  • Original Article
  • Published:
Cognition, Technology & Work Aims and scope Submit manuscript

Abstract

Task modeling is undoubtedly a key step for task analysis during the development of iterative systems since it helps not only in the understanding of what users want but also in how to design for them. As a consequence, it should be considered as indispensable while specifying the requirements of an interactive system. With this in mind, we analyzed the requirement specifications of a typical interactive system which is a compulsory element of the final grade for a human–computer interaction course in two Master’s degree programs at the University of Valenciennes. To that end, we decided to use an approach based on project-based learning. Sixty-three requirement specifications performed since 2010 were investigated to find out how the task modeling was represented. Moreover, suggestions concerning pedagogical improvement made by the students in an open question on the evaluation form were analyzed using the ground theory method. The results showed that since students are not required to perform task modeling by the requirement specification, they do not do so even in the case of complex interactive systems. Criticism, positive and improvement issues raised by the students were also identified. In this paper, we present a detailed analysis of this study that opens up questions about HCI education and effective learning of task modeling, and, as a consequence, its potential use in industry.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Notes

  1. www.scopus.com.

  2. http://login.webofknowledge.com/.

  3. The European System aims at unifying European higher education and facilitating equivalence between countries in order to promote student travel within Europe.

  4. Recall: Only several weeks ago before the project for the AHMS program, and during the previous year for the CS program.

  5. The analysis of the reports shows that the students were motivated since they worked outside of the class to finally produce professional quality reports, sometimes comprising about thirty pages. They considered that the time in class was not enough and all compensated by a considerable work outside of the class.

References

  • Abed M, Bernard JM, Angué JC (1991) Task analysis and modelization by using SADT and petri networks. In: Proceedings of tenth european annual conference on human decision making and manual control

  • Anda B, Hansen K, Gullesen I, Thorsen HK (2006) Experiences from introducing UML-based development in a large safety-critical project. Empir Softw Eng 11:555–581. doi:10.1007/s10664-006-9020-6

    Article  Google Scholar 

  • Annett J, Duncan KD (1967) Task analysis and training design. Occup Psychol 41:211–221

    Google Scholar 

  • Anzalone D, Manca M, Paternò F, Santoro C (2015) Responsive task modelling. In: Proceedings of the 7th ACM SIGCHI symposium on engineering interactive computing systems, EICS’15. ACM, New York, pp 126–131. doi:10.1145/2774225.2775079

  • Brazier FMT, Jonker CM, Treur J, Wijngaards NJE (2000) On the use of shared task models in knowledge acquistion, strategic user interaction and clarification agents. Int J Hum-Comput Stud 52:77–110. doi:10.1006/ijhc.1999.0322

    Article  Google Scholar 

  • Briand L, Labiche Y, Liu Y (2012) Combining UML sequence and state machine diagrams for data-flow based integration testing. In: Proceedings of the 8th European conference on modelling foundations and applications. Springer, Berlin, pp 74–89. doi:10.1007/978-3-642-31491-9_8

  • Caffiau S, Scapin DL, Sanou L, Guittet L (2014) Using K-MADe for learning task modeling: interest and difficulties. J Interact Pers Syst 1:1–28

    Google Scholar 

  • Courage C, Redish JG, Wixon D (2009) Task analysis. In: Sears A, Jacko JA (eds) Human–computer interaction: development process. CRC Press, NW, pp 33–53

    Chapter  Google Scholar 

  • Dargham JA, Chin RKY (2015) A framework for integrating project-based learning into the curriculum for outcome based education. In: Proceedings of the 7th international conference on engineering education (ICEED). Presented at the international conference on engineering education (ICEED). IEEE, pp 6–9

  • Diaper D (2004) Understanding task analysis for human–computer interaction. In: Diaper D, Stanton NA (eds) The handbook of task analysis for human–computer interaction. Lawrence Erlbaum Associates, Mahwah, NJ, pp 5–47

    Google Scholar 

  • Diaper D, Stanton NA (2004) The handbook of task analysis for human–computer interaction. Lawrence Erlbaum Associates, Mahwah, NJ

    Google Scholar 

  • Ezzedine H, Kolski C (2005) Modelling of cognitive activity during normal and abnormal situations using Object Petri Nets, application to a supervision system. Cogn Technol Work 7:167–181. doi:10.1007/s10111-005-0184-4

    Article  Google Scholar 

  • Fernández-Sáez AM, Genero M, Caivano D, Chaudron MR (2016) Does the level of detail of UML diagrams affect the maintainability of source code?: A family of experiments. Empir Softw Eng 21:212–259. doi:10.1007/s10664-014-9354-4

    Article  Google Scholar 

  • Gharsellaoui A, Bellik Y, Jacquet C (2012) Requirements of task modeling in ambient intelligent environment. In: Proceedings of the 2nd international conference on ambient computing, applications, services and technologies. Presented at the international conference on ambient computing, applications, services and technologies (AMBIENT’12), pp 71–78

  • Hackos JT, Redish JC (1998) User and task analysis for interface design. Wiley, London

    Google Scholar 

  • Hadar I, Reinhartz-Berger I, Kuflik T, Perini A, Ricca F, Susi A (2013) Comparing the comprehensibility of requirements models expressed in Use Case and Tropos: Results from a family of experiments. Inf Softw Technol 55:1823–1843. doi:10.1016/j.infsof.2013.05.003

    Article  Google Scholar 

  • He N, Huang H, Qian Y (2016) Teaching touch sensing technologies through project-based learning. In: Proceedings of IEEE frontiers in education conference. Presented at the IEEE frontiers in education conference (FIE), IEEE

  • Johnson H, Johnson P (1991) Task knowledge structures: Psychological basis and integration into system design. Acta Psychol (Amst) 78:3–26

    Article  Google Scholar 

  • Kolski C, Loslever P, Sagar M (2012) The performance of future designers on the specification of supervisory HCI: case study of a simulated work situation. Cogn Technol Work 14:107–128. doi:10.1007/s10111-010-0169-9

    Article  Google Scholar 

  • Kontogiannis T (2005) Integration of task networks and cognitive user models using coloured Petri nets and its application to job design for safety and productivity. Cogn Technol Work 7:241–261. doi:10.1007/s10111-005-0010-z

    Article  Google Scholar 

  • Köse U (2010) A web based system for project-based learning activities in “web design and programming” course. Proc Soc Behav Sci 2:1174–1184. doi:10.1016/j.sbspro.2010.03.168

    Article  Google Scholar 

  • Lepreux S, Abed M, Kolski C (2003) A human-centred methodology applied to decision support system design and evaluation in a railway network context. Cogn Technol Work 5:248–271. doi:10.1007/s10111-003-0128-9

    Article  Google Scholar 

  • Li S, Chen W, Fu Y, Wang C, Tian Y, Tian Z (2016) Modeling human behavior in manual control rendezvous and docking task. Cogn Technol Work 18:745–760. doi:10.1007/s10111-016-0388-9

    Article  Google Scholar 

  • Limbourg Q, Vanderdonckt J (2004) Comparing Task model for user interface design. In: Diaper D, Stanton NA (eds) The handbook of task analysis for human–computer interaction. Lawrence Erlbaum Associates, Mahwah, NJ, pp 135–154

    Google Scholar 

  • Macredie R, Wild P (2000) An evaluation of the potential of task analysis in the evolution of interactive work systems. Cogn Technol Work 2:7–15. doi:10.1007/s101110050002

    Article  Google Scholar 

  • Markham T (2012) Project based learning design and coaching guide. HeartIQ Press, San Rafael

    Google Scholar 

  • Martín JC, López CL, Martínez JEP (2014) Supporting the design and development of project based learning courses. In: Proceedings of IEEE frontiers in education conference. Presented at the IEEE frontiers in education conference (FIE). IEEE

  • Martinie C, Palanque P, Winckler M (2015) Designing and assessing interactive systems using task models. In: Monteiro IT, Silveira MS (eds) Tutorials of the 14th Brazilian symposium on human factors in computing systems. SBC, Troy, pp 29–58

    Google Scholar 

  • Mistry RD, Halkude SA, Awasekar DD (2016) APIT: evidences of aligning project based learning with various instructional strategies for enhancing knowledge in automobile engineering. In: Proceedings of international conference on learning and teaching in computing and engineering. IEEE, pp 107–114

  • Molina AI, Redondo MA, Ortega M (2007) Applying task modeling and pattern-based techniques in reengineering processes for mobile learning user interfaces: a case study. J. Comput. 2:23–30

    Article  Google Scholar 

  • Moussa F, Ismail I, Jarraya M (2015) Towards a runtime evolutionary model of user-adapted interaction in a ubiquitous environment: the RADEM formal model. Cogn Technol Work 17:391–415. doi:10.1007/s10111-014-0288-9

    Article  Google Scholar 

  • Neville AJ (2009) Problem-based learning and medical education forty years on. A review of its effects on knowledge and clinical performance. Med Princ Pract 18:1–9

    Article  Google Scholar 

  • Nugroho A (2009) Level of detail in UML models and its impact on model comprehension: a controlled experiment. Inf Softw Technol 51:1670–1685. doi:10.1016/j.infsof.2009.04.007

    Article  Google Scholar 

  • Nugroho A, Chaudron MRV (2009) Evaluating the impact of UML modeling on software quality: an industrial case study. In: Schürr A, Selic B (eds) Model driven engineering languages and systems: MODELS 2009. Springer, Berlin, pp 181–195. doi:10.1007/978-3-642-04425-0_14

    Chapter  Google Scholar 

  • Oliveira KM, Girard P, Gonçalves TG, Lepreux S, Kolski C (2015) Teaching task analysis for user interface design: lessons learned from three pilot studies. In: Proceedings of the 27th conference on l’interaction homme-machine, IHM’15. ACM, New York, pp 1–6. doi:10.1145/2820619.2825011

  • Palanque P, Bastide R (1997) Synergistic modelling of tasks, users and systems using formal specification techniques. Interact Comput 9:129–153. doi:10.1016/S0953-5438(97)00013-1

    Article  Google Scholar 

  • Paradowski M, Fletcher A (2004) Using task analysis to improve usability of fatigue modelling software. Int J Hum.-Comput Stud 60:101–115. doi:10.1016/j.ijhcs.2003.09.004

    Article  Google Scholar 

  • Paternò F, Santoro C (2002) Preventing user errors by systematic analysis of deviations from the system task model. Int J Hum-Comput Stud 56:225–245. doi:10.1006/ijhc.2001.0523

    Article  Google Scholar 

  • Paternò F, Mancini C, Meniconi S (1997) ConcurTaskTrees: a diagrammatic notation for specifying task models. In: Howard S, Hammond J, Lindgaard G (eds) Human–computer interaction INTERACT’97: IFIP TC13 international conference on human–computer interaction. Springer, Boston, pp 362–369. doi:10.1007/978-0-387-35175-9_58

    Google Scholar 

  • Pribeanu C (2007) An approach to task modeling for user interface design. Int J Comput Electr Autom Control Inf Eng 1:1398–1401

    Google Scholar 

  • Rochfeld A, Tardieu H (1983) MERISE: An information system design and development methodology. Inf. Manage. 6:143–159. doi:10.1016/0378-7206(83)90032-0

    Article  Google Scholar 

  • Rumbaugh J, Jacobson I, Booch G (1997) Unified modeling language reference manual. Addison-Wesley, London

    Google Scholar 

  • Santoro C (2005) A task model-based approach for the design and evaluation of innovative user interfaces. Presses Universitaires de Louvain, SIMILAR, Louvain

    Google Scholar 

  • Scapin DL, Bastien JMC (1997) Ergonomic criteria for evaluating the ergonomic quality of interactive systems. Behav. Inf. Technol. 16:220–231

    Article  Google Scholar 

  • Scapin DL, Pierret-Golbreich C (1989) Towards a method for task description: MAD. Work Disp Units 89:371–380

    Google Scholar 

  • Seman LO, Gomes G, Hausmann R (2016) Statistical analysis using PLS of a project-based learning application in electrical engineering. IEEE Lat. Am. Trans. 14:646–651

    Article  Google Scholar 

  • Siau K, Lee L (2004) Are use case and class diagrams complementary in requirements analysis? An experimental study on use case and class diagrams in UML. Requir Eng 9:229–237. doi:10.1007/s00766-004-0203-7

    Article  Google Scholar 

  • Letouze P, Souza Jr., JIM, Silva VM (2016) Generating software engineers by developing web systems: a project-based learning case study. In: Proceedings of the 29th international conference on software engineering education and training. Presented at the international conference on software engineering education and training, IEEE, pp 194–203

  • Stary C, van der Veer GC (1999) Task analysis meets prototyping: seeking seamless UI-development. In: CHI’99 extended abstracts on human factors in computing systems, CHI EA’99. ACM, New York, NY, USA, pp 104–105. doi:10.1145/632716.632783

  • Strauss A, Corbin J (1998) Basics of qualitative research: techniques and procedures for developing grounded theory. Sage, Thousand Oaks

    Google Scholar 

  • Talon B, Sagar M, Kolski C (2012) Developing competence in interactive systems: the GRASP tool for the design or redesign of pedagogical ICT devices. Trans Comput Educ 12:9:1–9:43. doi:10.1145/2275597.2275598

    Article  Google Scholar 

  • Tarby JC, Barthet MF (1996) The Diane+ method. In: Proceedings of the second international workshop on computer-aided design of user interfaces. Presented at the second international workshop on computer-aided design of user interfaces (CADUI’96), pp 95–120

  • van Westrenen F (2011) Cognitive work analysis and the design of user interfaces. Cogn Technol Work 13:31–42. doi:10.1007/s10111-010-0153-4

    Article  Google Scholar 

  • Warin B, Kolski C, Sagar M (2011) Framework for the evolution of acquiring knowledge modules to integrate the acquisition of high-level cognitive skills and professional competencies: Principles and case studies. Comput Educ 57:1595–1614. doi:10.1016/j.compedu.2011.02.013

    Article  Google Scholar 

  • Warin B, Talbi O, Kolski C, Hoogstoël F (2016) Multi-Role Project (MRP): a new project-based learning method for STEM. IEEE Trans Educ 59:137–146

    Article  Google Scholar 

  • Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Wesslén A (2012) Experimentation in software engineering. Springer, Berlin

    Book  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge the financial support granted by CAPES—Science without Borders Program. They thank warmly the students who participated in this study and Bruno Warin (University of Littoral Côte d’Opale, Calais, France) who proposed the evaluation questionnaire. The authors thank also the anonymous reviewers for their numerous constructive remarks.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christophe Kolski.

Appendix: Evaluation questionnaire

Appendix: Evaluation questionnaire

Preamble: We started with a course that could be described as classic, associated with different supports. Then, in supervised work classes, I proposed an active pedagogy, supported by the performance of a collective mini-project. To improve this pedagogy, I would like to know how you feel about it. With this in mind, I would like you to fill out the following questionnaire. The questionnaire responses will be used only for research purposes and anonymously. Thank you in advance for your help.

1.1 General profile

  1. 1.

    Gender:

  2. 2.

    Work investment—About your work investment in the master, you consider yourself as:

    1. a.

      Good worker and perfectionist

    2. b.

      Good worker

    3. c.

      Just enough to achieve the goal (The average in an exam for example)

    4. d.

      Irregular

    5. e.

      Carefree

  3. 3.

    Working method—You evaluate yourself as:

    1. a.

      Very methodical

    2. b.

      Methodical

    3. c.

      Pragmatic

    4. d.

      Carefree

  4. 4.

    Work preference—When you have the choice, you prefer to work:

    1. a.

      Individually

    2. b.

      In pairs

    3. c.

      In a team

  5. 5.

    Freedom of action—When doing the work, you prefer to:

    1. a.

      Be guided from the start and throughout the work

    2. b.

      Be guided in part of the work

    3. c.

      Have the goal and the main lines of resolution, then work freely

    4. d.

      Not be guided

1.2 The proposed mini-project

  1. 1.

    Initial interest—You can say that the theme of the project initially aroused:

    1. a.

      Enthusiasm

    2. b.

      Interest

    3. c.

      As a constraint

    4. d.

      As a punishment

  2. 2.

    Subject comprehension—About your comprehension, you think the subject was:

    1. a.

      Too detailed

    2. b.

      Well detailed

    3. c.

      Not explicit enough

    4. d.

      Incomprehensible

  3. 3.

    Difficulty of the work—You consider the work to be done:

    1. a.

      Too difficult

    2. b.

      Difficult

    3. c.

      At the right level

    4. d.

      Easy

  4. 4.

    Time for performance—Compared to the work required to complete the mini-project, you consider the time for performance spent in the supervised work classes was:

    1. a.

      Very important

    2. b.

      At the right level

    3. c.

      Not relevant enough

    4. d.

      Not at all sufficient

1.3 The pedagogy

  1. 1.

    Initial interest—You can say that the obligation to respect a scenario/methodology initially aroused:

    1. a.

      Enthusiasm

    2. b.

      Interest

    3. c.

      As a constraint

    4. d.

      As a punishment

  2. 2.

    Study of the scenario/method—Did you read the scenario/method (in relation to project subject)?

    1. a.

      I read very carefully

    2. b.

      I read with average attention

    3. c.

      I read little or nothing

  3. 3.

    Understanding alone—You think the scenario/method (project subject) is:

    1. a.

      Very easy to understand by yourself

    2. b.

      Easy to understand by yourself

    3. c.

      Difficult to understand by yourself

    4. d.

      Very difficult to understand by yourself

  4. 4.

    Understanding in group—You think the scenario/method (project subject) is:

    1. a.

      Very easy to understand in a group

    2. b.

      Easy to understand in a group

    3. c.

      Difficult to understand in a group

    4. d.

      Very difficult to understand in a group

  5. 5.

    Participation thanks to the scenario/method—Compared to sessions where the teacher presents the knowledge to learn on the “blackboard” (video presentation), do you think the scenario/method makes the supervised classes more motivating and encourages greater participation?

    1. a.

      Absolutely

    2. b.

      Almost sure

    3. c.

      Probably not

    4. d.

      Not at all

  6. 6.

    Utility of scenario/method—You think the scenario/method is:

    1. a.

      Very relevant for achieving the learning of the subject/subjects studied in class

    2. b.

      Relevant

    3. c.

      Irrelevant

    4. d.

      Useless

  7. 7.

    Group meeting organization—Were the group meetings organized (designation of a facilitator, a rapporteur, agenda, duration, time of individualized speech, etc.):

    1. a.

      Always

    2. b.

      Often

    3. c.

      Rarely

    4. d.

      Never

  8. 8.

    Frequency of course assessment—Do you think that regular assessments encourage better learning than an overall assessment at the end of the course?

    1. a.

      Absolutely

    2. b.

      Almost sure

    3. c.

      Probably not

    4. d.

      Not at all

  9. 9.

    Scenario/method understanding—You think the scenario/method is:

    1. a.

      Very easy to apply

    2. b.

      Easy to apply

    3. c.

      Difficult to apply

    4. d.

      Impossible to apply

  10. 10.

    Scenario/method application—Did you apply the scenario/method?

    1. a.

      Absolutely

    2. b.

      Practically yes

    3. c.

      Not exactly

    4. d.

      Not at all

  11. 11.

    Quality of the report—Has the application of scenario/method favored the quality of the final product (the report)?

    1. a.

      Yes

    2. b.

      No

  12. 12.

    Knowledge provided by teachers—Is the knowledge acquired by your group or the course given by the teacher before the project sufficient to do the required work?

    1. a.

      Absolutely

    2. b.

      Largely

    3. c.

      A little

    4. d.

      Not at all

1.4 The evaluation

  1. 1.

    Workload—Does the system of evaluation by report seem cumbersome?

    1. a.

      Absolutely

    2. b.

      Binding but supportable

    3. c.

      Binding but easy to integrate into your training workload

    4. d.

      Not at all

  2. 2.

    Relevance—Does the evaluation system seems relevant to promote learning?

    1. a.

      Absolutely

    2. b.

      Highly pertinent

    3. c.

      Not very pertinent

    4. d.

      Not at all

  3. 3.

    Preference of a single exam—Would you have preferred a global exam instead of an exam and the project report?

    1. a.

      Absolutely

    2. b.

      Strongly

    3. c.

      A little

    4. d.

      Not at all

1.5 Open question

Do you have suggestions for improving the proposed pedagogy?

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gonçalves, T.G., de Oliveira, K.M. & Kolski, C. The use of task modeling in interactive system specification. Cogn Tech Work 19, 493–515 (2017). https://doi.org/10.1007/s10111-017-0427-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10111-017-0427-1

Keywords

Navigation