Abstract
Previous research has found that embedding a problem into a familiar context does not necessarily confer an advantage over a novel context in the acquisition of new knowledge about a complex, dynamic system. In fact, it has been shown that a semantically familiar context can be detrimental to knowledge acquisition. This has been described as the “semantic effect” (Beckmann, Learning and complex problem solving, Bonn, Holos, 1994). The aim of this study was to test two competing explanations that might account for the semantic effect: goal adoption versus assumptions. Participants were asked to learn about the causal structure of a linear system presented on a computer containing three outputs by changing three inputs through goal free exploration. Across four conditions the level of familiarity was experimentally varied through the use of different variable labels. There was no evidence that goal adoption can account for poor knowledge acquisition under familiar conditions. Rather, it appears that a semantically familiar problem context invites a high number of a priori assumptions regarding the interdependency of system variables. These assumptions tend not to be systematically tested during the knowledge acquisition phase. The lack of systematicity in testing a priori assumptions is the main barrier to the acquisition of new knowledge. The semantic effect is in fact an effect of untested presumptions. Implications for research in problem solving, knowledge acquisition and the design of computer-based learning environments are discussed.
Similar content being viewed by others
Notes
Curiously, the authors interpreted these results differently. In comparing the final knowledge score between the three experimental conditions—without considering the a priori differences in knowledge—they erroneously arrived at the conclusion that concrete conditions are advantageous to the acquisition of new knowledge.
We refer to control worthiness as a characteristic of a complex, dynamic system that is determined by the semanticity of its output variables. The underlying assumption is that output variables high in semanticity (i.e. with semantic reference to concrete objects in the “real world”) are more likely to trigger control behaviour that aims at optimising levels of output variables according to self-set targets (e.g. increase, decrease, or keep stable) despite the task being to explore the system.
Technically, only four interventions are necessary to completely identify a linear 3 by 3 system: one where none of the input variables are changed to identify autonomic changes in the output variables, and three interventions where only one of the input variables is changed in order to identify their respective effects on the output variables.
Under given sample size constellations an existing effect of at least medium size (i.e. d ≥ 0.57)—which in this form of analysis also translates into an effect of as small as 9 % of explained variance—will be detectable with a probability of more than .80.
References
Beckmann, J. F. (1994). Learning and complex problem solving. Bonn: Holos.
Beckmann, J. F., & Guthke, J. (1995). Complex problem solving, intelligence, and learning ability. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European perspective (pp. 177–200). Hillsdale, NJ: Erlbaum.
Blessing, S. B., & Ross, B. H. (1996). Content effects in problem categorisation and problem solving. Journal of Experimental Psychology, 22, 792–810.
Burns, B. D., & Vollmeyer, R. (2002). Goal specificity effects on hypothesis testing in problem solving. The Quarterly Journal of Experimental Psychology, 55A, 241–261.
Chung, G., Harmon, T. C., & Baker, E. L. (2001). The impact of a simulation-based learning design project on student learning. IEEE Transactions on Education, 44, 390–398.
Cooper, B., & Dunne, M. (2000). Assessing children’s mathematical knowledge: Social class, sex and problem-solving. Buckingham: Open University Press.
de Freitas, S., & Oliver, M. (2006). How can exploratory learning with games and simulations within the curriculum be most effectively evaluated? Computers & Education, 46(3), 249–264.
de Jong, T., & van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179–201.
Dörner, D. (1987). On the difficulties people have in dealing with complexity. In J. Rasmussen, K. Duncan, & J. Leplat (Eds.), New technology and human error. New York: Wiley.
Dunbar, K. (1993). Concept discovery in a scientific domain. Cognitive Science, 17(3), 397–434.
Fang, L., Tan, H. S., Thwin, M. M., Tan, K. C., & Koh, C. (2011). The value simulation-based learning added to machining technology in Singapore. Educational Media International, 48, 127–137.
Funke, J. (1992). Dealing with dynamic systems: Research strategy, diagnostic approach and experimental results. German Journal of Psychology, 16(1), 24–43.
Goldstone, R. L., & Sakamoto, Y. (2003). The transfer of abstract principles governing complex adaptive systems. Cognitive Psychology, 46, 414–466.
Goldstone, R. L., & Son, J. Y. (2005). The transfer of scientific principles using concrete and idealized simulations. The Journal of the Learning Sciences, 14, 69–110.
Goode, N., & Beckmann, J. F. (2010). You need to know: There is a causal relationship between structural knowledge and control performance in complex problem solving tasks. Intelligence, 38, 345–352.
Gopnik, A., Sobel, D., Schulz, L., & Glymour, C. (2001). Causal learning mechanisms in very young children: Two, three, and four-year-olds infer causal relations from patterns of variation and covariation. Developmental Psychology, 37(5), 620–629.
Greiff, S., & Funke, J. (2009). Measuring complex problem solving—The MicroDYN approach. In F. Scheuermann & J. Björnsson (Eds.), The transition to computer-based assessment: Lessons learned from large-scale surveys and implications for testing (pp. 157–163). Luxembourg: Office for Official Publications of the European Communities.
Guthke, J., Beckmann, J. F., & Stein, H. (1995). Recent research evidence on the validity of learning tests. In J. S. Carlson (Ed.), Advances in cognition and educational practice. European contributions to the dynamic assessment (Vol. 3, pp. 117–143). Greenwich: JAI Press.
Hesse, F. W. (1982). Effekte des semantischen Kontexts auf die Bearbeitung komplexer Probleme [Effects of semantic context on complex problem solving]. Zeitschrift für Experimentelle und Angewandte Psychologie., 29(1), 62–91.
Hesse, F. W., Kauer, G., & Spies, K. (1997). Effects of emotion-related surface similarity in analogical problem solving. American Journal of Psychology, 110, 357–385.
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75–81.
Klahr, D. (2000). Exploring science: The cognition and development of discovery processes. Cambridge, MA: MIT Press.
Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15(10), 661–667.
Klayman, J., & Ha, Y. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94(2), 211–228.
Koedinger, K. R., & Anderson, J. R. (1998). Illustrating principled design: The early evolution of a cognitive tutor for algebra symbolization. Interactive Learning Environments, 5, 161–180.
Kotovsky, K., & Fallside, D. F. (1989). Representation and transfer in problem solving. In D. Klahr & K. Kotovsky (Eds.), Complex information processing: The impact of Herbert A. Simon (pp. 69–108). Hillsdale, NJ: Erlbaum.
Kröner, S., Plass, J. L., & Leutner, D. (2005). Intelligence assessment with computer simulations. Intelligence, 33, 347–368.
Lainema, T., & Nurmi, S. (2006). Applying an authentic, dynamic learning environment in real world business. Computers & Education, 47, 94–115.
Lazonder, A. W., Wilhelm, P., & Hagemans, M. G. (2008). The influence of domain knowledge on strategy use during simulation-based inquiry learning. Learning and Instruction, 18, 580–592.
Lazonder, A. W., Wilhelm, P., & van Lieburg, E. (2009). Unraveling the influence of domain knowledge during simulation-based inquiry learning. Instructional Science, 37, 437–451.
Leutner, D. (1993). Guided discovery learning with computer-based simulation games: Effects of adaptive and non-adaptive instructional support. Learning and Instruction, 3, 113–132.
Lubienski, S. T. (2000). Problem solving as a means toward ‘Mathematics for All’: An exploratory look through a class lens. Journal for Research in Mathematics Education, 31, 454–482.
McGaghie, W. C., Issenberg, S. B., Petrusa, E. R., & Scales, R. (2006). Effect of practice on standardised learning outcomes in simulation-based medical education. Medical Education, 40, 792–797.
Newson, R. (2006). Confidence intervals for rank statistics: Somers’ D and extensions. The Stata Journal, 6, 309–334.
Njoo, M., & de Jong, T. (1993). Exploratory learning with a computer simulation for control theory: Learning processes and instructional support. Journal of Research in Science Teaching, 30(8), 821–844.
Omodei, M. M., & Wearing, A. J. (1995). The fire chief microworld generating program: An illustration of computer-simulated microworlds as an experimental paradigm for studying complex decision-making behavior. Behavior Research Methods, Instruments & Computers, 27, 303–316.
Ravert, P. (2002). An integrative review of computer-based simulation in the education process. Computers, Informatics, Nursing, 20, 203–208.
Resnick, M. R. (1994). Turtles, termites, and traffic jams. Cambridge, MA: MIT Press.
Schauble, L., Klopfer, L. E., & Raghavan, K. (1991). Students’ transition from an engineering model to a science model of experimentation. Journal of Research in Science Teaching, 28(9), 859–882.
Schoppek, W. (2002). Examples, rules, and strategies in the control of dynamic systems. Cognitive Science Quarterly, 2, 63–92.
Schwartz, D. L., & Black, J. B. (1996). Shuttling between depictive models and abstract rules: Induction and fallback. Cognitive Science, 20, 457–497.
Snodgrass, J., & Corwin, J. (1988). Pragmatics of measuring recognition memory: Applications to dementia and amnesia. Journal of Experimental Psychology, 117, 34–50.
Somers, R. H. (1962). A new asymmetric measure of association for ordinal variables. American Sociological Review, 27, 799–811.
van Joolingen, W. R., & de Jong, T. (1993). Exploring a domain through a computer simulation: Traversing variable and relation space with the help of a hypothesis scratchpad. In D. Towne, T. de Jong, & H. Spada (Eds.), Simulation-based experiential learning (pp. 191–206). Berlin: Springer.
Vollmeyer, R., Burns, B. D., & Holyoak, K. (1996). The impact of goal specificity on strategy use and the acquisition of problem structure. Cognitive Science, 20, 75–100.
Wason, P. C. (1966). Reasoning. In B. M. Foss (Ed.), New horizons in psychology. Harmondsworth: Penguin.
Wood, R. E., Beckmann, J. F., & Birney, D. (2009). Simulations, learning and real world capabilities. Education + Training, 51(5/6), 491–510.
Acknowledgments
This research was supported, in part, under the Australian Research Council’s Linkage Projects funding scheme (project LP0669552). The views expressed herein are those of the authors and are not necessarily those of the Australian Research Council.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Beckmann, J.F., Goode, N. The benefit of being naïve and knowing it: the unfavourable impact of perceived context familiarity on learning in complex problem solving tasks. Instr Sci 42, 271–290 (2014). https://doi.org/10.1007/s11251-013-9280-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11251-013-9280-7