skip to Main Content

The development of eFun: A game-based app to measure cognitive functions in students

eFun is a self-administered series of games that measure a subset of cognitive functions called ‘executive functions’ in an engaging way on an iPad. Executive functions have been shown to predict academic and career success and are related to quality of life (Moffitt et al, 2011; Nayfeld, Fuccilli & Greenfield, 2013). eFun was developed with a theory-driven design through iterative cycles of design, it was tested in real world settings, and is based on established executive function theories and user feedback.

The rationale behind the development of eFun

Cognitive assessments can be expensive, lengthy and fatiguing for students and are often conducted in an artificial clinical context. In an effort to make the assessments more fun, researchers have started to introduce game elements to traditional cognitive tasks and training. Traditional cognitive assessments typically involve children being assessed in a private room with one or more assessors. This environment can be anxiety-inducing and stressful, especially for children who are unfamiliar with these types of situations (Shute, Wang, Greiff, Zhao & Moore,  2016). Furthermore, lengthy cognitive assessments can be boring for children to the point where they disengage with the tasks (McPherson & Burns, 2008). Therefore, research is shifting away from traditional cognitive testing to a more fun, accessible, and easy approach (Hawkins, Rae, Nesbitt, & Brown, 2013; Vermeir, White, Johnson, Crombez & Van Ryckeghem, 2020). To make the tasks more enjoyable, assessors have started to add gamified elements to traditional cognitive task designs (Berg, Rogers, McMahon, Garrett  & Manley, 2020;  Howard & Melhuish, 2017). Improving the task design of traditional cognitive tasks is thought to increase students’ engagement with the tasks, thereby raising the probability of measuring children’s full cognitive capabilities (Vermeir, White, Johnson, Crombez & Van Ryckeghem, 2020; Lumsden, Edwards, Lawrence, Coyle & Munafò, 2016). However, this is an emerging field that is still lacking valid research (tools) (Berg, Rogers, McMahon, Garrett & Manley, 2020; Blair, Zelazo & Greenberg, 2005). This is partly due to the fact that a large amount of research in the gamification literature focuses on cognitive training instead of assessment (Lumsden, Edwards, Lawrence, Coyle & Munafò, 2016; Ninaus, et al., 2015; Dovis, Van der Oord, Wiers, & Prins, 2015). While testing environments are slowly evolving, most still require one-on-one support from a trained assessor.

Challenges of designing a cognitive assessment tool for students:

Designing cognitive assessment tasks for students comes with a number of challenges. The main challenge is to develop tools that are engaging and at the same time reliably assesses cognitive constructs in students.

  • Challenge to design an engaging task

Another problem is that existing validated executive function assessment tools are not always suited for use with students. For example, if the student perceives the task as too effortful, frustrating, and/or repetitive, it can result in participant disengagement which in turn, may negatively impact data quality (Lumsden, Edwards, Lawrence, Coyle & Munafò, 2016). It has been shown that data quality can be negatively affected if the student puts in low effort on cognitive tasks including executive function tasks (DeRight & Jorgensen, 2015). Furthermore, it has been shown that if the student enjoys the task and finds it interesting, performance is higher (Schukajlow, & Krug, 2014). Task enjoyment has also been found to be positively associated with attention and task persistence (Reeve, 1989; Engelmann & Pessoa, 2014), which can lead to better performance (Engelmann, Damaraju, Padmala & Pessoa, 2009; Pessoa & Engelmann, 2010).

To increase task engagement and effort, some researchers have added game elements to traditional tasks (Lumsden, Edwards, Lawrence, Coyle & Munafò, 2016; Katz, Jaeggi, Buschkuehl, Stegman & Shah,  2014). However, introducing game-like elements can diminish their potential motivational benefits if they distract the participant to the point that the construct of interest is no longer being measured reliably (McPherson & Burns, 2008; Katz, Jaeggi, Buschkuehl, Stegman & Shah, 2014). Game designers value dynamic game elements, however, from a cognitive perspective it can interfere with the assessment of cognitive skills (Dockterman, Petscher, McAfee, Klopfer, Osterweil & Diefenthaler, 2020). This means that game elements need to be introduced carefully without distracting the player from the core task. In line with this, Lumsden et al. (2016) suggest that gamification can provide a way to develop engaging and yet scientifically valid cognitive assessments if it is applied carefully. Thus, the aim is to successfully import game design elements into EF tasks without undermining their validity. This is expected to improve the quality of the outcome data and enhance the experience for participants (Lumsden et al., 2016).

  • Challenge to maintain underlying theoretical construct

Another challenge is to make the tasks enjoyable while maintaining the underlying theoretical construct of the original task. There is a need to design measurement tools for students that use game elements to increase engagement, while simultaneously keeping the task focused on the assessment of psychological capacities consistent with theoretical guidelines (Vermeir, White, Johnson, Crombez & Van Ryckeghem, 2020). This is in line with design-based research (DBR) which emphasizes theory driven-design (Wang & Hannafin, 2005; Design-Based Research, 2003). This means that the development team needs to have a common understanding of the theoretical underpinnings that are needed to develop the tasks (Oubahssi, Piau-Toffolon, Loup & Sanchez 2020). On the other hand, DBR is conducted in order to generate, advance and refine theory (Wang, & Hannafin, 2005; Design-Based Research, 2003). As hypotheses are rejected or confirmed in DBR, theoretical models get refined/retheorized (Bell, Hoadley, & Linn, 2004; Stahl, 2002). Thus, DBR is based on established theories while allowing for theory generation and modification (Wang, & Hannafin, 2005).

Summary

To address these challenges, our research aimed to improve cognitive assessment with the new game-based assessment app eFun. The eFun app has been designed and developed in collaboration with researchers, teachers, students, and software engineers, it is based on established cognitive theories, and was validated through iterative testing in real world settings. The iterative development process is based on design-based research and includes cycles of design explorations, testing, analyses, redesign, and evaluation with students in authentic educational settings. The knowledge gained from the iterative process of designing a valid cognitive function app can inform other researchers who are aiming to develop cognitive assessment tools in an educational context.

To find out more about the development of eFun click here.

References

Bell, P., Hoadley, C.M., & Linn, M.C. (2004). Design-based research in education. Internet for science education, 2004, pp. 73-85. https://doi.org/10.4324/9781410610393

Berg, V., Rogers, S. L., McMahon, M. , Garrett, M. & Manley, D. (2020). A Novel Approach to Measure Executive Functions in Students: An Evaluation of Two Child-Friendly Apps. Frontiers in Psychology, 11, p. 1702. https://doi.org/10.3389/fpsyg.2020.01702

Blair, C. , Zelazo, P. D. & Greenberg, M. T.(2005). The measurement of executive function in early childhood,. Dev Neuropsychol, 28 (2), pp. 561-71. Doi: 10.1207/s15326942dn2802_1. V. Berg, A Game-Based Online Tool to Measure Cognitive Functions in Students pag. 85. International Journal of Serious Games Volume 8, Issue 1, March 2021 ISSN: 2384-8766 http://dx.doi.org/10.17083/ijsg.v8i1.410

DeRight, J. & Jorgensen, R. S.  (2015). I just want my research credit: Frequency of suboptimal effort in a non-clinical healthy undergraduate sample. The Clinical Neuropsychologist, <29 (1), pp. 101-117. https://doi.org/10.1080/13854046.2014.989267

Design-Based Research, C. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32 (1), pp. 5-8.  https://doi.org/10.3102/0013189X032001005

Dockterman, D., Petscher, Y., McAfee, A., Klopfer, E., Osterweil, S. & Diefenthaler, C. (2020). Gaming Considerations for Educational Assessments.  https://doi.org/10.31234/osf.io/en23t

Dovis, S., Van der Oord, S., Wiers, R. W. & Prins, J.P.  (2015). Improving executive functioning in children with ADHD: training multiple executive functions within the context of a computer game. a randomized double-blind placebo controlled trial. PLoS One, 10 (4) p. e0121651. Doi: 10.1371/journal.pone.0121651, https://doi.org/10.1371/journal.pone.0121651

Engelmann, J. B., Damaraju, E., Padmala, S. & Pessoa, L. (2009). Combined effects of attention and motivation on visual task performance: transient and sustained motivational effects. Frontiers in human neuroscience, 3, p. 4. https://doi.org/10.3389/neuro.09.004.2009

Engelmann, J. B. & Pessoa, L. (2014). Motivation sharpens exogenous spatial attention. https://doi.org/10.1037/2333-8113.1.S.64

Hawkins, G. E.,  Rae, B., Nesbitt, K. V.  & Brown, S. D. (2013). Gamelike features might not improve data. Behavior research methods, 45 (2) pp. 301-318. https://doi.org/10.3758/s13428-012-0264-3

Howard,S. J. & Melhuish, E. (2017). An Early Years Toolbox for Assessing Early Executive Function, Language, Self-Regulation, and Social Development: Validity, Reliability, and Preliminary Norms. Journal of Psychoeducational Assessment, 35 (3), pp. 255-275. Doi: 10.1177/0734282916633009.

Katz, B., Jaeggi, S., Buschkuehl, M., Stegman, A. & Shah, P. (2014). Differential effect of motivational features on training improvements in school-based cognitive training. Frontiers in human neuroscience, 8, p. 242. https://doi.org/10.3389/fnhum.2014.00242

Lumsden, J., Edwards, E. A., Lawrence, N. S., Coyle, D. & Munafò, M. R. (2016). Gamification of cognitive assessment and cognitive training: a systematic review of applications and efficacy. JMIR serious games, 4 (2), p. e11. https://doi.org/10.2196/games.5888

McPherson, J & Burns, N. R. (2008). Assessing the validity of computer-game-like tests of processing speed and working memory. Behavior Research Methods, 40 (4), pp. 969-981. https://doi.org/10.3758/BRM.40.4.969

Moffitt, T.E., et al. (2011). A gradient of childhood self-control predicts health, wealth, and public safety. Proc Natl Acad Sci U S A, 108 (7), pp. 2693-8. Doi: 10.1073/pnas.1010076108.

Nayfeld, I., Fuccillo,  J., & Greenfield, D. B. (2013) executive  functions  in  early  learning:  Extending  the relationship  between  executive  functions  and  school  readiness  to  science. Learning  and  Individual Differences, 26, pp. 81-88, 2013. Doi: 10.1016/j.lindif.2013.04.011.

Ninaus, M. et al. (2015). Game elements improve performance in a working memory training task. International journal of serious games, 2 (1), pp. 3-16.  https://doi.org/10.17083/ijsg.v2i1.60

Oubahssi, L., Piau-Toffolon, C., Loup, G. & Sanchez, E. (2020). From Design to Management of Digital Epistemic Games. International Journal of Serious Games, 7(1), pp. 23-46. https://doi.org/10.17083/ijsg.v7i1.336

Pessoa, L. & Engelmann, J.B. (2010). Embedding reward signals into perception and cognition. Frontiers in neuroscience, 4, p. 17. https://doi.org/10.3389/fnins.2010.00017

Reeve, J. (1989). The interest-enjoyment distinction in intrinsic motivation. Motivation and emotion, 13 (2), pp. 83-103. https://doi.org/10.1007/BF00992956

Vermeir, J. F., White, M. J., Johnson, D., Crombez, G. & Van Ryckeghem, D. M. L. (2020). The effects of gamification on computerized cognitive training: Systematic review and meta-Analysis. JMIR serious games, 8 (3) p. e18644. https://doi.org/10.2196/18644

Schukajlow, S. & Krug, A. (2014). Are interest and enjoyment important for students’ performance. Proceedings of the Joint Meeting of PME, 2014, 38, pp. 129-136.

Shute, V. J., Wang, L., Greiff, S., Zhao, W. & Moore, G. (2016). Measuring problem solving skills via stealth assessment in an engaging video game. Computers in Human Behavior, 63, pp. 106-117. https://doi.org/10.1016/j.chb.2016.05.047

Stahl, G. (2002). Computer Support for Collaborative Learning: Foundations for a Cscl Community (Cscl 2002 Proceedings). Routledge, https://doi.org/10.3115/1658616

Wang, F. & Hannafin, M. J. (2005). Design-based research and technology-enhanced learning environments. Educational technology research and development, 53 (4), pp. 5-23. https://doi.org/10.1007/BF02504682

Back To Top

Pin It on Pinterest

Share This