Researching Learner, Player, User Personas
Part One: Gathering Information
Reflection
Most importantly, I value purposeful play. In everything I do as a learning experience designer, I filter through the lens of “is it meaningful?” “ Is it purposeful?” “Is it designed with intention?” I value games that both entertain and challenge thinking and build problem-solving skills while supporting educational objectives and goals. I appreciate games that blend conceptual learning with critical, even creative thinking, providing opportunities to apply knowledge to real-world or simulated scenarios. Additionally, I value empathy-driven games, ones that encourage perspective-taking and compassion.
For learning games, a clear purpose must align with the educational standards/specific learning outcomes. Again, purposeful play. It is also important that learning games provide opportunities to practice skills in a meaningful context, such as computational thinking (per our client), design thinking, or even character education (my brain may have just caught fire thinking about that one!) I particularly love puzzle-solving, abstraction, pattern recognition, and deductive reasoning. Since most of my learning experience design work focuses on real-world learning and compassionate innovation, I prefer games that encourage players to reflect and think deeply about choices.
Though points and leaderboards are fun and motivational, they are not as important to me as purposeful play -- intrinsic engagement through rich, authentic content. Also, with learning games, I’m not keen on heavy competition. If a learning game was collaborative, I might enjoy more competition.
For entertainment games, I love creative problem-solving. My ADHD mind loves exploring different outcomes and experiencing the different possibilities. While I don’t lean towards heavy competition, I love social collaborative play, where shared experiences may lead to meaningful connections. I don’t love fast-paced action games, except maybe for Bananagrams. My mind is such that I need a minute to think, process, and explore different scenarios—anything fast-paced creates anxiety in me and causes my mind to freeze.
As a user, I value intuitive design. I have been an Apple girl since my college days, and I appreciate how user-friendly and easy-to-understand Apple devices are—simple, clear, and accessible. I’m continually growing my accessibility knowledge, as tools should be inclusive to all users. Tools and platforms should engage but not overwhelm.
Data Collection
I located two secondary sources through Google Scholar -- a digital book chapter from a Dordt University professor as part of the Handbook of Research on Global Issues in Next-Generation Teacher Education and a systematic literature review article from the International Journal of Technology and Design Education.
The research sources offered a combination of quantitative and qualitative data to help inform the learner persona. A few highlights:
- The pre-service teacher survey provided numeric data through scaled responses, such as comfort with technology on a scale of 1-5. It also offered some open-ended responses offering specific insight from pre-service teachers (qualitative).
- The literature review included numeric data from research outcomes and qualitative data summaries from other studies.
- The Dordt book chapter, a study conducted by Professor David J. Mulder, offered insights and analyses on the technology integration habits of pre-service teachers, most specifically attitudes, skill levels, and generational differences (qualitative).
Sources:
- Survey for pre-service students
- A chapter from the Handbook of Research on Global Issues in Next-Generation Teacher Education from the Dordt University Digital Collection.
- Dong, W., Li, Y., Sun, L. et al. Developing pre-service teachers’ computational thinking: a systematic literature review. Int J Technol Des Educ 34, 191–227 (2024). https://doi.org/10.1007/s10798-023-09811-3
The target audience for this research is pre-service teachers, primarily those in undergraduate teacher education programs. The survey data shows that most of the target audience is within the 18-24 age range, aligning with typical college-aged students. The literature analysis also focuses on pre-service teachers. The survey and Dordt chapter focused primarily on pre-service teachers aiming to be classroom teachers. The literature review focused more on pre-service teachers/teachers engaged in computational thinking training and interested in implementing game-based learning and edtech in their future classrooms.
Part 2: Analyze your Findings
Time was a constraint for this collection process, and I needed to lean into already published research. The systematic literature review offered insightful second-hand research findings and narrowed research into the world of computational thinking and pre-service teachers. The Mulder chapter offered a broad generational perspective on digital fluency and attitudinal and emotional considerations.
I created a side-by-side comparison with the survey and researched a literature review, as both of those sources focused more on specific teaching skills and technology training.
Below is a summary of what I learned from the survey and research findings to help create a learner persona for pre-service teachers:
- Tech Comfort with New Tools: There was varied familiarity with GBL and CT tools.
- Survey: range of GBL experience -- though not directly correlated to tech integration.
- Research Papers: indicated gaps in tech integration skills and need for explicit instruction.
- Training: The two papers specifically mentioned structured training to build confidence and education technology skills.
- Motivations: All three sources identified real-world relevance and practice experience as key motivators for pre-service teachers. The survey indicated a specific interest in GBL and interactive experiences, and the papers also discussed the importance of confidence and self-efficacy.
- Pain Points: All three sources touched on barriers to technology integration, including understanding, confidence issues in using the technology, and access to different learning apps and platforms.
- Learning Preferences: All sources included hands-on, collaborative, and interactive learning experiences as a preference.
- Game-Based Learning Elements: The two research papers did not directly address GBT, though the literature review included computational thinking learning modules and programs such as Scratch (not game-based but fun and interactive). The survey did mention interest in and some experience with GBL and the interest in learning more.
➡️ Link to Learner Persona Image for higher resolution: ⬅️
Part 3: Reflect on your Data Collection Process
The data collection process went well, but time constraints presented a few challenges. Ideally, I would have first interviewed the Subject Matter Expert (SME) to understand the project scope better. This process would have allowed me to refine the project's focus before developing surveys, conducting interviews, and gathering additional research. However, I created the survey before speaking with the SME due to scheduling limitations and assignment deadlines. After the initial survey was developed, we shared it with the SME, who provided valuable feedback and insights that informed our revisions. Once the client approves, the current plan is to give the updated survey to current pre-service students enrolled in the course.
One challenge was assuming the students would have had prior exposure to learning design classes, which influenced my survey design and participants (juniors and seniors). Though the target audience is pre-service teachers in general, the class is geared toward sophomore pre-service teachers. It would have been more strategic to conduct interviews and gain a thorough understanding of the client's needs before diving into data collection. Given more time and a better vision of the project scope, the data collection process would have been more targeted. In reflection, I will prioritize early collaboration with clients/stakeholders before moving to data collection.
Lastly, in reflection and moving forward with any instructional design, I would have interviewed the students rather than administered only a survey. When thinking of my own children, who are 20 and 22, one learns more about what they think by listening, asking questions, and probing more deeply when they share. A survey feels 2D, but an interview feels so much more. One caveat: College students are busy and may not see the value of investing time in an interview to help design a learning experience.