Project Overview and Purpose

Students in introductory biology courses often report they are studying hard yet not meeting their goals. Why do so many students struggle to learn despite their efforts? One likely reason, supported by work in the cognitive sciences, is that students are generally not adept at judging how well they know something (a metacognitive practice), nor at modifying their practices to improve their learning process (self-regulated learning; SRL) (Schraw et al., 2006; Stanton et al., 2015). Students who develop SRL strategies have better academic success and are more likely to persist in STEM courses than those who do not (Sebesta & Speth 2017; Hunter, 2019). SRL also strengthens students’ ability to transfer their knowledge to new settings and events (Bransford et al., 2000; Zhao et al., 2014), and developing such strategies can be especially impactful for students from historically underserved communities (e.g., Rodriguez et al., 2018). However, despite this evidence, few faculty encourage and engage students in SRL, and few resources exist to help create course structures that do so.

We propose to tackle this issue by developing new features for existing computational systems that encourage self-reflection and motivation through collecting and returning survey data to students while they are engaged in learning course materials. Students will be asked to answer questions about their performance goals and motivations, their self-regulatory strategies, and their confidence in their knowledge. Survey results will be returned to students, and followed by additional questions about how well they are currently meeting their goals, their likelihood of changing study strategies, and reflections on how they determine whether they are learning effectively. Students will also be messaged about upcoming assignments and quizzes, and offered choices of additional resources that include successful study practices, time management tips, etc. After each quiz/test in the course, they will be prompted to interface with the system again, always reflecting back on prior answers.

We propose that this technology-based intervention could have a profound effect on students by removing many of the barriers they often encounter. It should help them identify their personal motivations, set goals aligned with those motivations, and hone their study strategies through reflecting on their understanding and progress. In addition to developing the system, we will characterize the effects of using this system on students, measuring whether the process of repeated self-reflection and evaluation can lead to improvement in implementation of effective learning strategies, and in course performance. 

We will begin the project by researching currently available online platforms that use surveys and analyses of responses to provide targeted feedback to participants, such as Neurowyzr (https://neurowyzr.com), MentorCoach (https://mentorcoach.com ), and ECoach  (https://www.expertecoach.com/). ECoach, which is institutionalized in the Center for Academic Innovation at University of Michigan, is likely to have the most overlap with our needs as it is focused on educational environments and features such as messaging students and providing feedback before and after exams and providing space to make study plans and reflections. We will investigate whether Canvas and/or Qualtrics have sufficient capabilities to be customized to meet our project’s goals. We anticipate using these systems would lead to greater adoption of our designed intervention since students and educators would not need to install new software or learn how to interact with a new interface. Additionally, modifying established tools will help us to rapidly iterate on our designs and distribute our content more quickly. Our iterative design process will include soliciting feedback from both undergraduate students and faculty during the design and testing phases. We will also work with OIT and/or ASSETT at multiple points to ensure we understand capabilities of Canvas integration and/or integration of other software if deemed necessary.  

Stakeholders and Intended Impact

Students (particularly in STEM) desperately need tools to help them be successful in their courses, and in understanding how to learn. In particular, students from under-served populations, first generation students, and transfer students often feel unwelcome in the university environment, and struggle to determine successful learning strategies. The proposed intervention could help many students at CU, in any discipline. Furthermore, since we will study the effects of this technology, the results will be disseminated widely, and have the potential to be taken up by others at a national level. This is even more likely if the designed tools are easy to integrate into a learning management system. Ultimately, this technology could help promote more equitable learning environments that can reduce opportunity gaps.

Team Description/Gaps

Dr. Knight is a biology educational researcher and has the tools to qualitatively analyze student responses and create statistical models for what types of behaviors promote success. The team also includes MCDB PhD student Zachary Hazlett, undergraduate MCDB major Justin Hein, and Dr. Jason Zietz, Assistant Teaching Professor, Department of Information Science. Dr. Zietz has the computer skills and cognitive science background to help design and iterate on the computational interventions, and will identify one or more skilled undergraduate students who will aid with computer programming. We will also use additional undergraduate students for testing the system. Hazlett will lead the exploration of current technologies, including attending a summer conference that focuses on ECoach (https://www.seismicproject.org/annual-meetings/2023-summer-meeting/) to learn how that system works.

Intended Scale

Initially, students in one or two courses at CU would be asked to engage in this project, beginning with MCDB’s large Genetics course (~200 students/section, taught in Spring by Knight). Zietz’s courses in computer programming are other likely test cases. Once the system is designed and tested, we can implement it in other courses at CU (e.g. ones with typically high DFW rates such as Introductory Biology and Chemistry), at the discretion of interested faculty, and ultimately it could be incorporated in any course at CU or elsewhere.

Funding Request + Intended use of Funds:  total $63,057  

  • Graduate student Zachary Hazlett will investigate current technology, help plan and guide development of the product, work with Dr. Zietz to troubleshoot the system, and interface with students to determine its efficacy: (25% appointment, 4 semesters): $32,224
  • Undergraduate student assistant: Justin Hein, junior MCDB major with experience in education (Learning Assistant, Upward Bound instructor). Justin will be involved in development and testing of the product, as well as interfacing with additional students for feedback. $20/hr, ~75 hours = $1,500.
  • Undergraduate student assistant (TBD) identified from Dr. Zietz’s programming courses. $20/hr, ~75 hours = $1,500.
  • Undergraduate students who have recently taken or are currently taking Genetics: ~20 @$20 each: students will be interviewed to help characterize their experience using the product = $400.
  • Assistant Teaching Professor Dr. Jason Zietz: Two months of summer salary over two years: $21,433
  • Associate Professor Jenny Knight: $3,000 per summer over two years to coordinate the program: $6,000

Anticipated Long Term Needs

If the system works well using off-the-shelf technology, minimal long-term support would be needed for running the system. We would interface with ASSETT to make the system attractive to other users and offer workshops through ASSETT and/or CTL to encourage faculty adoption.

References

Bransford, J. D., Brown, A. L., Cocking, R. R. (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. In How People Learn: Brain, Mind, Experience, and School. https://doi.org/10.17226/9853   

Hunter, A.-B. (2019). Why undergraduates leave STEM majors: Changes over the last two decades. in Talking about Leaving Revisited: Persistence Relocation and Loss in Undergraduate STEM Education, Cham, Switzerland:Springer, pp. 87-114, 2019. https://doi.org/10.1007/978-3-030-25304-2

Rodriguez, F., Rivas, M. J., Matsumura, L. H., Warschauer, M., Sato, B. K. (2018). How do students study in STEM courses? Findings from a light-touch intervention and its relevance for underrepresented students. PLoS ONE, 13(7), 1–20. https://doi.org/10.1371/journal.pone.0200767

Schraw, G., Crippen, K. J., Hartley, K. (2006). Promoting self-regulation in science education: Metacognition as part of a broader perspective on learning. Research in Science Education, 36(1–2), 111–139. https://doi.org/10.1007/s11165-005-3917-8

Sebesta, A. J., Speth, E. B. (2017). How should I study for the exam? Self-regulated learning strategies and achievement in introductory biology. CBE Life Sci. Educ. 16(2), 1–12. https://doi.org/10.1187/cbe.16-09-0269

Stanton, J. D., Neider, X. N., Gallegos, I. J., & Clark, N. C. (2015). Differences in Metacognitive Regulation in Introductory Biology Students: When Prompts Are Not Enough. CBE Life Sci. Educ., 14(2), ar15–ar15. https://doi.org/10.1187/cbe.14-08-0135

Zhao, N., Wardeska, J. G., Mcguire, S. Y., Cook, E. (2014). Metacognition: An Effective Tool to Promote Success in College Science Learning. Journal of College Science Teaching, 43(4), 48–54.