Skip to main content

How Do We Know?

How Do We Know?

Some of the components of ARC-Learn were wildly successful, while others led to unanticipated challenges for students and mentors. We used our experiences running two cohorts of this program to explore many facets of URE design, how they worked for us, and to offer considerations for other program designers.

ARC-Learn Administrative Team

To successfully run a program with wrap-around student support, the administrative team needed to be large and inclusive. Our team included:

  • College level support (CEOAS): Associate deans for faculty, undergrad programs, research
  • Programmatic leadership (CEOAS): ARC-Learn PIs, program coordinator, STEM Research Center researchers
  • Polar research support (CEOAS): Faculty & graduate student mentors
  • Academic support (CEOAS): Academic advisors, experiential education coordinator
  • STEM education research: STEM Research Center researchers, including ARC-Learn PI
  • Mentor training, research, peer learning communities: Contracted inclusive mentorship specialist
  • Expert recommendations & evaluation: Expert Board of Advisors

Program Structure: Cohorts & Research Teams

The ARC-Learn program ran two staggered cohorts, each lasting two years (Cohort 1 - Fall 2021-Spring 2023, and Cohort 2 - Fall 2022-Spring 2024). Each cohort had approximately 20 students and 10 mentors. While details of cohort structure and practices varied between the two cohorts – as we implemented changes based on lessons learned – they shared a common overarching structure:

  • Each cohort was divided up into smaller research teams, each with a specific thematic focus. Most research teams were supported by two mentors, and students self-selected into the thematic team of most interest to them.
  • All students participated in bi-monthly cohort sessions facilitated by program leaders, as well as bi-monthly research team meetings led by faculty and graduate student mentors.
  • Mentors participated in a three-session inclusive mentorship training, as well as quarterly peer learning community meetings.
  • Basic expectations for student efforts included: a) use existing data to answer polar research questions, and b) present their findings at an undergraduate research poster symposium at the end of the program.

Check out our mid-program formative evaluation for more information about program structure.

Data Sources

This guidebook is informed by student and mentor feedback, the program administrative team’s significant professional experience in polar science education, undergraduate student engagement, science education, and inclusive mentorship, and lived experiences implementing the program. Students’ and mentors’ experiences in the program were captured through surveys and interviews at multiple points in the program. Peer reviewed manuscripts of research findings related to student research skill development, sense of belonging and STEM persistence, as well as mentor development of inclusive mentorship practices are still to come.

Program Adaptations

The program ran two, 2-year cohorts, offset by one year. We applied lessons learned from the first cohort to the second cohort, based on feedback in surveys, interviews and administrative team observations and challenges. Program adaptations were also discussed and reviewed by our Board of Advisors.

For example, we experienced a lack of administrative capacity for managing program logistics and non-research student support, so we hired a part-time program coordinator. This change was successful and well-received. We also heard from students that the program was too flexible and wide open, so we added more structure to the program, creating clear assignment benchmarks with deadlines. Other changes related to mentor expertise, supporting online students, data literacy, mentor availability, etc. were implemented based on student and mentor feedback.