STEM Central

A Community of Practice for NSF STEM Projects

STEP 2013 session II-4 NOTES: 6 Steps to Your STEP Evaluation


GOALS of Session Exchange ideas Provide general framework for evaluation Create a state for one-on-one discussions interactions between STEP evaluators

OVERVIEW Engage leaders in evaluation Document achievements Describe challenges and adjustments Using advisory committee Analyzing data Share results in a strategic fashion

STEP 1 Project Leader Collaboration Key Personnel participating in project Everybody should be talking (PI, co-PI, faculty, mentors, institutional research person …) -evaluation is sometimes an afterthought for project leaders -communication early/regularly w/ leadership -construct a project theory of change and logic models -document decisions and next steps -how can results be used and leveraged -be persistent w/o being annoying

Evaluation should evolve by objective. Must work with project leaders to develop a logic model and develop instrumentation for measuring. If you are asked to participate late as an evaluator start as if you are starting from the beginning event if it is the 4th year of the project.

Everyone should be connected together and the project should be co-constructed together with all stakeholders.

Sometimes there will be challenges, however, there are resources to assist. They include: AISL(?) Advancing in Formal STEM Learning; AEA, American Evaluation Association; Western Michigan University, (?); Educational psychology evaluators.

Some institutions have fired their evaluators so know that it is an option.

STEP 2 Document Achievements-Part of the logic Model What are key research questions? How will you document them? Why?

Research questions sometimes are: who, what, when, where and why?

Achievements might include: passing classes, what programmatic initiatives contribute to retention? Institution and student objectives should be limited and not try to accomplish everything; how to interventions effect students; how do your achievements compare to national data? Do interventions effect students?; how does it compare to national data? Do interventions improve the learning environment? Tools that can be used to measure that include: CLES, CURE & SURE (Grinnell College). These surveys cans assess learning for students in the classroom.

When documenting data with large populations desegregate the data into parts.

Key Activities/Number of Students Impacted Use: # of students, surveys, focus groups, journals, impact data, & other. To Rate Success Experiences: -use gpa, retention, research interest, cohort building, student satisfaction etc.

Project activities integrated at institutions -of shoots -different version -look at impacts beyond intended goals such as: socialization, confidence …

STEP 3 Challenges/Adaptations/Assessment/Sustainability -need to discuss worst practice as well as best practices; analyze why things did not go well.

What can be learned from a project’s misfires or mistakes in documentation?

-project heavy in use of senior faculty, they retire; -have buy-in for individuals at various levels; -not involving enough people in the admission of students • Need to educate folks on campus • Find different ways to recruit students(i.e. student-tech; parents-letters) • Need to think about who would direct the project if someone leaves o Be aware of the extreme

• Be strategic about what you are assessing o Use what they are doing as evidence, develop a rubic

Challenges faced -grant notification; recruitment participants, funding students, significant changes document

Use intermediate measures & metrics to monitor progress

Data analysis can be -quantitative, qualitative, mixed methods -case studies are good as well

STEP 4 Advisory Board Involvements How can advisory committee(s) contribute to evaluation?

Boards can be helpful w/ evaluation Ways to include them -facilitate and interpreting data -they are stake holders, involve them at the beginning -they can help you frame result

External Advisory Board -most important at end; can help identify key audiences; how to disseminated our work; use the strategically; summer resources, information, jobs etc.

STEP 5 Lessons Learned/Overall Impact

Why an evaluator to adapt methodological pluralism w/ collecting and analyzing project evaluation data? -learned that IRB can be a challenge -process can be challenging -lessons learned about use of personal time management; use is wisely -community college students are sometimes a challenge; they do not always fit the linear plan we see. -sometimes it requires use to change our programming strategies for student who do not participate in programs

Internal Evaluators (formative) are decision makers for the projects. control External Evaluators (summative) offer suggestions for the projects. No control

Take time to talk w/ IRB to facilitate the right wording & approach to proposal

Using different people on campus can be a good pedagogy in looking at data, project mission … .

Keep in mind original goals.

STEP 6 Dissemination Results

Sharing successes with your Provost, CAO -they are important stakeholders Get input from external advisory how to disseminate -they can help you identify multiple audiences -may be administrators; think of multi-level approaches to dissemination -figure out how best to speak to audience -project leadership team, use print and digital media for sharing work -look at high value products -tailor dissemination plans for each project

Dissemination Options Present at AACU conference-present there, deadline is mid-april for presentation League for Innovation in Community Colleges HICE (Hawaii International Conference) Others include: AERA, NCCET, NASPA, AEA

Conclusions/Questions Make checklists Survey program officers Project evaluation network/participate in discussion threads -dissemination may be beyond the NSF-STEP meeting -survey program officers on good projects & challenges