вход по аккаунту

код для вставкиСкачать
Buffalo, June 12, 2009
Understanding Middle States’
Expectations for Assessment
Linda Suskie, Vice President
Middle States Commission on Higher Education
3624 Market Street, Philadelphia PA 19104
E-mail: [email protected]
1. Understanding Standard 7: Institutional
2. Sharing assessment results
3. Using assessment results
4. Telling your story to Middle States
5. Questions an MSCHE reviewer might ask
Understanding Standard 7:
Institutional Assessment
Planning & Assessment
as a Four-Step Cycle
1. Goals
2. Programs,
Services &
4. Using
3. Assessment/
What Goals Are We Talking About?
• Institutional goals (mission & strategic plan)
– Administrative goals
• Division goals
– Administrative unit goals
– Student learning goals
Gen Ed curriculum
Academic programs
Student development programs
Support programs
1. Mission & Goals
2. Planning
8. Admissions
3. Resources
9. Student Support Services
4. Leadership/Governance
10. Faculty
5. Administration
11. Educational Offerings
6. Integrity
12. General Education
13. Related Educ. Activities
7. Institutional Assessment
14. Asmt. of Student Learning
Institutional Effectiveness:
Are We Achieving…
7. Mission & Goals
Strategies to Assess Institutional Goals
Assessments of student learning
• Direct evidence (clear, convincing)
Tests & examinations
Assignments, papers, projects
Field experience evaluations
• Indirect evidence
– Retention, graduation, placement rates
– Surveys of students & alumni
– Grades
Performance indicators
• “Measures that are monitored in order to determine the
health, effectiveness, & efficiency” of an institution
» Michael Dolence & Donald Norris
Key performance indicators (KPIs)
Key quality indicators (KQIs)
Performance measures
Performance metrics
= Balanced scorecard
= Dashboard indicators
Popular performance indicators
• Student retention & graduation rates
• Job placement rates
• Racial/ethnic enrollment breakdowns
• Dollar value of sponsored research grants
• Licensure & certification exam pass rates
• Faculty workload
– Student/faculty ratio
– Average credit enrollment per FTE faculty
Common state performance indicators
National Center for Public Policy & Higher Education
• Preparation
– Number & quality of
teachers graduating in
critical fields
• Participation
– Enrollment by race,
gender, income
• Affordability
– Discounted tuition &
fees as proportion of
median income
• Completion
– Actual & predicted
graduation rates
based on student
preparation & aptitude
• Benefits
– Degrees awarded in
critical fields
– Sponsored research &
• Learning
Other examples
• Participation rates (e.g., in student activities,
cultural events)
• Expenditures per FTE student
• Counts of contacts, inquiries, etc.
– Questions to library information desk
– Referrals to counseling center
Program reviews (academic & other)
• Common criteria for academic program reviews
– Quality
• Resources, activities, outcomes, etc.
– Need
• Demand for the program
• Competing programs
• Centrality to mission
– Cost and cost-effectiveness
Baldrige National Quality Program
1. Leadership
2. Strategic planning
3. Student, stakeholder, & market focus
4. Measurement, analysis & knowledge
5. Faculty & staff focus
6. Process management
7. Organizational performance results
Other assessment strategies
Surveys, interviews, focus groups
“Secret shoppers”
Observations of students, meetings, activities
Document reviews
– Meeting minutes, transcript analyses, e-mails, online
• Online institutional portfolios
• Quality improvement tools
– Run charts, histograms, pareto analyses, six sigma
– Activity-based costing: Compare outcome against cost
Your assessment strategy must align
with a goal to be useful.
Sharing Assessment Results
Why are you assessing
the program or curriculum?
– Validate it to others (accountability)
– Make sure it isn’t slipping
– Improve it
Keep assessment summaries
useful to you and your colleagues.
• Who needs to see the results?
• Why? What decisions will they make?
• What do they need to see to make those
What decisions might the
assessment help with?
• Learning goals
– Are our learning goals sufficiently clear and focused?
• Curriculum
– What is the value of service learning?
– Should our courses have more uniformity across sections?
• Teaching methods
– Is online instruction as effective as traditional instruction?
– Is collaborative learning more effective than lectures?
– Are we developing a community of scholars?
• Assessments
– Have our assessments been useful?
• Resource allocations
– Where should we commit our resources first?
Keep assessment summaries
short and simple.
• Fast and easy to read and understand
– Use short, simple charts, graphs, and lists.
• Use PowerPoint presentations.
• Avoid narrative text.
– First aggregate (sum up) data, then drill down into details
as needed.
– Round results.
– Sort results from highest to lowest.
– Percentages may be more meaningful than averages.
• Avoid complex statistics.
– As you collect results over time, show trends.
Tell a story.
• Key questions to address:
– What have you learned about your students’ learning &
other institutional goals?
– What are you going to do about what you have learned?
– When, where, and how are you going to do it?
» Doug Eder
• Focus on “big news.”
– Identify meaningful vs. insignificant differences.
• Find someone skilled at finding the stories in
reams of data.
Using Assessment Results
When Assessment Results
Are Good
When assessment results are disappointing…
Example: Student retention results
• Set a special target for male students.
Program (curriculum)
• Make the advisement program mandatory.
• Implementation (pedagogy)
• Increase professional development for advisors.
• Assessments
• Identify student goals upon entry and upon exit.
• Resource allocations
• Fund professional development for advisors.
Telling Your Story to
Middle States
What Should Institutions Document?
• Clear statements of goals
• Organized, sustained assessment process
– Principles, guidelines, support
– What assessments are already underway
– What assessments are planned, when, & how
• Assessment results documenting progress
toward accomplishing goals
• How results have been used for improvement
How Might Institutions Document This?
• Need not be a fancy bound document!
• Need not be in a consistent format or single
• An overview in the report to MSCHE
• A chart or “roadmap” in the report to MSCHE or an
• More thorough information in the on-site “resource
room,” online, and/or burned onto CD
• A few samples of student work
– Exemplary, adequate, inadequate
Do you need special assessment software?
What are your needs?
How will you use the software?
Are faculty & staff ready to use it?
Do you have IT support?
Ask vendors for references.
What are the real costs?
What is the cost-benefit balance?
Don’t rush; involve faculty in deciding.
an MSCHE Reviewer Might Ask
Is the Institution Engaged in
“Good” Assessment?
Clear &
accurate &
truthful results
For Each Goal…
• How is the goal being assessed?
• What are the results of those assessments?
• How have those results been used for
Do Institutional Leaders Support and
Value a Culture of Assessment?
• Is there adequate support for assessment?
– Overall guidance & coordination
• Are assessment efforts recognized & valued?
• Are efforts to improve teaching recognized &
How Much Has Been Implemented?
• Are there any significant gaps?
What Do Assessment Results Tell Us?
• Do results
– Achievement of
mission and
– Sufficient
academic rigor?
Have Assessment Results Been Used?
• Have they been appropriately shared & discussed?
• Have they led to appropriate decisions?
– Curricula and pedagogy
– Programs and services
– Resource allocation
– Institutional goals and plans
Is the Process Sustainable?
• Simple
• Practical
• Detailed
• Ownership
• Appropriate timelines
Where is the Institution Going with
• Will momentum slow after this review?
• What Commission action will most help the
institution keep moving?
Middle States’ Five “Rules” for
1. Keep it useful.
2. Tie assessments to important goals.
3. For student learning, include some “direct”
4. Use multiple measures.
5. Keep doing something everywhere, every
Bottom Line on Moving Ahead
 Keep assessment useful.
 Keep things simple.
• Especially in terms of time
• Don’t create unnecessary rules.
 Value assessment.
 Just do it!
Volunteer for
Middle States Evaluation Teams!
• Go to our web site
• Click on “Evaluators.”
Пожаловаться на содержимое документа