Monday, January 1, 2018

Indicators for STEM Education

You can follow me on Twitter @dbressoud.

The National Academies have just released the Board on Science Education report on Indicators for Monitoring Undergraduate STEM Education (available at http://sites.nationalacademies.org/DBASSE/BOSE).


This report is a response to the concern raised by the President’s Council of Advisors in Science and Technology that despite the many initiatives that are seeking to improve the teaching and learning of STEM subjects, we do not have effective national-scale measures of their success. The core of the charge to the committee that produced this report was to identify objectives for the improvement of STEM education, describe indicators that would inform whether or not we are making progress, and catalog what currently exists or could be developed by way of research and data collection to track progress. This extensive report provides this information.

The committee identified eleven objectives, organized into three general goals:

Goal 1: Increase students’ mastery of STEM concepts and skills by engaging them in evidence-based STEM practices and programs.
1.1 Use of evidence-based stem educational practices both in and outside of classrooms
1.2 Existence and use of supports that help instructors use evidence-based STEM educational practices
1.3 An institutional culture that values undergraduate STEM education
1.4 Continuous improvement in STEM teaching and learning 
Goal 2: Strive for equity, diversity, and inclusion of STEM students and instructors by providing equitable opportunities for access and success.
2.1 Equity of access to high-quality undergraduate STEM educational programs and experiences
2.2 Representational diversity among STEM credential earners
2.3 Representational diversity among STEM instructors
2.4 Inclusive environments in institutions and STEM departments
Goal 3: Ensure adequate numbers of STEM professionals.
3.1 Foundational preparation for STEM for all students
3.2 Successful navigation into and through STEM programs of study
3.3 STEM credential attainment
Each of these objectives is explained in detail, together with indicators of success and suggestions for how these might be measured. To give an indication of the breadth of this report, I’ll summarize some of what it says about the first and third objective, “Use of evidence-based stem educational practices both in and outside of classroom” and “An institutional culture that values undergraduate STEM education.”

The report first explains what evidence-based stem educational practices entail. For in-class practices, the report includes active learning and formative assessments. Acknowledging the lack of a common definition of active learning, this report uses it “to refer to that class of pedagogical practices that cognitively engage students in building understanding at the highest levels of Bloom’s taxonomy,” and then elaborates with examples that include “collaborative classroom activities, fast feedback using classroom response systems (e.g., clickers), problem-based learning, and peer instruction.”

This resonates with the CBMS definition, “classroom practices that engage students in activities, such as reading, writing, discussion, or problem solving, that promote higher-order thinking” (https://www.cbmsweb.org/2016/07/active-learning-in-post-secondary-mathematics-education/). The point being to engage students in wrestling with the critical concepts while in class. Thus the emphasis is not on activity as such, but on the promotion of cognitive engagement in higher order thinking.

I appreciate the emphasis on formative assessment: frequent, low-stakes, and varied assessments that clarify for students what they actually do and do not know. I also have found these helpful in informing me where student difficulties lie. The Indicators report references a 1998 review of formative assessment literature by Black and Wiliam, “Inside the Black Box: Raising Standards through Classroom Assessment,” that presents this as the single most effective means of raising student performance and describes how it needs to be done if it is to have these positive benefits. (Black and Wiliam article available at https://www.rdc.udel.edu/wp-content/uploads/2015/04/InsideBlackBox.pdf.)

Another important insight from this report, also identified in the MAA’s calculus studies, is the importance of course coordination. If a department is to improve instruction, it is essential that its members share a common understanding of the goals of the course. These shape pedagogical and curricular decisions as well as how student accomplishment is to be measured. The degree of coordination is one of the aspects of objective 1.3: An institutional culture that values undergraduate STEM education. As the report states on page 3-12,
A growing body of research indicates that many dimensions of current departmental and institutional cultures in higher education pose barriers to educators’ adoption of evidence- based educational practices (e.g., Dolan et al., 2016; Elrod and Kezar, 2015, 2016a, 2016b). For example, allowing each individual instructor full control over his or her course, including learning outcomes, a well-established norm in some STEM departments, can cause instructors to resist working with colleagues to establish shared learning goals for core courses, a process that is essential for improving teaching and learning.
As I reported last February in "MAA Calculus Study: PtC Survey Results," there is very little departmental coordination around homework, exams, grades, or instructional approaches.

Table. Of the 207 mainstream Calculus I courses with multiple sections taught in 121 PhD- granting departments and 103 such courses in 76 Masters-granting departments, the percentage of courses that have each feature in common across all sections. Source: PtC Census Survey Technical Report, available at
  https://www.maa.org/sites/default/files/PtC Technical Report_Final.pdf.
Of course, the big issue for an institutional culture that values undergraduate STEM education is how teaching is evaluated and role it plays in decisions of promotion and tenure. What is deeply discouraging is how poorly most departments do with just questions of coordination.

No comments:

Post a Comment