Introduction to Biological Anthropology (ANT 101) has been offered annually or more frequently at WCU for nearly two decades. It is a required course for anthropology majors, and for most of that time period non-majors have been permitted to take it to meet a general education distributive requirement. Until the fall semester of 2013, it was configured as a three-hour per week lecture course with no hands-on lab component, and the department had no access to laboratory classroom facilities. For several of those years, the instructor incorporated 3–5 virtual laboratory experiences over the semester using one lecture hour for each. While students said they enjoyed these experiences, assessment data indicated that they still had persistent misconceptions about evolution at course completion.

In fall 2013, a project team at WCU, including the course instructor (a biological anthropologist), a human physiologist experienced in inquiry curricula, an evolutionary biologist, and a psychologist with expertise in assessment and program evaluation were awarded a three-year TUES (Transforming Undergraduate Education in STEM) grant from the National Science Foundation (NSF). The purpose of this award was to develop an innovative, inquiry-based laboratory curriculum targeting student misconceptions about evolution, student ability to use the scientific method, and student understanding of the investigative tools used by biological anthropologists. To accommodate this new curriculum, the course was redesigned to meet four hours per week in an integrated lecture-lab format, with roughly half of that time devoted to laboratory activities and the other half to lecture and/or discussion.

The project was submitted to the West Chester University Human Subjects Committee and received expedited approval in the summer 2013. Informed consent was obtained each semester from students enrolled in the course who wished to participate. Over the period of the project, this was all but one or two students.

During each lab period, brief instruction on methodology was provided, as appropriate to the lab, and students were presented with a challenge scenario that asked them to apply the scientific process to solving that problem using the relevant method (with the challenge scenario providing a structured context in which to do so). In a standard biological anthropology lab curriculum, students might be asked to describe and identify various casts of hominin fossil skulls using characteristics they had learned about, associate these traits with dietary differences, and receive verification of their assessments by the instructor. In the inquiry-based, structured challenge approach developed at WCU, students were given a problem to solve that required them to hypothesize the likely diet of the various hominins or hominids. They were instructed in a technique that allowed them to test one of their hypotheses, then required to state their results in an organized manner, evaluate them, indicate next steps, and so on. Thus, each lab in the curriculum is configured to (1) help students understand how biological anthropologists think about and explore problems using relevant techniques and (2) gain experience with the scientific process. The lab curriculum includes some instruction and application of basic molecular techniques (e.g., constructing simple primate phylogenies based on morphological v. genetic variation and doing a DNA fingerprinting exercise to attempt to identify a hypothetical hominin fossil), since the curriculum is also designed to help students make connections between phenotypic observations and the molecular level in service of the project goal of helping students to better understand evolution. Table 1 provides a list of the labs with descriptions of the inquiry learning activities performed.

Table 1 Schedule of lab topics and inquiry-based learning activities

The full lab manual can be accessed at:

Standard assessments, including periodic exams and laboratory reports, were utilized to measure student learning. Responses to lab challenges at multiple time points were evaluated at the end of each semester using a rubric to measure individual students’ abilities to define the problem, to develop a plan to solve the problem, to analyze and present information, and to interpret findings and solve the challenge problem. Student lab teams also developed a project that they designed and implemented (from hypothesis to interpretation) using one of the methods they learned, and gave group presentations to the class. Other, more formative, measures of student learning were also introduced. For example, during each lab, students completed a pre-post assessment tool which was a modified version of the RSQC2 (Recall, Summarize, Question, Connect, and Comment) classroom assessment technique developed by Angelo and Cross (1993). Beginning in the second year of the project, pre- and post-lab clicker questions were incorporated for rapid assessment of the lab impact.

Several global surveys were administered at the beginning of each course, prior to any instruction, and again (for all but one survey) on the last day of the course. These included a survey focusing on evolution (17 items in year one, revised to 25 items in the second year) as well as surveys assessing students’ familiarity and comfort level with the scientific process, their level of motivation, and, at the end only, their overall assessment of their course experience. The evolution survey was also administered at WCU for 2 years prior to the course reorganization and lab implementation; data from this period are used for an internal comparison with survey results obtained during the implementation of the new curriculum. Biological anthropology colleagues at three other US universities (reported here as A, B, and C) also administered the evolution concepts survey to their students in introductory courses in biological anthropology, during the grant period, for comparison purposes. All of these courses were taught with some version of a more standard laboratory curriculum for this discipline (example of a standard approach described above). University ‘A’ is a large, midwestern state school (approximately 40,000 students); University ‘B’ is a sizable state school located in the south (approximately 30,000 students). University ‘C’ is a large, northeastern state school (approximately 30,000 students). At all three, introductory biological anthropology is taught in large lecture context with smaller recitation sections that meet one hour per week (i.e., two hours lecture, one hour of recitation or lab). At A and C, these recitations were used for weekly laboratory activities throughout the semester; at B, there were seven labs during the semester. Prior to 2013, the course at University A had no lab at all—only lecture.

The current report first describes the results of the evolution concepts instrument administered at the very beginning of the course and at the end of the course at WCU and across universities. Following a presentation of the results regarding changes in misconceptions we turn our attention to an examination of the specific areas of learning that we believe may have contributed to the reduction in misconceptions, including a look at specific assessments of students’ growing understanding of science as a process throughout the course.

Evolution misconceptions

Two versions of the evolution concepts instrument were used, one prior to the start of the grant period and throughout the first year following the grant award and a revised version used beginning in fall 2014. Each version included statements that students responded to on a 5-option Likert-type scale ranging from strongly agree to strongly disagree, or having no opinion. This instrument was based on a published and freely available tool used by other researchers (Cunningham and Wescott 2009). For purposes of analysis, each item was agreed by the project team to be either true or false, such that strong agreement with a true statement and strong disagreement with a false statement were considered to be ‘correct’ responses. A scale ranging from + 2 to − 2, including 0 for ‘no opinion’ was constructed, and several variables were computed from these scores, including total score (pre, post), percent of total points earned (pre, post), number of items correct (pre, post), and percent of items correct (pre, post). The use of percent variables was necessitated by a revision of the survey after the first year of curriculum implementation (2013–2014). The initial version of the survey included 24 items, but a qualitative analysis by study consultants resulted in a set of only 17 items deemed usable for the purposes of our study. This initial survey was then revised for use beginning in fall 2014 to include the 17 items kept from the original survey with the addition of 8 new items, resulting in a set of 25 usable items. The 25-question survey can be found in Additional file 1.

Several questions were addressed using the results of the evolution concepts instrument. First, we compared WCU student survey responses to responses from the three other institutions whose students completed the survey. We asked if student performance on the evolution concepts instrument improved from pre- to post-course for all institutions and whether the amount of improvement varied by institution. Second, we examined WCU student survey responses (both pre and post surveys) over time, asking if student performance on the evolution concepts instrument improved both prior to and during the grant implementation period. Next, we asked whether the degree of improvement changed following implementation of our new inquiry-based curriculum, relative to the academic years prior to implementation of the grant. Finally, in an attempt to understand the specifics of what evolution-related misconceptions might have improved and which did not, we conducted a qualitative analysis of survey items and compared student performance on sets of related items across universities.

WCU course assessments

A variety of measures were used to assess student learning throughout each semester at WCU and to evaluate the effectiveness of particular pedagogical approaches as well as the overall curriculum. Some of these measures were objective and direct measures of student learning. Some were indirect measures, student perceptions of what they learned and/or which laboratory sessions they believed were most helpful in their learning. In this report, we provide results of four of these measures—in-class clicker questions, laboratory challenges, RSQC2 responses, and student confidence ratings—to provide insights about the effectiveness of the curriculum in meeting its primary objectives.

In-class clicker questions

Students were presented with a set of true/false statements or multiple choice questions at the beginning and end of multiple laboratory sessions. Some items were tied directly to misconceptions about evolution, others to students’ understanding of the scientific method, while others were designed to measure more general understanding of the topics covered by the individual laboratory modules. Students responded, via clickers, to these statements presented visually in class. Responses served as an important source of formative assessment but also provided information on the effectiveness of each of the laboratory modules in correcting student misconceptions about evolution and student understanding of the scientific method.

Laboratory challenges

Laboratory modules included “challenge” activities, designed specifically to enable students to apply problem-solving skills within a structured context (Knabb and Misquith 2006). In each of these laboratory challenges, students were asked to state research questions or generate hypotheses, collect data, draw conclusions, report/graph their results, and reflect on those results. Each student completed a laboratory worksheet during each lab module and all worksheets were submitted as part of student lab notebooks at the end of each semester. Selected lab worksheets were reviewed by faculty involved with the grant project at the end of each semester using a developmental assessment screening tool developed by all project faculty. This screening tool underwent its own developmental process, resulting in a final tool that included four measures of scientific thinking (i.e., students’ ability to use the scientific method): Defining the Problem, Developing a Plan to Assess the Problem, Analyzing and Presenting Information, and Interpreting Findings and Solving the Problem. Each of these four areas was assessed on a scale of four developmental levels: beginning, developing, appropriately developed, and exemplary. A copy of this scoring rubric can be found in Additional file 2. Developmental changes in these four areas of scientific thinking were assessed by comparing assigned developmental levels following an early semester laboratory module with assigned developmental levels following a later semester laboratory module.

RSQC2 (Revised)

A modified version of the RSQC2 classroom assessment technique (Angelo and Cross 1993) was completed by students during each laboratory session. Complete details about the multiple sections of this activity can be found in Additional file 3. For the current report, we present data on one of the sections completed by students at the end of each laboratory session. Students were asked to rate the usefulness of each laboratory session in reaching learning outcomes. Ratings were made on a 4-point Likert scale: 4 = very useful; 3 = somewhat useful; 2 = minimally useful; 1 = not useful. Questions included: How useful was today’s laboratory session in helping you to understand the important concepts of evolution and human variation discussed in this course and used by biological anthropologists? How useful was today’s laboratory session in helping you to understand the tools used by biological anthropologists to understand the concepts of evolution and human variation?

Student confidence in using scientific method

WCU students completed a 10-item survey at both the beginning and the end of each semester asking them to rate their level of confidence in their abilities and/or understanding of several pieces of the scientific process. All items were rated on a 5-point Likert scale: 1 = completely doubtful; 2 = somewhat doubtful; 3 = neutral; 4 = somewhat confident; 5 = strongly confident. A copy of this survey is available in Additional file 4.

A variety of both univariate and multivariate linear model procedures were used to address questions of interest involving all student assessments, both within and across time periods and universities (where appropriate). Specifics regarding these analyses are discussed within the Results section.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.


This article is autogenerated using RSS feeds and has not been created or edited by OA JF.

Click here for Source link (