Kristen Murphy

Kristen Murphy

Assistant Professor

Office: Chemistry 101
Phone: 414-229-4468
e-mail: kmurphy@uwm.edu

Degree(s):

Ph.D., UW-Milwaukee

Research Description

Our chemical education research group is focused on methods of content delivery, student cognition in problem-solving strategies and assessment in preparatory and introductory college chemistry courses. More specifically these goals have centered on three main projects:

  • Measuring and enhancing students' scale literacy
  • Development of a new assessment strategy (rapid knowledge assessment) for understanding students' problem solving strategies
  • Examining multiple-choice assessments for differential item functioning

Scale Literacy

“…greater emphasis needs to be placed on teaching and learning about scale in general, and small scale below the limits of visibility in particular”

Jones, M.G., Tretter, T., Taylor, A., and Oppewal, T. (2008) International Journal of Science Education, 30, p. 428.

Grasping scale outside the visual realm can be difficult particularly with regards to the very small. Undergraduate students in preparatory and introductory chemistry courses, for example, are required to begin thinking about certain concepts in chemistry on a particle level, which is orders of magnitude smaller than the resolving ability of the human eye. The development of a student's scale literacy outside of the concepts of chemistry has been noted as an important component of a student's overall science literacy by the AAAS (American Association for the Advancement of Science). Research has shown that students need to continue cultivating their understanding of scale, particularly down to the nanometer size, beyond their elementary and secondary education years. Additionally, it has been found that students who utilize instrumentation in these very small regions have a better concept of scale than those who do not.

This project measures changes in understanding in both a student's scale perception and unitizing on the atomic level. Unitizing is the development and use of a convenient or familiar unit. For instance, although we use a common length of the meter this is on the order of human size (we unitize to what we are most familiar) and it is often only through necessity that we unitize to other units (for example, a light year). It is expected that students in preparatory or introductory college chemistry "think conceptually" of atoms and molecules interacting. The precursor expectation to this is unitizing on the atomic level (with the unit of the atom). Once students unitize on the atomic level, the transfer of both enhanced scale perception and atomic unitizing to other specific content areas are measured.

This project utilizes the previous studies by presenting students in preparatory and introductory chemistry courses with images generated by a real-time, remote access canning electron microscope or portable, in-class scanning tunneling microscope for inquiry-based exercises in during lecture. Both informal and formal assessment measure the efficacy of the instrumentation utilized during the lecture demonstrations in enhancing students' scale perception, atomic unitizing and transfer to other specific content areas.

We have developed a means to measure two key components of scale literacy (scaling skills and scale conceptual understanding) that have been found to be better predictors for success in general chemistry I than more traditional measures of math or chemistry content knowledge.  We are in the process of developing individual supplemental activities that are designed to target specific proficiency-levels of students and focus on enhancing performance on aspects of scale literacy: measurement and magnification and particulate nature of matter.

Rapid knowledge assessment

“…knowledge levels of learners need to be assessed and monitored continuously during instructional episodes to dynamically determine the design of further instruction”

Kalyuga, S., and Sweller, J. (2004) Journal of Educational Psychology, 96, p.558.

Improving students' learning and understanding in science, technology, mathematics or engineering (STEM) disciplines coincides with improving methods of assessing learning and understanding in STEM.  Cognitive load theory, a theory from cognitive psychology, has been applied in mathematics where expertise of a student was assigned based on a rapid-measurement scheme assessing which immediate first step was taken in the problem-solving process. This measurement coupled with a measure of the student's reported mental workload provided greater insight into the problem-solving strategies employed by the student than performance alone. Additionally, once expertise can be approximated, instructional materials can be tailored to further enhance performance on subsequent assessments where research has shown the lower expertise students performed better after instruction with worked examples while higher expertise students performed better with problem solving instructions.

Through this a rapid knowledge assessment (RKA) for preparatory and introductory college chemistry modeled after the rapid-measurement scheme utilized in mathematics has been developed. Efficiency in students' problem solving strategies is assessed through the reported immediate first step, correct or incorrect response to the exercise overall and student reported mental effort. Validation of this instrument included expert analysis of student responses, student problem solving strategies as reported through think-aloud protocols, and longitudinal analysis of similar items in preparatory to introductory college chemistry as well as internally and externally validated with other measures of the instrument and two different standardized, high-stakes exams.  Validation of the responses and subjective reporting of mental workload includes examining time on task, scan path maps, fixation patterns and task-evoked pupillary response (TEPR) using a desk-mounted eye tracker for both novices and experts.  Instructional materials using expert performance research and the foundations of deliberate practice exercises have been developed and field-tested.Further refinement, development and field-testing in conjunction with theinstrument use is ongoing.

Differential Item Functioning

“Although DIF detection has become routine in testing programs, studies to understand (not just detect) potential sources of DIF are rarely carried out.”

Zenisky, A.L., Hambleton, R.K. and Robin, F. (2003) Educational Assessment, 9, p.62.

Differential item functioning (DIF) is an item-level characteristic of test items where an item may be found to be statistically easier for members of one demographic comparison group than another. DIF analyses typically involve matching examinees from different subgroups (such as gender, race/ethnicity, socioeconomic status, language ability) on a proficiency variable, carrying out item analysis for each group, and evaluating the results for statistical significance. Where DIF is present, it is said that the item “favors” one group over another (a result that suggests that examinees at equal skill levels from different subgroups do not have an equivalent chance of answering a question correct due to subgroup membership). Statistical techniques for detecting DIF include item response theory (IRT), simultaneous item bias statistic (SIBTEST), and Mantel-Haenszel statistic and can be carried out on both multiple-choice and constructed-response items.

In this project, multiple-choice exam items in general chemistry were investigated using trial-tested preliminary exams in preparation for a standardized first-term general chemistry exam. These items were then retested in general chemistry I with both the original item (that was not included in the final released version of the test) and clones of the original items. These were tested in high, medium and low stakes testing and matching proficiency internally (using the score on the assessment as the measure of proficiency) and externally (using placement exam scores, ACT score and subscores, and standardized final exam scores). These items have been tested over three semesters and the subset of persistent DIF items have been coded into an eye tracker. Examining for differences between the subgroups solving the tasks using the eye tracker includes performance, time on task, scan path maps, fixation patterns and TEPR. Additionally, these subgroups will be collectively examined for processing differences when correlated to DIF persistence.

In addition to differential performance on tasks, subgroups have been found to have differential persistence in STEM majors. Using social cognitive theory and social cognitive career theory, four indicators are used to predict persistence in STEM majors: self-efficacy, outcomes expectations, interest and goals. An instrument to measure these in chemistry are in development and the correlation between persistent DIF and persistence in STEM will be examined.

Selected Publications:

Murphy, K.L; Holme, T.A.; Zenisky, A.L.; Caruthers, H.; Knaus, K.J. “Building the ACS Exams anchoring concept content map for Undergraduate chemistry” Journal of Chemical Education, 2012, 89, 715-720.

Holme, T.A. and Murphy, K.L. “The ACS Exams Institute Undergraduate Chemistry Content Map I: General Chemistry” Journal of Chemical Education, 2012, 89, 721-723.

Murphy, K.L.; “Using a Personal Response System to Map Cognitive Efficiency”, Journal of Chemical Education, accepted 2012.

Schroeder, J.; Murphy, K.L.; Holme, T.A.; “Investigating factors that influence item performance on ACS Exams”, accepted to Journal of Chemical Education, web publication date of January 19, 2012.

Emenike, M.E.; Schroeder, J.; Murphy, K.L.; Holme, T.A.; “Results from a National Needs Assessment Survey: A Snapshot of Assessment Efforts within Chemistry Faculty Departments”, submitted to Journal of Chemical Education, 2011.

Holme, T.A. and Murphy, K.L.; “Assessing Conceptual versus Algorithmic Knowledge in General Chemistry with ACS Exams”, Journal of Chemical Education, 2011, 88, 1217-1222.

Knaus, K.J.; Murphy, K.L.; Blecking, A.; Holme, T.A.; “A Valid and Reliable Instrument for Cognitive Complexity Rating Assignment of Chemistry Exam Items”, Journal of Chemical Education, 2011, 88, 554-560.

Emenike, M.E.; Schroeder, J.D.; Murphy, K.L.; Holme, T.A.; “A Snapshot of Chemistry Faculty Members’ Awareness of Departmental Assessment Efforts”, Assessment
Update
, 2011, 23, 1 (5).

Holme, T.A., and Murphy, K.L.; “Assessing Conceptual versus Algorithmic Knowledge: Are we engendering new myths in Chemical Education?” ACS Symposium Series, 2011, 1074,195–206.

Murphy, K.L.; Picione, J.P.; Holme, T.A.; “Data-Driven Implementation and Adaptation of New Teaching Methodologies”, Journal of College Science Teaching, 2010, 40, 78-84.

Authored Educational Materials

Preparing for Your ACS Examinations in Physical Chemistry: The Official Guide (Commonly called the Physical Chemistry Study Guide); Editors Thomas Holme and Kristen Murphy; American Chemical Society, Division of Chemical Education, Examinations Institute, Iowa State University, 2009.

Murphy, Kristen and Thomas Holme; Toledo Placement Exam, 2009, American Chemical Society, Division of Chemical Education, Examinations Institute

Murphy, Kristen; Picione, John; Blecking, Anja; Lecture Exercise Booklet (Active Learning Exercises) for Chemistry 100; 192 pages; Available online or from a local print shop, 2007-present.

Murphy, Kristen; Picione, John; Blecking, Anja; Online homework system; over 1900 problems (to date); utilized by all Chemical Science and General Chemistry I students, 2007-present.

Committee Member, American Chemical Society, Division of Chemical Education, Examinations Institute, General Chemistry, 2007.

Editor, American Chemical Society, Division of Chemical Education, Examinations Institute:

Assessment Materials (released): General Chemistry (2007), High School (2007); Inorganic Chemistry (2008), General Chemistry Conceptual (2008), General Chemistry (2009), High School (2009), Diagnostic of Undergraduate Chemical Knowledge (2008); General Chemistry (2011); High School (2011); General Chemistry, First Term (2012)

Assessment Materials (in progress): Physical Chemistry; Thermodynamics, Quantum Mechanics and Dynamics (2012); Diagnostic of Undergraduate Chemical Knowledge (2012); Analytical Chemistry (2012)