Yale Center for Teaching and Learning

Statistics

Statistics journals for pedagogy have recently published on active learning strategies like Just-in-Time Teaching and flipping the classroom. Instructors also increasingly emphasize building students’ conceptual understanding of statistics.

Instructors can access a variety of resources related to teaching and learning here or on the American Statistical Association’s Education website. The Consortium for the Advancement of Undergraduate Statistics Education (CAUSE) also supports teaching and learning at the university level through workshops, webinars, and more.

Journals and Websites





Articles and Papers

McGee M, Stokes L, Nadolsky P. (2016). Just-in-time Teaching in Statistics Classrooms. Journal of Statistics Education, 24(1), 16-26

Abstract: “Much has been made of the flipped classroom as an approach to teaching, and its effect on student learning. The volume of material showing that the flipped classroom technique helps students better learn and better retain material is increasing at a rapid pace. Coupled with this technique is active learning in the classroom. There are many ways of “flipping the classroom.” The particular realization of the flipped classroom that we discuss in this article is based on a method called “Just-in-Time Teaching (JiTT).” However, JiTT, in particular, and the flipped classroom, in general, is not just about watching videos before class, or doing activities during class time. JiTT includes assigning short, web-based questions to students based on previously viewed material. Typically, Internet-based questions are constructed to elicit common misunderstandings from the students, so that the instructor can correct such misunderstandings immediately in the next class period, hence the name, “Just-in-Time Teaching.” The addition of these pre-class questions is what separates JiTT from a general flipped classroom model. Even as the research on the superiority of JiTT as a learner-centered pedagogical method mounts, aids for the instructor have not, at least not as quickly. This article is focused on the instructor—with aids to help the instructor begin using the JiTT flipped classroom model in statistics courses.”  

Schwartz TA, Andridge RR, Sainani KL, Stangle DK, Neely ML. (2016). Diverse Perspectives on a Flipped Biostatistics Classroom. Journal of Statistics Education, 24(2),74-84

Abstract: “‘Flipping’” the classroom refers to a pedagogical approach in which students are first exposed to didactic content outside the classroom and then actively use class time to apply their newly attained knowledge. The idea of the flipped classroom is not new, but has grown in popularity in recent years as the necessary technology has improved in terms of quality, cost, and availability. Many biostatistics instructors are adopting this format, but some remain unsure whether such a change would benefit their students. One potential barrier to adopting a flipped classroom is the common misconception that only a single approach is available. Having adopted the flipped approach in their own courses, the authors participated in an invited panel at the 2014 Joint Statistical Meetings held in Boston, Massachusetts entitled “Flipping the Biostatistics Classroom.” A theme emerged from the panel’s discussions: rather than being a one-size-fits-all approach, the flipped biostatistics classroom offers a high degree of flexibility, and this flipped approach can—and should—be tailored to instructors’ specific target audience: their students. Several of these varied approaches to the flipped classroom and practical lessons learned are described.”  

Lesser LM, Pearl DK, Weber III JJ. (2016). Assessing Fun Items’ Effectiveness in Increasing Learning in College Introductory Statistics Students: Results of a Randomized Experiment. Journal of Statistics Education, 24(2), 54-62.

Abstract: “There has been a recent emergence of scholarship on the use of fun in the college statistics classroom, with at least 20 modalities identified. While there have been randomized experiments that suggest that fun can enhance student achievement or attitudes in statistics, these studies have generally been limited to one particular fun modality or have not been limited to the discipline of statistics. To address the efficacy of fun items in teaching statistics, a student-randomized experiment was designed to assess how specific items of fun may cause changes in statistical anxiety and learning statistics content. This experiment was conducted at two institutions of higher education with different and diverse student populations. Findings include a significant increase in correct responses to questions among students who were assigned online content with a song insert compared with those assigned content alone.”  

Pfannkuch M, Budgett S. (2016). Markov Processes: Exploring the Use of Dynamic Visualizations to Enhance Student Understanding. Journal of Statistics Education, 24(2), 63-73.

Abstract: “Finding ways to enhance introductory students’ understanding of probability ideas and theory is a goal of many first-year probability courses. In this article, we explore the potential of a prototype tool for Markov processes using dynamic visualizations to develop in students a deeper understanding of the equilibrium and hitting times distributions. From the literature and interviews with practitioners, we identified core probability concepts, problematic areas, and possible solutions from which we developed design principles for the tool and accompanying tasks. The tool and tasks were piloted on six introductory probability students using a two-person protocol. The main findings highlight that our tool and tasks seemed to assist students to engage with probability ideas, to develop some intuition for Markov processes, to enhance their distributional ideas, to work between representations, and to see structure within the mathematics representations. The implications for teaching and learning are discussed.”  

Sigal MJ. (2016). Play It Again: Teaching Statistics with Monte Carlo Simulation. Journal of Statistics Education, 24(3), 136-156

Abstract: “Monte Carlo simulations (MCSs) provide important information about statistical phenomena that would be impossible to assess otherwise. This article introduces MCS methods and their applications to research and statistical pedagogy using a novel software package for the R Project for Statistical Computing constructed to lessen the often steep learning curve when organizing simulation code. A primary goal of this article is to demonstrate how well-suited MCS designs are to classroom demonstrations, and how they provide a hands-on method for students to become acquainted with complex statistical concepts. In this article, essential programming aspects for writing MCS code in R are overviewed, multiple applied examples with relevant code are provided, and the benefits of using a generate–analyze–summarize coding structure over the typical “for-loop” strategy are discussed.”  

Makar K, Rubin A. (2009). A framework for thinking about statistical inference. Statistics Education Research Journal, 8(1), 82-105

Abstract: “Informal inferential reasoning has shown some promise in developing students’ deeper understanding of statistical processes. This paper presents a framework to think about three key principles of informal inference – generalizations ‘beyond the data,’ probabilistic language, and data as evidence. The authors use primary school classroom episodes and excerpts of interviews with the teachers to illustrate the framework and reiterate the importance of embedding statistical learning within the context of statistical inquiry. Implications for the teaching of more powerful statistical concepts at the primary school level are discussed.”    

Castro Sotos AE, Vanhoof S, Noortgate WV, Onghena P. (2007). Students’ misconceptions of statistical inference: A review of the empirical evidence from research on statistics education. Educational Research Review, 2(2), 98-113.

Abstract: “A solid understanding of inferential statistics is of major importance for designing and interpreting empirical results in any scientific discipline. However, students are prone to many misconceptions regarding this topic. This article structurally summarizes and describes these misconceptions by presenting a systematic review of publications that provide empirical evidence of them. This group of publications was found to be dispersed over a wide range of specialized journals and proceedings, and the methodology used in the empirical studies was very diverse. Three research needs rise from this review: (1) further empirical studies that identify the sources and possible solutions for misconceptions in order to complement the abundant theoretical and statistical discussion about them; (2) new insights into effective research designs and methodologies to perform this type of research; and (3) structured and systematic summaries of findings like the one presented here, concerning misconceptions in other areas of statistics, that might be of interest both for educational researchers and teachers of statistics.”  

Delmas R, Garfield J et al. (2007). Assessing students’ conceptual understanding after a first course in statistics. Statistics Education Research Journal, 6(2), 28-58

Abstract: “This paper describes the development of the CAOS test, designed to measure students’ conceptual understanding of important statistical ideas, across three years of revision and testing, content validation, and reliability analysis. Results are reported from a large scale class testing and item responses are compared from pretest to posttest in order to learn more about areas in which students demonstrated improved performance from beginning to end of the course, as well as areas that showed no improvement or decreased performance. Items that showed an increase in students’ misconceptions about particular statistical concepts were also examined. The paper concludes with a discussion of implications for students’ understanding of different statistical topics, followed by suggestions for further research.”   

Garfield JB. (2003). Assessing statistical reasoning. Statistics Education Research Journal, 2(1), 22-38.

Abstract: “This paper begins with a discussion of the nature of statistical reasoning, and then describes the development and validation of the Statistical Reasoning Assessment (SRA), an instrument consisting of 20 multiple-choice items involving probability and statistics concepts. Each item offers several choices of responses, both correct and incorrect, which include statements of reasoning explaining the rationale for a particular choice. Students are instructed to select the response that best matches their own thinking about each problem. The SRA provides 16 scores which indicate the level of students’ correct reasoning in eight different areas and the extent of their incorrect reasoning in eight related areas. Results are presented of a cross-cultural study using the SRA to compare the reasoning of males and females in two countries.”