• Journal of Mathematics Research
  • Announcements

JMR, Vol. 16, No. 3, June 2024: Call for Papers

Posted on Apr 7, 2024 We are seeking submissions for Vol. 16, No. 3, June 2024 issue. Submission deadline: May 5, 2024. Read More

Recruitment for Reviewers

Posted on Mar 19, 2020 We are recruiting reviewers for the journal. If you are interested in becoming a reviewer, we welcome you to join us. Please find the application form and details at http://recruitment.ccsenet.org a.. Read More

Policy Change of Free Print Journals

Posted on Jan 23, 2018 As you are aware, printing and delivery of journals results in causing a significant amount of detrimental impact to the environment. Being a responsible publisher and being considerate for the envi.. Read More

Current: Vol. 15, No. 6 (2023)

  • A Memoir on Pseudo-Variational Techniques for Parabolic PDE’s Incorporating Boundary Value Constraints
  •   Uchechukwu M. Opara    
  •   Festus I. Arunaye    
  •   Philip O. Mate    

research journal of mathematics

  • Restoring Environmental Justice: On the Coupled Dynamical Analysis of Lake Powell and Lake Mead
  •   Angelina Shen    
  •   Youn-Sha Chan    
  • Logistic Regression Analytically Solves the 3D Navier Stokes Equations
  •   Edward H. Jimenez    
  • Scales Bridging in the Model of Growth of Animals, a Holistic Slant
  •   V. L. Stass    
  • Control of the Hyperbolic Ill-posed Cauchy Problem by Controllability
  •   Bylli André B. Guel    
  •   Sadou Tao    
  •   Somdouda Sawadogo    
  • Reviewer Acknowledgements for Journal of Mathematics Research, Vol. 15, No. 6
  •   Sophia Wang    

Journal Publishing Workflow

Please see the workflow for the article publication:

research journal of mathematics

Paper Selection and Publication Process

a) Upon receipt of paper submission, the Editor sends an E-mail of confirmation to the corresponding author within 1-3 working days. If you fail to receive this confirmation, your submission/e-mail may be missed. Please contact the Editor in time for that. b) Peer review. We use  double-blind  system for peer-review; both reviewers and authors’ identities remain anonymous. The paper will be peer-reviewed by three experts; two reviewers from outside and one editor from the journal typically involve in reviewing a submission. The review process may take 4-10 weeks . c) Notification of the result of review by E-mail.

d) If the submission is accepted, the authors revise accordingly and pay the article processing charge (formatting and hosting).

e) A PDF version of the article is available for download on the journal's webpage free of charge.

f) From July 1, 2018, we will not automatically provide authors free print journals. We will provide free print copies for authors who really need them. Authors are requested to kindly fill  an application form  to request free print copies. Additionally, we are happy to provide the journal’s eBook in PDF format for authors, free of charge. This is the same as the printed version.

The publisher and journal have a policy of "Zero Tolerance on the Plagiarism". We check the plagiarism issue through two methods: reviewer check and plagiarism prevention tool (ithenticate.com).

All submissions will be checked by iThenticate before being sent to reviewers.

research journal of mathematics

  • ISSN(Print): 1916-9795
  • ISSN(Online): 1916-9809
  • Started: 2009
  • Frequency: bimonthly

Journal Metrics

  • h-index (December 2021): 22
  • i10-index (December 2021): 78
  • h5-index (December 2021): N/A
  • h5-median (December 2021): N/A

( The data was calculated based on Google Scholar Citations . Click Here to Learn More. )

  • Academic Journals Database
  • Aerospace Database
  • BASE (Bielefeld Academic Search Engine)
  • Civil Engineering Abstracts
  • CNKI Scholar
  • DTU Library
  • Elektronische Zeitschriftenbibliothek (EZB)
  • EuroPub Database
  • Google Scholar
  • Harvard Library
  • JournalTOCs
  • PKP Open Archives Harvester
  • ResearchGate
  • SHERPA/RoMEO
  • Standard Periodical Directory
  • Technische Informationsbibliothek (TIB)
  • The Keepers Registry
  • UCR Library
  • Universe Digital Library
  • Sophia Wang Editorial Assistant
  • [email protected]
  • Journal Home
  • Editorial Team
  • Order Hard Copies

American Mathematical Society

Publications — Over 100 years of publishing excellence

  • Book Author Resources
  • Submit a Book Proposal
  • AMS Rights, Licensing, and Permissions
  • Open Math Notes
  • Frequently asked questions
  • Member Journals
  • Research Journals
  • Translation Journals
  • Distributed Journals
  • Open Access Journals
  • Guidelines and Policies
  • Journal Author Resources

Librarian Resources

  • eBook Collections
  • COUNTER Usage Statistics
  • My Subscriptions
  • Subscription Information
  • Licensing Information

Mathematical Reviews/MathSciNet®

  • MathSciNet ®
  • Reviewer Home
  • MathSciNet ® Subscriptions

Membership — Welcome to your membership center

Join the ams, renew your membership, give a membership, individual membership.

  • Member Benefits
  • Member Directory
  • Reciprocating Societies
  • Members in Developing Countries

Institutional Membership

  • Domestic Institutions
  • International Institutions
  • Two-Year Institutions
  • Graduate Student Chapter Program

Other Member Types

  • Corporate Memberships
  • Associate Memberships

Meetings & Conferences — Engage with colleagues and the latest research

National meetings.

  • Joint Mathematics Meetings
  • Upcoming JMMs
  • Previous JMMs
  • Special Lectures
  • Professional Enhancement Programs (PEPs)

Sectional Meetings

  • Upcoming Sectionals
  • Previous Sectionals
  • Presenting Papers
  • Hosting Sectionals

Other Meetings, Conferences & Workshops

  • Mathematics Research Communities
  • Education Mini-conference
  • International Meetings
  • Mathematics Calendar
  • Short Courses
  • Workshop for Department Chairs and Leaders

Meetings Resources

  • Suggest a Speaker
  • AMS Meetings Grants
  • Submitting Abstracts
  • Welcoming Environment Policy
  • MathSafe – supporting safe meetings

News & Outreach — Explore news, images, posters, and mathematical essays

News from the ams.

  • AMS News Releases
  • Feature Stories
  • Information for Journalists
  • In Memory Of

Math Voices

  • Feature Column
  • Math in the Media
  • Column on Teaching and Learning

Explorations

  • Recognizing Diverse Mathematicians
  • AMS Posters
  • Mathematics & Music
  • Mathematical Imagery
  • Mathematical Moments

Professional Programs — Resources and opportunities to further your mathematical pursuits

Professional development.

  • Employment Services
  • Mathjobs.org
  • BEGIN Career Initiative
  • Mathprograms.org
  • Mathematical Opportunities Database
  • Research Seminars

Institutional Information and Data

  • Annual Survey of the Mathematical and Statistical Sciences
  • CBMS Survey
  • Other Sources of Data
  • Directory of Institutions in the Mathematical Sciences
  • Professional Directory

Grants & Support

  • AMS-Simons Grants for PUI Faculty
  • Travel Grants
  • Fellowships & Scholarships
  • Epsilon Fund
  • Child Care Grants

Awards & Recognition

  • AMS Prizes & Awards
  • Fellows of the AMS

Education — Resources to support advanced mathematics teaching and learning

For students.

  • Information for Undergraduate and High School Students
  • Research Experiences for Undergraduates (REUs)
  • Considering Grad School
  • Find Grad Programs
  • Applying to Grad School
  • What do Mathematicians Do?

For Teachers

  • Teaching Online
  • Teaching Resources
  • Inclusive Classrooms
  • Assessing Student Learning
  • Education Webinars

For Department Leaders & Mentors

  • Information for Department Leaders
  • paraDIGMS (Diversity in Graduate Mathematical Sciences)

Government Relations — Advocating for the mathematical sciences

Elevating mathematics in congress.

  • Our Mission
  • Letters, Statements, & Legislation
  • Congressional Briefings

Legislative Priorities

  • Federal Issues of Concern
  • Federal Budget Process

Get Involved

  • Advocacy Resources
  • Take Action

DC-Based Fellowships

  • Congressional Fellowship
  • Mass Media Fellowship
  • Catalyzing Advocacy in Science & Engineering (CASE) Fellowship

Giving to the AMS — Your gifts make great things happen for mathematics   Make a Gift

What you can support.

  • The 2020 Fund
  • Next Generation Fund
  • Birman Fellowship for Women Scholars
  • JMM Child Care Grants
  • MathSciNet for Developing Countries

Create a Legacy

  • Make a Tribute Gift
  • Create a Permanent Fund
  • Establish a Prize, Award or Fellowship
  • Bequests and Charitable Estate Planning

Honoring Your Gift

  • Donor Stories
  • Donor Wall of Honor
  • Thomas S. Fiske Society
  • AMS Contributors Society
  • AMS Gardens

Giving Resources

  • AMS Development Committee
  • AMS Gift Acceptance Policy

About the AMS — Advancing research. Connecting the mathematics community.

Our organization.

  • Executive Staff
  • Equity, Diversity, & Inclusion
  • Jobs at AMS
  • Customer Service

Our Governance

  • Board of Trustees
  • Executive Committee

Governance Operations

  • Calendar of Meetings
  • Policy Statements & Guidelines

On March 21 st , the AMS website will be down for regularly scheduled maintenance from 5:00am–8:00am

American Mathematical Society

Journals High quality journals covering a broad range of mathematical disciplines.

Notices of the American Mathematical Society

Current issue · All issues

Notices of the American Mathematical Society ISSN 1088-9477 (online) ISSN 0002-9920 (print) MCQ: 0.45

Bulletin of the American Mathematical Society

Bulletin of the American Mathematical Society ISSN 1088-9485 (online) ISSN 0273-0979 (print) MCQ: 0.47

Abstracts of Papers Presented to the American Mathematical Society

All issues : 2009 - Present

Abstracts of Papers Presented to the American Mathematical Society ISSN 2689-4831 (online) ISSN 0192-5857 (print) MCQ: 0.00

MCQ Info The Mathematical Citation Quotient (MCQ) measures journal impact by looking at citations over a five-year period.

Communications of the American Mathematical Society

Current volume · All volumes

Communications of the American Mathematical Society ISSN 2692-3688 MCQ: 0.47

Journal of the American Mathematical Society

Journal of the American Mathematical Society ISSN 1088-6834 (online) ISSN 0894-0347 (print) MCQ: 4.79

Representation Theory

Representation Theory ISSN 1088-4165 MCQ: 0.7

Proceedings of the American Mathematical Society

Proceedings of the American Mathematical Society ISSN 1088-6826 (online) ISSN 0002-9939 (print) MCQ: 0.85

Proceedings of the American Mathematical Society Series B

Proceedings of the American Mathematical Society Series B ISSN 2330-1511 MCQ: 0.84

Mathematics of Computation

Mathematics of Computation ISSN 1088-6842 (online) ISSN 0025-5718 (print) MCQ: 1.98

Conformal Geometry and Dynamics

Conformal Geometry and Dynamics ISSN 1088-4173 MCQ: 0.5

Memoirs of the American Mathematical Society

Memoirs Home

Memoirs of the American Mathematical Society ISSN 1947-6221 (online) ISSN 0065-9266 (print) MCQ: 0.51

Transactions of the American Mathematical Society

Transactions of the American Mathematical Society ISSN 1088-6850 (online) ISSN 0002-9947 (print) MCQ: 1.43

Transactions of the American Mathematical Society Series B

Transactions of the American Mathematical Society Series B ISSN 2330-0000 MCQ: 1.79

Electronic Research Announcements

All volumes

Electronic Research Announcements ISSN 1079-6762 MCQ: 0.00

Return to top

St. Petersburg Mathematical Journal

St. Petersburg Mathematical Journal ISSN 1547-7371 (online) ISSN 1061-0022 (print) MCQ: 0.54

Transactions of the Moscow Mathematical Society

Transactions of the Moscow Mathematical Society ISSN 1547-738X (online) ISSN 0077-1554 (print) MCQ: 0.51

Sugaku Expositions

Sugaku Expositions ISSN 2473-585X (online) ISSN 0898-9583 (print) MCQ: 0.10

Annales Scientifiques de l'Ecole Normale Superieure

Annales Scientifiques de l'École Normale Supérieure ISSN: 1088-4173 MCQ: 2.09

Asterisque

Astérisque ISSN: 0303-1179 MCQ: 0.45

Bulletin de la Societe Mathematique de France

Bulletin de la Société Mathématique de France ISSN 0037-9484 MCQ: 0.70

Theory of Probability and Mathematical Statistics

Theory of Probability and Mathematical Statistics ISSN 1547-7363 (online) ISSN 0094-9000 (print) MCQ: 0.12

Journal of Algebraic Geometry

Journal of Algebraic Geometry ISSN 1534-7486 (online) ISSN 1056-3911 (Print) MCQ: 1.37

JOT

ISSN 0379-4024 (print) MCQ: 0.60

Quarterly of Applied Mathematics

Quarterly of Applied Mathematics ISSN 1552-4485 (online) ISSN 0033-569X (print) MCQ: 0.60

Moscow Mathematical Journal

Moscow Mathematical Journal 1609-3321 (print) MCQ: 0.61

Mmoires de la Socit Mathmatique de France

Mémoires de la Société Mathématique de France ISSN 0249-633X MCQ: 1.83

Journal of the Ramanujan Mathematical Society

Journal of the Ramanujan Mathematical Society ISSN 0970-1249 MCQ: 0.24

  • SpringerLink shop

Mathematics

On these pages you will find Springer’s journals, books and eBooks in all areas of Mathematics, serving researchers, lecturers, students, and professionals. We publish many of the most prestigious journals in Mathematics, including a number of fully open access journals.

Our book and eBook portfolio comprises monographs, textbook series, reference works and conference proceedings from the world’s most distinguished authors.

Subdisciplines

Algebra Analysis Applications Computational Science & Engineering Dynamical Systems & Differential Equations Geometry & Topology History of Mathematical Sciences Mathematical & Computational Biology Mathematical Physics Number Theory & Discrete Mathematics Probability Theory & Stochastic Processes Quantitative Finance

Colourful books

Find our products

Visit our shop on SpringerLink with more than 300,000 books. Read over ten million scientific documents on SpringerLink.

Join our mailing list

Join our mailing list

Get access to exclusive content, sales, promotions and events. Be the first to hear about new book releases and journal launches. Learn about our newest services, tools and resources.

publish-a-book

Publish with us

Selecting the right publisher is one of the most important decisions an author will make. At Springer, we recognize that our authors are the heart of what we do and we are committed to provide the resources, support and advice you need to help you succeed.

Journal of the Association for Mathematical Research

JAMR banner

About the Journal

The Journal of the Association for Mathematical Research (JAMR) is a Diamond Open Access journal publishing research articles in all branches of mathematics at the level of the best specialized journals. There are no strict page limits. A published article may be accompanied, when appropriate, by other media, including: links to github or other repositories for code or data, related notes and videos relevant to the article. 

Announcements

First issue published.

The first issue of JAMR was published on July 21, 2023.

Journal of the Association for Mathematical Research launched.

Current issue.

research journal of mathematics

Chvátal-Erdo ̋s condition for pancyclicity

Effective counting in sphere packings, log-concave poset inequalities, information.

  • For Readers
  • For Authors
  • For Librarians

research journal of mathematics

Logo

Journal for Research in Mathematics Education

An official journal of the National Council of Teachers of Mathematics (NCTM), JRME is the premier research journal in mathematics education and is devoted to the interests of teachers and researchers at all levels--preschool through college.

  • eTOC Alerts
  • Latest Issue TOC RSS

Reflecting From the Border Between Mathematics Education Research and Cognitive Psychology

Unitizing predicates and reasoning about the logic of proofs.

This article offers the construct unitizing predicates to name mental actions important for students’ reasoning about logic. To unitize a predicate is to conceptualize (possibly complex or multipart) conditions as a single property that every example has or does not have, thereby partitioning a universal set into examples and nonexamples. This explains the cognitive work that supports students to unify various statements with the same logical form, which is conventionally represented by replacing parts of statements with logical variables p or P ( x ). Using data from a constructivist teaching experiment with two undergraduate students, we document barriers to unitizing predicates and demonstrate how this activity influences students’ ability to render mathematical statements and proofs as having the same logical structure.

How Students Understand Graphical Patterns: Fine-Grained, Intuitive Knowledge Used in Graphical Thinking

Engaging in the construction and interpretation of graphs is a complex process involving concerted activation of context-specific cognitive resources. As students engage in this process, they apply fine-grained, intuitive ideas to graphical patterns: graphical forms. Using data involving pairs of students constructing and interpreting graphs, we expand on the current knowledge base on graphical forms to contribute an empirically based catalog. We also situate our cognitively oriented work in relation to research that has emphasized (a) misconceptions and (b) social practices. In addition, we draw connections to the research on covariational reasoning. We end with implications regarding how graphical forms contribute to our understanding of students’ graphical reasoning and how instructors can support students.

The Journal for Research in Mathematics Education is published online five times a year—January, March, May, July, and November—at 1906 Association Dr., Reston, VA 20191-1502. Each volume’s index is in the November issue. JRME is indexed in Contents Pages in Education, Current Index to Journals in Education, Education Index, Psychological Abstracts, Social Sciences Citation Index, and MathEduc.

An official journal of the National Council of Teachers of Mathematics (NCTM), JRME is the premier research journal in mathematics education and is devoted to the interests of teachers and researchers at all levels--preschool through college. JRME presents a variety of viewpoints. The views expressed or implied in JRME are not the official position of the Council unless otherwise noted.

JRME is a forum for disciplined inquiry into the teaching and learning of mathematics. The editors encourage submissions including:

  • Research reports, addressing important research questions and issues in mathematics education,
  • Brief reports of research,
  • Research commentaries on issues pertaining to mathematics education research.

More information about each type of submission is available here . If you have questions about the types of manuscripts JRME publishes, please contact [email protected].

Editorial Board

The  JRME  Editorial Board consists of the Editorial Team and Editorial Panel.  The Editorial team, led by JRME Editor Patricio Herbst, leads the review, decision and editorial/publication process for manuscripts.  The Editorial Panel reviews manuscripts, sets policy for the journal, and continually seeks feedback from readers. The following are members of the current JRME Editorial Board.

Editorial Staff   

Editorial Panel  

International Advisory Board   

Headquarters Journal Staff  

The editors of the  Journal for Research in Mathematics Education (JRME)  encourage the submission of a variety of manuscripts.

Manuscripts must be submitted through the JRME Online Submission and Review System . 

Research Reports

JRME publishes a wide variety of research reports that move the field of mathematics education forward. These include, but are not limited to, various genres and designs of empirical research; philosophical, methodological, and historical studies in mathematics education; and literature reviews, syntheses, and theoretical analyses of research in mathematics education. Papers that review well for JRME generally include these Characteristics of a High-Quality Manuscript . The editors strongly encourage all authors to consider these characteristics when preparing a submission to JRME. 

The maximum length for Research Reports is 13,000 words including abstract, references, tables, and figures.

Brief Reports

Brief reports of research are appropriate when a fuller report is available elsewhere or when a more comprehensive follow-up study is planned.

  • A brief report of a first study on some topic might stress the rationale, hypotheses, and plans for further work.
  • A brief report of a replication or extension of a previously reported study might contrast the results of the two studies, referring to the earlier study for methodological details.
  • A brief report of a monograph or other lengthy nonjournal publication might summarize the key findings and implications or might highlight an unusual observation or methodological approach.
  • A brief report might provide an executive summary of a large study.

The maximum length for Brief Reports is 5,000 words including abstract, references, tables, and figures. If source materials are needed to evaluate a brief report manuscript, a copy should be included.

Other correspondence regarding manuscripts for Research Reports or Brief Reports should be sent to

Patricio Herbst, JRME Editor, [email protected] .

Research Commentaries

The journal publishes brief (5,000 word), peer-reviewed commentaries on issues that reflect on mathematics education research as a field and steward its development. Research Commentaries differ from Research Reports in that their focus is not to present new findings or empirical results, but rather to comment on issues of interest to the broader research community. 

Research Commentaries are intended to engage the community and increase the breadth of topics addressed in  JRME . Typically, Research Commentaries —

  • address mathematics education research as a field and endeavor to move the field forward;
  • speak to the readers of the journal as an audience of researchers; and
  • speak in ways that have relevance to all mathematics education researchers, even when addressing a particular point or a particular subgroup.

Authors of Research Commentaries should share their perspectives while seeking to invite conversation and dialogue, rather than close off opportunities to learn from others, especially those whose work they might be critiquing. 

Foci of Research Commentaries vary widely. They may include, but are not restricted to the following:

  • Discussion of connections between research and NCTM-produced documents
  • Advances in research methods
  • Discussions of connections among research, policy, and practice
  • Analyses of trends in policies for funding research
  • Examinations of evaluation studies
  • Critical essays on research publications that have implications for the mathematics education research community
  • Interpretations of previously published research in JRME that bring insights from an equity lens
  • Exchanges among scholars holding contrasting views about research-related issues

Read more about Research Commentaries in our May 2023 editorial . 

The maximum length for Research Commentaries is 5,000 words, including abstract, references, tables, and figures.

Other correspondence regarding Research Commentary manuscripts should be sent to: 

Daniel Chazan, JRME Research Commentary Editor, [email protected] .

Tools for Authors

The forms below provide information to authors and help ensure that NCTM complies with all copyright laws: 

Student Work Release

Photographer Copyright Release

Video Permission

Want to Review?

Find more information in this flyer  about how to become a reviewer for JRME . 

The  Journal for Research in Mathematics Education  is available to individuals as part of an  NCTM membership  or may be accessible through an  institutional subscription .

The  Journal for Research in Mathematics Education  ( JRME ), an official journal of the National Council of Teachers of Mathematics (NCTM), is the premier research journal in math education and devoted to the interests of teachers and researchers at all levels--preschool through college.

JRME is published five times a year—January, March, May, July, and November—and presents a variety of viewpoints.  Learn more about   JRME .

NCTM

© 2024 National Council of Teachers of Mathematics (NCTM)

Powered by: PubFactory

  • [66.249.64.20|91.193.111.216]
  • 91.193.111.216

Character limit 500 /500

Minnesota Journal of Undergraduate Mathematics

research journal of mathematics

ISSN: 2378-5810

The Minnesota Journal of Undergraduate Mathematics focuses on original mathematical research, done primarily by undergraduates, in all areas of mathematics and its applications.  The journal is currently not accepting new articles, while we work to process all of the current submissions.  Authors with submissions should watch for updates via email. We anticipate accepting new submissions later in 2024.  

Sponsors School of Mathematics Math Center for Educational Programs (MathCEP) Institute for Mathematics and its Applications

Current Issue

Vol. 8 No. 1 (2024): 2023-2024 Academic Year

Published: 2024-02-19

The Legendre Approximation and Arithmetic Bias

Megan Paasche, Ghaith Hiary

Expected Value of Statistics on Type-B Permutation Tableaux

Ryan Althoff, Daniel Diethrich, Amanda Lohss, Xin-Dee Low, Emily Wichert

Some Thoughts on the Search for 5 × 5 and 6 × 6 Additive-Multiplicative Magic Squares

Desmond Weisenberg

Image of University M logo with text Libraries Publishing

Contact Publishing Services | Acceptable Use of IT Resources

The copyright of these individual works published by the University of Minnesota Libraries Publishing remains with the original creator or editorial team. For uses beyond those covered by law or the Creative Commons license, permission to reuse should be sought directly from the copyright owner listed on each article.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

Hom-Lie Superalgebras in Characteristic 2

Journal Description

Mathematics.

  • Open Access — free for readers, with article processing charges (APC) paid by authors or their institutions.
  • High Visibility:  indexed within Scopus ,  SCIE (Web of Science) , RePEc , and other databases .
  • Journal Rank:  JCR - Q1 ( Mathematics ) / CiteScore - Q1 ( General Mathematics )
  • Rapid Publication: manuscripts are peer-reviewed and a first decision is provided to authors approximately 16.9 days after submission; acceptance to publication is undertaken in 2.6 days (median values for papers published in this journal in the second half of 2023).
  • Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
  • Sections: published in 13 topical sections .
  • Companion journals for Mathematics include: Foundations , AppliedMath , Analytics , International Journal of Topology , Geometry and  Logics .

Latest Articles

research journal of mathematics

Journal Menu

  • Mathematics Home
  • Aims & Scope
  • Editorial Board
  • Reviewer Board
  • Topical Advisory Panel
  • Instructions for Authors

Special Issues

  • Sections & Collections
  • Article Processing Charge
  • Indexing & Archiving
  • Editor’s Choice Articles
  • Most Cited & Viewed
  • Journal Statistics
  • Journal History
  • Journal Awards
  • Society Collaborations

Conferences

  • Editorial Office

Journal Browser

  • arrow_forward_ios Forthcoming issue arrow_forward_ios Current issue
  • Vol. 12 (2024)
  • Vol. 11 (2023)
  • Vol. 10 (2022)
  • Vol. 9 (2021)
  • Vol. 8 (2020)
  • Vol. 7 (2019)
  • Vol. 6 (2018)
  • Vol. 5 (2017)
  • Vol. 4 (2016)
  • Vol. 3 (2015)
  • Vol. 2 (2014)
  • Vol. 1 (2013)

Highly Accessed Articles

Latest books, e-mail alert.

research journal of mathematics

Topical Collections

Further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Involve your students in research

If you would like to receive a subscription to the print version of Involve for your institution or department or for personal use, please contact Mathematical Sciences Publishers .

For information on submitting papers see the Submission's page .

Involve is published by MSP (Mathematical Sciences Publishers), alongside other top journals. MSP is a nonprofit who believes that fair-priced scholar-led subscription journals remain the best stewards of quality and fairness, and strives to offer the highest quality at the lowest sustainable prices. MSP also developed EditFlow , the popular peer-review web application.

Impartiality Statement

Peer review.

This journal operates a single-anonymized review process (the names of the reviewers are hidden from the author). All contributions will be initially assessed by an editor for suitability for the journal. Papers deemed suitable are then typically sent to a minimum of one independent expert reviewer to assess the scientific quality of the paper. The editors are responsible for the final decision regarding acceptance or rejection of articles. The editors’ decisions are final. Editors are not involved in decisions about papers which they have written themselves.

  • MathSciNet (American Mathematical Society).
  • zbMATH Open (FIZ Karlsruhe).
  • Scopus (Elsevier) and Scimago Journal Rank .

Involve, a Journal of Mathematics https://msp.org/involve First published: 2008. Size/periodicity: 5 issues per year; about 900 pages per year. ISSN (print): 1944-4176. ISSN (electronic): 1944-4184. Dimensions: 7×10in (17.8×25.4cm).

Effects of a Learning Trajectory for statistical inference on 9th-grade students’ statistical literacy

  • Open access
  • Published: 10 April 2024

Cite this article

You have full access to this open access article

  • Marianne van Dijke-Droogers   ORCID: orcid.org/0000-0003-1561-7831 1 ,
  • Paul Drijvers   ORCID: orcid.org/0000-0002-2724-4967 1 &
  • Arthur Bakker   ORCID: orcid.org/0000-0002-9604-3448 2  

In our data-driven society, it is essential for students to become statistically literate. A core domain within Statistical Literacy is Statistical Inference, the ability to draw inferences from sample data. Acquiring and applying inferences is difficult for students and, therefore, usually not included in the pre-10th-grade curriculum. However, recent studies suggest that developing a good understanding of key statistical concepts at an early age facilitates the understanding of Statistical Inference later on. This study evaluates the effects of a Learning Trajectory for Statistical Inference on Dutch 9th-grade students’ Statistical Literacy. Theories on informal Statistical Inference and repeated sampling guided the Learning Trajectory’s design. For the evaluation, we used a pre-post research design with an intervention group ( n  = 267). The results indicated that students made significant progress on Statistical Literacy and on the ability to make inferences in particular, but also on the other domains of Statistical Literacy. To further interpret the learning gains of this group, we compared students’ results with national baseline achievements from a comparison group ( n  = 217) who followed the regular 9th-grade curriculum, and with international studies using similar test items. Both comparisons confirmed a significant positive effect on all domains of Statistical Literacy. These findings suggest that current statistics curricula for grades 7–9, usually with a strong descriptive focus, can be enriched with an inferential focus.

Avoid common mistakes on your manuscript.

In our data-driven society, it is essential for citizens to be statistically literate. Both our daily activities and professional practices increasingly rely on statistical information we obtain, either from taking measurements or through media reports. Statistical Literacy (SL) concerns the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002 ). The growing use of and dependence on statistical data requires an educational approach in which students learn to create and critically evaluate data-based claims (Ben-Zvi et al., 2015 ) and, as such, to become statistically literate.

A core domain of SL is drawing inferences from sample data. However, learning and applying Statistical Inferences (SI) is difficult for students (Castro Sotos et al., 2007 ; Konold & Pollatsek, 2002 ). Therefore, in many countries, including the Netherlands, it is not offered in the pre-10th-grade curriculum. Recent studies suggest that developing, at an early age, a good understanding of key statistical concepts of sample, variability and distributions, facilitates the understanding of SI later on (Ben-Zvi et al., 2015 ; Zieffler et al., 2008 ). Innovative educational software for simulating samples and repeated sampling offers opportunities to make these key concepts accessible (Biehler et al., 2013 ).

To support students’ SI, a Learning Trajectory (LT) for 9th-grade students (14–15 years old) was designed to introduce the key concepts of SI (van Dijke-Droogers et al., 2020 ). Theories of informal Statistical Inference (Makar & Rubin, 2009 ), complemented by ideas of growing samples and repeated sampling (Bakker, 2004 ), constituted the design of the LT. This simulation-based LT comprises an investigative approach that includes all stages of the statistical investigation cycle—from collecting data to interpreting the results—with an emphasis on interpreting sample data and reasoning about probability. Although the focus of the LT is on SI, the approach concretizes broader underlying statistical concepts, such as measures of center and spread, distribution, and correlation, by means of visualizations. As such, our conjecture is that the designed LT for introducing SI will also have a stimulating effect on the other, more descriptive-focused, domains of SL. More details about the design of the LT are elaborated by van Dijke-Droogers et al. ( 2021 ).

Currently, the typical Dutch pre-10th-grade curriculum is mainly focused on the descriptive SL domains. Adding new learning trajectories on top of regular curricula is difficult, e.g., in time and effort. In this regard, the purpose of the LT is to expand the 9th-grade curriculum with SI, the more complex domain of SL, without neglecting the current educational goals on the other domains.

The aim of the study reported here is to evaluate the effects of the designed LT for introducing Statistical Inference on students’ Statistical Literacy. Therefore, we wanted to assess students’ performance on SI, and their achievements on the other descriptive-focused domains of SL as offered in the regular curriculum. Assessment instruments with a specific focus on SI hardly exist for our age group. As such, we developed a pre- and posttest, by adapting and expanding already validated tests. This assessment instrument enabled us to establish students’ performance on both tests, and hence to evaluate the effects of the designed LT for Statistical Inference on students’ SL, and on the SI domain in particular. 

Theoretical background

Domains of statistical literacy.

Statistical Literacy (SL) concerns the use of statistical information as evidence in arguments (Schield, 1999 ). This includes the ability to read and interpret numbers in statements, surveys, tables, and graphs and studies how statistical associations are used as evidence for causal connections. Although SL has several definitions, the most-used one comes from Gal ( 2002 ), where SL is portrayed as the ability to interpret, critically evaluate, and communicate about statistical information and messages. According to Rumsey ( 2002 ), SL includes the understanding of basic statistical concepts and ideas in data awareness, production, understanding, interpretation, and communication.

Three domains of SL can be distinguished (Watson & Callingham, 2003 ). The average and chance (AC) domain covers determining measures of center and spread, and calculating and interpreting chance issues, as reflected in the mathematics curriculum in most Western countries (Watson & Callingham, 2004 ). The graphing and variation (GV) domain entails creating and interpreting visual representations of data with the variation involved. The sampling and inferences domain focuses on Statistical Inference and, as such, can be considered as the Statistical Inference domain within SL. This SI domain covers working with samples and drawing inferences, where interpreting the relationship between these two is particularly important in the process of statistical decision making.

Many secondary school curricula make a distinction between statistics without probability (descriptive statistics, exploratory data analysis), as addressed in the GV and AC domains, and statistics with probability (inferential statistics) as addressed in the SI domain. The latter is usually taught at upper levels (Burrill & Biehler, 2011 ). This also holds for the Dutch secondary school curriculum, in which statistics education progresses from descriptive statistics in the early years to preparing for a more formal approach to inferential statistics from grade 10 and in higher education (van Dijke-Droogers et al., 2017 ; van Streun & van de Giessen, 2007 ). In the Dutch curriculum for grades 7–9, the first two domains of SL are embedded in the descriptive statistics, whereas the SI domain is not addressed at all.

Statistical inference

Statistical Inference (SI) is at the heart of statistics as “it provides a means to make substantive evidence–based claims under uncertainty when only partial data are available” (Makar & Rubin, 2018 , p. 262). As such, SI can be considered both an outcome—evidence-based claims—and a reasoned process for probabilistic generalizations from data—interpreting the uncertainty involved (Makar & Rubin, 2009 ). SI concerns interpreting sample results, drawing data-based conclusions, and reasoning about probability. For most students, it is difficult to understand SI and the uncertainty involved. Several studies focused on the introduction and conceptualization of SI. The offering of educational activities of SI at an early age on informal level, combined with the frequent recurrence of such activities later on, seems to make SI accessible for students, in particular at the school level (Makar & Rubin, 2009 ; Paparistodemou & Meletiou-Mavrotheris, 2008 ; van Dijke-Droogers et al., 2020 ; Zieffler et al., 2008 ). In general, this informal approach focuses on ways in which students without knowledge of formal statistical techniques, such as hypothesis testing, use their statistical knowledge to underpin their inferences about an unknown population based on observed samples. A widely used framework for informal Statistical Inference identifies three main principles: generalization beyond data, data as evidence for these generalizations, and probabilistic reasoning about the generalization (Makar & Rubin, 2009 ).

SI requires an understanding of the key concepts of sample, variability and distribution—including frequency distribution and (simulated) sampling distribution. These concepts can be introduced at the school level by using ideas of simulating repeated samples (Garfield et al., 2015 ; Manor & Ben-Zvi, 2017 ; Rossman, 2008 ; Saldanha & Thompson, 2002 ; Watson & Chance, 2012 ) and growing samples (Bakker, 2004 ; Ben-Zvi et al., 2012 ; Wild et al., 2011 ). Digital tools such as TinkerPlots ™ offer opportunities for simulating repeated samples and to visualize concepts, such as random behavior, distribution, and probability (Garfield et al., 2012 ; Konold et al., 2007 ; Pfannkuch et al., 2018 ). Working with such simulations stimulates the understanding of statistical models and modeling processes that are essential for SI. In the LT we designed, students start with interpreting the sampling distribution obtained from repeated sampling with a physical black box filled with marbles. As a follow-up, students build and run a model of a real-world situation in TinkerPlots ™ and use this model, by simulating and interpreting the sampling distribution of repeated samples, to understand the real-world situation, and to draw inferences. The details of the LT will be illustrated and discussed later in the methodology section.

Assessing statistical literacy and inference

Assessment instruments at the secondary school level for SL, with a focus on SI, are scarce. The situation is very different at the tertiary level; think of the web-based ARTIST project—Assessment Resource Tools for Improving Statistical Thinking—by Garfield et al. ( 2002 ), the CAOS project—Comprehensive Assessment of Outcomes in a First Statistics Course—by delMas et al. ( 2007 ), the GOALS project—Goals and Outcomes Associated with Learning Statistics—by Garfield et al. ( 2012 ), and the BLIS project—Basic Literacy in Statistics—by Ziegler and Garfield ( 2018 ). The latter project, BLIS, involves a compilation of existing items from the other projects supplemented with simulation-based questions. The items in these projects require students to think and reason, not to compute, use formulas, or recall definitions. A study by Novak ( 2014 ) shares content and design with ours as it involves the evaluation of a simulation-based intervention using a pre-post research design.

The only studies that seemed useful for our students were the ones by Watson and Callingham ( 2003 , 2004 ) and the LOCUS project (Whitaker et al., 2015 ), as both focused on grades 6 to 12. Watson and Callingham’s studies appeared to be particularly suited, as they specifically distinguished—in their organization of assessment items—between the three domains of SL. Their approach allowed us to identify students’ SL, and also their performance on the domain of SI in particular. Using archived data from 1993 to 2000, Watson and Callingham empirically developed a 6-level hierarchy of SL that helped to identify the distribution of Australian middle school students’ SL across the levels. Their hierarchical levels for SL are presented in Table  1 . A follow-up study by Callingham and Watson ( 2017 ) showed that the level construct had remained appropriate and stable over time. This finding suggests that the identified levels provide a good basis for determining the level of SL in secondary education. In addition, their longitudinal analysis indicates that the Statistical Literacy hierarchy can be used to monitor students’ progress.

Research question

This study focuses on the question: What are the effects of a Learning Trajectory for Statistical Inference on 9th-grade students’ Statistical Literacy? To answer this question, we examined the effects of the LT on students’ proficiency in the domains of SL. Although the designed LT concentrates on Statistical Inference—the SI domain of SL—we conjectured that a focus on more complex learning activities for SI would also have a positive effect on students’ understanding of the other domains of SL.

To evaluate the effects of the LT, we used a pre-post research design with an intervention group ( n  = 267) who engaged with the LT. Additionally, to further interpret the learning gains of the intervention group, we compared their results with national baseline achievements from a comparison group ( n  = 217) who followed the regular Dutch curriculum, and with level scores of Australian students (Callingham & Watson, 2017 ).

An outline of the Learning Trajectory

A Learning Trajectory (LT) is a design and a research instrument to structure and connect all elements involved in learning a particular topic. An LT consists of a set of learning goals for students, learning activities that will be used to achieve these goals, and conjectures about the students’ learning process. It includes the simultaneous consideration of mathematical goals, student thinking models, teacher and researcher models of students’ thinking, sequences of teaching tasks, and their interaction at a detailed level of analysis of processes (Clements & Sarama, 2004 ).

The designed LT introduces the key concepts for Statistical Inference to 9th-grade students by using an investigative approach with a physical black box and simulation-based methods (van Dijke-Droogers et al., 2020 ); see Table  2 . Ideas of repeated sampling and growing samples instantiate the design, both for working with the physical black box filled with marbles and for simulating samples using TinkerPlots ™. All stages of the statistical investigation cycle are addressed in the LT, as students collect both physical and simulated data, analyze their data using the sampling distribution, and interpret the results to answer the question posed. The emphasis is on interpreting sample data and reasoning about probability. Recent views on statistical models and modeling (Büscher & Schnell, 2017 ; Manor & Ben-Zvi, 2017 ; Patel & Pfannkuch, 2018 ), and educational guidelines on the use of context, digital tools, exchange and comparison of sample results, making predictions, and engagement in both physical and simulation-based activities, are embedded in the design.

The investigative approach and learning activities in the more complex SI domain also attend to the other domains of SL. For example, the AC domain, average and chance, is addressed as students summarize their obtained sample data in measures of center and spread. As another example, the graphing part of the GV domain is given attention in the visualizations of both sample results and population models, and the variation part is targeted as students explore results of repeated samples.

The LT comprises eight learning steps that are split into two similar parts of four. Part one considers only categorical data and includes the following steps: (1) experimenting with a physical black box, (2) visualizing distributions, (3) statistical modeling using TinkerPlots ™, (4) applying models in new real-life contexts. Subsequently, in part two, LT steps (5) to (8) include similar steps, now using more complex numerical data. The eight steps of the LT were organized in two sequences of six 45-min lessons, with a total of 12 lessons. More details about the design of the LT are elaborated by van Dijke-Droogers et al. ( 2021 ).

Design of the assessment instrument

To evaluate the effects of the designed LT, we needed an assessment instrument to measure 9th-grade students’ SL, and SI in particular. In line with Novak ( 2014 ), we chose a pre-post research design to measure the effects of the LT on students’ proficiency, i.e., students’ progress when working with the LT. For the design of the tests, following Ziegler and Garfield ( 2018 ), we used existing items from validated tests by Watson and Callingham ( 2003 , 2004 ) and expanded these with newly designed items on Statistical Inference and simulation.

The pre- and posttest each contained ten clusters of items. Each cluster included two to six items, with a total of 39 and 34 items on the pre- and posttest, respectively. Both tests had a similar composition and a time-duration of 45 min. For each test, we selected five clusters of items from Watson and Callingham ( 2004 ) that covered the three domains of SL. We selected one cluster item applicable for secondary level from the CAOS test (delMas et al., 2007 ). As context was found to be an important factor affecting the difficulty of items for students, the selection of items was based on educational background, as well as on familiarity with the context. Table 3 provides an overview of the composition of the pre- and posttest, with reference to sources and accompanying domains of SL.

Figure  1 shows an example of an item from a validated test, in the AC domain. The level scores in this item refer to Watson and Callingham’s ( 2003 ) hierarchical levels 1 to 6 for SL, supplemented with the null level for incorrect or uncompleted items. As Fig.  1 shows, the answers could not be given on each level: It was not possible to formulate an answer on levels 1 and 2, the informal and inconsistent level, as all possible answers include the context information given—level 3 or higher—or the answer is incorrect—level 0.

figure 1

Item with corresponding level description from Watson and Callingham ( 2004 , p. 138)

Similarly, based on the item context, some items could only be coded to a maximum level score of 4 instead of 6. As such, for the selection of items, the chosen items had to be similar in maximum level score on the pre- and posttest, for each domain of SL, to compare students’ scores on both tests. The average maximum scores for SI items on the pre- and posttest were similar, both around 5.6, and, for the GV items, the average maximum scores were also similar, with around 3.7 for both tests. For AC, however, the maximum scores on the selected items in the pre- and posttest were rather different, with 5.7 and 4.6, respectively. To compensate for this difference, a correction was applied to the posttest results, so that students’ level scores on the pre- and posttest could be properly compared. Using the corrected AC scores, the average maximum score on SL was about 5.5 for both tests. As such, we considered the selected items on the pre- and posttest comparable for both tests, on all domains of SL.

As we were specifically interested in the effects of the LT on students’ understanding of the concepts of SI as addressed in the LT, four additional cluster items were designed for this study, focusing on the SI domain. For the design, we chose recognizable contexts and used the structure and phrasing of items from the two previously described tests. The level scores of these new items were, as with the existing items, based on Watson and Callingham’s ( 2003 ) level descriptions, and on the exemplary items they formulated on the SI domain (2004). See Fig.  2 for an example.

figure 2

Newly designed item with corresponding level description on the SI domain of Statistical Literacy (SI, Statistical Inference)

To analyze the validity of the designed assessment instrument for our Dutch 9th-grade students, we conducted two pilot tests in different classrooms, each consisting of 25 students, for the pretest. Concerning the concurrent validity of the newly designed SI items, we expected the students to score on the newly designed SI items at a similar level to the existing SI items from Watson and Callingham ( 2004 ). Students’ average level scores in the pilots on newly designed and existing SI items were not significantly different ( M new  = 2.49, SD new  = 0.71, M ex  = 2.78, SD ex  = 1.38, n  = 50, t (49) =  − 1.6; p  = .11). For the other domains, GV and AC, all items were from already validated tests. To assess the content and construct validity of all test items for our students, the results of each pretest pilot were used for in-depth discussion with experts in this area on content, construct, vocabulary, and clarity. In a similar way, the posttest was piloted in the same two classrooms with 25 students each. The posttest pilots took place after the large-scale implementation of the pretest. Based on our pretest experiences, the initial designed posttest was modified slightly—for example, the number of items was reduced from 38 to 34. The posttest was piloted 4 weeks after the pretest pilots. The 25 students did not follow any statistics education in the intervening weeks. The results of the pre- and posttest pilot were non-significantly different (− 0.08, t (51) = 0.84, p  = .40). Additionally, the posttest was thoroughly examined to ensure the pre- and posttest were comparable.

Concerning the reliability of the tests, Cronbach’s alpha values were 0.84 and 0.85 on the pre- and posttest, respectively, indicating a good reliability (Taber, 2018 ). To assess the difficulty of the items, p values were calculated. To assess the discrimination of the items, we used Rit (item–test correlation) and Rir (item–rest correlation), using classical test theory. See Table  4 for an overview of the reliability of item characteristics on the pre- and posttest, with accompanying ratings. For the pretest, we observed moderately difficult items with four easy items ( p value > .80) and one difficult item ( p value < .20). Rit and Rir values > 0.30 are indicated as good items, scores between 0.20 and 0.30 as medium, and scores < 0.20 as poor items (Ebel & Frisbie, 1991 ). The pretest Rit values indicated 5 poor, 12 moderate, and 22 good items, and the Rir scores indicated 8 poor, 16 moderate, and 15 good items. For the posttest, we observed moderately difficult items with 4 easy items and no difficult items. The Rit values indicated 1 poor, 9 moderate, and 24 good items, and the Rir scores indicated 2 poor, 13 moderate, and 19 good items. We considered these item scores on the pre- and posttest to be most acceptable. The pre- and posttest can be found in Appendix A and B .

Participants

For participants in the intervention group, through a national call, in for instance newsletters for math teachers and on social media, we invited Dutch teachers who were willing to implement the LT in their regular mathematics lessons. Eleven of them applied, with a total of 267 9th-grade students (aged 14–15 years) from 13 classes in 5 different schools. Two teachers participated with 2 of their classes. The teachers were instructed for the LT during 2 similar 3-h sessions. The first session focused on LT steps 1–4 and included the 45-min lessons 1 to 6. The teachers worked through students’ lessons and materials themselves, guided by the researcher. The second session was similar to the first one and concentrated on LT steps 5–8, lessons 7 to 12. The project materials consisted of a teacher guidebook and students’ materials, such as worksheets, datasets, and physical black boxes with marbles. The teachers of the intervention group eliminated all the regular 9th-grade statistics lessons to save time for the LT. The participating students from the intervention group were in the pre-university stream and thus belonged to the 15% best-performing students in our educational system.

Due to practical constraints—that is, the number of participants in the intervention group—it was not possible to set a randomized trial with a control group. To be able to indicate the effect of the LT in comparison with the regular curriculum, we established a “Dutch baseline” from a comparison group. For the participants in the comparison group, through a national call, we invited teachers who were interested in administering a test to identify the SL of their students. Six teachers applied with a total of 217 students in 10 classrooms. When the comparison group was registered, all participants had recently completed the 9th-grade regular statistics education, consisting of 10–16 lessons. The regular curriculum focused on the AC and GV domains of SL, as described earlier in the section on the domains of SL. To identify these students’ SL, and also to compare well with the intervention group, the average results on two tests were administered, with a 4-week interval between them. The two tests consisted of items from the pre- and posttest of the intervention group. In retrospect, the results of the comparison group on the two tests were found to be similar, for both SL as a whole (− 0.02, t (216) = 0.4, p  = .65), and for the domains SI (− 0.04, t (216) = 0.9, p  = .40), GV (+ 0.09, t (216) = 1.4, p  = .18), and AC (− 0.11, t (216) = 1.1, p  = .26). This could be expected, since no statistics education was given during the intervening 4 weeks. The average results on both tests were used as Dutch baseline achievements for the SL of 9th graders.

We are aware that teachers from the intervention group who were willing to “go the extra mile” were possibly more motivated for teaching statistics. However, the teachers of the comparison group also volunteered, mainly because they were interested in the performance of their students in the field of statistics. In this regard, the teachers from both groups had an above-average interest in teaching statistics. Students in both groups belonged to the 15% best achieving students in the Dutch educational system. They all successfully completed the regular statistics curriculum for the pre-university stream in grades 7 and 8. Students’ grade level from both the intervention and the comparison group was described as average according to their performance on mathematics and statistics tests. As such, we assumed both groups to be comparable.

Data collection

The data of the intervention group consisted of pre- and posttests. The pretest was taken in months 7–8 of the school year 2019–2020. The participating teachers administered the test, according to a clear instruction for testing, from their own students during their regular 45-min mathematics lessons. The posttest was administered in months 9–10 of the school year, after completing the LT, in a similar way, by the teachers during their regular lessons in their own school.

The data of the comparison group consisted of two tests, taken in months 8 and 9 of the school year. The average of both tests was used to identify the Dutch baseline. The tests were administered by the participating teachers, according to a clear instruction, during their regular mathematics lessons.

Data analysis

For the analysis, we first graded the pre- and posttest level scores for the intervention group on the domains of SL with two assessors. Second, we compared the scores of the intervention group with Dutch baseline achievements from the comparison group, and with the scores of Australian students (Callingham & Watson, 2017 ).

First, for assessing students’ proficiency on the domains of SL, the pre- and posttest data for the intervention group were coded with the level scores 0–6 for SL (Watson & Callingham, 2003 ), as described in the section on the assessment instrument. To indicate students’ progress, we compared changes in students’ pre- and posttest scores. Graphical representations were used for data exploration. Several statistical measures were calculated, such as center and spread, and proportions for level scores. For significance, we used paired \(t\) tests for comparing pre- and posttest results. For students’ proficiency level at SL, we calculated the mean of students’ average scores on the AC, GV, and SI domain, allowing us to compensate for the inequality in the number of items per domain.

Second, to further interpret the effects of the LT on students’ SL, we compared our finding with a Dutch baseline. For this baseline, the average scores of the comparison group on two tests were used. The test data were coded with the level scores 0–6, in a similar way as for the intervention group. For significance, we used independent samples t tests for comparing the intervention group scores with the Dutch baseline. Additionally, we compared our findings with the studies by Watson and Callingham and with their distribution of Australian students from grades 6 to 9 found across the levels for SL. As our assessment instrument was mainly based on their validated tests and hierarchical level construct for SL, we considered the results for our students to be comparable to theirs. In this regard, we expected the distribution in levels for our 9th-graders to be broadly similar to their distribution found for grade 9 and also expected that most students would score on levels 3–4 for SL. Concerning the comparison of our students’ average level scores with those of Australian students (Callingham & Watson, 2017 ), estimates for the Australian students’ average level score per grade were calculated using the distribution of students across the levels. For significance, we used independent samples t tests and chi-squared tests for the distribution over levels.

For reliability of the analysis, a second coder was asked to independently grade a random set of 5% (250 items) of the pre- and posttest data with students’ reasoning. The second coder agreed on 83% of the codes. Deviating codes, which were limited to one or two levels difference at most, were discussed until agreement was reached.

In this section, we first present the level scores for the intervention group on the domains of SL at the pre- and posttest. Second, we compare these results with Dutch baseline achievements from the comparison group, and with findings from Callingham and Watson ( 2017 ).

Students’ level scores for SL

Table 5 displays students’ proficiency on the domains of SL in level scores at the pre- and posttest for the intervention group, including their progress from pre to post.

Regarding students’ progress on SL, a paired t test between the pre- and posttest for the intervention group indicated the average posttest score was significantly higher than the score on the pretest (+ 0.68, t (266) = 13.0, p  < .0005). Students’ results on SL confirmed our conjecture that following the LT had a clear positive effect on students’ SL.

With regard to the SI domain of SL, a paired t test between the pre- and posttest indicated that students’ average level score on the posttest was considerably higher than on the pretest (+ 0.89, t (266) = 15.8, p  < .0005). These results were in line with our expectations, as we hypothesized that the investigative approach and more complex learning activities for SI as embedded in the LT would support all domains of SL, and SI in particular.

With respect to students’ progress on GV, a paired t test between the pre- and posttest for the intervention group indicated that their posttest score was significantly higher than their pretest score (+ 0.52, t (266) = 8.7, p  < .0005). Regarding students’ level for the GV domain, it is important to note that the average maximum scores for the test items used in this domain were, as elaborated earlier in the “ Methods ” section, considerably lower than for items in the other domains. Therefore, the GV level score cannot be used for comparison with other domains.

Concerning students’ progress on the AC domain, a paired t test between the pre- and posttest for the intervention group indicated that their posttest score was significantly higher than their pretest score (+ 0.63, t (266) = 6.7, p  < .05). The findings on the domains for SL confirmed our conjecture that following the LT had a clear positive effect on students’ SL and SI, and more moderate effects on the GV and AC domains.

Students’ level scores in comparison with the Dutch baseline

Table 6 displays the Dutch baseline achievements from the comparison group on the domains of SL in level scores, including a comparison with the pre- and posttest for the intervention group.

When comparing the results for SL on the posttest, an independent samples t test between the intervention group and the baseline indicated significantly more proficiency on SL for students who followed the LT in comparison with Dutch baseline achievements from the comparison group, who followed the regular curriculum (+ 0.32, t (482) = 4.9, p  < .0005). Students’ results on SL confirmed our conjecture that following the LT had a clear positive effect on students’ SL.

With regard to the SI domain of SL, on the posttest, an independent samples t test indicated that the level score for the intervention group who followed the LT was considerably higher in comparison with the Dutch baseline achievements from the comparison group (+ 0.65, t (482) = 8.7, p  < .0005). Concerning the GV domain of SL, an independent samples t test indicated that the posttest score for the intervention group was slightly, but significantly, higher than the Dutch baseline (+ 0.26, t (482) = 3.7, p  < .05). Although we expected the intervention group that followed the LT with a focus on SI to progress in the other domains, we did not expect them to reach higher scores than the baseline achievements from students who followed the regular curriculum with a focus on GV and AC. For the AC domain, an independent samples t test indicated that the posttest score for the intervention group that followed the LT was comparable with the Dutch baseline achievements (+ 0.06, t (482) = 0.6, p  = 52). These findings confirmed our conjecture that the LT with a focus on SI also stimulated the other domains of SL.

When comparing the results for SL on the pretest, an independent samples t test indicated that the average level score for the intervention group on SL was significantly lower than the Dutch baseline (− 0.36, t (482) = 5.9, p  < .0005). With regard to the SI domain, on the pretest, the score for the intervention group was slightly, but significantly, lower than the Dutch baseline (− 0.24, t (482) = 3.7, p  < .05). We did not expect this lower score. Although the students of the comparison group, the Dutch baseline, followed the regular statistics curriculum before the test, the SI domain was not offered in the regular lessons, so we expected a similar score for both groups. Regarding the GV domain, the score for the intervention group was, as expected, significantly lower than the Dutch baseline level score (− 0.26, t (482) = 4.2, p  < .05). Regarding students’ level for the GV domain, it is important to note that the average maximum scores for the test items used in this domain were, as elaborated earlier in the methods section, considerably lower than for items in the other domains.Therefore, the GV level score cannot be used for comparison with other domains. For the AC domain, the score for the intervention group was, as expected, significantly lower than the baseline (− 0.57, t (482) = 4.8, p  < .0005). The findings on the domains for SL confirmed our conjecture that following the LT had a clear positive effect on students’ SL and SI, and more moderate effects on the GV and AC domains.

The lower scores on the pretest for the intervention group, in comparison with the Dutch baseline, were to be expected, as the intervention group did not have 9th-grade statistics lessons prior to the pretest. Furthermore, the lower level score of − 0.36 on SL for the intervention group on the pretest relative to the Dutch baseline turned out to be almost equal in size to their higher level score of + 0.32 on the posttest. Since the intervention group had an educational disadvantage of about one school year relative to the baseline at the pretest, their score on the posttest could be interpreted as almost one school year advantage. Students’ results on SL confirmed our conjecture that following the LT had a clear positive effect on students’ SL.

Students’ level score on SL in comparison with those of Australian students

To further interpret the proficiency of students, we compared our results with those of Australian students (Callingham & Watson, 2017 ). In doing this, we compared the distribution of students over the levels for SL, and we compared students’ average level scores on SL. The distribution of Dutch students over the levels of SL on the pre- and posttest is presented in Table  7 as well as the distribution of Australian students.

The comparison of students’ distribution over the levels is displayed in Fig.  3 . The first two graphs compare the pre- and a posttest scores for the intervention group with those of Australian students. The third graph compares the results of the Dutch baseline with those of Australian students. For the Dutch baseline, the results on the pre- and a posttest for the comparison group were aggregated, since these results were highly similar. First, from Fig.  3 , the graphs illustrate that for the lower levels; the percentage of post-intervention students was lower than the percentage of the Dutch baseline or Australian students in grades 6, 7, and 8. Second, the graphs show that for the higher levels, the percentage of post-intervention students was actually higher. The pretest scores for the intervention group corresponded most closely to the performance of Australian students in grade 6 (Callingham & Watson, 2017 ) and, as such, were lower than we expected. A chi-squared test on the distribution over levels in percentages, between the pretest score for the intervention group and each Australian grade 6 to 9, confirmed the highest p value, and with that the best fit, for grade 6 (χ 2 (4) = 6.26, p  = .18). The pretest mean level score for the intervention group 2.60 ( 0.70 ) also corresponded to the estimate of the mean level score for Australian grade 6. The estimates per grade were calculated using the distribution of their students across the levels. Table 8 summarizes the comparison of the intervention group and the Dutch baseline with Australian grade results, based on the distribution of students over the levels and average level scores. Regarding the posttest score for the intervention group, the results corresponded most closely to Australian grades 7–8. The chi-squared test confirmed the similarity between the posttest scores for the intervention group and grades 7–8, as the highest p values found were χ 2 (4) = 6.2, p  = .184 and χ 2 (5) = 11.3, p  = .05, for grades 7 and 8, respectively. The posttest average level score for the intervention group 3.28 ( 0.69 ) also corresponded most closely to the estimate of the level score for Australian grade 8 (3.3). According to the findings by Callingham and Watson, the Dutch baseline corresponded most closely to Australian grades 6–7. The chi-squared test confirmed the similarity, as the highest p values found were for Australian grades 6 and 7 (χ 2 (4) = 9.3, p  = .05 and χ 2 (4) = 5.8, p  = .22, respectively). The mean level score for the Dutch baseline 2.96 also corresponded to the estimate of the level score for Australian grades 6–7, respectively 2.6 and 3.1.

figure 3

Comparison of students’ level scores on Statistical Literacy (SL) with Australian grade results from findings by Callingham and Watson ( 2017 )

Concerning the effects of the LT, the posttest score on SL for the intervention group that followed the LT appeared to be more advanced than the Dutch baseline score from the comparison group. Moreover, from the comparison with findings by Callingham and Watson ( 2017 ), the advantage for the intervention group on SL corresponded again, as in our earlier findings, with about one school year higher. Furthermore, the calculated estimates of students’ average level score per grade from the study of Callingham and Watson indicated that students’ progress per year from grades 6 to 9 is roughly 0.25. When we compare the posttest SL level score for the intervention group 3.28 ( 0.78 ) with the Dutch baseline 2.96 ( 0.69 ), the difference of 0.32 again corresponds to a level difference of more than one school year.

Conclusion and discussion

As the field of statistics and its education are changing rapidly, knowledge about efficient learning trajectories is needed for the successful and sustainable implementation of curriculum changes, both among researchers and teachers (Ben-Zvi et al., 2018 ; Biehler et al., 2018 ). The aim of the study presented here was to evaluate the effects of a Learning Trajectory for Statistical Inference on 9th-grade students’ Statistical Literacy, and on making inferences in particular. Theories of informal Statistical Inference complemented by ideas of growing samples and repeated sampling guided the design of the Learning Trajectory.

Although Statistical Inference is considered a more complex domain of Statistical Literacy, this study demonstrated that the designed Learning Trajectory for Statistical Inference had a significant positive effect on all domains of Statistical Literacy. As such, engaging in (informal) inferential activities also promoted students’ capacity on other Statistical Literacy domains. This insight into a joint development of (informal) Statistical Inference and literacy allows in educational practice for an early introduction of Statistical Inference. An early introduction can support a sustainable change in students’ understanding of statistical concepts required for both making inferences and Statistical Literacy.

Currently, the Dutch curriculum, as in many other countries, evolves from descriptive statistics in the earlier years to an inferential focus later on. In early years—pre-10th grade—the focus is on the Statistical Literacy domains of graphing and variation, and average and chance. Later on, the domain of Statistical Inference is given attention. The results of this research advocate an earlier introduction of Statistical Inference. The positive effects of the Learning Trajectory on the other domains of Statistical Inference are presumably due to the inquiry-based approach of the Learning Trajectory, in which all phases of the statistical investigation cycle are addressed several times, that is, posing a question, collecting data, and analyzing data, to answer the question posed. This is consistent with previous studies and theories that advocate a holistic approach (Ainley et al., 2006 ; Franklin et al., 2007 ; Lehrer & English, 2017 ; Van Dijke-Droogers et al., 2017 ).

In discussing these conclusions, there are a few points to consider. The first involves the low level of proficiency of Dutch students on Statistical Literacy relative to Australian students (Callingham & Watson, 2017 ). We expected Dutch students to score at the posttest on grade 9 level, and not on grades 6–7 and grades 7–8, for the Dutch baseline and the intervention group, respectively. These lower scores may be due to the fact that our Dutch pre-10th-grade statistics curriculum is more limited than the Australian curriculum for students in Callingham and Watsons’ research ( https://www.australiancurriculum.edu.au/ ). Another issue in this respect is that the average maximum attainable score on the graphing and variation items on both tests was lower (about 3.7) than for the other domains (about 5.5), which negatively affected students’ overall Statistical Literacy scores. When we compensate for the lower graphing and variation item scores, the Statistical Literacy average level scores of participating students increase by about 0.3. When we then compare the adjusted literacy scores with the Australian grade results, the grade results for Dutch students increase with almost one school year and, as such, were closer to our expectations. “Regarding graphing and variation, a related issue is whether working by hand or with digital tools affects students” learning. In our study, the intervention students mainly worked with digital tools, while the Dutch baseline students mainly worked by hand. The posttest scores for the intervention students on the GV domain were significantly higher than the scores for baseline students. As such, working with digital tools for graphing and variation seems to promote students’ understanding of the GVdomain.

The second point considers effect sizes. The use of effect sizes is complex and disputed and only makes sense for comparing similar studies (Bakker et al., 2019 ; Cohen, 1988 ; Schäfer & Schwarz, 2019 ; Simpson, 2017 ). The only study we could find that is similar enough to judge the differences found is Novak ( 2014 ), since it shares content and design with ours. Novak’s study involved the evaluation of a simulation-based intervention for an introductory statistics course at the university level. A pre-post research design was used with two random intervention groups and a total of 64 students, where both groups followed a slightly different simulation-based intervention. By comparing the pre- and posttest, Novak found a significant learning effect on students’ statistical knowledge with Cohen’s d  = 0.45, and the effect on students’ conceptual knowledge was approaching significant with Cohen’s d  = 0.18. In comparing our results with theirs, the effects of the Learning Trajectory on students’ Statistical Literacy and on the Statistical Inference domain appeared considerably positive with Cohen’s \(d\) = 0.90 and Cohen’s \(d\) =, 1.12 respectively, and we also found clear positive effects on the other two domains.

Limitations of our study are the following. First, we worked with students from the pre-university stream, the 15% best performing students of our educational system, for both the intervention and the comparison group. As such, the results in this research are not generalizable to regular classrooms without further research. Second, due to practical constraints—that is, the number of participants in the intervention group—it was not possible to set a randomized control group. To be able to indicate the effect of the Learning Trajectory in comparison with the regular curriculum, we established a Dutch baseline from a comparison group. This comparison group had already completed the regular 9th-grade statistics curriculum. To determine the level of Statistical Literacy of this group, and to be able to directly compare their results with the intervention group, we administered two tests that consisted of items from the pre- and posttest of the intervention group. Although a classical randomized test with a control group has added value—for example, as it enables to determine the initial and final level for the regular 9th-grade curriculum—using the Dutch baseline helped us to indicate the effect of the Learning Trajectory. Third, the items of the pre- and posttest were not identical. Despite careful alignment—through posttest pilots and expert consultation—differences in context, question wording, and visualizations may affect the result. However, the results of the comparison group ( n  = 217) on both tests, administered at an interval of only 4 weeks, were found to be non-significantly different, both on Statistical Literacy and on all three domains. This finding supports our assumption that both tests were comparable. Fourth, we did not examine differences due to instructors’ or students’ background. We recommend taking both issues into account in future research.

We present two points for recommendations. First, in this study, the identified levels of SL by Watson and Callingham ( 2003 , 3004) proved well applicable for evaluating the effects of the Learning Trajectory. The development of a pre- and posttest, consisting of items from validated tests—mainly from Watson and Callingham—supplemented by equivalent newly designed items on Statistical Inference, enabled us to assess students’ Statistical Literacy, and their Statistical Inference in particular. Both newly designed and existing test items were found appropriate, with a Cronbach’s alpha greater than 0.84 on the pre- and posttest. In analyzing the results, the levels of Statistical Literacy appeared useful to examine students’ proficiency. Furthermore, the findings by Callingham and Watson ( 2017 ) proved useful for interpreting students’ results, and, with that, the effect of the Learning Trajectory. Therefore, we recommend researchers and educators who intend to investigate the Statistical Literacy of secondary school students to use the levels by Watson and Callingham for assessing and evaluating students’ results.

Second, for the participating teachers of the intervention group, implementing the Learning Trajectory required considerable effort. In our study, 11 teachers from five different schools were willing to invest in the trajectory. The load for teachers from the comparison group was limited to administering two tests, making it easier for teachers to participate. Using a Dutch baseline from a comparison group appeared of added value to interpret the intervention group results. Therefore, we recommend researchers and educators interested in the effects of an LT, who are for practical reasons confined to an intervention group with considerable effort for participating teachers, to consider the use of national baseline achievements from a comparison group. Furthermore, as highlighted by several researchers, much work remains to be done to obtain a good understanding of how to assess the practical and substantive effects of educational interventions, this study contributes by presenting a pre-post research design in which students’ results were compared with Dutch baseline achievements from a comparison group and with findings from international studies.

To end with, the Learning Trajectory highly affected students’ performance on Statistical Literacy and Statistical Inference, and we also indicated significant positive effects for the other domains. Although the Learning Trajectory was not focused on the latter two, the investigative approach and more complex learning activities for Statistical Inference as embedded in the trajectory appeared to have a positive effect here as well. These findings indicate that the Learning Trajectory can be used to expand the 9th-grade curriculum with the Statistical Inference domain, without neglecting the current educational goals on the other domains of Statistical Literacy.

Ainley, J., Pratt, D., & Hansen, A. (2006). Connecting engagement and focus in pedagogic task design. British Educational Research Journal, 32 (1), 23–38.

Article   Google Scholar  

Bakker, A. (2004). Design research in statistics education . Utrecht University.

Google Scholar  

Bakker, A., Cai, J., English, L., Kaiser, G., Mesa, V., & van Dooren, W. (2019). Beyond small, medium, or large: Points of consideration when interpreting effect sizes. Educational Studies in Mathematics, 102 , 1–8.

Ben-Zvi, D., Bakker, A., & Makar, K. (2015). Learning to reason from samples. Educational Studies in Mathematics, 88 (3), 291–303.

Ben-Zvi, D., Gravemeijer, K., & Ainley, J. (2018). Design of statistics learning environments. In D. Ben-Zvi, K. Makar, & J. Garfield (Eds.), International handbook of research in statistics education (pp. 473–502). Springer.

Chapter   Google Scholar  

Ben-Zvi, D., Aridor, K., Makar, K., & Bakker, A. (2012). Students’ emergent articulations of uncertainty while making informal statistical inferences. ZDM—The International Journal on Mathematics Education, 44 (7), 913–925.

Biehler, R., Ben-Zvi, D., Bakker, A., & Maker, K. (2013). Technology for enhancing statistical reasoning at the school level. In M. A. Clements, A. Bishop, C. Keitel, J. Kilpatrick, & F. Leung (Eds.), Third international handbook of mathematics education (pp. 643–690). Springer.

Biehler, R., Frischemeier, D., Reading, C., & Shaughnessy, J. M. (2018). Reasoning about data. In D. Ben-Zvi, J. Garfield, & K. Makar (Eds.), International handbook of research in statistics education (pp. 139–192). Springer.

Burrill, G., & Biehler, R. (2011). Fundamental statistical ideas in the school curriculum and in training teachers. In C. Batanero, G. Burrill, & C. Reading (Eds.), Teaching statistics in school mathematics: Challenges for teaching and teacher education (A joint ICMI/IASE Study) (pp. 57–69). Springer.

Büscher, C., & Schnell, S. (2017). Students’ emergent modeling of statistical measures—A case study. Statistics Education Research Journal, 16 (2), 144–162.

Callingham, R., & Watson, J. M. (2017). The development of statistical literacy at school. Statistics Education Research Journal, 17 (1), 181–201.

Castro Sotos, A. E., Vanhoof, S., van Den Noortgate, W., & Onghena, P. (2007). Students’ misconceptions of statistical inference: A review of the empirical evidence from research on statistics education. Educational Research Review, 1 (2), 90–112.

Clements, D. H., & Sarama, J. (2004). Learning trajectories in mathematics education. Mathematical Thinking and Learning, 6 (2), 81–89.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Academic Press.

delMas, R. C., Garfield, J., Ooms, A., & Chance, B. (2007). Assessing students’ conceptual understanding after a first course in statistics. Statistics Education Research Journal, 6 , 28–58.

Ebel, R. L., & Frisbie, D. A. (1991). Essentials of educational measurement (5th ed.) . Prentice-Hall.

Franklin, C., Kader, G., Mewborn, D., Moreno, J., Peck, R., Perry, M., & Schaeffer, R. (2007). Guidelines for assessment and instruction in statistics education (GAISE) report. Alexandria, VA: American Statistical Association.

Gal, I. (2002). Adults’ statistical literacy: Meaning, components, responsibilities. International Statistical Review, 70 (1), 1–25.

Garfield, J., Ben-Zvi, D., Le, L., & Zieffler, A. (2015). Developing students’ reasoning about samples and sampling variability as a path to expert statistical thinking. Educational Studies in Mathematics, 88 (3), 327–342.

Garfield, J., delMas, R., & Chance, B. (2002). The assessment resource tools for improving statistical thinking (ARTIST) Project. NSF CCLI grant ASA- 0206571. https://app.gen.umn.edu/artist/

Garfield, J., delMas, R., & Zieffler, A. (2012). Developing statistical modelers and thinkers in an introductory, tertiary-level statistics course. Developing statistical modelers and thinkers in an introductory, tertiary-level statistics course, 44 (7), 883–898.

Konold, C., & Pollatsek, A. (2002). Data analysis as the search for signals in noisy processes. Journal for Research in Mathematics Education, 33 (4), 259–289.

Konold, C., Harradine, A., & Kazak, S. (2007). Understanding distributions by modeling them. International Journal of Computers for Mathematical Learning, 12 (3), 217–230.

Lehrer, R., & English, L. D. (2017). Introducing children to modeling variability. In D. Ben-Zvi, J. Garfield, & K. Makar (Eds.), International handbook of research in statistics education (pp. 229–260). Springer.

Makar, K., & Rubin, A. (2009). A framework for thinking about informal statistical inference. Statistics Education Research Journal, 8 (1), 82–105.

Makar, K., & Rubin, A. (2018). Learning about statistical inference. In D. Ben-Zvi, K. Makar, & J. Garfield (Eds.), International Handbook of Research in Statistics Education (pp. 261–294). Cham, Switzerland: Springer.

Manor, H., & Ben-Zvi, D. (2017). Students’ emergent articulations of statistical models and modeling in making informal statistical inferences. Statistics Education Research Journal, 16 (2), 116–143.

Novak, E. (2014). Effects of simulation-based learning on students’ statistical factual, conceptual, and application knowledge. Journal of Computer Assisted Learning, 30 (2), 148–158.

Paparistodemou, E., & Meletiou-Mavrotheris, M. (2008). Developing young students’ informal inference skills in data analysis. Statistics Education Research Journal, 7 (2), 83–106.

Patel, A., & Pfannkuch, M. (2018). Developing a statistical modeling framework to characterize Year 7 students’ reasoning. ZDM, 50 (7), 1197–1212.

Pfannkuch, M., Ben-Zvi, D., & Budgett, S. (2018). Innovations in statistical modelling to connect data, chance and context. ZDM, 50 (7), 1113–1123.

Rossman, A. J. (2008). Reasoning about informal statistical inference: One statistician’s view. Statistics Education Research Journal, 7 (2), 5–19.

Rumsey, D. J. (2002). Statistical literacy as a goal for introductory statistics courses. Journal of Statistics Education , 10 (3).

Saldanha, L. A., & Thompson, P. W. (2002). Conceptions of sample and their relationship to statistical inference. Educational Studies in Mathematics, 51 (3), 257–270.

Schäfer, T., & Schwarz, M. A. (2019). The meaningfulness of effect sizes in psychological research: Differences between sub-disciplines and the impact of potential biases. Frontiers in Psychology, 10 (813), 1–13.

Schield, Milo (1999). Statistical Literacy: Thinking critically about statistics as evidence. Of Significance, 1 (1).

Simpson, A. (2017). The misdirection of public policy: Comparing and combining standardised effect sizes. Journal of Education Policy, 32 (4), 450–466.

Taber, K. S. (2018). The use of Cronbach’s alpha when developing and reporting research instruments in science education. Research in Science Education, 48 , 1273–1296.

Van Dijke-Droogers, M., Drijvers, P., & Tolboom, J. (2017). Enhancing statistical literacy. In T. Dooley, & G. Gueudet (Eds.), Proceedings of the tenth congress of the European Society for Research in Mathematics Education (CERME10, February 1 –5,2017) (pp. 860–867). DCU Institute of Education and ERME.

van Dijke-Droogers, M. J. S., Drijvers, P. H. M., & Bakker, A. (2020). Repeated sampling with a black box to make informal statistical inference accessible. Mathematical Thinking and Learning, 22 (2), 116–138.

van Dijke-Droogers, M. J. S., Drijvers, P. H. M., & Bakker, A. (2021). Introducing statistical inference: Design of a theoretically and empirically based learning trajectory. International Journal of Science and Mathematics Education.

van Streun, A., & van de Giessen, C. (2007). Een vernieuwd statistiekprogramma: Deel 1 [A renewed statistical program, Part 1]. Euclides, 82 (5), 176–179.

Watson, J. M., & Callingham, R. (2003). Statistical literacy: A complex hierarchical construct. Statistics Education Research Journal, 2 , 3–46.

Watson, J., & Callingham, R. (2004). Statistical literacy: From idiosyncratic to critical thinking. In G. Burrill & M. Camden (Eds.), Curricular development in statistics education: International Association for Statistical Education roundtable (pp. 116–137). International Association for Statistical Education.

Watson, J., & Chance, B. (2012). Building intuitions about statistical inference based on resampling. Australian Senior Mathematics Journal, 26 (1), 6–18.

Whitaker, D., Foti, S., & Jacobbe, T. (2015). The levels of conceptual understanding in statistics (LOCUS) project: Results of the pilot study. Numeracy, 8 (2). https://doi.org/10.5038/1936-4660.8.2.3

Wild, C. J., Pfannkuch, M., Regan, M., & Horton, N. J. (2011). Towards more accessible conceptions of statistical inference. Journal of the Royal Statistical Society: Series A (statistics in Society), 174 (2), 247–295.

Zieffler, A., Garfield, J., delMas, R., & Reading, C. (2008). A framework to support research on informal inferential reasoning. Statistics Education Research Journal, 7 (2), 40–58.

Ziegler, L., & Garfield, J. (2018). Developing a statistical literacy assessment for the modern introductory statistics course. Statistics Education Research Journal, 17 (2), 161–178.

Download references

Acknowledgements

We thank Walter Stevenhagen for his contribution to the design and implementation of the assessment tool.

This research was funded by the Dutch Ministry of Education, Culture and Science under the Dudoc program.

Author information

Authors and affiliations.

Freudenthal Institute, Utrecht University, PO Box 85.170, 3508 AD, Utrecht, the Netherlands

Marianne van Dijke-Droogers & Paul Drijvers

Faculty of Social and Behavioral Sciences, University of Amsterdam, 15776, Nieuwe Achtergracht 127, 1001 NG, Amsterdam, Netherlands

Arthur Bakker

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Marianne van Dijke-Droogers .

Ethics declarations

Ethical approval.

The study was conducted according to the FI Data Management Protocol. This contains guidelines for the data collection (e.g., informing participants, consent statements from participants (including parents for participants under 16)), for data storage (e.g., ensuring privacy, making backups), and the use of a secure system to store data. More information about the FI Data Management Protocol can be found at https://www.uu.nl/sites/default/files/FI%20Data%20Management%20Protocol-dec2020.pdf .

Informed consent

Results only include data from participants who provided written consent via informed consent (for participants under 16 through their parents).

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 248 KB)

Supplementary file2 (docx 1227 kb), rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

van Dijke-Droogers, M., Drijvers, P. & Bakker, A. Effects of a Learning Trajectory for statistical inference on 9th-grade students’ statistical literacy. Math Ed Res J (2024). https://doi.org/10.1007/s13394-024-00487-z

Download citation

Received : 16 September 2022

Revised : 21 February 2024

Accepted : 04 March 2024

Published : 10 April 2024

DOI : https://doi.org/10.1007/s13394-024-00487-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Statistical Literacy
  • Statistical Inference
  • Learning Trajectory
  • Assessment instrument
  • Learning effects
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. (PDF) Journal of Mathematics Research-V11N2

    research journal of mathematics

  2. Journal for Research in Mathematics Education

    research journal of mathematics

  3. International Journal of Mathematics, Statistics and Computer Science

    research journal of mathematics

  4. Mathematics Education Research Journal

    research journal of mathematics

  5. Journal of Research in Mathematics Trends and Technology

    research journal of mathematics

  6. (PDF) International Journal of Mathematics Trends and Technology

    research journal of mathematics

VIDEO

  1. Review of Polynomial and Rational Functions

  2. Set Theory Part 1

  3. International conference of Mathematics & Mathematics Education (ICKME)

  4. how to write mathematics in calligraphy

  5. GRADE 10 MATH JOURNAL PRESENTATION

  6. Mathematics, Operational Research, Statistics and Economics (MORSE) at Lancaster University

COMMENTS

  1. Home

    Journal of Mathematics Research (ISSN: 1916-9795; E-ISSN: 1916-9809) is an open-access, international, double-blind peer-reviewed journal published by the Canadian Center of Science and Education. This journal, published bimonthly (February, April, June, August, October and December) in both print and online versions, keeps readers up-to-date with the latest developments in all aspects of ...

  2. Journal of Mathematics

    02 Apr 2024. 31 Mar 2024. 28 Mar 2024. 27 Mar 2024. 27 Mar 2024. 27 Mar 2024. Journal of Mathematics is a broad scope journal that publishes original research and review articles on all aspects of both pure and applied mathematics.

  3. AMS :: Journals Home

    High quality journals covering a broad range of mathematical disciplines. AMS peer-reviewed journals have published mathematical research of the highest quality since 1891. Led by prominent editors and providing a broad coverage of all areas of mathematics, AMS journals are a must-have resource for any serious research library collection.

  4. Home

    Research in the Mathematical Sciences is an international, peer-reviewed journal encompassing the full scope of theoretical and applied mathematics, as well as theoretical computer science. Encourages submission of longer articles for more complex and detailed analysis and proofing of theorems. Publishes shorter research communications (Letters ...

  5. Research in Mathematics

    Journal metrics Editorial board. Research in Mathematics is a broad open access journal publishing all aspects of mathematics including pure, applied, and interdisciplinary mathematics, and mathematical education and other fields. The journal primarily publishes research articles, but also welcomes review and survey articles, and case studies.

  6. Mathematics: Books and Journals

    Mathematics. On these pages you will find Springer's journals, books and eBooks in all areas of Mathematics, serving researchers, lecturers, students, and professionals. We publish many of the most prestigious journals in Mathematics, including a number of fully open access journals. Our book and eBook portfolio comprises monographs, textbook ...

  7. Mathematics of Operations Research

    An Approximation to the Invariant Measure of the Limiting Diffusion of G / Ph / n + GI Queues in the Halfin-Whitt Regime and Related Asymptotics. Mathematics of Operations Research is a scholarly journal concerned with mathematical and computational foundations in operations research.

  8. Home

    The European Journal of Mathematics (EJM) is an international peer-reviewed journal that publishes research papers in all fields of mathematics. It also publishes research-survey papers intended to provide nonspecialists with insight into topics of current research in different areas of mathematics.

  9. Home

    The Mathematics Education Research Journal accepts papers from authors from all regions internationally but authors must draw on the extensive research that has been produced in the Australasian region. This is a transformative journal, you may have access to funding. Editor-in-Chief. Vincent Geiger. Impact factor. 1.8 (2022) 5 year impact factor.

  10. Advances in Mathematics

    This journal has an Open Archive. All published items, including research articles, have unrestricted access and will remain permanently free to read and download 48 months after publication. All papers in the Archive are subject to Elsevier's user license. If you require any further information or help, please visit our Support Center.

  11. jmath

    Journal of Mathematics is a broad scope journal that publishes original research articles as well as review articles on all aspects of both pure and applied mathematics. As well as original research, Journal of Mathematics also publishes focused review articles that assess the state of the art, and identify upcoming challenges and promising solutions for the community.

  12. Journal for Research in Mathematics Education

    An official journal of the National Council of Teachers of Mathematics (NCTM), JRME is the premier research journal in mathematics education and is devoted to the interests of teachers and researchers at all levels--preschool through college. Journal information. 2018 (Vol. 49)

  13. Journal of Interdisciplinary Mathematics

    The Journal of Interdisciplinary Mathematics (JIM) is a world leading journal publishing high quality, rigorously peer-reviewed original research in mathematical applications to different disciplines, and to the methodological and theoretical role of mathematics in underpinning all scientific disciplines. The scope is intentionally broad, but ...

  14. Journal of the Association for Mathematical Research

    About the Journal. The Journal of the Association for Mathematical Research (JAMR) is a Diamond Open Access journal publishing research articles in all branches of mathematics at the level of the best specialized journals. There are no strict page limits. A published article may be accompanied, when appropriate, by other media, including: links ...

  15. The Journal of Mathematical Behavior

    The Journal of Mathematical Behavior is an international, double anonymized peer reviewed journal concerned with the learning and teaching of mathematics. Our fundamental goal is to publish research of the highest quality that expands understanding of how people build, retain, communicate, apply, and comprehend mathematical ideas.

  16. Home

    Overview. The International Journal of Research in Undergraduate Mathematics Education focuses on post-secondary mathematics education research. Offers comprehensive coverage of research in the teaching and learning of mathematics at post-secondary level. Presents new ideas and major developments in post-secondary mathematics education.

  17. Journal for Research in Mathematics Education

    An official journal of the National Council of Teachers of Mathematics (NCTM), JRME is the premier research journal in mathematics education and is devoted to the interests of teachers and researchers at all levels--preschool through college. Online ISSN: 1945-2306. eTOC Alerts. Latest Issue TOC RSS.

  18. Minnesota Journal of Undergraduate Mathematics

    The Minnesota Journal of Undergraduate Mathematics focuses on original mathematical research, done primarily by undergraduates, in all areas of mathematics and its applications. The journal is currently not accepting new articles, while we work to process all of the current submissions. Authors with submissions should watch for updates via email.

  19. Mathematics

    Mathematics is a peer-reviewed, open access journal which provides an advanced forum for studies related to mathematics, and is published semimonthly online by MDPI.The European Society for Fuzzy Logic and Technology (EUSFLAT) and International Society for the Study of Information (IS4SI) are affiliated with Mathematics and their members receive a discount on article processing charges.

  20. Involve

    Involve, bridging the gap between the extremes of purely undergraduate-research journals and mainstream research journals, provides a venue to mathematicians wishing to encourage the creative involvement of students. Involve publishes 5 issues per year, for an expected total of about 900 pages.

  21. Home

    The Journal of Mathematics Teacher Education is committed to research aimed at enhancing the education of mathematics teachers and developing mathematics teaching.. Focuses on the professional development of mathematics teachers and teacher-educators at all stages. Seeks to understand the institutional, societal, and cultural influences that impact mathematics teaching and teachers' learning.

  22. Research in Mathematics

    Journal metrics Editorial board. Research in Mathematics is a broad open access journal publishing all aspects of mathematics including pure, applied, and interdisciplinary mathematics, and mathematical education and other fields. The journal primarily publishes research articles, but also welcomes review and survey articles, and case studies.

  23. Table of Contents 2024

    An Efficient New Technique for Solving Nonlinear Problems Involving the Conformable Fractional Derivatives. Shams A. Ahmed. 23 Feb 2024. PDF. Citation. Journal of Applied Mathematics -. Volume 2024. - Article ID 5619098. - Research Article.

  24. Borel control and efficient numerical techniques to solve the Allen

    International Journal of Computer Mathematics Latest Articles. Submit an article Journal homepage. 0 Views 0 ... Altmetric Research Article. Borel control and efficient numerical techniques to solve the Allen-Cahn equation governed by temporal multiplicative noise. Nauman Ahmed a Department of Mathematics and Statistics, The University of ...

  25. Effects of a Learning Trajectory for statistical inference ...

    Statistics Education Research Journal, 7(2), 40-58. Article Google Scholar Ziegler, L., & Garfield, J. (2018). Developing a statistical literacy assessment for the modern introductory statistics course. Statistics Education Research Journal, 17(2), 161-178. Article Google Scholar Download references