• Columbia University in the City of New York
  • Office of Teaching, Learning, and Innovation
  • University Policies
  • Columbia Online
  • Academic Calendar
  • Resources and Technology
  • Instructional Technologies
  • Teaching in All Modalities

Designing Assignments for Learning

The rapid shift to remote teaching and learning meant that many instructors reimagined their assessment practices. Whether adapting existing assignments or creatively designing new opportunities for their students to learn, instructors focused on helping students make meaning and demonstrate their learning outside of the traditional, face-to-face classroom setting. This resource distills the elements of assignment design that are important to carry forward as we continue to seek better ways of assessing learning and build on our innovative assignment designs.

On this page:

Rethinking traditional tests, quizzes, and exams.

  • Examples from the Columbia University Classroom
  • Tips for Designing Assignments for Learning

Reflect On Your Assignment Design

Connect with the ctl.

  • Resources and References

use of assignment in education

Cite this resource: Columbia Center for Teaching and Learning (2021). Designing Assignments for Learning. Columbia University. Retrieved [today’s date] from https://ctl.columbia.edu/resources-and-technology/teaching-with-technology/teaching-online/designing-assignments/

Traditional assessments tend to reveal whether students can recognize, recall, or replicate what was learned out of context, and tend to focus on students providing correct responses (Wiggins, 1990). In contrast, authentic assignments, which are course assessments, engage students in higher order thinking, as they grapple with real or simulated challenges that help them prepare for their professional lives, and draw on the course knowledge learned and the skills acquired to create justifiable answers, performances or products (Wiggins, 1990). An authentic assessment provides opportunities for students to practice, consult resources, learn from feedback, and refine their performances and products accordingly (Wiggins 1990, 1998, 2014). 

Authentic assignments ask students to “do” the subject with an audience in mind and apply their learning in a new situation. Examples of authentic assignments include asking students to: 

  • Write for a real audience (e.g., a memo, a policy brief, letter to the editor, a grant proposal, reports, building a website) and/or publication;
  • Solve problem sets that have real world application; 
  • Design projects that address a real world problem; 
  • Engage in a community-partnered research project;
  • Create an exhibit, performance, or conference presentation ;
  • Compile and reflect on their work through a portfolio/e-portfolio.

Noteworthy elements of authentic designs are that instructors scaffold the assignment, and play an active role in preparing students for the tasks assigned, while students are intentionally asked to reflect on the process and product of their work thus building their metacognitive skills (Herrington and Oliver, 2000; Ashford-Rowe, Herrington and Brown, 2013; Frey, Schmitt, and Allen, 2012). 

It’s worth noting here that authentic assessments can initially be time consuming to design, implement, and grade. They are critiqued for being challenging to use across course contexts and for grading reliability issues (Maclellan, 2004). Despite these challenges, authentic assessments are recognized as beneficial to student learning (Svinicki, 2004) as they are learner-centered (Weimer, 2013), promote academic integrity (McLaughlin, L. and Ricevuto, 2021; Sotiriadou et al., 2019; Schroeder, 2021) and motivate students to learn (Ambrose et al., 2010). The Columbia Center for Teaching and Learning is always available to consult with faculty who are considering authentic assessment designs and to discuss challenges and affordances.   

Examples from the Columbia University Classroom 

Columbia instructors have experimented with alternative ways of assessing student learning from oral exams to technology-enhanced assignments. Below are a few examples of authentic assignments in various teaching contexts across Columbia University. 

  • E-portfolios: Statia Cook shares her experiences with an ePorfolio assignment in her co-taught Frontiers of Science course (a submission to the Voices of Hybrid and Online Teaching and Learning initiative); CUIMC use of ePortfolios ;
  • Case studies: Columbia instructors have engaged their students in authentic ways through case studies drawing on the Case Consortium at Columbia University. Read and watch a faculty spotlight to learn how Professor Mary Ann Price uses the case method to place pre-med students in real-life scenarios;
  • Simulations: students at CUIMC engage in simulations to develop their professional skills in The Mary & Michael Jaharis Simulation Center in the Vagelos College of Physicians and Surgeons and the Helene Fuld Health Trust Simulation Center in the Columbia School of Nursing; 
  • Experiential learning: instructors have drawn on New York City as a learning laboratory such as Barnard’s NYC as Lab webpage which highlights courses that engage students in NYC;
  • Design projects that address real world problems: Yevgeniy Yesilevskiy on the Engineering design projects completed using lab kits during remote learning. Watch Dr. Yesilevskiy talk about his teaching and read the Columbia News article . 
  • Writing assignments: Lia Marshall and her teaching associate Aparna Balasundaram reflect on their “non-disposable or renewable assignments” to prepare social work students for their professional lives as they write for a real audience; and Hannah Weaver spoke about a sandbox assignment used in her Core Literature Humanities course at the 2021 Celebration of Teaching and Learning Symposium . Watch Dr. Weaver share her experiences.  

​Tips for Designing Assignments for Learning

While designing an effective authentic assignment may seem like a daunting task, the following tips can be used as a starting point. See the Resources section for frameworks and tools that may be useful in this effort.  

Align the assignment with your course learning objectives 

Identify the kind of thinking that is important in your course, the knowledge students will apply, and the skills they will practice using through the assignment. What kind of thinking will students be asked to do for the assignment? What will students learn by completing this assignment? How will the assignment help students achieve the desired course learning outcomes? For more information on course learning objectives, see the CTL’s Course Design Essentials self-paced course and watch the video on Articulating Learning Objectives .  

Identify an authentic meaning-making task

For meaning-making to occur, students need to understand the relevance of the assignment to the course and beyond (Ambrose et al., 2010). To Bean (2011) a “meaning-making” or “meaning-constructing” task has two dimensions: 1) it presents students with an authentic disciplinary problem or asks students to formulate their own problems, both of which engage them in active critical thinking, and 2) the problem is placed in “a context that gives students a role or purpose, a targeted audience, and a genre.” (Bean, 2011: 97-98). 

An authentic task gives students a realistic challenge to grapple with, a role to take on that allows them to “rehearse for the complex ambiguities” of life, provides resources and supports to draw on, and requires students to justify their work and the process they used to inform their solution (Wiggins, 1990). Note that if students find an assignment interesting or relevant, they will see value in completing it. 

Consider the kind of activities in the real world that use the knowledge and skills that are the focus of your course. How is this knowledge and these skills applied to answer real-world questions to solve real-world problems? (Herrington et al., 2010: 22). What do professionals or academics in your discipline do on a regular basis? What does it mean to think like a biologist, statistician, historian, social scientist? How might your assignment ask students to draw on current events, issues, or problems that relate to the course and are of interest to them? How might your assignment tap into student motivation and engage them in the kinds of thinking they can apply to better understand the world around them? (Ambrose et al., 2010). 

Determine the evaluation criteria and create a rubric

To ensure equitable and consistent grading of assignments across students, make transparent the criteria you will use to evaluate student work. The criteria should focus on the knowledge and skills that are central to the assignment. Build on the criteria identified, create a rubric that makes explicit the expectations of deliverables and share this rubric with your students so they can use it as they work on the assignment. For more information on rubrics, see the CTL’s resource Incorporating Rubrics into Your Grading and Feedback Practices , and explore the Association of American Colleges & Universities VALUE Rubrics (Valid Assessment of Learning in Undergraduate Education). 

Build in metacognition

Ask students to reflect on what and how they learned from the assignment. Help students uncover personal relevance of the assignment, find intrinsic value in their work, and deepen their motivation by asking them to reflect on their process and their assignment deliverable. Sample prompts might include: what did you learn from this assignment? How might you draw on the knowledge and skills you used on this assignment in the future? See Ambrose et al., 2010 for more strategies that support motivation and the CTL’s resource on Metacognition ). 

Provide students with opportunities to practice

Design your assignment to be a learning experience and prepare students for success on the assignment. If students can reasonably expect to be successful on an assignment when they put in the required effort ,with the support and guidance of the instructor, they are more likely to engage in the behaviors necessary for learning (Ambrose et al., 2010). Ensure student success by actively teaching the knowledge and skills of the course (e.g., how to problem solve, how to write for a particular audience), modeling the desired thinking, and creating learning activities that build up to a graded assignment. Provide opportunities for students to practice using the knowledge and skills they will need for the assignment, whether through low-stakes in-class activities or homework activities that include opportunities to receive and incorporate formative feedback. For more information on providing feedback, see the CTL resource Feedback for Learning . 

Communicate about the assignment 

Share the purpose, task, audience, expectations, and criteria for the assignment. Students may have expectations about assessments and how they will be graded that is informed by their prior experiences completing high-stakes assessments, so be transparent. Tell your students why you are asking them to do this assignment, what skills they will be using, how it aligns with the course learning outcomes, and why it is relevant to their learning and their professional lives (i.e., how practitioners / professionals use the knowledge and skills in your course in real world contexts and for what purposes). Finally, verify that students understand what they need to do to complete the assignment. This can be done by asking students to respond to poll questions about different parts of the assignment, a “scavenger hunt” of the assignment instructions–giving students questions to answer about the assignment and having them work in small groups to answer the questions, or by having students share back what they think is expected of them.

Plan to iterate and to keep the focus on learning 

Draw on multiple sources of data to help make decisions about what changes are needed to the assignment, the assignment instructions, and/or rubric to ensure that it contributes to student learning. Explore assignment performance data. As Deandra Little reminds us: “a really good assignment, which is a really good assessment, also teaches you something or tells the instructor something. As much as it tells you what students are learning, it’s also telling you what they aren’t learning.” ( Teaching in Higher Ed podcast episode 337 ). Assignment bottlenecks–where students get stuck or struggle–can be good indicators that students need further support or opportunities to practice prior to completing an assignment. This awareness can inform teaching decisions. 

Triangulate the performance data by collecting student feedback, and noting your own reflections about what worked well and what did not. Revise the assignment instructions, rubric, and teaching practices accordingly. Consider how you might better align your assignment with your course objectives and/or provide more opportunities for students to practice using the knowledge and skills that they will rely on for the assignment. Additionally, keep in mind societal, disciplinary, and technological changes as you tweak your assignments for future use. 

Now is a great time to reflect on your practices and experiences with assignment design and think critically about your approach. Take a closer look at an existing assignment. Questions to consider include: What is this assignment meant to do? What purpose does it serve? Why do you ask students to do this assignment? How are they prepared to complete the assignment? Does the assignment assess the kind of learning that you really want? What would help students learn from this assignment? 

Using the tips in the previous section: How can the assignment be tweaked to be more authentic and meaningful to students? 

As you plan forward for post-pandemic teaching and reflect on your practices and reimagine your course design, you may find the following CTL resources helpful: Reflecting On Your Experiences with Remote Teaching , Transition to In-Person Teaching , and Course Design Support .

The Columbia Center for Teaching and Learning (CTL) is here to help!

For assistance with assignment design, rubric design, or any other teaching and learning need, please request a consultation by emailing [email protected]

Transparency in Learning and Teaching (TILT) framework for assignments. The TILT Examples and Resources page ( https://tilthighered.com/tiltexamplesandresources ) includes example assignments from across disciplines, as well as a transparent assignment template and a checklist for designing transparent assignments . Each emphasizes the importance of articulating to students the purpose of the assignment or activity, the what and how of the task, and specifying the criteria that will be used to assess students. 

Association of American Colleges & Universities (AAC&U) offers VALUE ADD (Assignment Design and Diagnostic) tools ( https://www.aacu.org/value-add-tools ) to help with the creation of clear and effective assignments that align with the desired learning outcomes and associated VALUE rubrics (Valid Assessment of Learning in Undergraduate Education). VALUE ADD encourages instructors to explicitly state assignment information such as the purpose of the assignment, what skills students will be using, how it aligns with course learning outcomes, the assignment type, the audience and context for the assignment, clear evaluation criteria, desired formatting, and expectations for completion whether individual or in a group.

Villarroel et al. (2017) propose a blueprint for building authentic assessments which includes four steps: 1) consider the workplace context, 2) design the authentic assessment; 3) learn and apply standards for judgement; and 4) give feedback. 

References 

Ambrose, S. A., Bridges, M. W., & DiPietro, M. (2010). Chapter 3: What Factors Motivate Students to Learn? In How Learning Works: Seven Research-Based Principles for Smart Teaching . Jossey-Bass. 

Ashford-Rowe, K., Herrington, J., and Brown, C. (2013). Establishing the critical elements that determine authentic assessment. Assessment & Evaluation in Higher Education. 39(2), 205-222, http://dx.doi.org/10.1080/02602938.2013.819566 .  

Bean, J.C. (2011). Engaging Ideas: The Professor’s Guide to Integrating Writing, Critical Thinking, and Active Learning in the Classroom . Second Edition. Jossey-Bass. 

Frey, B. B, Schmitt, V. L., and Allen, J. P. (2012). Defining Authentic Classroom Assessment. Practical Assessment, Research, and Evaluation. 17(2). DOI: https://doi.org/10.7275/sxbs-0829  

Herrington, J., Reeves, T. C., and Oliver, R. (2010). A Guide to Authentic e-Learning . Routledge. 

Herrington, J. and Oliver, R. (2000). An instructional design framework for authentic learning environments. Educational Technology Research and Development, 48(3), 23-48. 

Litchfield, B. C. and Dempsey, J. V. (2015). Authentic Assessment of Knowledge, Skills, and Attitudes. New Directions for Teaching and Learning. 142 (Summer 2015), 65-80. 

Maclellan, E. (2004). How convincing is alternative assessment for use in higher education. Assessment & Evaluation in Higher Education. 29(3), June 2004. DOI: 10.1080/0260293042000188267

McLaughlin, L. and Ricevuto, J. (2021). Assessments in a Virtual Environment: You Won’t Need that Lockdown Browser! Faculty Focus. June 2, 2021. 

Mueller, J. (2005). The Authentic Assessment Toolbox: Enhancing Student Learning through Online Faculty Development . MERLOT Journal of Online Learning and Teaching. 1(1). July 2005. Mueller’s Authentic Assessment Toolbox is available online. 

Schroeder, R. (2021). Vaccinate Against Cheating With Authentic Assessment . Inside Higher Ed. (February 26, 2021).  

Sotiriadou, P., Logan, D., Daly, A., and Guest, R. (2019). The role of authentic assessment to preserve academic integrity and promote skills development and employability. Studies in Higher Education. 45(111), 2132-2148. https://doi.org/10.1080/03075079.2019.1582015    

Stachowiak, B. (Host). (November 25, 2020). Authentic Assignments with Deandra Little. (Episode 337). In Teaching in Higher Ed . https://teachinginhighered.com/podcast/authentic-assignments/  

Svinicki, M. D. (2004). Authentic Assessment: Testing in Reality. New Directions for Teaching and Learning. 100 (Winter 2004): 23-29. 

Villarroel, V., Bloxham, S, Bruna, D., Bruna, C., and Herrera-Seda, C. (2017). Authentic assessment: creating a blueprint for course design. Assessment & Evaluation in Higher Education. 43(5), 840-854. https://doi.org/10.1080/02602938.2017.1412396    

Weimer, M. (2013). Learner-Centered Teaching: Five Key Changes to Practice . Second Edition. San Francisco: Jossey-Bass. 

Wiggins, G. (2014). Authenticity in assessment, (re-)defined and explained. Retrieved from https://grantwiggins.wordpress.com/2014/01/26/authenticity-in-assessment-re-defined-and-explained/

Wiggins, G. (1998). Teaching to the (Authentic) Test. Educational Leadership . April 1989. 41-47. 

Wiggins, Grant (1990). The Case for Authentic Assessment . Practical Assessment, Research & Evaluation , 2(2). 

Wondering how AI tools might play a role in your course assignments?

See the CTL’s resource “Considerations for AI Tools in the Classroom.”

This website uses cookies to identify users, improve the user experience and requires cookies to work. By continuing to use this website, you consent to Columbia University's use of cookies and similar technologies, in accordance with the Columbia University Website Cookie Notice .

  • MyU : For Students, Faculty, and Staff
  • Academic Leaders
  • Faculty and Instructors
  • Graduate Students and Postdocs

Center for Educational Innovation

  • Campus and Collegiate Liaisons
  • Pedagogical Innovations Journal Club
  • Teaching Enrichment Series
  • Recorded Webinars
  • Video Series
  • All Services
  • Teaching Consultations
  • Student Feedback Facilitation
  • Instructional Media Production
  • Curricular and Educational Initiative Consultations
  • Educational Research and Evaluation
  • Thank a Teacher
  • All Teaching Resources
  • Aligned Course Design
  • Active Learning
  • Team Projects
  • Active Learning Classrooms
  • Leveraging the Learning Sciences
  • Inclusive Teaching at a Predominantly White Institution
  • Assessments
  • Online Teaching and Design
  • AI and ChatGPT in Teaching
  • Documenting Growth in Teaching
  • Early Term Feedback
  • Scholarship of Teaching and Learning
  • Writing Your Teaching Philosophy
  • All Programs
  • Assessment Deep Dive
  • Designing and Delivering Online Learning
  • Early Career Teaching and Learning Program
  • International Teaching Assistant (ITA) Program
  • Preparing Future Faculty Program
  • Teaching with Access and Inclusion Program
  • Teaching for Student Well-Being Program
  • Teaching Assistant and Postdoc Professional Development Program

Pedagogy - Diversifying Your Teaching Methods, Learning Activities, and Assignments

Inclusive Teaching at a PWI is in a blue rectangle at the top. Below are three green circles for Climate, Pedagogy, and Content. Pedagogy is emphasized with key points: Diversify and critically assess teaching methods, learning activities, assignments.

Definition of Pedagogy 

In the most general sense, pedagogy is all the ways that instructors and students work with the course content. The fundamental learning goal for students is to be able to do “something meaningful” with the course content. Meaningful learning typically results in students working in the middle to upper levels of Bloom’s Taxonomy . We sometimes find that novice instructors conflate course content with pedagogy. This often results in “teaching as talking” where the presentation of content by the instructor is confused with the learning of content by the students. Think of your course content as clay and pedagogy as the ways you ask students to make “something meaningful” from that clay. Pedagogy is the combination of teaching methods (what instructors do), learning activities (what instructors ask their students to do), and learning assessments (the assignments, projects, or tasks that measure student learning).

Key Idea for Pedagogy

Diversify your pedagogy by varying your teaching methods, learning activities, and assignments. Critically assess your pedagogy through the lens of BIPOC students’ experiences at a PWI . We visualize these two related practices as a cycle because they are iterative and ongoing. Diversifying your pedagogy likely means shedding some typical ways of teaching in your discipline, or the teaching practices you inherited. It likely means doing more active learning and less traditional lecturing. Transforming good pedagogy into equitable pedagogy means rethinking your pedagogy in light of the PWI context and considering the ways your pedagogy may help or hinder learning for BIPOC students. 

PWI Assumptions for Pedagogy

Understanding where students are on the spectrum of novice to expert learning in your discipline or course is a key challenge to implementing effective and inclusive pedagogy (National Research Council 2000). Instructors are typically so far removed from being a novice learner in their disciplines that they struggle to understand where students are on that spectrum. A key PWI assumption is that students understand how your disciplinary knowledge is organized and constructed . Students typically do not understand your discipline or the many other disciplines they are working in during their undergraduate years. Even graduate students may find it puzzling to explain the origins, methodologies, theories, logics, and assumptions of their disciplines. A second PWI assumption is that students are (or should be) academically prepared to learn your discipline . Students may be academically prepared for learning in some disciplines, but unless their high school experience was college preparatory and well supported, students (especially first-generation college students) are likely finding their way through a mysterious journey of different disciplinary conventions and modes of working and thinking (Nelson 1996).

A third PWI assumption is that instructors may confuse students’ academic underpreparation with their intelligence or capacity to learn . Academic preparation is typically a function of one’s high school experience including whether that high school was well resourced or under funded. Whether or not a student receives a quality high school education is usually a structural matter reflecting inequities in our K12 educational systems, not a reflection of an individual student’s ability to learn. A final PWI assumption is that students will learn well in the ways that the instructor learned well . Actually most instructors in higher education self-selected into disciplines that align with their interests, skills, academic preparation, and possibly family and community support. Our students have broader and different goals for seeking a college education and bring a range of skills to their coursework, which may or may not align with instructors’ expectations of how students learn. Inclusive teaching at a PWI means supporting the learning and career goals of our students.

Pedagogical Content Knowledge as a Core Concept

Kind and Chan (2019) propose that Pedagogical Content Knowledge (PCK) is the synthesis of Content Knowledge (expertise about a subject area) and Pedagogical Knowledge (expertise about teaching methods, assessment, classroom management, and how students learn). Content Knowledge (CK) without Pedagogical Knowledge (PK) limits instructors’ ability to teach effectively or inclusively. Novice instructors that rely on traditional lectures likely have limited Pedagogical Knowledge and may also be replicating their own inherited teaching practices. While Kind and Chan (2019) are writing from the perspective of science education, their concepts apply across disciplines. Moreover, Kind and Chan (2019) support van Driel et al.’s assertion that:

high-quality PCK is not characterized by knowing as many strategies as possible to teach a certain topic plus all the misconceptions students may have about it but by knowing when to apply a certain strategy in recognition of students’ actual learning needs and understanding why a certain teaching approach may be useful in one situation (quoted in Kind and Chan 2019, 975). 

As we’ve stressed throughout this guide, the teaching context matters, and for inclusive pedagogy, special attention should be paid to the learning goals, instructor preparation, and students’ point of entry into course content. We also argue that the PWI context shapes what instructors might practice as CK, PK, and PCK. We recommend instructors become familiar with evidence-based pedagogy (or the Scholarship of Teaching and Learning , SoTL) in their fields. Moreover, we advise instructors to find and follow those instructors and scholars that specifically focus on inclusive teaching in their fields in order to develop an inclusive, flexible, and discipline-specific Pedagogical Content Knowledge.

Suggested Practices for Diversifying + Assessing Pedagogy

Although diversifying and critically assessing teaching methods, learning activities, and assignments will vary across disciplines, we offer a few key starting points. Diversifying your pedagogy is easier than critically assessing it through a PWI lens, but both steps are essential. In general, you can diversify your pedagogy by learning about active learning, peer learning, team-based learning, experiential learning, problem-based learning, and case-based learning, among others . There is extensive evidence-based pedagogical literature and practical guides readily available for these methods. And you can also find and follow scholars in your discipline that use these and other teaching methods.

Diversifying Your Pedagogy

Convert traditional lectures into interactive (or active) lectures.

For in-person or synchronous online courses, break a traditional lecture into “mini-lectures” of 10-15 minutes in length. After each mini-lecture, ask your students to process their learning using a discussion or problem prompt, a Classroom Assessment Technique (CAT), a Think-Pair-Share, or another brief learning activity. Read Lecturing from Center for Teaching , Vanderbilt University.

Structure small group discussions

Provide both a process and concrete questions or tasks to guide student learning (for example, provide a scenario with 3 focused tasks such as identify the problem, brainstorm possible solutions, and list the pros/cons for each solution). Read How to Hold a Better Class Discussion , The Chronicle of Higher Education .

Integrate active learning

Integrate active learning, especially into courses that are conceptual, theoretical, or otherwise historically challenging (for example, calculus, organic chemistry, statistics, philosophy). For gateway courses, draw upon the research of STEM and other education specialists on how active learning and peer learning improves student learning and reduces disparities. Read the Association of American Universities STEM Network Scholarship .  

Include authentic learning

Include authentic learning, learning activities and assignments that mirror how students will work after graduation. What does it mean to think and work like an engineer? How do project teams work together? How does one present research in an educational social media campaign? Since most students seeking a college education will not become academic researchers or faculty, what kinds of things will they do in the “real world?” Help students practice and hone those skills as they learn the course content. Read Edutopia’s PBL: What Does It Take for a Project to Be Authentic?

Vary assignments and provide options

Graded assignments should range from low to high stakes. Low stakes assignments allow students to learn from their mistakes and receive timely feedback on their learning. Options for assignments allow students to demonstrate their learning, rather than demonstrate their skill at a particular type of assessment (such as a multiple choice exam or an academic research paper). Read our guide, Create Assessments That Promote Learning for All Students .

Critically Assess Your Pedagogy

Critically assessing your pedagogy through the PWI lens with attention to how your pedagogy may affect the learning of BIPOC students is more challenging and highly contextual. Instructors will want to review and apply the concepts and principles discussed in the earlier sections of this guide on Predominantly White Institutions (PWIs), PWI Assumptions, and Class Climate. 

Reflect on patterns

Reflect on patterns of participation, progress in learning (grade distributions), and other course-related evidence. Look at your class sessions and assignments as experimental data. Who participated? What kinds of participation did you observe? Who didn’t participate? Why might that be? Are there a variety of ways for students to participate in the learning activities (individually, in groups, via discussion, via writing, synchronously/in-person, asynchronously/online)?

Respond to feedback on climate

Respond to feedback on climate from on-going check-ins and Critical Incident Questionnaires (CIQs) as discussed in the Climate Section (Ongoing Practices). Students will likely disengage from your requests for feedback if you do not respond to their feedback. Use this feedback to re-calibrate and re-think your pedagogy. 

Seek feedback on student learning

Seek feedback on student learning in the form of Classroom Assessment Techniques (CATs), in-class polls, asynchronous forums, exam wrappers, and other methods.  Demonstrate that you care about your students’ learning by responding to this feedback as well. Here’s how students in previous semesters learned this material … I’m scheduling a problem-solving review session in the next class in response to the results of the exam …

Be diplomatic but clear when correcting mistakes and misconceptions

First-generation college students, many of whom may also identify as BIPOC, have typically achieved a great deal with few resources and significant barriers (Yosso 2005). However, they may be more likely to internalize their learning mistakes as signs that they don’t belong at the university. When correcting, be sure to normalize mistakes as part of the learning process. The correct answer is X, but I can see why you thought it was Y. Many students think it is Y because … But the correct answer is X because … Thank you for helping us understand that misconception.

Allow time for students to think and prepare for participation in a non-stressful setting

This was already suggested in the Climate Section (Race Stressors), but it is worth repeating. BIPOC students and multilingual students may need more time to prepare, not because of their intellectual abilities, but because of the effects of race stressors and other stressors increasing their cognitive load. Providing discussion or problem prompts in advance will reduce this stress and make space for learning. Additionally both student populations may experience stereotype threat, so participation in the “public” aspects of the class session may be stressful in ways that are not true for the majority white and domestic students. If you cannot provide prompts in advance, be sure to allow ample individual “think time” during a synchronous class session.

Avoid consensus models or majority rules processes

This was stated in the Climate Section (Teaching Practices to Avoid), but it’s such an entrenched PWI practice that it needs to be spotlighted and challenged. If I am a numerical “minority” and I am asked to come to consensus or agreement with a numerical “majority,” it is highly likely that my perspective will be minimized or dismissed. Or, I will have to expend a lot of energy to persuade my group of the value of my perspective, which is highly stressful. This is an unacceptable burden to put on BIPOC students and also may result in BIPOC students being placed in the position of teaching white students about a particular perspective or experience. The resulting tensions may also damage BIPOC students’ positive relationships with white students and instructors. When suitable for your content, create a learning experience that promotes seeking multiple solutions to problems, cases, or prompts. Rather than asking students to converge on one best recommendation, why not ask students to log all possible solutions (without evaluation) and then to recommend at least two solutions that include a rationale? Moreover, for course content dealing with policies, the recommended solutions could be explained in terms of their possible effects on different communities. If we value diverse perspectives, we need to structure the consideration of those perspectives into our learning activities and assignments. 

We recognize the challenges of assessing your pedagogy through the PWI lens and doing your best to assess the effects on BIPOC student learning. This is a complex undertaking. But we encourage you to invite feedback from your students as well as to seek the guidance of colleagues, including advisors and other student affairs professionals, to inform your ongoing practices of teaching inclusively at a PWI. In the next section, we complete our exploration of the Inclusive Teaching at a PWI Framework by exploring the importance of auditing, diversifying, and critically assessing course content.

Pedagogy References

Kind, Vanessa and Kennedy K.H. Chan. 2019. “Resolving the Amalgam: Connecting Pedagogical Content Knowledge, Content Knowledge and Pedagogical Knowledge.” International Journal of Science Education . 41(7): 964-978.

Howard, Jay. N.D. “How to Hold a Better Class Discussion: Advice Guide.” The Chronicle of Higher Education . https://www.chronicle.com/article/how-to-hold-a-better-class-discussion/#2 

National Research Council. 2000. “How Experts Differ from Novices.” Chap 2 in How People Learn: Brain, Mind, Experience, and School: Expanded Edition . Washington D.C.: The National Academies Press. https://nap.nationalacademies.org/catalog/9853/how-people-learn-brain-mind-experience-and-school-expanded-edition

Nelson, Craig E. 1996. “Student Diversity Requires Different Approaches to College Teaching, Even in Math and Science.” The American Behavioral Scientist . 40 (2): 165-175.

Sathy, Viji and Kelly A. Hogan. N.D.  “How to Make Your Teaching More Inclusive: Advice Guide.” The Chronicle of Higher Education . https://www.chronicle.com/article/how-to-make-your-teaching-more-inclusive/?cid=gen_sign_in

Yosso, Tara J. 2005. “Whose Culture Has Capital? A Critical Race Theory Discussion of Community Cultural Wealth.” Race, Ethnicity and Education . 8 (1): 69-91.

  • Research and Resources
  • Why Use Active Learning?
  • Successful Active Learning Implementation
  • Addressing Active Learning Challenges
  • Why Use Team Projects?
  • Project Description Examples
  • Project Description for Students
  • Team Projects and Student Development Outcomes
  • Forming Teams
  • Team Output
  • Individual Contributions to the Team
  • Individual Student Understanding
  • Supporting Students
  • Wrapping up the Project
  • Addressing Challenges
  • Course Planning
  • Working memory
  • Retrieval of information
  • Spaced practice
  • Active learning
  • Metacognition
  • Definitions and PWI Focus
  • A Flexible Framework
  • Class Climate
  • Course Content
  • An Ongoing Endeavor
  • Align Assessments
  • Multiple Low Stakes Assessments
  • Authentic Assessments
  • Formative and Summative Assessments
  • Varied Forms of Assessments
  • Cumulative Assessments
  • Equitable Assessments
  • Essay Exams
  • Multiple Choice Exams and Quizzes
  • Academic Paper
  • Skill Observation
  • Alternative Assessments
  • Assessment Plan
  • Grade Assessments
  • Prepare Students
  • Reduce Student Anxiety
  • SRT Scores: Interpreting & Responding
  • Student Feedback Question Prompts
  • Research Questions and Design
  • Gathering data
  • Publication
  • GRAD 8101: Teaching in Higher Education
  • Finding a Practicum Mentor
  • GRAD 8200: Teaching for Learning
  • Proficiency Rating & TA Eligibility
  • Schedule a SETTA
  • TAPD Webinars

Teaching, Learning, & Professional Development Center

  • Teaching Resources
  • TLPDC Teaching Resources

How Do I Create Meaningful and Effective Assignments?

Prepared by allison boye, ph.d. teaching, learning, and professional development center.

Assessment is a necessary part of the teaching and learning process, helping us measure whether our students have really learned what we want them to learn. While exams and quizzes are certainly favorite and useful methods of assessment, out of class assignments (written or otherwise) can offer similar insights into our students' learning.  And just as creating a reliable test takes thoughtfulness and skill, so does creating meaningful and effective assignments. Undoubtedly, many instructors have been on the receiving end of disappointing student work, left wondering what went wrong… and often, those problems can be remedied in the future by some simple fine-tuning of the original assignment.  This paper will take a look at some important elements to consider when developing assignments, and offer some easy approaches to creating a valuable assessment experience for all involved.

First Things First…

Before assigning any major tasks to students, it is imperative that you first define a few things for yourself as the instructor:

  • Your goals for the assignment . Why are you assigning this project, and what do you hope your students will gain from completing it? What knowledge, skills, and abilities do you aim to measure with this assignment?  Creating assignments is a major part of overall course design, and every project you assign should clearly align with your goals for the course in general.  For instance, if you want your students to demonstrate critical thinking, perhaps asking them to simply summarize an article is not the best match for that goal; a more appropriate option might be to ask for an analysis of a controversial issue in the discipline. Ultimately, the connection between the assignment and its purpose should be clear to both you and your students to ensure that it is fulfilling the desired goals and doesn't seem like “busy work.” For some ideas about what kinds of assignments match certain learning goals, take a look at this page from DePaul University's Teaching Commons.
  • Have they experienced “socialization” in the culture of your discipline (Flaxman, 2005)? Are they familiar with any conventions you might want them to know? In other words, do they know the “language” of your discipline, generally accepted style guidelines, or research protocols?
  • Do they know how to conduct research?  Do they know the proper style format, documentation style, acceptable resources, etc.? Do they know how to use the library (Fitzpatrick, 1989) or evaluate resources?
  • What kinds of writing or work have they previously engaged in?  For instance, have they completed long, formal writing assignments or research projects before? Have they ever engaged in analysis, reflection, or argumentation? Have they completed group assignments before?  Do they know how to write a literature review or scientific report?

In his book Engaging Ideas (1996), John Bean provides a great list of questions to help instructors focus on their main teaching goals when creating an assignment (p.78):

1. What are the main units/modules in my course?

2. What are my main learning objectives for each module and for the course?

3. What thinking skills am I trying to develop within each unit and throughout the course?

4. What are the most difficult aspects of my course for students?

5. If I could change my students' study habits, what would I most like to change?

6. What difference do I want my course to make in my students' lives?

What your students need to know

Once you have determined your own goals for the assignment and the levels of your students, you can begin creating your assignment.  However, when introducing your assignment to your students, there are several things you will need to clearly outline for them in order to ensure the most successful assignments possible.

  • First, you will need to articulate the purpose of the assignment . Even though you know why the assignment is important and what it is meant to accomplish, you cannot assume that your students will intuit that purpose. Your students will appreciate an understanding of how the assignment fits into the larger goals of the course and what they will learn from the process (Hass & Osborn, 2007). Being transparent with your students and explaining why you are asking them to complete a given assignment can ultimately help motivate them to complete the assignment more thoughtfully.
  • If you are asking your students to complete a writing assignment, you should define for them the “rhetorical or cognitive mode/s” you want them to employ in their writing (Flaxman, 2005). In other words, use precise verbs that communicate whether you are asking them to analyze, argue, describe, inform, etc.  (Verbs like “explore” or “comment on” can be too vague and cause confusion.) Provide them with a specific task to complete, such as a problem to solve, a question to answer, or an argument to support.  For those who want assignments to lead to top-down, thesis-driven writing, John Bean (1996) suggests presenting a proposition that students must defend or refute, or a problem that demands a thesis answer.
  • It is also a good idea to define the audience you want your students to address with their assignment, if possible – especially with writing assignments.  Otherwise, students will address only the instructor, often assuming little requires explanation or development (Hedengren, 2004; MIT, 1999). Further, asking students to address the instructor, who typically knows more about the topic than the student, places the student in an unnatural rhetorical position.  Instead, you might consider asking your students to prepare their assignments for alternative audiences such as other students who missed last week's classes, a group that opposes their position, or people reading a popular magazine or newspaper.  In fact, a study by Bean (1996) indicated the students often appreciate and enjoy assignments that vary elements such as audience or rhetorical context, so don't be afraid to get creative!
  • Obviously, you will also need to articulate clearly the logistics or “business aspects” of the assignment . In other words, be explicit with your students about required elements such as the format, length, documentation style, writing style (formal or informal?), and deadlines.  One caveat, however: do not allow the logistics of the paper take precedence over the content in your assignment description; if you spend all of your time describing these things, students might suspect that is all you care about in their execution of the assignment.
  • Finally, you should clarify your evaluation criteria for the assignment. What elements of content are most important? Will you grade holistically or weight features separately? How much weight will be given to individual elements, etc?  Another precaution to take when defining requirements for your students is to take care that your instructions and rubric also do not overshadow the content; prescribing too rigidly each element of an assignment can limit students' freedom to explore and discover. According to Beth Finch Hedengren, “A good assignment provides the purpose and guidelines… without dictating exactly what to say” (2004, p. 27).  If you decide to utilize a grading rubric, be sure to provide that to the students along with the assignment description, prior to their completion of the assignment.

A great way to get students engaged with an assignment and build buy-in is to encourage their collaboration on its design and/or on the grading criteria (Hudd, 2003). In his article “Conducting Writing Assignments,” Richard Leahy (2002) offers a few ideas for building in said collaboration:

• Ask the students to develop the grading scale themselves from scratch, starting with choosing the categories.

• Set the grading categories yourself, but ask the students to help write the descriptions.

• Draft the complete grading scale yourself, then give it to your students for review and suggestions.

A Few Do's and Don'ts…

Determining your goals for the assignment and its essential logistics is a good start to creating an effective assignment. However, there are a few more simple factors to consider in your final design. First, here are a few things you should do :

  • Do provide detail in your assignment description . Research has shown that students frequently prefer some guiding constraints when completing assignments (Bean, 1996), and that more detail (within reason) can lead to more successful student responses.  One idea is to provide students with physical assignment handouts , in addition to or instead of a simple description in a syllabus.  This can meet the needs of concrete learners and give them something tangible to refer to.  Likewise, it is often beneficial to make explicit for students the process or steps necessary to complete an assignment, given that students – especially younger ones – might need guidance in planning and time management (MIT, 1999).
  • Do use open-ended questions.  The most effective and challenging assignments focus on questions that lead students to thinking and explaining, rather than simple yes or no answers, whether explicitly part of the assignment description or in the  brainstorming heuristics (Gardner, 2005).
  • Do direct students to appropriate available resources . Giving students pointers about other venues for assistance can help them get started on the right track independently. These kinds of suggestions might include information about campus resources such as the University Writing Center or discipline-specific librarians, suggesting specific journals or books, or even sections of their textbook, or providing them with lists of research ideas or links to acceptable websites.
  • Do consider providing models – both successful and unsuccessful models (Miller, 2007). These models could be provided by past students, or models you have created yourself.  You could even ask students to evaluate the models themselves using the determined evaluation criteria, helping them to visualize the final product, think critically about how to complete the assignment, and ideally, recognize success in their own work.
  • Do consider including a way for students to make the assignment their own. In their study, Hass and Osborn (2007) confirmed the importance of personal engagement for students when completing an assignment.  Indeed, students will be more engaged in an assignment if it is personally meaningful, practical, or purposeful beyond the classroom.  You might think of ways to encourage students to tap into their own experiences or curiosities, to solve or explore a real problem, or connect to the larger community.  Offering variety in assignment selection can also help students feel more individualized, creative, and in control.
  • If your assignment is substantial or long, do consider sequencing it. Far too often, assignments are given as one-shot final products that receive grades at the end of the semester, eternally abandoned by the student.  By sequencing a large assignment, or essentially breaking it down into a systematic approach consisting of interconnected smaller elements (such as a project proposal, an annotated bibliography, or a rough draft, or a series of mini-assignments related to the longer assignment), you can encourage thoughtfulness, complexity, and thoroughness in your students, as well as emphasize process over final product.

Next are a few elements to avoid in your assignments:

  • Do not ask too many questions in your assignment.  In an effort to challenge students, instructors often err in the other direction, asking more questions than students can reasonably address in a single assignment without losing focus. Offering an overly specific “checklist” prompt often leads to externally organized papers, in which inexperienced students “slavishly follow the checklist instead of integrating their ideas into more organically-discovered structure” (Flaxman, 2005).
  • Do not expect or suggest that there is an “ideal” response to the assignment. A common error for instructors is to dictate content of an assignment too rigidly, or to imply that there is a single correct response or a specific conclusion to reach, either explicitly or implicitly (Flaxman, 2005). Undoubtedly, students do not appreciate feeling as if they must read an instructor's mind to complete an assignment successfully, or that their own ideas have nowhere to go, and can lose motivation as a result. Similarly, avoid assignments that simply ask for regurgitation (Miller, 2007). Again, the best assignments invite students to engage in critical thinking, not just reproduce lectures or readings.
  • Do not provide vague or confusing commands . Do students know what you mean when they are asked to “examine” or “discuss” a topic? Return to what you determined about your students' experiences and levels to help you decide what directions will make the most sense to them and what will require more explanation or guidance, and avoid verbiage that might confound them.
  • Do not impose impossible time restraints or require the use of insufficient resources for completion of the assignment.  For instance, if you are asking all of your students to use the same resource, ensure that there are enough copies available for all students to access – or at least put one copy on reserve in the library. Likewise, make sure that you are providing your students with ample time to locate resources and effectively complete the assignment (Fitzpatrick, 1989).

The assignments we give to students don't simply have to be research papers or reports. There are many options for effective yet creative ways to assess your students' learning! Here are just a few:

Journals, Posters, Portfolios, Letters, Brochures, Management plans, Editorials, Instruction Manuals, Imitations of a text, Case studies, Debates, News release, Dialogues, Videos, Collages, Plays, Power Point presentations

Ultimately, the success of student responses to an assignment often rests on the instructor's deliberate design of the assignment. By being purposeful and thoughtful from the beginning, you can ensure that your assignments will not only serve as effective assessment methods, but also engage and delight your students. If you would like further help in constructing or revising an assignment, the Teaching, Learning, and Professional Development Center is glad to offer individual consultations. In addition, look into some of the resources provided below.

Online Resources

“Creating Effective Assignments” http://www.unh.edu/teaching-excellence/resources/Assignments.htm This site, from the University of New Hampshire's Center for Excellence in Teaching and Learning,  provides a brief overview of effective assignment design, with a focus on determining and communicating goals and expectations.

Gardner, T.  (2005, June 12). Ten Tips for Designing Writing Assignments. Traci's Lists of Ten. http://www.tengrrl.com/tens/034.shtml This is a brief yet useful list of tips for assignment design, prepared by a writing teacher and curriculum developer for the National Council of Teachers of English .  The website will also link you to several other lists of “ten tips” related to literacy pedagogy.

“How to Create Effective Assignments for College Students.”  http:// tilt.colostate.edu/retreat/2011/zimmerman.pdf     This PDF is a simplified bulleted list, prepared by Dr. Toni Zimmerman from Colorado State University, offering some helpful ideas for coming up with creative assignments.

“Learner-Centered Assessment” http://cte.uwaterloo.ca/teaching_resources/tips/learner_centered_assessment.html From the Centre for Teaching Excellence at the University of Waterloo, this is a short list of suggestions for the process of designing an assessment with your students' interests in mind. “Matching Learning Goals to Assignment Types.” http://teachingcommons.depaul.edu/How_to/design_assignments/assignments_learning_goals.html This is a great page from DePaul University's Teaching Commons, providing a chart that helps instructors match assignments with learning goals.

Additional References Bean, J.C. (1996). Engaging ideas: The professor's guide to integrating writing, critical thinking, and active learning in the classroom . San Francisco: Jossey-Bass.

Fitzpatrick, R. (1989). Research and writing assignments that reduce fear lead to better papers and more confident students. Writing Across the Curriculum , 3.2, pp. 15 – 24.

Flaxman, R. (2005). Creating meaningful writing assignments. The Teaching Exchange .  Retrieved Jan. 9, 2008 from http://www.brown.edu/Administration/Sheridan_Center/pubs/teachingExchange/jan2005/01_flaxman.pdf

Hass, M. & Osborn, J. (2007, August 13). An emic view of student writing and the writing process. Across the Disciplines, 4. 

Hedengren, B.F. (2004). A TA's guide to teaching writing in all disciplines . Boston: Bedford/St. Martin's.

Hudd, S. S. (2003, April). Syllabus under construction: Involving students in the creation of class assignments.  Teaching Sociology , 31, pp. 195 – 202.

Leahy, R. (2002). Conducting writing assignments. College Teaching , 50.2, pp. 50 – 54.

Miller, H. (2007). Designing effective writing assignments.  Teaching with writing .  University of Minnesota Center for Writing. Retrieved Jan. 9, 2008, from http://writing.umn.edu/tww/assignments/designing.html

MIT Online Writing and Communication Center (1999). Creating Writing Assignments. Retrieved January 9, 2008 from http://web.mit.edu/writing/Faculty/createeffective.html .

Contact TTU

Original article | Base for Electronic Educational Sciences 2020, Vol. 1(1) 20-26

The Use of Assignments in Education

Ömer Gökhan Ulum

pp. 20 - 26   |  DOI: https://doi.org/10.29329/bedu.2020.253.2   |  Manu. Number: MANU-2006-24-0002.R1

Published online: September 17, 2020   |   Number of Views: 755   |  Number of Download: 833

In all educational levels, teachers assign their students with different activities to practice and reinforce what they have learnt. Further, assignments are valuable educational tools that raise students’ consciousness as believed by teachers, parents, and authorities. It functions, in a sense, a bridge between schools and homes. Assignments require the effort, time, and dedication of students, families, and teachers. Thus, it is crystal clear that assignments are practical tools to develop communicative skills and provide learning experiences in order to achieve the needed behavior change. Thus, the purpose of this paper is to examine the views of state primary school students on the assignments given by their teachers. The participants of the study were chosen from the most convenient and accessible schools located in Adana, Turkey. The sample consisted of 250 primary school students (178 female and 72 male) who voluntarily participated in the study. Based upon a qualitative research design, a semi-structured interview designed by the researcher was utilized in order to attain the required data. Therefore, based on a descriptive research design, this study involved the data analysis of descriptive statistics (SPSS 22.0) in order to report the views of participants in numerical data. The findings of the study suggest that assignments are favored by the students, which may be attributed to their school level.

Keywords: Assignment, education, reinforcement, learning, revision

Volume 1 Issue 1

September 2020

Base for Electronic Educational Sciences

Creative Commons Lisansı

The Education Trust

  • Twitter Channel
  • Facebook Profile
  • YouTube Channel
  • Instagram Profile
  • Linkedin Profile
  • Pinterest Profile

Classroom Assignments Matter. Here’s Why.

As a former classroom teacher, coach, and literacy specialist, I know the beginning of the school year demands that educators pay attention to a number of competing interests. Let me suggest one thing for teachers to focus on that, above all else, can close the student achievement gap: the rigor and quality of classroom assignments.

Digging into classroom assignments is revealing. It tells a story about curricula, instruction, achievement, and education equity. In the process, it uncovers what teachers believe about their students, what they know and understand about their standards and curricula, and what they are willing to do to advance student learning and achievement. So, when educators critically examine their own assignments (and the work students produce), they have an opportunity to gain powerful insight about teaching and learning — the kind of insight that can move the needle on student achievement. This type of analysis can identify trends across content areas such as English/language arts, science, social studies, and math.

At Ed Trust, we undertook such an analysis of 4,000 classroom assignments and found that students are being given in-school and out-of-school assignments that don’t align with grade-level standards, lack sufficient opportunities and time for writing, and include tasks that require low-level thinking and work production. We’ve seen assignments with little-to-no meaningful discussion and those with teachers over-supporting students, which effectively rob students of the kind of challenging thinking that leads to academic growth. And we’ve seen assignments where the reading looked like stop-and-go traffic, overrun with prescribed note-taking, breaking down students’ ability to build reading flow and deep learning.

These findings served as the basis for our second Equity in Motion convening. For three days this summer, educators from across the country explored the importance of regular and thoughtful assignment analysis. They found that carefully developed assignments have the power to make a curriculum last in students’ minds. They saw how assignments reveal whether students are grasping curricula, and if not, how teachers can adapt instruction. They also saw how assignments give clues into their own beliefs about students, which carry serious equity implications for all students, especially those who have been traditionally under-served. Throughout the convening, educators talked about the implications of their assignments and how assignments can affect overall achievement and address issues of equity. If assignments fall short of what standards demand, students will be ill-equipped to achieve at high levels.

The main take-away from this convening was simple but powerful: Assignments matter!

I encourage all teachers to take that message to heart. This school year, aim to make sure your assignments are more rigorous, standards-aligned, and authentically relevant to your students. Use our Literacy Analysis Assignment Guide to examine your assignments — alone, or better yet, with colleagues — to ensure you’re delivering assignments that propel your students to reach higher and achieve more. Doing this will provide a more complete picture of where your students are in their learning and how you can move them toward skill and concept mastery.

Remember this: Students can do no better than the assignments they receive.

Related Content

Image of planet earth with green leaves, wind mills, and other symbols representing the environment

Why Education Equity Should Include Climate Justice

Ask a young person about the top issues they care about. Climate change is one of them.Ask a teacher…

collage of images depicting parents reading with their children and teachers in school settings

Family Engagement State Policy Recommendations

While federal, state, district, school, and classroom policies all impact student, family, and community engagement, state leaders play a…

A Time for Climate Action: Schools and the Inflation Reduction Act

Last fall, just as students were returning from summer break, schools across the nation were forced to close for…

A female teacher holding up an illustration in front of seated students

State Board of Education Approves New Teacher Preparation Standards

On Friday, April 12, 2024, the Texas State Board of Education (SBOE) approved a number of new standards adopted…

an illustration of people lined up and looking forward

Introducing Our Graduate Student Advisory Council

We are excited to announce the formation of our Graduate Student Advisory Council to support our emerging line of…

Focusing on the Intersection of Race, Disability, and Equity

We are proud to announce our new line of work on race, disability, and equity. While EdTrust has historically…

EdTrust Releases Research and Recommendations on How Schools Can Improve Interactions with Families to Better Support Students

FOR IMMEDIATE RELEASE Contact:Carolyn [email protected] Three new publications discuss the importance of community engagement, share findings on fa…

Making Assessment Reports More Meaningful for Students & Families

This EdTrust report, “Making Assessment Reports More Meaningful for Students & Families,” offers guidance on how schools can better…

Model Individual Score Report for State Summative Assessments

Below is an example of a model individual score report for students and families. The report demonstrates how to…

How Student, Family, and Community Engagement Impacts Students’ Social, Emotional, and Academic Development (SEAD)

This report from EdTrust, “How Student, Family, and Community Engagement Impacts Students’ Social, Emotional, and Academic Development (SEAD),” discusses…

The Importance of Family Engagement

How well do schools interact with parents and caregivers? Authentic family and community engagement is a key strategy for…

Engaging First: Supporting Young Learners Through Family Engagement

EdTrust’s report, “Engaging First: Supporting Young Learners Through Family Engagement,” is based on two nationally representative surveys, including 600…

Eberly Center

Teaching excellence & educational innovation, creating assignments.

Here are some general suggestions and questions to consider when creating assignments. There are also many other resources in print and on the web that provide examples of interesting, discipline-specific assignment ideas.

Consider your learning objectives.

What do you want students to learn in your course? What could they do that would show you that they have learned it? To determine assignments that truly serve your course objectives, it is useful to write out your objectives in this form: I want my students to be able to ____. Use active, measurable verbs as you complete that sentence (e.g., compare theories, discuss ramifications, recommend strategies), and your learning objectives will point you towards suitable assignments.

Design assignments that are interesting and challenging.

This is the fun side of assignment design. Consider how to focus students’ thinking in ways that are creative, challenging, and motivating. Think beyond the conventional assignment type! For example, one American historian requires students to write diary entries for a hypothetical Nebraska farmwoman in the 1890s. By specifying that students’ diary entries must demonstrate the breadth of their historical knowledge (e.g., gender, economics, technology, diet, family structure), the instructor gets students to exercise their imaginations while also accomplishing the learning objectives of the course (Walvoord & Anderson, 1989, p. 25).

Double-check alignment.

After creating your assignments, go back to your learning objectives and make sure there is still a good match between what you want students to learn and what you are asking them to do. If you find a mismatch, you will need to adjust either the assignments or the learning objectives. For instance, if your goal is for students to be able to analyze and evaluate texts, but your assignments only ask them to summarize texts, you would need to add an analytical and evaluative dimension to some assignments or rethink your learning objectives.

Name assignments accurately.

Students can be misled by assignments that are named inappropriately. For example, if you want students to analyze a product’s strengths and weaknesses but you call the assignment a “product description,” students may focus all their energies on the descriptive, not the critical, elements of the task. Thus, it is important to ensure that the titles of your assignments communicate their intention accurately to students.

Consider sequencing.

Think about how to order your assignments so that they build skills in a logical sequence. Ideally, assignments that require the most synthesis of skills and knowledge should come later in the semester, preceded by smaller assignments that build these skills incrementally. For example, if an instructor’s final assignment is a research project that requires students to evaluate a technological solution to an environmental problem, earlier assignments should reinforce component skills, including the ability to identify and discuss key environmental issues, apply evaluative criteria, and find appropriate research sources.

Think about scheduling.

Consider your intended assignments in relation to the academic calendar and decide how they can be reasonably spaced throughout the semester, taking into account holidays and key campus events. Consider how long it will take students to complete all parts of the assignment (e.g., planning, library research, reading, coordinating groups, writing, integrating the contributions of team members, developing a presentation), and be sure to allow sufficient time between assignments.

Check feasibility.

Is the workload you have in mind reasonable for your students? Is the grading burden manageable for you? Sometimes there are ways to reduce workload (whether for you or for students) without compromising learning objectives. For example, if a primary objective in assigning a project is for students to identify an interesting engineering problem and do some preliminary research on it, it might be reasonable to require students to submit a project proposal and annotated bibliography rather than a fully developed report. If your learning objectives are clear, you will see where corners can be cut without sacrificing educational quality.

Articulate the task description clearly.

If an assignment is vague, students may interpret it any number of ways – and not necessarily how you intended. Thus, it is critical to clearly and unambiguously identify the task students are to do (e.g., design a website to help high school students locate environmental resources, create an annotated bibliography of readings on apartheid). It can be helpful to differentiate the central task (what students are supposed to produce) from other advice and information you provide in your assignment description.

Establish clear performance criteria.

Different instructors apply different criteria when grading student work, so it’s important that you clearly articulate to students what your criteria are. To do so, think about the best student work you have seen on similar tasks and try to identify the specific characteristics that made it excellent, such as clarity of thought, originality, logical organization, or use of a wide range of sources. Then identify the characteristics of the worst student work you have seen, such as shaky evidence, weak organizational structure, or lack of focus. Identifying these characteristics can help you consciously articulate the criteria you already apply. It is important to communicate these criteria to students, whether in your assignment description or as a separate rubric or scoring guide . Clearly articulated performance criteria can prevent unnecessary confusion about your expectations while also setting a high standard for students to meet.

Specify the intended audience.

Students make assumptions about the audience they are addressing in papers and presentations, which influences how they pitch their message. For example, students may assume that, since the instructor is their primary audience, they do not need to define discipline-specific terms or concepts. These assumptions may not match the instructor’s expectations. Thus, it is important on assignments to specify the intended audience http://wac.colostate.edu/intro/pop10e.cfm (e.g., undergraduates with no biology background, a potential funder who does not know engineering).

Specify the purpose of the assignment.

If students are unclear about the goals or purpose of the assignment, they may make unnecessary mistakes. For example, if students believe an assignment is focused on summarizing research as opposed to evaluating it, they may seriously miscalculate the task and put their energies in the wrong place. The same is true they think the goal of an economics problem set is to find the correct answer, rather than demonstrate a clear chain of economic reasoning. Consequently, it is important to make your objectives for the assignment clear to students.

Specify the parameters.

If you have specific parameters in mind for the assignment (e.g., length, size, formatting, citation conventions) you should be sure to specify them in your assignment description. Otherwise, students may misapply conventions and formats they learned in other courses that are not appropriate for yours.

A Checklist for Designing Assignments

Here is a set of questions you can ask yourself when creating an assignment.

  • Provided a written description of the assignment (in the syllabus or in a separate document)?
  • Specified the purpose of the assignment?
  • Indicated the intended audience?
  • Articulated the instructions in precise and unambiguous language?
  • Provided information about the appropriate format and presentation (e.g., page length, typed, cover sheet, bibliography)?  
  • Indicated special instructions, such as a particular citation style or headings?  
  • Specified the due date and the consequences for missing it?
  • Articulated performance criteria clearly?
  • Indicated the assignment’s point value or percentage of the course grade?
  • Provided students (where appropriate) with models or samples?

Adapted from the WAC Clearinghouse at http://wac.colostate.edu/intro/pop10e.cfm .

CONTACT US to talk with an Eberly colleague in person!

  • Faculty Support
  • Graduate Student Support
  • Canvas @ Carnegie Mellon
  • Quick Links

creative commons image

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Elsevier Sponsored Documents

Logo of elsevierwt

Classroom assignments as measures of teaching quality

We investigate classroom assignments and resulting student work to identify important characteristics of assignments in terms of instructional quality and their validity as measures of teaching quality. We examine assignment quality within a large-scale project exploring multiple measures including classroom observations, teacher knowledge measures, and value-added estimates based on student achievement scores. Analyses included descriptive statistics, multivariate analyses to understand factors contributing to score variance, and correlational analyses exploring the relationship of assignment scores to other measures. Results indicate relatively low demand levels in all teacher assignments, a marked difference in score distributions for mathematics (math) and English language arts (ELA), and a substantial relationship between what was asked of and produced by students. Relationships between assignments scores, classroom characteristics, and other measures of teaching quality are examined for both domains. These findings help us understand the nature of and factors associated with assignment quality in terms of intellectual demand.

  • • Intellectual demand of assignments for math and English language arts is limited.
  • • Intellectual quality of student work is systematically related to assignment demand.
  • • Assignment quality is related to classroom demographics and academic achievement.
  • • Assignments complement classroom observations as a measure of classroom interaction.

1. Introduction

1.1. rationale.

Central to recent educational accountability efforts are teacher evaluation systems that include measures of the quality of classroom interactions, with the underlying claim that what teachers do in the classroom matters ( Stodolsky, 1990 ). Classroom observations have received the greatest amount of attention in evaluating classroom interactions (e.g., Bell et al., 2012 , Gitomer et al., 2014 , Kane et al., 2014 , Taylor and Tyler, 2012 ). They provide important evidence about classroom interactions and guide interpretation of classroom interactions to make inferences about an array of classroom qualities including the goals that teachers have for students, the depth of content and reasoning that characterizes a given lesson, and classroom discourse. Formal teacher evaluation, predominantly focused on classroom observations, characterizes most OECD countries ( OECD, 2013a , Scheerens et al., 2012 ). Across OECD countries, the overarching articulated goal of evaluation is to improve teaching quality ( OECD, 2013b ).

Within the context of teacher evaluation systems, very little attention has been given to classroom artifacts as a direct source of evidence about the quality of classroom instruction. Yet, students spend a great deal of their instructional time working on and with assignments, whether they are instructional tasks or some type of assessment. Artifacts have been used as part of larger portfolios of teaching (e.g., OECD, 2013b , Stake et al., 2004 ) but not as standalone evidence of teaching quality.

The current study examines the quality of assignments in middle school mathematics (math) and English language arts (ELA) classrooms as part of a larger study of measures of teaching quality. We define teaching quality as “the quality of interactions between students and teachers; while teacher quality refers to the quality of those aspects of interactions that can be attributed to the teacher” ( Bell et al., 2012 , pp. 63–64). We acknowledge that, in many cases, assignments may not simply reflect instructional decisions of the teacher. Assignments may be part of a curriculum that is determined at the school or district level. Thus, assignments can also provide information about what the district holds as its view of quality teaching, with the teacher acting as “a key connection between policy and practice …” ( Cohen & Hill, 2000 , p. 329).

This study is intended to provide evidence that classroom assignments through collected artifacts can provide complementary interpretations about classroom interaction quality. We investigate how a protocol can be used to assess the quality of teaching practice and student learning by evaluating the quality of assigned quizzes, tests, and in-class work. These artifacts are, in certain ways, more straightforward to interpret than are classroom observations. Specifically, assignment artifacts can make clear what is expected of students and how students respond to those expectations in ways that are not always observable within a set of classroom interactions (see Gitomer & Bell, 2013 ). This study also contributes further validity evidence (see Kane, 2013 ) for the interpretation of scores from an artifact protocol.

As part of a larger study of a broad set of measures of teaching quality, this study investigates the relationship of artifact scores to measures of classroom observations, teacher knowledge, and value-added measures based on standardized student achievement tests. Acknowledging that “no single measurement can capture the full range of teacher performance in different contexts or conditions” ( Looney, 2011 , p. 443), this research enables us to consider what artifacts can contribute to an understanding of teaching quality and how scores on artifacts are related to scores on other teaching quality measures.

As with classroom observations, quality of classroom assignments is a construct that has no absolute definition. Therefore, a given protocol provides a conceptual lens through which quality is defined (e.g., see Dwyer, 1998 ). For this work, we adopt the framework of authentic intellectual work ( Newmann, Bryk, & Nagaoka, 2001 ). Originally proposed by Archibald and Newmann (1988) , the framework characterizes the work students are typically asked to do as contrived and superficial and contrasts that with the kinds of work skilled adults often do. Authentic intellectual work is viewed as relatively complex and socially or personally meaningful.

Newmann et al. (2001) describe authentic intellectual work as having three distinctive characteristics. First, it involves the construction of knowledge, arguing that authentic work requires one to go beyond routine use of information and skills previously learned. Problem solvers must construct knowledge that involves “organizing, interpreting, evaluating, or synthesizing prior knowledge to solve new problems (p. 14).” The second characteristic of authentic intellectual work is disciplined inquiry, which involves the use of prior knowledge in a field, in-depth understanding, and elaborated communication. The final characterizing feature is value beyond school , the idea that work that people do authentically is intended to impact or influence others.

The principles of authentic work derive from philosophies and studies from constructivist traditions including Bruer, 1993 , Dewey, 1916 , Resnick, 1987 , and Wolf, Bixby, Glenn, & Gardner (1991) . Support for these constructivist pedagogies, on which the authentic work framework is based, include Carpenter et al., 1989 , Cobb et al., 1991 , and Silver and Lane (1995) . In this constructivist tradition, students engage with real-world problems that have legitimacy within their own experiences and that involve the structuring and restructuring of knowledge rather than simply reporting back information that they have reviewed.

A number of studies have provided empirical support. Newmann, Marks, and Gamoran (1996) studied a set of restructured schools that were designed around authentic intellectual engagement and related constructivist practices. They found that authentic pedagogical practice explained approximately 35% of the variance in student performance. D'Agostino (1996) studied compensatory (low-achieving) education third-grade classrooms and found a strong relationship between authentic instruction and math problem solving. Findings for reading comprehension were more ambiguous. Knapp, Shields, and Turnbull (1992) studied high-poverty schools and found that those classrooms that engaged in authentic practices of meaning making, disciplinary thinking, and connections with the real world produced students who were substantially stronger in their academic attainment.

The framework of authentic intellectual engagement is the foundation of the artifact protocol used in this study, the Intellectual Demand Assignment Protocol (IDAP) ( Wenzel, Nagaoka, Morris, Billings, & Fendt, 2002 ). While other assignment protocols build on different frameworks, the assignment protocols cited in the literature all focus on some variation of intellectual demand.

Prior classroom assignment research has provided understanding of the intellectual demands that are placed on students, how students respond, and how these assignments affect student outcomes. Students respond to authentic work that is challenging, constructive, and relevant ( American Institutes for Research, 2003 , Beane, 2004 , Daggett, 2005 , Dowden and Relevant, 2007 , Milanowski et al., 2009 , Ng, 2007 , Paik, 2015 , Prosser, 2006 , Woolley et al., 2013 ). Intellectual demand has also been measured reliably ( Borko et al., 2005 , Clare and Aschbacher, 2001 , Matsumura et al., 2008 ) and is connected to student outcomes ( Matsumura and Pascal, 2003 , Mitchell et al., 2005 , Newmann et al., 2001 ).

This study is situated within a larger validation effort of measures of teaching quality. Following Messick (1989) and Kane (2013) , we investigate evidence of the extent to which scores from an artifact protocol support the appropriateness of inferences about teaching quality in middle school math and ELA classrooms. Adopting a theoretical framework of intellectual demand, this research seeks empirical support for the use of artifacts to make judgments of teaching quality by investigating the following questions:

  • 1. How are scores representing assignment intellectual demand distributed for math and ELA?
  • 2. What is the relationship between the intellectual demand of a given assignment and the student work produced in response?
  • 3. What are the relationships between assignment intellectual demand and other measures of teaching?
  • 4. How are assignment scores related to contextual variables including teacher characteristics, class demographics, schools, or districts?

1.2. Review of research on assignment quality as measures of classroom practice

Initial validation work of IDAP ( Wenzel et al., 2002 ) provided evidence that IDAP scores could support inferences about the quality of classroom assignments in the Chicago Public Schools. Artifacts could be rated reliably, though there was some year-to-year drift. In addition, more challenging artifacts were associated with higher test scores ( Newmann et al., 2001 ). Note, however, these initial validation studies used status scores rather than a value-added measure. They also did not include observation measures as alternative measures. Additional work supporting the validity of using IDAP was done by Mitchell et al. (2005) , who also demonstrated that artifacts could be scored reliably and that scores were related to status achievement scores.

Studies using other assignment protocols have also examined the validity and reliability of classroom assignments. Matsumura and colleagues found that a reliable estimate of ELA classroom assignment quality could be attained with three assignments, that there was overlap among the scales, and that there was a relationship between assignment quality and other measures of teaching quality ( Clare, 2000 , Clare and Aschbacher, 2001 , Clare et al., 2001 , Matsumura and Pascal, 2003 , Matsumura et al., 2008 ). Similar work looking at middle school mathematics and science classes found that a reliable estimate of classroom practice could be based on teacher assignments and student work ( Borko et al., 2007 , Borko et al., 2005 ).

Studies have also explored factors related to the quality of assignments in a variety of contexts. Koh and Luke (2009) also found a relationship between quality of teacher assignments and student work in Singapore middle school social studies, science, English, and math classrooms. Students of different cultural and socioeconomic backgrounds receive lower-quality assignments, even in “progressive” settings ( Auerbach, 2002 , Rubin, 2003 ), suggesting that access to intellectually demanding assignments is not uniform across classrooms and schools.

1.3. Assumptions underlying the IDAP protocol

Every protocol, including IDAP, makes a set of assumptions about its range of application and the specifics of how evidence is to be collected and assessed ( Wenzel et al., 2002 ). Several key assumptions described below guided the design of this study.

1.3.1. Omission of the artifact origin

In considering demand, the focus is on what students are asked to do, independent of who actually designed the artifact. This study is not an evaluation of a teacher's ability to design instructional artifacts, including assessments, nor should scores be influenced by whether the artifact was authored by a commercial publisher, a school district, or the teacher herself.

1.3.2. Differentiation of challenging and typical assignments

Teachers assign particular work to satisfy different educational purposes, and some artifacts will not be as intellectually challenging as others. In some classes, for example, teachers may assign a relatively routine worksheet to develop a set of procedural skills. In another class, the same teacher might assign a complex project that requires extended disciplinary inquiry and communication. Thus, this study followed Wenzel et al.’s model (2002) and collected both challenging and typical assignments as part of IDAP.

1.3.3. Treatment of subject areas

While the construct of intellectual demand cuts across all subject areas ( Newmann, King, & Carmichael, 2007 ), the manifestation of intellectual demand is domain-specific. Construction of knowledge, disciplinary inquiry, and the audiences and purpose have meaning specific to respective disciplines. Thus, the study, following Wenzel et al. (2002) , includes separate protocols for math and ELA, all within the common framework of authentic intellectual demand.

1.3.4. Treatment of grade level

As with subject areas, the framework of authentic intellectual demand is deemed to be appropriate across grade levels ( Newmann et al., 2007 ). The construct of intellectual demand is not based on the grade-associated difficulty of the content expressed in the task but on the depth of the thinking about the content expected of the student.

1.3.5. Treatment of practices

IDAP does not privilege particular teaching practices. Newmann et al. (2007) argue that “no single practice or set of practices has been shown to be most effective for varied intellectual outcomes for most students across several grade levels and subjects” (p. 15).

1.3.6. Selection of artifacts

This research develops from, but differs from, earlier work done using Teacher Work Samples ( Denner, Salzman, & Bangert, 2001 ) by not eliciting specific artifacts, but instead allowing for teacher decision-making ( Dwyer, 1998 ) in selecting artifacts to share, consistent with Wenzel et al. (2002) .

2.1. Sample

The study, conducted across two years, included three school districts within a large metropolitan area in the United States, across 47 middle schools and with 225 math and 225 ELA teachers who volunteered to participate. Teachers were asked to supply six assignments across the school year. Per the IDAP protocol ( Wenzel et al., 2002 ), we asked teachers to provide both typical assignments and those that they considered to be challenging. Teachers had latitude in determining what constituted “typical” and “challenging.” While a typical assignment was described to teachers as “everyday work,” a challenging one was described as “an assignment that gives you the best sense of how well your students are learning a subject or skill at their highest level.” No distinctions were made as to whether the assignments were produced by the teacher, a commercial publisher, the school district, or any other entity.

The structure of the data is summarized in Fig. 1 . Two classrooms ( class sections ) of each teacher were sampled. In the fall, a teacher selected a typical assignment from each of her two different class sections and one sample of a challenging assignment from one of the class sections. In addition, per Wenzel et al. (2002) , p. 10 samples of student work associated with each challenging assignment were randomly selected from the classroom (in some cases fewer than 10 were available). This procedure was repeated in the spring, except that the challenging assignment came from the other class section. Half the teachers were sampled each year of data collection, yielding a total of 904 typical and 451 1 challenging assignments for each subject area. There were 4396 student work artifacts for math and 4390 for ELA.

Fig. 1

Study data structure.

2.2. Instruments

The study used the IDAP ( Wenzel et al., 2002 ) to examine classroom assignments and associated student work within a framework that valued authentic intellectual engagement. Two separate protocols, one for math and one for ELA, were used in the current study. For math, intellectual demand was considered in terms of three dimensions: communication, conceptual understanding, and real-world connections. For ELA, intellectual demand dimensions included: communication, construction of knowledge, and real-world connections. Teacher assignments and student work were each evaluated separately, and the scales for each were similar but not identical. The structure of the scales for assignments and student work is presented in Table 1 . The rubrics for each of the respective scales are available in Appendix A .

Table 1

Scale dimensions (scale range) for teacher assignments and student work in math and ELA.

2.2.1. Raters, training, and scoring

Artifacts were scored within domains by raters who had expertise both in teaching math (n = 17) or ELA (n = 22) and in using scoring rubrics to score constructed responses for other standardized assessment programs. Five highly experienced individuals in constructed-response scoring served as scoring leaders for each domain, providing guidance and oversight throughout the scoring process. Artifacts were distributed and displayed to raters using a proprietary scoring tool. Exemplar selection and rater training were led by experts (one each for math and ELA), who were involved in the original development of the IDAP. Papers were chosen by the experts from the pool of artifacts submitted as part of the study to serve as exemplars for each rubric and each score point.

Training and certification followed the same process for each of the six rubrics in each domain. First, benchmark papers were introduced to illustrate each score point. Then, using a set of 6 training papers, raters first scored the artifact, followed by a discussion with the scoring trainer. Next, each rater independently completed a set of 6 certification papers and was required to assign the pre-determined score in at least four cases (with no deviations of more than 1 score point) in order to begin the actual scoring. Training took between 2.5 and 5 h per scale. During scoring of the assignments, typical and challenging assignments were intermixed and rated in the same session. For student work, only challenging assignments were involved. Scoring leaders regularly read scored samples to ensure reliability. Scoring of the entire sample on each scale took 1–3 days, depending on the complexity of the scale and the number of artifacts.

2.3. Other measures of teaching quality

In addition to the IDAP, the study included other measures associated with teaching quality.

2.3.1. Observation protocols

Each teacher in this study was observed four times, and each lesson was video-recorded. Three different protocols were used to score each of the four videos. All teachers were scored using the Framework for Teaching ( Danielson, 2013 ) and CLASS™ ( La Paro & Pianta, 2003 ). These are protocols designed to capture qualities of instruction, classroom environment, and interpersonal relationships across all subjects and grades. In addition, math teachers were scored using the subject-specific Mathematical Quality of Instruction (MQI) protocol ( Learning Mathematics for Teaching Project, 2011 ), and ELA teachers were scored using the subject-specific Protocol for Language Arts Teaching Observation (PLATO) ( Grossman et al., 2009 ). Additional details on the design, rater training, and collection of the observation protocol data are provided in Bell et al. (2012) . The observation lessons were collected independently from the assignments, meaning that the foci of the two sources of evidence were not linked in terms of specific content being studied. The estimated reliability of the observation scores at the teacher level using four observations was approximately 0.65 ( Bill & Melinda Gates Foundation, 2012 ). Reliability was estimated using a variance decomposition analysis that included teacher, section, lesson, rater, and residual factors.

2.3.2. Value-added modeling

VAM scores were computed at the teacher by class section level, using state standardized achievement tests. The dataset also included scores from the state accountability tests in math, ELA, reading, and science, longitudinally linked to individual students across grades. Finally, the data included student demographic and program participation information including race, gender, free or reduced price lunch status, English language learner (ELL) status, and indicators of participation in gifted or special education programs. VAM scores were computed using the latent regression model ( Lockwood & McCaffrey, 2014 ), which regresses outcome test scores on teacher indicator variables, student background characteristics, and student prior test scores while accounting for the measurement error in the prior test scores. Reliabilities for VAM scores using a variance decomposition analysis were 0.89 for math and 0.80 for ELA ( Lockwood, Savitsky, & McCaffrey, 2015 ).

2.3.3. Teacher knowledge

All math teachers took a version of the Mathematics Knowledge for Teaching (MKT) test ( Hill, Schilling, & Ball, 2004 ). The 38-item instrument was designed to assess “the mathematics teachers use in classrooms” ( Delaney, Ball, Hill, Schilling, & Zopf, 2008 , p. 172) through items that ask questions covering instructional choices and analysis of student answers in the areas of number, operations, and content knowledge as well as patterns, functions, and algebra content knowledge. For ELA, teachers were administered an abridged form (30 items) of the Praxis ® test used for ELA teacher certification. This instrument is intended to assess “ … knowledge, skills, and abilities believed necessary for competent professional practice” ( Educational Testing Service, 2015 , p. 5). Reliabilities for teacher knowledge scores were 0.85 for math and 0.78 for ELA ( Lockwood et al., 2015 ).

3.1. How are scores representing assignment intellectual demand distributed for math and ELA?

To determine scale scores we first calculated an average score of all ratings for a given artifact on a given scale. We then averaged those scale scores to calculate a total assignment score.

Initially, however, it is important to establish that any variation in assignment scores is not a function of an undesirable confound, such as raters. In order to understand factors that might influence assignment scores, a variance component analysis was conducted. The assignment scale ratings were modeled via a multi-level cumulative logit mixed model (see Appendix B for details). Fitting this model allowed for a look at the reliability of the raw scores by computing the proportion of modeled variance accounted for by the school, teacher, classroom, assignment, and rater (see Table 2 ). Raters are associated with only 3% of the modeled variance for both math and ELA. In contrast, assignments account for 72% and 95% of the modeled variance for math and ELA, respectively. This suggests that most of the raw scale score variation is due to differences in assignment quality rather than any rater effects.

Table 2

Variance components analysis for total assignment quality scores.

To account for any potential residual rater effects as well as the possibility that some scales are more stringent than others (i.e. it is more difficult to obtain a high score on a particular rubric), a modeling approach was used to estimate true assignment scores. Linacre’s (2015) Many-Faceted Rasch Model (MFRM), the same IRT modeling approach used in Newmann et al. (2001) and Mitchell et al. (2005) , was used to estimate assignment intellectual demand 2 (for further information, see http://www.winsteps.com/facetman/theory.htm ).

Modeling did not have a substantial effect on the study results. The relative ranking and distribution of assignment scores is highly consistent between the MFRM adjusted overall scores and the simple averages computed across scales (see Table 3 ). Correlations between adjusted and raw averages range from 0.97 to 0.99 across all scales. The modeled or “adjusted” scores are used in all further analyses that include overall assignment quality scores, and raw scores are used in analyses focused on specific scales.

Table 3

Teacher assignment raw and adjusted scores (SD).

Note. Range = 1–3.67. df - Typical = 904; df - Challenging = 451.

3.1.1. Math

Average intellectual demand scores are relatively low with respect to the overall scale (see Table 3 ). Fig. 2 shows a large proportion of scores below 2 on a scale that ranges from 1 to 3.67 for both typical and challenging assignments. Though the challenging assignments show higher intellectual demand, they are still relatively lacking in demand for substantial demonstration of conceptual understanding, explanation of solution path, or connection to real-world issues. Differences between typical and challenging scores indicate that there is an increase in demand though the overall distribution of scores remains highly restricted. A paired t -test comparing the average adjusted scores of typical and challenging assignments from the same teacher indicates a small yet statistically significant difference between the two types of assignments ( t (443) = 6.1, p  < 0.001, d  = 0.16). Given the skewness of the data, however, the precise estimate of any effect sizes should be interpreted with appropriate caution.

Fig. 2

Math assignment score distributions.

There is a relatively low level of intellectual demand for typical assignments, with a mean of 1.58 on a scale that extends to 3.67. Most assignments received a score of less than 1.5. There is a clear difference in the distribution of challenging scores as many more assignments received scores of 2 or higher (see Table 3 and Fig. 3 ). Although data for challenging assignments is still skewed, it is also more evenly distributed across the scale. Typical and challenging ELA assignment adjusted scores from the same section are also statistically different ( t (444) = 12.7, p  < 0.001, d  = 0.48).

Fig. 3

English language arts (ELA) assignment score distributions.

3.1.3. Performance on individual scales

Table 4 presents information on each of the scales contributing to the overall intellectual demand score for both typical and challenging assignments in math and ELA. Differences between the typical and challenging assignments are also statistically significant at the scale level in both domains.

Table 4

Mean assignment demand by scale (SD)(n) including differences between typical and challenging scales.

Note. For this analysis the n reflects total number of ratings contributing to the scale, including cases with two ratings.

* p  < 0.05, ** p  < 0.01, *** p  < 0.001.

For math, all scales scores are slightly higher for challenging than for typical assignments. More challenging assignments are somewhat more likely to ask students to show their work or provide some types of explanation. The other scales have smaller differences for assignment type, although all score differences between challenging and typical assignments are statistically significant.

For ELA, score differences between typical and challenging assignments are larger. Challenging assignments are more likely to ask for some extended writing as a characteristic associated with rigor and construction of knowledge and more frequently ask students to take on a role in their writing. However, very rarely are these roles judged to be real world for the student (as described by the lack of scores of 3 and 4 on the relevance scale).

Table 5a and Table 5b present the correlations among scales for math and ELA, respectively. After aggregating scale scores at the teacher level and pooling across all assignments, the correlations were computed using Spearman's rank correlation Rho. The correlation reported in each series pools scores across all assignments. Analyses that included only typical or only challenging assignments yielded similar results. All correlations are statistically significant, yet moderate, supporting the idea that the IDAP dimensions are each measuring related yet unique constructs.

Table 5a

Assignment inter-scale correlations for mathematics.

Table 5b

Assignment inter-scale correlations for English language arts (ELA).

Note. All correlations p  < 0.01.

3.2. What is the relationship between assignment quality and resulting student work in math and ELA?

This analysis examines the relationship between the demands contained in classroom assignments and the corresponding work that students do to fulfill the assignment. Student work scores were regressed onto the teacher assignment ratings using hierarchical linear modeling (HLM; Raudenbush & Bryk, 2002 ) to describe the overall relationship of teacher assignment quality and quality of student work.

To better understand this relationship, we divided each scale into thirds and categorized each assignment as low, medium, or high and then reported the mean student work score for each category (see Table 6 ).

Table 6

Mean student work score by level of assignment demand (SD) (n)d.

3.2.1. Math

For math, more than 2000 student artifacts were produced in response to low-demand assignments and were uniformly rated low in demand. As intellectual demand of the assignment increased, there is a slight increase in quality of associated student work, but student scores remain relatively low. The weak relationship was confirmed by regressing student work quality onto teacher assignment quality ratings with a HLM, which returned a standardized coefficient of only 0.23 ( t  = 15.5), with an R 2 value of only 0.14. That is, math assignment ratings accounted for only 14% of the observed variance in student work scores. The most salient point, however, is the relative absence (∼6%) of high-demand teacher assignments: assignments with scores that fell into the top third of the score scale. In fact, only 21 (<10%) of the 225 math teachers had even one assignment score in the high-demand category, and only two (<1%) teachers in the study had more than one assignment rated at that level.

For ELA, there is a noticeable shift in the quality of student work in response to increased demand in the assignment, with a marked change in distribution as assignments move from low demand to medium and high demand. HLM analysis returned a standardized coefficient of 0.72 ( t  = 23.8) and an R 2 value of 0.52, suggesting a strong relationship between ELA assignment demand and student work.

3.3. What is the relationship between assignment quality and other measures of teaching quality in math and ELA?

We analyzed the relationships between teacher assignment and student work scores and other measures used to characterize teaching quality. All correlations were run at the teacher level. Teacher assignment scores included only challenging classroom assignments since they were the only assignments that also included student work. All results described in the following subsections are presented in Table 7 .

Table 7

Teacher-level correlations between challenging assignment quality, student work, and other measures of teaching quality.

Note. * p < 0.05. ** p < 0.01. *** p < 0.001.

3.3.1. Teacher knowledge

No statistically significant correlation was observed between quality of math assignments or student work and scores on the MKT teacher knowledge assessment. For ELA, positive and statistically significant correlations were observed for both assignments and student work.

3.3.2. Observation protocols

For each observation protocol we report relationships with total observation scores. For math, the correlation of assignment scores with overall observation scores was not statistically significant for either protocol. However, small but statistically significant positive relationships with student work were observed for the domain-general protocols but not for the subject-specific MQI. For ELA, all three protocol total scores were statistically significantly and positively related to both teacher assignments and student work.

No relationship was found between VAM and teacher assignment quality in math or ELA, nor is a relationship observed between ELA student work and VAM. For math, there is a statistically significant but small positive association between the quality of student work and VAM.

3.4. How are assignment scores related to contextual variables including teacher characteristics, class demographics, schools, or districts?

In this analysis, we attempt to understand contextual, assignment, and teacher factors associated with assignment quality. These analyses were conducted at the class section level to capture differences between two class sections that a teacher taught. Class section, teacher, and school variables were all incorporated as random effects to account for the nested structure of the data. In addition, a number of contextual variables were included in the model and treated as fixed effects. These variables were time of the school year ( season ) during which the assignment was given, assignment type, different scales, student grade ( grades 6–8 ), and school district ( 3 districts ).

The model also included predictor groups of variables intended to capture important contextual dimensions. A first group of predictors was considered class section demographic variables and included, for each section, percentage of students in each section identified as free/reduced-priced lunch eligible, minority, and English language learners. A second group of predictors was labeled as prior achievement variables and included mean class section prior achievement scores as well as the proportion of students in each class section classified as either gifted or special education. A third group of predictors was treated as teacher quality variables and consisted of teacher knowledge and VAM estimates.

To model the likelihood of different factors being associated with assignment ratings of different quality, scale scores were analyzed via a multi-level cumulative logit mixed model described in Appendix B . A full model incorporating all of the aforementioned contextual variables and predictors was fitted to the data to investigate their relationship with assignment ratings. Estimates from the full model are presented in Table 8 and elaborated on in the following subsections.

Table 8

Model estimates of contextual variables and assignment demand scores.

Note. * p  < 0.05. ** p  < 0.01. *** p  < 0.001. Coefficients are in logits.

There was multicollinearity among the individual variables within each predictor group, making the estimates of single variables within each group problematic to interpret. Therefore, a series of likelihood ratio tests (LRTs) was conducted that evaluated the effect of including the group of variables in the model compared with a model that contained all variables but those belonging to the specific predictor group. The p-values of the LRT are presented in Table 8 .

3.4.1. Math

Statistically significant effects were observed for school district such that assignments from one district were more intellectually demanding than from the other two districts. Sixth-grade assignments were more intellectually demanding than assignments in older grades. Both of these findings suggest the possibility of curricular differences across grades and/or districts that might account for intellectual demand of assignments. As observed in other analysis, challenging assignments differed from typical assignments and scale distributions differed as well.

Class sections with larger proportions of minority, English language learner, and free or reduced price lunch students were associated with assignments that had lower intellectual demand. Class sections that higher prior achievement scores, more students classified as gifted, and fewer classified as special education had assignments with greater intellectual demand. However, there was not a significant difference in intellectual demand among classes that differed with respect to the teacher's knowledge and VAM scores.

ELA relationships of contextual variables differed somewhat from math. No district effects were observed, and eighth grade had more challenging assignments than the lower grades. Additionally, there was an effect of season, such that assignments given in the second half of the school year were less challenging than those given in the first half. There were substantial differences across assignment type and scale.

Patterns among ELA predictor group variables paralleled math. Class sections with larger proportions of minority, English language learner, and free or reduced price lunch students were associated with assignments that had lower intellectual demand. Class sections that had students with higher prior achievement scores, more students classified as gifted, and fewer students classified as special education were given assignments with greater intellectual demand. However, there was not a significant difference in intellectual demand among classes that differed with respect to the teacher's knowledge and VAM scores.

4. Discussion

This study presents findings that describe how classroom artifacts can provide unique insights into the quality of instruction experienced by students in mathematics and ELA. These insights are complementary to those provided by existing measures, capturing both the instructional expectations set in a particular classroom and the student responses to these expectations. This research demonstrates that artifacts can provide explicit and reliably judged evidence about the intellectual demand of the work students are asked to do and the intellectual demand in the students’ response, shedding light on curricular and contextual factors as well as teaching quality. The study also provides evidence about the relationship between intellectual demand of assignments and other measures of teaching quality and contextual correlates.

A critical distinction in examining teaching has been the contrast of teacher quality with teaching quality ( Kennedy, 2010 ). Teacher quality focuses on the individual and assumes a causal connection between what the teacher does and what the student learns. Teaching quality focuses on the teaching and the teacher as a central actor in a complex system that includes a host of contextual factors including curriculum, policy, and community ( Bell et al., 2012 ). Studies of teaching quality more often include a focus on normative practice as a way of understanding consistencies in practice across a system. This research is an examination of teaching quality and avoids any claims of causal attribution to individual teachers. Instead, this study focuses on both differences and commonalities across classrooms as a way of understanding teaching quality situated in an understanding of the contextual factors. That is, artifacts provide evidence of instructional quality that is likely attributable to factors that include, but are not limited to, the teacher. This is a significant departure from teacher observations that tend to be centered on a teacher's strengths or weaknesses and, instead, focus on the actual intellectual demand asked of a student.

Evidence from assignments reinforces findings from observations that the intellectual demand of classroom interactions is very limited and relatively consistent across classrooms. As in observation studies, including the study from which these data are drawn (see Bell et al., 2014 ), dimensions associated with intellectual demand consistently score relatively low on different protocols ( Gitomer et al., 2014 ) and are clustered around a narrow range at the lower part of respective scales. Score variance attributable to raters is very low, and scale inter-correlations are moderate. While the construct of intellectual demand is the basis of IDAP for both math and ELA, the specific features of design and performance that define the domain-specific IDAP are unique and consistent with the structure of particular disciplines. For both math and ELA, raters are able to reliably score artifacts and to differentiate scores on three scales designed to capture intellectual demand. In addition to this general pattern, there are specific results that are appropriate reported separately by domain.

The construct of intellectual demand is the basis of IDAP for both math and ELA. However, the specific features of design and performance that define the domain-specific protocols are unique and consistent with the structure of particular disciplines. For both math and ELA, raters are able to reliably score artifacts and to differentiate scores on three scales designed to capture intellectual demand. Therefore, the IDAP can be used effectively to rate artifacts as evidence of teaching quality.

In math, the intellectual demand of artifacts is quite low and narrowly distributed for all scales, regardless of whether the teacher categorized the artifact as typical or challenging. Though there is a statistically significant difference in scores for typical and challenging assignments, those that teachers select as more challenging show only very small increases in each of the scales. To the limited extent that math artifacts show more demand, students are more likely to produce work that reflects this demand. Of course, the demands expressed on an assignment do not always reflect the full expectations of the teacher—some may be communicated orally or through other habits developed in the classroom.

The lack of substantial variation in math assignment scores leads to two sets of findings addressing issues of teaching quality. First, correlations with other variables of interest are highly attenuated. Therefore, these correlations with other measures of teaching quality are either not significant or very small. However, the finding of district and grade effects suggests the potential of this kind of artifact analysis to identify systematic differences that are worthy of further exploration. For example, can curricular differences at either the district or grade level account for these differences? Also, score decreases were found to be attributable to reduced attention to real-world relevance in the higher grades.

From a teaching quality perspective, however, this lack of variation is, in fact, quite important. The consistent lack of intellectual demand in math assignments suggests that issues of teaching quality must be addressed systemically. Even the teachers with the highest artifact scores had artifacts that scored low relative to the IDAP scales. While there are significant contextual relationships with demographic and prior achievement variables, the overarching story is that even in classrooms characterized by favorable predictors, the intellectual demand of assignments is quite low. In terms of the protocol, across the vast majority of classrooms in the study, students are asked to solve routine math problems that require little more than procedural execution and lack extended communication or real-world connections. These insights are valuable to stakeholders in a multitude of ways. The most obvious would be to exert pressure on publishers to deliver materials that are relevant, rigorous, and constructive so that these tasks can then be implemented as such in the classroom. Additionally, the small distinction between typical and challenging can be interpreted as a potential area for professional development for teachers in terms of how to effectively provide challenge in the classroom. Finally, in these findings there is a clear call to researchers and educators to further explore the role of relevance in middle years education.

Findings in ELA classrooms are somewhat different. For ELA, many of the challenging assignments are qualitatively different from typical assignments. Challenging assignments, while still modest in terms of the overall scale, are more likely to ask students to provide elaborated communication, to engage in greater construction of knowledge, and to complete more authentic writing tasks in which they attempt to take on a role, even if at a relatively superficial level. The more intellectually demanding ELA assignments are strongly associated with student work that evidences greater intellectual demand.

ELA scores are positively correlated with observation scores. Classrooms with higher intellectual demand in assignments are also characterized by observation scores that describe stronger classroom interactions. These classrooms are also taught by teachers with higher teacher knowledge scores. There are also significant contextual relationships with demographic and prior achievement variables.

For ELA only, there is an effect of the season during which the assignment is given. Assignments given in the spring are less demanding than those given in the fall. One conjecture is that the decrease may be reflective of an increased focus on standardized test preparation in the spring. Another is that teachers may experience planning fatigue as the year progresses. Also, assignments in eighth grade were more challenging than those in sixth grade, although no difference was noted for seventh grade. Further study is needed to understand this trend and whether it may be related to a perceived need that deeper tasks are required in preparation for high school.

Artifact research does allow for some important insights that are not always picked up by other measures. For example, the differences between the intellectual demand asked of students from different subgroups would be critical to key stakeholders in addressing equity of access. Additionally, if it is found that the spring drop in demand is related to the current timeframe of high-stakes testing, a response may be needed in either revising the testing in terms of intent and composition of the tests themselves.

Across both domains, correlations of assignment scores with VAM are not statistically significant, and this finding raises issues deserving of further study and adds to the argument for the drop in demand in preparation for the standardized testing, from which VAM is calculated, in the spring. It may be that the items on achievement tests that are used to determine VAM estimates are not asking students to engage in intellectually demanding tasks as valued by the IDAP. Similarly, it may be that improvement in achievement test scores can be made by simply focusing on the improvement of skills associated with low-demand tasks. A much better understanding of the alignment of test demands with assignment demands is needed to better interpret this lack of consistent relationship between measures.

Grade, district, and seasonal effects suggest the possibility that assignment differences reflect differences in mandated curriculum in addition to any differences that can be attributed to individual teachers. Current teacher evaluation efforts have placed a very substantial focus on individual teacher differences, but there are likely broader contextual factors such as curriculum that also need to be given attention ( Gitomer, 2017 ). The link between mandated curriculum reform and teacher interpretation, as explored by Cohen and Hill (2000) , merits further investigation.

4.1. Limitations

It is important to recognize features of the research that have implications for interpreting the study's findings. As noted, the Newmann et al. (2001) framework represents a particular perspective on instruction. Different frameworks might support different inferences about the quality of assignments. Nor does the framework used in this study consider important teaching and learning issues such as learning progressions, adaptive materials for students with special needs, culturally responsive pedagogy, and the relationship of artifacts to curriculum design. The IDAP may have more or less utility in studying these kinds of problems.

The issue of teachers selecting challenging and typical assignments raises potentially interesting questions as well. In this study, teachers were given limited guidance in deciding what constituted a challenging assignment. Thus, inferences in this study are not simply about the quality of an assignment but also include teacher choice of the assignment. While the student data suggests that the teachers were differentiating assignments in ways that were sensitive to the protocol, there is a question of whether different, and perhaps more directive, sampling procedures would lead to different interpretations.

Finally, as with most artifact studies, judgments are made on the basis of evidence contained in the artifact documents. Evidence of what transpired in the classroom around the enactment of the artifacts is not available. Certainly interactions between teachers and students might strengthen or weaken the expectations that can be inferred by only considering the physical documentation.

4.2. Conclusions

As a measure of teaching quality, assignments have the potential to contribute to an understanding of the extent to which intellectual demand is present in classrooms and across classrooms within schools and districts. These insights can then be used to address development needs in each of these areas leading to improvement in teaching quality and ultimately to improvement of student outcomes ( Yoon, Duncan, Lee, Scarloss, & Shapley, 2007 ). By examining score patterns as we have done, schools and districts may be able to better identify both the extent to which students are asked to engage in intellectually meaningful work as well as the teacher, classroom, and institutional factors that contribute to students’ learning experience. Further, with this insight, schools and districts could determine priorities for improvement at a local or system level.

This work was supported by the Bill & Melinda Gates Foundation, Grant# OPP52048, as part of the Understanding Teaching Quality (UTQ) project.

Acknowledgements

The authors are grateful to Steve Leinwand and Carmen Manning for leading the rater training effort and to Lesli Scott and Terri Egan for coordinating data collection and scoring, respectively. We are grateful to our collaborators, Courtney Bell, JR Lockwood, and Dan McCaffrey, for all of their contributions to study design as well as comments on earlier versions of the paper. The authors alone are responsible for the content of this paper.

1 One classroom was team-taught resulting in an extra set of assignments.

2 The total scores were modeled as follows: log ( P n m i j k P n m i j ( k − 1 ) ) = B n − D i − C j − F k , where B n is the overall quality of the assignment, D i is the difficulty of the particular scale i , C j is the severity of the rater j , F k is the threshold difficulty of scale rating k vs k-1, and P n m i j k is the probability of receiving rating k, and P n m i j ( k −1) is the probability of receiving rating k-1.

Appendix A. 

Idap protocol, math - teacher assignment scoring, scale 1 - (written mathematical communication).

3  =  Analysis/Persuasion/Theory . The task requires the student to show his/her solution path and to explain the solution path with evidence.

2  =  Report/Summary . The task requires the student to show her/his solution path or requires explanation of why.

1  =  Just answers . The task requires little more than answers. Students may be asked to show some work.

Scale 2 - (Conceptual Understanding)

4  = The dominant expectation of the assignment requires substantial demonstration of conceptual understanding that is related to one or more mathematical ideas.

3  = The dominant expectation of the assignment requires some demonstration of conceptual understanding that is related to one or more mathematical ideas.

2  = The dominant expectation of the assignment relates to a mathematical idea but primarily requires demonstration of procedural knowledge.

1  = The mathematical expectation of the assignment relates only tangentially, if at all, to a mathematical idea.

Scale 3 - (Real-world connection)

4  = The assignment clearly addresses an authentic situation that entails relevant mathematical questions, issues, or problems that are encountered in the real world and specifies that the work will be meeting the needs of a real and appropriate audience, one other than a teacher or student peers.

3  = The assignment clearly addresses an authentic situation that entails relevant mathematical questions, issues, or problems that are encountered in the real world but does not specify that the work will be meeting the needs of a real and appropriate audience.

2  = The assignment makes an attempt to address mathematical questions, issues, or problems that are encountered in the real world, but the connection is superficial, tangential, contrived, or weak.

1  = The assignment makes a minimal attempt to address the mathematical questions, issues, or problems that are encountered in the real world.

Math - Student work

3  = Student work demonstrates a complete indication of the solution path with complete and appropriate explanation/justification OR complete and appropriate explanation/justification of the student's thinking and conclusions.

2  = Student work demonstrates a clear indication of the solution path but little or no appropriate explanation or justification.

1  = Student work demonstrates little more than an answer, with little or no indication of the solution path and little or no appropriate explanation or justification.

Scale 2 - (Conceptual understanding)

4  = Student work demonstrates substantial conceptual understanding of important mathematical ideas and is free of misconceptions.

3  = Student work demonstrates some conceptual understanding of important mathematical ideas and may contain minor misconceptions.

2  = Student work demonstrates partial conceptual understanding of important mathematical ideas and may contain major misconceptions.

1  = Student work demonstrates little or no conceptual understanding of mathematical ideas.

Scale 3 - (Reasoning)

4  = Student work demonstrates appropriate and successful use of reasoning.

3  = Student work demonstrates reasoning, but it is not entirely appropriate or successful.

2  = Student work demonstrates some use of reasoning strategies.

1  = Student work represents primarily the reproduction of fragments of knowledge OR the application of previously learned procedures OR completely inappropriate reasoning.

ELA - Teacher assignments

Scale 1 - (elaborated communication: expository//narrative).

4  = The task requires extended writing and asks students to make an assertion—stating a claim OR drawing a conclusion OR suggesting a generalization—AND to support the assertion with evidence offered by examples, details, illustrations, facts, AND/OR reasons.//The task requires extended writing and calls for imaginative texts that demonstrate the tone, style, conventions, and content of the genre AND that include detail.

3  = The task requires extended writing and asks students to EITHER make an assertion—stating a claim OR drawing a conclusion OR suggesting a generalization—OR to support a given assertion with evidence offered by examples, details, illustrations, facts, AND/OR reasons.//The task requires extended writing and calls for imaginative texts, BUT it does not require demonstration of the characteristics of the genre.

2  = Short answers are sufficient to complete the task; the task requires only one or two sentences, clauses, or phrases that complete a thought.

1  = The task is a fill-in-the-blank or multiple-choice task.

Scale 2 - (Construction of knowledge)

3  = The dominant expectation of the task is that students construct knowledge and/or an idea—to generate and explore ideas through interpretation, analysis, synthesis, and/or evaluation of information or through the use of literary strategies and techniques.

2  = There is some expectation that students construct knowledge or an idea—to generate and explore ideas through interpretation, analysis, synthesis, and/or evaluation of information or through the use of literary strategies and techniques for part of the task, BUT these skills are not the dominant expectation of the task.

1  = There is very little or no expectation for students to construct knowledge or an idea. Students can satisfy all or almost all of the requirements of the task by reproducing information they have read, heard, or viewed. For imaginative tasks, students are given very little or no direction in the use of literary strategies and techniques in order to construct an idea.

4  = The task requires students to take on a real-world role. The task also calls for a product that achieves a purpose beyond the simple demonstration of academic competence. In addition, the task requires that work be submitted to an audience other than teachers or students as graders.

3  = The task requires students to take on a real-world role. The task also calls for a product that achieves a purpose beyond the simple demonstration of academic competence. There is no requirement that the work be submitted to a real audience.

2  = The task requires students to take on a role but one that students could not realistically assume now or in the future. The primary purpose of the task is to demonstrate academic competence to teachers or students as graders.

1  = The task does not specify a role for the student. The primary purpose for the task is to demonstrate academic competence.

ELA - Student work

4  = The work includes extended writing that makes an assertion AND supports it with relevant evidence. The work ALSO is sufficiently developed, internally coherent, and effectively organized.//The work includes extended writing and detail. The work ALSO demonstrates the tone, style, conventions, and content of the genre AND evokes meaning through these literary devices.

3  = The work includes extended writing that makes an assertion AND supports it with relevant evidence. However, the work lacks sufficient development, internal coherence, and/or effective organization.//The work includes extended writing and detail. The work ALSO demonstrates the tone, style, conventions, and content of the genre.

2  = The work includes extended writing that EITHER makes an assertion OR provides evidence for a given assertion but does not do both.//The writing does not demonstrate the characteristics of the genre well, although it offers more sustained writing than isolated sentences or short answers.

1  = The writing is a series of isolated sentences or short answers that do not support a central idea, or the writing responds to fill-in-the-blank or multiple-choice tasks.

3  = The dominant part of the work constructs knowledge or an idea; the work generates and explores ideas through interpretation, analysis, synthesis, and/or evaluation of information or through the use of literary strategies and techniques.

2  = The work demonstrates moderate construction of knowledge or an idea. A portion of the work generates and explores ideas through interpretation, analysis, synthesis, and/or evaluation of information or through the use of literary strategies and techniques, BUT these skills are not the dominant part of the work.

1  = There is very little or no construction of knowledge or an idea. All or almost all of the work could have been generated by reproducing information the student has read, heard, or viewed. For imaginative tasks, literary strategies and techniques are not used to construct an idea.

Scale 3 - (Language conventions)

4  = The writing demonstrates competent command of spelling, vocabulary, grammar, and punctuation, AND effective use of language resources. The writing is almost error free.

3  = The writing demonstrates competent command of spelling, vocabulary, grammar, and punctuation. There may be some errors, but they present no problem for understanding the student's meaning.

2  = There are frequent errors in spelling, vocabulary, grammar, and punctuation, and they may interfere with the student's meaning, making the work difficult to understand.

1  = The writing is a series of isolated sentences or short answers that do not support a central idea, or the writing responds to fill-in-the-blank or multiple-choice tasks, or the writing is not the student's own.

Appendix B. 

Cumulative logit mixed model.

The model was coded using the clmm function (with default settings) in the R package ordinal ( Christensen, 2015 ). The dependent variable was the raw scale rating (ordinal) and the predictors in the different models as described in the manuscript. This is essentially an extension of the well-known multilevel/hierarchical linear models ( Raudenbush & Bryk, 2002 ) for ordinal dependent variables. Fitting the model with ordinal variables requires using a logit link, and thus the estimated coefficients are in the logit-link as well. The hierarchical nature of the data can be accommodated with the software estimating both the fixed and random effects.

Center for Teaching Innovation

Resource library.

  • AACU VALUE Rubrics

Using rubrics

A rubric is a type of scoring guide that assesses and articulates specific components and expectations for an assignment. Rubrics can be used for a variety of assignments: research papers, group projects, portfolios, and presentations.  

Why use rubrics? 

Rubrics help instructors: 

  • Assess assignments consistently from student-to-student. 
  • Save time in grading, both short-term and long-term. 
  • Give timely, effective feedback and promote student learning in a sustainable way. 
  • Clarify expectations and components of an assignment for both students and course teaching assistants (TAs). 
  • Refine teaching methods by evaluating rubric results. 

Rubrics help students: 

  • Understand expectations and components of an assignment. 
  • Become more aware of their learning process and progress. 
  • Improve work through timely and detailed feedback. 

Considerations for using rubrics 

When developing rubrics consider the following:

  • Although it takes time to build a rubric, time will be saved in the long run as grading and providing feedback on student work will become more streamlined.  
  • A rubric can be a fillable pdf that can easily be emailed to students. 
  • They can be used for oral presentations. 
  • They are a great tool to evaluate teamwork and individual contribution to group tasks. 
  • Rubrics facilitate peer-review by setting evaluation standards. Have students use the rubric to provide peer assessment on various drafts. 
  • Students can use them for self-assessment to improve personal performance and learning. Encourage students to use the rubrics to assess their own work. 
  • Motivate students to improve their work by using rubric feedback to resubmit their work incorporating the feedback. 

Getting Started with Rubrics 

  • Start small by creating one rubric for one assignment in a semester.  
  • Ask colleagues if they have developed rubrics for similar assignments or adapt rubrics that are available online. For example, the  AACU has rubrics  for topics such as written and oral communication, critical thinking, and creative thinking. RubiStar helps you to develop your rubric based on templates.  
  • Examine an assignment for your course. Outline the elements or critical attributes to be evaluated (these attributes must be objectively measurable). 
  • Create an evaluative range for performance quality under each element; for instance, “excellent,” “good,” “unsatisfactory.” 
  • Avoid using subjective or vague criteria such as “interesting” or “creative.” Instead, outline objective indicators that would fall under these categories. 
  • The criteria must clearly differentiate one performance level from another. 
  • Assign a numerical scale to each level. 
  • Give a draft of the rubric to your colleagues and/or TAs for feedback. 
  • Train students to use your rubric and solicit feedback. This will help you judge whether the rubric is clear to them and will identify any weaknesses. 
  • Rework the rubric based on the feedback. 

Assessment Rubrics

A rubric is commonly defined as a tool that articulates the expectations for an assignment by listing criteria, and for each criteria, describing levels of quality (Andrade, 2000; Arter & Chappuis, 2007; Stiggins, 2001). Criteria are used in determining the level at which student work meets expectations. Markers of quality give students a clear idea about what must be done to demonstrate a certain level of mastery, understanding, or proficiency (i.e., "Exceeds Expectations" does xyz, "Meets Expectations" does only xy or yz, "Developing" does only x or y or z). Rubrics can be used for any assignment in a course, or for any way in which students are asked to demonstrate what they've learned. They can also be used to facilitate self and peer-reviews of student work.

Rubrics aren't just for summative evaluation. They can be used as a teaching tool as well. When used as part of a formative assessment, they can help students understand both the holistic nature and/or specific analytics of learning expected, the level of learning expected, and then make decisions about their current level of learning to inform revision and improvement (Reddy & Andrade, 2010). 

Why use rubrics?

Rubrics help instructors:

Provide students with feedback that is clear, directed and focused on ways to improve learning.

Demystify assignment expectations so students can focus on the work instead of guessing "what the instructor wants."

Reduce time spent on grading and develop consistency in how you evaluate student learning across students and throughout a class.

Rubrics help students:

Focus their efforts on completing assignments in line with clearly set expectations.

Self and Peer-reflect on their learning, making informed changes to achieve the desired learning level.

Developing a Rubric

During the process of developing a rubric, instructors might:

Select an assignment for your course - ideally one you identify as time intensive to grade, or students report as having unclear expectations.

Decide what you want students to demonstrate about their learning through that assignment. These are your criteria.

Identify the markers of quality on which you feel comfortable evaluating students’ level of learning - often along with a numerical scale (i.e., "Accomplished," "Emerging," "Beginning" for a developmental approach).

Give students the rubric ahead of time. Advise them to use it in guiding their completion of the assignment.

It can be overwhelming to create a rubric for every assignment in a class at once, so start by creating one rubric for one assignment. See how it goes and develop more from there! Also, do not reinvent the wheel. Rubric templates and examples exist all over the Internet, or consider asking colleagues if they have developed rubrics for similar assignments. 

Sample Rubrics

Examples of holistic and analytic rubrics : see Tables 2 & 3 in “Rubrics: Tools for Making Learning Goals and Evaluation Criteria Explicit for Both Teachers and Learners” (Allen & Tanner, 2006)

Examples across assessment types : see “Creating and Using Rubrics,” Carnegie Mellon Eberly Center for Teaching Excellence and & Educational Innovation

“VALUE Rubrics” : see the Association of American Colleges and Universities set of free, downloadable rubrics, with foci including creative thinking, problem solving, and information literacy. 

Andrade, H. 2000. Using rubrics to promote thinking and learning. Educational Leadership 57, no. 5: 13–18. Arter, J., and J. Chappuis. 2007. Creating and recognizing quality rubrics. Upper Saddle River, NJ: Pearson/Merrill Prentice Hall. Stiggins, R.J. 2001. Student-involved classroom assessment. 3rd ed. Upper Saddle River, NJ: Prentice-Hall. Reddy, Y., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation In Higher Education, 35(4), 435-448.

Types of Assignments and Assessments

Assignments and assessments are much the same thing: an instructor is unlikely to give students an assignment that does not receive some sort of assessment, whether formal or informal, formative or summative; and an assessment must be assigned, whether it is an essay, case study, or final exam. When the two terms are distinquished, "assignment" tends to refer to a learning activity that is primarily intended to foster or consolidate learning, while "assessment" tends to refer to an activity that is primarily intended to measure how well a student has learned. 

In the list below, some attempt has been made to put the assignments/assessments in into logical categories. However, many of them could appear in multiple categories, so to prevent the list from becoming needlessly long, each item has been allocated to just one category. 

Written Assignments:

  • Annotated Bibliography : An annotated bibliography is a list of citations or references to sources such as books, articles, websites, etc., along with brief descriptions or annotations that summarize, evaluate, and explain the content, relevance, and quality of each source. These annotations provide readers with insights into the source's content and its potential usefulness for research or reference.
  • Summary/Abstract : A summary or abstract is a concise and condensed version of a longer document or research article, presenting the main points, key findings, and essential information in a clear and brief manner. It allows readers to quickly grasp the main ideas and determine whether the full document is relevant to their needs or interests. Abstracts are commonly found at the beginning of academic papers, research articles, and reports, providing a snapshot of the entire content.
  • Case Analysis : Case analysis refers to a systematic examination and evaluation of a particular situation, problem, or scenario. It involves gathering relevant information, identifying key factors, analyzing various aspects, and formulating conclusions or recommendations based on the findings. Case analysis is commonly used in business, law, and other fields to make informed decisions and solve complex problems.
  • Definition : A definition is a clear and concise explanation that describes the meaning of a specific term, concept, or object. It aims to provide a precise understanding of the item being defined, often by using words, phrases, or context that distinguish it from other similar or related things.
  • Description of a Process : A description of a process is a step-by-step account or narrative that outlines the sequence of actions, tasks, or events involved in completing a particular activity or achieving a specific goal. Process descriptions are commonly used in various industries to document procedures, guide employees, and ensure consistent and efficient workflows.
  • Executive Summary : An executive summary is a condensed version of a longer document or report that provides an overview of the main points, key findings, and major recommendations. It is typically aimed at busy executives or decision-makers who need a quick understanding of the content without delving into the full details. Executive summaries are commonly used in business proposals, project reports, and research papers to present essential information concisely.
  • Proposal/Plan : A piece of writing that explains how a future problem or project will be approached.
  • Laboratory or Field Notes:  Laboratory/field notes are detailed and systematic written records taken by scientists, researchers, or students during experiments, observations, or fieldwork. These notes document the procedures, observations, data, and any unexpected findings encountered during the scientific investigation. They serve as a vital reference for later analysis, replication, and communication of the research process and results.
  • Research Paper : A research paper is a more extensive and in-depth academic work that involves original research, data collection from multiple sources, and analysis. It aims to contribute new insights to the existing body of knowledge on a specific subject. Compare to "essay" below.
  • Essay : A composition that calls for exposition of a thesis and is composed of several paragraphs including an introduction, a body, and a conclusion. It is different from a research paper in that the synthesis of bibliographic sources is not required. Compare to "Research Paper" above. 
  • Memo : A memo, short for memorandum, is a brief written message or communication used within an organization or business. It is often used to convey information, provide updates, make announcements, or request actions from colleagues or team members.
  • Micro-theme : A micro-theme refers to a concise and focused piece of writing that addresses a specific topic or question. It is usually shorter than a traditional essay or research paper and requires the writer to present their ideas clearly and concisely.
  • Notes on Reading : Notes on reading are annotations, comments, or summaries taken while reading a book, article, or any other written material. They serve as aids for understanding, retention, and later reference, helping the reader recall essential points and ideas from the text.
  • Outline : An outline is a structured and organized plan that lays out the main points and structure of a written work, such as an essay, research paper, or presentation. It provides a roadmap for the writer, ensuring logical flow and coherence in the final piece.
  • Plan for Conducting a Project : A plan for conducting a project outlines the steps, resources, timelines, and objectives for successfully completing a specific project. It includes details on how tasks will be executed and managed to achieve the desired outcomes.
  • Poem : A poem is a literary work written in verse, using poetic devices like rhythm, rhyme, and imagery to convey emotions, ideas, and experiences.
  • Play : A play is a form of literature written for performance, typically involving dialogue and actions by characters to tell a story or convey a message on stage.
  • Choreography : Choreography refers to the art of designing dance sequences or movements, often for performances in various dance styles.
  • Article/Book Review : An article or book review is a critical evaluation and analysis of a piece of writing, such as an article or a book. It typically includes a summary of the content and the reviewer's assessment of its strengths, weaknesses, and overall value.
  • Review of Literature : A review of literature is a comprehensive summary and analysis of existing research and scholarly writings on a particular topic. It aims to provide an overview of the current state of knowledge in a specific field and may be a part of academic research or a standalone piece.
  • Essay-based Exam : An essay-based exam is an assessment format where students are required to respond to questions or prompts with written, structured responses. It involves expressing ideas, arguments, and explanations in a coherent and organized manner, often requiring critical thinking and analysis.
  • "Start" : In the context of academic writing, "start" refers to the initial phase of organizing and planning a piece of writing. It involves formulating a clear and focused thesis statement, which presents the main argument or central idea of the work, and creating an outline or list of ideas that will support and develop the thesis throughout the writing process.
  • Statement of Assumptions : A statement of assumptions is a declaration or acknowledgment made at the beginning of a document or research paper, highlighting the underlying beliefs, conditions, or premises on which the work is based. It helps readers understand the foundation of the writer's perspective and the context in which the content is presented.
  • Summary or Precis : A summary or precis is a concise and condensed version of a longer piece of writing, such as an article, book, or research paper. It captures the main points, key arguments, and essential information in a succinct manner, enabling readers to grasp the content without reading the full text.
  • Unstructured Writing : Unstructured writing refers to the process of writing without following a specific plan, outline, or organizational structure. It allows the writer to freely explore ideas, thoughts, and creativity without the constraints of a predefined format or order. Unstructured writing is often used for brainstorming, creative expression, or personal reflection.
  • Rough Draft or Freewrite : A rough draft or freewrite is an initial version of a piece of writing that is not polished or edited. It serves as an early attempt by the writer to get ideas on paper without worrying about perfection, allowing for exploration and creativity before revising and refining the final version.
  • Technical or Scientific Report : A technical or scientific report is a document that presents detailed information about a specific technical or scientific project, research study, experiment, or investigation. It follows a structured format and includes sections like abstract, introduction, methods, results, discussion, and conclusion to communicate findings and insights in a clear and systematic manner.
  • Journal article : A formal article reporting original research that could be submitted to an academic journal. Rather than a format dictated by the professor, the writer must use the conventional form of academic journals in the relevant discipline.
  • Thesis statement : A clear and concise sentence or two that presents the main argument or central claim of an essay, research paper, or any written piece. It serves as a roadmap for the reader, outlining the writer's stance on the topic and the key points that will be discussed and supported in the rest of the work. The thesis statement provides focus and direction to the paper, guiding the writer's approach to the subject matter and helping to maintain coherence throughout the writing.

Visual Representation

  • Brochure : A brochure is a printed or digital document used for advertising, providing information, or promoting a product, service, or event. It typically contains a combination of text and visuals, such as images or graphics, arranged in a visually appealing layout to convey a message effectively.
  • Poster : A poster is a large printed visual display intended to catch the attention of an audience. It often contains a combination of text, images, and graphics to communicate information or promote a particular message, event, or cause.
  • Chart : A chart is a visual representation of data or information using various formats such as pie charts, bar charts, line charts, or tables. It helps to illustrate relationships, trends, and comparisons in a concise and easy-to-understand manner.
  • Graph : A graph is a visual representation of numerical data, usually presented using lines, bars, points, or other symbols on a coordinate plane. Graphs are commonly used to show trends, patterns, and relationships between variables.
  • Concept Map : A concept map is a graphical tool used to organize and represent the connections and relationships between different concepts or ideas. It typically uses nodes or boxes to represent concepts and lines or arrows to show the connections or links between them, helping to visualize the relationships and hierarchy of ideas.
  • Diagram : A diagram is a visual representation of a process, system, or structure using labeled symbols, shapes, or lines. Diagrams are used to explain complex concepts or procedures in a simplified and easy-to-understand manner.
  • Table : A table is a systematic arrangement of data or information in rows and columns, allowing for easy comparison and reference. It is commonly used to present numerical data or detailed information in an organized format.
  • Flowchart : A flowchart is a graphical representation of a process, workflow, or algorithm, using various shapes and arrows to show the sequence of steps or decisions involved. It helps visualize the logical flow and decision points, making it easier to understand and analyze complex processes.
  • Multimedia or Slide Presentation : A multimedia or slide presentation is a visual communication tool that combines text, images, audio, video, and other media elements to deliver information or a message to an audience. It is often used for educational, business, or informational purposes and can be presented in person or virtually using software like Microsoft PowerPoint or Google Slides.
  • ePortfolio : An ePortfolio, short for electronic portfolio, is a digital collection of an individual's work, accomplishments, skills, and reflections. It typically includes a variety of multimedia artifacts such as documents, presentations, videos, images, and links to showcase a person's academic, professional, or personal achievements. Eportfolios are used for self-reflection, professional development, and showcasing one's abilities to potential employers, educators, or peers. They provide a comprehensive and organized way to present evidence of learning, growth, and accomplishments over time.

Multiple-Choice Questions : These questions present a statement or question with several possible answer options, of which one or more may be correct. Test-takers must select the most appropriate choice(s). See CTE's Teaching Tip "Designing Multiple-Choice Questions."  

True or False Questions : These questions require test-takers to determine whether a given statement is true or false based on their knowledge of the subject.

Short-Answer Questions : Test-takers are asked to provide brief written responses to questions or prompts. These responses are usually a few sentences or a paragraph in length.

Essay Questions : Essay questions require test-takers to provide longer, more detailed written responses to a specific topic or question. They may involve analysis, critical thinking, and the development of coherent arguments.

Matching Questions : In matching questions, test-takers are asked to pair related items from two lists. They must correctly match the items based on their associations.

Fill-in-the-Blank Questions : Test-takers must complete sentences or passages by filling in the missing words or phrases. This type of question tests recall and understanding of specific information.

Multiple-Response Questions : Similar to multiple-choice questions, but with multiple correct options. Test-takers must select all the correct choices to receive full credit.

Diagram or Image-Based Questions : These questions require test-takers to analyze or interpret diagrams, charts, graphs, or images to answer specific queries.

Problem-Solving Questions : These questions present real-world or theoretical problems that require test-takers to apply their knowledge and skills to arrive at a solution.

Vignettes or Case-Based Questions : In these questions, test-takers are presented with a scenario or case study and must analyze the information to answer related questions.

Sequencing or Order Questions : Test-takers are asked to arrange items or events in a particular order or sequence based on their understanding of the subject matter.

Projects intended for a specific audience :

  • Advertisement : An advertisement is a promotional message or communication aimed at promoting a product, service, event, or idea to a target audience. It often uses persuasive techniques, visuals, and compelling language to attract attention and encourage consumers to take specific actions, such as making a purchase or seeking more information.
  • Client Report for an Agency : A client report for an agency is a formal document prepared by a service provider or agency to communicate the results, progress, or recommendations of their work to their client. It typically includes an analysis of data, achievements, challenges, and future plans related to the project or services provided.
  • News or Feature Story : A news story is a journalistic piece that reports on current events or recent developments, providing objective information in a factual and unbiased manner. A feature story, on the other hand, is a more in-depth and creative piece that explores human interest topics, profiles individuals, or delves into issues from a unique perspective.
  • Instructional Manual : An instructional manual is a detailed document that provides step-by-step guidance, explanations, and procedures on how to use, assemble, operate, or perform specific tasks with a product or system. It aims to help users understand and utilize the item effectively and safely.
  • Letter to the Editor : A letter to the editor is a written communication submitted by a reader to a newspaper, magazine, or online publication, expressing their opinion, feedback, or comments on a particular article, topic, or issue. It is intended for publication and allows individuals to share their perspectives with a broader audience.

Problem-Solving and Analysis :

  • Taxonomy : Taxonomy is the science of classification, categorization, and naming of organisms, objects, or concepts based on their characteristics, similarities, and differences. It involves creating hierarchical systems that group related items together, facilitating organization and understanding within a particular domain.
  • Budget with Rationale : A budget with rationale is a financial plan that outlines projected income and expenses for a specific period, such as a month or a year. The rationale provides explanations or justifications for each budget item, explaining the purpose and reasoning behind the allocated funds.
  • Case Analysis : Case analysis refers to a methodical examination of a particular situation, scenario, or problem. It involves gathering relevant data, identifying key issues, analyzing different factors, and formulating conclusions or recommendations based on the findings. Case analysis is commonly used in various fields, such as business, law, and education, to make informed decisions and solve complex problems.
  • Case Study : A case study is an in-depth analysis of a specific individual, group, organization, or situation. It involves thorough research, data collection, and detailed examination to understand the context, challenges, and outcomes associated with the subject of study. Case studies are widely used in academic research and professional contexts to gain insights into real-world scenarios.
  • Word Problem : A word problem is a type of mathematical or logical question presented in a contextual format using words rather than purely numerical or symbolic representations. It challenges students to apply their knowledge and problem-solving skills to real-life situations.

Collaborative Activities

  • Debate : A debate is a structured discussion between two or more individuals or teams with differing viewpoints on a specific topic or issue. Participants present arguments and counterarguments to support their positions, aiming to persuade the audience and ultimately reach a resolution or conclusion. Debates are commonly used in academic settings, public forums, and formal competitions to foster critical thinking, communication skills, and understanding of diverse perspectives.
  • Group Discussion : A group discussion is an interactive conversation involving several individuals who come together to exchange ideas, opinions, and information on a particular subject. The discussion is typically moderated to ensure that everyone has an opportunity to participate, and it encourages active listening, collaboration, and problem-solving. Group discussions are commonly used in educational settings, team meetings, and decision-making processes to promote dialogue and collective decision-making.
  • An oral report is a form of communication in which a person or group of persons present information, findings, or ideas verbally to an audience. It involves speaking in front of others, often in a formal setting, and delivering a structured presentation that may include visual aids, such as slides or props, to support the content. Oral reports are commonly used in academic settings, business environments, and various professional settings to share knowledge, research findings, project updates, or persuasive arguments. Effective oral reports require clear organization, articulation, and engaging delivery to effectively convey the intended message to the listeners.

Planning and Organization

  • Inventory : An inventory involves systematically listing and categorizing items or resources to assess their availability, quantity, and condition. In an educational context, students might conduct an inventory of books in a library, equipment in a lab, or supplies in a classroom, enhancing their organizational and data collection skills.
  • Materials and Methods Plan : A materials and methods plan involves developing a structured outline or description of the materials, tools, and procedures to be used in a specific experiment, research project, or practical task. It helps learners understand the importance of proper planning and documentation in scientific and research endeavors.
  • Plan for Conducting a Project : This learning activity requires students to create a detailed roadmap for executing a project. It includes defining the project's objectives, identifying tasks and timelines, allocating resources, and setting milestones to monitor progress. It enhances students' project management and organizational abilities.
  • Research Proposal Addressed to a Granting Agency : A formal document requesting financial support for a research project from a granting agency or organization. The proposal outlines the research questions, objectives, methodology, budget, and potential outcomes. It familiarizes learners with the process of seeking funding and strengthens their research and persuasive writing skills.
  • Mathematical Problem : A mathematical problem is a task or question that requires the application of mathematical principles, formulas, or operations to find a solution. It could involve arithmetic, algebra, geometry, calculus, or other branches of mathematics, challenging individuals to solve the problem logically and accurately.
  • Question : A question is a sentence or phrase used to elicit information, seek clarification, or provoke thought from someone else. Questions can be open-ended, closed-ended, or leading, depending on their purpose, and they play a crucial role in communication, problem-solving, and learning.

More Resources

CTE Teaching Tips

  • Personal Response Systems
  • Designing Multiple-Choice Questions
  • Aligning Outcomes, Assessments, and Instruction

Other Resources

  • Types of Assignments . University of Queensland.

If you would like support applying these tips to your own teaching, CTE staff members are here to help.  View the  CTE Support  page to find the most relevant staff member to contact.

teachingitps

Catalog search

Teaching tip categories.

  • Assessment and feedback
  • Blended Learning and Educational Technologies
  • Career Development
  • Course Design
  • Course Implementation
  • Inclusive Teaching and Learning
  • Learning activities
  • Support for Student Learning
  • Support for TAs
  • Assessment and feedback ,

How technology is reinventing education

Stanford Graduate School of Education Dean Dan Schwartz and other education scholars weigh in on what's next for some of the technology trends taking center stage in the classroom.

use of assignment in education

Image credit: Claire Scully

New advances in technology are upending education, from the recent debut of new artificial intelligence (AI) chatbots like ChatGPT to the growing accessibility of virtual-reality tools that expand the boundaries of the classroom. For educators, at the heart of it all is the hope that every learner gets an equal chance to develop the skills they need to succeed. But that promise is not without its pitfalls.

“Technology is a game-changer for education – it offers the prospect of universal access to high-quality learning experiences, and it creates fundamentally new ways of teaching,” said Dan Schwartz, dean of Stanford Graduate School of Education (GSE), who is also a professor of educational technology at the GSE and faculty director of the Stanford Accelerator for Learning . “But there are a lot of ways we teach that aren’t great, and a big fear with AI in particular is that we just get more efficient at teaching badly. This is a moment to pay attention, to do things differently.”

For K-12 schools, this year also marks the end of the Elementary and Secondary School Emergency Relief (ESSER) funding program, which has provided pandemic recovery funds that many districts used to invest in educational software and systems. With these funds running out in September 2024, schools are trying to determine their best use of technology as they face the prospect of diminishing resources.

Here, Schwartz and other Stanford education scholars weigh in on some of the technology trends taking center stage in the classroom this year.

AI in the classroom

In 2023, the big story in technology and education was generative AI, following the introduction of ChatGPT and other chatbots that produce text seemingly written by a human in response to a question or prompt. Educators immediately worried that students would use the chatbot to cheat by trying to pass its writing off as their own. As schools move to adopt policies around students’ use of the tool, many are also beginning to explore potential opportunities – for example, to generate reading assignments or coach students during the writing process.

AI can also help automate tasks like grading and lesson planning, freeing teachers to do the human work that drew them into the profession in the first place, said Victor Lee, an associate professor at the GSE and faculty lead for the AI + Education initiative at the Stanford Accelerator for Learning. “I’m heartened to see some movement toward creating AI tools that make teachers’ lives better – not to replace them, but to give them the time to do the work that only teachers are able to do,” he said. “I hope to see more on that front.”

He also emphasized the need to teach students now to begin questioning and critiquing the development and use of AI. “AI is not going away,” said Lee, who is also director of CRAFT (Classroom-Ready Resources about AI for Teaching), which provides free resources to help teach AI literacy to high school students across subject areas. “We need to teach students how to understand and think critically about this technology.”

Immersive environments

The use of immersive technologies like augmented reality, virtual reality, and mixed reality is also expected to surge in the classroom, especially as new high-profile devices integrating these realities hit the marketplace in 2024.

The educational possibilities now go beyond putting on a headset and experiencing life in a distant location. With new technologies, students can create their own local interactive 360-degree scenarios, using just a cell phone or inexpensive camera and simple online tools.

“This is an area that’s really going to explode over the next couple of years,” said Kristen Pilner Blair, director of research for the Digital Learning initiative at the Stanford Accelerator for Learning, which runs a program exploring the use of virtual field trips to promote learning. “Students can learn about the effects of climate change, say, by virtually experiencing the impact on a particular environment. But they can also become creators, documenting and sharing immersive media that shows the effects where they live.”

Integrating AI into virtual simulations could also soon take the experience to another level, Schwartz said. “If your VR experience brings me to a redwood tree, you could have a window pop up that allows me to ask questions about the tree, and AI can deliver the answers.”

Gamification

Another trend expected to intensify this year is the gamification of learning activities, often featuring dynamic videos with interactive elements to engage and hold students’ attention.

“Gamification is a good motivator, because one key aspect is reward, which is very powerful,” said Schwartz. The downside? Rewards are specific to the activity at hand, which may not extend to learning more generally. “If I get rewarded for doing math in a space-age video game, it doesn’t mean I’m going to be motivated to do math anywhere else.”

Gamification sometimes tries to make “chocolate-covered broccoli,” Schwartz said, by adding art and rewards to make speeded response tasks involving single-answer, factual questions more fun. He hopes to see more creative play patterns that give students points for rethinking an approach or adapting their strategy, rather than only rewarding them for quickly producing a correct response.

Data-gathering and analysis

The growing use of technology in schools is producing massive amounts of data on students’ activities in the classroom and online. “We’re now able to capture moment-to-moment data, every keystroke a kid makes,” said Schwartz – data that can reveal areas of struggle and different learning opportunities, from solving a math problem to approaching a writing assignment.

But outside of research settings, he said, that type of granular data – now owned by tech companies – is more likely used to refine the design of the software than to provide teachers with actionable information.

The promise of personalized learning is being able to generate content aligned with students’ interests and skill levels, and making lessons more accessible for multilingual learners and students with disabilities. Realizing that promise requires that educators can make sense of the data that’s being collected, said Schwartz – and while advances in AI are making it easier to identify patterns and findings, the data also needs to be in a system and form educators can access and analyze for decision-making. Developing a usable infrastructure for that data, Schwartz said, is an important next step.

With the accumulation of student data comes privacy concerns: How is the data being collected? Are there regulations or guidelines around its use in decision-making? What steps are being taken to prevent unauthorized access? In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data.

Technology is “requiring people to check their assumptions about education,” said Schwartz, noting that AI in particular is very efficient at replicating biases and automating the way things have been done in the past, including poor models of instruction. “But it’s also opening up new possibilities for students producing material, and for being able to identify children who are not average so we can customize toward them. It’s an opportunity to think of entirely new ways of teaching – this is the path I hope to see.”

Ohio State nav bar

The Ohio State University

  • BuckeyeLink
  • Find People
  • Search Ohio State

AI Teaching Strategies: Transparent Assignment Design

<null>

The rise of generative artificial intelligence (AI) tools like ChatGPT, Google Bard, and Jasper Chat raises many questions about the ways we teach and the ways students learn. While some of these questions concern how we can use AI to accomplish learning goals and whether or not that is advisable, others relate to how we can facilitate critical analysis of AI itself. 

The wide variety of questions about AI and the rapidly changing landscape of available tools can make it hard for educators to know where to start when designing an assignment. When confronted with new technologies—and the new teaching challenges they present—we can often turn to existing evidence-based practices for the guidance we seek.

This guide will apply the Transparency in Learning and Teaching (TILT) framework to "un-complicate" planning an assignment that uses AI, providing guiding questions for you to consider along the way. 

The result should be an assignment that supports you and your students to approach the use of AI in a more thoughtful, productive, and ethical manner.    

Plan your assignment.

The TILT framework offers a straightforward approach to assignment design that has been shown to improve academic confidence and success, sense of belonging, and metacognitive awareness by making the learning process clear to students (Winkelmes et al., 2016). The TILT process centers around deciding—and then communicating—three key components of your assignment: 1) purpose, 2) tasks, and 3) criteria for success. 

Step 1: Define your purpose.

To make effective use of any new technology, it is important to reflect on our reasons for incorporating it into our courses. In the first step of TILT, we think about what we want students to gain from an assignment and how we will communicate that purpose to students.

The  SAMR model , a useful tool for thinking about educational technology use in our courses, lays out four tiers of technology integration. The tiers, roughly in order of their sophistication and transformative power, are S ubstitution, A ugmentation, M odification, and R edefinition. Each tier may suggest different approaches to consider when integrating AI into teaching and learning activities. 

For full text of this image, see transcript linked in caption.

Questions to consider:

  • Do you intend to use AI as a substitution, augmentation, modification, or redefinition of an existing teaching practice or educational technology?
  • What are your learning goals and expected learning outcomes?
  • Do you want students to understand the limitations of AI or to experience its applications in the field? 
  • Do you want students to reflect on the ethical implications of AI use?  

Bloom’s Taxonomy is another useful tool for defining your assignment’s purpose and your learning goals and outcomes. 

This downloadable Bloom’s Taxonomy Revisited resource , created by Oregon State University, highlights the differences between AI capabilities and distinctive human skills at each Bloom's level, indicating the types of assignments you should review or change in light of AI. Bloom's Taxonomy Revisited is licensed under Creative Commons Attribution 4.0 International (CC BY 4.0).  

Access a transcript of the graphic .

Step 2: Define the tasks involved.

In the next step of TILT, you list the steps students will take when completing the assignment. In what order should they do specific tasks, what do they need to be aware of to perform each task well, and what mistakes should they avoid? Outlining each step is especially important if you’re asking students to use generative AI in a limited manner. For example, if you want them to begin with generative AI but then revise, refine, or expand upon its output, make clear which steps should involve their own thinking and work as opposed to AI’s thinking and work.

  • Are you designing this assignment as a single, one-time task or as a longitudinal task that builds over time or across curricular and co-curricular contexts?  For longitudinal tasks consider the experiential learning cycle (Kolb, 1984) . In Kolb’s cycle, learners have a concrete experience followed by reflective observation, abstract conceptualization, and active experimentation. For example, students could record their generative AI prompts, the results, a reflection on the results, and the next prompt they used to get improved output. In subsequent tasks students could expand upon or revise the AI output into a final product. Requiring students to provide a record of their reflections, prompts, and results can create an “AI audit trail,” making the task and learning more transparent.
  • What resources and tools are permitted or required for students to complete the tasks involved with the assignment? Make clear which steps should involve their own thinking (versus AI-generated output, for example), required course materials, and if references are required. Include any ancillary resources students will need to accomplish tasks, such as guidelines on how to cite AI , in APA 7.0 for example.
  • How will you offer students flexibility and choice? As of this time, most generative AI tools have not been approved for use by Ohio State, meaning they have not been  vetted for security, privacy, or accessibility issues . It is known that many platforms are not compatible with screen readers, and there are outstanding questions as to what these tools do with user data. Students may have understandable apprehensions about using these tools or encounter barriers to doing so successfully. So while there may be value in giving students first-hand experience with using AI, it’s important to give them the choice to opt out. As you outline your assignment tasks, plan how to provide alternative options to complete them. Could you provide AI output you’ve generated for students to work with, demonstrate use of the tool during class, or allow use of another tool that enables students to meet the same learning outcomes.

Microsoft Copilot is currently the only generative AI tool that has been vetted and approved for use at Ohio State. As of February 2024, the Office of Technology and Digital Innovation (OTDI) has enabled it for use by students, faculty, and staff. Copilot is an AI chatbot that draws from public online data, but with additional security measures in place. For example, conversations within the tool aren’t stored. Learn more and stay tuned for further information about Copilot in the classroom.

  • What are your expectations for academic integrity? This is a helpful step for clarifying your academic integrity guidelines for this assignment, around AI use specifically as well as for other resources and tools. The standard Academic Integrity Icons in the table below can help you call out what is permissible and what is prohibited. If any steps for completing the assignment require (or expressly prohibit) AI tools, be as clear as possible in highlighting which ones, as well as why and how AI use is (or is not) permitted.

Promoting academic integrity

While inappropriate use of AI may constitute academic misconduct, it can be muddy for students to parse out what is permitted or prohibited across their courses and across various use cases. Fortunately, there are existing approaches to supporting academic integrity that apply to AI as well as to any other tool. Discuss academic integrity openly with students, early in the term and before each assignment. Purposefully design your assignments to promote integrity by using real-world formats and audiences, grading the process as well as the product, incorporating personal reflection tasks, and more. 

Learn about taking a proactive, rather than punitive, approach to academic integrity in A Positive Approach to Academic Integrity.

Step 3: Define criteria for success.

An important feature of transparent assignments is that they make clear to students how their work will be evaluated. During this TILT step, you will define criteria for a successful submission—consider creating a  rubric to clarify these expectations for students and simplify your grading process. If you intend to use AI as a substitute or augmentation for another technology, you might be able to use an existing rubric with little or no change. However, if AI use is modifying or redefining the assignment tasks, a new grading rubric will likely be needed. 

  • How will you grade this assignment? What key criteria will you assess? 
  • What indicators will show each criterion has been met? 
  • What qualities distinguish a successful submission from one that needs improvement? 
  • Will you grade students on the product only or on aspects of the process as well? For example, if you have included a reflection task as part of the assignment, you might include that as a component of the final grade.

Alongside your rubric, it is helpful to prepare examples of successful (and even unsuccessful) submissions to provide more tangible guidance to students. In addition to samples of the final product, you could share examples of effective AI prompts, reflections tasks, and AI citations. Examples may be drawn from previous student work or models that you have mocked up, and they can be annotated to highlight notable elements related to assignment criteria. 

Present and discuss your assignment.

use of assignment in education

As clear as we strive to be in our assignment planning and prompts, there may be gaps or confusing elements we have overlooked. Explicitly going over your assignment instructions—including the purpose, key tasks, and criteria—will ensure students are equipped with the background and knowledge they need to perform well. These discussions also offer space for students to ask questions and air unanticipated concerns, which is particularly important given the potential hesitance some may have around using AI tools. 

  • How will this assignment help students learn key course content, contribute to the development of important skills such as critical thinking, or support them to meet your learning goals and outcomes? 
  • How might students apply the knowledge and skills acquired in their future coursework or careers? 
  • In what ways will the assignment further students’ understanding and experience around generative AI tools, and why does that matter?
  • What questions or barriers do you anticipate students might encounter when using AI for this assignment?

As noted above, many students are unaware of the accessibility, security, privacy, and copyright concerns associated with AI, or of other pitfalls they might encounter working with AI tools. Openly discussing AI’s limitations and the inaccuracies and biases it can create and replicate will support students to anticipate barriers to success on the assignment, increase their digital literacy, and make them more informed and discerning users of technology. 

Explore available resources It can feel daunting to know where to look for AI-related assignment ideas, or who to consult if you have questions. Though generative AI is still on the rise, a growing number of useful resources are being developed across the teaching and learning community. Consult our other Teaching Topics, including AI Considerations for Teaching and Learning , and explore other recommended resources such as the Learning with AI Toolkit and Exploring AI Pedagogy: A Community Collection of Teaching Reflections.

If you need further support to review or develop assignment or course plans in light of AI, visit our Help forms to request a teaching consultation .

Using the Transparent Assignment Template

Sample assignment: ai-generated lesson plan.

In many respects, the rise of generative AI has reinforced existing best practices for assignment design—craft a clear and detailed assignment prompt, articulate academic integrity expectations, increase engagement and motivation through authentic and inclusive assessments. But AI has also encouraged us to think differently about how we approach the tasks we ask students to undertake, and how we can better support them through that process. While it can feel daunting to re-envision or reformat our assignments, AI presents us with opportunities to cultivate the types of learning and growth we value, to help students see that value, and to grow their critical thinking and digital literacy skills. 

Using the Transparency in Learning and Teaching (TILT) framework to plan assignments that involve generative AI can help you clarify expectations for students and take a more intentional, productive, and ethical approach to AI use in your course. 

  • Step 1: Define your purpose. Think about what you want students to gain from this assignment. What are your learning goals and outcomes? Do you want students to understand the limitations of AI, see its applications in your field, or reflect on its ethical implications? The SAMR model and Bloom's Taxonomy are useful references when defining your purpose for using (or not using) AI on an assignment.
  • Step 2: Define the tasks involved. L ist the steps students will take to complete the assignment. What resources and tools will they need? How will students reflect upon their learning as they proceed through each task?  What are your expectations for academic integrity?
  • Step 3: Define criteria for success. Make clear to students your expectations for success on the assignment. Create a  rubric to call out key criteria and simplify your grading process. Will you grade the product only, or parts of the process as well? What qualities indicate an effective submission? Consider sharing tangible models or examples of assignment submissions.

Finally, it is time to make your assignment guidelines and expectations transparent to students. Walk through the instructions explicitly—including the purpose, key tasks, and criteria—to ensure they are prepared to perform well.

  • Checklist for Designing Transparent Assignments
  • TILT Higher Ed Information and Resources

Winkelmes, M. (2013). Transparency in Teaching: Faculty Share Data and Improve Students’ Learning. Liberal Education 99 (2).

Wilkelmes, M. (2013). Transparent Assignment Design Template for Teachers. TiLT Higher Ed: Transparency in Learning and Teaching. https://tilthighered.com/assets/pdffiles/Transparent%20Assignment%20Templates.pdf

Winkelmes, M., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., Weavil, K. (2016). A Teaching Intervention that Increases Underserved College Students’ Success. Peer Review.

Related Teaching Topics

Ai considerations for teaching and learning, ai teaching strategies: having conversations with students, designing assessments of student learning, search for resources.

for Education

  • Google Classroom
  • Google Workspace Admin
  • Google Cloud

Easily distribute, analyze, and grade student work with Assignments for your LMS

Assignments is an application for your learning management system (LMS). It helps educators save time grading and guides students to turn in their best work with originality reports — all through the collaborative power of Google Workspace for Education.

  • Get started
  • Explore originality reports

TBD

Bring your favorite tools together within your LMS

Make Google Docs and Google Drive compatible with your LMS

Simplify assignment management with user-friendly Google Workspace productivity tools

Built with the latest Learning Tools Interoperability (LTI) standards for robust security and easy installation in your LMS

Save time distributing and grading classwork

Distribute personalized copies of Google Drive templates and worksheets to students

Grade consistently and transparently with rubrics integrated into student work

Add rich feedback faster using the customizable comment bank

Examine student work to ensure authenticity

Compare student work against hundreds of billions of web pages and over 40 million books with originality reports

Make student-to-student comparisons on your domain-owned repository of past submissions when you sign up for the Teaching and Learning Upgrade or Google Workspace for Education Plus

Allow students to scan their own work for recommended citations up to three times

Trust in high security standards

Protect student privacy — data is owned and managed solely by you and your students

Provide an ad-free experience for all your users

Compatible with LTI version 1.1 or higher and meets rigorous compliance standards

Google Classroom picture

Product demos

Experience google workspace for education in action. explore premium features in detail via step-by-step demos to get a feel for how they work in the classroom..

“Assignments enable faculty to save time on the mundane parts of grading and...spend more time on providing more personalized and relevant feedback to students.” Benjamin Hommerding , Technology Innovationist, St. Norbert College

use of assignment in education

Classroom users get the best of Assignments built-in

Find all of the same features of Assignments in your existing Classroom environment

  • Learn more about Classroom

Explore resources to get up and running

Discover helpful resources to get up to speed on using Assignments and find answers to commonly asked questions.

  • Visit Help Center

PDF

Get a quick overview of Assignments to help Educators learn how they can use it in their classrooms.

  • Download overview

PDF

Get started guide

Start using Assignments in your courses with this step-by-step guide for instructors.

  • Download guide

use of assignment in education

Teacher Center Assignments resources

Find educator tools and resources to get started with Assignments.

  • Visit Teacher Center

Video

How to use Assignments within your LMS

Watch this brief video on how Educators can use Assignments.

  • Watch video

Turn on Assignments in your LMS

Contact your institution’s administrator to turn on Assignments within your LMS.

  • Admin setup

use of assignment in education

Explore a suite of tools for your classroom with Google Workspace for Education

You're now viewing content for a different region..

For content more relevant to your region, we suggest:

Sign up here for updates, insights, resources, and more.

How do you prevent free-riding in group assignments? 7 tips!

Professor Mariëtte van den Hoven highlights a new challenge within higher education, sounding the alarm. However, she does not delve into the underlying causes of this problem nor propose solutions. 

Free-riding is certainly not a new issue and has long been a subject of much educational debate. 

Luckily, VU CTL formulated  these 7 tips for teachers that will help counteract student free-riding. 

This website uses cookies

You can accept all cookies or set your preferences per cookie category. You can always alter your choice by removing the cookies from your browser. VU Amsterdam and others use cookies to: 1) analyse website use; 2) personalise the website; 3) connect to social media networks; 4) show relevant advertisements. More information about the cookies we use

Cookie preferences

You can accept all cookies or you can set your preferences per cookie category. You can always alter your choice by removing the cookies from your browser. See more information in the cookie statement.

Personal settings:

These cookies are used to ensure that our website operates properly.

These cookies help to analyse the use of the website. These measurement data are subsequently used to improve the website.

Personalisation

These cookies are used to analyse how you use our website. This enables us to adapt our website content with information that suits your interests.

Social media

These cookies are placed by social media networks. For example, if you watch a YouTube video embedded in the website, or use the social media buttons on our website to share or like a post. This allows social media networks to track your internet behaviour and use that for their own purposes.

Advertising

These cookies are placed by advertising partners. They are used to show you relevant advertisements for Vrije Universiteit Amsterdam on other websites that you visit. They enable advertising networks to track your internet behaviour.

  • Academic Technologies
  • Classroom Problem Report

Blended and Distance Learning with General Assignment Classrooms

Discover how UW’s advanced classroom technology can enhance your teaching experience. Upper campus has 91 general assignment classrooms designed to support blended and distance learning, along with lecture capture capabilities. While our primary solution is Panopto, instructors also have the flexibility to use Zoom for their teaching needs.

With Panopto and Zoom, you can seamlessly record lectures or conduct live streams directly from your laptop in the classroom. Choose your preferred sources and start capturing valuable content, whether you’re engaging with a remote audience or teaching in person. Once recorded, easily share your content on Canvas, our integrated learning management system.

Teachers and TAs can leverage Zoom’s cloud recording feature to share class meetings directly with students through the Zoom App in Canvas. Plus, the Zoom recordings can automatically migrate into Panopto for added convenience. Panopto recordings can then be shared within Canvas or accessed through the Panopto platform.

Ready to explore these powerful tools further?

Contact Academic Technology today to schedule a consultation in our Kane Hall demo room and unlock the full potential of Panopto and Zoom for your classroom sessions. Email us at  [email protected]  or call Academic Technologies at  (206) 221-5000  extension 2.  (edited) 

  • Skip to main content
  • Keyboard shortcuts for audio player

Meta unveils new virtual reality headsets — and a plan for their use in classrooms

Ayesha Rascoe, photographed for NPR, 2 May 2022, in Washington DC. Photo by Mike Morgan for NPR.

Ayesha Rascoe

NPR's Ayesha Rascoe speaks to Nick Clegg, president of global affairs at Meta about the company's new virtual reality headsets and Meta's plans to have the headsets used in classrooms.

AYESHA RASCOE, HOST:

Facebook's parent company, Meta, has a new educational product for their Quest virtual reality headset, intended to go along with third-party educational apps. That's right - this one's headed to the classroom. Now, the headsets, which costs around $300 and are aimed at students who are 13 and older, are already in some schools. Meta's president of global affairs, Nick Clegg, thinks these headsets can engage students by immersing them into virtual environments - study ancient Rome by walking through ancient Rome, dinosaurs by walking among dinosaurs. We've seen Meta's commercials. I asked him, though, if headsets were an answer for students struggling with reading or math, areas where test scores have been at their lowest level in decades.

NICK CLEGG: I was reading a study the other day by I think it's Pricewaterhouse, PwC, who said that the learners that they'd spoken to, who'd been learning in virtual reality, said that they were 150% more engaged during classes than they otherwise would be. And Morehouse College reported much higher average final test scores for students learning in VR than from traditional or even traditional online methods. This isn't just about kind of academic learning. This is also about practical education. So for instance, in Tulsa, there is a welding school where welders of all levels are using VR to upskill their welding, you know, certification, their welding training. So I think there are lots and lots of different applications that educators and teachers are telling us at this very sort of nascent stage of the technology that they're using.

RASCOE: What do you say to those who will be critical of Meta in this space, given Meta's record of creating and marketing social media tools to children and teens that are addictive, and that research shows can have a real negative impact on kids' mental health? Why should Meta or its products be trusted in a classroom?

CLEGG: I don't think it's about whether you do or don't trust a company like Meta. It's do you or don't you trust the judgment of the teacher in the classroom? And we are building these tools so it is entirely controlled by the teacher. It's not controlled by us. It's the teacher that decides whether the headset is used. It's the teacher that decides what the content is on the headset. Students won't be able to access the Meta Quest store. They won't be able to access social media apps and social experiences on the Meta platform.

RASCOE: Separately, since I have you here - we're in an election year. There are major concerns about deepfakes and altered media spreading misinformation online. Meta announced starting next month, it will label AI-generated content and will also label any digitally altered media, AI or not, that it feels poses a particularly high risk of materially deceiving the public on a matter of importance. When it comes to election-related content, why just label the content and not remove it completely if it poses a risk of deceiving the public?

CLEGG: Oh, no. We will continue, of course, to remove content that breaks our rules. It doesn't matter whether it's synthetic or whether it's by a human being. We disable networks of fake accounts. We expect people, if they're going to use AI to produce political ads, to declare that. And if they don't, and they repeatedly fall foul of our rules, we won't allow them to run ads. But we have to work across the industry.

RASCOE: Well, that brings me to this question, because researchers at the New York University Stern Center for Business and Human Rights - they released a report this year arguing that it's not the creation of AI content that's really a threat to election security, but the distribution of, quote, "false, hateful and violent content" via social media platforms. They argue that companies like Meta need to add more humans to content moderation. They need to fund more outside fact-checkers and institute circuit breakers to slow the spread of certain viral posts, that it's really about the distribution of the content versus, like, how it's created by AI or whatever. What is your response to that?

CLEGG: Well, we as Meta so happen to have by far the world's largest network of fact-checkers, over 100 of them around the world, working in over 70 languages. If you look, for instance, at the prevalence of hate speech on Facebook now, what does prevalence mean? That means the percentage of hate speech as a percentage of the total amount of content on Facebook. It's down to as low as 0.01%. And by the way, that's not just my statistic or the statistic from Meta. That's actually a independently vetted statistic.

RASCOE: But you said you have 100 fact-checkers. I mean, there are millions and millions of posts. So is that something where you need more content moderators, you need more fact-checkers - is that something that Meta would consider in a pivotal election year?

CLEGG: Well, as I say, we constantly expand the number of fact-checkers we have. We'll never be perfect. The internet is a big open landscape of content, but I think we are a completely different company now than we were, for instance, back in 2016 at the time of the Russian interference in the U.S. elections then.

RASCOE: That's Nick Clegg, Meta's president for global affairs. Thanks for joining us.

CLEGG: Thank you.

Copyright © 2024 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

IMAGES

  1. Education Assignment Topics to Write About

    use of assignment in education

  2. 5 Advantages of Assignment for Students You Must Know

    use of assignment in education

  3. Planning An Assignment

    use of assignment in education

  4. What is an Assignment

    use of assignment in education

  5. 4 Benefits of Submitting Your Assignments in PDF Format

    use of assignment in education

  6. 5 Advantages of Assignment for Students You Must Know

    use of assignment in education

VIDEO

  1. How to Do Any Writing Assignment

  2. Make

  3. Terms and Use Assignment

  4. LeadSquared Product Updates: Learn What’s New!

  5. 15. Assignment Operators Example

  6. Design a Front Page that WOWs: Easy Tips for Microsoft Word Users #shorts #frontpagedesigns

COMMENTS

  1. (PDF) The Use of Assignments in Education

    The Use of Assignments in Education. Omer Gokhan Ulum. To cite this article. Ulum, O.G. (2020). The Use of Assignments in Education. Base for Electronic Educational Sciences, 1(1), 20-26.

  2. Designing Assignments for Learning

    An authentic assessment provides opportunities for students to practice, consult resources, learn from feedback, and refine their performances and products accordingly (Wiggins 1990, 1998, 2014). Authentic assignments ask students to "do" the subject with an audience in mind and apply their learning in a new situation.

  3. Pedagogy

    Pedagogy is the combination of teaching methods (what instructors do), learning activities (what instructors ask their students to do), and learning assessments (the assignments, projects, or tasks that measure student learning). Key Idea for Pedagogy. Diversify your pedagogy by varying your teaching methods, learning activities, and assignments.

  4. Making Effective Student Assignments: Practices for Improving ...

    assessment. The one aspect of education that will remain constant is the use of assignments for students. This component of lesson planning designed to encourage increased critical thinking and problem-solving skills is even more important to the education process than in the past, because our society continues to become more complex

  5. How Do I Create Meaningful and Effective Assignments?

    However, when introducing your assignment to your students, there are several things you will need to clearly outline for them in order to ensure the most successful assignments possible. First, you will need to articulate the purpose of the assignment. Even though you know why the assignment is important and what it is meant to accomplish, you ...

  6. The Use of Assignments in Education

    Eshach, H. (2007). Bridging in-school and out-of-school learning: Formal, non-formal, and informal education. Journal of science education and technology, 16(2), 171-190. Halici Page, M., & Mede, E. (2018). Comparing task-based instruction and traditional instruction on task engagement and vocabulary development in secondary language education.

  7. Designing Assessments of Student Learning

    As educators, we measure student learning through many means, including assignments, quizzes, and tests. These assessments can be formal or informal, graded or ungraded. But assessment is not simply about awarding points and assigning grades. Learning is a process, not a product, and that process takes place during activities such as recall and ...

  8. Classroom Assignments Matter. Here's Why.

    This type of analysis can identify trends across content areas such as English/language arts, science, social studies, and math. At Ed Trust, we undertook such an analysis of 4,000 classroom assignments and found that students are being given in-school and out-of-school assignments that don't align with grade-level standards, lack sufficient ...

  9. PDF ART-210506100125

    The use of assignments in education . Base For Electronic Educational Sciences,1 (1), 20-26. Submission Date: 24/ 06/ 2020 . Acceptance Date: 15 / 09/ 2020 . Abstract In all educational levels, teachers assign their students with different activities to practice and reinforce what they have learnt. ...

  10. ERIC

    The Use of Assignments in Education. Ulum, Ömer Gökhan. Online Submission, Base for Electronic Educational Sciences v1 n1 p20-26 2020. In all educational levels, teachers assign their students with different activities to practice and reinforce what they have learnt. Further, assignments are valuable educational tools that raise students ...

  11. Creating Assignments

    Double-check alignment. After creating your assignments, go back to your learning objectives and make sure there is still a good match between what you want students to learn and what you are asking them to do. If you find a mismatch, you will need to adjust either the assignments or the learning objectives.

  12. PDF The Role of Rubrics in Advancing and Assessing Student Learning

    A rubric is a multi-purpose scoring guide for assessing student products and perform-ances. This tool works in a number of different ways to advance student learning, and has great potential in particular for non-traditional, first generation, and minority stu-dents. In addition, rubrics improve teaching, contribute to sound assessment, and are ...

  13. Classroom assignments as measures of teaching quality

    The current study examines the quality of assignments in middle school mathematics (math) and English language arts (ELA) classrooms as part of a larger study of measures of teaching quality. We define teaching quality as "the quality of interactions between students and teachers; while teacher quality refers to the quality of those aspects ...

  14. Using rubrics

    Rubrics help instructors: Assess assignments consistently from student-to-student. Save time in grading, both short-term and long-term. Give timely, effective feedback and promote student learning in a sustainable way. Clarify expectations and components of an assignment for both students and course teaching assistants (TAs).

  15. Assessment Rubrics

    Rubrics help instructors: Provide students with feedback that is clear, directed and focused on ways to improve learning. Demystify assignment expectations so students can focus on the work instead of guessing "what the instructor wants." Reduce time spent on grading and develop consistency in how you evaluate student learning across students ...

  16. Types of Assignments and Assessments

    Assignments and assessments are much the same thing: an instructor is unlikely to give students an assignment that does not receive some sort of assessment, whether formal or informal, formative or summative; and an assessment must be assigned, whether it is an essay, case study, or final exam. ... Case analysis is commonly used in various ...

  17. Guide to Writing in Education

    One secret to writing well in education classes lies in understanding the four most important types of writing you will use: reflective, analytic, persuasive, and procedural. Since two or more of these types of writing are frequently combined when you complete any specific assignment, more detail about each is provided below so you can see how each

  18. PDF DOING ASSIGNMENTS IN EDUCATION

    In the Faculty of Education, you will have many different assignments. These include: • Written assignments such as essays, reflective tasks, reports and case studies, • Oral assignments including both individual and group presentations, and • Visual assignments such as posters, videos, portfolios and other electronic resources.

  19. How technology is reinventing K-12 education

    In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data. Technology is "requiring people to check their assumptions ...

  20. AI Teaching Strategies: Transparent Assignment Design

    The Committee on Academic Misconduct (COAM) recommends use of these standard Academic Integrity Icons to promote transparent communication with students about the use of various resources on assignments, including AI. The icons were developed for the Carmen Course Template by Ohio State Online staff and Dr. Lisa Cravens-Brown, COAM chair and Associate Vice Chair for Instruction in Psychology ...

  21. Get Started with Assignments

    Easily distribute, analyze, and grade student work with Assignments for your LMS. Assignments is an application for your learning management system (LMS). It helps educators save time grading and guides students to turn in their best work with originality reports — all through the collaborative power of Google Workspace for Education. Get ...

  22. Understanding the role of digital technologies in education: A review

    An online classroom calendar, where we may display class schedules, assignment schedules, field excursions, speaker events, examinations schedules, or semester breaks, will help students plan accordingly. ... The advantages are determined by how students, parents, and teachers use technology to improve education. When technology is used ...

  23. An Investigation into ChatGPT's Application for a Scientific Writing

    The college-level integration of generative artificial intelligence presents new opportunities and challenges as educators explore the potential of artificial intelligence technologies to enhance learning experiences and academic outcomes. In this study, we investigate and assess the impact of integrating ChatGPT, a generative artificial intelligence tool, into an inorganic chemistry classroom ...

  24. How do you prevent free-riding in group assignments? 7 tips!

    Free-riding is certainly not a new issue and has long been a subject of much educational debate. Luckily, VU CTL formulated these 7 tips for teachers that will help counteract student free-riding. The problems of free-riding in higher education can be addressed with these seven strategies, offered by the VU CTL.

  25. PDF FACT SHEET: U.S. Department of Education's 2024 Title IX Final Rule

    On April 19, 2024, the U.S. Department of Education released its final rule to fully effectuate Title IX's promise that no person experiences sex discrimination in federally funded education. Before issuing the proposed regulations, the Department received feedback on its Title IX regulations, as amended in 2020, from a wide variety of ...

  26. How teachers started using ChatGPT to grade assignments

    A new tool called Writable, which uses ChatGPT to help grade student writing assignments, is being offered widely to teachers in grades 3-12. Why it matters: Teachers have quietly used ChatGPT to grade papers since it first came out — but now schools are sanctioning and encouraging its use. Driving the news: Writable, which is billed as a ...

  27. Blended and Distance Learning with General Assignment Classrooms

    Discover how UW's advanced classroom technology can enhance your teaching experience. Upper campus has 91 general assignment classrooms designed to support blended and distance learning, along with lecture capture capabilities. While our primary solution is Panopto, instructors also have the flexibility to use Zoom for their teaching needs.

  28. Louisiana education chief tells schools to ignore new Title IX rules

    Louisiana's top education official on Monday instructed schools to ignore new Title IX rules unveiled by the Biden administration, warning that extending the civil rights law's protections to ...

  29. Meta unveils new virtual reality headsets

    This is also about practical education. So for instance, in Tulsa, there is a welding school where welders of all levels are using VR to upskill their welding, you know, certification, their ...

  30. Terms of Reference (ToR): Communication Firm Assignment Civic Education

    Terms of Reference (ToR): Communication Firm Assignment Civic Education and Outreach Program for the National ID System. Download Publication. BROWSE RELATED PUBLICATIONS. SHARE THIS: Facebook Twitter Google Plus Pinterest Email to a Friend. In this Section BUDGET FORMULATION.