top of page
Search

Confidence Builds Competence: "I can" statements and differentiated review for food safety success

  • Colleen Farris
  • Dec 7, 2025
  • 3 min read
Image created by Colleen Farris using Canva for Education free images.
Image created by Colleen Farris using Canva for Education free images.

Preparing first-year culinary arts students for the World Food Safety Organization (WFSO) Essentials of Food Hygiene (EFH) credential exam requires intentional, equitable assessment design that provides learners with the time, tools, and support they need to achieve mastery (Bloom, 1974; Shepard, 2000). This post presents a manageable, data-driven, competency-aligned, differentiated assessment design that fosters student confidence and builds food safety competency.


The EFH exam aligns with course competency 4.08: Demonstrate readiness for industry certification requirements/exams for food safety. Separating 4.08 into 14 key concepts helps students see and structure their progress. Using a Google Form survey tool combined with a 4-column rubric as a data collection technique, students self-assess their food safety concept knowledge. Responses can then be quickly collected and analyzed before and after review and learning activities.


Initial survey introduction.
Initial survey introduction.
...“I can” statements communicate learning targets and increase learner agency...

Confidence ratings using “I can” statements communicate learning targets and increase learner agency (Wiggins & McTighe, 2005). The ungraded survey identifies learning needs without punishing students who are not yet proficient. This competency-based approach aligns with industry practices, while avoiding the demoralizing effects of grade averaging (Blum, 2022; Kohn, 2011).


This assessment occurs after a pre-assessment and gamified class review. Students reflect on what they know and what they don’t know, and where they need help–cornerstones of a learning culture (Shepard, 2000, p.10). Students receive personalized study checklists via email, while the teacher views responses in a spreadsheet. Timely, targeted feedback benefits students, and analysis of qualitative observation data and quantitative survey data gives the teacher actionable insights into student needs.


Content-restricted AI reduces risk while capitalizing on its strengths to expand learning opportunities...

Next, students engage with a MagicSchool Room AI Tutor trained on the food safety competency and guided by a custom prompt. Student confidence levels differentiate the conversation-style review initiated by the AI Tutor, where concept ratings of 4 receive minimal review, 3 prompt peer-type support to bridge knowledge gaps, 2 receive clarification and checks for comprehension, and 1 receive a full instruction presentation with scaffolding and follow-up checks for understanding.


MagicSchool AI Tutor script based on competency content and custom prompt for student engagement.
MagicSchool AI Tutor script based on competency content and custom prompt for student engagement.

The AI Tutor increases fairness through clarification, translation, and vocabulary learning (Montenegro & Jankowski, 2017). Content-restricted AI reduces risk while capitalizing on its strenghts to expand learning opportunities (Selwyn, 2011). After working with the AI Tutor, students complete a follow-up survey. This survey collects requests for help, reflections on the review process, and data parallel to the check-in survey. Results generate revised checklists to guide continued learning.


Follow-up survey introduction.
Follow-up survey introduction.

The resulting spreadsheet data measures growth, and identifies students who fall below the 75% proficiency threshold for intervention. This process reinforces that it is okay to ask for help (Bloom, 1974). Surveys also provide aggregate data to monitor patterns that indicate need for reteaching, revision or realignment of lesson plans and content presentations to improve equity of access for groups of students with different performance levels than the average (Broadfoot & Black, 2004; Montenegro & Jankowski, 2017).


 ...this approach teaches students that mastery is within their reach.

The electronic assessment course taught me to intentionally assess for learning, using manageable, equitable, differentiated assessments aligned with learning objectives and course competencies. Reflection, support, and encouragement focused on growth rather than grades help empower learners to reach mastery (Stommel, 2021). By allowing students to reframe their learning levels in low-stakes terms through reflection, this approach teaches students that mastery is within their reach. That is the goal of competency-based culinary education and the purpose of this electronic assessment design.


References


Bloom, B. S. (1974). Time and learning. American Psychologist, 29(9), 682–688. https://doi.org/10.1037/h0037632.


Blum, S. D. (2022). The ungrading umbrella. Grow Beyond Grades. https://growbeyondgrades.org/blog/the-ungrading-umbrella.


Broadfoot, P., & Black, P. (2004). Redefining assessment? Assessment in Education: Principles, Policy & Practice, 11(1), 7–26. https://doi.org/10.1080/0969594042000208976 .


Kohn, A. (2011). The case against grades. https://www.alfiekohn.org/article/case-grades/.


Montenegro, E., & Jankowski, N. A. (2017). Equity and assessment: Moving towards culturally responsive assessment (Occasional Paper No. 29). NILOA. https://www.learningoutcomesassessment.org/wp-content/uploads/2019/08/OccasionalPaper29.pdf.


Selwyn, N. (2011). Education and technology: Key issues and debates. Continuum International Publishing.


Shepard, L. A. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4–14.


Stommel, J. (2021, February 22). Grades are dehumanizing; Ungrading is no simple solution. https://www.jessestommel.com/grades-are-dehumanizing-ungrading-is-no-simple-solution/.


Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). ASCD.

 
 
bottom of page