Auld, G.; Baker, S.; McGirr, K.; Osborn, K.S.; Skaff, P.
Objective To confirm the reliability and validity of a previously validated evaluation instrument in a new context. Methods In a cross-sectional study, the processes and results of testing Cooking Matters’ (CM) use of the Expanded Food and Nutrition Education Program's Behavior Checklist as a retrospective pretest/posttest were described. The researchers determined reliability, face and content validity, and response-shift bias with 95 CM participants. Results Most items had acceptable face validity and moderate reliability; other items lacked reliability, or face or content validity (were unrelated to the CM curriculum). Conclusions and Implications Proper match between evaluation tools and curricula is needed for appropriate program assessment without which outcome data can be misleading or potentially invalid. Confirmation of validity is essential when adopting others’ evaluation tools in new contexts, particularly for programs with widespread use such as federally funded programs and national nonprofit organizations. © 2017