Explanation of Measure Validity, Reliability, Trustworthiness, and Fairness
Measure: Missouri Content Assessment (MoCA)
Evidence regarding validity | The MoCA content area assessments are exit assessments for each certification area and must be completed/passed by the candidate. DESE worked with Pearson to ensure validity to the MoSPE standards and content requirements for all teachers in Missouri. |
Evidence regarding reliability | The MoCA content test is mandated by the state of Missouri for the APR and is required for initial certification for each completer. No bias in reliability has been reported by DESE. |
Evidence regarding trustworthiness | P-12 Partners report that MAE program candidates have strong content knowledge. |
Evidence regarding fairness | Pearson’s technical manual addresses the fairness of the assessment. We have had 100% pass rates so have not observed problems with fairness.
The MAE program receives 13 vouchers, each at $25.00 value, from DESE for students with financial need. |
Measure: GPA Requirements (overall/content/professional)
Evidence regarding validity | Candidates for certification were required to have a minimum overall GPA of 2.75 on a 4.00 scale until Spring of 2021 when that required was removed by DESE. Professional education coursework and content area coursework must also be completed with a minimum GPA of 3.00 on a 4.00 scale. This requirement was eliminated by DESE in Spring 2021.
Content validity is ascertained by the successful completion of the courses and candidates’ abilities to perform well in field experiences and pass the MoCA. Further, our candidates perform well in their courses so we have evidence that there is alignment between the assignments and the instruction. Professional education courses with field experiences rely on our P-12 partners for feedback as we often work as a team with the candidates and/or completers. It is very rare that an admitted candidate does not meet GPA requirements. |
Evidence regarding reliability | The MAE and Truman faculty who teach the content courses have approval to teach graduate courses from the University and the required credentials to teach at the undergraduate level. All syllabi are approved through faculty governance and our faculty have been fairly consistent in terms of how frequently they teach particular courses. For multiple sections of MAE courses, faculty meet periodically to ensure consistency of content and assignments. |
Evidence regarding trustworthiness. | Due to our multiple measures for meeting the APR and our students’ abilities to meet these measures well, we feel that our GPA measures are credible, dependable, and confirmable. |
Evidence regarding fairness | Even in the best case scenarios, evaluating student work for grades can have variability. If students feel that their work has been evaluated unfairly, students can complete a grade appeal petition form to the Registrar’s office. In addition, if the student feels that he/she was discriminated against due to race, religion, etc., they can file a non-discrimination complaint reporting and resolution procedure to our Title IX office. Finally, students can reach out to the Department Chairs, Deans, and/or the Provost if they have concerns. |
Measure: Professional Disposition Rubric
Evidence regarding validity | When the instrument was first created, faculty used research-based sources to determine skills and categories, which was vetted by mentor teachers, candidates, and completers. The Elementary MAE Faculty used the document for several semesters and then the MAE Faculty recommended changes for MAE program adoption, which were brought forward to the Advisory Board. After that review, we made the changes and have utilized the instrument since. The professional disposition rubric data aligns with our candidate’s performance based upon the MEES scores, GPA, and qualitative feedback from mentors and University Supervisors. |
Evidence regarding reliability | The professional disposition rubric is submitted by candidates, P-12 Partners, and MAE Faculty for each field experience. The data is examined by each supervisor for consistency among the stakeholders during that course. In addition, the data are collected by the Education Department each semester. We analyze the data by the candidate to ensure consistency over multiple courses. In addition, University Supervisors explain the form to the mentors and candidates prior to the clinical or at the beginning of the clinical experience and then review the data at the end of the experience. |
Evidence regarding trustworthiness | The scores represent the overall strengths of our candidates and completers and are trustworthy when considered with high employment rates, successful certification, and mentors who ask to work with our candidates. |
Evidence regarding fairness | We have compared dispositions scores by different MAE programs, candidate’s gender, and the same candidate throughout multiple courses. No significant differences were found. |
Measure: Social Justice Rubric
Evidence regarding validity | The Social Justice Rubric was created based upon several sources including teaching tolerance (now learning for justice), Zaretta Hammond, and others’ work t0 support our goals for social justice dispositions. The second version was designed after data collected from our stakeholders. |
Evidence regarding reliability | The data is self-reported, examined by the mentor and the university supervisor so there are 3 people reporting on the candidate’s success. We know that our reliability will increase once we have the measure corrected after the next pilot phase and we begin training on the final product. |
Evidence regarding trustworthiness | In process since it is a new measure and we have adjusted it again this semester. |
Evidence regarding fairness | In process since it is a new measure and we have adjusted it again this semester. |
Measure: Professional Development Plan (PDP)
Evidence regarding validity | The PDP document was adapted from the New Jersey Department of Education Optional Teacher Professional Development (PDP) Template and Sample PDP. This specific instrument was chosen after reviewing numerous other state-used instruments as being most “user friendly” and most “adaptable” while still validly assessing PD activities of MAE students. |
Evidence regarding reliability | The data are self-reported so the consistency is that all MAE candidates complete the form and a tabulation and beginning analysis of PDP section I (what areas MAE students believe they need to further develop as a professional teacher) was conducted Spring 2020. We plan on continuing to examine the other two sections as well as consistency between PDP and other self-report measures of development. Once students complete the form, they are to sign and date the document and to have their mentor teacher and University Supervisor sign and date before turning it in. |
Evidence regarding trustworthiness | Similar to fairness, given the self-report nature of the PDP there is no reason to believe that a student would be encouraged to respond to any particular item(s) in a given manner or to reflect any particular “socially desirable” responses. |
Evidence regarding fairness | Given that the PDP is entirely self-reported there is no reason to believe there would be bias. Fairness does not seem to be an issue given the structure of the PDP. |
Measure: First-Year Teacher Survey Questionnaire
Evidence regarding validity | This measure was created and subsequently examined for content validity and structural validity by DESE. Content validity was established through item validation survey and expert review. In order to determine structural validity, a confirmatory factor analysis of the measure was conducted, but the model fit was poor. Subsequent exploratory factor analysis of the measure revealed eight factors accounting for 100% of the variance. Some of these eight factors contained items largely aligned with the standard they were intended to assess (Standards 4, 5, and 7) while other items cross loaded and thus, did not align with only one standard. |
Evidence regarding reliability | Internal consistency was measured using Cronbach’s coefficient alpha. The Cronbach’s α for each of the standards is as follows:
Standard 1: Cronbach’s α=.85
Standard 2: Cronbach’s α=.86
Standard 3: Cronbach’s α=.71
Standard 4: Cronbach’s α=.85
Standard 5: Cronbach’s α=.90
Standard 6: Cronbach’s α=.87
Standard 7: Cronbach’s α=.92
Standard 8: Cronbach’s α=.80
Standard 9: Cronbach’s α=.83 |
Evidence regarding trustworthiness | The data reported by DESE aligns with what the MAE Faculty received from employers who hire program completers. |
Evidence regarding fairness | Only programs with at least 10 respondents are included in the report. Therefore, while the data provided from this measure is helpful, it does not reflect graduates of all teacher education programs in the MAE. |
Measure: Principal of First-Year Teachers Survey Questionnaire
Evidence regarding validity | This measure was created and subsequently examined for content validity and structural validity by DESE. However, there is not a separate technical manual for this survey. DESE reported that this survey is reviewed with the First-Year Teacher Survey Questionnaire. |
Evidence regarding reliability | DESE does not report reliability; however, our scores are consistent year to year. |
Evidence regarding trustworthiness | The data reported by DESE aligns with what we have heard from employers who hire our completers. |
Evidence regarding fairness | Only programs with at least 10 responses are included in the report. Therefore, while the data provided from this measure are helpful, they do not reflect graduates of all teacher education programs in the MAE. |
Measure: Undergraduate Degree
Evidence regarding validity | The Higher Learning Commission has reviewed and approved our undergraduate programs in addition to undergoing Truman’s internal review process with the Faculty Senate (Undergraduate Council and Graduate Council committees) for all undergraduate and graduate programs and courses. |
Evidence regarding reliability | All graduates from particular undergraduate programs are required to take approved courses for that major and all are expected to experience the core mission of Truman’s liberal arts and sciences’ focus. |
Evidence regarding trustworthiness | There are no qualitative measures for this evidence. |
Evidence regarding fairness | The University Assessment Committee examines graduation rates from programs and other measures such as the undergraduate portfolio results to determine fairness. |
Measure: Portfolio
Evidence regarding validity | The portfolio is aligned with MoSPE, which is a valid instrument and part of the APR. In addition, students will write two of their reflections to focus on the MAE 2028—social justice and technology. |
Evidence regarding reliability | Faculty received training on how to score the portfolios with sample artifacts and reflections (from current candidates) in fall 2020. MAE Faculty were assigned to read a select number of the students’ work and evaluate it while the MAE Department Chair reviewed all of the students’ work and evaluated it. |
Evidence regarding trustworthiness | The artifact needs to represent the reflection and vice versa. |
Evidence regarding fairness | All candidates are expected to complete the portfolio and if the scores among the evaluators are not similar, we will review that particular portfolio with another MAE Faculty member to ensure fairness. |
Measure: Graduate Student Exit Questionnaire (GSEQ)
Evidence regarding validity | The Graduate Survey Exit Questionnaire was examined by the Graduate Council for validity in 2019. Variations of this survey have been utilized for more than 20 years at TSU. Results are reported by the graduate program and aggregated as graduate studies. Each program receives its data yearly. |
Evidence regarding reliability | High scores have been received consistently on this measure. |
Evidence regarding trustworthiness | Students have a comment space where they can add qualitative information, which is reviewed by the MAE Department each year. |
Evidence regarding fairness | The completion of this form is a graduation requirement for all graduate students at the university. |
Measure: MAE Completer Survey
Evidence regarding validity | Dr. Marty Strange (MAE Faculty member) worked with the MAE Department Chair and members of the Assessment Committee to ensure that this measure will meet our content needs. In addition, the IRB committee reviewed the study in its IRB application. |
Evidence regarding reliability | Participation has been limited, resulting in low n. Reports have been consistent in the consecutive years with additional data gathered annually to monitor reliability. |
Evidence regarding trustworthiness | From the survey results, focus groups were established to follow-up on information from the survey. This provides qualitative data to triangulate with the survey. |
Evidence regarding fairness | As per the IRB, no one is being paid to participate in the study. There are no negative consequences for anyone who chooses not to take the survey. The survey will be sent to all completers for that academic year. |
Measure: MAE Completer Focus Group
Evidence regarding validity | Questions will be based on results of the MAE Completer Survey which need further elaboration. |
Evidence regarding reliability | A random sampling of completers are asked to participate in focus groups. It is the intent to use focus groups each year after the results of the current MAE Completer Surveys are examined. Consistency will be measured over time. |
Evidence regarding trustworthiness | The qualitative responses are triangulated with the Completer Survey and possibly other measures such as the First-Year Teacher Survey Questionnaire data and Principal of First-Year Teachers Survey Questionnaire data. |
Evidence regarding fairness | Random sampling of participants will be implemented to maximize fairness. |
Measure: Missouri Educator Evaluation System (MEES)
Evidence regarding validity | DESE worked with P-12 Partners, Missouri Association of Colleges for Teacher Education, and other constituencies including Pearson to create content validity for the MEES with the MoSPE. |
Evidence regarding reliability | All MAE Faculty are trained annually for the use of the MEES by the Regional Professional Development Center to ensure inter-rater reliability. |
Evidence regarding trustworthiness | Qualitative data is not expected with the MEES; however, our MAE Faculty discuss the rubric with the P-12 Partner and Candidate during the summative evaluation. |
Evidence regarding fairness | Truman completers had 100% pass rate so problems with fairness have not been observed. |
Measure: Advisory Board
Evidence regarding validity | The items that the MAE brings forth to the Advisory Board are from discussions at the Department Meetings or ones that fit with the MAE 2028 document ensuring content validity. In addition, there are discussions of what happens in the context of placements with our P-12 Partners, which improves the program. |
Evidence regarding reliability | There is inconsistent attendance at meetings, which is not ideal in terms of reliability. Meeting synopses are sent to all members and feedback is requested if they could not attend our meeting. |
Evidence regarding trustworthiness | The qualitative feedback that we receive from our P-12 partners is invaluable. |
Evidence regarding fairness | If everyone does not attend the meeting, fairness is not always met. Sending out the synopsis and asking for feedback mitigates that aspect to some degree, but responses are not received as frequently as desired. |
Measure: School Partnership and Outreach
Evidence regarding validity | Each outreach and partnership has periodic evaluations from the different stakeholders and the Department Chair compares the goals of the P-12 partners to the MAE mission and outcomes for alignment. |
Evidence regarding reliability | Multiple stakeholders are involved in each partnership and outreach (some of which have lasted multiple years) providing consistency over time. |
Evidence regarding trustworthiness | The qualitative comments support our outcomes and expectations in terms of the impact of these programs and opportunities. |
Evidence regarding fairness | There is a careful attempt to work closely with school partners to ensure fairness and to listen to their concerns/issues (if any) so issues can be addressed immediately. We want partnerships to be partnerships. |
Measure: Truman MAE Curriculum Matrix
Evidence regarding validity | DESE approves all matrices for all courses for each MAE program. |
Evidence regarding reliability | Faculty who teach multiple sections of the same courses meet periodically to ensure consistency with outcomes. |
Evidence regarding trustworthiness | Course evaluations indicate trustworthiness. |
Evidence regarding fairness | Course evaluations indicate fairness or if there are any questions, students come to the Department Chair of Education, Dean of Health Sciences and Education or the Academic Office. |
Measure: ESOL Add-on Certification
Evidence regarding validity | Courses have been approved by DESE. |
Evidence regarding reliability | Multiple faculty who teach the particular courses review for consistency among sections. |
Evidence regarding trustworthiness | We examine the number of advisees each semester. |
Evidence regarding fairness | All advisors discuss ESOL add-on certification and it is highlighted on the webpage. |
Blank Copies of Locally Developed Instruments
MAE Professional Disposition Rubric
Professional Development Plan
Social Justice Disposition Form Version 1 (used Spring 2021)
Social Justice Disposition Form Version 2 (currently in use)