Exam Info
The CSAEd certifications utilize computer-based exams composed of multiple-choice questions administered in one online session, with a single, predefined break. All questions will be included in calculating the candidates’ scores.
The exams contain the following number of multiple-choice questions with the noted maximum time limits:
- Core certification exam: 150 questions (Three (3) hours)
- Specialty certification exams: 50 questions (Two (2) hours per exam)
Individuals may request exam accommodations within the online application.
Pilot Program Application & Exam Window
- Applications available: September 27, 2022
- Application deadline: October 23, 2022 at 11:59 pm ET
- Notification of application status and exam registration opens: October 31, 2022
- Exam administration window: November 1 - December 12, 2022
- Exam results shared: By week of January 9, 2023
- Retake exam window: January 30-February 10, 2023
Exam Content Outlines
Both the core and each specialty exam consist of the same eight domains. For a detailed exam content outline for each certification exam, including a breakdown of the percentage of each domain covered, see the Candidate Handbook (Appendix A). Below is a listing of each domain and its description.
Foundations of the Profession: This domain encompasses the foundational understanding of the histories and contexts of higher education systems and connects social justice, inclusive histories, and philosophies to the student affairs profession. This domain includes a commitment to research, professional standards and codes of practice, law, and organizational policies.
Student Learning, Development, and Success: This domain encompasses the application of student learning and development theories while centering and advocating for holistic student learning, development, and success. It includes the design of programs and services that retain, develop, and move students toward completion and graduation.
Assessment and Evaluation: This domain encompasses the appraisal of the quality and effectiveness of higher education work with understanding and appreciation for different contexts, cultures, and backgrounds. Specifically, the practice of assessment and evaluation grounded in outcomes, the use of a variety of methods and tools to do so, and utilizing the data to identify strengths and opportunities for improvement in program, delivery, or actions.
Social Justice and Inclusion: This domain encompasses the process, outcomes, and dynamic influence of individual and institutional awareness and action to foster inclusion, create equity, and ensure access grounded in the understanding of systems of oppression and privilege and how they are perpetuated in our practice and communities. This domain includes our individual dispositions and sense of agency and responsibility for justice for ourselves, others, our community, and the larger global context.
Leadership: This domain encompasses the practices of embracing institutional values and vision to empower and engage others; valuing diverse, inclusive, and equitable views and methodologies to take risks and evolve communities; adaptively approaching problems and challenges; and navigating different types of business, political, personnel, and financial pressures to create transformative change.
Talent Management: This domain encompasses the competencies needed to support the talent life cycle within an organization. By applying effective talent management practices, student affairs educators attract, develop, and retain staff who are enabled and empowered to set and reach personal and organizational goals. In this work, the student affairs educator plays an active role in the continual assessment and relevant supports and interventions to develop the full potential of all staff, including full time, part time, graduate and undergraduate members. The student affairs educator addresses the individual needs of staff members to collectively engage in mission-based work that advances student learning, development, and success.
Crisis and Risk Management: This domain encompasses the ability to understand, educate, plan, and apply information pertinent to emergency situations and operationalize risk management; managing uncertainty; using data; and providing direction toward institutional objectives related to crisis response and risk management.
Financial and Facility Management: This domain encompasses contributing to and implementing the effective and efficient delivery of an organization’s strategic and operational goals, managing financial and facility resources that help ensure a safe and productive environment to fulfill the mission of the organization, and practicing ethical and equitable management of financial resources.
Exam Results
The CSAEd Core and each CSAEd Specialty Area exam is designed to measure a candidate’s performance against a predetermined standard–the level of knowledge and competency in established domains that can be reasonably expected of mid-level student affairs and services educators, including within six specific functional areas.
Each exam measures what the candidate knows at the time they are taking the assessment. The candidate's exam performance is not compared to other candidates' performances on the same exam. Passing scores were established for CSAEd Core and Specialty exams through a panel-based standard setting process utilizing a no-data Angoff method (See Exam Score Process and Validity below).
Candidates will be informed of whether they passed or did not pass an exam via email. Candidates participating in the pilot program will receive their results by email in early January 2023. After the conclusion of the pilot program, candidates will receive email notification of their exam results within 30 (thirty) calendar days of completing a certification exam.
Exam Scoring Process and Validity
A panel-based standard setting process utilizing a no-data Angoff method was conducted in May and June of 2022 for the seven Student Affairs Educator Certification exams.
A panel of fifty-three (53) subject matter experts (SMEs) were identified by the seven Consortium founding partner associations and convened to execute the standard setting process. SME identification consisted of nominations of, and invitations to, respective association membership, including invitations to small colleges/university and community college member groups, racial and ethnic affinity member groups, and graduate preparation faculty.
A workshop was conducted virtually with the psychometric consultants, Consortium staff, and available panel members. Panel members who were unable to attend were given a recording of the meeting. The standard setting process was reviewed, a discussion of minimally qualified candidates was facilitated, and definitions and item ratings were presented and discussed. Panel members were assigned to either the General (Core) certification or to one of the six specialty certifications and were given two weeks to complete all of their ratings. Raters were asked to rate the entire new item bank for their assigned certification. To protect the integrity of the exam, this exercise was conducted using the standard setting tool in the secure, online Surpass platform.
Once the final ratings were collected, the psychometric team calculated the mean rating (estimated item difficulty; the item p-value under Classical Test Theory) for each item and examined overall rater agreement. Overall rater agreement was strong. Of the nine raters assigned to one specialty area, only one rater’s average cut score fell below two standard deviations of the mean and was, therefore, excluded from the exercise. All other estimated cut scores for each certification fell within two standard deviations of the mean and were retained.
Further, the item mean ratings of all but 68 items across all seven certifications were within two standard deviations of the overall mean of these ratings. The 68 items identified as not having rater agreement were flagged for revision by subject matter experts in the future and none of these items were included in the form assembly of the new exams. In addition to their ratings, SMEs were able to provide comments as they reviewed and rated items. These comments were used to further refine item quality.
Based on the ratings of the items and the approved exam blueprint, two (2) forms of each exam were engineered. The psychometric consultants used the predicted difficulty of the items to determine combinations of items to assemble two forms with similar overall difficulty while ensuring that the content matched the approved exam blueprint.