With over 1115,000 PAs actively working in the US and a projected increase of 37% through 2025, it is fair to say that the PA profession is growing.
Along with a burgeoning clinician demand for PAs, there is an even more significant burden placed on PA academic programs to recognize those individuals best prepared for the rigors of a graduate PA program.
It is, therefore, increasingly important that academic institutions can identify student characteristics associated with success in PA school.
Despite a rigorous screening process, PA programs still experience a 5.2% attrition rate (the percent of students forced to drop out), which counteracts their most considerable efforts and harms students and the economic stability of the program.
In addition to completing the academic and clinical requirements, PA students must pass the Physician Assistant National Certifying Examination (PANCE). As we all know, PANCE pass rates are a big part of the accreditation process and can influence the character and quantity of future applicants.
Developing a screening standard will be a win-win. Assuring programs accept students who will be able to complete the program and providing students a fair and reasonable metric by which to gauge their competencies.
Predictors of PA School Success
The current research on predictors of PA school success is inconstant and, at times, contradictory.
About half of PA schools require the GRE. At the same time, most consider overall and science GPA minimum, volunteer hours, shadowing hours, direct patient care hours, and specific coursework, in addition to many other factors such as the personal interview, supplemental applications, and the forever dreaded PA school personal statement.
What Defines PA School Success?
To answer the question of whether a particular metric can predict academic success, we must first ask the question of what it means to succeed in PA school.
The quality of your PA school experience is something that everyone defines a little differently. For you, PA school success might mean getting good grades, preparing well for future practice, having meaningful clinical experience, or making connections that will help you to achieve your goals, among many other options.
Many of these definitions are subjective, personal, or unquantifiable. For this discussion, we will consider two variables used by the majority of PA school admissions teams — successful completion of the PA program from start to finish and passing your PANCE/high PANCE test scores.
Does the GRE Predict PA School Success?
There have been few studies to determine the relationship between GRE scores and PA student success.
The studies that do exist are self-contradictory and have demonstrated a weak (if any) association between GRE scores and PA school performance.
Based on the existing research, "GRE scores have been determined inadequate and inconsistent predictive measures of PA school success."
Also, there is no convincing evidence of a relationship between GRE scores and graduate student success for most graduate school programs. GRE scores turned out to be only “moderate predictors of first semester grades” and “weak to moderate predictors of graduate GPA.”
Despite this existing data, the GRE is a required prerequisite by over 1/2 of PA schools nationally. Ugh!
Does Undergraduate GPA Predict PA School Success?
Most (if not all) PA schools use overall undergraduate GPA and science GPA as criteria for PA school student selection even though recent research studies have had surprisingly contradictory results.
Two studies reported GPA among the most reliable predictors of success on the PANCE, while another showed that higher admission GPA was positively associated with first attempt PANCE scores.
In contrast, a different study examining 119 PA students failed to identify a correlation between undergraduate GPA and performance on the PANCE, nor was science prerequisite GPA associated with PANCE performance.
What was the most reliable predictor of PANCE success? Performance on foundational coursework taken during PA school, which, of course, is useless when it comes to identifying successful Pre-PA school applicants.
Can the MCAT predict PA School Success?
Each year, some 90,000 aspiring doctors take the MCAT® exam, and scores of admissions teams comb through the results as part of their search for talented future physicians. That means the test must do its job well?
Researchers have spent years studying thousands of students to assess the MCAT® exam. The researchers suggest a strong relationship between MCAT scores and students’ success in the first year of medical school.
Research has found "that, on average, students with higher MCAT scores perform better in their first-year courses. Also, "they found these results at a range of schools that vary in many ways, including their academic missions, applicant pools, teaching practices, and approaches to grading."
What’s more, numbers crunched indicate that the MCAT predicts how well students perform in second-year courses as well as on Step 1 of the United States Medical Licensing Examination, significant milestones in moving toward on-time graduation from medical school.
Researchers also wanted to know whether MCAT scores added value beyond what undergraduate GPAs offer in predicting medical school success. The data showed that both measures predicted well — but that using the two metrics together worked much better. “Admissions committees can be reassured that it makes sense to continue using both the MCAT and GPA.”
Some PA programs accept the MCAT as a PA school admission prerequisite, though to date, there is no conclusive evidence as to the MCAT's utility as a predictor of PA school success.
Can the SAT or ACT Predict PA School Success?
Currently, no studies exist to determine the usefulness of other standardized tests such as the ACT or SAT in relation to PA school success.
But due to the weak association between GRE and PANCE performance, it has been predicted that "ACT and SAT scores would likely show a weak or no association with PA school success."
Can Empathy, Communication Skills, and Teamwork Predict PA School Success?
Research has demonstrated that certain personality types may provide insight as to how a student will perform in PA school.
Specifically, conscientiousness and emotional intelligence are moderately predictive of success in the early years and much more predictive of academic achievement in the later years of education.
Many institutions either use a rating for high school extracurricular involvement in their admission process or subsume it within a broader “personal qualities” rating. Several institutions investigated the relationship of extracurricular activities to college academic achievement. One college found no correlation on a widespread basis, but that some of these students did less well than predicted. Another college found a small positive correlation between the extracurricular rating and college grades.
Are Aptitude Tests a Good Measure of College Completion and Academic Success in Undergraduate Programs?
Most studies find that the correlation between SAT scores and first-year undergraduate college grades is not overwhelming and that SAT scores explain only 10 percent to 20 percent of the variation in first-year GPA.
Generally, higher standardized test scores do show a correlation with college success as it's usually defined. ... In other words, a student who gets a higher score on the ACT or SAT is slightly more likely to be more successful in college, but only slightly.
In a report published by the American Enterprise Institute, researchers calculated a student’s likelihood of graduation based on both her high school GPA and her SAT or ACT score. While better marks on both measures predict a better chance of completion, "the relationship between high school GPA and graduation rates was by far the strongest."
What Happened to Evidence-Based Medicine?
In a recent survey conducted by the National Association for College Admissions Counseling, they found only 51 percent of the colleges surveyed conduct predictive validity studies to discover whether aptitude tests tell them anything helpful. Yet, nearly 8 out of 10 undergraduate colleges require aptitude exams.
In other words, schools are quick to adopt standardized test scores as a standard without testing to see if these test scores predict academic or future success.
Among that 51 percent that do crunch the numbers, 59 percent conduct validity studies annually; 24 percent every other year. Colleges have their own study protocols for gauging, whether SAT or ACT scores mean anything.
Can the PA-CAT Exam Predict PA School Success?
In full disclosure, my initial feelings regarding the PA-CAT were that it was a money grab.
Exam Master, the privately owned company that has created and marketed The PA-CAT (along with PANCE preparation tools, and test prep for nursing, dental and medical schools), has a substantial financial incentive to bring another assessment tool to market amid a frantic search by PA school administrators desperately seeking reliable metrics for PA school admissions. The market is significant, and there is a lot of money to be made by those hoping to generate a tidy profit.
Not to mention an exam company such as Exam Master can "double-dip." Meaning they can not only market an exam to PA programs, but they can also make and sell resources to prepare for the PA-CAT exam to pre-PA students. Cha-Ching!!
Let us not forget the "triple-dip" where Exam Master charges Pre-PAs to release those exam results to PA programs as part of their CASPA application!
In theory, if a test is well designed, then it only asks questions that target useful skills. If all the items on the test are important, then somebody who can answer 100 percent of the items would have a better chance of succeeding than somebody who can only answer 80 percent or 50 percent?
But we know the GRE is virtually useless in predicting PA school success. Still, just like prescriptions for Lipitor, it has been adopted quickly and without evidence by the majority of PA programs.
Data from the MCAT and medical school performance gives me some hope. But the most recent iteration of the MCAT was developed by an advisory committee, and survey respondents made up of medical school deans, administrators, basic and clinical science faculty, pre-health advisors, medical students, medical residents, and medical school faculty. Not a private company such as Exam Master.
Still, it is unlikely that any standardized test can ever adequately measure competencies beyond medical knowledge.
In studies that demonstrate a strong correlation between test scores and academic achievement, it is only true for the average; there are always some people who do terribly on a test, then excel later. An overreliance on just one measure of performance, such as the PA-CAT, risks missing a pool of candidates with other valuable attributes to contribute to the health care system.
My final thought is that no school should adopt a requirement such as the PA-CAT until there is evidence that it actually works. We should be especially mindful of the profit motive behind exams such as the PA-CAT, and guard against its influence on our next generation of passionate, driven, and compassionate PA professionals. And most of all, we as PAs, Pre-PAs, and PA students should question the metrics being used in the PA school admissions process and harness the power of "big data" to help identify evidence-based admissions practices.
Have a thought? Please share them in the comments section!
Stephen Pasquini PA-C
- How well does the MCAT® exam predict success in medical school?
- If GPA is the best predictor of college success, why do colleges cling to ACT and SAT?
- Can Aptitude Tests Really Predict Your Performance?
- College Vine: How Good is the SAT/ACT at Predicting College Success?**
- Forbes: What Predicts College Completion? High School GPA Beats SAT Score
- Does the MCAT predict medical school and PGY-1 performance?
- Physician Assistant College Admissions Test (PA-CAT)
- Factors Associated with Academic Performance in Physician Assistant Graduate Programs and National Certification Examination Scores. A Literature Review***