The U.S. News Best High Schools rankings, released today, give national recognition to 6,218 top-performing public high schools in all 50 states and the District of Columbia. These schools were awarded gold, silver and bronze medals based on their students’ graduation rates, performance on state tests and college readiness.
Each year, U.S. News is asked why individual schools moved up or down in the rankings compared with previous years. There are many possible reasons schools’ ranks changed in the 2016 edition. The most common explanations are outlined below.
1. Changes to the methodology used to rank the Best High Schools: In a major change, U.S. News factored high school graduation rates — an important measure of student success — into the methodology for the 2016 rankings. U.S. News was able to include this key outcome measure because, for the first time, graduation rates were available from all states and those rates were computed using a standard definition nationwide.
In previous years, schools were evaluated in three stages; for 2016, they were evaluated in four. To pass Step 3 of the new four-step methodology, high schools had to have a rounded graduation rate of 68 percent or greater in 2013-2014, when looking specifically at students who entered ninth grade in the 2010-2011 school year.
The 68 percent threshold was based on a federal law that requires states to provide additional resources to schools whose graduation rates are 67 percent or lower. U.S. News believes that the 68 percent threshold provides a basic measure to ensure that ranked schools do not struggle to graduate their students.
In future rankings, U.S. News may increase the threshold rate needed to pass Step 3.
Schools without a graduation rate value were allowed to pass Step 3 as well, to account for varying state rules about which high schools a graduation rate is calculated for — a factor that high schools themselves have limited control over.
In total, 190 high schools that would otherwise have been awarded gold, silver or bronze medals failed to meet the 68 percent threshold.
U.S. News also made a smaller but significant change to the first step of the methodology.
This year, for the first time, U.S. News considered absolute performance on state-required tests. This enabled the 10 percent of schools with the highest performance on each state’s reading and math assessments to automatically pass Step 1. In addition, schools in the bottom 10 percent of their state’s reading and math test results were barred from passing Step 1.
In the past, a school would pass Step 1 if its students’ performance on state tests exceeded what would be statistically expected for that school, based on the percentage of its students in poverty.
U.S. News made this adjustment to Step 1 to reward schools for exceptionally high performance on state assessments, regardless of their poverty level. This change also prevented schools with exceptionally low state test performance from being able to win a gold, silver or bronze medal.
2. Changes in relative performance on state tests: Some schools that were ranked in the 2015 Best High Schools rankings fell out of the 2016 rankings completely because they were no longer among the best-performing schools on their statewide tests — meaning that their overall student performance on state tests during the 2013-2014 academic year did not exceed statistical expectations (Step 1 of the rankings methodology) or the performance of their least advantaged students was not as good as the state average (Step 2 of the methodology).
If they did not pass both of these steps of the methodology, schools were not eligible for a gold, silver or bronze medal.
3. Changes in state assessment data: A number of states made changes to their required tests, their proficiency standards and/or their reporting practices, which may have affected schools’ ranks this year compared with last year. These changes may have affected schools in Alabama, Hawaii, North Carolina, Texas and Utah.
New Common Core-aligned assessments were also piloted in several states in 2013-2014, and as a result, several states reported no new state assessment data. These states included Connecticut, Idaho, Kansas, Montana and South Dakota. For these states, state assessment results from the 2012-2013 school year were reused.
California also piloted Common Core-aligned assessments in 2013-2014, but still administered the California High School Exit Exam that year. Previous years’ Best High Schools rankings have relied on California’s Academic Performance Index, a composite of multiple assessments and other data points, but the index was not calculated for 2013-2014, so proficiency results from the exit exam were used instead.
The change to using the exit exam to measure overall student performance on state assessments resulted in a reduction in the number of California schools passing Step 1 of the methodology.
4. Changes in relative or absolute performance on college-level coursework: Some schools may have moved either up or down in the 2016 rankings compared with last year because of how their 12th-grade class in 2013-2014 compared with the 2012-2013 class, in terms of participation in and performance on Advanced Placement or International Baccalaureate exams.
U.S. News determines the college readiness of each school by analyzing these data for the graduating class cohort in the most recent academic year available — in this case, the 2013-2014 school year. This means we looked at whether these students took and passed any AP or IB exams during their years at the school, up to and including their senior year.
Many schools’ ranks were affected because of changes in their College Readiness Index scores.
5. New medal winners: Some schools were new to the 2016 rankings because they passed Step 1, Step 2 and Step 3 of this year’s methodology but weren’t eligible for a gold, silver or bronze medal last year based on their performance.
Other high schools became eligible to be ranked for the first time in 2016 because they are relatively new schools. They may have had their first 12th-grade class graduate in 2013-2014, or the size of their graduating class may have grown enough to be included in the rankings.
In total, 1,753, or 28 percent, of the high schools that were awarded a gold, silver or bronze medal in the 2016 rankings were not ranked in 2015. Specifically, 55 of this year’s gold medal winners, 496 of the silver medal winners and 1,202 of the bronze medal winners were not ranked in 2015.
Overall, in the 2016 rankings, nearly one-third of eligible high schools earned a medal.
6. Suppression of state test results, incomplete state test data, changes in free and reduced-price lunch eligibility or lack of AP test results: Some medal-winning schools that were top performers in terms of college readiness in 2015 weren’t eligible to be ranked in 2016 because their state blocked certain portions of their math and English state test results from being released publicly.
There were also schools that weren’t ranked in 2015 that may have been eligible for medals this year, but certain portions of their state test data were suppressed or missing. Data could have been suppressed by states for various reasons, including protecting the identities of certain students.
States where data suppression appears to have had a large effect on the results include Oklahoma, Utah and West Virginia.
Some states had big changes from 2012-2013 to 2013-2014 in the number of students eligible for free and reduced-price lunch, which is used as a factor in Step 1 of the Best High Schools methodology. The states affected include Arizona, Delaware, Hawaii, Illinois, Ohio, Utah and West Virginia, as well as the District of Columbia. Though the increases and decreases were verified with the states, in many cases no explanation was provided.
For a detailed explanation of how all of these issues were handled in the analysis, see the Best High Schools technical appendix.
In addition, South Dakota schools weren’t eligible for gold or silver medals in 2016 because U.S. News could not use their AP data to determine their students’ level of college readiness (Step 4 of the methodology). South Dakota was the only state that did not give U.S. News permission to use its schools’ AP data. U.S. News also considers IB data in Step 4, but no South Dakota schools had IB data.
However, South Dakota schools still had the ability to earn bronze medals if they passed the first three steps of the methodology.
In total, 4,465, or 69 percent, of the high schools that were awarded a gold, silver or bronze medal in the 2015 Best High Schools rankings returned to the 2016 rankings as gold, silver or bronze medal winners. That means that 31 percent of the high schools that were ranked in 2015 were not ranked in 2016.
Of the schools that were gold medal winners in the 2015 rankings, 95 percent returned to the 2016 rankings as gold, silver or bronze medal winners. A majority of the 2015 gold medal winners — 76 percent — returned as gold in 2016.
Of the schools that were silver medal winners in the 2015 rankings, 76 percent returned to the 2016 rankings as gold, silver or bronze medal winners. More than half of the 2015 silver medal winners — 67 percent — returned as silver in 2016.
And of the schools that were bronze medal winners in the 2015 rankings, 62 percent returned to the 2016 rankings as gold, silver or bronze medal winners. More than half of the 2015 bronze medal winners — 55 percent — returned as bronze in 2016.
These results show that the bronze high schools were much less consistent in their year-to-year performance, especially when compared with the relatively high year-to-year consistency among the gold medal schools.
More from U.S. News
View the 2016 Top Public High Schools
How States Compare in the 2016 Best High Schools Rankings
How U.S. News Calculated the 2016 Best High Schools Rankings
Why Schools Moved Up or Down in the Best High Schools Rankings originally appeared on usnews.com