You have /5 articles left.
Sign up for a free account or log in.

Greggory DiSalvo/iStock/Getty Images Plus
This fall, as prospective students embark on the college admissions process, they—like millions before them—will turn to information sourced from one of the most reliable higher education data sets: the Integrated Postsecondary Education Data System. Most won’t realize it, but the facts and figures guiding their decisions largely come from IPEDS.
What is this college’s acceptance rate? Graduation rate? Is it affordable? How much financial aid do students receive? Each year, applicants and their families ask these questions, often relying on resources such as the College Scorecard and other tools built directly on IPEDS data. Because every federally funded college and university is required to report detailed enrollment, admissions and financial data annually, IPEDS provides one of the most comprehensive and trustworthy pictures of the U.S. higher education landscape.
Yet this year, the U.S. Department of Education has put forth a proposal that could undermine the quality and reliability of new admissions data. By Dec. 5 of this year, the department plans to require colleges to submit a supplemental data set called the Admissions and Consumer Transparency Supplement (ACTS). The proposal requires four-year institutions with selective admissions to report highly detailed undergraduate and graduate student data, broken out by multiple student characteristics, along with five years of historical records.
While promoting greater transparency is important, the proposed changes present significant implementation challenges that could affect the accuracy and comparability of the data. Without adjustments, the Education Department risks compiling a data set of uncertain quality that will be difficult to interpret or compare across institutions. To ensure this vital information remains accurate, reliable and useful for students, their families and policymakers, the department should delay, revise or scale back aspects of this proposal.
Our perspective is informed by more than 40 years of combined experience with IPEDS data collection, including participation in more than 20 technical review panels weighing the burden of data collection on the institutions against the value of the data produced. Additionally, our organizations are part of a national project that has been working with colleges and universities to collect similar admissions data. These experiences give us a clear view of the options the Education Department now faces—and of better alternatives.
Where the ACTS Proposal Falls Short
As written, the ACTS proposal presents serious technical and methodological challenges that are further complicated by an ambitious timeline. From experience, we know that the following realities will make it extremely difficult for colleges to meet the proposed requirements and may jeopardize the new data collection effort.
- Most “selective” institutions admit most students who apply. The ACTS proposal requires all four-year institutions with “selective admissions” to report new data. However, IPEDS only flags open-admission institutions, meaning any college not flagged as open admission, in theory, is “selective.” In practice, however, most of these institutions are not highly selective. Based on our analysis of IPEDS data, of the roughly 1,700 colleges and universities that have “selective” admissions, more than 80 percent admit at least half of all applicants.
- Not all institutions maintain five years of historical data. Through our own efforts to collect admissions data, we know that not all institutions keep applicant data for five years, which the ACTS proposal necessitates. Even if institutions did have historical data, these years are not comparable. During the COVID-19 pandemic (2020–21 and 2021–22), many institutions adopted test-optional or modified admissions practices. In 2023–24, the Supreme Court’s decision on race-conscious admissions had taken effect, and in 2024–25, a delayed Free Application for Federal Student Aid rollout complicated admissions data.
Each year in this five-year period is unique, making trend analysis unreliable. In addition, first-time data collections almost always surface definitional issues and inconsistencies that need refinement before backfilling multiple years. A more measured approach would start with a single year of data, stabilize definitions and then consider retroactive data collection where possible.
- Graduate admissions data are collected by program. Unlike undergraduate admissions, graduate admissions decisions are typically decentralized—made at the program or department level using criteria specific to each discipline. The admissions process for a master’s in business is very different from that of a Ph.D. in biology. This variation means that combining graduate admissions data into broad categories at the institutional level will produce misleading comparisons, a fact already supported by the Education Department’s own research. Because undergraduates represent 83 percent of all U.S. postsecondary students, the department should prioritize refining undergraduate admissions data before venturing into graduate-level reporting.
- Not all data are consistently available or comparable across the admissions process. Several of the proposed data points in the ACTS proposal either lack consistent definitions or are not routinely collected. And many institutions don’t collect all the data requested across applicants, admits and enrollees. This issue has posed long-standing problems for admissions data collection and has necessitated collecting fewer variables in our experience. Parental education, for example, is rarely collected, and GPA calculations are not consistent across high schools.
Without clear definitions and consistent data availability, institutions will choose to report metrics differently, undermining comparability and accuracy. The simplest solution is to align ACTS elements with existing IPEDS definitions or to delay adding new elements until consistent standards are developed.
- Data broken down by multiple characteristics multiplies complexity. The proposal requires slicing the data by race, sex, income, GPA, admission type and more. While this request might seem straightforward, it raises major concerns about student privacy, feasibility and data quality. At some colleges, breaking out data by different characteristics may risk identifying specific students because only a few would meet those criteria.
Each additional data breakdown multiplies the reporting complexity, slows down design and testing, and risks generating unusable data. A potentially more effective approach would focus on one or two disaggregations tied to clear policy priorities, such as breaking out admissions by race and sex, then building from there.
A Better Path
If the Department of Education seeks to promote more transparent admissions data, a phased approach may be more effective than attempting to implement everything at once. Institutions are already slated to begin reporting on applicants, admits and newly enrolled students by race and ethnicity for the first time this fall, and these data could be leveraged to compare test scores or analyze outcomes by race and sex. Beginning with a smaller scope—supported by clear, standardized definitions—would allow reporting systems to stabilize, surface challenges and build confidence in the quality of the data.
By prioritizing a thoughtful design over speed and volume, the Education Department can ensure it does right by the millions of prospective college students who rely on IPEDS to guide their next steps. Without reliable, comparable data, the college admissions process would become even more daunting than it already is.