As more and more accounts of abuse and fraud among institutions of higher education come to light, it has become increasingly apparent that current federal regulation is vastly inadequate to protect postsecondary students and taxpayers. In a new report published by the Center for American Progress, TCF senior fellow Robert Shireman, as well as the Center’s Elizabeth Baylor and Ben Miller, provide evidence that the failure of regulators to catch bad actors in this sphere is not coincidental, but rather is built into the processes by which the U.S. Department of Education (DOE) oversees these schools.

Offering the first behind-the-scenes view of the audits used to monitor colleges receiving taxpayer funds, the authors find that the thousands of audits and inquiries conducted each year almost completely sidestep issues of major importance, and instead focus mostly on minor bookkeeping issues. Identifying the minute errors schools make in the calculation and distribution of federal student aid, both independent auditors and federal investigators systematically fail to take into account broader questions regarding the honesty and principles with which these schools operate, despite the $100 billion in federal grants and loans they take in annually.

The report finds that while annual DOE audits and program reviews should be among the strongest and most flexible tools the federal government has to identify and rein in aggressive recruiting, misleading advertising, inadequate advising, and other abusive practices, little use is made of these tools. The audits instead are merely a set of annual reviews of financial statements and compliance with the federal aid programs, conducted by firms hired by the colleges. The program reviews are inquiries conducted by DOE staff to examine whether institutions are behaving in accordance with federal policy. These tools, which are supposed to ensure that federal funds are not squandered, over time have lost their value.

Rather than looking for signs of trouble, such as rapid increases in enrollment, high dropout rates, or high faculty turnover, the audits and program reviews have focused on checking numbers in databases—checking the dates, for example, upon which a students receive aid in relation to when they submit the necessary paperwork—to ensure schools are meeting precise federal procedural requirements.

To make matters worse, auditors are working with outdated guidelines, creating significant oversight vulnerabilities as a result of the recent expansion of online education and publicly traded schools. And even once an audit is submitted, the DOE does not require auditors to submit documentation supporting their claims, nor does it maintain resources to consistently fact-check the work of the auditors, who are hired by the schools themselves. These factors have created the perfect storm, allowing the requirement for independent audits to devolve into little more than box-checking exercises that institutions can easily game to avoid additional questions from regulators.

While program reviews have the benefit of being conducted by unbiased government regulators who are not beholden to the schools for their paychecks, the authors make it clear that, even in this process, the DOE has been derelict in its duty to prevent abuse by colleges. The bureaucracy has adopted a complacent approach that is focused on technical compliance, just like the auditors. Forgetting their primary purpose as protectors of consumers and taxpayers, the authors’ examination of published program reviews reveal that today’s regulators imagine schools as their primary clients.

The report suggests that the problems plaguing program reviews are similar to those that hinder school audits from providing a real understanding of institutions and the ways they may be taking advantage of students, families and taxpayers. An incomplete and narrow scope of examination, the ability of school to censor government findings prior to publication, and insufficient resources to annually examine a sufficient number of programs all undermine the effectiveness of the federal watchdogs.

With accreditors and state agencies already coming under fire for providing deficient oversight of higher education, Shireman and his coauthors call for an immediate shift in both the approaches and tactics used to conduct audits and programs reviews at the federal level. Resolving the problems identified in the report involves a culture shift concerning the purpose of these tools, but there are specific guidelines that can facilitate this transformation, increasing the likelihood that the audits and program reviews will operate as a deterrent to both shady practices that do not serve students well.

Among the specific recommendations outlined are taking steps to prevent colleges from knowing ahead of time that entire categories of activities will not be subject to review, and prohibiting schools from censoring investigators’ reports. “To keep colleges on their toes, those institutions must know that anything can be examined; that every recruiting call, any advertisement, all enrollment, attendance, and course records, and every employee training session could be reviewed; and that any findings of note—whether it has clearly stepped over a line or the practice simply reflects poorly on the institution—will be reported.”

The DOE spends a tremendous (though insufficient) amount of both time and money overseeing the thousands of colleges that receive federal financial aid. However, as tens of thousands of defrauded students know all too well, the regulatory system is not currently protecting students and taxpayers as it should. This report is the first step toward identifying ways in which the processes that regulators already conduct can be infused with additional meaning and energy, providing the safeguard that the higher education marketplace, specifically the for-profit sector, so desperately require. The cost of avoiding the issue—in terms of wasted federal financial aid dollars, time spent acquiring useless degrees, and disenchantment with postsecondary school—is simply much too high.