Barely in existence six years ago, technology schools — termed ‘bootcamps’ due to a rigorous, fast-track, intensive learning model — equip students with key tech skills, preparing them to thrive as software engineers, data scientists, and cybersecurity professionals.
As a mostly unaccredited industry, leading bootcamps have strived to self-regulate to establish credibility without straining credulity. Beyond partnerships with innovative quality assurance and student loan platforms, such as Skills Fund, market-leading bootcamps have embraced the value of publishing student outcomes data throughout the years.
However, singular reporting attempts by individual schools provides limited value to prospective students. As each school uses a different standard, means of calculating outcomes, and publishing format, a student is unable to make an apples-to-apples comparison across peer programs.
Under the leadership of Skills Fund, who underwrites a student’s return on investment in part of its comprehensive vetting of partner schools, the Council on Integrity in Results Reporting (CIRR) was born to change that. With CIRR, bootcamps have created an industry-wide standard for all schools to report on the most meaningful of student results. (Full disclosure — I work for Skills Fund, which is an active stakeholder member of CIRR).
CIRR is the first measurement and reporting standard to account for on-time graduation, in-field job placement, and median starting salaries of 100% of enrolled students. The non-profit’s outcomes reports are independently reviewed by third-party auditors annually, according to CIRR’s standards.
CIRR isn’t the first attempt at transparent outcomes reporting in higher education. In framing student expectations of career outcomes, leading schools such as Galvanize, Flatiron School, and General Assembly have individually reported their student outcomes for years.
The University of Texas, in partnership with the U.S. Census Bureau, recently unveiled SeekUT, a database to disclose graduates earnings per degree program, using census bureau data. The “powerful but imperfect” database is a workaround on the 2008 Higher Ed Act’s ban on a federal database to link student-level educational data to national employment data.
And under the Obama Administration, the Department of Education released the College Scorecard, which provides consumer insights into outcomes of students who received federal student aid.
But, unlike singular efforts, the strength of CIRR’s reporting lies in the simplicity and clarity across all students and all primary outcomes, directly creating institutional accountability.
To place this into context: when did your college or university release audited data on what happened to every single enrolled student in your degree program? Specifically, how many of your peers graduated on time, secured full-time employment, and the salary of their first job?
Schools reporting to the CIRR standard do just that.
By using an identical one-page reporting template that enables a comparison across peer programs, anyone from federal policymakers; to state workforce and economic development authorities; to state attorney generals; to educational investors, media and most importantly, prospective students, can access critical student outcomes details to inform enrollment, funding, and other important decisions that are based upon program quality.
Accordingly, the CIRR model can serve a greater breadth of schools and be used as:
A Job & Skills Training Program Evaluator for Federal Funding:
As Congress is exploring the option to offer federal funding to short-form skills training programs, policymakers can use CIRR members’ data as a benchmark to identify quality programs and downselect schools that could be eligible to participate in the federal Pell Grant program. A number of CIRR member schools are also currently eligible to receive federal G.I. Bill funds.
An Accountability Mechanism for Workforce Development Programs & Student Outcomes:
As some question if workforce training programs actually work, CIRR provides a simple, transparent means to uncover the answer.
State and federal governments could require workforce training programs, receiving taxpayer funding, to report outcomes according to CIRR standards. Regardless of the quality metrics placed upon these programs, transparency in completion and employment data creates an on-going accountability instrument for oversight bodies.
A Resource for Investors Looking to Fund Innovative Schools:
With the number of last-mile workforce training programs, bootcamps, and online program management companies (OPMs) steadily rising, good schools will require go-to-market and expansion funding. OPMs, companies who leverage a university brand under which they operate, are a $1.5 billion industry that is expected to grow at an estimated annual rate of 35% in the coming two years. Investors can use CIRR data to analyze industry demand, volume per cohort, and student completion rates — and ensure prospective schools aren’t hiding poor outcomes.
Ultimately, higher ed as a whole — students, taxpayers, policymakers, regulators, and invested interests — need to know which schools are worthy of investment and effectively equip students to enter their field of choice, be it a traditional college or workforce development training program. CIRR offers a tremendous leap forward to accomplish this goal.