For decades, employers have relied on educational credentials—like the high school diploma and bachelor’s degree—to sort and filter job applicants. Degrees from highly selective colleges have a powerful brand attached to them that signals valuable skills and traits. But these credentials have always been a blunt instrument—because they reflect the assessment (and conception) of skills from the perspective of educators, not employers.

All that is starting to change, however. Today, the advent of big data analytics is enabling employers to look beyond credentials toward the underlying skills and competencies that credentials were designed to signal. They are turning to increasingly sophisticated pre-hire assessments to identify and quantify skills, rather than rely solely on educational institutions or experiences. Technology’s transformation of the hiring process holds potential to democratize learning, unleash innovations in the way we learn and work, and advance more equitable opportunities for all Americans. It is fueling the rise of skills as the new currency in an increasingly dynamic knowledge economy. And it may soon render degrees (at least, in terms of hiring) obsolete.

In a recent Strada-Gallup Employer Survey, we asked hiring managers which factors are most important in making hiring decisions. They told us that it’s skills and experiences they value most, far more than credentials. It’s a sentiment echoed in pop culture, where Oprah, Steve Martin, and self-improvement guru Cal Newport have argued that when it comes to professional success, it’s not the credentials that matter, but the skills. The secret to any kind of professional success, they say, is this: be so good they can’t ignore you. Ryan Craig, the co-founder and managing director of University Ventures has likewise made the case that the best credential isn’t a credential at all: it’s a job.

. . .

Ryan Craig, the co-founder and managing director of University Ventures has, likewise, made the case that the best credential isn’t a credential at all: it’s a job.

. . .

But downplaying or eliminating the role of credentials is easier said than done. The challenge stems from the fact that while cultivating what economists call “human capital”—knowledge, skills, and abilities—has long been one of the chief aims of education, the value of education is wrapped up in credentials. People mistakenly believe that those who attend college but didn’t earn a credential either received no benefit or were made worse off.

In fact, the well-demonstrated economic returns of college hold true to some extent even for those who don’t complete a credential. There is, likewise, massive variability among the outcomes associated with credentials. For example, some high school graduates are doing college-level math, while others are only doing math at the middle-school level. Similarly, the majority of four-year college graduates enjoy labor market success, but four out of 10 start off their careers underemployed.

The shortcomings of credentials are compounded by a dearth of consumer-friendly data to help learners and employers alike make sense of the hundreds of thousands of unique credentials that have now flooded the education and labor markets. What’s more, the economic value of a credential is affected by an person’s unique collection of knowledge, skills, personality traits, and relationships.

The credential-based approach to hiring and learning has also led to an array of unintended consequences. Working learners frequently have to spend time relearning skills they already have. Prior learning assessments, the principal tool colleges have used to evaluate whether to award students credits for skills they’ve mastered, have gained limited traction in higher education, in spite of their demonstrated benefits. Virtually every product and service we buy today is getting faster, cheaper, and more personalized—except for education.

Worst of all, the credential-based approach to hiring filters out otherwise competent job candidates from getting hired. The upshot is that credentials can actually create barriers to entry that inhibit people with fewer economic and social advantages from moving up.

The good news is that the degree’s fading signal may be accelerating in an era where the shelf life of skills is shrinking, and the jobs of the future require learners to re- and upskill throughout their lives to remain economically relevant—and employed. In the face of new technologies and modes of working, a terminal credential earned at the front end of a lifetime of work no longer carries the weight it once did. More so than earning credentials, the fusion of work and learning is the new imperative.

But the growing interest in hiring based on skills rather than using degrees as proxies also raises new and significant questions for employers. Without credentials, how will employers be able to differentiate among prospective job candidates, and how will knowledge be validated? Alternatives are emerging that might compete with degrees. Firms like Inspiring Minds and ACT are developing pre-hire assessments that assess workers’ skills directly: an Ithaka S+R study, for example, found that 63% of HR professionals use pre-hire assessments, and the technology is undergoing a rapid wave of innovation.

As we build the learning ecosystem of the future, rethinking the ways in which we not only enable individuals to develop—and signal—the development of skills is a central concern. It is a shift that will require better and new forms of data to help individuals identify educational experiences, skills, and credentials more tightly coupled to their specific career and life circumstances. And it holds potential to decouple access to employment (and economic opportunity) from historic—and structural—barriers to access and equity in both education and hiring.