MindEdge Online Learning

Authentic Learning and the Skills Gap


Authentic Learning and the Skills Gap

Blog Post Heading

Blog Post Content

By Heather Morton
Senior Editor, MindEdge Learning

We are regularly told that American workers lack needed skills and seem relatively disengaged from their workplaces.

In a 2023 survey by Springboard for Business, 70% of business leaders reported that a skills gap was negatively affecting their organization’s business performance. Meanwhile, according to Gallup, employee engagement—which peaked at 40% in June 2020—currently hovers around 33%.

These trends are mirrored in K-12 education. In 2018, U.S. students ranked 11th out of 79 countries in science and 30th in math. Since the pandemic, chronic absenteeism—defined as missing 10% or more of the school year—has roughly doubled.

When faced with these longstanding challenges, educators turn to learning methods that put real-world problem-solving at the heart of pedagogy. The proliferation of these methods—authentic learning, performance-based learning, problem-based learning, and case-based learning (to name just a few)—attests to an irresistible impulse to base learning on something real. All these methods claim they provide needed skills and can motivate learners.

When carefully designed, these programs deliver. They motivate students and provide the skills needed to navigate messy real-world problems.

But there are some drawbacks.

First, authentic learning experiences take much more planning than traditional learning. The central task needs to be challenging but possible. It has to be “real” but narrowly defined. The students or trainees need to be provided with the resources that will allow them to make progress without being told exactly what to do. They have to be guided toward productive approaches. If working in a team, as is often the case, members need to be assigned roles. It takes significantly more time and energy to plan an authentic learning experience than a traditional classroom experience.

Second, the specificity of authentic learning experiences can be a problem. In the United States, for example, only 38% of STEM graduates go on to work in STEM occupations. Therefore, a chemistry degree based primarily on solving real-world chemistry problems might create fewer transferrable skills than a degree that is taught more traditionally. Even graduates who secure jobs in chemistry might have significant gaps in their foundational knowledge if they’ve spent much of their time solving various narrowly defined problems.

What Skills Are Needed?

But the deeper issue, an issue for any type of learning, is how to know or anticipate what skills will be needed. We can train workers only in skills that are already missing, or which we can reasonably guess will be missing on the future. In “How to take a pragmatic (and winning) approach to skills”, Josh Bernsin gives the example of pharmaceutical companies that want skills which will help them invent the next medical breakthrough.

Bernsin makes two points. First, no company will find the unique skill set it needs by utilizing current skills-tagging technology—by scraping job ads, for example. A medical breakthrough requires unique skills that aren’t easily found.

Second, although he is a big proponent of skills tagging, Bernsin recognizes that identifying needed skills (much less training for them) is not fundamentally a problem of acquiring and deploying data. It’s really a business problem that involves anticipating the future.

The same problem—the difficulty of knowing what skills employees need—can be seen from the education side, as well. Professor Will Venters discusses an authentic learning experience he runs for students at the London School of Economics and Political Science. The week-long sprint, sponsored by Visa, defines a problem that teams of LSE students tackle for course credit.

But this authentic assessment isn’t entirely authentic. There’s a hugely significant real-world prize at stake: the five winning teams are invited to present to Roland Berger Strategy consultants and Visa; the top team is traditionally invited to present to Visa’s Global Innovation Hub. According to Venters, students tend to become very anxious as they fixate on this zero-sum social prize. Venters works hard to assure them that their LSE grade, which depends on a good application of the systems development tools that they were taught in the first six weeks of the course, is an academic prize that all the teams are likely to win.

Venters’ authentic assessment has an outcome based on traditional academic deliverables—a video presentation, a storyboard, a one-page reflection on the sprint teamwork, etc.—and a more authentic outcome based on creativity and innovation. But because of the short time frame, getting to the authentic outcome may require skipping steps. As Venters puts it, “Those groups who win in the “Dragons’ Den” are often those who don’t do as well with the consistency, coherence, and design of the process.” In other words, the Visa and Roland Berger experts seem to want a different set of skills than do the professors at LSE.

Once again, it’s not entirely clear which skills the LSE students need: a good sense of the process or some difficult-to-define creativity. On the other hand, in my opinion, the LSE seems to have broadly the right approach. It is, after all, genuinely unclear what process leads to an effective and innovative idea. Until we’ve worked that out, a good compromise seems to be teaching students what we think they need to know to succeed and then allowing them to practice that knowledge in an engaging project that mimics the world they are training for.