Is your online learning hurting you? Why it’s beyond time to unfollow the pack and embrace ethical digital design

Imagine it’s your job to find or create an online tool that will help deliver coaching experiences to clients — coaching that is meant to enhance wellbeing, inclusion, and work/life satisfaction.  Simple, right?

If you’re not paying attention, maybe.

Now imagine you’ve been dialled into the headlines, research findings, and social media controversies swirling around ‘Big Tech’, and your journey will hit at least two gaping potholes:

POTHOLE 1: Harm vs help

Almost all current digital tools rely on foundational design elements and features (such as like buttons, notifications, and endless scrolling content) that

  1. Are so common we no longer even ‘see’ them; and
  2. Are patently harmful.

The ‘standard’ digital features we know so well were developed over time and with heaps of venture capital by the likes of Google, Facebook, and Twitter with the aim of inciting addictive and compulsive behaviours—because the ‘Bit Tech’ business model requires companies to make money off user engagement, not user wellbeing.

The resultant — and well-documented — harms to cognition, attention, and wellbeing extend to such serious consequences as contributing to sleep deprivation, depression, and worse. And we all experience this so frequently that it’s difficult to identify the cause of our tiredness, overwhelm, or general inability to feel we’re accomplishing what we want. But if you tried a digital detox, you just might.

POTHOLE 2: Bias vs inclusion

A little more addictive Googling and you hit another snag.

On a regular basis, consumers and researchers alike are discovering the dangers of seemingly inconsequential algorithms that are now baked into everything from booking a rideshare to buying groceries or applying for a job.

Unintended racism is a nearly ubiquitous feature of AI systems according to Professor Parham Aarabi, director of the University of Toronto’s Applied AI Group. “Programs that learn from users’ behaviour almost invariably introduce some kind of unintended bias,” he said*.

And most of the larger, venture-funded digital coaching products rely on AI for key functions such as deciding who should get coaching, what content a user should see or not see – when any assurance that their algorithms are unbiased is likely premature.

Steering clear

So, let’s return to the original task: how do you buy or build an online tool that improves wellbeing and inclusion when the foundations of digital product design are based on features that have an opposite effect?

“Very carefully,” as they say. It’s easy to overlook these issues, to figure that one more app or tool won’t make it that much worse, or to rationalise that good intentions will change these outcomes. But doing so would add to what is in reality a serious problem, and you could be exacerbating the very problem you’re trying to solve.

As providers and/or creators of digital technology, we have the power to make a difference in this arena, and to look for better alternatives that respect our audience’s time, autonomy, and attention.

It’s time to #unfollow the pack, and we can start simply by raising our awareness of digital design impacts and re-envisioning an online life built around our needs. It might be an uphill climb, as was the process of designing and building Talking Talent Online, but many hands make lighter work—so the more we can all embrace the idea of ethical digital design, the easier it will be to move towards it, together.

*Source: Collier, Kevin. NBC News, “Twitter’s racist algorithm is also ageist, ableist, and Islamophobic, researchers find.” 9 Aug 2021.