Let's Talk

Dark Design Defined

About this episode.

Let’s Navigate...

Have you ever considered what’s happening behind the scenes in your child’s apps, video games, websites and social media platforms? Enter “dark design” – a manipulative feature built into some digital programs that can influence children’s online choices and impact their mental health.

Listen in as we talk with experts and researchers from West Virginia University to explore dark design including what it is, why it exists, how common it is, and strategies for prevention and protection.

Guests:
Amy Gavril, MD, MSCI, Division Chief, Child Safety and Advocacy, West Virginia University
Laurel Cook, PhD, Associate Professor of Marketing, West Virginia University

TRANSCRIPT

Announcer:

Welcome to Well Beyond Medicine, the world’s top-ranked children’s health podcast produced by Nemours Children’s Health. Subscribe on any platform at nemourswellbeyond.org or find us on YouTube.

Carol Vassar, podcast host/producer:

Each week we’re joined by innovators and experts from around the world exploring anything and everything related to the 80% of child health impacts that occur outside the doctor’s office.

I’m your host, Carol Vassar, and now that you are here, let’s go.

MUSIC:

Let’s go, oh, oh, well beyond medicine.

Carol Vassar, podcast host/producer:

Have you ever wondered what happens in the recesses of the apps our children use, the video games they play, the websites they go to, and the social media platforms they post to?

As parents and grandparents, we want to protect our kids, right?

But we might not think much about algorithms or marketing or digital design but our guests today say it’s time for caregivers and policymakers to consider the design of our children’s apps, video games, websites, and social media platforms to prevent them from falling prey to dark design tactics.

Baked into the design of some modern digital programs, dark design can influence our children’s online choices and negatively affect their mental health.

To fully define and discuss dark design, its purpose, and its prevalence, I’m joined by Dr. Amy Gavril, associate professor for the West Virginia University School of Medicine and Division Chief of Child Safety and Advocacy in the WVU Department of Pediatrics.

Also from WVU is Dr. Laurel Cook, an associate professor of marketing. She researches dark design and its impact on children and adults alike from a marketing perspective.

Before beginning research on this podcast episode, I was totally unaware of dark design, so I’m not even going to try and define it. I’ll leave that to one of our experts.

Here’s Dr. Laurel Cook.

Dr. Laurel Cook, West Virginia University:

Essentially, dark design is a intentional interface for any user where it’s designed to really manipulate their behavior.

So you might use nudge techniques or classical conditioning to try to guide a user like a child to a specific action. It comes in a lot of different examples.

Broadly speaking, through all of our hard work and categorizing the wide range of examples of dark design that exist, we have 10 primary categories.

The way dark design might look to a user will include examples like sneaking or misdirection, the use of social factors like social proof, obstruction, forced action, those sorts of things.

And then altogether, including the subcategories that exist for this, are 41 different subcategories. So 41 different types of dark design that exist in the online space. That might be different websites or different apps that a user is on.

Carol Vassar, podcast host/producer:

How prevalent is this? Again, I had never heard of this until I started researching for this particular episode. How prevalent is it out there in cyberspace?

Dr. Laurel Cook, West Virginia University:

Well, when you consider that there are 41 different types, of those 41 types, about 68% are very specific to children. And so I would say it’s very prevalent.

It makes me nervous with how prevalent it is and often keeps me up at night. I love what I do as a researcher but also, my heart goes out to parents and caregivers who face the situation.

I would say that when I describe this topic of dark design as it pertains to children specifically, I jokingly use an analogy and that’s Maslow’s hierarchy of needs.

We think about our basic need of food and shelter and children, when they articulate their most basic need, I would probably very easily and readily say, Wi-Fi. Internet access is the most basic of all needs. That helps everyone in the room understand how the scale that we’re talking about here.

We’ve got about 68% of all the different types of dark design that are affecting children specifically. And we also can look at data now that actually describe the amount of time and hours that children of all different ranges are spending.

For example, children in that eight to 12 range, preteen range, might spend an average of four to six hours a day online, whereas full teens, anyone from the age of 13 to 17 are spending nine hours on average online.

I think the scale of the number of examples and how much time they’re spending online really gives some weight to this particular issue and highlights how pervasive and how many repeated points of exposure a child and a teen might be shown.

We also have to acknowledge the fact that today, especially in 2024, some teens just have near constant access to the internet. They have not just one device but multiple devices. That’s a huge issue.

And then I think underlying the seriousness of this issue and why a marketer wants to get involved is because there’s a huge commercial interest here.

That’s been articulated in the form of the attention economy and knowing that brands and businesses often look at children as under-aged adults and have this cradle to grave acknowledgement, like this lifetime value that a child may offer. And so there’s a lot of economic motivation here to guide and capture a child’s interest online and keep them engaged online.

Carol Vassar, podcast host/producer:

There’s an economic interest but what about an ethical responsibility on the part of these companies? Are we losing the ethical weight here?

Dr. Laurel Cook, West Virginia University:

I think, yes, especially because we’ve seen for years, this term of dark design is about 14 years old now. We’ve been setting it for well over a decade and in that time, tech companies and brands have just gotten savvier on skirting broad laws, for example, designed to keep children safe online.

And so an answer to your question, unfortunately, I would say that ethical consideration has largely not been prioritized and now it’s coming on the shoulders of parents and caregivers and policymakers to create interventions after the fact. Halt some of the harmful effects here.

What I’ve noticed in my research, I am a marketer but again, I use my discipline and service to those I study, and I realize that what I see brands doing is referring to children and teens as digital natives. “Oh, they’re savvier than me. They know how to keep themselves safe online.”

That’s really not an accurate statement in the fact that there are three components missing that a child lacks. They don’t have the agency, they don’t have the fluency, and they don’t have the citizenship. The internet wasn’t really designed with children in mind.

Those gaps there further underscore how easy it is for a brand or a company or a tech provider to skirt ethical considerations here in this topic specifically.

Carol Vassar, podcast host/producer:

Dr. Gavril, I want to turn to you and ask, are children and teens more vulnerable to these tactics, these dark design tactics? And how do their cognitive and emotional development make them more susceptible?

Dr. Amy Gavril, West Virginia University:

Well, they’re absolutely more susceptible because they’re still developing cognitively and their executive functioning is still developing. They lack skills to recognize and resist these coercive design techniques. Or sometimes, they don’t even have the knowledge that they’re being deceived because it’s all part of the game they’re playing or part of the platform that they’re on.

And as to your second part of your question, how it’s affecting them emotionally? Is that what you were asking?

Carol Vassar, podcast host/producer:

Emotionally, cognitively, etc.

Dr. Amy Gavril, West Virginia University:

Right.

We’re seeing a lot of effects from this.

I think one of the most worrying aspects of dark design is one of its main goals is to keep children online, to keep them playing, to keep them on the platform because the longer they’re there, the more likely that game or that platform is going to make money in some way. This extended usage can come across in a child in lots of different ways.

Sleep deprivation is very common because children are spending more and more time online and that time has to come from somewhere.

It’s affecting their social interactions, their social skills. Some children will become almost addicted to online time that compulsive usage is getting in the way of their day-to-day activities, their interactions with family and friends, schoolwork, and just growing, developing like a normal child.

And then there are other aspects to the dark design that’s very concerning to me. For instance, sometimes, these designs are nudging children to give away some privacy information that children might not otherwise give because children know and have been taught many times not to do that. But it’s a deceptive practice that can almost trick children into doing that.

And that opens them up to loss of privacy, to cyber bullying. It can open them up to potential predators, child predators online. There’s lots of different areas that it can affect a child.

Carol Vassar, podcast host/producer:

What are some examples of dark design targeting minors, Dr. Cook?

Dr. Laurel Cook, West Virginia University:

Well, some of the examples that I want to share with you underscore what Dr. Gavril just said, because I think we recognize enough of the medical literature how, for example, a teen’s brain is formed and develops and the dopamine, for example, from a reward.

You see those social media notifications and boy, does that feel good to a youngster. That reward center of their brain develops a lot sooner than the self-regulation part does.

That further just illustrates why these tactics can be so particularly and disproportionately effective with children and teens.

And examples that are specific to this group might include things like friend spam. Something that is as innocuous as a certainly little pop-up that might appear in a child’s game where it will ask the user, the child to share their contact list and other things like social proof where you see influencers whose target audience might include children and teens who will use such practices like to unlock.

And so again, that’s an engagement tactic designed to promote more users and staying online longer for example.

We also see kidified content. This might be very sexualized content, for example, that has been transformed into a cartoon.

Again, this is content that might be a surprise to a parent who is overlooking their child’s shoulder and sees that they’re looking at a Peppa the Pig cartoon, for example, but it’s highly suggestive, for example. That is a very egregious example of dark design but no less warrants attention and oversight here.

Carol Vassar, podcast host/producer:

It sounds like they’re taking a page from the tobacco companies in the mid-seventies when they cartoonized their characters and really targeted kids, which got them in a lot of trouble.

Let’s talk about who is responsible for making sure that kids do not fall for dark design. Are parents aware that this is out there? And do parents know what to do with regard to dark design?

Dr. Laurel Cook, West Virginia University:

Well, it’s very interesting you asked this question because I’ve had a chance to research and speak with, anecdotally, parents and caregivers in a number of different English-speaking countries. Here in the U.S., of course, and then counterparts in other parts of the world, including Australia and the U.K., and the answer by parents and caregivers differs greatly.

For example, in the U.S., you see a lot of this reflected in our own public policy. Parents want that control and so they would say, yeah, it’s on our shoulders to really regulate and control what a child is exposed to, whereas their counterparts elsewhere in the world, and especially in the E.U., parents and caregivers would say, “No, no, it’s the public policy makers or the tech providers or the brands themselves who are responsible.”

But my answer would be it’s a combination.

I think it’s very easy for your listeners and everyone who has a vested interest in keeping a child and a teen safe online. It’s very easy for us to acknowledge that the deck is really stacked against parents and caregivers. Even parents with the best of intentions will soon find themselves with information overload or just recognizing the sheer amount of effort it takes.

In the state of West Virginia, we have a very high number of grand families. So these are grandparents who are raising our children. You might imagine how much even more efforts is required by this particularly vulnerable group of caregivers who are vastly outnumbered when it comes to tech savviness.

And so I would say the deck is stacked against parents and caregivers and because of that, now intervention at a macro level is warranted. So that requires policy makers to get involved.

I think the policies right now in the U.S. are not as heavy hitting as they are in other parts of the world. So I would like to see stricter oversight over these tech companies and brands.

But I also think that it’s possible for tech companies and brands to self-regulate also. As a marketer, I’ve seen advertisers has self-regulation within their industries to try to…

Like cigarette advertising was a great analogy where there are efforts made to have a level of standards, ethical standards that are expected for ethical advertisers and brands, for example.

I think on one hand you have the macro level, the brands themselves, the tech companies who design social media algorithms, and then you have parents and caregivers and together, I think that represents a group who’s responsible for making the internet not only safe for kids, but a place for them to actually enjoy their time and thrive and succeed.

Carol Vassar, podcast host/producer:

Dr. Gavril, I’m curious, are there any real life examples or maybe case studies of minors who have been negatively affected by these dark design tactics that we’ve been talking about?

Dr. Amy Gavril, West Virginia University:

Well, I can talk of a couple cases I’ve run into.

I’ve run into families who come to me and inform me that their child’s just charged hundreds of dollars on a credit card that’s been linked to some sort of online account to build up their avatar in their favorite game.

When you get to know these kids, these are not kids that would steal money out of their parents’ pocketbooks or wallets but because of the deceptiveness and the nudges to buy, “You need to buy this, that the other thing to make your avatar cool and to win the next game.” They do these things that they wouldn’t otherwise do.

A mother brought her teenage son in to see me very distraught because her son had gone down at the beginning of the weekend to the basement to play video games online with his friends. They have a very hectic house, lots of kids.

The end of the weekend is the next time she sees him on Sunday, Sunday evening before getting ready to go to bed to go to school the next day. And she realizes when she goes down to the basement that he spent the entire weekend down in the basement.

She found bottles of soda that was actually filled with urine so that he didn’t have to stop playing. And she was so irate. She disciplined him by saying, “No more video games. You can’t play.” And because this child clearly had an addiction at this point in time, he actually physically assaulted her. She said, “This was not my child. My child’s not angry. He is not violent.”

These are situations where these children because either they’re just that compulsive usage, constantly playing or a compulsion to win just is being sold to them and they lack the skills to recognize and resist that they’re being deceived.

These are some of the outcomes.

Carol Vassar, podcast host/producer:

I’m curious what you recommended to that mom with regard to her son, his addiction, his behavior moving forward. Were they able to back out and get him help?

Dr. Amy Gavril, West Virginia University:

Well, yes. We tried to address it in the same way we would address other addictions. Therapy, time, and redirection.

He was a very bright young man and I think after he did that to his mom, he understood the seriousness of it and it scared him. So that was a positive that he was willing to change.

Carol Vassar, podcast host/producer:

Dr. Cook, any firsthand examples you might have through your research and study and work?

Dr. Laurel Cook, West Virginia University:

Well, I’ve had a number of examples and a common thread amongst them is some sort of outcome, usually always negative or unfavorable in valence related to mental health.

A number of… Especially teens who might suffer negative effects on their mental health and their well-being because at the end of the day, humans are social creatures and we’re heavily influenced by others.

And I think especially because the surgeon general here in the U.S. mentioned his opinion piece over the summer about social media is a sect, especially for teens on their mental health, is such a huge issue that maybe we should even consider having warning labels for social media.

I’m testing that now actually and there is a lot of truth in what you suggest.

But anyway, it just shows that we’re so social and we are so persuaded by the opinions and thoughts of other people.

This more recently some examples of a negative outcome such as sextortion. This idea of content provided by children themselves. This content they’re providing online themselves is leading to this phenomenon that’s putting them in a real severe danger.

Now we’re having a new line of lawyers come up who are having to defend children in this case. This is just showing you how dark and twisted on this broad topic of dark design really can be in the lives of children and teens.

Carol Vassar, podcast host/producer:

And when we talk about sextortion, we’re talking about somebody maybe popping up saying, “Send me an inappropriate picture of you,” and the child sends it and then the person uses it against them in some way, either online or in real life?

Dr. Laurel Cook, West Virginia University:

That’s right.

Often the child is interacting with someone they think is a peer and in most cases, the person they’re interacting with is not a peer but has used technology, in most cases, social media specifically to appear like a fellow student at their school because they have this information readily available about these vulnerable kids.

They know what school they attend, they know what grade they’re in, they know their gender, they know a lot of personal information about children and can use very savvy technology tricks.

For example, on a platform like Snapchat. Children and teens love this particular platform because the information that’s shared is ephemeral, so it disappears quickly and a teen, for example, can say, “Prove you’re real, show a picture, send a snap that you’re real.” And it’s very easy for a 50-year-old guy to use savvy tech tricks to appear like a peer and so that’s even more convincing to a child and a relationship is formed.

And then before you know it, that child is extorted after they’ve been tricked into sharing some very CSAM, so sexual child abuse material, with another user online.

Carol Vassar, podcast host/producer:

Real quickly, I’m wondering, AI is the flavor of the year, the flavor of the decade. Other emerging technologies are around. Did these technologies amplify? Do they help mitigate dark patterns, especially with regard to minors? What are your thoughts on that, either of you?

Dr. Laurel Cook, West Virginia University:

Well, I am a self-proclaimed tech enthusiast. I love all technology. I see a lot of bright sight to technology.

I don’t think the technology can be described as bad or good. It’s depending on how it’s used to be fair. And I’ve seen it used to harm others to make dark design and deceptive patterns possible.

But one thing I would like to suggest or offer is one thing that AI in particular is really good at is in looking for patterns and in the context of dark design, maybe a little safety pal or a buddy or some sort of UI interface that’s powered by AI could be put on a child’s device to help them recognize or spot cases where manipulation or deceptive patterns might be used.

Carol Vassar, podcast host/producer:

Dr. Gavril, I’m curious, do you see a future where dark design is significantly reduced or eliminated? And what do you think it’s going to take for that to happen?

Dr. Amy Gavril, West Virginia University:

I think it’s going to take an increase in knowledge from parents on how to spot it and what to do about it and also professionals.

I think it will also require government intervention to assist in controlling these situations and protecting children.

As a pediatrician, gone are days for us to just talk to families about screen time and I feel like we’ve addressed the issue of media affecting children. I think we need to become more involved in talking to families and more specifics about how to protect children while online and that would include talking about things like dark design.

Carol Vassar, podcast host/producer:

Dr. Cook, same question. What’s the future? Are we going to be able to tackle this and really make some inroads in eliminating dark design and what will it take?

Dr. Laurel Cook, West Virginia University:

Yes. I think we’re already starting to see here in the U.S. heavy hitters like the FTC and other policymakers really bring this issue to the forefront and that’s been coupled with the Surgeon General, for example, the greatest medical authority in the land also lending his weight of authority to this topic.

Laws like the Kids Online Safety Act, it hasn’t gone all the way into being fully passed into law just yet but it’s close and I think it’s getting a lot of bipartisan support.

It’s not a political issue now, it’s more of one of, “Hey, let’s wake up and help make this a requirement by brands and tech providers.”

There’s also part of the discussion of brands recognizing that maybe we shouldn’t sponsor family bloggers as a component here because again, telling bloggers, for example, require their children to be part of their content. And so now that’s also being part of the discussion at the company and grand level.

I also think it’s going to take the acknowledgement of this mentor versus manager spectrum here. I’ve seen a lot of parents who decide, “All right, well, I’ll just take a full manager perspective and oversee everything. I’m going to provide them with a special phone that gives me full oversight.”

Whereas on the other end of the spectrum, they can have more of a mentorship role where they help educate their children and learn about that sexist social media and other dark design tactics together. I think that’s really useful.

But the age of willful ignorance by parents has to be over. I think it’s important for parents and caregivers who are listening to realize and acknowledge that the internet isn’t an entirely safe space for children and teens, and we can’t be willfully ignorant and hopeful that children will figure it out on their own. We have to be part of this village that keeps those safe together.

Carol Vassar, podcast host/producer:

Dr. Laurel Cook is an associate professor of marketing at West Virginia University. She was joined in conversation about dark design by Dr. Amy Gavril, associate professor for the West Virginia University School of Medicine and Division Chief of Child Safety and Advocacy for the WVU Department of Pediatrics.

MUSIC:

Well Beyond Medicine.

Carol Vassar, podcast host/producer:

Thanks to Dr. Cook and Dr. Gavril for their insights on this little-talked-about issue affecting our children and their health. And thanks to you for listening.

Oh, the places will go to talk with the experts about the issues outside the doctor’s office that affect our children’s health.

Next week, we begin a series of YouTube podcasts from our recent trip to health in Las Vegas. The audio-only version will still be available on your favorite podcast app with the associated video audio version available on the Nemours YouTube channel.

Our guest will be Dr. Bayo Curry-Winchell, founder of Beyond Clinic Walls, which is dedicated to sharing accurate and credible short-form health content in the social media and podcast spaces. No dark design.

Missed an episode? No worries. You can catch up by visiting nemourswellbeyond.org. There, you can subscribe to the podcast and leave a review. That’s nemourswellbeyond.org.

Our production team for this episode includes Lauren Teta, Cheryl Munn, and Susan Masucci.

I’m Carol Vassar. Until next time, remember, we can change children’s health for good well beyond medicine.

MUSIC:

Let’s go, oh, oh, well beyond medicine.

Listen on:

Put a face to it.

Meet Today's Guests

Carol Vassar

Host
Carol Vassar is the award-winning host and producer of the Well Beyond Medicine podcast for Nemours Children’s Health. She is a communications and media professional with over three decades of experience in radio/audio production, public relations, communications, social media, and digital marketing. Audio production, writing, and singing are her passions, and podcasting is a natural extension of her experience and enthusiasm for storytelling.

Amy Gavril, MD, MSCI, Division Chief, Child Safety and Advocacy, West Virginia University

A pediatric child abuse and neglect subspecialist, Dr. Gavril excels in clinical investigation, research, grant writing and program development. She is passionate about education, leadership and advancing child welfare through medical research.

Laurel Cook, PhD, Associate Professor of Marketing, West Virginia University

A dedicated educator and researcher, Dr.Cook specializes in consumer behavior and social marketing. She supports student-led initiatives fostering community engagement and inspiration.

Subscribe to the Show