My parents are fond of retelling the story of a couple they knew who swore up and down their kids would only watch PBS. This was the 1970s — the era before cell phones, before video game systems, before your TV choices consisted of anything more than the three major networks, PBS, and local independent stations that ran reruns of “Our Miss Brooks” and B-movies with titles like “Disembodied Venusian Alien Hands Attack.” If it could be pulled off, it should have been a cinch.
So, imagine my parents’ lack of surprise when they arrived one Saturday afternoon to find their friends’ 6-year-old twins watching “Starsky & Hutch,” all with the defeated parents looking on.
“I didn’t know ‘Starsky & Hutch’ was on PBS now,” my dad said. The girls were confused, the parents were slightly roiled and an anecdote was born.
There’s a wider point here, which is that it’s difficult for parents, even with the best intentions, to police their child’s media usage. This is especially true in 2021, when the stakes are somewhat higher than your kids watching a drug dealer kill one of Starsky’s informants or one of them asking what, exactly, Huggy Bear does for a living.
With social media and the internet, your child can be exposed to material and individuals that aren’t just obscene and rebarbative but deadly and dangerous; parents in 2021 need to ensure children aren’t able to venture to the seamier corners of the internet, particularly where predators might lie in wait.
Given all that, it’s a wonder that Mark Zuckerberg’s Facebook is proposing a version of its Instagram platform for minors under the age of 13 — a system the Menlo Park, California, social media giant somehow believes is in no way a serious target for sexual predators — as a tool parents can use to fight this. According to them, the platform will give “parents visibility and control over what their kids are doing.”
Forty-four attorneys general are just as skeptical as you likely are.
In a letter sent to Facebook on Monday, the 44 top state law officials cited a “checkered record” on Facebook’s part when it came to protecting minors from dangers like cyberbullying and pedophiles, according to The Associated Press.
“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” said the letter, which was spearheaded by Massachusetts Attorney General Maura Healey.
“The attorneys general urge Facebook to abandon its plans to launch this new platform.”
Children under the age of 13 wouldn’t ordinarily be able to join Instagram, and there doesn’t seem to be any groundswell of support for such a product, either — at least among parents, who would presumably be controlling the information diet of a child under 13 with greater strictness than they would for older teenagers.
While the letter was spearheaded by Healey, keep in mind this brought together a bipartisan raft of signatories. It brought together New York Attorney General Letitia James, who is currently trying to sue the National Rifle Association out of existence, and Texas Attorney General Ken Paxton, one of the most vociferous opponents of the Democrats’ current push to enact stringent gun control legislation. If you can bring those two together on something, that’s a sign this is an outstandingly poor idea.
The letter noted Facebook has a very poor record of keeping its users safe.
“Facebook has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls. Reports from 2019 showed that Facebook’s Messenger Kids app, intended for kids between the ages of six and 12, contained a significant design flaw that allowed children to circumvent restrictions on online interactions and join group chats with strangers that were not previously approved by the children’s parents,” the letter said.
“Just recently, a ‘mistake’ with Instagram’s algorithm promoted diet content to users with eating disorders, where the app’s search function recommended terms including ‘appetite suppressants’ and ‘fasting’ to vulnerable people who were at risk of relapsing.”
The letter also cited research that “increasingly demonstrates that social media can be harmful to the physical, emotional, and mental well-being of children,” including several studies which found, among other things, that viewing selfies led to “decreased self-esteem” and “decreased life satisfaction.”
Another study found that “Instagram … exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers[,]” and “[t]he platform’s relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing.”
Facebook, the letter pointed out, told a congressional hearing in March that “[t]he research we’ve seen is that using social apps to connect to other people can have health benefits.”
The letter also went on to note another point of concern: Those under 13 “are also simply too young to navigate the complexities of what they encounter online, including inappropriate content and online relationships where other users, including predators, can cloak their identities using the anonymity of the internet.
“One report found an increase of 200% in recorded instances in the use of Instagram to target and abuse children over a six-month period in 2018, and UK police reports documented more cases of sexual grooming on Instagram than any other platform.”
And keep in mind — that was on a platform that isn’t based around pictures from and of children under the age of 13.
The project hasn’t gone live yet. BuzzFeed News first reported on it back in March when it got ahold of leaked Facebook memos.
“I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list,” Vishal Shah, Instagram’s vice president of product, allegedly said in a post on an employee message board.
“We will be building a new youth pillar within the Community Product Group to focus on two things: (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time.”
The last major product the social media giant marketed to minors under the age of 13 was the aforementioned version of Facebook Messenger Kids.
The company promised there was no negative impact in letting children so young use digital messaging services. However, according to Wired, many of the “experts” Facebook consulted when developing the kids’ messaging app also received funding from the social media giant.
Facebook seems to be taking the same tack this time around, although it’s unclear whether the experts will have conflicts of interest this time.
“We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it,” a Facebook spokesperson said, according to CNBC. “We also look forward to working with legislators and regulators, including the nation’s attorneys general.”
Facebook later sent an updated statement saying the product was aimed at kids who were already using the internet: “We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing. We are developing these experiences in consultation with experts in child development, child safety and mental health, and privacy advocates.”
It’s worth noting the platform isn’t live yet, but Facebook has announced no substantive measures to let parents control their child’s social media usage or ensure predators won’t use the platform to groom and abuse their kids.
All they’ve done is vouchsafe to parents: “Don’t worry, we’ve got it covered.”
That just won’t do, particularly given the prevalence of predatory behavior and cyberbullying on other similar platforms.
This isn’t just your kid watching Starsky and/or Hutch take down a drug dealer. This is a whole different ballgame, one social media companies should not and cannot be allowed to profit off of.
The dangers are simply too great and the benefit is totally nonexistent.
This article appeared originally on The Western Journal.