What’s your biggest grievance with social media?
Is it the privacy concerns? Maybe you’re tired of seeing people glued to their screens. Maybe you’re done with your own doomscrolling habits. Or perhaps you’re simply sick to your stomach thinking about all the different ways that social media is being misused by bad actors—deepfake explicit images, child exploitation, sexual predators and so on.
There’s been a lot of debate in the last 10 years or so over the addictive nature of social media. Studies have correlated negative mental health effects—such as depression, anxiety and suicidal ideation—with heavy social media use.
Of course, correlation doesn’t necessarily equal causation. That said, many governments are working to ban social media for children under 16. Australia was the first country to enact such a law in December 2025. France has fast-tracked its own legislation to impose a ban. And many states are also looking at ways to limit social media access for kids.
But this week marks a potential turning point in the state of social media.
Social media giants have long defended that their platforms are not the problems—it’s the content posted on their platforms that causes issues. Therefore, up to this point, they’ve been protected by clauses in Section 230 of the Communications Decency Act—clauses that state online platforms can’t be held liable for content posted by third parties—and they haven’t been held accountable for the alleged harms that their products cause.
However, new evidence suggests that—while content posted by third parties about self-harm, disordered eating, steroids, drugs and other risky behaviors are bad—it’s the algorithms created by social media companies that are exacerbating these dangers.
Alice’s Algorithm in Wonderland
Let’s look at an example: If a teenager searches social media for a new gym routine, the algorithm will take note that he or she wants this sort of content. It will also note that teen’s demographics, such as age and gender, and it will look through its archives to see what sort of videos other people in that same demographic with that same content interest enjoyed—typically, more videos about exercise and probably dieting, too. It will show the teen those videos, and, if the teen engages with them, it will continue to go down the rabbit trail of content searched for and watched by people with similar interests. Pretty soon, teenage girls can find themselves locked into content about eating disorders and plastic surgery; teenage boys might wind up seeing videos about steroids and “looksmaxxing.” From there, things can get much, much darker as the rabbit hole goes deeper.
These teens didn’t ask for this content. They might even become self-aware enough to try to “fix” their algorithms by refusing to engage with harmful content and searching for videos of cute puppies and kittens. But that’s not how the algorithm works. It doesn’t acknowledge that a teenager’s interests might change. It doesn’t recognize that not all users want to continue down the rabbit hole. All it can process is that this user, this teen, engaged with this type of content. And so it’s going to follow its programming and continue showing that content because its own records demonstrate that that’s what the majority of other users did. So if it’s persistent enough, this user will follow suit, too.
The algorithm conforms to who we are … at first. But then, it wants us to conform to it.
Bellwether-ing This Storm
Now, social media giants would probably argue that I’m over-simplifying and over-generalizing this topic, that it’s much more nuanced than that. However, those same companies—Meta (Facebook and Instagram), YouTube, TikTok and Snapchat—are going to court for essentially doing what I just outlined.
The argument is that these companies knowingly and willingly created addictive products and “specifically targeted minors as a core market,” all in the name of generating revenue. And this week, jury selection began for the first of many “bellwether trials” (landmark test cases) that will determine if social media companies will be held liable in the future.
Notably, TikTok and Snapchat both settled with the first plaintiff—a woman known as K.G.M., who is part of a group of thousands suing the social media companies—out of court, before the trial began. But that isn’t to say those companies won’t come up again in later trials.
In the weeks to come, parents can expect to hear testimonies from parents and children personally impacted by social media. According to PBS, “K.G.M. alleges she started watching YouTube at age 6, and had Instagram, Snapchat and TikTok accounts by the time she was 14.” She also claims “that a lack of sufficient guardrails and warnings on the social media platforms led to compulsive use and mental health concerns such as depression, anxiety, body dysmorphia, self-harm and risk of suicide.”
Parents can also expect to read internal messages between company employees comparing their products to drugs, emails revealing that company heads (including Meta’s Mark Zuckerberg) prioritized profits through teen engagement over safety for those same teens and other documents demonstrating that these companies wanted to create users “for life” by engaging them when they’re still kids, according to Ars Technica.
As this trial (and others) progresses, some parents might be tempted to ban social media in their own households. Others may feel like the details are getting blown out of proportion. And still, others might be frozen by indecision, feeling like the situation is a lose-lose no matter what.
I would advise you to talk to your teens about the issue. Talk to them about the trials and ask what they think about them. Ask them if they think your family’s social media habits could use some adjusting. And work together to form some boundaries to keep your family safe and healthy. Here are some questions to get you started:
- Do you think third-party content is a bigger issue or the algorithms themselves?
- Do you see any benefits to algorithms—that is, do you think you more frequently receive content that is helpful or harmful?
- Do you think there are times when you use social media too much?
- Do you think there are times when I use social media too much?
- Do you ever want to quit social media? How do you think you would handle that if you did?
- When are some good times in the day/week we can set aside to focus on family time instead of screen time?
- Should we have some no-phone zones in the house?
Whatever happens in the course of these trials, social media’s influence isn’t going anywhere. And even if some good legislation results from these lawsuits, the ultimate safeguard for your teens isn’t the judicial or legal system: It’s you And we’ll do our best to keep you informed every step of the way.


![The 8 Best Wineries in Niagara [2025] The 8 Best Wineries in Niagara [2025]](https://torontoblogs.ca/wp-content/uploads/2023/08/363627788_1792636007818216_6491937137270359820_n.webp)
![29th Jan: Killer Stories (2026), 2 Episodes [TV-MA] (6/10) 29th Jan: Killer Stories (2026), 2 Episodes [TV-MA] (6/10)](https://occ-0-7324-92.1.nflxso.net/dnm/api/v6/Qs00mKCpRvrkl3HZAN5KwEL1kpE/AAAABQ_PKeFjObJj7JFJ7i4qPuW7LOkB6DU0SmCDccQhFP1wOvmKiDC2FkjU1jHsNzaCvLh8Z1YuMv0ELxb0Xi1Jl324ylHUO9dz3Jqa-k_NGnvU56UEczAXu4CpjhhEHivnx5zrKvz_ZIJ-jMp0N4pvI-CZkRPMw-Pzmr67D5VOPGR1toUSgSNOZ8DZ0Qwlg9mFMDtTXNgmcqIDq_rOJk9IyneVMk5JTEnCbS2gyXMpAbaoQ5X-Kvvf-IlSirffMWvY2QugVXN-LHovVvPOaxLM4BsaxvoAqtfHOmUcFlfglbGR2-U0BpEzhycXMR4v5w.jpg?r=348)

![Best Golf Cart Dealers in Toronto [2023] Best Golf Cart Dealers in Toronto [2023]](https://torontoblogs.ca/wp-content/uploads/2023/08/363871464_245016418441759_4591047426406196628_n.webp)








