• KoboldCoterie@pawb.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Parents giving their children these devices, observing excessive attachment, and not cutting them off bear considerable responsibility.

    While I do agree that parents should bear the brunt of the responsibility here, you must realize that kids are resourceful and no amount of parental oversight will stop a determined kid from accessing this content. Parents aren’t in their presence 24/7, and just like a kid whose parents deny them candy can find plenty of ways to obtain it without their parents knowing, the same is true for social media use. It’s the old adage that the more you tighten your grip, the more slips through your fingers.

    liberty

    You keep using that word, but this isn’t really about personal freedoms at all. It’s about companies that saw that their product was causing harm, and actively made the decision to continue promoting that harmful product in the name of profits. Their products were specifically engineered to cause these outcomes, and you’re defending their right to do that. Do you just propose we allow companies to do whatever they want in the name of profits, no matter the cost to society? If not, where do you draw the line? How much harm do they have to knowingly cause before you think it’s too much?

    When risks are open & obvious, such as the overconsumption of certain foods & legal substances, that’s generally viewed as a matter of personal choice rather than unreasonably dangerous product defect.

    We restrict alcohol and cigarette use by underage people, too, actually, because their effects are known to be harmful, so I’m not sure what point you’re trying to make here. Nobody’s talking about making social media use illegal for adults.

    Basically, I think you’re arguing against social media restrictions for kids which is fine but that’s a completely different discussion. It’s related, but it’s not what this article is about - this article is about holding corporations responsible for bad behavior. If that isn’t what you want to discuss, why are you here?

    However, even supposing such features defectively make the system unreasonably dangerous in a reasonably foreseeable manner, that only demands that service providers provide fair warning. Once duty to warn has been met, users are reasonably aware of risks and responsibility shifts to risk-takers or parents who give children access despite reasonably knowing the risk.

    Okay, I think you’re just not understanding the situation here. Meta did research on the effects of social media. They found that it was harmful. Even after determining that, they continued to promote it as not harmful. Zuckerberg even testified that that evidence that social media was harmful didn’t exist, after they had found evidence that it was. This all came to light because of whistleblower testimony. So even if we accept your premise here, that duty to inform was not met and that’s part of what’s at issue here.