A photo of a child playing with a tablet

🔒 New Rules for Keep Da Keiki Safe Online 📱

⬇️ Pidgin | ⬇️ ⬇️ English

Howzit, gang! Get one big kine news from da U.S. government side. Dey proposing some new rules for make sure da keiki stay safe wen dey online. Dis one real important step for protect our keiki from kine stuffs dat no good for dem on da internet.

Da Federal Trade Commission (F.T.C.) stay saying we gotta make some serious changes for protect da children’s privacy online. Dis one of da biggest moves by da U.S. government for beef up consumer privacy in more than ten years. Dey like make sure da online world safe for our keiki, especially wen dey using social media, gaming apps, and learning tools.

Da plan? Dey like fortify da rules under da Children’s Online Privacy Protection Act of 1998. Dis law, it stops online services from tracking young ones using stuffs like social media apps, video game platforms, toy retailers, and digital ad networks. Da F.T.C. wants shift da responsibility from da parents to da apps and oddah digital services. Dey also like limit how these platforms use and make money off da keiki’s data. 🛡️👶

Da proposed changes mean certain online services gotta turn off targeted ads for kids under 13 by default. And dey no can use personal info like one kid’s cellphone number for keep dem on da platform longer. No more bombarding da young ones with push notifications.

Da updates also include beefing up security requirements for online services dat collect children’s data. Plus, dey gotta limit how long dey can keep dat info. And for da learning apps and oddah educational-tech providers, dey only can collect kids’ personal details for educational stuffs, not for make money.

Da F.T.C. Chair Lina M. Khan says, “Kids gotta be able to play and learn online without being tracked by companies looking to hoard and monetize their personal data.” She like make sure da companies take more responsibility for keeping da keiki’s data safe.

COPPA, da main federal law protecting keiki online in da U.S., requires online services aimed at children, or those knowing they get kids on their platform, for get permission from one parent before collecting, using, or sharing personal details from a kid under 13.

To stay legal, popular apps like Instagram and TikTok get rules dat kids under 13 no can set up accounts. Social media and gaming apps typically ask users for their birthday wen signing up.

But even with dis, da regulators filed plenty complaints against big tech companies. Dey say some companies no do good with age-checking; show targeted ads to kids without parental permission; let strangers contact kids online; or keep kids’ data even afta parents ask for delete ’em. Amazon, Microsoft, Google, YouTube, Epic Games (da guys who make Fortnite), and Musical.ly (now known as TikTok) all had for pay big fines for break dis law.

And den get dis: 33 state attorneys general wen’ file one lawsuit against Meta, da parent company of Facebook and Instagram. Dey say Meta wen’ break da children’s privacy law. Meta say dey working hard for make online experiences safe for teenagers and dat da complaint no really show what dey doing.

Da F.T.C. proposed these stronger kids’ privacy protections amid growing concerns about da mental health and physical safety risks popular online services might pose to da young ones. Parents, doctors, and children’s groups worried dat social media content recommendation systems show stuff promoting self-harm, eating disorders, and plastic surgery to young girls. Some school officials also worried dat social media platforms distracting students from their school work.

States dis year passed more than a dozen laws restricting minors’ access to social media networks or porn sites. But, industry trade groups been suing for block some of these laws.

Da F.T.C. started reviewing da kids’ privacy rule in 2019 and received more than 175,000 comments. Da result? One proposal running more than 150 pages. Proposed changes include making online services no can use tracking codes for keep kids on their platforms longer without verifiable parental consent.

How online services going follow these changes not clear yet. Da public get 60 days for comment on da proposals, and den da commission going vote.

Da first reactions from industry trade groups mixed. Da Software and Information Industry Association, wit members like Amazon, Apple, Google, and Meta, says dey “grateful” for da F.T.C.’s efforts and like participate in da next phase. But NetChoice, wit members including TikTok, Snap, Amazon, Google, and Meta, say da agency’s proposed changes go too far, overriding what parents might like.

Dis new rule from da F.T.C. going make um more hard for websites for provide services to kids as approved by their parents, according to Carl Szabo, NetChoice’s general counsel.

So, das da scoop, everybody. Da F.T.C. trying for make sure our keiki stay safe online. It’s about making da internet world one safer place for dem. 📱🔒👧👦🌐🤙


NOW IN ENGLISH

U.S. Regulators Propose New Online Privacy Safeguards for Children 📱🔒

Hello everyone! Big news from the U.S. government: they’re proposing significant changes to enhance children’s online privacy. This initiative is one of the most substantial efforts to strengthen consumer privacy in the U.S. in over a decade.

The Federal Trade Commission (F.T.C.) is looking to revamp the key federal rule safeguarding children’s online privacy. The proposed changes aim to strengthen the Children’s Online Privacy Protection Act of 1998. This law currently restricts online tracking of youngsters by services like social media apps, video game platforms, toy retailers, and digital advertising networks. The F.T.C.’s goal is to shift online safety responsibility from parents to apps and other digital services, while also curbing how these platforms use and profit from children’s data. 🛡️👶

The proposed modifications would require online services to disable targeted advertising by default for children under 13. They would also prohibit these services from using personal details, like a child’s cellphone number, to keep them on their platforms longer. This means an end to personal data being used to inundate young children with push notifications.

Additionally, the updates aim to enhance security for services collecting children’s data and limit the duration such information can be held. They also seek to restrict the collection of student data by educational-tech providers, allowing schools to consent to the collection of children’s personal information solely for educational, not commercial, purposes.

Lina M. Khan, the chair of the Federal Trade Commission, emphasized, “Kids must be able to play and learn online without being endlessly tracked by companies looking to hoard and monetize their personal data.” She advocates for firms to safeguard kids’ data, placing direct obligations on service providers.

COPPA, the central federal law protecting children online in the U.S., requires online services targeting children or aware of their presence on their platform to obtain parental permission before collecting personal details from children under 13. Popular apps like Instagram and TikTok enforce rules that prohibit children under 13 from creating accounts.

Despite these rules, regulators have filed numerous complaints against large tech companies for ineffective age-gating systems and other violations. Companies like Amazon, Microsoft, Google, YouTube, Epic Games, and Musical.ly (now TikTok) have paid substantial fines for violating the law.

Separately, Meta, the parent company of Facebook and Instagram, faced a lawsuit filed by 33 state attorneys general, accusing it of violating children’s privacy laws. Meta contends it has worked diligently to create safe and age-appropriate online experiences for teenagers.

The F.T.C.’s proposed changes come amid heightened concerns about the mental health and physical safety risks posed by popular online services to young people. There are worries that social media content recommendation systems expose young users to inappropriate content, and that platforms distract students in educational settings.

States have enacted laws to restrict minors’ access to social media and pornographic sites, with industry groups challenging several of these laws.

The F.T.C. began reviewing the children’s privacy rule in 2019, receiving a wide array of comments from various stakeholders. The resulting proposal, extending over 150 pages, includes changes like restricting online operators from using user-tracking codes to maximize children’s time spent on their platforms without verifiable parental consent.

How online services will adapt to these changes remains to be seen. The public has a 60-day comment period on the proposals, after which the commission will vote.

Industry reactions are mixed, with some trade groups expressing gratitude for the F.T.C.’s consideration of external input, while others argue the proposed changes overstep and could hinder websites from providing necessary services approved by parents.

This development from the F.T.C. is a significant step towards creating a safer online environment for children. 📱🔒👧👦🌐🤙

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *