The bill, California’s Age Design Code Act, would require companies like TikTok, Instagram and YouTube to install safeguards for users under 18, including defaulting to higher privacy settings for minors and refraining from the collection of location data for these users; It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether they are designed to be addictive or could cause additional harm to children. Child safety advocates applauded the bill, which passed by a 33-to-3 vote, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the Internet that kids and families deserve,” said Josh Golin, executive director of the advocacy group Fairplay. “For too long, tech companies have treated dire privacy and security issues as a public relations problem to be addressed only through vague promises, obfuscation and delays,” he said. “Now, tech platforms should prioritize the interests and well-being of young Californians before reckless growth and shareholder dividends.” Meanwhile, some privacy advocates have raised concerns about the bill’s broad scope, as it could require all users to verify their age and restrict anonymous web browsing. “The bill would dramatically degrade the Internet experience for everyone and empower a new censorship-focused regulator that has no interest or expertise in balancing complex and competing interests,” wrote Eric Goldman, a law professor and critic of the bill. California is the first US state to pass legislation to control apps that are “likely to be accessed” by users under 18. It comes after the state failed to pass a separate bill targeting children’s online safety earlier in August. That bill, AB 2408, would allow companies to be sued for designing features that keep young users addicted to the app. The bill now faces a final vote before being sent to Governor Gavin Newsom to be signed into law. If enacted, it would go into effect in 2024 and companies could face fines of up to $7,500 per user if found to be violating child protection measures. The bill mirrors similar legislation, called the Age Appropriate Design Code, which came into force in the UK in 2021. It comes as social media companies face more scrutiny over their impact on public health, particularly on their youngest and most vulnerable users. In 2021, Meta whistleblower Frances Haugen revealed internal research at parent company Instagram that showed the app’s drastic mental health effects on teenage users. Such revelations have intensified calls from advocacy groups to strengthen protections for young users. Many called for the federal child protection law, known as Coppa, to be updated to better protect children. The passage of California’s design code for age is “a monumental step in protecting California’s children online,” said Jim Steyer, founder and CEO of the children’s online safety organization Common Sense Media, but it needs to more measures be taken. The act “is just part of the change we need to better protect young people from the manipulative and dangerous practices used by online platforms today,” he said. “California lawmakers and lawmakers across the country must follow up on this important development by enacting additional online and platform privacy accountability measures.”


title: “First Of Its Kind Legislation Will Keep California Kids Safer Online Technology Klmat” ShowToc: true date: “2022-11-01” author: “Kimberly Mitchell”


The bill, California’s Age Design Code Act, would require companies like TikTok, Instagram and YouTube to install safeguards for users under 18, including defaulting to higher privacy settings for minors and refraining from the collection of location data for these users; It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether they are designed to be addictive or could cause additional harm to children. Child safety advocates applauded the bill, which passed by a 33-to-3 vote, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the Internet that kids and families deserve,” said Josh Golin, executive director of the advocacy group Fairplay. “For too long, tech companies have treated dire privacy and security issues as a public relations problem to be addressed only through vague promises, obfuscation and delays,” he said. “Now, tech platforms should prioritize the interests and well-being of young Californians before reckless growth and shareholder dividends.” Meanwhile, some privacy advocates have raised concerns about the bill’s broad scope, as it could require all users to verify their age and restrict anonymous web browsing. “The bill would dramatically degrade the Internet experience for everyone and empower a new censorship-focused regulator that has no interest or expertise in balancing complex and competing interests,” wrote Eric Goldman, a law professor and critic of the bill. California is the first US state to pass legislation to control apps that are “likely to be accessed” by users under 18. It comes after the state failed to pass a separate bill targeting children’s online safety earlier in August. That bill, AB 2408, would allow companies to be sued for designing features that keep young users addicted to the app. The bill now faces a final vote before being sent to Governor Gavin Newsom to be signed into law. If enacted, it would go into effect in 2024 and companies could face fines of up to $7,500 per user if found to be violating child protection measures. The bill mirrors similar legislation, called the Age Appropriate Design Code, which came into force in the UK in 2021. It comes as social media companies face more scrutiny over their impact on public health, particularly on their youngest and most vulnerable users. In 2021, Meta whistleblower Frances Haugen revealed internal research at parent company Instagram that showed the app’s drastic mental health effects on teenage users. Such revelations have intensified calls from advocacy groups to strengthen protections for young users. Many called for the federal child protection law, known as Coppa, to be updated to better protect children. The passage of California’s design code for age is “a monumental step in protecting California’s children online,” said Jim Steyer, founder and CEO of the children’s online safety organization Common Sense Media, but it needs to more measures be taken. The act “is just part of the change we need to better protect young people from the manipulative and dangerous practices used by online platforms today,” he said. “California lawmakers and lawmakers across the country must follow up on this important development by enacting additional online and platform privacy accountability measures.”


title: “First Of Its Kind Legislation Will Keep California Kids Safer Online Technology Klmat” ShowToc: true date: “2022-12-17” author: “Jeffrey Walton”


The bill, California’s Age Design Code Act, would require companies like TikTok, Instagram and YouTube to install safeguards for users under 18, including defaulting to higher privacy settings for minors and refraining from the collection of location data for these users; It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether they are designed to be addictive or could cause additional harm to children. Child safety advocates applauded the bill, which passed by a 33-to-3 vote, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the Internet that kids and families deserve,” said Josh Golin, executive director of the advocacy group Fairplay. “For too long, tech companies have treated dire privacy and security issues as a public relations problem to be addressed only through vague promises, obfuscation and delays,” he said. “Now, tech platforms should prioritize the interests and well-being of young Californians before reckless growth and shareholder dividends.” Meanwhile, some privacy advocates have raised concerns about the bill’s broad scope, as it could require all users to verify their age and restrict anonymous web browsing. “The bill would dramatically degrade the Internet experience for everyone and empower a new censorship-focused regulator that has no interest or expertise in balancing complex and competing interests,” wrote Eric Goldman, a law professor and critic of the bill. California is the first US state to pass legislation to control apps that are “likely to be accessed” by users under 18. It comes after the state failed to pass a separate bill targeting children’s online safety earlier in August. That bill, AB 2408, would allow companies to be sued for designing features that keep young users addicted to the app. The bill now faces a final vote before being sent to Governor Gavin Newsom to be signed into law. If enacted, it would go into effect in 2024 and companies could face fines of up to $7,500 per user if found to be violating child protection measures. The bill mirrors similar legislation, called the Age Appropriate Design Code, which came into force in the UK in 2021. It comes as social media companies face more scrutiny over their impact on public health, particularly on their youngest and most vulnerable users. In 2021, Meta whistleblower Frances Haugen revealed internal research at parent company Instagram that showed the app’s drastic mental health effects on teenage users. Such revelations have intensified calls from advocacy groups to strengthen protections for young users. Many called for the federal child protection law, known as Coppa, to be updated to better protect children. The passage of California’s design code for age is “a monumental step in protecting California’s children online,” said Jim Steyer, founder and CEO of the children’s online safety organization Common Sense Media, but it needs to more measures be taken. The act “is just part of the change we need to better protect young people from the manipulative and dangerous practices used by online platforms today,” he said. “California lawmakers and lawmakers across the country must follow up on this important development by enacting additional online and platform privacy accountability measures.”


title: “First Of Its Kind Legislation Will Keep California Kids Safer Online Technology Klmat” ShowToc: true date: “2022-11-26” author: “Mary Leonard”


The bill, California’s Age Design Code Act, would require companies like TikTok, Instagram and YouTube to install safeguards for users under 18, including defaulting to higher privacy settings for minors and refraining from the collection of location data for these users; It also requires companies to analyze their algorithms and products to determine how they may affect young users, assessing whether they are designed to be addictive or could cause additional harm to children. Child safety advocates applauded the bill, which passed by a 33-to-3 vote, saying similar federal legislation is needed to protect young users. The bill is “a huge step forward toward creating the Internet that kids and families deserve,” said Josh Golin, executive director of the advocacy group Fairplay. “For too long, tech companies have treated dire privacy and security issues as a public relations problem to be addressed only through vague promises, obfuscation and delays,” he said. “Now, tech platforms should prioritize the interests and well-being of young Californians before reckless growth and shareholder dividends.” Meanwhile, some privacy advocates have raised concerns about the bill’s broad scope, as it could require all users to verify their age and restrict anonymous web browsing. “The bill would dramatically degrade the Internet experience for everyone and empower a new censorship-focused regulator that has no interest or expertise in balancing complex and competing interests,” wrote Eric Goldman, a law professor and critic of the bill. California is the first US state to pass legislation to control apps that are “likely to be accessed” by users under 18. It comes after the state failed to pass a separate bill targeting children’s online safety earlier in August. That bill, AB 2408, would allow companies to be sued for designing features that keep young users addicted to the app. The bill now faces a final vote before being sent to Governor Gavin Newsom to be signed into law. If enacted, it would go into effect in 2024 and companies could face fines of up to $7,500 per user if found to be violating child protection measures. The bill mirrors similar legislation, called the Age Appropriate Design Code, which came into force in the UK in 2021. It comes as social media companies face more scrutiny over their impact on public health, particularly on their youngest and most vulnerable users. In 2021, Meta whistleblower Frances Haugen revealed internal research at parent company Instagram that showed the app’s drastic mental health effects on teenage users. Such revelations have intensified calls from advocacy groups to strengthen protections for young users. Many called for the federal child protection law, known as Coppa, to be updated to better protect children. The passage of California’s design code for age is “a monumental step in protecting California’s children online,” said Jim Steyer, founder and CEO of the children’s online safety organization Common Sense Media, but it needs to more measures be taken. The act “is just part of the change we need to better protect young people from the manipulative and dangerous practices used by online platforms today,” he said. “California lawmakers and lawmakers across the country must follow up on this important development by enacting additional online and platform privacy accountability measures.”