Age verification… for Google? It sounds like a punchline. But it is real policy here in Australia! Australia is considering mandatory age verification just to access search engines and AI tools, with the aim of shielding minors from inappropriate content. On the surface, that sounds responsible. But here is the deeper issue... 1. How do you technically verify someone's age online without overstepping personal privacy? 2. What burden does this place on platforms, publishers, developers, and users? 3. And what happens when access to information becomes gated, segmented, or throttled in the name of "protection"? This is no longer just about adult content. It is about who gets to search, what they can see, and who controls the filters. From a tech, design, and ethical standpoint, this has massive implications!! Here is the question I am asking: Is this the start of a safer digital future… Or a slippery slope toward a censored one? What do you think?
Australia's age verification for Google: a slippery slope?
More Relevant Posts
-
Digitally Native or Digitally Naive? Let's quote a few research facts: - In the USA, of those who go online, 38% are unable to perform more complex digital tasks. - Many Americans also have difficulties distinguishing facts from opinions online. - Over 70% of adults can’t name the three branches of government! This isn’t just embarrassing trivia, it’s a democratic emergency. - Nationwide, only 22% of students achieved civics proficiency. - There is No federal privacy law in the USA The digital age has brought an unprecedented transformation of how misinformation and propaganda spread. Technologies for creating manipulative content are becoming cheaper, more sophisticated, and easier to use – a perfect storm for spreading digital deception. #Google went from the message "Not be Evil" to "Digital Surveillance" Google started tracking users across websites – using cookies to build detailed behavioral profiles – simple advertising evolved into what we now call #surveillance #capitalism. #Apple from "Freedom Fighter from Corporate Control" to secretly tracking users even when they opted out and when requested, the company shares data with law enforcement 90 percent of the time. The gap between marketing promises and actual behavior is as wide as their profit margin! The big tech companies exploit our human psychology to keep us chained to their products while using their vast resources to lobby against privacy legislation that might protect us. Consequently, modern civics education(very low in the USA) needs to integrate #digital #literacy as a core democratic skill. Citizens who understand data exploitation are more likely to protect their privacy, while those who can evaluate information sources resist manipulation. More than just identifying fake news, users need to understand who controls information and how corporate interests shape media narratives. Americans today face a double crisis: low digital literacy and limited #civic knowledge. That combination leaves people wide open to online manipulation and extremism. The tech giants haven’t helped. What once felt like tools of liberation have morphed into engines of surveillance capitalism – profiting from user data while crushing competition. But it doesn’t have to be this way, the EU #GDPR shows that real #privacy protections are possible, and they’ve already sparked global momentum. The path forward must combine digital literacy and civic education. Citizens should know not only how to stay safe online, but also how democracy works – and how the two connect. By practicing basic digital hygiene, choosing privacy-first tools, and sharing knowledge in your community, you can protect yourself while strengthening #democracy. In the end, digital literacy isn’t optional anymore; it’s a cornerstone of preserving #freedom and democracy in a connected world. This is an excellent book for young and old. I leave the conclusions to you, about WHY the state of our world.
To view or add a comment, sign in
-
-
SemaBOX Africa launches YEBO, an encrypted file-sharing platform built for African creators offering free 2.5GB transfers with full privacy protection and no AI training usage.
To view or add a comment, sign in
-
SemaBOX Africa launches YEBO, an encrypted file-sharing platform built for African creators offering free 2.5GB transfers with full privacy protection and no AI training usage.
To view or add a comment, sign in
-
*Accept all cookies?* Most users associate data protection mainly with the countless cookie banners they encounter online. Yet, privacy decisions go far beyond this — influencing many aspects of our everyday lives. However, privacy decisions extend beyond this online interaction and also influence other aspects of our lives. This year, the annual conference of the Plattform Privatheit was centered on the topic of “Data Protection and Digital Policy in Times of Crisis.” The keynotes by Guido Scorza and Dr. Katja Muñoz were particularly impactful, addressing the risks of generative AI for privacy, interpersonal relationships, and democracy. Jennifer Klütsch from our team, Ann-Sophie Schenk from the Chair Individual and Technology (iTec) of RWTH Aachen and Fabian Wörz from the JFF - Institute for Media Education in Research and Practice, presented our interdisciplinary, BMFTR-funded project #SoPrivAdo. The conference brought together diverse insights from privacy research – and sparked conversations that challenged assumptions and opened up new angles for our work. Many thanks for the excellent organization of the 10th annual conference and the support from Plattform Privatheit this year. *About the #SoPrivAdo project* SoPrivAdo stands for "Chancen und Risiken sozialer Einflussfaktoren für die Privatheitsentscheidungen Adoleszenter" aka "Opportunities and risks of social factors influencing the privacy decisions of adolescents. The BMFTR-funded project investigates the opportunities and risks that arise for adolescents and young adults through shared technologies. It focuses on privacy decisions influenced by others, which can occur when sharing information in messenger groups or on social media. As friends reveal more about themselves, the social norms governing personal data sharing often shift. Shared technologies can also result in decisions about one’s privacy being influenced by the surrounding environment—for instance, when friends post a joint photo online without consent. The project aims to enhance the autonomy of adolescents and young adults aged 16 to 25 in these decisions, empowering them to understand and exercise their right to informational self-determination, even in social contexts. Want to learn more about the project? Visit our project homepage: https://lnkd.in/dMeZuJbR or check out the conference proceedings: https://lnkd.in/dybj7xN2
To view or add a comment, sign in
-
-
One privacy decision. Multiple people affected. And often, only one gets to decide. 📱 Whether it’s messenger group chats or smart speakers in shared spaces: privacy decisions are often made by someone else, or they affect more than just the person who made them. This creates a particular challenge for adolescents and young adults, who are navigating the tension between a desire for autonomy and the need to belong to a peer group. In our research, we explore - from a psychological perspective - how to support adolescents and young adults in navigating both their own and others’ privacy in social interactions involving shared technologies. We are always happy to share and discuss our ideas, methods, and findings – most recently together with our project partners at the Plattform Privatheit conference, engaging in inspiring exchanges with researchers, practitioners, and representatives from ministries and industry 👇
*Accept all cookies?* Most users associate data protection mainly with the countless cookie banners they encounter online. Yet, privacy decisions go far beyond this — influencing many aspects of our everyday lives. However, privacy decisions extend beyond this online interaction and also influence other aspects of our lives. This year, the annual conference of the Plattform Privatheit was centered on the topic of “Data Protection and Digital Policy in Times of Crisis.” The keynotes by Guido Scorza and Dr. Katja Muñoz were particularly impactful, addressing the risks of generative AI for privacy, interpersonal relationships, and democracy. Jennifer Klütsch from our team, Ann-Sophie Schenk from the Chair Individual and Technology (iTec) of RWTH Aachen and Fabian Wörz from the JFF - Institute for Media Education in Research and Practice, presented our interdisciplinary, BMFTR-funded project #SoPrivAdo. The conference brought together diverse insights from privacy research – and sparked conversations that challenged assumptions and opened up new angles for our work. Many thanks for the excellent organization of the 10th annual conference and the support from Plattform Privatheit this year. *About the #SoPrivAdo project* SoPrivAdo stands for "Chancen und Risiken sozialer Einflussfaktoren für die Privatheitsentscheidungen Adoleszenter" aka "Opportunities and risks of social factors influencing the privacy decisions of adolescents. The BMFTR-funded project investigates the opportunities and risks that arise for adolescents and young adults through shared technologies. It focuses on privacy decisions influenced by others, which can occur when sharing information in messenger groups or on social media. As friends reveal more about themselves, the social norms governing personal data sharing often shift. Shared technologies can also result in decisions about one’s privacy being influenced by the surrounding environment—for instance, when friends post a joint photo online without consent. The project aims to enhance the autonomy of adolescents and young adults aged 16 to 25 in these decisions, empowering them to understand and exercise their right to informational self-determination, even in social contexts. Want to learn more about the project? Visit our project homepage: https://lnkd.in/dMeZuJbR or check out the conference proceedings: https://lnkd.in/dybj7xN2
To view or add a comment, sign in
-
-
📍 #Berlin, Plattform Privatheit: A week of engaging research and discussions on privacy and informational self-determination, bringing together researchers, industry representatives, and politicians to explore the chances and risks of digital policies to support democracy. Ann-Sophie S., along with project partners Jennifer Klütsch and Fabian Wörz, presented project #SoPrivAdo, which empowers young adolescents to make informed privacy decisions. As debates about #privacy, chat control, and #digital #rights intensify, it’s crucial to reflect on how these issues shape our #future and how we can enable others to navigate their own privacy needs while balancing social influences ultimately fostering informational self-determination. organized by: Fraunhofer ISI funded by: Bundesministerium für Forschung, Technologie und Raumfahrt
*Accept all cookies?* Most users associate data protection mainly with the countless cookie banners they encounter online. Yet, privacy decisions go far beyond this — influencing many aspects of our everyday lives. However, privacy decisions extend beyond this online interaction and also influence other aspects of our lives. This year, the annual conference of the Plattform Privatheit was centered on the topic of “Data Protection and Digital Policy in Times of Crisis.” The keynotes by Guido Scorza and Dr. Katja Muñoz were particularly impactful, addressing the risks of generative AI for privacy, interpersonal relationships, and democracy. Jennifer Klütsch from our team, Ann-Sophie Schenk from the Chair Individual and Technology (iTec) of RWTH Aachen and Fabian Wörz from the JFF - Institute for Media Education in Research and Practice, presented our interdisciplinary, BMFTR-funded project #SoPrivAdo. The conference brought together diverse insights from privacy research – and sparked conversations that challenged assumptions and opened up new angles for our work. Many thanks for the excellent organization of the 10th annual conference and the support from Plattform Privatheit this year. *About the #SoPrivAdo project* SoPrivAdo stands for "Chancen und Risiken sozialer Einflussfaktoren für die Privatheitsentscheidungen Adoleszenter" aka "Opportunities and risks of social factors influencing the privacy decisions of adolescents. The BMFTR-funded project investigates the opportunities and risks that arise for adolescents and young adults through shared technologies. It focuses on privacy decisions influenced by others, which can occur when sharing information in messenger groups or on social media. As friends reveal more about themselves, the social norms governing personal data sharing often shift. Shared technologies can also result in decisions about one’s privacy being influenced by the surrounding environment—for instance, when friends post a joint photo online without consent. The project aims to enhance the autonomy of adolescents and young adults aged 16 to 25 in these decisions, empowering them to understand and exercise their right to informational self-determination, even in social contexts. Want to learn more about the project? Visit our project homepage: https://lnkd.in/dMeZuJbR or check out the conference proceedings: https://lnkd.in/dybj7xN2
To view or add a comment, sign in
-
-
This is one of the most significant RegTech shifts we've seen. Moving to inferred age creates huge implications for privacy and regulatory enforcement. AU10TIX is at the forefront of designing identity solutions that meet the need for fair age estimation and global compliance without compromising user experience.
Sorry, your birthday doesn’t matter anymore. YouTube has started using AI to estimate whether a user is under 18 and automatically restricts access if a user is determined not to meet age requirements. As TIME Magazine reports: “This technology will allow us to infer a user’s age and then use that signal, regardless of the birthday in the account, to deliver our age-appropriate product experiences and protections.” Read the full article →https://lnkd.in/e7KB6uw7 Regulators have already made the push toward moving from self-declared dates to inferred ages, with huge implications for compliance, privacy, and user experience. That shift raises new challenges for platforms and businesses alike: ✔️ Stricter global compliance obligations ✔️ Designing fair “age estimation + correction” flows ✔️ Balancing privacy, trust, and youth protection Explore what’s already on the books in AU10TIX' Age Assurance Regulatory Guide: https://lnkd.in/dgrH3erS #AgeAssurance #OnlineSafety #RegTech #Compliance
To view or add a comment, sign in
-
-
Why AI Privacy Matters to Everyday Users Artificial Intelligence isn’t just powering big companies, it’s shaping the apps we use daily. But behind every smart feature is our data. 🔐 Protecting privacy isn’t a “nice to have”, it’s a right. Here are 3 ways you can protect yourself when using AI-driven tools: 1️⃣ Check privacy policies 2️⃣ Limit unnecessary permissions 3️⃣ Use privacy-focused apps #AIPrivacy #DataProtection #TechTips
To view or add a comment, sign in
-
-
You disabled tracking. Google tracked you anyway. For eight years, 98 million users believed they'd opted out of Google's data collection through the "Web & App Activity" setting. They hadn't. Google continued harvesting their app activity data, packaging it, and profiting from it. Last month, a jury awarded those users $425 million. This week, lawyers are seeking an additional $2.36 billion in forfeited profits. The verdict was clear: Google's conduct was "highly offensive, harmful, and without consent." But here's what should alarm us most: that's roughly $4 per person whose privacy was violated. Meanwhile, Google made billions from that same data. The maths doesn't add up. When the cost of betraying user trust is a rounding error compared to the profits gained, we haven't created accountability. We've just priced in the violation. This is privacy theatre at its finest. Toggle switches and settings panels that create the illusion of control whilst the machinery of data extraction continues uninterrupted backstage. And Google isn't alone in this performance. Meta has spent years perfecting off-platform tracking, finding creative loopholes to follow users across other apps and the web regardless of their stated preferences. The tracking toggle becomes decoration, not protection. The truly damning detail? Despite the verdict, Google hasn't changed its privacy disclosures or data collection practices. The company that built its empire on "Don't be evil" is now arguing in court that users simply "misunderstand how our products work." Perhaps we understand perfectly well. Perhaps that's precisely the problem. We're watching the limits test of digital consent play out in real time. When a company can continue surveillance after users explicitly opt out, collect a historic fine, and carry on unchanged, we need to ask ourselves what meaningful consent actually looks like in practice. Because right now, it looks like permission is something companies ask for out of courtesy, not necessity. And privacy settings are suggestions, not boundaries. The question isn't whether tech companies will track us. It's whether we'll continue accepting a system where they can profit enormously from doing so without permission, pay a fraction of those earnings as penance, and face no requirement to actually stop.
To view or add a comment, sign in
-
The 2025 legislative session makes clear that California is taking an aggressive approach to regulating the tech sector on a number of key issues, including AI-generated content, social media, children and teens’ safety, and consumer data privacy. https://lnkd.in/g-yaVDJh
To view or add a comment, sign in
What worries me is not the intent... it is the implementation. We all want safer digital spaces for kids. But when the solution involves surveillance-level verification just to search the internet… You have to ask - What problem are we solving, and at what cost? This is not just a tech issue. It is a question of digital rights, access, and how much control is too much. Curious where others land on this... Is this smart regulation or regulatory overreach? Thoughts... Gavin Heaton The Hon. Victor Dominello Ed Husic MP Andrew Birmingham 🕵️♂️ Alastair MacGibbon