Harnessing data and digital technology
? Search engines are one of the main ways children are exposed to online pornography and other potentially harmful material. eSafety?s research ?Accidental, unsolicited and in your face? shows many children?s first exposure to pornography is accidental, and this often occurs via a search engine. A new industry code drafted by industry bodies representing search engines including Google, Yahoo, and Microsoft (owners of Bing) will provide some very important safeguards. The aim of the code is to protect children from exposure to age-inappropriate material which may be harmful to them, including: ? online pornography ? high-impact violence ? eating disorders ? suicide and self-harm. All measures will be in place by 27 June next year. Here?s how it?ll work ? When children are logged into a search account, the search engine will be required (at a minimum) to filter out pornography and high impact violence ? If you?re not logged into an account and your search returns pornographic or extremely violent images, these will be blurred by default. This is to reduce the possibility of children being exposed to them by accident. In addition, if you enter a search relating to eating disorders, suicide or self-harm, any material promoting these harms will be downranked, while reliable health information and support services will be promoted. Users can still follow links that appear in their search, regardless of whether they are downranked or the images are blurred. If you are over 18, you can choose to opt in or out by logging into an account and changing the settings. These steps codify a range of existing practices that have long been practised by major search engines. Making them enforceable provides more protections for users and greater accountability for industry.
Strengthen Children's safety industry code
Kobi refused a doctor's AI. She was told to go elsewhere
Six years on from the start of Australia?s Privacy Act overhaul and there?s still room to send the industry scattering and cast doubt on what will actually land. The Productivity Commission is the latest to throw a curveball on where things may be headed, proposing a fresh ?dual-track? privacy compliance approach for businesses in its interim Harnessing data and digital technology report this week. It?s not the only controversial thing in there: Less redtape around AI, rethinking copyright infringement for feeding the robots and an immediate halt on guardrails for high-risk AI use cases are also in the recommendations list, along with more data accessibility overall. Data Synergies? Peter Leonard sees the privacy take as an example of how privacy reform isn?t really being dictated by consumer privacy needs but by bigger government concerns. While the choice of outcomes-based approaches to using consumer data will appeal to digital marketers, he also warns the flip is even more stringent alternatives from the Privacy Commissioner. ADMA's Andrea Martens points out how privacy progress has "stalled" over the last 18 months, and hopes these latest recommendations will be a "circuit breaker to the unproductive privacy versus business debate". Civic Data?s Chris Brinkworth meanwhile, says the Productivity Commission?s modus operandi is less about privacy and more about the internationally competitive product and AI race Australia must face.
Productivity Commissioner throws cat among the pigeons on Privacy Act reform; calls for ‘safe harbour’ on personal data use for brands acting in best interest of consumers, opens door to AI copyright free-for-all
Earlier this year, the UK government ordered Apple to provide access to encrypted data in the company's cloud storage service, iCloud.
A UK Government Order Threatens the Privacy and Security of All Internet Users
Read the responses to the reform areas we're exploring for Pillar 3: Harnessing data and digital technology.
Responses to Pillar 3: Harnessing data and digital technology
In carrying out our key activities, decisions to undertake discretionary regulatory action are taken in accordance with the OAIC?s Regulatory action policies
OAIC regulatory priorities
OpenAI scrambles to remove personal ChatGPT conversations from Google results.
ChatGPT users shocked to learn their chats were in Google search results
Social media minimum age legislation passed
OAIC invites industry, civil society, academia and other interested stakeholders to consult on the development of the children?s online privacy code
Children’s Online Privacy Code (consultation for industry, civil society, academia and other interested stakeholders)
This tool will help you to assess your business?s current privacy practices by providing examples that may apply to your own circumstances.
Privacy Foundations self-assessment tool
Privacy Awareness Week is an annual event to promote and raise awareness of the importance of protecting personal information. This year, it runs from Monday 16 June to Sunday 22 June with the theme: ?Privacy ? it?s everyone?s business?.
Privacy Awareness Week 2025
An Economist BEST BOOK OF THE YEARAs the data economy grows in power, Carissa Véliz exposes how our privacy is eroded by big tech and governments, why that matters and what we can do about it.The moment you check your phone in the morning you are giving away your data. Before you've even switched off your alarm, a whole host of organisations have been alerted to when you woke up, where you slept, and with whom. As you check the weather, scroll through your 'suggested friends' on Facebook, you continually compromise your privacy.Without your permission, or even your awareness, tech companies are harvesting your information, your location, your likes, your habits, and sharing it amongst themselves. They're not just selling your data. They're selling the power to influence you. Even when you've explicitly asked them not to. And it's not just you. It's all your contacts too.Digital technology is stealing our personal data and with it our power to make free choices. To reclaim that power and democracy, we must protect our privacy.What can we do? So much is at stake. Our phones, our TVs, even our washing machines are spies in our own homes. We need new regulation. We need to pressure policy-makers for red lines on the data economy. And we need to stop sharing and to adopt privacy-friendly alternatives to Google, Facebook and other online platforms.Short, terrifying, practical: Privacy is Power highlights the implications of our laid-back attitude to data and sets out how we can take back control.If you liked The Age of Surveillance Capitalism, you'll love Privacy is Power because it provides a philosophical perspective on the politics of privacy, and it offers a very practical outlook, both for policymakers and ordinary citizens.
Privacy is Power - book by Carissa Veliz Introduction
AI Privacy and Safety Checks © 2024 by Sarah Wood
?Thrilled? 23andMe founder buys firm out of bankruptcy | Information Age | ACS
NIB denies using social media to predict health
Learn how Concentric AI protects sensitive data in Microsoft Copilot by identifying and mitigating risks without complex rules or user input.
Is Copilot safe? Microsoft Copilot security concerns explained
New research shows nearly one in five young adults believe tracking a partner's location is to be expected in a relationship.
Location-sharing apps linked to increased risk of digital coercive control, eSafety Commission research finds
Listening to children, young people and parents is crucial to make sure the Children?s Online Privacy Code responds to their needs and experiences.
Children, young people and parents invited to help shape online privacy protections
Insights from eSafety’s image-based abuse reporting and removal scheme
Melbourne Council wants facial recognition
Significant changes to the Australian Privacy Act are underway, with the first tranche of major reforms introduced in December 2024. These changes are designed to modernise the legislation and align it with global privacy standards. They will impact how organisations of all sizes handle personal information, with key implications for cyber businesses, government agencies, and small to medium enterprises (SMEs). While further reforms are expected, the scope and timing of the next phase remain unclear.
An Update on Privacy Reforms: What It Means for Cyber, Government and Small Business
Google has backflipped on a promise to allow users to easily opt out of third-party cookies in its market leading browser, announcing overnight it will not introduce the standalone prompt it flagged last year. It means the ability to opt out of third-party cookies ? the tiny packets of code that track users' activity across the internet ? will remain buried in the settings of Chrome. The decision comes as Google parent Alphabet faces legal pressure in the US, where a judge has ruled it maintains illegal monopolies on online advertising technology.
Google backtracks on cookies, again
Enjoy the videos and music that you love, upload original content and share it all with friends, family and the world on YouTube.
How AI Uses your Data Against You
We are developing a Children?s Online Privacy Code to help better protect children and young people online, writes Dr Kate Bower, director of our Privacy Reform Implementation and Social Media Taskforce. Every child in Australia has grown up in the shadow of social media.
Sunshine and double rainbows – building a better online environment for children and young people
Once errors creep into the AI knowledge base, they can be very hard to get out.
A weird phrase is plaguing scientific papers ? and we traced it back to a glitch in AI training data
Amazon has disabled some Alexa privacy features amid a push to introduce AI capabilities and make more money.
Everything you say to an Alexa speaker will be sent to Amazon – starting today
This report by the Australian Information Commissioner examines the prevalence and use of messaging apps by Australian Government agencies.
Messaging apps: a report on Australian Government agency practices and policies
This new guidance is intended for developers of generative artificial intelligence (AI) models or systems who are subject to the Privacy Act
Guidance on privacy and developing and training generative AI models
Senators are calling for stronger privacy laws to give Facebook users the ability to block the company from using their posts to train its AI models, as users can in the EU.
Meta admits Australians cannot opt out of ‘predatory’ AI data scrape
Aussies admit to dangerous, antisocial smart glasses use
“I Have Nothing to Hide” – The Dangerous Myth About Privacy
Founder of Session relocates to Switzerland citing ?hostile? atmosphere towards privacy-focused technology
International
The Secret System Behind Every Call You Make Is About to Change Hands
The password era is ending. Bad actors know it, which is why they’re desperately accelerating password-related attacks while they still can.
Convincing a billion users to love passkeys: UX design insights from Microsoft to boost adoption and security