
Publication
From tall tales to truth: 10 privacy myths we still hear
Privacy misconceptions are everywhere.
Australia | Publication | June 2025
This article was co-authored with Amanda Wescombe.
Privacy misconceptions are everywhere. From boardrooms to bars and backstreets, myths about the requirements of the Privacy Act 1988 (Cth) (Privacy Act) and its Australian Privacy Principles (APPs) are abound. These myths persist because privacy law is complex and ever-evolving, and perhaps because at first glance some rules seem counterintuitive.
For government legal practitioners, Australian and overseas businesses, privacy officers, and compliance professionals, believing myths can lead to risky compliance gaps. It’s important to bust these myths so that your agency or organisation handles personal information correctly and confidently.
Below, we tackle ten common Australian privacy myths, explaining the reality behind each one and offering practical tips for compliance.
It’s often assumed that once you “de-identify” data (e.g. by removing names), it’s no longer personal information and you can use or share it freely. In reality, this is only true if the de-identification is done properly, and the data cannot be re-associated with an individual. This can happen when the data is linked or combined with another data source.
The Privacy Act defines personal information as information about an identified or “reasonably identifiable” individual. Data is considered de-identified only if there is a very low risk that individuals could be re-identified. If there’s enough detail left that someone could re-identify a person (especially when combined with other data), then it’s still personal information and remains protected by the Privacy Act.
De-identification can be difficult to do well, and different methods pose different levels of risk. A superficial scrubbing of names or obvious identifiers may not be enough. Context and technical safeguards matter. The Office of the Australian Information Commissioner (OAIC) notes that any information which can be re-identified from an “anonymous” dataset must be handled as personal information under the law.
Don’t assume data is truly anonymous just because you removed names or ID numbers. Unless you’ve robustly de-identified the dataset (with very low re-identification risk), treat it as personal information and protect it accordingly. If in doubt, seek specialist advice on de-identification techniques.
Some believe that “business information”, like the details on a business card or a work email address, does not count as personal information. This belief stems from the idea that this type of information is “professional” and public, not actually private, and that privacy law therefore does not apply. In fact, if the information identifies a specific person, it is personal information, even if it’s in a work context and even if it is publicly available.
The Privacy Act’s definition of personal information isn’t limited to someone’s private life; it covers any information about an individual who is identified or reasonably identifiable. This includes work-related details. For example, a person’s name, work email, work phone number, title, and other employment details generally constitute personal information under Australian law. It also includes an opinion about an identified or reasonably identifiable individual.
There’s no blanket exception for contact details just because they’re collected in a business context – they still identify an individual. The only time “business information” would not be personal is if it’s solely about a company or other entity with no identifiable individual (e.g. a company’s financials without personal identifiers). But the contact information on a business card clearly identifies a person, so it’s protected. Opinions linked to that contact information may also be personal information.
Treat work contact details (names, corporate email addresses, direct phone numbers, etc.) as personal information. They fall under the Privacy Act just like personal contact details. Ensure your privacy practices (like securing data and providing privacy notices) cover employee and client contact information, not just “private” data.
The Privacy Act is often said not to apply to “small” businesses (those with $3 million or less in annual turnover). A common myth is that if your business is below this threshold, you have no privacy obligations at all.
In reality, many small businesses do have to comply with the Privacy Act depending on the context of their business. It’s true that the Privacy Act has a general small business exemption (section 6D) for those under the $3 million turnover threshold. However, there are important exceptions. For example, regardless of turnover:
In short, “small” does not mean “automatically exempt”. The law carves out several categories of small entities that must follow the APPs. Reforms are also under discussion to remove or narrow the small business exemption in the future.
If your business is under the $3 million threshold, carefully consider whether your business falls under an exception. For example, health services, businesses trading personal data, Commonwealth contracted service providers, credit reporting bodies, those handling tax file numbers, etc. all must comply. When in doubt, consider erring on the side of compliance – good privacy practice is advisable even if you might be exempt and is useful if your business aims to grow beyond the $3 million annual turnover threshold. Remember, even if you do fall within the small business exemption, other privacy regulation may apply, for example State or Territory laws.
With the rise of cloud computing, some organisations think that if they store personal data with a cloud service (especially a big, “trusted” provider), the provider will handle all privacy compliance. This is a myth. Under the Privacy Act, your organisation or agency is responsible and accountable for protecting personal information, even if the information is handled by a cloud or third-party service.
Using a cloud provider doesn’t outsource liability for your legal obligations. In fact, if your cloud service is overseas, Australian law (APP 8 and section 16C of the Privacy Act) makes your organisation accountable for any privacy breach by that overseas service provider as if it were your own. The overseas provider may also be captured under the extraterritorial application of the Privacy Act. You must take reasonable steps to ensure the cloud provider handles the information in accordance with Australian Privacy Principles. Even if you use domestic cloud services, you are still deemed to “hold” the personal information and must secure it (APP 11) and manage it properly.
The OAIC advises that you use contracts and due diligence to ensure third-party processors (including cloud vendors) protect the data and enable you to comply with the APPs. In short, you can’t pass the buck entirely. If there’s a breach or misuse in the cloud environment, your organisation or agency will be on the hook under the Privacy Act (unless a specific exception applies).
Choose cloud providers carefully, but remember you (the APP entity) are ultimately responsible for how personal information is handled. You can’t simply assume that your service provider will manage the information securely or will do all the other things necessary for privacy compliance. Conduct a Privacy Impact Assessment to understand the type of data you intend to store in the cloud, and the protections that each cloud provider offers to ensure compliance with the Privacy Act. Use contracts to require contextually appropriate privacy and security measures, and monitor compliance through regular audits.
It might seem logical that if personal information is publicly available, for example on a LinkedIn profile or a public registry, then privacy law doesn’t apply. After all, it’s already “out there”. However, the Privacy Act does not have a blanket exemption for publicly available information.
Personal information doesn’t lose protection under the Privacy Act just because it’s public. The Privacy Act’s definition of personal information covers any information about an identifiable individual, whether it’s sensitive, confidential, or public. For example, a person’s address or phone number published online is still their personal information.
An organisation collecting or using such data must still comply with the APPs. This includes, for example, collecting by fair means and only using the information for a legitimate purpose. There are some specific provisions in the APPs that consider context. For example, in some cases APP 3 allows collection of sensitive information if it’s already publicly available through a deliberate action by the individual. But these are narrow exceptions, not a free pass to reuse data in any way. Additionally, other laws (like spam and marketing laws) restrict use of public contact information for marketing without consent. Privacy laws stop organisations gathering public information to produce dossiers or to conduct surveillance activities. The bottom line is that “public” doesn’t mean “no privacy rights”. If you compile or use data about people from public sources, you still need to ensure compliance with applicable APPs.
Don’t assume you can use personal data without limits just because it was public or open source. Privacy laws still apply to publicly available personal information. Apply the same principles of transparency, fairness, and purpose limitation when handling such data. Always consider why the data was public and whether the individual would expect you to collect it and use it. If in doubt, get consent or legal advice.
“Get consent for everything” is a mantra some organisations follow, fearing that any handling of personal information without the individual’s permission is illegal. To be fair, it’s quite a good start. In truth, the Privacy Act does not require express consent for all personal data handling. Instead, consent is just one ground for lawful use, and sometimes you can collect, use or disclose information under other provisions.
Consent is required in certain areas. For example:
But the APPs also allow many activities without consent, provided other conditions are met. For instance:
Privacy law doesn’t always mean you must get consent before dealing with personal information – consent is only one option, and the law provides for other bases. Over-relying on consent can even be problematic if people don’t truly understand what they’re agreeing to. The key is to know when consent is required versus when the law allows you to proceed without it (while still respecting privacy).
Importantly, you can’t get individuals to consent to absolving you of all your obligations under the Privacy Act. For example, asking your customers to consent to your business taking no security precautions, does not relieve your business of its security obligations under APP 11.
Consent is not the only way to legally handle personal information. Use consent when it’s required or the best choice (e.g. for sensitive information and secondary use cases), but remember that the APPs also permit handling of personal information for core business purposes, contractual necessity, legal obligations, or public interests without consent. Always ensure you have some legal basis – just know that the basis doesn’t always have to be consent.
The Privacy Act’s employee records exemption (section 7B(3) of the Act) is often misunderstood. Some employers think it means “anything goes” with employee personal information, that they can use or disclose employee data however they like because it’s exempt. This is not the case.
The employee records exemption is actually quite narrow. It only applies to acts or practices directly related to a current or former employment relationship and an “employee record” held by the employer. In practice, that means if you’re a private sector employer, you don’t have to follow the APPs for most internal HR matters about current or past employees. For example, handling payroll, managing performance reviews, or keeping emergency contact details on file would be exempt. But there are limits:
The OAIC explicitly gives the above example to illustrate that you can’t use the exemption as a blanket excuse to misuse employee data. Also, if employee data is used for purposes like commercial research or shared across entities, caution is needed because other laws (and employee trust) come into play. In short, the employee records exemption is there to allow “business-as-usual” human resources management without cumbersome requirements, not to permit exploitation of employee information.
The employee records exemption is limited. It only covers current/former employees and only for purposes directly related to employment. Don’t overstep. Using employee data for non-HR purposes (or dealing with contractors’ or applicants’ data) still triggers full Privacy Act obligations. Even exempt data should be handled with care to maintain trust and meet other legal duties (like confidentiality and Fair Work obligations).
Australia’s Notifiable Data Breaches scheme requires organisations to notify affected individuals and the OAIC when a data breach is “likely to result in serious harm”. A myth has arisen that if stolen or lost data was encrypted, you’re off the hook and no notification is needed. The truth is that encryption is a major mitigating factor, but you must assess the situation, not every incident involving encrypted data is automatically non-notifiable.
It’s correct that strong encryption can reduce the risk of harm. If a laptop is lost but the personal data on it is encrypted to a high standard (and the encryption keys are strong and secure), unauthorised access is unlikely, and that may mean the incident is not an “eligible data breach” under the law. The Privacy Act even notes that if data is lost but later recovered or protected (for example via remote wipe or robust encryption), it might not require notification.
However, you need to consider the specific circumstances: was the encryption robust, and are the decryption keys safe? The OAIC advises that, if the circumstances suggest the data could still be accessed, for example the attacker also obtained the encryption key or the encryption used was outdated or weak, you should not assume the data is secure. If a database was encrypted but hackers got the password, the encryption won’t prevent harm. In such cases, serious harm could be more likely, triggering notification obligations.
In essence, encryption shifts the analysis to the strength of the security in that there is probably no need to notify in relation to “adequately encrypted” data that remains inaccessible. But if the encryption is compromised or insufficient, it should be treated like any other breach. Always document your reasoning, because you may need to justify why you did or did not notify.
Encryption can save you from having to notify, but only if it truly protected the data. After a breach, assess whether the personal information was encrypted to a high standard and that keys remain secret. If so, and there is no likely harm, you might not have to notify. If there’s any reasonable chance the data can be decrypted, err on the side of notifying and remember to improve your encryption practices going forward.
Some organisations think privacy rules only concern digital data, and that information in other forms (like hardcopy files, or video footage from security cameras) falls outside privacy law. This is a myth. The Privacy Act is “technology neutral” and applies to personal information in any format. In fact, the Privacy Act explicitly states it covers information “whether recorded in a material form or not”.
Personal information can be on paper, in a photograph, in an audio recording, on video, a gym workout plan, or just about any medium – what matters is whether it’s about an identifiable individual. A cabinet full of paper customer records certainly contains personal information and must be handled with the same care as an electronic database. Likewise, CCTV footage that can identify people (by their face, licence plate, etc.) is their personal information. If an organisation (that is covered by the Act) operates CCTV, the recordings of individuals are subject to the APPs. For example, they should be secured and only used or disclosed for legitimate purposes. Absolutely do not use the footage to create a “best of” reel for the Christmas party (yes, this happened). Guidance from the OAIC confirms that it does not matter what form personal information is in. Even a video without a name attached is still capturing someone’s image, which can identify them.
Note that whilst surveillance devices may also be governed by state laws, from a privacy perspective, physical and visual information counts.
Don’t let the format fool you. Personal information is protected whether it’s on paper, film or stored digitally. Lock those filing cabinets and control access to CCTV footage just as you would protect personal data in a computer system. All APP principles (security, access, use limits, etc.) apply to information in any medium, not just online data.
There’s a lingering notion that technical data, for example, IP addresses, device IDs, or other online identifiers, are just numbers and not “personal information”. Lots of tech companies and online advertising organisation yearn for this to be true. In truth, an IP address or similar identifier can be personal information if it can reasonably be linked to an individual. The Privacy Act’s definition is broad and focuses on whether an individual is reasonably identifiable from the information.
An IP address identifies a device’s connection to the internet, and often, through logs or service provider records, it can be tied to a particular user or household. The OAIC and courts have indicated that information such as IP addresses, MAC addresses, or cookies may constitute personal information when combined with other data or when an entity can look up the individual behind them. For example, a web service might know that IP 203.0.X.X belongs to a specific customer’s account, thus the IP address can identify a customer. The Australian Government’s own guidance lists “IP address” among examples of personal information.
Similarly, location data, device identifiers, or other unique numbers can be personal information if they pinpoint an individual. It depends on the context: one isolated IP string might not readily identify someone, but if your systems or processes can associate it with a user profile, it is personal information. Also, under data retention laws, certain telecommunications metadata is expressly treated as personal information. The safest approach is to treat technical identifiers as personal information data when they relate to an individual’s device or activity. This ensures you handle things like analytics data or cookie IDs in compliance with the APPs (e.g., securing them, being transparent, etc.).
Don’t write off IP addresses or device IDs as non-personal. If an identifier could be used to single out or recognise an individual, it likely meets the definition of personal information. Apply privacy protections to technical data just as you would to names and email addresses, especially since combining identifiers with other data usually reveals who the person is.
Publication
Privacy misconceptions are everywhere.
Publication
Canada and the European Union signed a Security and Defence Partnership (SDP), which formalizes a mutual intent to foster closer ties by establishing a framework for dialogue and cooperation across the full security and defence spectrum.
Subscribe and stay up to date with the latest legal news, information and events . . .
© Norton Rose Fulbright LLP 2025