Friday, August 8, 2025
Australia's Privacy Laws: Guidance for Financial Advisers

Connor Disselkoen

As of December 2024, Australia’s first tranche of major Privacy Act reforms came into effect, and more are expected by late 2026. Financial advice practices must adapt to new transparency, security, and client-rights requirements, particularly around recording client conversations, automated decision-making and data protection.
In short, it means:
- Stronger regulator powers – the Office of the Australian Information Commissioner (OAIC) can now issue fines, compliance notices, and even criminal charges for things like doxxing.
- Privacy tort lands mid-2025 – Individuals can sue for serious, intentional invasions of privacy, even without proving damage.
- AI decisions must be highlighted – By December 2026, your privacy policy needs to say if decisions are automated, how they’re made, and what data’s used.
- Secure your data – “Reasonable steps” now means using both tech (like encryption) and organisational processes (like policies, training).
- Kids need protection too – A new Children’s Online Privacy Code must be in place by December 2026.
Useful References
- Australian Privacy Principles (APPs). OAIC’s APP Guidelines on Australian Privacy Principles. The definitive word on how APPs 1–13 interpret data handling obligations - collection, use, disclosure, accuracy, access, security, correction.
- Privacy Act Tranche 1 reforms overview. OAIC media centre announcement: “Passing of bill a significant step for Australia’s privacy law”. Breakdown of changes that took effect from December 2024, including statutory tort, enforcement expansion, automated decision-making transparency.
- Automated Decision-Making guidance (ADM). Commonwealth Ombudsman, OAIC & AGD’s Automated Decision-Making – Better Practice Guide (2025). Practical advice on what responsible ADM looks like—tied back to administrative law values and privacy principles.
- OAIC guidance on AI and privacy. OAIC’s Guidance on privacy and the use of commercially available AI products. How to handle generative AI: sensitive data, secondary use, consent, collection under APP 3, and caution around chatbots.
What This Means for Financial Advice Practices
- Policy transparency (APP 1) Your privacy policy must now call out automated decision-making if it plays a material role. Keep it client-friendly, plain-English, and specific to your advice tools.
- Security obligations (APP 11) You’re legally required to have both tech (encryption, access controls) and process controls (staff training, breach incident plans) in place.
- Access, correction & policy clarity (APPs 12, 13, and 1.3–1.4) If your policy fails to meet expectations, OAIC can hit you with infringement or compliance notices. So audit now, rather than waiting.
- Legal risk (Statutory Tort) Any grave misuse of client info, or secret intrusions, could trigger legal exposure, even if no damage is proven.
Best Practice Recommendations
- Governance: Update your privacy policy to include automated decision making use (APP 1.7–1.9), security measures (APP 11.3), access and correction pathways (APP 12/13), and make it approachable.
- Team awareness Train staff on:
- recognising when a decision is automated,
- explaining it plainly,
- how to respond if a client asks how a decision was made.
- Security posture review Map your technical safeguards (encryption, MFA) and organisational ones (data access roles, retention, breach response), and keep evidence (for OAIC or your own clarity).
- Incident Response Ensure your breach plan names new thresholds and outlines who does what. Keep it updated and tested.
- Client communication Before you record meetings, get clear consent. If there’s any automated element in your process, say so.
- Monitoring and adaptation Watch for:
- OAIC guidance on ADM transparency (APP 1) and
- development of the Children’s Online Privacy Code, due by December 2026.