15 September 2022
Data generation is currently increasing at exponential rates, driven by the rise of e-commerce, continued growth of digital platforms and further digitisation as a result of 5G connected devices. Alongside growing business demand for ‘deep data sets’ to inform decision-making and power artificial intelligence (AI) applications, more data is being collected from consumers and sourced from third parties through complex data sharing arrangements.
Many businesses have tended to view customer data as their own proprietary information. Under this ‘proprietary model’, customer data is seen as a business asset that is either to be commercialised or protected as a competitive advantage. But a growing tension with consumer demands for transparency is beginning to shift business thinking and the manner in which products and services are marketed, and the ‘proprietary model’ is also being disrupted by changing privacy regulations. Privacy law reform is occurring on a global scale, with many of the reforms being driven by different jurisdictions harmonising their laws with the protections under the European Union (EU) General Data Protection Regulation (GDPR).
Globally, regulatory reforms tend to be centred on two themes:
The GDPR requires individuals to be informed and consent to the collection and use of their personal data and that this consent be given in a clear and affirmative act. It is not sufficient to rely on the silence of the consumer or pre-checked default settings. Further, where the processing of personal information has multiple purposes, consent must be given for all of them and not bundled.
Australian privacy law is expected to adopt this approach and be more prescriptive about what constitutes ‘informed consent’. The Federal Government is currently conducting a review of the Privacy Act 1988 (Cth) as part of its response to the Australian Competition and Consumer Commission (ACCC) Digital Platforms Inquiry.
The proposed reforms are expected to include the requirement that consent be informed and ‘clearly, affirmatively and unambiguously’ given for any collection, use and disclosure of personal information that is not necessary to perform the services under the contract to which the consumer is a party. It is also expected that social media services and online platforms will be regulated under a proposed ‘Online Privacy Code’, which will contain detailed requirements for how user consent is obtained and the purposes for which personal information may be used or disclosed.
Australia’s sector-by-sector rollout of the Consumer Data Right (CDR) is also expected to effectively prescribe how data may be transferred between competitors at the express request of the consumer, and has the potential to supersede existing practices of data sharing and data scraping currently leveraged by business.
Some jurisdictions have introduced protections that go much further than the GDPR. For example, in the United States, the California Consumer Privacy Act (CCPA) creates a right for consumers to know when their personal information is sold and disclosed and to whom, as well as a right for consumers to opt-out of the sale of their information to third parties. This is in addition to a right of erasure of personal information found under both the CCPA and GDPR. There is also a suggestion that the state of California will implement a ‘data dividend’ that could see companies charged for the information they collect from consumers. These reforms go further than anything currently being considered in Australia, but demonstrate a global trend towards greater individual control over the use of one’s personal data.
Regulators across the globe are also turning their attention to the use of AI and the ‘deep data sets’ required to drive this technology. The proposed EU AI Act aims to regulate the development of AI by providing a framework of obligations for developers, deployers and users that is underpinned by a risk categorisation system. ‘High risk’ systems would be subject to the most stringent obligations, including that data sets be subject to governance and management practices to identify biases and be checked for inaccuracies, and a requirement to be transparent about when AI is used.
These issues are also being considered in Australia, with the Australian Human Rights Commission recently highlighting the human rights risks associated with the use of AI and recommending the introduction of legislation that regulates and effectively prohibits the use of facial recognition technology.
Going forward, General Counsel should carefully track global privacy reforms, and work closely with CIOs to establish a data governance framework that enables regulatory compliance. The trend towards combining data from different sources within a business to create a single ‘source of truth’ will require navigation of complex legal and ethical considerations. Without proper governance, data originally sourced from a customer database may be used by another part of the business for a purpose it was not collected for, resulting in regulatory issues and the erosion of customer trust.
Through sophisticated governance structures, and by providing consumers with informed choices over how their data is collected and used, businesses will be able to better utilise the data assets they hold, and build and maintain trust with their customers.
Authors
Head of Technology, Media and Telecommunications
Head of Intellectual Property
Senior Associate
Law Graduate
Tags
This publication is introductory in nature. Its content is current at the date of publication. It does not constitute legal advice and should not be relied upon as such. You should always obtain legal advice based on your specific circumstances before taking any action relating to matters covered by this publication. Some information may have been obtained from external sources, and we cannot guarantee the accuracy or currency of any such information.
Head of Technology, Media and Telecommunications