The Australian Landscape - October 2019 - WWW.CORRS.COM.AU - Corrs Chambers Westgarth
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Corrs Welcome to the second edition of TMT: The Australian Landscape. Many of the articles in this edition consider the issue of personal data, Chambers which is the subject of considerable regulatory reform in Australia. We consider the key outcomes of the Australian Competition and Consumer Westgarth is Commission’s Digital Platforms Inquiry, which is likely to pave the way for significant data privacy reform in Australia (including consent Australia’s requirements which go beyond those required under the EU’s General Data Protection Regulation), but unlikely to deliver anticipated changes to leading the regulation of the media sector. independent Data privacy is increasingly becoming a consumer protection issue in Australia, with regulators seeking to provide consumers with greater law firm control over data that is collected about them. Companies doing (or considering doing) business in certain industries in Australia will need to be aware of the new ‘Consumer Data Right’, which will activate data portability for consumers as well as businesses across a range of industries. We also consider new trends in how the Australian Foreign Investment Review Board will review an overseas company’s investment in an Australian company with a significant personal data holding, while the Australian stock exchange is welcoming an increasing number of IPOs from overseas tech companies. As artificial intelligence continues to become more prevalent in Australian businesses, we consider questions of legal liability for AI and how AI might be used to assist with decision making by the boards of Australian companies. We also look at an important decision of an Australian court which could see social media platforms liable for defamatory comments published by their users. Finally, this edition also contains recommendations for medtech and other start-ups about managing IP and regulatory issues. Please enjoy this edition and feel free to contact us if you have any queries. James North Frances Wheelahan Partner and Partner and Head of Technology, Editor of TMT: Australian Media and Telecommunications Landscape Tel: +61 2 9210 6734 Tel: +61 3 9672 3380 Mob: +61 405 223 691 Mob: +61 419 517 506 james.north@corrs.com.au frances.wheelahan@corrs.com.au PAGE B
Contents Technology .................................................................................................. 2 ‘Informed choice’: significant data privacy reforms on the horizon for Australia ................................................................... 2 Australia builds its open data economy: Consumer Data Right passes parliament .............................................................. 4 Key legal issues for medtech start-ups ...................................................................................................................................... 6 Liability for AI: considering the risks ........................................................................................................................................... 8 M&A and Capital Markets .......................................................................... 10 AI in the boardroom: could robots soon be running companies? ........................................................................................... 10 Is the ASX becoming the new NASDAQ? How growth-stage tech companies are finding a warm welcome down under.12 FIRB increases its focus on data ............................................................................................................................................... 14 Media ......................................................................................................... 16 The ACCC Digital Platforms Inquiry Final Report: is media law reform destined for the too hard basket once again? ... 16 ‘Innocent dissemination’ and ‘secondary publisher’ defences no longer available to owners of Facebook pages: NSW Supreme Court ............................................................................................................................................................................ 18 Intellectual Property ................................................................................ 20 Protecting your IP assets: key issues for tech start-ups and emerging companies ............................................................ 20 Contacts .................................................................................................... 24 PAGE 1
Technology ‘Informed choice’: The case for reform significant data privacy The primary focus of the DPI was digital platforms and the media. Digital platforms typically operate reforms on the horizon for under a distinct business model providing services to consumers for zero monetary cost in exchange for Australia their attention and use of their data. The platforms then ‘monetise’ that data by selling targeted advertising, By James North, Head of Technology, from which they earn the majority of their revenue. Media and Telecommunications and Jennifer This business model poses some specific challenges Dean, Special Counsel in terms of data privacy, but the ACCC makes a case for ‘economy-wide’ reforms, citing a number of other Following its recent detailed examination of sectors with data practices it considers to be similar, the functioning of Australia’s digital economy, including financial institutions, telecommunications the Australian Competition and Consumer service providers, retailers offering rewards schemes, Commission (ACCC) has released its Digital airlines and media businesses. Platforms Inquiry (DPI) Final Report. Concerns regarding current practices It is fair to say that Australian data privacy regulation The ACCC’s recommendations are wide ranging, and has not kept pace with the multiple ways in which include a series of proposals relating to data privacy businesses collect, use, share and deal in data as part which, if implemented, would have broad impacts of the digital economy. For the ACCC, however, this is across the entire economy and significant implications not only about privacy, but also consumer protection. for global businesses that deal with Australian consumers. In its analysis of consumer welfare, the ACCC places significant weight on consumer survey data which We also see the potential for some unintended adverse indicates a strong consumer preference for having outcomes for consumers. control over the data collected about them (especially location data and internet browsing data) and how PAGE 2
Technology it is used and disclosed. While these results are pecuniary penalties would apply to their use. This hardly surprising, what the surveys do not appear to could add significantly to the compliance burden for address is whether consumers value this control more businesses contracting with Australian consumers than some of the benefits that access to data drives and small businesses. (e.g. improvements to the quality of services or the • A direct individual right of action for an interference ability to offer services for free). with privacy and increased penalties. The ACCC is highly focused on the importance of • A new Privacy Code specifically for digital consumers being able to make ‘informed choices’ about platforms. the handling of their data. Some of its key findings in this context include: • A statutory tort for serious invasions of privacy. • Bargaining power imbalances and information • A prohibition against certain unfair trading asymmetries between digital platforms and practices (beyond unfair contracting). consumers create inherent difficulties for consumers in accurately assessing the current and What’s ahead? future consequences of providing their user data. Global convergence towards GDPR standards means that the ACCC recommendations that align with the European • Consumer consents using click-wrap agreements privacy regime are unlikely to impose significant additional with take-it-or-leave-it terms that ‘bundle’ a wide regulatory burdens on the majority of businesses range of consents mean that consent is not truly operating in Australia. However, the recommendations informed or voluntary. relating to consent, which are stricter than the GDPR • Many privacy policies are long, complex, vague and protection standard, are likely to present a greater difficult to navigate. compliance challenge. In particular, when coupled with unbundling consents, more stringent consent Key recommendations requirements could present real IT system challenges, Most of the ACCC’s recommendations would bring with systems needing to be able to record and implement Australian privacy law into closer alignment with the diverse consent patterns on an individual consumer level European Union General Data Protection Regulation based on the particular services acquired. (GDPR). However, the ACCC’s recommendations Further, both the consent recommendations and the regarding consumer consent appear to be stricter proposed digital platforms privacy code arguably raise than the GDPR in some respects. The ACCC’s key some fundamental issues in relation to the way digital recommendations are: platforms operate. The ACCC has acknowledged that • Strengthened protections in the Privacy Act (in line data collection drives the ability to offer valuable services with the GDPR). A range of amendments are intended without charge and to improve those services over time. to broaden the definition of ‘personal information’ In an individual case, much of the data collected may not to encompass technical data (such as location data be necessary for the provision of the particular digital and IP addresses) and impose more prescriptive service a consumer is receiving. However, the potential notification requirements at the time of collection. cumulative impact of successive decisions by consumers to refuse consent for such data collection (or a simple • Strengthened consent requirements in the Privacy failure to adjust mandated default settings which would Act. These would require consumer consent for any prevent the collection) has not been addressed by the collection, use or disclosure that is not necessary ACCC, either in terms of quality of service or the ability to for the performance of a contract to which the offer services at no charge. consumer is a party (with some limited exceptions). Significantly, the ACCC does not recommend Perhaps the key takeaway from the data privacy adoption of the GDPR exception for use or disclosure sections of the DPI Final Report is that the ACCC for the ‘legitimate interests’ of the collector. does not view data privacy as an issue solely for Separately, it has recommended that valid consent privacy regulation – instead, it is thinking about it as a must be clear, affirmative (i.e. default settings consumer issue that may equally be addressed under should not allow collection and processing), specific the Australian Consumer Law. (i.e. consents should not be bundled), unambiguous Australian privacy law reform is perhaps inevitable. and informed. In line with other jurisdictions, such as the US and • A prohibition against unfair contract terms. The Germany, we also expect to see the ACCC pursue ACCC has recommended that unfair contract terms enforcement action under competition or consumer be prohibited and not just void, meaning that civil protection legislation to address data privacy issues. PAGE 3
Technology Australia builds its open 3. D esignated gateways. Designated gateways are responsible for facilitating the transfer of information data economy: Consumer between data holders and accredited persons. Data Right passes 4. C onsumers who exercise the rights under the CDR. CDR consumers are identifiable or reasonably parliament identifiable persons, including a business enterprise, to whom the CDR data relates because of a supply of By Philip Catania, Partner, Arvind Dixit, Partner a good or service to the person. and Kit Lee, Law Graduate The principle of ‘reciprocity’ applies to accredited data recipients. Under this principle, accredited data After approval in the Senate on 1 August recipients can also be classified as data holders for 2019, the Consumer Data Right (CDR) Bill has certain data (e.g. where they provide similar services to been passed in both houses of parliament. an entity listed in the designated class), meaning they Based on consumer choice in the context of will be required to share data with other recipients. data, the CDR has, at its heart, an increase Organisations that wish to become accredited recipients in order to improve their particular customer service in competition flowing from a person’s right offerings should be aware of their obligations as data to control their data. holders under this principle. This principle creates a network of back and forth sharing between all entities The ‘Big Four’ banks have already voluntarily within the CDR system creating greater opportunities implemented the CDR in relation to certain product for consumers. data available on credit and debit card, deposit and transaction accounts, and must provide access to data Given that organisations also qualify as ‘consumers’, related to mortgage accounts by 1 February 2020. businesses (especially those entities that maintain large The CDR will also be implemented in the energy and data repositories) should contemplate the ways in which telecommunications sectors, followed by other sectors they might take advantage of their data rights. Equally, that are yet to be determined. if large organisations make requests for transfers of their data and the costs and infrastructure required to The CDR is not just relevant to businesses in the sectors engage in such transfers, businesses should be aware noted above, however – all businesses that collect and of the need to facilitate potentially significant transfers handle consumer data should familiarise themselves of data. with key aspects of the CDR. What data does it apply to? As Australia reforms its privacy regime in a manner that reflects aspects of the European General Data Only data that qualifies as ‘Consumer Data’ may be Protection Regime (GDPR), the CDR is likely to be used transferred under the CDR system. Data will only be as the mechanism to achieve data portability across a considered Consumer Data if it is: range of sectors. • data generated or collected in Australia by an Essentially, the CDR empowers customers to access Australian person; or and use data that businesses hold about them. • data generated or collected by an Australian person Consumers can obtain their data held by third parties and the data relates to an Australian person or for themselves or authorise the secure sharing of their products/services offered to an Australian person. data to accredited third parties (such as comparison ‘Consumer Data’ includes all types of data that meet services who provide consumers with tools to make the above requirements, not just personal information. more informed choices). Businesses will need to take steps to identify and Who does the CDR apply to? categorise the various datasets which fall under the The four key players in the CDR system are: CDR system. 1. D ata holders. Data holders are entities within a It could also be comprised of following types of data: particular class of persons and who hold CDR data • Data under designation instruments. The CDR within a prescribed class of information. only applies to data that has been specified under a 2. A ccredited data recipients. Accredited data designation instrument (e.g. product data available recipients must be licensed to receive data through on credit and debit card, deposit and transaction the CDR system. accounts). As noted above, the CDR will be rolled out across relevant sectors in stages, and businesses PAGE 4
Technology may have an opportunity to take incremental steps Privacy safeguards to uplift their systems, depending on the approach The consumer data rules establish privacy safeguards adopted under each sectors’ designation. which are additional privacy protections offered to • Third party data. It is possible that third party consumers, enforced by the Office of the Australian datasets could fall within the definition of ‘Consumer Information Commissioner (OAIC). Data’, as entities commonly collect information These safeguards provide consumers with avenues about their consumers from third party providers. to seek remedies for breaches of their privacy or If such information is subject to proprietary confidentiality (including access to internal and external restrictions or confidentiality arrangements, data dispute resolution and direct rights of action), and holders could be requested to share information also establish obligations to provide anonymity and for which they have no contractual right to disclose. pseudonymity to consumers, and destroy or de-identify Businesses should be alert to this potential conflict redundant data. if engaging third party data providers that limit the use and disclosure of the relevant data. Organisations will need to be aware of the intersecting relationship between the privacy safeguards and the • Derived data. CDR data includes data that is ‘directly Australian Privacy Principles (APP): or indirectly derived from other CDR data’, meaning that data which has been transformed in the hands • For data holders, some of the privacy safeguards of the data recipient or data holder is subject to the apply concurrently to the APPs whilst others do not regime. This means that where an organisation apply. takes steps to create unique insights about data in • For accredited data recipients, the privacy relation to a consumer, they may be required to share safeguards will largely substitute the APPs but only those insights under the CDR system. This issue in respect of CDR data. was raised by a number of organisations as part of the CDR consultation process, and as a result, data • For accredited persons, the two regimes apply that has been materially enhanced (e.g. data whose concurrently but with the more specific privacy value has largely been generated by the actions of safeguards prevailing. the data holder) will not be subject to disclosure The implementation of privacy safeguards as an with regards to the first tranche of banking data. additional set of privacy obligations was met with Instead, only collected data (e.g. raw transaction data) criticism through the public consultation process. In and immaterially derived data (e.g. fees charged, particular, there was significant concern regarding the calculated account balances and interest accrued on multiplicity of obligations that data holders and data accounts) are subject to the CDR. However, the issue recipients would be subject to under the CDR, the APPs needs to be addressed on a sector by sector basis. and, for entities operating in the EU, the GDPR. • Chargeable data. Organisations may charge fees The overlapping application of these regimes will mean for the use and disclosure of certain datasets, to be that organisations may need to consider segregating determined by the regulator. In determining where their data into specific categories so that the various data is ‘chargeable’, the regulator must consider regulatory requirements under each can be managed whether: and complied with. -- the data includes intellectual property or would All organisations that collect and handle consumer be an acquisition of property; data should monitor the implementation of the CDR -- organisations currently charge fees for across the banking, energy and telecommunications disclosing data; sector and consider the practical measures that can be implemented in order to future-proof their -- the incentive to generate data would be own operations (including accurately auditing and reduced; or categorising existing, and potential future, data assets). -- the marginal cost of disclosure would be significant. This clearly contemplates the situation in which value- added data is designated under the CDR system and attempts to provide organisations with compensation for data which they transform for commercial purposes. However, chargeable data is subject to various restrictions and organisations should be aware of how their current costing structures will be affected. PAGE 5
Technology Key legal issues for software which directs patient activity based on a non- interactive intervention. This will align with international medtech start-ups approaches, for example in the European Union, where rules for higher classifications have already By Frances Wheelahan, Partner and been introduced. However, this will have a dramatic Suman Reddy, Senior Associate impact on the time and costs involved in registering (or maintaining the registration of) the SaMD on the Australia’s medtech sector has experienced Australian Register of Therapeutic Goods (ARTG). rapid growth over the last decade, with a Medtech companies should review the proposed classification scheme in anticipation of the increased recent focus on digital health, connected regulatory scrutiny which is likely to be imposed. devices, and artificial intelligence. 2. Cyber security In this article, we discuss some of the key legal issues For any medical device to be included on the ARTG, faced by startups entering the medtech space, focusing the manufacturer must demonstrate compliance with on proposed changes to the medical device regulatory the ‘Essential Principles’ contained in the Medical regime, privacy and intellectual property laws. Devices Regulations. The Essential Principles require the minimisation of risks associated with the design, 1. Changes to the classification rules for SaMD long-term safety and use of the device, which implicitly (Software as a Medical Device) includes minimisation of cyber security risks. Technology has evolved and diffused dramatically However, the Essential Principles currently do not refer since the last major overhaul of the Australian medical specifically to SaMD. This is a recognised gap, and one device regime which occurred in 2002. The changes to which the TGA plans to address by recommending the Therapeutic Goods Act 1989 (Cth) (the Act) and the changes to the Essential Principles to include clear introduction of the Therapeutic Goods (Medical Devices) and transparent requirements for demonstrating the Regulations 2002 (Medical Devices Regulations) were safety and performance of SaMD and other regulated intended to provide a best practice regulatory regime software. Proposed requirements include: which harmonised Australia’s requirements for quality, safety and performance with the higher standards • any cyber security risks associated with network enforced in Europe at the time. connectivity be minimised; However, Australia’s medical device regulatory • that software be designed and produced using best framework has not kept pace with the advances in practice software engineering principles; information and communications technology which now • best practice cyber security principles be used underpin the focus of medtech innovation – particularly regarding the risk of unauthorised access to the the development of standalone software and integrated device; and technology platforms which can be used to diagnose or treat disease. • medical devices be designed to facilitate software updates, and information about the clinical risk of an Given this, the Therapeutic Goods Administration update is provided to the user. (TGA) is poised to recommend the introduction of new regulations to govern SaMD, or software-as-a-medical Again, the proposed changes to the regime will device. One of the most significant proposed changes necessarily involve additional effort and cost for will be the requirement to properly classify SaMD manufacturers to systemise development and according to risk, in contrast to the present situation production practices, and document the evidence for which results in all SaMD being properly classified assessment. The TGA also notes that in some cases, as Class I (i.e. the lowest risk classification of device), new quality management and development practices regardless of actual risk. This is because the current may have to be put in place to demonstrate compliance. classification rules only consider the possible harm 3. New penalties under the Privacy Act caused by a physical interaction of a medical device and a human. All Australian companies (with limited exceptions) must comply with the Australian Privacy Principles (APPs) The proposed changes to the rules will result in contained in the Privacy Act 1988 (Cth) (Privacy Act) SaMD which is used directly in diagnosis or therapy when dealing with personal information. The APPs being classified as Class IIa to III devices, both for contain higher standards when dealing with health new applications and for existing registrations. The information. only SaMD to remain as Class I would be lower risk PAGE 6
Technology Breaches of the APPs are subject to hefty penalties - 5. Abolishment of the innovation patent system up to A$2.1 million for the most serious and repeated Legislation currently before the Australian Parliament breaches. However it is likely that these penalties, – the IP Laws Amendment (Productivity Commission together with the OAIC’s enforcement powers, will be Response Part 2 and Other Measures) Bill 2019 (Cth) increased significantly in the near future. The Australian (Bill) – will, if passed, have the effect of abolishing Government has proposed these amendments to Australia’s innovation patent system. The innovation substantially strengthen the enforcement regime patent system provides second tier patent protection of and align our legal framework more closely with the eight years for innovations, as opposed to the 20 year European GDPR. protection for patentable inventions. The proposed amendments will increase the maximum The innovation patent system was introduced in 2001 penalty for entities subject to the Privacy Act to the to protect incremental technological developments by higher of: Australian small and medium sized enterprises and has • A$10 million for serious or repeated breaches; been used effectively in the medical device space. • three times the value of any benefit obtained through Under the Bill, those who have already obtained or the misuse of information; or applied for innovation patents will (if the Bill is passed) continue to be able to enforce them. In addition, for a • 10% of a company’s annual domestic turnover. period of 18 months from the Bill receiving royal assent, The draft legislation is due for consultation before the it will still be possible to apply for innovation patents. end of 2019. After this ‘grace period’, no more applications will be accepted. 4. Use of data in machine learning Medtech companies who wish to apply for innovation Medical technology is increasingly incorporating patent protection should try to obtain these key elements of machine learning which relies on enforcement tools while they are still available. continuous data analysis to “train” the algorithm to become more accurate over time. However, given the privacy constraints around secondary uses of health information, consent to the use of such data for machine learning purposes must be obtained from individuals. Data could be de-identified for this purpose, however it may be argued that the process of de-identifying data is itself a “use” of data which requires consent under the APPs. Medtech companies should both identify how they need to use the data they collect, and consider the potential ways in which they might plan to use that data in the future, and ensure that they have obtained the required consents to enable those uses. Australia’s Under the Privacy Act, APP 1 requires that a company make available a well drafted privacy policy. In addition medtech sector to that, medtech companies may wish to develop a ‘white paper’ which provides some further details has experienced about the company’s data handling and cyber security practices so that it is clearer and more transparent to rapid growth over potential customers and individuals how their data, and particularly personal information, will be collected, the last decade. used, stored and disclosed. A privacy policy may deal with this to some extent, however, it is not a legal requirement to describe a company’s data protection practices in any detail in such a policy. A white paper can be a good way to provide comfort to consumers of technology that personal information will be handled safely and appropriately. PAGE 7
Technology Liability for AI: Liability for intention and effect considering the risks Companies must be satisfied that their use of AI is socially justifiable and legally acceptable. They should be clear about the problem that the AI is seeking to By Simon Johnson, Partner, Michael do Rozario, address, and be vigilant to ensure that the algorithm is Partner, David Yates, Partner and Daniel Argyris, operating as intended. The absence of a human decision Lawyer maker in a process does not mean that liability for the unlawful acts of an AI powered decision is avoided. The rapid development of artificial intelligence For example, unlawful discrimination against a person (AI) is transforming the world. The will be just as unlawful if the discriminatory decision International Data Corporation predicts that was made by an AI tool rather than a person. Also, EU global spending on AI will reach USD37.5 billion and US regulators have pursued cases of AI pricing algorithms causing companies to behave in cartel-like this year and USD97.9 billion by 2023.1 ways, notwithstanding there was no human involvement in the price setting. In Australia, the ACCC’s Digital As businesses invest in projects that use AI software Platforms Inquiry flagged personalised pricing and platforms, it is important to consider the liability algorithms as an area for concern and monitoring. and litigation issues that arise. These algorithms set pricing based on data about Developing law: liability for artificial intelligence perceived need and capacity to pay. As companies take advantage of the benefits offered by Companies should also take a macro view as to whether AI, new fields of liability emerge. The civil law provides the intention of the AI they propose to use is consistent ample opportunity for those damaged by AI-powered with good corporate behaviour. AI algorithms that decisions to seek compensation, damages or other make decisions affecting individuals’ rights can have redress. For example, products with inbuilt AI will consequences for a company’s reputation even if legal clearly fall under existing statutory safety and fitness obligations are not contravened. Companies should take for purpose laws, and the usual norms of product note of a growing body of human rights law that is being liability law will apply to determine liability, causation developed in relation to AI and legislation in order to and damages. For products and services that use ensure the ethical development of AI products.3 AI (and strict liability instances aside), contractual Liability for the actions of the algorithm limitations on liability, terms of use, warnings and notices, exclusions and indemnities will be just as Proving the method by which an AI algorithm reached effective as if the product or service relied on human a decision is particularly complex and, in a litigation intelligence. sense, may be beyond human expert explanation. More likely (and strict liability instances aside), a party But more complex uses of AI will test the boundaries seeking to defend a decision reached by algorithm will of current laws and likely give rise to new examples seek to prove that the outcome was within reasonably of liability, albeit under existing legal frameworks. acceptable parameters. The Robo-Debt class action currently being explored against the Australian Government will use the existing This will require consideration as to the design of the administrative law to challenge the validity of the algorithm itself, the data that the algorithm has trained Government’s reliance on an algorithmic system to on and the testing of outcomes. What is acceptable will automate the determination of welfare debts.2 evolve over time. Companies looking to assess legal risk from the A well-known example of algorithmic error is the Uber implementation of AI need to take a holistic approach to self-driving car that did not recognise a woman walking liability, assessing their risk at multiple levels: liability a bicycle as an object that required it to stop or take for the intention and effect of an AI system, liability for avoidance action, causing a fatality. In that case, a lack the performance of the algorithm, and liability arising of testing and previous data on the LIDAR identification from the data used to train the algorithm. of a human walking a bicycle caused the algorithm to reach several incorrect conclusions, before the brakes were applied too late to avoid the collision.4 1 See “Worldwide spending on Artificial Intelligence systems”, 4 September 2019, 3 For more information on developments in the EU, USA and Australia see https:// https://www.idc.com/getdoc.jsp?containerId=prUS45481219 corrs.com.au/insights/how-are-ai-regulatory-developments-in-the-eu-and-us- 2 This class action has only recently been foreshadowed, see “Government’s ‘robo- influencing-ai-policy-making-in-australia debt’ recovery scheme facing class action”, 17 September 2019, https://www.smh. 4 For a detailed explanation of the algorithmic issues that caused the fatality see com.au/politics/federal/government-s-robo-debt-recovery-scheme-facing-class- https://www.economist.com/the-economist-explains/2018/05/29/why-ubers-self- action-20190917-p52s7v.html driving-car-killed-a-pedestrian PAGE 8
Technology The example highlights the intersection between information, even though that information may not be the legal notion of foreseeability and the training of contained in specie in the data set. an AI system to account and test for all foreseeable The protection of data from model inversion risk needs outcomes. It must be ensured that the AI’s decision to be considered in testing of AI such that personal making stays within parameters as more data is applied information or confidential commercial information to it and its decision making processes evolve. is not disclosed as part of normal use, but also as a A careful auditing process will be critical in establishing security measure to ensure that a malicious actor could the credibility and reliability of an AI system. The UK not obtain that information through intentional misuse Information Commissioner’s Office has identified of the system. five risk areas for analysis and consideration in AI decisions:5 Implications The issue of liability for AI is as far-reaching as the 1. Meaningful human reviews in non-solely potential use cases. In many cases, liability for AI will automated AI systems; be straight-forward and will not test the boundaries 2. Accuracy of AI systems outputs and of established liability frameworks. However, complex performance measures; systems will require careful thought and legal analysis. Companies should also have regard to the significant 3. Known security risks exacerbated by AI; amount of policy development that is underway across 4. Explainability of AI decisions to data subjects; the world to establish guidelines on the acceptable and parameters of AI use. 5. Human biases and discrimination in AI systems. Liability for the data used in AI algorithms While it is obvious that incorrect or insufficient data will cause an AI algorithm to make erroneous decisions, particular caution also needs to be had in relation to the collection, use and disclosure of the data that trains or underpins the algorithm. For AI algorithms dealing with people, it is critical to ensure the protection of personal information and compliance with privacy laws. Companies will be liable for the collection and use of personal information in an AI system, including ensuring that information has not been collected and stored in contravention of privacy laws. Similarly, there is an ongoing obligation to maintain the security and integrity of personal information. Additionally, algorithms must be tested to ensure that The issue of the intended use does not result in the inadvertent disclosure of personal information, such as through liability for AI is model inversion. Model inversion is an AI risk that arises when a user has some data about a person, but as far-reaching can then establish other information about the person by observing the outcome of the algorithm. The issue as the potential can arise even if the personal information in the data set has been de-identified, because some models can use cases. accurately predict the parameters of the de-identified information to re-identify the particular individual. The same situation would apply to corporate data, resulting in the inadvertent disclosure of sensitive or confidential 5 https://ai-auditingframework.blogspot.com/2019/07/developing-ico-ai-auditing-framework.html PAGE 9
M&A and Capital Markets AI in the boardroom: could The term ‘AI’ is often used synonymously with machine learning, but this is not strictly correct. robots soon be running True AI exhibits features of human-like intelligence and companies? the ability to use human-like judgement in decision- making. This is in contrast to machine learning tools By Justin Fox, Partner, James North, Head of that conduct statistical analysis of data sets to identify Technology, Media and Telecommunications and patterns, but which are not exercising ‘judgement’ to Jennifer Dean, Special Counsel reach conclusions. Despite these differences, both AI and machine learning tools rely on large, high quality Artificial intelligence (AI) and automation data sets to improve, and both will inevitably make mistakes along the way. more broadly continue to be identified as the next frontier in productivity enhancement and Predictions of robots in the boardroom are not far- fetched. In late 2016, OMX-listed Tieto Corporation growth. Last year, McKinsey estimated AI announced that it had appointed an AI platform known could potentially increase economic outputs by as Alicia T to be a member of its executive leadership $13 trillion by 2030, and add to global GDP by team. Alicia T is equipped with a conversational approximately 1.2%.1 interface that allows its human counterparts to ask it questions. The platform even has a vote on some Consistent with the trend, it is likely that management decisions. Australian boards will increasingly look to AI More recently, Hong Kong venture capitalist Deep and machine learning to improve the quality Knowledge Ventures appointed an algorithm known as Vital to help the fund make its investment decisions. of their decision making. But can an algorithm These appointments reflect a growing acceptance that run a company instead of a director? machine learning may be capable of making better business decisions than human beings. 1 McKinsey & Company, Notes from the AI frontier: Modelling the impact of AI on the world economy (April 2018) PAGE 10
Corporate Can an algorithm run a company instead of 10 key questions for directors to ask in relation a director? to AI For the time being, the answer to this question is 1. Where do we use AI in our business? no. A robot can’t be a director under Australian law. 2. What decisions does it make? By definition, a director must be a ‘person’. We do expect, however, that directors will increasingly seek 3. Who could be impacted by those decisions? to use machine learning and AI to assist them in their 4. Do we tell people who could be impacted by the own decision making and to rely on decisions taken decision that we have used AI and that they have elsewhere within the organisation that are the product a right to have the decision reviewed by a person? of the application of AI. In this context, it is critical that directors are aware of the legal risks associated with 5. Do we understand the algorithm? using AI and how to properly manage them. 6. Is it consistent with our values/objectives? AI is not foolproof and directors must expect that some 7. Have we satisfied ourselves that the data source decisions made by AI will be wrong. This may be for is appropriate for our specific use? numerous reasons, including that: 8. Do we have a human decision making process as • the algorithm is incorrect or poorly understood; part of our decision making loop – if not, do we • the data set is inappropriate or is contaminated by undertake spot checks and trend analysis of the bias; or AI generated output? • the decision making was impacted by coding error or 9. Is the decision making process transparent – can malfeasance (e.g. hacking). it be audited? Where the AI is wrong, this can result in wrong 10. Did we buy the tool from a third party vendor? If decisions or even decisions that breach the law. For so, what warranties has the vendor given us as to example, in the human resources context, the use of AI performance? tools in conjunction with data about previous successful To continue reading this article, click here. employees to predict which candidates are most likely to be successful in the future may simply reinforce existing biases or discrimination in hiring practices. The issue for directors is whether they might be exposed to a breach of their duty to exercise reasonable care and diligence as a result of the failure of the AI. AI and safe harbours There are three important safe harbours available under the Corporations Act to directors who are accused of breaching their duty to exercise reasonable care and diligence. These are: Australian boards • the business judgement rule in section 180(2); will increasingly • the right of reliance in section 189; and • the right to delegate in section 190. look to AI and Australian courts have not yet had an opportunity to machine learning consider how those safe harbours might respond to a case where an impugned decision was made by or to improve the with the assistance of AI. However, a first principles assessment suggests that the safe harbours might not quality of their be available if directors were to simply adopt decisions made by AI without exercising independent judgement. decision making. PAGE 11
Corporate Is the ASX becoming investors, both in Australia and around the globe. Australia’s funds management industry is the largest the new NASDAQ? How in the Asia-Pacific region, in part due to Australia’s compulsory superannuation system. By 2035, its pool growth-stage tech of superannuation assets is expected to reach A$9.5 companies are finding trillion. While resources and financial stocks will continue to draw the lion’s share of that cash, tech is a warm welcome down undoubtedly the fastest growing sector on the ASX. under • Valuation requirements ideal for start-ups Starting a technology company has never been By James North, Head of Technology, Media and easier. The digital revolution has allowed technology Telecommunications, Gaynor Tracey, Partner and entrepreneurs and their investors to go from an idea to Madeleine Kulakauskas, Senior Associate reaching millions of customers at a speed never seen before. With a market heavily weighted towards While Silicon Valley is largely considered the home of financial services and mining companies, the tech start-ups, listing in the US is onerous to the point Australian Securities Exchange (ASX) has that it is not accessible to growth stage companies. traditionally been light on technology stock, In particular, companies trying to go public in the US meaning that investors wanting tech exposure are prone to litigation and enormous expense. Floats have had to look to other markets or exchange are fewer but larger, because by the time the company reaches a stage where it can afford to list, it is mature. traded funds. For a tech company to list on NASDAQ, it needs to be circa-US$1 billion to get any traction. For a tech Now, however, as the benefits of an Australian platform company to get a float worth less than US$3 billiion or are becoming increasingly understood across the US$4 billion underway is almost impossible. world, the ASX’s mission to actively target overseas technology companies looking to raise capital is Conversely, the ASX presents itself as the ideal market starting to come to fruition. for tech companies valued under US$1 billion. Provided they have a minimum number of 300 non-affiliated With numerous growth-stage tech companies from investors (totalling $2,000), a free float of 20% and can Australia, the Asia-Pacific, the US, Europe and Israel satisfy either the profit test (having A$1 million) or the successfully listing on the ASX with good valuations asset test (having A$4 million net tangible assets or and traction for scale, a clear trend has emerged – the A$15 million market capitalisation), companies can list ASX is increasingly being used by tech companies as on the ASX. either a stepping stone to a future dual listing on other exchanges or as a long-term listing venue. • High ranking in the world’s top equity markets In August 2019, WiseTech (one of the top five ASX tech The ASX is consistently ranked in the world’s top ten companies) was the first to cross over the A$10 billion global securities exchanges by value, and is a world market cap threshold. Other recent high profile raisings leader in capital raising, ranked within the top five include those by Silicon Valley growth story Life360, exchanges globally. In 2018, Australia found itself within which launched its IPO in May 2019 and raised A$145 the global top five for IPOs, coming in at A$8.5 billion million, and Minneapolis-based payment platform with 132 new listings. In particular, the ASX recorded Sezzle, which launched its IPO in July 2019 and raised A$4 billion in tech-focused IPO capital raised between A$43 million. Most recently, Irish insurance software 2013 and 2018. This figure provides clear incentive company Fineos launched the largest 2019 initial public for growth-stage companies who are looking to raise offering on the ASX with an A$211 million dollar listing. capital to fund future growth. Over the last five years, the ASX-listed tech sector has • Well-regulated with a stable economy triumphed as the fastest-growing sector in respect of A listing on a well-regulated exchange helps to new listings, with its growth rate more than doubling. build a company’s reputation and profile as it shows So why is it that the ASX is increasingly being seen as they are focused on strict business and accounting the new NASDAQ? procedures and professional management. It can also • A super pool of capital bring additional credibility when dealing with large multinational customers, which is important for a tech A listing on the ASX exposes tech companies company in the growth stage. outside the US or UK to a much broader network of PAGE 12
Another great attraction of Australia is its resilient economy and impressive growth record. Over the past 28 Australia has always years, Australia’s economy has grown by an average rate of 3.2% in real terms. This is well above that of all other prized innovation and major developed economies, including the US (2.5%). Further, Australia’s tech industries specifically have grown punched above its at a yearly average rate of approximately 5.0% over the past 28 years. This high and steady economic growth gives weight for technological foreign companies confidence and incentive to list. advancement. • Access to a global market Australia’s local market is global, which brings global exposure – each day, approximately 45% of the ASX’s trading volume and capital comes from outside Australia. For tech companies who don’t want to be limited to investors from a single market, this is a great attraction. With a market cap of A$1.9 trillion, the ASX has a significant capacity to fund local and global companies, meaning it can be used by tech companies at growth stage as either a stepping stone to a future dual listing on the NASDAQ or as a long-term listing venue. Following the downturn of the Australian mining boom and recent regulatory scrutiny of the financial services industry, the ASX has looked to redress the majority of the value of its market being tied up in the mining and financial services industries. It has done this by courting US, European and Israeli tech companies and industry bodies and actively encouraging a less concentrated spread with a focus on the tech companies of the future. • An epicentre of technological innovation Australia has always prized innovation and punched above its weight for technological advancement – this is the country that brought the world WiFi, ultrasound, the pacemaker, the bionic ear, the underwater torpedo and, most importantly, Vegemite! With Australian technology companies such as Atlassian, Xero and Canva having global success and a supportive ecosystem, the ASX is open for business to ambitious tech companies, regardless of jurisdiction. While technology stocks currently only make up 2.4%, or approximately A$50 billion, of the A$1.9 trillion worth of companies listed on the ASX, the exchange wants that to grow. ASX executive general manager of listings and issuer services Max Cunningham has recently commented “ASX is trying to position ourselves as a late-stage VC [venture capital] funding market with companies that have de-risked their model, have proven their revenue and are looking to scale their businesses and potentially go public to provide liquidity for their shareholders and acquisition currency”. The ASX’s clear appetite for these stocks means that tech companies desperate for much needed capital to scale will find a warm welcome down under. PAGE 13
Corporate FIRB increases its focus on “an important role in enabling foreign investment to proceed while safeguarding the national interest and data managing any identified risks”. Negotiating conditions with FIRB is therefore a well understood path to foreign By James Morley, Partner and investment in Australia. Justin Fox, Partner FIRB has not published a set of standard data protection conditions. Acquirers of data heavy assets are therefore Australia’s Foreign Investment Review Board currently exposed to case by case negotiation. However, (FIRB) is increasing its focus on investment we are aware that FIRB has recently required, or has proposals which give foreign acquirers access considered, conditions which: to personal data of Australian citizens. • restrict the storage of data to onshore Australian facilities; While acquisitions of facilities that house data which has • restrict the ability of certain data to be accessed national security implications have long been a focus from overseas, or by upstream investors or of FIRB, the regulator is now taking a keen interest in personnel; any proposal which involves the transfer of personal information offshore or which results in a foreign • require service providers to have certain person owning or having the ability to access personal certifications and safeguards in place to protect information, irrespective of whether the data itself data; raises obvious national security concerns. • require the acquirer to provide FIRB access to the The extent of FIRB’s regulatory pivot is demonstrated by data upon request; recent comments made by FIRB Chairman David Irvine, • require businesses to maintain records of offshore who noted (in apparently unscripted comments to a access to data; and business forum) that: • apply governance or physical access restrictions to “The protection of sensitive data is becoming the issue support the above undertakings. du jour, and not just sensitive national security data. The development of data-security conditions – conditions on While acquirers will need to be prepared to accept the foreign investor to protect data – continues to be a and understand the impact of the conditions on the key area of focus for us.” businesses being acquired and their future plans (including whether the conditions apply to anonymised We are seeing this play out in the conditions that FIRB or aggregated data), they should also be aware that the are seeking to negotiate with foreign acquirers in imposition and negotiation of these conditions can delay financial, health and education assets that have access the approval process, and factor that process into the to large amounts of personal data. transaction timetable and strategy. By way of background, foreign investment in Australia is The strength of FIRB’s purpose is illustrated by Mr regulated, and notifiable foreign investment proposals1 Irvine’s additional comments, at the same forum, that: must be approved by the Australian Treasurer. When making foreign investment decisions the Treasurer is “I am having a long-running battle with the Critical advised by FIRB which examines foreign investment Infrastructure Centre, which says critical infrastructure proposals and advises on the national interest is ports, water, power, energy and telecommunications. implications. I am saying there is another one: it is called data.” The Treasurer has the power to block foreign The Critical Infrastructure Centre (CIC) was established investment approvals that are contrary to Australia’s in 2018 and brings together various Government national interest. In practice it is very rare for a departments and intelligence agencies to manage proposal to be refused approval. However, the national security risks arising from foreign involvement Treasurer also has the power to apply conditions on the in Australia’s critical infrastructure assets, including way in which a proposal is to be implemented to ensure ports, electricity, water and gas utilities. The CIC it is not contrary to the national interest. FIRB’s stated also oversees security issues relating to Australia’s preference is not to prohibit transactions and sees telecommunications sector. the imposition of conditions on approvals as playing 1 Whether a proposal is a “notifiable action” will depend on a number of factors, including the size of the acquisition, the characterisation of the investor, the investment sector, and whether the acquisition is occurring in Australia or offshore. PAGE 14
The CIC is wholly separate from FIRB and has broader reach, in that it advises the Minister for Home Affairs The current trend on the use of the Minster’s power under the Security of Critical Infrastructure Act 2018 to direct owners towards imposing or operators of critical infrastructure assets to take or refrain from taking certain action. That role gives detailed conditions on the CIC a level of influence over the ongoing operation of critical infrastructure assets which goes beyond the acquisition of large advising on the initial foreign investment proposal. datasets is not a phase. In the context of telecommunications, this has translated into a willingness by the CIC to direct the owners of certain assets as to which third party equipment suppliers they can and cannot use. In contrast, FIRB has not traditionally taken such an interventionist or ongoing role in framing its investment conditions. It is not clear from his comments whether Mr Irvine is advocating for data assets to be brought within the formal purview of the CIC. That would require a Ministerial direction and would be a significant expansion of the CIC’s jurisdiction. What his comments do suggest, however, is that the current trend towards imposing detailed conditions on the acquisition of large data sets is not a passing phase. Foreign acquirers of those assets would be well advised to give thought to the issues that are likely to concern FIRB in advance of making an application and to pro-actively offer up undertakings to address those concerns. PAGE 15
Media The ACCC Digital That being said, the media regulatory framework is complex, overlapping and technical, and the policy Platforms Inquiry Final implications of reform extend significantly beyond questions of the influence of digital platforms on Report: is media law competition or consumer welfare. Given this context, reform destined for the too it seems reasonable that media law harmonisation in particular should be addressed as part of a more hard basket once again? holistic review. The Government has expressed its broad support for By James North, Head of Technology, Media and the ACCC’s recommendations, but its political will to Telecommunications, Adam Foreman, Partner drive legislative change in this challenging policy area is and Jennifer Dean, Special Counsel yet to be tested. The genesis of the Australian Competition In this article, we look at the key findings and and Consumer Commission’s (ACCC) Digital recommendations of the DPI in relation to the media sector and consider what concrete reform in this area Platforms Inquiry (DPI) was a deal the Federal might look like. Government did with cross-benchers in 2017 to obtain support for a relaxation of media Harmonisation of media regulation ownership laws. Journalistic content creation It would be difficult to find a stakeholder who is not was always intended to be its core focus. broadly in favour of harmonising media regulation across different modes of delivery (the submissions lodged with the DPI reflect this). The real challenge In reading the Final Report and its wide-ranging is the ‘how’ and this is a question the ACCC has not recommendations, there is a sense that some of this tackled, instead recommending that it be addressed by focus has been lost, with many recommendations likely way of a separate government process. to have only a marginal impact on the sustainability of media business models and much of the work in One of the more interesting (or dispiriting) aspects of formulating concrete measures in key policy areas the DPI Final Report is the list of previous reviews of being deferred to future processes. media laws going back to 2005 (Appendix C). By our PAGE 16
You can also read