EU AI Act passed
The European Parliament approved the Artificial Intelligence Act (EU AI Act) on 13 March 2024. It has been hailed as the world’s first comprehensive and binding piece of legislation on AI, although many of its provisions won’t be enforced for at least a year or two. Rather than attempting to regulate specific technologies, the focus of the EU AI Act is on the element of risk posed by the way in which AI systems are used. So, for example, the use of AI in video games is considered low risk whereas social scoring systems (such as that used in China) is deemed unacceptable risk. The Act also attempts to tackle some of the inherent copyright issues associated with training generative AI (GenAI) products, by imposing a certain level of transparency for systems it terms General Purpose AI (GPAI).
Law firm bans GenAI
Not a day seems to go by without a barrage of press releases about law firms adopting AI products, supposedly to “improve efficiency” (but arguably just to increase profits and jump on the AI bandwagon). In the midst of all the legal AI fanfare, American firm Carlton Fields ruffled a few feathers and generated some heat on LinkedIn when it announced a firm-wide ban on the use of GenAI for research and writing. Making the announcement, Peter J. Winders, General Counsel at the firm, concluded that GenAI “simply can’t be used to produce competent legal product[s] intended to aid a client or assist a court in the important work of fairly deciding disputes and developing the law.” Interestingly, he also argues that, even if a large language model (LLM) is trained on reliable legal information (as is the case with the new Lexis+ AI product), it can still result in so called “digital hallucinations”: “What about a generative AI program trained only on a large library of only reliable material, such as the West system or the preserved research and output of a large law firm? This might cut down on the risk that the generative AI tool has scraped rubbish off the internet at large, but it will do nothing to stop its resort to the fabrication of cases.” These shortcomings of supposedly hallucination-free AI are highlighted in this LinkedIn post.
UK AI and copyright code abandoned
GenAI companies such as OpenAI and Google are facing legal challenges from a variety of publications and writers who are concerned with the unlicensed use of their copyright material for the training of LLMs. The UK 4 developers by drawing up a voluntary code of practice. Unfortunately, it recently abandoned this initiative, claiming that the working group was unable to reach any consensus. However, there is still a chance of getting this issue back on the government agenda, thanks to the Artificial Intelligence (Regulation) Bill, a private members’ bill sponsored by Lord Holmes of Richmond. Separately, the House of Lords recently produced a report in which it calls on the government to tackle the issue of LLMs being trained on copyrighted materials.
Product Security and Telecommunications Infrastructure Act 2022
Cybersecurity has been in the news recently following the apparent spear phishing attempt targeting a number of MPs. One piece of legislation which attempts to reduce the cybersecurity vulnerabilities of consumer connected devices is the Product Security and Telecommunications Infrastructure Act, the provisions of which are due to come into force on 29 April 2024. The Act requires manufacturers, importers, and distributors of relevant products to take steps to minimise the exposure of consumers to cyberattacks. It’s designed to tighten up the notoriously lax security protocols related to the Internet of Things (IoT), preventing malicious hackers breaking into a home network via an insecure smart speaker which comes with a default password of “1234” etc.
Clarification of the term “personal data”
A recent judgment handed down by the European Court of Justice, in a Belgian case (C-604/22) regarding the auctioning of personal data for advertising purposes, clarified the definition of “personal data” in the context of the General Data Protection Regulation (GDPR). The court had to consider whether a “Transparency and Consent String” (TC String) which is used by advertisers, in conjunction with cookies, to trade supposedly anonymised user data, fell outside the scope of personal data under the GDPR. It concluded that “the TC String contains information concerning an identifiable user and therefore constitutes personal data within the meaning of the GDPR.” This rather technical judgment should serve as a warning to online advertisers who trade user data, and encourage them to conduct fresh audits to ensure they are not inadvertently breaching any of their data protection obligations. According to the TLT data protection team, although this is an EU ruling, it has the potential to impact UK companies “whether by virtue of those organisations being directly caught by the EU GDPR’s extra-territorial scope, or as a helpful indication of how similar UK GDPR concepts may be interpreted.”
Alex Heshmaty is technology editor for the Newsletter. He runs Legal Words, a legal copywriting agency based in the Silicon Gorge. Email alex@legalwords.co.uk.
Photo by Marc-Olivier Jodoin on Unsplash.