Understanding the opportunities and challenges AI and tech present for business
Artificial Intelligence (AI) has huge transformative potential. It has a wide range of applications and offers huge benefits for businesses across sectors. Development, provision and use of AI also needs to navigate a number of risks and practical challenges – including regarding ethical use, data governance, IP and ownership, AI liability, cyber security and resilience, as well as the multi-layered and quickly evolving regulatory landscape.
Rapid advances in AI are having a significant impact on our clients' business models, growth strategies and day to day decision making. Our experts examine the big questions arising in AI deployment, and provide holistic, pragmatic legal advice to help our clients strategically manage risk as they explore AI opportunities.
Double click to add custom text
AI Scholarship Programme with Oxford University
We’re using our global voice in Tech to promote AI safety through a ground-breaking scholarship programme. Jonathan Kewley, our Tech Co-Chair, has founded a scholarship programme with Oxford University to prioritise Tech ethics and computer science research. This supports people from diverse backgrounds to be part of this generationally important debate, and to design safety and fairness into the technology which now is all around us.
The EU’s Artificial Intelligence Act on BBC News
The final phase of developing the EU Artificial Intelligence Act is happening. Dessi Savova, Partner and Head of Continental Europe of the Clifford Chance Tech Group speaks to BBC News about the Parliament's vote.
As the AI Act moves ahead in EU Parliament, Dessislava Savova, Partner and Head of Continental Europe of the Clifford Chance Tech Group speaks to BBC News on EU and global regulation.
The AI Act will unquestionably have a global impact with regulators and businesses around the world.
Tokyo Partner Michihiro Nishi speaks to David Mitchell from 39 Essex Chambers about the current thinking of generative Artificial Intelligence (AI) regulations in Japan and the future of legal disputes in Japan surrounding generative AI.
This podcast has been republished with the permission of 39 Essex Chambers.
On 14 June 2023, the European Parliament voted to adopt its negotiating position on the proposed EU regulation on AI (AI Act). This is another major step towards the adoption of the proposed AI Act, undoubtedly one of the most important and anticipated pieces of legislation of the past few years.
As legal, technology and risk-management teams collaborate to support business-critical decisions, establish forward-looking frameworks and embed responsible AI in company strategy, being able to assess and advise on AI with a holistic understanding of the changing legal and policy landscape has never been more important.
On April 11, President Biden's administration quietly dropped a consultation that might lead to a slow revolution in the U.S.'s approach to AI regulation. The Department of Commerce's National Telecommunications and Information Administration (NTIA) published a request for comment (RFC) on how to achieve "trustworthy AI." The comments window is open until June 12, after which the NTIA will draft a report on AI policy development.
The Aye-aye (Daubentonia madagascariensis) is a long-fingered lemur native to, you guessed it, Madagascar – but my voice assistant told me it was a Canadian rock band (called Eye Eye).
It is highly likely that you have your own experiences of the (current) limitations of artificial intelligence and in similar superficial contexts. However, when complex technology is used in the insurance value chain, deficiencies such as algorithm bias can have discriminatory and other significant consequences for policyholders such as higher premiums, refusal of insurance cover, or rejection of claims.
The speed at which Generative AI has gained traction in both businesses and our daily lives has re-ignited the debate around the potential for harm associated with highly automated systems deployed at scale.
Countless social media posts, images and articles have been generated using AI tools such as ChatGPT, Stable Diffusion and Dall-E. But who – if anyone – owns the copyright in the outputs, and do the tools themselves infringe copyright? Our IP & Tech//Digital teams take a deep dive into the issues and the recent infringement claims.
Given the rapid uptake in the use of artificial intelligence (AI) including machine learning (ML) technologies in the financial services sector, the sector and its regulators have enjoyed a head start in exploring and seeking to navigate the issues that arise. The sector is therefore well-positioned to demonstrate how a pro-innovation, context-specific and risk-based approach to regulating AI can succeed.
When a track by artist "Ghostwriter" was uploaded and then promptly removed from streaming services in April, it was the latest example of one of 2023's most astonishing trends. The track 'heart on my sleeve' sounded like it was sung by two of the world's biggest stars, Drake and The Weeknd. In fact, it was actually someone who had used an AI tool to make his voice sound like theirs.
On 30 March 2023, the Italian Data Protection Authority (the Garante) issued an interim emergency decision ordering OpenAI LLC ("OpenAI") to immediately stop the use of ChatGPT to process the personal data of data subjects located in Italy, pending further investigation.
The arrival of 'generative AI' that can produce new content (text-based responses to queries, images, audio, video and even code) marks a new and exciting phase in the development of mainstream AI applications. The starting gun has been fired for a race to capture the market.
A class action against GitHub Copilot has been filed alleging violations of the rights, including copyright, of authors that have created or contributed to codebases stored as public repositories on GitHub.
A year after its announcement, GitHub Copilot was launched last month to the public at large. The tool promises to "fundamentally change the nature of software development" by providing AI-based coding suggestions, saving developers time and effort. What are the intellectual property implications for those who build or buy software created using Copilot?
InternationalBanksTechnologyAntitrust competition litigationCompliance & online systemsEU Public policyMonitoring & advice on EU developmentsUnited StatesEuropean Union (EU)United KingdomGermanyFintech and Competition Law
Competition authorities around the world have increased their focus on fintech as part of a broader rise in intervention in financial services and tech markets...
The data centre industry is poised for growth in 2023 due to increased demand from businesses. However, factors such as higher costs, a slowing economy, new capacity challenges and increased regulation due to sustainability concerns about energy and water consumption, will impact growth. The pandemic has fueled the growth of the global data centre market, projected to reach 235 billion euros by 2026 with a projected Compound Annual Growth Rate of 4.5%. Companies must consider the latest tech trends when selecting a data centre partner or colocation provider.
Last year was a tough one for fintech with the collapse of a number of high-profile industry players, as well as wider economic pressures including the war in Ukraine, supply chain challenges and high inflation.
Evolving technologies, increased digital connectivity, cyber risk, geopolitical tensions, climate change, supply chain disruption and changing markets are shaping government policies, regulation and legal risk in relation to use of data and technology.