FCA criticised over using sensitive data in AI trial with Palantir

## FCA criticised over using sensitive data in AI trial with Palantir The Financial Conduct Authority (FCA) has attracted criticism for turning over sensi
FCA criticised over using sensitive data in AI trial with Palantir
FCA criticised over using sensitive data in AI trial with Palantir
The Financial Conduct Authority (FCA) has attracted criticism for turning over sensitive data to controversial US technology firm Palantir as part of a trial to help it sharpen its approach to financial crime. This move has sparked concerns among regulators and experts about the potential risks of sharing confidential information with a company that has been linked to various controversies.
Background & Context
The FCA, the UK's financial regulator, has been working on developing its artificial intelligence (AI) capabilities to combat financial crime, including money laundering and terrorist financing. As part of this effort, it partnered with Palantir, a company known for its data analytics software, to test its AI-powered tools. However, critics argue that the FCA's decision to share sensitive data with Palantir raises concerns about data protection and the potential for misuse.
Palantir has faced criticism in the past for its involvement in various high-profile projects, including the US Immigration and Customs Enforcement (ICE) agency's deportation efforts. The company has also been accused of having ties to the US intelligence community, which has raised concerns about its potential use of shared data for surveillance purposes.
Impact on Swiss SMEs & Finance
While the FCA's decision to partner with Palantir may not have direct implications for Swiss SMEs, it highlights the growing importance of data protection and AI regulation in the financial sector. As the use of AI and machine learning becomes more widespread, regulators and financial institutions must ensure that they are taking adequate measures to protect sensitive data and prevent potential misuse.
For Swiss SMEs, this development serves as a reminder of the need to prioritize data security and compliance with regulatory requirements. With the increasing use of digital technologies, SMEs must be aware of the potential risks associated with data sharing and take steps to protect their sensitive information.
What to Watch
The FCA's trial with Palantir is set to continue, and regulators will be closely monitoring the outcome. The development of AI-powered tools to combat financial crime is a critical area of focus for regulators, and the FCA's efforts are being closely watched by its European counterparts. As the use of AI and machine learning continues to evolve, regulators and financial institutions must ensure that they are taking adequate measures to protect sensitive data and prevent potential misuse.
Source
Original Article: FCA criticised over using sensitive data in AI trial with Palantir
Published: March 23, 2026
Disclaimer: This article is for informational purposes only and does not constitute financial advice. Consult a licensed financial advisor before making investment decisions.
References
Transparency Notice: This article may contain AI-assisted content. All citations link to verified sources. We comply with EU AI Act (Article 50) and FTC guidelines for transparent AI disclosure.
Original Source
This article is based on FCA criticised over using sensitive data in AI trial with Palantir (Finextra)


