Denise Almeida

3 posts tagged with "Denise Almeida" (See all Author)

Policy and regulation update 2024: Matrix and the GDPR

06.06.2024 07:00 — Foundation Denise Almeida

If you have been following the matrix.org blog for some time, you will know that we’ve never been ones to shy away from complex topics like public policy and its impacts on Matrix. With this blog post series, our aim is to introduce a more regular cadence to our regulatory updates and to be more transparent about where we are focusing our efforts in this area.

Each blog post in the series will focus on a given theme or piece of law, as well as its relevant jurisdiction. We will start this series by taking a deep dive into EU regulation, starting with the General Data Protection Regulation (GDPR). Future blog posts in the series will cover the digital services package (DMA and DSA), the incoming CRA and the highly controversial CSAM regulation. These will be followed by a series dedicated to the UK, particularly UK applications of European law such as the GDPR and ePrivacy directive, as well as the Online Safety Act and the IPA amendment bill. Finally, we will conclude the series by looking across the pond and diving into the Cloud Act, as well as KOSA and other existing proposals.

Continue reading…

Open letter to EU Member States on the proposed CSA Regulation

22.01.2024 00:00 — Foundation Denise Almeida

We join our voices to technology companies, trade associations and other supporters in asking EU member states to align the Council's position on the CSA Regulation to the position agreed by the Parliament.

Safeguarding encryption should be a priority in negotiations, ensuring the protection of rights and freedoms around privacy and security of communications.

A copy of the open letter sent to ministers can be read below.

🔗Open letter to EU Member States on the proposed CSA Regulation

Dear Ministers of the Interior, Justice, and Economy of EU Member States,

We write to you as small and medium-sized companies and organizations from Europe, concerned about the proposal for a Regulation on Child Sexual Abuse (CSA). Collectively, we call on you to ensure that your country’s position on this file is brought as close as possible to the European Parliament’s (EP) one. We all agree that ensuring children are safe online is one of the most important duties of tech companies and for this reason, we find the European Commission’s proposed Regulation extremely worrying. If it were implemented as proposed, it would negatively impact children’s privacy and security online, while also having dramatic unforeseen consequences on the EU cybersecurity landscape, on top of creating an ineffective administrative burden1. The European Parliament recently adopted its position on the file, acknowledging that scanning technologies are not compatible with the aim of having confidential and secure communications. The crucial changes it therefore puts forward for the proposal reflect the opinions of the European Data Protection Supervisor (EDPS), the Council legal services as well as countless experts in cryptography and cybersecurity2. It also reflects the opinion of between 63% and 69% of the companies, public authorities, NGOs and citizens consulted by the European Commission in its Impact Assessment3. As small and medium-sized tech companies and organizations, we share their concerns as we know that looking for specific content – such as text, photos and videos – in an end-to-end encrypted communication would require the implementation of a backdoor, or of a similar technology called “client-side scanning”. Even if this mechanism is created with the purpose of fighting crime online, it would also quickly be used by criminals themselves, putting citizens and businesses more at risk online by creating vulnerabilities for all users alike.

🔗Data protection is a strong competitive advantage

As tech companies operating within the European Union, we have built products and services in line with the strong data protection framework of the EU which still serves as an example and inspiration across the world.

The GDPR allowed for the creation of ethical, privacy-first tech companies in Europe, that would otherwise never have been able to compete against Big Tech. It gave European companies a strong competitive advantage in that field internationally and allowed consumers to finally be able to find alternatives to American and Chinese services. Our users, both within the EU and beyond, have come to trust our commitment to safeguarding their data and this trust is a key driver of our competitiveness. The learning curve for adapting to the necessary administrative burden brought about by the GDPR was high but was worth it. However, the CSA Regulation could threaten this unique selling point of European IT companies and would also add a new administrative burden which we fear could overwhelm both our companies and law enforcement bodies. Considering the volume of communications and content transiting through our services, even an insignificant error rate of the technologies applied to scan for abusive material would result in millions of false positives to be manually reviewed every day.

🔗The CSA Regulation could erode trust and safety online

In a world where data breaches and privacy scandals are increasingly common, the EU's reputation for stringent data protection is a unique selling point for businesses operating within its borders. It provides us with a competitive edge, assuring our customers that their information is handled with the utmost care and integrity. This trust, once eroded, is challenging to rebuild, and any measures that compromise it such as mandatory scanning, or mandatory age verification have the potential to harm businesses both large and small. Furthermore, the EU has recently adopted Regulation 2023/2841, which mandates that EU Institutions and bodies to consider the use of end-to-end encryption among their cybersecurity risk-management measures. There are also multiple ‘cyber’ EU proposals currently on the table, such as the Cyber Resilience Act and the Cybersecurity Act. Supporting an opposite approach for the CSA Regulation would only undermine the EU cybersecurity framework creating a contradictory, incoherent and inefficient new set of measures that companies would not be able to enforce without putting citizens and businesses at risk.

🔗The EU Parliament's proposal goes in the right direction

Therefore, we applaud the European Parliament for its resolute stance in defending the European citizens' right to privacy and secure communication. The European Parliament’s commitment to these principles is not only a testament to its dedication to human rights, but also a beacon of hope for businesses like ours that prioritize data protection and security. The position of the Parliament includes alternatives to scanning which have a minimal impact on cybersecurity and data protection, and which experts believe would be both more effective and more efficient than mandatory scanning. Such changes of paradigm would mean going beyond the false dichotomy between privacy and security, while also making the proposal respect the proportionality principle, as requested by the Regulatory Scrutiny Board. Even if not perfect in our eyes, the changes the European Parliament made in its position are a good compromise to maintain digital security and confidentiality and to better protect children online. We believe that these changes strike the right balance between child protection and safeguarding privacy and cybersecurity.

As representatives of the vibrant European small businesses community, we encourage EU Member States to continue championing the values of privacy, cybersecurity and data protection. These principles not only align with the EU's commitment to human rights, but also serve as a foundation for a thriving and competitive business environment. Let us defend and strengthen these principles, ensuring that the EU remains an advocate of privacy in the global marketplace.

For these reasons we call on you to:

  • Ensure that Council’s position is aligned as closely as possible to the European Parliament’s. This will allow for a swifter adoption of the Regulation while building on the important work of the European Parliament.
  • Maintain the high level of fundamental rights - and in particular data protection – enjoyed by citizens in the European Union.
  • Refrain from forcing companies like us to conduct mass surveillance of private correspondence on behalf of law enforcement agencies.
  • Guarantee a high level of cybersecurity in the EU by protecting end-to-end encryption and bringing the necessary safeguards in the text. Client-side scanning and backdoors in particular should not be mandated.
  • Preserve the confidentiality of correspondence.
  • Minimize the administrative burden of the proposal by making it more effective and efficient, through alternatives to mass scanning.

Signed,

  • Blacknight Solutions (Ireland)
  • Element (United Kingdom)
  • Mail.de GmbH (Germany)
  • Matrix Foundation (United Kingdom)
  • Nextcloud (Germany)
  • Open-Xchange (Germany)
  • Renvis (Greece)
  • TelemetryDeck (Germany)
  • Tresorit (Switzerland)
  • E Foundation (France)
  • Logilab (France)
  • Mailfence (Belgium)
  • Murena (France)
  • Olvid (France)
  • Proton (Switzerland)
  • Surfshark (Lithuania)
  • Threema (Switzerland)
  • Tuta (Germany)

Trade associations and supporters:

  • ACT | The App Association
  • Defend Democracy
  • Gate 15
  • Myntex
  • Quilibrium
  • Studio Legale Fabiano
  • Cyberstorm
  • Encryption Europe
  • ISOC-CAT
  • Privacy & Access Council of Canada
  • SecureCrypt
1

A detailed summary of the proposal, drafted by the NGO EDRi, is available here: https://edri.org/our-work/private-and-secure-communications-put-at-risk-by-european-commissions-latest-proposal/

2

For more information, you can read their statement from July 2023: https://edri.org/wp-content/uploads/2023/07/Open-Letter-CSA-Scientific-community.pdf

3

See in particular page 134 of the impact assessment: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52022SC0209

How the UK's Online Safety Bill threatens Matrix

19.05.2021 15:47 — Tech Denise Almeida
Last update: 19.05.2021 14:48

Last week the UK government published a draft of the proposed Online Safety Bill, after having initially introduced formal proposals for said bill in early 2020. With this post we aim to shed some light on its potential impacts and explain why we think that this bill - despite having great intentions - may actually be setting a dangerous precedent when it comes to our rights to privacy, freedom of expression and self determination.

The proposed bill aims to provide a legal framework to address illegal and harmful content online. This focus on “not illegal, but harmful” content is at the centre of our concerns - it puts responsibility on organisations themselves to arbitrarily decide what might be harmful, without any legal backing. The bill itself does not actually provide a definition of harmful, instead relying on service providers to assess and decide on this. This requirement to identify what is “likely to be harmful” applies to all users, children and adults. Our question here is - would you trust a service provider to decide what might be harmful to you and your children, with zero input from you as a user?

Additionally, the bill incentivises the use of privacy-invasive age verification processes which come with their own set of problems. This complete disregard of people’s right to privacy is a reflection of the privileged perspectives of those in charge of the drafting of this bill, which fails to acknowledge how actually harmful it would be for certain groups of the population to have their real life identity associated with their online identity.

Our view of the world, and of the internet, is largely different from the one presented by this bill. Now, this categorically does not mean we don’t care about online safety (it is quite literally our bread and butter) - we just fundamentally disagree with the approach taken.

Whilst we sympathise with the government’s desire to show action in this space and to do something about children’s safety (everyone’s safety really), we cannot possibly agree with the methods.

Back in October of 2020 we presented our proposed approach to online safety - ironically also in response to a government proposal, albeit about encryption backdoors. In it, we briefly discussed the dangers of absolute determinations of morality from a single cultural perspective:

As uncomfortable as it may be, one man’s terrorist is another man’s freedom fighter, and different jurisdictions have different laws - and it’s not up to the Matrix.org Foundation to play God and adjudicate.

We now find ourselves reading a piece of legislation that essentially demands these determinations from tech companies. The beauty of the human experience lies with its diversity and when we force technology companies to make calls about what is right or wrong - or what is “likely to have adverse psychological or physical impacts” on children - we end up in a dangerous place of centralising and regulating relative morals. Worst of all, when the consequence of getting it wrong is criminal liability for senior managers what do we think will happen?

Regardless of how omnipresent it is in our daily lives, technology is still not a solution for human problems. Forcing organisations to be judge and jury of human morals for the sake of “free speech” will, ironically, have severe consequences on free speech, as risk profiles will change for fear of liability.

Forcing a “duty of care” responsibility on organisations which operate online will not only drown small and medium sized companies in administrative tasks and costs, it will further accentuate the existing monopolies by Big Tech. Plainly, Big Tech can afford the regulatory burden - small start-ups can’t. Future creators will have their wings clipped from the offset and we might just miss out on new ideas and projects for fear of legal repercussions. This is a threat to the technology sector, particularly those building on emerging technologies like Matrix. In some ways, it is a threat to democracy and some of the freedoms this bill claims to protect.

These are, quite frankly, steps towards an authoritarian dystopia. If Trust & Safety managers start censoring something as natural as a nipple on the off chance it might cause “adverse psychological impacts” on children, whose freedom of expression are we actually protecting here?

More specifically on the issue of content moderation: the impact assessment provided by the government alongside this bill predicts that the additional costs for companies directly related to the bill will be in the billions, over the course of 10 years. The cost for the government? £400k, in every proposed policy option. Our question is - why are these responsibilities being placed on tech companies, when evidently this is a societal problem?

We are not saying it is up to the government to single-handedly end the existence of Child Sexual Abuse and Exploitation (CSAE) or extremist content online. What we are saying is that it takes more than content filtering, risk assessments and (faulty) age verification processes for it to end. More funding for tech literacy organisations and schools, to give children (and parents) the tools to stay safe is the first thing that comes to mind. Further investment in law enforcement cyber units and the judicial system, improving tech companies’ routes for abuse reporting and allowing the actual judges to do the judging seems pretty sensible too. What is absolutely egregious is the degradation of the digital rights of the majority, due to the wrongdoings of a few.

Our goal with this post is not to be dramatic or alarmist. However, we want to add our voices to the countless digital rights campaigners, individuals and organisations that have been raising the alarm since the early days of this bill. Just like with coercive control and abuse, the degradation of our rights does not happen all at once. It is a slippery slope that starts with something as (seemingly) innocuous as mandatory content scanning for CSAE content and ends with authoritarian surveillance infrastructure. It is our duty to put a stop to this before it even begins.

Twitter card image credit from Brazil, which feels all too familiar right now.