We wrote last week about Proton Mail adding an AI assistant that just sent a wafer-thin slice of your email in plaintext to sit on Proton’s servers — unlike the zero plaintext that was stored there…
Email is never stored unencrypted at rest on Proton’s servers. But AI prompts, which are likely your entire draft email, do exist unencrypted at rest on their servers. That’s what has the privacy nerds screaming.
yep! and the important thing to understand about proton is, the end to end encryption (where one end is the sender of a message and the other is the receiver — Proton never handles plaintext at all, beyond a tiny and clearly called out amount of metadata stored as plaintext on their servers for stuff like Calendars) is the whole point of the thing, there’s no reason to use Proton without it. with this LLM garbage, Proton’s threat model has shifted such that you can’t trust that the other end’s plaintext didn’t get transmitted to Proton’s servers (there’s no way for you, the receiver, to tell that the sender didn’t use the cloud LLM features), which makes Proton a lot less useful for some of the most vulnerable people who use it, such as activists and journalists who might be under legal threat. this plaintext leak allows some of the messages you’ve received to be subpoenaed, and it’s very easy for that to be used in a criminal case against you.
also, Proton’s published security model for their LLM feature (which is ultra-thin and resembles a PR puff piece more than any other model they published before this) states that their no-log policy is what makes the cloud version of the LLM secure, but their no-log policy has gigantic holes in it, and Proton’s response to these concerns is utterly unbefitting of a privacy/security software company
Ah OK, so it’s sending the email draft in process not sending off the content of incoming messages or your final sent messages. Now I understand. Also, that’s still bad…
I’d personally consider that sufficient grounds to accuse Proton of stealing its customers’ data.
At the (miniscule) risk of sounding unnecessarily harsh on tech, any customer data that gets sent to company servers without the customer’s explicit, uncoerced permission should be considered stolen.
Email is never stored unencrypted at rest on Proton’s servers. But AI prompts, which are likely your entire draft email, do exist unencrypted at rest on their servers. That’s what has the privacy nerds screaming.
yep! and the important thing to understand about proton is, the end to end encryption (where one end is the sender of a message and the other is the receiver — Proton never handles plaintext at all, beyond a tiny and clearly called out amount of metadata stored as plaintext on their servers for stuff like Calendars) is the whole point of the thing, there’s no reason to use Proton without it. with this LLM garbage, Proton’s threat model has shifted such that you can’t trust that the other end’s plaintext didn’t get transmitted to Proton’s servers (there’s no way for you, the receiver, to tell that the sender didn’t use the cloud LLM features), which makes Proton a lot less useful for some of the most vulnerable people who use it, such as activists and journalists who might be under legal threat. this plaintext leak allows some of the messages you’ve received to be subpoenaed, and it’s very easy for that to be used in a criminal case against you.
also, Proton’s published security model for their LLM feature (which is ultra-thin and resembles a PR puff piece more than any other model they published before this) states that their no-log policy is what makes the cloud version of the LLM secure, but their no-log policy has gigantic holes in it, and Proton’s response to these concerns is utterly unbefitting of a privacy/security software company
Ah OK, so it’s sending the email draft in process not sending off the content of incoming messages or your final sent messages. Now I understand. Also, that’s still bad…
I’d personally consider that sufficient grounds to accuse Proton of stealing its customers’ data.
At the (miniscule) risk of sounding unnecessarily harsh on tech, any customer data that gets sent to company servers without the customer’s explicit, uncoerced permission should be considered stolen.