this post was submitted on 06 Jul 2024
108 points (100.0% liked)

Free and Open Source Software

17952 readers
7 users here now

If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
108
submitted 4 months ago* (last edited 4 months ago) by Recant to c/foss
 

As mentioned in the comments, plain text keys aren't bad because they are necessary. You have to have at least one plain text key in order to be able to use encryption

you are viewing a single comment's thread
view the rest of the comments
[–] bjoern_tantau@swg-empire.de 15 points 4 months ago (2 children)

How else should the keys be stored?

[–] Pechente@feddit.org 8 points 4 months ago (1 children)

There are system specific encryption methods like keychain services on iOS to store exactly this kind of sensitive information.

[–] ericjmorey@programming.dev 10 points 4 months ago (1 children)

How would that provide additional security in the particular circumstance of someone having access to the Signal encryption keys on someone's phone?

[–] hedgehog@ttrpg.network 5 points 4 months ago

This particular scenario involves the MacOS desktop app, not the phone app. The link is showing just an image for me - I think it’s supposed to be to https://stackdiary.com/signal-under-fire-for-storing-encryption-keys-in-plaintext/

That said, let’s compare how it works on the phone to how it could work on MacOS and how it actually works on MacOS. In each scenario, we’ll suppose you installed an app that has hidden malware - we’ll call it X (just as a placeholder name) - and compare how much data that app has access to. Access to session data allows the app to spoof your client and send+receive messages

On the phone, your data is sandboxed. X cannot access your Signal messages or session data. ✅ Signal may also encrypt the data and store an encryption key in the database, but this wouldn’t improve security except in very specific circumstances (basically it would mean that if exploits were being used to access your data, you’d need more exploits if the key were in the keychain). Downside: On iOS at least, you also don’t have access to this data.

On MacOS, it could be implemented using sandboxed data. Then, X would not be able to access your Signal messages or spoof your session unless you explicitly allowed it to (it could request access to it and you would be shown a modal). ✅ Downside: the UX to upload attachments is worse.

It could also be implemented by storing the encryption key in the keychain instead of in plaintext on disk. Then, X would not be able to access your Signal messages and session data. It might be able to request access - I’m not sure. As a user, you can access the keychain but you have to re-authenticate. ✅ Downside: None.

It’s actually implemented by storing the encryption key in plaintext, collocated with the encrypted database file. X can access your messages and session data. ❌

Is it foolproof? No, of course not. But it’s an easy step that would probably take an hour of dev time to refactor. They’re even already storing a key, just not one that’s used for this. And this has been a known issue that they’ve refused to fix for several years. Because of their hostile behavior towards forks, the FOSS community also cannot distribute a hardened version that fixes this issue.

[–] scott@lem.free.as 4 points 4 months ago (1 children)

In the device's secure enclave (e.g. TPM).

[–] onlinepersona@programming.dev 13 points 4 months ago* (last edited 4 months ago) (1 children)

How does that help when somebody has access to the phone via your PIN or password?

Anti Commercial-AI license

[–] chris@l.roofo.cc 4 points 4 months ago (1 children)

If I'm not mistaken you can save keys in these chips so that they can not be extracted. You can only use the key to encrypt/decrypt/sign/verify by asking the chip to do these operations with your key.

[–] onlinepersona@programming.dev 5 points 4 months ago (2 children)

That sounds only marginally better. Access to the phone still means you can create a backup containing the key, so TPM wouldn't help much.

Anti Commercial-AI license

[–] lud@lemm.ee 3 points 4 months ago (1 children)

No, why would a backup contain non-exportable information? One of the reasons to use TPM to begin with is that sensitive information can't leave it.

[–] onlinepersona@programming.dev 1 points 4 months ago (1 children)

How do you restore a backup on another phone without the keys?

Anti Commercial-AI license

[–] lud@lemm.ee 2 points 4 months ago

You would probably use a recovery key that exists exclusively elsewhere like on paper in a vault. Like bitlocker.

I have no idea if signal uses TPM or not but generally keys in TPM are non-exportable which is a very good thing and IMO the primary reason to use TPM at all.

[–] scott@lem.free.as 1 points 4 months ago (1 children)

One would hope the backup is encrypted.

[–] onlinepersona@programming.dev 3 points 4 months ago* (last edited 4 months ago)

It is. A password is generated that you have to write down. It must've been a compromise because they knew most people would just pick a shitty password if they didn't generate one and it would end up on a piece of paper or in some digital form anyway.

Anti Commercial-AI license