• Max-P@lemmy.max-p.me
        link
        fedilink
        arrow-up
        10
        arrow-down
        2
        ·
        4 months ago

        You don’t have to trust the server and shouldn’t have to trust the server if the client is doing proper E2E because you know the maximum amount of metadata it’s got.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          10
          arrow-down
          2
          ·
          4 months ago

          Your phone number is the metadata that’s not encrypted, that’s literally the whole problem here. Signal server is able to harvest graphs of phone numbers that interact with one another.

          • notabot@lemm.ee
            link
            fedilink
            arrow-up
            10
            arrow-down
            2
            ·
            4 months ago

            With ‘sealed sender’ your phone number, or any other identifying information, is not included in the metadata on the envelope, only the recipient’s id is visible, and it’s up to the recipient’s client to validate the sender information that is inside the encrypted envelope. It looks like a step in the right direction, though I don’t use signal enough to have looked into auditing it myself.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              4 months ago

              Again, this is a trust based system because you don’t know what the server is actually doing. The fact is that the server does collect enough information to trivially make the connection between phone numbers and the connections on the network. If trust me bro from Moxie is good enough for you, that’s of course your prerogative.

              • notabot@lemm.ee
                link
                fedilink
                arrow-up
                3
                ·
                4 months ago

                You’re correct that if you use the system the way it used to work they can trivially build that connection, but (and I know this is a big assumption) if it does now work the way they say it does, they do not have the information to do that any more as the client doesn’t actually authenticate to the server to send a message. Yes, with some network tracing they could probably still work out that you’re the same client that did login to read messages, and that’s a certainly a concern. I would prefer to see a messaging app that uses cryptographic keys as the only identifiers, and uses different keys for different contact pairs, but given their general architecture it seems they’ve tried to deal with the issue.

                Assuming that you want to use a publicly accessible messaging app, do you have any ideas about how it should be architected? The biggest issue I see is that the client runs on your phone, and unless you’ve compiled it yourself, you can’t know what it’s actually doing.

                • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  6
                  ·
                  4 months ago

                  Again, everything you say is based purely on faith. As you acknowledge, the design of the system is such that people operating the server can trivially build out graphs of user connections. All the same arguments people apply to no trusting server side encryption equally apply to metadata.

                  Meanwhile, there are plenty of examples of messaging apps that don’t require phone numbers. Matrix, Wire, SimpleX chat, are just a few examples. Being able to build your own client is also important, and there is a concept of reproducible builds which allows people to be reasonably sure that a binary being shipped is compiled from the source that’s published. These are solved problems, and there is no technical reason for Signal to do what it’s doing.

                  • notabot@lemm.ee
                    link
                    fedilink
                    arrow-up
                    2
                    ·
                    4 months ago

                    I agree that them having users’ phone numbers isn’t ideal. There are other identifiers they could use that would work just as well. However, both the client and server are open source, so you can build, at least the client, yourself. If you can content yourself that it does not leak your ID when sending messages, then you don’t need to trust the server as it does not have the information to build a graph of your contacts. Sealed sender seems to have been announced in 2018, so it’s had time to be tested.

                    Don’t get me wrong, the fact they require a phone number at all is a huge concern, and the reason I don’t really use it much, but the concern you initially stated was addressed years ago and you can build the client yourself to validate that.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              6
              arrow-down
              2
              ·
              4 months ago

              I’m talking about the information the server has. The encrypted envelope has nothing to do with that. Your register with the server using your phone number, that’s a unique identifier for your account. When you send messages to other people via the server it knows what accounts you’re talking to and what their phone numbers are.

              • notabot@lemm.ee
                link
                fedilink
                arrow-up
                6
                arrow-down
                1
                ·
                4 months ago

                Whilst I absolutely agree it’s correct to be skeptical about it, the ‘sealed sender’ process means they don’t actually know which account sent the message, just which account it should be delivered to. Your client doesn’t even authenticate to send the message.

                Now, I’m just going on what they’ve published on the system, so either I could be completely wrong, or they could be being misleading, but it does look like they’ve tried to address the very issue you’ve been pointing out. Obviously it’d be better if they didn’t have your phone number at all, but this does seem to decouple it in a way that means they can’t build a connection graph.

                • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  5
                  arrow-down
                  2
                  ·
                  4 months ago

                  The problem is that there is no way to verify any of this. You’re just putting trust into people operating this service. That’s not how security is supposed to work.

                  • notabot@lemm.ee
                    link
                    fedilink
                    arrow-up
                    4
                    arrow-down
                    1
                    ·
                    4 months ago

                    Strictly you’re having to trust the build of the client rather than the people running the server. If the client doesn’t send/leak the information to the server, the people running the server can’t do anything with it. It’s definitely still a concern, and, if I’m going to use a hosted messaging app, I’d much rather see the client built and published by a different group, and ideally compile it myself. Apart from that I’m not sure there’s any way to satisfy your concerns without building and running the server and client yourself.

            • sunzu2@thebrainbin.org
              link
              fedilink
              arrow-up
              3
              ·
              4 months ago

              first comment to provide a decent counterpoint.

              Looks like signal and email use both. but it still does not answer

              AI said:

              The server knows who initiated the communication (they handed over their lockbox first), but not the direction of individual messages within a conversation.

              • notabot@lemm.ee
                link
                fedilink
                arrow-up
                3
                ·
                4 months ago

                ‘Sealed sender’ seems to avoid this by not actually requiring the client to authenticate to the server at all, and relying on the recipient to validate that it’s signed by the sender they expect from the encrypted data in the envelope. As I mentioned in another reply, I’m just going on what they’ve published on the system, so either I could be completely wrong, or they could be being misleading, but it does look like they’ve tried to address this issue.

                • sunzu2@thebrainbin.org
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  4 months ago

                  I’m just going on what they’ve published on the system, so either I could be completely wrong, or they could be being misleading

                  same here overall. thank you for responding.

          • Max-P@lemmy.max-p.me
            link
            fedilink
            arrow-up
            4
            ·
            4 months ago

            The identifier is unavoidable for push notifications to work. It needs to know which phone to send it after all, even if it doesn’t use Google’s services, it would still need a way to know which device has new messages when it checks in. If it’s not a phone number it’s gonna be some other kind of ID. Messages need a recipient.

            Also, Signal’s goal is protecting conversations for the normies, not be bulletproof to run the next Silk Road at the cost of usability. Signal wants to upgrade people’s SMS messaging and make encryption the norm, you have to make some sacrifices for that. Phone numbers were a deliberate decision so that people can just install Signal and start using E2E texting immediately.

            If you want something really private you should be using Tor or I2P based solutions because it’s the only system that can reasonably hide both source and destination completely. Signal have your phone number and IP address after all. They could track your every movements.

            Most people don’t need protection against who they talk to, they want privacy of their conversations and their content. Solutions with perfect anonymity between users are hard to understand and use for the average person who’s the target audience of Signal.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              4 months ago

              The identifier absolutely does not need to be your phone number, and plenty of other apps are able to do push notifications without harvesting personal information from the users.

              Meanwhile, normies don’t need Signal in the first place since e2ee primarily protects you from things like government agencies snooping on your data.

            • Possibly linux@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              4 months ago

              Just a side note but both Simplex Chat and Briar are free of unique identifiable IDs.

              For Simplex Chat it uses hash tables. It still has a centralized server (which you can self host) but you can use the built in Tor functionality to hide your IP.

              For Briar it is totally decentralized. All messages go directly over Tor but it also can use WiFi and Bluetooth. It supports group content types such as Forms and blogs. The downside is that you need a connected device. You can also use Briar Mailboxes on a old phone to receive messages more reliably.

      • ᗪᗩᗰᑎ@lemmy.ml
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        4 months ago

        Signal has been forced by court to provide all the information they have for specific phone numbers [0][1]. The only data they can provide is the date/time a profile was created and the last date (not time) a client pinged their server. That’s it, because that’s all the data they collect.

        Feel free to browse the evidence below, they worked with the ACLU to ensure they could publish the documents as they were served a gag order to not talk about the request publicly [2].

        [0] https://signal.org/bigbrother/

        [1] https://www.aclu.org/news/national-security/new-documents-reveal-government-effort-impose-secrecy-encryption

        [2] https://www.aclu.org/sites/default/files/field_document/open_whisper_documents_0.pdf#page=8

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          4 months ago

          Once again, even if this is the way things worked back in 2016 there is no guarantee they still work like that today. This is the whole problem with a trust based system. You are trusting that people operating the server. It’s absolutely shocking to me that people have such a hard time accepting this basic fact.

          • ᗪᗩᗰᑎ@lemmy.ml
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            4 months ago

            Once again, even if this is the way things worked back in 2016 there is no guarantee they still work like that today.

            You have to trust someone. You’re not building all your software and reading every line yourself are you?

            While there’s no guarantees, Signal continues to produce evidence that they don’t collect data. Latest publication August 8th, 2024: https://signal.org/bigbrother/santa-clara-county/

            The code is open has had a few audits: https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

            This is the whole problem with a trust based system

            Can you point me to a working trustless system? I’m not sure one exists. You might say peer-to-peer systems are trustless because there’s no third party, but did you compile the code yourself? did you read every last line of code before you compiled and understood exactly what it was doing?

            It’s absolutely shocking to me that people have such a hard time accepting this basic fact.

            What’s shocking to me is the lack of understanding that unless you’re developing the entire platform yourself, you have to trust someone at some point and Signal continues to post subpoenas to prove they collect no data, has an open source client/server, provides reproducible builds and continues to be the golden standard recommended by cryptographers.

            I would recommend to anyone reading this to rely on the experts and people who are being open and honest vs those who try to push you to less secure platforms.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              3
              ·
              4 months ago

              You have to trust someone. You’re not building all your software and reading every line yourself are you?

              No, you don’t have to trust anyone. That’s literally the point of having secure protocols that don’t leak your personal data. 🤦

              Signal made an intentional choice to harvest people’s phone numbers. The rationale for doing that is very thin, and plenty of other messengers avoid doing this. The fact that Signal insists on doing that is a huge red flag all of its own.

              The code is open has had a few audits

              Only people who are actually operating the server know what’s running on it. The fact that Signal aggressively prevents use of third party clients and refuses to implement federation that would allow other servers to run is again very suspect.

              Can you point me to a working trustless system?

              SimpleX, Matrix, Briar, and plenty of other chat systems do not collect personal data.

              You might say peer-to-peer systems are trustless because there’s no third party, but did you compile the code yourself? did you read every last line of code before you compiled and understood exactly what it was doing?

              The discussion in this thread is specifically about Signal harvesting phone numbers. Something Signal has no technical reason to do.

              What’s shocking to me is the lack of understanding that unless you’re developing the entire platform yourself, you have to trust someone at some point and Signal continues to post subpoenas to prove they collect no data, has an open source client/server, provides reproducible builds and continues to be the golden standard recommended by cryptographers.

              Kind of ironic that you’ve exposed yourself as being utterly clueless on the subject while accusing me of lack of understanding.

              I would recommend to anyone reading this to rely on the experts and people who are being open and honest vs those who try to push you to less secure platforms.

              I would recommend anyone reading this to rely on rational thinking and ignore trolls who tell you to just trust Signal. Privacy and security are not based on trust, and if you ask any actual expert in the field they will tell you that.

              • ᗪᗩᗰᑎ@lemmy.ml
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                4 months ago

                No, you don’t have to trust anyone. That’s literally the point of having secure protocols that don’t leak your personal data. 🤦

                Unless you’re reading all the code, understand the protocols, and compiling yourself you are placing your trust in someone else to do it for you. There’s no way around this fact.

                You suggest SimpleX, Matrix, and Briar (which I believe are great projects btw, I’ve used them all and continue to use SimpleX and Matrix) but have you read the code, understand the underlying protocols, and compiled the clients yourself or are you placing your trust in a third party to do it for you? Be honest.

                I will agree though, if you absolutely do not trust Signal, you should use Briar or SimpleX, but neither are ready for “every day” users. Briar doesn’t support iPhones so its basically dead in the water unless you can convince family/friends to switch their entire platform. SimpleX is almost there but it still continues to fail to notify me of messages, continues to crash, and the UX needs significant improvement before people are willing to put up with it.

                The discussion in this thread is specifically about Signal harvesting phone numbers. Something Signal has no technical reason to do.

                Let me give you a history lesson, since you seem to have no clue about where Signal started and why they use phone numbers. Signal started as an encryption layer over standard text/SMS named TextSecure. They required phone numbers because that’s how encrypted messages were being sent. In 2014, TextSecure migrated to using the internet as a data channel to allow them to obscure additional metadata from cell phone providers, as well as provide additional features like encrypted group chats. Signal continued to use phone numbers because it was a text message replacement which allowed people to install the app and see all their contacts and immediately start talking to them without having to take additional action - this helps with onboarding of less technical users. Fast forward to today and Signal is only using phone numbers as a spam mitigation filter and to create your initial profile that is no longer being shared with anyone unless you opt into it.

                Now, you can say they’re collecting phone numbers for other nefarious purposes but they publish evidence that they don’t. Will they ever get rid of phone numbers? Unlikely unless they figure out a good alternative to block spam accounts.

                Privacy and security are not based on trust

                You’re 100% right. If you read the code, understand the protocols, and build the clients from source, you don’t have to trust anyone 😊

                • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  4 months ago

                  Unless you’re reading all the code, understand the protocols, and compiling yourself you are placing your trust in someone else to do it for you. There’s no way around this fact.

                  That’s why you have a lot of eyes on the code and security experts who dedicate their research to finding flaws and breaking algorithms. It’s certainly a very different scenario from simply trusting people who run a server. The fact that this even needs to be said is frankly phenomenal. There’s also a concept of reproducible builds, so even if you’re not compiling everything yourself you can be reasonably sure that what’s package in the binary was in fact compiled from the source. Again, these are solved problems.

                  SimpleX is almost there but it still continues to fail to notify me of messages, continues to crash, and the UX needs significant improvement before people are willing to put up with it.

                  If people genuinely care about privacy then it’s important to promote apps that actually care about privacy by design and invest in improving these apps instead of just perpetuating the problem by recommending Signal. Even Matrix is far better in terms of privacy and it’s plenty mature at this point.

                  Let me give you a history lesson, since you seem to have no clue about where Signal started and why they use phone numbers.

                  I’m well aware of the history, and the justifications. The fact remains is that I simply do not trust Signal knowing where it originates.

                  Fast forward to today and Signal is only using phone numbers as a spam mitigation filter and to create your initial profile that is no longer being shared with anyone unless you opt into it.

                  The correct statement is that Signal claims to do this, there is no way for an outside party to verify that this is actually the case, hence why it comes down to you taking what people operating Signal say on faith.

                  You’re 100% right. If you read the code, understand the protocols, and build the clients from source, you don’t have to trust anyone 😊

                  Trusting countless researchers an security experts to read the code, understand the protocols, and provide reproducible builds, is a lot better than trusting a sketchy US company that was started by the CIA and NED. I guess that’s a concept that’s difficult for some to wrap their head around though.

                  • ᗪᗩᗰᑎ@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    4 months ago

                    Even Matrix is far better in terms of privacy and it’s plenty mature at this point.

                    I would disagree, this guy’s been finding issues and reporting them to Matrix for a while now and appears to find them every time he glances at the project. I LOVE Matrix. I would recommend it over Discord, Telegram etc, but I would not recommend Matrix over Signal.

                    The fact remains is that I simply do not trust Signal knowing where it originates.

                    This is fair. No critique against this stance.

                    Trusting countless researchers an security experts to read the code, understand the protocols, and provide reproducible builds,

                    I agree! Trust the countless researchers, security and cryptography experts.

                    … is a lot better than trusting a sketchy US company that was started by the CIA and NED.

                    You’re gonna have to cite your sources.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            5
            ·
            edit-2
            4 months ago

            True but I find the opposite end of the spectrum hard to believe. Extraordinary claims require extraordinary proof.

            What is known is that government agents from countries like Iran, China and Russia actively are spreading misinformation. Not to say that you are a government agent but you should doubt the argument on both sides. For instance, using Signal is way better than not using an audited encrypted messager. Often times I see people jump to worse platforms. I think it is important to understand the problems with Signal.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        4 months ago

        True, however your claim lacks evidence. They have your phone number and a few time stamps. That isn’t going help much.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
          link
          fedilink
          arrow-up
          3
          ·
          4 months ago

          My claim is that privacy should not be based on trust. This appears to be a very difficult concept for people in this thread to understand.

            • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
              link
              fedilink
              arrow-up
              3
              ·
              4 months ago

              Yeah, you trust that the encryption algorithm is designed correctly and that it doesn’t leak data because many people have audited it and nobody found a flaw in it. You absolutely will not have to trust people operating servers however. If you can figure out why e2ee is important then I’m sure you’ll be able to extrapolate from that why metadata shouldn’t be seen by the server either.

    • Pherenike@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      4 months ago

      I’m not very tech-savvy, and that article looks very nice, but it’s kind of old and it’s true that they haven’t been as transparent (and frequently audited) as other services and they still require a phone number to set up an account, even if you can switch to only using a username later. Also, they removed encrypted database, and Molly brings that back which is the main reason I use it. Another thing I don’t like about Signal is how ferociously they’ve tried to shut down forks in the past, and how they don’t say that you need Google Play Services for it to work properly. Sadly it’s the only “privacy-conscious” service I’ve managed to make most of my family and friends use, after trying for years.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        4 months ago

        They only shut down forks that violate Signal branding. Mozilla does the same thing with Firefox.

        It is libre so if you fork it there is nothing they can do. Also if they were really hostile they would of used a non libre license or made it entirely proprietary.