Pavel Durov’s arrest suggests that the law enforcement dragnet is being widened from private financial transactions to private speech.

The arrest of the Telegram CEO Pavel Durov in France this week is extremely significant. It confirms that we are deep into the second crypto war, where governments are systematically seeking to prosecute developers of digital encryption tools because encryption frustrates state surveillance and control. While the first crypto war in the 1990s was led by the United States, this one is led jointly by the European Union — now its own regulatory superpower.

Durov, a former Russian, now French citizen, was arrested in Paris on Saturday, and has now been indicted. You can read the French accusations here. They include complicity in drug possession and sale, fraud, child pornography and money laundering. These are extremely serious crimes — but note that the charge is complicity, not participation. The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.

  • einkorn@feddit.org
    link
    fedilink
    arrow-up
    100
    arrow-down
    8
    ·
    3 months ago

    Well, except Telegram isn’t a good tool for privacy.

    There is no E2EE. Simple encryption is only available for 1:1 chats and disabled by default. Telegram doesn’t disclose their encryption methods, so there is no way to verify the (in)effectiveness. Telegram is able to block channels from their end, so there is no privacy from their end either.

    • Libb@jlai.lu
      link
      fedilink
      arrow-up
      28
      arrow-down
      17
      ·
      3 months ago

      Well, except Telegram isn’t a good tool for privacy.

      That’s not the point. The hunting down on tools and their creators (and on our right to privacy) is the issue here. At least, imho.

      • Rose@lemmy.zip
        link
        fedilink
        arrow-up
        47
        arrow-down
        3
        ·
        edit-2
        3 months ago

        It has nothing to do with privacy. Telegram is an old-school social network in that it doesn’t even require that you register to view the content pages. It’s also a social network taken to the extreme of free speech absolutism in that it doesn’t mind people talking openly about every kind of crime and their use of its tools to make it easier to obtain the related services. All that with no encryption at all.

          • Pup Biru@aussie.zone
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            3 months ago

            free speech can be good. free speech can also be bad. overall, it’s more good than bad however society seems to agree that free speech has limits - you can’t defame someone, for example

            free speech absolutism is fucking dumb; just like most other absolutist stances

            this also isn’t even about free speech - this is about someone having access to information requested by investigators to solve crimes, and then refusing to give that information

            • istanbullu@lemmy.ml
              link
              fedilink
              arrow-up
              2
              arrow-down
              3
              ·
              3 months ago

              This is pure nonsense.

              Western governments hate Telegram because until now Telegram didn’t cooperate with Western intelligence services like American social media companies do. Everything on Meta or Google gets fed into NSA, but Telegram has been uncooperative.

              This will likely change after Durov’s arrest, but it was nice while it lasted.

              • Pup Biru@aussie.zone
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                3 months ago

                we don’t disagree about that: governments don’t like that telegram doesn’t cooperate; that’s not in dispute

                where the disagreement comes is the part after. telegram (and indeed meta, google, etc) have that data at their disposal. when served with a legal notice to provide information to authorities or shut down illegal behaviour on their platforms, they comply - sometimes that’s a bad thing if the government is overreaching, but sometimes it’s also a good thing (in the case of CSAM and other serious crimes)

                there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic

                • istanbullu@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 months ago

                  there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic

                  Was it? The French authorities did not provide any convincing evidence, just accusations.

              • octopus_ink@lemmy.ml
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                3 months ago

                This will likely change after Durov’s arrest, but it was nice while it lasted.

                Why use a tool that relies on the goodwill of the operator to secure your privacy? It’s foolish in the first place.

                The operator of that tool tomorrow may not be the operator of today, and the operator of today can become compromised by blackmail, legally compelled (see OP), physically compelled, etc to break that trust.

                ANYONE who understood how telegram works and also felt it was a tool for privacy doesn’t really understand privacy in the digital age.

                Quoting @[email protected] :

                Other encrypted platforms: we have no data so we can’t turn over data

                Telegram: we collect it all. No you can’t know who is posting child abuse content

                And frankly, if they have knowledge of who is sharing CSAM, it’s entirely ethical for them to be compelled to share it.

                But what about when it’s who is questioning their sexuality or gender? Or who is organizing a protest in a country that puts down protests and dissent violently? Or… Or… Or… There are so many examples where privacy IS important AND ethical, but in zero of those does it make sense to rely on the goodwill of the operator to safeguard that privacy.

                • istanbullu@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  3
                  ·
                  3 months ago

                  ANYONE who understood how telegram works and also felt it was a tool for privacy doesn’t really understand privacy in the digital age.

                  Telegram is the most realistic alternative to breaking Meta’s monopoly. You might like Signal very much, but nobody uses it and the user experience is horrible.

        • Dark Arc@social.packetloss.gg
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          6
          ·
          3 months ago

          Questionable interpretation. Privacy doesn’t mean mathematically proven privacy. A changing booth in a store provides privacy but it’s only private because the store owner agreed to not monitor it (and in many cases is required by law not to monitor it).

          Effectively what you and the original commenter are saying (collectively) is that mathematically proven privacy is the only privacy that matters for the Internet. Operators that do not mathematically provide privacy should just do whatever government officials ask them to do.

          We only have the French government’s word to go off of right now. Maybe Telegram’s refusals are totally unreasonable but maybe they’re not.

          A smarter route probably would’ve been to fight through the court system in France on a case by case level rather than ignore prosecutors (assuming the French narrative is the whole story). Still, I think this is all murkier than you’d like to think.

          • Rose@lemmy.zip
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            3 months ago

            It’s a street, not a changing booth. Also, I’m familiar with every charge against Durov and I personally have seen the illegal content I talked about. If it’s so easily accessible to the public and persists for years, it has nothing to do with privacy and there is no moderation - though his words also underscore the latter.

            • Dark Arc@social.packetloss.gg
              link
              fedilink
              English
              arrow-up
              2
              ·
              3 months ago

              Who said it’s a street? What makes it a street?

              personally have seen the illegal content I talked about.

              Did you seek it out? I and nobody I know personally, have ever encountered anything like what was described on that platform and I’ve been on it for years.

              Was it the same “channel” or “group chat” that persisted for years?

              What gives them the right or responsibility to moderate a group chat or channel more than say Signal or Threema? Just because their technical back end lets them?

              I mean by that argument Signal could do client side scanning on everything (that’s an enforcement at the platform level that fits their technical limitations). Is that where we’re at? “If you can figure out how to violate privacy in the name of looking for illegal content, you should.”

              Nothing Telegram offers is equivalent to the algorithmic feeds that require moderation like YouTube, Twitter, Instagram, or Facebook, everything you have to seek out.

              Make no mistake, I’m not defending the content. The people who used the platform to share that content should be arrested. However, I’m not sure I agree with the moral dichotomy we’ve gotten ourselves into where e.g., the messenger is legally responsible for refusing service to people doing illegal activity.

              • Rose@lemmy.zip
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                3 months ago

                I won’t go into the specific channels as to not promote them or what they do but we can talk about one known example, which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data. They used the two highly popular bots called H****a and the E** ** G**, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5. They use the Telegram API and have been there for years. How do you moderate that? You don’t. You take it down as the illegal, privacy-violating, and doxing-enabling content that it is.

                Edit: “Censored” the names of the bots, as I still don’t want to make them even easier to find.

                • Dark Arc@social.packetloss.gg
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  3 months ago

                  which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data

                  Was that a bad thing? I’ve never heard the name Bellingcat before, but it sounds like this would’ve been partially responsible for the reporting about the Navalny poisoning?

                  They used the two highly popular bots called Ha and the E ** G, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5.

                  Ultimately, that sounds like an issue the Russian government needs to fix. Telegram bots are also trivial to launch and duplicate so … actually detecting and shutting that down without it being a massive expensive money pit is difficult.

                  It’s easy to say “oh they’re hosting it, they should just take it down.”

                  https://www.washingtonpost.com/politics/2018/10/16/postal-service-preferred-shipper-drug-dealers/

                  Should the US federal government hold themselves liable for delivering illegal drugs via their own postal service? I mean there’s serious nuance in what’s reasonable liability for a carrier … and personally holding the CEO criminally liable is a pretty extreme instance of that.

      • einkorn@feddit.org
        link
        fedilink
        arrow-up
        24
        ·
        3 months ago

        I am going to quote myself here:

        The issue I see with Telegram is that they retain a certain control over the content on their platform, as they have blocked channels in the past. That’s unlike for example Signal, which only acts as a carrier for the encrypted data.

        If they have control over what people are able to share via their platform, the relevant laws should apply, imho.

        • Libb@jlai.lu
          link
          fedilink
          arrow-up
          7
          arrow-down
          12
          ·
          3 months ago

          I am going to quote myself here:

          Allow me to quote myself too, then:

          That’s not the point.

          I do not disagree with your remarks (I do not use Telegram), I simply consider it’s not the point or that it should not be.

          Obviously, laws should be enforced. What those laws are and how they are used to erode some stuff that were considered fundamental rights not so long ago is the sole issue, once again, im(v)ho ;)

          • einkorn@feddit.org
            link
            fedilink
            arrow-up
            22
            ·
            3 months ago

            It IS the point. If Telegram was designed and set up as a pure carrier of encrypted information, no one could/should fault them for how the service is used.

            However, this is not the case, and they are able to monitor and control the content that is shared. This means they have a moral and legal responsibility to make sure the service is used in accordance with the law.

          • Serinus@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            The point is that if you’re going to keep blackmail, you have to share with the government.

            The easy answer is to stop keeping blackmail.

    • istanbullu@lemmy.ml
      link
      fedilink
      arrow-up
      8
      arrow-down
      14
      ·
      edit-2
      3 months ago

      Well, except Telegram isn’t a good tool for privacy.

      If Telegram wasn’t good for privacy, Western governments would not be trying to shut it down.

      E2EE is nice, but doesn’t matter if the government can just sieze or hack your phone. Much better to use non-Western social media and messaging apps.

        • istanbullu@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          Dis you miss the entire Snowden revelations? Western governments are hostile to online privacy and freedom.

      • vxx@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        3 months ago

        If it would be a good tool for privacy, Russia would try to shut it down the same way they did with Signal.

        • istanbullu@lemmy.ml
          link
          fedilink
          arrow-up
          4
          ·
          3 months ago

          Russia tried for years to ban Telegram. They stopped after Telegram managed to keep itself alive by proxies.

        • chayleaf@lemmy.ml
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          they did ban it, and everyone still used it (Telegram was good at evading the bans back then, but eventually Roskomnadzor became decent at banning it), and then they unbanned it, whatever that means

      • einkorn@feddit.org
        link
        fedilink
        arrow-up
        2
        arrow-down
        4
        ·
        3 months ago

        If Telegram wasn’t good for privacy, Western governments would not be trying to shut it down.

        They are not trying shutdown Telegram, they are trying to control it.

        E2EE is nice, but doesn’t matter if the government can just sieze or hack your phone. Much better to use non-Western social media and messaging apps.

        What kind of argument is this supposed to be? Governments can size your phone anywhere … oh wait … lemmy.ml … yeah, I see…

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          3 months ago

          They like to poke fun at the “west” but Russia, China and others are all worse some how. At least in most countries it is controversial to attack journalists and encryption

          • einkorn@feddit.org
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            3 months ago

            In case you are serious: Lemmy.ml is known for being a tankie instance. So a nonsensical anti-west statement makes a lot more sense considering the instance the user chose.

  • oktoberpaard@feddit.nl
    link
    fedilink
    arrow-up
    37
    arrow-down
    1
    ·
    3 months ago

    Telegram’s “privacy” is fully based on people trusting them not to share their data - to which Telegram has full access - with anyone. Well, apart from the optional E2EE “secret chat” option with non-standard encryption methods that can only be used for one on one conversations. If it were an actual privacy app, like Signal, they could’ve cooperated with authorities without giving away chat contents and nobody would’ve been arrested. I’m a Telegram user myself and I from a usability standpoint I really like it, but let’s be realistic here: for data safety I would pick another option.

    • endofline@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      3 months ago

      Matrix does have this the same. Most of publicly accessible channels are non encrypted. It’s all because of e2e performance issues for big channels. It comes with a cost which is not required for most people

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    43
    arrow-down
    9
    ·
    3 months ago

    The crime is not responding to authorities when obviously illegal content such as CSAM is posted. Don’t let the right try to spin this as a free speech thing. It’s not.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      3 months ago

      Other encrypted platforms: we have no data so we can’t turn over data

      Telegram: we collect it all. No you can’t know who is posting child abuse content

      • Pilferjinx@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        3 months ago

        Wait, telegram has collected it, knows about and, ultimately condones it? Or is it more of a wilful ignorance and resistance to forced compliance?

      • endofline@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        3 months ago

        It’s clearly wrong. Matrix does have non-encrypted channels and honestly most of publicly accessible channels are non-encrypted. Do you consider matrix also on the Dame “bucket” as telegram? In matrix you can created encrypted channels but they work very badly in terms of performance with huge number of people like 1000+

    • endofline@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      3 months ago

      We still don’t have a legal definition of “hate speech”. Yes it’s defined it is what it is, you can’t find any international legal definition and it’s left to the interpretation of judges. Don’t you consider it worrying?

      About crime, as far as I know, child abuse and sex content is taken down. Drugs not - there are many countries with very lax drugs policies.

      • some_guy@lemmy.sdf.org
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        I didn’t comment on hate speech. I commented on CSAM, which the sources I’ve read and listened to (podcasts) say Telegram pretty much never answered when contacted.

        • endofline@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          Well, I didn’t see child pornography on telegram but I saw sex channels being removed. Comparing to Instagram, I didn’t see happening this on Instagram. Minor soft pornography is flourishing on Instagram. CSAM or terrorism is always a case brought up to take some unpopular things down

          • some_guy@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            CSAM or terrorism is always a case brought up to take some unpopular things down

            I’ll concede this point.

  • Oneser@lemm.ee
    link
    fedilink
    arrow-up
    31
    arrow-down
    3
    ·
    3 months ago

    I thought telegrams encryption was more or less non-existent? Am I missing something?

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      that’s correct - the issue here is that he has full access to the information that investigators are requesting and is simply refusing to comply with requests

      this isn’t shit like a conversation you had with a friend about weed - this is CSAM and drug trafficking

  • mox@lemmy.sdf.org
    link
    fedilink
    arrow-up
    25
    arrow-down
    5
    ·
    edit-2
    3 months ago

    It would be easy to dismiss the headline’s claim because Telegram’s design makes it arguably not a privacy tool in the first place.

    However, it is possible that this arrest was chosen in part for that reason, with the knowledge that privacy and cryptography advocates wouldn’t be so upset by the targeting of a tool that is already weak in those areas. This could be an early step in a plan to gradually normalize outlawing cryptographic tools, piece by piece. (Legislators and spy agencies have demonstrated that they want to do this, after all.) With such an approach, the people affected might not resist much until it’s too late, like boiling the proverbial frog.

    Watching from the sidelines, it’s impossible to see the underlying motivations or where this is going. I just hope this doesn’t become case law for eventual use in criminalizing solid cryptography.

    • You’re thinking too far. As someone who knows two people that worked for the Swiss government closely:

      Don’t worry about it. The whole deepstate Idea is absolutely ridiculous.

      There is no big plan to weaken encryption or anything. There was probably a single prosecutor working on a case involving Telegram that saw his chance and took it.

      Seriously, you should be a lot more worried about google or meta, not western democracies.

      Unless you live in russia/china/iran/yourFavouriteDictatorship, then forget whatever I just said. But if you live there, what’s happening in France isn’t a Problem to you anymore since your government does it anyways lol

      But yeah, I’m getting a not tired of the deepstate conspiracies. He broke the law, that’s why he gets arrested, not because of some deepstate conspiracy

      • mox@lemmy.sdf.org
        link
        fedilink
        arrow-up
        18
        arrow-down
        3
        ·
        edit-2
        3 months ago

        What are you on about?

        When legislation aiming to restrict people’s rights fails to pass, it is very common for legislators/governments to try again shortly thereafter, and then again, and again, until some version of it eventually does pass. With each revision, some wording might be replaced, or weak assurances added, or the most obvious targets changed to placate the loudest critics. It might be broken up in to several parts, to be proposed separately over time. But the overall goal remains the same. This practice is (part of) why vigilance and voting are so important in democracies.

        There’s nothing “deep state” about it. It’s plainly visible, on the record, and easily verifiable.

        As someone who knows two people that worked for the Swiss government closely

        This is an appeal to authority (please look it up) and a laughably weak one at that.

        There is no big plan to weaken encryption or anything.

        You obviously have not been keeping up with events surrounding this topic over the past 30 years.

      • octopus_ink@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        3 months ago

        There is no big plan to weaken encryption or anything.

        This may not be a symptom of such a plan, but there very much is such a plan.

        Exportation of PGP and similar “strong encryption” in the 90s was considered as exporting munitions by the DoD.

        it was not until almost two decades later that the US began to move some of the most common encryption technologies off the Munitions List. Without these changes, it would have been virtually impossible to secure commercial transactions online, stifling the then-nascent internet economy.

        More recently you can take your pick.

        Governments DO NOT like people having encryption that isn’t backdoored. CSAM is literally the “but won’t someone think of the children” justification they use, and while the goals may be admirable in this case, the potential harm of succeeding in their quest to ban consumer-accessible strong encryption seems pretty obvious to me.

        As a bonus - anyone remember Truecrypt?

        https://cointelegraph.com/news/rhodium-enterprises-bitcoin-usd-loan-bankruptcy

        https://www.csoonline.com/article/547356/microsoft-subnet-encryption-canary-or-insecure-app-truecrypt-warning-says-use-microsoft-s-bitlocker.html

  • foremanguy@lemmy.ml
    link
    fedilink
    arrow-up
    18
    arrow-down
    6
    ·
    3 months ago

    The world is turning bad, Telegram is not really a private app, but they have one advantage is that they fuck off all the govs that try to get datas from its users! Soon govs will forbid the encryption to watch gently in our digital life. He’s not complice with these crimes, he’s just proposing a tool that make communication more secure and private, but sadly some bad actors use it as a way to do bad things…

    • exocortex@discuss.tchncs.de
      link
      fedilink
      arrow-up
      13
      ·
      3 months ago

      Why do they have the data in the first place?

      Your communications on telegram are not encrypted by default. You can have e2e encrypted 1on1-conversations, but group chats are blown for them to do everything.

      They had a hilarious argumentation where they claimed that the key to unlock your chats is stored on a different server than your chats are and therefore they cannot access it. A company that argues like they (“trust us”) isn’t trustworthy.

      Signal has been audited over and over again by internationally respected cryptographers. They cannot decrypt your chats by design. No need for “trust us bro”.

      • foremanguy@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        Yeah this is true and I don’t recommended Telegram in any case, but it’s sad that a guy who try to protect a bit our privacy be arrested

      • chayleaf@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        I remember them responding to a couple antipiracy lawsuits in… India I think? they also make an exception for ISIS-related channels. But mostly all, yes.

    • jimmydoreisalefty@lemmy.worldOP
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      3 months ago

      Thanks, here is more information about Crikey:

      Crikey is an independent Australian source for news, investigations, analysis and opinion focusing on politics, media, economics, health, international affairs, the climate, business, society and culture. We are guided by a deceptively simple, old idea: tell the truth and shame the devil.

  • rottingleaf@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    Telegram is not a privacy tool.

    I mean, if he’s convicted for a privacy tool, while it’s not a privacy tool, we have a bit of ambiguity.

    Arguably advertising something which is not a privacy tool as one is fraud. Maybe even phishing, since TG the company has in plaintext all the chat history of its users.

    And this

    The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.

    in non-libertarian language means something similar, that is, that something not confirmed to be a privacy tool is being provided as a privacy tool.

    I am a libertarian, but in this case they are consistent, if I’m reading this correctly. They are not abusing power, they are doing exactly what they are claiming to be doing.

    Also maybe I’m just tired of Telegram. It’s engaging, and I have AuDHD, which means lots of energy spent, and I can’t drop it completely because work, and also some small communities are available as TG channels. Would be wonderful were they to move at least to WhatsApp, but it is what it is.

    Still, ability to easily create a blog (what a TG channel really is for its users) reachable without bullshit is a niche in huge demand. LJ filled that at some point, Facebook did at another, TG does now.

    Something like this is desperately needed. I’d say the solution should be complementary to Signal - that is, DMs and small groups should not be its thing. Neither should be privacy of huge chats and channels - they’d be public anyway. However, anonymity with means to counter spam should, so should be metadata of user activity.

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 months ago

    Honestly this could go to ways. I really hope people more to more secure platforms but it is possible they find something equally as problematic

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    5
    ·
    3 months ago

    In all fairness Telegram has unencrypted user data and messages but didn’t turn it over to the authorities. They also allow known criminal activity to thrive.

    • istanbullu@lemmy.ml
      link
      fedilink
      arrow-up
      8
      arrow-down
      4
      ·
      3 months ago

      They also allow known criminal activity to thrive.

      Most scammers I have seen are operating out of Facebook or Instagram.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        3 months ago

        What is “most scammers?”

        That’s not a useful metric. What is a “scammer?” Also it is probably better to look a scammers per capita

    • airikr@lemmy.ml
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      3 months ago

      It is very important to mention that you mean end-to-end encryption. The data is stored encrypted when using cloud chat. Nothing (besides phone number what I know) is stored in plain text on Telegram’s servers.

      I am not defending Telegram. I am just stating facts.

      Negative votes incoming in 3… 2… 1…

      • mox@lemmy.sdf.org
        link
        fedilink
        arrow-up
        4
        ·
        3 months ago

        It is very important to mention that you mean end-to-end encryption. The data is stored encrypted when using cloud chat.

        In response, it is very important to mention that point-to-point encryption and encryption at rest are next to meaningless with respect to the chat participants’ privacy. They might be relevant to the case against Durov, but they don’t protect against leaks or compromised servers. Please don’t rely on them for your safety.

  • istanbullu@lemmy.ml
    link
    fedilink
    arrow-up
    14
    arrow-down
    19
    ·
    3 months ago

    Governments want to make it illegal to have privacy. Durov’s arrest was one of the many steps they are taking in that direction.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      3 months ago

      That might be true but in this case Telegram was hosting lots of CSAM and other illegal activity in public group chats.

      Imagine you are the victim of Sex abuse. Your nude images are on a public group chat and yet Telegram does nothing. There is no technical reason they couldn’t remove the images. They just don’t feel like it. What’s worse is that there is a lot of images of children.