“We’re aware of reports that access to Signal has been blocked in some countries,” Signal says. If you are affected by the blocks, the company recommends turning on its censorship circumvention feature. (NetBlocks reports that this feature lets Signal “remain usable” in Russia.)

  • eldavi@lemmy.ml
    link
    fedilink
    arrow-up
    12
    arrow-down
    20
    ·
    edit-2
    5 months ago

    it was in their initial filing when they started the lawsuit to defend themselves.

    i’ve been sealioned too much on the lemmyverse so you’re going to have to do your own googling.

    • neuracnu@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      5 months ago

      Asking the person you’re debating to look up your own citations is certainly one way to converse. But ok, let’s go for it.

      In Aug 2023, Forbes published an article describing the proposal of “unfettered access” you referred to:

      https://www.forbes.com/sites/emilybaker-white/2023/08/21/draft-tiktok-cfius-agreement/

      In June 2024, the Washington Post reported that the Committee on Foreign Investment in the United States (CFIUS) turned down the proposal and includes some broad reporting as to why:

      https://www.msn.com/en-us/news/politics/tiktok-offered-an-extraordinary-deal-the-u-s-government-took-a-pass/ar-BB1nfAcE

      The article isn’t very technical, but it mentions some interesting responsibility angles that the US wouldn’t want to back themselves into:

      • throwing open some, but not all, doors to server operations and source code creates a mountain of work for the government to inspect, which would be a workload nightmare
      • the US government’s deepest concerns seem to be about what data is going out (usage insights on the virtuous side and clipboard/mic/camera monitoring on the ultra shady side) and data coming in (bespoke content intended to influence US residents of China-aligned goals). Usage insights are relatively benign from national security perspective (especially when you can just mandate that people in important roles aren’t permitted to use it). Shady monitoring should be discoverable through app source code monitoring, which you can put the app platforms (Apple, Google, whoever else) on the hook for if they continue to insist on having walled app gardens (and if you trust them at all). The content shaping is harder to put your finger on though, since it’s super easy to abstract logic as far out as you need to avoid detection. “Here, look at these 50M lines of code that run stateside, and yeah, there are some API calls to stuff outside the sandbox. Is that such a big deal?” Spoiler: it is a big deal.
      • the US can’t hold Byte Dance accountable so long as it remains in China. Let’s say the US agreed to all this, spent all the effort to uncover some hidden shady activity that they don’t like (after an untold amount of time has passed). What then? They can’t legally go after Byte Dance’s foreign entity. The US can prosecute the US employees, but it’s totally possible to organize in such a way that leaves those domestic employees free from misdeeds, leaving prosecutors unable to enforce misdeeds fairly. It’d be a mess.

      The second article explains this somewhat, but I’m admittedly painting some conjecture on top regarding how a malicious actor could behave. I’ve got no evidence that Byte Dance is actually doing any of that.

      But going back to the “influence the public” angle, I’m struggling to see how different TikTok is versus NHK America (Japan’s American broadcasts) or RT (American media from the Russian standpoint) aside from being wildly more successful and popular. But I guess that’s all there is to it.

      I’d prefer our leaders also be transparent with us regarding their concerns about TikTok. The reductive “because China!!1!” argument is not compelling on its own.