A reported Free Download Manager supply chain attack redirected Linux users to a malicious Debian package repository that installed information-stealing malware.

The malware used in this campaign establishes a reverse shell to a C2 server and installs a Bash stealer that collects user data and account credentials.

Kaspersky discovered the potential supply chain compromise case while investigating suspicious domains, finding that the campaign has been underway for over three years.

    • 30p87@feddit.de
      link
      fedilink
      arrow-up
      44
      ·
      1 year ago

      And via a website too. That’s like pushing a car. One of the main strengths of Linux are open repositories, maintained by reputable sources and checked by thousands of reputable people. Packages are checksummed and therefore unable to be switched by malicious parties. Even the AUR is arguably a safer and more regulated source. And it’s actually in there.

    • xkforce@lemmy.world
      link
      fedilink
      arrow-up
      24
      arrow-down
      3
      ·
      1 year ago

      The same people that would have given that poor nigerian prince their bank account details

    • TrustingZebra@lemmy.one
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      1 year ago

      It’s still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.

      Do you know of a good download manager for Linux?

        • TrustingZebra@lemmy.one
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).

          • somedaysoon@lemmy.world
            link
            fedilink
            English
            arrow-up
            16
            ·
            1 year ago

            It only makes a difference if the server is capping the speed per connection. If it’s not then it will not make a difference.

            • TrustingZebra@lemmy.one
              link
              fedilink
              arrow-up
              4
              arrow-down
              1
              ·
              1 year ago

              I guess many servers are capping speeds them. Makes sense since I almost never see downloads actually take advantage of my Gigabit internet speeds.

              • somedaysoon@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                edit-2
                1 year ago

                It’s interesting to me people still download things in that fashion. What are you downloading?

                I occasionally download something from a web server, but not enough to care about using a download manager that might make it marginally faster. Most larger files I’m downloading are either TV shows and movies from torrents and usenet, or games on steam. All of which will easily saturate a 1Gbps connection.

            • everett@lemmy.ml
              link
              fedilink
              arrow-up
              13
              ·
              1 year ago

              It could make multiple requests to the server, asking each request to resume starting at a certain byte.

                • drspod@lemmy.mlOP
                  link
                  fedilink
                  arrow-up
                  18
                  ·
                  1 year ago

                  The key thing to know is that a client can do an HTTP HEAD request to get just the Content-Length of the file, and then perform GET requests with the Range request header to fetch a specific chunk of a file.

                  This mechanism was introduced in HTTP 1.1 (byte-serving).

        • arglebargle@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          1 year ago

          just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.

      • Xirup@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.

      • arglebargle@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done

    • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      Gotta admit, it was me. I’ve only used a computer for short time.
      I’ve got my first laptop 3 years ago, and that broke after just 2 months. And anyway, with AMD Athlon 64 it greatly struggled with a browser. So really I only started seriously using computer at the start of 2021, when I got another, usable laptop. And that’s when I downloaded freedownloadmanager.deb. Thankfully, I didn’t get that redirect, so it was a legitimate file.

    • Hamartiogonic@sopuli.xyz
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Oh, I know someone who adds the word “free” to various search words like “free pdf reader” or “free flash player” (happened a very long time ago). He’s also the kind of person who I can imagine having a bunch of viruses and malware on his computer.

    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      People not well versed in Linux.

      You know, the non-techies, which the Linux community claims should know such things but obviously does not.

    • gaael@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I’ve installed and used it, and still do.

      My internet connection is not that reliable, and when I download big files that are not torrents (say >1000 MB) and the download is interrupted because of internet disconnect, Firefox often has trouble getting back to it while FDM doesn’t.

      FDM also lets me set download speed limits, which means I can still browse the internet while downloading.

      It’s not my main tool for downloading stuff, but it has its uses.

  • drspod@lemmy.mlOP
    link
    fedilink
    arrow-up
    56
    ·
    1 year ago

    The article mentions how to check for infection:

    If you have installed the Linux version of the Free Download Manager between 2020 and 2022, you should check and see if the malicious version was installed.

    To do this, look for the following files dropped by the malware, and if found, delete them:

    /etc/cron.d/collect
    /var/tmp/crond
    /var/tmp/bs
    
    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 year ago

      Back in the day when most stuff was on FTP and HTTP and your connection was crap and could drop at any time, you’d use a download manager to smooth things along. It could resume downloads when connection dropped, it could keep a download going for days on end and resume as needed, and it could abusing the bandwitdh limitations of the source site by using multiple parallel connections that pulled on different file chunks. In some ways it was very similar to how we use BT today.

      It was also useful to keep a history of stuff you’d downloaded in case you needed it again, manage the associated files etc.

      • drspod@lemmy.mlOP
        link
        fedilink
        arrow-up
        16
        ·
        1 year ago

        and it could abusing the bandwitdh limitations of the source site by using multiple parallel connections that pulled on different file chunks

        Also for files which had multiple different mirror sites you could download chunks from multiple mirrors concurrently which would allow you to max out your bandwidth even if individual mirrors were limiting download speeds.

    • Dhs92@programming.dev
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      It’s a download client that can pause/Resume downloads, as well as use multiple connections to download files

        • schmidtster@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          1 year ago

          Sucks having your connection drop and having to redlownload the entire thing again. Managers are a fix.

        • db2@sopuli.xyz
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          BitTorrent works in chunks basically, or can download it nonlinearly. Downloading from a site in a basic way gets the file from start to finish, the download manager can let you stop it and pick up where you left off, as long as the server you’re getting the file from is configured to allow it.

          https://github.com/agalwood/Motrix

          (Note: I don’t use that or any other download manager and haven’t since Windows 95, it’s linked as example only)

    • puffy@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Back in the 2000s, browsers were really bad at downloading big things over slow connections since they couldn’t resume, a brief disconnect could destroy hours of progress. But I don’t think you need this anymore.

  • gabriele97@lemmy.g97.top
    link
    fedilink
    arrow-up
    25
    ·
    1 year ago

    How is it possible that users noticed strange behaviors (new Cron jobs) and they didn’t check the script launched by those jobs 😱

  • insaneduck@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    1 year ago

    Finally linux is getting popular enough to make viruses. Yay?. Insert gru meme here

  • rufus@discuss.tchncs.de
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Mmmh. You kinda deserve being infected if you do things like this. Every beginner tutorial specifically tells you not to download random stuff from the internet and ‘sudo’ install it. Every Wiki with helpful information has these boxes that tell you not to do it. I’m okay if you do it anyways. But don’t blame anyone else for the consequences. And don’t tell me you haven’t been warned.

    Also I wonder about the impact this had. It went unnoticed for 3 years. So I can’t imagine it having affected many people. The text says it affected few people. And it didn’t have any real impact.

    But supply chain attacks are real. Don’t get fooled. And don’t install random stuff. Install the download manager from your package repository instead.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      I kind of disagree. Applications often require root permissions to install themselves, since regular users can’t access certain folders like /opt, etc.

      Also, do you really think that people would actually read the source and then compile all their software themselves? Do you do the same?

      Generally though I do agree, you’re probably fine installing software from your distro’s repos but even that’s not bulletproof and also it’s not like third-party repos are uncommon either.

      • rufus@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Yes. I do it the correct way. I use my favourite distro’s package manager to install software. This way it’s tested, a few people had a look at the changes, and sometimes a CI script automatically determines if the installer affects other parts of the system. I go to great lengths to avoid doing it any other way. (I’ve been using some flatpaks in recent times, though. But sometimes I also install it only for a separate user account. Mainly when it’s proprietary or niche.)

        It is super rare that I install random stuff from the internet. Or ‘curl’ and then pipe the installer script into a root shell. And when I do, I put in some effort to see if it’s okay. I think i had a quick glance at most of the install .sh scripts before continuing. So yes, I kinda do my best. And I isolate that stuff and don’t put it on the same container that does my email.

        Most of the times you can avoid doing it the ‘stupid way’. And even the programming package managers like ‘npm’, ‘cargo’, … have started to take supply chain attacks seriously.

  • _cnt0@feddit.de
    link
    fedilink
    arrow-up
    13
    arrow-down
    6
    ·
    1 year ago

    malicious Debian package repository

    *laughs in RPM*

    This comment was presented by the fedora gang.

    • puffy@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      Right, but you could do the same with RPM. Not everyone is aware of this, but installing a package executes scripts with root access over your system.

  • rurb@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I had to essentially read the same thing four times before there was any new information in this post. Not sure if that’s a Jerboa thing or what, but probably could have been avoided.

    • drspod@lemmy.mlOP
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Yeah I agree, sorry about that. I thought that the body-text field was mandatory to fill in, so I used the introductory paragraph from the article so as not to editorialize.