Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

  • NekuSoulA
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    5 months ago

    I’ve been trying to find some better/original sources [1] [2] [3] and from what I can gather it’s even worse. It’s not even an upscaler of any kind, it apparently uses an NPU just to control clocks and fan speeds to reduce power draw, dropping FPS by ~10% in the process.

    So yeah, I’m not really sure why they needed an NPU to figure out that running a GPU at its limit has always been wildly inefficient. Outside of getting that investor money of course.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Ok, i guess its just kinda similar to dynamic overclocking/underclocking with a dedicated npu. I don’t really see why a tiny 2$ microcontroller or just the cpu can’t accomplish the same task though.