• Lev@europe.pub
    link
    fedilink
    English
    arrow-up
    69
    ·
    12 hours ago

    Daily reminder that Codeberg is always the good alternative to corporate bastards like this idiot

  • Jocker@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    11 hours ago

    Contradictory to the title, this message is not to the developers, developers don’t care what github ceo thinks, and they should know it. This might be for the management of other companies to allow using ai or force ai usage.

  • rimjob_rainer@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    84
    arrow-down
    1
    ·
    edit-2
    15 hours ago

    I don’t get it. AI is a tool. My CEO didn’t care about what tools I use, as long as I got the job done. Why do they suddenly think they have to force us to use a certain tool to get the job done? They are clueless, yet they think they know what we need.

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      9 hours ago

      Because unlike with the other tools you use the CEO of your company is investing millions of dollars into AI and they want a big return on their investment.

      • DarkSurferZA@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 hours ago

        Return? No, there is no return on investment from AI. If there really was a return to be had from Devs, you wouldn’t have to force them to use it.

        This is a saving face and covering their asses exercise. Option 1 is “We spent the money, nobody’s using it, the bubbles gonna burst”, the other choice is “if we can ramp up the usage numbers before the earnings call, we can get some of that sweet investor money to buy us out of being mauled by our shareholders”.

        It’s shitty management, making shitty decisions to cover up their previous shitty decisions

    • sobchak@programming.dev
      link
      fedilink
      English
      arrow-up
      13
      ·
      12 hours ago

      I think part of it is because they think they can train models off developers, then replace them with models. The other is that the company is heavily invested in coding LLMs and the tooling for them, so they are trying to hype them up.

    • bless@lemmy.ml
      link
      fedilink
      English
      arrow-up
      51
      ·
      16 hours ago

      GitHub is owned by Microsoft, and Microsoft is forcing AI on all the employees

      • ksh@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        They all need to be sued for unethical “Embrace, Extend and Extinguish” practices again

      • TeddE@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        ·
        15 hours ago

        Honestly I’ve been recommending setting up a personal git store and cloning any project you like, I imagine the next phase of this is Microsoft making a claim that if Copilot ‘assisted’ all these projects, Microsoft is a part owner of all these projects - in a gambit to swallow and own open source.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        15 hours ago

        I am surprised they aren’t embracing it… I would. You immediately get some vague non person to blame all your failures on.

        Employers aren’t loyal enough for the average person to care about their companies well being.

        • rozodru@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          15 hours ago

          I agree, let them generate massive tech debt cause right now the majority of my current clients have hired me to clean up their AI slop.

          is it bad for their users? oh hell yes it is. Is it great for me an other consultants/freelancers? hell yes it is. Best thing that’s ever happened to my wallet recently are vibe coders. I love those dumb prompt monkeys.

    • Jhex@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      13 hours ago

      Why do they suddenly think they have to force us to use a certain tool to get the job done?

      Not just that… why do they have to threat and push for people to use a tool that allegedly is fantastic and makes everything better and faster?.. the answer is that it does not work but they need to pump the numbers to keep the bubble going

    • MajorasMaskForever@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      13 hours ago

      It’s not about individual contributors using the right tools to get the job done. It’s about needing fewer individual contributors in the first place.

      If AI actually accomplishes what it’s being sold as, a company can maintain or even increase its productivity with a fraction of its current spending on labor. Labor is one of the largest chunks of spending a company has so, if not the largest, so reducing that greatly reduces spending which means for same or higher company income, the net profit goes up and as always, the line must go up.

      tl;dr Modern Capitalism is why they care

      • Tamo240@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        12 hours ago

        Alternatively, following their logic, keep the number of people and achieve massively higher productivity. But they don’t want that, they want to reduce the number of people having opinions and diluting the share pool, because its not about productivity, its about exerting control.

    • 0x0@lemmy.zip
      link
      fedilink
      English
      arrow-up
      13
      ·
      15 hours ago

      They are clueless, yet they think they know what we need.

      Accurate description of most managers i’ve encountered.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      14 hours ago

      They are clueless, yet they think they know what we need.

      AI make money line go up. It’s not clueless, he’s trying to sell a kind of snake oil (ok, not “snake oil”, I don’t think AI is entirely bad).

      • ragas@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 hours ago

        Snake oil is also not entirely bad. The placebo effect actually works.

        • CeeBee_Eh@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          8 hours ago

          No, snake oil is extremely bad. It’s a highly exploitative practice that preys on the desperation of sick people.

          That’s what “snake oil” refers to. Exploiting someone by playing their emotions.

          The placebo effect actually works.

          The placebo effect sometimes works. But only in very specific circumstances. A placebo will not cure cancer or heart disease.

          It can help with things related to pain, as mental and emotional state can directly affect the severity of pain. And a placebo can sometimes marginally improve symptoms by reducing stress levels. But that’s why placebos are used during drug trials. If a drug produces the same results as a placebo, then it doesn’t work. And that says a lot about what the placebo effect actually is. It’s just a mental state change that gets expressed as reduced physiological stress.

  • ipkpjersi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    3 hours ago

    Threatening remarks like that are why I learned PHPUnit and XDebug, and yeah it made me become a better developer, but often times these are just empty statements.

    AI is just another tool in my toolbox, but it’s not everything.

    • MoondropLight@thelemmy.club
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      6 hours ago

      Unit testing and TDD are awesome; but if you can avoid it: Don’t write things in PHP (unless it’s for work).

      • ipkpjersi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        I’ve written a few personal projects in Laravel too, I don’t mind modern PHP tbh. It’s good for spinning up web apps quickly.

  • redlemace@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    14 hours ago

    such an easy choice …

    (edit: I followed up and got out. This too is now self-hosted and codeberg when needed)

  • Fedditor385@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    edit-2
    17 hours ago

    AI can only deliver answers based on training code developers manually wrote, so hod do they expect to train AI in the future if there is no more developers writing code by themselves? You train AI on AI-generated code? Sounds like expected enshittification down the line. Inbreeding basically.

    Also, small fact is that they invested so much money into AI, that they can’t allow it to fail. Such comments never came from people who don’t depend on AI adoption.

    • Showroom7561@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      14 hours ago

      It’s like all those companies who fast tracked their way into profits by ignoring the catastrophic effects they were having on the environment… Down the road.

      Later is someone else’s problem. Now is when AI-pushers want to make money.

      I hate where things have been heading.

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      15 hours ago

      same as how it goes on the stock market? they don’t care about the long term, but only the short term. what happens on the long term is somebody else’s problem, you just have to squeeze out everything, and know when to exit.

      they are gambling with our lives. but not with theirs. that’s (one of) the problem: they are not fearing their lives.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    18 hours ago

    Expectation: High quality code done quickly by AI.

    Reality: Low quality AI generated bug reports being spammed in the hopes the spammers can get bug bounty for fixing them, with AI of course.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    16 hours ago

    I asked an AI to generate me some code yesterday. A simple interface to a REST API with about 6 endpoints.

    And the code it made almost worked. A few fixes here and there to methods it pulled out of it’s arse, but were close enough to real ones to be an easy fix.

    But the REST API it made code for wasn’t the one I gave it. Bore no resemblance to it in fact.

    People need to realise that MS isn’t forcing it’s devs to write all code with AI because they want better code. It’s because they desperately need training data so they can sell their slop generators to gullible CEOs.

  • medem@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    34
    ·
    19 hours ago

    “Managing agents to achieve outcomes may sound unfulfilling to many”

    No shit, man.

  • aliser@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    18 hours ago

    does “embracing AI” means replacing all these execs with it? or is it “too far”?

    • Soup@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      15 hours ago

      No, they’re all super special and have an “instinct” that a robot could never have. Of course the same does not go for artists or anyone who does the actual work for these “titans of industry”.

      *by “instinct” we, of course, mean survivorship bias based on what is essentially gambling, exploitation, and being too big to fail.

  • antihumanitarian@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    4
    ·
    15 hours ago

    I’m a professional developer and have tested AI tools extensively over the last few years as they develop. The economic implications of the advancements made over the last few months are simply impossible to ignore. The tools aren’t perfect, and you certainly need to structure their use around their strengths and weaknesses, but assigned to the right tasks they can be 10% or less of the cost with better results. I’ve yet to have a project where I’ve used them and they didn’t need an experienced engineer to jump in and research an obscure or complex bug, have a dumb architectural choice rejected, or verify if stuff actually works (they like reporting success when they shouldn’t), but again the economics; the dev can be doing other stuff 90% of the time.

    Don’t get me wrong, on the current trajectory this tech would probably lead to deeply terrible socioeconomic outcomes, probably techno neofeudalism, but for an individual developer putting food on the table I don’t see it as much of a choice. It’s like the industrial revolution again, but for cognitive work.

    • sobchak@programming.dev
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 hours ago

      I keep hearing stuff like this, but I haven’t found a good use or workflow for AI (other than occasional chatbot sessions). Regular autocomplete is more accurate (no hallucinations) and faster than AI suggestions (especially accounting for needing to constantly review the suggestions for correctness). I guess stuff like Cursor is OK at making one-off tools on very small code-bases, but hits a brick-wall when the code base gets too big. Then you’re left with a bunch of unmaintainable code you’re not very familiar with and you would to spend a lot of time trying to fix yourself. Dunno if I’m doing something wrong or what.

      I guess what I’m saying is that using AI can speed you up to a point while the project accumulates massive amounts of technical debt, and when you take into account all the refactoring and debugging time, it results in taking longer to produce a buggier project. At least, in my experience.

      • antihumanitarian@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 hours ago

        I’ve used it most extensively doing Ruby on Rails greenfield apps, and also some JS front ends, some Python mid sized apps, and some Rust and Nix utilities. You’re absolutely right about it struggling with code base scale, I had to rework the design process around this. Essentially, design documentation telling the story, workflow documentation describing in detail every possible functionality, and an iteration schedule. So the why, what, and how formalized and in detail, in that order. It can generate the bulk of those documents given high level explanations, but require humans to edit them before making them the ‘golden’ references. Test driven development is beyond critical, telling it everywhere to use it extensively with writing failing tests first seems to work best.

        So to actually have it do a thing I load those documents into context, give it a set unit of work from the iteration schedule, and work on something else.

        It does go down some seriously wrong paths sometimes, like writing hacky work arounds if it incorrectly diagnosing some obscure problem. I’ve had a few near misses where it tried to sneak in stuff that would bury future work in technical debt. Most problematic is it’s just subtle enough that a junior dev might miss it; they’d probably get sent down a rabbit hole with several layers of spaghetti obscuring the problem.

        • kcuf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 hours ago

          That sounds like you’re still doing a lot of work. Is that net new work you wouldn’t have done before (like would you have needed to write those docs before as well)? Writing code never feels like the complicated or time expensive part to me. Figuring out what I want to do is, and I need to do that with either approach, and then thinking through how I’d like to organize things is another time sink, and perhaps that can be replaced/augmented by ai, but organizing things well requires long term thinking and is very hard to explain

      • AWistfulNihilist@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 hours ago

        That’s perfect for higher ups. They don’t care if what you release has bugs as long as you work on them when they pop up, they consider that part of your job. They want a result quickly and will accept 85% if it moves the needle forward.

        These people don’t care about technical debt, they don’t care about exploits until it happens to them, then it’s how bad and how long to fix. No one cares about doxxes anymore, it’s just the cost of doing business. Like recalls.

        This is perfect for CEOs and billionaires because they don’t care how something is done at a 35,000 foot view, they just want it now. AI is a nightmare of exploits that haven’t even begun to be discovered yet. Things that will be easily exploitable, especially by other algorithms.

        Coders are just as effected by supply and demand, and the demand is for AI products.

        • sobchak@programming.dev
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 hours ago

          Hmm, a lot of my career was done doing embedded programming, where mistakes in production are very costly, and software/hardware has to be released with basically zero bugs, so that may be where the disconnect is. I still think bugs and technical debt are costly elsewhere too if the product is going to have a long lifecycle, but executives are just dumb.

          • AWistfulNihilist@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 hours ago

            There’s been an unbalancing of top down power, especially in venture capital, we will pay for these decisions down the line.

    • Taldan@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      13 hours ago

      I’m finding AI effectively automates entry level jobs and interns. The long term implications is very few will be able to enter the field. What do we do when all the experienced engineers retire? How will we shift our economy to work for everyone under this model?

      • ragas@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        Can you give an example of what those entry level jobs may be? Because I have yet to encounter a position where an AI would be as smart as an entry level person.

  • alvyn@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    30
    ·
    20 hours ago

    Is his message: “let us scrape your code or go away, and we gonna scrape it anyway” note: scrape = steal