I came across this article in another Lemmy community that dislikes AI. I’m reposting instead of cross posting so that we could have a conversation about how “work” might be changing with advancements in technology.

The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work “you and I do today” (including Altman himself), doesn’t look like work.

The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

As humanity’s core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn’t seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they’re made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

These days we have fewer bookkeepers - most companies don’t need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn’t have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

  • 6nk06@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    79
    ·
    19 hours ago

    At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

    No and no. Have you ever coded anything?

    • kescusay@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      ·
      18 hours ago

      Yeah, I have never spent “days” setting anything up. Anyone who can’t do it without spending “days” struggling with it is not reading the documentation.

        • kescusay@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 hours ago

          Well, if I’m not, then neither is an LLM.

          But for most projects built with modem tooling, the documentation is fine, and they mostly have simple CLIs for scaffolding a new application.

      • HarkMahlberg@kbin.earth
        link
        fedilink
        arrow-up
        41
        ·
        17 hours ago

        Ever work in an enterprise environment? Sometimes a single talented developer cannot overcome the calcification of hundreds of people over several decades who care more about the optics of work than actual work. Documentation cannot help if its non-existent/20 years old. Documentation cannot make teams that don’t believe in automation, adopt Docker.

        Not that I expect Sam Altman to understand what it’s like working in a dumpster fire company, the only job he’s ever held is to pour gasoline.

        • killeronthecorner@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          5 hours ago

          Dumpster fire companies are the ones he’s targeting because they’re the mostly likely to look for quick and cheap ways to fix the symptoms of their problems, and most likely to want to replace their employees with automations.

    • nucleative@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      17 hours ago

      If your argument attacks my credibility, that’s fine, you don’t know me. We can find cases where developers use the technology and cases where they refuse.

      Do you have anything substantive to add to the discussion about whether AI LLMs are anything more than just a tool that allows workers to further abstract, advancing all of the professions it can touch towards any of: better / faster / cheaper / easier?

      • HarkMahlberg@kbin.earth
        link
        fedilink
        arrow-up
        9
        ·
        7 hours ago

        Yeah, I’ve got something to add. The ruling class will use LLMs as a tool to lay off tens of thousands of workers to consolidate more power and wealth at the top.

        LLMs also advance no profession at all while it can still hallucinate and be manipulated by it’s owners, producing more junk that requires a skilled worker to fix. Even my coworkers have said “if I have to fix everything it gives me, why didn’t I just do it myself?”

        LLMs also have dire consequences outside the context of labor. Because of how easy they are to manipulate, they can be used to manufacture consent and warp public consciousness around their owners’ ideals.

        LLMs are also a massive financial bubble, ready to pop and send us into a recession. Nvidia is shoveling money into companies so they can shovel it back into Nvidia.

        Would you like me to continue on about the climate?