When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:

  • Zkuld@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    6 days ago

    Hmm and what about the everyday user who needs to ask AI how long to cook potatoes? What will they do after the pop?

    • rem26_art@fedia.io
      link
      fedilink
      arrow-up
      18
      ·
      6 days ago

      completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT

    • cubism_pitta@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      6 days ago

      Local models are actually pretty great! They are not great at everything… but for what most people are using LLMs for they do a fine job.

      Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        6 days ago

        Cue 8 paragraphs of “I first learned to cook potatoes with my grandfather back in…” and every reader screaming “Oh my god just get to the recipe…”