When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:
When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:
Hmm and what about the everyday user who needs to ask AI how long to cook potatoes? What will they do after the pop?
completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT
Local models are actually pretty great! They are not great at everything… but for what most people are using LLMs for they do a fine job.
Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.
Boiling time isn’t related to original potato size, it’s related to the size of pieces you cut. So the first half is irrelevant and the second half is overly verbose.
deleted by creator
They’ll ask their parents, or look up cooking instructions on actual websites.
Cue 8 paragraphs of “I first learned to cook potatoes with my grandfather back in…” and every reader screaming “Oh my god just get to the recipe…”
How will I remember to put avocado in my BBQ sauce, without AI?