I realize, I need to upgrade my little NUC to something bigger for higher inference of bigger llama models. I want something that you still can have on your living room’s tv bench, so no monster rack please, but that has also the necessary muscle when needed for llama. Budget doesn’t matter right now, want to understand what’s good and what’s out there. Thanks

  • anamethatisnt@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Con: Fewer guides, more complicated setup and having to solve the translation from CUDA with IPEX-LLM and so on. Not everything will run.

    Pro: Looking at Intel Arc Pro B70 with 32GB for less than half the price of an RTX 5090 sure makes one curious to try it.