• SabinStargem@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    It is going to take at least five years before local AI is user-friendly enough and with performant hardware circulating, that ordinary folks would consider buying an AI machine.

    I have a top-end DDR4 gaming rig. It takes a long time for 100b sized AI to give some roleplaying output, at least forty minutes for my settings via KoboldCPP with a GGUF. I don’t think a typical person would want to wait more than 2 minutes for a good response. So we will need at least DDR6 era devices before it is practical for everyday people.

    • lmuel@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      A local LLM is still an LLM… I don’t think it’s gonna be terribly useful no matter how good your hardware is

      • maus@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I have great success with local LLM in some of my workflows and automation.

        I use it for my line completion and basic functions/asks while developing that I dont want to waste tokens on.

        I also use it in automation. I run my own media server with a few dozen people with an automated request system “jellyseerr” that adds content. I have automation that leverages local LLM to look at recent media requests and automatically requests content that is similar to it.

      • luridness@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Local AI can be useful. But I would rather see nice implementations that used small but brilliantly tuned models for… let’s say better predictive text… it’s already somewhat AI based I just would like it to be. Better

      • xthexder@l.sw0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        The diminishing returns are kind of insane if you compare the performance and hardware requirements of a 7b and 100b model. In some cases the smaller model can even perform better because it’s more focused and won’t be as subtle about its hallucinations.
        Something is going to have to fundamentally change before we see any big improvements, because I don’t see scaling it up further ever producing AGI or even solving any of the hallucinations/ logic errors it makes.

        In some ways it’s a bit like the Crypto blockchain speculators saying it’s going to change the world. But in reality the vast majority of applications proposed would have been better implemented with a simple centralized database.

  • Ajinomoto@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Why not just leave it alone inside a browser tab? If I want AI, and I use it quite a lot, I will go into their website. Don’t force it system wide, just sucks

    • Aceticon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      This is pretty much a “all Tech companies have to jump on the AI hype train” pressure on publicly traded companies and those who need lots of investor money, and little if at all customer pressure.

      All investors want their money to be in the same place as those who invested in Google before it made it big, and the AI hype promises exactly that to the “winners” of the AI race.

      Customer needs and demands are well below secondary to investor pressure, especially for companies which have dominant market positions (so general customers have no decent alternatives) and startups whose entire business model is AI.

    • AstralPath@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      They want their greasy tendrils all up in your PC’s guts. Every bit of info flowing in your system can be monetized. All they care about is money and dominance and their “AI” in everyone’s devices is their wet dream.

      Cancer is preferable to tech bros as cancer doesn’t know its killing the host. Tech bros know full well their actions are killing the planet and its inhabitants. Their actions are willfully vile and toxic; completely at odds with the needs of humanity.

      Don’t expect them to ever do the right thing for anyone but themselves.

      • AmbiguousProps@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I suppose, it just seemed like putting the blame on the consumers rather than greedy, short-sighted executives.

        • Holytimes@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          It’s not, it is after all entirely true. The actual blame for the executive is that they don’t understand that the consumer is ignorant beyond reasoning and dumb as rocks.

          They are trying to push a product that’s unreliable to an idiot, and anyone with two brain cells knows not to trust it.

          Making it a twice failed concept from the start

          A good product should be useable by even a fool and malleable enough to be used by anyone.

    • nyankas@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I think it’s quite possible to become confused if you’re used to software that, bugs aside, behaves pretty much completely predictably, then get a feature marketed as “intelligence” which suddenly gives you unpredictable and sometimes incorrect results. I‘d definitely be confused if the reliable tools I do my work with suddenly developed a mind of their own.

    • _edge@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Have you recently vibe editted a Microsoft Copilot Excel Sheet on your AI PC (but actually in the cloud)?

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      It’s just a softer thing to say than ‘a lot of people hate AI and it’s alienating potential customers’. They can’t come out and say that out loud, they don’t want to piss off Microsoft too much and they aren’t going to try to do NPU-free systems (it’s not really possible). They aren’t going to do anything to ‘fight back’ against the AI that people hate (they can’t), so their best explanation as to why they pull back from a toxic brand strategy is that ‘people just don’t care’ rather than ‘people hate this thing that we are going to keep feeding’.

      But if they need to rationalize the perspective, an “AI” PC does nothing to change the common users experience with the AI things they know, does not change ChatGPT or Opus or anything similar, that stuff is entirely online. So for the common user, all ‘AI’ PC means is a few Windows gimmicks that people either don’t care about or actively complained about (Recall providing yet another way for sensitive data to get compromised).

      In terms of “AI” as a brand value, the ones most bullish about AI are executives that like the idea of firing a punch of people and incidently they actually want to buy fewer PCs as a result. So even as you can find AI enthusiastic people, they still don’t want AI PCs.

      For most people, their AI experience has been:

      • News stories talking about companies laying off thousands or planning to lay off thousands for AI, AI is the enemy
      • News stories talking about some of those companies having to rehire those people because AI fell over, AI is crap
      • Their feeds being flooded with AI slop and deepfakes, AI is annoying
      • Their google searches now having a result up top that, at best, is about the same as clicking the top non-sponsored link, except that it frequently totally botches information, AI is kind of pointless

      For those that have actually positive AI experience, they already know it has nothing to do with whether the PC is ‘AI’ or not. So it’s just a brand liability, not a value.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    Not the position Dell is taking, but I’ve been skeptical that building AI hardware directly into specifically laptops is a great idea unless people have a very concrete goal, like text-to-speech, and existing models to run on it, probably specialized ones. This is not to diminish AI compute elsewhere.

    Several reasons.

    • Models for many useful things have been getting larger, and you have a bounded amount of memory in those laptops, which, at the moment, generally can’t be upgraded (though maybe CAMM2 will improve the situation, move back away from soldered memory). Historically, most users did not upgrade memory in their laptop, even if they could. Just throwing the compute hardware there in the expectation that models will come is a bet on the size of the models that people might want to use not getting a whole lot larger.

    • Heat and power. The laptop form factor exists to be portable. They are not great at dissipating heat, and unless they’re plugged into wall power, they have sharp constraints on how much power they can usefully use.

    • The parallel compute field is rapidly evolving. People are probably not going to throw out and replace their laptops on a regular basis to keep up with AI stuff (much as laptop vendors might be enthusiastic about this).

    I think that a more-likely outcome, if people want local, generalized AI stuff on laptops, is that someone sells an eGPU-like box that plugs into power and into a USB port or via some wireless protocol to the laptop, and the laptop uses it as an AI accelerator. That box can be replaced or upgraded independently of the laptop itself.

    When I do generative AI stuff on my laptop, for the applications I use, the bandwidth that I need to the compute box is very low, and latency requirements are very relaxed. I presently remotely use a Framework Desktop as a compute box, and can happily generate images or text or whatever over the cell network without problems. If I really wanted disconnected operation, I’d haul the box along with me.

    • Euphoma@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      Phones have already came with ai processors for a long time, specifically for speech recognition and camera features, its not advertised because its from before the bubble started

    • MonkderVierte@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Historically, most users did not upgrade memory in their laptop, even if they could.

      My Thinkpad x220 begs to differ.

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      There are a number of NPUs that plug into an m.2 slot. If those aren’t powerful enough, you can just use an eGPU.
      I would rather not have to pay for an NPU that I’m probably not going to use.

    • Goodeye8@piefed.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I’m not that concerned with the hardware limitations. Nobody is going to run a full-blown LLM on their laptop, running one on a desktop would already require building a PC with AI in mind. What you’re going to see being used locally are going smaller models (something like 7B using INT8 or INT4). Factor in the efficiency of an NPU and you could get by with 16GB of memory (especially if the models are used in INT4) with little extra power draw and heat. The only hardware concern would be the technological advancement speed of NPUs, but just don’t be an early adopter and you’ll probably be fine.

      But this is where Dells point comes in. Why should the consumer care? What benefits do consumers get by running a model locally? Outside of privacy and security reasons you’re simply going to get a better result by using one of the online AI services because you’d be using a proper model instead of the cheap one that runs with limited hardware. And even for the privacy and security minded people you can just build your own AI server (maybe not today but when hardware prices get back to normal) that you run from home and then expose that to your laptop or smartphone. For consumers to desire running a local model (actually locally and not in a selfhosting kind of way) there would have to be some problem that the local model solve that the over the internet solution can’t solve. So far such a problem doesn’t exist today and there doesn’t seem to be a suitable problem on the horizon either.

      Dell is keeping their foot in the door by still implementing NPUs into their laptops, so if by some miracle some magical problem is found that AI solves they’re ready, but they realize that NPUs are not something they can actually use as a selling point because as it stands, NPUs solve no problems because there’s no benefit to running small models locally.

      • jj4211@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        More to the point, the casual consumer isn’t going to dig into the nitty gritty of running models locally and not a single major player is eager to help them do it (they all want to lock the users into their datacenters and subscription opportunities).

        On the Dell keeping NPUs in their laptops, they don’t really have much of a choice if they want modern processors, Intel and AMD are all-in on it still.

        • Goodeye8@piefed.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Setting up a local model was specifically about people who take privacy and security seriously because that often requires sacrificing convenience, which in this case would be having to build a suitable server and learning the necessary know-how of setting up your own local model. Casual consumers don’t really think about privacy so they’re going to go with the most convenient option, which is whatever service the major players will provide.

          As for Dell keeping the NPUs I forgot they’re going to be bundled with processors.

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            My general point is that discussing the intricacies of potential local AI model usage is way over the head of the people that would even in theory care about the facile “AI PC” marketing message. Since no one is making it trivial for the casual user to actually do anything with those NPUs, then it’s all a moot point for this sort of marketing. Even if there were an enthusiast market that would use those embedded NPUs without a distinct more capable infrastructure, they wouldn’t be swayed/satisfied with just ‘AI PC’ or ‘Copilot+’, they’d want to know specs rather than a boolean yes/no for ‘AI’.

    • Nighed@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I think part of the idea is: build it and they will come… If 10% of users have NPUs, then apps will find ‘useful’ ways to use them.

      Part of it is actually battery life - if you assume that in the life of the laptop it will be doing AI tasks (unlikely currently) an NPU will be wayyyy more efficient than running it on a CPU, or even a GPU.

      Mostly though, it’s because it’s an excuse to charge more for the laptop. If all the high end players add NPUs, then customers have no choice but to shell out more. Most customers won’t realise that when they use chat got or copilot one one of these laptops, it’s still not running on their device.

    • Holytimes@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Weirdly dell always seems to understand what normal users want.

      The problem is normal users have beyond low expectations, no standards and are ignorant of most everything tech related.

      They want cheap and easy to use computers that require no service and if there is a problem a simple phone number to call for help.

      Dell has optimized for that. So hate em or not, while their goods have gone to shit quality wise. They understand their market and have done extremely well in servicing it.

      Thus I am not surprised at all dell understood this. If anything I would have been more surprised if they didn’t.

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        I think they all understand what we want (broadly), they just don’t care, because what they want is more important, and they know consumers will tolerate it.

        • ouRKaoS@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          They care, they just care differently. What they want is money, so they’re trying to find what the maximum price is they can sell the minimum amount of product for.

          If they can dress that up as “caring for the consumer” it’s a bonus.

          • artyom@piefed.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            You’re not thinking about the bigger picture. They can sell you an irrepairable device, design it to fail after a short time so you have to buy another one, upsell you on useless AI shit to pump up investments, and load it with a bunch of invasive software so they can collect and sell information about you. None of this has anything to do with what you, the consumer, want, and they know that, but they don’t care, because it’s not what makes them money.

      • anomnom@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        And yet just before looking at Lemmy I got an ad for the Dell AI laptop on YouTube (on my TV, still need to get a piHole up and running).

        • vala@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Just stop using the TV like that. Hook up a small Linux computer via hdmi and use that instead.

          • anomnom@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            I have an older MacBook with standard hdmi, but there are some creators I really like on YouTube and we have an ancient Roku stick that still works. The remote is convenient and I usually go pee during the ads.

        • Kazumara@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          on YouTube (on my TV, still need to get a piHole up and running

          Unfortunately that won’t help. The Youtube ads are served from the same domains as the videos, so a DNS based blocker is inherently powerless.

            • Kazumara@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              4 months ago

              Can confirm, Firefox with uBlock Origin works. The OS doesn’t seem to matter. I use that combination on Linux (Fedora 43), Windows (10), macOS (15) and Android (16), no YouTube ads anywhere.

          • AmbitiousProcess (they/them)@piefed.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            Seconded on Framework. I’ve got the more performant (but more heavy, large, and expensive) 16, but for most people the 13 will be perfectly usable. The newer 12 model also seems pretty decent and is a bit cheaper.

            They’ve kept their RAM prices relatively stable too, but if you already have other RAM lying around you can just bring your own and save yourself the money. Same for the SSD.

            The main downside is they’re gonna be quite expensive upfront compared to alternatives, so I wouldn’t recommend them to someone price-sensitive, especially in the current economy.

            The main benefit is that since they’re so modular and upgradable, you’ll save money down the line on repair services, replacement parts, or just the cost of buying a whole new device because one component broke that they don’t sell replacements for.

          • Zagorath@quokk.au
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            How is their site (and product) as an option for your non-techy mum? Also does shipping end up being exorbitant if you’re not in the same country they’re based in?

            • RamRabbit@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              4 months ago

              They have a fully prebuilt option for every computer. Which works well for non-techy people.

              No clue what shipping is like in your country. Was fine for me in the US.

          • Damage@feddit.it
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            I’ve got a framework 13 but I wouldn’t suggest them for casual users. They’re very expensive for the specs.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago
    > be me
    > installed VScode to test whether language server is just unfriendly with KATE
    > get bombarded with "try our AI!" type BS
    > vomit.jpg
    > managed to test it, but the AI turns me off
    > immediately uninstalled this piece of glorified webpage from my ThinkPad
    

    It seems I’m having to do more jobs with KATE. (Does the LSP plugin for KATE handle stuff differently from the standard in some known way?)

  • manxu@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Dell is the first Windows OEM to openly admit that the AI PC push has failed. Customers seem uninterested in buying a laptop because of its AI capabilities, likely prioritizing other aspects such as battery life, performance, and display above AI.

    Silicon Valley always had the annoying habit of pushing technology-first products without even much consideration of how they would solve real world problems. It always had it, but it’s becoming increasingly bad. When Zuck unveiled the Metaverse it was already starting to be ludicrous, but with the AI laptop wave it turned into Onion territory.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      What do you mean? Do you even have ANY foundation to this accusation?

      Hold on, I need to turn off my heater. 22211123222234663fffvsnbvcsdfvxdxsssdfgvvgfgg

      There it is. The off button. Touch controls are so cool guys.

  • MutantTailThing@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    For me at least, AI reminds me too much of that thrice cursed MS Word paperclip. I did not want it then and I do not want it now.

    Also, adding ‘personality’ to pieces of software is cringy at best and downright creepy at worst.

    • justsomeguy@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Forget about the personality for a minute. They have a different thing in common. Uselessness. I tried AI for a bunch of general use cases and it almost always fails to satisfy. Either it just can’t do the task in the first place or it makes mistakes that then cost too much time to fix.

      There are exceptions and specialized models will have their use but it’s not the Swiss army knife tool AI companies are promising.

      • The_v@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        AI hardware is a sales pitch without a clear product. Consumers have no clue why they would want to buy something with AI on it.

        For most consumers AI is a webpage that kids cheat on homework or adults attempt to cheat at work with. It makes ugly fake pictures with all sorts of weird errors. Its also the annoying as fuck answering services that you have to yell at 4 or 5 times to get to a real person.

        Why would an AI PC be desirable?

  • ikt@aussie.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    The ‘quiet part out loud’ isn’t being used how it should be

    Microsoft appears to genuinely believe that AI features are what customers want even if they don’t directly ask for it, nobody asked for Google’s AI feature but it’s hugely popular, people freakin love it (except here on Lemmy of course where it’s the devil), Google see that users are staying on their site longer and not clicking through to sites as much which is a win for them and a win for users

    A real quiet part out loud would be ‘Microsoft knows its users don’t like its AI features but don’t care because Windows isn’t a money maker for them compared to Microsoft Copilot 365’

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      The real ‘quiet part’ would be that they are avoiding it because a large number of people hate ‘AI’. To say they are ‘confused’ is still keeping the quiet part quiet…

    • kescusay@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Is that a win for Google, though? They make most of their money from AdSense, because websites want to display ads. If people aren’t clicking through to websites from their search results, that seems like fewer opportunities to display ads, reducing the viability of AdSense.

      • ikt@aussie.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        seems like so far things are going well, businesses spend more on ads on google 🙃

        Google, for now, is still laughing its way to the bank. The search giant is putting advertisers’ ads within or directly above and below AI Overviews themselves. As Google CEO Sundar Pichai said in an interview last year, “If you put content and links within AI Overviews, they get higher clickthrough rates than if you put it outside of AI Overviews.” Organic links are pushed down.

        Even in 2024, market research company SparkToro’s study of searchers found “almost 30 percent of all clicks go to platforms Google owns.” For every 1,000 Google searches in the United States, 360 clicks go to a non-Google-owned, non-Google-ad-paying property. With the rise of AI Overview, Google’s share can only have grown, while everyone else’s has shrunk.

        https://www.theregister.com/2025/07/29/opinion_column_google_ai_ads/

        I believe maps and youtube were money losers for google for a very long time but the goal was to keep people within the google ecosystem where they can extract money from users in other ways

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Yeah, I’m not sure what the point of a cheap NPU is.

    If you don’t like AI, you don’t want it.

    If you do like AI, you want a big GPU or to run it on somebody else’s much bigger hardware via the internet.

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      A cheap NPU could have some uses. If you have a background process that runs continuously, offloading the work to a low-cost NPU can save you both power and processing. Camera authorization, if you get up, it locks; if you sit down, it unlocks. No reason to burn a core or GPU for that. Security/Nanny cameras recognition. Driving systems monitoring a driver losing consciousness and pulling over. We can accomplish this all now with CPUs/GPUs, but purpose-built systems that don’t drain other resources aren’t a bad thing.

      Of course, there’s always the downside that they use that chip for recall. Or malware gets a hold of it for recall, ID theft, There’s a whole lot of bad you can do with a low-cost NPU too :)

  • stoy@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I’d much rather have a more powerful generic CPU than a less powerful generic CPU with an added NPU.

    There are very few people who would benefit from an added NPU, ok I hear you say what about local AI?

    Ok, what about it?

    Would you trust a commercial local AI tool to not be sharing data?

    Would your grandmother be able to install an open source AI tool?

    What about having enough RAM for the AI tool to run?

    Look at the average computer user, if you are on lemmy, chances are very high that you are far more advanced than the average computer user.

    I am talking about those users who don’t run Adblocker, don’t notice the YT ad skip button and who in the past would have installed a minimum of of five toolbars in IE, yet wouldn’t have noticed the reduced view of the actual page.

    These people are closer to the average users than any of us.

    Why do they need local AI?

    • biggerbogboy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      There’s also the fact that many NPUs are pretty much useless unless used for a very specific model built for the hardware, so there’s no real point having them

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      My understanding from a very brief skim of what Microsoft was doing with Copilot is to take screenshots constantly, run image recognition on it, and then make it searchable as text and have the ability to go back and view those screenshots in a timeline. Basically, adding more search without requiring application-level support.

      They may also have other things that they want to do, but that was at least one.

      • Spuddlesv2@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Do you mean Copilot, the local indexer and search tool or do you mean Copilot the web based AI chat bot or do you mean Copilot the rebranded Office suite or do you mean … etc.

        Seriously, talk about watering down a brand name. Microsoft marketing team are all massive, massive fuck knuckles.

        • ozymandias117@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Hey, the last one is great.

          Now when I get asked “what do you think about Copilot,” I can just say, “I prefer LibreOffice”

      • stoy@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Exactly!

        I could even see the cards having ram slots, so you can add dedicated ram to the NPU to remove the need for sharing ram with the system

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          The fact that i didnt know about those means that consumers have zero need for them and building them into consumer hardware is just an attempt to keep the AI bubble afloat.

    • Endmaker@ani.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      an added NPU

      cmiiw but I don’t think NPUs are meant to be used on general-purpose personal computers. A GPU makes more sense.

      NPUs are meant for specialised equipment e.g. object detection in a camera (not the personal-use kind)

      • vithigar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        They are in general purpose PCs though. Intel has them taking up die space in a bunch of their recent core ultra processors.

      • altkey (he\him)@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        Probably not even general purpose GPUs, although we sucked it up when RT and Tensor cores were put on a plate whenever we like it or not. These though at least provided something to the consumer unlike NPUs.

  • Clent@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    What a trash click bait headline. That’s not how the statement “saying the quiet part out loud” works. This isn’t a secret and it’s not unspoken and it certainly doesn’t not reveal some underlying motive.