“General interest in AI PCs has been wavering for a while …"

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    14
    ·
    9 hours ago

    The only people who have interest in “AI PCs” are the ones trying to sell them to you.

    And there’s not really much if any support for local AI in commercial software. Copilot doesn’t run local AI, as far as I’m aware. It’s purely marketing.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    8 hours ago

    I’m all for running models locally, as long as one can handle the hardware cost of not sharing hardware with other users, for privacy reasons and so forth, but laptops aren’t a fantastic hardware platform for heavy parallel computation.

    • Limited ability to dissipate heat.

    • Many current laptops have a limited ability to be upgraded, especially on memory. Memory is currently a major limiting factor on model size, and right now, laptops are likely to be memory-constrained due to shortages and due to using soldered memory, most can’t be upgraded a couple years down the line when memory prices are lower.

    • Limited ability to use power on battery.

    In general, models have been getting larger. I think that it is very likely that for almost any area we can think of, we can get a better result by producing a larger model. There are tasks that don’t absolutely need a large model, but odds are that one could do better with a larger model.

    Another issue is that the hardware situation is rapidly changing, and it may be that there will be better hardware out that can erfoem significantly better before long.

    So unless you really, really need to run your computation on a laptop, I’d be inclined to run it on another box. I’ve commented on this before: I use a Framework Desktop to do generative AI stuff remotely from my laptop when I want to do so. I need very little bandwidth for the tasks I do, and anywhere I have a laptop and a cell link, it’s available. If I absolutely had to have a high-bandwidth link to it, or use it without Internet access, I’d haul the box along with me. Sure, it needs wall power (or at least a power station), but you aren’t going to be doing much heavy parallel computation on a laptop without plugging it into a wall anyway.

    Even with non-laptop hardware, unless you really, really want to do a bunch of parallel computation in the near future, especially since a lot of hardware costs have shot up, unless you are willing to pay the cost to do it now, you may be better off waiting until prices come down.

    EDIT: I’m also not at all convinced that a lot of the things that one thinks might need to be done on-laptop actually need to be done on-laptop. For example, let’s say that one likes Microsoft’s Recall feature of Copilot. I am pretty sure that someone could put together a bit of software to just do the image recognition and tagging on a home desktop when one plugs one’s laptop in at night to charge — log the screenshots at runtime, but then do the number crunching later. Maybe also do fancy compression then, bring the size down further. Yeah, okay, that way the search index doesn’t get updated until maybe the night, but we’ve had non-realtime updated file indexes for a long time, and they worked fine. I have my crontab on my Linux box update the locate database nightly to this day.

    EDIT2: Not to mention that if you have a parallel compute box on the network, it can be used by phones or whatever too. And you’ll probably get better results with the kind of image recognition with a much larger model that can run on a box like that.

    I mean, you want to always be solving a user problem. Do people want captioned screenshots of what they’ve been doing, to seach their usage history? Maybe. Do they need it immediately, and are they willing to make the associated cost, battery life, and caption-quality performance tradeoffs for that immediacy? I’m a lot more skeptical about that.

    EDIT3: And I do get that, if you want to provide remote access to a parallel compute box, self-hosting is hard today. But that still seems like a problem that Microsoft is in a prime position to work on. Make it plug-and-play to associate a parallel compute box with a laptop. Plug it in over USB-C like an eGPU. Have a service to set up a reverse proxy for the parallel compute box, or provide the option to use some other service. Hell, provide the option to do cloud compute on it.

    Steam does something like this with Steam Link for local network use, leveraging a “big” box for parallel compute so that small, portable devices can use it for games. Does Microsoft?

    searches

    Yeah.

    https://www.xbox.com/en-US/consoles/remote-play

    Play remotely from your Xbox console

    Play games installed on your console, including titles in the Xbox Game Pass library, on LG Smart TVs, Samsung Smart TVs, Amazon Fire TV devices, and Meta Quest headsets, as well as other browser supported devices like PCs, smart phones, and tablets.

    Yeah. They’re literally already selling parallel compute hardware that you put on your home network and then use on portable devices. I mean, c’mon, guys.