Our News Team @ 11 with host Snot Flickerman


Yes, I can hear you, Clem Fandango!

  • 13 Posts
  • 1.36K Comments
Joined 3 years ago
cake
Cake day: October 24th, 2023

help-circle
  • I’ve been running Ubuntu on laptops for a lot longer than five years and the last time I had real WiFi issues was over a decade ago. That’s why I think it may be debian related or based on your description, possibly a closed source driver issue. There’s actually quite a lot of WiFi devices that use chipsets that we don’t have proper Linux drivers for at all, and what exists are sort of hacked together projects that live on github. I’ve had to do this with every netgear dongle I ever had, the downloading and compiling drivers for it from github.



  • Ah, yeah, was there any particular reason you were using LMDE? Because I’m not sure what parts of systemd it uses (especially back then), but I always just edited /etc/systemd/logind.conf to have HandleLidSwitch=ignore and have had zero issues. Pretty sure there is a gnome GUI for changing this same setting, gnome-tweaks.

    I would assume the bad WiFi support was due to it being Debian and Debian being notoriously behind in terms of updates for the sake of stability.





  • then just rig it up to ping your phone when it’s done…oh who am I kidding these dudes wouldn’t know how to do that.

    Fucking. Exactly. I just made a long comment about how this article feels like they’re talking to non-tech-savvy people who are pretending to be tech-savvy because they talk to a fucking AI.

    Like this dumbfuck kid who “has to keep shipping software” as if that means he’s not shipping it riddled with bugs and security issues since his AI makes the spaghetti code and he just says “I’m sure this is ready for production.”


  • This article is so confusing because it seems like everyone they’re talking to is just using online models and the use of local models is mentioned but it’s not clear how many of the people being interviewed are using local models since it’s all about laptops. Even the one lady who you can see her screen is a CLI, it’s not clear that she’s not just using the CLI version of Claude.

    I have a mid-range desktop and doing local LLM can be pretty darn slow on it especially with an AMD card and ROCm as opposed to Nvidia and CUDA. I have a relatively nice laptop, but it’s specs are well below my desktop and I just can’t imagine actually running a local LLM on a laptop.

    If they’re not using a local model, then they wouldn’t need to worry about overheating with the lid closed. Easy to make it so it doesn’t hibernate when the lid is closed via CLI (at least in Linux anyway). Because if they’re offloading all the work to a remote model, their PC can essentially be relatively idle and draw less power/produce less heat.

    Article also seems strangely focused on Macs? All it’s mentions of how to make it so you can close the lid are Mac-focused. Did I miss something about the new Apple Silicon being really efficient for local LLMs? Maybe that’s what I’m missing here.

    It just seems almost weirdly narcissistic, like they want people to ask them about it so they can talk about it. Certainly it seems that way with the kid with a startup business that he runs during classes with tokens paid for by his parents.

    Anyway, the whole thing seems odd to me. Either the article is about people who aren’t actually super savvy coders or techies, or they would… just switch it so they can close their fucking laptop… or something about making a show of what they’re doing is part of it. I dunno, weird. Anyway.




  • I have a hard time considering these types of things proper games just because they’re only about “make numbers bigger faster.”

    Also, the sheer number of these types of games which are all basically the same in terms of what they do kind of makes it just another one in a sea (pun intended) of these kinds of games.