kersploosh@sh.itjust.works to Programmer Humor@programming.dev · 2 days agoIt works thosh.itjust.worksvideomessage-square35fedilinkarrow-up1762
arrow-up1762videoIt works thosh.itjust.workskersploosh@sh.itjust.works to Programmer Humor@programming.dev · 2 days agomessage-square35fedilink
minus-squareJankatarch@lemmy.worldlinkfedilinkarrow-up42·edit-22 days agoI am actually pretty ok with this type of "messing around’ usage. On the condition they also stop killing the environment to train and run these stupid things.
minus-squareentropicdrift@lemmy.sdf.orglinkfedilinkarrow-up13·2 days agoYeah, if they were just running it locally off a GPU it would be cooler
minus-squarepsud@aussie.zonelinkfedilinkEnglisharrow-up7·1 day agoRunning an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.
I am actually pretty ok with this type of "messing around’ usage.
On the condition they also stop killing the environment to train and run these stupid things.
Yeah, if they were just running it locally off a GPU it would be cooler
Running an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.