Father, Hacker (Information Security Professional), Open Source Software Developer, Inventor, and 3D printing enthusiast

  • 3 Posts
  • 88 Comments
Joined 2 years ago
cake
Cake day: June 23rd, 2023

help-circle
  • It’s not a shame. Have you tried this? Try it now! It only takes a minute.

    Test a bunch of images against ChatGPT, Gemini, and Claude. Ask it if the image was AI-generated. I think you’ll be surprised.

    Gemini is the current king of that sort of image analysis but the others should do well too.

    What do you think the experts use? LOL! They’re going to run an image through the same exact process that the chatbots would use plus some additional steps if they didn’t find anything obvious on the first pass.


  • I don’t think it’s irresponsible to suggest to readers that they can use an AI chatbot to examine any given image to see if it was AI-generated. Even the lowest-performing multi-model chatbots (e.g. Grok and ChatGPT) can do that pretty effectively.

    Also: Why stop at one? Try a whole bunch! Especially if you’re a reporter working for the BBC!

    It’s not like they give an answer, “yes: Definitely fake” or “no: Definitely real.” They will analyze the image and give you some information about it such as tell-tale signs that an image could have been faked.

    But why speculate? Try it right fucking now: Ask ChatGPT or Gemini (the current king at such things BTW… For the next month at least hahaha) if any given image is fake. It only takes a minute or two to test it out with a bunch of images!

    Then come back and tell us that’s irresponsible with some screenshots demonstrating why.


  • Or like isn’t the UK the most surveiled country with their camera system?

    Ahahah! That’s a good one!

    You think all those cameras are accessible to everyone or even the municipal authorities? Think again!

    All those cameras are mostly useless—even for law enforcement (the only ones with access). It’s not like anyone is watching them in real time and the recordings—if they even have any—are like any IT request: Open a ticket and wait. How long? I have no idea.

    Try it: If you live in the UK, find some camera in a public location and call the police to ask them, “is there an accident at (location camera is directly pointing at)?”

    They will ask you all sorts of questions before answering you (just tell them you heard it through the grapevine or something) but ultimately, they will send someone out to investigate because accessing the camera is too much of a pain in the ass.

    It’s the same situation here in the US. I know because the UK uses the same damned cameras and recording tech. It sucks! They’re always looking for ways to make it easier to use and every rollout of new software actually makes it harder and more complicated!

    How easy is the ticket system at your work? Now throw in dozens of extra government-mandated fields 🤣

    Never forget: The UK invented bureaucracy and needles paperwork!



  • Why’d you give up on local image generation? With FLUX-based models and tools like ComfyUI, it’s actually better than what you get with cloud-based services. You have a lot more control, and the wide availability of LoRAs makes it much more fun/useful, IMHO.

    Having said that, if you don’t have a modem GPU with at least 8GB of VRAM, it’s not going to be a great experience. 16GB is preferable.

    My great wish is for there to be affordable, fast GPUs with at least 32GB of VRAM. That would be enough to play a modern AAA game while also running other AI workloads at the same time (e.g. as a secondary aspect of the game).

    I have two really fantastic game ideas that can’t really exist without the average gamer having access to that level of hardware. Not for fancy graphics; for the AI possibilities 😁


  • If the cost of using it is lower than the alternative, and the market willing to buy it is the same. If the current cloud hosted tools cease to be massively subsidized, and consumers choose to avoid it, then it’s inevitably a historical footnote, like turbine powered cars, Web 3.0, and laser disk.

    There’s another scenario: Turns out that if Big AI doesn’t buy up all the available stock of DRAM and GPUs, running local AI models on your own PC will become more realistic.

    I run local AI stuff all the time from image generation to code assistance. My GPU fans spin up for a bit as the power consumed by my PC increases but other than that, it’s not much of an impact on anything.

    I believe this is the future: Local AI models will eventually take over just like PCs took over from mainframes. There’s a few thresholds that need to be met for that to happen but it seems inevitable. It’s already happening for image generation where the local AI tools are so vastly superior to the cloud stuff there’s no contest.




  • Ooh this looks awesome!

    I help people design analog keyboards all the time and I always strongly suggest the use an RP2040 instead of plugging in a 3rd party microcontroller board (e.g. Black Pill). Yet they always balk at the seeming complexity of doing that!

    Something like this will lower that barrier by a huge amount 👍

    Note: Putting an RP2040 into anything is actually super easy. One of the easiest MCUs to use, actually! It’s also not very picky about how far away you put things like capacitors and the crystal (even though the data sheet says otherwise). I’ve seen working boards where they placed the crystal and flash chip waaaaay TF too far away (IMHO).