This was the weirdest thing I’ve seen today. These are only the ones I’ve spotted.

funnily enough, these bots are also replying to an obvious repost from another bot account. It’s at the top right now! Beautiful

https://www.reddit.com/r/goodnews/comments/1p8dt2a/_/

tipping points:

  1. consuming so much AI content has led to me able to see subtle patterns
  2. They’re all saying “exactly” and saying the same thing"
  3. their usernames are similar, flower/nature related, two words, no profile pictures
  4. All of their profiles have the exact same format of comments with the agreement, summary
  5. and they all have porn on their profile. oh

edit: tf?

    • Xylight@feddit.onlineOP
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 hours ago

      No, when it comes to LLMs there’s hardly any “dead giveaways” now. You have to learn to recognize the patterns.

      Omitting the final punctuation is quite a common thing people do, in fact you did in your comment. It’s probably just a part of the system prompt.

      • petersr@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        2 hours ago

        Yeah, LLM would probably not omit the final punctuation unless specifically prompted to or unless it is given a ton of examples of comments it should mimic in the prompt.

    • grepe@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      i don’t think i (or perhaps anyone) can recognize any single particular comment as being llm generated… but when the bots come in force it is still really easy. basically it boils down to this: many replies keep reiterating the same exact points in slightly different way with the same exact keywords. if you would use chatgpt to summarize each response you’d get basically the same thing from all bot replies.

    • SGforce@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      Think it’s probably a bug in the script they’re running. It’s cleaning one character too many off the end.