Normalize saying “I asked Baal…I asked Baal-Zebub…I asked my tartot cards…” and acting like they’re all fucking equally accurate because they probably are.
The same thing as any tarot spread generator. The meanings are standardized, so ChatGPT isn’t actually doing heavy lifting. The Gemini side is not randomly picking cards.
This was all possible 20+ years ago on a GeoCities website.
Normalize saying “I asked Baal…I asked Baal-Zebub…I asked my tartot cards…” and acting like they’re all fucking equally accurate because they probably are.
I’ve been arguing that people think chatgpt is legit like a fortune teller. They ask it questions about the future and believe that shit.
I wonder what you’d get of you’d tell Gemini to pull/lay “random tarot cards” and then have ChatGPT explain their meaning
The same thing as any tarot spread generator. The meanings are standardized, so ChatGPT isn’t actually doing heavy lifting. The Gemini side is not randomly picking cards.
This was all possible 20+ years ago on a GeoCities website.