Move Godot to codeberg. Github is too popular. It’s where all the fool kids go and link their LinkedIn to. It’s Microslop all the way down. It’s highly likely the number of real contributors will be higher on Codeberg than Github.
Also, doesn’t github have a viebcoding platform built into it making it really easy to create AI slop?
Edit: yep they do! Get the fuck off of github!
They should close PRs to the public and only accept then from contributors who apply to be vetted.
It sucks, but that’s really the only good way to prevent spam.
Or Captcha’d PRs.
AI is quite good at solving captchas; better than many humans. And it doesn’t really slow down the sloppers for them to set their machine running, come back in an hour and then solve a puzzle manually to submit it. Couple of minutes of work every day and they can still drown the world in bullshit.
Something needs to change, but I’m not convinced that would be enough…
Which AI poisoned captchas?
'cause hallucinogenic ones seem pretty effective deterrent.
Seems like a natural next step is some kinda reputation system for a project’s contributors. If you’ve written 50 successfully merged PRs, you’re certainly less likely to make trash in any method. Create a mentorship heirarchy. It sounds very helpful no matter what. Then the people who have merged 0 PRs to the project will likely work harder too, since they know they will be at the most scrutiny
Its trivial for ai to successfully merge 50 prs into the projects of other ai.
It be frustrating for a beginner that every time they try to contribute they are ignored because they have yet to contribute.
When the topic of AI submissions flooding open source projects pops up, my immediate reaction is to think "see, this is why you disregard intentions". Because I genuinely believe a lot of the people submitting this slop are trying to help the project, even if in reality they’re harming it, by wasting the maintainers’ time with their crap.They cause harm and deserve to be treated as a source of harm, simple as.
And while most projects could/should use more money, I don’t think that’s the solution; it allows the devs to handle more workload, sure, but the goal should be to reduce it. I think this will be eventually done through pre-sorting contributors: a cathedral for the outsiders, but a bazaar for the insiders.
I think intentions are a spook created by the bourgeois to control us, and I wrote an article on My blog about it https://medium.com/@viridiangrail/8-deeper-levels-to-understand-accidents-and-intentions-308c7dc9b742
There’s a lot in your article I agree with. A lot. I could nitpick some of the middle layers, but the conclusion is the same — we should simply disregard intentions, when judging the morality of the actions of someone (incl. ourselves).
Specially the 7th layer — what you said there is something that has been living in my mind for a long time, but I was never able to phrase it properly.
About the 8th layer: the bourgeoisie does love to exploit this problem when it helps them to get less blame, since it’s impossible to prove someone doesn’t have good intentions. But I don’t think they created it, I think the problem is older even than our own species, and it comes from developing a theory of mind.
Thank you for sharing it!
I wonder if the influx of slop contributions can be stemmed by a legal document that makes the contributor legally liable for their submission.
Seems like lawyers have been learning the hard way to be accountable for their slop, perhaps these lessons can be leveraged in the open source community.
Legally liable for what? Just being bad code? How are you going to enforce that against some kid in another country?
It’s time to start putting maintainers’ attention behind a paywall. $50 refundable deposit to submit a PR, forfeited if it’s obvious AI slop
Real “these kids would be very upset if they could read” situation. Who bothers to pick through the whole EULA before submitting?
Like any open source mass contribution project that’s gained too much popularity, you need extra firebreaks in between the Open Submission and Final Product.
That means adding manpower to submission review, QA, etc, which public projects don’t often have.
Sort of the Achilles Heel of the open source community. AI is just making the vulnerability extra glaring.
That would be a closing a gate after the horses have escaped situation.
Letting unverifiable code in would damage to developers and users that wouldn’t be easy to disentangle and erode trust in the product, killing it.
Mitchell Hashimoto is trying to build a reputation system to combat this https://github.com/mitchellh/vouch
vibeCoding–
FashSlop*










