

I believe there will be people who let LLMs only do untrusted jobs. Human writes a specification, AI writes an implementation along with a proof that it adheres to the spec.


I think a better example is that programmers use AI to autocomplete text. They could write the exact same text by hand or use a dumber autocomplete but there is no reason to. The product is exactly the same just delivered with slightly less wear on the programmer’s fingers.
I learned about this via Guix but didn’t notice that. Nice!