I came home recently to find that my request to a gardener to “cut back the shrubs” led him to absolutely decimate them:

What does ‘cut back’ mean? I meant slightly trim overgrowth but leave them otherwise intact. It was a text I sent while travelling and listening to a podcast, without putting much real thought into it. To him it meant significantly reduce the quantity of the shrub. The problem was my instruction rather than the gardener’s execution. I gave a vague and misleading request which was easy to misinterpret.
I would suggest this is what many users do with LLMs. It often leads them to blame the machine for what was actually a failure on their part to provide adequate instructions. Prompting requires clarity about your intentions, which is something most of us lack at least some of the time.
