Ask HN: Strangest or most unexpected behavior you've seen from GitHub Copilot?

5 pointsposted 11 hours ago
by stikit

Item id: 46642593

3 Comments

wryoak

10 hours ago

That doesn’t seem weird. Wrong, sure, but not out of the ordinary

stikit

10 hours ago

I would expect that the mode is a feature of the vs code integration and not something the model would necessarily even be aware of. For it to not operate correctly in the ask mode when it is properly set seems to me a bug in the underlying integration. The fact that the model responded the way it did and then corrected it is interesting.

wryoak

7 minutes ago

Well, you didn’t share the exact text of the original prompt but you described yourself as having asked it to do something. These models are being trained for agentic behavior on data where an agent is asked _to do something_, and as their output is purely probabilistic, the rewarded response will then often include the text “I have done something” even though they have not done something. Perhaps there _is_ an issue with the integration that caused your experienced response, but purely based on my experience and the limited information you gave, my immediate guess is that the model positively associates your prompt with the generated response