Ask HN: Strangest or most unexpected behavior you've seen from GitHub Copilot?

7 pointsposted 22 days ago
by stikit

Item id: 46642593

3 Comments

wryoak

22 days ago

That doesn’t seem weird. Wrong, sure, but not out of the ordinary

stikit

22 days ago

I would expect that the mode is a feature of the vs code integration and not something the model would necessarily even be aware of. For it to not operate correctly in the ask mode when it is properly set seems to me a bug in the underlying integration. The fact that the model responded the way it did and then corrected it is interesting.

wryoak

22 days ago

Well, you didn’t share the exact text of the original prompt but you described yourself as having asked it to do something. These models are being trained for agentic behavior on data where an agent is asked _to do something_, and as their output is purely probabilistic, the rewarded response will then often include the text “I have done something” even though they have not done something. Perhaps there _is_ an issue with the integration that caused your experienced response, but purely based on my experience and the limited information you gave, my immediate guess is that the model positively associates your prompt with the generated response