Can confirm. Had a situation with 3.5 where it was refusing to show me example Ada code in response to the prompt “Give me some Ada code”. It was responding with something along the lines of “Ada is not used in Python and therefore I cannot help you with this request”. My custom instruction is “We use Python 3 with strict typing”
Without knowing the prior conversation do you think it was confusing Americans with Disabilities Act and the associated “code” from governments ? Where it was parsing it as some weird combination ?
Unexpected twists in responses coming from discovery of surprising second and third order intersections of concepts is a pattern of results I expected to find a lot more of in my conversations with chat GPT, but it almost never strays from the most strongly correlated line of thinking with the prompt's most strongly.corrolated context.
It's very likely I just haven't tried hard enough, because I wasn't trying to do this I just expected it would happen anyway.