GPT-3 or GPT-4 can give us "convincing liars", but we still need to figure out how to combine them with actual factual databases and do a quick fact-checking/validation/inference. GPT-3 is showing us a convincing human-like style, but no real substance. It's a massive step forward in any case.
I might try to generate soft-science essays with GPT-3 at one of my universities to see if it passes through TA filters.
I might try to generate soft-science essays with GPT-3 at one of my universities to see if it passes through TA filters.