> Often we might think we know something and later conclude that we were wrong... but in this case, it's common to say "I thought I knew such-and-such, but I didn't". ... At least, that's one way of using the word "know". So if we're trying to give an analysis of this thing, we can't just decide to discard truth as a requirement of knowledge... the goal is to give an analysis of this thing that people talk about, not replace it with something else.
Let's consider a case like that. A student receives a graded history test, and says "I thought I knew that James Madison was the 3rd president of the United States, but I didn't."
Under the "justified true belief" one might say that their knowledge was undermined because they learned it wasn' true. But their teacher, born (say) circa 1970, has no more direct access to the truth of who was the 3rd U.S. President than their students.
So we might say instead that the student changed their mind about their knowledge because a rather flimsy justification for believing that Madison was the 3rd president ("I studied really hard") was kicked out from under them by a somewhat sturdier justification for a contradictory belief ("My teacher says that Jefferson was the 3rd President, and she knows a lot about history.")
> That said, it sounds like you're sympathetic to a line of thought on which this so-called "traditional" conception of knowledge isn't very useful, and we should focus on the sorts of things that Bayesian epistemology focuses on... something like degrees of confidence.
> Under the "justified true belief" one might say that their knowledge was undermined because they learned it wasn't true. But their teacher, born (say) circa 1970, has no more direct access to the truth of who was the 3rd U.S. President than their students.
I'm not quite sure what you mean and I think a bit more precision is helpful here. If one adopts a JTB conception of knowledge, then one wouldn't say that the knowledge of undermined because the student didn't have knowledge. Yes, it's also true that the (weak) justification for believing Madison was the 3rd president was outweighed by the (new and stronger) evidence that he wasn't... but that's about the justification, not the knowledge per se.
(Quick note: philosophers tend to use "undermined" for when something removes the evidential force, rather than countering the force. For instance, suppose I see someone who looks exactly like you in the library. Ordinarily that's pretty good evidence that you were in the library, and it would be reasonable for me to believe you were in the library. However, if I found out that you had an identical twin, this would undermine the evidence, and assuming this is all the evidence I had, it would not be rational for me to believe you were in the library.)
For what it's worth, the post-Gettier literature (and failure to get much clarify about the concept of knowledge) is one of the things that pushed people towards the more Bayesian approach to epistemology.
(Personally I think the Bayesian approach is avoiding some hard but genuine issues about rationality. Specifically, in Bayesian epistemology everything is relative to your priors. But there are lots of probabilistically coherent priors. Are they all equally good? If so, then there's really not much we can say in general about what's good or bad evidence, what people should or shouldn't believe, it's all pretty relative. I don't think that's right... it seems like one can be irrational yet coherent, otherwise it's hard to say how people like you and me are more rational than people believing massive conspiracy theories. Lots more could be said here, but that's why I and plenty of other folks aren't happy with an "anything goes" view about priors. But then for folks like me who want to say that all coherent priors aren't equally rational, it's hard to say what would make some priors better than other in purely Bayesian terms... so I think we end up back in the realm of traditional epistemology. That said, all this is pretty controversial and this is a pretty quick sketch of an argument, so there's of course lots to be said in response.)
You're right, JTB would say they didn't have knowledge. But why? Is it because it wasn't justified or because it wasn't true? If it's the latter, how can you ever ascertain whether something is or isn't knowledge (i.e. how do you know 'truth')? That's my fundamental problem with this account.
I agree, also, that just defining good knowledge as beliefs based on good priors regresses the problem. I tend to think this regression is unsolvable except in pragmatic terms (what is efficacious when tried against the real world is what is good).
Most folks who do traditional epistemology would say that you could have very good justification yet fail to have knowledge because what you believe isn't true.
And yes, this probably means that in some sense, when you know something, you don't have direct access to the fact that this is knowledge. It's also often the case that you shouldn't be certain that you know. You can't perfectly ascertain whether you have knowledge... though you can have good reason to believe that you know something.
Most philosophers these days deny that knowledge requires either certainty or direct access. For instance, we know lots of things on the basis of testimony and other sorts of non-deductive evidence.
Let's consider a case like that. A student receives a graded history test, and says "I thought I knew that James Madison was the 3rd president of the United States, but I didn't."
Under the "justified true belief" one might say that their knowledge was undermined because they learned it wasn' true. But their teacher, born (say) circa 1970, has no more direct access to the truth of who was the 3rd U.S. President than their students.
So we might say instead that the student changed their mind about their knowledge because a rather flimsy justification for believing that Madison was the 3rd president ("I studied really hard") was kicked out from under them by a somewhat sturdier justification for a contradictory belief ("My teacher says that Jefferson was the 3rd President, and she knows a lot about history.")
> That said, it sounds like you're sympathetic to a line of thought on which this so-called "traditional" conception of knowledge isn't very useful, and we should focus on the sorts of things that Bayesian epistemology focuses on... something like degrees of confidence.
Yes, I'd say so.