In roughest terms, to be an agnostic is to withhold belief on a matter, whereas to be a fallibilist is to have a belief but recognize that you could be mistaken, that those who disagree with you could have some or all of the truth, and that it is important to comport yourself accordingly.Then he sets up two epistemic circumstances to illustrate the difference:
Epistemic Circumstance 1 (EC1): You confront a body of presumptive evidence that "reasonable people" (however that is to be understood) generally accept, but you recognize that there are different ways of fitting that evidence into a coherent whole—different "stories" we can tell that fit just as well with the given evidence. In other words, we have certain mutually exclusive holistic ways of seeing the evidence, each of which maps onto the evidence just as well. For simplicity, let us assume there are only two such ways of seeing that fit as well onto the evidence, which we will call Worldviews A and B.While there is no reason, on the evidence, to prefer A or B in EC1, there may be personal reasons. "You might find A more hopeful. Or you might like who you are better when you live as if A is true. Or perhaps you’ve grown up with a community that embraces A, and you continue to have a sense of solidarity with that community. Or perhaps you’ve tried to see the world through the lens of B and it just doesn’t sit right with you because of what you identify as mere quirks of personality. Or perhaps it is a combination of these factor." You make your choice while recognizing that your reasons are idiosyncratic and not required based on the evidence.
Epistemic Circumstance 2 (EC2): You confront a body of presumptive evidence that reasonable people generally accept, as well as certain further "apparent truths," that is, things you experience as clearly true/self-evident/obvious/hard to deny/intuitively correct. But some of the people you regard as rational don’t find these apparent truths nearly as apparent as you do, and may instead find other things evident which are hardly evident to you. So, within the total body of "evidence" with which you are confronted, some of it is "shared evidence" whereas some of it is "personal evidence." Now suppose that, as before, Worldviews A and B both map onto the shared evidence (and are the only worldviews you have so far encountered that do this). But now let us suppose, furthermore, that Worldview A maps well onto the conjunction of the shared evidence and your personal evidence, while B doesn’t (accepting B would force you to abandon things that seem clearly right to you). At the same time, Worldview B maps well onto the conjunction of the shared evidence and what is apparently the personal evidence of reasonable people other than you.
In EC2, on the other hand, you do have certain personal evidence that moves you to accept worldview A rather than B. But note that this decision is based on your personal evidence - evidence which is not accepted by other "reasonable" people. This leads you to hold your personal evidence with less confidence, though it certainly does not mean your personal evidence is wrong.
In EC1, your reasons for favoring A over B are ones that do not appear to you as evidence for the truth of A, and in this sense are seen by you as nothing but pragmatic reasons to operate as if A is true. But in EC2, your reasons for favoring A over B have the "look and feel" of evidence, that is, they seem to be truths that speak in favor of the truth of A. And this makes your epistemic situation clearly different. It means, among other things, that when you endorse A, it is because A seems right to you in a way that B does not. You favor A over B on the basis of considerations that present themselves to you as evidence for the truth of A and against the truth of B.In EC1 you are agnostic on the theoretical level because you have no reason based on the evidence to hold one over the other, though you may have pragmatic or personal reasons. In EC2 you are not agnostic on the theoretical level because you do have evidence - albeit personal and not universally held - for holding A over B. These features require you to hold an attitude of fallibilism in EC2:
While A just seems right to you in a way that B does not, you also know that you are fallible, and you know that some of the evidence you are using in arriving at A is not regarded as veridical by other people who otherwise seem eminently reasonable. This fact alone does not make the evidence seem less veridical to you, but it does motivate an attitude of due caution, a willingness to investigate, to hear opposing arguments and be open to be moved by them if they do amount to "defeaters" of your presumptive evidence. And it also makes you resistent to condemning those who endorse B.I'll leave the application of these distinctions as an exercise for the readers. I only wanted to note them here.