Not 100% sure what you're asking. How I read it is:
The reference standard is assumed to be accurate to within 0.2degC (presumably right across the temperature range being checked, though not explicitly stated). Then the Davis sensor is checked at 6 different temperature points and verified to be reading to within the accuracy specified by the accuracy graph. So the maximum temperature error is (nominal error + 0.2degC).
For example, in an extreme case, at -40C as indicated by the reference reading, the temperature could read -43C and it would pass NIST verification. But the true temperature could be -39.8C and so the maximum temperature error could then be 3.2C.
NB In general the errors would be less, probably much less, than the example above, but the example is true (at least, as I'm understanding it) in the extreme case.