2.0 seems beyond what could happen just from temperature, given the game conditions. The change happens as quickly as the ball adjusts to the outside temperature (IOW, the answer is the same as "how long does it take for a ball that was indoors at 70F to become 40F all the way through when you take it outside into 40F temperatures); I dunno specifically, but it seems like 90 minutes outside would at least get you most of the way there.
What was the temperature at halftime (I've only seen references to the kickoff temp)? If it was 50F, then about 1-1.1 PSI drop is attributable to temperature. If it was 40F, that goes to 1.6 PSI.
Either way there's still some 'splainin to do if the difference is 2 PSI. Perhaps that explanation lies in pressure gauges that are only accurate to .5 PSI; perhaps it lies in air leakage over time or Gronk spikes and game wear (the fact that the Colts' balls didn't see the same drop and that the balls seem evenly worn makes the game wear/air leakage explanations pretty unlikely to me). Or perhaps the balls turned into the refs were below spec, or the balls were tampered with post-measurement.