SumnerH said:
1. I fill the my balls on the sideline outside at 50F.
2. You fill yours in the locker room at 70F
3. The refs call for the balls to be tested. We take them there, where they're immediately tested. Both sets are 12 PSI.
Given those circumstances, you'd expect that if the balls were then held outside for a half at 50F and re-tested that mine would still read 12 PSI (no temperature change for the air inside, so no pressure change), but yours would read a little under 11 PSI (the 20F drop should show something in the realm of 1 to 1.1 PSI lost).
Here's another plausible scenario:
1) Assume all balls are filled indoors at 70-whatever but the Colts are at 13.5 and the Pats are at 12.5.
- They are measured indoors (with gauge) before the game starts, and all pass.
2) After the half, the Colts complain about "deflation" and all balls are brought inside for immediate testing, while they are still cold from the elements.
- As the Peter King video shows, you can test 24 balls pretty darn quick, so it can easily be done while all are cold, and now the Colts balls are 12.5 while the Pats have dipped to 11.5
- The spare balls are obtained and tested. All are at 12.5 because they have been indoors the whole time.
3) After the game is over, the balls are brought back inside. There's no hurry to do the test, so the refs go about their normal business and don't perform the test for some 30-60 minutes.
- All balls have returned to room temperature and are now sitting at the acceptable levels of 12.5 and 13.5 respectively.
No shenanigans, no evil geniuses,no malfeasant ball boys. Just simple issues that involve varying times, temperatures, and different average PSIs.