In 2006, the CPI’s All-US–All-Items 12-month standard errors increased by more than 50% over the previous year’s median average, returning to regular pre-2006 levels in 2007 and 2008. Since overall sample size had not been appreciably reduced in the 2006 time period, the analysis had to look elsewhere for an explanation for this rather significant rise in the All-US–All-Items 12-month standard errors. Perhaps one or more of the individual (replicate) variance pieces was contributing an inordinately high amount of variance to the overall variance. A decomposition analysis of the Stratified Random Groups variance calculation system was produced, and the results showed one or two major contributors at the lowest aggregate level producing as much as half of the entire All-US–All-Items variance. This paper will investigate the nature and genesis of these anomalies, their impact on the overall CPI variance, and then compare how different variance methodologies would have handled these anomalies.