The Current Employment Statistics State and Area (CESSA) program produces monthly industry employment estimates for subnational areas based on a survey of about 634,000 nonfarm worksites. Before estimates are published, they go through several screening procedures at the micro (individual report) and macro (estimation cell) level. CESSA adapted a process based on the Fay-Herriot model for extreme outlier detection at the macro level. The standardized difference between the sample-based estimate (Y1) and the synthetic part (Y2) of the model are used to identify significant deviations as candidates for macro editing. In those cases where the standardized difference exceeds a given threshold and analysts cannot find economic reasons to support the extreme movement, the modeled estimate is used to replace the direct sample-based estimate. This paper examines the process that CESSA uses to identify extreme macro outliers and its application to employment estimates at the state and area level. The effect of this procedure on error variance and bias when adjusting extreme estimates at different standardized cut-off levels is explored in an empirical study.