Loading...
HomeMy WebLinkAboutAQ_GEN_PLNG_20220404_SIP_RH-SIP_AppE3 MPE Appendix E-3 Model Performance Evaluation for Particulate Matter and Regional Haze of the CAMx 6.40 Modeling System and the VISTAS II 2011 Updated Modeling Platform October 29, 2020 This page intentionally left blank. Model Performance Evaluation for Particulate Matter and Regional Haze of the CAMx 6.40 Modeling System and the VISTAS II 2011 Updated Modeling Platform for Task 8.0 Prepared for: Southeastern States Air Resource Managers, Inc. 1252 W. Government St., #1375 Brandon, MS 39043 Under Contract No. V-2018-03-01 Prepared by: Alpine Geophysics, LLC 387 Pollard Mine Road Burnsville, NC 28714 and Eastern Research Group, Inc. 1600 Perimeter Park Dr., Suite 200 Morrisville, NC 27560 Final – October 29, 2020 Alpine Project Number: TS-527 ERG Project Number: 4133.00.006 Model Performance Evaluation – PM and Regional Haze October 29, 2020 i This page is intentionally blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 ii Contents Page 1.0 Introduction .......................................................................................................................1 2.0 Model Performance Evaluation ........................................................................................5 2.1 Graphical Presentations ........................................................................................7 2.2 Ambient Measurement Networks .......................................................................14 2.2.1 Ambient Air Quality Observations .........................................................14 2.2.2 IMPROVE...............................................................................................15 2.2.3 CASTNET...............................................................................................15 2.2.4 CSN .........................................................................................................16 2.3 CAMx Species Mapping .....................................................................................16 2.4 Summary and Comparison to EPA MPE Results ...............................................17 3.0 PM2.5 Sulfate ...................................................................................................................18 4.0 PM2.5 Nitrate ...................................................................................................................25 5.0 PM2.5 Ammonium ...........................................................................................................32 6.0 PM2.5 OC .........................................................................................................................38 7.0 PM2.5 EC .........................................................................................................................43 8.0 Total PM2.5 ......................................................................................................................48 9.0 Performance on 20% Most-Impaired Days.....................................................................53 10.0 PM2.5 Composition and Contributions to Light Extinction ............................................60 Appendix A VISTAS12 Modeling Domain Model Performance Metrics by Network, Station, Pollutant, and Season Appendix A-1 VISTAS12 Modeling Domain CASTNET Model Performance Metrics by Station, Pollutant, and Season Appendix A-2 VISTAS12 Modeling Domain CSN Model Performance Metrics by Station, Pollutant, and Season Appendix A-3 VISTAS12 Modeling Domain IMPROVE Model Performance Metrics by Station, Pollutant, and Season Appendix B VISTAS12 Modeling Domain Scatter Plots of PM2.5 Species by Network, Pollutant, and Month Appendix C Scatter, Soccer, and Bugle Plots by Class I Area for the 20% Most Impaired Days and 20% Clearest Days Appendix D VISTAS12 Modeling Domain Soccer Plots of PM2.5 Species by Network, Pollutant, and Month Appendix E VISTAS12 Modeling Domain Bugle Plots of PM2.5 Species by Network, Pollutant, and Month Appendix F VISTAS12 Modeling Domain Observed and Modeled Concentration and Light Extinction Comparisons Model Performance Evaluation – PM and Regional Haze October 29, 2020 iii TABLES Table 2-1. Fine Particulate Matter Performance Goals and Criteria ...............................................6 Table 2-2. Overview of Utilized Ambient Data Monitoring Networks .........................................15 Table 2-3. Species Mapping from CAMx into Observation Network ...........................................16 Table 3-1. Model Performance Statistics for PM2.5 Sulfate by Region, Network, and Season. ....19 Table 4-1. Model Performance Statistics for PM2.5 Nitrate by Region, Network, and Season. ....26 Table 5-1. Model Performance Statistics for PM2.5 Ammonium by Region, Network, and Season ................................................................................................................................33 Table 6-1. Model Performance Statistics for PM2.5 OC by Region, Network, and Season. ..........38 Table 7-1. Model Performance Statistics for PM2.5 EC by Region, Network, and Season............43 Table 8-1. Model Performance Statistics for PM2.5 by Region, Network, and Season. .................48 FIGURES Figure 1-1. IMPROVE Monitor Locations and the VISTAS 12km Domain. .................................4 Figure 2-1. Example Scatter Plot of Average 2011 Monthly Sulfate Concentration at IMPROVE Sites in VISTAS States (left) and 20% Clearest Days at Everglades National Park (right). ...........................................................................................................7 Figure 2-2. Example Box Plot of Monthly Average Nitrate Concentration for Non-VISTAS State CSN Sites. ...................................................................................................................8 Figure 2-3. Example Spatial Plot of Nitrate NMB by Network for Summer Months (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). .........................................................9 Figure 2-4. Example Soccer Plot of Monthly Nitrate Normalized Mean Bias and Error for CASTNET Sites in VISTAS States (left) and PM2.5 Species on the 20% Most Anthropogenically Impaired Days at Great Smoky Mountains National Park (right). .....10 Figure 2-5. Example Bugle Plot of Monthly Mean Fractional Bias as a Function of Modeled Concentration at IMPROVE Sites in VISTAS States (top) and Mean Fractional Error for PM2.5 Species on the 20% Clearest Days at Saint Marks (bottom). ...........................11 Figure 2-6. Example Mass Daily Stacked Bar Chart for PM2.5 Species on the 20% Most Anthropogenically Impaired Days at Shaenandoah. ..........................................................12 Figure 2-7. Example Extinction Daily Stacked Bar Chart for PM2.5 Species on the 20% Most Anthropogenically Impaired Days at Shaenandoah. ..........................................................13 Figure 2-8. Example Observed (Obs) and Predicted (Mod) Mass Concentrations (Left) and Light Extinctions (Right) at the Dolly Sods Wilderness on the Observed 20% Most Anthropogenically Impaired Days. ....................................................................................14 Figure 3-1. Boxplot Comparisons of Model Predictions and IMRPOVE Sulfate Observations for Each Climate Region by Month. ..................................................................................20 Model Performance Evaluation – PM and Regional Haze October 29, 2020 iv Figure 3-2. Boxplot Comparisons of Model Predictions and CSN Sulfate Observations for Each Climate Region by Month. ........................................................................................21 Figure 3-3. Boxplot Comparisons of Model Predictions and CASTNET Sulfate Observations for Each Climate Region by Month. ..................................................................................22 Figure 3-4. Spatial Plots of Sulfate NMB by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). ...........................................................................23 Figure 3-5. Spatial Plots of Sulfate NME by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). ...........................................................................24 Figure 4-1. Boxplot Comparisons of Model Predictions and IMPROVE Nitrate Observations for Each Climate Region by Month. ..................................................................................27 Figure 4-2. Boxplot Comparisons of Model Predictions and CSN Nitrate Observations for Each Climate Region by Month. ........................................................................................28 Figure 4-3. Boxplot Comparisons of Model Predictions and CASTNET Nitrate Observations for Each Climate Region by Month. ..................................................................................29 Figure 4-4. Spatial Plots of Nitrate NMB by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). ...........................................................................30 Figure 4-5. Spatial Plots of Nitrate NME by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). ...........................................................................31 Figure 5-1. Boxplot Comparisons of Model Predictions and CSN Ammonium Observations for Each Climate Region by Month. ..................................................................................34 Figure 5-2. Boxplot Comparisons of Model Predictions and CASTNET Ammonium Observations for Each Climate Region by Month. ............................................................35 Figure 5-3. Spatial Plots of Ammonium NMB by Season and Network (Square = CASTNET, Diamond = CSN). ..............................................................................................................36 Figure 5-4. Spatial Plots of Ammonium NME by Season and Network (Square = CASTNET, Diamond = CSN). ..............................................................................................................37 Figure 6-1. Boxplot Comparisons of Model Predictions and IMPROVE Organic Carbon (OC) Observations for Each Climate Region by Month. ...................................................39 Figure 6-2. Boxplot Comparisons of Model Predictions and CSN Organic Carbon (OC) Observations for Each Climate Region by Month. ............................................................40 Figure 6-3. Spatial plots of organic carbon (OC) NMB by season and network (Circle = IMPROVE, Diamond = CSN). ..........................................................................................41 Figure 6-4. Spatial plots of organic carbon (OC) NME by season and network (Circle = IMPROVE, Diamond = CSN). ..........................................................................................42 Figure 7-1. Boxplot Comparisons of Model Predictions and IMPROVE Elemental Carbon (EC) Observations for Each Climate Region by Month. ...................................................44 Figure 7-2. Boxplot Comparisons of Model Predictions and CSN Elemental Carbon (EC) Observations for Each Climate Region by Month. ............................................................45 Model Performance Evaluation – PM and Regional Haze October 29, 2020 v Figure 7-3. Spatial plots of elemental carbon (EC) NMB by season and network (Circle = IMPROVE, Diamond = CSN). ..........................................................................................46 Figure 7-4. Spatial plots of elemental carbon (EC) NME by season and network (Circle = IMPROVE, Diamond = CSN). ..........................................................................................47 Figure 8-1. Boxplot Comparisons of Model Predictions and IMPROVE Total PM2.5 Observations for Each Climate Region by Month. ............................................................49 Figure 8-2. Boxplot Comparisons of Model Predictions and CSN Total PM2.5 Observations for Each Climate Region by Month. ..................................................................................50 Figure 8-3. Spatial plots of total PM2.5 NMB by season and network (Circle = IMPROVE, Diamond = CSN). ..............................................................................................................51 Figure 8-4. Spatial plots of total PM2.5 NME by season and network (Circle = IMPROVE, Diamond = CSN). ..............................................................................................................52 Figure 9-1. Observed Sulfate (Top) and Modeled NMB (Bottom) for Sulfate on the 20% Most-impaired Days at IMPROVE Monitor Locations. ....................................................54 Figure 9-2. Observed Nitrate (Top) and Modeled NMB (Bottom) for Nitrate on the 20% Most-impaired Days at IMPROVE Monitor Locations. ....................................................55 Figure 9-3. Observed OC (Top) and Modeled NMB (Bottom) for OC on the 20% Most- impaired Days at IMPROVE Monitor Locations. .............................................................56 Figure 9-4. Observed EC (Top) and Modeled NMB (Bottom) for EC on the 20% Most- impaired Days at IMPROVE Monitor Locations. .............................................................57 Figure 9-5. Observed Total PM2.5 (Top) and Modeled NMB (Bottom) for Total PM2.5 on the 20% Most-impaired Days at IMPROVE Monitor Locations. ...........................................58 Figure 9-6. Observed NACL (Top) and Modeled NMB (Bottom) for NACL on the 20% Most-impaired Days at IMPROVE Monitor Locations. ....................................................59 Figure 10-1. Example Daily Observed (Obs) and Predicted (Mod) Total Mass Concentrations (Top) and Light Extinctions (Bottom) at the St. Mark’s Wildlife Refuge on the Observed 20% Clearest Days.............................................................................................61 Figure 10-2. Example Averaged Observed (Obs) and Predicted (Mod) Total Mass Concentrations (Left) and Light Extinctions (Right) at the St. Mark’s Wildlife Refuge on the Observed 20% Clearest Days. .................................................................................62 Model Performance Evaluation – PM and Regional Haze October 29, 2020 vi Abbreviations/Acronym List Alpine Alpine Geophysics, LLC AQS Air Quality Subsystem CAMx Comprehensive Air quality Model with eXtensions CASTNET Clean Air Status and Trends Network CM Coarse Mass CSN Chemical Speciation Network DJF December, January, and February (i.e., Winter) EC Elemental carbon ERG Eastern Research Group, Inc. EGU Electric Generating Unit EPA United States Environmental Protection Agency FCRS Crustal fraction of PM FPRM Fine other primary (diameter ≤ 2.5µm) FRM Federal Reference Method IMPROVE Interagency Monitoring of Protected Visual Environments JJA June, July, and August (i.e., Summer) km Kilometer MAM March, April, and May (i.e., Spring) MB Mean Bias ME Mean Error MFB Mean Fractional Bias MFE Mean Fractional Error MPE Model Performance Evaluation N Number of observations NACL Sodium chloride NH4+ Ammonium Ion NMB Normalized Mean Bias NME Normalized Mean Error NO3- Nitrate O3 Ozone OC Organic carbon OM Organic matter OMC Organic mass carbon OSAT Ozone Source Apportionment Technology PEC Primary elemental carbon PM Particulate matter PM2.5 Fine particle; primary particulate matter less than or equal to 2.5 microns in aerodynamic diameter PNH4 Particulate ammonium PNO3 Particulate nitrate PSAT Particulate Source Apportionment Technology PSO4 Particulate sulfate r Pearson correlation coefficient R Programming software language called R Model Performance Evaluation – PM and Regional Haze October 29, 2020 vii RADM-AQ Regional Acid Deposition Model – aqueous chemistry RHR Regional Haze Rule RMSE Root Mean Squared Error SESARM Southeastern States Air Resource Managers, Inc. SIPS State Implementation Plans SMAT-CE Software for Model Attainment Test – Community Edition SO42- Sulfate SOA Secondary organic aerosol SOAP Secondary organic aerosol partitioning SON September, October, and November (i.e., Fall) VISTAS Visibility Improvement – State and Tribal Association of the Southeast Model Performance Evaluation – PM and Regional Haze October 29, 2020 1 1.0 INTRODUCTION Southeastern States Air Resource Managers, Inc. (SESARM) has been designated by the United States Environmental Protection Agency (EPA) as the entity responsible for coordinating regional haze evaluations for the ten Southeastern states of Alabama, Florida, Georgia, Kentucky, Mississippi, North Carolina, South Carolina, Tennessee, Virginia, and West Virginia. The Eastern Band of Cherokee Indians and the Knox County, Tennessee local air pollution control agency are also participating agencies. These parties are collaborating through the Regional Planning Organization known as Visibility Improvement - State and Tribal Association of the Southeast (VISTAS) in the technical analyses and planning activities associated with visibility and related regional air quality issues. VISTAS analyses will support the VISTAS states in their responsibility to develop, adopt, and implement their State Implementation Plans (SIPs) for regional haze. The state and local air pollution control agencies in the Southeast are mandated to protect human health and the environment from the impacts of air pollutants. They are responsible for air quality planning and management efforts including the evaluation, development, adoption, and implementation of strategies controlling and managing all criteria air pollutants including fine particles and ozone as well as regional haze. This project will focus on regional haze and regional haze precursor emissions. Control of regional haze precursor emissions will have the additional benefit of reducing criteria pollutants as well. The 1999 Regional Haze Rule (RHR) identified 18 Class I Federal areas (national parks greater than 6,000 acres and wilderness areas greater than 5,000 acres) in the VISTAS region. The 1999 RHR required states to define long-term strategies to improve visibility in Federal Class I national parks and wilderness areas. States were required to establish baseline visibility conditions for the period 2000-2004, natural visibility conditions in the absence of anthropogenic influences, and an expected rate of progress to reduce emissions and incrementally improve visibility to natural conditions by 2064. The original RHR required states to improve visibility on the 20% most impaired days and protect visibility on the 20% least impaired days.1 The RHR 1 RHR summary data is available at: http://vista.cira.colostate.edu/Improve/rhr-summary-data/ Model Performance Evaluation – PM and Regional Haze October 29, 2020 2 requires states to evaluate progress toward visibility improvement goals every five years and submit revised SIPs every ten years. To demonstrate progress toward the improvement goals, the SESARM partners modeled visibility and air quality conditions for a base year of 2011 and future year of 2028. The SESARM VISTAS II Regional Haze modeling analysis was performed by the contractor team Eastern Research Group, Inc. (ERG) and Alpine Geophysics, LLC (Alpine). The preparation and modeling were conducted over several contract tasks, including emission inventory development, ambient data collection, CAMx modeling, and model performance evaluation of the base year. The VISTAS II modeling included particulate matter simulations and source apportionment studies using the 12-kilometer (km) grid based on EPA’s 2011/2028el modeling platform and preliminary source contribution assessment,2 updated to include a 12km subdomain over the VISTAS region and augmented with revisions to electric generating unit (EGU) and non-EGU point source projections. The air quality modeling was conducted using Comprehensive Air quality Model with extensions (CAMx). A detailed description of the modeling platform can be found in the Task 6 modeling report. Under Task 8 of the Regional Haze Modeling for Southeastern VISTAS II Regional Haze Analysis Project, a thorough model performance evaluation (MPE) was conducted for particulate matter less than or equal to 2.5 microns in aerodynamic diameter (PM2.5) species components and light extinction to examine the ability of the CAMx v6.40 modeling system to simulate 2011 measured concentrations. This report documents the MPE for that base year CAMx modeling. The VISTAS II modeling for 2011 is based on the EPA modeling conducted for Regional Haze Analysis, sometimes referred to as the “2011el” modeling. Updates to the EPA platform in the VISTAS II modeling include updating the version of CAMx from version 6.32 to 6.40. Many updates to the CAMx model were implemented between the 6.32 and 6.40 release. According to the CAMx 6.40 release notes, the significant changes included: 2 EPA. 2017. Documentation for the EPA’s Preliminary 2028 Regional Haze Modeling. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards. October. Available at: https://www3.epa.gov/ttn/scram/reports/2028_Regional_Haze_Modeling-TSD.pdf. Model Performance Evaluation – PM and Regional Haze October 29, 2020 3 1. Updates to the chemistry to include a condensed halogen mechanism for ocean-borne inorganic reactive iodine, hydrolysis of isoprene-derived organic nitrate and SO2 oxidation on primary crustal fine PM. This update includes the changes to the Ozone and Particulate Source Apportionment Technology (OSAT/PSAT) algorithms; 2. Inclusion of in-line inorganic iodine emissions to support halogen chemical mechanisms; 3. A major revision to the Secondary organic aerosol partitioning (SOAP) and secondary organic aerosol (SOA) chemistry algorithm; 4. Updates to the Regional Acid Deposition Model aqueous chemistry (RADM-AQ) algorithm; and 5. A major revision to the wet deposition algorithm to identify assumptions or processes that were unintentionally or otherwise unreasonably limiting gas and PM update into precipitation. The wet deposition algorithm was simplified and improved in several ways, resulting in the increased scavenging of gases and PM. In addition to the model version, the CAMx 6.32 and 6.40 simulations contained differences from the EPA modeling platform that had been made subsequent to the 2011el/2028el model release. In the most current 2023en simulation, EPA developed new photolysis rates and ozone column data. These updates were included in the updated modeling platform and resulting CAMx 6.40 simulation and were used in the VISTAS II 2011el simulations. Another configuration difference is how the boundary conditions were mapped for speciation from the two versions of the model. EPA and the VISTAS CAMx 6.32 and 6.40 simulations all used the same boundary condition files. However, when CAMx was updated from 6.32 to 6.40 the species in the SOA scheme changed. The SOA5, SOA6, and SOA7 were removed and SOA3 and SOA4 were redefined. However, neither EPA nor this study remapped the boundary conditions to account for this change. EPA examined the regional haze summary data for all Class I areas and found the total organic carbon (OC) species (not just SOA) accounted for 1-5% of the boundary condition impairment at the Southeastern Class I areas.3 This is a small impact on regional haze and the impact of SOA on regional haze is even smaller. 3 Brian Timin, EPA Office of Air Quality Planning and Standards (OAQPS) personal communication October 11, 2018. Model Performance Evaluation – PM and Regional Haze October 29, 2020 4 Figure 1-1 presents the Interagency Monitoring of Protected Visual Environments (IMPROVE) monitor locations in the VISTAS 12-km domain. Figure 1-1. IMPROVE Monitor Locations and the VISTAS 12km Domain. Model Performance Evaluation – PM and Regional Haze October 29, 2020 5 2.0 MODEL PERFORMANCE EVALUATION In order to estimate the ability of CAMx to replicate the 2011 base year concentrations of particulate matter and light extinction, an operational model performance evaluation was conducted following the approach outlined in the modeling protocol. For this evaluation, mean bias and normalized mean bias, mean error and normalized mean error, and Pearson’s correlation coefficient were used and directly compared to EPA’s results 4 using these same statistics and observed concentrations. In addition, mean fractional bias (MFB) and mean fractional error (MFE) were calculated. Mean bias (MB) is the average difference between predicted (P) and observed (O) concentrations for a given number of samples (n): 𝑀𝑀𝑀𝑀(𝜇𝜇𝜇𝜇 𝑚𝑚−3 𝑜𝑜𝑜𝑜 𝑀𝑀𝑚𝑚−1 )= 1𝑛𝑛�(𝑃𝑃𝑖𝑖−𝑂𝑂𝑖𝑖)𝑛𝑛𝑖𝑖=1 Mean error (ME) is the average absolute value of the difference between predicted and observed concentrations for a given number of samples: 𝑀𝑀𝑀𝑀(𝜇𝜇𝜇𝜇 𝑚𝑚−3 𝑜𝑜𝑜𝑜 𝑀𝑀𝑚𝑚−1 )= 1𝑛𝑛�|𝑃𝑃𝑖𝑖−𝑂𝑂𝑖𝑖|𝑛𝑛𝑖𝑖=1 Normalized mean bias (NMB) is the sum of the difference between predicted and observed values divided by the sum of the observed values: 𝑁𝑁𝑀𝑀𝑀𝑀(%)= ∑(𝑃𝑃−𝑂𝑂)𝑛𝑛1∑(𝑂𝑂)𝑛𝑛1 ∗100 Normalized mean error (NME) is the sum of the absolute value of the difference between predicted and observed values divided by the sum of the observed values: 𝑁𝑁𝑀𝑀𝑀𝑀(%)= ∑|𝑃𝑃−𝑂𝑂|𝑛𝑛1∑(𝑂𝑂)𝑛𝑛1 ∗100 4 https://www3.epa.gov/ttn/scram/reports/2028_Regional_Haze_Modeling-TSD.pdf. Model Performance Evaluation – PM and Regional Haze October 29, 2020 6 Pearson’s correlation coefficient (r) is defined as: 𝑜𝑜=∑(𝑃𝑃𝑖𝑖−𝑃𝑃)(𝑂𝑂𝑖𝑖−𝑂𝑂)𝑛𝑛𝑖𝑖=1�∑(𝑃𝑃𝑖𝑖−𝑃𝑃)2𝑛𝑛𝑖𝑖=1 �∑(𝑂𝑂𝑖𝑖−𝑂𝑂)2𝑛𝑛𝑖𝑖=1 Mean Fractional Bias (MFB) is defined as: 𝑀𝑀𝑀𝑀𝑀𝑀(%)= 2𝑁𝑁��𝑃𝑃−𝑂𝑂𝑃𝑃+𝑂𝑂�𝑁𝑁 1 × 100 Mean Fractional Error (MFE) is defined as: 𝑀𝑀𝑀𝑀𝑀𝑀(%)= 2𝑁𝑁��|𝑃𝑃−𝑂𝑂|𝑃𝑃+𝑂𝑂�𝑁𝑁 1 × 100 Model predictions of PM species were paired in space and time with observational data from the IMPROVE, Chemical Speciation Network (CSN), and the Clean Air Status and Trends Network (CASTNET) monitoring sites. These results are organized by network and season (winter (DJF), spring (MAM), summer (JJA), and fall (SON)), for receptors located within the ten VISTAS states and outside of the region. Recommended benchmarks for photochemical model performance statistics (Boylan, 2006; Emery, 2017) will be used to assess the applicability of this modeled simulation for regulatory purposes. The goal and criteria values noted in Table 2-1 below will be used for this study. Table 2-1. Fine Particulate Matter Performance Goals and Criteria NMB NME Species Goal Criteria Goal Criteria 24-hr PM2.5 and Sulfate <± 10% <± 30% < 35% < 50% 24-hr Nitrate <± 10% <± 65% < 65% < 115% 24-hr OC <± 15% <± 50% < 45% < 65% 24-hr EC <± 20% <± 40% < 50% < 75% Appendix A presents the MPE statistics in tabular formats for the CASTNET (Appendix A-1), CSN (Appendix A-2), and IMPROVE (Appendix A-3) datasets. Model Performance Evaluation – PM and Regional Haze October 29, 2020 7 2.1 Graphical Presentations In addition to statistical summaries, graphical displays of data allow for a fuller characterization of model performance. Therefore, plots play a key role in any model performance evaluation. Below are examples of the types of plots that are used in this evaluation. • Scatter plots (Figure 2-1) present the time and space ordered pairs with observations on the x-axis and the model predicted concentrations on y-axis. These plots are useful for indicating trends of either over, or under prediction across the range of values. Scatter plots have been prepared for PM2.5 species by network, pollutant, and month (Appendix B) and for SO4, NO3, EC, OC, OM, NACL, PM2.5, PMC, and soils on the 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix C). Figure 2-1. Example Scatter Plot of Average 2011 Monthly Sulfate Concentration at IMPROVE Sites in VISTAS States (left) and 20% Clearest Days at Everglades National Park (right). • Box plots (Figure 2-2) can be for a useful tool for evaluating model performance evaluation. These types of plots show the distribution of observations, model estimates, or performance metrics. In this report box plots in this evaluation are grouped by monthly observed and modeled concentrations by species, network, and region. Our box plots show several quantities: the 25% to 75% percentiles are represented by the lower and upper extent of the box, the median values by the line across the box, and outliers as Model Performance Evaluation – PM and Regional Haze October 29, 2020 8 points outside the box. The monthly box plots presented can be used to quickly visualize model performance across the entire year, highlighting the seasonal change in model performance. Figure 2-2. Example Box Plot of Monthly Average Nitrate Concentration for Non-VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 9 • Spatial plots of model performance at monitor locations (Figure 2-3) provide an overall picture of the geographic patterns in model performance. Any performance metric can be plotted in this manner and we include spatial plots of MB, ME, NMB, and NME. The markers are plotted at the monitor location with the color of the marker keyed to the value of the metric being presented. Figure 2-3. Example Spatial Plot of Nitrate NMB by Network for Summer Months (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). Summer Model Performance Evaluation – PM and Regional Haze October 29, 2020 10 • The soccer plot (Figure 2-4) is so named because the dotted lines illustrating performance goals resemble a soccer goal. The error is plotted on the y-axis and the bias plotted on the x-axis. The plot is a convenient way to visualize both bias and error model performance on a single plot. As bias and error approach zero, the points are plotted closer to or within the “goal,” represented by the dashed boxes. The size of the goal is developed from historical values of the metric for each variable from comparable modeling studies. Soccer plots have been prepared for PM2.5 species by network, pollutant, and month (Appendix D) and by species on the 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix C). Figure 2-4. Example Soccer Plot of Monthly Nitrate Normalized Mean Bias and Error for CASTNET Sites in VISTAS States (left) and PM2.5 Species on the 20% Most Anthropogenically Impaired Days at Great Smoky Mountains National Park (right). Model Performance Evaluation – PM and Regional Haze October 29, 2020 11 • The bugle plot (Figure 2-5), named for the shape formed by the criteria and goal lines. The bugle plots are shaped as such because the goal and criteria lines are adjusted based on the average concentration of the observed species. As the average concentration becomes smaller, the criteria and goal lines become larger to adjust for the model’s ability to predict at low concentrations. Bugle plots for the mean fractional bias and mean fractional error have been prepared for PM2.5 species by network, pollutant, and month (Appendix E) and by species on the 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix C). Figure 2-5. Example Bugle Plot of Monthly Mean Fractional Bias as a Function of Modeled Concentration at IMPROVE Sites in VISTAS States (top) and Mean Fractional Error for PM2.5 Species on the 20% Clearest Days at Saint Marks (bottom). Model Performance Evaluation – PM and Regional Haze October 29, 2020 12 • Mass daily stacked bar charts (Figure 2-6) compare 2011 observations to 2011 model values by PM2.5 species mass concentration. Mass daily stacked bar charts have been prepared for the 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix F). Figure 2-6. Example Mass Daily Stacked Bar Chart for PM2.5 Species on the 20% Most Anthropogenically Impaired Days at Shaenandoah. Model Performance Evaluation – PM and Regional Haze October 29, 2020 13 • Extinction daily stacked bar charts (Figure 2-7) compare 2011 observations to 2011 model values by PM2.5 species light extinction. Extinction daily stacked bar charts have been prepared for the 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix F). Figure 2-7. Example Extinction Daily Stacked Bar Chart for PM2.5 Species on the 20% Most Anthropogenically Impaired Days at Shaenandoah. • Mass average stacked bar charts (Figure 2-8) compare 2011 average PM2.5 species mass concentration observations to 2011 average model values. Mass average stacked bar charts have been prepared for the 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix F). • Extinction average stacked bar charts (also Figure 2-8) compare 2011 average PM2.5 species light extinction observations to 2011 average model values. Mass average stacked bar charts have been prepared for the observed 20% clearest and 20% most anthropogenically impaired days at each Class I area in the VISTAS 12 domain (Appendix F). Model Performance Evaluation – PM and Regional Haze October 29, 2020 14 Figure 2-8. Example Observed (Obs) and Predicted (Mod) Mass Concentrations (Left) and Light Extinctions (Right) at the Dolly Sods Wilderness on the Observed 20% Most Anthropogenically Impaired Days. 2.2 Ambient Measurement Networks Provided below is an overview of the various ambient air monitoring networks used in this evaluation. Network methods and procedures are subject to change annually due to systematic review and/or updates to the existing monitoring network/program. 2.2.1 Ambient Air Quality Observations Year 2011 data from available ambient air monitoring networks for PM species are used in the model performance evaluation. Table 2-2 summarizes routine PM monitoring networks used in this analysis. Alpine focused on the ambient data collected from the IMPROVE network. This network began in 1985 as a cooperative visibility monitoring effort between EPA, federal land management agencies, and state air agencies (IMPROVE, 2011). Data are collected at Class I areas across the United States mostly at National Parks, National Wilderness Areas, and other protected pristine areas. Currently, there are approximately 181 IMPROVE sites that have complete annual PM2.5 mass and/or PM2.5 species data. There are 110 IMPROVE monitoring sites which represent air quality at the 156 designated Class I areas. The 71 additional IMPROVE sites are “IMPROVE protocol” sites which are generally located in rural areas throughout the U.S. Although these sites use the IMPROVE monitoring samplers and collection routines, they are not located at Class I areas. Model Performance Evaluation – PM and Regional Haze October 29, 2020 15 Table 2-2. Overview of Utilized Ambient Data Monitoring Networks Monitoring Network Chemical Species Measured Sampling Period IMPROVE Speciated PM2.5 and PM10; light extinction data 1 in 3 days; 24-hour average CASTNET Speciated PM2.5, and O3 Approximately 1-week average CSN Speciated PM2.5 24-hour average 2.2.2 IMPROVE The IMPROVE network began in 1985 as a cooperative visibility monitoring effort between EPA, federal land management agencies, and state air agencies (IMPROVE, 2011). Data are collected at Class I areas across the U.S., mostly at national parks, national wilderness areas, and national wildlife refuges. As of 2018, there were approximately 160 IMPROVE sites that have complete annual PM2.5 mass and/or PM2.5 species data. There are 110 IMPROVE monitoring sites which represent air quality at the 156 designated Class I areas. The additional IMPROVE sites are “IMPROVE protocol” sites, which are generally located in rural areas throughout the U.S., although there are also a handful of urban sites in the U.S. These protocol sites provide additional spatial information across the country, being generally located in areas where there are few Class I areas. The protocol sites use the IMPROVE monitoring samplers and collection routines. In addition to IMPROVE data that is available in AQS, the IMPROVE program provides summary datasets that contains information and pre-calculated data needed for Regional Haze Rule analyses. This includes daily average and annual data for the 20% most impaired and 20% clearest visibility days. 2.2.3 CASTNET Established in 1987, CASTNET is a dry deposition monitoring network where PM data are collected and reported as weekly average data (U.S. EPA, 2012a). In addition, this network measures and reports hourly ozone concentrations. CASTNET provides atmospheric data on the dry deposition component of total acid deposition, ground-level ozone and other forms of atmospheric pollution. The data (except for ozone) are collected in filter packs that sample the ambient air continuously during the week. As of 2018, CASTNET is comprised of 95 monitoring stations across the U.S. The longest data records are primarily at eastern U.S. sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 16 2.2.4 CSN CSN, formerly known as STN: The Speciation Trends Network, began operation in 1999 to provide nationally consistent speciated PM2.5 data for the assessment of trends at representative sites in urban areas in the U.S. The CSN was established by regulation and is a companion network to the mass-based Federal Reference Method (FRM) network implemented in support of the PM2.5 NAAQS. As part of a routine monitoring program, the CSN quantifies mass concentrations and PM2.5 constituents, including numerous trace elements, ions (sulfate, nitrate, sodium, potassium, and ammonium), elemental carbon (EC), and organic carbon (OC). As of 2018, there were 52 trends sites in the CSN nationally. CSN trends sites are largely static urban monitoring stations with protocols for sampling methods that are dedicated to characterizing aerosol mass components in urban areas of the U.S. to discern long term trends and provide an accountability mechanism to assess the effectiveness of control programs. In addition, in 2018, there were approximately 100 supplemental speciation sites that are also part of the CSN. The CSN data at trends sites are collected 1 in every 3 days, whereas supplemental sites collect data either 1 in every 3 days or 1 in every 6 days. 2.3 CAMx Species Mapping The CAMx model species are not directly comparable with the species measured at the monitoring networks described in Section 2.2. The CAMx species mapping was presented in the modeling protocol and is repeated in Table 2-3. Table 2-3. Species Mapping from CAMx into Observation Network Network Observed Species CAMx Species IMPROVE NO3 PNO3 SO4 PSO4 NH4 PNH4 OM = 1.8*OC SOA1+SOA2+SOA3+SOA4 +SOPA+SOPB+POA EC PEC SOIL FPRM+FCRS PM2.5 PSO4+PNO3+PNH4+SOA1+SOA2+SOA3+SOA4 +SOPA+SOPB+POA+PEC+FPRM+FCRS+NA+PCL CSN PM2.5 PSO4+PNO3+PNH4+SOA1+SOA2+SOA3+SOA4 +SOPA+SOPB+POA+PEC+FPRM+FCRS+NA+PCL NO3 PNO3 SO4 PSO4 NH4 PNH4 Model Performance Evaluation – PM and Regional Haze October 29, 2020 17 Table 2-3. Species Mapping from CAMx into Observation Network Network Observed Species CAMx Species OM = 1.4*OC SOA1+SOA2+SOA3+SOA4 +SOPA+SOPB+POA EC PEC 2.4 Summary and Comparison to EPA MPE Results Comparing model performance statistics of EPA’s CAMx 6.32 and VISTAS CAMx 6.40 simulations using EPA’s 2011el modeling platform showed relatively similar results with the VISTAS results showing slightly improved performance for all PM2.5 species except sulfate and OC at IMPROVE, CSN, and CASTNET monitors in the southeastern state region. For sulfate and OC, CAMx 6.40 concentrations were lower than CAMx 6.32 creating an under prediction bias for most of the VISTAS12 modeling domain and seasons in VISTAS simulation compared to EPA’s CAMx 6.32 results. For nitrate, ammonium, and EC, the CAMx 6.32 and CAMx 6.40 results differed slightly, with neither version of the model consistently demonstrating performance better than the other. The total PM2.5 performance results were consistent between both simulations even as results generally showed higher CAMx 6.32 concentrations compared to CAMx 6.40 at lower concentration levels, with consistent performance at higher concentrations. There appears to be a trend where CAMx 6.40 concentrations are generally slightly higher that CAMx 6.32 during dry periods and CAMx 6.32 generally slightly higher during wet periods. This is not surprising given the update to the wet deposition algorithm between CAMx 6.32 and 6.40. The comparison of CAMx 6.32 and 6.40 showed differences in model concentration estimates with little difference noted in performance between the two model configurations for most species. The only noted differences were seen in sulfate performance. This was expected given the changes to the model due to the inclusion of new science in CAMx6.40. Alpine Geophysics does not see any features in the modeling that would preclude the use of the more up-to-date science in CAMx 6.40 for use in the VISTAS air quality planning. Model Performance Evaluation – PM and Regional Haze October 29, 2020 18 3.0 PM2.5 SULFATE Table 3-1 summarizes model performance statistics for PM2.5 sulfate. Boxplot comparisons of model predictions and observations (IMPROVE, CSN, and CASTNET) by month for each climate region are shown in Figures 3-1, 3-2, and 3-3. VISTAS12 modeling domain spatial plots of NMB and NME for each season are shown in Figures 3-4 and 3-5. Sulfate performance across seasons, networks, and regions is generally mixed. A notable under prediction of sulfate is observed across the VISTAS12 domain consistent with our findings of CAMx v. 6.40 in comparison to the CAMx v. 6.32 simulations from EPA. NMBs range from - -37.5% to -3.38% in the VISTAS states across all seasons and networks. Both the observations and the model consistently predicted the highest average sulfate concentrations in the summer, although the model performance is showing the largest underestimation in the summer. This under prediction is also noticeable during all other seasons, though the magnitude of the under prediction is less. Sulfate is also under predicted outside of the VISTAS states in all networks with the single notable over prediction at non-VISTAS IMPROVE sites in the fall (0.13%). The greatest over prediction of sulfate is seen on the western boundary of the VISTAS12 modeling domain during winter months and in the northeastern region of the domain during spring and summer months. Under predictions are noted along the southern boundary of the domain during summer months. Model Performance Evaluation – PM and Regional Haze October 29, 2020 19 Table 3-1. Model Performance Statistics for PM2.5 Sulfate by Region, Network, and Season. Region Network Season N Avg. Obs. (μg/m3) Avg. Pre. (μg/m3) r NMB (%) NME (%) MB (μg/m3) ME (μg/m3) VISTAS IMPROVE Winter 389 1.65 1.48 0.59 -10.40 34.17 -0.17 0.56 Spring 405 2.24 1.87 0.60 -16.64 34.14 -0.37 0.76 Summer 390 3.28 2.20 0.73 -32.81 38.58 -1.08 1.27 Fall 381 1.61 1.55 0.75 -3.38 33.54 -0.05 0.54 All 1565 2.20 1.78 0.71 -19.13 35.69 -0.42 0.78 CSN Winter 623 1.94 1.60 0.52 -17.40 36.32 -0.34 0.70 Spring 647 2.67 2.20 0.58 -17.60 34.01 -0.47 0.91 Summer 674 3.56 2.52 0.70 -29.17 35.17 -1.04 1.25 Fall 638 1.72 1.63 0.58 -5.39 27.80 -0.09 0.48 All 2582 2.49 2.00 0.70 -19.79 33.82 -0.49 0.84 CASTNET Winter 241 2.16 1.54 0.28 -28.71 39.26 -0.62 0.85 Spring 302 2.84 1.77 0.31 -37.50 42.94 -1.06 1.22 Summer 274 3.75 2.38 0.64 -36.57 43.33 -1.37 1.62 Fall 277 1.70 1.50 0.18 -12.18 50.52 -0.21 0.86 All 1094 2.63 1.80 0.52 -31.43 43.65 -0.83 1.15 Non- VISTAS IMPROVE Winter 1612 1.05 0.86 0.70 -18.17 40.99 -0.19 0.43 Spring 1752 1.32 1.25 0.64 -5.32 41.10 -0.07 0.54 Summer 1703 1.55 1.20 0.78 -22.85 41.62 -0.36 0.65 Fall 1656 0.99 0.99 0.82 0.13 33.44 0.00 0.33 All 6723 1.23 1.08 0.73 -12.46 39.72 -0.15 0.49 CSN Winter 1783 1.88 1.34 0.57 -28.96 42.24 -0.55 0.80 Spring 1888 2.08 1.93 0.71 -7.50 31.91 -0.16 0.66 Summer 1908 2.93 2.32 0.83 -20.86 33.13 -0.61 0.97 Fall 1831 1.66 1.52 0.81 -8.78 29.87 -0.15 0.50 All 7410 2.15 1.79 0.77 -16.96 34.13 -0.36 0.73 CASTNET Winter 427 1.69 0.99 0.54 -41.49 50.85 -0.70 0.86 Spring 551 1.91 1.33 0.40 -30.07 49.04 -0.57 0.94 Summer 521 2.56 1.65 0.51 -35.45 53.51 -0.91 1.37 Fall 530 1.46 1.32 0.38 -9.55 50.33 -0.14 0.74 All 2029 1.91 1.34 0.48 -29.94 51.17 -0.57 0.98 Model Performance Evaluation – PM and Regional Haze October 29, 2020 20 Figure 3-1. Boxplot Comparisons of Model Predictions and IMRPOVE Sulfate Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 21 Figure 3-2. Boxplot Comparisons of Model Predictions and CSN Sulfate Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 22 Figure 3-3. Boxplot Comparisons of Model Predictions and CASTNET Sulfate Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 23 Figure 3-4. Spatial Plots of Sulfate NMB by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 24 Figure 3-5. Spatial Plots of Sulfate NME by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 25 4.0 PM2.5 NITRATE Table 4-1 summarizes model performance statistics for PM2.5 nitrate. Boxplot comparisons of model predictions and observations (IMPROVE, CSN, and CASTNET) by month for each climate region are shown in Figures 4-1, 4-2, and 4-3. VISTAS12 modeling domain spatial plots of NMB and NME for each season are shown in Figures 4-4 and 4-5. Nitrate performance in the VISTAS12 modeling domain shows strong seasonal variation. The model under predicts at networks in the summer months (-30.96% to -49.69%) and over predicts at networks during the fall (7.60% to 51.78%). Both the model and the observation show the lowest average nitrate concentrations in the summer. Under predictions of nitrate persist across all seasons and networks with low observed nitrate concentrations and significantly over predictions during months when observed nitrate is highest. An exception is noted regarding under prediction in non-VISTAS states in both the CASTNET and CSN observations during the highest observed nitrate concentrations in winter months. Over prediction of nitrate is seen geographically across most of the VISTAS12 modeling domain especially in the northeast during most months and the northwestern quadrant of the domain during the cooler months of winter and fall. Under prediction of nitrate is noted at networks in most of the VISTAS states during the summer months and along the western border of the domain in spring and summer. Model Performance Evaluation – PM and Regional Haze October 29, 2020 26 Table 4-1. Model Performance Statistics for PM2.5 Nitrate by Region, Network, and Season. Region Network Season N Avg. Obs. (μg/m3) Avg. Pre. (μg/m3) r NMB (%) NME (%) MB (μg/m3) ME (μg/m3) VISTAS IMPROVE Winter 389 0.62 0.81 0.55 29.14 75.87 0.18 0.47 Spring 405 0.39 0.46 0.32 20.09 97.74 0.08 0.38 Summer 390 0.18 0.12 0.22 -30.96 78.32 -0.05 0.14 Fall 381 0.24 0.34 0.43 41.04 102.06 0.10 0.25 All 1565 0.36 0.43 0.51 21.25 86.61 0.08 0.31 CSN Winter 623 1.07 1.40 0.52 31.82 70.18 0.34 0.75 Spring 647 0.55 0.68 0.38 23.04 84.80 0.13 0.47 Summer 675 0.28 0.17 0.26 -37.94 62.40 -0.10 0.17 Fall 636 0.39 0.60 0.49 51.78 94.99 0.20 0.37 All 2581 0.56 0.70 0.58 24.18 77.02 0.14 0.43 CASTNET Winter 241 1.26 1.12 0.47 -11.28 60.57 -0.14 0.77 Spring 302 0.61 0.49 0.22 -20.01 77.22 -0.12 0.47 Summer 274 0.28 0.14 0.31 -49.69 78.85 -0.14 0.22 Fall 277 0.52 0.56 0.17 7.60 87.38 0.04 0.45 All 1094 0.65 0.56 0.48 -13.89 72.31 -0.09 0.47 Non- VISTAS IMPROVE Winter 1611 1.05 1.26 0.70 19.69 66.59 0.21 0.70 Spring 1750 0.60 0.75 0.82 25.43 69.75 0.15 0.42 Summer 1703 0.19 0.11 0.52 -39.73 76.22 -0.08 0.15 Fall 1655 0.33 0.50 0.80 52.12 91.85 0.17 0.30 All 6719 0.54 0.65 0.76 20.89 72.17 0.11 0.39 CSN Winter 1784 2.67 2.53 0.70 -5.45 41.71 -0.15 1.11 Spring 1889 1.48 1.62 0.79 9.15 51.33 0.14 0.76 Summer 1899 0.52 0.34 0.52 -34.52 64.58 -0.18 0.34 Fall 1829 0.94 1.14 0.75 20.28 59.15 0.19 0.56 All 7401 1.39 1.39 0.78 0.06 49.46 0.00 0.69 CASTNET Winter 427 1.88 1.77 0.46 -6.09 70.27 -0.11 1.32 Spring 551 0.85 0.99 0.56 17.1 88.84 0.14 0.75 Summer 521 0.33 0.22 0.10 -35.05 99.67 -0.12 0.33 Fall 530 0.73 0.97 0.52 34.12 100.28 0.25 0.73 All 2029 0.90 0.95 0.54 5.56 84.10 0.05 0.76 Model Performance Evaluation – PM and Regional Haze October 29, 2020 27 Figure 4-1. Boxplot Comparisons of Model Predictions and IMPROVE Nitrate Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 28 Figure 4-2. Boxplot Comparisons of Model Predictions and CSN Nitrate Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 29 Figure 4-3. Boxplot Comparisons of Model Predictions and CASTNET Nitrate Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 30 Figure 4-4. Spatial Plots of Nitrate NMB by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 31 Figure 4-5. Spatial Plots of Nitrate NME by Season and Network (Circle = IMPROVE, Square = CASTNET, Diamond = CSN). Summer Fall Winter Spring Model Performance Evaluation – PM and Regional Haze October 29, 2020 32 5.0 PM2.5 AMMONIUM Table 5-1 summarizes model performance statistics for PM2.5 ammonium. Boxplot comparisons of model predictions and observations (CSN and CASTNET) by month for each climate region are shown in Figures 5-1 and 5-2. VISTAS12 modeling domain spatial plots of NMB and NME for each season are shown in Figures 5-3 and 5-4 (note that the IMPROVE network does not measure ammonium). Ammonium is generally under predicted across the VISTAS12 domain in all seasons, with the exception of over prediction in the fall months. In the VISTAS state receptor networks, ammonium is generally under predicted with a significant over prediction observed during the lowest observed concentration fall months in the CSN. While both the model and the observations in the VISTAS states show the lowest average ammonium concentrations in the fall, the model predictions show less seasonal variability than the observations. Over prediction of ammonium is seen across much of the eastern half of the VISTAS12 modeling domain during fall months and along the northern border of the domain during most seasons with noted under prediction seen at peninsular Florida CASTNET sites across most seasons. Model Performance Evaluation – PM and Regional Haze October 29, 2020 33 Table 5-1. Model Performance Statistics for PM2.5 Ammonium by Region, Network, and Season Region Network Season N Avg. Obs. (μg/m3) Avg. Pre. (μg/m3) r NMB (%) NME (%) MB (μg/m3) ME (μg/m3) VISTAS CSN Winter 618 0.82 0.88 0.61 7.65 42.73 0.06 0.35 Spring 644 0.82 0.80 0.61 -2.93 41.52 -0.02 0.34 Summer 673 0.88 0.80 0.69 -8.88 34.35 -0.08 0.30 Fall 624 0.42 0.67 0.68 60.09 70.46 0.25 0.29 All 2559 0.74 0.79 0.63 6.73 43.58 0.05 0.32 CASTNET Winter 241 0.93 0.71 0.57 -23.39 38.57 -0.22 0.36 Spring 302 0.87 0.63 0.42 -28.38 44.38 -0.25 0.39 Summer 274 1.17 0.70 0.61 -40.17 45.97 -0.47 0.54 Fall 277 0.55 0.57 0.32 2.89 59.60 0.02 0.33 All 1094 0.88 0.65 0.48 -26.16 45.97 -0.23 0.40 Non- VISTAS CSN Winter 1781 1.31 1.19 0.69 -9.57 38.97 -0.13 0.51 Spring 1873 1.01 1.10 0.78 8.25 37.59 0.08 0.38 Summer 1884 0.87 0.83 0.79 -5.17 37.97 -0.05 0.33 Fall 1796 0.62 0.82 0.77 32.80 52.20 0.20 0.32 All 7334 0.95 0.98 0.75 3.02 40.46 0.03 0.39 CASTNET Winter 427 1.02 0.82 0.55 -20.05 51.55 -0.20 0.53 Spring 551 0.74 0.71 0.57 -4.44 50.62 -0.03 0.38 Summer 521 0.85 0.59 0.50 -31.14 53.61 -0.27 0.46 Fall 530 0.59 0.69 0.39 16.02 66.97 0.10 0.40 All 2029 0.79 0.70 0.48 -12.06 54.91 -0.10 0.43 Model Performance Evaluation – PM and Regional Haze October 29, 2020 34 Figure 5-1. Boxplot Comparisons of Model Predictions and CSN Ammonium Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 35 Figure 5-2. Boxplot Comparisons of Model Predictions and CASTNET Ammonium Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 36 Figure 5-3. Spatial Plots of Ammonium NMB by Season and Network (Square = CASTNET, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 37 Figure 5-4. Spatial Plots of Ammonium NME by Season and Network (Square = CASTNET, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 38 6.0 PM2.5 OC Table 6-1 summarizes model performance statistics for PM2.5 organic carbon (OC). To provide a direct comparison to the observational data, as noted in Table 2-2, CAMx’s OM was divided by 1.8 and 1.4, respectively, to generate OC for IMPROVE and CSN receptors. Boxplot comparisons of model predictions and observations (IMPROVE and CSN) by month for each climate region are shown in Figures 6-1 and 6-2. VISTAS12 modeling domain spatial plots of NMB and NME for each season are shown in Figures 6-3 and 6-4. Both the model and the observations show the highest average OC concentrations in the summer. OC is generally overestimated for the CSN network and underestimated for the IMPROVE network. OC is generally over predicted in the VISTAS12 domain across seasons outside of the summer. The greatest noted NMB includes winter month over prediction (163.33%) in non-VISTAS receptors from the CSN. The most significant over prediction of OC is seen across the northern half of the VISTAS12 modeling domain during winter months with high over predictions also seen in the region during spring and fall seasons. Table 6-1. Model Performance Statistics for PM2.5 OC by Region, Network, and Season. Region Network Season N Avg. Obs. (μg/m3) Avg. Pre. (μg/m3) r NMB (%) NME (%) MB (μg/m3) ME (μg/m3) VISTAS IMPROVE Winter 406 1.32 1.49 0.63 12.62 48.46 0.17 0.64 Spring 433 1.81 1.22 0.35 -32.46 49.52 -0.59 0.90 Summer 425 2.18 1.60 0.31 -26.87 47.47 -0.59 1.04 Fall 411 1.31 1.09 0.38 -16.76 48.67 -0.22 0.64 All 1675 1.66 1.35 0.35 -18.89 48.47 -0.31 0.81 CSN Winter 607 1.94 3.31 0.57 71.02 85.84 1.37 1.66 Spring 612 1.83 2.38 0.60 29.73 51.48 0.55 0.94 Summer 664 2.61 3.78 0.39 44.82 64.15 1.17 1.67 Fall 617 1.68 2.49 0.63 48.12 63.72 0.81 1.07 All 2500 2.03 3.00 0.55 48.22 66.28 0.98 1.34 Non- VISTAS IMPROVE Winter 1666 0.75 1.19 0.51 59.06 87.07 0.44 0.65 Spring 1831 0.84 0.81 0.57 -3.52 56.96 -0.03 0.48 Summer 1764 1.43 1.15 0.49 -19.53 46.28 -0.28 0.66 Fall 1700 0.98 1.06 0.70 8.30 55.31 0.08 0.54 All 6961 1.00 1.05 0.62 4.69 58.08 0.05 0.58 CSN Winter 1706 1.57 4.13 0.52 163.33 169.30 2.56 2.66 Spring 1824 1.27 2.20 0.30 72.88 90.62 0.93 1.15 Summer 1903 2.01 2.35 0.54 16.61 40.83 0.33 0.82 Fall 1763 1.44 2.41 0.64 68.03 76.19 0.98 1.09 All 7196 1.58 2.75 0.40 74.16 89.18 1.17 1.41 Model Performance Evaluation – PM and Regional Haze October 29, 2020 39 Figure 6-1. Boxplot Comparisons of Model Predictions and IMPROVE Organic Carbon (OC) Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 40 Figure 6-2. Boxplot Comparisons of Model Predictions and CSN Organic Carbon (OC) Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 41 Figure 6-3. Spatial plots of organic carbon (OC) NMB by season and network (Circle = IMPROVE, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 42 Figure 6-4. Spatial plots of organic carbon (OC) NME by season and network (Circle = IMPROVE, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 43 7.0 PM2.5 EC Table 7-1 summarizes model performance statistics for PM2.5 EC. Boxplot comparisons of model predictions and observations (IMPROVE and CSN) by month for each climate region are shown in Figures 7-1 and 7-2. VISTAS12 modeling domain spatial plots of NMB and NME for each season are shown in Figures 7-3 and 7-4. In the VISTAS states, EC concentrations averaged over the entire year show fairly close agreement with observations with a NMB of 0.20% at the IMPROVE monitors and 14.58% at the CSN monitors. However, on a seasonal basis the model is underestimating EC in the spring and summer and overestimating in the winter at the IMPROVE monitors. At the CSN monitors the model is overestimating except in the summer where the model NMB is a very low 0.26%. Significant over prediction of EC is seen across most of the VISTAS12 modeling domain during winter months with high over predictions also seen in the northern half of the domain during spring and fall seasons. Table 7-1. Model Performance Statistics for PM2.5 EC by Region, Network, and Season. Region Network Season N Avg. Obs. (μg/m3) Avg. Pre. (μg/m3) r NMB (%) NME (%) MB (μg/m3) ME (μg/m3) VISTAS IMPROVE Winter 406 0.30 0.40 0.64 34.89 56.66 0.10 0.17 Spring 433 0.31 0.27 0.38 -10.71 45.46 -0.03 0.14 Summer 423 0.28 0.21 0.46 -24.74 42.01 -0.07 0.12 Fall 412 0.25 0.25 0.60 0.18 38.63 0.00 0.10 All 1674 0.28 0.28 0.45 -0.20 45.98 0.00 0.13 CSN Winter 610 0.67 0.87 0.56 29.28 58.09 0.20 0.39 Spring 613 0.56 0.63 0.49 12.19 48.72 0.07 0.27 Summer 664 0.67 0.67 0.29 -0.26 47.28 0.00 0.32 Fall 619 0.61 0.72 0.55 18.32 49.89 0.11 0.31 All 2506 0.63 0.72 0.49 14.58 51.03 0.09 0.32 Non- VISTAS IMPROVE Winter 1671 0.19 0.31 0.63 62.79 83.18 0.12 0.16 Spring 1829 0.17 0.21 0.65 25.86 59.94 0.04 0.10 Summer 1763 0.21 0.21 0.55 -0.97 44.60 0.00 0.10 Fall 1702 0.20 0.28 0.61 36.63 62.53 0.07 0.13 All 6965 0.19 0.25 0.56 29.70 61.71 0.06 0.12 CSN Winter 1713 0.61 1.10 0.57 80.48 95.49 0.49 0.58 Spring 1834 0.49 0.75 0.48 53.10 72.00 0.26 0.35 Summer 1904 0.70 0.79 0.56 12.60 44.43 0.09 0.31 Fall 1774 0.66 0.94 0.67 42.67 60.96 0.28 0.40 All 7225 0.62 0.89 0.56 44.59 66.31 0.27 0.41 Model Performance Evaluation – PM and Regional Haze October 29, 2020 44 Figure 7-1. Boxplot Comparisons of Model Predictions and IMPROVE Elemental Carbon (EC) Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 45 Figure 7-2. Boxplot Comparisons of Model Predictions and CSN Elemental Carbon (EC) Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 46 Figure 7-3. Spatial plots of elemental carbon (EC) NMB by season and network (Circle = IMPROVE, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 47 Figure 7-4. Spatial plots of elemental carbon (EC) NME by season and network (Circle = IMPROVE, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 48 8.0 TOTAL PM2.5 Table 8-1 summarizes model performance statistics for total PM2.5. Boxplot comparisons of model predictions and observations (IMPROVE and CSN) by month for each climate region are shown in Figures 8-1 and 8-2. VISTAS12 modeling domain spatial plots of NMB and NME for each season are shown in Figures 8-3 and 8-4. PM2.5 is over predicted across both networks during the winter season and under predicted across both networks during the summer season. Model performance varies between VISTAS and non-VISTAS regions, especially during the spring and fall seasons, with slightly better performance typically seen at the VISTAS state locations (compared to non-VISTAS receptors) at high observed concentrations and slightly worse performance at these same locations at low observed concentrations. Table 8-1. Model Performance Statistics for PM2.5 by Region, Network, and Season. Region Network Season N Avg. Obs. (μg/m3) Avg. Pre. (μg/m3) r NMB (%) NME (%) MB (μg/m3) ME (μg/m3) VISTAS IMPROVE Winter 403 5.86 6.96 0.67 18.87 38.66 1.11 2.26 Spring 413 7.86 6.35 0.53 -19.16 36.82 -1.51 2.89 Summer 423 10.95 6.68 0.57 -39.02 42.12 -4.27 4.61 Fall 413 5.79 5.40 0.74 -6.63 31.04 -0.38 1.80 All 1652 7.64 6.35 0.55 -16.96 38.01 -1.30 2.91 CSN Winter 627 9.86 11.25 0.64 14.08 35.17 1.39 3.47 Spring 651 11.00 9.35 0.54 -15.00 33.16 -1.65 3.65 Summer 677 15.85 11.25 0.52 -29.03 36.52 -4.60 5.79 Fall 639 8.80 8.84 0.65 0.54 30.89 0.05 2.72 All 2594 11.45 10.18 0.55 -11.07 34.36 -1.27 3.93 Non- VISTAS IMPROVE Winter 1660 4.55 5.97 0.68 31.36 53.57 1.43 2.44 Spring 1812 5.29 5.11 0.63 -3.30 44.48 -0.17 2.35 Summer 1762 6.92 4.80 0.66 -30.69 40.01 -2.12 2.77 Fall 1704 4.54 4.86 0.63 7.08 40.04 0.32 1.82 All 6938 5.34 5.18 0.61 -3.09 43.93 -0.16 2.35 CSN Winter 1773 11.26 13.83 0.61 22.84 42.32 2.57 4.76 Spring 1881 9.44 10.17 0.56 7.70 36.89 0.73 3.48 Summer 1906 12.75 9.55 0.72 -25.12 32.43 -3.20 4.14 Fall 1826 8.67 9.82 0.61 13.27 37.14 1.15 3.22 All 7386 10.54 10.80 0.58 2.47 36.94 0.26 3.89 Model Performance Evaluation – PM and Regional Haze October 29, 2020 49 Figure 8-1. Boxplot Comparisons of Model Predictions and IMPROVE Total PM2.5 Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 50 Figure 8-2. Boxplot Comparisons of Model Predictions and CSN Total PM2.5 Observations for Each Climate Region by Month. Model Performance Evaluation – PM and Regional Haze October 29, 2020 51 Figure 8-3. Spatial plots of total PM2.5 NMB by season and network (Circle = IMPROVE, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 52 Figure 8-4. Spatial plots of total PM2.5 NME by season and network (Circle = IMPROVE, Diamond = CSN). Winter Spring Summer Fall Model Performance Evaluation – PM and Regional Haze October 29, 2020 53 9.0 PERFORMANCE ON 20% MOST-IMPAIRED DAYS Spatial plots summarizing IMPROVE observations and model NMB on the 20% most- impaired days are shown in Figures 9-1 through 9-6. In each figure the top graphic presents the observed concentration and the bottom graphic presents the NMB. For sulfate (Figure 9-1), predictions on the 20% most-impaired days are biased low across all regions, with the most significant percentage under predictions occurring in the southwest quarter of the VISTAS12 modeling domain. Some isolated over predictions are observed in a few Class I areas near the outer domain boundaries and in the northeast. Predictions of nitrate (Figure 9-2) on the 20% most-impaired days in the VISTAS12 modeling domain are mixed with a high positive bias in the north and a mix of negative and positive bias in the southeast. A general positive bias of OC (Figure 9-3) is observed across the region on the 20% most-impaired days. In the SESARM states the OC has approximately the same NMB at monitors with high observed concentrations as monitors with lower observed concentrations. For EC (Figure 9-4) the model shows a slight under prediction at monitors in the northern portion of the SESARM states and a positive bias at monitors in the southern SESARM region. On the 20% most-impaired days, model performance for total PM2.5 is overall biased low across most quadrants of the VISTAS12 modeling domain (corresponding closely to the sulfate performance). A slight over prediction of PM2.5 on those days is observed in the Northern Plains and Upper Midwest, primarily along the Canadian border (corresponding closely to high nitrate concentrations and performance). Sodium chloride (NACL) is generally over predicted along boundaries with ocean water bodies (Atlantic Ocean and Gulf of Mexico) and is expectedly under predicted across the rest of the VISTAS12 modeling domain. Model Performance Evaluation – PM and Regional Haze October 29, 2020 54 Figure 9-1. Observed Sulfate (Top) and Modeled NMB (Bottom) for Sulfate on the 20% Most-impaired Days at IMPROVE Monitor Locations. Model Performance Evaluation – PM and Regional Haze October 29, 2020 55 Figure 9-2. Observed Nitrate (Top) and Modeled NMB (Bottom) for Nitrate on the 20% Most-impaired Days at IMPROVE Monitor Locations. Model Performance Evaluation – PM and Regional Haze October 29, 2020 56 Figure 9-3. Observed OC (Top) and Modeled NMB (Bottom) for OC on the 20% Most- impaired Days at IMPROVE Monitor Locations. Model Performance Evaluation – PM and Regional Haze October 29, 2020 57 Figure 9-4. Observed EC (Top) and Modeled NMB (Bottom) for EC on the 20% Most- impaired Days at IMPROVE Monitor Locations. Model Performance Evaluation – PM and Regional Haze October 29, 2020 58 Figure 9-5. Observed Total PM2.5 (Top) and Modeled NMB (Bottom) for Total PM2.5 on the 20% Most-impaired Days at IMPROVE Monitor Locations. Model Performance Evaluation – PM and Regional Haze October 29, 2020 59 Figure 9-6. Observed NACL (Top) and Modeled NMB (Bottom) for NACL on the 20% Most-impaired Days at IMPROVE Monitor Locations. Model Performance Evaluation – PM and Regional Haze October 29, 2020 60 10.0 PM2.5 COMPOSITION AND CONTRIBUTIONS TO LIGHT EXTINCTION Charts for each of the VISTAS_12 modeling domain’s Class I areas can be generated using the provided Excel file titled “APP_F_PM_EXTINCTION_MPE.xlsx” in Appendix F. These stacked bar charts detail the daily and averaged composition of PM2.5 on the 20% most impaired and clearest days for both modeled and observed concentration (μg/m3) and light extinction (bext-1) at each IMPROVE monitoring site located within the VISTAS12 modeling domain. Total mass plots display the amount of total particle mass using concentrations of coarse mass (CM), crustal (soil), ammonium nitrate (NO3), ammonium sulfate (SO4), EC, organic mass carbon (OMC), and sea salt. Daily concentration values by day are presented for SAMA’s 20% clearest days on the top of Figure 10-1 below. The amount of light extinction due to each aforementioned species by day is displayed in the daily light extinction tab of Appendix F and is presented on the bottom of Figure 10-1. An example of the averaged concentration across all days is presented for SAMA’s 20% clearest days on the left of Figure 10-2 below. The average amount of light extinction due to each species is displayed in the average light extinction tab of Appendix F and is presented on the right of Figure 10-2. Predicted (modeled) results for all locations are based on across all daily results for each Class I area’s impairment classification (20% clearest or 20% most anthropogenically impaired) using CAMx v6.40 and calculated using the new IMPROVE equation. Observations, clearest, and most impaired days and associated observational concentrations and light extinction data by IMPROVE receptor were identified and provided by EPA in their Preliminary Regional Haze Modeling.5 5 https://www.epa.gov/visibility/regional-haze-guidance-technical-support-document-and-data-file Model Performance Evaluation – PM and Regional Haze October 29, 2020 61 Figure 10-1. Example Daily Observed (Obs) and Predicted (Mod) Total Mass Concentrations (Top) and Light Extinctions (Bottom) at the St. Mark’s Wildlife Refuge on the Observed 20% Clearest Days. Model Performance Evaluation – PM and Regional Haze October 29, 2020 62 Figure 10-2. Example Averaged Observed (Obs) and Predicted (Mod) Total Mass Concentrations (Left) and Light Extinctions (Right) at the St. Mark’s Wildlife Refuge on the Observed 20% Clearest Days. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix A VISTAS12 Modeling Domain Model Performance Metrics by Network, Station, Pollutant, and Season Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix A-1 VISTAS12 Modeling Domain CASTNET Model Performance Metrics by Station, Pollutant, and Season (see MPE by Station and Season-1.pdf) Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix A-2 VISTAS12 Modeling Domain CSN Model Performance Metrics by Station, Pollutant, and Season (see MPE by Station and Season-2.pdf) Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix A-3 VISTAS12 Modeling Domain IMPROVE Model Performance Metrics by Station, Pollutant, and Season (see MPE by Station and Season-3.pdf) Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix B VISTAS12 Modeling Domain Scatter Plots of PM2.5 Species by Network, Pollutant, and Month Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 B-1 Figure B-1. Scatter Plots of Sulfate by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 B-2 Figure B-2. Scatter Plots of Nitrate by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 B-3 Figure B-3. Scatter Plots of OC by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 B-4 Figure B-4. Scatter Plots of EC by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 B-5 Figure B-5. Scatter Plots of Total PM2.5 by Network and Month for VISTAS and Non- VISTAS sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 B-6 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix C VISTAS12 Modeling Domain Scatter, Soccer, and Bugle Plots by Site for the 20% Most Impaired Days and 20% Clearest Days (see “APP_C_maps_pred_obs_mpe_results_station_all_dates_improve.xlsx”) Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix D VISTAS12 Modeling Domain Soccer Plots of PM2.5 Species by Network, Pollutant, and Month Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 D-1 Figure D-1. Soccer Plot of Sulfate by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 D-2 Figure D-2. Soccer Plot of Nitrate by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 D-3 Figure D-3. Soccer Plot of OC by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 D-4 Figure D-4. Soccer Plot of EC by Network and Month for VISTAS and Non-VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 D-5 Figure D-5. Soccer Plot of Total PM2.5 by Network and Month for VISTAS and Non- VISTAS Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 D-6 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix E VISTAS12 Modeling Domain Bugle Plots of PM2.5 Species by Network, Pollutant, and Month Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-1 Figure E-1. Bugle Plot of Monthly Sulfate at VISTAS State CASTNET Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-2 Figure E-2. Bugle Plot of Monthly Sulfate at Non-VISTAS State CASTNET Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-3 Figure E-3. Bugle Plot of Monthly Sulfate at VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-4 Figure E-4. Bugle Plot of Monthly Sulfate at Non-VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-5 Figure E-5. Bugle Plot of Monthly Sulfate at VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-6 Figure E-6. Bugle Plot of Monthly Sulfate at Non-VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-7 Figure E-7. Bugle Plot of Monthly Nitrate at VISTAS State CASTNET Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-8 Figure E-8. Bugle Plot of Monthly Nitrate at Non-VISTAS State CASTNET Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-9 Figure E-9. Bugle Plot of Monthly Nitrate at VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-10 Figure E-10. Bugle Plot of Monthly Nitrate at Non-VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-11 Figure E-11. Bugle Plot of Monthly Nitrate at VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-12 Figure E-12. Bugle Plot of Monthly Nitrate at Non-VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-13 Figure E-13. Bugle Plot of Monthly OC at VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-14 Figure E-14. Bugle Plot of Monthly OC at Non-VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-15 Figure E-15. Bugle Plot of Monthly OC at VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-16 Figure E-16. Bugle Plot of Monthly OC at Non-VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-17 Figure E-17. Bugle Plot of Monthly EC at VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-18 Figure E-18. Bugle Plot of Monthly EC at Non-VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-19 Figure E-19. Bugle Plot of Monthly EC at VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-20 Figure E-20. Bugle Plot of Monthly EC at Non-VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-21 Figure E-21. Bugle Plot of Monthly Total PM2.5 at VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-22 Figure E-22. Bugle Plot of Monthly Total PM2.5 at Non-VISTAS State CSN Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-23 Figure E-23. Bugle Plot of Monthly Total PM2.5 at VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 E-24 Figure E-24. Bugle Plot of Monthly Total PM2.5 at Non-VISTAS State IMPROVE Sites. Model Performance Evaluation – PM and Regional Haze October 29, 2020 Appendix F VISTAS12 Modeling Domain Observed and Modeled Concentration and Light Extinction Comparisons (see “APP_F_PM_EXTINCTION_MPE.xlsx”) Model Performance Evaluation – PM and Regional Haze October 29, 2020 This page is intentionally left blank.