TRR 2320 1 Transportation Research Record: Journal of the Transportation Research Board, No. 2320, Transportation Research Board of the National Academies, Washington, D.C., 2012, pp. 1–9. DOI: 10.3141/2320-01 A. A. Amekudzi, S. R. Brodie, and J. M. Fischer, School of Civil and Environmental Engineering, Georgia Institute of Technology, 790 Atlantic Drive, Atlanta, GA 30332. M. K. Smith, Office of Safety, Federal Highway Administration, 1200 New Jersey Avenue, SE, Washington, DC 20590. C. L. Ross, School of City and Regional Plan- ning, Georgia Institute of Technology, 245 Fourth Street, NW, Atlanta, GA 30332. Corresponding author: A. A. Amekudzi, adjo.amekudzi@ce.gatech.edu. assessed through structured interviews of selected DOTs. On the basis of the findings, the researchers formulated a performance-based framework, the EJ maturity-scale model, that describes and evaluates the maturation of EJ policies, programs, and activities. The model was applied to evaluate different programs and demonstrate different levels of effectiveness in achieving equity outcomes in transportation. The performance-based framework can be used as a benchmarking resource by agencies interested in assessing and improving the level of effectiveness of their EJ policies and programs. Background: ExEcutivE ordEr, Laws, and PoLiciEs Environmental justice became a national issue in the early 1980s when a North Carolina community protest led to a federal investiga- tion on the location of toxic waste landfills in the South. The result- ing study by the U.S. Government Accountability Office revealed that a disproportionately high number of such facilities were sited in low-income and minority neighborhoods throughout the region (1). EJ regulations were formally mandated by President Clinton in 1994 with the signing of EO 12898: Federal Actions to Address Environmental Justice in Minority Populations and Low-Income Populations. The EO requires all federally funded agencies to iden- tify and address disproportionately high and adverse human health and environmental effects of their programs, policies, and activities on minority and low-income populations. The EO effectively brought together two previous regulations: Title VI of the 1964 Civil Rights Act, which focuses on nondiscrimi- nation, and the 1969 National Environmental Policy Act (NEPA), which focuses on protecting the natural environment. These two acts established the basis and provided the required authority for the concept of EJ. The transportation community, however, did not outline specific goals and regulatory guidelines until the subsequent U.S. DOT Order 5610.2 in 1997 (2) after articulating its proposed EJ strategy (60 FR 33896) in 1995. The order established the pro- cess for the DOT and its operating administrations to integrate the goals of the EO and was developed completely within the framework of existing requirements, primarily NEPA and Title VI of the Civil Rights Act, the Uniform Relocation Assistance and Real Property Acquisition Policies Act of 1970 as amended, the Intermodal Surface Transportation Efficiency Act of 1991, and other DOT applicable guidance concerning planning; social, economic, or environmental matters; public health or welfare; and public involvement (3). FHWA Impact of Environmental Justice on Transportation applying Environmental Justice Maturation Model to Benchmark Progress Adjo A. Amekudzi, Mshadoni K. Smith, Stefanie R. Brodie, Jamie M. Fischer, and Catherine L. Ross Presidential Executive Order 12898 on Environmental Justice (EJ) was signed in 1994, and the U.S. Department of Transportation (DOT) issued regulatory guidelines for addressing EJ in transportation in 1997. Transportation agencies have since adopted a range of policies, pro- grams, and activities to identify and address disproportionately high and adverse human health and environmental effects of their policies, programs, and activities on minority and low-income populations. On the basis of the relevant literature and structured interviews, this paper assesses how state DOTs are addressing EJ issues in their decision-making processes and identifies common and effective practices. The results show that several state DOTs have implemented public involvement programs and other procedures to assess the burdens of transporta- tion investment. However, fewer agencies assess the equity of benefits, fewer assess outcomes of EJ actions, and fewer still link EJ analysis outcomes with future funding and policy decisions. On the basis of existing practices and regulatory guidelines, the researchers formu- lated a performance-based, maturity-scale model that agencies can use to benchmark the effectiveness of their external and internal EJ activi- ties in achieving EJ outcomes in transportation. The model was applied anonymously to selected agencies to demonstrate different maturity levels in addressing EJ. Since the Presidential Executive Order (EO) on environmental justice (EJ), EO 12898, was signed in 1994, transportation agencies have developed a range of policies and programs that they use to monitor, mitigate, and prevent disproportionately adverse impacts of trans- portation on minority and low-income communities. Because of the absence of explicit and quantitative guidance, EJ programs are evolv- ing at different rates and with different emphases. This study reviewed the impact of EJ on transportation planning in state departments of transportation (DOTs) and, by extension, the impact on transporta- tion. Relevant literature was reviewed and the state of the practice 2 Transportation Research Record 2320 issued DOT Order 6640.23 in 1998, and FHWA and FTA issued a memo in 1999, each providing more specific details for regulating and monitoring transportation policies and programs for achieving EJ outcomes. FundaMEntaL PrinciPLEs oF EnvironMEntaL JusticE in transPortation Specifically, FHWA and FTA define EJ as having three fundamental principles related to burdens, process, and benefits (4): 1. To avoid, minimize, or mitigate disproportionately high and adverse human health and environmental effects, including social and economic effects, on minority populations and low-income populations (burdens); 2. To ensure the full and fair participation by all potentially affected communities in the transportation decision-making process (process); and 3. To prevent the denial of, reduction in, or significant delay in the receipt of benefits by minority and low-income populations (benefits). These principles are applicable for all phases of project development for any agency receiving federal funds, whether the improvement is federally funded or not. The target groups to be considered in EJ are blacks, Hispanics, Asians, American Indians, Alaskan Natives, low-income people, and, more recently, Native Hawaiians or Other Pacific Islanders. The elderly, disabled, and child population groups were listed as target groups in EO 13330: Human Service Transportation Coordination (2004), although they were not explicitly identified in the original EO, and are considered in practice (4). EvoLution oF EnvironMEntaL JusticE through PEEr-to-PEEr BEnchMarking EJ regulations do not include explicit quantitative compliance mea- sures to guide federally assisted agencies in the application of the EJ principles. They require interpretation. For example, how many advocacy groups must be included for a process to “ensure . . . full and fair participation?” The guidance is silent on this matter and others, leaving agencies to interpret the order in their own way. At the time of implementation in the mid-1990s, this lack of guidance caused a great deal of confusion and frustration for public agencies. However, according to some of the practitioners interviewed in this study, this lack of guidance has provided unique opportunities for agencies to develop EJ programs in a manner that best fits agency and customer needs; it has also allowed agencies to be flexible and creative. Through peer-to-peer benchmarking (i.e., communication and comparison) among states, agencies can organically develop best-practice models that are context sensitive, but benefit from a knowledge of policies, methods, and tools used by peer agencies, indicating that EJ is an evolving practice. This is a different approach from many federally regulated programs that typically identify compliance measures and means by which agencies must confirm compliance. By 2010, several EJ programs had been in place at state transportation agencies for at least a decade. These programs reflect the opportunities and challenges created as a result of less rigid regulation. coMPLiancE Lack of explicit guidance does not remove the oversight require- ment for the regulating agency. FHWA and FTA monitor state DOTs’ compliance with the EJ regulatory guidelines. Typically, local agen- cies align their EJ programs with the state DOT and are, in this way, indirectly connected to FHWA and FTA. This connection, however, may vary depending on the location and population size of a local community. For example, a rural municipality with a population less than 10,000 may adopt its state’s DOT’s EJ policies. However, in metropolitan areas, the metropolitan planning organizations (MPOs) may lead most of the planning activities and thus become the primary points of contact for the federal agencies. As a result, MPOs work closely with FHWA and FTA on EJ compliance. Ultimately, all agencies must comply with the federal regulatory guidelines on EJ and the oversight requirements of Title VI and NEPA. Each agency must provide a Title VI compliance report annually. The report provides evidence of the activities that the agency has undertaken to meet the requirements of Title VI and EJ. Title VI compliance reporting can often be combined with, or at least aligned with, NEPA compliance. The NEPA process requires documentation of all plan development processes undertaken by an agency that receives federal funds. This includes documentation of potential impacts on both natural and human resources, along with measures for mitigating such impacts. Through the NEPA process, state and federal partners can review the impacts and mitigation measures for any federal process and produce one of three types of documents: a categorical exclusion, an environmental assessment, or an environmental impact statement. EJ efforts are reviewed for compliance as part of the review of the Title VI and NEPA documents. In addition to document reviews, the federal government can assess the quality of an EJ program at certification reviews for MPOs and when auditing self-certification documentation for state and local agencies. FHWA and FTA funding is the basis of such federal involvement with MPOs. EJ Maturation ModEL The literature on transportation planning and EJ was reviewed to assess the status of state DOT policies and programs for achieving EJ, including common and effective practices and lessons learned since 1994. The literature discusses a range of policies, public involvement programs, technical analysis methods, before-and-after studies, and ways to link EJ assessments and outcomes with decision making. The literature review identified several agencies that have devel- oped guidance to integrate EJ concerns into their operating proce- dures (1, 5–7) and found that agencies are continuing to refine their approaches to EJ. The study results allowed for the development of a three-phase maturity-scale or maturation model that captures the process of continuous improvement through which EJ programs are being developed by state DOTs. considering Burdens and Benefits The 1994 EO required federally sponsored agencies to develop poli- cies, programs, and activities to ensure full and fair participation of all potentially affected people in decision making; to avoid or mitigate disproportionately high and adverse impacts to minority and low-income populations; and to prevent the denial, reduction, Amekudzi, Smith, Brodie, Fischer, and Ross 3 or delay of benefits to minority and low-income populations. A 2008 national survey of MPOs and state DOTs found that both MPOs and DOTs have consistent knowledge of EJ concepts across agen- cies and that several respondents demonstrated a strong commit- ment to EJ beyond legislative requirements (1). The response rate for state DOTs was 38% (19). DOTs and MPOs expressed similar perspectives about EJ. Agencies reported on EJ compliance strat- egies including surveying public opinion, assessments based on geographic information systems (GIS), and the use of performance measures. Public involvement was found to be a particularly impor- tant, highly visible, and quantifiable element of the EJ process. For performance measures, the survey respondents stressed the need to incorporate decision criteria to measure design. Other results indi- cated that some disproportionately high and adverse impacts may be mitigated and others avoided. Also, if projects advance largely because of political influence, despite EJ concerns, the implement- ing agency risks litigation. The 2008 national survey also identified a number of EJ concerns that affect policy decisions: (a) public par- ticipation, (b) access to transportation, (c) location of public facili- ties, (d ) access to health care, and (e) transit access. Transit access can be especially important to low-income and elderly populations and minority communities that may not have other transportation options. Owens et al. recommend that planners include affected populations early in a transit planning process and build rapport with such communities (1). Otherwise, without an inclusive pro- cess, it is even more difficult to achieve truly equitable outcomes. Amekudzi and Dixon have similarly reported on the importance of having an equitable process for achieving equitable outcomes; they emphasize the linkage between equity in process and outcomes (8). Macrolevel and Microlevel analyses Proactively, the Arizona DOT conducted an analysis of its EJ pro- gram in 2002 (5). The Arizona DOT study found that although EJ processes always involve identifying disproportionately high and adverse impacts to minority or low-income populations, impacts have been defined in several ways. They have been associated with (a) the siting of undesirable or environmentally hazardous facilities in areas that are disproportionately populated by minority or low- income residents; (b) public participation in decision making; (c) public transportation access, and (d) funding decisions, in the sense that prioritization of certain projects may have implications for certain communities that receive transportation benefits (5). To address all of these important impacts, the Arizona DOT study distinguished between macrolevel (policy) and microlevel (project) analyses, as follows (5). • Macrolevel: – Coordinate efforts with other transportation agencies; – Create detailed, formalized policies, procedures, and guidance; – Communicate policies and procedures with staff and departments; and – Consider the effectiveness of the policies and procedures yearly. • Microlevel: – Define project study area, – Develop community profile, – Analyze impacts, – Identify solutions, and – Document findings. Macrolevel analysis refers to the formal incorporation of EJ consid- erations into agency policies, programs, and procedures. The current study found that only a few DOTs have formalized their EJ policies, programs, and procedures; however, without the formal incorpora- tion of EJ considerations into agency programs and procedures, it is difficult to track EJ compliance. Microlevel analysis determines if a project will have disproportionately adverse impacts on minority and low-income populations (5). To achieve the full benefit of macrolevel and microlevel analyses, Arizona DOT researchers recommended linking funding decisions with environmental planning and project management. Other organizational recommendations include the for- mation of community planning groups, transit planning partnerships, and an ongoing EJ task force (5). Transportation policy decisions ultimately affect the community. For example, some of the policies and procedures defined at the macrolevel may have to do with ways to involve the public or ways to conduct microlevel analysis. Without such guidance in place, an agency may comply with the regulations of the EO and state man- dates with respect to actions taken, but still fail to achieve EJ in the eyes of the public, and at the level at which the public perceives dis- proportionately high and adverse impacts occurring. The experience of the Arizona DOT is a prime example. Its study results showed that the agency was on par with peer agencies in terms of EJ actions; however, community groups continued to voice concern over the quantity of transportation options available to low-income and minority communities as well as potential negative impacts to those communities (9). Performance-based, policy-level EJ programs can help the agency monitor and evaluate EJ outcomes and incorporate these results into future transportation decisions. Performance Measures for EJ Performance measurement allows agencies to assess and track the effectiveness of their EJ programs. Performance management is the use of performance measurement information to inform decision making in a way that performance improves over time according to agency goals and objectives. The performance management litera- ture recommends measuring inputs (actions and methods), outputs (products and services delivered), and outcomes (consequences of the program outputs on customers) for a comprehensive view of performance (9). Often, each of these categories of measures can be applied at both macro- and microlevels of analysis. Table 1 sum- marizes performance measures that may be appropriate for use in an EJ assessment, depending on the goals of the agency. As shown in Table 1, outcome measures seem to be the most com- mon for EJ analysis. Some agencies have begun to measure perfor- mance in the areas shown in Table 1. For example, the Colorado DOT has developed a list of qualitative performance measures, addressing issues such as accessibility to jobs, travel times to selected activ- ity centers, provision and quality of transit service, and distribution of transportation funding among population groups. The Colorado DOT generally compares these measures across population groups (i.e., minority and nonminority) and assesses them before and after the implementation of a transportation project. The measures were initially qualitative because there were no data available for evalu- ating them. However, the agency’s intent was to develop appropriate measures, collect the data, develop the necessary analysis tools, establish a baseline, and evaluate impacts at the statewide and regional levels (6). Public involvement and customer satisfaction measures are in use at several DOTs (11), but they are not necessarily being used for EJ analysis or decision making. 4 Transportation Research Record 2320 technical analysis Methods and challenges The literature discusses a range of technical analysis methods used to support EJ processes. Most EJ analyses make use of national cen- sus data along with GIS to determine the distribution of burdens and benefits. Depending on the needs of a particular project, analysis may focus on environmental impacts such as air quality and noise (12); social impacts such as accessibility, travel opportunity, and safety (13); or a variety of other effects such as those identified in Table 1. NCHRP Report 532 provides a prescriptive overview of analysis methods for several types of impacts (14). EJ analysis often faces some definitional challenges, including (a) how to define and apply concepts of equity, such as disproportion- ality (15, 16); (b) how to identify target populations, given ambiguous census categories (16); (c) how to define a study area with appropri- ate boundaries, given that an affected region will rarely coincide with the boundaries of census units (13, 17 ), and (d) how to account for the modifiable areal unit problem where using different geographical units of analysis results can lead to different results (8). Because every method has both advantages and disadvantages, Hartell recommends that simple statistical tests should be applied to review the characteris- tics and distributions of the data before choosing the most reasonable methods of analysis (16). Klein suggests the use of spatial statistics to enhance GIS analysis to overcome some of the definitional challenges associated with drawing boundaries and tracking cumulative impacts over time (18). Public involvement Public involvement is one of the more explicit elements of the EJ regulatory requirements. The EO identifies specific categories of stakeholders to be included, in addition to those identified by NEPA. Many agencies have merged the EJ and NEPA public involvement practices by paying special attention to include the low-income and minority populations within the NEPA process (1). Other agencies have unique public involvement practices, designed to meet the diverse needs of their particular regions and projects (4). To highlight a few agencies that go beyond the traditional or status quo approaches to public involvement, FHWA compiled a list of 10 project and program development efforts across the country (4). The South Carolina case provides an effective practices example through a nontraditional community outreach approach. Recogniz- ing that their first attempts at public involvement did not produce much minority input, the South Carolina DOT coordinated with local preachers to announce public meetings during church services in a predominately minority neighborhood. Numerous meetings were held in a variety of settings to make attendance convenient, and the South Carolina DOT committed to building the trust of the community by maintaining communication as decisions were being made (4). Another example of effective practices is the Wisconsin DOT’s proactive needs assessment for two heavily traveled corridors. Given the potential for controversy, the Wisconsin DOT began pub- lic involvement efforts nearly 3 years before it formally began the TABLE 1 Performance Measures for Environmental Justice (10, 11) Type of Measure Goal Area Performance Measures Input Output Outcome Safety and security Pedestrian and bicycle injuries and fatalities X Vehicle crashes X Accessibility Proximity to transit by type (bus or rail) X Accessibility to regional amenities (health care, education, etc.) X Mobility and efficiency Level of service (headways, days and hours of service) X Number of transfers required for trips between select O-D pairs X Percentage of transit travel time accounted for by transfers X Travel times for selected O-D pairs by mode X Percentage of congested travel times between select O-D pairs X Environmental protection, Number of households living within X feet of busy highway X energy conservation, Air pollution concentrations by type of pollutant X and quality of life Incidence rates of respiratory disorders X Number households exposed to noise exceeding X decibels X Number households living within X feet of a bus terminal X Percentage of buses servicing area that use alternative fuels X Property takings, household dislocations, access restrictions X System condition Condition of roads and streets X Condition of sidewalks X Average age and condition of transit vehicles X Funding equity Transportation capital expenditures per capita X Transportation operating expenditures per capita X Identity of users benefiting from new project or program X Public involvement Number of public outreach events X Number of participants attending public outreach events X Customer satisfaction ratings X Economic vitality and Number of and accessibility to jobs by type X competitiveness Employer accessibility to workers by skill level X Property values by location X Note: O-D = origin–destination; X = applicable; blank cell = not applicable. Amekudzi, Smith, Brodie, Fischer, and Ross 5 NEPA-required assessments in order to get input and support from the affected community, one made up of predominantly low-income, minority, and transient residents. In contrast to the South Carolina example, well-established leadership was difficult to find in this community. To reach community members, the Wisconsin DOT cre- atively partnered with a local middle school, incorporating transpor- tation and land use planning into the curriculum and having students prepare a portion of the needs assessment. A student team presented its findings in a public meeting to an audience of parents, Wisconsin DOT staff, city and county officials, and other interested individuals. This meeting was followed up by a community design charrette, from which suggestions were incorporated into the final project design. Going beyond regulations and following the spirit and intent of the EJ EO, the Wisconsin DOT addressed both the system and social needs of its community by allowing the residents to become more involved in community-based transportation planning (4). A third example of effective practices related to public involve- ment occurred in transit-oriented development in the Bay Area Rapid Transit (BART) Fruitvale Transit Village, an award-winning transit-oriented development project that originated with commu- nity opposition to a proposed parking deck. The success of this proj- ect for public involvement hinged on the willingness of BART and the City of Oakland, California, to partner with a local community- based organization, the Unity Council, which supported the inter- ests of the majority Latino district of Fruitvale. This partnership took the form of, first, agency grants to the Unity Council to initiate a community planning process and, second, a representative policy committee formed by means of a memorandum of understanding among all three organizations. By integrating the community’s needs and desires for the proposed project, the agency was able to develop a project that garnered enthusiastic community support, improved pedestrian traffic through the adjacent business district, promoted new investment activity around the transit station, and minimized nega- tive environmental impacts (4). These examples indicate that suc- cessful inclusive practices within various EJ communities may be nontraditional and context sensitive. Education and guidance Materials Formal integration of EJ programs and procedures in agency pro- grams, policies, and practices helps in managing and tracking the effectiveness of such activities. The Ohio DOT began to formalize the way in which it addresses EJ on the basis of the results of a pro- active study conducted by an EJ task force in 2002. The Ohio DOT study was conducted to determine the best approaches for Ohio to use in addressing EJ requirements in transportation, agreeing on mini- mal standards for EJ, and creating guidance and education materials on EJ from the Ohio perspective. The guidance provides the current regulations, demographic information, quantitative and qualitative test questions, public involvement recommendations, and integration tech- niques. The guidance also draws from Mid-Ohio Regional Planning Commission (the Columbus MPO) and the results of its EJ task force (7). This example illustrates the value of a close working relationship structured around EJ concerns between the state DOT and MPO. The Colorado DOT has also formalized its EJ procedures, beginning with a list of qualitative performance measures for public involvement (19). This was followed by publishing EJ and Title VI guidelines for NEPA projects in 2004. These guidelines prescribe procedures for the planning phases of projects and include detailed answers to frequently asked questions on EJ processes (20). These resources serve those involved in the transportation decision-making process by providing information on the regulatory history and background on environmental justice issues, including public involvement and planning techniques. EJ Maturation Framework The literature review results identified common elements of EJ pro- grams and initiatives, including public involvement programs, project analyses to determine burdens and identify disproportionately high impacts, and documentation. Less common elements include formal- ized EJ policies; before-and-after studies to determine whether EJ outcomes are being met, including timely benefits to affected groups (e.g., using performance measures to evaluate EJ outcomes, surveying affected communities to assess investment outcomes); and linking EJ analysis results with decision making. What emerges from the review is in essence a framework or model that shows agencies at different levels of maturity, in relation to achieving EJ outcomes in transporta- tion (Figure 1). The literature indicates that a good number of agen- cies have some level of public involvement and technical analysis to evaluate EJ at the project level; fewer, however, have a performance- based process for subsequently evaluating EJ outcomes and incorpo- rating the evaluation results into future policy and funding decisions. In general then, EJ programs may be activity focused or performance focused. Performance-based programs exhibit a higher level of matu- rity, in the sense that there is integration of past outcomes of an EJ program with future transportation policies and funding decisions. In performance-based programs, EJ is incorporated into all relevant DOT programs, positioning the agency to be more effective in achieving intended outcomes. Figure 1 summarizes the EJ maturation framework. An agency may be categorized in Phase I, II, or III of maturity. The phases are incremental, which means that Phase III is dependent on Phase II being in place and that Phase II is also dependent on Phase I being in place. In the first phase, an agency develops formal policies for EJ, identifies potential target groups, and develops public involve- ment processes. An agency in Phase II will build on Phase I activi- ties by implementing long-term monitoring and evaluation systems, to ensure that project burdens are not disproportionately distributed and benefits are not denied, reduced, or delayed for any populations. Agencies in Phase II begin to conduct before-and-after studies (either through public or technical evaluation) to assess whether intended EJ outcomes are being met. Phase III agencies take the results from Phase II activities and link them to policy and funding decisions, to ensure that EJ outcomes are met in ongoing project development and to incorporate evidence from past projects in future funding decisions. Phase III also provides a basis for evaluating new policy development regarding equity impacts. A performance-based program will include all three phases. Programs with Phase I initiatives or activities can only be categorized in the early stages of maturity, having EJ activi- ties but being unable to demonstrate conclusively whether intended outcomes are being achieved. Programs with Phases I and II only will be able to demonstrate the extent to which outcomes are being met but cannot articulate explicitly how decisions are being made to address any shortcomings detected in Phase II. The phase being implemented by an agency is indicative of the relative maturity of its EJ program. However, maturity may also vary within a phase. Specifically, the maturation framework further categorizes three levels of EJ programs within the first phase, as shown in Table 2. 6 Transportation Research Record 2320 The EJ maturation model does not imply that all the elements listed in a particular maturity level must be adopted by an agency in order to demonstrate EJ outcomes. However, it does indicate that agencies need to be thoughtful and strategic in deciding on which policies, procedures, and activities to adopt in order to develop a program that demonstrably achieves EJ outcomes. survEy rEsuLts, anaLysis, and discussion: BEnchMarking EJ PErForMancE Methodology Analysis of the EJ literature led to identification of several common and effective practices. Effective practices are those that will help agencies achieve EJ outcomes, as illustrated in Figure 1 and the following survery questions: 1. Does your agency have written formalized independent EJ policies or practices? If so, what are they? May we have access? 2. Does your agency systematically survey selected community groups, the public, or other organizations regarding the equity of transportation programs? 3. Does your agency have multimodal policy groups? 4. Does your agency use standard NEPA guidelines to conduct EJ processes? 5. What is your current data analysis methodology? For target areas? For determining impacts? Is it written or standardized in any way? 6. Does your agency have a citizen advisory group or any other structure for soliciting feedback? 7. How does your agency use information obtained from the public? 8. Does your agency have any examples, information, etc., to show result, outcome, or impact occurring as the result of considering EJ on transportation planning and practice? Transportation? FIGURE 1 EJ maturity-scale model showing application phases. TABLE 2 Typology of EJ Programs and Key Practices Type Approach Maturation Practices 1 Activity based Phase I, Level I Establish formal guidelines, identify target groups, and implement public involvement process. 2 Activity based Phase I, Level II Standardize technical analysis procedures and maintain citizen group lists. 3 Activity based Phase I, Level III Create standing citizen committee for feedback and implement formal interdivisional process. 4 Performance based (early stage) Phase II Create performance measures and use survey groups to measure outcomes. 5 Performance based (mature stage) Phase III Integrate outcomes into policies and funding decisions. Amekudzi, Smith, Brodie, Fischer, and Ross 7 A targeted survey was conducted to obtain more detailed infor- mation representing effective practices (some in more use than others). Because a national survey on EJ practices of state DOTs and MPOs had been completed within 2 years of the current study (1), the researchers did not think that another full-scale national survey would add significant value. Rather, they focused on identifying a range of effective practices to illustrate different maturation levels in achiev- ing EJ outcomes. The targeted DOTs were identified through the lit- erature and the list of DOTs submitted for review by national experts on EJ practice, including the TRB’s Transportation and Environmen- tal Justice Committee and the FHWA Resource Center’s Civil Rights Technical Services Team. Structured telephone interviews were then conducted with nine state DOTs during February 2011, with one follow-up interview conducted in June. The personnel interviewed were generally managers or coordinators in Civil Rights, Title VI, or Environmental Divisions. The survey questions listed above were developed to determine the DOTs’ current practices, assess a range of approaches, and determine how these approaches fall out on the maturity scale of achieving EJ outcomes. Tables 3 and 4 report the survey results anonymously. A review of the tables is informative regarding the maturing state of EJ in transportation in the country and the steps required for advancing EJ programs to their next level on the maturity scale. results Several important observations were made on the basis of the survey results. One important observation is that EJ programs are housed in various units of DOTs. In some states, EJ compliance is handled by the unit that also handles Title VI and Civil Rights compliance. In other states, EJ is seen as an environmental issue, and the process is conducted though the environmental division, often as a part of the NEPA process. Integration of the EJ process with NEPA can have advantages and disadvantages. Incorporating EJ into the NEPA pro- cess can ensure that EJ is accounted for in the planning of projects. However, because NEPA focuses on the natural environment and not the social environment, EJ analysis conducted by environmental departments may be limited in scope. Additionally, the full NEPA process is not conducted for all projects; those that qualify for cate- gorical exclusions do not require a full review. In these cases, EJ must be assessed separately. If EJ is housed exclusively in the Civil Rights or Environmental Division, EJ may not be addressed through interdivisional cooperation. More mature EJ programs will have a process that integrates these two divisions and possibly involves others. Just under half of the agencies studied had formal guidelines for EJ assessment, yet all of the agencies had a process to identify impacts on target groups. By federal mandate, all DOTs must have a public involvement process. However, many agencies have public involvement processes that are not necessarily considered as a part of EJ. In other cases, public involvement is seen as synonymous with EJ. Although public involvement is critical in EJ, it is not suf- ficient in and of itself to ensure that an agency achieves EJ outcomes. Because all agencies had one or more of these practices, they have achieved at least Phase I, Level I on the maturity scale (Table 3). In general, it was observed that the EJ programs of most DOTs are in the Phase I stage of the maturity scale. However, even within this phase, DOTs exhibit various levels of progress. Three DOTs were clearly in Phase I, Level I (i.e., DOTs 3, 4, and 5). These DOTs all determined disproportionately high and adverse impacts of projects on target groups. In addition, they may or may not have established public involvement processes. Some of them had integrated their EJ processes with NEPA. Only one of these DOTs (i.e., DOT 3) had a formal EJ plan at the time of the survey, but did not use NEPA to structure the EJ process. DOTs without EJ plans may develop one and promote EJ as an interdivisional initiative within the agency. DOT 7, for example, had developed an EJ plan. However, this agency did not have any formalized public involvement processes and would not be able to demonstrate an inclusive decision-making process; DOT 7 was therefore categorized at Level 1. DOT 7 did, however, have a written technical analysis procedure, an important Level II element, meaning that implementing a public involvement process would quickly move the agency to Level II status. DOT 8 identified disproportionately high and adverse impacts on target groups and documented EJ through NEPA. However, DOT 8 also did not show a formal public involvement process that involved EJ populations in decision making, and was therefore categorized as Level 1. The difference in the DOTs in Levels II and III is a result of a policy structure promoting feedback and cooperation within the agency. Only two DOTs (i.e., DOTs 2 and 6) were categorized in Phase I, Level III; both of these had a structured process for citi- zen feedback and interdivisional procedures. It should be noted that DOT 1 had a standing committee; however, missing elements such as technical analysis procedures and other Level II practices are needed for this agency to be categorized as Level II or beyond. Phase II of the model involves monitoring the outcomes of EJ actions to determine whether prior actions are actually achieving intended outcomes. This can be done through technical analysis using various performance measures and public evaluation (using surveys) of affected communities. Three of the DOTs surveyed monitor and measure EJ outcomes, as shown in Table 4. In a performance-oriented approach, one DOT has created a list of qualitative performance mea- sures that are compared across population groups (i.e., minority and nonminority) and conducted an assessment before and after the imple- mentation of the transportation project (1). Performance measures (qualitative or quantitative) used in conjunction with before-and-after studies are a critical tool for assessing EJ outcomes. Phase III of the model involves the next logical step after creating and evaluating performance measures: incorporating those results in future decision making. Table 4 shows the survey results for Phase III. It should be noted that DOT 9 does not have a distinct EJ program; rather it has integrated EJ into a DOT-wide project devel- opment process. As a result, the outcomes of the projects are linked back to funding decisions and policy development. In addition, two state DOTs were working toward the development of Phase III elements in their respective programs at the time of the survey. impact of EJ Policies, Programs, and activities on transportation The impact of EJ on transportation is dependent on the demonstrated performance or outcomes of EJ policies, programs, and activities on the quality of transportation for the transportation system users, with a particular focus on target groups. The extent to which EJ programs are demonstrably achieving EJ outcomes, as per EO 12898 and the FHWA–FTA guidance, can be linked to the maturity levels of various EJ programs in state DOTs (and other agencies involved in transpor- tation decision making) around the country. Specifically, according to the FHWA–FTA EJ principles and the findings of this study, to dem- onstrate EJ outcomes, agencies must involve target groups in deci- sion making on projects that are expected to affect them (process); 8 Transportation Research Record 2320 measure if there are disproportionate impacts on burdens and benefits for the identified target groups; implement actions to address dis- proportionate impacts; monitor outcomes to determine whether the intended outcomes of the EJ actions are being achieved; and address any deficiencies identified through future funding and policy deci- sions. The results of this study show that state DOTs are at different levels of maturity in developing the capacity to demonstrate EJ out- comes in transportation decision making, and, as with any evolving practice, there is a range of capacities to demonstrate EJ performance, with some agencies taking the lead in defining and demonstrating best practices. suMMary and concLusions More than 15 years have passed since the issuance of EO 12898. In that time, general, rather than specific, guidance from the U.S. DOT, FHWA, and FTA have allowed state DOTs to develop EJ programs to suit their individual needs while achieving the requirements of the issued guidelines. As a result, EJ programs across the country have matured to different levels. The EJ maturation model and the results presented in this paper allow state DOTs to benchmark the maturity of their programs against other states. DOTs can identify their maturity within the framework and determine steps for moving their EJ programs to the next level or best-practice status. The survey results show that several current practices end at the documentation phase. Therefore, the next step in several state DOT EJ programs is to measure EJ outcomes of transportation projects. The Arizona DOT’s research suggests that there may be a disconnect between an agency’s internal evaluation and external public evalu- ation of EJ efforts (5). This indicates the importance of postimple- mentation evaluations. To engage in performance-based EJ, agencies need to assess how well funding decisions reflect the knowledge obtained from EJ analysis, particularly how well they address any deficiencies, and affect the quality of transportation for target popula- tions and other communities (5, 21). Systematic ways to address these efforts include implementing formal policies and guidelines, (decision) criteria, and performance measures for EJ (1). These measures must then be incorporated into future transportation policy decisions and funding allocations for project development. As trans- portation agencies continue to improve and refine their EJ programs, a critical measure of performance of EJ activity on transportation TABLE 3 Practices and Survey Results: Phase I, Process and Burdens Level I Level II Level III Formal EJ Plan or Standard Guidelines Document EJ Through NEPA Structure ID of Disproportionately Adverse Impacts of Projects on Target Groups Public Involvement Process EJ Education Program Internal EJ Education Program External Maintenance of Citizen Group Lists Standardized Technical Analysis Procedures Formal Internal EJ Workgroups Standing Committee of Citizens for Feedback DOT 1 X X X DOT 2 X X X X X X DOT 3 X X X DOT 4 X X X DOT 5 X X DOT 6 X X X X X X X X DOT 7 X X X DOT 8 X X X DOT 9 X X X X Note: ID = identification; blank cell = not applicable. TABLE 4 Practices and Survey Results: Phases II and III Phase II. Evaluation of Benefits and Burdens Phase III. Decision Making: Linkage to Resources Survey of Target Groups to Measure Performance Measures for Process and Outcomes Systems Evaluation Assessment of Cumulative Impacts Linkage of Measured Outcomes to Funding Decisions DOT 1 X DOT 2 DOT 3 X DOT 4 DOT 5 DOT 6 X DOT 7 DOT 8 DOT 9 X X Note: Blank cell = not applicable. Amekudzi, Smith, Brodie, Fischer, and Ross 9 is the ability of agencies to demonstrate how they are achieving EJ outcomes in all the communities they serve. acknowLEdgMEnt This study was jointly funded by the Georgia Department of Trans- portation and the U.S. Department of Transportation through the Georgia Tech University Transportation Center project Impact of Environmental Justice on Transportation Planning. rEFErEncEs 1. Owens, E., G. Goodwin, C. Lewis, and J. Mallory. An Evaluation of Envi- ronmental Justice and Environmental Equity: Laws and Issues that Affect Minority and Low-Income Populations. Publication SWUTC/08/167921-1. Southwest Region University Transportation Center, College Station, Tex., 2008. 2. Department of Transportation, Office of the Secretary. Department of Transportation (DOT) Order to Address Environmental Justice in Minor- ity Populations and Low-Income Populations. Federal Register, Vol. 62, No. 72, April 15, 1997, pp. 18377–18381. 3. U.S. Department of Transportation, Office of the Secretary. Department of Transportation Final Environmental Justice Strategy. Federal Register, Vol. 60, No. 125, June 29, 1995, pp. 33896–33899. 4. Federal Highway Administration. Environmental Justice. www.fhwa. dot.gov.gov/ENVIRONMENT/EJ2.HTM. Accessed Sept. 30, 2010. 5. Jerome, A., and J. Donahue. What Is the Best Way to Address Environ- mental Justice Issues? Arizona Department of Transportation, Phoenix. 2002. http://www.azdot.gov/TPD/ATRC/publications/project_reports/ PDF/AZ506.pdf. 6. Grauberger, C., and D. Van Orden. Integrating Environmental Justice in Transportation Planning Phase II. Colorado Department of Transpor- tation Research Branch, 2003. http://www.coloradodot.info/programs/ research/pdfs/2003/environmentaljustice2.pdf/view. 7. Ohio Department of Transportation. Guidance and Best Practices for Incorporating Environmental Justice into Ohio Transportation Planning and Environmental Processes. Ohio Department of Trans- portation, 2002. http://www.dot.state.oh.us/Divisions/TransSysDev/ Environment/NEPA_policy_issues/ENVIRONMENTAL_JUSTICE/ Documents/EJ_Book_Complete.pdf. 8. Amekudzi, A. A., and K. K. Dixon. Development of an Environmental Justice Analysis Methodology for Georgia Department of Transporta- tion’s Multimodal Transportation Planning Tool. Proc., 8th Transpor- tation Research Board Conference on Planning Applications, Corpus Cristi, Tex., April 22–26, 2001. 9. Hatry, H. P., and J. S. Wholey. Performance Measurement: Getting Results, 2nd ed. Urban Institute Press, Washington, D.C., 2007. 10. Robinson, G. Environmental Justice in Transportation Toolkit, Volume 2. Transportation Equity Cooperative Research Program, 2008. http://ejkit. com/the-toolkit/ej-toolkit/ej-toolkit-volume-2/. 11. Midwest Transportation Knowledge Network. Data and Measure Syn- thesis. DOT State Stats, 2010. http://members.mtkn.org/measures/2010. 12. Forkenbrock, D. J., and L. A. Schweitzer. Environmental Justice in Transportation Planning. Journal of the American Planning Associa- tion, Vol. 65, No. 1, 1999, pp. 96–112. 13. Chakraborty, J. Evaluating the Environmental Justice Impacts of Trans- portation Improvement Projects in the U.S. Transportation Research Part D, Vol. 11.5, 2006, pp. 315–323. 14. Forkenbrock, D. J., and J. Sheely. NCHRP Report 532: Effective Methods for Environmental Justice Assessment. Transportation Research Board of the National Academies, Washington, D.C., 2004. http://onlinepubs. trb.org/onlinepubs/nchrp/nchrp_rpt_532.pdf. 15. Duthie, J., K. Cervenka, and S. T. Waller. Environmental Justice Analy- sis: Challenges for Metropolitan Transportation Planning. In Transpor- tation Research Record: Journal of the Transportation Research Board, No. 2013, Transportation Research Board of the National Academies, Washington, D.C., 2007, pp. 8–12. 16. Hartell, A. M. Methodological Challenges of Environmental Justice Assessments for Transportation Projects. In Transportation Research Record: Journal of the Transportation Research Board, No. 2013, Transportation Research Board of the National Academies, Washington, D.C., 2007, pp. 21–29. 17. Boston, T., and L. R. Boston. Beyond Race and Poverty: A Multi- Dimensional Approach to Measuring Environmental Justice. Atlanta Regional Commission, Atlanta, Ga., 2007. 18. Klein, N. Spatial Methodology for Assessing Distribution of Transporta- tion Project Impacts with Environmental Justice Framework. In Transpor- tation Research Record: Journal of the Transportation Research Board, No. 2013, Transportation Research Board of the National Academies, Washington, D.C., 2007, pp. 46–53. 19. Grauberger, C., and D. Van Orden. Environmental Justice Research Study. Colorado Department of Transportation Research Branch, 2002. http:// www.coloradodot.info/programs/research/pdfs/2002/env-justice.pdf/view. 20. Colorado Department of Transportation. CDOT’s Title VI and Environ- mental Justice Guidelines for NEPA Projects, Rev. 3. Colorado Depart- ment of Transportation, 2004. http://cospl.coalliance.org/fez/view/co:5133. 21. Sanchez, T., and J. F. Wolf. Environmental Justice and Transportation Equity: A Review of Metropolitan Transportation Organizations. Pre- sented at Racial Equity in Transportation: Establishing Priorities for Research and Policy Roundtable Sponsored by the Civil Rights Project at Harvard. Brookings Institution, Washington, D.C., 2005. The authors remain exclusively responsible for the material presented in this paper. The Environmental Justice in Transportation Committee peer-reviewed this paper. TABLE 3 Practices and Survey Results: Phase I, Process and Burdens Level I Level II Level III Formal EJ Plan or Standard Guidelines Document EJ Through NEPA Structure ID of Disproportionately Adverse Impacts of Projects on Target Groups Public Involvement Process EJ Education Program Internal EJ Education Program External Maintenance of Citizen Group Lists Standardized Technical Analysis Procedures Formal Internal EJ Workgroups Standing Committee of Citizens for Feedback DOT 1 X X X DOT 2 X X X X X X DOT 3 X X X DOT 4 X X X DOT 5 X X DOT 6 X X X X X X X X DOT 7 X X X DOT 8 X X X DOT 9 X X X X Note: ID = identification; blank cell = not applicable.