Executive Summary

This report, jointly authored by The Century Foundation, the National Employment Law Project, and Philadelphia Legal Assistance, presents the findings of an intensive study of state efforts to modernize their unemployment insurance (UI) benefit systems. This is the first report to detail how UI modernization has altered the customer experience. It offers lessons drawn from state modernization efforts and recommends user-friendly design and implementation methods to help states succeed in future projects.

While the need for better systems was apparent even before the COVID-19 pandemic struck, that crisis has illuminated the challenges with the existing UI infrastructure. This report includes specific recommendations that can inform the federal and state response to the unprecedented volume of unemployment claims during the pandemic, as well as ideas for longer-term reforms.

What Is Benefits Modernization?

Benefits modernization is the process of moving the administration of unemployment benefits from legacy mainframe systems to a modern application technology that supports web-based services. Many of these mainframe systems were programmed with COBOL, a long-outdated computer language. A few states began to upgrade their systems in the early 2000s, with the pace picking up after targeted federal funds were made available to support modernization in 2009.

Unfortunately, many of the initial modernization projects encountered significant problems. Some were abandoned altogether, while others were poorly implemented. Too often, workers paid the price through inaccessible systems, delayed payments, and even false fraud accusations. The COVID-19 pandemic, which led to an unprecedented spike in unemployment claims, has further exposed weaknesses in these systems.

To date, fewer than half of states have modernized their unemployment benefits systems. Several have plans to modernize or are already in the midst of modernizing. The guidance in this report is meant for them, as well as for modernized states that are looking to improve their systems.

Research Methods

The findings and recommendations in this report are grounded in interviews with officials from more than a dozen states and in-depth case studies of modernization in Maine, Minnesota, and Washington, conducted from October 2018 to January 2020. The case studies involved many hours of in-person discussions with agency leadership and staff, focus groups with unemployed workers, and interviews with legal services organizations, union officials, and other stakeholders.

The report provides lessons for states no matter which pathway to modernization they choose. In fact, the three states featured in the case studies took notably different approaches. Minnesota was one of the first to modernize in 2007, and while it used a private vendor, the code remains the property of the state agency. Washington contracted directly for a proprietary commercial off-the-shelf (COTS) system which rolled out in 2017. Maine also modernized in 2017, but as part of a consortium, meaning that it shares system and maintenance expenses with two other states (Mississippi and Rhode Island).

To complement the interviews and case studies, this report presents new data analysis, suggesting that while timeliness in processing claims and paying benefits improved in many states after they modernized, denial rates went up for workers seeking benefits and the quality of decisions declined.

The report also examines the growing use of artificial intelligence and predictive analytics in unemployment insurance. It concludes that, while some of these tools can improve operations and potentially help workers better understand program requirements, major concerns about fairness, accuracy, and due process remain.

Recommendations

The single strongest recommendation in this report is for states to place their customers at the center of a modernization project, from start to finish. The biggest mistake states made was failing to involve their customers—workers and employers—at critical junctures in the modernization process. This led to systems touted as convenient and accessible, but which claimants often found challenging and unintuitive. Customer-centered design and user experience (UX) testing are widely accepted best practices in the private sector, and should be a core part of any UI modernization effort.

More specifically, the report provides recommendations for states at each of the three major stages of a modernization project.

At the planning stage:

  • setting a realistic timetable and to avoid rushing implementation;
  • embedding talented agency staff in a modernization effort and getting their buy-in every step of the way;
  • asking customers what they need;
  • being willing to revamp the agency’s business processes along with the technology; and
  • identifying key conditions up front in an RFP (for agencies using outside vendors).

At the design stage:

  • getting user feedback from a broad range of stakeholders;
  • allowing plenty of “sandbox” time for agency staff; and
  • building in a set of key features that will help customers and reduce the burden on agency staff.

At the implementation stage:

  • not going live in the November–March period, when seasonal claims rise;
  • considering rolling out pieces of a new system in stages;
  • training and supporting staff on an ongoing basis;
  • staffing up call centers and deploying additional staff to career centers;
  • having a robust community engagement plan;
  • expecting bugs and having a process in place to fix them; and
  • providing for ongoing feedback from customers and front-line staff.

The pandemic has revealed how critically important unemployment insurance is to workers, their families, and the broader economy. By following the steps outlined in this report, states can build stronger unemployment systems that deliver the services and benefits their customers need.

What States Can Do Right Now

The release of this report coincides with the emergence of one of the greatest challenges the unemployment insurance system has ever faced: the COVID-19 pandemic. More than 10 percent of the workforce filed initial claims for unemployment in a three-week period in March and April, and job losses continued to rise thereafter.

State systems have been overwhelmed by the basic task of accepting claims, and workers are frustrated. Luckily, there are immediate steps states can take to improve access, even within outdated systems. Some states are already moving to implement these reforms, and others should follow their lead as quickly as possible.

Michigan is a good example of a state that has turned around a poorly engineered system into one that is serving workers well during the pandemic. While the original system had been modernized, the new system was designed with a faulty algorithm that inaccurately flagged workers for fraud and cut off benefits at every decision point. A new governor appointed a claimant representative to head the agency, and less than a year later, the system faced the massive challenge posed by the COVID-19 outbreak. The new leadership identified places in the system that placed an unnecessary hold on benefits and turned off those chokepoints. As a result, the agency became the second-fastest in new benefit processing among the ten states with the largest numbers of claims,1 and one of the first states to stand up the new Pandemic Unemployment Assistance benefit and the Pandemic Unemployment Compensation benefit authorized by the CARES Act.

Our recommendations for what states can do now come from our study of best practices at the state level. While states are unlikely to be able to fully replace their UI systems in the midst of this crisis, they can and must improve their technology. Here are six key areas for immediate improvement.

First, unemployed workers need 24/7 access to online and mobile services. We live in a country where you can shop online at any hour of the day. Filing for unemployment shouldn’t be restricted to nine to five on weekdays.

Second, unemployment websites and applications must be mobile-optimized. More people have mobile phones than desktop or laptop computers, and public access to computers has vanished in an era of social distancing. Low-wage workers and workers of color are particularly likely to rely on their phones for Internet access. While more than 80 percent of white adults report owning a desktop or laptop, fewer than 60 percent of Black and Latinx adults do. States must also allow workers and employers to email documents or upload them from their phones.

Third, states should update their password reset protocols. In some states, workers must be mailed a new password; in others, staff cannot process claims because they are busy answering phone calls about password resets. Technology exists for states to implement secure password reset protocols that do not require action by the agency, which saves time for everyone.

Fourth, states can use call-back and chat technology to deal with the unprecedented volume. These short-term fixes could become part of a permanent solution. “Call back” systems return a worker’s phone call instead of making them wait on hold. Chat-bots, live chats, and thought bubbles can define terms and answer simple questions for workers filing online.

Fifth, states should adopt a triage business model. Many of the questions coming into the call centers relate to passwords or claim status. Using a triage model, states can quickly train staff to handle this part of the volume, and leave more challenging questions to more experienced staff.

Finally, civil rights laws require that states translate their websites and applications into Spanish and other commonly spoken languages. Right now, an unemployed worker with limited English skills may have no choice but to file an application over the phone with an interpreter. With so many seeking help, workers may never get through on the phone or can get stuck on hold for hours. Translating online materials would not only ensure equal access, but also be more efficient.

Even if these measures take a number of weeks or months to implement, the investment would be well worth it. The employment crisis triggered by the pandemic has highlighted gaping holes in accessing unemployment, but it has also created an opportunity. We can build twenty-first-century systems nimble enough to handle disasters and designed to meet the needs of customers who are depending on access to unemployment insurance in this traumatic time.

Introduction

Unemployment insurance (UI) is the nation’s primary social program for jobless workers and their families. The federal–state UI program was established by the Social Security Act of 1935, providing critical income support for workers contending with a national unemployment rate of approximately 30 percent during the Great Depression. More recently, UI was credited with reducing poverty, empowering workers, and stabilizing the economy during the Great Recession.2 The benefits available through these programs keep workers economically stable during periods of job loss, often meaning the difference between losing access to vital job search resources like a phone or a car, or basic necessities like a roof over one’s head and food on the table.

While the research for this report was conducted before the COVID-19 crisis erupted, the current economic crisis has put into sharp focus the need for strong unemployment systems. Between March 14 and April 25, 30 million Americans—one-fifth of the workforce that is covered by unemployment insurance programs—filed an initial application for unemployment benefits. These workers experienced overwhelmed phone lines and websites, and most importantly, excruciating delays in receiving benefits. As the crisis erupted in March, only 14 percent of the 11.7 million jobless workers who filed claims received a UI payment. A recent survey revealed that for every ten workers who were able to file for unemployment insurance, three to four additional workers tried to apply but could not get through UI systems to make a claim, and two additional people did not try because it was too difficult.3

This report explains efforts by states to modernize unemployment benefits systems—many of which still to this day rely on 1980s technologies such as mainframe computers running on COBOL programming language—and explores ways for these new systems to realize their potential to be more customer-friendly and scalable to challenges like the ones states face today. In looking at challenges and best practices among states, we also identify actions states can take right now, even before modernization, to improve the experience of workers and the performance of UI systems.

Most importantly, as states and the federal government look to rebuild the backbone of this crucial safety net anew, our report suggests a new customer-driven approach geared to meet the needs of jobless workers reaching out for help during the long economic recovery ahead and future economic crises. As those new systems are built, the practical realities of workers’ lives must be kept front and center. Our analysis finds, for example, that states that modernized their benefits systems were more likely to deny benefits because they have imposed more stringent online verification of work search activities that workers struggle to navigate. While these rules have been temporarily eased in many states during the COVID-19 crisis, they could become a major factor as the economy opens up.

While a few states began to modernize in the early 2000s, this trend picked up around the time of the Great Recession, as states began looking for technological improvements that could streamline business processes, reduce costs, and provide better security and privacy protocols for the massive amount of data maintained by the UI system. These UI benefit modernization projects involved moving state unemployment benefits and appeals systems from a “legacy” mainframe-based system to an application technology that supports web-based services. They gained significant traction as states received targeted federal funds in 2009. These upgrades were necessary for security; one state secretary of labor described the mainframe system as held together by “bubblegum and duct tape.”4 The upgraded systems also offered customers the potential for more convenient online filing, notification of claims progress, and appeal filing.

Some early modernization efforts were unsuccessful. As of 2016, 26 percent of projects had failed and been discarded; 38 percent were past due, over budget, or lacking critical features and requirements; and 13 percent were still in progress. System failures can have disastrous human consequences. UI systems that are poorly planned and lack critical user testing limit claimants’ ability to access benefits, and sometimes cut them off from benefits entirely. Many states experienced significant lock-out issues when claimants had no easy way to reset their online account passwords.5 Florida cut off all points of access to its system for anyone not using the online tools, creating significant barriers for workers with language access, computer literacy, or broadband access issues. Systems that incorporated automated decision-making processes generated tens of thousands of incorrect fraud determinations that put workers into massive debt, drove them to bankruptcy, and cut off future access to unemployment benefits.6

However, the last few years have shown great improvement by states, with more states implementing full systems while at the same time controlling costs. Some states learned from past challenges and made improvements in time for challenges presented by COVID-19.

This report is the first to detail how UI modernization has altered claimant experiences. It shares the findings from interviews with officials from more than a dozen states and in-depth case studies of modernization in Maine, Minnesota, and Washington. It also analyzes publicly available UI data to run a comparative analysis of state UI programs at various stages of modernization. The report then draws on those interviews, case studies, and data analysis to present a set of recommendations for states to follow in their modernization projects.

States that have modernized acknowledge that the process is challenging and never perfect, but many have sought to learn from these experiences to build user-centric systems with positive outcomes for workers. Our hope is that this report will aid all states in doing so.

The Role of Unemployment Insurance

UI serves several key policy goals. Most obviously, it provides income stabilization for individuals who are involuntarily unemployed through a cash benefit. This stabilization extends to the local economy generally, but is critically important during times of economic downturn. The UI system also promotes attachment to the workforce and provides job search assistance and standards to prevent workers from accepting new work that is not suitable to their skills and to avoid downward pressure on wages.

Generally, during periods of recession, Congress has acted to temporarily extend benefits for workers after they exhaust their standard twenty-six weeks of state UI. During the Great Recession, state and federal UI payments totaled over $600 billion, keeping 11 million workers above the poverty line.7 Economists Alan Blinder and Mark Zandi examined the effect these payments made in the recovery, and found that every dollar in benefits paid generated $1.61 in local economic development.8 Similarly, the CARES Act of 2020 added thirteen weeks of benefits, temporarily increased payouts by $600 per week, and expanded eligibility to new categories of workers, delivering an estimated $250 billion in support to workers and the broader economy.

Maintaining the federal–state UI program is a macroeconomic balancing act. During periods of economic growth, UI agencies build up their trust funds to prepare to stabilize the workforce and economy during periods of economic recession. However, while the level of funding in a state’s UI trust fund is an important indicator of a state’s recession readiness, just as important—perhaps even more so—is a state’s ability to make benefits available to workers accurately, efficiently, and in a timely manner during a recession. As we evaluate the effect of modernizing IT systems, it is critical that we recognize access to benefits as an important countercyclical tool.

Who Does the System Serve?

UI systems have two primary customers: workers and employers. While worker benefits are the most visible part of the system, employers also interact with the UI system from both a tax and a benefit perspective.

While every state operates its program differently, in general, employers are charged for UI benefits based on their former employees’ experience with the system. This taxation system is referred to as “experience rating.” State UI taxes (assessed per the State Unemployment Tax Act, or SUTA) are levied on whatever the state sets as a Taxable Wage Base—which can vary from the first $7,000 of income to the first $46,800 each worker earns. The UI tax structure gives employers an interest in whether or not workers receive benefits, as benefit receipt can put employers on the hook for higher taxes. Employers are also included in initial investigations into benefit eligibility, receive notices about eligibility determinations, appeal determinations, and often participate in administrative hearings that address the reason a worker is unemployed.

Workers have historically been assessed for eligibility on two bases: financial eligibility and separation eligibility. Financial eligibility is based on how much money a worker earned during the qualifying base period and how often they earned money. Separation eligibility addresses why the worker is unemployed—that is, whether they left their job for a qualifying reason, such as through a layoff, a discharge without cause, or quitting for good reason.

Workers also have continuing eligibility requirements that require them to file weekly or biweekly certifications to receive benefits in which they must report any earnings, show they are able to work, and inform the state they are still unemployed. Additionally, workers’ continuing eligibility may be challenged based on their availability for work and effort made to find suitable employment. In order to interact with the system, workers must file initial claims, communicate with the agency representatives, file continuing claims, receive notices about eligibility, appeal determinations, and participate in administrative hearings.

The states are obligated under federal law to serve customers in a manner that also ensures due process. Specifically, the Social Security Act of 1935, which created the UI program, requires that the states provide for “methods of administration . . . reasonably calculated to insure the full payment of compensation “when due” and for a “fair hearing.”9 These provisions necessitate fair but also rapid and accurate administration of the program so that workers are able to receive benefits within a few weeks of losing work. Failing to conform to these requirements can trigger the loss of a tax credit to employers (per the Federal Unemployment Tax Act, or FUTA) of 5.4 percent of the first $7,000 in worker pay. The U.S. Department of Labor’s Employment and Training Administration (ETA) plays a critical role enforcing these federal safeguards and ensuring compliance by the states.

Racial Equity Implications of the UI System

The evidence suggests that institutional racism plays a significant role not only in unemployment but also in access to UI systems. If we look at the racial equity implications of modernizing UI systems, we see a compelling reason to center the experiences of Black and brown workers.

Higher Unemployment Rates

Black and Latinx workers face labor market obstacles and exclusions due to hiring discrimination rates that have remained unchanged over the past twenty-five years.10 The unemployment rate for Black workers across almost every level of education has remained double that of white workers for nearly forty years.11 And in the ten largest majority-Black cities, the unemployment rate of Black residents was 3.9 to 10.8 percent higher than that of white residents.12

Lower UI Benefits

Despite facing higher rates of unemployment, evidence shows that Black and Latinx workers do not receive UI benefits at the same rate as white workers. In 2010, following the Great Recession, non-Latinx Black unemployed workers had the lowest rates of receiving UI benefits at only 23.8 percent, compared to 33.2 percent of non-Latinx white unemployed workers; meanwhile, only 29.2 percent of Latinx unemployed workers received benefits.13 Between April 27 and May 10, 2020, over 71.5 percent of Black unemployed women did not receive unemployment benefits, compared to just 54 percent for white unemployed women.

Racial Wealth Gap

Applying for unemployment insurance during normal times can be a complicated and arduous process; add in a built-in waiting week policy in many states, and workers are often left struggling with a gap in income. With somewhere from half to 74 percent of all workers reporting they live paycheck to paycheck, a wait for a UI check—or no check at all—can be painful.14 This is particularly true if a worker doesn’t have wealth to fall back on. Being able to wait for unemployment benefits is a luxury afforded to those with savings and wealth; and unsurprisingly, this nation’s racial inequities have created a racial wealth gap.

America’s laws and policies have deprived people of color of an equitable share of the nation’s wealth. The typical net worth of a white family is nearly ten times that of a Black family and seven times that of a Latinx family.15 And more than 25 percent of Black households have zero or negative wealth, compared to less than 10 percent of white households.16 Without wealth to fall back on, workers of color are even more harmed by inefficient and ineffective systems.

Racial Wage Gap

Racial wage gaps mean lower earnings, resulting in smaller UI benefits. Black men earn just 73 cents—and Latinx men earn just 69 cents—for every dollar earned by a white man.17 And while the overall gender wage gap means that for every dollar earned by a white man, the average white woman just makes 79 cents, women of color earn even less: 62 cents for Black women, 57 cents for Native American women, and 54 cents for Latinx women.18

Digital and Mobile Divide

How people access information regarding and apply for unemployment benefits is also impacted by race. More people have mobile phones than desktop or laptop computers,19 and unemployment websites and applications that are not mobile-responsive disproportionately place a burden on workers of color. Twenty-five percent of Latinx and 23 percent of Black adults, compared to just 12 percent of white adults, are entirely smartphone dependent and do not use broadband at home.20 While more than 80 percent of white adults report owning a desktop or laptop, fewer than 60 percent of Black and Latinx adults do.21 And when it comes to job searches, 55 percent of Black and Latinx workers, compared to just 37 percent of white, use their smartphone to get information about a job; and when applying for jobs, Black and Latinx workers are more than twice as likely than white workers to apply for a job using their mobile device.22 Ensuring all workers can navigate the UI system requires access to mobile-friendly programs.

Biased Algorithms

According to the research institute Data & Society, “algorithms can be incredibly complicated and can create surprising new forms of risk, bias, and harm.”23 Algorithmic systems can have bias in multiple places, including biases introduced through data input or by the algorithm creator.24 One of the problems with algorithmic systems is that they often make decisions that result in different outcomes based on protected attributes, including race, even if these attributes are not formally entered into the decision-making process.25 For example, evidence shows a racial impact in medical algorithms that ignore social determinants of health or result in Black people needing to be sicker than white people before being offered additional medical help.26 Without external auditing systems that assess how the data is processed, these biases are allowed to go unchecked.27

Due to the layers of institutional racism faced by Black, Indigenous, and workers of color, we must work alongside worker leaders and organizations to create a racially just, inclusive, and truly accessible unemployment insurance system. A modernized unemployment insurance system should center the experiences of workers of color and collect data on race and ethnicity, to ensure states are adequately meeting the needs of all workers.

Administrative Funding

Critical to the viability of state UI programs is access to adequate funding for administration of the tax, benefits, and appeals systems. The federal government funds the administration of the state UI programs, including eligibility determinations, tax collections from employers, and the appeals process. State systems have been chronically underfunded, and the resulting search for efficient technology solutions is one of the principal motivators behind benefit modernization projects.

Administrative grants are tied to the amount of unemployment insurance claims paid out by the state and therefore drop when there are improvements in the economy and declines in UI recipiency. As a result, federal grants for the administration of unemployment insurance declined by 30 percent from 1999 to 2019 on an inflation-adjusted basis.28

These funding levels during this period were barely enough for states to manage the basic staff needed to operate their UI programs, let alone upgrade and maintain unemployment insurance technology. The approximately $2 billion in annual federal funds available to states before the COVID-19 crisis left state UI programs with threadbare staffs that struggled to address the sudden and major surge of claims. In particular, states lacked flexible technology that could quickly incorporate law changes, bandwidth to process claims, and enough trained staff to ramp up call center and adjudication operations that require interactions with claimants. While the federal government provided an additional $2 billion in state UI administrative funding under the Families First Coronavirus Response Act, which is allowing the states to increase staffing and pursue short-term technology improvements to respond to COVID claims, the “boom and bust” cycle of federal funding is inconsistent with the needs of the states for stable levels of funding to sustain their UI programs.

Federal regulations stipulate that states should deliver a first payment to at least 87 percent of eligible applicants within fourteen or twenty-one days from their initial claim for benefits (the difference is that those states with a waiting week are given 21 days to make a payment).29 The national average of first payment timeliness dropped below the national standard in 2008, which was understandable, given that the surge of unemployment in the early years of the Great Recession came before states got more administrative dollars from the formula. However, timeliness did not recover, even when UI claims fell to their lowest level in fifty years. Throughout our interviews, state officials consistently complained about having to juggle staffing to meet all their requirements, including taking claims, and cited low administrative funding as a reason for driving more claimants to online initial applications.

FIGURE 1

As a result, states have been forced to look for supplemental funding. Between 2007 and 2016, the National Association of State Workforce Agencies reported a 115 percent national increase in state supplemental spending for UI administration. As of 2017, seventeen states and territories reported the use of a state administrative tax to supplement administrative costs;  nine states reported using general revenue, while twelve states said they used other sources for administrative funding.30The federal government has periodically, but not consistently, provided additional funding that has supported benefits modernization projects, as discussed further below (see “The Federal Role”).

Benefit Modernization: Where Are We Now?

Since the early 2000s, states have been working toward upgrading their UI technology. Driven by concerns about data security and privacy, administrative costs, and efficiency, states have tried a variety of methods to either wholesale replace their systems or gradually improve their components. Most have done so through contracts with private vendors, sometimes as part of a consortium, where system and maintenance expenses are shared across a group of states, but the product itself is customized to some degree to each participating state’s needs. At least one state, Idaho, handled its modernization project entirely in house.31 Consortium models hold the potential to generate even more financial savings if other states join the consortium later, but states still have to navigate governance issues.

For many years, modernization projects trended in the red—encountering significant cost overruns and schedule delays, with several states actually pulling their projects. As noted in the introduction to this report, as of 2016, 26 percent of projects had failed and been discarded; 38 percent were past due, over budget, or lacking critical features and requirements; and 13 percent were still in progress.32 However, the past few years have shown great improvement, with more states implementing final systems while controlling costs.

As of 2019, twenty-two states had completed modernization projects for their UI benefits systems and twenty-one states completed modernization projects for their UI tax collection systems (sixteen states have completed both). Thirteen states reported having UI modernization projects in development,33 including two—Ohio and Montana—that are re-modernizing after an early 2000s implementation.

figure 2
TABLE 1
States That Modernized Their UI Systems, By Year
Montana 2001
Ohio 2004
Utah 2006
Minnesota 2007
New Hampshire 2009
Illinois 2010
Nevada 2013
New Mexico 2013
Michigan 2013
Massachusetts 2013
Florida 2013
Indiana 2014
Idaho 2014
Louisiana 2015
Tennessee 2016
Mississippi 2016
Missouri 2016
Washington 2017
South Carolina 2017
Maine 2017
Wyoming 2018
North Carolina 2018
Source: NASWA Information Technology Support Center & Author’s analysis

The completed modernization projects have encountered significant problems, including numerous delays, issues with testing, data conversion errors between legacy and new systems, data loss and security issues, and poor training of staff who interact with claimants. For example, Massachusetts’ modernized system, built by Deloitte, was $6 million over budget and, after rolling out two years late, was riddled by implementation problems.34 Call center wait times doubled, and there were 100–300 claimant complaints per week, owing in large part to a major increase in system-generated questionnaires to claimants, which delayed claims processing.35

While Tennessee’s system was developed by a different vendor, Geographic Unemployment Solutions, they experienced some of the same problems as Massachusetts: the system auto-generated numerous non-critical questions about applications that had to be cleared by staff, and the backlog for responding to user questions about claims stretched to eighty-two days after the system was rolled out in May 2016. Data conversion problems between the legacy and new system caused delays in payments, all during an implementation that a legislative audit later concluded was rushed.36

In several other states, implementation has been rushed to meet external deadlines, leading to situations like that in Maine and Washington (described in depth in the case studies) where the state’s system was not ready for a surge of claimant questions, glitches caused the system to go down after launch, and repeat problems with core elements like passwords could not be solved.37

Florida’s CONNECT system was riddled with timeliness and accuracy problems when it launched, including more than 400,000 claims documents that were stuck in an “unidentified” queue and unable to be processed.38 The implementation of the new system coincided with a new requirement that claimants report five employer contacts per week, which could only be reported online through the new system. As described in a previous NELP report, “the number of workers disqualified because DEO [the Florida Department of Economic Opportunity] found they were not ‘able and available for work’ or not ‘actively seeking work’ more than doubled in the year following the launch of CONNECT, even though weekly claims declined by 20 percent in that same year.”39 The U.S. Department of Labor Civil Rights Division found that this aspect of the system had a discriminatory effect on Limited English Proficient claimants who struggled the most.40 New Mexico’s modernization implementation also faced a civil rights complaint from a legal service organization based on multiple language access problems, including an elderly Spanish-speaking farm worker who was told he could file online in a site that was only in English.41

The U.S. Department of Labor (DOL) has issued guidance advising states to move away from phone-based work-search verification and instead move to online collection of work search activities through a case management system.42 But as was demonstrated in Florida, verifying details of job-seeking efforts (such as details of job applications submitted) online can be a hurdle for many claimants who have difficulty navigating online systems, like those with limited English proficiency.

The DOL guidance on work search is one aspect of the department’s focus on “program integrity;” that is, reducing the incidence of payments made in error. There have been several supplemental federal appropriations related to program integrity, which have been used to fund modernized systems featuring new models to detect improper payments and assess fraud. In one such instance gone badly awry, the state of Michigan used an automatic determination process that falsely charged more than 20,000 claimants with improperly collecting unemployment benefits (such as by both working and collecting UI benefits).43

This report intends to help modernizing states learn from the challenges these early-adopting states faced. It is clear from the research done for this report that UI modernization is maturing, with several states joining forces in consortia in an attempt to reduce the costs of development and maintenance of their system. Moreover, states are benefiting from products that have been developed by vendors specifically for UI and that have been road-tested in these early applications, and can be deployed more smoothly. Still other states have designed their own unemployment insurance technology systems, and, as in the case of Idaho, are making their technologies and expertise available to other states. Taken together, these advances put the UI system in position to improve the outcomes of UI modernization, both among those states that have yet to modernize and who are looking to improve their systems. At the same time, it is important to learn from the ways that COVID-19 pandemic has tested and stressed these systems, and how states have responded.

The Federal Role

While each state UI agency is responsible for its own benefit modernization project, the federal government plays a critical role in funding the projects and oversight to ensure compliance with the legal safeguards requiring fair and timely processing of benefits. Congress has also played a monitoring role, assisted by research provided by the U.S. Government Accountability Office (GAO), which issued a series of reports addressing the severe staffing, phone claims system, and IT challenges that compromised access to benefits during the Great Recession.44

Funding

Federal funds have been critical to many benefit modernization projects. During the Great Recession, Congress authorized the release of federal trust fund dollars (called “Reed Act” distributions, which are based on each state’s share of FUTA revenues) to the states to stabilize the solvency of their state trust funds, expand benefits, and support UI administration. In 2009, as part of the American Recovery and Reinvestment Act, Congress passed the UI Modernization Act, which created an incentive program for the states to expand UI benefits for low-wage and part-time workers, many of whom are women, while also helping the states make critical investments in IT, staffing, and other UI administration needs. From 2009 to 2011, 39 states claimed about $4.5 billion in incentives to improve UI systems, including improving technology.45

In addition, in recent years, DOL has made varying amounts of supplemental funding available to support state consortia and state IT needs. However, that funding has been limited and typically comes with a number of strings attached, such as mandates that the systems are designed to identify UI overpayments and assurances that the projects would be completed with other sources of funding, if necessary to cover the full cost.

Oversight

DOL’s Employment and Training Administration (ETA) has set some standards in response to the growing reliance of states on technology to process UI benefits. Of particular significance, in 2015, ETA and DOL’s Civil Rights Center issued state guidance entitled “State Responsibilities for Ensuring Access to Unemployment Insurance Benefits,”46 which relied on the federal UI and civil rights laws to clarify where technology can compound problems of access to UI benefits facing many unemployed workers. On May 11, 2020, DOL updated this 2015 guidance in response to the COVID-19 pandemic, pointing out to states the requirement that they translate vital written, oral, and electronic information into languages spoken by a significant portion of the eligible or affected population, as defined by Department of Justice guidelines.47 Moreover, states must provide access to people with disabilities, including online enrollment systems that incorporate modern accessibility standards for deaf, blind, and otherwise disabled applicants.

This comprehensive interpretation of federal law provides a template for states to ensure full and fair access to benefits when modernizing their IT systems. The federal guidance clarifies that the “use of a website or web-based technology as the sole or primary way for individuals to obtain information about UI benefits or to file UI claims may have the effect of denying or limited access to members of protected groups in violation of Federal nondiscrimination law.”48 It goes on to caution that the state UI agencies “must also take reasonable steps to ensure that, if technology or other issues discussed in this [guidance] interfere with claimants’ access, they have established alternative methods of access, such as telephonic and/or in-person options.”49

On a separate track, in 2018, DOL rolled out a “pre-implementation planning checklist” for states to follow, which candidly recognizes that “recent efforts by states in launching new IT systems have resulted in unexpected disruptions in service to customers, delays in payment of benefits, and the creation of processing delays.” 50Before going “live” with a new UI benefit or tax IT system, the states are required to submit a report to DOL indicating that they have reviewed and addressed each element of the checklist,51 which covers all phases of the process, including functionality and testing of the system, customer access and usability, policies and procedures, implementation preparation, call center operations readiness, vendor support, communications, training and other core functions and activities.52

The checklist was developed with the assistance of the Information Technology Support Center (ITSC), which is operated by the national organization of state UI and workforce agency administrators (called the National Association of State Workforce Agencies, or NASWA). ITSC also provides UI IT modernization resources online and other services funded by DOL grants, which are available only to NASWA members. ITSC also provides consulting services at the initial planning phases of a state’s UI IT project.

To date, only a limited number of states have launched new systems that require submission of the planning documents. ITSC is not responsible for reviewing the pre-implementation planning documents to evaluate their compliance with the checklist. And given its limited resources and expertise in UI IT planning and implementation, ETA is likely to defer to the judgement of the states in evaluating the planning documents.

Recommendations for States

This section presents our full list of recommendations to states on how to plan, design, and implement a UI benefits modernization project. These recommendations are grounded in interviews with officials from more than a dozen states; in-depth case studies of UI modernization projects in Maine, Minnesota, and Washington; and analysis of data on UI system performance from all fifty states. (See the remainder of this report for these materials.) This list of recommendations presents:

  • best practices, as identified in our interviews and case studies of state UI modernization projects;
  • lessons learned about missteps to avoid; and
  • best practices from beyond UI, looking at both the public and private sectors.

These recommendations will be most applicable to states that have not yet modernized, or who are in the midst of modernization. However, they may also be helpful to states that have already modernized but are looking to improve their systems.

We recognize that state UI agencies operate with limited human, financial, and technological resources. Nonetheless, we believe the recommendations presented here are achievable even within those constraints, particularly if federal funds are available to bolster state efforts.

We have structured our recommendations to follow each of the three major stages of modernization: planning, design, and implementation. Our single strongest recommendation is to place customers at the center of the project, from start to finish. The biggest mistake we saw states make was failing to involve workers at critical junctures in the modernization process. This led to systems touted as convenient and accessible, but which claimants often found challenging and unintuitive. Customer-centered design and user experience (UX) testing are widely accepted best practices in the private sector, and should be a core part of any UI modernization effort.

Stage 1: Planning

Recommendation 1.1. Set a realistic timetable. Many state officials interviewed for this project said they regretted allowing too little time and having to rush implementation. Allow ample time for planning and design, including user testing and refinement, before you roll out a new system.

Recommendation 1.2. Get buy-in from agency staff. Modernization projects demand full commitment and cooperation across all agency divisions. It may be challenging to divert talented staff away from their daily responsibilities, but to succeed, you have to embed them in the modernization effort and get their buy-in every step of the way. Experienced staff can draw on their institutional knowledge to inform operational change management. Staff with coding experience can help ease the data migration from legacy systems. Seek their input early, to inform your RFP (if you are using an outside vendor) and make sure you don’t omit anything critical.

Recommendation 1.3. Ask customers what they need. As part of your initial needs assessment, reach out to unemployed workers and employers, and ask them what they would like to see in a modernized UI system. Agency staff will provide valuable input, but there is no substitute for going straight to your customers.

Recommendation 1.4. Be willing to revamp your business process. As you plan your modernization project, don’t be held back by your current business process. Assume it can and will change, rather than designing your new system around processes that may no longer make sense in a new environment.

Recommendation 1.5. Identify key conditions in your RFP. If you are using an outside vendor, making these three conditions clear in your RFP will help you negotiate a contract that sets you up for success.

1.5.1. Retain control of the go-live date. You don’t want to be rushed by a vendor into rolling out a new system before you are confident that you are ready.

1.5.2. Allow for extensive usability testing by your staff and customers. Some vendors only test out a product with their own software engineers, a narrow approach known as user acceptance testing (UAT). Make sure you have the opportunity for broader user experience (UX) testing, to gather input from workers and employers in your state.

1.5.3. Provide for tech support after the go-live date. Even the best-planned modernization project will not roll out perfectly. Require the vendor to train your staff in advance so they can handle issues as they arise and make changes to the system. In addition, make sure the vendor remains available to you after rollout without further costs.

Stage 2: The Design Process

Recommendation 2.1. Get user feedback from a broad range of stakeholders. Create ample opportunities throughout the design process for workers and employers to try out features of your system, and tell you what makes sense to them and what doesn’t. There are others who will use the system frequently and should be involved in testing as well. They include labor unions, legal aid organizations, community groups, and other social service agencies. Be sure to compensate members of the public for their time and transportation costs when you invite them to participate in focus groups or other consultations.

Pennsylvania and Massachusetts Listen to Stakeholders

 

Two states have now created mechanisms for stakeholder involvement and feedback in these projects. In 2017, Pennsylvania’s legislature created the “Benefit Modernization Advisory Committee” in the bill that provided funding for Pennsylvania’s new project.53 The small committee consists of employer, labor, technologist, and claimant representatives, along with three agency staff members who will use the new system. By law, the committee:

 

– meets at least quarterly with project and agency leadership;

– receives monthly updates on the project;

– monitors the implementation and deployment of the project, providing feedback and both formal and informal recommendations; and

– submits a yearly report of the project’s process and the committee’s project recommendations to the legislature.

 

In 2020, Massachusetts’ legislature created a similar advisory council in the funding mechanism for upgrades to the modernized system that was deployed in 2013, and includes the council in the bid selection process.54 The advisory council has similar responsibilities to Pennsylvania’s committee, but had broader stakeholder involvement, creates greater project transparency, and requires feedback from vulnerable communities:

[T]he advisory council shall solicit input on the criteria utilized for the selection of the bid evaluation from low-wage unemployed workers, people with disabilities who use assistive technology, community-based organizations that advocate for people with limited English proficiency, people of color, recipients of unemployment benefits and individuals with technological expertise in systems designed to maximize user accessibility and inclusiveness.55

 

Recommendation 2.2. Allow plenty of “sandbox” time for agency staff. Create both structured and unstructured opportunities for staff to experiment with features of the system as it is being developed and recommend improvements.

Recommendation 2.3. Build in key features that help customers and reduce the burden on agency staff. While the precise design of your system should be guided by the needs of your customers and staff, there are a set of features that we recommend building into any system (and writing into your vendor contract).

2.3.1. Create a substantive, accessible claimant portal. Customers should be able to access a portal where they can perform all essential tasks: filing an initial claim, continuing claim, or appeal; checking on the status of a claim or appeal; completing fact-finding questionnaires; uploading documents and evidence, and reviewing correspondence.

2.3.2. Go for a professional look. Your website should present your agency as the professional operation that it is. The interface should look like the private sector websites customers are used to encountering; if it doesn’t, customers may be less likely to trust it, resulting in higher call center volume and more paper applications.

2.3.3. Make your website mobile-optimized. More people have mobile phones than desktop or laptop computers. 56 Low-wage workers and workers of color are particularly likely to rely on their phones for Internet access.57While more than 80 percent of white adults report owning a desktop or laptop, fewer than 60 percent of Black and Latinx adults do. 58 Some states had already been planning to optimize their sites for mobile access before the COVID-19 pandemic struck; Connecticut, for example, quickly made that change afterwards. There’s no need to build an app; just make sure your website reformats automatically for mobile devices.

2.3.4. Design a sensible password reset process. At the start of the pandemic, a major source of delay in filing unemployment claims was the overwhelming number of people getting locked out of their accounts. Technology exists to implement secure password reset protocols that do not require the mailing of a new password or other action by agency staff. Using those protocols saves time and frustration for everyone.

2.3.5. Make online and mobile systems available 24/7. In an era when online commerce and banking happens at all hours, workers expect similar access to the UI system. Allowing claims to be filed at any time also reduces pressure on the system when demand surges, by spreading out the claims.

2.3.6. Automatically save incomplete applications, and provide a warning before timing out. Too many systems kick customers out before they have completed their claim. Auto-save can prevent them from having to start all over again if they leave their applications open on their screens for too long while searching for information. Providing a warning before a customer is timed out is an added safeguard, but doesn’t substitute for auto-save.

2.3.7. Allow customers to choose email or texting as a communications method. It is unrealistic to expect that customers will constantly log back into the system to check for updates. Push information out through email or text, and allow customers to choose which method works best for them. Minimize the use of mailed documents; if it is required by law, make that clear to customers.

2.3.8. Permit customers to email in or upload documents from a computer or mobile device. The system should be designed to allow photographs and scans of documents to be easily submitted. This is how mobile banking works; UI should be no different.

2.3.9. Avoid automated decision-making. Technology can streamline many aspects of UI, but allowing computers to make decisions is inconsistent with the requirements of due process. As a safeguard against AI-driven error, several of the states we studied required a staff member’s involvement before a claim could be denied.

2.3.10. Use plain language and smart questioning. Use simple, non-bureaucratic language. It may help to gather information from customers through a series of straightforward questions; a vendor we interviewed suggested thinking of the claims process as an interview, rather than a form to fill out. For example, many people use the terms “laid off” and “fired” interchangeably, so posing a series of simple questions about why they lost their job could provide better information and thus reduce errors.

2.3.11. Translate the application and other online materials into Spanish and other commonly spoken languages. Civil rights laws require translation of materials to ensure equal access. Translating materials also can save states money, by reducing demand for interpretive services at call centers. Washington provides an entirely Spanish-language version of its website and application, for example.

2.3.12. Minimize the paperwork burdens associated with work search. If your state directs claimants to register for worksearch on a state government website to search for a job, look for a way to integrate the two systems. Provide consistent guidance about what is required and avoid unduly burdening claimants. You want people spending their time looking for work, not assembling extensive documentation that your system lacks the capacity to review. Rely instead on the Reemployment Services and Eligibility Assessment (RESEA) program for verification, as needed.

2.3.13. Coordinate technology with other state agencies. If your state plans to move to a single sign-on system, make sure the password mechanism you create can be easily integrated into that new system. If your state handles appeals in a centralized manner, make sure UI claimants can tap into information about their appeals through the UI portal.

2.3.14. Provide a view-only option for non-UI staff. Workers often go to career centers and other public agencies seeking help with UI. Constituent services representatives in legislators’ offices also receive UI inquiries. Allowing public employees to log in and see what’s happening with a claim—but not to make changes that interfere with UI processing—is a best practice that Pennsylvania has adopted.

Designing a Modern Website

 

Websites today include a variety of features customers are used to seeing, all of which would
enhance the functioning of a UI website. They include:

– Chatbots (to answer frequently asked questions)

– Live chat (if you have the staff capacity)

– Calendaring

– Drop-down menus

– Progress bars (to track the steps in an application)

– Hover text (to pull up the definition of a term, for example)

– Select-all options (to file a batch appeal, for example)

– Expansive character limits for text fields

– Dark/dim versions

Stage 3: Implementation

Recommendation 3.1. Don’t go live during the busy season. It’s best to avoid the November–March period, so you are not struggling to adjust to a new system at a time when seasonal claims surge.

Recommendation 3.2. Consider rolling out pieces of the new system in stages. Going live with just one element, like the appeals process or Disaster Unemployment Assistance (DUA) claims, gives you a chance to see how the new system is working and make refinements, before everyone has to use it. Idaho and Washington have taken this approach.

Recommendation 3.3. Train and support your staff before going live, and on an ongoing basis. Modernization means making big changes, not only to your computer systems but likely to your business process as well. Your staff, including those on the front lines in call centers and career centers, should feel well prepared for those changes and supported throughout the transition. Maine and Wisconsin reported that call center locations that received the most practice with the new system were the best prepared for the rollout. Minnesota also provided guidance to its staff on how to respond in an empathetic manner to claimants struggling with their new system.

Recommendation 3.4. Staff up your call centers and deploy staff to career centers before going live. Call center usage spikes dramatically when a new system is rolled out, as does the number of people seeking assistance with UI at career centers. Minnesota put additional staff on the phones and Maine placed staff at career centers before going live with the new system, to help manage the demand. If you don’t anticipate being able to answer calls without substantial delays, add a call-back option, as Washington did.

Recommendation 3.5. Have a robust community engagement plan. Your rollout shouldn’t catch the community by surprise. Reach out in advance to stakeholders, educate them about the new system, and ask them to help spread the word. Tap into the same group you asked for help in testing your system before rollout (see Recommendation 2.1, above).

Recommendation 3.6. Expect lots of bugs and have a clear process in place to fix them. If you run into major problems, hold the claims, as Maine did, rather than adjudicating them, as Michigan did. Putting claims on hold during a system malfunction prevents a lot of hardship for workers, and unnecessary workload for staff who process appeals and reversals.

Recommendation 3.7. Provide for ongoing feedback from your customers and front-line staff. Washington, for example, used customer surveys to inform its decisions about business process changes. New Mexico did a particularly thorough job with the usability surveys it sent to claimants and employers. Be sure to dig deeper than just asking customers for their overall level of satisfaction with the experience. Provide the opportunity for feedback at every stage, not just at the end of a transaction, by which point customers may have forgotten exactly what language they found confusing or where they got stuck. Create a mechanism for staff to provide suggestions for improvements as well, and follow up in a timely manner.

Initial Lessons from States

In the early stages of the research for this report, the authors communicated with UI agency officials from a diverse group of about twenty states, of which over a dozen agreed to be interviewed to share the lessons learned from their experience.59Specifically, the state officials were asked to share features of their modernized systems they are most proud of, and what they would do differently if they were just starting the process. Most of the states interviewed had completed the transition to modernized benefits processing, and the remaining states were fairly far along in their development.

Highlights from these initial interviews are presented below, structured to follow the three stages of a modernization process: planning, design, and implementation.

1. Planning Lessons from State Interviews

  • Funding is key. As one state official emphatically put it, “FUNDING these projects is the greatest obstacle of all!” Many states relied primarily on the one-time large infusion of flexible “Reed Act” funding resulting from the 2009 Recovery Act, while others (e.g., Colorado, Utah, and Washington) accessed dedicated state funding or special tax assessments to support their programs. Some agencies have sought out a specific funding mechanism. For example, Pennsylvania currently has an employee-side tax, and at the UI agency’s request, the legislature directed some of the revenue from that tax to fund its current modernization effort.
  • State consortia can be challenging to form, but useful. Most states reported serious challenges forming UI IT consortia due to restrictions on federal consortia funding for such consortia, changing priorities of state political leaders, and other factors. As a result, many UI IT modernization efforts were significantly delayed or altogether abandoned in some cases. However, some states have valued and benefited from the formation of consortia to share UI IT infrastructure, expertise, and costs (including the Mississippi/Maine consortium and the Idaho/Vermont/North Dakota consortium).
  • Strong teams representing all functions should be involved in the planning process. Several states (e.g., Utah, Washington, Vermont) emphasized the importance of assembling strong teams that represent all the major functions of the agency to be involved in the planning, design, and implementation processes, and develop the expertise in the new systems. The team members should also play a central role providing the training of the system to front-line staff, both before the system is launched and on a continuous basis thereafter, while also regularly engaging the staff to solicit feedback on the system.

2. Design Lessons from State Interviews

  • New internal staffing structures and business practices may be needed. Some states (e.g., Minnesota, New Mexico) emphasized the importance of creating internal staffing structures to more efficiently and effectively process and adjudicate UI claims. These “unified integrated” systems break down traditional agency staffing silos (e.g., creating separate units that handle initial UI claims, adjudications, and overpayments) and instead apply UI staff where they are most needed (e.g., responding by phone to resolve more complicated UI adjudication issues).
  • Investments in internal IT systems and expertise can be valuable. Several of the states interviewed (e.g., Iowa, Utah, Minnesota, Washington) have worked with commercial off- the- shelf technologies, developed open-source technologies, or invested significantly to develop internal IT staffing and expertise in order to rely less on established vendors and provide greater flexibility to manage their UI IT systems.
  • Data conversion should begin as early as possible. UI agency officials also recommended that states taking on new UI IT modernization efforts begin the process of data conversion from their legacy systems as early as possible and conduct extensive testing of the converted data.
  • Too little usability testing was done. With some exceptions, such as usability studies in New Mexico, the interviews with UI officials clarified that there has been limited feedback solicited directly from workers or worker advocates to evaluate the usability of the new UI IT systems for all workers, but especially for the large proportion of unemployed workers whose first language is not English or who have limited computer literacy. Where outreach has been conducted, it has often taken place at the end of the planning and development process, when the opportunity to inform key decisions is more limited.

3. Implementation Lessons from State Interviews

  • Rolling out new systems is challenging and can take a long time. As one state official put it, UI IT modernization is a “roller coaster” ride, fraught with funding, vendor, staffing, and automation challenges. Many of the states interviewed started the process in the early 2000s, and only recently launched their automated benefits systems. Several states also experienced challenging launches of their new systems, which required major adjustments. Accordingly, most UI agency officials strongly advised that states provide for extensive lead time and testing of the technology (e.g., Wyoming tested 1,800 cases with staff before they were put in a live environment), organize the staff to ensure that “all hands are on deck” while rolling out the new system, and perhaps most importantly, wait to launch the new system during low-volume periods (e.g., during the summer months when fewer people are applying for benefits).
  • Modernization can improve UI access. Uniformly, state UI officials emphasized the significant role that new automated and IT reforms have played in improving communication with UI claimants, with twenty-four-hour access in many cases to a range of online services and direct access to information on the history and status of the individual’s claim. Indeed, the rate of online initial and continued claims filing has increased in most states that modernized. Some states (e.g., Colorado and Mississippi) have provided mobile-ready platforms, which helps workers in more rural communities that do not have reliable broadband access. New Mexico has also developed “personas” to help the platform accommodate particular groups of workers seeking to navigate the online claims systems by anticipating their needs and providing them with “pop ups” and other features that provide clarifying information. Several states (e.g., New Mexico and Mississippi) are also providing or developing “single service sign-on” systems, which allow workers to readily access not just UI benefits, but also job service and reemployment services.
  • UI access “pain points” remain. Due to administrative funding limitations and other pressures, several states indicated that they are reducing access to phone-claims services, which have been critical to many workers who have more challenging claims or have experienced barriers to navigating online systems. Many states also reported that the new automated systems have generated a greater number of eligibility, disqualification, and overpayment issues, which often produce multiple unfavorable determinations that the claimant is required to respond to separately. Where possible, certain states have taken steps to intervene in a timely fashion in these cases, and at least one state (Vermont) has adopted a policy to merge these multiple determinations into a single notice and determination.

Case Study Findings

This section presents findings from detailed case studies of UI modernization in Maine, Minnesota, and Washington. All three states had completed UI IT modernization projects at the time of data collection. Their UI IT projects were generally regarded as successful in terms of current program outcomes or public response—though no modernization effort was without its unique challenges. They also represent states with different populations, economies, and labor forces, as shown in Table 1.

table  2
Three Different Case Study States
Population Unemployment Rate. 2018 Major Industries
Maine 1.34 million (ranking 42nd in population nationwide) 3.24% Accommodation and Food Service; Retail  Trade
Minnesota 5.64 million (ranking 22nd in population nationwide) 2.94% Trade, Transportation and Utilities; Professional and Business Services; Manufacturing
Washington 7.61 million (ranking 13th in population nationwide 4.46% Retail Trade; Manufacturing; Accommodation and Food Services
Source: U.S. Census Bureau July 2019 population estimates; Bureau of Labor Statistics Local Area Unemployment Statistics seasonally adjusted unemployment rate, 2018; Maine Center for Workforce Research and Information; Minnesota Employment and Economic Development; Washington State Employment Security Department.

Maine, Minnesota, and Washington took different approaches to UI IT modernization. Minnesota was one of the earliest states to modernize their benefits system, which went live in 2007. Though the online platform—the Minnesota Unemployment Insurance (UI) Program—was created by private vendors BearingPoint and Deloitte, the code is the property of the state. Minnesota’s Department of Employment and Economic Development (DEED) maintains the code and shares it freely with other state agencies that request it. While building their online system, Minnesota’s UI agency was also engaged in updating their business practices. They accomplished this by reviewing call center management, scripts, and training, among other things.

Washington interviewed several technology vendors for their project. After conducting extensive market research, Washington’s Employment Security Department (ESD) ultimately negotiated a contract with Fast Enterprises that obligated the vendor to provide ongoing system support and maintenance for its commercial off-the-shelf (COTS) product. Though agency employees reported that they have a lot of oversight over the system, it is a proprietary system owned by Fast Enterprises. Unlike Minnesota, Washington’s modernization project, which was rolled out in 2017, included no operational change management.

Maine is the only case study state that is part of a consortium, meaning that it shares system and maintenance expenses with two other states (Mississippi and Rhode Island). The Maine–Mississippi–Rhode Island consortium—ReEmployUSA—uses software developed by Tata Consulting Services (TCS), a subsidiary of the multinational conglomerate Tata Group. Maine’s personalized UI system is called ReEmployME and was rolled out in 2017. Maine made reviewing agency business processes a feature of its modernization project. The consortium owns the supporting code, though TCS provides ongoing application support.

table 3
Case Study Modernization Projects At a Glance
UI IT Program Vendor(s) Project Duration
Maine ReEmployME (launched 2017) Tata Counsulting Services (TCS) 2016–2017
Minnesota Minnesota Unemployment Insurance (UI) Program (launched 2007) BearingPoint, Deloitte 2003–2007
Washington eServices (launched 2017) Fast Enterprises 2015–2017

The case studies presented in this section provide a description of our research methods as well as our findings on each state’s UI IT project and its successes and challenges, with attention given to how modernization has impacted claimants.

Minnesota

Minnesota’s UI agency (the Department of Employment and Economic Development, or DEED) takes pride in its business philosophy and quality of services. It was the first state to modernize both its UI tax collection and benefit payment systems. Before the COVID crisis, Minnesota was one of the few states that regularly met the federal trust fund solvency standard. Furthermore, Minnesota does well on official performance measurements set by the Department of Labor—in 2018, Minnesota’s recipiency rate was ranked fifth highest and their average weekly benefit amount ($462.61) was ranked third highest among all state UI programs. Both measurements improved following modernization. Minnesota has been a national leader, developing and maintaining IT and claims processing systems that have been shared widely with other states. They are one of the top states in major elements of administrative performance, such as first-payment timeliness, overpayments, nonmonetary time lapse, and nonmonetary quality, a fact they are proud of. The feedback provided in focus groups by worker advocates, summarized below, generated helpful recommendations to further improve upon the Minnesota system.

TABLE 4

Minnesota UI Indicators and Modernization

Indicator Before Modernization Just After Modernization Now
2006 Rank 2008 Rank 2018 Rank
Overpayment rate 11.70% 15 10.70% 21 6.50% 44
Total denials as a percent of claims 31.50% 23 30.80% 15 48% 21
Nonmonetary determination separation quality 68.70% 25 76.70% 32 84.80% 11
First payment timeliness 88.60% 39 88.00% 30 93.20% 9
Nonmonetary determination timeliness 82.00% 23 75.90% 19 88.40% 14
Percent online claims (initial) 16.00% 3 30.40% 12 87.90% 15
Note: Indicators that have declined since modernization are shaded dark gray.

Planning in Minnesota

Minnesota began planning its Unemployment Insurance Technology Initiative Project (UITIP) in 2003. Prior to UITIP, unemployment insurance was a paper-based process. During interviews, agency officials indicated that evaluating business practices and eliminating system complexities were compatible project goals.

The hallmark of the modernization project was a triage system to better manage call center operations. Minnesota’s business reorganization was intended to ensure that the majority of claims were able to be efficiently processed with minimal staff time, leaving staff to spend time on claims with more difficult issues. Under the triage system, call center staff are trained to handle common issues while transferring any exceptional scenarios to a smaller group of subject matter experts (SMEs). Project leaders observed call center staff and analyzed existing datasets to identify successful practices that could be built into the online system. These operational adjustments helped reduce work backlogs and call center wait times. Even during the Great Recession, wait times were less than three minutes, on average.

Minnesota had one of the quickest modernization processes and was able to roll out its system after sixteen months. In retrospect, the agency believed that having the project run by UI experts, rather than technology experts, was critical to its success.

The code that powers Minnesota’s online UI system was developed by third-party vendors. It was purchased by the state so that they would have full intellectual control to make any necessary or desired modifications.

Stakeholder Feedback on Planning in Minnesota
  • Stakeholders were not really consulted during the planning stages of the project. While
  • Minnesota has an Unemployment Insurance Advisory Council, it had grown to over forty members around the time of modernization and was not engaged with the modernization project.

Design in Minnesota

Minnesota’s online application was designed with common scenarios in mind. Rules-based routing populates relevant questionnaires for applicants, expediting fact-finding stages. The system automatically detects and flags issues that require further fact-finding and generates relevant questionnaires for the applicant. The design approach from DEED was to “meet claimants where they are” rather than having to chase them down for information later. DEED reviewed prior claims data to determine the decision trees and drop-down options in the application and questionnaires. For weekly work search questions, DEED consulted with an outside academic to identify which questions actually cut to the heart of what it means to “search for work.”

The online self-service options available to claimants were completely new, since no online access was available prior to the project. At the time of our interview, the online system had a forty-five-minute “time-out” for initial applications, with limited autosave functionality. Claimants and employers, or employer representatives, are able to file claims as well as appeals online. They can also pick their hearing date and time, request interpretation, add witnesses, and add representatives. However, while prior to the UITIP, parties could file a single appeal that covered several issues, the new online appeals system requires a separate appeal for every issue. The system does not permit a party to select multiple issues to be addressed in a single appeal. While DEED still attempts to “batch” many of these appeals on the back end, that process is not automated and there is no way for parties to mark multiple appeals as “related” to ensure batching.

All credibility determinations are made by DEED staff, and no overpayment determinations are made without an actual person involved. The backend functionality “pushes” flagged issues to adjudicators, with the oldest issue flagged as top priority, to help streamline claim processing. Still, the system primarily relies on the ability of claimants to answer detailed questionnaires, which were written at a high literacy level.

The two-stage design project included user testing. During the first phase, the agency assembled focus groups composed of employer customers and third-party administrators. These user groups were shown prototypes of the self-service system and asked for feedback. The second phase invited employers, third-party administrators (TPAs), and current UI applicants to participate in focus group discussions. Feedback from the focus groups helped the agency make improvements to website navigation and wording prior to the official launch.60

Stakeholder Feedback on Design in Minnesota
  • Hours were limited. Focus group participants were frustrated by the website’s limited service hours. While most Internet users expect 24/7 access to online services, Minnesota’s online UI system is only available from 6:00 AM to 6:00 PM, Monday through Friday. Claimants explained that they found this timeframe inconvenient, especially when they were also actively looking for work.
  • Navigation was difficult. Workers also noted that navigating the website was challenging, given the text-heavy design of many of the webpages. The online application generated long questionnaires that required close and careful reading, often resulting in claimants “timing out” of the system and losing their progress. Most participants revealed that they resorted to drafting their answers in a word-processing application to copy and paste into the application.
  • There was no advocate access point. Legal aid attorneys reported that the system was designed without an online access point for representatives in which they could file appeals or review documents to help claimants understand agency communications (unlike employer accounts, which have an access point for TPAs). Having an access point into the system would legitimize their role as claimant representatives, one attorney said, and make the new system more transparent for claimants and advocates alike.

Implementation in Minnesota

UITIP went live in 2007, and despite its careful planning, Minnesota immediately experienced several issues with the new system. First, the website was rolled out prior to the 2008 recession, and during the winter months, when many seasonal workers were filing. The agency was down to just thirty full-time employees at that time, which was inadequate for handling the volume of applications received. The call center was inundated with panicked questions about the online application, many of which were users who had forgotten their password or been locked out of their account. Additionally, staff were also struggling to adjust to the new triage system. Some call center staff still treated UI as a “case system” in which every call warranted close, personal examination. Still others treated phone assignments as punishment.

Minnesota’s UI IT modernization project was delivered on time and on budget, though it took time and training for the new program to achieve its high performance metrics post-modernization. When we interviewed DEED staff, approximately 90 percent of UI applications were received online. However, agency leadership affirmed that the core of their UI operations would always be the call center. DEED operates the only dedicated call center training room within the department. New call center workers receive empathy training and attend emotional intelligence seminars to improve customer service prior to taking calls. Training also prepares staff to handle technological questions, as they use the administrative version of the online application for data entry. Call center workers can even see what information claimants have input into the system to assist them with their application. Moreover, the state thinks of their triage system as a model for other states. Along with computer modernization, the state put in a new business process in which calls are first answered by generalists who can answer basic questions, and then are routed to different staff members who are specialists for more difficult questions about benefits and legal issues. DEED staff see this as an adoption of call center models used in the private sector, and see themselves as being in the business of providing service.

However, the original build tied the frontend presentation layer—the user interface—closely to the backend business logic, which prevents the agency from making significant changes to what the customer sees or even offering different language versions of the online application, without changing the code for the entire system. It also prevents changes that would improve the visual appeal of the application and self-service options. DEED is planning to separate the presentation layer so that it can make such changes in the future. Since go-live, DEED has created Spanish, Hmong, and Somali language videos to help claimants through the filing process, but acknowledges that these resources are not a satisfactory replacement for supporting multiple other-language versions of the online application.

Stakeholder Feedback on Implementation in Minnesota
  • System is still paper-based. Focus group participants were surprised by the amount of mailed paperwork they still received given the introduction of a fully online option. Even though the new system allows applicants to file entirely online, questionnaires can generate issues that require further information from applications that must be mailed or faxed to the agency. Furthermore, the system does not allow claimants to attach supporting documents to their online application or to send them via email.
One Minneapolis focus group participant explained that by using the online application, she expected all agency communications to be delivered electronically. She was used to checking her online account for new messages. Consequently, she overlooked a mailed notice informing her to attend a mandatory workforce seminar. She missed the appointment and received an electronic communication shortly thereafter announcing that she was no longer eligible for unemployment benefits. She was only able to reinstate her benefits after filing an appeal.
  • Password reset problems occurred. One of the inevitable consequences of most UI modernization projects—a spike in technical support questions related to resetting passwords—continues to be handled by DEED entirely via snail mail. Claimants that get locked out of their online accounts must call DEED to request a password reset. During focus groups, participants who experienced this issue explained that they lost a week of potential benefits while waiting for their new password to arrive by mail.
  • Workers were generally pleased with the website access and call center help. Participants acknowledged that while Minnesota’s online system replicates certain pre-modernization access barriers, the website is convenient—especially for younger workers. One participant explained that he finished his online application in less than ten minutes. He said that most of the questions were easy to understand, even if they weren’t designed to address his particular situation. Focus group participants generally were pleased with DEED’s call center. They reported that call times could occasionally be long, but most of the staff were considerate and empathetic.
  • Triage system was imperfect. Though call center support was expedient, it was not always helpful. A Minneapolis participant explained that it was difficult to get a “straight answer” to a “straightforward question” about his eligibility. Other participants echoed this sentiment, admitting that certain call center staff didn’t have the authority to answer any complex inquiries—an actual feature of the triage system. Another Minneapolis participant suggested that call center workers weren’t “worried about giving me the wrong answer,” but “worried about giving me an answer that I’m going to use later [during] an appeal.”
  • Claimants experienced increased disqualification and appeal problems. Legal aid attorneys raised concerns about the new appeals system and the requirement that claimants file separate appeals for each individual issue. The modernization project resulted in an increase in determinations and disqualifications for their clients, especially concerning wage reporting and continuing eligibility requirements. Attorneys reported that clients often missed appealing an issue and did not understand separate appeals were necessary. The new online system also failed to clearly demarcate at what point an appeal was considered “filed,” and many claimants missed going to the final screen and therefore never timely filed their appeals.
  • Department was responsive to identified problems. Legal services attorneys reported that technical issues they identified in the system after it went live did lead to later positive changes. For example, a prominent button that would dismiss existing appeals was fixed to prevent claimants from clicking it in error, and a notification was added to alert applicants about the thirty-minute time-out window.
  • System was not built for all workers. Legal aid attorneys explained that they represented some of the most vulnerable worker populations, and that their clients experienced language and technological barriers, as well as complex appeals, in the new system. Post-modernization, it became more common to address a client’s multiple issues across multiple hearings (in the past, all issues could usually be resolved in one hearing).

Washington

With funding provided by the state, Washington’s UI agency (the Employment Security Department, or ESD) has made substantial improvements to a system that, like other states, struggled when it was first launched in 2017. Agency officials actively engaged and trained ESD staff at all levels to continuously upgrade the new system, and collected extensive data to closely monitor its performance. Washington also offers a positive range of user self-service tools to help workers navigate the system, and a website with more design features than the others we studied. Like Minnesota, Washington consistently does well on key measures of performance—in 2018, Washington’s recipiency rate was ranked thirteenth highest among all the states in 2018, and their average weekly benefit ($467 in 2018) was ranked fourth highest among all the state UI programs. It also maintained a trust fund balance that exceeded the federal solvency standard before the COVID-19 pandemic hit. Washington’s worker advocates have also been exceptionally effective, and together with the workers who participated in the project’s focus groups, they identified key priorities to ensure that the state’s most vulnerable workers can readily access UI benefits.

TABLE 5

Washington UI Indicators and Modernization

Indicator Before Modernization Just After Modernization Now
2016 Rank 2017 Rank 2018 Rank
Overpayment rate 13.60% 13 8.70% 34 19.30% 8
Total denials as a percent of claims 24.00% 42 45.00% 24 46.00% 25
First payment timeliness 85.50% 35 72.60% 50 81.80% 46
Nonmonetary quality 78.80% 25 65.60% 40 59.80% 47
Nonmonetary determination timeliness 50.70% 49 52.10% 52 71.30% 39
Percent online claims (initial) 58.20% 32 70.60% 28 64.90% 32
Note: Indicators that have declined since modernization are shaded dark gray.

Washington’s performance, as compared to other states, has been impacted by modernization. The agency experienced a dip in the quality of nonmonetary determinations after modernization, from above the national standard of 75 percent quality to below after modernization. The state agency attributed the problem to an overreliance on the computerized tools and insufficient proactive fact finding, something they are training on and reported seeing some better results beyond the study period. As will be described below, most states, including Washington, experienced an increase in the denial rate after modernization and Washington went from the states least likely to issue a denial to an average state on denial rates. In the agency’s opinion this is because claims were more likely to be accurately decided at first, rather than having to be reworked and redone. This aside, modernization has coincided with tighter monitoring of work-search activities which are the underlying source of this increase in denials.

Planning in Washington

In 2010, ESD requested state funds to help leverage existing federal funds for modernization. The legacy mainframe system, called GUIDE, required thousands of patches and several ancillary systems to manage added agency functions, like the Reemployment Trade Adjustment Assistance (RTAA) program. It was a difficult request to make during an economic recession, but the state legislature was responsive and made a significant investment in improving the system. To its credit, the legislature has continued to invest state dollars in the administration and IT needs of the program.

ESD issued a request for proposals aimed at procuring a commercial off-the-shelf (COTS) product to support their UI IT project. After interviewing six vendors, ESD and Washington’s Attorney General signed a contract with Fast Enterprises. Agency officials were satisfied with terms of the contract. They reported that they enlisted several experts from the many Washington-based technology industries to help with gap analysis, feasibility study, and training prior to contract negotiations. Consequently, ESD had full control over the go-live date.

The contract also obligated Fast Enterprises to provide system maintenance and security. Fast Enterprises sent their own staff to support ESD during and after the unemployment tax and benefits (UTAB) modernization project. ESD reported being pleased with the support provided by the Fast Enterprises staff, and viewed their involvement positively. Unsurprisingly, our interviews at ESD included several Fast Enterprises employees now working on the twelfth version of the system. ESD found it difficult to keep high quality developers in house and up to speed with FAST, especially given the private sector competition for programmers in the state.

The Fast Enterprises core product had previously been implemented in Michigan. To learn and improve upon Michigan’s experience, ESD project members spent time speaking with Michigan officials about pain points and lessons learned.

As part of the project planning team, ESD pulled staff from all of its units to make sure it had the right people at the table, especially those who would be on the front line of the system implementation. One takeaway from ESD was that “if you do not feel the pain of missing those workers in your daily operation of the system, you did not choose the right people for the project.” ESD also learned that it was best to only make major decisions with a large team while authorizing some project team members to make smaller decisions. ESD also engaged some users in testing of the system through its local workforce centers toward the end of the project, but found that at that point it was too late to make any real changes to the product prior to implementation.

Stakeholder Feedback on Planning in Washington

As in Minnesota, worker advocates in Washington were mostly unaware of the benefit modernization project until implementation. Employers were included in the project only through some initial testing, when ESD brought sixty employers in for beta testing to help identify easy fixes to the system.

Design in Washington

Fast Enterprises already had a core product, so it was able to code the core architecture changes for UTAB quickly. However, like any commercial off-the-shelf product, this meant that the core design of the system was outside of Washington’s control and there was little leeway to make customizations.

One of the highlights of the UTAB design is the range of user self-service tools, which allow claimants more advanced access than the mainframe. For example, prior to UTAB, if a claimant missed a weekly certification, they could only reopen the claim by speaking with ESD staff on the phone. With UTAB, claimants can now reopen their claims online and are able to file for up to four weeks of backdating. Additional highlights include:

  • Claimants can send messages through the online portal to ESD, and can attach documentation to be added to their claim.
  • Claimants can file appeals through the system, and the system will pre-fill information based on previous data entries.
  • All determinations are available to view online.

At the time of our interviews, all elements of UTAB were mobile responsive except for the initial claim application. The initial application has a fifteen minute auto logout, and no autosave feature, although claimants can save their application at the end of each page. Unlike most other systems, the initial application does not ask for significant separation information. Instead, after the application is filed, the system may flag a separation issue and then add a questionnaire to the eServices portal.

One major design change with the initial system was a change to the notices of determination. Prior to UTAB, all determinations were free form, and while there were some templates, adjudicators had full control over the content. Adjudicators commented that the new system “took a lot of thinking” out of the determinations and instead put in place a check-box system. Adjudicator indicated that independent analysis of facts felt more difficult and were limited in what would appear on the determination because they could not write in a free form manner. A state court later ruled these determinations insufficient to meet due process requirements, and ESD updated its system, although adjudicators still lack much of the flexibility they had prior to UTAB.

However, after reviewing the implementation of the FAST system in Michigan, Washington was extra careful about integrating any automated decision-making into its system. For fraud, every decision is evaluated by a human representative. While certain types of eligibility issues can “go presumptive” based on the burden of proof if one party does not respond, the only automated decision-making occurs with weekly job search reporting. This may explain the substantial increase in disqualifications for worksearch in the state, and the denial rate for these reasons more than doubled after modernization, from 2 percent of all weeks claimed to 5 percent of all weeks claimed.

Stakeholder Feedback on Design in Washington
  • Online services reviews. Workers had mixed responses to UTAB and the new online services (which the state calls eServices). Some workers focused on the following advantages:
    • Many of the Seattle workers explained that it was helpful to see all their information listed out in one place, a record of ESD communications, and the amount of benefits money still available to them.
    • Workers found it helpful to be able to appeal in the same system they saw their determinations.Others struggled to navigate eServices:
    • Many workers struggled with the system timeouts. For appeals, workers said when they timed out they lost everything they had written. For weekly claims, they often timed out while trying to answer the weekly work search questions, which were quite long and cumbersome. If they did not affirmatively hit save somewhere in the claim, they would lose everything they had entered.
    • Workers reported that the eServices portal was not intuitive and many encountered problems trying to find their determinations or other important information.
    • Workers noted that sometimes there were multiple claim years available on eServices, but that it was difficult to navigate between them and find the current information.
  • Claimant advocates had no access to eServices. As in Minnesota, claimant advocates were frustrated by their lack of access to the system, which made it difficult for them to see eligibility decisions or help claimants navigate the problems identified above.
  • ESD determinations were unclear. The change in determination structure had a major impact on workers. Workers reported receiving one line decisions telling them they had been disqualified for benefits, but without any context or description of the facts. Some stated they did not even know what issue was involved in their disqualification. Advocates reported similar problems and that administrative law judges could not rule on the determinations in many of these cases due to the lack of information.
  • Out of eleven focus group participants in Sunnyside, Washington, only one was able to use eServices. The migrant workers primarily used the telephone services to contact ESD. Most did not have access to a computer and could not navigate eServices on their phones. A few who tried could not manage to finish the initial application online and had to call. Despite this, workers reported feeling continuously pressured by ESD to use eServices.

Implementation in Washington

Project development was delayed and system changes were creating some frustrations and concerns about job security. Because the agency was not satisfied with the testing and system readiness, the original launch date had to be reset multiple times. Washington’s UI website was ultimately launched in January 2017.

Like Minnesota, Washington’s go-live came during a period of higher unemployment claims related to seasonal industries. Agency staff present at the time recall severe performance issues related to the launch, reflected in Washington’s official performance statistics from 2017. That year, the state was only able to deliver initial payments to 72.6 percent of eligible claimants within twenty-one days—short of the 87 percent federal standard. In 2018, the state’s performance had improved to 81.8 percent.

One of the primary issues during the launch was system overload. Call volume spiked dramatically, resulting in only about a 20 percent chance for connection. Panicked claimants experienced busy signals, dropped calls, and system error messages due to higher-than-expected web traffic.61 Although the rollout was marred by public complaints, ESD and Fast Enterprises staff recalled how the launch helped them redouble their efforts to make the project a success. The agency hired thirty more agents to answer the phones. Even with minimal training, they were able to meet agency benchmarks for call handling times 93 percent of the time and cut wait times by 67 percent. By 2018, the call center connection rate had surged to 99 percent. Washington stood out as a state that collected a large amount of data about the functioning of its modernized systems, and used that data to make improvements.

Since go-live, ESD has expanded telephone hours to increase access and has done empathy training with all of its staff. ESD has also partnered with local workforce centers and twenty-three of these centers across the state have ESD staff placed on site. ESD explained that many of the (primarily migrant) agricultural workers in the center of the state had limited computer skills and relied on workforce staff for assistance with their UI claims.

A 2019 claimant survey that garnered over 16,000 responses found that 85 percent of claimants were using eServices and 62 percent were using the call center. Using the analytical tools within eServices, staff analyzed whether they could safely reduce support for call center operations; specifically, their interactive voice response (IVR) system (an automated routing system for filing weekly claims by telephone). With approximately 33 percent of claimants using the IVR system to file weekly claims, ESD felt that transitioning away from these services would negatively affect claimants and the state’s recipiency rate.

One other major learning experience from ESD implementation was around staff training on the new system. Fast Enterprises had done limited “just-in-time” training prior to the original go-live date. Delaying the launch twice left staff underprepared for the actual launch. Furthermore, the system training environment didn’t use actual data or bring together all elements of the system in a holistic way. During our interviews, ESD staff recognized that they had not prioritized training during contract negotiations. As a result, ESD had to significantly increase its own staff training and engagement. ESD staff recommend that states embed training staff in the development phases of the project, host expanded morning Q&A sessions during implementation, provide space for training in live environments, and plan for retraining three to six months after implementation.

While state did have a usability vendor, the agency reported not having enough time to implement many of the suggestions. ESD made efforts to gather feedback on its system after implementation. A feedback form embedded within the eServices portal has led to further operational tweaks and a dramatic reduction in claimant complaints.

Stakeholder Feedback on Implementation in Washington
  • Claimants could not get through on the phone. Workers confirmed that after go-live the phone systems were overwhelmed. Many experienced long wait times or could not get through. Advocates reported that their clients who lacked access to computers, were not technologically literate, or had language barriers were unable to file or navigate their unemployment cases when they could not get through on the phones. However, all focus group participants said they still had to rely on the call center even with the addition of eServices. For most participants, the call center was their primary means of securing UI benefits and a critical resource for when they had questions or issues.
  • The Office of Administrative Hearings (OAH) did not modernize. In Washington, all administrative appeals are centralized in OAH, which is separate from ESD. OAH did not have a modernization project, so although customers received electronic notices from ESD, they only received paper notices from OAH. This led to many workers missing their hearings because they did not know to look for mailed notices. It also caused issues on the back end. Advocates reported much longer wait times for benefits to be released after winning a hearing with OAH and that instead of having one hearing scheduled to cover multiple appeals for a single client, there were multiple hearings scheduled. ESD plans on improving the process so they are more aligned with OAH in the future, and electronic delivery is being turned on as of 2020.
  • Migrant workers lacked trust in the system. Many of the Sunnyside focus group participants’ first or only language was Spanish. While noting that eservices were available in Spanish, the workers overwhelmingly preferred filing using the phone and always called ESD with their questions. However, their interactions with ESD representatives often made them feel like ESD was looking for ways to disqualify them or accuse them of fraud. This was especially true for worksearch. Given the limited number of warehouses in their town, many workers had no way to report three work searches per week. They were also afraid that if they applied to other, less well-paid work, they would have to accept an offer and miss their opportunity to return to their warehouse job when recalled.

Maine

Maine’s Bureau of Unemployment Compensation completed its benefits modernization project in 2017. The agency is part of the ReEmployUSA consortium, which uses software developed by Tata Consulting Services (TCS). Maine’s UI benefits website—ReEmployME—had a difficult launch that received significant public attention. However, recent changes in leadership have led to a renewed openness and focus on enhancing access to services and improving the system to be more user-friendly, as reflected in the state’s participation as a case study in this project. For example, to expand access to the system, the new leadership prioritized increasing the staff in the state’s local one-stop career centers to provide increased direct, in-person services to workers. They also reengaged with key stakeholder groups representing workers, reestablishing crucial lines of communication. The state’s UI program remains above the national average in its recipiency rate (ranked twenty-first among all states in 2018) and is one of the only states that saw benefit denial rates drop after modernization. In 2019, the state’s average weekly benefit ($351) replaced 52 percent of the state’s average weekly wage, which was above the national average of 45 percent. The worker advocates and claimants who participated in the focus groups raised several access concerns, which are summarized below, while applauding the direct service they received from the agency when they were reached by phone.

Table 6

 

Maine UI Indicators and Modernization

Indicator Before Modernization Just After Modernization
2016 Rank 2017 Rank
Overpayment rate 12.9% 17 5.4% 44
Total denials as a percent of claims 45.3% 20 36.3% 29
Nonmonetary determination separation quality 84.9% 11 76.2% 30
First payment timeliness 91.9% 13 81.9% 45
Nonmonetary determination timeliness 86.3% 15 79.2% 30
Percent online claims (initial) 50.1% 41 69.6% 28
  Note: Indicators that have declined since modernization are shaded dark gray.

Table 6 reveals other positive impacts following benefits modernization—notably, the state’s overpayment rate went from one of the highest in the nation (seventeenth) prior to modernization to one of the lowest (forty-fourth). Despite these successes, Maine has struggled to improve its rankings and meet national performance standards post-modernization.

Planning in Maine

After Maine moved its system off the mainframe in 2000, the state began planning for modernization of its remaining legacy design in 2013. The Bureau of Unemployment Compensation decided to proceed with the consortium model, noting that it would take less staff and funding resources, and appreciating the shared governance and systemwide updates. Maine joined Mississippi’s ReEmployUSA consortium for a system designed by TCS.

The original ReEmployUSA consortium states included Mississippi, Maine, and Rhode Island; Connecticut and Oklahoma will join the consortium but are still in project development.62 Mississippi went live with its system in early 2017 and led the product development as the original user, having received a grant several years earlier to convert into an online rules based system. The U.S. Department of Labor awarded the consortium a $90 million development grant to help Maine and Rhode Island integrate Mississippi’s UI IT system framework under Mississippi’s leadership.63

The ReEmployUSA consortium is cost-effective because it leverages existing technology and eliminates the need for state agencies to design their online platform from scratch. Maine was also able to operate through Mississippi’s procurement process, and all steps were reviewed by legal counsel for both states. The consortium then developed a Memorandum of Understanding for the development stage of the project, and the consortium itself owns the code for the project. Mississippi also sent some of its UI staff to Maine to assist with the project and provide feedback on the Mississippi experience.

Stakeholder Feedback on Planning in Maine

Similar to the other projects in the case study, no nongovernmental stakeholders were included in planning of the system before it was launched in 2017.

Design in Maine

The consortium model comes with certain limits on design. Similar to a commercial off-the-shelf (COTS) system, many of the core components were predetermined based on the Mississippi system, otherwise considered the “core product.” Maine did perform a gap analysis of Maine and Mississippi law to determine any changes that were necessary to conform to Maine’s UI law. Maine found that there was an 85 percent code match for benefits and appeals, but the gap analysis took more time than was originally expected.

In large part because Maine’s core system was developed in the late 2000s as part of the Mississippi project, the presentation layer does not reflect modern design and dynamism of Washington’s system, which went live around the same time. Neither the initial application nor the weekly certification systems are mobile-responsive. When the system went live it also had a five-minute timeout for both types of filing, and only some autosave features for the weekly certifications. However, the weekly certifications are designed in an easy-to-understand way for wage reporting, which is a frequent challenge for many claimants to understand clearly.

The portal provides a way for parties to view determinations and appeal online. The portal also includes an internal messaging system. While there is no way in the portal to upload documentation, UI claim examiners often provide their email address so that claimants can send them documents.

There is no automated decision-making in Maine’s system design. Maine’s unemployment law requires scheduled fact-finding by telephone on potentially disqualifying issues. Fact-finding notices are still mailed and provide parties with a scheduled time for the interview.

Stakeholder Feedback on Design in Maine
  • Smartphone usability. While Maine provides for smartphone access, which is a positive feature of the system, the focus groups reported that navigating ReEmployME on smartphones was difficult, especially when trying to access different tabs or parts of the online portal.
  • The online system was not user-friendly, regardless of access point. While some users, mostly younger or technology-savvy workers, were able to navigate the system, many claimants noted the following problems that prevented them from smoothly using the online system:
    • Many reported that the website often crashed or would be unavailable.
    • Claimants could not go back and change answers easily if they made a mistake, because the system did not include a “back” button. Instead, they would have to log out and back in to make changes if they accidentally hit the wrong radio button.
    • Claimants had trouble using the drop-down menus, and did not understand why there was not an “other” with an opportunity to explain in many sections, given the broad scope of workers’ situations.
    • Several claimants commented that the dependents section constantly froze and kicked them out of the application.
    • Claimants consistently timed out of the application if they received a phone call or text message, or had to go look something up.
  • The “old look” design of the system. Many claimants complained about the “old look” of ReEmployME, with some referring to it as a “Windows 95,” which made the system feel unwelcoming to them.
  • Workers did not receive notifications. No workers reported receiving electronic alerts when new documents were added to their portal or new information was available. Many expressed concern that they would miss important documents or deadlines. Claimants newer to unemployment also did not understand why the system did not send them a weekly prompt to file their certification.
  • Workers were confused by extra paperwork. Claimants did not understand why they still received extra paperwork many weeks after the date of action. Some reported it felt like they were often asked to fill out paper forms with information they had already filled out online. Other times, they only received certain vital information by mail, like fact-finding appointments or hearings, when they felt they should be able to see those notices online. One problematic example was a union worker who took out-of-state gigs, and would receive a return-to-work form in the mail to his home address that required action on his part, rather than electronically where he could actually see it while he was away.
  • Hearing files were still sent by mail and not accessible online. Similarly, claimants only received the file of paperwork for their appeal hearings by mail and had no way to access the information online.

Implementation in Maine

Maine went live with its ReEmployME benefits system in early December 2017. Similar to many other states, Maine’s decision to implement its system right before seasonal unemployment hit and the state experienced its highest quarter of unemployment led to additional challenges. The state felt forced to implement the project due to a hard date related to the funding of the project and was facing a loss of grant money.

Maine workers experienced well-documented and significant problems with the new system immediately upon its implementation. Workers had difficulties filing initial claims online and logging work search efforts, experienced arbitrary cutoffs at the end of the calendar year, and waited on hold for hours to get help by phone (sometimes only to be disconnected before they got assistance).64 Thousands of claimants were locked out of their ReEmployME accounts after too many login attempts, after having been incorrectly instructed that they could use their usernames and passwords from the old system. Furthermore, most front-line agency staff did not have the authority to unlock an account and reset a password. The resulting delays prevented many claimants from filing a successful claim within the mandated fourteen-day window.65

The online work search documentation required for weekly certifications proved particularly challenging for claimants, who found it difficult to navigate and reported that it could only be completed on Sundays (to cover the previous week).66 State legislators had to step in with a bill allowing work search histories to be filed by phone or in person.67

While the rollout was problematic, some of Maine’s responses to system issues provide a roadmap for how to protect workers from losing benefits. In particular, the state Department of Labor (which administers the UI program through its Bureau of Unemployment Compensation) gave claimants months to fix work search records, rather than disqualifying them based on failure to submit. Despite the widely reported problems with work search, the department said that very few claimants actually lost benefits. The department also provided leniency for claimants who missed filing claims because of login errors or other problems with the system and waived its backdating requirements.

The Department of Labor observed that claimants who lacked computer skills experienced the most challenges. Many claimants struggled with signing into the system and would forget their passwords. As in Minnesota, the most frequent type of call after implementation was from claimants who were locked out of the system. Based on this experience, Department staff recommended that other states seeking to modernize do targeted education on the password reset process as part of implementation. They also suggested that for new projects, states should roll out the registration system and portal early to give claimants a chance to register and explore.

Many of the issues were less about the actual technology than the business practices and staffing of the Department of Labor at the time ReEmployME was launched. For example, while the department had the technology for skills-based routing of telephone calls, they lacked the staff to implement it. The department also did not have enough staff at the time to support the call center hours. Similarly, the new system was designed to pick up eligibility issues that may have been missed by the legacy system, leading to an increase in fact-finding, which required additional claim examiner time. Finally, during the implementation the department reported several critical staff transitions.

The agency did make some efforts to prepare for the rollout. The Department of Labor created a Training and Support Unit that gathered information from the vendor, Mississippi, and a separate consulting firm that they hired. There were refresher training sessions offered throughout the project and after implementation. Importantly, the department gave its staff time “in the sandbox”—known otherwise as some freeform time to explore the new system. However, they found that structured exploration was the most effective. During this time, agency staff were given assignments to complete, such as practice adjudications, so they could understand how the system would work practically for them.

The Department of Labor also brought in an outside company that specializes in teaching staff how to manage different customer interactions with empathy. Staff rated this training highly, and the department now gives out a quarterly empathy award to staff. Department leadership recommended that, looking forward, they would recommend a “soft rollout” with stakeholders to get feedback before having the entire system go live. The general impression of the agency was that they were on the road to a successful rollout but the rush to go live led to unnecessary problems.

When new leadership took over in 2019, they took a hard look at changes that were necessary to improve ReEmployME and the Department of Labor’s business practices. To address access issues, the department now pays 50 percent of Employment Services staff salaries for the help they provide to UI claimants at local workforce centers and created a video tutorial to teach the staff how to use ReEmployME. During “mud season,” the spring thaw that slows down the state’s logging economy, the department set up mobile labs to help loggers access ReEmployME.

Maine is somewhat limited in the updates it can make to ReEmployME based on its position within the consortium. At the time of our interview, Maine had only recently finalized the operational memorandum of understanding with the consortium, and all changes to the core product must be agreed upon by all consortium members. When a change request comes from a UI director, they must first justify the business case for the change, followed by TCS evaluating the cost, time, and feasibility of such a change. However, one change that was noted during interviews was that Maine recently increased the timeout period on its web applications to fifteen minutes.

Stakeholder Feedback on Implementation in Maine

Claimants confirmed the implementation problems that were covered by the media at the time of implementation. Additionally, they and advocates noted:

  • Work search was difficult to fill out on smartphones. Even those who attempted to answer the new work search questions online struggled. Trying to fill in detailed information was time-intensive and claimants felt the system should be able to at least automatically populate their previous entries. Additionally, several mentioned that the system required them to enter a zip code for an employer, which they often did not know. Finally, claimants did not understand why the certification was not integrated with the state job search website they were registered on.
  • Department staff could not be reached by phone. The difficulty in reaching help through the telephone system was “demoralizing” and made claimants feel like the system “was not there to help them, and that it did not want them to collect benefits.” People waited for hours on the phone and many still rarely got through. They felt like they spent time waiting on the phone that they could have spent searching for work. Sometimes they would call and just get a message telling them to use the online services. New aspects of the system, like a screen that showed there were “issues” with their claim, without more information, made the inability to talk to department staff even more frustrating.
  • Lack of access was compounded by the workforce centers. Workers thought they could go to workforce centers for help, but were typically turned away or told there was nothing that could be done for them there. Thus, the expanded workforce center services prioritized by the new leadership is helpful to addressing this concern.
  • Claimants whose calls got through to the department staff were extremely well treated. Claimants said they thought the department representatives were empathetic and very helpful on the phone. The representative would “give them plenty of time, almost as if they felt you had earned that time with them.” Workers always felt like they got the right information from the call center and were relieved to speak with someone about their case.

Data Analysis

The adoption of modernized unemployment insurance systems has occurred at a period when the UI system is reaching fewer unemployed workers than ever before. The percentage of all jobless workers receiving a state unemployment insurance payment has dropped from 43.7 percent in 2001 to just 27.8 percent in 2018.68 This metric, termed the unemployment insurance recipiency rate, reached an all-time low of 25.7 percent in 2013. Moreover, going into the COVID-19 crisis, fewer jobless workers were even filing an application. Indeed, the share of jobless workers surveyed by the Census Bureau that filed a UI application dropped in half, from 51 percent in 2006 to just 23 percent in 2016.69

In examining the decline in recipiency from 2004 to 2017, economist Wayne Vroman points out not only that fewer workers are applying for benefits, but also that the increasing number of administrative disqualifications among those who do apply has driven recipiency down even further.70 As described below, the ways in which modernized UI systems collect, analyze, and adjudicate information submitted by claimants can increase the number of administrative denials. Modernization, however, has nothing to do with another major reason why unemployment insurance recipiency declined, which is the decline in the value of benefits. Responding to increased costs during the Great Recession, nine states reduced their basic unemployment insurance package to fewer than twenty-six weeks.71

This analysis closely looks at differences between states on administrative measures, such as the portion of workers being denied benefits. Figure 3 plots the increasing numbers of modernized states and the percent of all UI submitted applications that are denied. The national UI denial rate has steadily increased as the number of states modernizing their benefit systems has grown each year. Modernization directly impacts the way in which applications are adjudicated, so there is a much more plausible connection between modernization and denial rates, and this change appears to be correlated with more denials.

Indeed, most national and state agency officials we interviewed expressed little surprise that modernization has been associated with an increasing share of denial rates. Rather than viewing it as a negative sign of reduced claimant access to benefits, these leaders saw it as a positive sign that modernized systems were more accurately determining benefits eligibility. In their view, modernized systems were able to identify problems with benefit claims sooner through more effective fact-finding and prevent claims that may later be determined to be inaccurate, or even fraudulent.

While state officials also expressed hope that modernization might impact overpayment rates, the national overpayment rate changed little from 2012 to 2019, from 10.8 to 10.2 percent of benefits.72 Overpayments are a major area of concern among UI officials, as the Office of Management and Budget has identified UI as one of the federal programs with the highest level of improper payments.73

figure 3

Comparing Modernized and Non-Modernized States

To understand the impact of modernization on access to benefits and the operations of the UI system, we delved into the differences in unemployment insurance data between modernized and non-modernized states. Table 7 compares key UI program indicators between two groups of states: modernized and non-modernized. Specifically, this analysis groups the states that have modernized their programs by 2018 (twenty-two states) and those who had not modernized (twenty-eight states and the District of Columbia). The analysis employed a t-test to see whether observed differences between modernized and non-modernized states were large enough to be statistically significant, or simply within the normal range of differences between states.74

Table 7
Modernization Correlated with Increased Denials but Not Recipiency Rates
Change in Key UI Variables, 2018 vs. 2002
Modernized States (n=22, 2018), = m Not Modernized (n=30, 2018)
Recipiency rate -0.172 -0.144
Total Denials 16.69% -16.46%
Nonseperation denials 121.68% 7.66%
Nonseperation determinations 72.75% -6.69%
Seperation determinations -29.77% -27.06%
Seperation denials -34.73% -31.75%
Overpayment rates 0.02% 0.03%

Note: Bolded cells represent statistically significant differences.
Source: Author’s analysis of U.S. Department of Labor data.

This analysis shows a systematic connection between modernization and the increasing rates of denials of those who apply for benefits, but not a statistically significant difference in state recipiency rates. In other words, modernization has presented additional challenges for those who make the effort to apply for benefits. Specifically, the data in Table 7 suggests that:

  • Denial rates are statistically different between modernized and non-modernized states. Once the analysis is limited to those workers who have applied for UI benefits, the impacts of modernization are stark. Among modernized states, the number of unemployment insurance denials increased by 16.7 percent from 2002 to 2018. Among non-modernized states, the trend is nearly the opposite—denials decreased by 16.5 percent from 2002 to 2018. This difference is statistically significant at the 95 percent level.
  • Denials relating to work search and availability to work are driving a wedge between the states. The increase in denial rates among modernized states is driven by one type of denials—nonseparation denials. Nonseparation denials typically occur when an unemployed worker is found to have failed to meet the law’s requirement for being able and available for work and searching for a job.75 Nonseparation denials include cases when a worker fails to comply with ongoing eligibility requirements for UI like certifying their weekly work search activities or failing to report to a required appointment with a job counselor. Modernized systems have brought significant changes to the determinations of eligibility for these questions, including the ability to ask more detailed questions to claimants about their availability to work and more regularly request names connected with job search activities. As we learned from the stakeholder feedback in our case studies, these online systems can be more difficult to navigate than the phone-based systems that they replaced.
  • Denials do not appear to have reduced overpayments. The officials we interviewed about modernization stated that the increased number of nonmonetary denials represented an improvement in issue detection, and that many of these cases would have been found to be overpayments after the fact. In other words, modernized systems better track issues such as working while collecting unemployment at the time they occur, rather than detecting them through data cross-matches that only surface after the infraction. However, our data does not find a systematic difference in overpayments between modernized and non-modernized states, so there is not evidence for the claim that increased denials have reduced overpayments.

Modernization Impacts on Timely, Accurate Payment of Benefits

Typically, new information technology implementations disrupt the business processes of an organization. For UI modernization, this impact can be measured by an analysis of federal performance standards, which regulate if states are deciding and paying UI benefits in a timely and accurate basis.

Table 8 displays a detailed analysis of the change in UI performance measures from the year before modernization to the year after modernization in that particular state. This analysis compares the rate of change at the time of each individual state’s modernization to the national average during the same time period. A state earns a green indicator of its UI performance if it has improved as compared to the national average and a red indicator if it has declined.

Modernization clearly impacts the quality of nonmonetary determinations, as sixteen out of the twenty states analyzed did worse than the national average. In addition, just over half of modernized states also experienced a decline in their ability to move through all the steps of the determination process and deliver a payment on time. Future modernizing states should be aware that changes to businesses processes that came along with modernization can slow state payment times during and can lead to declines in quality.

Moreover, sixteen states have a red indicator when it comes to the average age of appeals, meaning that modernization caused states to take a longer time deciding appeals of benefits. In future modernizations, states should be vigilant to the possibility of unacceptably long appeals timelines and take swift action to shorten them.

Table 8

 

Effects of UI Modernization on Program Performance

Program Activities  First Payment Timeliness  Nonmonetary quality  Nonmonetary Timeliness  Quality of Appeals Decision  Average Age of Appeals 
Florida  Declined Decline Improved Declined Declined
Idaho  Improved  Declined  Improved  Declined  Improved 
Illinois  Declined  Declined  Improved Improved  Declined
Indiana  Declined  Declined  Declined  Declined  Declined 
Louisiana  Declined  Declined  Declined  Declined  Declined
Massachusetts  Declined  Declined  Improved  Declined  Declined 
Maine  Declined  Declined  Declined  Improved  Declined 
Michigan  Improved  Improved  Improved  Improved  Declined 
Minnesota  Improved  Improved  Declined  Declined  Declined 
Missouri  Declined  Declined  Declined  Improved  Declined 
Mississippi  Improved  Improved  Improved  Improved  Improved 
Montana  Improved  Declined  Declined  Improved  NA
New Hampshire  Improved  Declined  Improved  Declined  Declined 
New Mexico  Improved  Declined  Improved  Improved  Declined 
Nevada  Declined  Declined  Declined  Improved  Declined 
Ohio  Improved  Declined  Improved  Declined  NA 
South Carolina  Declined  Declined  Declined  Declined  Declined 
Tennessee  Declined  Declined  Improved  Declined  Declined 
Utah  Declined  Improved  Improved  Improved  Declined 
Washington  Declined  Declined  Improved  Declined  Declined 
Number of red indicators (Declined compared to national average at the time)  12 (60%)  16 (80%)  8 (40%)  11 (55%)  16 (80%) 
Source: Author’s analysis of U.S. Department of Labor data.

AI and Predictive Analytics

As state agencies upgrade their technological capabilities, more vendors have joined the market, offering tools that can automate functions previously performed by agency staff. This follows a national trend that has already impacted the public benefit system, especially state Medicaid and SNAP programs. A lot remains unknown about the types of automated decision-making, predictive analytics, and artificial intelligence currently in use by state unemployment agencies. While some of these tools can improve the functioning of the agencies, and potentially assist workers in better understanding their UI reporting requirements, major concerns about fairness, accuracy, and due process remain.

One of the driving forces behind modernization efforts has been pressure from the U.S. Department of Labor and state legislatures for UI agencies to address improper payments. Although many improper payments are not the result of fraudulent behavior, an entire cottage industry of vendors has developed to provide tools that identify and prevent fraud.

Fraud Detection and Risk Assessment Tools

Fraud detection and risk assessment tools aggregate and analyze data from different cross-match sources or databases and tools that use predictive analytics to detect potential fraudulent behavior or risk. They offer some limited advantages, but mostly raise concerns.

First, the limited advantages. For most of the past fifty years, unemployment systems have had access to different cross-matches with statewide and national databases that have provided information for claims investigations and determination of benefit eligibility. These cross-matches were often analyzed individually by staff, an arduous and time-consuming task. Now, many states have the capability to automatically input this new information into claims and issue fact-finding documentation or determinations, which should catch improper payments at an earlier stage.

However, the aggregation of data more generally can be problematic if the underlying data is not good data. Acting on aggregated inaccurate data only creates more barriers to benefits for workers, and more work for the agency. Furthermore, given the vast inequality in our country and the surveillance that results from using public resources, many low-income and minority workers have significantly more publicly accessible data about them than other workers applying for benefits.

Additionally, some vendors are offering “risk assessment” models or “discovery” tools that will aggregate cross-matches and other data to determine a “score” of fraud risk by the claimant, including FAST Enterprises in Washington. Risk assessment tools have proven constitutionally problematic when used in other contexts. Advocates have raised concerns about—and litigated—the use of risk assessment algorithms in bail-setting in criminal cases. Advocates have also recognized the dangers of discriminatory bias in assessment scores in the child abuse and neglect cases, which aggregate data that is inherently biased, especially against individuals of color.76

While human bias has always existed in these decision-making processes, the introduction of this technology threatens to permanently cement that bias. These aggregated scores carry with them a sense of infallibility, because of course the computer analysis is more reliable than human analysis. A state actor may quickly rely on the output as a gold standard. However, some researchers have found that the “scores” and predictions from these systems are no more reliable than a coin flip.77

An important question in these systems is what weighting the agency chooses to give the various cross-matches or data that is input into the assessment score. An outside evaluator can help states determine the fairness and accuracy of a scoring or weighting algorithm before it is deployed.

Automated Decision-making

Running a UI system requires thousands of discrete tasks, and one goal of modernized systems has been to increase efficiency. One method of increasing efficiencies is to automate more of the decisions and notices in the system. Automation has always existed in UI systems; for example, almost all initial determinations on financial eligibility are automated based on the wage reporting in the system for the claimant. Traditionally, other eligibility determinations, both separation and nonseparation, have always had a human touch, meaning that a UI representative is evaluating the facts and determining the outcome.

Vendors in the UI space are pushing technology that will remove these human touches. When information is collected, the system will issue a decision based on the programmed algorithm or analysis. Vendor websites specifically advertise that their systems can act “on behalf of” adjudicators.

The states that participated in our case studies took a careful approach to questions of automated decision-making:

  • In Minnesota, the agency structured its decision-making on the idea that “humans do human things, machines do machine stuff—but the humans can always override the machines.” The Minnesota system flags and identifies eligibility issues on its own, which increases operational efficiency but does eliminate some flexibility in claim taking. But on the actual eligibility determinations, there is always a human touch, especially on credibility, which can often be the dispositive factor in UI decisions.
  • In Washington, the agency was very careful about automation, as its vendor, FAST Enterprises, had been involved in the Michigan modernization project that resulted in state and federal litigation about its automated decision making. Leadership visited Michigan to better understand the pain points so they would not be replicated in Washington. Notably, the agency ensured that every fraud determination in Washington had a human touch, no decisions were “auto-determined.” While most eligibility determinations were not automated, the back-end adjudication system had almost the same effect at the time the project went live. It created a check box system that left little room for flexibility or individual analysis in each decision, and while determinations before were multiple pages long, the new ones barely had any facts or reasoning. Adjudications were initially left with little choice in how the determinations were written, but ESD later improved the back-end system.Finally, the agency did auto-determine work search compliance. As explained above, this report raises significant concerns about the lack of flexibility in work search questions on continuing claims and auto-determining eligibility based on those questions raises significant access issues.
  • In Maine, the state UI statute provides protection against any form of computer decision-making. The statute requires that, prior to any disqualification based on new information during a continuing claim, a fact-finding telephone interview must be scheduled with the parties.78

However, many concerns remain about how other states are using automated decision-making, especially in the work-search and overpayment contexts.

Nudging

Another AI-driven technology in use in at least one state is “nudging”—the use of predictive analytics to apply targeted behavioral psychology to change the conduct of users. The basic premise is promising: states can use aggregated historical data to identify factors, not necessarily personal characteristics, that may pose challenges for claimants to successfully and correctly complete initial and continuing claims. Unlike other forms of AI, which can be used to proactively disqualify claimants, nudging in this fashion does not prescribe immediate negative consequences. A target of nudging should not be in a worse position than a claimant who does not receive any nudges.

At this point, the focus of nudging technology in UI has been to prevent improper payments. New Mexico, working with Deloitte, developed a series of nudging tools and messages aimed at improving compliance in areas where high numbers of improper payments were reported, primarily on continuing claims filing. New Mexico also used nudging and risk assessment models to identify worker misclassification.

Deloitte was able to statistically control which groups of claimants received which messages. Although neither Deloitte or New Mexico did any user testing or focus groups to develop the text of the messages, they collected data after implementation and continued to update and narrow the messages based on what wording was effective. New Mexico found that personalizing the messages worked best, while scare tactics and legalese failed to have an impact. However, overall there was not a clear indication that the nudging tactics had a significant impact on improper payments, in part due to message fatigue. New Mexico also abandoned an early effort to use nudging during initial claims after not seeing any improvement in claimant response, and also shared a concern that too much nudging during initial claims might dissuade claimants from thinking they are eligible for benefits. Importantly, to ensure fair consideration of every claim, no agency claim examiners or interviewers are able to see in the system who received a message as a result of the nudging analytics.

Nudging, especially when the technology has been outsourced to a private vendor, raises similar concerns to other uses of algorithms and automation in unemployment insurance systems: what data is being used and how is it being evaluated? New Mexico described the algorithm factors used to identify risk as designed and maintained by Deloitte. While they can see the scores, they cannot see the underlying characteristics or weighting upon which the scores are based. The state referred to the nudging technology as “proprietary” to Deloitte. Black box technology, especially when controlled by a private entity, raises due process concerns, both procedural and substantive, for claimants whose eligibility may be affected by the decision-making.

State Responsibility for Effects of AI and Predictive Analytics

While these technologies may improve efficiencies for state agencies, their use should be centered on improving outcomes for the end users of the unemployment system. An efficient system that improperly delays or denies benefits, or incorrectly assesses fraud, runs contrary to the due process rights of claimants and federal law governing fairness in the administration of unemployment systems.79

Recent litigation in Michigan by claimant-plaintiffs alleging due process violations from the state’s automated unemployment fraud detection system, known as the Michigan Integrated Data Automated System (MiDAS), highlights the fact that state officials are not shielded from liability solely because an algorithmic system created by a private vendor caused the rights deprivation. As the federal District Court in the case described:

MiDAS was developed to search for discrepancies in the records of unemployment compensation recipients, automatically determine whether the claimants committed fraud, and execute collection proceedings, which included intercepting tax refunds and garnishing wages. Auto-adjudication is a process that starts with the automated generation of a flag, then leads to the automated generation of questionnaires, then to an automated determination based on logic trees, followed by an automated generation of a notice of fraud determination, then automated collection activity.80

In the case, claimant-plaintiffs allege that the defendants, including several agency officials and the private vendor, “worked together with the state to design, maintain, operate, and implement the robo-fraud-detection and adjudication system . . . [which] labeled them fraudsters, and then assessed and collected fines and penalties, all without notice and an opportunity to be heard.”81 Similar claims were made in an ongoing state court case alleging due process violations under Michigan’s constitution.82 State officials sought qualified immunity in the federal case, but the Sixth Circuit Court of Appeals rejected the defendants’ “invitation to allow state actors to evade liability by utilizing new technologies to effectuate unconstitutional conduct.”83 The court refused to let the state officials “hide behind MiDAS,” finding that “MiDAS did not create itself,” as the officials implemented and oversaw the project and enforced the false fraud determinations it “automatically rendered.”84

State UI agency officials should heed the warning from the federal courts that the use of technology created by a private vendor does not create an automatic legal shield when that technology violates the rights of claimants. But these problems can be avoided if states follow the types of recommendations in this report, which place the user experience at the center of the planning, design, and implementation of modernized benefit systems. As a multitude of vendors flood the market with different fraud detection software, it is vital for states to concretely understand the underlying technology and the data and algorithms it relies upon, and to take proactive measures to ensure that claimants, already navigating these difficult systems, are not unduly harmed.

Conclusion: The Road Ahead

The surge in unemployment claims in 2020, and the failure to meet customer service standards, has caused a reckoning among states and will undoubtedly accelerate modernization efforts. States should heed the lessons of those that have come before them, by basing immediate and long-term changes on solid user testing and bedrock principles of claimant access. Too often, modernization efforts have had negative impacts of claimant access, and the future of the UI system depends on a different approach.

Appendix: Research Methods

The case studies of modernization in Maine, Minnesota, and Washington were conducted from October 2018 to January 2020. Each involved many hours of in-person discussions with UI agency leadership and staff, focus groups with unemployed workers, and interviews with legal services organizations, union officials, and other stakeholders.

Site Selection

As discussed earlier, Maine, Minnesota, and Washington were selected as case study sites because they showcased different approaches to UI IT modernization projects. They also represent states with different populations, economies, and labor forces. Following our conversations with several state agencies (see Initial Lessons from States section), we developed additional selection criteria that led us to feature these three states in our final report. These criteria included evaluating the strength of the agency’s business processes; evaluating the agency’s ability to respond to problems and public critiques following the go-live; and identifying a strong presence of legal aid services, labor organizations, and worker advocacy groups invested in improving the claimant side of the state’s unemployment insurance program.

Our final criteria turned out to be critical to determining our case study states: agency management had to be willing participants, which entailed agency staff working in policy and leadership, adjudication, claims taking, appeals, and IT committing to one or two full days of interview meetings and system demonstrations with our research team. While agency leaders were often enthusiastic about our project, it was challenging for them to divert staff away from daily operations for extended periods; early discussions with South Carolina, Mississippi, New Mexico, and Utah did not result in a case study for these reasons.

Our research team had strong existing relationships with many UI agency leaders from previous research on unemployment insurance and through NASWA, which helped us secure commitments from agency leadership in Maine, Minnesota, and Washington. Our agency partners had deep knowledge of the industry and suggested suitable states to choose for our case study research. They also were also helpful with initiating introductions for us when our research team did not have existing relationships.

Focus Group Recruitment

We recognize there are demographic limitations to our final site selection. Minnesota, Maine, and Washington are northern states with seasonal industries and labor forces. They have significantly less diverse populations—6.8 percent of Minnesota residents, 4.3 percent of Washington residents, and just 1.6 percent of Maine residents are Black or African American, compared to 13.4 percent of the total U.S. population. Similarly, 5.5 percent of Minnesota residents, 12.9 percent of Washington residents, and 1.7 percent of Maine residents are Latinx or Hispanic, compared to 18.3 percent of the national population. However, Minnesota and Washington had slightly denser American Indian and Alaska Native populations—1.4 percent and 1.9 percent, respectively—compared to the entire nation.85

We attended to these demographic limitations by holding focus groups in an urban center as well as a rural township in each state. We coordinated focus group recruitment with legal aid advocates, labor organizations, and social-justice-oriented community groups in each state. Many of these organizations expressly provided services to BIPOC (Black, Indigenous, People of Color) communities, immigrants, and low-income workers. They reached out to their former or current clients to encourage focus group participation. The Century Foundation also ran Facebook advertisements in each state. The Facebook advertisements linked respondents to a short online screening survey designed and hosted by the research team, which collected self-reported demographic information from respondents. These combined efforts helped us select and plan for diverse focus group formations.

Participation in focus groups was limited to individuals who had filed for unemployment insurance in a case study state post-modernization. Participants did not have to have received unemployment insurance to be eligible. Though not always possible, we encouraged participation of individuals who had experience filing for unemployment insurance using both the old and the new system. Unless participants were directly recruited by our partnering organization, research team members contacted all respondents via email to confirm their eligibility. If there was more interest in a focus group than there were available seats, the research team extended invitations to individuals who self-reported as BIPOC first. For each focus group, we planned for up to twelve participants though typical attendance was between six to nine. Table 9 shows a breakdown of focus group participation.

table 9
Focus Group Participation by Location
Location Number of Participants
Minnesota Minneapolis 9
Windom 6
Washington Seattlle 8
Sunnyside 12
Maine Portland 8
Brewer 6

All participants received a $100 per diem. Additionally, we gave all participants a free meal, since many of the focus groups took place after regular business hours and during dinnertime. Focus group conversations ran for approximately two hours and were attended by one moderator and one dedicated notetaker. When it was feasible, focus group conversations were also audio recorded, after obtaining signed and verbal consent from all participants.

Partnering with local organizations to recruit focus group participants was both strategic and mutually beneficial. Legal aid services in Portland, Seattle, and Minneapolis offered to host focus groups. A social-justice-oriented community organization in Sunnyside and a labor temple in Brewer also hosted focus groups. Organizations that volunteered to host our focus groups were given a donation of $500 to show our gratitude for their work.

Interview Methods

Our interview instrument had ten open-ended questions that were designed to analyze the institutional and procedural changes that accompanied UI IT projects. The questions were as follows:

  1. What are the three biggest ways modernization changed your business process?
  2. What are the three biggest ways modernization changed the claimant experience?
  3. Describe the claimants who have the easiest time accessing benefits today, and those who experience the greatest challenges. Would your answer have been different in any way before modernization?
  4. How has modernization changed the initial application experience for the state? For the claimant?
  5. How did modernization change the call center experience for claimants and staff at the time the new system was launched, and on an ongoing basis?
  6. How has modernization changed the continuing claims experience for the state? For the claimant?
  7. How has modernization changed the appeals process for the state? For the claimant?
  8. Did you get input from claimants or employers before implementation, and is your system set up to make continuous improvements?
  9. What advice would you give another state that is considering modernizing?
  10. How would you describe the ideal UI system (in terms of how it operates, not the tax/benefit structure)?

These questions were shared with state agencies prior to the research team’s visit. By sharing the questions beforehand, agency management was able to identify team members with specialized knowledge in areas related to policy and leadership, claims taking, adjudication, and appeals. We also noted our interest in securing a system demonstration and meetings with agency staff from the IT department working in system design, updates, and data management. In Maine and Washington, this included conversations with external IT consultants from Tata Consulting Services and Fast Enterprises.

Sharing our research goals with agency leaders beforehand helped them create structured itineraries for our visit that made the best use of everyone’s time. Many of our interviews included visual presentations, handouts of UI data trends, and other prepared resources for our benefit. On average, we spent one to one-and-a-half full business days with state agencies in scheduled meetings across several agency departments.

At least three research team members were present during each agency interview. The interview format was necessarily semi-structured to allow agency staff to schedule opportune meetings that would not disrupt their daily operations. Though our questions were asked out of order depending on the availability of departmental staff, we secured meetings and system demonstrations that covered all our interest areas at each state. Furthermore, we always had scheduled time to debrief with agency staff that allowed them to ask questions about our project and research goals. All research team members took notes, although there was always one research team member dedicated to taking detailed notes. Meetings with state agencies were not recorded.

Our research questions did not require much modification to be relevant to other stakeholders, such as legal aid services and labor unions. These semi-structured interviews were shorter (usually two to three hours) and took place after the research team’s visits to state agencies. Questions probed whether worker advocates had played a role in the design and implementation of UI IT projects. Asking the same ten questions that we asked the state agencies proved very illuminating, since worker advocates’ interactions with claimants sometimes changed after modernization. They were able to identify specific aspects of the new system that were challenging for their clients. They also suggested innovative approaches to designing an ideal UI system.

Our strategy for choosing case study sites included assessing whether there were established community organizations with a vested interest in strengthening unemployment insurance. Given our team members’ backgrounds and expertise, we found making these connections and coordinating interviews relatively straightforward—community leaders and legal aid lawyers were interested and enthusiastic about our project. Research team members reached out to their existing connections with legal aid services and labor unions in each state to set up interviews. These working meetings also became the basis for participant recruitment for our focus groups, as several organizations represented clients that were appealing UI agency decisions.

Focus Group Methods

Focus groups are group interview settings that emphasize the importance of shared experiences and interest in specific topics.86 Our research team identified unemployment insurance claimants early on as key stakeholders with vested interests in the design and outcome of UI IT modernization projects. Despite this, claimants are rarely consulted during or even after projects. Consequently, there is a dearth of knowledge about whether claimants find online UI systems to be as efficient, convenient, and beneficial as agencies claim them to be.

Focus group questions are necessarily semi-structured to allow discussion to flow freely. Our focus group questions included a variety of closed-ended and open-ended questions to encourage participation by all. Questions followed the procedural steps of applying for unemployment insurance. We began by asking participants about their experience filling out the initial application and filing weekly claims online. Next, we asked participants about how certain processes (for example, fulfilling the work search requirement) were impacted by moving these processes online. We asked participants to reflect on whether UI modernization changed other program services, such as call center and WorkForce center operations. Finally, we asked participants who had experienced issues or filed appeals to talk about how these processes were handled by the new system.

In each focus group, the moderator and notetaker began the discussion with personal introductions and a description of the research project. Participants were given consent forms to sign that explained the terms of their participation. Participants were asked to respect others’ privacy by not sharing stories outside of the group, and refrain from interrupting when others were speaking. To encourage camaraderie within the group, discussion started with introductions and an ice-breaker activity—each participant shared their story about applying for unemployment and rated their experience with the state UI program. In larger focus groups, participants were asked to use a nametag or name card to allow moderators or participants to refer to everyone by name.

Participants’ ratings of the UI program highlighted how the new system worked for some but not all claimants. Even participants that rated their experience with the UI agency highly were sympathetic towards participants who had struggled with their application. There was a natural agreement with most focus groups that applying for unemployment insurance was needlessly complex and embarrassing, which elicited rich discussions about how online systems helped or hindered claimants.

With the moderator asking questions and follow-ups, facilitating equal participation, and occasionally redirecting conversation, the second research team member was free to take detailed notes during the conversation. Transcriptions from recorded focus groups were generated automatically using Otter AI.

Narrative Analysis

After interviews and focus group meetings, research team members debriefed to extrapolate the themes of the day. This included comparing notes, parsing out answers to research questions, and identifying items that required direct follow-up. Our notes from interviews and focus groups, audio recordings and transcripts, and agency-generated data and presentations made up the bulk of materials analyzed for the purpose of this report.

Our objective was to identify how UI modernization impacted the claimant experience. Identifying claimant outcomes that are particular to UI IT systems requires isolating these issues from processes and norms related to UI systems generally. To accomplish this, we asked state agencies, worker advocacy organizations, and claimants themselves to share stories—or narratives—to relate personal experience to institutional change.

Narrative analysis is a group of methods for interpreting texts telling a common story from different angles.87 It is a suitable analytical method for our study given the time elapsed between system go-lives and the time of our interviews. It focuses analysis away from what happened to how people make sense out of what happened and to what effect.88 Agency staff explaining how modernization changed UI told stories about how project decision-making was linked to political actors and events, institutional values, and even the changing seasons. Similarly, claimants’ interactions with unemployment systems were closely tied to personal stories about experiencing job loss as well as other hardships.

Narrative analysis “is an approach to the analysis of qualitative data that emphasizes the stories that people employ to account for events,” Alan Bryman explains.89 It is a method that illuminates overarching narratives through comparison; for example, three separate agencies explaining what went into their decision to go-live in the middle of winter. It is a method that humanizes contexts that would otherwise seem diametrically opposed; for example, finding that agency officials and claimants share many of the same values in system design.

Narrative analysis grounded the research team’s analysis. Analyzing our research products generated concrete recommendations for states during planning, design, and implementation stages of modernization projects. While these recommendations are tailored to include claimants during each stage, they are also responsive to the needs of agency employees, worker advocacy organizations, and other key stakeholders in building a UI system that works for everyone.

Notes

  1. Mark Bocchetti, “States navigate jobless flood at different speeds,” Roll Call, (April 28, 2020), https://www.rollcall.com/2020/04/28/states-navigate-jobless-flood-at-different-speeds/.
  2. George Wentworth, “Our Country Has Forgotten the Lessons of the Great Recession,” The Hill, December 22, 2017 https://www.nelp.org/commentary/our-country-has-forgotten-the-lessons-of-the-great-recession/
  3. Ben Zipperer and Elise Gould, “Unemployment filing failures,” Economic Policy Institute, April 28, 2020, https://www.epi.org/blog/unemployment-filing-failures-new-survey-confirms-that-millions-of-jobless-were-unable-to-file-an-unemployment-insurance-claim/.
  4. Kathy Manderino, Secretary of the Pennsylvania Department of Labor and Industry, “Written Testimony before the House Labor and Industry Committee Regarding the Department of Labor and Industry Unemployment Compensation System,” Pennsylvania State Legislature, Harrisburg, Pennsylvania, March 1, 2017, https://www.legis.state.pa.us/WU01/LI/TR/Transcripts/2017_0021_0001_TSTMNY.pdf.
  5. Lou Ansaldi, NASWA/ITSC, “UI IT Modernization Overview,” presentation at the NASWA UI Directors Conference, Orlando, Florida, November 8, 2017.
  6. Paul Egan, “Judge blasts state agency as court OKs faulty computer system lawsuit,” Detroit Free Press, January 3, 2019, https://www.freep.com/story/news/local/michigan/2019/01/03/unemployment-insurance-agency-michigan/2474723002/;
    Steve Gray and Casey Farrington, “Opinion: Undoing the harm of MiDAS’ fraud designations,” The Detroit News, October 16, 2018, https://www.detroitnews.com/story/opinion/2018/10/16/opinion-undoing-harm-midas-fraud-designations/1649803002/.
  7. William Carrington, “Unemployment Insurance in the Wake of the Recent Recession,” Congressional Budget Office, November 2012, http://www.cbo.gov/sites/default/files/cbofiles/attachments/11-28-UnemploymentInsurance_0.pdf.
  8. Alan S. Blinder and Mark Zandi, “The Financial Crisis: Lessons for the Next One,” Center of Budget and Policy Priorities, October 15, 2015, https://www.cbpp.org/research/economy/the-financial-crisis-lessons-for-the-next-one.
  9. 42 U.S.C. Sections 503(a)(1), (3); 20 C.F.R. Parts 640, 650.
  10. Lincoln Quillian, Devah Pager, Ole Hexel, Arnfinn H. Midtbøen, “The persistence of racial discrimination in hiring,” Proceedings of the National Academy of Sciences, September 2017, https://www.pnas.org/content/early/2017/09/11/1706255114#ref-17;Erik Sherman, “Hiring Bias Blacks And Latinos Face Hasn’t Improved In 25 Years,” Forbes, September 16, 2017, https://www.forbes.com/sites/eriksherman/2017/09/16/job-discrimination-against-blacks-and-latinos-has-changed-little-or-none-in-25-years/#3830954451e3.
  11. Olugbenga Ajilore, “On the Persistence of the Black-White Unemployment Gap,” Center for American Progress, February 24, 2020, https://www.americanprogress.org/issues/economy/reports/2020/02/24/480743/persistence-black-white-unemployment-gap/;
    Jhacova Williams and Valerie Wilson, “Black workers endure persistent racial disparities in employment outcomes,” Economic Policy Institute, August 27, 2019, https://www.epi.org/publication/labor-day-2019-racial-disparities-in-employment/.
  12. Andre M. Perry, “Black workers are being left behind by full employment,” The Brookings Institution, June 26, 2019,
    https://www.brookings.edu/blog/the-avenue/2019/06/26/black-workers-are-being-left-behind-by-full-employment/.
  13. Austin Nichols and Margaret Simms, “Racial and Ethnic Differences in Receipt of Unemployment Insurance Benefits during the Great Recession,” Urban Institute, June 2012, https://www.urban.org/sites/default/files/publication/25541/412596-Racial-and-Ethnic-Differences-in-Receipt-of-Unemployment-Insurance-Benefits-During-the-Great-Recession.PDF.
  14. Nicole Lyn Pesce, “A shocking number of Americans are living paycheck to paycheck,” MarketWatch, January 11, 2020, https://www.marketwatch.com/story/a-shocking-number-of-americans-are-living-paycheck-to-paycheck-2020-01-07.
  15. Kriston McIntosh, Emily Moss, Ryan Nunn, and Jay Shambaugh, “Examining the Black-white wealth gap,” The Brookings Institution, February 27, 2020,
    https://www.brookings.edu/blog/up-front/2020/02/27/examining-the-black-white-wealth-gap/;
    Danyelle Solomon and Darrick Hamilton, “The Coronavirus Pandemic and the Racial Wealth Gap,” Center for American Progress, March 19, 2020,
    https://www.americanprogress.org/issues/race/news/2020/03/19/481962/coronavirus-pandemic-racial-wealth-gap/.
  16. Janelle Jones, “The racial wealth gap,” Economic Policy Institute, February 13, 2017, https://www.epi.org/blog/the-racial-wealth-gap-how-african-americans-have-been-shortchanged-out-of-the-materials-to-build-wealth/.
  17. Eileen Patten, “Racial, gender wage gaps persist in U.S. despite some progress,” Pew Research Center, July 1, 2016, https://www.pewresearch.org/fact-tank/2016/07/01/racial-gender-wage-gaps-persist-in-u-s-despite-some-progress/.
  18. “Quantifying America’s Gender Wage Gap by Race/Ethnicity,” National Partnership for Women and Families, March 2020,
    https://www.nationalpartnership.org/our-work/resources/economic-justice/fair-pay/quantifying-americas-gender-wage-gap.pdf.
  19. “Mobile Fact Sheet,” Pew Research Center, June 12, 2019, https://www.pewresearch.org/internet/fact-sheet/mobile/.
  20. Andrew Perrin and Erica Turner, “Smartphones help blacks, Hispanics bridge some—but not all—digital gaps with whites,” Pew Research Center, August 20, 2019,
    https://www.pewresearch.org/fact-tank/2019/08/20/smartphones-help-blacks-hispanics-bridge-some-but-not-all-digital-gaps-with-whites/.
  21. Monica Anderson, “Racial and ethnic differences in how people use mobile technology,” Pew Research Center, April 30, 2015, https://www.pewresearch.org/fact-tank/2015/04/30/racial-and-ethnic-differences-in-how-people-use-mobile-technology.
  22. Ibid.
  23. Robyn Caplan, Joan Donovan, Lauren Hanson, and Jeanna Matthews, “Algorithmic Accountability: A Primer,” Data and Society Research Institute, April 18, 2018, https://datasociety.net/wp-content/uploads/2018/04/Data_Society_Algorithmic_Accountability_Primer_FINAL-4.pdf.
  24. Ibid.
  25. “Understanding Algorithms,” Data and Society Research Institute, April 18, 2018,https://datasociety.net/wp-content/uploads/2018/04/Data_Society_Understanding_Algorithms_Explainer_FINAL-web.pdf.
  26. Caplan, Donovan, Hanson and Matthews, “Algorithmic Accountability”; Heidi Ledford, “Millions of black people affected by racial bias in health-care algorithms,” Nature Research, October 24, 2019, https://www.nature.com/articles/d41586-019-03228-6.
  27. Caplan, Donovan, Hanson and Matthews, “Algorithmic Accountability.”
  28. Email from Tom Stengle, U.S. Department of Labor, author’s calculations
  29. U.S. Department of Labor “UI Performs Core Measures,” accessed August 1, 2020 https://oui.doleta.gov/unemploy/pdf/Core_Measures.pdf.
  30. “State Supplemental Funding Survey,” National Association of State WorkforceAgencies, March 31, 2017 https://www.naswa.org/system/files/document/fy_2016_supplemental_report.pdf.
  31. Internet Unemployment System, Idaho Department of Labor, https://labor.idaho.gov/dnn/iUS, accessed June 26, 2020.
  32. Ansaldi, “UI IT Modernization Overview.”
  33. Information Technology Support Center, Status of State UI IT Modernization Projects, September 2019 http://www.itsc.org/Documents/Status%20of%20State%20UI%20IT%20Modernization%20Projects.pdf
  34. Megan Woolhouse and Beth Healy, “None Admit Fault on Troubled Jobless Benefits System,” Boston Globe, October 28, 2013, https://www.bostonglobe.com/business/2013/10/28/state-senators-question-deloitte-labor-chief-troubled-unemployment-benefits-system/8HrEnVobsloB9tNiILb7nJ/story.html. Auditor general reports have identified problems with modernized UI systems. See, e.g., Tennessee Comptroller of the Treasury, Single Audit Report, Division of State Audit, for the Year Ended June 30, 2016, March 22, 2017, 372, http://controller.finance.tennessee.edu/wp-content/uploads/sites/7/2017/03/2016_TN_Single_Audit.pdf (hereafter “Tennessee Report”); Louisiana Legislative Auditor, Financial Audit Services Management Letter, December 14, 2016, 2, http://app.lla.state.la.us/PublicReports.nsf/0/75F71BBD2CDA081686258089006513D9/$FILE/00011DB2.pdf (hereafter “Louisiana Report”); Office of the Auditor General, Report Summary of Performance Audit: Michigan Integrated Data System (MiDAS), February 2016, 1, https://audgen.michigan.gov/wp-content/uploads/2016/06/rs641059315.pdf (Michigan).
  35. Monica Halas, consulting attorney, and Hajar Hasani and Larisa Zehr, legal interns, Northeastern University School of Law, “UI Online: The Problem, the Legal Framework and Solutions,” Greater Boston Legal Services.
  36. Tennessee Report, 372.
  37. David Gutman, “State’s new unemployment- benefits website can’t handle traffic at launch,” The Seattle Times, January 3, 2017,https://www.seattletimes.com/seattle-news/politics/states-new-unemployment-benefits-website-crashes-right-after-launch/.
  38. Michael Van Sickler, “State audit highly critical of Florida’s unemployment system CONNECT,” Tampa Bay Times, February 27, 2015,
    https://www.tampabay.com/news/politics/gubernatorial/state-audit-highly-critical-of-floridas-unemployment-system-connect/2219504/#.
  39. George Wentworth and Claire McKenna, “Ain’t No Sunshine: Fewer than one in eight unemployed workers in Florida is receiving unemployment insurance,” National Employment Law Project, September 21, 2015,
    https://www.nelp.org/publication/aint-no-sunshine-florida-unemployment-insurance/.
  40. Miami Workers Center v. Florida Department of Economic Opportunity, Division of Workforce Services (CRC Complaint No. 12-FL-048).
  41. Bruce Krasnow, “Group files civil rights claim vs. New Mexico,” The New Mexican, August 15, 2013, https://www.santafenewmexican.com/news/local_news/group-files-civil-rights-claim-vs-new-mexico/article_f14e0688-ecbd-5a2f-9ddc-8eefdc89ac9e.htm.
  42. U.S. Department of Labor, “Model Unemployment Insurance Work Search Legislation,” Training and Employment Notice 17-19, February 10, 2020, https://wdr.doleta.gov/directives/attach/TEN/TEN_17-19.pdf.
  43. Jonathan Oosting, “Lawsuit over false fraud fiasco revived by Michigan Supreme Court,” The Detroit News, April 5, 2019, https://www.detroitnews.com/story/news/local/michigan/2019/04/05/supreme-court-revives-false-fraud-lawsuit-against-michigan/3377047002/.
  44. “Information Technology: Department of Labor Could Further Facilitate Modernization of States’ Unemployment Insurance Systems,” U.S. Government Accountability Office, GAO-12-957, September 2012; “Unemployment Insurance: States’ Customer Service Challenges and DOL’s Related Assistance,” U.S. Government Accountability Office, GAO-16-430, May 2016.
  45. “Modernizing the Unemployment Insurance Program: Federal Incentives Pave the Way for State Reforms,” National Employment Law Project, May 2012, https://www.nelp.org/wp-content/uploads/2015/03/ARRA_UI_Modernization_Report.pdf.
  46. Unemployment Insurance Program Letter 2-16, October 1, 2015, https://oui.doleta.gov/dmstree/uipl/uipl2k16/uipl_0216.pdf.
  47. Ibid., change 1.
  48. Ibid., 4.
  49. Ibid., 13.
  50. Unemployment Insurance Program Letter No. 11-18, August 17, 2018, 2, https://wdr.doleta.gov/directives/corr_doc.cfm?DOCN=9114.
  51. Phone interview with Ellen Golombeck, Deputy Director, National Association of State Workforce Agencies, October 10, 2018.
  52. Ibid.
  53. 43 P.S. § 781.4(h)(6) (Pa.).
  54. “An Act Financing the General Governmental Infrastructure of the Commonwealth,” The Acts and Resolves of Massachusetts, chapter 151, 2020, https://malegislature.gov/Laws/SessionLaws/Acts/2020/Chapter151.
  55. Ibid.
  56. “Mobile Fact Sheet,” Pew Research Center, June 12, 2019, https://www.pewresearch.org/internet/fact-sheet/mobile/ (showing that 96 percent of U.S. adults own a cell phone and 81 percent own a smartphone, compared to 74 percent who own a desktop or laptop computer).
  57. Monica Anderson and Madhumitha Kumar, “Digital divide persists even as lower-income Americans may gains in tech adoption,” Pew Research Center, May 7, 2019, https://www.pewresearch.org/fact-tank/2019/05/07/digital-divide-persists-even-as-lower-income-americans-make-gains-in-tech-adoption/ (showing that only 54 percent of adults with incomes below $30,000 have computers and 71 percent have smartphones, while ownership rates for the two types of technology are similar among middle-income and higher-income adults).
  58. Andrew Perrin and Erica Turner, “Smartphones help blacks, Hispanics bridge some—but not all—digital gaps with whites,” Pew Research Center, August 20, 2019, https://www.pewresearch.org/fact-tank/2019/08/20/smartphones-help-blacks-hispanics-bridge-some-but-not-all-digital-gaps-with-whites/.
  59. UI agency officials from the following states were interviewed for this project: Colorado, Iowa, Kansas, Maine, Minnesota, Mississippi, New Mexico, Pennsylvania, Utah, Vermont, Washington, and Wyoming.
  60. State of Minnesota, Office of Enterprise Technology.
  61. David Gutman, “State’s new unemployment- benefits website can’t handle traffic at launch,” The Seattle Times, January 3, 2017,
    https://www.seattletimes.com/seattle-news/politics/states-new-unemployment-benefits-website-crashes-right-after-launch/.
  62. “Status of State UI IT Modernization Projects,” NASWA Information Technology Support Center, updated September 2019http://www.itsc.org/Documents/Status%20of%20State%20UI%20IT%20Modernization%20Projects.pdf.
  63. Theo Douglas, “Efficiency in Numbers as States Take Unemployment Insurance to the Cloud,” GT: Government Technology, November 6, 2017, https://www.govtech.com/computing/Efficiency-in-Numbers-as-States-Take-Unemployment-Insurance-to-the-Cloud.html.
  64. Colin Ellis, “How Maine bungled rollout of new jobless claims system,” Portland Press Herald, March 11, 2018, https://www.pressherald.com/2018/03/11/state-bungled-rollout-of-new-unemployment-claims-system/.
  65. Ibid.
  66. Robert N. Charette, “Maine’s New Unemployment System Frustrates the Public and State Workers Alike,” IEEE Spectrum, March 23, 2018, https://spectrum.ieee.org/riskfactor/computing/it/maines-new-unemployment-system-frustrates-the-public-and-state-workers-alike.
  67. Colin Ellis, “Lawmakers still seeking fix for Maine unemployment filing system,” Portland Press Herald, March 7, 2018, https://www.centralmaine.com/2018/03/07/state-legislators-still-seeking-fix-for-states-unemployment-filing-system/.
  68. “Application and Recipiency CY 1988–2019,” U.S. Department of Labor, Employment and Training Administration, updated November 1, 2019, https://oui.doleta.gov/unemploy/large_carousel.asp?slide=0.
  69. Stephen A. Wandner and Andrew Stettner, “Why are many jobless workers not applying for benefits?” Monthly Labor Review, June 2000, https://www.bls.gov/opub/mlr/2000/06/art2full.pdf, and Wayne Vroman, “Unemployment insurance recipients and nonrecipients in the CPS,” Monthly Labor Review, October 2009, https://www.impaqint.com/sites/default/files/project-reports/Vromen_UI.pdf.
  70. Wayne Vroman, “Unemployment Insurance Benefits Performance since the Great Recession,” Urban Institute, February 27, 2018, https://www.urban.org/research/publication/unemployment-insurance-benefits.
  71. Andrew Stettner, “Unemployment Trust Fund Recovery Is Helping Employers, not Workers,” The Century Foundation, December 7, 2017, https://tcf.org/content/report/unemployment-trust-fund-recovery-helping-employers-not-workers/.
  72. “Unemployment Insurance Benefit Payment Integrity,” United States Department of Labor, Employment and Training Administration, https://oui.doleta.gov/unemploy/improp_payrate.asp.
  73. Office of Management and Budget, “High-Priority Programs and Programs over $100M in Monetary Loss, 2019,” https://www.paymentaccuracy.gov/payment-accuracy-high-priority-programs/ accessed August 1, 2020.
  74. While this does not show that modernization is a causal factor, it does show whether the difference is large enough that modernization could be the cause.
  75. Separation denials are related to the reason a worker lost their job and whether it counts as an eligible reason under UI law; everything else is a nonseparation.
  76. Virginia Eubanks, Virginia. Automating Inequality : How High-Tech Tools Profile, Police, and Punish the Poor (New York: St. Martin’s Press, 2018).
  77. “Impoverished Algorithms: Misguided Governments, Flawed Technologies, and Social Control,” 46 Fordham Urb. L.J. 364, 379.
  78. See 26 M.R.S. § 1194.
  79. 42 U.S.C. Sections 503(a)(1), (3); 20 C.F.R. Parts 640, 650.
  80. Cahoo v. SAS Inst. Inc., 2020 U.S. Dist. LEXIS 145817, *4 (E.D. Mich. Aug. 11, 2020).
  81. Id. at *5.
  82. See Bauserman v. Unemployment Ins. Agency, 2019 Mich. App. LEXIS 7683 (Mich. 2019).
  83. Cahoo v. SAS Analytics Inc., 912 F.3d 887, 904 (6th Cir. Mich. 2019).
  84. Id.
  85. “Quick Facts: Minnesota, Washington, Maine, United States Population Estimates July 1, 2019,” United States Census Bureau, https://www.census.gov/quickfacts/fact/table/MN,WA,ME,US/PST045219.
  86. Alan Bryman, Social Research Methods, 5th Edition (Oxford: Oxford University Press, 2016), 501.
  87. Catherine Kohler Riessman, Narrative Analysis (Newbury Park, Calif.: Sage, 1993), 11.
  88. Bryman, Social Research Methods, 589. Emphasis added.
  89. Ibid., 590.