key: cord-311521-4calrk5l authors: Bhar, Ramaprasad; Malliaris, A.G. title: Modeling U.S. monetary policy during the global financial crisis and lessons for covid-19() date: 2020-08-30 journal: J Policy Model DOI: 10.1016/j.jpolmod.2020.07.001 sha: doc_id: 311521 cord_uid: 4calrk5l The paper formulates the modeling of unconventional monetary policy and critically evaluates its effectiveness to address the Global Financial Crisis. We begin with certain principles guiding general scientific modeling and focus on Milton Friedman's 1968 Presidential Address that delineates the strengths and limitations of monetary policy to pursue certain goals. The modeling of monetary policy with its novelty of quantitative easing to target unusually high unemployment is evaluated by a Markov switching econometric model using monthly data for the period 2002-2015. We conclude by relating the lessons learned from unconventional monetary policy during the Global Financial Crisis to the recent bold initiatives of the Fed to mitigate the economic and financial impact of the Covid-19 pandemic on U.S. households and businesses. The Federal Reserve Act of 1913 established the U. S. Central Bank called the Federal Reserve System and charged it to design a national currency and conduct a monetary policy to promote financial stability. In 1977, Congress amended The Federal Reserve Act, stating explicitly the monetary policy objectives of the Federal Reserve as: Committee shall maintain long run growth of the monetary and credit aggregates commensurate with the economy's long run potential to increase production, so as to promote effectively the goals of maximum employment, stable prices and moderate long-term interest rates." Among the three goals of maximum employment, stable prices and moderate long-term interest rates, the first two have received precedence and thus most consideration over the years. For example, during the late 1970s and early 1980s when the Fed was fighting inflation and long-term interest rates had approached very high levels, around 20%, such elevated interest rates did not cause the Fed to deviate from its inflation targeting. The pursuit of maximum employment and price stability as the Fed's primary goals has evolved into the Fed's "dual mandate". This dual goal involves some possible trade-offs between inflation and unemployment, often called the Phillips curve and the appraisal of their associated risks. For example, the fight against inflation during the 1970s and early 1980s had as a consequence the increase in unemployment. More recently, after the Global Financial Crisis of 2007-09, the Fed pursued maximum employment, understanding the risks of possible inflation. In this paper we accomplish four goals: first, present the methodological elements of M. The epistimological prominence of models in science has increased significantly during the past several decades and both philosophers and historians of science have studied these valuable tools. Frigg and Hartmann (2018) present a critical literature overview on models and modelling in science and list a large group such as: computational models, developmental models, explanatory models, testing models, theoretical models, heuristic models, mathematical models, and formal models, among several others. This large variety makes it difficult to propose a uniform definition. Milton Friedman (1953) participated in debates about the role of models in economic methodology and is known to have argued that models should not be judged by the realism of their assumptions but only by the success of their predictions. For our discussion, a model is a conceptual representation of some aspect of reality we wish to understand; or a model may highlight important connections among sets of data representing certain variables. According to Friedman we may not have an exact set of J o u r n a l P r e -p r o o f assumptions about the behavior of economic agents but if the model generates a testable prediction and offers a policy recommendation then such a model is useful. So a model is like a tool. If it works it is useful. This pragmatic methodology of modeling suggested by Friedman (1953) is carefully explored methodologically in Isaac (2013) . How have economists modeled monetary policy? Keynes observed that during the Depression of the the 1930s interest rates were very low and investments were unresponsive. He offered a model for monetary policy known as the liquidity trap and argued that monetary policy under such circumstances is ineffective and cannot help an economy overcome the depression. Monetarists who followed Keynesians in early 1960s argued that monetary policy had a major impact on an economy and played an important role on business cycles, exactly in opposition to the Keynesian model. However, monetary policy can prevent money itself from being a major source of economic disturbance; offer a stable background of expectations and offseting major disturbances in the economic system arising from other sources. How should monetary policy be conducted? Friedman advises that the most appealing guides for monetary policy are exchange rates, the price level as defined by some index, and the quantity of some monetary aggregate. Friedman (1968, p.15 ) asserts "I believe that a monetary total is the best currently available immediate guide or criterion for monetary policy-and I believe that it matters much less which particular total is chosen than that one be chosen." Fifty years later, Mankiw and Reis (2018) skillfully evaluate the brilliant contribution of Friedman on modeling the role of monetary policy by affirming the intellectual prescience of some of his proposals but also by replacing others that were less enduring. Friedman's view that in the long run the central bank cannot peg unemployment nor real interest rates has persisted. Instead of targeting Friedman's suggested variable of the growth rate on some monetary aggregate, central banks for the last two decades embraced price stability and in particular, often advocate a precise 2% inflation target. Also, the emphasis has shifted from using monetary variables as tools to using the very short term interest rate. Mankiw and Reis (2018) give a J o u r n a l P r e -p r o o f detailed exposition of the role of the natural rate of interest, introduced by Knut Wicksel, and explain how central banks currently interpret inflation targets in a flexible framework that adjusts to the business cycle. Put differently, the short run flexible decision-making by central banks to achieve their mandates by setting short term interest rates subject to new data may not have appealed to Friedman who prefered a much longer run strategy. It was in the longrun that "major sources of economic disturbances" occurred as in wars, periods of financial exhuberance or panics and Friedman's insightful advice suggested "offseting" such developments. Such "offseting" is central to the modeling of "unconventional monetary policy" presented next. We now know that the financial crisis began as a subprime mortgage lending problem in the economy. It had also validated that using the fed funds as a tool to achieve its dual mandate was most appropriate. These significant accomplishments of monetary policy were called the "great moderation". So when unemployement reached a high of 10% during the Great Recession, the Fed's objective function had to prioretize tagetting unemployment over inflation. Initially, while the economy was in recession, the risks of inflation were very low but from mid-2009 to the end of 2015, when the Fed increased the fed funds rate from the range of 0% to 0.25% to the range of 0.25% to 0.50%, the assessment of pursuing easy policies to achieve maximum employment J o u r n a l P r e -p r o o f against the potential risks of reigniting inflation were continuously monitored and assessed. The importance of job creation as the primary goal of unconventional monetary policy is expounded in Baghestani (2008) , Evans (2010) , Kuttner (2018) . During September 2008, fed funds were between 1.75% to 2%. Three months later, during December 2008, fed funds were between 0% to 0.25%. This great tool of normal monetary policy over several decades had become of no further use. In late 2008, the Fed decided to use an unconventional method to stimulate the U.S. economy constrained by the zero lower bound of fed funds. These tools included the so called Large-Scale of Asset Purchases (LSAP) or Quantitative Easing (QE). The tool of QE consists of the Federal Reserve purchasing longer-term U.S. Treasury securities and agency mortgage-backed securities (MBS) with the aim of driving down longer-term interest rates, thereby stimulating economic activity. In other words, if the short-term fed funds rate has reached the zero level and cannot go any lower, the next option for the Fed to accomplish its dual mandate was to target the longer-term interest rate. Independent of the debate whether the Fed can influence longer-term interest rates, Bernanke (2012 Bernanke ( , 2015 argues that QE works via the portfolio balance channel. The simple logic of this channel indicates that different classes of financial assets are not perfect substitutes in portfolios formed by investors and if the Fed can purchase large quantities of a certain asset and influence its price and therefore its yield, such changes may, through arbitrage transactions, spread to other asset classes. If the final result of QE is the increase of long-term bond prices and the decline of yields across many asset classes, the overall economic result is an increase in wealth that leads to more consumption and investment. With increases in consumption and investment, the economy grows and generates jobs to achieve the Fed's goal of maximum employment. Not all economists accept this line of reasoning. Thornton (2014) does not find any empirical evidence in support of the portfolio balance channel and Taylor (2014) Pieces of the unconventional monetary policy modeling evolved gradually. Bernanke (2012) and Kuttner (2018) discuss the progression of QE in detail. It was not known at the beginning how many rounds of QE were necessary for the restoration of financial stability in the financial sector and economic recovery in the real economy. Today, with the benefit of historical experience we know that the Fed executed 3 main rounds of QE. During the long period of about seven years, 2008-2015, with fed funds rate remaining essentially at zero, monetary policy followed QE and increased its balance sheet from about 1 trillion to 4.5 trillion to stimulate the economy that found itself to be in a liquidity trap. The use of QE was motivated by the pursuit of its goal of maximum employment. It was modeled in J o u r n a l P r e -p r o o f Friedman's belief that it is the role of monetary policy to offset major economic disturbances and the prediction of lowering unemployment was viewed as verification of its appropriateness. Forward guidance about keeping the fed funds rate at the zero lower bound and pursuing lower longer term interest rates was carefully articulated in numerous Fed communications to reduce financial uncertainty while at the same time it was made clear that inflationary expectations were clearly and carefully monitored to enhance the Fed's hard won credibility as an inflation fighter. Numerous speeches by Fed officials document in detail these strategic efforts to employ unconventional monetary policy tools. Bernanke (2015) Jointly using monthly QE data and also longer term interest rates (say for 10-year Treasury Notes), how have these 2 series impacted the U.S. labor market? We formulate basic hypotheses to connect empirically monthly QE and longer term interest rates to various data describing labor markets to investigate empirically the effectiveness of unconventional monetary policy that targeted maximum employment in the aftermath of the financial crisis? We claim that the unconventional tools of monetary policy targeted primarily the unemployment rate. We also view the Bernanke Fed's targeting of employment as confirmation of the validity of its policies. It was the Fed's affirmation of its new policies that QE and lower longer term T-Notes rates would yield the forecasted lowering of unemployment. We use all six measures of Unemployment, U1, U2, U3, U4, U5, and U6 to perform an exhaustive analysis of this dimension of labor markets. U3 is the most representative measure that is used most commonly. These hypotheses display the very practical and realistic goal of unconventional monetary policy that targeted the reduction in unemployment. What do we mean by such a statement? By unconventional monetary policy we mean the use of QE as a tool whose goal was to reduce the steepness of the yield curve, increase financial wealth, eventually increase aggregate demand and thus bring down unemployment. In the spirit of Friedman's scientific methodology numerous assumptions are made about these developments and what matters most is not the exact nature of these assumptions but the prediction that unemployment will decline. An early analysis of QE is found in Gagnon et al. (2011) and more recently in Engen et al. (2015) . The comprehensive paper of Gagnon et al. (2011) argues that in contrast to the pre-crisis Additional theoretical support for the hypotheses we are formulating can be provided by Chiarella and Guilmi (2011) . Actually these authors develop a dynamic macroeconomic model in the spirit of Hyman Minsky's (1977) The independent variable is monetary policy. We employ two proxies for monetary policy during the period studied: first we use Total Fed Assets and second we use the 10-Year Treasury Note which under normal conditions is determined solely by market conditions but during the financial crisis was targeted by the Fed via its QE strategies. We hypothesize: Following Hamilton (1994) , the matrix of transition probabilities is defined as: The expected duration of the high volatility regime is given by E(S=1) = 1/(1-P11) and that for the low volatility regime is given by E(S=2) = 1/(1-P22). Since we are interested in establishing the efficacy of the modelling approach over this rather long sample period, we initially focus on the diagnostic tests of the model. Although, the residual analysis from the regression equation (1) without switching shows no significant serial correlations, the CUSUM-square test shows instability in variance and/or parameters. This is further corroborated by the Chow Break Point test. These results are not included in the paper. Although the GARCH variance captures the time varying nature of conditional variance, it cannot address it, if there is structural discontinuity in the level of variance. In order to explore more effectively the influence of the independent variables on the various measures of unemployment as well as parameter and variance discontinuity over the long sample, we investigate the non-linear modelling approach via a two state Markov chain shown in equation (3). The subscript is used to denote dependence of these parameters on the prevailing Markov state. The law guiding the evolution of this unobserved state variable is a time homogenous transition probability matrix. The residual term is also dependent on the Markov state occurring and the variance of this residual variable is also state dependent. In fact, the realization of the J o u r n a l P r e -p r o o f residual variance is one way of classifying the states to which we may be able to attach economic significance In contrast to linear models that assume stationary distributions, regime-switching models are based on a mixture of parametric distributions whose probabilities depend on unobserved state variable. In our model, the economy alternates between two unobserved states of high volatility and low volatility according to a Markov chain process. Since we have identified that the parameters are not constant through time, and some sort of structural break occurs in the series, we have certain modelling choices. One alternative is to estimate the model over different sub-samples if the timing of the break is known. Another alternative is to make the structural break endogenous to the model since in many cases the timing of the shift may not be known. By making the shift(s) endogenous to the model, we can also make inferences about the process that drives these shifts. Models that shift between various densities allow us to incorporate structural breaks in the estimation procedure. A source of uncertainty, idiosyncratic to regime switching models is the ex-post determination of regimes. In switching models, it is assumed that the occurrence of a regime is observed by the market but not by the econometrician who must infer it from the model. Until recently, the quality of regime classification was determined by focusing on the smoothed expost regime transition probabilities. An innovation in this area is the Regime Classification Measure (RCM) proposed by Ang and Bekaert (2002) . This is essentially a sample estimate of the variance of the probability series. It is based on the idea that perfect classification of regimes would infer a value of 0 or 1 for the probability series and would be a Bernoulli random variable.  U2 = Percentage of labor force who lost jobs or completed temporary work.  U3 = Percentage of labor force who are without jobs and have looked for work in the last four weeks (note that this is the officially-reported unemployment rate).  U4 = U3 plus the percent of the labor force that count as "discouraged workers," i.e. people who would like to work but have stopped looking because they are convinced that they can't find jobs.  U5 = U4 plus the percent of the labor force that count as "marginally attached" or "loosely attached" workers, i.e. people who would theoretically like to work but haven't looked for work within the past four weeks.  U6 = U5 plus the percent of the labor force that count as "underemployed," i.e. parttime workers who would like to work more but can't find full-time jobs. Graph 1 illustrates these six proxies for the unemployment rate First, all 6 Tables show that all measures of unemployment changes are driven by changes in independent variables that are significant in both the low and the high volatility Markov regimes. We work with first differences because of the non-stationarity of our variables. Second, the 2 monetary policy instruments used by the Fed after the Lehman Brothers bankruptcy were QE and its impact on reducing longer-term interest rates on government bonds. J o u r n a l P r e -p r o o f studied and summarized in Williams (2011 Williams ( , 2013 and more recently in Bhar, Malliaris and Malliaris (2015) . U1 is driven by the long-run 10-Year T-Note rate in the low volatility regime and by its own 2-period lag in the high volatility regime. U2 is driven by the lagged T-Note rate in the low volatility regime and by Fed Assets in the high volatility regime. Autoregressive terms play also significantly for U2. Third, our results are very interesting for U3 in Table 3 . Since this is the official measure of unemployment, it is encouraging to observe that U3 is driven by Fed Assets in both the low (level of significance 1%) and high volatility (level of significance 5%) regimes. Also, it is important to indicate that the average duration of the high volatility regime for U3 is about 5.95 months that is higher than the low volatility regime that lasts only 1.91 months. U4 is driven by Fed Assets (significance level 10%), the T-Note rate and autoregressive factors in the low volatility regime. U5 on the other hand is influenced by Fed Assets in the low volatility regime and by the T-Note rate in the high volatility regime. Finally, U6 is significantly influenced by both Fed Assets and the T-Note rate in the low volatility regime and by autoregressive factors in the high volatility regime. The over-all conclusion of these 6 Tables is that monetary policy in its unconventional approach with QE and the 10 Year T-Note Rate appears statistically significant in 9 out of 12 regimes-recall we analyze six unemployment measures each in a low and high volatility regime. If we focus on U3 as the official statistic that does not include chronically discouraged workers for whom monetary policy has very limited effectiveness, then our results are much stronger. Recall, as was said earlier that Fed Assets enter as statistically significant for both the low and high volatility regimes of U3 (at the 1% and the 5% level respectively). This paper articulates that Friedman's modeling of monetary policy emphasizes its role as offsetting major disturbances in the economic system arising from sources such as a financial crisis. This conviction is in opposition to the Keynesian model that pronounces the ineffectiveness of monetary policy during a liquidity trap. Friedman also argues that models J o u r n a l P r e -p r o o f should not be judged by the realism of their assumptions but only by the success of their predictions. The Bernanke Fed, soon after fed funds were decreased close to zero in late 2008, chose new tools to target unemployment that had reached about 10%. No one knew during the months that followed the Lehman Brothers bankruptcy, the magnitude, duration and impact of these policies. What was known at the beginning was the hard economic facts of high unemployment, very low inflation, frozen financial markets and both national and international financial instability. When fiscal policy faced both budgetary and political constraints, a consensus was rapidly built by FOMC members to target unemployment as directed by the Fed's dual mandate. This consensus was not formed on solid analytical grounds of a macroeconomic model describing the channels connecting monetary policy to increased employment but rather on a careful calculus of risk management, that is, by assessing the benefits from lowering unemployment versus the risks of creating inflation, with a conviction in the effectiveness of monetary policy. What central bankers learned during the 2007-2015 period is that the effectiveness of QE was due to four factors: first, the rapid and careful formulation of unconventional monetary policy to replace traditional fed funds management when these had reached zero; there was no lamenting that monetary policy had arrived to its dead end. Second, the commitment of central bankers to unconventional QE that was extended over 7 years with 3 major rounds rather than being abandoned after its first or second round. Third, the boldness of policy makers to pursue large QE rather than moving very cautiously, say by only doubling the size of the Fed's balance sheet instead of actually growing it by 350%, and, fourth, the achievement of its goal to reduce unemployment substantially. Of course there was some good fortune, meaning that the monthly, data-driven, communications and evaluations of QE never were confronted with unpleasant issues of financial instability, rapid inflation or re-occurring recessions. economy. These initiatives are described in detail in Fleming, Sarkar, and Van Tassel, (2020) who also compare whether the Covid-19 initiatives are similar to the ones taken during the Global Financial Crisis or new, specially designed for Covid-19. To make some preliminary comparisons between the two crises-the Global Financial This paper discusses certain principles guiding general scientific modeling and focuses on Milton Regime Switches in Interest Rates Federal Reserve versus Private Information: Who is the Best Unemployment Rate Predictor? Aggregate Demand and Long-Run Unemployment The Impact of Large-Scale Asset Purchases on the S&P 500 Index, Long-Term Interest Rates and Unemployment The Crisis and the Policy Response Monetary Policy since the Onset of the Crisis The Courage to Act: A Memoir of a Crisis and its Aftermath American Economic Association Presidential Address at the Allied Social Sciences Association Annual Meetings in The Financial Instability Hypothesis: A Stochastic Microfoundations Framework The Macroeconomic Effects of the Federal Reserve's Unconventional Monetary Policies Innovative Federal Reserve Policies During the Great Financial Crisis Labor Markets and Monetary Policy The COVID-19 Pandemic and the Fed's Response Models in Science The Methodology of Positive Economics The Role of Monetary Policy Modeling Without Representation Large-Scale Asset Purchases by the Federal Reserve: Did They Work? Monetary Policy Surprises, Credit Costs, and Economic Activity What Happened: Financial Factors in the Great Recession Money, Banking and Monetary Policy from the Formation of the Federal Reserve until Today Monetary Policy and Balance Sheets Outside the Box: Unconventional Monetary Policy in the Great Recession and Beyond An Exploration of Optimal Stabilization Policy Friedman's Presidential Address in the Evolution of Macroeconomic Thought A Theory of Systemic Fragility Monetary Policy and Job Creation". A speech at the University of Maryland Smith School of Business Distinguished Speaker Series Unwinding Quantitative Easing: How the Fed Should Promote Stable Prices, Economic Growth and Job Creation QE: Is There a Portfolio Balance Effect? Unconventional Monetary Policy and Aggregate Bank Lending: Does Financial Structure Matter? Unconventional Monetary Policy: Lessons from the Past Three Years Lessons from the Financial Crisis for Unconventional Monetary Policy