Edward P. Herbst and Frank Schorfheide
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691161082
- eISBN:
- 9781400873739
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691161082.001.0001
- Subject:
- Economics and Finance, Econometrics
Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy ...
More
Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy analysis at central banks. This book introduces readers to state-of-the-art computational techniques used in the Bayesian analysis of DSGE models. The book covers Markov chain Monte Carlo techniques for linearized DSGE models, novel sequential Monte Carlo methods that can be used for parameter inference, and the estimation of nonlinear DSGE models based on particle filter approximations of the likelihood function. The theoretical foundations of the algorithms are discussed in depth, and detailed empirical applications and numerical illustrations are provided. The book also gives invaluable advice on how to tailor these algorithms to specific applications and assess the accuracy and reliability of the computations. The book is essential reading for graduate students, academic researchers, and practitioners at policy institutions.Less
Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy analysis at central banks. This book introduces readers to state-of-the-art computational techniques used in the Bayesian analysis of DSGE models. The book covers Markov chain Monte Carlo techniques for linearized DSGE models, novel sequential Monte Carlo methods that can be used for parameter inference, and the estimation of nonlinear DSGE models based on particle filter approximations of the likelihood function. The theoretical foundations of the algorithms are discussed in depth, and detailed empirical applications and numerical illustrations are provided. The book also gives invaluable advice on how to tailor these algorithms to specific applications and assess the accuracy and reliability of the computations. The book is essential reading for graduate students, academic researchers, and practitioners at policy institutions.
Don Harding and Adrian Pagan
- Published in print:
- 2016
- Published Online:
- January 2018
- ISBN:
- 9780691167084
- eISBN:
- 9781400880935
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691167084.001.0001
- Subject:
- Economics and Finance, Econometrics
The global financial crisis highlighted the impact on macroeconomic outcomes of recurrent events like business and financial cycles, highs and lows in volatility, and crashes and recessions. At the ...
More
The global financial crisis highlighted the impact on macroeconomic outcomes of recurrent events like business and financial cycles, highs and lows in volatility, and crashes and recessions. At the most basic level, such recurrent events can be summarized using binary indicators showing if the event will occur or not. These indicators are constructed either directly from data or indirectly through models. Because they are constructed, they have different properties than those arising in microeconometrics, and how one is to use them depends a lot on the method of construction. This book presents the econometric methods necessary for the successful modeling of recurrent events, providing valuable insights for policymakers, empirical researchers, and theorists. It explains why it is inherently difficult to forecast the onset of a recession in a way that provides useful guidance for active stabilization policy, with the consequence that policymakers should place more emphasis on making the economy robust to recessions. The book offers a range of econometric tools and techniques that researchers can use to measure recurrent events, summarize their properties, and evaluate how effectively economic and statistical models capture them. These methods also offer insights for developing models that are consistent with observed financial and real cycles.Less
The global financial crisis highlighted the impact on macroeconomic outcomes of recurrent events like business and financial cycles, highs and lows in volatility, and crashes and recessions. At the most basic level, such recurrent events can be summarized using binary indicators showing if the event will occur or not. These indicators are constructed either directly from data or indirectly through models. Because they are constructed, they have different properties than those arising in microeconometrics, and how one is to use them depends a lot on the method of construction. This book presents the econometric methods necessary for the successful modeling of recurrent events, providing valuable insights for policymakers, empirical researchers, and theorists. It explains why it is inherently difficult to forecast the onset of a recession in a way that provides useful guidance for active stabilization policy, with the consequence that policymakers should place more emphasis on making the economy robust to recessions. The book offers a range of econometric tools and techniques that researchers can use to measure recurrent events, summarize their properties, and evaluate how effectively economic and statistical models capture them. These methods also offer insights for developing models that are consistent with observed financial and real cycles.
Yacine Aït-Sahalia and Jean Jacod
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691161433
- eISBN:
- 9781400850327
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691161433.001.0001
- Subject:
- Economics and Finance, Econometrics
High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric ...
More
High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. The book covers the mathematical foundations of stochastic processes, describes the primary characteristics of high-frequency financial data, and presents the asymptotic concepts that their analysis relies on. It also deals with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As the book demonstrates, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. The book approaches high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.Less
High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. The book covers the mathematical foundations of stochastic processes, describes the primary characteristics of high-frequency financial data, and presents the asymptotic concepts that their analysis relies on. It also deals with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As the book demonstrates, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. The book approaches high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.
Charles F. Manski
- Published in print:
- 2019
- Published Online:
- May 2020
- ISBN:
- 9780691194738
- eISBN:
- 9780691195360
- Item type:
- book
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691194738.001.0001
- Subject:
- Economics and Finance, Econometrics
Although uncertainty is a common element of patient care, it has largely been overlooked in research on evidence-based medicine. This book strives to correct this glaring omission. Applying the tools ...
More
Although uncertainty is a common element of patient care, it has largely been overlooked in research on evidence-based medicine. This book strives to correct this glaring omission. Applying the tools of economics to medical decision making, the book shows how uncertainty influences every stage, from risk analysis to treatment, and how this can be reasonably confronted. In the language of econometrics, uncertainty refers to the inadequacy of available evidence and knowledge to yield accurate information on outcomes. In the context of health care, a common example is a choice between periodic surveillance or aggressive treatment of patients at risk for a potential disease, such as women prone to breast cancer. While these choices make use of data analysis, the book demonstrates how statistical imprecision and identification problems often undermine clinical research and practice. Reviewing prevailing practices in contemporary medicine, the book discusses the controversy regarding whether clinicians should adhere to evidence-based guidelines or exercise their own judgment. It also critiques the wishful extrapolation of research findings from randomized trials to clinical practice. Exploring ways to make more sensible judgments with available data, to credibly use evidence, and to better train clinicians, the book helps practitioners and patients face uncertainties honestly. It concludes by examining patient care from a public health perspective and the management of uncertainty in drug approvals. The book explains why predictability in the field has been limited and furnishes criteria for more cogent steps forward.Less
Although uncertainty is a common element of patient care, it has largely been overlooked in research on evidence-based medicine. This book strives to correct this glaring omission. Applying the tools of economics to medical decision making, the book shows how uncertainty influences every stage, from risk analysis to treatment, and how this can be reasonably confronted. In the language of econometrics, uncertainty refers to the inadequacy of available evidence and knowledge to yield accurate information on outcomes. In the context of health care, a common example is a choice between periodic surveillance or aggressive treatment of patients at risk for a potential disease, such as women prone to breast cancer. While these choices make use of data analysis, the book demonstrates how statistical imprecision and identification problems often undermine clinical research and practice. Reviewing prevailing practices in contemporary medicine, the book discusses the controversy regarding whether clinicians should adhere to evidence-based guidelines or exercise their own judgment. It also critiques the wishful extrapolation of research findings from randomized trials to clinical practice. Exploring ways to make more sensible judgments with available data, to credibly use evidence, and to better train clinicians, the book helps practitioners and patients face uncertainties honestly. It concludes by examining patient care from a public health perspective and the management of uncertainty in drug approvals. The book explains why predictability in the field has been limited and furnishes criteria for more cogent steps forward.