A risk-averse insurance company controls its reserve, modelled as a perturbed Cramér-Lundberg process, by choice of both the premium p and the deductible K offered to potential customers. The surplus is allocated to financial investment in a riskless and a basket of risky assets potentially correlating with the insurance risks and thus serving as a partial hedge against these. Assuming customers differ in riskiness, increasing p or K reduces the number of customers n(p,K) and increases the arrival rate of claims per customer λ(p,K) through adverse selection, with a combined negative effect on the aggregate arrival rate n(p,K)λ(p,K). We derive the optimal premium rate, deductible, investment strategy, and dividend payout rate (consumption by the owner-manager) maximizing expected discounted life-time utility of intermediate consumption under the assumption of constant absolute risk aversion. Closed-form solutions are provided under specific assumptions on the distributions of size and frequency claims.

This paper studies the dynamic asset allocation problem faced by an infinitively-lived commodity-based sovereign wealth fund under incomplete markets. Since the non-tradable stream of commodity revenues is finite, the optimal consumption and investment strategies are time dependent. Using data from the Norwegian Petroleum Fund, we find that the optimal demand for equity should decrease gradually from 60 to 40 percent over the next 60 years. However, the solution is particularly sensitive to the correlation between oil and stock price changes. We also estimate wealth-equivalent welfare losses, relative to the optimal rule, when following alternative suboptimal investment rules.

We study the statistical properties of heterogeneous agent models. Using a Bewley- Hugget-Aiyagari model we compute the density function of wealth and income and use it for likelihood inference. We study the finite sample properties of the maximum likelihood estimator (MLE) using Monte Carlo experiments on artificial cross-sections of wealth and income. We propose to use the Kullback-Leibler divergence to investigate identification problems that may affect inference. Our results suggest that the unrestricted MLE leads to considerable biases of some parameters. Calibrating weakly identified parameters allows to pin down the other unidentified parameter without compromising the estimation of the remaining parameters. We illustrate our approach by estimating the model for the U.S. economy using wealth and income data from the Survey of Consumer Finances.

This paper shows that the consumption-based capital asset pricing model (C-CAPM) with low-probability disaster risk rationalizes pricing errors. We find that implausible estimates of risk aversion and time preference are not puzzling if market participants expect a future catastrophic change in fundamentals, which just happens not to occur in the sample (a ‘peso problem’). A bias in structural parameter estimates emerges as a result of rational pricing errors in quiet times. While the bias essentially removes the pricing error in the simple models with constant risk-free rates, time-variation may also generate large and persistent estimated pricing errors in simulated data. We also show analytically how the problem of biased estimates can be avoided in empirical research by resolving the misspecification in moment conditions.

This paper revisits the fit of disaster risk models where a representative agent has recursive preferences and the probability of a macroeconomic disaster changes over time. We calibrate the model as in Wachter (2013) and perform two sets of tests to assess the empirical performance of the model in long run simulations. The model is solved using a two step projection-based method that allows us to find the equilibrium consumption-wealth ratio and dividend-yield for different values of the intertemporal elasticity of substitution. By fixing the elasticity of substitution to one, the first experiment indicates that the overall fit of the model is adequate. However, we find that the amount of aggregate stock market volatility that the model can generate is sensible to the method used to solve the model. We also find that the model generates near unit root interest rates and a puzzling ranking of volatilities between the risk free rate and the expected return on government bills. We later solve the model for values of the elasticity of substitution that differ from one. This second experiment shows that while a higher elasticity of substitution helps to increase the aggregate stock market volatility and hence to reduce the Sharpe Ratio, a lower elasticity of substitution generates a more reasonable level for the equity risk premium and for the volatility of the government bond returns without compromising the ability of the price-dividend ratio to predict excess returns.

"

En este documento se presentan los resultados obtenidos de un ejercicio empírico que pretende extraer los principales hechos estilizados de la economía colombiana para el período 1994: I 2007: I. El objetivo es servir de apoyo tanto para el diseño y especificación como para la evaluación de un modelo de equilibrio general dinámico y estocástico (DSGE) que actualmente desarrolla el Departamento de Modelos Macroeconómicos del Banco de la República. Para ello se emplea una base de datos que permite descomponer algunos de los principales agregados macroeconómicos calculados por el DANE a través del sistema de cuentas nacionales anuales en sus componentes doméstico e importado, así como construir una medida de los márgenes de comercialización adicionados a los bienes de consumo e inversión importados. Una vez se dispone de los datos, se analiza la estructura de la economía colombiana por componentes de oferta y demanda siguiendo de cerca la metodología empleada por Restrepo y Soto (2004) en Chile y Restrepo y Reyes (2000) en Colombia.

"

There is now an impetus to apply dynamic stochastic general equilibrium models to forecasting. But these models typically rely on purpose-built data, for example on tradable and nontradable sector outputs. How then do we know that the model will forecast well, in advance? We develop an early warning test of the database-model match and apply that to a Colombian model. Our test reveals where the combination should work (consumption) and where not (in investment). The test can be adapted to look at many likely sources of DSGE model failure.

"

In this document we apply the Business Cycle Accounting (BCA) procedure in order to study which mechanisms are more relevant to interpret the dynamics of the Colombian GDP between 1994 and 2009. The neoclassical growth model with endogenous variable capital utilization of Cavalcanti et al. (2008) is used as a benchmark model. This reference model can be seen as a reduced form of a greater number of DSGE models with different frictions commonly used in the literature. This equivalence allows us to assess the importance of each of those detailed models in explaining the fluctuation in the economic activity. The results suggest that in order to achieve better predictions for the GDP, macroeconomic models should include distortions that alter the consumption-savings decisions and the labor supply of the households. These distortions can be obtained with the inclusion of capital and labor wedges on the benchmark model. Furthermore, frictions related to movements in the total factor productivity tend to overestimate periods of persistent economic downturn such as the one registered between 1998 and 2000 after the Asian crisis.