top of page

Two perspectives on Applied Econometrics: Structural and reduced-form methods

The “Credibility Revolution” in Economics

There have been several advances in applied econometric methods throughout the past few decades. A great deal of these improvements had to do with better research design. We can see this in the increased use of actual and natural experiments that generally provide a more credible identification of causal effects from the data at hand. Together with improvements in robust inference, such as semi- and non-parametric methods or robust standard errors, this has helped to alleviate doubts about the validity of empirical results that still existed within the profession in the 1980s; in short, it is said that economists have succeeded in taking the “con out of econometrics”.


These developments in methodology are closely connected to a particular type of applied research, which will be referred to here as “non-structural”. Informally also called “reduced-form” or “experimentalist”, such research typically places emphasis on the statistical identification of particular (causal) parameters using statistical assumptions. In such research, economic theory is often in a more supporting role, and emphasis is put on quantifying things accurately. A study that will be familiar to continuing Economics students at UCL, Card and Krueger’s paper on the effect of a minimum wage increase on fast-food employment in New Jersey is a prime exhibit in non-structural work.


However, not everyone is content with this kind of research. An often repeated criticism has been that together with “con”, such research has taken “econ” out of econometrics and diverted our attention from important economic problems such as inequality to catchy “freakonomics”-type research. It is indeed possible to find non-structural work where economics is not used beyond identifying “what is y and what is x”, i.e. what is an outcome and what is an explanatory variable. As a more substantive drawback, reduced-form evidence also tends to be less useful for building a complete picture of an economic phenomenon, where a synthesizing role for theory would be necessary. Instead, it is often only able to provide local validity for its results, especially if the outcome responses are heterogeneous in the question being studied.


Theory and Measurement

On the other hand, structural econometric work incorporates an explicit theoretical “structure of decision making” into the estimated model. For example, a simplistic structural study of the effect of welfare benefits on labour supply might begin by formulating a worker’s utility maximization problem, and then use data to estimate aspects of the labour supply function that would result from solving the problem as its specified. This is in contrast to a non-structural “treatment effects” approach discussed above, which would focus more on finding a situation where there has been exogenous variation in benefits, without making reference to any particular theoretical framework. Thus, at its core, a structural approach is characterized by the use of economic assumptions in addition to statistical ones in the identification of parameters from the data.


This need to make more assumptions turns out to be a major point of contention. Solving and estimating any useful structural model imposes a heavy computational cost on the researcher, and naturally tends to tilt structural researchers towards more tractable models. For the same reasons, any statistical assumptions in structural models tend to be stricter than in reduced-form research. Sceptics see all of this as adding a thick layer of largely untestable conditions on any empirical results. Even though advances in computer processing power have allowed structural work to move on from the more blatantly stylized models that dominated the field in the past, a hint of bad reputation has followed structural models to this day. With the same “curse of dimensionality” that is present in e.g. machine learning, even modern technology does not mean that structural research is freed from making simplifications for computational reasons.


In response to criticism, practitioners of structural models have tended to emphasize the virtuousness of making one’s assumptions explicit. Keane (2010) points out that just like structural models, even the strongest design-based research is often not interpretable without strong a priori assumptions. Ultimately, being able to statistically identify a parameter with very few assumptions is not very useful if we have no meaning for what exactly is being identified. For example, Keane argues that Angrist’s 1990 study of the wage effect of military conscription using a draft lottery has a strong claim for having an ideal research design, but that the meaning of coefficient uncovered is far from certain.


Clarity on mechanisms is in fact one of the greatest rewards a structural approach has to offer, as each parameter has an interpretation in the context of the economic model being considered. Furthermore, a deeper understanding of what links changes in one variable to movements in another gives us confidence that we will be able observe a similar relationship in the future. This contrasts with the tight context-dependency that was noted above as limiting reduced-form research. From the policy-design perspective, a structural model grants the ability to study counterfactuals of different policy decisions that are yet to occur, or that take place in such a different environment, such that reduced-form evidence can provide limited guidance.


A False Dichotomy?

In the end, the main difference between structural and non-structural approaches might come down to a disagreement on what types of questions economists should try to answer. A more modest research agenda that focuses on quantifying things would prefer the lighter set of qualifications that reduced-form work carries. On the other hand, more assumptions are perhaps needed to get anywhere with the questions on “how” and “why”, or to have wider generalizability for empirical results. Interestingly, Nevo and Whinston argue that structural models are much more prevalent in empirical industrial organization than in labour economics because the types of economic problems and the nature of data studied in the two fields are so different. However, they also note that a great deal of the difference in empirical approach is due to “cultural differences”, such as the size of the role economic theory plays in the two fields overall.


It is therefore too soon to start dividing economists into opposing camps based on the generalized description of the different approaches given above. In practice, there is no black-and-white split into structural and reduced-form work, and plenty of research blends elements from different approaches. Instead of being at odds with each other, it has been noted in several instances that reduced-form analysis, experiments and structural modeling together could provide the strongest type of empirical evidence economists are able to offer. Instead of categorically sticking to one type of approach, it is perhaps better to make a flexible choice depending on the type of question being studied and the data available.


Otso Hao

bottom of page