A SURVEY OF FRONTIER PRODUCTION ... - Science Direct

4 downloads 0 Views 1MB Size Report
outliers. One possibility, suggested by Aigner and Chu and implemented by .... production frontier model may be obtained by either maximum likelihood or COLS ...
Journal

of Econometrics

13 (1980) 5-25. 0 North-Holland

A SURVEY OF FRONTIER THEIR RELATIONSHIP

Publishing

PRODUCTION TO EFFICIENCY

Finn

Company

FUNCTIONS AND MEASUREMENT*

OF

R. FQRSUND

University of Oslo, Blindern, Oslo 3, Norway

C.A. Knox University of North Curolina,

LOVELL Chapel Hill, NC 27514, USA

Peter SCHMIDT Michigan

State University, East Lansing, MI 48824, USA

1. Introduction The textbook definition of a production function holds that it gives the possible output which can be produced from given quantities of a set of inputs. Similarly, a cost function gives the minimum level of cost at which it is possible to produce some level of output, given input prices. Finally, a profit function gives the maximum profit that can be attained, given output price and input prices. For each of the above functions, the concept of maximality or minimality is important. The word frontier may meaningfully be applied in each case because the function sets a limit to the range of possible observations. Thus, for example, one may observe points below the production frontier (firms producing less than maximal possible output) but no points can lie above the production frontier; similar comments apply to suitably defined cost and profit frontiers. The amounts by which a firm lies below its production and profit frontiers, and the amount by which it lies above its cost frontier, can be regarded as measures of inefficiency. The measurement of inefficiency has been the main motivation for the study of frontiers. As we will see, it is possible (under most sets of assumptions so far advanced) to measure at least the average level of inefficiency within an industry; measurement of inefficiency for each individual firm depends on one’s assumptions in a more fundamental way. maximum

*The authors would like to thank Dennis Aigner for his help and encouragement in preparing this paper. The last two authors are also grateful to the National Science Foundation for its support of this research under grant SOC 78-12447.

6

F.R. Fqhund

et al., Frontier

productionfunctions

From an econometric point of view, the estimation of frontiers is interesting because the concept of maximality (or minimality) puts a bound on the dependent variable (or, in some models, at least on some component of the dependent variable). Most econometric frontiers assume one-sided disturbances to allow for this, and some statistical problems (of a type not found in the classical regression model) must therefore be faced. The plan of this paper is as follows. Section 2 gives a theoretical discussion of inefficiency. Section 3 presents a survey of econometric frontier models and other efficiency measurement models. Finally, section 4 raises a few rather philosophical questions about frontiers.

2. Theoretical underpinnings Consider a firm employing y1inputs x = (xi,. . .,x,,)‘, available at fixed prices w = (w,, ., w,)‘>O, to produce a single output JJ that can be sold at fixed price p>O. Efficient transformation of inputs into output is characterized by the production function f’(x), which shows the maximum output obtainable from various input vectors. Under certain regularity conditions an equivalent representation of efficient production technology is provided by the cost function c(y, w)-min, (w’xlf(x)z y, x10}, which shows the minimum expenditure required to produce output y at input prices w. A vector of cost minimizing input demands can be obtained by means of Shephard’s lemma as x(y, w)= V,c(): w), provided V,+,c(y, w) exists. Under certain regularity conditions a third equivalent representation of efficient production technology is provided by the profit function n(p, w) = max,,,{Py - w’xlf(x) 1 r, x 20, y 201, which shows the maximum profit available at output price p and input prices w. A vector of profit maximizing output supply and input demands can be obtained by means of Hotelling’s lemma as y(p, w) = n,(p, w), x(p, w) = - V,n(p, w), provided the derivatives exist. In the econometric literature the functions f(x), ~(4: w), and n(p, w) are typically referred to as frontiers, since they characterize optimizing behavior on the part of an efficient producer and thus place limits on the possible values of their respective dependent variables. Let us now suppose that the firm is observed at production plan ($‘,x’). Such a plan is said to be technically efficient if 2” =f (x0), and technically inefficient if y” f’(x”) is assumed to be impossible.] One measure of the technical efficiency of this plan is provided by the ratio O~~~/f(x~)~ 1. Technical inefficiency is due to excessive input usage, which is costly, and so u~‘x~~c(J’~, w). Since cost is not minimized, profit is not maximized, and so (py’ - IV’X’) 2 z(p, ws). The plan (r”, x0) is said to be allocatively efficient if f;.(x”)/&(x”)= wi/wj, and allocatively inefficient if ~(x”)/fj(xo)#~i/~j, assuming f to be

F.R. F@wnd

et al., Frontier production functions

I

differentiable. Allocative inefficiency results from employing inputs in the wrong proportions, which is costly, and so w’x’ Zc(y’, w). Since cost is not minimized, profit is not maximized, and so (py” - w’x”)~~(p, w). It follows that observed expenditure w’x’ coincides with minimum cost c(y’, w) if, and only if, the firm is both technically and allocatively efficient. If w’x’ > c(y”, w), this difference may be due to technical inefficiency alone, allocative inefficiency alone, or some combination of the two. It also follows that observed input usage x0 coincides with cost minimizing input demand x(y”, w) if and only if the firm is both technically and allocatively efficient. A combination of technical and allocative inefficiency causes xP > xi( y”, w) for at least some inputs, but may cause x9 5 xj(yo, w) for some other inputs. A combination of technical and allocative efficiency is necessary but not sufficient for (py’ - w’x’) = n(p, w). Necessity is obvious; the combination is not sufficient because the firm could still be scale inefficient. A firm is said to be scale efficient if p=c,(y’, w), and it is said to be scale inefficient if p#c,(y’, w). It follows that (py’ - w’x’)=x(p, w) if, and only if, the firm is technically, allocatively, and scale efficient. If (py’ - w’x’) < n(p, w), this difference may be due to any combination of the three types of inefficiency. It also follows that observed output supply y” and input usage x0 coincide with profit maximizing output supply y(p, w) and input usage x(p, w) if, and only if, the firm is technically, allocatively and scale efficient. This discussion has been sufficiently general to accommodate any specification of the production, cost and profit functions that satisfy the usual regularity conditions required by duality theory. In his path-breaking paper Farrell (1957) proposed specific measures of technical and allocative efficiency. While his measures are valid for the restrictive technologies he considered they do not generalize easily to technologies that are not linearly homogeneous, or to technologies in which strong input disposability and strict quasiconcavity are inappropriate. Generalizations of Farrell’s measures have been proposed by Fare and Love11 (1978) and by Forsund and Hjalmarsson (1974, 1979b).

3. Econometric

models

It is the task of the econometrician to recast the preceding theoretical treatment into a framework that permits estimation of the various frontiers, and allows calculation of the magnitudes and costs of the various types of inefficiency relative to these frontiers. Although work has been progressing on this task for over two decades, only recently has it attracted widespread attention. Studies of frontier technology can be classified according to the way the frontier is specified and estimated. First, the frontier may be specified as a

8

F.R. F~rsund

et al., Frontier

production

functions

parameteric function of inputs, or it may not. Second, an explicit statistical model of the relationship between observed output and the frontier may be specified, or it may not. Finally, the frontier itself may be specified to be either deterministic or random. Not all eight permutations of these possibilities have been considered, but several have, and these will be considered in turn. We will also discuss an alternative approach which investigates efficiency without the explicit use of a frontier.

3.1. Deterministic

non-parumetric

frontiers

The beginning point for any discussion of frontiers and efficiency measurement is the work of Farrell (1957), who provided definitions and a computational framework for both technical and allocative inefficiency. Consider a firm using two inputs x1 and x2 and producing output y, and assume that the firm’s production function (frontier) is ~~=f(x~,x~). Assume that it is characterized by constant returns to scale, so that it may be written 1 =f(x,/y,x,/y): That is, frontier technology can be characterized by the unit isoquant. Let this unit isoquant be denoted UU’ in fig. 1.

Fig. 1

If the firm is observed using (X:,X:) to produce y”, let point A in fig. 1 represent (x~/y”,x~/y”). (By definition this cannot lie below VU’.) Then the ratio OB/OA measures technical inefficiency; it is the ratio of inputs needed to produce y” to the inputs actually used to produce y”, given the input mix used. Now let PP’ represent the ratio of input prices. Then the ratio OD/OB measures allocative inefficiency, since the cost of point D is the same as that of the allocatively efficient point C, and is less than that of the technically efficient but allocatively inefficient point B. Finally, OD/OA measures total efficiency.

F.R. Fhrsund et al., Frontier production functions

9

The efficient unit isoquant is of course not observable; it must be estimated from a sample of (possibly inefficient) observations like A above. Farrell’s approach is non-parametric in the sense that he simply constructs the free disposal convex hull of the observed input-output ratios by linear programming techniques; this is thus supported by a subset of the sample, with the rest of the sample points lying above it. This procedure is not based on any explicit model of the frontier or of the relationship of the observations to the frontier (other than the fact that observations cannot lie below the frontier). Farrell’s approach has been extended and applied by Farrell and Fieldhouse (1962), Seitz (1970, 1971), Todd (1971) Afriat (1972) Dugger (1974) and Meller (1976). The principal advantage of the approach is that no functional form is imposed on the data. The principal disadvantage is that the assumption of constant returns to scale is restrictive, and its extension to non-constant returns to scale technologies is cumbersome; see Farrell and Fieldhouse (1962) and Seitz (1971) for details. A second disadvantage of the approach is that the frontier is computed from a supporting subset of observations from the sample, and is therefore particularly susceptible to extreme observations and measurement error.

3.2. Deterministic

parameter frontiers

Although Farrell’s non-parametric approach has won few adherents, a second approach proposed by Farrell has proved more fruitful. Almost as an afterthought, he proposed computing a parametric convex hull of the observed input-output ratios. The selection of functional forms being somewhat limited at the time, he recommended the Cobb-Douglas form. (It should be noted, however, that an estimated convex hull in the input space provides enough information to determine a production function only in the case of constant returns.) He acknowledged the undesirability of imposing a specific (and restrictive) functional form on the frontier, but also noted the advantage of being able to express the frontier in a simple mathematical form. Unfortunately Farrell did not follow up on his own suggestion, and it was over a decade before anyone else did. Aigner and Chu (1968) were the first to follow Farrell’s suggestion. They specified a homogeneous Cobb-Douglas production frontier, and required all observations to be on or beneath the frontier. Their model may be written In v=lnf(x)--u n

=cY,+ 1 ccilnxi-u, i=l

UZO,

(I)

10

F.R. Fqhsund et al., Frontier production functions

where the one-sided error term forces parameter vector CI= (a,, pi,. . ., a,)’ may

yZf’(x). The be ‘estimated’

elements of the either by linear

programming (minimizing the sum of the absolute values of the residuals, subject to the constraint that each residual be non-positive) or by quadratic programming (minimizing the sum of squared residuals, subject to the same constraint). Although Aigner and Chu did not do so, the technical efficiency of each observation can be computed directly from the vector of residuals, since u represents technical inefficiency. The principal advantages of the parametric approach vis-a-vis the nonparametric approach are the ability to characterize frontier technology in a simple mathematical form, and the ability to accommodate non-constant returns to scale [note that I:= 1 cli = 1 is not imposed in (l)]. However, the mathematical form may be too simple; the parametric approach imposes structure on the frontier that may be unwarranted, although the restrictive homogeneous CobbDouglas specification has been relaxed by Fdrsund and Jansen (1977) and Fdrsund and Hjalmarsson (1979a), among others. The parametric approach often imposes a limitation on the number of observations that can be technically efficient. In the homogeneous CobbDouglas case, for example, when the linear programming algorithm is used, there will in general be only as many technically efficient observations as there are parameters to be estimated. As was the case with the non-parametric approach, the ‘estimated’ frontier is supported by a subset of the data and is therefore extremely sensitive to outliers. One possibility, suggested by Aigner and Chu and implemented by Timmer (1971), is essentially just to discard a few observations. If the rate of change of the ‘estimates’ with respect to succeeding deletions of observations diminishes rapidly, this suggestion will be useful. A final problem with this approach is that the ‘estimates’ which it produces really have no statistical properties. That is, mathematical programming procedures produce ‘estimates’ without standard errors, tratios, etc. Basically this is because no assumptions are made about the regressors or the disturbance in (l), and without some statistical assumptions inferential results cannot be obtained.

3.3. Deterministic

statistical frontiers

The model of the previous section can be made amenable analysis by making some assumptions. This we proceed to do. Note that the model in (1) can be written as y=f(x)e-“,

to statistical

(2)

F.R. Fdrsund et al., Frontier

II

production functions

or Iny=ln[f(x)]-u,

(3)

where u 20 (and thus Ose-“=< l), and where In [f(x)] is linear in the Cobb Douglas case exhibited in (1). The question that must be asked is what to assume about x and u. The answer that has been given most often is to assume that the observations on u are independently and identically distributed (iid), and that x is exogenous (independent of u). Any number of distributions for u (or, equivalently, for e-‘) could be specified. Aigner and Chu did not explicitly assume such a model, though it seems clear that it was assumed implicitly. Afriat (1972) was the first to explicitly propose this model. He proposed a two-parameter beta distribution for e-“, and proposed that the model be estimated by the maximum likelihood method. This amounts to a gamma distribution for u, as considered further by Richmond (1974). On the other hand, Schmidt (1976) has shown that if u is exponential, then Aigner and Chu’s linear programming procedure is maximum likelihood, while their quadratic programming procedure is maximum likelihood if u is half-normal. It should be stressed that the choice of a distribution for u is important because the maximum likelihood estimates (MLE) depend on it in a fundamental way different assumed distributions lead to different estimates. This is a problem because there do not appear to be good c1priori arguments for any particular distribution. A further problem with maximum likelihood in the frontier setting is that the range of the dependent variable (output) depends on the parameters to be estimated, as pointed out by Schmidt (1976). [This is so because ysf(x) and f(x) involves the parameters which are to be estimated.] This violates one of the regularity conditions invoked to prove the general theorem that maximum likelihood estimators are consistent and asymptotically efficient. As a result, the statistical properties of the maximum likelihood estimators needed to be reconsidered. This is done by Greene (1980a), who shows that the usual desirable asymptotic properties of maximum likelihood estimators still hold if the density of u satisfies the following conditions: (i) the density

of u is zero at u = 0;

(ii) the derivative of the density approaches zero as u approaches

of u with zero.

respect

to

its

parameters

As noted by Greene, the gamma density satisfies this criterion and is thus potentially useful here. However, it is a little troubling that one’s assumption about the distribution of technical inefftciency should be governed by statistical convenience.

12

F.R. F@wnd

et al., Frontier

production functions

There is also an alternative method of estimation, apparently first Richmond (1974) based on the ordinary least squares results; we this corrected OLS, or COLS. Suppose for simplicity that (3) (Cobb-Douglas), and thus of the form of (1). Then if we let p be the u, we can write

In Y= (CQ-p)+

i

crilnxi-

noted by will call is linear mean of

(u-p),

i=l

where the new error term has zero mean. Indeed the error term satisfies all of the usual ideal conditions except normality. Therefore eq. (4) may now be estimated by OLS to obtain best linear unbiased estimates of (Q-P) and of the ai. If a specific distribution is assumed for u, and if the parameters of this distribution can be derived from its higher-order (second, third, etc.) central moments, then we can estimate these parameters consistently from the moments of the OLS residuals. Since ,u is a function of these parameters, it too can be estimated consistently, and this estimate can be used to ‘correct’ the OLS constant term, which is a consistent estimate of (U-P). COLS thus provides consistent estimates of all of the parameters of the frontier. A difficulty with the COLS technique is that, even after correcting the constant term, some of the residuals may still have the ‘wrong’ sign so that these observations end up above the estimated production frontier. This makes the COLS frontier a somewhat awkward basis for computing the technical efficiency of individual observations. One response to this problem is provided by the stochastic frontier approach discussed below. Another way of resolving the problem is to estimate (4) by OLS, and then to correct the constant term not as above, but by shifting it up until no residual is positive, and one is zero. Gabrielson (1975) and Greene (1980a) have both shown that this correction provides a consistent estimate of CI,,. Another difficulty with the COLS technique is that the correction to the constant term is not independent of the distribution assumed for U. Consider the one-parameter gamma distribution

g,

(u;g)=

&(u)“~‘expO,oO measures average cost is

w’(x/y)=

(15)

i=l,...,n,

1

allocative

inefficiency.

E aiiwi++i i i=l

i=l i#j

j=l

Using

.ij(a$+e$)wfWf.

(15),

the

observed

(16)

F.R. F&xmd

et al., Frontier

productionfunctions

19

It is readily seen that w’(x/y) =c(y, w)/p if, and only if, (d,i+ +0$) =2 for all i#j, which requires flij= 1 for all if j. If any Gij# 1, then w’(x/y)>c(y, w)/y, which simply reflects the fact that allocative inefficiency is costly. Note that symmetry holds in the average cost function (13) and the costminimizing inputtoutput demand equations (14), but fails in the observed analogues (16) and (15). Hence a test for symmetry is also a test for allocative efficiency. Toda shows how to estimate the system (15) and (16). His method produces estimates of the cost function parameters mij and the allocative inefficiency parameters Bij. From the latter it is possible to infer directions, magnitudes and costs of allocative inefficiency. The outstanding advantage of this model is that it can be superimposed on a flexible functional form. It also appears to be possible to do much more with the model that Ioda has done. For example, technical inefficiency might be introduced by adding one-sided disturbances to eqs. (15) and (16). The major difficulty with the model is that the inefficiency parameters oij are not firm-specific, and measure only the systematic portion of allocative inefficiency. To the extent that there are also random deviations from the optimal input ratios [e.g., as modelled by Schmidt and Love11 (1979)] the systematic portion of allocative inefficiency understates the average extent of allocative inefficiency. We conclude this subsection with a summary observation. In the deterministic and stochastic frontier models inefficiency is introduced via the disturbance term, thereby complicating econometric implementation of those models. In the non-frontier models inefficiency is introduced via varying coefficients (Lau and Yotopoulos) or via asymmetry (Toda), making resort to relatively high-powered econometric techniques unnecessary, but reducing the information which can be obtained.

4. Philosophical

questions

In surveying the work on frontiers which has been done or might be done, a number of philosophical points arise. While no claim is made that these can easily be settled, we will discuss a few points which seem important.

4.1. What are the ‘fj-ontier’ and 'average' functions? Many discussions of frontiers have made the point that one should estimate the frontier function (usually by some novel but complicated method), whereas OLS provides estimates of the ‘average’ function. However, not all authors appear to attach the same meaning to the words ‘frontier’ and ‘average’. This is worth a little discussion, at least.

20

F.R. Fphund

et al., Frontier

production/unctions

A typical definition of the frontier (whether deterministic or stochastic) is that it gives maximal output which can be attained, given a set of input quantities. But it is possible to think of this maximum as being taken either with respect to those firms (or plants) in the sample, or with respect to all firms (or plants) which could conceivably exist and embody current technology. In the first case (maximality over the sample), one has defined the frontier as what might be called the best-practice frontier, to use Farrell’s phrase. Clearly this is what is estimated by those methods which could be classified as non-statistical - that is, those methods which fit a (parametric or non-parametric) frontier without assuming the form of the distribution of the one-sided error. This would include those methods listed in sections 3.1 and 3.2. On the other hand, in the second case (maximality over all possible sample points, given technology), one has defined what might be called an absolute frontier. This is what is estimated by those methods which assume an explicit distributional form for the one-sided error ~ as is clear since such methods typically yield no 100 y/,-efficient observation. [See Greene (1980a) on this point.] This would include those methods listed in sections 3.333.5. This distinction could be held to be important since, from a theoretical point of view, it is the absolute frontier which represents current technology. However, the practical importance of the distinction is not likely to be large, since the absolute and best-practice frontiers necessarily converge asymptotically (as sample size grows without bound). It seems somewhat more important to be clear about the nature of the socalled average function. A statistical parametric model such as (2) assumes that the firms differ only with respect to a technical inefficiency term, u, generated by iid drawings from some distribution. Under these conditions the average function is conceptually identical to the frontier, except for the realized value of the multiplicative efficiency term in (2) since there is only one production technology. The term frontier function is associated with maximal possible output, and the average function is naturally associated with mean output, for given input levels. The whole concept of an average function or average firm is a bit of a red herring in this context. Thus, for example, the fact that the COLS estimates of all the coefficients of the frontier (except the constant term) equal the OLS estimates should be viewed as natural, and not (as it sometimes seems to be) as a defect of the COLS procedure. The notion of an average function would perhaps be more meaningful in a random coefficients model. An average function can then be defined as the function obtained when the random coefficients obtain their expected values. A random coefficients model with a one-sided error (technical efficiency term) is a general type of stochastic frontier; one would not be able to estimate the frontier (i.e., the random coefficients) for individual firms, but only on average.

F.R. F@xund et al., Frontier

production/unctions

21

4.2. The nature of inefficiency Every profession has its own beliefs, and the strongest belief of economists seems to be in maximizing behavior. Economists may quarrel over what it is that individuals and firms maximize, but they regard as near heresy the suggestion that individuals and firms may simply not maximize anything. From such a point of view it is a bit hard to talk meaningfully about inefficiency. Consider a simple example; a farmer leaves a gate open, the cows get into the corn, and the crop is ruined and the cows get sick. The farmer’s neighbors would likely call this a mistake (in the present context, technical inefficiency). However, an economist might well object that it is costly to check every gate carefully enough to make sure that it stays closed, and that therefore the efficient strategy of the firm is to employ the input ‘care in gate checking’ only to the point where its cost balances the marginal damage prevented. As a result the cows should be expected to get into the corn occasionally. Of course, if this argument is accepted, it still can be pushed further ~ suppose the cows get into the corn more than occasionally, in the sense that it is clear that the input ‘care in gate checking’ is being used in less than the profit-maximizing amount. The farmer’s neighbors would likely call this another mistake, and we would call it allocative inefficiency. However, again an economist might well object that the farmer must just be maximizing something other than profit, or that he faces unobserved constraints. The above arguments can be found, for example, in Stigler’s (1976) critique of the notion of X-efficiency proposed by Leibenstein (1966, 1976). Stigler basically takes the view that all perceived inefficiency is allocative inefficiency, and even this is perceived because of a failure on the part of the observer ~ e.g., a failure to measure all relevant inputs, or to correctly perceive what is being maximized or to account for all of the constraints on the maximization process, etc. This kind of argument seems to be accepted by most economists. However, this is really an act of faith, since the dogma being proposed is not amenable to empirical proof or disproof. Furthermore, its relevance to the type of work surveyed in this paper is really more to the interpretation of the results than to the construction of the models. Consider the (realistic) setting in which we have data on a cross-section of firms in an industry. The data includes output, and the prices and quantities of perhaps four inputs. In such a setting, it is natural - as seen by the fact that many have done so ~ to write a system consisting of the production (or cost) function, and of the first-order conditions for either profit maximization or cost minimization. None of these equations will fit the data perfectly, so disturbances must be added. The question is what theory tells us about the nature and interpretation of these disturbances.

22

F.R. F&wnd

et ul., Frontier

productionfunctions

It is easiest to talk about the disturbances in the first-order conditions. A standard assumption is that these are normal; theory does not really dictate their form. They can be called a measure of allocative inefficiency; if the technology is simple enough to solve explicitly for the cost function from the production function and first-order conditions, we can see how much this raises cost. But the interesting question is of course relative to what state of the world cost is raised. If allocative inefficiency represents mistakes, cost is raised by these mistakes, and this is easy to interpret. But suppose, on the other hand, that we have a putty-clay situation in which the cost-minimizing strategy of the firm dictates that a new plant be built only occasionally, and in which, once the plant is built, certain input substitutions are impossible until the next plant is built. If relative input prices change, such a firm will be (some might prefer to say ‘appear to be’) allocatively inefficient during any particular year of operation. But to say that this raises cost is wrong; such a statement ignores the costs of adjustment which made the firm’s strategy optimal. And it is not hard to find other similar examples. However, it should be noted that this does not argue against error terms on first-order conditions; it merely argues for caution in using the phrase ‘allocative inefficiency’ to describe the phenomena they capture. Next we turn to the disturbance on the production function. Production frontiers are modelled with one-sided errors. Stochastic frontiers also have a two-sided error, while deterministic frontiers do not. The one-sided error term represents production below the frontier, and is called technical inefficiency. Now, it is certainly possible to question this setup. Consider an idealized situation in which we observed every detail of the production process, including every conceivable input (including such things as ‘care in gateetc.) and every conceivable external circumstance closing’, ‘motivation’, (weather, behavior of wildlife, etc.). Then output would basically be deterministic. (This is the same as the argument that a coin flipped 1,000 times in exactly the same way will land the same way each time.) However, output is certainly not exactly given a list of only, say, four inputs, determined, and the error term in the production function is an expression of this. A deterministic (or ‘pure’) frontier uses a purely one-sided error. Thus it is assumed to be meaningful to be able to define exactly the maximal possible output, given some set of relevant inputs. Thus, for example, given quantities of seed, land, labor, fertilizer and capital, maximal output of corn for a farmer is assumed to be determined (without error). Actual output is maximal output minus a (non-negative) inefficiency error. Clearly this assumes that all other conceivable inputs or external events have a maximal possible (i.e., bounded) effect. For example, it is assumed that there is a best possible state of weather, a best possible set of farming practices, a best

F.R. Fqh-sund et al., Frontier

production functions

23

these best possible possible behavior by insects, etc., so that under circumstances frontier output (but no more!) may be attained. A stochastic frontier uses a mixture of one-sided and two-sided (e.g., normal) errors. Thus, given quantities of a list of inputs, there is a maximal output that is possible, but this maximal level is random rather than exact. This assumes that some other inputs or external effects have maximal possible effects, but others have potentially unbounded effects. For example, the effects of weather and other external events might be regarded as normally distributed (and thus unbounded). Thus the stochastic frontier expresses maximal output, given some set of inputs, as a distribution (typically normal) rather than a point. However, it still must be possible to regard certain other inputs or external events as having maximal (best possible) values, so that their suboptimal values create the one-sided error. (Typically these things would be those that are associated with the management practices of the firm.) Also, it should be stressed that statistical ‘noise’ is found in every regression equation, and is usually argued to be normally distributed; this is just another reason for the stochastic nature of the frontier. For example, measurement errors on output fit in easily here, but create severe problems for a deterministic frontier. Finally, it is possible to argue that there is not an optimal value of anything, and hence there is no reason for a one-sided error or error component. In this view the concept of maximality is discarded, and a production function is regarded as merely giving the distribution of output, given certain inputs. If this view is accepted then there is no reason to study frontiers, of course. Clearly those people who use frontiers must accept the notion of maximality, and they often want to measure technical inefficiency. Thus failure to produce at the frontier is taken to be worth discovering, regardless of the reason for this failure. (This is not to deny that linding the reason for the failure would be worthwhile, if possible.) This is true whether the frontier is deterministic or stochastic. Deterministic frontiers are often argued to be consistent with economic theory, but in fact their chief advantage seems clearly to be the availability of a measure of technical inefficiency for each observation. Their chief disadvantage is that they are bound to be confounded by statistical ‘noise’. For stochastic frontiers the situation is exactly reversed. Thus there is not yet a consensus on how one should, or whether one can, measure the technical efficiency of a firm, even if this is agreed to be a useful thing to measure. As with most philosophical discussions, this one may in the end be too pessimistic. Philosophical arguments have seldom prevented the use of techniques which yield plausible results. In that sense the real test of frontier models is likely to be an empirical one, and the evidence is just now coming in.

24

F.R. Fbrswd

et al., Frontier

production,finctions

References Afriat, S.N., 1972, Efficiency estimation of production functions, International Economic Review 13, no. 3, Oct., 568-598. Aigner, D.J. and S.F. Chu, 1968, On estimating the industry production function, American Economic Review 58, no. 4, Sept., 826.-839. Aigner, D.J., T. Amemiya and D.J. Poirier, 1976, On the estimation of production frontiers: Maximum likelihood estimation of the parameters of a discontinuous density function, International Economic Review 17, no. 2, June, 377-396. Aigner, D.J., C.A.K. Lovell and P.J. Schmidt, 1977, Formulation and estimation of stochastic frontier production function models, Journal of Econometrics 6, no. 1, July, 21-37. Broeck, J. van den, F.R. FQrsund, L. Hjalmarsson and W. Meeusen, 1980, On the estimation of deterministic and stochastic frontier production functions: A comparison, Journal of Econometrics, this issue. Chu, SF., 1978, On the statistical estimation of parametric frontier production functions: A reply and further comments, Review of Economics and Statistics 60, no. 3, Aug., 479-481. Dugger, R., 1974, An application of bounded nonparametric estimating functions to the analysis of bank cost and production functions, Ph.D. Dissertation (University of North Carolina, Chapel Hill, NC). Fire, R. and C.A.K. Lovell, 1978, Measuring the technical efficiency of production, Journal of Economic Theory 19, no. 1, Oct., 15&162. Farrell, M.J., 1957, The measurement of productive efficiency, Journal of the Royal Statistical Society A 120, part 3, 253- 281. Farrell, M.J. and M. Fieldhouse, 1962, Estimating efficient production under increasing returns to scale, Journal of the Royal Statistical Society A 125, part 2, 252-267. Fdrsund, F.R. and L. Hjalmarsson, 1974, On the measurement of productive efficiency, Swedish Journal of Economics 76, no. 2, June, 141-154. Fdrsund, F.R. and L. Hjalmarsson, 1976, Technical progress, best-practice production functions and average production functions in the Swedish dairy industry, Paper presented at the Econometric Society European Meetings, Helsinki, 23-27 Aug. F@rsund, F.R. and L. Hjalmarsson, 1979a, Frontier production functions and technical progress: A study of general milk processing in Swedish dairy plants, Econometrica 47, no. 4, July, 883~900. 1979b, Generalized Farrell measures of efficiency: An Fdrsund, F.R. and L. Hjalmarsson, application to milk processing in Swedish dairy plants, Economic Journal, forthcoming. Firsund, F.R. and E.S. Jansen, 1977, On estimating average and best practice homothetic production functions via cost functions, International Economic Review 18, no. 2, June, 463476. Gabrielson, A., 1975, On estimating efficient production functions, Working Paper no. A-85 (Chr. Michelsen Institute, Department of Humanities and Social Sciences, Bergen, Norway). Greene, W.H., 1980a, Maximum likelihood estimation of econometric frontier functions, Journal of Econometrics, this issue. Greene, W.H., 1980b, On the estimation of a flexible frontier production model, Journal of Econometrics, this issue. Kopp, R.J. and V.K. Smith, 1978, The characteristics of frontier production function estimates for steam generating electric plants: An econometric analysis, Unpublished manuscript. Lau, L.J. and P.A. Yotopoulos, 1971, A test for relative efficiency and application to Indian agriculture, American Economic Review 61, no. 1, March, 94-~109. Lee, L.F. and M.M. Pitt, 1978, Pooling cross-section and time-series data in the estimation of stochastic frontier production function models, Unpublished manuscript. Lee, L.F. and W.G. Tyler, 1978, The stochastic frontier production function and average efticiency: An empirical analysis, Journal of Econometrics 7, no. 3, June, 385--390. Leibcnstein, H., 1966, Allocative efficiency vs. X-efficiency, American Economic Review 56, no. 2, June, 392415. Leibenstein, H., 1976, Beyond economic man (Harvard University Press, Cambridge, MA). Meeusen, W. and J. van den Broeck, 1977, Efficiency estimation from Cobb-Doug& production functions with composed error, International Economic Review 18, no. 2, June, 435 444.

F.R. FQrsund et ul., Frontier

productimfinctions

25

Meller, P., 1976, Efficiency frontiers for industrial establishments of different sizes, Explorations in Economic Research, Occasional Papers of the National Bureau of Economic Research 3, 379407. Olson, J.A., P. Schmidt and D.M. Waldman, 1980, A Monte Carlo study of estimators of stochastic frontier production functions, Journal of Econometrics, this issue. Richmond, J., 1974, Estimating the efficiency of production, International Economic Review 15, no. 2, June, 515-521. Schmidt, P., 1976, On the statistical estimation of parametric frontier production functions, Review of Economics and Statistics 58, no. 2, May, 238-239. Schmidt, P., 1978, On the statistical estimation of parametric frontier production functions: Rejoinder, Review of Economics and Statistics 60, no. 3, Aug., 481-482. Schmidt, P. and C.A.K. Lovell, 1979, Estimating technical and allocative inefficiency relative to stochastic production and cost frontiers, Journal of Econometrics 9, no. 3, Feb., 343-366. Schmidt, P. and C.A.K. Lovell, 1980, Estimating stochastic production and cost frontiers when technical and allocative inefficiency are correlated, Journal of Econometrics, this issue. Seitz, W.D., 1970, The measurement of efficiency relative to a frontier production function, American Journal of Agricultural Economics, 52, no. 4, Nov., 505-511. Seitz, W.D., 1971, Productive efficiency in the steam-electric generating industry, Journal of Political Economy 79, no. 4, July/Aug., 878-886. Stevenson, R.E., 1980. Likelihood functions for generalized stochastic frontier estimation, Journal of Econometrics, this issue. Stigler, G.J., 1976, The Xistence of X-efficiency, American Economic Review 66, no. 1, March, 213-216. Timmer, C.P., 1971, Using a probabilistic frontier production function to measure technical efficiency, Journal of Political Economy 79, no. 4, July/Aug., 776794. Toda, Y.. 1976, Estimation of a cost function when the cost is not minimum: The case of Soviet manufacturing industries, 1958-71, Review of Economics and Statistics 58, no. 3, Aug., 259268. Toda, Y., 1977, Substitutability and price distortion in the demand for factors of production: An empirical estimation, Applied Economics 9, no. 2, June, 203-217. Todd, D., 1971, The relative efficiency of small and large firms, Committee of Inquiry on Small Firms Research Report no. 18 (HMSO, London). Trosper, R.L., 1978, American Indian relative ranching efficiency, American Economic Review 68, no. 4, Sept., 5033516. Yotopoulos, P.A. and L.J. Lau, 1973, A test for relative economic ehiciency, American Economic Review 63, no. 1, March, 214223.