Measuring efficiency and productive performance of

11 downloads 0 Views 119KB Size Report
budget finances by attempting to raise the cost efficiency of Philippine universities (see for instance. CHED, 2003; ADB.org Higher Education News, 2003; Phil.
Intl. Trans. in Op. Res. 14 (2007) 217–229

INTERNATIONAL TRANSACTIONS IN OPERATIONAL RESEARCH

Measuring efficiency and productive performance of colleges at the university of Santo Tomas: a nonparametric approach Babeth Isa S. Fernando and Emilyn C. Cabanda Graduate School, University of Santo Tomas, Espana, Manila 1008, Philippines E-mail: [email protected] [Fernando], [email protected] [Cabanda] Received 31 August 2005; received in revised form 20 February 2006; accepted 10 March 2006

Abstract This paper estimates relative efficiency and productive performance of 13 colleges at the University of Santo Tomas (UST), using data envelopment analysis (DEA) – Malmquist indices and a multi-stage model. DEA is a management evaluation tool that assists with identifying the most efficient and inefficient decision-making units (DMUs) in the best practice frontier. Total factor productivity (TFP) is measured for a sample of 13 colleges at UST over the period 1998–2003. Empirical results show that the main contributing factor to TFP growth is efficiency change. That is, UST colleges are technically operating efficiently in the frontier technology; though there is a downward shift in the technological advancement. Our results further imply that with the use of output–input mix, UST colleges as a whole have recorded a higher level of technical efficiency than innovation. These new findings contribute significantly to the existing literature on efficiency and productive performance in the education sector. Keywords: higher education institution; data envelopment analysis; total factor productivity; technical efficiency; technological change

1. Introduction Various studies have been conducted in the past to measure efficiency and productivity growth in several educational institutions around the world (see e.g. Abbott and Doucouliagos, 1998, 2002; Athanassopoulos and Shale, 1997; Breu and Raab, 1994; Chen, 1997; Grosskopf and Mourtray, 2001; Johnes, 1996; Johnes and Johnes, 1995; Ng and Li, 2000). Data envelopment analysis (DEA) was the most commonly used method for the measurement of productivity of these educational institutions, wherein the level of profits is an imperfect indicator of performance. In the existing literature, DEA-education studies focused more on across university performance in a specific country for the right allocation of resources to enhance efficiency or resource utilization efficiency. Studies were made of the relative efficiency of economics departments across r 2007 The Authors. Journal compilation r 2007 International Federation of Operational Research Societies. Published by Blackwell Publishing, 9600 Garsington Road, Oxford, OX4 2DQ, UK and 350 Main St, Malden, MA 02148, USA.

218

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

universities (see Johnes and Johnes, 1995; Madden et al., 1997) or MBA programs (see Haksever and Muragishi, 1998). However, there is a dearth of studies on the management of universities taking financial resources and efficiency as objectives to give insights into a university’s operations across disciplines. This apparent gap in the existing education-related DEA literature is addressed by this paper. Addressing this issue offers significant new empirical contributions to the efficiency and productive performance of academic disciplines in a university. Aside from the non-profit sector, DEA has also been explored extensively in the profit sector, such as financial institutions, railways, airports, electricity, hospitals and health, shipping, hotels and restaurants, telecommunications, mass media, etc. However, the current interest seems to have initiated by Charnes et al. (1978, 1994), who proposed DEA as a way of measuring performance in not-for-profit and public sector organizations where success cannot be measured by a single factor such as profit. Classic explanations of the technique are found in Charnes et al. (1989, 1994). Thus, DEA began as a new management science tool for technical efficiency analysis of non-profit sector decision-making units (DMUs). DEA evaluates the relative efficiency of homogeneous DMUs where there is no known relationship between the transformation of inputs used by an organizational unit and the outputs that it would produce (Taylor and Harris, 2004). The efficiency frontier is therefore not known, but it can be estimated by using data on the actual performance of the DMUs under consideration, in terms of the outputs that they produce for the level of inputs that they use. The essential characteristic of DEA is the transformation of the multiple-input, multiple-output DMU into a single ‘virtual output’ value for all DMUs. The ratio of this single virtual input to virtual output provides a measure of technical efficiency. In this paper, DEA models are applied to different colleges within one university only – the University of Santo Tomas in the Philippines. The rest of the paper is organized as follows: Section 2 describes the Philippine education system and provides a background on the University of Santo Tomas. Section 3 discusses the methodology and data sample. Section 4 reports empirical results and conclusions follow in Section 5.

2. Philippine education and the University of Santo Tomas With an increasing number of young persons enrolling in higher education courses, Philippine higher education institutions – like many other educational institutions in many countries – have faced the problem of providing higher education in a more effective manner that enables existing resources to be used to meet increasing demand for education. The number of students enrolled in higher education in the Philippines increased to 2.4 million students in 2003–2004 [Commission on Higher Education (CHED), 2005]. Since the late 1990s the higher educational institutions in the Philippines have shown interest in trying to provide more quality education but maintaining the budget finances by attempting to raise the cost efficiency of Philippine universities (see for instance CHED, 2003; ADB.org Higher Education News, 2003; Phil. Education Information Resource, 2004). The Philippine higher educational institutions have attempted this, primarily, by trying to achieve a greater number of size and scope by distributing higher education institutions into several branches rather than one-campus universities. Other initiatives that have occurred over the last decade include: the setting-up of the tri-semester system of higher education; restructuring of fees; revising the curriculum of specialized courses; the introduction of competitive allocation of r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

219

research funding; and a steady rise in student numbers without a corresponding rise in real funding. This reorganization and growth of Philippine higher education has raised concerns about whether individual universities, especially their colleges, operate at the best possible level of cost efficiency. The higher education system in the Philippines is classified into public and private higher education institutions. At present, there are 1479 higher education institutions in the country, broken down into: 111 state universities and colleges, two CHED-supervised institutions, 44 local universities and colleges, 12 other government schools, five special higher education institutions and 1305 private institutions. Previously, the administration, supervision and regulation of higher education rests with the Department of Education (DepED) through its Bureau of Higher Education. However, in 1994, two laws were passed in Congress: (1) Republic Act No. 7722 creating the Commission on Higher Education (CHED); and (2) Republic Act No. 7796 creating the Technical Education and Skills Development Authority (TESDA). As a result of the trifocalization of education in 1994, the DepED now concentrates only on the administration, supervision and regulation of basic education (elementary and secondary education). TESDA, an agency attached to the Department of Labor, oversees post-secondary technical and vocational education. The system and/or institutional governance and policy guidance over public and private higher educational institutions as well as degree-granting program in all post-secondary educational institutions rest with CHED, a department-level agency (attached to the Office of the President of the Philippines for administrative purposes), independent from and co-equal with DepED. The CHED is responsible for administering and supervising both public and private higher education institutions in the Philippines. State universities and colleges (SUCs) are institutions funded by the national government. They have their own charters and are thus autonomous from CHED. CHED-supervised institutions that are non-chartered colleges, are directly under the supervision of CHED whose annual budget allocation is integrated in the government budget appropriation for CHED. Local universities and colleges previously called community colleges are those operated, supported and maintained by local government units. In addition, there are other government schools offering bachelor’s degrees and/or graduate degrees and advanced training such as military and police academies, which are supervised and regulated by the Department of National Defense and Philippine National Police. Private institutions, on the other hand, are owned and administered by private individuals, groups or corporations. These are classified either as sectarian or non-sectarian colleges and universities. Sectarian schools (325) are usually non-listed, non-profit institutions, owned and operated by religious orders. Thus the University of Santo Tomas is categorized in the sectarian sector. Meanwhile, the non-sectarian schools (980) are owned by private corporations, which are not affiliated to any religious organization, the majority are listed, a few are non-listed, non-profit corporations and a number are foundations. The University of Santo Tomas, the only Pontifical and Royal Catholic University of the Philippines, a Dominican institution of learning, under the inspiration and patronage of St. Thomas Aquinas, commits itself to the pursuit of truth and to the preservation, advancement and transmission of knowledge in the arts and sciences, both sacred and civil through the use of reason illuminated by faith. The university is composed of faculties, colleges and schools, which are organically independent from each other, as well as of departments, which are established to assist in the coordination of courses of study, which are common to several or all faculties. Institutes r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

220

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

exist or may be established in the university as organically independent bodies thereof or as adjuncts of a particular faculty or college. At present, the university is composed of 1525 faculty members, 238 academic officials, 110 administrative officials, 550 non-academic employees and 32,600 students with an average of 2200 students per college, except for the colleges of Commerce, Engineering and Arts & Letters whose student populations are consistently above 3000. The board of trustees maintains the formulation and approval of policies, rules and standards of the university followed by the Rector, who heads the operation of the university. Implementation of policies and management are vested at the Vice-Rector level, such as VR-Academic Affairs, VR-Religious Affairs and VR-Finance. These Vice-Rector offices are supported by the different departments/offices, who help the overall operations of the university. The faculties/colleges are composed of the following: Dean, Assistant Dean, Faculty Secretary, Regent, Academic Chairman and the Faculty members. The University of Santo Tomas, after almost four centuries of glorious existence, is as much as ever committed to the quest for truth and the promotion of virtues among its students. However the university has made only modest progress in its pursuit of excellence in the area of teaching as it endeavored to attain its vision of upgrading the quality of instruction according to global standards and of becoming a center of excellence in various academic programs. The University of Santo Tomas envisions that by 2011 it will be a center of excellence in various programs of teaching, an acknowledged expert in key areas of research in the pure and applied sciences, a leader in community/extension services, and the Center of Contextualized Theology in Asia. It also envisions for itself an extended physical presence beyond Manila, and a more functional networking mechanism with other universities/institutions. In order to achieve the purpose of UST as a center of excellence in various programs of teaching, it is imperative that the university maximizes efficiencies so that it can grow the system, broaden the range of college performance and ensure that a high-quality education should be provided. In this case, this study aims to measure efficiency and productive performance of 13 UST colleges or faculties. The use of panel data provides information on what constitutes the best practice in the sample of colleges. Specifically, this study has three major objectives: (i) to determine the main drivers for efficiency and productivity growth among UST colleges; (ii) to measure the comparative performance in terms of efficiency and productivity changes of the UST colleges over the period 1998–2003; and (iii) to identify and determine the best and poorest performers among UST colleges.

3. Methodology and data sample 3.1. DEA-Malmquist index method (output orientated) As with studies on the performance of non-profit organizations, this paper adopts a DEA approach to performance management. DEA is a nonparametric method of estimation. The advantage of such a method lies in the ability to evaluate performance of DMUs with a multipleoutput and multiple-input production process. It can provide a comprehensive index for measuring the overall efficiency of a DMU relative to a group of DMUs (Charnes et al., 1978). r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

221

We follow Fare et al. (1994, p. 71), output-orientated productivity change index method as " #1=2 tþ1 t t tþ1 tþ1 tþ1 tþ1 t t D ðx ; y Þ D ðx ; y Þ D ðx ; y Þ 0 0 Mo ðxtþ1 ; ytþ1 ; xt ; yt Þ ¼ 0 t t t :  tþ1  tþ1 ; ytþ1 Þ t t D0 ðx ; y Þ Dtþ1 D ðx 0 0 ðx ; y Þ

ð1Þ

The output-orientated model has to put much weight on the expansion of outputs out of given input resources. The ratio outside the brackets in Equation (1) measures the change in relative efficiency on how far observed production is from maximum potential production between year t and t11. The geometric mean of the two ratios inside the brackets captures the shift in technology between the two periods evaluated at xt and xt11. Mathematically, DEA can be expressed as the linear programming with inputs, x1, x2, xm, outputs y1, y2, ys, a nonnegative vector and an optimal solution to the problem. The Malmquist productivity index (Mo) will indicate the performance of UST colleges or DMUs with the following interpretations: (1) Mo 5 1 indicates a constant performance or a DMU is ‘efficient,’ implying no other DMU or combination of DMUs is more efficient; (2) Mo41 indicates an improved performance; (3) Moo1 means a declining performance. The index is derived with the aid of DEAP version 2.1 (Coelli, 1996). The Malmquist index has five indices as used in this paper:     

Technical efficiency change (EFFCH) (relative to CRS technology); Technological change (TECHCH); Pure technical efficiency change ( PECH) (relative to VRS technology); Scale-efficiency change (SECH); and Total factor productivity change (TFPCH).

Efficiency change is made by a decomposition of scale efficiency change, which is relative to CRS technology and pure efficiency change (under variable returns to scale assumption). Scale efficiency change is the difference between efficiency and pure efficiency change or simply EFFCH/PECH.

3.2. Input-orientated multi-stage model In this paper, efficiency estimates are obtained by using the input-orientated VRS formulation of DEA, pioneered by Banker et al. (1984). The input-orientated approach is an input reducing measure. The apparent usefulness of the multi-stage DEA approach is its ability to identify the efficient projected points, which input and output mixes are as similar as possible to those of inefficient points, and also invariant to units of measurement (Coelli et al., 1998). Slack is the inherent feature of the multi-stage model, which refers to input excess or surplus use of input resources. Input slacks occur when the projection of an inefficient observation onto the efficient plane occurs in such a way that calls for further reduction of one or more inputs. The slacks in the model provide the following information: (1) if slack is zero (5 0) means a DMU is efficient; and (2) greater than zero (40) slack implies that a DMU is inefficient. r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

222

The input-orientated model can be formally stated as follows: Min y

n X

i ¼ 1; 2; . . . ; m;

li xij )yxi0 ;

ð2Þ

j¼1

n X

s:t:

li yrj *yr0 ;

r ¼ 1; 2; . . . ; s;

ð3Þ

j¼1

n X

lj *0;

j ¼ 1; 2; . . . ; n:

ð4Þ

j¼1

where DMU0 represents one of the n DMUs under evaluation; xi0 and yr0 are the ith input and rth output for DMU0; s is the number of outputs produced by a DMU; and m is the number of inputs used by a DMU. The DEA model above P is under the constant returns to scale assumption. The VRS model n is obtained by adding nj¼1 lj ¼ 1 as a constant. 3.3. Data sample and variables The 13 colleges of the University of Santo Tomas which were evaluated over the test period 1998– 2003 are the following: Architecture, Fine Arts & Design, Commerce, Education, Music, Nursing, Pharmacy, Rehabilitation Science, Science; and Faculty of Arts & Letters, Civil Law, Engineering and Medicine. Also, in this paper, three outputs are used.1 The first output relates to the total number of equivalent full-time students (EFTS) enrolled. The second relates to the total number of graduate degrees conferred. The third output is the total revenue or income per college.2 Likewise, three inputs are used. The first input is the total number of academic staff (full-time equivalent). The second input is the total number of non-academic staff (full-time equivalent). The third input is the operating expenses, which include only those that are directly attributable to the colleges, such as light and water, operating and maintenance, representations and other miscellaneous expenses (office and school supplies, etc.), but exclude the administrative expenses where the salary and wages of personnel and faculty members are included. Likewise, the revenue and expense figure used in this study are both nominal account. These output and input measures are calculated using a panel of 78 pooled data. The data for analysis were obtained from annual reports of accounting, auditing, budget and registrar’s departments of the University of Santo Tomas. 1

The educational outputs of these sample colleges are similar and homogeneous in all respects, such as faculty members are paid based on their ranks and not on their degree qualification. All students are charged tuition fees based on the same rate per unit. Thus, these colleges generate the same income per semester and only differ depending on the number of enrollees. 2 Total revenue or income was used as an output instead of research because in UST not all sample colleges incorporated research in their academic program. Thus, including research as one of the variables will not gain homogeneity of the sample data. Besides, for purposes of measuring the productivity of any entity, the revenue or income may be considered as one of the most commonly used variables to measure if an organization is maximizing its resources to the fullest. r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

223

4. Empirical results 4.1. Main drivers for efficiency and productivity growth The growth of TFP is decomposed into two components, namely: technological change and technical efficiency change. Technological change implies a shift in the frontier, while efficiency is the movement toward the frontier. Results shown in Table A1 reveal that the TFP growth of the colleges of University of Santo Tomas was due to efficiency change, with an index score of 1.009 (0.9% growth/year) for the entire test period. Taken individually, the source of TFP growth for the 12 colleges, namely: Architecture, Fine Arts & Design, Arts & Letters, Commerce, Education, Music, Nursing, Pharmacy, Rehabilitation Science, Science, Civil Law and Medicine, was efficiency change, with index scores of 1.020, 1.019, 1.000, 1.000, 1.025, 1.003, 1.035, 1.033, 0.928, 1.008, 1.068 and 1.005, respectively. In contrast, only the college of Engineering TFP was due to technological change, with a score of 0.996 (slightly below 1). This result was substantiated by the fact that College of Engineering was the first to adopt the computerized teaching method. The efficiency score analysis shows that 11 colleges of the University of Santo Tomas are relatively efficient, and their efficiency scores are: above 1 (Architecture, Fine Arts & Design, Education, Music, Nursing, Pharmacy, Science, Civil Law and Medicine) and equal to 1.00 (Arts & Letters and Commerce). Results show that the resource utilization of these colleges is functioning well. As seen from Table A2, only two colleges (Engineering and Rehabilitation Sciences) obtained scores below 1 in the efficiency change, with scores of 0.974 and 0.928, respectively. The low efficiency of these two colleges was affected by pure and scale inefficiencies in the case of Engineering and pure inefficiency for Rehabilitation Science. To further examine the input usage efficiency of UST colleges, we use the multi-stage model to produce slack results. There are four colleges (Engineering, Nursing, Rehab Science and Science) that incurred slacks in their input resources in 2003 (see Table A3). This means that these colleges need to further reduce their input usage to be efficient. Input saving efficiency is needed by these four colleges to reach the best-practice frontier. For example, Engineering had an excess in its academic staff of nine (34.9%) and non-academic staff of 10 (71%). To be efficient, this college should theoretically lose at least 19 staff. The number of its staff (both academic and non-academic) is not proportionate to the number of students and graduates enrolled. Thus, an input saving of its total staff force of about 105.9% is required for the best-practice performance. Likewise, Rehab Science and Science colleges incurred excesses of 10 academic staff and 12 non-academic staff, respectively. These two colleges may have surpluses in their total staff perhaps because of laboratories that need to be maintained and the research workload carried by their academic staff. However, no slack has been detected on operating expenses in any college. This implies that UST colleges have utilized operating expenses (financial resources) efficiently to produce their desired outputs. 4.2. Comparative performance of UST colleges Based on the DEA method, the comparative performance in terms of efficiency and productivity changes of UST colleges are computed for the full sample period 1998–2003. For ease of interpretation in the individual performance of each college, Table A4 presents TFP scores for the r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

224

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

sample period. Recall that if the value of the Malmquist index or any of its components is o1, that denotes regress or deterioration in productivity, whereas values 41 denote improvements in productivity (Fare et al., 1994). Table A4 shows that on average productivity decreased by about 15% from 1998 to 2000, with an index score of 0.982 and 0.838, respectively. This decline may be due to a drastic decrease in productivity in the College of Commerce from 2.993 (1998–1999) to below average of 0.341 (1999–2000). One significant reason for this collapse was the implementation of a tuition fee increase for the academic year (1999–2000) which was almost 97% higher than previous years, which subsequently, caused the decreased in number of enrollees by 6%. The high expenditures incurred by the college were due to acquisition of advanced technology and installation of modern facilities. Likewise, the payment of three years’ backwage salaries to faculty members had added to the decline of total productivity in the succeeding year. Furthermore, analysis reveals that such an abrupt decrease in the productivity level of the College of Commerce was attributed to a technological regression from an above average score of 1.162 (1998–1999) to below average of 0.694 (1999–2000). However, productivity continued to slightly increase over the succeeding three-year period (2000–2003), with yearly TFP scores of 0.946, 0.988 and 1.006, respectively. On average, the annual growth was due to improvements in efficiency rather than to innovation. Turning to the results of each college per year, we note that the Colleges of Nursing and Pharmacy are the only two colleges that consistently maintained their productivity growth at above 1 for the sample period (1999–2003).

4.3. Best and worst performers among UST colleges Taken as a group, the colleges of UST are performing well against each other. The results are not substantially different from the three input–three output models. Overall, the level of technical efficiency of the Santo Tomas colleges in the university system appears to be high. However, it cannot be concluded that there is no scope for improvement in efficiency. The results indicate homogeneity in performance within the university system. They do not rule out the possibility that the entire system may be underperforming. Looking at Table A5, we see that the top performing colleges with the productivity level above 1 are Pharmacy (1.056) and Nursing (1.056) only. On the other hand, the three colleges that have TFP scores below 90% are Music (0.886), Arts & Letters (0.874) and Rehabilitation Science (0.851). The rest obtained scores between 90 and 98%. When it comes to its efficiency score, 11 out of the 13 colleges have a score of above 1. The top five on the list are Civil Law (1.068), Nursing (1.035), Pharmacy (1.033), Education (1.025) and Architecture (1.020), while, the two colleges that got a score of below 1.0 are Engineering and Rehabilitation Science. This result implies that the colleges are better in their technical efficiency or managerial performance in utilizing and maximizing their resources. Based on technological performance, there are only two colleges with scores above 1: Pharmacy and Nursing. Eight of the remaining colleges have scores nearly close to the efficiency level of 1.0, with at least a score of 90% or above. Only three colleges, namely: Fine Arts & Design, Music and Arts & Letters, got scores of 80% below the average, thus, making them the worst performing colleges. However, this overall result shows that there is a lack of technological innovation or shift r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

225

in the frontier in the colleges of UST. This further implies that UST is left behind in injecting new and advanced technology to its individual colleges as well as to the overall operating system of the university. Moreover, this deterioration in technology needs urgent attention by the UST administrators and officials in order to catch up with technological innovations.

5. Conclusions Universities are an important component of human capital formation. They are also a major expenditure component for taxpayers. The efficiency by which inputs produce desired outputs is thus an important public policy issue. Moreover, with increased competition for students globally, the efficiency of universities is an international issue. In this paper, DEA was used to estimate relative efficiency and productive performance of colleges at the University of Santo Tomas for the period 1998–2003. The technical efficiency results suggest that the Santo Tomas colleges are operating at a fairly high level of efficiency relative to each other, although there is still room for improvement within the university. From the results, we can infer that efficient utilization of financial resources can help attain productive performance across academic disciplines. This brings new insights to the university operations across disciplines. The first task of university management, after the DEA results have been interpreted, is to ensure that the answers that emerged are accepted throughout the institution, with a view to getting the approach understood and accepted. The task thus involves finding explanations for variations in performance. One way to address this is to inspect the key characteristics of each frontier college and then compare it with the inefficient college(s) in the sample that may define the frontier. The inefficient college can then learn from the frontier unit and/or explain the causes of its own inefficiency. More administrative attention may then be paid to those that perform poorly, especially those colleges with declining innovation performance. For school administrators, it is important to know the colleges that define the frontier and their relative peers to maintain the level of good performance. This is because it is the relative ratio of its ‘own’ efficiency (catching up effect) that explains a TFP development. For future research, it is important to explore the extent to which inefficiencies are related to regulation, managerial practices and workplace relations, which are the acknowledged limitations in this paper. In addition, there is also a need to compare the operation of Santo Tomas colleges with other institutions or colleges from other universities in the Philippines and even universities in other countries.

Acknowledgments Our sincerest acknowledgment goes to the Administrative Offices and the Graduate School of the University of Santo Tomas, Philippines for the financial support provided for this research and conference presentation. Special thanks are given to Dean Lilian J. Sison, Rector Fr. Tamerlane R. Lana, O.P. and Vice-Rector Fr. Juan V. Ponce, O.P. for their helpful support of this research and presentation. Tim Coelli is hereby acknowledged for making available the DEAP v2.1 software for empirical analysis in this paper. Participants of the DEA Session in the IFORS r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

226

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

Triennial 2005 Conference, held at Hilton Hawaiian Village Beach Resort & Spa, Honolulu, Hawaii on July 11–15, 2005, are also acknowledged for their valuable comments during the presentation of this paper. Remaining errors found in this paper are the responsibility of the authors alone.

References Abbott, M., Doucouliagos, C., 1998. The efficiency of Australian universities: a data envelopment analysis. Economics of Education Review 22, 89–97. Abbott, M., Doucouliagos, C., 2002. A data envelopment analysis of the efficiency of Victorian TAFE institutes. The Australian Economic Review 35, 55–69. Athanassopoulos, A.D., Shale, E., 1997. Assessing the comparative efficiency of higher education institutions in the UK by means of data envelopment analysis. Education Economics 5, 118–134. Banker, R.D., Charnes, A., Cooper, W.W., 1984. Some models for estimating technical and scale inefficiencies in data envelopment analysis. Management Science 30, 1078–1092. Breu, T.M., Raab, R.L., 1994. Efficiency and perceived quality of the nation’s top 25 national universities and national liberal arts colleges: an application of data envelopment analysis to higher education. Socio-Economics Planning Science 28, 33–45. Charnes, A., Cooper, W., Lewin, A.Y., Seiford, L.M., 1994. Data Envelopment Analysis: Theory, Methodology and Applications. Kluwer Academic Publishers, Dordrecht. Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444. CHED 2003. Commission on Higher Education Manual, CHED. Pasig City, Philippines. CHED 2005. Commission on Higher Education Manual, CHED. Pasig City, Philippines. Chen, T.Y., 1997. A measurement of the resource utilization efficiency of university libraries. International Journal of Production Economics 53, 71–80. Coelli, T., 1996. A Guide to DEAP Version 2.1: A Data Envelopment Analysis (Computer) program. Department of Econometrics, University of New England, Armidale, NSW, Australia. Coelli, T., 1998. Productivity in Australian electricity generation: will the real TFP measure please stand up? CEPA Working Paper, Department of Econometrics, University of New England, NSW, Australia. Coelli, T., Rao, P., Battese, G., 1998. An Introduction to Efficiency and Productivity Analysis. Kluwer Academic Publishers, Dordrecht. Fare, R., Grosskopf, S., Norris, M., Zhang, Z., 1994. Productivity growth, technical progress, and efficiency change in industrialized countries. The American Economic Review 84, 66–83. Grosskopf, S., Mourtray, C., 2001. Evaluating performance in Chicago public high schools in the wake of decentralization. Economics of Education Review 20, 1–14. Haksever, C., Muragishi, Y., 1998. Measuring value in MBA Programmes. Education Economics 6, 11–25. Johnes, J., 1996. Performance assessment in higher education in Britain. European Journal of Operational Research 89, 18–33. Johnes, J., Johnes, G., 1995. Research funding and performance in U.K. university departments of economics: A frontier analysis. Economics of Education Review 14, 301–314. Madden, G., Saveage, S., Kemp, S., 1997. Measuring public sector efficiency: A study of economics department at Australian universities. Education Economics 5, 165–185. Ng, Y.C., Li, S.K., 2000. Measuring the research performance of Chinese higher education institutions: An application of data envelopment analysis. Education Economics 8, 2–139. Taylor, B., Harris, G., 2004. Relative efficiency among South African universities: A data envelopment analysis. Higher Education 47, 73–89.

r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

227

Appendix

Table A1 Geometric average of TFP and its components of the colleges of the University of Santo Tomas, 1998–2003 Colleges

TFPCH

EFFCH

TECHCH

Architecture Fine Arts Arts & Letters Commerce Education Engineering Music Nursing Pharmacy Rehabilitation Science Science Civil Law Medicine All colleges mean

0.966 0.903 0.874 0.961 0.973 0.970 0.886 1.056 1.056 0.851 0.970 0.982 0.929 0.950

1.020 1.019 1.000 1.000 1.025 0.974 1.003 1.035 1.033 0.928 1.008 1.068 1.005 1.009

0.946 0.887 0.874 0.961 0.949 0.996 0.883 1.020 1.022 0.917 0.963 0.919 0.924 0.942

TFPCH, total factor productivity change; EFFCH, technical efficiency change; TECHCH, technological change.

Table A2 Geometric average of efficiency and its components: pure efficiency change and scale efficiency change, 1998–2003 Colleges

EFFCH

PECH

SECH

Architecture Fine Arts Arts & Letters Commerce Education Engineering Music Nursing Pharmacy Rehabilitation Science Science Civil Law Medicine All colleges mean

1.020 1.019 1.000 1.000 1.025 0.974 1.003 1.035 1.033 0.928 1.008 1.068 1.005 1.009

1.015 1.015 1.000 1.000 1.023 0.985 1.000 1.042 1.049 0.916 1.007 1.023 1.000 1.005

1.006 1.004 1.000 1.000 1.001 0.989 1.003 0.993 0.985 1.013 1.000 1.044 1.005 1.003

EFFCH, technical efficiency change; PECH, pure efficiency change; SECH, scale efficiency change.

r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

228

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

Table A3 Summary of input slacks in 2003 Colleges

Total no. of academic staff

Total no. of non-academic staff

Operating expenses

Architecture Fine Arts Arts & Letters Commerce Education Engineering Music Nursing Pharmacy Rehabilitation Science Science Civil Law Medicine

0.000 0.000 0.000 0.000 0.000 8.786 0.000 0.000 0.000 10.213 0.000 0.000 0.000

0.000 0.000 0.000 0.000 0.000 9.755 0.000 0.147 0.000 0.000 11.700 0.000 0.000

0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

Table A4 TFP performance of UST colleges per year Colleges

1998–1999

1999–2000

2000–2001

2001–2002

2002–2003

Architecture Fine Arts Arts & Letters Commerce Education Engineering Music Nursing Pharmacy Rehabilitation Science Science Civil Law Medicine

0.884 0.889 0.867 2.993 0.918 1.025 0.741 0.917 0.992 0.735 0.912 1.054 0.867

0.817 0.778 0.805 0.341 0.911 1.007 0.893 1.020 1.083 0.869 0.916 0.923 0.862

0.999 0.965 0.804 0.934 0.976 0.938 0.791 1.129 1.112 0.789 1.030 0.928 0.986

1.003 0.936 0.940 0.939 0.970 1.004 1.064 1.058 1.075 0.957 1.014 1.007 0.902

1.161 0.961 0.966 0.913 1.102 0.884 0.978 1.177 1.023 0.928 0.984 1.005 1.041

r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007

B. I. S. Fernando and E. C. Cabanda / Intl. Trans. in Op. Res. 14 (2007) 217–229

229

Table A5 Performance ranking of TFP and components Rank Colleges

TFP Rank Colleges

EFFCH Rank Colleges

TECHCH

1 1 2 3 4 4 5 6 7 8 9 10 11

1.056 1.056 0.982 0.973 0.970 0.970 0.966 0.961 0.929 0.903 0.886 0.874 0.851

1.068 1.035 1.033 1.025 1.020 1.019 1.008 1.005 1.003 1.000 1.000 0.974 0.928

1.022 1.020 0.996 0.963 0.961 0.949 0.946 0.924 0.919 0.917 0.887 0.883 0.874

Pharmacy Nursing Civil Law Education Science Engineering Architecture Commerce Medicine Fine Arts Music Arts & Letters Rehabilitation Science

1 2 3 4 5 6 7 8 9 10 10 11 12

Civil Law Nursing Pharmacy Education Architecture Fine Arts Science Medicine Music Commerce Arts & Letters Engineering Rehabilitation Science

1 2 3 4 5 6 7 8 9 10 11 12 13

Pharmacy Nursing Engineering Science Commerce Education Architecture Medicine Civil Law Rehabilitation Science Fine Arts Music Arts & Letters

TFP, total factor productivity; EFFCH, technical efficiency change; TECHCH, technological change.

r 2007 The Authors. Journal compilation r International Federation of Operational Research Societies 2007