Web Sites Evaluation

7 downloads 98304 Views 308KB Size Report
of Shanghai and Webometrics sites show a good correlation. Keywords ... from the Web Impact Factor, build on the same idea as the bibliographic databases.
Ranking Methodologies Evaluation of the World Universities web-based Yasser M. Abd El-Latif Faculty of Sciences Department of Mathematics

[email protected]

Taymoor M. Nazmy Mohamed F. Tolba Faculty of Computer & Information Sciences Ain Shams University Abbassia (11566), Cairo, Egypt [email protected] [email protected]

Abstract The purpose of this paper is to provide an evaluation and comparative study for methodologies applied by sites dedicated to evaluate and rank the world universities. The set of the web sites dealing with the ranking of the higher education was listed. It is not a comprehensive list, but rather, offers a general sense of the range of existing resources that might be considered in the evaluation of websites. This paper presents a survey on the most important sites in that context which are Shanghai, Webomatrics, Times Higher Education World University, Maclean and Education Guardian's. The criteria of ranking for each site were stated and the comparison between the ranking of Shanghai and Webometrics sites show a good correlation.

Keywords Ranking of world universities, Ranking Criteria.

1. Introduction Evaluation of educational web sites is a subject area that has not had a lot of published research [1], [2], [3], [4]. With the enormous amount of information and web sites available on the Internet, teachers may feel overwhelmed when it comes to evaluating web sites. The criteria selected for evaluating websites is dependent on the purpose of the site and what it is the user wants from the site. The main aim of this work is to show the methodology, utilized for the quantitative evaluation and comparison of sites in the operational phase. Determining the rank of a given university on a given indicator is a simple matter since the relative size of the indicator is a direct indication of its position relative to the other Universities’ values on that indicator. However, matters are substantially more complicated in the calculation of the “over-all” rankings. The over-all ranking, as the name suggests, is the rank placement of a university taking into account how all the universities in the comparison group performed on their respective aggregated indicators. We will provide the recipe for doing this in a subsequent section but first we need to tell you what the indicators are and how most evaluation site as Shanghai [5], Webomatrics [6], Times Higher Education World University (THE) [7], Maclean [2] and Education Guardian's [8] weights them. A particular aspect of the evaluation criterion is connectivity whether, and how easily, one can connect to the site. Santa Vicca regards this as the most important criterion: "If one cannot access the information, one cannot evaluate the information." [1]

Caywood makes connectivity as significant criteria, considering whether a site is frequently overloaded and whether the URL (Uniform Resource Locator) is stable. The ranking is based on a combined indicator that takes into consideration the volume of the published material on the web, and the visibility and impact of these webpages measured by the sitations (site citations) or links they received (inlinks). It is derived from the Web Impact Factor, build on the same idea as the bibliographic databases based Impact Factor of the Journal Citation Reports published by the Institute of Scientific Information, now Thomson Scientific. The number of global rankings available for comparison purposes is very limited, so only six different sources were used for extracting the data. As absolute values are not comparable, all the figures were converted into rankings. The data providers are: 

  





Webometrics Ranking of World Universities is the basis for the comparison, although the data were segregated into components for the different purposes. Taking into account the data came from the November 2005 round, the ranks for size (number of pages) and rich files (documents in pdf, doc, ps or ppt formats) were used for the productivity tables, while the number of external inlinks appears in the Visibility section and the global Webometrics Rank is used for the impact tables. Essential Science Indicators is a powerful database elaborated by The Thomson Corporation that provides number of papers and citations received by those papers for a large number of institutions worldwide. Google Scholar still in a beta phase is providing access to a large (and increasing) number of databases interlinked of bibliographic and webliographic records of scientific papers. The data for popularity (number of visits) was extracted from the Alexa database [9]. This system intercepts a large number of visits through its toolbar and spyware distributed almost evenly worldwide. The rank indicates the position of the institution in the global database, although the positions were reorganized by institution (universities) and regionalized (due to geographical bias detected in the Far East area) and finally appears in the visibility tables. The Academic Ranking of World Universities, updated in 2005 elaborated by the Institute of Higher Education of the Shanghai Jiao Tong University. The combined indicator was recalculated using their own data for providing a unique position to every one of the 500 universities in the list. This position was used as an impact measurement. The Times Higher Education World University Rankings is an international ranking of the world's top universities published by Times Higher Education (THE). A publisher of international education rankings since 2004, THE creating a new ranking methodology whose citation database information is compiled in partnership with Thomson Reuters. The rank was also considered as an impact measurement.

The paper is organized as follows. Section 2, gives a brief summary about the evaluation of the web sites. Section 3, introduces the different ways of ranking methodologies. The ranking is based on a combined criteria are given in sections 4, especially deals with Shanghai, Webomatrics, THE and Maclean web sites. Analysis and comparison between the ranking of Shanghai and Webometrics sites are

discussed in sections 5 and show a good correlation. Finally, in section 6, we conclude this paper with further observations.

2. Web Sites Evaluation There are many general web sites that outline basic standards for evaluating Internet information; the sample of evaluation can be categorized as international evaluation such as International Business Schools [9], International universities and International universities sites and the other category is regional evaluation such as UK, USA, Canada, Germany, Asia, and Australia. Table 1 illustrates sample of evaluation sites as International and regional ranking. We will predict the usability of each evaluation site using Alexa ranking [10] which ranks the sites according to the number of visitors. Ranked Targets International Business Schools

Ranking Site EMBA rankings (Executive Masters of Business Administration ) (http://rankings.ft.com/rankings//index.jsp)

Alexa Ranking 1,023,461

International universities

Academic Ranking of World Universities by Shanghai Jiao Tong University (http://ed.sjtu.edu.cn/ranking.htm)

131,729

International universities sites

World Universities' ranking on the Web (http://www.webometrics.info)

27,208

(a) Ranked Targets UK USA Canada Germany Australia Asia

Ranking Site

Alexa Ranking

University Guide 169 (http://education.guardian.co.uk/) U.S. News & WORLD REPORT: RANKINGS & GUIDES 3,309 (http://www.usnews.com/usnews/rankguide/rghome.htm) The Maclean's ranking 33,519 (http://www.macleans.ca/universities/) DAAD ranking (http://www.daad.de/deutschland/studium/hochschulranki 34,554 ng/04690.en.html) Top 100 ranked universities for International students. 157,113 (http://www.australian-universities.com/rankings/) Asia's Best Universities 2000 (http://www.asiaweek.com/asiaweek/features/universities 28,531,537 2000/index.html) (b)

Table 1: Sample of Evaluation Sites (a) International Ranking, (b) Regional Ranking.

Many of the other evaluation outlined in other sites [5], [6], [8], [11] can be included when evaluating educational web sites; however, there are two ways to evaluate the

web site one is based on indirect contact (home page) and the other is based on direct contact. The teachers believe that an educational web site and its information need to be given a specific set of criteria for evaluation. There has been much discussion about the international standing of the universities. The discussion is often related back to funding issues. Is it possible under current funding arrangements for the universities to be high up in the international league tables? Is any university in the top 50 or top 100 in the world? One answer to the last question has been provided by work done in the Institute of Higher Education at Shanghai Jiao Tong University [5]. Since the initial publication of the Academic Ranking of World Universities (ARWU) in June 2003, it suggested a methodology to publish the ranking on the web and we will illustrate this in the next section.

3. Ranking Methodologies 3.1 Evaluation based on indirect contact (homepages) The Ranking Group has scanned every institution that has any Nobel Laureates, Fields Medals, Highly Cited Researchers, or articles published in Nature or Science. In addition, major universities of every country with significant amount of articles indexed by Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI) are also included. In total, more than two thousand institutions have been scanned, and about one thousand institutions have been actually ranked. However, only the list of top five hundred institutions has been published on the web [5], [6]. The following some sample of evaluation sites as International ranking based on indirect contact. Data also collected automatically from the main search engines, using ad-hoc scripts or the APIs provided by the commercial engines using the following search engines:    

Google (www.google.com), Yahoo! Search (search.yahoo.com), MSN Search (search.msn.com) Teoma (www.teoma.com).

3.2 Evaluation based on direct contact The direct contact is used for collecting data about the institutions for the regional ranking. Data could be collected directly through questioners or telephone contact. In the first way for direct contact by questioner, the questioners are sent to different places and peoples to collect the data which is able to reflect the regional success of the academic education. The questioners could cover the universities, faculties, students, parents, graduated, companies, and secondary schools. In the second way for direct contact by telephone survey, the deans and the professors of the universities are called for collecting the data. The telephone survey is not only used for the data collection but also used to measure the waiting time before talking, the response of the target persons and their cooperation degrees.

3.3 Determinants of International Standing Shanghai studied that the international ranking of a university is determined by research performance and its importance as judged by citations [5]. This approach was extended [12] by adding a range of other measures of performance and by allowing for the discipline mix in institutions between laboratory-based disciplines and non-laboratory based. Attributes are grouped under the following six headings: i. Quality/International standing of academic staff as measured by research output, citations, membership of learned academies, and success in obtaining research grants. ii. Quality of graduate programs, particularly Ph.D. programs, as measured by student surveys, progression rates and successful completions. iii. Quality of undergraduate entry as measured by Tertiary Entrance Scores iv. Quality of undergraduate programs as measured by progression rates within the degree and to higher studies, student evaluations, and staff-student ratios. v. Resources as measured by total revenue deflated by size of institution. vi. Subjective assessment as obtained through a survey of educationists. In the following section we will give a review of international and regional approaches for institutions evaluations and the used criteria in each evaluation approach.

4. Ranking Criteria and Weights Criteria are defined as consisting of characterizing marks or traits, standards on which a decision or judgment may be based, identifying indications and/or a basis of discrimination. In using the word criteria, we are referring of the standards that help us to determine whether an educational web site is valuable. Thus, the use of the word criteria is necessary to accurately judge educational web sites. These criteria will be a guideline of identifiers that will enable viewers to critically examine educational web sites. Many evaluation sites gave their attention to the ranking of international universities. We will deal with the most evaluation site Shanghai [5], Webomatrics [6] as international evaluation sites, and Maclean [2], Education Guardian's [8] as regional evaluation sites and show how these sites give the criteria of ranking. 4.1 Shanghai The original purpose of Shanghai ranking was to find out the position of Chinese universities in the world. Shanghai ranking was put on the internet upon on the encouragement of colleagues from all over the world. There have been more than 2,000,000 visitors since 2003, an average of 2000 every day. Rank universities is affected by several indicators of academic and research performance [13], ranking indicators include alumni and staff winning Nobel Prizes and Fields Medals, Highly Cited Researchers in twenty-one broad subject categories, articles published in Nature and Science, articles indexed in Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI), and academic performance with respect to the size of an institution. Table 2 gives the details of the criteria and their weights.

4.2 Webometrics Webometric indicators [6] are provided to show the commitment of the institutions to Web publication and to the worldwide Open Access to knowledge. It is an initiative of a working group devoted to the quantitative study of the internet and specially the process of scholar communication in the Web. They developed a portal, an electronic journal called Cybermetrics [14] in 1996 devoted to the emerging discipline of Webometrics, and trying to build a web based citation report using hypertext links as an analogous to bibliographic citations. Every January and July this site will offer a ranking of universities and research centers worldwide according to the webometric indicators of their institutional web domains. Criteria

Indicator

Code

Weight

Quality of Education

• Alumni of an institution winning Nobel Prizes and Fields Medals

Alumni

10%

Quality of Faculty

• Staff of an institution winning Nobel Prizes and Fields Medals • Highly cited researchers in 21 broad subject categories

Award

20%

HiCi

20%

• Articles published in Nature and Science

N&S

20%

Research Output

• Articles in Science Citation Indexexpanded, Social Science Citation Index, and Arts & Humanities Citation Index

SCI

20%

Size of Institution

• Academic performance with respect to the size of an institution

Size

10%

Total

100%

Table 2: Criteria and Weights for ARWU – (Shanghai, 2004-2011).

From Webometrics [6], the best way to build the ranking is combining a series of indicators that measures different aspects of the web presence of the institutions. The obvious precedent is the Web Impact Factor (WIF) proposed by Almind & Ingwersen that is based on link analysis. The most widely accepted calculation of the WIF is the number of external inlinks by the number of pages. That means a relation of 1:1 between visibility and size. Although there has been some criticism to WIF, given extra weight to visibility as the number of links from other domains is a way of measuring impact of the contents. Besides it was needed to give extra value to a special group of pages, the so-called rich files (pdf, doc, ppt and ps) that are more related to publication activities. The new ratio is 4:3 where visibility and size (total number of pages and number of rich files) are converted to ranks before combined. From a quantitative point of view, Webometrics calculates three different indicators from search engines as follows: Size Number of pages is calculated using four engines: Google, Yahoo, MSN & Teoma. For each engine, results are normalized to 1 for the highest value. Then for each domain, maximum and minimum results are excluded and every institution is assigned a rank according to the combined sum.

Visibility The total number of unique external links received (inlinks) by a site can be only confidently obtained from Yahoo and MSN only. For each engine, results are normalized to 1 for the highest value and then combined to generate the rank. Rich Files After evaluation the “academic” relevance and the volume of different file formats we considering for our purposes the following 'rich files': EXTENSION .pdf .ps .doc .ppt

FILE Adobe Acrobat PDF Adobe Postscript Microsoft Word Microsoft Powerpoint

These data were extracted using Google and merging the results for each file-type after normalizing them in the same way as described before. The three ranks were combined according to a formula where each one has a different weight: Webometrics Rank (Position) = 2*Rank(Size) + 4*Rank(Visibility) + 1*Rank(Rich Files) WR = 2S + 4V + R 4.3 Times Higher Education World University rankings After the 2009 rankings, THE took the decision to break from its original partner Quacquarelli Symonds (QS) in 2010 and signed an agreement with Thomson Reuters to provide the data for its annual World University Rankings from 2010 onwards. The publication developed a new rankings methodology in consultation with its readers, its editorial board and Thomson Reuters. Thomson Reuters will collect and analyze the data used to produce the rankings on behalf of Times Higher Education. The results will be published annually from September 2010. On 3 June 2010, Times Higher Education revealed the methodology which they propose to use when compiling the new world university rankings [15]. The publication, along with data partners Thomson Reuters, plan to use 13 separate performance indicators to compile the league tables for 2010 and beyond - an increase from just six measures used under the methodology employed between 2004 and 2009. After further consultation the devisors grouped the criteria under five broad overall indicators to produce the final ranking score as shown Table 3. 4.4 Maclean's ranking The Maclean's ranking is one of many sources prospective students use when choosing a university [2]. Since 1991, Maclean's Magazine has published an annual article ranking all Canadian universities. Table 4 displays the points of the University of Calgary garnered on each of the indicators for the 2004 rankings. The ranking of the University of Calgary on each indicator is also given in this table.

4.5 Education Guardian's Education Guardian's interactive guide to universities and colleges is also one of the most comprehensive sources of information on the UK's academic sector [8]. The ranking of the University in UK on each indicator is given in Table 5 including the difference weights through 2004 and 2005 ranking. Overall indicator Industry Income innovation International diversity

Individual indicators

Percentage weightings



Research income from industry (per academic staff)

 

Ratio of international to domestic staff Ratio of international to domestic students

Teaching – the learning environment

    

Reputational survey (teaching) PhDs awards per academic Undergrad. admitted per academic Income per academic PhDs/undergraduate degrees awarded

15% 6% 4.5% 2.25% 2.25%

Research – volume, income and reputation

   

Reputational survey (research) Research income (scaled) Papers per research and academic staff Public research income/ total research income

19.5% 5.25% 4.5% 0.75%

Citations – research influence

 Citation impact (normalised average citation per paper) Total

2.5% 3% 2%

32.5% 100%

Table 3: Times Higher Education World University rankings indicators (2010 rankings). Overall indicator Student Body Classes

Faculty

Individual indicators • • • • • • •

• Finances

Library

   

Reputation



The incoming students' average high-school Out-of-province students and international students. The entire distribution of class sizes. The percentage of students in classes. The percentage of faculty with Ph.D.s The number who win national awards The magazine measures the success of eligible faculty in the Research in Social Sciences, Humanities, Natural Sciences, Engineering and Health. The amount of money available for full-timeequivalent student (3.3%), student services (4.3%), and scholarships and bursaries (4.3%). Number of full-time-equivalent students Budget allocated to library services (4%) The percentage of the library budget spent on updating the collection (4%). Reputation with its own graduates, as well as within the community at large gifts to the university over the past five years Total

Percentage weightings 22% 18%

17%

12%

12%

19% 100%

Table 4: University of Calgary results on Maclean’s indicators (2004 rankings).

Overall indicator

Individual indicators •

Teacher score Entry qualifications

• •

Spend per student



Teaching staff are weighted according to their seniority and qualifications qualifications for new entry students Costs such as central libraries, information services and central computers All costs are calculated per student

Percentage weightings 2004 2005-2011 Not included

15%

10%

20%

15%

10%

Student:staff ratio



Student:staff ratios

15%

20%

Value-added

 

Low grades students with Very high entry qualifications

10%

10%



The level of employment for universities in different subjects

15%

17%



The ability of the institution, at subject level, to attract students from under-represented groups

8%

8%

Student destinations Inclusiveness

Total

100%

Table 5: The ranking of the University in UK (Education Guardian's, 2004 and 2005-2011).

5. Correlation between Evaluation Sites We noted that not all the ranking approaches and systems use the same criteria; some of the used criteria are common while the others are not. The used criteria in each ranking approach must be chosen to perfectly reflect an accurate result of the international institutions ranking. It is not expected that the results from two different ranking approaches be the same, but we must find some kind of correlation between them. So, ranking results should not be accepted uncritically but must be assessed as to whether it produces accurate and reliable results or not. For that, we will study the results of two different evaluations that not use the same criteria to give a comparative study between their results. For example, the first ten ranked institutions in both the International Webometrics Ranking of World Universities Sites “Webometrics ranking”, the International Academic Ranking of World Universities, “Shanghai ranking”, and the Times Higher Education World University Rankings "THE ranking", and their corresponding will be shown in Figure 1. Figure 1 illustrates that, the correlation between the results obtained from the three different ranking approaches could give a good pointer about the accuracy and reliability of the both ranking results. Important conclusions resulted from the figure is that there is a strong correlation among the two different ranked lists. Although the main primary of the Webometrics Ranking is to promote the publication in the Web by Universities and other research related institutions, the web indicators produced allow for a comparative analysis.

Institution

Country

Webometrics Rank

Shanghai Rank

THE Rank

California Institute of Technology USA 13 6 2 Columbia University USA 12 18 Cornell University USA 4 13 14 Duke University USA 37 35 24 Harvard University USA 2 1 1 Johns Hopkins University USA 15 13 Massachusetts Institute of Technology (MIT) USA 1 3 3 Northwestern University USA 30 25 Princeton University USA 39 7 5 Stanford University USA 3 2 4 Swiss Federal Institute of Technology Zurich Switzerland 23 15 University of British Columbia Canada 33 37 30 University of California Berkeley USA 5 4 8 University of California Los Angeles USA 14 12 11 University of California San Diego USA 36 32 University of California Santa Barbara USA 34 29 6 University of Cambridge UK 16 5 University of Chicago USA 29 9 12 University of Illinois - Urbana USA 19 33 University of Michigan USA 6 22 16 University of North Carolina, Chapel Hill USA 25 42 31 University of Oxford UK 27 10 7 University of Pennsylvania USA 10 14 19 University of Tokyo Japan 34 21 26 University of Toronto Canada 32 26 17 University of Washington USA 8 16 23 Washington University Saint Louis USA 31 38 Yale University USA 28 11 10 Figure 1: International Ranking of World Universities by Webometrics, Shanghai and THE Ranking (2011).

Figure 2(a) represents the Webometrics ranking results in x-axis and the Shanghai ranking result in y-axis. Each point in the graph represent a rank of an institution resulted from both ranking approaches. For example, the point (3, 1) represents the institute which is in the rank position 3 in the Webometrics and in the rank position 1 in the Shanghai. The ideal correlation line is the line on which both evaluation methods are agreed in some ranked institutions, which was not happen. The figure shows also that there are four common institutions in the top ten ranked institutions. In the next ten ranked institutions, from the 10th to the 20th, there are six common institutions. Then from the 20th to the 30th there are four common institutions. Then from 30th to the 40th there are five common institutions. In the same way we can observe for figure 2(b) and 2(c). Table 6 summarizes the results.

(a)

(b)

(c) Figure 2: The correlation diagrams between (a) Webometrics ranking and Shanghai ranking (b) Webometrics ranking and THE ranking and (c) Shanghai ranking and THE ranking.

Figure 2 (a) (b) (c)

#Top 10 ranked 4 4 8

#(10th to 20th) 6 8 5

#(20th to 30th) 4 4 5

#(30th to 40th) 5 8 4

Table 6: Number of common institutions on the same intervals.

The upper and lower threshold correlation lines are parallel to the ideal correlation line and pass through the points (0, 25) and (25, 0) respectively. All institutions exist between the two threshold correlation lines are ranked in both ranking approaches with the difference between their ranking positions in both approach not more than 25. If the institutions are not between the two threshold correlation lines, then the position difference between the two ranking approach will be more than 25. For the institutions which are outside the region in between the upper and the lower correlation lines, there is disagreement between the results from two approaches. This disagreement will be more if we increase the distance between the upper and lower correlation lines and the ideal correlation line. For example, Princeton University is ranked in position 39 in the Webometrics ranking while the same university is ranked in position 7 in the Shanghai Ranking and in position 5 in the THE Ranking. For the first 200 institutions there are some institutions satisfying this property. For that, we have two situations: 1- The ranking approach gives a lower position to the institution because: a. Ranking approach ignored some criteria which could increase the ranking position of the institution ranked lower. b. Ranking approach considered some criteria which could decrease the ranking position of the institution ranked lower. c. Ranking approach biased against the institutions ranked lower. 2- The ranking approach which gives a higher position to the institution because: a. Ranking approach ignored some criteria which could decrease the ranking position of the institution ranked higher. b. Ranking approach considered some criteria which could increase the ranking position of the institution ranked higher. c. Ranking approach biased toward the institutions ranked higher. Hence, the correlation among the results coming from different ranking approach can increase the degree of confidence about some ranked institutions when their ranking values exist in the area between the upper and lower correlation lines especially when the distance between them is small. Also, the correlation among the results can give some doubt about the reliability of some ranked institution when their ranking values exist outside the area between the upper and lower correlation line especially when the distance between them is large. Figure 2(c) illustrates the Shanghai ranking and THE ranking is very close than the Webometrics ranking.

6. Conclusion The World Wide Web offers information and data from all over the world. Because so much information is available, and because that information can appear to be fairly “anonymous”, it is necessary to develop skills to evaluate what you find. When you use a research or academic library, the books, journals and other resources have already been evaluated by scholars, publishers and librarians. Every resource you find has been evaluated in one way or another before you ever see it. When you are using the World Wide Web, none of this applies. There are no filters. Because anyone can write a Web page, documents of the widest range of quality, written by authors of the widest range of authority, are available on an even playing field. Excellent resources reside along side the most dubious. The Internet epitomizes the concept of Caveat lector: Let the reader beware. There are criteria by which scholars in most fields evaluate print information, and shows how the same criteria can be used to assess information found on the Internet. This paper presents a survey on the most important sites in that context which are Shanghai, Webomatrics, Times Higher Education, Maclean and Education Guardian's. The criteria of ranking for each site were stated and the comparison between the ranking of Shanghai and THE sites show a good correlation. The Ranking Group will continue their efforts and update ARWU annually with necessary modifications. The Ranking Group are investigating the possibilities of providing lists of top universities with engineering (technology) or medical orientation, as well as the possibilities of ranking universities by broad subject areas such as social sciences, physical sciences, engineering and technology, life sciences and medicine.

References [1]

SantaVicca, E.F. The Internet as a reference and research tool: A model for educators. The Reference Librarian, 41/42, 225-236, 1994.

[2]

Nancy Heckman, Rick White, Steven Wang, Hubert Wong, “Maclean's Ranking of Canadian Universities”, Technical Report 189, Department of Statistics, University of British Columbia, October, 1999.

[3]

Weingart, P. Impact of bibliometrics upon the science system: inadvertent consequences? In: H.F. Moed, W. Glänzel and U. Schmoch (eds). Handbook on Quantitative Science and Technology Research. Dordrecht (The Netherlands): Kluwer Academic Publishers, 2004.

[4]

N.C. Liu, Y. Cheng, and L. Liu, Academic Ranking of World Universities Using Scientometrics: a Comment on the “Fatal Attraction”, Scientometrics, 64(1), 101, 2005.

[5]

Institute of Higher Education, Shanghai Jiao Tong University, Academic Ranking of World Universities – 2004, from website http://ed.sjtu.edu.cn/ranking.htm.

[6]

World Universities' ranking on the Web, http://www.webometrics.info

[7]

Times Higher Education World http://www.timeshighereducation.co.uk.

University

rankings

[8]

Guardian Newspapers on the Web, http://education.guardian.co.uk

[9]

Executive Masters of Business http://rankings.ft.com/rankings//index.jsp

[10]

Alexa database http://www.alexa.com.

[11]

The Maclean's ranking in Canada, http://www.macleans.ca/universities

[12]

Williams, R. and Van Dyke, N., The International Standing of AustralianUniversities, Melbourne Institute Report, University of Melbourne, 2004, (http://www.melbourneinstitute.com).

[13]

A.F.J. van Raan, Fatal Attraction: Ranking of Universities by Bibliometric Methods, Scientometrics, 62 (1), 133, 2005.

[14]

Cybermetrics Electronic Journal, http://www.cindoc.csic.es/cybermetrics

[15]

Baty, Phil. "THE unveils broad, rigorous new rankings methodology". Times Higher Education. Retrieved 16 September 2010.

Administration

EMBA