IRISS Working Paper Series 2008-06 May 2008 ... - CiteSeerX

3 downloads 29 Views 704KB Size Report
May 6, 2008 - attitude questions are influenced by the same latent construct in each culture, the ..... loadings) of the relationships between latent factors and response variables are .... 2 Participating countries are: Austria, Belarus, Belgium, Bulgaria, Croatia, Czech Republic, Denmark, .... Drug Litter Speeding Alcohol Sex.
Centre d'Etudes de Populations, de Pauvreté et de Politiques Socio-Economiques International Networks for Studies in Technology, Environment, Alternatives and Development

IRISS Working Papers IRISS-C/I An Integrated Research Infrastructure in the Socio-economic Sciences

Measurement Equivalence and Extreme Response Bias in the Comparison of Attitudes across Europe

by Milosh Kankarash Guy Moors

IRISS Working Paper Series

2008-06

May 2008

Centre d'Etudes de Populations, de Pauvreté et de Politiques Socio-Economiques International Networks for Studies in Technology, Environment, Alternatives and Development

Measurement Equivalence and Extreme Response Bias in the Comparison of Attitudes across Europe

Milosh Kankarash Tilburg University, CEPS/INSTEAD Guy Moors Tilburg University

Abstract It is generally accepted that both measurement inequivalence and extreme response bias can seriously distort measurement of attitudes and subsequent causal models. However, these two issues have rarely been investigated together. In this article we demonstrate the flexibility of a multigroup latent class factor approach in both analysing measurement equivalence and detecting extreme response bias. Using data from the European Value Survey from 1999/2000, we identified an extreme response bias in answering Likert type questions on attitudes towards morals of compatriots. Furthermore, we found measurement inequivalence in form of direct effects of countries on response variables. When only one of these two issues either measurement inequivalence or extreme response bias - was included into measurement model estimated effects of countries on attitudinal dimension were different from those obtained with a model that includes both measurement issues. Using this all-inclusive model we have got more valid estimates of the differences between countries on measured attitude.

Reference IRISS Working Paper 2008-06, CEPS/INSTEAD, Differdange, Luxembourg URL http://ideas.repec.org/p/irs/iriswp/2008-06.html

Milosh Kankarash is sponsored by a PhD grant (“Bourse Formation Recherche” BFR06/040) financed by the Luxembourg Ministère de la culture, de l’enseignement supérieur et de la recherche and administered at CEPS/INSTEAD.

The views expressed in this paper are those of the author(s) and do not necessarily reflect views of CEPS/INSTEAD. IRISS Working Papers are not subject to any review process. Errors and omissions are the sole responsibility of the author(s).

(CEPS/INSTEAD internal doc. #07-08-0404-E)

Measurement Equivalence and Extreme Response Bias in the Comparison of Attitudes across Europe

Milosh KANKARASH ∗ Tilburg University, CEPS/INSTEAD Guy MOORS ° Tilburg University

Paper to be presented at 7th International Conference on Social Science Methodology, Naples (Italy), September 1-5, 2008



Corresponding Author: Tilburg University; Department of Methodology & Statistics, Faculty of Social and Behavioural Sciences, PO Box 90153, 5000 LE Tilburg (the Netherlands); E-mail: [email protected]. ° Tilburg University; Department of Methodology & Statistics, Faculty of Social and Behavioural Sciences, PO Box 90153, 5000 LE Tilburg (the Netherlands); E-mail: [email protected]

-1-

Introduction In this research we merge two issues that relate to the validity of attitude measurement in cross-cultural research, i.e. response bias and measurement equivalence. The latter refers to the question to what extent responses to a set of attitude questions are influenced by the same latent construct in each culture, the former to the idea that responses to a set of attitude questions are not merely influenced by the latent construct one intends to measure, but also by potential response styles respondents adapt in the survey process. With growing number of comparative studies it is of increasing importance to use measurement procedures that enable a researcher to draw valid conclusions from cross-cultural comparisons. These procedures require methodological scrutiny since in the context of cross-cultural study the researcher has to deal not only with methodological problems common in the intra-cultural research, but also with issues inherent to this specific situation (Hue & Triandis, 1985a). More precisely, in crosscultural research the problem of comparability of data gathered from the different cultural settings has to be addressed (Van de Vijver & Leung, 1997; Poortinga, 1989; Cambré et al, 2000). In principal, respondents of different countries provide answers to the same attitudinal questions. However, even if a ‘perfect’ translation of the key questionnaire into the language of the interview is possible, it can still be questioned whether each culture interprets questions in the same way. To the extent that different cultures understand survey questions in a similar way, it is said that the measurement of latent variables is equivalent. This equivalence, thus, is not to be merely assumed but thoroughly investigated and established before any comparison takes place. While the issue of the measurement equivalence comes to the fore in crosscultural studies, response bias is a topic that is common for attitudes research in general. It refers to ‘a systematic tendency to respond to a range of questionnaire items on some basis other than the specific item content’ (Paulhus, 1991: 17). This non-random response bias can seriously distort not only measurement of attitudes but the effects of covariates on these attitudes too. Both measurement equivalence and response bias are by no means new problems (Billiet & McClendon, 2000; Cheung & Rensvold, 1998; Moors, 2003) but, to the best of our knowledge, they have been rarely researched within the same

-2-

setting. In this paper we will use an approach within the framework of latent class analysis that allows for both investigating measurement equivalence and detecting response bias (Moors, 2003; Moors, 2004). Our methodological analysis will be conducted on a substantial example from the European Value Survey (Halman, 2001) – a set of 8 questions on attitudes towards moral of compatriots. This paper is organized as follows. First we start with a short overview of the literature on the concepts of measurement equivalence and response bias. Then we describe how these issues can be investigated within the context of a multigroup latent class factor analysis (Magidson and Vermunt, 2001). After introducing the data and procedure, the empirical results are presented and discussed.

1. Measurement Issues in Cross-Cultural Research In the introduction we argued that measurement equivalence and response bias are linked to one another. This statement will become obvious if we focus on what van de Vijver and Leung (1997) refer to as the three basic levels of equivalence: construct, measurement unit and scalar equivalence. These three levels of equivalence are directly related to three major kinds of bias: construct bias, method bias, and item bias. According to van de Vijver and Leung (1997) construct or structural equivalence is the minimal requirement for any kind of comparison. It is achieved when an instrument measures the same construct across all cultural groups. If furthermore, the measurement unit is similar in different cultures a higher level of measurement unit equivalence is established. The latter, however, does not imply that the origin of the scale is the same across cultures. When this is obtained the highest level of scalar equivalence is reached. Response biases such as acquiescence and extreme response behaviour are method biases that might cause shifts in the meaning of the measurement unit and origin of a scale and consequently deteriorate the level of measurement equivalence (Cheung & Rensvold, 2000). Acquiescence (also called agreement tendency or yea-saying) is the tendency to agree rather than to disagree with questions, regardless of the question content. Acquiescence, thus, can cause lack

-3-

of scalar equivalence since items of a scale are affected to a similar extent by a factor that is independent of the construct studied. Extreme response bias describes general tendency to choose extreme response categories on a rating scale (e.g., the 1 and/or 5 on a 5-point scale) (Paulhus, 1991; Greenleaf, 1992). In the case of extreme response bias measurement equivalence is even more strained, as it does not alter only origin but measurement unit of the scale too, thus endangering both metric and scalar equivalence. After all, adopting an extreme response style simply means that the respondent's score on a scale is 'pushed' towards the extreme values of the scale without necessarily reflecting a 'true' content difference with other respondents who do not adopt such a style. In a way extreme response style may be thought of as 'outlier' behaviour. To the extent that different cultures exhibit different levels of extreme response style, ignoring this style factor can lead to biased comparisons. The present study deals with the issue of extreme response bias in a cross-cultural perspective. Investigating the presence of extreme response style in data is always an issue that needs to be addressed when dealing with rating scales because of detrimental effects that this response style can have on data analysis and interpretation of results. Extreme response bias (ERB) affects descriptive and inferential statistical analysis, but it also impairs validity of the obtained results (Cheung & Rensvold, 2000). ERB spuriously increases standard deviation, which as a consequence has biased estimates of correlation from a number of methods, including regression analysis, factor analysis, and cluster analysis. In the context of cross-cultural research it is especially important to check if ERB accounts for obtained differences among cultures. These cross-cultural differences in ERB have been reported in several studies (Greenleaf, 1992; Hui & Triandis, 1989, Gibbons, Zeller, & Rudek, 1999, Arce-Ferrer, 2006). Citizens of Mediterranean and Latin American countries, as well as Latino American and African American groups in the United States, consistently show higher tendency towards extreme response bias compared to other cultures. It is obvious that erroneous conclusions can be drawn when resulting differences between cultures are due to the differences in extreme response bias (Cheung & Rensvold, 2000).

-4-

2. Methodological Approach 2.1 Measurement Equivalence Different techniques have been suggested to investigate (various aspects of) measurement equivalence (Hui and Triandis, 1985a), but the multigroup confirmatory factor analysis approach is clearly the most frequently applied (Steenkamp & Baumgartner, 1998; Vandenberg & Lance, 2000). In this research we will make use of conceptually similar and at the same time different approach, i.e. a multigroup latentclass factor analysis (Hagenaars, 1990; Vermunt & Magidson, 2005). It is alike to Lisrel approach as it is a simultaneous factor analysis in several groups (Clogg & Goodman, 1985). The main difference between the two models is that latent variables and indicators in latent-class models are treated as categorical variables. In other words, instead of using a correlation/covariance matrix as an input, a latent-class approach analyses the cross-classification of the responses on scale items of interest 1 . We have chosen the latent-class factor approach for two of its especially important benefits. First, it does not require traditional modelling assumptions (e.g. linear relationship, normal distribution or homogeneity of variances) which make it more robust to the biases caused by violations of these assumptions (Magidson & Vermunt, 2002). Second and more importantly, latent-class methods are more flexible in detecting different response biases compared to Lisrel approach (Moors, 2003). This flexibility mainly comes from the intrinsic exploratory capacity of the technique (Magidson & Vermunt, 2003), which allows for unexpected results to emerge from data. Thus, as shown in this research, even in situations when a researcher is not looking for specific response biases, empirical indications of it may turn up during the analysis. A multiple group approach is based on the comparison of measurement models that differ on the level of heterogeneity (inequivalence) caused by an exogenous (grouping) variable (McCutcheon, 1987, 2002). Assume a basic model with one nominal exogenous variable Z (which in our case stands for countries), two latent factors X1 and X2 and three (nominal) response variables Y1, Y2 and Y3 (e.g. attitudinal questions) for which the following relationships are defined (Figure 1). 1

For technical elaboration of this approach see Hagenaars, 1990; McCutcheon, 1987 and 2002.

-5-

Figure 1. Schematic representation of the homogeneous measurement model

Using the most common parameterization of the basic latent-class factor model, i.e. in terms of unconditional and conditional probabilities (Magidson & Vermunt, 2004), the general probability structure of this model is defined as:

π(Yk | Z) =

∑∑ π (X X 1

x1

x2

3

2 | Z) ∏ π (Yk | X 1 X 2 )

(1)

k =1

Here we are modelling the conditional probability of observing a particular set of Y-values given a particular set of Z-values. There are several restrictions imposed in this model (1). First, the latent variables X1 and X2 are assumed to be dependent on the exogenous variable Z. Second, the response variables Y1, Y2 and Y3 are held to be uncorrelated as they reflect latent factors. Third, and most important for our analysis, the response variables Y1, Y2 and Y3 are assumed to be independent from the covariate Z, given the latent variables X1 and X2 (assumption of local independence). In other words, this means that the exogenous variable Z is only indirectly related to the response variables Y1, Y2 and Y3 through its effects on the latent variables X1 and X2.

-6-

The conditional probabilities of the latent-class part of the model: 3

∏ π (Y

k

| X1X 2 )

(2)

k =1

are restricted by logit models with linear terms:

η Yk | X1X2 = β Y0 + β Y1 νX 1 + β Y2 νX 2 K

K

(3)

K

In LCFA factors are assumed to be discrete ordinal variables, which is why the levels of latent-factor variables are restricted by using fixed scores for each category of the factors. These scores are equidistant and range from 0 to 1. The ß’s can be interpreted as factor loadings expressed as log-linear parameters, since they are indicating the strength of the relationships between factors and response variables (Magidson & Vermunt, 2004; Moors, 2003). No three-way or higher order interactions are included in the model. Latent factors are assumed to be independent from one another. However, similar as in Lisrel confirmatory factor analysis approach, these associations can be included in a model. Since the latent structure within each category of exogenous variable Z is the same, this model is called homogenous. Two restrictions imposed by the homogeneous model are especially important for our analysis, namely the direct effects of the exogenous variable Z on the response variables are set equal to zero, same as are the interactions between covariate Z with the latent-class factor X on items A and B. When these restrictions are set free, we have a case of unrestrictive, heterogeneous model that makes no assumptions on latent-class factor weights or on conditional probabilities. Thus, in heterogeneous model exogenous variable Z has two additional sets of effects on response variables, aside from influencing them through the latent factors: the direct effects on response variables and interaction effects with latent factors on response variables:

η Yk | X1X2Z= β Y0 + β Y1 νX 1 + β Y2 νX 2 + β Y3 Z + β Y4 νX 1 Z + β Y5 νX 2 Z K

K

K

-7-

K

K

K

(4)

If some of these effects are restricted to be equal across groups the model is called ‘partially homogeneous’ (McCutcheon, 2002). Among many possible partially homogenous models, the model without any interaction term is of particular interest:

η Yk | X1X2Z= β Y0 + β Y1 νX 1 + β Y2 νX 2 + β Y3 Z K

K

K

K

(5)

Leaving interaction terms from the equation implies that the strengths (factor loadings) of the relationships between latent factors and response variables are assumed to be the same among different categories (e.g. different countries) of exogenous variable while the latent and manifest distributions are still different. Thus, in this model latent factors have similar meanings in all countries, but the response patterns differ across countries as a consequence of direct effects of exogenous variable on the response variables, i.e. as a result of differences in responses among countries which are not caused by differences in latent variable that is being measured (in our case it is attitude towards moral of compatriots). In the heterogeneous model, however, latent factors have different meaning in different countries and comparison becomes problematic. The three models described here - homogeneous, heterogeneous and partially homogenous model - are theoretical in nature and many intermediate models are possible in reality. For example, it is not always necessary to include all direct effects of the group variable on the indicators. Thus, from an empirical point of view, the main challenge is to select a model that fits the data well with the lowest level of heterogeneity possible.

2.2 Response Bias As with multiple-group comparison the issue of response bias can be addressed within the context of structural equation modelling (Billiet & McClendon, 2000; Cheung & Rensvold, 2000) and its latent class variant (Moors, 2003, 2004). Within this setting it is argued that the responses to attitudinal questions may be influenced by content, i.e. the concept one likes to measure, and by style, e.g. tendencies to choose extreme response categories or to agree with items regardless of

-8-

content. The empirical approach involves a confirmatory factor analysis or latent class factor analysis of two sets of balanced items that are assumed to measure two distinct concepts. Both Billiet & McClendon (2000) and Moors (2003) have demonstrated that a third general style factor can be detected next to two content factors, which also improved the fit with the data. The suggestion to use two sets of balanced questions was originally made by Billiet & McClendon in the context of modelling acquiescence. It relies on the assumption that a response bias tendency is a kind of personality trait which, thus, has to be found in scales measuring different constructs in order to be validated as a response bias. Furthermore, it builds upon the idea that agreeing with both negatively and positively worded questions in balanced sets reflects the acquiescence style rather than a difference in content. To make this assumption with more confidence the approach necessitates two sets of distinct questions. In the case of extreme response style the intrinsic confusion of 'style' with 'content' is less an issue since even in the case that positively worded questions measure something else than negatively worded questions (rather than being bipolar) an extreme response can be detected. The "two-content, one-style" factor approach is clearly developed from the perspective that response styles refer to a personality trait. However, whether response bias is a personality style or just a content specific tendency is still a matter of debate. It has been argued that in order to be a personality trait, the response style should be a fairly stable characteristic of the respondent. Some researchers have found that ERB is consistent over different measures and over time (Greenleaf, 1992; Arthur, 1966; Bachman & O’Malley, 1984), while others have found it to be unstable and difficult to generalise over different situations (Hui & Triandis, 1985b; Innes, 1977). In this paper, however, we do not take an a priori theoretical stance in this issue. Rather we want to utilize the flexibility of a latent-class factor approach in detecting response bias tendency without explicit model set-up based on the assumption of the response style. This paper also includes a single set of items. The consequence of that is that in order to confirm that the method factor that we extract from the data is the factor that reflects the extreme response bias tendency of respondents, we will have to place stronger emphasis on the verification process for this factor than it was the case in previous research listed in this section.

-9-

2.3 Data: The European Values Survey 1999/2000 Our study is embedded in real data from the European Values Survey (EVS) of 1999/2000 wave. It includes 33 European countries 2 with nearly 40 thousand participants (Halman, 2001). National samples were drawn from population of adult citizens over 18 years of age. The survey included a set of 8 questions (table 1) in which participants were asked to rate their opinions towards moral of their compatriots (people from the same country). The introductionary question to the items was: ‘According to you, how many of your compatriots do the following?’ Answers are recorded on a four-point scale ranging from 1, meaning ‘Almost all’ to 4, meaning ‘Almost none’. Thus, higher score on the scale imply better opinion on the moral of compatriots. Table 1 presents the items and their corresponding (interpolated) medians and inter-quartile ranges (IQR).

Table 1. Manifest variables and associated median and IQR Statistics

Variables

Median

IQR

Claiming state benefits to which they are not entitled (Benefits)

2.49

1.16

Cheating on tax if they have the chance (Cheat_tax)

2.20

1.09

Paying cash for services to avoid taxes (Avoid_tax)

2.28

0.95

Taking the drug marijuana or hash (Drug)

2.81

0.99

Throwing away litter in a public place (Litter)

2.26

1.19

Speeding over the limit in built-up areas (Speeding)

2.23

1.08

Driving under the influence of alcohol (Alcohol)

2.67

1.06

Having casual sex (Sex)

2.55

1.11

2

Participating countries are: Austria, Belarus, Belgium, Bulgaria, Croatia, Czech Republic, Denmark, Estonia, Finland, France, Germany, Great Britain, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, The Netherlands, Northern Ireland, Poland, Portugal, Romania, Russia, Slovakia, Slovenia, Spain, Sweden, Turkey, and Ukraine.

- 10 -

2.4 Procedure When an analysis is conducted on a large sample size, as is the case in this study, most of the effects in the model are likely to be significant, regardless of how substantive they are. This means that the common likelihood ratio statistic used as a criterion for model fit will probably lead to the rejection of any model that imposes restrictions (Hagenaars, 1990) which in turn makes the heterogeneous model as the most possible best-fitting model. In LCFA this problem is addressed by use of information criteria, among which the most popular is Bayesian Information Criteria (BIC). BIC simultaneously estimates the fit of the model and its parsimony (number of parameters relative to other models), partly compensating for sample size 3 . Thus, the lower the BIC is, the better is particular model fitted to the data for a given number of parameters. When the BIC value of a model does not further decrease with addition of new parameters, that model can be accepted as the best-fitting parsimonious model for given data.

Two research questions lead the analysis in this work: (1) To what extent is the measurement of ‘moral of compatriots’ comparable among countries and is the measurement biased by extreme response style? (2) What are the consequences of ignoring issues of measurement equivalence and response bias in interpreting differences among countries? We take following steps in the analysis. First, we compare models with different levels of heterogeneity (inequivalence) in a search for the most parsimonious model. In the second step we analyse the model with adjustment for response bias, i.e. we extract two factors, one ‘content’ and one ‘method’ factor. Finally, after verification of the method factor as a factor that accounts for extreme response bias, we investigate what consequences measurement inequivalence and response bias have from a substantive point of view. In particular, we examine changes in differences among countries on attitudes towards moral of compatriots once we take in account these two methodological issues.

3

BIC is calculated in a following way: BIC = -2*LL + (logN)*npar. In our analysis we made use of the LatentGold 4.5 program (Statistical Innovations Inc.)

- 11 -

3. Model selection In this initial step we compare a number of measurement models in which we treat the 8 attitudinal response variables as nominal indicators and the latent class factor as a discrete ordinal variable. Although the measurement level of the 8 response variables is ordinal these variables are treated as nominal for two reasons. First, it allows for checking whether the relationship of the response variables with the latent class factor is ordinal, and second, since every response category is treated as a dummy this approach is able to diagnose extreme response style behavior (Moors, 2003) - as will be demonstrated in section four of this paper. The models that we compare in search for the best fitting parsimonious one are the three prototypical models that we discussed previously – homogeneous, partially homogenous and heterogeneous model, with country as an exogenous, grouping variable and with attitude towards moral of compatriots as a latent-class factor (‘content’ factor). The one-factor model only includes this ‘content’ latent variable, whereas the two-factor model added a ‘method’ factor. This second factor accounts for extreme response bias, as it will be shown in the next section. Purpose is, of course, finding the bestfitting model. Comparison of the three models in terms of their fit to data is presented in Table 2.

Table 2. Fit statistics for one and two-factor models ONE-FACTOR MODEL (‘content’ factor) Measurement models

df

N parameters

L2

BIC

Homogeneous model

29909

79

134544.615

-173779.317

29189

799

101939.433

-198962.330

28469

1519

98503.677

-194975.916

Partially Homogeneous model (the best-fitting model)

Heterogeneous model

TWO-FACTOR MODEL (‘content’ and ‘method’ factor) Homogeneous model Partially Homogeneous model (the best-fitting model)

Heterogeneous model

29854

134

108043.933

-199713.028

29134

854

86396.681

-213938.110

28414

1574

83103.383

-209809.239

- 12 -

The partially homogeneous model, which includes direct effects on all 8 response variables, has the best fit (in terms of its BIC value) in both the one and twofactor analyses. By consequence, the resulting differences across countries are not due only to differences in the latent attitude but also due to other factors that are unrelated to the measured attitude (direct effects). It also means that it is justified to argue that no substantive differences in the relationship of the latent-class factor(s) with indicator variables across countries exist (factor loadings are similar enough, but that the response pattern differs between countries. At first look the similarity in the results of model comparisons between the content-only (one-factor) and the content + response bias (two-factor) model may be taken as an indication of a lack of any substantial improvement in the data comparability with the inclusion of the ‘method’ factor in the latter model. However, inspecting the bivariate residuals between response variables and countries reveals a different conclusion.

Table 3. Bivariate residuals between response variables and Country Covariates

Benefits

Cheat tax

Avoid tax

Drug

Litter Speeding Alcohol

COUNTRY (in 1-factor model)

57.54

38.10

38.05

56.49

33.51

31.52

41.21

43.60

COUNTRY (in 2-factor model)

22.10

18.13

19.36

21.51

15.88

14.23

19.90

17.25

Sex

The bivariate residuals reported in table 3 are Pearson Chi Square statistics divided by degrees of freedom. The larger a residuals is (>3.84=significance at .05 level) the larger the effect between variable pairs is that is not explained by the model. Bivariate residuals between response variables and an exogenous covariate actually represent size of direct effects of this grouping covariate on responses in a given model. Thus, they can be used to assess and compare direct effects in different models (Vermunt & Magidson, 2000; Moors, 2004). In our case this diagnostic feature give us evidence that inclusion of the response bias factor substantially reduces inequivalence in the data caused by direct effects of countries on response variables. Direct effects that were varying between 31 and 57 in the 1-factor 'content only' model are reduced to the range between 14 and 22 in the 2-factor 'content-style'

- 13 -

model, which is more than 50% decrease of their original value in the 1-factor model. This shows that when we account for response bias by including the ‘method’ factor in the model we substantially reduce measurement inequivalence in the model. Hence, from this analysis we can draw two initial conclusions: 1. Measurement inequivalence is present in the data obtained on the scale intended to measure attitudes toward moral of compatriots and, consequently, resulting differences between countries on this scale can be validly compared only when we take into account direct effects of countries on the response variables; and 2. Accounting for response bias by including a ‘method’ factor in the model substantially reduces the level of inequivalence in the model; In the next section we focus on the interpretation of the latent-class factors with special attention to the response style factor.

4. Interpreting the latent-class factors Table 4 brings together the information on how the response variables relate to the one- and two latent-class factors. Recall that the one-factor model refers to a 'content-only' approach and the two-factor model adds a 'method' factor to this model 4 . As indicated before, by treating the indicators as nominal (rather than ordinal) we are able to explore whether the relationship of the items with the content factor is ordinal, and it enables the detection of an extreme response style. The first finding is of course that the 'content' factor is virtually identical in both models. We can see that for each attitudinal variable beta weights have similar distribution: starting with high negative values on category 1 ('almost all') and then increasing this value gradually in all three subsequent categories until reaching high positive value in category 4 ('almost none'). These ordered weights are consistent with an interpretation of the first latent-class factor as identifying a positive evaluation of the morals of compatriots.

4

The two factors are not allowed to correlate.

- 14 -

Table 4.

Beta statistics for 1- and 2-factor models a) 1-FACTOR MODEL

b) 2-FACTOR MODEL

‘Content’ factor

‘Content’ factor

‘Method’ factor

-2.116 -1.062 1.005 2.172

-2.073 -1.108 0.941 2.240

1.567 -1.383 -1.462 1.278

-2.178 -1.311 0.834 2.654

-2.122 -1.422 0.762 2.783

0.506 -1.896 -1.303 2.693

-1.846 -1.051 0.787 2.110

-1.789 -1.167 0.685 2.271

0.757 -1.676 -1.284 2.203

-2.653 -1.042 0.887 2.808

-2.181 -1.464 0.440 3.205

1.998 -1.731 -1.770 1.503

-2.464 -1.218 0.839 2.843

-2.39 -1.395 0.713 3.078

0.479 -1.709 -0.962 2.191

-2.856 -1.612 0.542 3.926

-2.675 -1.581 0.600 3.655

0.470 -1.889 -1.340 2.759

-2.906 -1.116 1.100 2.922

-2.805 -1.225 0.971 3.059

2.376 -1.828 -1.894 1.346

-2.309 -1.200 0.880 2.630

-2.239 -1.292 0.778 2.752

1.143 -1.633 -1.576 2.067

Benefits almost all many some almost none

Cheat_tax almost all many some almost none

Avoid_tax almost all many some almost none

Drug almost all many some almost none

Litter almost all many some almost none

Speeding almost all many some almost none

Alcohol almost all many some almost none

Sex almost all many some almost none

Results on the second latent-class factor support the notion of an extreme response pattern in the data, i.e. positive beta-weights for the two extreme categories 1 and 4 and negative betas for the two middle categories 2 and 3. Same conclusion can be drawn from an analysis of a profile matrix presented in Table 5 that shows probabilities of belonging to a particular factor level (class) for a given answer. - 15 -

Table 5. Profile with marginal probabilities for a 2-factor model ‘Content’ factor ‘Method’ factor Level 1 Level 2 Level 1 Level 2 0.5352 0.4648 0.8403 0.1597 Factor's Level Size Benefits almost all 0.1018 0.0113 0.0219 0.2584 many 0.6367 0.216 0.4715 0.2814 some 0.2473 0.6524 0.4745 0.2311 almost none 0.0143 0.1203 0.0321 0.2291 Cheat_tax almost all 0.2132 0.0492 0.0867 0.4015 many 0.6723 0.3605 0.5832 0.2333 some 0.112 0.5242 0.3244 0.1941 almost none 0.0025 0.0661 0.0056 0.1711 Avoid_tax almost all 0.1716 0.0412 0.0652 0.3517 many 0.6579 0.3428 0.5603 0.2544 some 0.1643 0.5394 0.3636 0.2071 almost none 0.0062 0.0767 0.0109 0.1869 Drug almost all 0.0376 0.0019 0.0035 0.1134 many 0.4847 0.1097 0.3239 0.2393 some 0.4629 0.7064 0.6276 0.3049 almost none 0.0148 0.182 0.045 0.3423 Litter almost all 0.2452 0.0379 0.1051 0.379 many 0.6101 0.3029 0.5193 0.1938 some 0.1413 0.56 0.3586 0.2165 almost none 0.0033 0.0992 0.017 0.2108 Speeding almost all 0.2061 0.0329 0.0779 0.377 many 0.6746 0.3586 0.58 0.2525 some 0.1185 0.5509 0.3381 0.2214 almost none 0.0007 0.0576 0.004 0.1491 Alcohol almost all 0.0621 0.0022 0.004 0.1935 many 0.5851 0.1415 0.3989 0.2733 some 0.3451 0.7521 0.5778 0.3053 almost none 0.0077 0.1042 0.0192 0.2279 Sex almost all 0.0979 0.0101 0.0228 0.2374 many 0.6149 0.1971 0.4477 0.2788 some 0.2816 0.7149 0.5196 0.2908 almost none 0.0055 0.0779 0.0099 0.1929

- 16 -

In case of ‘content’ factor, level 1 can be considered as a ‘low moral of compatriots’ class of this factor, as most of the answers in this level are concentrated in categories 1 and 2, while level 2 stands for ‘high moral of compatriots’ factor class as most of the answers shift to the two higher categories (3 and 4). This pattern is consistent across all attitudinal variables and it again confirms attitudinal nature of this factor. On the other hand, responses on the ‘method’ factor have clearly distinctive, and at the same time indicative pattern. Level 1 of this factor is characterized by extensive concentration of answers in the two middle categories 2 and 3 and almost no answers in the two extreme categories. But the distribution changes completely in level 2, where responses disperse almost equally among all four answers’ categories. In other words, compared to the level 1, the main difference is that answers in level 2 are more extreme in both directions. The fact that they are spreading in both directions implies that this factor is not connected to the substantial meaning of the questions. Thus, this factor depicts respondents’ tendency to respond to the questions by choosing extreme categories, irrespective of their content, i.e. it expose extreme response bias.

To validate the results of table 4 and 5, we have operationalized an 'extreme response' variable as a sum of respondents’ scores on the two extreme categories (answers 1 ‘almost all’ and 4 ‘almost none’), as well as two variables created as sum of scores on extreme categories taken separately: ‘almost all’ responses (category 1) and ‘almost none’ responses (category 4). We calculated the ‘almost all’ and ‘almost none’ sumscores with the intention to investigate the nature of method factor in more detail, i.e. to check if method factor is similarly associated with both extreme scores (as extreme bias factor should be by definition). When we correlate these sumscores (table 6) with respondents’ factor scores on the ‘content’ and ‘method’ factor we can see that the ‘extreme responses’ variable is not correlated with the ‘content’ factor, while the two other variables, ‘almost all’ and ‘almost none’ are correlated moderately in different directions consistent with the content of that factor.

- 17 -

Table 6.

Correlations of Factor scores with ‘extreme responses’ variable ‘Content’ factor mean score

‘Method’ factor mean score

Almost ALL Almost NONE EXTREME Score Score Score

‘Content’ factor mean score ‘Method’ factor mean score

1

-.058

-.415

.358

-.089

-.058

1

.620

.560

.886

Almost ALL

-.415

.620

1

-.111

.730

Almost NONE

.358

.560

-.111

1

.598

EXTREME

-.089

.886

.730

.598

1

What is more important, however, is that the ‘method’ factor highly correlates with the ‘extreme responses’ variable. Furthermore, it has similar, same-direction correlation with both variables representing opposite extremes. Evidently, we are dealing here with a factor that accounts for extreme response bias. Having confirmed the presence of extreme response bias, the next step of our analysis deals with the consequences of ignoring this bias as well as ignoring measurement inequivalence found in the data in drawing conclusions from country differences.

5. Consequences of ignoring measurement inequivalence and extreme response bias From

a

methodological

point

of

view

investigating

measurement

inequivalence and response bias is relevant on its own. However, social researchers may still wonder whether it is truly worth making the effort. As long as this methodological fine-tuning of measurement issues does not lead to fundamentally different interpretations of country differences there is little incentive to adopt the research strategy developed in the previous section. The final question, thus, is whether the resulting differences across countries change once we take into account measurement inequivalence and response bias that are found in the data.

- 18 -

To answer this question we compare the results from the initial, content-only homogeneous latent-class factor model with: - A model in which the method factor that accounts for extreme response bias is included, i.e. a two-factor homogeneous model; - A model in which measurement inequivalence is taken into account, i.e. a one-factor partially homogeneous model; - A model that captures both measurement inequivalence and the extreme response bias observed; i.e. a two-factor partially homogeneous model.

By comparing results for these four situations, we are able to test if measurement inequivalence and response bias make an impact on the substantial findings. The extent that country comparisons based on the homogeneous model differ from those comparisons in the partially homogeneous model with two factors indicates the improvement in the validity of these comparisons when measurement inequivalence and response bias are taken into account. Furthermore, we can also examine effects of response bias and measurement inequivalence taken separately comparing the results in situation 2 and 3 with the initial, one-factor homogeneous model. Differences between countries in the aforementioned models are graphically presented in Figure 2 5 . These differences are expressed through gamma values which average for all countries is 0. Hence, gamma coefficients are measures of relative position of a country with regards to the measured attitude 6 . In our case, the higher gamma values are the more positive is the attitude towards the moral of compatriots in a given country compared to other countries and, vice versa, the lower the gammas are the more negative is the attitude on moral of compatriots in a country.

5

In Figure 2 countries are ordered according to the size of its effect on the latent variable in one-factor homogeneous model. 6 Gammas are parameters of a logit model used to predict latent distribution as a function of covariates, i.e. they depict the effects of a selected covariate on latent factors (Vermunt & Magidson, 2005).

- 19 -

Figure 2. Country differences in ‘Moral of Compatriots’ attitude in the four models 1-FACTOR HOMOGENEOUS 1-FACTOR PARTIAL HOMOGENEOUS

2-FACTOR HOMOGENEOUS 2-FACTOR PARTIAL HOMOGENEOUS

Turkey Belarus Ukraine Malta Russia Greece Croatia Bulgaria Romania Hungary Slovakia Czech Republic Poland Lithuania Latvia Estonia Ireland Northern Ireland Iceland Finland Sweden Denmark Belgium Netherlands Portugal Spain Italy Austria Germany Great Britain France

-4.500

-3.500

-2.500

-1.500

- 20 -

-0.500

0.500

1.500

2.500

3.500

First pattern that we can most easily depict is that the two homogeneous models - with or without method factor - have very similar results, i.e. country differences on attitude toward moral of compatriots do not change substantially when we only account for extreme response bias observed in data. This resemblance between gammas in one-factor and two-factor homogeneous models implies that accounting for extreme response bias without adjustment for measurement inequivalence does not change much in the results. This finding can be somewhat surprising and could lead to the conclusion that extreme response bias does not influence the relative position of countries on measured attitude. However, it would be premature and erroneous conclusion, as we will shortly see. On the other hand, accounting for measurement inequivalence without adjustment for response bias (one-factor partially homogeneous model) produces strikingly different results compared to the initial, one-factor homogeneous model. It turns out that adding all direct effects dramatically alters the countries relative weights to extend that the gamma weights of these two models hardly correlate (r=0.012). Consequently, for most of the countries the relative position to the other countries changes, while some of them even switch position from below European average to above average on attitude towards moral of compatriots or vice versa. Countries that show this trend are Great Britain, Germany, Slovakia, Croatia and Ukraine. Contrary to the findings from the one-factor homogeneous model, citizens of Great Britain and Germany are less convinced of their compatriots moral attitudes than the average European, whereas Slovakians, Croatians and especially Ukrainians have higher attitudes on moral of their compatriots compared to the European average. In addition, the differences between countries tend to substantially decrease from the one-factor homogeneous model to the both partially homogeneous models, which indicate that European countries are less different in terms of attitudes towards moral of compatriots as would have been concluded from an initial, homogeneous measurement model. These results clearly demonstrates that country positions on measured attitude in both relative and absolute terms are extremely influenced by the measurement model that is adopted and that inconsistent conclusions would have been made depending on the choice of model. The previous findings suggest that accounting for response bias does not make any impact on country rankings on the content latent-class factor, contrary to the measurement inequivalence that makes huge effect on these rankings. However, - 21 -

neither the two-factor homogeneous nor one-factor partially homogeneous model tells the whole story. Before making inference on relative country positions we need to extend our comparison to the model that simultaneously takes into account both measurement inequivalence and extreme response bias. And as it turned out, the results for the two-factor partially homogeneous model prove to be most illuminating. First, there are significant differences between the two partially homogeneous models (r=0.513). For example, in Germany, Great Britain, Portugal, Belgium, Croatia and Belarus gammas increase (in absolute value) from the one-factor to the two-factor model. In Spain, Latvia, Hungary, Romania, Greece and Turkey, on the other hand, gammas are reduced from one-factor to two-factor model, in absolute terms. Additionally, in Italy, Denmark, Sweden, Iceland, Poland, Czech Republic, Bulgaria, Russia and Malta, gamma values in the two models have different signs indicating that they switch from a higher than European average to a lower than average position or vice versa. As we have seen, the difference between the one- and two-factor partially homogenous models is the inclusion of the extreme response style factor in the latter model. Hence, the difference in country standings between these two models indicates that the extreme response style also affects country comparisons next to measurement inequivalence. In other words, accounting for extreme response patterns results in a different picture on the standings of countries relative to each other in respect to the attitude on morals of their compatriots. Even more important result, however, is that this model that encompass both measurement inequivalence and response bias has moderate correlation with initial, homogeneous model (r=0.567). This imply that the extreme response style partly account for the huge differences in country estimates we observed when comparing the homogeneous and partially homogeneous models with one factor. It also shows that the most valid results are neither completely same nor completely different from the results in the initial homogeneous model, but somewhere in between these two extremes. In summary, these findings indicate that in a situation when measurement inequivalence and extreme response bias are both present in data, accounting for only one of these two issues at the time could lead to erroneous conclusion on the differences between countries in measured latent attitudes. In this research only accounting for measurement inequivalence would lead to the conclusion that country differences are completely different when comparing it with the initial homogeneous - 22 -

‘content-only’ model. A model that, on the other hand, includes an ‘extreme response style’ factor that does not recognize inequivalence in measurement did not alter country comparisons. However, it is when we deal with both methodological issues in the same model we get the more valid results. These final results are still connected to the original ones, but there has been a great deal of change involved too. Hence, by ignoring response style bias we would be prone to make erroneous conclusion both on the differences between countries and on the relation of these results with initial ones in homogeneous model.

Conclusions Measurement inequivalence can seriously impair the comparison of attitudes among cultural groups and one of its main causes may be response bias. In this research we have tried to go one step further than is usually done in research on equivalence, i.e. to simultaneously check for response bias too, and to investigate the impact on substantive results when both issues are included. For this goal, we made use of a latent-class factor approach, demonstrating its flexibility and adequacy in detecting both measurement inequivalence and extreme response bias. The multigroup latent-class factor approach is still rarely used in social research and we hope that this work will encourage researchers to use it more readily. Obtained results confirm the importance of accounting for measurement inequivalence and response bias in the data. The analysis has shown that these issues, when present in data, are capable of severely distorting and altering the substantial findings. But the key finding of our work is that country differences in latent attitudes are substantially different if these two methodological issues are accounted simultaneously compared to the situation when they are treated separately. Hence our conclusion that country comparison is only valid if both measurement inequivalence and response bias behavior are taken into account. We like to underscore the importance of this finding given that these two issues are very rarely treated at the same time. These results, thus, are a strong argument for researchers to check for presence of both measurement inequivalence and response bias in comparative, cross-cultural data and, if both found, to account for them together, in one overall model.

- 23 -

References: Arce-Ferrer, A.J. (2006). An Investigation into the Factors Influencing ExtremeResponse Style. Educational and Psychological Measurement, 66(3), 374-392. Arthur, A.Z. (1966). Response Bias in the Semantic Differential. British Journal of Social and Clinical Psychology, 5, 103-7. Bachman, J.G., & O'Malley, P.M. (2004). Yea-Saying, Nay-Saying, and Going to Extremes: Black-White Differences in Response Styles. The Public Opinion Quarterly, 48(2), 491-509. Cambré, B., Welkenhuysen-Gybels, J., & Billiet, J. (2002). Is it content or style? An evaluation of two competitive measurement models applied to a balanced set of ethnocentrism items. International Journal of Comparative Sociology, 43, 1-20. Billiet, J.B., & McClendon, M.J. (2000). Modeling Acquiescence in Measurement Models for Two Balanced Sets of Items. Structural Equation Modeling, 7, 608628. Cheung, G. W., & Rensvold, R. B. (1998). Testing Measurement Models for Factorial Invariance: A Systematic Approach. Educational and Psychological Measurement, 58(6), 1017-1034. Cheung, G. W., & Rensvold, R. B. (2000). Assessing Extreme and Acquiescence Response Sets in Cross-Cultural Research using Structural Equations Modeling. Journal of Cross-Cultural Psychology, 31, 187-212. Clogg, C. C., & Goodman, L. A. (1985). Simultaneous Latent Structure Analysis in Several Groups. Sociological Methodology, 15, 81-110. Eid, M., Langeheine, R., & Diener, E. (2003). Comparing Typological Structures Across Cultures by Multigroup Latent Class Analysis: A Primer. Journal of Cross-cultural Psychology, 34, 195-210. Gibbons, J., Zeller, J., & Rudek, D. (1999). Effects of language and meaningfulness on the use of extreme response style by Spanish-English bilinguals. CrossCultural Research, 33, 369-381. Greenleaf, E. A. (1992). Measuring Extreme Response Style. Public Opinion Quarterly, 56, 323-351. Hagenaars, J.A. (1990). Categorical longitudinal data: log-linear panel, trend, and cohort analysis. Newbury Park, California: Sage.

- 24 -

Halman, L. (2001). The European Values Study: A Third Wave. Tilburg: EVS, WORC, Tilburg University. Hui, C.H., & Triandis, H.C. (1985a). Measurement in Cross-Cultural Psychology: A Review and Comparison of Strategies. Journal of Cross-cultural Psychology, 16(2), 131-152. Hui, C.H., & Triandis, H.C. (1985b). The Instability of Response Sets. The Public Opinion Quarterly, 49(2), 253-260. Hui, C.H., & Triandis, H.C. (1989). Effects of culture and response format on extreme response style. Journal of Cross-Cultural Psychology, 20(3), 296–309. Innes, J.M. (1977). Extremity and ‘Don't Know’ Sets in Questionnaire Response. British Journal of Social and Clinical Psychology, 16, 9-12. Magidson, J., & Vermunt, J. K. (2001). Latent Class Factor and Cluster Models, Biplots and Related Graphical Displays. Sociological Methodology, 31, 223-264. Magidson, J., & Vermunt, J.K. (2002). Non-technical Introduction to Latent Class Models. Statistical Innovations, White Paper 1, 15. Magidson, J., & Vermunt, J.K. (2003). Comparing Latent Class Factor Analysis with Traditional Factor Analysis for Data-mining. In Bozdogan, H. (Ed.), Statistical Data-mining and Knowledge Discovery (pp. 373-383). CRC Press. Magidson J., & Vermunt J.K. (2004). Latent class models. In: Kaplan, D. (Ed.), The sage handbook of quantitative methodology for the social sciences (pp. 175198). Sage Publications, Thousand Oaks. McCutcheon, A.L. (1987). Latent class analysis. Beverly Hills: Sage Publications. McCutcheon, A.L. (2002). Basic Concepts and Procedures in Single- and MultipleGroup Latent Class Analysis. In Hagenaars, J.A. and McCutcheon, A.L. (Eds.), Applied latent class analysis (pp. 56-85). Cambridge: Cambridge University Press. Moors, G. (2003). Diagnosing response style behavior by means of a latent-class factor approach. Socio-demographic correlates of gender role attitudes and perceptions of ethnic discrimination re-examined. Quality & Quantity, 37, 227302. Moors, G. (2004). Facts and artefacts in the comparison of attitudes among ethnic minorities. A multi-group latent class structure model with adjustment for response style behaviour. European Sociological Review, 20, 303-320.

- 25 -

Paulhus, D. L. (1991). Measures of Personality and Social Psychological Attitudes. In Robinson, J.P., & Shaver, R.P. (Eds.), Measures of Social Psychological Attitudes Series (Vol 1, pp. 17-59). San Diego: Academic. Poortinga, Y. H. (1989). Equivalence of Cross-Cultural Data: an Overview of Basic Issues. International Journal of Psychology, 24, 737-756. Steenkamp, J. E. M., & Baumgartner, H. (1998). Assessing Measurement Invariance in Cross-national Consumer Research. Journal of Consumer Research, 25, 7890. Vandenberg, R.J., & Lance, C.E. (2000). A review and synthesis of the measurements invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational Research Methods, 3, 4-69. Van de Vijver, F., & Leung, K. (1997). Methods and Data Analysis of Cross-Cultural Research. Thousand Oaks: Sage. Vermunt, J. K., & Magidson, J. (2000). LatentGold. User’s Guide. Belmont: Statistical Innovations Inc. Vermunt, J.K., & Magidson, J. (2005). Factor Analysis with categorical indicators: A comparison between traditional and latent class approaches. In Van der Ark, A., Croon, M.A. & Sijtsma, K. (Eds.), New Developments in Categorical Data Analysis for the Social and Behavioral Sciences (pp. 41-62). Mahwah: Erlbaum.

- 26 -

Centre d'Etudes de Populations, de Pauvreté et de Politiques Socio-Economiques International Networks for Studies in Technology, Environment, Alternatives and Development

IRISS Working Papers The IRISS Working Paper Series has been created in 1999 to ensure a timely dissemination of the research outcome from the IRISS-C/I programme. They are meant to stimulate discussion and feedback. The working papers are contributed by CEPS/INSTEAD resident staff, research associates and visiting researchers.

The fifteen most recent papers Kankara M. & Moors G., ‘Measurement Equivalence and Extreme Response Bias in the Comparison of Attitudes across Europe’, IRISS WP 2008-06, May 2008. Prejmerean M. & Vasilache S., ‘What’s a university worth? Changes in the lifestyle and status of post-2000 European Graduates.’, IRISS WP 2008-05, February 2008. Takhtamanova Y. & Sierminska E., ‘Gender differences in the effect of monetary policy on employment: The case of nine OECD countries.’, IRISS WP 2008-04, February 2008. Tamilina L., ‘The analysis of welfare state effects on social trust in a multidimensional approach’, IRISS WP 2008-03, February 2008. Corsini L., ‘Institutions, Technological Change and the Wage Differentials Between Skilled and Unskilled Workers: Theory and Evidence from Europe’, IRISS WP 2008-02, January 2008. Gerber P. & Fleuret S., ‘Cartographier une enquête à l’échelle intra-urbaine: bien-être et personnes âgées de la ville de Luxembourg’, IRISS WP 2008-01, January 2008. Pavlopoulos D., Fouarge D., Muffels R. & Vermunt J., ‘Who benefits from a job change: The dwarfs or the giants?’, IRISS WP 2007-16, December 2007. Martin L., ‘The impact of technological changes on incentives and motivations to work hard’, IRISS WP 2007-15, December 2007. Popescu L., Rat C. & Rebeleanu-Bereczki A., ‘Self-Assessed Health Status and Satisfaction with Health Care Services in the Context of the Enlarged European Union’, IRISS WP 2007-14, November 2007. Weziak D., ‘Measurement of national intellectual capital application to EU countries ’, IRISS WP 2007-13, November 2007. D’Angelo E. & Lilla M., ‘Is there more than one linkage between Social Network and Inequality?’, IRISS WP 2007-12, November 2007. Lilla M., ‘Income Inequality and Education Premia’, IRISS WP 2007-11, November 2007. Stanciole A., ‘Health Insurance and Life Style Choices: Identifying the Ex Ante Moral Hazard’, IRISS WP 2007-10, November 2007. Raileanu Szeles M., ‘The patterns and causes of social exclusion in Luxembourg’, IRISS WP 2007-09, August 2007. Mussard S. & Philippe B., ‘Une évaluation du rôle des déterminants du partage de la valeur ajoutée’, IRISS WP 2007-08, June 2007.

Electronic versions Electronic versions of all IRISS Working Papers are available for download at http://www.ceps.lu/iriss/wps.cfm

Centre d'Etudes de Populations, de Pauvreté et de Politiques Socio-Economiques International Networks for Studies in Technology, Environment, Alternatives and Development

IRISS-C/I is a visiting researchers programme at CEPS/INSTEAD, a socio-economic policy and research centre based in Luxembourg. It finances and organises short visits of researchers willing to undertake empirical research in economics and other social sciences using the archive of micro-data available at the Centre.

What is offered? In 1998, CEPS/INSTEAD has been identified by the European Commission as one of the few Large Scale Facilities in the social sciences, and, since then, offers researchers (both junior and senior) the opportunity to spend time carrying out their own research using the local research facilities. This programme is currently sponsored by the European Community’s 6th Framework Programme. Grants cover travel expenses and on-site accommodation. The expected duration of visits is in the range of 2 to 12 weeks.

Topics The major resource offered to visitors is access to a series of internationally comparable longitudinal surveys on living conditions at the household and individual level. The anonymised micro-data provide information on wages and income, health, education, employment and professional activities, accommodation, social relations,... Comparable micro-data are available for EU countries, Central European countries, as well as the USA. These data offer opportunities to carry out research in fields such as survey and panel data methodology, income distribution and welfare, income and poverty dynamics, multi-dimensional indicators of poverty and deprivation, gender, ethnic and social inequality, unemployment and labour supply behaviour, education and training, social protection and redistributive policies, fertility and family structures, new information technologies in households and firms, ...

Who may apply? All individuals (doctoral students as well as experienced academics) conducting research in an institution within the EU-25 or an FP6 Associated State. IRISS-C/I can be meeting place for groups of researchers working on a joint project. We therefore encourage joint proposals by two or more researchers.

For more detailed information and application form, please consult our website: http://www.ceps.lu/iriss or contact us at IRISS-C/I, CEPS/INSTEAD BP 48, L-4501 Differdange, G.-D. Luxembourg Tel: +352 585855 610; Fax: +352 585588 E-mail: [email protected]