Printer Friendly

Styles of thinking, abilities, and academic performance.

A cornerstone of modern educational psychology is that a student's level of abilities is one of the major predictors of school success. Though psychologists or educators may differ significantly on the details and on theory, there is established evidence that abilities matter (e.g., Carroll, 1993; Gardner, H., 1983; Guilford, 1967; Horn, 1994; Spearman, 1927; Sternberg, 1985, 1986, 1988b; Thurstone, 1938).

Yet abilities do not predict school performance completely. In the search for other variables that contribute to school achievement, researchers have devoted considerable attention to the so-called stylistic aspects of cognition. The idea of a style reflecting a person's typical or habitual mode of problem-solving, thinking, perceiving, and remembering was initially introduced by Allport (1937). Since then, researchers have developed various theories in attempts to understand the reality of styles (see Curry, 1983; Grigorenko & Sternberg, 1995; Kagan & Kogan, 1970s; Kogan, 1983; Riding & Cheema, 1991; Sternberg, 1988a; Vernon, 1973). In an examination of the literature on styles, Grigorenko and Sternberg (1995) found three general approaches to stylistic aspects of learning.

The first approach is cognition-centered, dealing with cognitive styles. Theorists and researchers in this area have sought to investigate "the characteristic, self-consistent modes of functioning which Individuals show in their perceptual and intellectual activities" (Witkin, Oltman, Raskin, & Karp, 1971, p. 3). Some of the main styles studied in this literature have been leveling-sharpening (i.e., a tendency to be hypersensitive to small differences versus a tendency to maximize assimilation; Klein, 1954), equivalence range (i.e., a spontaneous differentiation of heterogeneous items Into a complex of related groups; Gardner, R., 1953), field dependence-independence (i.e., an ability to differentiate an object from the context; Witkin, 1973), and impulsivity-reflectivity (i.e., a tendency to reflect/disregard alternative solutions; Kagan, 1958). There also have been attempts to integrate specific cognitive styles into a larger framework of cognitive functioning. Kagan and Kogan (1970) have matched particular cognitive styles with stages of problem-solving. Fowler (1977, 1980) and Santostefano (1986) have incorporated the notion of styles into a develop mental framework, and Royce and Powell (1983) have conceptualized styles as higher-order strategies that control the deployment of lower-order abilities.

A second approach to studying styles is personality-centered. The theory of Myers and Myers (1980), based on the work of Jung (1923), follows this approach. Myers and Myers have distinguished among two attitudes, extroversion and introversion; two perceptual functions, intuition and sensing; two judgmental functions, thinking and feeling; and two ways of dealing with the outer world, judgment and perception. Gregorc (1984) has distinguished between two ways of handling each of space and time. Thus, people can be classified as abstract or concrete with respect to space, and as sequential or random with respect to time. Miller (1987, 1991) has proposed a somewhat different taxonomy, distinguishing among analytic versus holistic, objective versus subjective, and emotionally stable versus emotionally unstable individuals.

The third approach is activity-centered and tends to focus on styles of learning and teaching. These theories have probably had the most direct application in the classroom. For example, Kolb (1974) has identified four styles of learning: convergent versus divergent and assimilational versus accommodational. Dunn and Dunn (1978) have categorized styles in terms of preferred elements in a learning situation, such as various aspects of the environment (e.g., sound and light) and various aspects of Interaction with the self and others (e.g., peers and adults). Renzulli and Smith (1978) have distinguished preferred styles of work in the classroom, such as projects, drill and recitation, and peer teaching. A theory of the same kind but more oriented toward the world of work is that of Holland (1973), who has distinguished among realistic, investigative, artistic, social, and enterprising styles on the job.

These three approaches differ not only in the focus of their interest, but also in the ways they address the functional aspects of styles, mentioned previously. The cognition- and the personality-centered approaches typically imply that styles are either-or constructs (a person could be either field-independent, or field-dependent, but not both). In these approaches, styles are consistent across various tasks and situations, and can be modified very little, if at all, by training during the life span. Cognitive and personality styles are most often viewed as structures, where the focus is placed on stability over time as such, styles are "givens" in a training or educational setting (Riding & Cheema, 1991). Cognition- and personality-centered theories also usually have built-in evaluating attitudes assuming that certain styles are better than others: It is often more beneficial in modern society to be reflective rather than impulsive, or sequential rather than random. These styles are measured primarily by specially designed laboratory tasks. In contrast, styles defined in the third, activity-centered approach are measured by methods more easily usable in educational environments. Most authors working in the activity-centered framework view styles as processes, which can be built on and used to compensate for or to remediate weaknesses. In this interpretation, styles are seen as dynamic, not as "frozen forever." There are also no "bad or good" styles -- the aim is to find or develop "optimal" styles for particular situations.

THEORY OF MENTAL

SELF-GOVERNMENT

Our goal was to build on this work using the theory of mental self-government Sternberg, 1988a, 1990, 1994). The objective of the theory is to integrate various approaches to style and to provide new directions for theory applied to educational practice.

The basic Idea of the theory of mental self-government (Sternberg, 1988a, 1994) is that people, like societies, have to organize or govern themselves. Thus, the theory addresses the question of how people govern and manage their everyday cognitive activities, within the school and without. In the theory of mental self-government, a style of thinking is defined as a preferred way of thinking. It is not an ability, but rather a favored way of expressing or using one or more abilities. Two or more people with the same levels or patterns of abilities might nevertheless have very different styles of thinking. Also, two people with similar personality characteristics might differ in their thinking styles. Thus, styles of thinking do not reside in the domain of abilities or in the domain of personality, but at the interface between the two (Sternberg, 1988a, 1988b, 1994).

The theory is organized into five major parts: functions, forms, levels, scope, and leanings of mental self-government (see Table 1). Because scope was not used in our studies, we do not describe it further. The basic idea, then, is that people can be characterized and assessed with regard to habitual functions, forms, levels, and leanings in their cognitive activities.
Table 1
Styles of Mental Self-Government

Styles Characterization Example Relevant to School Settings

Functions

 Legislative is concerned with students who like to
 creating, formulating, approach assignments
 imagining, and planning; in their own ways, who
 likes to formulate like to wander off from
 his or her own activities their textbooks, who
 like to explore, to do
 science projects, to
 write poetry and
stories,
 to compose music, and to
 create original artworks

 Executive is concerned with students who are always
 implementing and ready for the
 doing; likes to pursue class, who know the
 activities structured assigned material
 by others very well, who prefer
 solving problems
 over formulating them
 and like developing
 someone else's idea
 more than suggesting
 their own

Judicial is concerned with judging, students who like to
 evaluating, and comparing; comment and to critique.
 likes to judge the products who enjoy writing
 of others' activities, or critical essays and
 to judge the others commentaries, who prefer
 themselves evaluating others' ideas
 over formulating or
 implementing them

Forms

 Monarchic tends to focus students who like to
 single-mindedly on one engage in single
 goal or need at a time; projects, whether in
art,
 a single goal or science, history,
 way of doing things or business
 predominates

 Hierarchic tends to allow for students who know how to
 multiple goals, each divide homework so that
 of which may have a more time and energy are
 different priority; devoted to more
important
 knows how to perform and more difficult
 multiple tasks assignments
 within the same time
 frame, setting priorities
 for getting them done

 Oligarchic tends to allow for multiple students who start many
 goals, all of projects simultaneously,
 which are equally important; but have trouble
 likes to do multiple tasks finishing them because
 within the same time there is not enough time
 frame but has difficulty and because priorities
 setting priorities of the projects
 for getting them done have not been set

 Anarchic tend to eschew rules, students who do not
 procedures, and much planning and tend
 formal systems; often to choose projects they
 has difficulty adjusting work on in a random way;
 to the school as a system do not like to follow
 the established
 curriculum and have
 difficulties meeting
 deadlines
Levels

 Global prefers to deal with the students who like
writing
 large picture and on the global
 abstractions message and meaning of
a
 work of art, or on the
 significance of a
 particular discovery
 for mankind

 Local prefers dealing with details students who like
writing
 and concrete issues on the components of a
 work of art, or on the
 details of an experiment

Leaning

 Liberal likes to do things in new students who like
 ways, to have figuring out how to
 change in his or her life, operate new equipment
 and to defy conventions and like non-traditional
 traditional challenging
 tasks

Conser- likes traditions and stability; students who like to be
vative shown all the
 steps of operating
 equipment and like
 to be given precise
 instructions for
 performing a task




The theory of mental self-government is rooted in previous work on styles, and so it shares some characteristics with earlier theories. The theory of thinking styles addresses all three domains -- the domain of cognition, the domain of personality, and the domain of activity: We view thinking styles as buffers between such internal characteristics, as ability and personality, on the one hand, and the external situation, on the other. The theory of mental self-government provides an insight into individually preferred ways of thinking in various activities. Similar to the cognition- and the personality-centered approaches, some thinking styles imply distinct poles: A person can be either local or global, liberal or conservative, but not both (see Table 1 for definitions of these styles). On the contrary, other thinking styles (e.g., legislative, judicial, and executive), like activity-based styles, are not polarizable, because they represent distinct categories that do not lie on a continuum.

The space of thinking styles is multidimensional; the different styles are not orthogonal to each other and tend to correlate and to form profiles. Thus, for example, the executive style often correlates with the conservative style, whereas the legislative-style tends to be associated with the liberal one (Martin, 1989). Moreover, although people have a general profile of the ways they choose to think, thinking styles can vary across tasks and situations. A student's preferred style in mathematics, for example, may not necessarily be his or her preferred style in a cooking class. Thus, similarly to the activity-based approach, we view styles as dynamic and adaptive, subject to change and optimization. Unlike the cognition- or the personality-based approaches to styles, we believe that the thinking styles are not fixed, but rather can vary across the life span. The styles that may lead to adaptive performance (either in learning or teaching) at the elementary-school level are not necessarily those that will work best in advanced graduate school training or at work. For example, teachers at the primary level of education tend to favor students with creative thinking styles more than do teachers at the secondary level (Sternberg & Grigorenko, 1995).

Styles are, at least in part, socialized (Sternberg, 1988a; Sternberg & Grigorenko, 1995), and may undergo developmental changes. No thinking style is, in any absolute sense, "good" or "bad." Rather, it can be more or less adaptive for a given task or situation, and what is adaptive in one setting may not be in another.

Finally, thinking styles manifest themselves in any activity, and therefore can be measured in an ecologically valid situation, as well as in a laboratory setting. In other words, thinking styles are reflected in styles of learning and teaching, styles of working and playing, and so on.

In our previous studies (Grigorenko & Sternberg, in press; Sternberg de Grigorenko, 1993, 1995), we operationalized the theory of thinking styles and applied it to various educational activities -- in particular, learning and teaching. Our most relevant findings were that there was a significant variation of styles among teachers and students, and that students' thinking styles were predictive of their school success. In the present study, we have attempted to extend our research on thinking styles into the area of gifted education.

METHOD

This study was a part of a large-scale effort to validate Sternberg,s triarchic model of intelligence (Sternberg, 1985, 1986, 1988b). The triarchic theory distinguishes among three kinds of intellectual giftedness: analytic, creative, and practical. In brief, the theory suggests that individuals gifted in these different ways excel in different activities. The analytically gifted are strong in analyzing, evaluating, and critiquing; the creatively gifted are good at discovering, creating, and inventing; and the practically gifted are strong in implementing, utilizing, and applying. A complete account of the larger study has been presented elsewhere (Sternberg & Clinkenbeard, 1995; Sternberg, Ferrari, Clinkenbeard, & Grigorenko, in press). In this article, we present only the components of the study relevant to thinking styles.

Study Questions

The general purpose of this study was to investigate relations between different types of abilities (as defined by the triarchic theory) and different thinking styles (as defined by the theory of mental self-government). We addressed four research questions, as follows:

1. Is there a relationship between thinking styles and abilities? In other words, did stylistic preferences differ for gifted and nongifted students? More specifically, did stylistic preferences differ among gifted students of different abilities? For example, did creative students tend to be more legislative, and analytical students tend to be more judicial?

2. After controlling for level of students, abilities, to what degree do thinking styles predict performance? In other words, when the level of abilities was accounted for, did styles contribute anything to understanding variability in academic performance?

3. Given that four different instructional types (analytical, creative, practical, and traditional) were used, do students with certain thinking styles who are placed in a particular instructional group that is matched or mismatched with their ability, perform any better than students with other thinking styles? For example, did creative students, scoring high on the judicial thinking style and placed in the group with analytical instruction, perform better than creative students in the same group, but with lower score on the judicial thinking style?

4. Given that various tools of performance evaluation (homework, written examinations, and project) were used in this study, did students with particular thinking styles do better in one form of evaluation than another? In other words, did specific forms of evaluation benefit students with particular thinking styles?

Participants

Participants were high school students, ranging in age from 13 to 16 years, who attended the 1993 Yale Summer Psychology Program (YSPP). The program was advertised through brochures and newsletters distributed to schools in the United States and other countries. Schools were asked to submit nominations of gifted students to the Program Committee of the YSPP. A selection procedure was based on the students, performance on the Sternberg Triarchic Abilities Test (STAT), Level H, designed for advanced high school and college students (Sternberg, 1993). The STAT was sent to schools that placed nominations. The test was administered to the nominated students in their own schools.

A total of 199 students (146 females and 53 males) were selected for participation in the summer program of 1993. (Altogether the YSPP enrolled 225 students, of whom 25 were admitted for pay, to provide tuition for eligible students. These 25 students were excluded from further analyses. Moreover, one student was expelled for discipline problems.)

Of these students, 3 (1.5%) were entering grade 9, 25 (12.6%) were entering Grade 10, 77 (38.7%) were entering Grade 11, and 94 (47.2%) were entering Grade 12. The program participants were fairly widely distributed ethnically (based on students' own reports): 60% European American, 11% African American, 6% Hispanic American, and 17% American from another ethnic minority. Further, 4% of the students were from South Africa, and 2% "other."

Materials and Procedures

Different instruments were used for identification, performance assessment, and styles evaluation. All instruments were developed prior to the study, and complete accounts of their psychometric properties can be found elsewhere. Thus, only brief technical descriptions will be provided here.

Identification. Identification and classification of students Into ability groups were done using the results of the STAT multiple-choice and essay subtests. There are nine multiple-choice subtests, each including 2 sample items and 4 test items, for a total of 36 items (for details, see Sternberg et al., in press). The item types on the nine multiple-choice subtests are: (1) analytic-verbal -- inferring the meanings of neologisms from natural contexts; (2) analytic-quantitative -- inferring subsequent numbers on the basis of series of numbers; (3) analytic-figural -- inferring the missing part of each matrix based on the matrixs overall structure; (4) practical-verbal -- performing everyday reasoning; (5) practical-quantitative -- performing everyday math; (6) practical-figural -- performing route planning; (7) creative-verbal -- solving verbal analogies preceded by counterfactual premises; (8) creative-quantitative -- learning and applying novel number operations; (9) creative-figural -- extracting and applying rules for figure transformations. The three essay subtests involved analytical thinking (requiring students to analyze the advantages and disadvantages of having police or security guards in a school building), creative thinking (requiring students to describe how they would reform their school system to produce an ideal one), and practical thinking (requiring students to specify a problem in their life, and to state three practical solutions for solving it). Multiple-choice subtests and essays were standardized, and then three primary STAT scores (analytical, creative, and practical) were obtained. The details of the validation of the STAT have been described in detail elsewhere (Sternberg & Clinkenbeard, 1995; Sternberg et al., in press). In brief, a varimax-rotated principal components analysis of the multiple-choice subtests of the STAT resulted in 9 specific factors (factor loadings varied from .92 to .98) with approximately equal eigenvalues, which ranged from 1.01 to 0.98. The factor structure shows that each subtest represents a unique process (analytic, creative, practical) -- content (verbal, quantitative, figural) combination, suggesting that the STAT does tap into different abilities, instead of simply measuring Spearman's general ability (g, Spearman, 1927). The Kuder-Richardson-20 (KR-20) reliability coefficients for multiple-choice items ranged from .49 to .64. The interrater agreement on the essays ranged from .58 to .69. The multiple-choice subtests scores correlated with essay questions at p < .01. The correlations with the Watson-Glaser Critical Thinking Appraisal and the Concept Mastery tests used for external validation were significant, but of moderate magnitude.

Based on their STAT performance, all students enrolled in the program were classified into five different groups. The STAT subtest scores were standardized, so they could be compared across different subtests. Students were identified as "high" in an aspect of ability based on their strongest test attainment and their score in respect to the group average. Thus, we first constructed three groups: (a) a group in which students demonstrated a high level of analytical ability (N = 39, 19.6%); (b) a group in which students were high in creative ability (N = 38, 19.1%); and (c) a group in which students were high in practical ability (N = 35, 17.6%). For students to be classified as "high" in analytic, creative, or practical ability, their total score for a given ability was required to be at least a half-standard deviation above the group average and at least a half-standard deviation above their own scores for the other two abilities measured by the STAT (e.g., analytic higher than creative and practical). Although the half-standard deviation criterion might sound weak, recall that all students entering the program were first nominated as gifted by their schools.

For the fourth group, we defined a "balanced" gifted group (N = 40, 20.1%). For students to be classified as balanced, they had to score above the group average for all three abilities. Finally, the fifth group was composed of students who scored at or below the group average for all three abilities (N = 47, 23.6%). These students were classified as not identified as gifted.

Instruction. Students were given an intensive, 4-week, college-level psychology course. The course consisted of three main components: the text (Sternberg, 1995), the lecture series, and the afternoon sections. The first two components were common to all groups, whereas the last constituted the treatment and diverged across groups. There were four types of afternoon sections, in which section leaders emphasized different skills: memory (traditional educational paradigm), analytical thinking, creative thinking, or practical thinking. The students were divided among instructional groups in such a way that all groups had close to equal numbers of students of each of the five types of ability patterns. Thus, some students were placed in groups in which the type of afternoon section matched their abilities (N = 83, e.g., 10 of 38 creative students were placed in groups in which the section leaders taught for creative thinking), while the remaining students were mismatched (N = 113, e.g., 9 creative students were place in groups with the section leaders teaching for practical thinking).

Performance Assessment. In our previous studies on thinking styles, we found that thinking styles predict school success: Students were viewed by their teachers as achieving at higher levels when the students' profiles of styles matched those of their teachers (Sternberg & Grigorenko, 1995). In other words, teachers appear to value more highly students who are stylistically similar to themselves. However, the evaluation of academic performance was done through students' class grades: that is, it could have been confounded with teachers' subjective perception of a given student. Thus, we were able to show that thinking styles were relevant to school performance, but our measures of school performance were, most likely, confounded with the teachers view of a given student. In the present study, we had a chance to eliminate the possible bias that may have resulted from the use of class grades as indicators of academic performance. In this study, students, performance was rated by independent raters, who had never met the students and thus made their judgments based only on the quality of students' writing.

All students received identical kinds of assessments: two major homework assignments, a final project, and two exams. Each of the assessments involved various tasks testing for analytical, creative, and practical skills. Some examples of assessments are (a) compare Freud's theory of dreaming to Jung's [analytical]; (b) design an experiment to test a theory of dreaming [creative]; and (c) discuss the implications of Jung's theory of dreaming for your life [practical].

Four raters scored all performance assessments, rating each task for a corresponding ability (e.g., analytic ratings for the analytic performances). All ratings were on a scale of 1 (low) to 9 (high). The effective reliabilities of quality ratings for four raters varied from .73 to .96. Principal-components analyses were used to extract the common variance in the ratings of the four judges for each of the analytic, creative, and practical ratings for the three types of assessments. The first principal-component score of these analyses was then used to assess achievement in the following analyses. Thus, there were six resulting factor scores: three abilities (analytic, creative, and practical), evaluated in three different assessment settings (two homework assignments, a final project, and two exams).

Using these scores, we created six summary measures. Three measures reflected students' performance on all homework assignments, all exams, and the final project. Three other measures reflected students, performance on analytical, creative, and practical tasks in different assessment settings. These six measures were used in the further analyses.

Evaluation of Styles. In the present study, students' thinking styles were evaluated in two different ways: (a) a self-report questionnaire and (b) a set of thinking@styles tasks. Detailed descriptions of the thinking-styles instruments can be found elsewhere (Grigorenko & Sternberg, in press; Sternberg & Grigorenko, 1995). The purpose of having different measures was to have converging operations (Garner, Hake, & Eriksen, 1956) that measured the same constructs. In this way, sources of bias and error associated with individual measures would be reduced (Campbell & Fiske, 1959). The thinking styles measures are as follows:

* Thinking Styles Questionnaire (TSQ). This questionnaire consisted of 104 items, 8 for each of 13 scales: legislative, executive, judicial, monarchic, hierarchic, oligarchic, anarchic, global, local, liberal, conservative, internal, and external. Items were in the form of a Likert scale with ratings ranging from 1 (low) to 7 (high). In the further analyses, we used only 11 scales: Internal and external styles were excluded, because the other thinking styles measure (see the following description did not include the tasks measuring these styles. The scales, internal consistency coefficients, obtained from an independent sample of school students prior to the study, ranged from .55 to .83 (see Table 2), suggesting adequate reliability of the instrument.

* Set of Thinking Styles Tasks for Students (STS). The STS was a set of 16 different tasks and preference items for students. The tasks and preference items were assumed to map directly onto 11 thinking styles: legislative, executive, judicial, monarchic, hierarchic, oligarchic, anarchic, global, local, liberal, and conservative. Students had to solve problems and make choices, and every response was coded via a scoring map of correspondence between responses and styles. For each scale, the sum of the scores across tasks and preference items was considered to be a measure of the thinking style. When preferences and choices were reordered into dichotomous form, the KR-20 reliability coefficients ranged from .59 to .74 (Sternberg & Grigorenko, 1995).

Examples of items from the TSQ and STS, as well as internal-consistency alpha reliabilities of subscales (calculated on Independent but comparable samples of students not involved in these studies), are shown Table 2. The correlations between corresponding scales of the TSQ and STS varied between r = .45 (N = 277, p [less than].0001) for the conservative style and r = .20 (N = 277, p [is less than].001) for the global style. TABLE 2 Examples of Some of the Items and Reliability Coefficients of the Scales of the Thinking Styles Questionnaire (TSQ) and the Set of Thinking Styles Tasks for Students (STS)
Style Sample Item

Thinking Styles Questionnaire (TSQ)

 Legislative When faced with a problem, I use my own ideas and
 strategies to
 solve it.

 Executive Before starting a task or project, I check to see
 what method or
 procedure should be used.

 Judicial I enjoy work that involves analyzing, grading, or
 comparing
 things.

 Monarchic I like to concentrate on one task at a time.

 Hierarchic In talking or writing down ideas, I like to have
 the issues
 organized in order of importance.

 Oligarchic I prefer to work on a project or task that is
 acceptable and
 approved by my peers.

 Anarchic When there are many important things to do, I try
 to do as
 many as I can in whatever time I have.

 Global I care more about the general effect than about
 details of a task I
 have to do.

 Local I like to collect detailed or specific information
 for projects I
 work on.

 Liberal I like to change routines in order to improve the
 way tasks are
 done.

 Conservative I like situations where I can follow a set
routine.

Style Reliability
([Alpha])

Thinking Styles Questionnaire (TSQ) N = 277

 Legislative .81

 Executive .83

 Judicial .73

 Monarchic .84

 Hierarchic .81

 Oligarchic .54

 Anarchic .55

 Global .83

 Local .66

 Liberal .88

 Conservative .83


Set of Thinking Styles Tasks for Students (STS)

Item Examples: When I am studying literature, I prefer: (a) to make up my own characters and my own plot (legislative); (b) evaluate the author's style, to criticize the author's ideas, and to evaluate characters' actions (judicial); (c) to follow the teacher's advice and interpretations of author's positions, and to use the teachers way of analyzing literature (executive); (d) to do something else (please describe your preferences in the space below).

An example of a task to distinguish among oligarchic, hierarchical, monarchic, and anarchic thinking styles is: You are the mayor of a large northeastern city. You have a city budget this year of $100 million. Below is a list of problems currently facing your city. Your job is to decide how you will spend the $100 million available to improve your city. Next to each problem is the projected cost to eliminate a problem entirely. In the space on the next page, list each problem on which you will spend city money and how much money you will budget for that problem. Whether you spend money on one, some, or all problems is up to you, but be sure your plan will not exceed the $100 million available. Whether you spend all the money to solve one or a few problems or divide the money partially to solve many problems is up to you. You have one additional problem -- you are up for reelection next year, so consider public opinion when making your decisions.

Problems facing your city: (1) Drug problem ($100,000,000); (2) The roads (they are old, full of potholes, and need to be repaired) ($25,000,000); (3) You have no new land for landfill and you need to build a recycling center ($25,000,000); (4) You need shelters for the homeless ($50,000,000); (5) You must replace subway cars and city buses; you need to buy new ones for the public transportation system ($50,000,000); The public school teachers are demanding a salary increase and they are going to go on strike ($30,000,000); (7) Sanitation workers are demanding a salary increase and they are going to go on strike ($30,000,000); (8) An increase in unemployment has increased the number of welfare recipients ($80,000,000); (9) The AIDS epidemic has created the need for public education on AIDS prevention and you need to build an AIDS hospital ($100,000,000); and (10) You need to build a new convention center to attract out-of-state tourists. This could generate additional revenue for the next fiscal year ($70,000,000).

Principal-components analyses were used to extract the common variance in the scale scores obtained from the TSQ and STS. The first principal-component scores were then used as measures of the styles in the data analyses. The variance explained by the first principal component of each style was 64% for legislative; 69% for executive; 61% for judicial; 58% for monarchic; 50% for hierarchic; 52% for oligarchic; 50% for anarchic; 60% for global; 63% for local; 72% for liberal; and 72% for conservative. This procedure allowed us to separate out questionnaire-specific measurement error, at least to some extent, and to preserve for further analyses the styles scores reflecting only variance shared by the two instruments.

Some styles were highly correlated (e.g., local and global, r = -.67, p [is less than] .001]. To reduce the factor space of styles (i.e., to decrease the number of dependent variables), we performed factor analyses with varimax rotation, using the factor scores described previously. The outcome of this analysis was a nine-factor structure, which explained 98% of the total variance. Four of the styles created two factors with different directions of factor loadings (FL). Thus, liberal and conservative styles loaded on one factor; FLs were .90 and -.89, respectively. Global and local styles also created a two-pole factor, with FLs of .93 and -.90, respectively. All other styles formed independent factors. The FLs were as follows: judicial, .99; executive, .81; legislative, .82; monarchic, .98; hierarchic, .97; oligarchic, .99; and anarchic, .99. The factor scores of these 9 factors (local-global, liberal-conservative, judicial, executive, legislative, hierarchic, oligarchic, monarchic, and anarchic) were used in the further analyses.

RESULTS

Based on the results of our previous work (Grigorenko & Sternberg, in press; Sternberg & Grigorenko, 1993, 1995) and in correspondence with the research questions of this study, we formed a set of working hypotheses:

1. We expected to find no association of styles with abilities. That is, we did not expect to find that students with different ability patterns would demonstrate consistent stylistic preferences. 2. We expected that at least some thinking styles would contribute to predictions of overall academic performance, although we did not specify which ones. 3. We did not expect to find any difference in academic performance of matched versus mismatched students of different styles. That is, we did not expect to find any interaction effects between types of instruction (being matched/mismatched) and thinking styles. For example, we did not expect judicial creative students to perform better in the analytical instruction group than did legislative creative students in this group. 4. We expected that students with certain thinking styles would perform better in some forms of evaluation than in others.

We present the results of our study with respect to the formulated research questions.

Styles, Abilities, Gender, and Grade

In the first set of analyses, we tested whether there were any group differences between styles for male versus female students, for students of differing ability patterns, and for students in different school grades. Multivariate analysis of variance was used. None of the main effects or interaction effects was significant. Thus, students' thinking styles did not differ across sex, grade, or ability patterns.

When Abilities Are Taken Into Account, Do

Styles Still Predict Academic Performance?

We explored the question of whether thinking styles add anything to the explanatory power of ability measures for predicting academic performance. First, we computed correlations between academic performance measures, STAT scores, and the measures of the thinking styles (Table 3). All three STAT components correlated significantly with the assessments of performance across different abilities (the correlations ranged from r = .34, p [is less than] .0001 to r = .15, p [is less than] .05). Seven out of nine correlations between the STAT components and types of assessments were also significant, but practical and creative components of the STAT did not correlate significantly with performance on the homework assignments. It is important to note the fairly high correlations between all academic performance measures and scores on the analytical subtest of the STAT. These correlations were stronger than the diagonal correlations between subtests of the STAT and corresponding performance measures. We explain the presence of these correlations as a reflection of the fact that students' performance on all of the assignments, exams, and the final project were Inevitably confounded with the general ability of each student to understand and analyze the theory with which he or she was working. This occurred despite our efforts to formulate the tasks in the manner most suitable for each of the studied abilities. The magnitude of these off-diagonal correlations, however, does not differ significantly from the magnitude of the diagonal correlations. For example, even the largest discrepancy, r = .34 versus r = .17, did not represent a significant difference: z = 1.82, p [is greater than] .05. In other words, the presence of these off-diagonal correlations statistically does not undermine the discriminant validity of the STAT.

TABLE 3 Correlations of Abilities, Academic Performance, and Styles
 Performance

Item Analytic Creative Practical

Abilities

 STAT-Analytical .25(*) .34(*) .34(*)

 STAT-Creative .15(*) .21(*) .20(*)

 STAT-Practical .15(*) .17(*) .17(*)

Thinking styles (factor scores)

 Legislative .17(*) .16(*) .14

 Judicial .15(*) .20(*) .23(*)

 Executive -.15(*) -.16(*) -.10

 Monarchic -.06 -.06 -.08

 Hierarchic -.06 -.16(*) -.11

 Oligarchic -.06 -.11 -.11

 Anarchic -.08 -.13 -.12

 Local-Global -.07 .04 .09

 Liberal-Conservative .14 .10 .08

 Performance

Item Homework Exams Project

Abilities

 STAT-Analytical .26(*) .27(*) .31(*)

 STAT-Creative .13 .16(*) .20(*)

 STAT-Practical .12 .18(*) .15(*)

Thinking styles (factor scores)

 Legislative .12 .14(*) .17(*)

 Judicial .21(*) .18(*) .15(*)

 Executive -.12 -.07 -.18(*)

 Monarchic -.05 -.04 -.10

 Hierarchic .13 .07 .08

 Oligarchic -.12 -.11 .03

 Anarchic -.10 -.08 -.12

 Local-Global .05 .01 .12

 Liberal-Conservative .03 .09 .16


Note: STAT = Sternberg Triarchic Abilities Test, Level H. (*) p [is less than] .05.

Only five styles correlated significantly with measures of performance. Indicators of the judicial style correlated significantly with performance on all tasks and in all assessment settings. In addition, students with the legislative style tended to perform better on both the final projects and the exams and on both the analytical and creative measures. On the contrary, the performance of executive students was worse on the final project and on both the creative and analytical tasks. Liberal students tended to do better on the final project, and hierarchical students did better on the creative assignments. Thus, the correlational pattern suggests that the judicial, legislative, and executive styles showed significant associations with academic performance. In particular, students with higher scores on the legislative and judicial thinking styles tended to do better whereas students who scored high on the executive style tended to do worse, on average.

Among the correlations between the thinking styles measures and the STAT components, only the association between the STAT creative component and the liberal-conservative styles factor was significant (r = .22, p [is less than] .005), with liberal students being more creative. This general lack of associations between the styles and the ability measures largely supports our predictions of no differences on thinking styles in groups of students with different ability patterns. Of course, we cannot prove the null hypothesis.

Thus, the simple correlational analyses suggested that students, performance was associated not only with their levels and patterns of abilities, but also with their thinking styles. Moreover, the amount of variance explained by the STAT subtests in the measures of performance was of the same magnitude as the amount of variance explained by some of the styles (e.g., judicial). In addition, the fact that there was only one significant correlation between the STAT measures and the thinking styles suggests that associations of thinking styles with academic performance are independent of the correlations between academic performance and abilities.

To test further the predictive power of styles, we performed a series of multiple-regression analyses in which the predicted variables were the measures of achieved performance on tasks requiring use of various abilities and performance on various tasks in different assessment settings, and the predictors were the measures on the corresponding STAT component and the five thinking styles that were found to be correlated with the measures of academic performance. Thus the multiple-regression equations were both theoretically based and used the information obtained in the correlation analyses.

The independent variables predicted 16% of the variance in the summary measure of performance on the analytic tasks (F = 5.3, p [is less than] .0001). The variables that contributed significantly were the STAT analytic component (F = 13.2, p [is less than] .0005, B = .18), judicial style (F = 3.6, p [is less than] .05, B = .09, legislative style (F = 5.0, p [is less than] .05, B = .11), and executive style (F = 3.4, p [is less than] .05, B = -.10). For performance on the creative tasks, the predictors explained 15% of the variance (F = 5.0, p [is less than] .0001). Significant contributions to the explained variance were from the STAT creative component (F = 7.3, p [is less than] .05, B = .14), judicial style (F = 6.8, p [is less than] .01, B = .13), and executive style (F = 4.3, p [is less than] .05, B = -.11). Finally, the independent variables predicted 13% of the variance on performance on the practical tasks (F = 4.1, p [is less than] .001, B = .18). Only two variables contributed significantly: the practical component of the STAT (F = 6.7, p [is less than] .01, B =.14) and judicial style (F = 10.6, p [is less than] .001, B = .17). Thus, performance on the analytic, creative, and practical tasks is dependent not only on the level of the corresponding ability, but also on stylistic preferences. In general, the more judicial or legislative a student was, the better his or her performance was, whereas the more executive a student was, the worse was his or her performance.

The results of the multiple-regression analyses predicting performance on various tasks in different assessment settings were homogeneous. For all three overall dependent measures (exam, homework, and final project evaluations), the analytical component of the STAT was the best predictor (p [is less than] .0001 in all three models, B ranging from .24 to .27). The judicial style made a statistically significant contribution in explaining variation in performance on the exams (F = 6.3, p [is less than] .01, B = .15), on homework (F = 8.0, p [is less than] .005, B = .15), and on the final project (F = 3.3, p [is less than] .05, B = .10). Moreover, three other styles contributed significantly to prediction of final project performance: the legislative style (F = 5.1, p [is less than] .05, B = .13), the liberal style (F = 3.8, p [is less than] .05, B = .12), and the executive style (F = 4.4, p [is less than] .05, B = -.13).

Styles and Different Types of Instruction

In the previous analyses, we investigated the predictive power of styles for academic performance over and above abilities across all groups, regardless of the type of instruction. The question for the following set of analyses was whether styles make a difference in academic performance of matched versus mismatched students, that is, whether there is an interaction effect between type of instruction (matched/mismatched) and thinking styles. For example, do judicial creative students perform better in the analytical instruction group than do legislative creative students?

At the first stage of these analyses, we conducted analysis of variance to ensure that there were no accidental clusterings of thinking styles within different instructional groups or within groups of matched and mismatched students. There were not. At the second stage, we performed two series of multiple-regression analyses of academic performance, testing for (a) possible Interaction effects between thinking styles and type of instruction, and (b) possible interaction effects between thinking styles and matched/mismatched placement. No main or interaction effects with type of instruction were found. There was an effect of match for the summary score on analytic tasks and homework assignments, with matched students doing better than mismatched students, but no interaction effects with thinking styles were found.

Styles and Different Types of Evaluation

Finally, a last set of analyses was based on the hypothesis that various assessment methods might favor different thinking styles (Sternberg, 1994). Specifically, we suggested that while types of assessments do not differentially benefit students of different ability patterns (e.g., creative students would do as well on the homework assignments as on the exams), various types of evaluation could be differentially beneficial for students of different styles (e.g., a judicial student would do better on the exams than on the final project).

To perform these analyses, we recoded the thinking-styles scores. We analyzed the distribution of thinking styles scores in the sample and adapted a cut-point of 1.5 standard deviations to detect the 10%-15% of the students who scored the highest on each of the 11 styles (we used raw scores on thinking styles in these analyses). (Due to the limited size of our sample, we could not adapt a traditional conservative cut-off score of 2 standard deviations above the mean.)

These students were considered to be "high" on a given style. Then we carried out analyses of variance to test whether members of the high groups performed better in particular assessment settings than in others. Table 4 shows the results of these analyses. The table shows that there is a significant difference in performance of highly judicial, highly liberal, and highly oligarchic thinkers versus all other students on exams, judicial thinkers versus other students on homework, and executive and anarchic thinkers versus other students on the final project. Moreover, certain combinations of styles (e.g., judicial and global) tend to enhance the performance in different assessment situations.

TABLE 4 Styles and Types of Assessment
Styles Exams

 Judicial F(1, 186) = 3.3, p [is less than] .05

 Executive

 Liberal F(1, 186) = 3.3, p [is less than] .05

 Oligarchic F(1, 186) = 4.3, p [is less than] .05

 Anarchic
 Legislative(*)Global F(3, 186) = 3.6, p [is less than] .01

 Legislative(*)Local

 Legislative(*)Liberal

 Legislative(*)Anarchic

 Judicial(*)Global F(3, 184) = 2.3, p [is less than] .05

 Judicial(*)Hierarchic F(3, 186) = 4.5, p [is less than] .005

 Executive(*)Global

 Executive(*)
 Conservative

 Executive(*)Anarchic

Styles Homework Assignments

 Judicial F(1, 186) = 5.8, p [is less than] .05

 Executive

 Liberal

 Oligarchic

 Anarchic

 Legislative(*)Global

 Legislative(*)Local

 Legislative(*)Liberal

 Legislative(*)Anarchic

 Judicial(*)Global

 Judicial(*)Hierarchic F(3, 186) = 2.5, p [is less than] .05

 Executive(*)Global

 Executive(*)
 Conservative

 Executive(*)Anarchic

Styles Final Paper

 Judicial

 Executive F(1, 186) = 4.1, p [is less than] .05

 Liberal

 Oligarchic

 Anarchic F(1, 186) = 4.4, p [is less than] .05

 Legislative(*)Global

 Legislative(*)Local F(3, 186) = 3.6, p [is less than] .05

 Legislative(*)Liberal F(3, 186) = 3.6, p [is less than] .05

 Legislative(*)Anarchic F(3, 186) = 2.8, p [is less than] .05

 Judicial(*)Global

 Judicial(*)Hierarchic

 Executive(*)Global F(1, 186) = 4.9, p [is less than] .005

 Executive(*)
 Conservative F(1, 186) = 4.1, p [is less than] .01

 Executive(*)Anarchic F(1, 186) = 2.9, p [is less than] .05


Note: The table shows only the styles and the combination of the styles that yielded statistically significant results. The significance values are corrected for multiple comparisons by adjusting p values.

The patterns of the means suggest that the exam format was most favorable for judicial thinkers ([Mu] = .39) in the group of highly judicial students versus [Mu] = .03 in the group of other students) and for thinkers of other styles who were also high on the judicial style. This format of assessment was least beneficial for legislative/global thinkers ([Mu]= 2.11 in the group of legislative-global thinkers versus [Mu] = -.04 or higher in other groups), and was disadvantageous as well for oligarchic students (Mu] = -.44 in the group of oligarchic thinkers versus [Mu] = .02 in the group of other students).

The format of the independent final project was the least beneficial for executive students ([Mu] = -.44 among executive thinkers versus [Mu] = .07 in the group of all other students) and, moreover, for students with virtually all combinations of other styles with the executive style. In addition, anarchic students tended to do worse than did all other students combined ([Mu] = -.37 versus [Mu] = .08, respectively). The independent project was the most favorable for legislative/local ([Mu] = 2.30 versus [Mu] = .29 or lower in other groups), legislative/anarchic ([Mu] = .51 versus [Mu] = .05 or lower in other groups), and legislative/liberal ([Mu = 2.28 versus [Mu] = .29 or lower in other groups) thinkers.

Homework assignments were quite variable, and, in addition, there were few constraints on how they had to be done; for example, a student could request a consultation with a teaching fellow, or a group of students could complete the assignments together. Thus, this particular type of assessment did not appear to be consistently beneficial for students with any particular stylistic preferences, except for judicial thinkers ([Mu] = .40 versus [Mu] = -.04 in the group of all other students).

DISCUSSION

This study attempted to investigate patterns of thinking styles in a group of gifted children. We obtained the following results, in summary:

* There are no differences in thinking styles between groups of students with different ability patterns.

* Certain thinking styles contribute significantly to prediction of academic performance.

* The degree of this contribution is not affected by the type of instruction students are given.

* Students with particular thinking styles do better in some forms of evaluation than in others.

A few interesting conclusions can be drawn from the results. First, we saw no distinct patterns of particular thinking styles among the students by abilities, gender, or grade. These results are similar to those of our previous findings in a sample of unselected (nongifted) students in four different schools (Sternberg & Grigorenko, 1995). Thus, in two independent groups of students, we found no direct links between styles and abilities. A variety of styles can be associated with high levels of ability. Moreover, various types of abilities can be associated with a given style.

The finding of no difference in profiles of styles in males versus females is also not a surprising one: Previously we also found no association between students, gender and their styles (Sternberg & Grigorenko, 1995). A lack of grade differences in styles might be explained by the fact that in both studies we worked primarily with highschool students, limiting the age range of participants to 12-17 years of age. We intentionally limited our samples to this age range due to the fact that we used self-report questionnaires requiring a certain level of metacognitive capacities to reflect stable patterns of preferences and behaviors (Schwab-Stone, Fallon, Briggs, & Crowther, 1994). However, the theory of mental self-government assumes the presence of developmental changes in stylistic preferences; future studies, implementing measures other than self-report ones, might show significant age/grade effects.

In our larger-scale study (Sternberg & Clinkenbeard, 1995; Sternberg et al., in press), we found a significant instruction-by-ability interaction and showed that performance of students with a certain ability pattern was higher on the tasks corresponding to their abilities if these students were placed in a matching instructional group (e.g., analytical students placed in the groups with analytical instructions did better on the analytical tasks than did other students). In this study, due to the study design, we had a chance to investigate how, if at all, thinking styles modify the relationships between ability type and the mode of instruction. We discovered no significant buffering effects that could have been attributed to styles. For example, creative students, placed in Instructional groups that did not match their ability, did not differ in their performance when their styles were taken into account. These findings, however, should be interpreted with caution: The whole program took only 4 weeks; and it is possible that there simply was not enough time for stylistic differences to manifest themselves.

Further, in the present study, we had a unique opportunity to evaluate the contribution various thinking styles made to students' performance in a given situation, when these evaluations were conducted by psychologists who did not meet the students in person, but judged their performance on the basis of their writing. Of course, a valid argument could be made that such a type of evaluation is biased in favor of students with a high level of writing skills. Yet, taking into account this drawback of our evaluation system, we think that these assessments were less subjective than traditional school teachers, grades, and therefore provided us with an opportunity to test whether thinking styles predict students' performance, over and above the students' abilities. This particular aspect of the design was especially interesting, due to the fact that in our previous research we showed that teachers tend to overestimate the extent to which their students share their own styles, and that students in fact receive higher grades and more favorable evaluations when their styles more closely match those of their teachers (Sternberg & Grigorenko, 1995). Thus, the opportunity to separate teacher-dependent variance in performance assessment and to study the "purer" contributions of ability and styles was explored.

We found that students, performance was associated not only with their levels and types of abilities, but also with at least three thinking styles (judicial, executive, and legislative). The highest predictive power was demonstrated by the judicial style. But, in interpreting these results, we should say that a significant portion of the YSPP academic activities was based on analytical work, which involved comparing, contrasting, and evaluating. Even though special "create-and-implement" tasks were designed to benefit creative and practical students, these tasks were based on material that needed to be critically evaluated by a student. Thus, even the creative and practical tasks in our program involved a significant amount of analytical effort. We might have found a different "most favorable style" were the same study carried out in an art school or a boy/girlscout program. However, the most general conclusion remains the same: Styles add to our understanding of students, performance, and therefore should be taken into consideration in school settings.

Yet one more illustration of the importance of the styles came from the last, and probably the most interesting part of the study, where we compared the performance of students in "high" and "other" groups in a particular style. In these analyses, we detected a number of interaction effects between various styles and different assessment procedures. We found that the examination format was most beneficial for judicial students, whereas the final projects favored legislative students and disadvantaged executive students. Although our sample was not large enough to study interactions between various styles in more detail, the results clearly suggest that different types of assessment benefit different types of thinkers (see also Sternberg, 1994). The take-home message of these analyses is that styles do matter, and teachers should systematically vary their assessment to meet the needs of a larger number of students.

These latter analyses are of practical value. A teacher of gifted students should try to assess the students' performance by using an array of different assessment procedures. Independent of their patterns of abilities, gifted students of different thinking styles tend to perform better when the method of assessment matches their thinking styles. Our results, obtained in this and other studies, suggest that short-answer/multiple-choice items appear to be most beneficial for executive, local, judicial, and hierarchic thinkers. Macroanalytic essays are advantageous for judicial-global students, whereas microanalytic essays will be advantageous for judicial-local thinkers. Timed assignments may be beneficial for hierarchic students and detrimental for anarchic ones. Monarchic students tend to shine when the assignment requires high commitment, whereas oligarchic students are able to divide their resources equally between a number of tasks. Open-ended assignments, projects, and portfolios will benefit legislative thinkers and may frustrate executive ones.

In summary, the diversity of styles among students implies that students need a variety of means of assessment to maximize and show to an optimal extent their talents and achievements.

REFERENCES

Allport, G. (1937), Personality: A psychological interpretation. New York: Holt.

Campbell, D. T., & Fiske, D. W (1995). Convergent and discriminant validation by the multitrait multi-method matrix. Psychological Bulletin, 56, 81-105.

Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. New York, NY: Cambridge University Press.

Curry, L. (1983). An organization of learning styles theory and constructs. ERIC Document. 235 185.

Dunn, R., & Dunn, K. (1978). Teaching students through their individual learning styles. Reston, VA: Reston Publishing.

Fowler, W. (1977). Sequence and styles in cognitive development. In F. Weizmann & I. Uzgiris (Eds.), The structuring of experience (pp. 265-295). New York: Plenum Press.

Fowler, W. (1980). Cognitive differentiation and developmental learning. In H. Rees & L. Lipsitt (Eds.), Advances in child development and behavior, Vol. 15 (pp. 163-206). New York: Academic Press.

Gardner, H. (1983). Frames of mind: The theory of multiple intelligences. New York: Basic Books.

Gardner, R. (1953). Cognitive style in categorizing behavior. Perceptual and Motor Skills, 22, 214-233.

Garner, W R., Hake, H. W., & Eriksen, C. W (1956). Operationism and the concept of perception. Psychological Review, 63, 149-159.

Gregorc, A. F. (1984). Style as a symptom: A phenomenological perspective. Theory Into Practice, 23, 51-55.

Grigorenko, E. L., & Sternberg, R. J. (1995). Thinking styles. In D. H. Saklofske & M. Zeidner (Eds.) International handbook of personality and intelligence (pp. 205-229). New York: Plenum Press.

Grigorenko, E. L., & Sternberg, R. J. (in press). Styles of thinking in school settings. Vestnik Moskovskogo Universiteta. Seria 14. Psikhologia.

Guilford, J. P. (1967). The nature of human intelligence. New York: McGraw-Hill.

Holland, J. L. (1973). Making vocational choices: A theory of careers. Englewood Cliffs, NJ: Prentice-Hall.

Horn, J. L. (1994). Theory of fluid and crystallized intelligence. In R. J. Sternberg (Ed.), The encyclopedia of human intelligence. Vol. 1 (pp. 443-451). New York: Macmillan.

Jung, C. (1923). Psychological types. New York: Harcourt Brace.

Kagan, J. (1958). The concept of identification. Psychological Review, 65, 296-305.

Kagan, J., & Kogan, N. (1970). Individual variation in cognitive processes. In P A. Mussen (Ed.), Carmichael's manual of child psychology. Vol 1 (pp. 1273-1365). New York: Wiley.

Klein, G. S. (1954). Need and regulation. In M. R. Jones (Ed.), Nebraska symposium on motivation (pp. 474-505). Lincoln: University of Nebraska Press.

Kogan, N. (1983). Stylistic variation in childhood and adolescence: Creativity, metaphor, and cognitive style. In P. H. Mussen (Ed.), Handbook of child psychology, Vol. 3 (pp. 630-706). New York: Wiley.

Kolb, D. A. (1974). On management and the learning process. In D. A. Kolb, I. M. Rubin, & J. M. McIntyre (Eds.), Organizational psychology (pp. 239-252). Englewood Cliffs, NJ: Prentice-Hall.

Martin, M. (1989). Mind as mental self-government: Construct validation of a theory of intellectual styles. Unpublished manuscript, Yale University, New Haven, Connecticut.

Miller, A. (1987). Cognitive styles: An integrated model. Educational Psychology, 7, 251-268.

Miller, A. (1991). Personality types, learning styles and educational goals. Educational Psychology, 11, 217-238.

Myers, I. B., & Myers, P. B. (1980). Gifts differing. Palo Alto, CA: Consulting Psychologists Press.

Renzulli, J. S., & Smith, L. H. (1978). Learning styles inventory. Mansfield Center, CT: Creative Learning Press.

Riding, R., & Cheema, I. (1991). Cognitive styles: An overview and integration. Educational Psychology, 11, 193-215.

Royce, J., & Powell, A. (1983). Theory of personality and individual differences: Factors, systems and process. Englewood Cliffs, NJ: Prentice-Hall.

Santostefano, S. (1986). Cognitive controls, metaphors and contexts: An approach to cognition and emotion. In D. Bearison & H. Zimiles (Eds.), Thought and emotion: Developmental perspectives (pp. 217-238). Hillsdale, NJ: Lawrence Erlbaum.

Schwab-Stone, M., Fallon, T., Briggs, M., & Crowther, B. (1994). Reliability of diagnostic reporting for children 6-11 years: A test-retest study of the Revised Diagnostic Schedule for Children. The American Journal of Psychiatry, 157, 1048-1054.

Spearman, C. (1927). The abilities of man. New York: Macmillan.

Sternberg, R. J. (1985). Beyond IQ: A triarchic theory of human intelligence. New York: Cambridge University Press.

Sternberg, R. J. (1986). Intelligence applied: Understanding and increasing your intellectual skills. San Diego: Harcourt Brace.

Sternberg, R. J. (1988a). Mental self-government: A theory of intellectual styles and their development. Human Development, 31, 197-224.

Sternberg, R. J. (1988b). The triarchic mind: A new theory of human intelligence. New York: Viking.

Sternberg, R. J. (1990). Thinking styles: Keys to understanding student performance. Phi Delta Kappan, 71, 366-371.

Sternberg, R. J. (1993). Sternberg Triarchic Abilities Test. Unpublished test.

Sternberg, R. J. (1994). Allowing for thinking styles. Educational Leadership, 52(3), 36-39.

Sternberg, R. J. (1995). In search of the human mind. Orlando, FL: Harcourt Brace.

Sternberg, R. J., & Clinkenbeard, P. (1995). A triarchic view of identifying, teaching, and assessing gifted children. Roeper Review, 17, 225-260.

Sternberg, R. J., Ferrari, M., Clinkenbeard, P., & Grigorenko, E. L. (in press). Identification, instruction, and assessment of gifted children: A construct validation of a triarchic model. Gifted Child Quarterly.

Sternberg, R. J., & Grigorenko, E. L. (1993). Thinking styles and the gifted. Roeper Review, 16, 122-130.

Sternberg, R. J., & Grigorenko, E. L. (1995). Styles of thinking in the school. European Journal of High Ability, 6(2), 1-19.

Thurstone, L. L. (1938). Primary mental abilities. Chicago: University of Chicago Press.

Vernon, P. (1973). Multivariate approaches to the study of cognitive styles. In J. R. Royce (Ed.), Contributions of multivariate analysis to psychological theory (pp. 139-157). London: Academic Press.

Witkin, H. A. (197). The role of cognitive style in academic performance and in teacher-student relations. Unpublished report, Educational Testing Service. Princeton, New Jersey.

Witkin, H. A., Oltman, P. K., Raskin, E., & Karp, S. A. (1971). Embedded Figures Test, Children's Embedded Figures Test, Group Embedded Figures Test Manual. Palo Alto: Consulting Psychologist Press.

ABOUT THE AUTHORS

ELENA L. GRIGORENKO, Associate Research Scientist, Department of Psychology and Child Study Center; and ROBERT J. STERNBERG, IBM Professor of Psychology and Education, Department of Psychology, Yale University, New Haven, Connecticut.

This research was supported under the Javits Act Program (Grant #R206R50001) as administered by the Office of Educational Research and Improvement of the U.S. Department of Education. Grantees undertaking such projects are encouraged to express their professional judgments freely. This article, therefore, does not necessarily represent positions or policies of the government, and no official endorsement should be inferred.

We are grateful to Pamela Clinkenbeard and Michel Ferrari for their assistance in data collection.

Copies of the instruments may be obtained at cost from the authors. Address requests for reprints to Robert J. Sternberg, Department of Psychology, Yale University, Box 208205, New Haven, CT 06520-8205 (E-mail: [email protected]).

Manuscript received January 1996; revision accepted April 1996.
COPYRIGHT 1997 Council for Exceptional Children
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1997 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Grigorenko, Elena L.; Sternberg, Robert J.
Publication:Exceptional Children
Date:Mar 22, 1997
Words:10118
Previous Article:Becoming architects of communities of learning: addressing academic diversity in contemporary classrooms.
Next Article:Early interventionists' perspectives of multicultural practices with African-American families.
Topics:

Terms of use | Privacy policy | Copyright © 2024 Farlex, Inc. | Feedback | For webmasters |