Ed Tech Cases Portal
Exemplary Technology-Supported Schooling Cases in the USA
School Reports & Videos: Learn about one of the eleven schools studied through links to its video case, text case report and website. Exemplary Technology-Supported Schooling Cases in the USA Multicase Reports: Papers on topics such as professional development, leadership, instructional practices, and student outcomes that analyze at the schools studied.
Project Overview Contact Ron Anderson and Sara Dexter, Project Co-Directors
Multicase Reports > Methodology
Methodology of the Exemplary Technology-Supported Schooling Case Studies Project
Ronald E. Anderson, University of Minnesota and Sara Dexter, University of Nevada, Las Vegas
Download PDF Report Download a PDF of this report    

The goals of the “Exemplary Technology-Supported-Schooling Case Studies" project were (1) to try to identify the most exemplary schools in the United States with innovative classrooms that incorporated technology to improve learning in a major way, and (2) to understand the programs of educational improvement underlying the innovations, including the specification of success factors. To accomplish these objectives, appropriate methodological strategies and procedures were selected and refined.

This project investigated innovative cases in the United States, but the research procedures were coordinated with two major international studies. One is the OECD's Organizational Case Studies project and the other is the IEA's SITES-M2 (Second International Technology in Education Study, Module 2). Over 30 countries have been participating in one or both of these studies, each one conducting case studies similar to ours, but with each country determining its own case selection criteria. Selected case results from this project were generated and prepared for both international projects.

Prior national surveys (e.g., Anderson, 1993; Becker, 1999) have documented the evolution of information technology in American schools. Such studies even analyzed the relationship between information technology and critical elements in the educational enterprise, such as teacher pedagogical beliefs and practices, school-wide staff development and teacher support systems, and the school's decision-making practices and organization. But research is needed that goes beyond these surveys to examine detailed portraits of innovative pedagogical practices supported with technology. By collecting data about various school contexts, research could help us to understand better what contextual factors are most crucial for creating and sustaining an entire school environment where most of the teachers are exemplary in their uses of technology.

Case Study Methodology

Case study methods were selected for the study that made it possible to combine qualitative and quantitative approaches and balance the competing considerations inherent in the international studies that our study was intended to support. A case study is an exploration of a bounded system over time through detailed, in-depth data collection involving multiple sources of information rich in context (Creswell, 1998). It can be exploratory, descriptive, or explanatory (Yin, 1994), and our desire was to utilize methods of all three approaches. Within the spectrum of case study methodologies, the common focus is on “the case,” its context, and a commonly used set of techniques. A case may even be bounded by a time period, an organizational structure, or a set of events.

The primary focus of our case studies were the people, actions, and contextual conditions that are linked to the outcomes. Among conditions that may be important to the success of the practice are the ways technology is used by teachers and students; how this application enables and/or draws on associated pedagogy or curricula; the kinds of skills, training, and/or technical support that the teachers seemed to need to implement the application in this way; and the policies, norms, and cultural conventions that supported these practices. Our procedures were designed to investigate the role of these and many more possible conditions. In the analysis of each case we looked for one or more of these factors possibly making a difference.

The identification of such relationships can promote the improvement of practice. They can provide practitioners and policy makers with a menu of practices that they can assemble for a particular design or situation, anticipating how these assembled causal relationships might interact with each other in ways that advance the intended outcomes. The beauty of the case study approach is that these causal relationships can retain their contextual nuance; they can be viewed within the “cloud of correlated events” (Scarr, 1985; Salomon, 1993) of a particular cases in which they were identified.

Our main approach used was that of an instrumental case study, where the focus of the analysis is on underlying issues, relationships, and causes that may generalize beyond the case (Stake, 1995). Analysis was done at the level of single cases and multiple cases. For the latter, cross-case analyses was done to identify themes that unite and/or distinguish the cases. The conclusions-or more accurately "assertions"-are validated through the triangulation of findings across various data sources (Stake, 1995; Miles & Huberman, 1994). The focus of our case studies was on the outcomes and the people, actions, and contextual conditions that shaped the outcomes.

Case studies are not new to the study of innovations, e.g., Huberman and Miles (1984) conducted case studies of educational innovations in math and science two decades ago. Recent examples of important case studies of information technology innovations in learning and teaching are Schofield and Davidson (2002) and Means, Penuel, and Padilla (2001). Our study departs from these latter studies in that it is limited to sites that were exemplary. Our study was further challenged by its links to large international comparative research projects. We know of no other study that simultaneously participated in two large international studies, conforming to both sets of procedures required. But what was in many ways a burden yields a remarkable opportunity for comparative analysis and exploration.
Site Selection

To help us select sites that best fulfilled study objectives while remaining within the constraints of the OECD and IEA studies, the first task was to specify our site selection criteria. The specification of criteria for site selection was a long process that involved extensive discourse with researchers from the two international studies, a special Advisory Committee, and staff of the National Science Foundation and the U. S. Department of Education. For purposes of the U.S. project, we began with the criteria developed by the "Expert Panel on Educational Technology," which was sponsored by the Secretary of Education in 1999 to make awards to the most exemplary "learning programs."

The final six criteria that were used for selecting the sites were as follows: (1) a majority of teachers at the public school are engaged in a school-wide reform or school improvement; (2) a majority of teachers are engaged in an innovative, technology-supported pedagogical practice; (3) the school is committed to meeting high content standards in core subjects; (4) the students are drawn from diverse backgrounds including a number of low income students; (5) the reform effort and the innovative technology-supported teaching practices appear to be sustainable and transferable; and (6) there is compelling evidence that the reform effort and the innovative technology-supported teaching practices have resulted in educationally significant outcomes or gains for the students involved.

We began the site search by sending a solicitation letter to the State technology directors in all fifty states. The letter was drafted and sent by Linda Roberts, Director of the Office of Technology, U. S. Department of Education. Any State technology directors that had not responded by June were called and in many cases another copy of the letter was faxed to them. Nominations of districts and/or schools were received from 35 States.

Concurrently nominations were solicited from numerous other sources. Flyers asking for nominations were distributed by U of MN and NCREL staff at all three "Evaluating the Effectiveness of Technology" conferences (held in April, May and June of 2000). In addition, nominations were received from U. S. Department of Education staff including Linda Roberts, Jenelle Leonard, Judy Segal, Sharon Horn, and Diane Reed. The staff of the Center for Technology in Learning at SRI International, including Robert Kozma and Barbara Means, provided names of schools and districts as well. By August the list of nominations had been narrowed down to about 20 schools, and we sent this list to our project Advisory Committee asking them for their evaluation of each candidate site and requesting that they nominate any other schools or districts that they felt met the criteria. In December we sent another list of the 11 "finalists" asking for another round of their opinions.

Another source of nominations came from directly contacting representations of school reform programs and projects known to have a major technology component. We began with the projects designated by the Secretary of Education's Expert Panel on Educational Technology. This Panel worked for two years reviewing over 125 applications for status as promising or exemplary with respect to educational technology. In September two educational technology programs were awarded "exemplary" status and five were award "promising" status. Nominations were solicited from numerous additional programs emphasizing educational technology including the following: Carnegie Learning, Edison Schools, NetSchools, New American Schools, the IMMEX project, Children Connecting Classrooms Community Curriculum (C5), Challenge 2000 Multimedia Project, One Sky, Many Voices, Apple Classroom of Tomorrow (ACOT), Schools for Thought, Lightspan, and Co-nect. In most instances we received one or more nominations from each of these projects.

As a result of this process we received explicit nominations for nearly a hundred different school districts and approximately 125 schools. Nominated districts were contacted for a specific school name to contact. Of the individual schools nominated by districts, they were pretty evenly divided according to elementary, middle, and high schools.

Numerous requests for information were sent to all schools nominated, but not all responded. To gather sufficient information on the 50-some schools in our database who did respond, the University of Minnesota team placed many telephone calls. Schools are busy places, and rarely did one telephone contact result in the completion of an interview that yielded significant additional information. For example, during one month, we completed over 90 calls, and sent over 50 faxes and e-mails to possible candidate sites. After reaching a school e attempted to conduct a telephone interview with the principal or a technology coordinator. We supplemented the interview with any information available on the Web. If all this information indicated that the school might meet our selection criteria, we attempted to interview a teacher involved in the technology reform activity. Each telephone interview ranged from 45 to 60 minutes in length and included supporting questions for each of the six criteria.

The six selection criteria provided the foundation that framed the telephone interview questions. A very important part of the selection process was the use of interviews to gather essential information about the school-wide reform and use of technology from district administrators, technology leaders, and classroom teachers. To determine the match between our the six site selection criteria and a site’s characteristics we crafted a number of relevant questions that allowed the researcher to probe deeper into how wide-spread and embedded a school improvement effort was, and the extent to which technology was integral to that improvement effort.

For example, to gather support for criteria 1, a majority of teachers at the school are engaged in a school-wide reform or school improvement, interviewees were asked to “describe the major school reform or improvement efforts at the school.” This question was followed up by several probing questions on additional details about the school-wide reform effort. During each telephone interview we sought to ascertain the congruence between the various interviewees’ beliefs about the stated reform of and the actual practices of district administrators, classroom teachers, and students. For example, according to one school administrator’s report of the various reform efforts and use of technology at his district’s school, it met the study criteria; but we did not select it as a study site. on the basis of two additional follow-up interviews of classroom teachers. These interviews attempted to triangulate these teachers’ descriptions of reform and their actual classroom practice with the administrator’s statements. . In this instance, teachers’ interviews revealed that there were a couple of “maverick” teachers at the high school who were doing interesting things with technology, that technology use was not wide-spread, the school was re-framing its reform efforts, and that there were only 20 computers in a computer lab for the entire K –12 school of over 400 students. The teachers’ interviews revealed that technology was not extensively used by most teachers, there was very limited computer access and there appeared to be a disconnect between teachers and school administrators view of what constituted school-wide reform. This lack of congruence across interviews concerning the details of the school-wide reform and how technology was used resulted in this candidate site not being selected. In conducting each candidate site interview it was imperative that during the interview session the researcher paid attention to details given about the school-wide reform efforts and descriptions of technology use to formulate additional probing questions that would facilitate identification of incongruence, or important salient details about the infrastructure that the interviewees might not have explicitly state.

Upon receiving the initial screening call some candidate sites, immediately refused to participate and were dropped from our database. Those schools gave various reasons for not wanting to participate in the study, including, 1) they felt that they had been overly studied by other research groups, 2) district priorities would not allow them time to get involved with outside research, 3) the principal and/or district administrators perceived that the amount of time required to participate in the study was too demanding, or 4) the district was undergoing re-structuring and was not willing to participate. We dropped some sites from further consideration for a variety of reasons. Some reasons included, 1) the site did not meet all the selection criteria, 2) there was a lack of consonance about details of school-wide reform and/or use of technology across interviewees, 3) the site’s technology-related programs were being re-evaluated and/or re-designed and undergoing substantial change, or 4) the school’s technology-related programs were in the beginning stages of implementation and thus and too new for our study’s purposes..

The site selection process was lengthy and included the input from a variety of sources. Groundwork for the decision on each site included many days spent interviewing key personnel at candidate study sites, examining all of the program documents and evaluation reports available, and reviewing relevant websites. This data was compiled and analyzed and then the leading candidates for inclusion were discussed with our Advisory Committee and researchers in the U. S. Department of Education. Decisions to select a site for inclusion were not made all at once but were generally made 2 or 3 at a time over a period of about 24 months. This gave us the advantage of actually conducting the data collection on-site for some schools before making the final decision on other schools.
Data Collection and Analysis

Each site visit includes a team of two researchers working at the school site for 5 days. These 5 days are used for conducting interviews with the principal, one or more technology coordinators, other administrators relevant to the technology reform program, 4 to 6 teachers, several students in these teachers' classrooms, and several parents of these students. In addition, at each site 2 to 4 classrooms are systematically observed by the researchers. All interviews are recorded and most are videotaped. The classroom observation periods are videotaped with one or two cameras.

While the study was administered by the University of Minnesota and the Minnesota staff coordinated the project, SRI International assisted with the data collection and analysis of the data. Both research teams followed the same procedures, but there were two separate human subjects review processes by the respective Institutional Review Boards (IRBs). In all cases, written (signed) consent was obtained from all participants prior to interviewing, classroom observation, and audio or video recording. The consent process informed all participants that their responses were completely voluntary and all data were confidential unless they waived their right to retain total confidentiality. This applied to both the schools and the staff, students, and others within the school.

As of mid-2003 all of the schools principals, except for one, gave us written permission to use their school name and to identify the names of staff members in writing up the case reports. The nature of the study gave most participants little concern for privacy or confidentiality of information pertaining to their work or conversations. In fact, many felt honored to be involved in a school activity that was in some sense designated as exemplary.

As soon as each site visit has been completed, the interviews are transcribed into document files. The text segments in these files were then coded according to the coding scheme given in Appendix 1 and described in the next section. Site documents were logged and filed for analyses and reference.

Analysis of Data

All interview transcripts and documents were analyzed with a structured coding scheme that was derived from the conceptual frameworks for the study. This scheme contained seven main coding areas. (The full coding scheme is given in Appendix 1.) The first category concerned the innovation or reform itself and was designed to capture information about the technology-supported school-wide innovation or improvement, the history and scope of the innovation, including its goals and origin, the curricular/subject areas involved and its instructional organization. This allowed us to compare reforms on the basis of their purpose and intent to improve the quality of instruction. A second code area was about the school itself and allowed us to organize information about the site, including background information on and the demographics of the school and its community. With this code we also tagged pertinent information about the school culture, its leadership, and any external relationships the school established to aid their technology implementation. This group of codes allowed us to capture relevant meso-level information about the school’s setting and how together they helped to create a favorable context for the classroom uses of technology.

Another set of codes focused on the technology and the technology support present at the site. These codes supported our analysis of the vision for technology and the specifics of what the site has put into place and how it keeps it working and teachers prepared for its use. The next two sets of codes focused on students and teachers and their roles, practices, and outcomes. Together, these codes support the description and analysis of the classroom-based teaching and learning with technology. The final two sets of codes allowed us to capture the elements of the site that contribute to the sustainability and transferability of its innovation. We differentiated between elements of the innovation itself, the classroom, school, and district components. These two codes were often used as a second additional code to some other pertinent information.

As each of the seven categories were divided into several additional categories or codes, the total number of codes or "nodes" was 36. There were a total of 162 separate documents for the analysis. These documents included interview transcripts, observation reports, reports from the school or their website, and curriculum statements. As these documents were all in digital form, they were all included in the analysis.

Each team of two researchers divided up the interviews to code; codes were assigned to sections of transcripts with the qualitative analysis program NUD*IST NVIVO. This program allows any length of the segment of text to be coded with as many codes as the analyst sees fit to apply. After all coding was complete, the NVIVO program was used to gather all text segments from that site’s transcripts into a report for each code. These reports were then analyzed to determine the main points and themes within each code area. These points provided the basis for the conclusions that are reported in the other multicase reports.
Summary of the Cases

Table 1 below lists the schools and several demographic characteristics for each. There were four elementary schools, three middle schools, and three senior high schools. One middle school was quite large with over 1,300 students and the senior high was small with only 240 students. Otherwise, the schools tended to be somewhat average or typical in size. Newsome Park Elementary and Hew Tech High were magnet schools and only about 5 years old at the time of the data collection. The remaining schools were older, more established schools. Four schools were located in sizable urban areas, five in suburban communities, one in a small town, and one was a virtual school.

There was considerable variation in the racial diversity and family poverty of the schools. Three schools had relatively little diversity and poverty: Frontier Elementary, Mantua Elementary, and Mountain Middle. Six schools had 60% racial minority or greater and very high poverty levels. About half of New Tech High's students were minority, and because the school did not have a lunch program we were unable to obtain the percentage of students receiving free and reduced lunch. However, the staff told us that the students came from highly diverse economic backgrounds.

Table 1. Demographic Information for Each School

School Name
Grades Served
Size of Place
Percent Minority
Percent Poverty+
Newsome Park Elementary
Canutillo Elementary
Mantua Elementary
Frontier Elementary
Lemon Grove Middle School
Jennings Junior High School
The Mott Hall School
Mountain Middle*
Emerson High School
New Tech High School
Small town
The Virtual High School

+Poverty indicator was percent of students eligible for free or reduced cost lunch.
*Indicates school name is a pseudonym.

Overview of the Reforms

Table 2 summarizes with a phrase for each school the investigated innovation or school reform.

Table 2. The Schools and the Innovation Studied

School Reform/Innovation
Newsome Park Elementary School Project learning using wireless laptops
Canutillo Elementary School Constructivist learning, supported by technology
Mantua Elementary School "Basic School" vision powered by technology
Frontier Elementary School Integrated curriculum, extended school year, and technology focus
Lemon Grove Middle School "Thin client" system and academic performance
Jennings Junior High School Inquiry based , technology-integrated lessons
The Mott Hall School Laptops for all students and staff
Mountain Middle School* Technology to support standards-based achievement
Emerson High School Integration of technology with whole-language curricular reform
New Tech High School High-tech preparation for a high-tech world
The Virtual High School Production and online delivery of elective courses within a consortium of schools

*Indicates school name is a pseudonym.

Before discussing these innovative reforms, we would note that while there was considerable diversity in the types of technology-supported programs, the schools were somewhat similar in that they tended to have an average to high density of computers as measured by their student-computer ratio. For example, two schools, Newsome Park Elementary and Mountain Middle had a student-computer ratio of four to five, which is approximately the national average. Two of the schools, Mantua Elementary and New Tech High, had a computer for every student, i.e., a ratio of one.

The degree of teacher participation varied across school sites, although in general it was quite high. Of course, there are different types and levels of participation, but in about half of the schools all (100%) of the teachers were participating at a noteworthy level. In most of the remaining schools at least 75% of the teachers were participating.

The school-wide reform of the first school, Newsome Park Elementary , was "project-based learning using wireless laptop computers." This strategy was supported by an intensive 45-hour technology-based professional development in which 38 of 40 teachers had participated. The reform program included a variety of software packages and learning activities for the teachers and students to use.

Canutillo Elementary is a medium sized rural school, which serves a majority (94%) Hispanic student population. In addition, all ( 100%) of students qualify for free and reduced lunch. The main reform effort focus is to use technology as a tool integrally embedded into the school’s reading improvement program. In addition to the regular school year reading reform effort, students in grades K-6 have the opportunity to further develop their reading and technology skills through a summer “Reading Renaissance Camp.”

Mantua Elementary called itself a "basic school powered by technology." This approach was derived from their attempts over a decade to adapt the Boyer Basic School philosophy, which emphasizes a learning community with a coherent curriculum. The teachers with the help of technology specialists developed a variety of strategies for pursuing this philosophy using technology. Among their strategies are a computing unit for every student, a video conferencing center and a full range of assistive technologies. A number of the reform activities appeared to have been initiated by the teachers.

Frontier Elementary, opened as an extended year technology rich elementary school in the mid-1990s. Eleven per cent of students were from diverse ethnic backgrounds, of which 35% qualify for free or reduced lunch. The use of student data is embedded into nearly every aspect of the school and classroom processes. The district-wide supported student database makes available to teachers continuous, up-to-date data on students’ performance and personal history. Teachers are able to use current data to inform their decisions about individual student curricula and instructional needs.

Another suburban middle school, Lemon Grove Middle, emphasizes student achievement but takes a much different approach with technology. Their reform effort is summarized as "thin client computing supporting students' academic performance." "Thin clients" refers to the computer stations which have very little independent capability (either hardware or software) apart from the local network to which they are connected. This ICT strategy has made it possible for them to attain a very high computer density and quality maintenance with centralized control.

Jennings Junior High is a medium sized school in a first tier suburb of a major metropolitan Midwestern city. Like many major metropolitan city schools, Jennings has faced the challenge of improving students’ academic achievement. A district-wide reform plan centered on providing teachers with the needed technology training, equipping classrooms with extensive technology and supporting technology use. The multifaceted technology curriculum integration plan has resulted in the majority of core teachers at Joshua having modernized technology rich classrooms.

A highly recognized school within a very large eastern city, The Mott School, pioneered a “Anytime, Anywhere Learning” laptop program. 100% of students and teachers have their own laptop. This largely Hispanic (80%) gifted population of students were not bound by stationary desk top computers, but can readily use their laptops 24 hours a day. These high achieving students wre developing computing and researching skills that allow them to extend and enrich their own learning.

Mountain Middle, a large suburban school has a reform program that can best be described as "technology to support standards-based achievement." For some time its school district leadership has pioneered an approach to promote improvements in achievement using technology in a variety of ways. Some of their innovations include a new teacher-support role called "Student Achievement Specialist" and innovation groups called Vanguard Teams.

Emerson High School is a large urban 9-12 school. Students at Emerson were mostly (90%) Hispanic and 92% received free or reduced cost lunch. “ Project Bulldog” provided the major impetus and structural foundation for the school ’s technology and curriculum integration effort. Through “Project Bulldog" participating students receive desktop computers for their homes, and could select courses that integrate technology into the curriculum. Electric High selected the “Coalition of Essential Schools” model to guide the framing of its school-wide reform efforts.

New Tech High School, was established to give students "High-Tech preparation for a High-Tech world." They think of themselves as a high-tech "start-up" company where the students are learning to fill technically demanding jobs, but unlike a vocational school, their education is not seen as ending, and in fact almost all of the students go on to college. A number of radical improvements have been implemented and their school has become known as a showcase to which visitors come from all over the world.

Virtual High School is a consortium of high schools that provides Internet-based courses for students in member schools. This innovative course delivery method provided one type of "school without walls." The high quality curriculum content must adhere to a rigorous set of standards developed by an expert panel of teachers and evaluators. This organizational arrangement makes it possible for many students to take high caliber course work in specialized areas that their own school does not offer.


New conceptual and methodological models are needed to adapt to and understand the changes that result from the integration of information technology into education. Rapid changes in education due to information technologies mean that case study methods are useful to identify key factors, uncover hidden meanings, and explore alternative conceptual models. The “Exemplary Technology-Supported-Schooling Case Studies" project exemplifies the need for exploration of new concepts and methods, especially because it was designed so that the data would have some comparability to cases in two large international studies. The methodological decisions and procedures used here may be useful for future investigations of technology's role in schools, especially school improvement efforts.

Our study was indeed challenged by its links to large international comparative research projects. We know of no other study that simultaneously participated in two large international studies, conforming to both sets of procedures required. But what was in many ways a major burden yields a remarkable opportunity for exploratory and systematic comparative analysis across a large number of schools and countries.

Anderson, Ronald E. (1993). Computers in American Schools 1992: An Overview. University of Minnesota, Department of Sociology.

Becker, Henry J. (1999). Internet Use by Teachers: Conditions of Professional Use and Teacher-Directed Student Use. University of California, Irvine: Center for Research on Information Technology and Organizations (CRITO), [Available online at http://www.crito.uci.edu/TLC/findings.html]

Creswell, J. (1998). Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks, CA: Sage.

Dalin, Per (1994). How schools improve: An international report. London: Cassell.
Huberman, A. M. and Miles, M. B. (1984). Innovation up close: A field study in 12 school settings. New York: Plenum.

Louis, K. & Miles, M. (1991). Improving the urban high school: What works and why. New York: Teachers College Press.

Means, B., & Olson, K. (1997). Technology’s role in education reform: Findings from a national study of innovating schools. Washington, D.C. U.S. Department of Education, Office of Educational Research and Improvement.

Means, Barbara, Penuel, William R. & Padilla, C. (2001) The Connected School - Technology and Learning in High School. San Francisco: Jossey-Bass.

Miles, M. & Huberman, A. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage.
Salomon, G. (1993). No distribution without individuals' cognition. In G. Salomon (Ed.), Distributed cognitions (pp. 111-138). New York: Cambridge University Press.

Scarr, S. (1985). Constructing Psychology: Making facts and fables for our times. American Psychologist, 40, 499-512.

Schofield, Janet W. and Davidson, Ann L. (2002) Bring the Internet to School. San Francisco, CA: Jossey-Bass.

Stake, Robert E. (1995) The Art of Case Study Research. Thousand Oaks: Sage.

Yin, R. (1994). Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage.

Appendix 1

1(1) /IPPUT- history and scope
Description: M2: The history and scope including the goals and origin, the curricular/subject areas involved and the instructional org. of the IPPUT.
1 1) /IPPUT- history and scope/goals, obj of
Description: Objectives of IPPUT.

1 2) /IPPUT- history and scope/Origin, history of
The history behind the IPPUT; the needs and relevance on which it is based. OECD diffusion code (e.g. who adopted first, any patterns to adoption and implementation, adopter's characteristics, etc.).

1 3) /IPPUT- history and scope/Curr areas & org of
The IPPUT's curriculum goals, content and organization (e.g. cross-curricular links, relations with real-world-like problems), flexibility of the curriculum.

1 4) /IPPUT- history and scope/assessment practices of
This IPPUT's forms (e.g. tests, portfolios, project performances) and organization of assessment (e.g. formative or summative, role, students). This is mainly school-wide shared practices. See 5.2 for individual teacher practices.

2(2) /School
Description: M2: Information about the site itself including background, culture, and relationships.
2 1) /School/background
Background of school: type, location, size, student pop characteristics to understand school setting.

2 2) /School/relationships
Relationships of school (relevant to IPPUT) with: school board; parents; external - such as business partners, colleges. Include also special external funding such as grants or donations.

2 3) /School/culture, ldrshp
The culture (artifacts, symbols, basic assumptions, espoused values) of the school, including its collegiality and general professional development practices i.e. perhaps focused more on the adult’s experience as an employee in their workplace (see 3.3 for ICT-specific professional development). The leadership style and practices of: Principal; other leaders, including teachers; site council.

2 4) /School/Schoolwide reform, imprmt
OECD code: for school-wide improvements or reform that are related to , but larger than, the innovation we are focusing on and considering the IPPUT.

3(3) /ICT
Description: M2: ICT at the site itself and related to system plans; ICT support structure; ICT in the IPPUT.
3 1) /ICT/Role of in school
Vision of ICT, use of (other than for IPPUT), school policies/ plans for ICT. Goals of ICT distribution e.g., equity in access, etc.

3 2) /ICT/Rel w~ plans
Relationship of ICT in school to local district, state, or national plans (beyond the scope of the IPPUT).

3 3) /ICT/ICT support
ICT technical AND instructional support, including facilities, staff (such as Tech. Coord. or other), ICT-specific prof. Dev. (see 2.3 for gen'l. prof. dev), or however staff gained tech competencies, and incentives.

3 4) /ICT/descript of school's
Descriptions of the amt, and nature of ICT in school.

3 5) /ICT/ICT use in IPPUT
Use of ICT by students: communications; information retrieval and processing; multimedia; simulations, data collection and analysis; drill and practice; student-, teacher- and other actor-to-computer interactions; added value (unique contributions of ICT) to learning and teaching of ICT.

4(4) /Students
Description: M2: Which students are involved, their practices and outcomes.
4 1) /Students/describe involved
Description of involved students, including # of, grade level, experience with, socio-eco, and cognitive ability.

4 2) /Students/practices in IPPUT
Description: Roles, collaborations, and activities in IPPUT.

4 3) /Students/outcomes and impact of IPPUT & ICT
Student outcomes from IPPUT, including student competencies, attitude and motivation, career skill development. Include differences between classes or groups that have access to IPPUT and ICT and those who do not. OECD Equity hypothesis (#3): equity issues, gaps between high and low students’ access to and abilities with and benefits from.

5(5) /Teacher
Which teachers are involved, their practices, and their outcomes. May also include important non-licensed teaching staff in these categories too, to outline their background (use 5.1) and roles in the IPPUT (use 5.2).
5 1) /Teacher/bkgrd, exp, beliefs
Description of involved teachers, including ed background, experience with ICT, norms and beliefs on teaching and ICT, and their innovation history. May also include important non-licensed teaching staff in these categories too, to outline their background.

5 2) /Teacher/practices in IPPUT
Teacher practices in IPPUT, including instruction methods used, roles, interaction with students, use of curriculum materials and assessment. This is for individual teacher practices. See 1.4 for school-wide shared assessment and practices. May also include important non-licensed teaching staff too, to outline their roles in the IPPUT ( use 5.2).

5 3) /Teacher/outcomes and impact of IPPUT & ICT
Teacher (especially self-identified) outcomes from IPPUT and/or school-wide reform, including competencies, attitudes and beliefs. See also 2.3 for professional development and professional collaboration.

6(6) /Sustainability
Description: the innovation characteristics and the micro, meso and macro level factors that impact the IPPUT. NOTE: These codes might often be used as a second code to some other descriptive information about the school or IPPUT.
6 1) /Sustainability/IPPUT charac~ &
Characteristics of the IPPUT that contribute to or impede sustainability, including implementation issues, barriers, solutions (for OECD future projections). NOTE: This code might often be used as a SECOND, ADDITIONAL code to some other information (e.g. 1.x, 2.2, etc.).

6 2) /Sustainability/micro &
Micro level factors ( teachers, classroom factors, students)that contribute to or impede the sustainability of the IPPUT. NOTE: This code might often be used as a SECOND, ADDITIONAL code to some other information (e.g. 2.x, 4.x, 5.x)

6 3) /Sustainability/meso &
Meso level factors (student pop, school-level staff (e.g. prin., tech coord), ICT and ICT support) that contribute to or impede the sustainability of the IPPUT (for school culture use 2.3). NOTE- This code might often be used as a SECOND, ADDITIONAL code to some other information.

6 4) /Sustainability/macro &
Macro level (district-level actors or context; district, state, national ed system and ICT or Ed reform policies) factors that contribute to or impede the sustainability of the IPPUT. NOTE: This code might oftern be used as a SECOND, ADDITIONAL code to some other information.

7(7) /Transferability/Scalability
Description: M2: The transferability of the innovation and the micro, meso and macro level factors that impact its transferability.
7 1) /Transferability/IPPUT charac &
M2: The transferability or scalability of the innovation and the micro, meso and macro level factors which impacts its transferability. NOTE: These codes might often be used as a second code to some other descriptive information about the school or IPPUT.

7 2) /Transferability/meso &
Meso level factors (student pop, school-level actors [beyond classroom, e.g. prin or tech coord], context and culture, ICT and ICT support) that contribute to or impede the transferability or scalability of the IPPUT. NOTE: This code might often be used as a SECOND, ADDITIONAL code to some other information (e.g. 3.x, 4.2).

7 3) /Transferability/macro &
Macro level (district-level actors or context; district, state, national ed system and ICT or Ed reform policies) factors that contribute to or impede the transferability or scalability of the IPPUT. NOTE: This code might often be used as a SECOND, ADDITIONAL code to some other information.

(8) /Does not fit
Description Use sparingly and only when info absolutely does not fit any other existing category.