• 沒有找到結果。

Chapter 4 Methodology

I. Questionnaire Surveys and Students’ IT Literacy Assessment

4.8 Data analysis methods

The data collected were analysed at three different levels:

1. School sector level, i.e. Secondary, Primary and Special: The unit of analysis is the school sectors. Schools and individuals were grouped by sector and summary statistics were presented.

No attempts were made to compare across sectors, as they are different with respect to focus, missions, contexts and goals.

2. School level: The unit of analysis is the schools. At this level the analyses considered differences across schools, for example, in the general nature and pattern of use of IT by teachers in schools or the school-averaged performance of the students, and the factors contributing to these variations.

3. Individual student level: The unit of analysis is the individual students. Here the analyses focused on variations across students such as their attitude towards and competence and use of IT in learning related activities, and the investigation of important contextual factors that affect these variations.

The following section gives a more detailed description of the analyses that were carried out for each instrument.

4.8.1 Questionnaire Surveys

In order to address the Research Question sets, all of the important constructs pertinent to each question set were identified and the items from the various questionnaires relevant to the constructs were then mapped. Descriptive statistics such as means, standard deviations and frequency distributions were computed as appropriate to reveal the overall picture of ITEd implementation in Hong Kong. The methods of estimation of weighted means and percentage distributions are explained in Appendix X. Appropriate data were compared to the findings of the SITES-M1 and Preliminary Study to chart progress over the five-year period.

4.8.2 IT Literacy Assessment

There are two sections to the IT Literacy Assessment.

Section 1 is the assessment of students’ stage-specific knowledge and skills in IT. These items were marked according to the model answers. A summary was computed of numbers of students by percentage of correct answers. To examine the extent to which the students had met the criterion-based expectations of IT targets on this section of the assessment, three categories of indicator were used:

those who were able to answer less than 50% of the items correctly, those who were able to answer from 50% to 80% of the items correctly and those who were able to answer more than 80% correctly.

Students who scored above 80% were considered to have complete mastery of the stage-specific knowledge and skills, since, as mentioned earlier, a 20% tolerance level was set up to allow for random variation which may affect the test score from a variety of sources, such as distractions in the assessment environment, the occasion of testing, the rater, the examinee’s state of mind at the time of testing, etc. Those who scored from 50% to 80% were considered to have at least a reasonable grasp.

At school level, summaries were made regarding percentage of students meeting this criterion by school.

Section 2 of the IT Literacy Assessment is the students’ self-assessment of their IT literacy. Frequency distributions of the students’ responses to the items were produced.

The method for computing the percentage distributions is included in Appendix X.

Chapter 4: Methodology

4.8.3 School Visits

The most important purpose for the analyses of the School Visit data was to identify unique uses and good practices that were of interest for further investigation. Fieldworkers made independent observations and records and reached consensus about these as a form of triangulation of the data during the post-visit debriefing. The methods of analysis of the data collected are explained below.

School Tour

Fieldwork staff members recorded their observations on the School Tour Observation Form. Some numerical data were analysed descriptively (e.g. frequency counts) to identify aspects of the school IT infrastructure, access and actual usage (for example, number of computers or peripherals). Field notes about their observations of IT utilisation in class and outside class and the feeling that there was an IT culture in the school were treated as qualitative data and were analysed using the procedures for qualitative data as described in Section 4.8.4 below.

School Document Analysis

Observations were recorded on pre-prepared forms. These notes were treated as qualitative data, as described in Section 4.8.4.

Classroom Visit

The major target for the analyses of classroom visit data was to identify innovative practices in the classroom and to make inferences about the effects on teaching and learning, particularly on students’

reactions. In the post-lesson de-briefing, the observers reflected on their observations of the pedagogical practices and learning outcomes (as defined for the purpose of this Study in Chapter 3, for example teacher-centredness, extent of IT use and how it relates to the pedagogical paradigm of IT, the extent of students’ motivation, understanding and participation) and came to a mutual consensus through discussion about the classroom practices and the perceived impact of these pedagogical practices of using IT. These reflections were treated as qualitative data and were analysed using the procedures for qualitative data, as described in Section 4.8.4. The Analysis Framework for Classroom Visit is attached in Appendix XI. Specific cases and examples of interest were drawn on to illustrate points emerging from these analyses.

IT Activity Daily Log

The purpose of the IT Activity Daily Log was to measure students’ habits, usage patterns, perceived importance, satisfaction and self-efficacy in using IT. Therefore the first part of the analysis was to calculate the means of total time spent with respect to activities, tools, physical location and activities related to school curriculum subjects.

The second part of the analysis considered the students’ ratings of various IT-related activities in terms of importance, interest and self-efficacy. These ratings were weighted with respect to the time the student actually spent on the respective activities, and then averaged to indicate their overall perceived importance, interest and self-efficacy with respect to each category of activities they had engaged in.

The method for estimating the weighted means of students’ responses is included in Appendix X.

Focus Group Interviews and Individual Interviews

The interviews were transcribed and the transcripts of the individual interviews were sent to the respective interviewees for verification, when appropriate. They were then analysed according to standard protocols for the analyses of qualitative data, as described in Section 4.8.4.

EMB Document Analysis

Four aims were identified for the analysis of the EMB documents:

Ÿ to identify specific targets/objectives of the ITEd initiative, Ÿ to chart progress and changes in the utilisation of resources, Ÿ to evaluate reports of major projects,

Ÿ to explain the funding model for the ITEd initiative.

The documents were classified according to categories based on the research question sets and the relevant documents were analysed systematically in order to address these issues. The framework for analysis of EMB Documents is attached in Appendix V.

Triangulation of data

In the data analysis process, data from different sources were triangulated with each other so as to ensure the validity of interpretation of and the conclusions drawn from the data. The following strategies were used to facilitate triangulation of the data:

Ÿ The use of a variety of data sources, such as data from different stakeholders and/or instruments within the school sector,

Ÿ The use of multiple research methods, that is quantitative and qualitative, to compare data, such as between Student Focus Group interview and IT Activity Daily Log,

Ÿ The use of more than one expert researcher within the research team and/or local, overseas consultants.

The triangulation took the form of checking the consistency of the data collected from different stakeholder groups or via different methods, and/or obtaining agreement between investigators about the analysis and interpretation of the data.

4.8.4 Description of qualitative analysis processes

Qualitative data analysis is the process of systematically arranging and presenting information in order to search for meaning in the data collected (Minichiello et al., 1995). The analysis of qualitative data involves three distinct stages which have been followed strictly in the present study:

Ÿ Data reduction and display,

Ÿ Identifying “themes” or patterns from data collected, Ÿ Drawing conclusions from the data.

Chapter 4: Methodology

Figure 4.2 Interactive model of analysing qualitative data (Modified from the work of Miles &

Huberman, 1994)

To analyse the qualitative interview and observation data, content analysis was used to identify and categorise the Primary “themes” or patterns in data collected (Patton, 1990). This was combined with the constant comparative method, in which incoming data were constantly compared for patterned regularities against the previous data. It is important to note here that the themes or patterns emerge out of the data rather than being imposed on them prior to data collection and data analysis (Patton, 1990).

Steps involved in qualitative data analysis

Ÿ Note-taking and tape-recording (for interviews); note-taking and recording on prepared forms (for observations, classroom visits),

Ÿ Verbatim transcription of tape-recorded interviews (in the language in which the interview was conducted); verification by interviewees of individual interviews,

Ÿ Coding and recoding (for all data including interview transcripts, observation notes and forms, questionnaire open comments, and relevant school documents),

Ÿ Drawing/verifying “themes” or patterns: Identification of categories of data specific to the instrument used and common across instrument,

Ÿ As appropriate, some descriptive statistics pertaining to frequency of incidents pertaining to specific themes,

Ÿ Identification of specific quotations to support or illustrate the patterns, and translation of these quotations into English where the transcript was in Chinese.

To ensure consistency, at least two fieldwork staff members made independent observations and compared these during the post-observation briefing.

It must be emphasized that the purpose of the description of proposed data analysis was to give a sense of the direction and proper protocols rather than details of the actual variables that were analysed. This is appropriate for the phenomonographical approach used in the qualitative part of this Study, in which the role of the researcher is:

‘noting regularities, patterns, explanations, possible configurations, causal flows, and propositions. The competent researcher holds these conclusions lightly, maintaining openness and skepticism, but the conclusions are still there, inchoate and vague at first, then increasingly explicit and grounded….qualitative data analysis is a continuous, iterative enterprise….

Qualitative researchers are in a more fluid – and a more pioneering – position’ (Miles and Huberman, 1994, p.11).

Conclusions:

Drawing / verifying

“themes” or patterns

Modified from the work of Miles & Huberman (1994)