• 沒有找到結果。

Learner readiness for online learning: Scale development and student perceptions

N/A
N/A
Protected

Academic year: 2021

Share "Learner readiness for online learning: Scale development and student perceptions"

Copied!
11
0
0

加載中.... (立即查看全文)

全文

(1)

Learner readiness for online learning: Scale development and student perceptions

Min-Ling Hung

a

, Chien Chou

a,*

, Chao-Hsiu Chen

a

, Zang-Yuan Own

b

aInstitute of Education, National Chiao Tung University, 1001 Ta-Hsueh Road, Hsinchu 30010, Taiwan, R.O.C bDepartment of Applied Chemistry, Providence University, 200 Chun-Chi Road, Shalu, Taichung 43301, Taiwan, R.O.C

a r t i c l e i n f o

Article history:

Received 20 August 2009 Received in revised form 15 March 2010 Accepted 10 May 2010

Keywords:

Distance education and telelearning Gender studies

Teaching/learning strategies

a b s t r a c t

The purpose of this study was to develop and validate a multidimensional instrument for college students’ readiness for online learning. Through a confirmatory factor analysis, the Online Learning Readiness Scale (OLRS) was validated infive dimensions: self-directed learning, motivation for learning, computer/Internet self-efficacy, learner control, and online communication self-efficacy. Research data gathered from 1051 college students infive online courses in Taiwan revealed that students’ levels of readiness were high in computer/Internet self-efficacy, motivation for learning, and online communi-cation self-efficacy and were low in learner control and self-directed learning. This study found that gender made no statistical differences in thefive OLRS dimensions, but that higher grade (junior and senior) students exhibited significantly greater readiness in the dimensions of self-directed learning, online communication self-efficacy, motivation for learning, and learner control than did lower grade (freshman and sophomore) students.

Ó 2010 Elsevier Ltd. All rights reserved.

1. Introduction

The domains of learning and teaching are experiencing great changes as higher-education institutions rapidly adopt the concepts and practices of e-learning. Many universities nowadays, such as those in Taiwan, are starting to provide web-based courses that complement classroom-based courses. Online courses provide learners with a variety of benefits such as convenience (Poole, 2000),flexibility (Chizmar & Walbert, 1999), and opportunities to work collaboratively and closely with teachers and other students from different schools or even across the world. But are college students ready for online learning?

Over the last few years, researchers have focused on developing a readiness scale for online learning. For example,Smith, Murphy, and Mahoney (2003) conducted a study with college-age students and found two primary factors that predicted student success: self-management of learning and comfort with e-learning. A review of this study, however, reveals that these scales and measures of assessing learners’ readiness do not comprehensively cover other dimensions that are critical to online learning and that include technical skills and learner control.

Since online learning has become highly popular in educational institutions, throughout this process, there has been and will continue to be a need for faculty and students to re-examine students’ readiness and to re-develop a more comprehensive measure of students’ readiness. By undertaking this task, teachers can design better online courses and guide students toward successful and fruitful online learning experiences.

To better understand how to achieve effective online learning, it is necessary to know what dimensions of online learning readiness college students should possess and what dimensions were possibly omitted in past research. Researchers have noted that technical skills involving computers and the Internet are related with learners’ performance in Web-based learning environments (Peng, Tsai, & Wu, 2006). Similarly, learners’ perceptions of the Internet shape the learners’ attitudes and online behaviors (Tsai & Lin, 2004).

In addition to appropriate network-related skills and attitudes, online learning environments that are not highly teacher-centered require students to take a more active role in their learning. In particular, students have to realize their responsibility for guiding and directing their own learning (Hartley & Bendixen, 2001; Hsu & Shiue, 2005), for time-management (Hill, 2002; Roper, 2007), for keeping up with the class, for completing the work on time (Discenza, Howard, & Schenk, 2002), and for being active contributors to instruction (Garrison, Cleveland-Innes, & Fung, 2004).

* Corresponding author. Tel.: þ886 3 5731808; fax: þ886 3 5738083. E-mail address:cchou@mail.nctu.edu.tw(C. Chou).

Contents lists available atScienceDirect

Computers & Education

j o u r n a l h o m e p a g e : w w w . e l s e v i e r . c o m / l o c a t e / c o m p e d u

0360-1315/$– see front matter Ó 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2010.05.004

(2)

Since online learning environments also allow students to have moreflexibility in their learning-activity arrangements, students need to make decisions about and to exercise control over their learning activities in terms of pace, depth, and coverage of the content, type of media accessed, and time spent on studying. Thus, the dimension of learner control also becomes an important part of students’ readiness (Stansfield, McLellan, & Connolly, 2004).

The online course environment provides communication tools to facilitate interpersonal communication among teachers and students (Hew & Cheung, 2008; Roper, 2007). By using asynchronous tools, such as threaded discussions and email, and synchronous ones, such as live chat, instant messages, and Skype, students can ask questions and exchange ideas to enhance their learning. Since online courses generally lack weekly face-to-face meetings, it is important for students to communicate comfortably and confidently with teachers and classmates through computer-mediated correspondence or discussion, especially those presented in writing (Salaberry, 2000).

The purposes of this study are to re-examine the concept and the underlying dimensions of students’ readiness for online learning, and to construct and validate an instrumentdthe Online Learning Readiness Scale (OLRS). Because this study’s OLRS framework is a hypothetical model serving to explain college students’ readiness toward online learning, the construct should be validated. Therefore, the present study has used a confirmatory factor analysis (CFA), instead of a traditional exploratory factor analysis (EFA), to establish the construct validity of the OLRS model. CFA is derived from the structural equation modeling (SEM) methodology and is theory-driven, functioning to help determine whether or not the number of factors and the loadings of measured variables on them conform to expectations based on pre-established theory. This study will explore the following four research questions:

1. Could an OLRS model be constructed and validated through CFA? 2. What is college students’ readiness for online learning?

3. Does the gender of college students make any difference in their readiness for online learning?

4. Does the grade (i.e., level of accumulated academic credits) of college students make any difference in their readiness for online learning? 2. Literature review

2.1. Measuring learner readiness toward online learning

The concept of readiness for online learning was proposed in the Australian vocational education and training sector byWarner, Christie, and Choy (1998). They defined readiness for online learning in terms of three aspects: (1) students’ preferences for the form of delivery as opposed to face-to-face classroom instruction; (2) student confidence in using electronic communication for learning and, in particular, competence and confidence in the use of Internet and computer-mediated communication; and (3) ability to engage in autonomous learning.

In order to concretize the readiness concepts,McVay (2000, 2001)developed a 13-item instrument for measuring readiness for online learning. The instrument focuses on student behavior and attitudes as the predictors. Later,Smith et al. (2003)conducted an exploratory study to testMcVay’s (2000)Readiness for Online Learning questionnaire. The instrument was administered to 107 undergraduate university students in the United States and Australia and yielded a two-factor structure,“Comfort with e-learning” and “Self-management of learning.” The former one, or the need for self-direction, was recognizable as an e-learning-focused dimension identified bySmith (2000) for its broader set of resource-basedflexible learning materials. The latter one permeated the concept of distance education, regarding whichEvans (2000)commented that self-direction is a prerequisite for effective resource-based learning in distance education. Later,Smith (2005)conducted a survey study with 314 Australian undergraduate university students and confirmed that the McVay Readiness for Online Learning questionnaire may have useful applicability to research and practice in the area of student dispositions and preferences associated with online learning.

The McVay instrument describes a readiness for engagement with the particular form of resource-based learning delivery that is online, rendering the two aspects identified in the instrument as potential factors affecting the present study. However, assessments of online learner readiness have needed to address facets that tend to vary significantly and that include technical computer-use skills, Internet-navigation skills, and learner control over the sequence and selection of materialsdfacets that, indeed, were absent from McVay’s instrument. Thus, the following parts of the current paper review more dimensions that may be involved in the readiness concept. 2.2. Self-directed learning (SDL)

It is important to note some of the highly relevant characteristics of self-directed learners. In the original research ofKnowles (1975), SDL is defined as a process in which individuals take the initiative in understanding their learning needs, establishing learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes. Knowles’ concept of SDL was further classified and concretized into the Self-Directed Learning Readiness Scale (SDLRS) byGuglielmino (1977) to help in the diagnosis of students’ learning needs and personality characteristics and to promote student autonomy. Later, Garrison’s (1997)model appeared to yield comprehensive representations regarding SDL, which was defined as an approach that helped stimulate learners’ assumption of personal responsibility and collaborative control over the cognitive monitoring) and contextual (self-management) processes in constructing and confirming meaningful and worthwhile learning outcomes.

As online learning programs have been intensively used in the past decades, it is important for distance educators to be proactive in helping potential learners to determine whether they are prepared to take an online course or program.Lin and Hsieh (2001)found that successful online learners make their own decisions to meet their own needs at their own pace and in accordance with their own existing knowledge and learning goals. The presence of this correlation makes it easier for mature students who are self-directed to take respon-sibility for learning and to be more enthusiastic about the learning activities.

In order to construct the SDL items in our proposed OLRS in this study, we created a pool of items by both writing new items and adapting items from available scales ofGarrison (1997), Guglielmino (1977), andMcVay (2000, 2001). In this way, we selected nine items covering

(3)

students’ attitudes, abilities, personality characteristics, and affectsve responses toward online learning. Two examples of these items are “I set up my learning goals” and “I carry out my own study plan” in an online context.

2.3. Motivation for learning

Motivation has had an important influence on learners’ attitudes and learning behaviors in educational research and practice (e.g.,Deci & Ryan, 1985; Fairchild, Jeanne Horst, Finney, & Barron, 2005; Ryan & Deci, 2000). Learning takes place through interplays between cognitive and motivational variables, and these two aspects have been found to be indivisible (Pintrich & Schunk, 2002; Stefanou & Salisbury-Glennon, 2002).

Students’ possession of motivational orientation (intrinsic or extrinsic) has significant effects on the students’ learning performance. According toRyan and Deci (2000), intrinsic motivation is a critical element in cognitive, social, and physical development because it is through acting on one’s inherent interests that one grows in knowledge and skills. Intrinsic motivation was found to be associated with a lower dropout rate, higher-quality learning, better learning strategies, and greater enjoyment of school (Czubaj, 2004; Deci & Ryan, 1985). ‘Extrinsic motivation’ was defined byDeci and Ryan (1985)as the performing of a behavior to achieve a specific reward. From students’ perspective, extrinsic motivation relative to learning may include getting a higher grade on exams, getting awards, and getting prizes.

In addition, Garrison‘s model(1997)involved motivation aspects reflecting both perceived value of learning and anticipated success in learning. Motivation included the need to do something out of curiosity and for enjoyment. Furthermore, Garrison identified that motivation and responsibility were reciprocally related and that they were facilitated by collaborative control over the educational transaction. To sustain motivation, students must become active learners who have strong desires for learning (Candy, 1991; Knowles, 1975).

Lepper (1989)andLepper and Cordova (1992)supported the idea that control along with fantasy, curiosity, and challenge is a critical feature in what makes particular technologies intrinsically motivating. Recently, researchers have investigated the role of motivation in computer-supported collaborative learning (CSCL). For example,Ryan and Deci (2000)identified that learners in an online setting had significant freedom to determine their own learning path, a freedom that might benefit learners with intrinsic motivation.Yang, Tsai, Kim, Cho, and Laffey (2006)found evidence that motivation is positively related with how learners perceive each other’s presence in online courses.Saadé, He, and Kira (2007)also noted that intrinsic and extrinsic motivation played an important role in the success or failure of online learning.

The dimension of“motivation for learning” can significantly facilitate learners’ efforts to be compatible with the learners’ own desires and to enhance their learning, retention, and retrieval. Understanding students’ attitudes and preferences toward learning is essential to improving the planning, producing, and implementing of educational resources (Federico, 2000). In order to construct relevant items of motivation for learning in the OLRS, we created a pool of items by both writing new items and adapting items fromRyan and Deci (2000). In this way, we selected seven items, an example of which is“I enjoy challenges” in an online learning context.

2.4. Learner control

By nature, Web-based environments are very different from traditional learning environments. Traditional learning environments, such as textbooks or instructional videos, typically require students to follow a linear sequence. Web-based instruction systems permit more flexibility and freedom in study materials. The learners are allowed to choose the amount of content, the sequence, and the pace of learning with maximum freedom (Hannafin, 1984; Reeves, 1993). The learners are given control over their own instruction and can follow a more individualized approach by repeating or skipping sections and by following subjects regardless of the order in which information has been physically arranged. In the broadest sense, learner control is the degree to which a learner can direct his or her own learning experience and process (Shyu & Brown, 1992). The meaning of learner control has evolved over time to include the characteristics of new learning para-digms as well as new technologies.

The Component Display Theory ofMerrill (1983)and the Elaboration Theory ofReigeluth and Stein (1983)have indicated that learner control is an important aspect of effective learning and that the level of learner control may maximize student performance.Merrill (1984) suggested that the learner should be given control over the sequence of instructional material. With this control, individuals can discover how to learn as they make instructional decisions and experience the results of those decisions.

In asynchronous online learning environments, there seems to be no specific instructional sequence that is the most suitable for all learners. Learners may have their own preference, viewing the instructional material in a sequence that best meets their needs (Jonassen, 1986).Wang and Beasley (2002)used a sample of 81 Taiwanese undergraduates and found that students’ task performance is affected mainly by learner control in a Web-based learning environment. Thus, online learners who are better empowered to determine their own learning may exhibit better learning performance. Since the way in which each individual would prefer to access and to interact with computer-based learning material varies from individual to individual, the current study proposes related items that accommodate, in our OLRS, learner control over nonlinear, iterative, and active learning styles. An example of these items is“I set up a schedule to view the online instructional material.”

2.5. Computer & Internet self-efficacy

Since online courses are delivered through networks, it would be particularly important to have related assessments concerning indi-viduals’ perceptions of using a given technology and individuals’ ability to use the technology, that is, assessments concerning computer/ network self-efficacy. The related idea of self-efficacy stems from social cognitive theory, which offers a conceptual framework for understanding how self-efficacy beliefs regulate human functioning through cognitive, motivational, affective, and decisional processes (Bandura, 1977, 1986, 1997).Compeau and Higgins (1995)developed and validated a 10-item instrument of computer self-efficacy (CSE) and identified that computer self-efficacy had a significant influence on computer-use outcomes, emotional reactions to computers, and actual computer use. The researchers claimed that computer self-efficacy does not reflect simple component skills, such as booting up the

(4)

computer; instead, it represents an individual’s perception of his or her ability to use computers to accomplish a task, such as using software to analyze data.

Similarly, in discussing Internet self-efficacy (ISE),Eastin and LaRose (2000)pointed out that ISE does not result merely in performing some Internet-related tasks, such as uploading or downloadingfiles; rather, ISE is one’s ability to apply higher-level skills such as trou-bleshooting problems. ISE may be different from CSE and may require a set of behaviors for establishing, maintaining, and using the Internet. In addition,Tsai and Tsai (2003)showed that students with high Internet self-efficacy learned better than did students with low Internet self-efficacy in a Web-based learning task.Tsai and Lin (2004)explored adolescents’ perceptions and attitudes regarding the Internet among 636 high school students and found that females were more likely than males to perceive the Internet as pragmatic and that males’ enjoyment of the Internet was greater than females’ corresponding enjoyment.

In order to construct the computer and Internet self-efficacy-related items in our proposed OLRS in this study, we newly developed some items and selected other items from those concerning computer self-efficacy (Compeau & Higgins, 1995) and Internet self-efficacy (Eastin & LaRose, 2000) with a shift to the online learning context. An example of these items is“I feel confident in using the Internet (Google, Yahoo) tofind or gather information for online learning.”

2.6. Online communication self-efficacy

Online learning may also involve computer-mediated communication. Researchfindings indicate that shy students tend to participate more in online environments than in traditional environments (Palloff & Pratt, 1999).McVay (2000)reported that it is important to create opportunities for interactions and communications between students and their instructors in Web-based learning. Similarly,Roper (2007) suggested that successful students should make the most of online discussions, which may provide opportunities for richer discourse and thoughtful questions as a technique to engage both fellow students and instructors. Asking questions is a way to go deeper into the subject, and going deeper makes the subject matter more understandable. In addition, to prevent burn-out or loss of interest when studying online, students should take advantage of opportunities to work with other online students, using encouragement and feedback to stay motivated. From the above-mentioned studies, we conclude that communication self-efficacy in online learning is an essential dimension for overcoming the limitations of online communication. In the current study, we created a pool of items for online communication self-efficacy by both writing new items and adapting concepts fromMcVay (2000)andRoper (2007). One example of these items is“I feel confident posting questions and responses in online discussions.

In sum, by understanding college students’ readiness for online learning, not only can instructional designers provide better online courses, but also teachers can help students enhance their online learning experiences. To appropriately categorize students’ readiness and to construct an OLRS, we drew on self-directed learning proposed byGarrison (1997)andMcVay (2000, 2001), motivation for learning (Ryan & Deci, 2000), computer and Internet self-efficacy (Compeau & Higgins, 1995; Eastin & LaRose, 2000), learner control (Shyu & Brown, 1992), and online communication self-efficacy (McVay, 2000; Roper, 2007). On the basis of the OLRS, we strive to better understand college students’ readiness for online learning and whether gender and grade differences are significant characteristics of the students’ readiness. 3. Research method

3.1. Contexts and participants

The current study’s participants were online learners who were enrolled in at least one of five online courses in three universities in Taiwan. A total of 1200 questionnaires were distributed by paper or email after the midterm. A sample of 1051 usable responses was obtained from a variety of undergraduate students with different majors, resulting in a response rate of 87.6%. The students were asked to describe themselves in reference to a 5-point Likert-type scale, with anchors ranging from 1 (strongly disagree) to 5 (strongly agree).

The demographic variables included gender, student grade (freshman or sophomore), and course name. There were more female respondents (589, 56%) than male respondents (462, 44%). Regarding their grade level, 321 (30.5%) participants were juniors whereas 648 (61.7%) were seniors and 82 (7.8%) were freshmen and sophomores. Regarding the course taken, 658 (62.61%) participants were from the life chemistry course, 169 (16.08%) participants were from calculus, 80 (7.61%) participants were from statistics, 79 (7.52%) participants were from Taiwan ecology, and 65 (6.18%) were from introduction to environmental protections.

Thefive online courses surveyed in this study were purely distance learning with an asynchronous format. Three courses were elective so that students had indicated their preferences on the course format. Although calculus and statistics were required courses, these two online courses were repeated mainly for students who had failed to pass a previous class. Before taking the online course, students would already have been informed of the course format. Thus, they would be able to make decisions based on their preferences concerning course format. Our study focused on three universities. (1) a national university that was located in northern Taiwan and that offered“Calculus” via the Blackboard Learning Management System; (2) a private university that was located in Northern Taiwan and that offered the required course “Statistics” and the two general education courses “Introduction to Environmental Protections” and “Taiwan Ecology” via the Moodle Learning Management System; and (3) a private university that was located in central Taiwan and that offered the general education course “Life Chemistry” via a self-developed adaptive learning system. All five courses were asynchronous online courses featuring digital learning materials including videos and slides. Students were asked to post questions and comments on discussion spaces for each week throughout the semester, and the instructors and teaching assistants responded to students’ postings.

3.2. Instruments

To ensure that no important aspects of OLRS were neglected, we conducted OLRS-themed interviews with two college teachers and two students who had online learning experience. They reviewed the initial item list and recommended (1) using simple but unambiguous language and short sentences, and (2) adding some possible omitted items to cover a broader scope of the variables. All together, 26 pre-selected questions cover these dimensions.

(5)

The scale was divided intofive dimensions: directed learning, learner control, motivation for learning, computer/Internet self-efficacy and online communication self-efficacy (seeAppendix 1for statements in each dimension). In thefirst part, “self-directed learning” centered on learners’ taking responsibility for the learning context to reach their learning objectives described byGarrison (1997). The concept of“learner control” centered on online learners’ control over their learning (control that manifested itself as repeating or skipping some content) and on efforts by online learners to direct their own learning with maximum freedom. The concept of“motivation for learning” centers on online learners’ learning attitudes, and the concept of “computer/Internet self-efficacy” is about online learners’ ability to demonstrate proper computer and Internet skills. Thefinal concept is “online communication self-efficacy,” which would describe learners’ adaptability to the online setting through questioning, responding, commenting, and discussing.

4. Results

4.1. Model testing results

We used confirmatory factor analysis (CFA) to evaluate the hypothetical model of this study. The results for the initial measurement model, as shown inTable 1, indicate poor modelfit. An examination of the LISREL output indicates that several items had large standardized residuals (greater than 3.0). Following established data-analysis practices (MacCallum, 1986), we deleted problematic items and reevaluated the measurement model. As a result, 8 of the 26 items were removed from the analysis. To assess the influence of item deletion on content validity, we examined the items that remained for each construct. Content validity appeared to be adequate because our measurement of each construct rested on at least three items.

Thefinal model chi-square ¼ 451.18 (p < 0.001) indicates a bad fit. However, some problems exist when relying solely on chi-square statistics since chi-square has been indicated as being sensitive to sample size. Owing to the large sample size of this study (n¼ 1051), we further discuss other indices to evaluate modelfit.

An adjunct discrepancy-basedfit index may serve as the ratio of chi-square to degrees of freedom (

c

2/df). A

c

2/df ratio value less than 5

indicates an acceptablefit between a hypothesized model and sample data (MacCallum, Browne, & Sugarwara, 1996). In the present study, the revised measurement model,

c

2/df¼ 3.61, indicates that the proposed model may have an acceptable fit. As can be seen inTable 1, the

other indices for model-fit evaluation were RMSEA ¼ 0.050, SRMR ¼ 0.043, GFI ¼ 0.95, and CFI ¼ 0.99. RMSEA values less than 0.05 are indicative of a closefit, values ranging from 0.05 to 0.08 are indicative of a reasonable fit, and values 0.09 are indicative of a poor fit (Browne & Cudeck, 1993; MacCallum et al., 1996). To conclude, the revised measurement model exhibits both a goodfit and psychometric properties. As shown inFig. 1, each item has a substantial loading between 0.55 and 0.85 on thefive factors, and each loading was

Table 1

Modelfit measurement statistics.

Model c2 df c2/df< 3.0a RMSEA<0.08a SRMR<0.05a GFI>0.90a CFI>0.90a

Initial 4558.27 289 15.772 0.119 0.150 0.75 0.94

Revised 451.18 125 3.609 0.050 0.043 0.95 0.99

aRepresents the range indicating acceptablefit.

Fig. 1. Results of the confirmatory factor analysis: pattern coefficients for online learner readiness. Note. CIS: computer/Internet self-efficacy; SDL: self-directed learning (in an online context); LC: learner control (in an online context); MFL: Motivation for learning (in an online context); OCS: Online communication self-efficacy.

(6)

statistically significant. Thus, results constitute evidence pointing toward the construct validity of the instrument for learner readiness in an online learning format.

4.2. Validity and reliability

We evaluated the OLRS measurement model by examining the composite reliability and the convergent and discriminant validities. Studies have suggested that 0.7 is an acceptable value for a reliable construct (Fornel & Larcker, 1981). The values of composite reliability for thefive subscales given inTable 2were acceptable.

The factor loadings from the CFA provide evidence for convergent validity as all items load sufficiently high on the corresponding constructs. We also evaluated convergent validity by using average variance extracted (AVE), which should exceed 0.50 (Fornel & Larcker, 1981). As indicated inTable 2, all indicator factor loadings exceed the threshold value of 0.50 suggested byPeterson (2000). AVE ranged from 0.486 to 0.686. Two constructsdcomputer/Internet self-efficacy and learner controldwere slightly below 0.50. For discriminant validity, the square root of the AVE of each construct should be greater than the correlation shared between the construct and other constructs in the model and should be at least 0.50 (Fornel & Larcker, 1981).Table 3displayed the correlations among constructs, with the square root of the AVE on the diagonal. All constructs satisfactorily pass the test, as the square root of the AVE (on the diagonal) is larger than the cross-correlations with other constructs. The convergent and discriminant validities of the constructs of the OLRS model are thus acceptable. 4.3. Difference among students’ scores of five readiness dimensions on the OLRS

Table 4presents students’ mean scores and standard deviations on the five subscales. To calculate each student’s mean score for every factor (dimension), we identified the sum of the answers to each item in that factor, and then divided the sum by the number of that factor’s items. AsTable 4indicates, all students’ average scores relative to the different dimensions range from 3.75 to 4.37 on a 5-point Likert-type rating scale, indicating that on average these online learners exhibited above-medium levels of readiness toward online learning. In order to investigate the differences among thefive factors (dimensions) of the OLRS, we conducted a multivariate, repeated one-way ANOVA. By comparing the mean of thosefive dimensions, the higher the mean score, the more online learning readiness the self-evaluating students assigned to themselves. The comparisons of thesefive mean scores can indicate the rank of students’ readiness of the five dimensions. The results show that Hottelling’s Trace was significant (F ¼ 237.323, p < 0.001). A post hoc test further revealed that the mean score of factor MFL (motivation for learning) was greater than the mean scores of factors SDL, OCS, and LC; that the mean score of factor CIS (computer/ Internet self-efficacy) was greater than the other four factors’ mean scores, the mean score of factor OCS was greater than the mean scores of factors SDL and LC; and that the mean score of factor SDL was greater than the mean score of factor LC.Table 4shows the results of a multivariate, repeated, one-way ANOVA and of a post hoc test of the OLRS factors.

4.4. Gender difference in online learning readiness

To test for gender differences in the OLRS constructs, we ran a Multivariate Analysis of Variance (MANOVA) that revealed no significant difference between male and female students, as shown inTable 5.

4.5. Grade difference in online learning readiness

This study furthermore analyzed the relationships between students’ grade (i.e., level of accumulated academic credits) and the OLRS dimensions. This study divided the sample students into three groups according to student grade: (1) freshmen and sophomores, (2) juniors, and (3) seniors. The freshman and sophomore students were in the same group because there were not many students in even the combination of the two groupings (about 7.8% were freshman and sophomore students, 30.5% junior students, and 61.7% senior students).

Table 2

Reliability, AVE, CR of confirmatory factor analysis.

Measures Items Composite reliability Average variance extracted (AVE) Computer/Internet self-efficacy 3 0.736 0.486

Self-directed learning 5 0.871 0.577

Learner control 3 0.727 0.477

Motivation for learning 4 0.843 0.573

Online communication self-efficacy 3 0.867 0.686

Table 3

Correlations among constructs (square root of AVE in diagonal). Computer/ Internet self-efficacy Self-directed learning Learner control Motivation for learning Online communication self-efficacy Computer/Internet self-efficacy 0.697 Self-directed learning 0.087 0.760 Learner control 0.052 0.661 0.691

Motivation for learning 0.266 0.572 0.511 0.757

Online communication self-efficacy 0.151 0.459 0.412 0.621 0.828

Note: diagonal elements (in bold) represent the square root of the average variance extracted (AVE). Off-diagonal elements represent the correlations among constructs. For discriminant validity, diagonal elements should be larger than off-diagonal elements.

(7)

The MANOVA tests revealed that students’ grade levels made significant differences in the OLRS (F ¼ 4.519, p < 0.000; Wilks’ Lambda¼ 0.958; partial eta squared ¼ 0.021). As shown inTable 6, a follow-up analysis showed that grade level made significant differences in mean scores of self-directed learning (F¼ 14.23, p < 0.01), learner control (F ¼ 13.44, p < 0.01), online communication self-efficacy (F¼ 8.59, p < 0.01), and motivation for learning (F ¼ 4.39, p < 0.05).

A multiple-comparisons analysis revealed that seniors rated self-directed learning (Scheffe’s post hoc analysis, p < 0.01) significantly higher than freshmen, sophomores, and juniors. Seniors rated learner control (Scheffe’s post hoc analysis, p < 0.01) significantly higher than juniors, freshmen and sophomores. Juniors and seniors scored higher than freshmen and sophomores with respect to online communi-cation self-efficacy (Scheffe’s post hoc analysis, p < 0.01). In addition, seniors rated motivation for learning (Scheffe’s post hoc analysis, p< 0.05) significantly higher than freshmen and sophomores. Students of different grade levels did not express any significant difference regarding their readiness in the computer/Internet self-efficacy dimension.

5. Discussion

5.1. Dimensions of college students’ readiness for online learning

This study presents a conceptual framework for understanding learner readiness in online learning settings and analyzes the validity and the reliability of an instrumentdthe OLRSdthat can facilitate research in this area. Learning attitude, style, and ability have historically provided students and instructors with academic support. However, with the advent and growing popularity of e-learning, it is important to reconsider students’ intention and characteristics in online learning environments. This study focuses on addressing preliminary psycho-metric properties (internal consistency and construct validity) and on confirming the factor structure of the scale by using 1051 college or university students enrolled in at least 1 of 5 different online courses, as well as by examining the relationships among the factors.

The confirmatory factor analysis of the OLRS supported the five dimensions (factors) model: self-directed learning, motivation for learning, computer/Internet self-efficacy, online communication self-efficacy, and learner control. All constructs display adequate reliability and discriminant validity. Composite reliability for thefive subscales all met the recommended minimum 0.70 (Fornel & Larcker, 1981). All factor loadings were significant (p < 0.001), indicating that each item was well represented by the factors and all constructs share more variance with their indicators than with other constructs. Thus, the OLRS was found to be a valid measure of online learner attitude and behavior. Thesefindings provide evidence that although the scale is multidimensional, being composed of the different five dimensions, the items reflect, at a more general level, the overall online learner readiness construct.

A comparison of the present study with previous related studies reveals that learners’ readiness is indeed an important issue in online learning settings. The OLRS provided by the present study seems more comprehensive than the Readiness for Online Learning questionnaire provided bySmith (2005)andSmith et al. (2003). The current study constructs more dimensions as well as more items that cover the scopes of online learners’ attitudes and behaviors. The OLRS instrument can be characterized as containing both general dimensions (e.g., moti-vation for learning and computer/Internet self-efficacy) and specific dimensions (e.g., online communication self-efficacy). The instrument in this study has sufficient merits to justify further research in the area.

5.2. Students’ readiness scores of five dimensions on the OLRS

Research question 2 concerns college students’ readiness for online learning. In this study, students’ mean scores in five dimensions are all higher than the theoretical mean of 3, ranging from 3.60 to 4.37 on a 5-point Likert scale. Thisfinding means that the current study’s sample of college students has the highest readiness in the dimension of computer/Internet self-efficacy, followed by motivation for learning and online communication self-efficacy, and the lowest readiness in the dimensions of learner control and self-directed learning. From the above results, we found that college students nowadays may be relatively confident in their computer/network skills (such as managing software, searching for online information, and performing basic software functions), which are requisite for online learning, and thus, the students would be ready to take online courses from these perspectives. Of course, there exist individual differences that create

Table 4

Results of the multivariate one-way ANOVA and of post hoc test.

Dimension Mean SD F value (Hotelling’s Trace) Summary of significant differences in paired samples in post hoc test CIS: Computer/Internet self-efficacy 4.37 0.602 237.323*** CIS> MFL > OCS > SDL > LC

SDL: Self-directed learning 3.75 0.654 LC: Learner control 3.60 0.715 MFL: Motivation for learning 4.01 0.593 OCS: Online communication self-efficacy 3.93 0.673 ***p< 0.001.

Table 5

Descriptive statistics and F test of gender on OLRS dimensions.

Gender F P Partial eta squared

Male Female

M SD M SD

Computer/Internet self-efficacy 4.365 0.634 4.366 0.577 0.001 0.973 0.000 Self-directed learning 3.754 0.674 3.743 0.638 0.074 0.785 0.000 Learner control 3.608 0.759 3.586 0.680 0.241 0.623 0.000 Motivation for learning 4.028 0.631 3.995 0.561 0.788 0.375 0.001 Online communication self-efficacy 3.951 0.703 3.909 0.650 0.980 0.322 0.001

(8)

a need for teachers’ special guidance or training relative to online learning (Tsai & Tsai, 2003). For example, online orientation for courses can (1) bring technical training and support to online students so that they can be familiar with the functions of the learning system and (2) may reduce future encounters with possible technical difficulties.

In the dimension of motivation for learning, the sampled students who took online courses demonstrated that, already, they had opened themselves to new ideas (online learning), had learned from mistakes, and were willing to share ideas with others. This overallfinding is consistent with thefinding fromSaadé et al. (2007)that motivation may play an important role in online learning. In addition, sampled students in general seem to be somewhat confident in their online communication styles. Online learners’ interactions occur mostly through an online threaded discussion that allows not only students and students but also students and instructors to interact in asyn-chronous ways. It is obvious that students who have better online communication self-efficacy feel relatively comfortable in expressing themselves in writing (McVay, 2000, 2001; Salaberry, 2000; Roper, 2007).

In this study’s five readiness dimensions, students’ ratings yielded significantly lower mean scores for learner control and self-directed learning than for the other three dimensions. Indeed, most online learners take an online course owing to its convenience and flexibility (Chizmar & Walbert, 1999; Poole, 2000). Thus, it is important that online learners have the ability to develop time-management skills. They should possess self-discipline to devote adequate time to the course, to post discussion-related messages, and to submit their work on time (Discenza et al., 2002; Hill, 2002; Roper, 2007). Furthermore, since the online course is not like the traditional course with face-to-face instructor guidance, learners are easily distracted by other things around them such as online games, and instant messages. In short, it is important for online learners to have self-directed learning ability (Garrison, 1997).

5.3. Gender and grade differences in college students’ readiness

Does gender of college students make any difference in their readiness for online learning, as the third research question asks? The results of this study show no gender differences. Thisfinding means that male and female students had similar levels in all readiness dimensions: they exhibited equal attitudes and behaviors on self-directed learning, motivation for learning, computer/Internet self-efficacy, online communication self-efficacy, and learner control. The above findings are similar to the findings ofBunz, Curry, and Voon (2007)that there appeared to be no gender difference in computer competency, and to thefindings ofMasters and Oberprieler (2004)that men and women exhibited equal participation in web-based learning environments. However, the current study’s findings differ from those inCaspi, Chajuta, and Saportaa (2008), where women seemed to prefer written communication more than men did. In a study byKay (2009), male students’ perception that interactive classroom communications systems improved the overall learning process was stronger than female students’ corresponding perception, regardless of computer-comfort level or computer-use type. In this study, male and female students demonstrated an equal readiness tendency for online learning. It may be possible that male and female students have similar attitudes and beliefs toward their pursuit of academic studies online. Thus, the results show there is no gender difference in college students’ online learning readiness.

Grade levels seem to make differences in students’ readiness for online courses. For the current study, we further conducted a series of post hoc tests (Scheffe tests) to examine the relationship between grades and OLRS dimensions, the end goal being to answer the fourth research question. Thefirst finding is that seniors exhibited significantly greater readiness in the dimension of self-directed learning and of learner control than did all lower-grade students and in the dimensions of motivation for learning than did freshmen and sophomores. This finding means that students’ maturity may play an important role in their monitoring, managing, control, and motivation relative to online learning. The secondfinding is that juniors and seniors exhibited significantly greater readiness in the dimension of online communication self-efficacy than did freshmen and sophomores. This means that higher-grade students were perhaps more accustomed to communicating with their teachers and peers through computer-mediated communication in the e-learning context. The thirdfinding is that, in terms of computer/Internet self-efficacy, all students demonstrated an equal degree of readiness. The possible reason is that they were confident in their computer and Internet-related skills and knowledge when taking an online course.

Thisfinding is congruent withWojciechowski and Palmer’s (2005)finding that students who were older had few previous course withdrawals and were more likely to be successful in online classes. Thus,Wojciechowski and Palmer’s (2005)study and the current one lead to the tentative assertion that relatively mature college students possess greater readiness for enrollment in online courses and would exhibit, in such an environment, a better learning performance than students of less pronounced maturity.

6. Implications

The results of this study reveal that two readiness dimensions need special attention: learner control and self-directed learning. Teachers may need to help students develop self-directed learning and learner-control skills and attitudes, especially for online learning contexts. For example, teachers may need to improve the clarity of their syllabus and course structure before students can direct themselves toward taking full control of their own learning. Thus, teachers can help students to establish their own time- and information-management skills

Table 6

Descriptive statistics and F test of grade level on OLRS dimensions.

Grade level F Partial eta squared Post hoc analysis Mean (SD)

(1) Freshman/Sophomore (2) Junior (3) Senior

Computer/Internet self-efficacy 4.30(0.56) 4.35(0.60) 4.38(0.61) 0.78 0.002

Self-directed learning 3.52(0.65) 3.64(0.66) 3.83(0.64) 14.23** 0.027 (3)>(1),(2) Learner control 3.37(0.78) 3.48(0.70) 3.68(0.70) 13.44** 0.025 (3)>(2)>(1) Motivation for learning 3.86(0.56) 3.97(0.59) 4.04(0.60) 4.39* 0.008 (3)>(1) Online communication self-efficacy 3.63(0.64) 3.95(0.68) 3.96(0.67) 8.59** 0.016 (2)>(1), (3)>(1) *p< 0.05; **p < 0.01.

(9)

and can ensure adequate time for the class participation. Teachers should design activities to pull the students in: for example, encouraging students to share real-life experiences and to vote on or comment on issues pertaining to the online courses. When dealing with the students who possess relatively low learner control, teachers can provide the students with a pre-test clarifying the entry level of the students’ ability and then can instruct students to control both the learning content and the learning process in ways that meet the students’ individual learning needs: for example, teachers can instruct students to repeat what they do not understand. Teachers can create a learning community through which group discussions, experience sharing, instant feedback, and so on can keep the students interested in the course. Teachers can, if possible, send an email or make phone calls to relatively passive students to ask them the reason for the passivity and to draw them back into the course.

In addition, teachers of online courses need to encourage students, especially those with lower self-efficacy in online communication, to participate more extensively in the discussions, to bravely express their thoughts, to form better friendships, and to seek assistance when facing problems online. Since motivation is one of the important factors in online learning, teachers should help students stay motivated in online learning. For example, teachers can provide students with an appropriate induction into the world of online learning by having students get to know their teacher or peers through online tools or by responding promptly and positively to students’ inquiries again through online tools. If students seem to face problems or feel discouraged during the process of online learning, quick and supportive intervention and assistance are necessary for the preservation of students’ motivation.

Do the abovefindings also mean that younger college students would do well to avoid online courses? It is indeed tough for freshmen to adjust their high school learning patterns to college ones, and even tougher to make the adjustment from their high school classrooms to virtual college classrooms. Nevertheless, in college, students should take more responsibility for their own learning experiences. For example, students shouldfigure out how to best divide up their time for readings and assignments. Thus, if students in their first year are unable to schedule themselves independently, discipline themselves to work without direct (face-to-face) guidance from instructors, or spend hours studying learning materials presented on a computer screen, a principled suggestion is that the students not take online courses. From another perspective, if the online course is open to freshmen, the design of the course should be well-organized. For example, students may receive frequent reminders about the deadlines, requirements, and tests (through email or instant messages over cell phones), and the students should be especially encouraged to seek out assistance from teachers, TAs, or other academic advisors during the learning process.

7. Limitation and recommendations for future research

This study revealed several limitations that should be discussed in future research. First, the average variance extracted (AVE) for computer/Internet self-efficacy and learner control is below 0.50, which does not show a satisfactory convergent validity. For these two dimensions, the indicators need to be revised or new relevant indicators need to be added in further studies. Second, this study used 1051 students fromfive different courses and almost two-thirds of the sample from the course of life chemistry. Because this study aims to develop an instrumentdthe OLRSdfor all college students, this study did not probe into the learner-readiness differences relative to these five courses. However, in order to examine the usefulness of the OLRS for all academic disciplines, students from diverse colleges and courses may be involved in future research. Third, because of its exploratory nature, this study did not check OLRS criterion-related validity; that is, we did not collect students’ data on the OLRS and other similar scales concurrently. Future research may focus on the correlation of OLRS and other similar scales for more concurrent evidence of validity. In addition, future research may address the test-retest reliability of the OLRS.

This study’s new readiness concept is a relatively comprehensive one whose valid and reliable measurementdthe OLRSdcan strengthen future investigation into students’ readiness for online learning. Indeed, this study’s readiness scale not only presents distinct dimensions but also extends the previous similar studies’ scope. The OLRS provides academic faculty with a framework for understanding both students’ learning styles most conducive to online learning and the communication and technical expertise needed to complete online coursework. Future research should provide evidence as to whether the OLRS can effectively predict student performance, or whether there is a positive correlation between OLRS scores and academic performance in online courses. In specific, we suggest future studies on the relationship between the readiness of self-directed learning and course topics in the online learning context. The development of the OLRS allows teachers to reconsider their instructional design for online students, especially in different grade levels. It is expected that this scale can help online course teachers to assess either the readiness of individual students or the readiness of groups of students regarding online courses, and to design better courses for maximizing students’ online learning experiences.

Appendix.

ORLS dimensions and items

Item no. Dimension/items Computer/Internet self-efficacy

CIS1 I feel confident in performing the basic functions of Microsoft Office programs (MS Word, MS Excel, and MS PowerPoint). CIS2 I feel confident in my knowledge and skills of how to manage software for online learning.

CIS3 I feel confident in using the Internet (Google, Yahoo) to find or gather information for online learning. Self-directed learning

SDL1 I carry out my own study plan.

SDL2 I seek assistance when facing learning problems. SDL3 I manage time well.

SDL4 I set up my learning goals

SDL5 I have higher expectations for my learning performance.

(10)

References

Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84, 191–215. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice-Hall. Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman.

Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing modelfit. In K. A. Bollen, & J. S. Long (Eds.), Testing structural equation models (pp. 136–162). Newbury Park, CA: Sage Publications.

Bunz, U., Curry, C., & Voon, W. (2007). Perceived versus actual computer-email-webfluency. Computers in Human Behavior, 23(5), 2321–2344. Candy, P. C. (1991). Self-direction for lifelong learning: A comprehensive guide to theory and practice. San Francisco: Jossey-Bass.

Caspi, A., Chajuta, E., & Saportaa, K. (2008). Participation in class and in online discussions: gender differences. Computers & Education, 50(3), 718–724.

Chizmar, J. F., & Walbert, M. S. (1999). Web-based learning environments guided by principles of good teaching practice. Journal of Economic Education, 30(3), 248–264. Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: development of a measure and initial test. MIS Quarterly, 19(2), 189–211.

Czubaj, C. A. (2004). Literature review: reported educator concerns regarding cyberspace curricula. Education, 124(4), 676–683. Deci, E., & Ryan, R. (1985). Intrinsic motivation and self-determination in human behavior. New York: Plenum Press.

Discenza, R., Howard, C., & Schenk, K. (2002). The design & management of effective distance learning programs. Hershey, PA: Idea Group Publishing.

Eastin, M. A., & LaRose, R. (2000). Internet self-efficacy and the psychology of the digital divide. Journal of Computer Mediated Communication, 6(1).http://jcmc.indiana.edu/ vol6/issue1/eastin.htmlRetrieved September 2000, from.

Evans, T. (2000). Flexible delivery andflexible learning: developing flexible learners? In V. Jakupec, & J. Garrick (Eds.), Flexible learning, human resource and organizational development (pp. 211–224) London: Routledge.

Fairchild, A. J., Jeanne Horst, S., Finney, S. J., & Barron, K. E. (2005). Evaluating existing and new validity evidence for the academic motivation scale. Contemporary Educational Psychology, 30(3), 331–358.

Federico, P. (2000). Learning styles and student attitudes toward various aspects of network-based instruction. Computers in Human Behavior, 16(4), 359–379. Fornel, C., & Larcker, D. F. (1981). Structural equation models with unobservable variables and measurement errors. Journal of Marketing Research, 18(2), 39–50. Garrison, D. R. (1997). Self-directed learning: toward a comprehensive model. Adult Education Quarterly, 48(1), 18–33.

Garrison, D. R., Cleveland-Innes, M., & Fung, T. (2004). Student role adjustment in online communities of inquiry: model and instrument validation. Journal of Asynchronous Learning Networks, 8(2), 61–74.

Guglielmino, L. M. (1977). Development of the self-directed learning readiness scale. Unpublished doctoral dissertation. Athens, GA: The University of Georgia.

Hannafin, M. J. (1984). Guidelines for using locus of instructional control in the design of computer-assisted instruction. Journal of Instructional Development, 7(3), 6–10. Hartley, K., & Bendixen, L. D. (2001). Educational research in the Internet age: examining the role of individual characteristics. Educational Researcher, 30(9), 22–26. Hew, K. F., & Cheung, W. S. (2008). Attracting student participation in asynchronous online discussion: a case study of peer facilitation. Computers & Education, 51(3), 1112–

1124.

Hill, J. R. (2002). Overcoming obstacles and creating connections: community building in web-based learning environments. Journal of Computing in Higher Education, 14(1), 67–86.

Hsu, Y. C., & Shiue, Y. M. (2005). The effect of self-directed learning readiness on achievement comparing face-to-face and two-way distance learning instruction. International Journal of Instructional Media, 32(2), 143–156.

Jonassen, D. H. (1986). Hypertext principles for text and courseware design. Educational Psychologist, 21(4), 269–292.

Kay, R. H. (2009). Examining gender differences in attitudes toward interactive classroom communications systems (ICCS). Computers & Education, 52(4), 730–740. Knowles, M. S. (1975). Self-directed learning: A guide for learners and teachers. New York: Association Press.

Lepper, M. (1989). Children and computers. American Psychologist, 44(2), 170–178.

Lepper, M., & Cordova, D. (1992). A desire to be taught: instructional consequences of intrinsic motivation. Motivation and Emotion, 16(3), 187–208. Lin, B., & Hsieh, C. T. (2001). Web-based teaching and learner control: a research review. Computers & Education, 37(4), 377–386.

MacCallum, R. C. (1986). Specification searches in covariance structure modeling. Psychological Bulletin, 101, 107–120.

MacCallum, R. C., Browne, M. W., & Sugarwara, H. M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1(2), 130–149.

Masters, K., & Oberprieler, G. (2004). Encouraging equitable online participation through curriculum articulation. Computers & Education, 42(4), 319–332.

McVay, M. (2000). Developing a web-based distance student orientation to enhance student success in an online bachelor’s degree completion program. Unpublished practicum report presented to the Ed.D. Program. Florida: Nova Southeastern University.

McVay, M. (2001). How to be a successful distance learning student: Learning on the Internet. New York: Prentice Hall.

Merrill, M. (1983). Component display theory. In C. Reigeluth (Ed.), Instructional-design theories and models: An overview of their status (pp. 279–334). Hillsdale, NJ: Lawrence Erlbaum Associates.

Merrill, M. D. (1984). What is learner control? In R. K. Bass, & C. D. Dills (Eds.), Instructional development: The state of the art II (pp. 221–242) Dubuque, IA: Kendall Hunt Pub Co. Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies for the online classroom. San Francisco: Jossey-Bass.

Peng, H., Tsai, C. C., & Wu, Y. T. (2006). University students’ self-efficacy and their attitudes toward the Internet: the role of students’ perceptions of the Internet. Educational Studies, 32(1), 73–86.

Peterson, R. (2000). A meta-analysis of variance accounted for and factor loadings in exploratory factor analysis. Marketing Letters, 11, 261–275. Pintrich, P. R., & Schunk, D. H. (2002). Motivation in education: Theory, research, and applications (2nd ed.). Upper Saddle River, NJ: Merrill/Prentice Hall. Poole, D. M. (2000). Student participation in a discussion-oriented online course: a case study. Journal of Research on Computing in Education, 33(2), 162–177. Reeves, T. C. (1993). Pseudoscience in computer-based instruction: the case of lecturer control research. Journal of Computer-based Instruction, 20(2), 39–46.

Reigeluth, C. M., & Stein, F. S. (1983). The elaboration theory of instruction. In C. M. Reigeluth (Ed.), Instructional-design theories and models: An overview of their current status, Vol. 1. Hillsdale, NJ: Lawrence Erlbaum Associates.

Roper, A. R. (2007). How students develop online learning skills. Educause Quarterly, 30(1), 62–64.

Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: classic definitions and new directions. Contemporary Educational Psychology, 25(1), 54–67. Saadé, R. G., He, X., & Kira, D. (2007). Exploring dimensions to online learning. Computers in Human Behavior, 23(4), 1721–1739.

Appendix (continued)

Item no. Dimension/items Learner control (in an online context)

LC1 I can direct my own learning progress.

LC2 I am not distracted by other online activities when learning online (instant messages, Internet surfing). LC3 I repeated the online instructional materials on the basis of my needs.

Motivation for learning (in an online context)

MFL1 I am open to new ideas. MFL2 I have motivation to learn. MFL3 I improve from my mistakes. MFL4 I like to share my ideas with others. Online communication self-efficacy

OCS1 I feel confident in using online tools (email, discussion) to effectively communicate with others. OCS2 I feel confident in expressing myself (emotions and humor) through text.

(11)

Salaberry, M. R. (2000). Pedagogical design of computer mediated communication tasks: learning objectives and technological capabilities. Modern Language Journal, 84(1), 28–37.

Shyu, H. Y., & Brown, S. W. (1992). Learner control versus program control in interactive videodisc instruction: what are the effects in procedural learning? International Journal of Instructional Media, 19(2), 85–95.

Smith, P. J. (2000). Preparedness forflexible delivery among vocational learners. Distance Education, 21(1), 29–48. Smith, P. J. (2005). Learning preferences and readiness for online learning. Educational Psychology, 25(1), 3–12.

Smith, P. J., Murphy, K. L., & Mahoney, S. E. (2003). Towards identifying factors underlying readiness for online learning: an exploratory study. Distance Education, 24(1), 57–67. Stansfield, M., McLellan, E., & Connolly, T. M. (2004). Enhancing student performance in online learning and traditional face-to-face class delivery. Journal of Information

Technology Education, 3, 173–188.

Stefanou, C., & Salisbury-Glennon, J. (2002). Developing motivation and cognitive learning strategies through an undergraduate learning community. Learning Environments Research, 5(1), 77–97.

Tsai, C.-C., & Lin, C.-C. (2004). Taiwanese adolescents’ perceptions and attitudes regarding the Internet: exploring gender differences. Adolescence, 39, 725–734.

Tsai, M. J., & Tsai, C. C. (2003). Information searching strategies in web-based science learning: the role of Internet self-efficacy. Innovations in Education and Teaching International, 40(1), 43–50.

Wang, L.-C. C., & Beasley, W. (2002). Effects of learner control and hypermedia preference on cyber-students’ performance in a web-based learning environment. Journal of Educational Multimedia and Hypermedia, 11(1), 71–91.

Warner, D., Christie, G., & Choy, S. (1998). Readiness of VET clients forflexible delivery including on-line learning. Brisbane: Australian National Training Authority. Wojciechowski, A., & Palmer, L. B. (2005). Individual student characteristics: can any be predictors of success in online classes? Online Journal of Distance Learning

Admin-istration, 8(2).http://www.westga.edu/wdistance/ojdla/summer82/wojciechowski82.htmRetrieved June 8, 2008, from.

Yang, C. C., Tsai, I. C., Kim, B., Cho, M.-H., & Laffey, J. M. (2006). Exploring the relationships between students’ academic motivation and social ability in online learning environments. The Internet and Higher Education, 9(4), 277–286.

數據

Fig. 1. Results of the confirmatory factor analysis: pattern coefficients for online learner readiness
Table 4 presents students ’ mean scores and standard deviations on the five subscales. To calculate each student’s mean score for every factor (dimension), we identi fied the sum of the answers to each item in that factor, and then divided the sum by the num

參考文獻

相關文件

(a) Classroom level focusing on students’ learning outcomes, in particular, information literacy (IL) and self-directed learning (SDL) as well as changes in teachers’

(b) Pedagogical and Assessment Practices (e.g. Transforming the Learning and Teaching Culture, Promotion of Self-directed Learning, Skills Development for e-Learning) ... Use

教育局 課程發展處 藝術教育組.. 你的生活

information on preventive measures, youth online culture, relevant community and online resources for e-learning. –Most of Students were asking the tips of healthy use of

Briefing Seminar for School Leaders of WiFi 900 Schools... Coherent Development of IT in Education Strategies and Curriculum Reform for Fostering Students’ Life-long Learning

 Catering for Learner Diversity Series: Adopting e-Learning to Cater for Students with Special Educational Needs in the Junior Secondary English Classroom..  Catering for

• use Chapter 4 to: a) develop ideas of how to differentiate the classroom elements based on student readiness, interest and learning profile; b) use the exemplars as guiding maps to

A Very good. You are able to apply your understanding of how endogenetic processes leading to the formation of major landform features along plate boundaries to explain the