• 沒有找到結果。

Influence of polling technologies on student engagement: An analysis of student motivation, academic performance, and brainwave data

N/A
N/A
Protected

Academic year: 2021

Share "Influence of polling technologies on student engagement: An analysis of student motivation, academic performance, and brainwave data"

Copied!
10
0
0

加載中.... (立即查看全文)

全文

(1)

In

fluence of polling technologies on student engagement: An analysis

of student motivation, academic performance, and brainwave data

Jerry Chih-Yuan Sun

*

Institute of Education, National Chiao Tung University, 1001 Ta-Hsueh Road, Hsinchu, Taiwan, ROC

a r t i c l e i n f o

Article history:

Received 29 July 2013 Received in revised form 21 October 2013 Accepted 24 October 2013

Keywords:

Improving classroom teaching Interactive learning environments Teaching/learning strategies

a b s t r a c t

This study compared clicker technology against mobile polling and the Just-in-Time Teaching (JiTT) strategy to investigate how these methods may differently affect students’ anxiety, self-efficacy, engagement, academic performance, and attention and relaxation as indicated by brainwave activity. The study utilized a quasi-experimental research design. To assess the differences between the effects of clickers and mobile polling, the study collected data from two courses at a large research university in Taiwan in which 69 students used either clickers or mobile polling. The results showed that mobile polling along with the JiTT strategy and in-class polls reduce graduate students’ anxiety, improve student outcomes in an environment comprising both graduate and undergraduate students, and increase stu-dents’ attention during polling. However, brainwave data revealed that during the polling activities, students’ attention in the clicker and mobile polling groups respectively increased and decreased. Stu-dents nowadays do notfind smartphones a novelty; however, incorporating them into class is still a potentially effective way to increase student attention and provide a direct way for instructors to observe the learning effects of lectures and improve their teaching approach on that basis.

Ó 2013 Elsevier Ltd. All rights reserved.

1. Introduction

1.1. Overview of polling technologies

One of the main problems with the traditional lecture format is that under it students tend to have a low level of engagement, with the consequence that their learning may suffer. One way for instructors to better engage students is through class discussions or activities employing anonymous responses. However, using a more traditional hand-raising or response card method of response,Kennedy and Cutts (2005)andStowell and Nelson (2007)found that participants were either reluctant to respond to a question posed to the class until others had responded or were apt to conform to the majority response. In contrast, several studies have shown an increase in student engagement in classes incorporating electronic feedback devices (Bode, Drane, Kolikant, & Schuller, 2009; Lasry, 2008; Stowell & Nelson, 2007; Trees & Jackson, 2007).

Over the pastfive years, technology has come to be used in lecture halls to address the issue of low student engagement (Delialioglu, 2012; Koenig, 2010; Mason, 2011; Middlebrook & Sun, 2013; Sun & Rueda, 2012; Walsh, Sun, & Riconscente, 2011). The use of electronic feedback devices (also known as electronic voting systems or clickers, which is the term used herein) for polling purposes is becoming more common in academic settings, especially at the higher education stage (Gilbert, 2005; Martyn, 2007). Clickers are small, portable devices that use infrared or radio-frequency technology to transmit and record student responses to questions providing instantaneous feedback to the students and their instructor regarding their level of understanding of the material being presented. To date, studies examining the benefits of electronic feedback devices have established that they are responsible for at least some limited range of improvement in aca-demic performance (Anthis, 2011; Elicker & McConnell, 2011; Kennedy & Cutts, 2005; Stowell & Nelson, 2007).Kennedy and Cutts (2005)

demonstrated that students who frequently used clickers in class could be categorized as either high or low performers on in-class tests and end-of-term exams, while students who were low-frequency clicker users clustered into a range of moderate to low performance. Similarly,

* Tel.: þ886 3 5131242; fax: þ886 3 5738083. E-mail address:csun@nctu.edu.tw.

Contents lists available atScienceDirect

Computers & Education

j o u r n a l h o m e p a g e : w w w . e l s e v i e r . c o m / l o c a t e / c o m p e d u

0360-1315/$– see front matter Ó 2013 Elsevier Ltd. All rights reserved.

(2)

Cai et al. (2011)found that using clickers can enhance active learning and improve conceptual learning. Conversely,Stowell and Nelson (2007)found no differences in terms of learning outcomes on class quizzes between learners who used clickers and those who respon-ded using more traditional methods, such as response cards or hand raising.

Previous studies on clickers (Koenig, 2010; Martyn, 2007) have investigated their benefits by comparing them with traditional classroom approaches. However, little research has empirically investigated mobile polling (i.e., polls conducted on students’ smartphones). Therefore, instead of following the approach of older studies comparing the clicker approach with the traditional classroom, the focus of this study was to investigate mobile polling and strategies for incorporating polling technologies into the classroom.

1.2. Just-in-Time teaching and polling

Just-in-Time teaching (JiTT) (Lasry, 2008) using tools such as radio-frequency-based electronic feedback devices has become a popular pedagogical tool in university lectures (Gilbert, 2005; Koenig, 2010; Martyn, 2007), as it allows students to register their answers on a multiple-choice question slide or opinion slide and provides instantaneous feedback via a graph. This feedback is visible to everyone in the session and therefore benefits both the instructor, by providing him or her with guidance on how to best adjust the lecture content, and the students themselves, by assessing their level of preparation and knowledge of the topic. With reference to this feedback, students can compare their answers with those of other students and see whether they need to change their level of preparation or learning strategies to keep up (Kay & LeSage, 2009). Most importantly, clickers encourage student participation in the lecture setting, which is becoming an increasingly difficult thing for teachers to secure in the era of social networking, texting, and online gaming (Hanson, Drumheller, Mallard, McKee, & Schlegel, 2010).

Aside from the benefits for teaching and learning, clickers can also be used for evaluation: when used in a non-anonymous, personalized manner, they can be used to administer quizzes in the lecture setting. While students will immediately be able to determine whether they answered the quiz questions correctly or not, in-class polling does not allow them to change their answers and is thus suitable for the purposes of grading.

1.3. Anxiety

Quizzes and exams are a common and effective way to assess student knowledge in higher education, but they may lead to test anxiety, another variable adopted in this study. Anxiety is a significant issue in universities (Zeidner & Matthews, 2005).Ottens (1991)identified four interrelated characteristics of academically anxious students: 1) disruption in mental activity, 2) psychological distress, 3) misdirected attention, and 4) academically ineffective behavior (e.g., procrastination). Evaluations and timed test-taking conditions (such as quizzes or exams) accentuate the detrimental effects of anxiety. As anxiety interferes with working (or short-term) memory (Zeidner & Matthews, 2005), and effectively deprives students of the working memory’s full processing capacity, anxious students may not be able to fully demonstrate their knowledge of the topic at hand.Covington (1992)proposed a useful interaction model for test anxiety, demonstrating that its effects manifest in three stagesdappraisal, preparation, and test-taking. In the appraisal stage, students judge an upcoming test to be either a challenge or a threat. In our study, we presume that if in-class polling is not graded, students will perceive it as a challenge rather than a threat, and that this will lead to more effective preparation by review of course content.

1.4. Self-efficacy

None of the studies conducted on polling to date have found any significant increases in academic performance as a result of it. The present study thus sought to examine the relationship between students’ self-efficacy for learning of course content and their academic performance. For these purposes, self-efficacy was defined as an individual’s beliefs about his or her ability to accomplish a task; it does not relate to the amount or quality of skill possessed, but rather what a person believes he or she can achieve (Bandura, 1977). Our study specifically investigated whether or not differences existed between students’ self-efficacy regarding the subject matter and their ultimate performance based on the technology used (clickers versus mobile polling).

1.5. Cognitive engagement

Cognitive engagement refers to an individual’s voluntary efforts to understand and master challenging tasks (Fredricks, Blumenfeld, & Paris, 2004).Astin (1984)has also called it psychological effort. This characteristic is not easily observed, since similarly to emotional engagement but unlike behavioral or physical engagement, it essentially entails the degree to which a learner uses one or more cognitive processes to learn.

One way to address the problem of measuring cognitive engagement is to employ electronic devices for providing feedback, in com-bination with questions intended to extrapolate learners’ cognitive, behavioral, and emotional levels of engagement. Existing studies investigating feedback devices reported the benefits of increased overall engagement (Bode et al., 2009; Dallaire, 2011; Lasry, 2008; Stowell & Nelson, 2007; Trees & Jackson, 2007), but they have rarely examined the influence of feedback devices on cognitive engagement. Hence, this study aimed to determine how the use of clickers versus that of mobile polling affected learners’ “unobservable engagement,” that is, their cognitive engagement as described byFredricks, Blumenfeld, Friedel, and Paris (2005)andFredricks et al. (2004).

1.6. Brainwaves and polling

Brainwave analysis is now quite advanced in a variety of academic and professional contexts, such as medical care; however, few studies have applied brainwave analysis in the classroom setting. In the past, brainwave experiments have required a lot of preparation, also necessitating the use of gel tofix electrodes to the head of the experimental subject. For these reasons, it was challenging to implement brainwave experiments in the classroom. However, with advances in technology, it is now possible to glean accurate brainwave data with

(3)

portable devices that require only simple preparation and do not require gel. For example,Wang and Sourina (2013)used the portable Emotiv device (Stytsenko, Jablonskis, & Prahm, 2011) to monitor the state of participants’ brainwaves and provide neurofeedback as a way to treat attention deficit hyperactivity disorder. Brainwave analysis in the classroom is worthwhile, as it can reveal variations in attention and relaxation. Higher relaxation values indicate that the subjects are feeling less stressed; for example,Crowley, Sliney, Pitt, and Murphy (2010)

used the Hanoi game to measure the attention and relaxation levels of subjects and evaluate their emotional responses. Our study extends their work by conducting an in-class experiment to observe the emotional responses of students during class activities. In addition,Jimenez, Mesa, Rebolledo-Mendez, and de Freitas (2011)assessed behaviors using human brain frequencies as inputs and indicated the possibility that human behaviors could be studied through attention to cognitive states.Maki et al. (2012)confirmed that it was possible to make subjective, inferential assessments using biosignals. These studies all used the Neurosky Mindset as the sensor; it has been proven accurate and comfortable for subjects.

Mobile polling has a greater potential than clicker systems do. Experiments (Sun, Martinez, & Seli, in press) have shown that it instantly grabs students’ attention, allowing them to be polled and review activities conducted in class anytime and anywhere. It thus extends the functionality of clickers. Unlike clickers, however, the potential of mobile polling has scarcely been studied. Specifically, nor, to date, have studies examined how different kinds of polling activities influence students’ brainwaves. Therefore, to assess students’ response to mobile polling, we analyzed their brainwaves to determine the differences between the effects of clickers and mobile polling groups as class feedback devices. A brief conceptual model of the research question is shown inFig. 1.

2. Materials and methods

The study utilized a quasi-experimental research design. To assess the differences between the effects of the use of clickers and that of mobile polling, the study collected data from two courses at a large research university in Taiwan in which the students used clickers (control group) or mobile polling (experimental group) to provide experimental feedback.

2.1. Participants

The courses selected as sites for this study were courses in educational research methods and sociology of education. Both courses required students to apply foundational knowledge belonging to thefield of education, making them suitable for this study. The educational research methods course was a conceptual introduction to the use of scientific methodologies in conducting educational studies, while the other course, sociology of education, helped students understand issues and theories related to education and sociology. Among the 27 students in the educational research methods class, 48.1% (n¼ 13) were in the clicker group and 51.9% (n ¼ 14) in the mobile polling group. Females (n¼ 19) represented 70.4% of the participants in that class, all of whom were graduate students. Among the 42 students in the sociology of education class, 45.2% (n¼ 19) were in the clicker group and 54.8% (n ¼ 23) in the mobile polling group. Females (n ¼ 18) represented 42.9% of participants in this class; 33.3% (n¼ 14) were graduate students and 66.7% (n ¼ 28) undergraduates. The mean age across both groups was 22.68 years (SD¼ 2.17).

2.2. Methods and instructional design

Fig. 2presents the experimental design used in this study. In both groups, two surveys (pre- and post-class) were conducted, at the beginning and end of the experimental session respectively, to assess differences between clicker use and mobile polling. Students

(4)

participated in in-class polls using either clickers or mobile phones with a software package developed by Turning Technologies in order to facilitate the emergence of any effect of the JiTT strategy.Fig. 3presents a screenshot of the mobile polling and its results.

When creating the in-class polls, instructors were encouraged to incorporate the four knowledge dimensions and six cognitive process dimensions identified byAnderson and Krathwohl’s (2001)adaption of Bloom’s Taxonomy Structure (1956). The former include 1) factual knowledge, 2) conceptual knowledge, 3) procedural knowledge, and 4) meta-cognitive knowledge, and the latter, 1) remembering, 2) understanding, 3) applying, 4) analyzing, 5) evaluating, and 6) creating. The purpose of adopting this approach was to formulate a way for instructors to gage students’ understanding of the content based on lectures and any other learning activities taking place in the classroom, with the aim of providing data that could guide the instructor to make any useful modifications to the lecture material and learning activities. In order to facilitate the creation by instructors of in-class questions, researchers worked with the instructors to identify the appropriate questions and communicated with them constantly to ensure that these questions were provided on schedule.

Below is a set of in-class polling questions used in the educational research methods class, which helped instructors to gauge students’ understanding of the concept of internal validity.

Fig. 2. Experimental design.

(5)

Which threat to internal validity exists in each of the situations listed below? 1. Maturation

2. Mortality

3. Data collector characteristics 4. Location

5. Instrumentation 6. Regression

In this course, quizzes were conducted in order to measure students’ post-test achievement. Students indicated their estimation of how well they would do on a scale of 1–100; that was their “quiz efficacy score.” In addition, they were asked to indicate their level of “quiz anxiety” on a scale of 1–100. Teaching assistants uploaded the students’ quizzes, along with their efficacy and anxiety scores, to the uni-versity’s learning management system.

Open-ended questionnaires were used to further explore faculty and student perspectives on how different polling technologies may have influenced the nature and effects of the instruction provided, in order to gauge their overall satisfaction regarding the quality of and support provided by the electronic feedback devices and to understand how teachers applied the JiTT techniques in classes.

In order to validate students’ survey responses on cognitive engagement, brainwave data were collected from two different students in each class session. A total of 32 students across 16 class sessions (voluntarily) chose to wear the brainwave headphones. Of the two students whose brainwave data were collected in each class session, one responded to the in-class polls with the clicker and the other one with the mobile phone (seeFig. 4). Brainwave data were analyzed based on the research methods presented in previous studies in thefield (Crowley et al., 2010; Rothkrantz, Wiggers, Wees, & Vark, 2004). We used thefirst three minutes of the electroencephalograph (EEG) data as the baseline regarding attention and relaxation levels, since students at this time were in a serene state of mind. We then divided the EEG raw data into two parts: class lecture and in-class activities. For the class lecture and in-class activities, we calculated the percentage in excess of the baseline for attention and relaxation levels, respectively. We then generalized participants’ brainwave pattern between lecture and in-class activities based on the observed data, and plotted the EEG diagram from the selected participants. We combined the EEG observations with the results of the pre- and post-class surveys and open-ended questionnaires in order to analyze students’ learning effectiveness during the class. In this way, we were able to explain the variations in brainwave data meaningfully.

2.3. Instruments

The instruments used in this study were adapted from existing validated scales as follows: the test anxiety scale from the Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich, Smith, Garcia, & McKeachie, 1991) and the cognitive engagement scale developed by (Fredricks et al., 2005). The MSLQ was published by the National Center for Research on Improving Postsecondary Teaching and Learning at the University of Michigan in 1986. We used the subscale on test anxiety to measure students’ test anxiety by polling environment. Similarly,Fredricks et al. (2005)developed the engagement scale to measure behavioral, emotional, and cognitive engagement and we used the subscale on cognitive engagement. Given that the engagement scale was originally designed to measure children’s level of engagement in school, some items had to be modified to measure the engagement levels of graduate and undergraduate students; for example, the item“I follow the rules at school” was revised to “I am compliant with the university’s standards of behavior.” All of these scales used a 6-point Likert rating (6¼ strongly agree, 5 ¼ agree, 4 ¼ somewhat agree, 3 ¼ somewhat disagree, 2 ¼ disagree, and 1 ¼ strongly disagree).

Internal consistency coefficients (Cronbach’s

a

’s) were computed to identify the reliability of the various scales. The results were as follows: for the test anxiety scale, .852 and .779 in the educational research methods and sociology of education classes, respectively, and for the cognitive engagement scale, .796 and .714, respectively.

(6)

3. Results and discussion

All quantitative data were coded and prepared for computer analysis using the Predictive Analytics Software (PASW) 18.0 program. Cronbach’s

a

was computed in order to validate the reliability of each of the measurement scales. The brainwave data were analyzed based on the research methods described in previous studies (Crowley et al., 2010; Rothkrantz et al., 2004). For the descriptive statistics, fre-quencies were computed for nominal variables and means and standard deviations for both interval and nominal variables. Next, Pearson correlation coefficients were computed to examine the intercorrelations among the variables examined in this study. Finally, t-tests and analysis of covariance (ANCOVAs) were conducted to examine the differences in means between the control and experimental groups. 3.1. Anxiety and cognitive engagement

ANCOVA was performed on the post-test results for student anxiety and cognitive engagement to identify any between-group differ-ences, with the pre-test used as the covariant, the post-test results as the dependent variable, and the polling strategy (clicker or mobile polling) as thefixed factor. The ANCOVA results revealed no significant differences between the two groups for both classes (educational research methods and sociology of education) (Tables 1–4).

Although the results of the t-tests for cognitive engagement and anxiety before and after the classes did not show any significant dif-ferences (Tables 5and6), students in the mobile (experimental) group in the educational research methods class showed reduced levels of anxiety, indicating the possible benefits of mobile polling for graduate students.

3.2. Academic performance

Results for academic performance revealed a significant difference between the control and experimental groups in the sociology of education class (Table 7). In contrast, the t-test revealed no significant differences between the two groups in the educational research methods class. As the participants in the education research methods class were all graduate students, the reason for this result may be that mobile polling worked better for improving student outcomes in an environment consisting of both graduate and undergraduate students. Table 1

ANCOVA for cognitive engagement in the educational research methods class.

n M SD F (1,35) p

Clicker 13 4.464 .141 .355 .557

Mobile 14 4.632 .136

Table 2

ANCOVA for anxiety in the educational research methods class.

n M SD F (1,35) p

Clicker 13 3.645 .173 1.202 .284

Mobile 14 3.444 .166

Table 3

ANCOVA for cognitive engagement in the sociology of education class.

n M SD F (1,35) p

Clicker 16 4.169 .085 .433 .515

Mobile 22 4.098 .073

Table 4

ANCOVA for anxiety in the sociology of education class.

n M SD F (1,35) p

Clicker 16 3.090 .146 2.172 .149

Mobile 22 3.062 .124

Table 5

T-test for cognitive engagement and anxiety before and after class in the educational research methods class.

Clicker (n¼ 13) Mobile (n¼ 14) M (SD) t (p) M (SD) t (p) CE Before 4.60 (.65) .452 (.660) 4.48 (.62) 1.253 (.232) After 4.49 (.58) 4.61 (.59) ANX Before 3.72 (.97) .208 (.839) 3.61 (1.11) 1.047 (.314) After 3.68 (.59) 3.41 (1.04)

(7)

3.3. Open-ended questionnaire

At the end of the questionnaire, students were asked to complete three open-ended questions: 1. Have you encountered any issues when using the polling system?

2. What are the advantages of using polling technologies?

3. What are your suggestions for instructors using polling technologies?

For the students in the experimental group, the stability of the Internet and thus of the polling system were important factors. Students reported that the use of mobile polling helped them learn effectively. Students in the control group did not encounter any issues with the polling system; however, they did report that the traditional clickers lacked interactive capabilities and system responses, which may have become a barrier when they were responding to the polls. Both groups were in favor of anonymous polling, which may increase student interest in using such a system as compared to non-anonymous polling (where students’ names are attached to their responses). 3.4. Trajectory analysis of quiz efficacy, anxiety, and scores

Throughout the semester, six quizzes were conducted in the educational research methods class. Based on the results of the trajectory analysis of the results of these quizzes, students in the experimental group had lower quiz anxiety than those in the control group. In addition, student anxiety in both groups decreased over time. Results from the quiz efficacy and quiz scores did not reveal any further findings of particular interest.

3.5. Brainwave data on attention

In order to validate students’ survey responses with regard to engagement, brainwave data were collected from selected students. Overall, in the control group, the results showed that students’ brainwave data related to attention increased during polling activities as opposed to the lecture in general. For instance, as depicted inFig. 5, we found that when the polling activity started, student A’s average attention level increased, and decreased again when the teacher resumed lecturing. In order to explain this phenomenon, we examined the results of the open-ended questionnaire given to participants, which indicated that when the polling activity started, students (simulta-neously) concentrated in order to listen to the instructor’s explanation of the question. Moreover, in the case of student A, the brainwave also Table 6

T-test for cognitive engagement and anxiety before and after class in the sociology of education class.

Clicker (n¼ 16) Mobile (n¼ 22) M (SD) t (p) M (SD) t (p) CE Before 4.09 (.54) .000 (1.000) 4.27 (.46) 1.454 (.161) After 4.09 (.59) 4.16 (.44) ANX Before 2.86 (.83) .425 (.677) 3.26 (.84) .503 (.620) After 2.91 (.79) 3.19 (.90) Table 7

Academic performance between courses comparing clickers and mobile polling.

Class Control group Experimental group p

M SD M SD

Educational research methods 86.71 4.99 86.73 4.27 .988

Sociology of education 83.47 3.84 86.91 2.71 .002*

(8)

achieved its overall peak value during the polling activity, showing empirically that the use of clickers enhances the students’ attention in class. Finally, as mentioned above, we can see that the student’s attention level dropped dramatically after the polling activity. It may be possible that after polling, the students felt a sense of relief overall, seeing that the majority of students answered the question correctly; as a result, they felt able to resume ignoring the instructor’s explanation. However, after a while, it appears that the student sensed some divergence between their own understanding and the lecture, since the level of attention increased during the post-polling lecture period as students reconsidered the difference between their ideas and those of the instructor.

In the experimental group, as shown inFig. 6, we discovered that when the polling activity started, student B’s attention level generally started to decrease. After our investigations, we considered this point and realized that it was a naturalfinding given that smartphones are now nearly universal and are often used by students to play video games or send messages to their friends. This may be why student B’s attention level increased initially. However, after a while, the students’ perceptions of the smartphones changed; they started to think of them as for polling and not for playing, and thus began to concentrate on the instructor’s talk. Furthermore, after polling, student B’s attention level did not decrease, revealing that these students could maintain concentration after polling was complete, since they found it more natural to do polling with a mobile device than with a clicker.

Thus, our analysis shows that clickers and mobile polling each have their own different strengths and weaknesses. Nevertheless, it is worth investigating what the more natural form of mobile polling is in order to exploit its potential and resolve its drawbacks. Mobile polling has the advantage of being extensible as an after–class activity. Unlike that of clickers, however, the extension of mobile polling has scarcely been studied. We believe that future study will show that mobile polling can play a pivotal rule in education.

3.6. Brainwave data on relaxation

The study also analyzed the relaxation level of students in the classroom. Higher relaxation values indicate that the subjects are in a more relaxed state and are not stressed. As a consequence, relaxation level is an effective way to observe student anxiety. The brainwave data on relaxation levels also serves as auxiliary data for explaining the questionnaire data on anxiety.Norhazman et al. (2012)researched the relationship between anxiety and relaxation and presented a method for analyzing this relationship. Furthermore, their research focuses on studying the effect of binaural beats in people experiencing stress, especially their effects on EEG signals, and shows that relaxation level as represented in EEG data is suitable for representing anxiety. In our control group, as shown inFig. 7, when the polling activity started, the relaxation level increased slightly. After a period, the relaxation value then decreased below the baseline, indicating that the student was more stressed after the polling activity. Thus, in our observations, students did not experience stress when using the clicker. However, after polling, they seemingly found it difficult to absorb new material while the mind was still integrating the implications of the polling activity. Therefore, the student’s anxiety level increased.

In contrast, in the experimental group, as illustrated inFig. 8, the phenomenon was quite different. The relaxation level of student D increased substantially during the polling activity. This phenomenon might be caused by the student’s feeling a sense of relaxation during the polling activity when it was conducted with a mobile phone. Polling using a smartphone was seemingly quite natural to the student, who did not feel anxious during or after the polling activity. Although the relaxation level decreased below the baseline, it soon began to increase again. This suggests that the mobile method provides a better way for students in the after-polling period to alleviate the anxiety caused by the in-class activity.

Fig. 6. Brainwave data on student B’s attention in the experimental group.

(9)

Thus, in terms of relaxation level, mobile polling was a more intuitive method for students participating in polling. Students might require more time to become accustomed to clickers, and a period of time may let their minds wander and even decrease the effectiveness of polling.

3.7. Relationship between brainwaves and psychological effects

The brainwave data was analyzed in terms of cognitive states. In our experiments, we observed that less anxiety was associated with greater learning motivation. (Norhazman et al., 2012) used EEG data to analyze anxiety in students, and we adopted their method. With stress, studentsfind it difficult to make choices, and when faced with new tools, such as clickers, some students may struggle with the task at hand. When analyzing student responses, we found that while new technology may be quite appealing for personal use, when it is used in the classroom setting, students may feel anxious.

The brainwave data served as an effective means to measure cognitive state, a more direct method than statistical analysis. Both attention and relaxation levels may be recorded during class. While polling activities in general help students develop a higher level of attention, different polling tools have quite diverse results. For this reason, future research in this area is still worthwhile.

4. Conclusion

The results of our study suggest that mobile polling with the JiTT strategy and in-class polls may have benefits for students. For example, it may reduce graduate students’ anxiety, improve student outcomes in an environment consisting of both graduate and undergraduate students, and hold students’ attention during polling activities. Future research should look into obtaining both quantitative and qualitative data to assess the reasons for these significant differences.

As educational technologies continue to rapidly change, the results of this research on the role of electronic and mobile feedback devices suggest innovative ways to implement polling strategies, for instance gauging students’ understanding with pre-class polls, and offer in-sights for the benefit of educators who wish to promote cognitive engagement on the part of their students with various types of feedback devices. In the past, students may have been scared of being the subjects of brainwave experiments; however, the new noninvasive EEG devices are more acceptable to students and suitable for use during class. Future research may have available more advanced EEG devices to measure students’ concentration levels and provide even more accurate results.

The present experimental results show that brainwave data is appropriate for gauging the effectiveness of educational technology and accurate in observing cognitive states, asJimenez et al. (2011)proposed. Furthermore, we found that while students nowadays do not consider smartphones a novelty, incorporating them into the classroom is still a potential way to increase student attention and provide a direct means for instructors to observe the learning effects of lectures and so improve their teaching approachesdperhaps even more now that they are ubiquitous. While clickers also seem to be a workable device for classrooms, their application is limited. We believe that improving mobile polling is worthwhile and that brainwave data is a potentially valuable auxiliary tool to observe learning effects on students. Future research may consider creating a tool for instructors to monitor students’ concentration and relaxation levels in real time in the classroom. For instance, the instructor’s screen might display a summary of every student’s brain data in order that the teacher could exploit this information for real-time improvement and feedback.

In conclusion, mobile polling will have diverse applications, paving the way to help students learn better. Although learning technology and mobile learning in general have been widely studied, few have considered the potential of mobile polling. On the basis of the results of our experiment and the auxiliary EEG data, we can confidently predict that mobile polling will be recognized as effective and widely accepted.

Acknowledgements

This study is supported in part by the National Science Council of the Republic of China under contract numbers NSC 100-2511-S-009-012 and NSC 101-2511-S-009-010-MY3. The author would like to thank the instructors and the students who participated in this study and acknowledge the contributions of Chao-Hsiu Chen, Chih-Chien Lin, and William Shao-Chin Chang who supported this research study and provided valuable comments.

(10)

References

Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.

Anthis, K. (2011). Is it the clicker, or is it the question? Untangling the effects of student response system use. Teaching of Psychology, 38(3), 189–193.

Astin, A. W. (1984). Student involvement: a developmental theory for higher education. Journal of College Student Personnel, 25(3), 297–308.

Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191–215.

Bloom, B. S. (1956). Taxonomy of educational objectives: The classification of educational goals. New York: Longmans, Green.

Bode, M., Drane, D., Kolikant, Y. B.-D., & Schuller, M. (2009). A clicker approach to teaching calculus. Notices of the American Mathematical Society, 56(2), 253–256.

Cai, T., Qi, Y., Cai, T., Han, J., Yang, S., & Bao, L. (2011). The effective use of clickers in freshmen classrooms. In 2011 International Conference on e-Business and E-Government (ICEE) (pp. 1–4). Shanghai, China: IEEE.

Covington, M. V. (1992). Making the grade: A self-worth perspective on motivation and school reform. New York: Cambridge University Press.

Crowley, K., Sliney, A., Pitt, I., & Murphy, D. (2010). Evaluating a brain-computer interface to categorise human emotional response. In 2010 IEEE 10th International Conference on Advanced Learning Technologies (ICALT) (pp. 276–278). Sousse, Tunisia: IEEE.

Dallaire, D. H. (2011). Effective use of personal response“clicker” systems in psychology courses. Teaching of Psychology, 38(3), 199–204.

Delialioglu, Ö. (2012). Student engagement in blended learning environments with lecture-based and problem-based instructional approaches. Educational Technology & Society, 15(3), 310–322.

Elicker, J. D., & McConnell, N. L. (2011). Interactive learning in the classroom: is student response method related to performance? Teaching of Psychology, 38(3), 147–150.

Fredricks, J. A., Blumenfeld, P., Friedel, J., & Paris, A. (2005). School engagement. In K. A. Moore, & L. Lippman (Eds.), What do children need toflourish? Conceptualizing and measuring indicators of positive development (pp. 305–321). New York: Springer.

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. Gilbert, A. (2005). New for back-to-school:‘Clickers’. Retrieved June 20, 2011, fromhttp://news.cnet.com/New-for-back-to-school-clickers/2100-1041_3-5819171.html.

Hanson, T. L., Drumheller, K., Mallard, J., McKee, C., & Schlegel, P. (2010). Cell phones, text messaging, and facebook: competing time demands of today’s college students. College Teaching, 59(1), 23–30.

Jimenez, C. O. S., Mesa, H. G. A., Rebolledo-Mendez, G., & de Freitas, S. (2011). Classification of cognitive states of attention and relaxation using supervised learning algo-rithms. In 2011 IEEE International Games Innovation Conference (IGIC) (pp. 31–34). Orange, CA: IEEE.

Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: a review of the literature. Computers & Education, 53(3), 819–827.

Kennedy, G. E., & Cutts, Q. I. (2005). The association between students’ use of an electronic voting system and their learning outcomes. Journal of Computer Assisted Learning, 21(4), 260–268.

Koenig, K. (2010). Building acceptance for pedagogical reform through wide-scale implementation of clickers. Journal of College Science Teaching, 39(3), 46–50.

Lasry, N. (2008). Clickers orflashcards: is there really a difference? The Physics Teacher, 46(4), 242–244.

Maki, Y., Sano, G., Kobashi, Y., Nakamura, T., Kanoh, M., & Yamada, K. (2012). Estimating subjective assessments using a simple biosignal sensor. In 2012 IEEE International Conference on Fuzzy systems (FUZZ-IEEE) (pp. 1–6). Sichuan, China: IEEE.

Martyn, M. (2007). Clickers in the classroom: an active learning approach. EDUCAUSE Quarterly, 30(2), 71–74.

Mason, R. B. (2011). Student engagement with, and participation in, an e-Forum. Educational Technology & Society, 14(2), 258–268.

Middlebrook, G., & Sun, J. C.-Y. (2013). Showcase hybridity: a role for blogfolios. In K. V. Wills, & R. Rice (Eds.), ePortfolio performance support systems: Constructing, presenting, and assessing portfolios (pp. 123–133). Fort Collins, CO: The WAC Clearinghouse and Parlor Press.

Norhazman, H., Zaini, N. M., Taib, M. N., Omar, H. A., Jailani, R., Lias, S., et al. (2012). Behaviour of EEG alpha asymmetry when stress is induced and binaural beat is applied. In Paper presented at the 2012 IEEE Symposium on Computer Applications and Industrial Electronics (ISCAIE).

Ottens, A. J. (1991). Coping with academic anxiety (2nd ed.). New York: Rosen.

Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ). Ann Arbor, Michigan: The University of Michigan.

Rothkrantz, L. M., Wiggers, P., Wees, J.-W., & Vark, R. (2004). Voice stress analysis. In P. Sojka, I. Kopecek, & K. Pala (Eds.), Text, speech and dialogue (Vol. 3206; pp. 449–456). New York/Berlin, Heidelberg: Springer.

Stowell, J. R., & Nelson, J. M. (2007). Benefits of electronic audience response systems on student participation, learning, and emotion. Teaching of Psychology, 34(4), 253–258.

Stytsenko, K., Jablonskis, E., & Prahm, C. (2011). Evaluation of consumer EEG device Emotiv EPOC. In MEi: CogSci Conference. Ljubljana, Slovenia.

Sun, J. C.-Y., Martinez, B., & Seli, H. (2014). Just-in-time or plenty-of-time teaching? Different electronic feedback devices and their effect on student engagement. Educational Technology & Society. in press.

Sun, J. C.-Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: their impact on student engagement in distance education. British Journal of Educational Technology, 43(2), 191–204.

Trees, A. R., & Jackson, M. H. (2007). The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems. Learning, Media and Technology, 32(1), 21–40.

Walsh, J. P., Sun, J. C.-Y., & Riconscente, M. (2011). Online teaching tool simplifies faculty use of multimedia and improves student interest and knowledge in science. CBE-Life Sciences Education, 10(3), 298–308.

Wang, Q., & Sourina, O. (2013). Real-time mental arithmetic task recognition from EEG signals. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 21(2), 225–232.

Zeidner, M., & Matthews, G. (2005). Evaluation anxiety: current theory and research. In A. J. Elliot, & C. S. Dweck (Eds.), Handbook of competence and motivation (pp. 141–166). New York: Guilford Press.

數據

Fig. 2 presents the experimental design used in this study. In both groups, two surveys (pre- and post-class) were conducted, at the beginning and end of the experimental session respectively, to assess differences between clicker use and mobile polling
Fig. 2. Experimental design.
Fig. 4. Students wearing the brainwave headphones.
Fig. 5. Brainwave data on control group student A’s attention level.
+3

參考文獻

相關文件

The economy of Macao expanded by 21.1% in real terms in the third quarter of 2011, attributable to the increase in exports of services, private consumption expenditure and

Consistent with the negative price of systematic volatility risk found by the option pricing studies, we see lower average raw returns, CAPM alphas, and FF-3 alphas with higher

In this paper, we provide new decidability and undecidability results for classes of linear hybrid systems, and we show that some algorithms for the analysis of timed automata can

Employment of Foreign Workers, overseas Chinese students shall meet the student status regulated in the Regulations on Study and Counseling Assistance in Taiwan for

In response to the changing needs of society, the rapid development of science and technology, the views of stakeholders collected through various surveys and

After enrolment survey till end of the school year, EDB will issue the “List of Student Identity Data on EDB Record and New STRNs Generated” to the school in case the

The ECA Co-ordinator should design an evaluation and appraisal system for the proper assessment of various activities, school clubs, staff and student performance.. This

Wang, Solving pseudomonotone variational inequalities and pseudocon- vex optimization problems using the projection neural network, IEEE Transactions on Neural Networks 17