Critical Thinking in Online Vs. Face-to-Face Higher Education
Ellen Baker Derwin, Ph.D.
Abstract:
This study compares critical thinking skills for adult learners in online and face-to-face liberal studies classes at a university with locations in California and Washington (N=150). In a between-subjects design, the study analyzed students’ score gains from pre- to post-tests on the California Critical Thinking Skills Test (CCTST). The study also compared students’ grades on critical thinking assignments required at the end of the course.
Results showed that there were no significant differences between face-to-face and online learners for the CCTST score gains or the grades on the final assignments. Results are consistent with previous “no significant difference” studies. The research adds to the literature by specifically addressing outcomes in critical thinking. Future studies may benefit from selecting a variety of critical thinking measures and identifying characteristics required to demonstrate higher-level thinking skills.
During the fall of 2006, 3.5 million college and university students were enrolled in at least one online course, an increase of almost 10% compared to fall 2005 (Allen & Seaman, 2007). This number represents almost 20% of all postsecondary students in the United States, and the number is expected to grow each year (Carnevale, 2005) . With the increase in distance learning programs and the diversity of students enrolled, leaders at higher education institutions are concerned about assessing the effectiveness of different delivery methods (Twigg, 2005). In 2005, 62 % of chief academic officers surveyed believed that learning outcomes for students enrolled in online courses were better than outcomes for face-to-face classes (I. E. Allen & Seaman, 2006).
A history of research findings support the academic leaders’ beliefs by indicating no significant differences between the effectiveness of online courses compared with traditional face-to-face classes (Clark, 2002; Phye, 1997). While a great deal of research compares distance education in its many varieties and technologies, with onsite education, the outcomes are not clearly defined. The majority of studies focus on overall grades. Such research does not address outcomes for tasks that require critical thinking skills. Yet, educators and theorists acknowledge the importance of critical thinking as an essential learning outcome to achieve quality learning (Wickersham & Dooley, 2006).
No Significant Difference
Hundreds of studies compare instruction that uses technology to traditional instruction. Russell’s 1997 “no significant difference” study and Verduin and Clark’s 1991 summary of studies are the most widely reviewed in literature (as cited in LeBaron & Tello, 1998). Russell’s work reviews research showing no difference, if not better results, in distance learning as compared to on campus classes. LeBaron and Tello (1998) analyze these works and discuss questions that have initiated a great deal of discussion. “How do we know if ‘distance education’ works? What indeed does ‘working’ mean? If an online university course is taught alongside a traditionally-delivered section offered by the same faculty member, does a simple comparison of student grades adequately assess the merits of one method or the other?” ((LeBaron & Tello, 1998), p. 59).
With the hundreds of “no significant difference” studies completed and documented, it seems appropriate to more closely investigate the outcomes measured that, when compared, indicated a strong similarity between online and on-campus classes. Despite the plethora of studies, particularly in the 1990s, more research continues. What specific outcomes are being used to measure effectiveness, and are these the outcomes that can provide us with the most comprehensive and appropriate comparison? Additionally, new technologies used in varying combinations in course development affect potential outcomes.
Studies that compare delivery systems are interesting because they use a wide variety of outcome measures, and they address many different courses at both graduate and undergraduate levels. The delivery systems also vary widely, and the research included technologies which are now obsolete, as well as a range of formats from asynchronous self-paced courses to synchronous interactive courses.
In many studies comparing outcomes for undergraduate and graduate students enrolled in traditional versus online classes, the quality for both formats is similar. In most cases, the quality is defined by student self-reports, test grades, and final grades (M. Allen et al., 2004; Fredda, 2000; Neuhauser, 2002; Van Schaik & Barker, 2003; Vroeginday, 2005). The more unique studies further scrutinize grades by examining the status of higher level learning required for the better scores. In these cases, there was limited detail about the assessments, but some researchers are beginning to address these issues (Parker & Gemino, 2001; Unal, 2005; Witta, 2005). Given the limited information about higher level learning online versus on-campus, the next section further explores the concept of critical thinking and then reviews studies that address higher level thinking in the physical and virtual classrooms.
Critical Thinking
If you query theorists and educators, you will find considerable agreement that college students need to improve their critical thinking skills (McLean, 2005). In a national survey of colleges and universities, 72% of the representatives considered critical thinking to be an important skill to assess as an outcome of instruction (Ratcliff, Johnson, La Nasa, & Gaff, 2001). Almost every postsecondary school expects its graduates to be able to demonstrate critical thinking skills. Overall, educators are concerned about improving critical thinking skills among students in higher education and find it to be a desirable outcome of undergraduate education (Halpern, 2001; Johnson, 1992; McLean, 2005; Ratcliff et al., 2001). While it appears that there is a great deal of concern and interest in critical thinking, there is, nevertheless, little evidence that critical thinking instruction is occurring with any regularity in colleges and universities (Reed, 1998). In order to be accountable, it is valuable to be able to measure these skills (K. A. Williams, 2002). The following section will review learning theories as well as critical thinking.
One of the challenges of incorporating critical thinking into higher education is consistency of agreement in defining this term. Clearly defining the term would offer improved opportunities to measure the concept. While critical thinking is often discussed as a desired outcome for college students, measures of assessment have not been consistent (Erwin & Sebrell, 2003).
In response to the multiple definitions, the American Philosophical Association recruited Peter Facione in 1990 to lead a Delphi project to organize research and assessment on critical thinking (Burbach, Matkin, & Fritz, 2004). The ultimate consensus definition is as follows: “We understand critical thinking to be purposeful, self-regulatory, judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based” (P. A. Facione, 2006, p. 21). The Delphi report lists six skills, and each has a corresponding subskill (P. A. Facione, 1990). The skills are interpretation, analysis, evaluation, inference, explanation, and self-regulation. The Delphi model is considered to be the most broadly accepted definition of critical thinking (K. A. Williams, 2002). As Halpern suggests, even in the diverse definitions, an overlap helps to move forward and look at skills common to most of the definitions when evaluating critical thinking (Halpern, 2001). Having reviewed several leading critical thinking theorists, it appears that this definition touches all of the major thoughts. Facione is the main author of the California Critical Skills Test, which the methodology section further describes. Due to the systematic scope of the Dephi Report, the current study will use this instrument.
Critical Thinking Studies in Campus-Based Courses
The lack of clarity of critical thinking carries through to faculty perceptions of this topic. Halx and Reybold (2005) implemented a qualitative study to explore faculty perceptions of critical thinking. While most instructors agree that critical thinking is an important component of higher education, like theorists, the faculty also reflects ambiguity in the definition. Findings showed that while the participants sincerely wished to engage their students in critical thinking, they had no formal education in how to do so (Halx & Reybold, 2005). The researchers concluded the importance of instructors defining critical thinking and critical teaching so that it could be measured. In order to teach critical thinking, it is important to assure that instructors develop their own skill in higher level thinking.
A number or studies assessed critical thinking in traditional on-campus courses (Bauer, 2001; Burbach et al., 2004; Erwin, 2000; Hatcher, 2006; Jenkins, 1998; Lawson, 1999; Reed & Kromrey, 2001; Solon, 2003; R. L. Williams, Oliver, & Stockdale, 2004). Outcomes included the Ennis Weir, a survey about the value of rubrics, the CCTST, the WGCTA, the Cornell Z, and the Psychological Critical Thinking Instrument. Studies found that infusing critical thinking into courses as well as providing very specific direction such as rubrics and instruction on a critical thinking model were associated with higher outcome scores than control groups. In most cases, the studies considered different aspects of critical thinking and measured an increase in those skills based on an instructional intervention. Faculty believes that critical thinking is valuable, and researchers have incorporated instruction to improve these skills in a variety of ways, using different measures. All of the studies presented occurred in face-to-face courses, and the control groups were in the same setting.
Critical Thinking Online
In addition to reviewing studies that addressed critical thinking and face-to-face learning, it is valuable to consider research about critical thinking and online learning. Online learners have more responsibility to engage themselves in the learning process than students in face-to-face courses since a classroom instructor can literally pull them into a discussion (Newman, Webb, & Cochrane, n.d.). Koory (2003) found that students who have certain characteristics are more likely to be successful online. In interviews, she discovered that successful students are self-directive, task-oriented, independent, and interested in problem solving and immediate application. Some studies have analyzed students’ ability to engage in online discussions. In an example related to critical thinking, Wickersham and Dooley (2006) explored whether learners in small online group discussions would reflect more critical thinking than students in the large full class group.
Discussion board interaction is a common focus of studies about critical thinking and online learning. (Yang, Newby, & Bill, 2005). Yang et al. noted that several previous studies were based on self- reports, and they sought to conduct a quasi experiment. The purpose of their study was to determine if students’ critical thinking skills would improve after participating in Socratic dialogues on asynchronous discussion boards. Instructors modeled this type of questioning by continuing to ask questions to stimulate thought. They also wondered if students would maintain higher levels of critical thinking without the instructor continuing to offer probing questions. The students’ critical thinking skills were measured by the California Critical Thinking Test. Results showed that the mean scores on the CCTST prior to the course compared to after the course were significantly higher at the end.
In another qualitative study, Chang (2002) asked how asynchronous learning promotes critical thinking, and she explored opportunities adult learners see for critical thinking in an asynchronous online class. She noted that with research still new in this field, there are few studies that specifically address the quality of fully online asynchronous distance education. Chang conducted a study of asynchronous web-based graduate courses at a major university in New York City. In one phase of the study, a survey was distributed to students to determine if they perceived their experience in online courses as conducive to critical thinking. The researcher also observed the classes to determine if they met criteria determined to promote critical thinking. The criteria were authenticity, community, reflection, and multiple perspectives. This study was qualitative and did support that online learning can enhance critical thinking. There were no quantitative measures. While qualitative measures are interesting, they cannot easily be compared to on-campus courses, and there are other outcomes to consider.
Face-to-Face, Online, and Critical Thinking
Several studies addressed either face-to-face learning and critical thinking or online learning and critical thinking. It was challenging to locate a study that considered critical thinking in both learning environments. Huff (2000) implemented the only study that appeared relevant to the comparison of critical thinking in a distance environment to a traditional classroom. Huff states that her research is innovative in comparing critical thinking in distance learning students to those in onsite courses (Huff, 2000). Her research asked whether critical thinking skills for social work graduate students increased after they participated in a policy course. The study also explored whether on-site and distance education students gained critical thinking skills equally. The author used Facione’s definition of critical thinking from the Delphi report and the CCTST manual. The main skills considered in this definition are interpretation, analysis, inference, evaluation, and explanation.
Rather than looking at online computer-based learning, Huff’s study focused on students enrolled in an interactive television course compared to the same face-to-face course. Huff implemented a pretest posttest design and found first that there was no significant difference between the two groups on the CCTST on the pretests. Huff also compared the pretests and posttests of both groups, and there were no significant differences. She did find that both groups increased their pretest to posttest scores significantly. In addition to using the CCTST, Huff compared the two groups on their midterm exams, which tested both objective recall and critical thinking skills. Again there was no difference between the two groups.
Statement of the Problem
This research provides a comparison of online to onsite learning using outcomes that, in contrast to the many “no-significant difference” studies, better address higher level thinking. While various types of distributed learning have existed for many years, distance learning has become dramatically more prevalent in more recent years. With the increase in this format, comparisons between traditional or onsite education and distance or online education have become the focus of much research. Although learning outcomes have been compared, the measures have been limited. By measuring only final grades, it appears that researchers are missing the opportunity to assess possible differences in higher-level thinking between students that learn onsite and online. Although a long history of research shows no significant differences between face-to-face and distance courses, the types of distance courses varied dramatically, and the outcomes primarily addressed general outcomes rather than outcomes related to critical thinking. Since both online and adult students tend to have characteristics that are relevant to critical thinking, such as self-directedness, the ability to incorporate prior experience, and skills to integrate new learning, this population may demonstrate higher scores on critical thinking assessments than adult learners in face-to-face settings.
Hypotheses
The hypotheses are as follows:
Hypothesis 1: Adult undergraduate students enrolled in an online liberal studies class will achieve a larger pre- to post-test score increase on a standardized critical thinking test taken at the beginning and the end of the course when compared to pre- and post-test scores for adult undergraduate students enrolled in a face-to-face liberal studies class.
Hypothesis 2: Adult undergraduate students enrolled in an online liberal studies class will achieve a higher score on an instructor-created critical thinking assignment when compared to adult undergraduate students enrolled in a face-to-face liberal studies course.
Methodology
Participants
Participants were undergraduate students at a private university with over 7,000 students at 23 academic campuses located in California and Washington. Programs at the university are designed for nontraditional adult learners who are returning to higher education after being away for a period of time or those who did not attend college immediately after high school and are seeking to begin their higher education.
All students enrolled in the Spring 1, 2008 sections of Liberal Studies 300, both online and face-to-face, were invited to participate. In order to obtain additional participants for the online format, the researcher also invited students enrolled in the online course for the Spring 2, 2008 term to participate. A total of 150 students participated in both formats combined. Total enrollment for all face-to-face classes was 110, and 104 participated although they did not all complete all components of the study. Participation for face-to-face classes was 96%. The total enrollment for the online classes was 134. Forty-six students (34%) participated, although they did not all complete every study aspect. There were a total of eight face-to-face classes and a total of seven online classes.
Research Design
The research design was a quasi-experimental nonequivalent control group study (Campbell & Stanley, 1963). Students self-selected into either the face-to-face or online course, and the study used a between-groups approach to compare students studying in the two different formats. It was necessary to use already-formed groups because the institution could not mandate the course format that the student chose, and, therefore, could not randomly assign students to the groups. The researcher selected face-to-face sections based on instructors’ willingness to participate, and eight instructors agreed to participate. There were seven online sections, and all instructors agreed to participate.
The research compared student performance in an undergraduate liberal studies course targeted to adult students and taught in online and on-campus formats. The online and on-campus courses were standardized and used the same assignments. The research included two back-to-back 9-week terms of study for the comparison.
Independent and Dependent Variables
The independent variable was the format of the course, either online or face-to-face. The first dependent variable was the difference between the pre- and post-test scores on the CCTST. The same version of the CCTST was used for both the pretest and the post-test. Insight Assessment, the developers of CCTST, stated their research shows that taking the same version of the test the first time does not affect their post-test score (P. A. Facione, Facione, Giancarlo, & Blohm, 2002).
The second dependent variable examined the scores on the course assignment due at the end of the term. This final paper required students to discuss the ethical challenges in their targeted profession and provide very specific examples of how to deal with these challenges. The instructors asked the students to write a minimum of five pages, demonstrate analytical skills, use logical organization, support their statements with evidence, and show their ability to evaluate concepts. The instructors expected students to demonstrate skills they learned from practicing critical thinking exercises, developing hypotheses and critiquing readings. The assignment required students to skillfully interpret readings, analyze materials as applied to their own situation, infer solutions, and evaluate ethical dilemmas. These skills are components of critical thinking (P. A. Facione, 1990).
One faculty member at the university who was not teaching the class during the terms used in the study graded all of the papers. The instructor has taught this course in the past and had positive student and dean evaluations. The outside grader was not able to distinguish between papers submitted by online students and those of face-to-face students. This independent grading did not affect the students’ actual course grade. The course instructors posted grades independently of this study. The two dependent variables were unique, and they were analyzed separately.
The study used the California Critical Thinking Test, CCTST, as a pretest at the beginning of the term and a post-test at the end of the term. The difference between the pre- and post-test scores was compared. The study compared the change in scores using a one-way analysis of variance (ANOVA). Using this method, the means of the two groups were compared based on the independent variable.
Setting
The two course formats used the same syllabus, but the interaction was different due to the nature of the format. The face-to-face class included minimal lecture, group discussion, in-class writing, and problem solving. The classroom instructors typically encouraged student interactions. All of the face-to-face interaction took place within a classroom, and students were present at the same time and place. There was no online interaction.
In contrast, the online course was in the eCollege course management system. The course included written information, PowerPoint presentations, links to interactive activities, videos, threaded discussions, and written assignments.. The instructors facilitated the discussions, graded work, and provided frequent feedback to students.
The environment for students taking the CCTST online contrasted with the environment for students taking the test in class. In the face-to-face group, students were in a traditional classroom setting with seats facing the front of the room, although they adjusted seating for group activities. Online students took the test from a home or another computer where they do their other course work.
Procedure
For face-to-face students, instructors read the script (in Appendix D) to introduce the research. The instructors also administered the test. Students who chose to participate were asked to complete the survey and the pretest early in the class time period. At the end of the term, the instructor asked study participants to stay in class to complete the post-test.
For online students, an announcement posted in the course explained that in order to participate, students must complete the pretest by the end of the second week. The researcher sent emails to remind students of the deadline. Online students completed the post-test during the last 2 weeks of class. Announcements posted in the course management system and emails reminded them of the deadline.
Face-to-face students who opted to participate signed the informed consent form prior to completing the survey and the CCTST. After signing the consent forms, students first completed the survey. Then they took the CCTST. The CCTST Handbook provided step-by-step instructions for administering the CCTST, and the face-to-face instructors followed them in detail.
The procedures varied slightly for the online classes in order to provide the information and assessments in the online format. In contrast to face-to-face students who completed a paper-and-pencil assessment, online students took the survey and assessment online. The university uses the eCollege course management system, and the researcher provided a link to the e-testing system that the assessment company, Insight Assessment, administers.
Students were asked to complete the pretest during the first week of the term and the post-test during the last week of the term. Announcements were posted and emails sent to remind students to complete the tests during the requested time windows. The test was timed so that students had a maximum of 45 minutes to take the CCTST. Online students were able to move back and forth to questions so that both online and face-to-face students had the opportunity to recheck and change their responses.
Results
In order to test Hypothesis 1, comparing groups on the change from pre- to post-test scores on the CCTST, a one-way analysis of variance (ANOVA) was conducted comparing the mean change scores. This analysis compared the two independent variables on the mean of the dependent variable (Mertler & Vannatta, 2005). This analysis allowed a comparison of the online students’ growth in critical thinking to the face-to-face students’ growth in critical thinking.
There was no statistically significant difference between the online and the face-to-face groups for the change scores between the pre- and post tests, F (1.130), =3.10, p=.08. The results revealed that the face-to-face students and online students’ score increase was similar.
In order to test Hypothesis 2, comparing the two groups on the final course assignment, a one-way analysis of variance (ANOVA) was conducted to determine if the scores from both groups are significantly different. This analysis compared the mean scores on this dependent variable to determine if there was a significant difference between the two means. The assignment was considered to be independent of the CCTST. Therefore, an analysis of variance was conducted with each dependent variable separately rather than utilizing a multivariate analysis of variance with both dependent variables. The two hypotheses address two different dependent variables that focus on test scores. One dependent variable was a standardized test that assesses critical thinking. The second dependent variable score focused on a specific assignment in the course that required students to use critical thinking skills that were included in the course.
No significant difference was found between the online and the face-to-face groups for the scores on the assignments, F (1, 101), = .03, p=.86. The results revealed that the face-to-face students and online students’ scores were similar.
Overall the results did not support the original hypotheses. Test scores did not improve significantly for either the face-to-face or online students when comparing scores from pre- to post-test on the CCTST. Both groups also performed similarly when comparing their scores on a critical thinking assignment. Experience taking previous online classes did not affect the performance of students in this study. This finding indicates that experience taking online courses was neither of significant benefit or detriment to the participants’ critical thinking growth.
Implications and Future Studies
Score Comparisons
The first research question asks, “Do adult students enrolled in online classes demonstrate different scores than adult students taking face-to-face classes when comparing identical assessments that require critical thinking, defined as the ability to interpret, analyze, evaluate, and infer?” Hypothesis 1 addressed this question by comparing the difference between the students’ pre- and post-test scores on the CCTST. Since no statistically significant difference was found, the study did not support the suggestion that characteristics of online learners, such as self-directedness, ability to reflect critically, and the skills to integrate new knowledge contribute to the ability of students to improve critical thinking skills as measured by the CCTST. While the sample was not examined to determine if participants had these characteristics, the study acknowledged that online learners tend to possess these skills (Koory, 2003; Lee & Gibson, 2003; Stevens-Long & Crowell, 2002).
Similarly, the second hypothesis was designed to answer the first research question by determining if online students’ scores were significantly higher than face-to-face students’ scores for an assignment that required students to interpret, analyze, evaluate, and infer. The data did not support the hypothesis, but rather suggested that the examined online learners and face-to-face students had similar skills in completing a course assignment that required critical thinking skills.
Critical Thinking Skills and Tendencies
The foundation of the hypotheses relies on the concept that adult learners and online learners tend to have skills that correspond to the ability to think critically. Many adult learners are skilled at integrating knowledge into practice (Knowles, Holton, & Swanson, 1998; O’Lawrence, 2006; Sachs, 2001). They also value relevance and seek opportunities to blend their experiences with information in the classroom. Additionally, they tend to be self-directed problem solvers (Forrest & Peterson, 2006). Similarly, online learners are likely to be self-directed and able to critically reflect on course content (Lee & Gibson, 2003; Stevens-Long & Crowell, 2002).
In order for the foundation of the hypotheses to be valid, the researcher expected that the learners in the study possessed these characteristics. Additionally, Facione and Facione (1994) suggest that in order to actually apply critical thinking skills, individuals must have attributes that include open-mindedness, inquisitiveness, cognitive maturity, truth seeking, analyticity, and critical thinking self-confidence. The items on the CCTST are based on Facione and Facione’s assertion that these characteristics are important to the demonstration of critical thinking. Future studies might challenge whether other characteristics are more important and identify the characteristics that are truly necessary.
Instruments
Along with considering critical thinking instruction and related theories, it is important to examine results with regard to the measurement instruments. While both the CCTST and the assignment were included in the study to measure critical thinking skills, they are very different instruments. The CCTST is an objective test with a history of reliability and validity, and it is clearly based on Facione’s definition of critical thinking skills which include interpretation, analysis, inference, evaluation, and explanation (P. A. Facione, 2006). The purpose of including the assignment was to add a more subjective assessment, guided by a rubric. It is valuable information that the hypotheses for both instruments were not confirmed. This information may indicate that there are similarities in the abilities measured by the CCTST and the assignment. Future studies might correlate the post-test CCTST scores with the assignment grades to ascertain whether or not there is a relationship between the two instruments.
One of the main emphases of this study was to examine a different set of outcome measures than standardized tests and grade point averages. This study focused on two dependent variables: the gain from the pretest to the post-test on the CCTST and a final assignment that was graded using a rubric emphasizing critical thinking skills. Perhaps these measures were too narrow to assess students’ higher level thinking. Future studies may benefit from selecting other measures of critical thinking. For example, researchers can record and transcribe student interactions and discussions in face-to-face classes. They can store and review online discussions in distance-based classes. Additionally, researchers can implement longitudinal studies that assess student gains in critical thinking over time after taking courses that infuse critical thinking. This type of study is more challenging due to the many variables that can interfere over time. However, for adult students enrolled in degree programs in order to enhance their employment skills, it would be beneficial to determine if they are gaining critical thinking skills over time and if so, whether they are maintaining them and transferring them to skills on the job. A continued comparison between different instructional formats would be interesting.
Methodology
In order to generalize, it is also preferable to include a larger sample size along with randomized selection. The use of a convenience sample hinders the ability to generalize. Since the students must be allowed the freedom to choose their own course format, it was not possible to randomize. The participation for the face-to-face classes was very good at 96%. However, it was much more difficult to encourage online students to participate, and only 34% of those enrolled opted to participate in the study. Future studies may benefit from providing additional incentives to encourage more involvement from online students. If the Institutional Review Board (IRB) would allow approval, extra credit or another grade incentive for participating in research would be helpful. The IRB at the participating university discouraged this type of incentive.
It is also possible that while the intention of the liberal studies course is to teach critical thinking skills, it has not been formally evaluated to determine if the course content meets this expectation. While educators are interested in improving critical thinking skills, there is no evidence that instructors are successfully infusing this type of instruction into courses (Halpern, 2001; Johnson, 1992; McLean, 2005; Ratcliff et al., 2001). The instructors for this course were not measured on their ability to teach critical thinking, and the course itself was not objectively assessed for its critical thinking content. In future studies, the targeted course could be assessed by an expert in critical thinking to determine if its components do indeed address those skills. This expert could also train the instructor or instructors to teach in a way that enhances students’ opportunities to learn critical thinking skills. Such an expert could also assess instructors on their existing skills in educational methods that enhance critical thinking. It is likely that some instructors are more skilled in emphasizing critical thinking than others. The instructors in this study may have lacked the skills to encourage critical thinking regardless of the students’ skills or course content. Some instructors may have been better able to mold the content to infuse the critical thinking skills into the course.
With increasing emphases on critical thinking skills and dramatic growth in online learning, it is valuable to continue to probe whether or not the format of learning has an influence on critical thinking. This study indicated that there was no difference in the gain of critical thinking skills when comparing students who took a liberal study course online to those who enrolled in a face-to-face course. This result is consistent with many previous studies although the “no significant difference” studies included a vast array of demographics, disciplines, and measures (Clark, 2002; LeBaron & Tello, 1998; Phye, 1997). The majority of measures in past studies addressed areas of performance that did not take critical thinking into account.
Although the Huff (2000) study was unique in addressing critical thinking skills and including nontraditional students, it focused on a very specific type of graduate class (social work) and used television technology which is atypical of today’s distance learning formats. Huff’s study also focuses on midterm exams and CCTST pre- and post-tests as outcomes. The current study added the more subjective assignment assessment, which requires different skills than the objective assessments. Therefore, one of the benefits of this research is its addition to the studies that have found no significant difference between online and face-to-face students. This study enhances prior research by adding critical thinking skills as an outcome in which there is no significant difference. The lack of significant difference is encouraging for online learning, which has been criticized for its quality (American Federation of Teachers, 2000; Twigg, 2005).
References
Allen, I. E., & Seaman, J. (2006). Making the grade: Online education in the United States, 2006. Needham, MA: The Sloan Consortium.
Allen, M., Mabry, E., Mattrey, M., Bourhis, J., Titsworth, S., & Burrell, N. (2004). Evaluating the effectiveness of distance learning: a comparison using meta-analysis. Journal of Communication, 54(3), 402-420.
American Federation of Teachers. (2000). Distance education: Guidelines for good practice. Higher Education Program and Policy Council of the American Federation of Teachers. Retrieved August 23, 2006, from http://www.aft.org/pubs-reports/higher_ed/distance.pdf.
Bauer, K. (2001). The effect of participation in undergraduate research on critical thinking and reflective judgment. Paper presented at the Annual Meeting of the Association for Institutional Research.
Burbach, M. E., Matkin, G. S., & Fritz, S. M. (2004). Teaching critical thinking in an introductory leadership course utilizing active learning strategies: A confirmatory study. College Student Journal, 38(3), 482-493.
Campbell, D., & Stanley, J. (1963). Experimental and quasi-experimental designs for research. Chicago: Rand McNally.
Carnevale, D. (2005). Today’s news. The Chronicle for Higher Education Retrieved June 28, 2005, from www.chronicle.com
Chang, E. A. (2002). The efficacy of asynchronous online learning in the promotion of critical thinking in graduate education. Unpublished dissertation, Columbia University, New York.
Clark, D. (2002). Psychological myths in elearning. Medical Teacher, 24(6), 598-604.
Erwin, T. D. (2000). The NPEC sourcebook on assessment, volume I: Definitions and assessment for critical thinking, problem solving, and writing. Washington, D.C.: U.S. Dept. of Education, National Center for Education Statistics.
Erwin, T. D., & Sebrell, K. W. (2003). Assessment of critical thinking: ETS: Tasks in critical thinking. The Journal of General Education, 52(1), 50-70.
Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Millbrae, CA: California Academic Press.
Facione, P. A. (2006). Critical Thinking: What it is and why it counts. Retrieved December 2, 2006, from www.insightassessment.com.
Facione, P. A., & Facione, N. C. . (1994). The California critical thinking skills test: test manual. Millbrae: California Academic Press.
Facione, P. A., Facione, N., Giancarlo, C., & Blohm, S. (2002). The California critical thinking skills test: Test manual. Millbrae, CA: California Academic Press LLP.
Forrest, S. P., & Peterson, T. O. (2006). It’s called andragogy. Academy of Management learning and education, 5(1), 113-122.
Fredda, J. V. (2000). Comparison of selected student outcomes for internet versus campus-based instruction (No. 00-08). Ft. Lauderdale, FL: Nova Southeastern University.
Halpern, D. F. (2001). Assessing the effectiveness of critical thinking instruction. The Journal of General Education, 50(4), 270-286.
Halx, M. D., & Reybold, L. E. (2005). A pedagogy of force: faculty perspectives of critical thinking capacity in undergraduate students. The Journal of General Education, 54(4).
Hatcher, D. L. (2006). Stand-alone versus integrated critical thinking courses. The Journal of General Education, 55(3-4), 247-272.
Huff, M. T. (2000). A comparison study of live versus interactive television for teaching MSW students. Research on social work practice, 10(4), 400-416.
Jenkins, E. K. (1998). The significant role of critical thinking in predicting auditing students’ performance. Journal of Education for Business, 73(5), 274-279.
Johnson, R. H. (1992). The problem of defining critical thinking. In S. P. Norris (Ed.), The generalizability of critical thinking. New York: Teachers College Press.
Knowles, M. S., Holton, E. F., & Swanson, R. A. (1998). The adult learner: the definitive classic in adult education and human resource development (5th ed.). Houston: Gulf.
Koory, M. A. (2003). Differences in learning outcomes for the online and F2F versions of “An introduction to Shakespeare” [Electronic Version]. Journal of Asynchronous Learning Network, 18-35. Retrieved August 23, 2006.
Lawson, T. (1999). Assessing psychological critical thinking as a learning outcomes for psychology majors. Teaching of Psychology, 26(3), 207-209.
LeBaron, J. F., & Tello, S. F. (1998). Evaluating the effectiveness of distance education: What are the questions. Knowledge Quest, 26(3), 59-61.
Lee, J., & Gibson, C. C. (2003). Developing self-direction in an online course through computer-mediated interaction. The American Journal of Distance Education, 17(4), 173-187.
McLean, C. L. (2005). Evaluating critical thinking skills: two conceptualizations. Journal of Distance Education, 20(2), 1-20.
Mertler, C. A., & Vannatta, R. A. (2005). Advanced and multivariate statistical methods (Third ed.). Glendale, CA: Pyrczak Publishing.
Neuhauser, C. (2002). Learning style and effectiveness of online and face-to-face instruction. The American Journal of Distance Education, 16(2), 99-113.
Newman, D. R., Webb, B., & Cochrane, C. (n.d.). A content analysis method to measure critical thinking in face-to-face and computer supported group learning [Electronic Version] from http://www.qub.ac.uk/mgt/papers/methods/contpap.html.
O’Lawrence, H. (2006). The influences of distance learning on adult learners. Techniques, 81(5), 47-49.
Parker, D., & Gemino, A. (2001). Inside online learning: Comparing conceptual and technique learning performance in place-based and ALN formats. Journal of Asynchronous Learning Networks, 5(2), 64-74.
Phye, G. D. (1997). Learning and remembering: The basis for personal knowledge construction. In G. D. Phye (Ed.), Handbook of academic learning: Construction of knowledge. San Diego, CA: The Academic Press.
Ratcliff, J. L., Johnson, D. K., La Nasa, S. M., & Gaff, J. G. (2001). The status of general education in the year 2000: summary of a national survey. Washington, D.C.: Association of American Colleges and Universities.
Reed, J. H. (1998). Effect of a model for critical thinking on student achievement in primary source document analysis and interpretation, argumentative reasoning, critical thinking dispositions, and history content in a community college history course. Unpublished Doctoral Dissertation, University of South Florida.
Reed, J. H., & Kromrey, J. D. (2001). Teaching critical thinking in a community college history course: empirical evidence from infusing Paul’s model. College Student Journal, 35(2), 201-215.
Sachs, J. (2001). A path model for adult learner feedback. Educational Psychology, 21(3), 267-275.
Solon, T. (2003). Teaching critical thinking: the more, the better! The community college enterprise(Fall), 1-9.
Stevens-Long, J., & Crowell, C. (2002). The design and delivery of interactive online graduate education. In K. E. Rudestam & J. Schoenholtz-Read (Eds.), Handbook of online learning. Thousand Oaks, CA: Sage Publications, Inc.
Twigg, C. A. (2005). Quality assurance for whom? Providers and consumers in today’s distributed learning environment The National Center for Academic Transformation. Retrieved August 23, 2006, from http://www.thencat.org/Monographs/Quality.html.
Unal, Z. (2005). Comparing the learning outcomes and course satisfaction of Web-based vs. classroom-based instruction. Unpublished Doctoral Dissertation. The Florida State University.
Van Schaik, P., & Barker, P. (2003). A comparison of on-campus and online course delivery methods in southern Nevada. Innovations in Education and Teaching International, 40(1), 5-15.
Vroeginday, B. J. (2005). Traditional vs. online education. Unpublished Doctoral Dissertation. Fielding Graduate University.
Wickersham, L. E., & Dooley, K. E. (2006). A content analysis of critical thinking skills as an indicator of quality of online discussion in virtual learning communities. The Quarterly Review of Distance Education, 7(2), 185-193.
Williams, K. A. (2002). Measurement of critical thinking in college students: Assessing the model. Unpublished Dissertation, James Madison University, Harrisonburg, VA.
Williams, R. L., Oliver, R., & Stockdale, S. (2004). Psychological versus generic critical thinking as predictors and outcome measures in a large undergraduate human development course. The Journal of General Education, 53(1).
Witta, E. L. (2005). Achievement in online vs. traditional classes. In C. Howard, J. V. Boettcher, L. Justice, K. Schenk, P. L. Rogers & G. A. Berg (Eds.), Encyclopedia of distance learning (Vol. II). Hershey, PA: Idea Group Reference.
Yang, Y. C., Newby, T. J., & Bill, R. L. (2005). Using socratic questioning to promote critical thinking skills through asynchronous discussion forums in distance learning environments. The American Journal of Distance Education, 19(3), 163-181.