Thursday, February 13, 2014

The latest MOOC Research

This from Justin Reich at Digital/Edu:


Framing MOOC Research: How Comparisons Shape Our Judgments
Last month, my colleagues and I on the HarvardX and MITx research teams jointly released a series of reports about the first 17 courses launched by HarvardX and MITx on the edX platform. We released a synthesis report with findings about all of the courses, and then 15 additional reports examining individual courses in more detail.

We tried to provide the public and our internal stakeholders with data that instructors can use to create better courses and that people can use to judge the state of the enterprise. We are keenly aware, however, that our data don’t have a single story to tell, and how people read our research depends upon how they approach the subject.

Here are two sets of facts, two possible frames for thinking about HarvardX courses:
Set 1: In September 2009, Boston’s WGBH published to YouTube a series of 12 videos from Michael Sandel’s Justice course at Harvard University. Each hourlong video is a combination of lecture from Sandel and facilitated Socratic dialogue among the hundreds of students who take his class every semester.

On YouTube, the first video has been played almost 5 million times. The second video over a million times. The next ten videos have been played about 300,000 to 400,000 times each. All told, the series has about 10 million views.

As a back of the envelope calculation (ignoring people who watch videos multiple times, who don’t finish videos, etc.), it seems unlikely that more than 6% of people who started the series finished the whole thing.

Set 2: Researchers at the Community College Research Center at Columbia University’s Teachers College have done some very interesting work examining online courses in community colleges.

Di Xu and Shanna Smith Jaggers have recently published studies that show that online course completion rates in two large systems in Virginia and Washington are lower than face-to-face course completion rates. In Virginia, completion rates in face-to-face courses were 81 percent, while online completion rates were 68 percent; in Washington, the rates of completion were 90 percent for face-to-face versus 82 percent for online. While online completion rates lag behind on-campus counterparts, the vast majority of students in both conditions earn a passing grade in the courses in which they enroll.

The first set of facts concerns online media available to anyone for learning and personal growth. The second set of facts concerns structured learning experiences offered by institutions of higher education. Both are are potential frames of reference for interpreting our research about HarvardX and MITx courses, each could lead to different comparisons and different judgments.

One of the most important findings from our research is that people use materials in edX courses in all kinds of ways. The 2012-2013 MITx and HarvardX courses had a little over 800,000 registrations. A little over 43,000 people earned a certification of completion in a course. Nearly 36,000 people opened up more than half of the units of a course, but did not earn a certificate. Over 450,000 people viewed less than half of the units of the course (without earning a certificate), and nearly 300,000 people who registered for a course never entered the courseware at all.

The figure below is a scatterplot of all 800,000 registrations. On the y-axis is the student’s grade in the course, and on the x-axis is the percentage of the units (or chapters) in a course that the student opening. As you can see, nearly the entire possibility space is full of points. There are people who are completing every snippet of course content, people who are auditing courses and ignore assessments, people dabbling in a fraction of the course, and people who are never showing up at all.

scatterplot 
How should we judge these findings?

One thing we can do is compare these patterns of behavior to the patterns of behavior in community colleges. In community colleges, we are keenly concerned with completion rates. Courses are expensive, and students who fail and drop out not only miss the benefits of learning and certification, but they also lose the money they invested in enrollment. We might make the comparison that only about 6% of HarvardX and MITx registrants finish a course, but in Virginia 68% of online community college registrants finish a course.

We could also compare the online learning content hosted on edX to the PBS Justice videos hosted on YouTube. In online content, we expect to see a funnel of participation. We expect, that on any website, there will be some number of people who navigate to the site, a smaller number of people who register, a smaller number of people who participate in some way, and then a smaller yet number of people who engage the deepest possible ways. When we compare the HarvardX version of JusticeX with the PBS version of Justice, we find very similar patterns of participation.

paired figures

Patterns of persistence and completion in edX look pretty typical when compared with engagement funnels of online media, and they look pretty lousy when compared to community college retention rates. So which is the right comparison?

Faculty intent should play an important role in deciding the right frame of reference, the right yardstick, for judging open online courses.

Some MOOC faculty are primarily interested in sharing their ideas and relatively uninterested in certifying learners’ competency. Michael Sandel, through PBS Justice, justiceharvard.org, and JusticeX, is primarily interested in helping more people learn more about moral reasoning rather than certifying people at a particular level of skill or knowledge about moral reasoning. If he’s trying to maximize the number of learning experiences that people have, of both lighter touch and deeper engagement, then the frame of online learning media seems to be a fairer point of reference for judging the public impact of JusticeX.

By contrast, many of Sebastian Thrun’s courses at Udacity, especially his pilot programs with San Jose State University and Georgia Tech, are explicitly designed to replace or complement typical courses of study in higher education. Familiar higher education settings make a more sensible frame of reference in understanding these efforts.

As new forms of online learning proliferate, no doubt there will be even more sensible ways of contextualizing and framing new courses and new platforms for learning. As we debate how open online courses might reshape the future of higher education and lifelong learning, its worth paying close attention to how the points of comparison that we choose frame our interpretations and judgments.

 Justin Reich, is the Richard L. Menschel HarvardX Research Fellow and a Fellow at the Berkman Center for Internet & Society. He writes the EdTechResearcher blog for Education Week.

And last May at HarvardX:

Data, data, data (from CS50x)

by David J. Malan, Senior Lecturer on Computer Science and Lead Instructor of CS50x 
Version 1 of CS50x debuted on Monday, 15 October 2012, and concluded (unlike my taxes) on Monday, 15 April 2013. We’ve only just begun to dive into all of the data we collected over those six months, but we thought we’d take a moment to share some preliminary datapoints.
Day 0 began roughly as follows:
2:00pm CS50x goes live
2:02pm 500 users online
3:00pm 10,000 users online

Tommy MacWilliam '13, CS50’s, one of CS50’s head TFs, shares even more detail in his blog.
Day n-1, meanwhile, concluded with
150,349 students registered
100,953 students engaged
10,905 pset0 submissions
1,482 project submissions
1,388 certificates awarded
whereby students who “engaged” watched content, asked questions, and used apps, even if they didn’t submit work. Receipt of a certificate required submission of all work with scores of 60% or higher.
Submissions of problem sets and quizzes trended as follows:
submissions
Daily engagement in the course, as measured by Google Analytics in terms of unique visitors, trended similarly, peaking early, then reaching some form of equilibrium (that appears to trail off as the course’s end loomed!):
unique visitors
If we zoom in on the course’s final weeks, unique visits hovered around 2,500 per day:
unique visitors
Visits by country, meanwhile, were ordered as follows, with Google Analytics reporting at least one visitor from every country in the world:
visits
As for the students themselves, upon submitting Problem Set 0, students were asked to submit a form that inquired as to their background before CS50x and motivation for taking CS50x. Based on 10,905 submissions (which may or may not be representative of CS50x’s 150,349 registrants), CS50x’s demographics were 20% female and 80% male (whereas CS50 on campus was 36% female in Fall 2012). The average age was 29, with a median of 27 and a mode of 21. CS50x’s eldest student was 80 and youngest student was 10. Even younger were Louis and his brother, who said “hello, world” as well. Though filming those hellos apparently required one take too many.

Per the below, 56% of CS50x students had no prior background in CS, versus 75% in CS50 on campus:
prior courses
Most students had at least a degree from high school, if not beyond:
education
Out of the 10,905 students who submitted pset0, 10,137 (93%) intended to do all of the course’s work:
plan
To be fair, out of the 150,349 students who registered, only 10,137 (7%) intended to do all of the work (else they’d presumably have submitted at least pset0).

Meanwhile, out of the 10,905 students who submitted pset0, 3,381 (31%) took CS50x because of the prospect of a certificate:
interest in certificates
To be fair, out of the 150,349 students who registered, only 3,381 (2%) took CS50x because of the prospect of a certificate (else they’d presumably have submitted at least pset0).

For most students, then, “success” didn’t necessarily mean a certificate. And “completion” wasn’t necessarily the goal. Among students' motivations for registering were:
Why taking CS50x?
Ultimately, at least 1,388 students will receive a certificate like John Harvard’s here:
John Harvard’s certificate
To put all these numbers into perspective:
  • Among 150,349 students, 100,953 engaged (67%).
  • Among those 100,953 students, 10,905 submitted pset0 (11%).
  • Among those 10,905 students,
    • 3,381 (31%) sought a certificate, and
    • 1,388 (41%) will receive a certificate.
CS50x’s “completion rate” (whereby completion is defined as submission of all work with scores of at least 60%) was thus:
  • 41%, if out of 3,381 who sought a certificate.
  • 13%, if out of 10,905 who submitted pset0.
  • 1.4%, if out of 100,953 who engaged.
  • 0.9%, if out of 150,349 who registered.
By contrast, 703 out of 706 students (99.6%) “completed” CS50 on campus this past fall. But, to be fair, for most CS50x students, “completion” wasn’t necessarily the goal. Indeed, tens of thousands “engaged” in some form.

For the curious, version 2 of CS50x will debut in late 2013 or early 2014 once we’ve had a chance to re-tool and improve! In the meantime, all of version 1’s content will remain freely available at cs50.tv, at youtube.com/cs50, and at x.cs50.net as well as in iTunes and iTunes U. And even before version 2 of CS50x debuts, content from Fall 2013 of CS50 itself will become available at cs50.net in September 2013.

1 comment:

Anonymous said...

Might not be a fair parallel but make me wonder if all the hysteria about putting more programs on line is going to result in greater access but with significantly higher failure rates - unless we are also going to expect professors to allow students to meander through their electronic courses at their own paces.