The question is, as MOOCs continue to grow in popularity and the "graduates" of these courses inevitably enter jobs that use these new skills, will the "certificates" that these places offer come to have their own currency?
In other words, will there come a time when a "degree" from Coursera showing proficiency in a relevant topic become more valuable than an actual degree from an accredited four-year institution?
I think that this will happen. Slowly, but it will happen. Software companies will probably lead the way on this. If I were hiring someone for a developer position, I have to ask my self which I would rather see:
- An online portfolio complete with GitHub resume of open source samples, blog posts demonstrating writing skills and competence, and a collection of certificates from Udacity, mitX, and Coursera.
OR...
- A degree from State Tech with transcripts showing an A in Data Structures, a B in databases, and an A in Operating Systems
Obviously there won't be such a clean divide between the two for a long time, but it's always been my belief that software development especially (and engineering in general) are better served by a master-apprentice approach than a four year "liberal arts" approach. Seen this way, the rise in "online credentials" is most valuable as a way of selecting those who would most benefit from the intense time and money investment of a master of their field.
As for general education, I do think it would be a shame to throw the baby out with the bathwater and end up with an entire generation of people who are really good at machine learning but never took a higher-level history course. So I'm thinking a good hybrid would be a two-year associates degree that covers the basics followed by self study online using MOOCs (perhaps combined with some kind of enlightened one or two year "work-study" program offered by companies like Google).
I'm really just thinking out loud here, so this isn't some kind of formal proposal or argument. I'm just wondering what the most optimal "end game" is.
Most of the courses are, at least currently, not the quality of those served by a good conventional institution. Moreover the multiple-choice and/or keep-trying grading format makes assessment, even self-assessment, challenging.
> And in a vote of confidence in the form, students in both [of two Coursera courses] overwhelmingly endorsed the quality of the course: 63 percent who completed Dr. Agarwal’s course as well as a similar one on campus found the MOOC better; 36 percent found it comparable; 1 percent, worse.
I'd say that the quality of MOOCs vary, but definitely no more than the quality of courses at a formal institution.
Really, people who spend their time, received some notion of credit for completing a class, and then praise it for quality represent the textbook case of biased opinion.
This is generally true. I think I learned a lot in the Scala course, but it would be nice to have a human look at the code at some point. And the assignments were more or less fill-in-the-blank.
OTOH, this language course http://spanishmooc.com/ looks potentially as good or better than traditional ones. It seems like language courses over the Internet with native speakers could be higher quality and competitively priced with software solutions.
I'm about halfway through completing MIT EdX's 6.00x and Udacity's CS101 courses and I'm extremely satisfied so far. I think the instruction has been near-perfect. For the first time in my life the material is sticking.
I'm really looking forward to the increases in human skill and potential that comes from educating millions from around the world. That creative genius who never how to code, now has the skills to write innovative software. Or the millions of more average people who've increased their potential - that's valuable as well. Think about what the world would be like if everyone was educated up to the college-level. I think human advancement in science, culture, technology, and the arts would accelerate tremendously.
I'm quite prone to oversubscribing to MOOCs and then dropping out quickly. Sometimes it's because I realize that though I'm interested in the topic I'm "long magazine article" interested not 12-week class interested. And sometimes I drop out due to lack of time to make the commitment to do well in the course; I find that "coursera guilt" is recognizable to most of my social group, and my threshold is that if I'm more than two weeks behind I'll drop out.
There are some things that I think these outfits could do to improve the retention rates for students; most of these features should be optional but providing support for students to complete courses should be a priority for online learning companies; regardless of their business model.
Herewith, a list of features that would help me stay on top of courses I am taking for reasons somewhere on the spectrum from recreational intellectual interest to professional development:
1. export syllabus/course schedule to calendar; I do this manually for some courses where I really do want to reach the finish line.
2. Optional daily reminders on weekdays; this would encompass the videos you should watch and the readings you should complete that day to stay even with the course schedule, the upcoming class assignments and the results of the last assignment, as well as a digest of forum threads.
3. In tandem with the above, improved course dashboards; there should be one place to go to find out what you should look at, read, or do; next.
4. Video editing and presentation coaching for instructors. Some people are really good at presenting course material, others need some editorial coaching to focus their videos. I can think of a couple of courses where the lecturers lack of stage presence and inability to be concise made watching the videos such a chore that I would rather do anything else than watch them; and that's often the point where I decide my life won't be worse if I drop that course and see if I can find a different one covering the same material.
On another note: Does anyone offer a course on how to read mathematical notation? I often find myself trying to read equations in notations that are not that familiar to me, and it feels like a fundamental lack of literacy to me.
I too am prone to enrolling in too many courses. I am always behind on courses. My strategy has been to learn and apply the skills, more than to get a certificate of completion. Some courses take far more time than others, specifically once that need Math refreshers. I am pleasantly surprised by the high quality of these courses and intend to complete most of the ones I enrolled in, in my own time, since it is really hard for me, with full time work, to follow the course schedule.
I once heard the phrase "When the New York times is reporting on something then it's already old news". But that aside: Good, I always hoped this would happen. The internet is not just games, movies, pornography, social networking and sending PDF files. It's predestined to be a medium for learning.
I really love moocs. I've somewhat become addicted to them. One huge problem I have with them however is that there is no obvious path for learning in them. Note I am not talking about the course itself but the path from one course to another. This isn't a problem held by all of them (notably Udacity and Saylor). For coursera and edx however they just seem to be an eclectic jumble of unrelated courses. If I want to take a course but don't meet the requirements I'm pretty much screwed if the course isn't being given. If I want to expand my knowledge a step further I don't know where to go.
There's an algorithms I and an algorithms II course, for example.
I'm starting Crytography I next week, and expect that to have a follow up as well, and Analytic Combinatorics, Part I is in a couple of months.
The logic course that finishes up in a week or so though, that one seems to come out of nowhere and not have much of a stated follow on to it. Despite, of course, being useful for all manner of computer science.
But as for the groundings for these courses. I don't know what the answer is. I don't think the providers need to provide a basics level, especially if they are getting sign up numbers into the 50k per course. But providing basics courses or self-study pre-courses would be a good move, considering the number of people in the logic course who, unfortunately, didn't have the knowledge base the course requires.
There are some pain points coursera could smooth over, and having a better course search and course reference system is one of them.
I think, because it seems that a number of these courses are on line equivalents of off line courses, the thought hasn't been put into making the course abstracts stand on their own, but instead are just rewordings of the standard course descriptions.
In other words, will there come a time when a "degree" from Coursera showing proficiency in a relevant topic become more valuable than an actual degree from an accredited four-year institution?
I think that this will happen. Slowly, but it will happen. Software companies will probably lead the way on this. If I were hiring someone for a developer position, I have to ask my self which I would rather see:
- An online portfolio complete with GitHub resume of open source samples, blog posts demonstrating writing skills and competence, and a collection of certificates from Udacity, mitX, and Coursera.
OR...
- A degree from State Tech with transcripts showing an A in Data Structures, a B in databases, and an A in Operating Systems
Obviously there won't be such a clean divide between the two for a long time, but it's always been my belief that software development especially (and engineering in general) are better served by a master-apprentice approach than a four year "liberal arts" approach. Seen this way, the rise in "online credentials" is most valuable as a way of selecting those who would most benefit from the intense time and money investment of a master of their field.
As for general education, I do think it would be a shame to throw the baby out with the bathwater and end up with an entire generation of people who are really good at machine learning but never took a higher-level history course. So I'm thinking a good hybrid would be a two-year associates degree that covers the basics followed by self study online using MOOCs (perhaps combined with some kind of enlightened one or two year "work-study" program offered by companies like Google).
I'm really just thinking out loud here, so this isn't some kind of formal proposal or argument. I'm just wondering what the most optimal "end game" is.