Wednesday, January 28, 2015

Research: Process, Ethics, Validation, and Technicianship?

Derby Wharf, Salem, MA - Jan 2015 (Storm: Juno)
I am sure that last one is a word I just made up on the spot. It's been a slow week in 802.  I was reading Lisa's reflection on Lurking in 802 (she is in last year's cohort, so she is two courses ahead of us in Cohort 7), and how she viewed 802 at the time as a make or break experience for the Ed.D. program.  While 801 last semester was a whirlwind tour of Online and Distance Education, this semester is a whirlwind tour (or boot-camp perhaps) on the topic of Research and how to go about doing it.  The textbook, by Cohen and the gang, is something that I have read before (a few year ago), but this doesn't help a ton ;-)  There is a lot to unpack, and this book is dense.  Even if one could memorize everything (not a good way to learn, by the way), it's like going to philosophy course - you are there essentially to argue for and support your stance. You are also called upon to explain your underlying frameworks, or mental constructs and assumptions, for what you want to do.  To some extent this is a bit like therapy where you are called upon to reflect on why you hold those views and unpack your own biases and unspoken, but held, beliefs.

While this is something that friends and colleagues, and I, have been doing for a while, especially with our group in #rhizo14, vis-a-vis online research ethics (hey remember #massiveteaching courseraGate of 2014 and the discussions on ethics there?) being in class feel different.  In an online environment, while you may engage in these discussions, at some point if you feel like you've had your fill of the discussion you can choose to take a break and not engage any more.  In a course, however, you don't necessarily have that luxury.  You may take a small break from the discussion, and perhaps wait for other cohort-mates to step in, but you can't necessarily stay out of the arena for too long. Even if you could avoid thinking about such heavy subjects for the live seminars, or the asynchronous discussions, you still have homework to complete, which ensures that you will be thinking and articulating something about such weighty subjects.

Even though we are only about to complete week 4 (what? week 4? that's like 1/3 of the semester! Holy cow!), thus far this is a humbling experience on two levels.  First the readings make my brain hurt (figuratively). I haven't experienced this since Fall of 2010 when I was taking a course on psycholinguistics (which was also a primer on second language acquisition). Even though I had read all of the books and articles over the summer, and I had reviewed them just prior to each class session I still felt a bit lost with the majority of the readings.  It's not that I didn't get them, it's more like they all meshed in my head and only a small amount of distilled knowledge remained on the surface. It took discussion to really get those "a-ha" moments and really make connections with readings that remained beneath the surface. The second reason why this is humbling is that in the live sessions I'm don't feel adequately prepared.  This is connected to the first reason (the overwhelming amount of information that is taken in). I usually have something up my sleeve in live sessions (asynchronous discussions allow you to look things up and present your arguments), but I am now in a position (again) to consult my notes, to consult my highlighted text and articles, and the things I scribbled in the margins, and then still say "huh???" It's a bit of an academic rush, but it's humbling.

And now for a change in topic, but it's still related to 802.  A little while back I received this notification from LinkedIn for a new discussion.  The discussion is about PhD graduates being just technicians - so how can we help them improve? I suspect that by technicians the original poster means that their research is very mechanical in nature and that it follows a cookie cutter approach.  If the overall tone of the AU EdD program is like 802 I suspect that none of us will be cookie cutter researchers.  That said, I think that cookie cutter researchers exists because the overall environment supports them in some sense.  Some in the thread support Post-Doc work as a way to combat this technician mentality, but I think that this, too, is one way that higher education is prolonging academic adolescence amongst learners. In my view PostDoc positions are really temporary holding cells where people go to do more research because jobs aren't open. Those of you in higher education, if you think I am wrong in this assertion please let me know.  The PostDoc solution is sort of along the same lines of making doctoral students do more coursework before they have an opportunity to submit a proposal for research.  This is completely wrong in my view.

First proposed solution: If a student has completed an MA in the field that they are pursuing a doctorate in, then they shouldn't need 12-20 additional courses in order to get to that stage where your proposal is put forth and ready for comment by more seasoned academics. Students should have the most minimal of formal coursework which will have the effect of getting everyone on the same page and having people gel as a group of co-learners.  Courses should not be about content.  They should be about ways of thinking and more "brainy" stuff (a more proper word escapes me).  Courses contribute to mechanistic view of education and research because as learners we are seeking the path that will get us those good marks and high achievements (that the original posted wrote about), but they won't necessarily push us in the ways we need to be pushed to grow.  Students should go out and learn what they need to learn on their own, or with a group of co-learners, under the guidance of someone more seasoned. A class might be better for scheduling purposes (I am still an admin at my day-job), but this isn't necessarily what is best for learners.

Second proposed solution: In lieu of coursework, how about some qualifying papers? One of the pieces of advice that I've received from those who already have their degree is to think of my dissertation in chunks that I can pull out an publish separately with little editing.  I think that this is probably the making of a bad dissertation (or bad articles).  The instances of dissertation topics that can reasonably be transformed from 1 dissertation to multiple articles are probably few and far between.  It seems that the underlying idea is that new PhDs (I am using this as generic for "doctor" - EdDs and others Ds would fall into this category too) need publications and the way to get them is through the dissertation.  This, in my mind, contributes to the mechanization of research because, again, people are looking to get the most bang for the work that they put in.  If in lieu of courses students got their hands dirty with smaller research projects, things that culminated in publishable quality papers, then not only would those PhD students get to experience different research methods and ways of approaching knowledge generations, they would also have articles that they can submit for publication. This means that the dissertation could stand on its own without pulling double duty.

Third proposed solution: This may invite the ire of some recently minted PhD I know, but it need to be said: Newly minted PhDs (or EdDs for that matters) should not supervise students or teach in Doctoral programs until they have more experience under their belt.  I think that this is really important.  I don't think that I will know everything I ought to know once I finish my doctorate at AU.  This is not to say that I don't think that the AU is a quality program, quite the contrary!  What I mean by this is that it's impossible to have certain knowledge without the benefit of more experiences under my belt.  Learning is life-long, and the four (or five) years spent in a doctoral program is not sufficient to then turn around and mentor those who are just beginning their doctoral journey. I think that newly minted PhDs do need more time-on-task, and more intellectual brain-teasing in order to continue to hone their skills and expand their horizons. It is only through greater experience, and an open mind, that we are able to mentor others. Otherwise we fall into methodological  and disciplinary traps of our making. We are contributing to the echo chamber that we are in, instead of busting through those walls. After a period of mentorship by more experienced faculty, additional training, and greater time-on-task, in teaching, in service, and in research, should newly minted PhDs supervise new doctoral students (at that point they won't be newly minted any more).

Fourth proposed solution: Alt-Ac Careers! I know people who've gone into PhD programs, they completed them, but then the flame, the spark, the passion, the je ne sais quoi, just isn't there any more for research.  Don't belabor the point.  If someone earned their PhD but they don't do research, or they do it mechanistically because they have to produce something, then help them figure out an alternative to the academic career if that's not what they like to do.  Sometimes mechanistic application is due to lack of training (addressed through points 1-3) and sometimes it's because it's not what motivates people any longer.  It's perfectly fine if people's interests change.  The important thing is to figure out what you can do with your shiny PhD once you are done if you don't like doing research.  I don't think that teaching is the solution.  I think that those who do mechanistic research might have issues with sniffing out bad research, and this is a problem for teaching.  Teaching and Research are two sides of the same coin as far as I am concerned.  Your research (or review of the recent research!) points you to things you should be doing in the classroom in helping learners pick up new things and bridging that research and practice gap. If you can't do research well, you might not be able to evaluate it well.  It's not a rule, but something to keep an eye out for.

Well, that's all I have for that.  Your thoughts on this? How would you respond to this LinkedIn thread?




Post-script: We've had snow days the past few days, so I've had time to think about this for a while - hence the photo ;-)


Friday, January 23, 2015

Academic writing, but not in English...

One of the nice things about being a language geek and an academic is that you get access to research that has been published in other languages.  In addition to English I fare quite well with research written in French, Italian, and Greek. Even though I don't have any formal experience with learning Spanish I could probably get the gist of Spanish articles based on my familiarity of French and Italian.  When it comes to writing (producing language), the process is a bit painful.  My academic vocabulary isn't as developed in the other languages and the necessary stylistics of research publications in other languages is a skill that I don't yet have.

I am actually quite interested in developing that competency just for my own edification, but I would love to be able to publish research in other languages, with practice comes more familiarity and ability to use those languages anyway, and that's a goal that I strive toward.  That said, I started pondering the utility of publishing articles in other languages.  English seems to be the lingua franca for most things relating to my chosen fields of inquiry.  While there are journals and articles published in those other languages, they definitely seem to be the minority. So, inquiring minds, want to know how useful is it to publish in languages other than English, in the broad sense?  Would what I write get as big of an audience as the materials in English? I'd like to do my part to increase access to research in other languages, so this may be a way.

I am also wondering what the logistics are to translate, and republish, my work in other journals.  If I publish in Open Access Journals I suspect that it's easier to translate and re-publish elsewhere (with the caveat that this work is published already elsewhere), but do people really do that?  Is this something that is currently acceptable in academia - or would be too much of a disruption in the current publishing status quo?

International scholars, what are your thoughts?

Thursday, January 22, 2015

Axiology, Ontology, Epistemology, Researchology...

Alright, I made that last one up (probably).  This week (Week 2/14) in EDDE 802 we are tackling knowing, ways of knowing, "valid" knowledge and ways of known, frameworks for research and so on.  It's quite a huge topic, and something that even seasoned researchers keep coming back to and analyzing, debating, discussing, and re-evaluating. The prodding question this week to get our mental gears kicking is one of introspection - looking at our own beliefs and world views and seeing how those fit with defined ontologies and epistemologies that we are reading in the textbook.

The nice thing is that when I was preparing to teach my own (MEd level) research methods course the textbook we are using was the same (a previous edition, but the same nevertheless), so between my own experience as a learner (at the MA level) in research methods, my own experience designing and teaching a course, and now the experience of being back in the learner's seat has indicated one thing to me: regardless of how much one engages with this material, how much it is discussed and debated, there is always more to come back to and scratch your head, and proceed to ponder some more!  This isn't a bad thing, after all research does not (and should not) apply cookie cutter methods; this is simply wrong.

From an ontology point of view, I think that our interactions with animate and inanimate objects give them meaning, but we can't ascribe any old meaning to those objects. There are certain (for lack of a better word) affordances for each object. A stone can be a toy (think game pieces for checkers), it can be a weapon, it can be made into a tool (which I guess is another type of weapon), it can be a measure or counter-weight. The point is that the stone isn't all these things on its own, but it has the potential to be those things due to its properties if you add some human ingenuity.

From an epistemological frame, I used to be squarely (more or less anyway) on the qualitative side. Since I was never in the hard sciences, the quantitative was never too strong with me. For the first leg of graduate work (MBA, and MS in Information Technology)  most "research" was really there to support decision making.  While there were quantitative components, seeing as I focused on Human Resources I ended up focusing more on the qualitative human factors and less on the scientific stopwatch management.

For the last leg of my studies prior to Athabasca my applied linguistics department was, again more or less, all critical theory all the time - which is a consequence of having a critical theorist running the department and making hiring decisions while the department is small. Those who weren't critical theorists seemed mostly on the qualitative side of things. I do appreciate critical theory and the empowerment that it can bring. However, I think that too much of one thing makes you blind to other possibilities. Sometimes it seems to me that critical theorists use critical theory research to mask their opinions and rhetoric when there is little or no research involved.  I think that this is one of the weak points of critical theory and it should be addressed in some fashion.  Luckily, even though I no longer have daily contact with those critical theory professors (at least from a perspective of student/teacher), I still get to flex those critical theory muscle and discuss things with fellow Rhizo14 colleagues :-)

As far as my default, if I can call it that, my preferred mode of inquiry seems to always start with the following phrase "I wonder what would happen if..." I prefer to look at my collected data and looking for patterns (if they exist) and thinking about what they might signify, instead of starting with a hypothesis and looking to prove or disprove it. Although this too might be problematic because as humans I think we are, by nature, looking for patters and we might see patterns where there are none. I wonder if all researchers are this introspective...

What are your thoughts on this topic? How do you approach research and your own biases?

Friday, January 16, 2015

I dream of dissertation...

Week 1 of 15, of semester 2 of 8, of doctoral work is about to end!  The course that my cohort is focusing on this semester is a research methods course. Luckily neither I, nor it seems many of my classmates, are that new to research methods.  It's nice to have the group (or at least quite a few members of the group) exposed to the basics so that we can spend some time in critiquing and going deeper (and that's something we did on our cohort's facebook group this week anyway).  I also appreciate the fact the course isn't setup to only allow for one path through the course.  There are certainly foundational materials that we are expected to read and know, but for presentations it seems like we have a ton of choices in terms of what research methods we choose to present.

I've been thinking about the assignments and I think I will spend some time exploring research methods that I haven't had a ton of exposure in, or methods that I've been meaning to go much deeper into.  I think I will spend some time with Discourse Analysis - I've got a few books on my bookshelf that need  some reading on the topic, and I think I will focus on autoethnography.  Some members of the #rhizo14 community(and I) are working together on an autoethnographic paper (aka the un-paper) for a special issue of a journal and for an OLC conference presentation.  Autoethnography is new to me, so I guess I'm trying to kill two birds with one stone - both the paper for the journal and something for this course.  The whole aspect of autoethnography is making me think of the dissertation.  I've gone through many potential ideas for a dissertation topic including using design-based-research to convert the course that I teach (INSDSG 684) from a closed, institutional, course to an open course.  Seeing that the course was cancelled last fall (for low enrollment) and that this semester I don't even have the minimum amount of student to run the course, I am not sure that banking on this approach is wise. I may find myself with a re-designed course, fully open, but without learners.  No learners means no data, and no data means that there is little to write about.

In doing some initial work on autoethnography (and this is really preliminary at this point), I was thinking of using my own experiences as a MOOC learner (going 4 years strong in 2015) to write a dissertation about my MOOC experiences.  I am not sure if I will draw upon the previous 4 years, or if I will spend 18-24 months MOOCing in xMOOCs, cMOOCs, pMOOCs, rMOOCs,  and so on and do much more data gathering than I have done in the previous years.  While I have quite a lot of materials on this blog for MOOCs (over 200 blog posts...and counting) the current collection of data I have might be considered haphazard in its collection.

With the explosion of MOOC platforms, and the languages available, I am thinking that I could really sit down and learn in the various languages I know (including French, Italian, Greek, and so on).  It has been a really long time since I've considered myself an xLL (x = insert language of your choice) Language Learner.  One of the areas of research in linguistics is in ELLs (English Language Learners) and how students who have another native language are learning academic materials in a language that is not their own.  When I returned from Greece in 1994 and I started High School in the US I was, in earnest, an ELL.  While the seeds for English were in my head (I was born here and spent some years here before I moved to Greece), my language development wasn't the same as fellow classmates who were English speaking-only and had their schooling in English all of their lives.  It's obvious, at this point, that English is a language that I am no longer considered an ELL in. However, how about French, and Italian, and even German (my German isn't that great). I could pick up new knowledge in MOOCs, interact with classmates (in dreaded discussion forums), and not only pick up something new, but improve my language capacity in those languages (in theory). I think this might make an interesting dissertation.

The only trepidation I have is the method: autoethography.  While I do acknowledge the importance of critical theory in education and in research, and the validity of the researcher's and their experiences as the object of research, part of me is uncomfortable with this. Is studying and researching one's self just a tad bit narcissistic? Also, what about validity and applicability of the research findings - from a scientific point of view.  From a humanistic point of view what I write will be valid, as my own lived experience, however what would my dissertation committee think of this approach?  Something to ponder.  What do you think?



Saturday, January 10, 2015

Is our current HigherEd setup encouraging prolonged (academic) adolesence?

In a recent posting about doctoral degrees ("academic" versus "professional") there was one line of thought that I meant to explore, but I really neglected because it didn't quite fit in with the post the way it was ultimately flowed. In the ACM eLearn article that really got my mental gears going, and to which my post was a response to, the professional doctor "is more likely to consume research" (para. 5).

I find this statement  problematic on many levels with regard to to a doctoral degree, and the false differentiation between a PhD and an EdD, but I also find it problematic when I think of Higher Education in general.  My initial thoughts (last week) were that students, at the end of their Masters level studies should "consume" research, they shouldn't have to wait until their complete a doctorate in order to consume research.  After some time pondering the point I started wondering if we've come to a point in Higher Education where we are prolonging the academic adolescence of our learners by pushing activities and skills, such as access to research literature, to higher levels of academic accomplishment.  As much as I don't like the label, I am at the beginning of the millennial generation. Higher Education was promoted to us, in High School, as the thing to do in order to be setup for a career. So, Higher Education was really a means to get a job, or so it was promoted.  Courses on philosophy, and ethics, and English composition - man those felt like pulling teeth at times because I wasn't prepared to think like that, to think critically, because I always expected to get a knowledge dump in order to do a job. Despite the focus on employement and carrers I think I did well, after all I still talk about those professors with reverence.

Even at the MA level, some disciplines still seem like they are practicing knowledge dumping, or filling of "empty" vessels (that's a whole other problem, but it's a topic for another post).  Research, and critically analyzing research of others, isn't always something that we do in our MA level courses.  Again, I count myself as lucky because throughout my graduate education I have had professors who did push us to think critically about what we are reading.  It felt like pulling teeth at the time, but I think we're much better for it.  This, however, wasn't done systemically, in a curricular way, but rather on a class-to-class, or professor-to-professor basis.

When it comes to education, the statement above, the consumption of research is wrong on two counts.  First research should not be consumed.  Consumption to me has a connotation that you are not thinking about what you are reading.  You are taking in, and taking at face value, what someone else writes, says, or acts.  I think it's important as individuals who have received a certain education, be it BA, MA, or Doctoral education, that we critically analyze what we are reading.  Some research may be bunk, some may be good. But even good research needs to be thought about critically.  Just because the authors of particular research saw it going one way, it doesn't mean that it can't be applied in other, unforeseen, but equally effective, ways.  Thinking about what you are "consuming" is an important aspect of being educated.

The second thing here is that everyone who's been educated, again regardless of the level, should be able to do this.  This is not the purview of those who complete Doctoral work.  Granted, those with a Doctoral background may have an expanded scope through which to view, review, and think critically about research work, but that comes with experience and a prolonged apprenticeship period; 4 years of higher education for the BA versus 10 or more for those with some Doctoral degree.  I am wondering if such an attitude toward education (i.e. PhD is the domain of research consumption) is prolonging the learner's academic adolescence, not enabling them to be self sufficient and a life-long learner in their respective fields; thus - to some extent - making them dependent to those with a Doctorate necessary to feed them what they need to know and act as gatekeepers for knowledge.

Thoughts?


Monday, January 5, 2015

Of MOOCs, online courses, content, and teaching - whoa, that's a lot!


Alright, being now back from my mini vacation, and back into the regular rhythm of work, reading, and very soon classes, I've caught up with a lot of my saved Pocket articles.  The one thing I saw is, still, the very schizo nature of MOOC reporting and commentary. This reminds me a bit of the headlines, back in the day on Engadget and other tech sites, about studies on cell phones causing/not causing cancer. In the MOOC context this is about whether or not MOOCs (in their many forms?) are/aren't good, revolutionary, the best-thing-since-sliced-bread, etc. Sometimes I feel like the Charlie Brooker of the field of education when I write these, but hey, that's not necessarily a bad thing.

Right before the holiday break, one of my colleagues sent me an article from Technology Review which he thought I should publicly respond to (being the crazy MOOCie that I am).  The article is What are MOOCs good for? and I may have read this a while back, but probably didn't really write much about it since it really didn't provide me with any food for through - it seemed at the time as really more of the same.  In the second reading, prompted by my colleague, I came across three things. The first thing is what my colleague pointed out, in which following was written:
“For all the hype, MOOCs are really just content—the latest iteration of the textbook. And just like a book on a library shelf, they can be useful to a curious passerby thumbing through a few pages—or they can be the centerpiece to a well-taught course. On their own, MOOCs are hardly more likely than textbooks to re-create a quality college education in all its dimensions.”
I think that this line of reasoning really shows a massive misunderstanding of MOOCs, or perhaps it's falling into the disillusionment  brought on by the MOOC-bubble driven by xMOOCs in 2012.  MOOCs aren't just textbooks for the new generation.  MOOCs aren't just about content.  I think that the way that open teaching and open learning has been "realized" (using this term loosely here) in xMOOCs does potentially lead the newbies in the group to say that MOOCs are just content.  This is the same rationale  that my dad used about the internet five or so years ago.  He used to say that "it's just a huge library, so what?"  Well, now he is on, reading news, communicating with friends and relatives at a distance, and spreading his daily bits of wisdom to those who want to follow him on facebook.  I would argue that MOOCs, and courses in general, aren't about the content, but about the learning community that develops as part of a learning cohort.  I saw this in recent MOOCs like #Rhizo14 and Connected Courses, and I saw it back in 2011 with the mother of all MOOCs, aka Change11 (man, that was a long MOOC...but I made it to the end...was there a badge for that? ;-)  ).  I could have just taken content from these MOOCs, like I do in the xMOOCs, but I opted to be part of a learning community and write, comment, contribute, and remix as much as I could or as much as I wanted to.

Face-to-face, or traditional online courses, aren't immune to the "courses are only content" mentality.  It is this mentality that has faculty slap copyright notices on their syllabi and their Blackboard course-shells, and their assignments, and their rubrics.  As the Greeks would say, όπα ρε μεγάλε!...or loosely translated whoa there partner!  By slapping copyright notices on all of these things you are feeding into the, erroneous, weltansauung that education is about content.  Perhaps it is partly about content, but the point is that learning content alone does not education make. Education is about knowing the world around you, and knowing what to do and how to handle new and unexpected situations (at least that's my stance). You need some data to process first in order to learn, but you don't stay within that box that the data provides for you. Once your training is done with the data you have, you break through that box. Learning is about engagement, and engagement can be measured in many ways (side note: hence the issue of only measuring engagement in MOOCs by way of deficit - the dropout).

The next point in this article is actually about  engagement. When talking about a computer science MOOC, one of the professors comments:
The paying Harvard students decide for themselves whether to attend the lectures or just catch them online. “I would like to think there’s a nontrivial psychological upside to the shared experience,” he says, but it’s up to them. Instead of necessarily having all 800 students attend each lecture, “I would rather have 400 students who want to be there,” he adds. Besides, “we’re nearing the point where it’s a superior educational experience, as far as the lectures are concerned, to engage with them online.”
The psychological up-tick may or may not be there, even in the on-campus experience (any researchers out there who have studied this? I haven't looked at the literature for this yet).  I am personally of the opinion that the mere physical act of being in the same room, at the same time, does absolutely nothing for learners. The sense that you were there only really works,(for some people) in my book, for historically significant acts like where were you when the moon landing happened? or for my generation perhaps the question would be where were you when Chernobyl happened? I was too young for Chernobyl, but I certainly remember what I was doing on September 11th 2001. This sense of shared togetherness only really works, in my book, for events of great shock and awe.  A lecture isn't necessarily inspiring to me.  That said,  even in a MOOC, where information may be received asynchronously, we've time-shifted that togetherness through other means.  I don't need to look at the back of my classmate's head to know that I am not alone in the course.  I can see reactions (and engage) in the facebook group, or through following hashtaged content on twitter.

This reminds me a lot of discussions I've had on-campus with colleagues in a couple of different departments. They have been concerned that my offering online courses (or allowing campus students to enroll in online courses) would cannibalize their campus offerings.  In effect they were holding campus students hostage because they were local. As I've pointed out to many collegues in the past, the question to answer is what is the benefit of on-campus. If you lecture on-campus, and you lecture online, with little interaction, why come to a 5pm, 6 pm, or 7 pm class and go home close to midnight when you can do it from home, asynchronously? There are definite benefits to on-campus learning, but one needs to use those benefits not only for individual courses but for an entire curriculum and program of study.  If you are just going to campus to get lectured at, why not just the course online where there actually are opportunities for interaction even if the content is predominantly lecture driven?

Next on my list of Pocket articles is this Wired article about free online courses still being the future of education.First it astounds me that in 2015 people still use massively open for MOOCs when, in most cases this isn't true for xMOOCs.  There are many sources out there on this, but I would just direct you to read some of David Wiley's stuff as a primer on the co-opting of the term open. In any case, the following was in the Wired article:
This week, a team of researchers out of MIT, Harvard, and China’s Tsinghua University—all schools that offer MOOCs—released a study showing that students who attended a MIT physics class online learned as effectively as students who took the class in person. What’s more, the results were the same, regardless of how well the online students scored on a pre-test before taking the class.
Part of me is amazed that this is news, because those of us in the education field have been keenly aware of the no significant difference phenomenon for quite some time. This has been so firmly established that it's almost pointless to run new studies to try to  prove/disprove it. Now wired isn't written by, or for, people in education, so this may be understandable, but why not point to the NSD phenomenon and add to it, instead of making this sound like groundbreaking research? The article continues on to say that:
...studies like the one from MIT are providing new fuel for people like Agarwal. It’s an affirmation of the very thing they’ve been saying all along: that it’s possible to get a quality college education without the hefty price tag.
Of course, but as I wrote in my previous post (on PhD/EdD differences) the issue isn't necessarily about education, it's about name brand-recognition. This is precisely what Daniel Lemire wrote in his blog post about lectures not being saleable. You can't make money off the lecture these days, what universities make money on are credentialing of individuals.  Being able to get a good education for cheap isn't something new.  My father, someone who never went to high school, has been an avid reader all of his life. Even though he never finished high school, and certainly never went to college, he's studied literature, classics, history, biology, anthropology, and a whole list of other topics, on his own, either through books he bought, or by going to the library to get books to read.  MOOCs are certainly not a pioneer of democratized content, and content like this isn't even accessible to everyone (see lack of internet access as an example).

Relating to this I had a number of articles in my pocket on the Openness of MOOCs and MOOCs being confused with traditional online courses. This confusion is certainly nothing new. As early as 2012, when xMOOCs first started to show up as the new kid on the block people confused them (this IHE article is what I point my own students to). For most people the distinction is pretty hard to manage, and this seems pretty normal.  As Wiley points out, when  you do an analysis of differences between traditional online education and the MOOC the difference is really the ability of the learner to self-register and attend for free.  While in on-campus courses we've welcomed auditors (or people who sit-in for free in our lecture halls if there is space), in an online environment, where LMS enrollments are automated and connected with the SIS (student information system), it's not easy for many auditors to join in for free because each auditor (at least in my institution) would have to be manually put into the course.  Additionally, as the Technology Review article points out, there is an issue with teaching in open spaces:
Yet while MOOCs’ huge enrollments are fantastic for running educational experiments, it makes them hard to teach. Pritchard’s MOOC represents a much wider range of abilities than his on-campus class at MIT. “It’s like we’re trying to teach from second grade up to seventh,” he says. His new project is an Advanced Placement physics course for high school students. By narrowing the target audience—high school students who believe they’re ready to take AP physics are likely to start within a fairly tight band of knowledge—he thinks he can teach more effectively than would be possible in a more diverse MOOC.
The problem, as I see it, is not one of teaching, but rather of expectations and design.  If you approach Open Teaching and Open Courses with the same design mindset as the one you have for traditional, small, private, courses then you are going to fail.  One needs to fundamentally rethink and reconceptualize what it means to teach, and what it means to design a course for at scale deployment. When I was an undergraduate the on-campus at scale deployment (large auditoria) meant that the course has a lecture section which was lead by a professor, and a a discussion section lead by a Teaching Assistant. In large xMOOCs we've seen this tactic used, but is this just a case of using existing paradigms for new problems, trying to fit a square peg in a round hole? This approach may work, and we need to experiment to see if it works, but should it be our only experiment?

The other problem is fundamentally a problem of audience. This is something we saw at my institution with one of the MOOCs we offered that was open to anyone. This meant that the MOOC was going to have materials for kindergartners learning about the topic to post-docs who were interested in learning more and engaging. I predicted that this would be a massive failure and the reason for this is that you can't be all things to all people.  You might be able to build a platform that adapts to the learner's level and provides them with the materials that they need at their level, but at that point aren't you just a content provider? What about the learning community, that I wrote about earlier, how does that get fostered when you have such a wildly heterogeneous group of learners in your course?

Finally, I want to close out this (lengthy) post about MOOC LMS.  One of the posts I was catching up on was a Connected Courses post on the curious case of JANUX, the University of Oklahoma MOOC platform. I must admit that JANUX has been on my radar since (at least) last spring.  I signed up for a few courses, but I never had time to participate in them.  I think I was more busy with cMOOCs and other endeavors than anything else.  Since the courses, and their content, were archived, I thought I would make my way through them on my own time. I was not particularly keen on getting any sort of certification, so I let things lapse (as I have done with other open course platforms that have self-paced courses).  The LMS is a little janky, at least on a web-browser, but my limited dealings with the iPad app seem to indicate that it's not a bad system.  That said, I don't know why anyone would opt to use this closed system.  In the courses I signed up for I don't see any way to download the videos.  This for me is a must.  In any system that I is reported to be "open" I don't want materials to disappear at the end of the semester, which for me means that I want to retain a copy of any work I do (posted on this blog, scribd, or slideshare), and any of the materials that were used in the thought process around my work. This seems like a gamble gone wrong for the University of Oklahoma.  I think OpenEdX is a better choice (and it would be even better with a federated search and single-signon).

What are your thoughts on these topics?  Have I managed to daze and confuse? 

Sunday, January 4, 2015

Online Doctorates, degree designation, and misunderstanding of what it all means...



Happy new year to all! The other day I was catching up on some reading in my Pocket account when I read an article in eLearn Magazine about online doctorates. I feel like I should have a grumpy-cat image on this blog with a big "no" on it since there were a number of things that seemed really wrong to me about this article. Some of them are probably the author's interpretation, or way of explicating things, and other things are wide-spread academia myths (in my view). I think the article author intends well, as the crux of the article seems to be to research things well before you commit to any online program, but things seem to go awry in explanations.  First the author writes:

As the number of master's students from the initial flush of fully online degrees stabilizes, those interested in increased revenue streams have opened up the university gates wide and have started to look to doctoral-level education for the next big democratization of higher education.
I think that this is a gross misassessment of what online degrees do, including the doctorate.  I do think that there is a demand for more education, and because we are all busy during traditional school hours to attend MA and PhD program online programs are growing or expanding to fill in the gap.  That said this should not be mistaken with democratization of education. Just because higher education is now potentially available to more people, via online delivery, it does not mean that there is an actual democratization of knowledge and information.  People still have to pay tuition and fees in order to be able to take those online programs.  When the financial cost of attending school is not $0 then it's not democratization.  Democratization is when knowledge and information becomes available to all of those who want to partake in it, regardless of their socioeconomic background.  Online degrees, even the doctorate, is still only available to those elite who are able to pay for it!

The second thing that really didn't sit well with me is this artificial distinction between a "real" doctorate (PhD) and the "professional" doctorate (EdD in my discipline).  The distinction between the two, the one that I've heard over the years, and the one that the author uses is the following.  In a PhD (or academic doctorate as the author refers to them as):

The expectation for those in academic doctorates is that they will focus on the creation and dissemination of new knowledge in their disciplines. They will have experience in presenting their ideas to academic communities for criticism and feedback. Academic doctoral students are expected to publish their work in peer-reviewed journals with findings from their original research.
In a "professional" doctorate, like the EdD however:

[doctoral candidates] are trained differently. While much of the coursework will appear similar, those inside the academy can attest to the differences in the kinds of experiences for these two different tracks. Those with practical or professional doctoral preparation will focus on improving practice. They are more likely to learn to consume research rather than producing copious amounts of original research themselves. They focus on translating current research findings into practical implications for those in their fields. They tend to be leaders in their chosen practice areas, and typically don't work in academic appointments.
This to me is more or less bunk.  First of all, it seems to me that in most disciplines (i.e. those who are not considered lab sciences) that this distinction is irrelevant.  In academia one would never hire a PhD in any of the disciplines I've studied only to conduct research.  Being part of academia, on the tenure track, means that you are conducting research, peer reviewing, being a member of committees to provide service to the institution and to the profession, and teaching.  Teaching is about mentoring and about bringing the theoretical and the practical together so that students can see theory in application, and then in turn start hypothesizing and working things out on their own.  As a matter of fact, we've seen a whole alt-ac movement where PhDs who are thinking about not going into academia (because let's face it, there is a dearth of tenure track jobs available) need the skills required to put theory into practice and to translate that knowledge for other audiences.  Are you telling me that they should have gone for a professional doctorate instead?

On the other hand, you've got the professional doctorate who, by definition, seems to me to be the mama-bird who pre-digests food for her younglings.  This is also wrong. Professional doctorates might have more of a focus on practice (maybe skipping that required skilling for alt-ac) however this does not absolve them of research responsibilities, if they want to be taken seriously.  People who graduate from professional doctoral programs need to be every bit as much as thought leader, to borrow a phrase from a friend of mine, as those graduating from academic doctoral programs. They still need to be able to conduct research as part of their day-to-day job because that's how we determine if theoretical practice, put into day-to-day use actually works. No one needs to produce copious amounts of research to be an academic, they just need to produce good research.  Research isn't measured by the pound, but rather by impact. And, what if someone with a professional doctorate wants to pursue the tenure track? Should this degree designation prevent them from doing so?

Finally, the author cites a 2005 article on perceptions of hiring authorities, in academia, about online doctorates.  Leaving aside that this was 10 years ago and things will most likely have changes, I think the conclusion, that those who earned a doctorate by online means, are not welcomed to apply for academic postings.  I think this is looking at it in a manner that is too simplistic.  I think that we need to dive deeper into the nitty gritty here and see who was offering online doctorates ten years ago?  The only people I know who've earned online doctorates during that time period are people who went to Capella, or maybe even University of Phoenix.  The push-back felt in these cases is not because of the online doctorate, but because what that online doctorate is associated with: a for-profit institution who is seen to lack quality (in other words the perception of the diploma mill). Now if the degree were attained through Harvard Online (if that existed back then), I would say that an online degree would most likely be welcomed if it were associated with a positive name.  It's not the online degree, but rather the name.


There is no excuse for a poor academic program - period - be it BA, MA, or PhD level. The sense that I am getting from articles like this is that the difference between a PhD and an EdD basically translates to "PhD = more rigorous" and "EdD = be done quickly, call yourself a doc" - this to me is a false conclusion. I personally don't see a difference, from a theoretical perspective, between a PhD and an EdD.  your thoughts?