Monday, August 31, 2009

Conversation Simulation Software

While I was exploring the options for conversation simulation software (I am not that skilled with Flash so that would take WAY too long to accomplish), I came across KDSimStudio (via eLearning Learning).

I was really excited to try it out since it seemed straight forward and easy to use, and there was a demo version that I could try. The software looks nice, but I found out that it only supports roman character sets (maybe even just ASCII), so all my Greek looks like gibberish.

Too bad because I thought that this software would have been great for language education!

Friday, August 28, 2009

Google for Education

I love the phrase: Collaborating like it's 1999 :-)


Wednesday, August 26, 2009

What does a D stand for?

Earlier this month I was reading the sinkhole ahead blog post on Inside Higher Ed, which prompted me to read this little rant on the D written by the same author.

You know it's funny, I've been a student for quite some time now and I've never thought of the "D" much. One semester in my undergrad I just wanted to get a D in calculus II so that I can pass and move on. Calculus II wasn't required for any subsequent courses, but I had to take it and pass it, and quite honestly I felt like I was being dragged behind a bus.

In any case, what is debated is what role does D satisfy? I've always thought about the letter grade system as being things similar to my Greek Elementary school grading
A = 'Αριστο = Excellent
B = Πολύ Καλό = Very Good
C = Καλό = Satisfactory/good
D = Μέτριο = So, so (not quite fail, not quite satisfactory, needs work)
F = 0 (Zero)

Now, one of the blog posts mentions the following

D's make some level of sense if you believe the ancient fiction that a 'C' is an average grade. That hasn't been true for a long time, if ever, but if it were true, a 'D' would carry the relatively clear meaning of 'below average, but still acceptable.' Of course, if it were still acceptable, colleges would take it in transfer. But C's aren't really average, and D's aren't really accepted.


Now my way of thinking of grades, I guess, falls under this 'ancient fiction', but the way I see it is that there is a misstep between what the instructor thinks his grading system reflects and what the school's grading measures reflect. If we all graded based on a system that means the same thing to all graders things would improve. I also think that as people we've been conditioned to think of a C as bad because we are all exceptional students. Should we strive to do our best in class? Of course! Should we all expect to be the creme de la creme? We can expect it, it doesn't mean it's happening.

Now, to clear the air, I don't believe in bell curves and contrived ways of making students fit into a bell curve. If 95% of the class deserves and A and 5% deserves a D, that is how it should be graded, but a C should either be respectable (IF it means "good" or "satisfactory") or else the grading system needs to be reworked so that it actually makes sense.

Monday, August 24, 2009

Getting out of Grading - Seriously?


Earlier this month I was reading an article on Inside Higher Ed about about a Duke administrator that went back into teaching, how she found Grading so tiresome that she decided to outsource it...to her students! Yes indeed, students in her class also graded each others papers.

This professor writes:

I can't think of a more meaningless, superficial, cynical way to evaluate learning than by assigning a grade. It turns learning (which should be a deep pleasure, setting up for a lifetime of curiosity) into a crass competition: how do I snag the highest grade for the least amount of work? how do I give the prof what she wants so I can get the A that I need for med school? That's the opposite of learning and curiosity, the opposite of everything I believe as a teacher, and is, quite frankly, a waste of my time and the students' time. There has to be a better way....


I honestly fail to see what's superficial about grading. It's not a beauty contest among the students. Each class has certain educational outcomes. As an instructor for the course you are in charge of saying whether those students have realize those educational outcomes, and if they have to what degree those outcomes have been realized. This isn't some voodoo that you perform to get a student's grade, it's based on a rubric that you make based on your intended educational outcomes!

Now there are pedagogical reasons for letting fellow classmates grade other people's papers, but that grading can't (1) be the sole grading criterion and (2) it can't be self-guided, it's gotta be based on a rubric! If you don't do this all students can sign a pact to give each other a good grade.

I've had classes where I've graded classmates on a given rubric, but that wasn't their final grade. The final grade for that particular project was 75%-80% what the teacher thought and 20%-25% what your peer evaluation said.

All things considered this professor comes off as lazy to me.

Friday, August 21, 2009

The Wrath of Khan

A little Friday PhD humor for you:



I have to say that I've never been that inventive with my project names :-) I just go off swearing up a storm when something does not work ;-)

Tuesday, August 18, 2009

On ESL and critical thinking - some reactions

I was reading a post titled Language learning, critical thinking and the role of the teacher on the linguist the other day and I was really surprised. Now granted I am not a member of his list-serv, perhaps I should be to get the whole story, even though ESL isn't my immediate field of interest.

Now long story short here, it appears that some people have their feathers ruffled because of the belief that critical thinking should (or should not) be included in the foreign language curriculum. Personally I think that critical thinking activities should be part of the curriculum in any language learning situation because when you are learning a language you are also learning about many other things that influence a language - such as culture, history, popular sayings, predispositions of the natives, and so on. Language is not used in a vacuum and simply learning more vocabulary doesn't mean that you will necessarily be getting more comprehensible input.

Yousef writes (in the comments)
I don't think it's outright racism, but there is certainly an element of cultural superiority and just plain smugness.


Perhaps, perhaps not. The point is that when you are learning a new language, your Weltanschauung changes, or has the potential to change. Some (bad) teachers will be smug about it. Most teachers that I've come in contact with are not smug about the way they think (critically or not). They wanted to help me and my classmates learn.

I find it funny that Steve writes
I would ask them to listen on their MP3 players as much as they can, and to try to reduce their exposure to their native language , so that the brain has a chance to develop an ability to handle English.


It is quite possible that the learners of a foreign language do not have access to playback devices like MP3 players. If all depends on the context of the language learning, the who, why, where and by whom.

I also disagree with Steve about only interacting with texts that are of interest. If you only do that, you are handicapping yourself because you aren't picking up vocabulary and grammatical structures for other situations that you will need to know about. If we all learned about topics we were predisposed to want to learn about in school, we would never be exposed to things that we may like, or that we should know. Situational language, something Steve apparently does not like, is a good springboard to other topics while grounding the learning in something concrete that people will use.


As far as the original question goes: "Is no one here just interested in improving the learners' language skills?"...well it depends on what you mean by language skills. What are your rubrics? Without rubrics how will you know how improved your learners are?

Sunday, August 16, 2009

Should we abolish copyright on academic works?


...my two cents...

I saw this on Techdirt about a month ago and it's been lingering in my Google Reader starred items ever since. I've made a good faith effort to read the original but my brain is a bit fried from this summer (and I would like to save a few braincells for the fall semester)

Here's the abstract for the paper:
The conventional rationale for copyright of written works, that copyright is needed to foster their creation, is seemingly of limited applicability to the academic domain. For in a world without copyright of academic writing, academics would still benefit from publishing in the major way that they do now, namely, from gaining scholarly esteem. Yet publishers would presumably have to impose fees on authors, because publishers would not be able to profit from reader charges. If these publication fees would be borne by academics, their incentives to publish would be reduced. But if the publication fees would usually be paid by universities or grantors, the motive of academics to publish would be unlikely to decrease (and could actually increase) – suggesting that ending academic copyright would be socially desirable in view of the broad benefits of a copyright-free world. If so, the demise of academic copyright should be achieved by a change in law, for the ‘open access’ movement that effectively seeks this objective without modification of the law faces fundamental difficulties.


Now as a student in academia my writing has been my writing. No one else could profit from it (i.e. get credit). presumably I could take the idea that I had in the classroom, that I eluded to in some paper and go out and sell it an make money.

Now there are many people out there that research, ponder, and write. They create new knowledge (or validate old hypotheses). These people get the street-cred, after all their names are on the paper that they submit and no one can take that away. But, the money gained from the purchase of that article does not go back to the original author but to the journal that printed it or made it available in some form. Working in a library I know that journal subscriptions costs A LOT of money, none of which the authors see (as far as I know).

What's funny is the fact that many academic that I know of are willing and complacent in this. They are so concerned with tenure (or getting from one level of professor to another), and their courseload that they don't seem to mind that other people are profiting from their work!

Should we abolish copyright on academic works? Yes we should. Academic work should be available in creative commons licensing schemes because academics are creating knowledge that can benefit us all. It seems unethical for people who did not contribute to the knowledge creation cycle to be heavily benefiting from the work of others and then creating a walled garden where content is only accessible to those with fat wallets.

Friday, August 14, 2009

Classes | over

Wow, classes are over!

I suppose I should pop the cork off some wine or something and celebrate - then again school starts again in a couple of weeks so it will be a short lived celebration :-)

This summer went by quite fast. I don't know if it was the crazy weather (mostly gray and rainy), or the fact that I had homework in the summer. Oh well. I still have at least three weeks of homework-free (and Blackboard-free!!!) time to enjoy the rest of the summer :-)

Hopefully this time next year I will be done with my Instructional Design degree!

Wednesday, August 12, 2009

Selecting an LMS

Selecting an LMS is probably not an easy thing for an organization because many different faculty probably have many different requirements for their classes. This exercise in LMS selection then becomes a balancing act between cost, ease of use, and fulfilling as many of the user requirements as possible.

Last summer, when I was taking INSDSG 619 we spoke about these issues but at a surface level since that wasn't the focus of the course. I think it would be great to offer a course on LMS selection and administration so that students can get their hands dirty with a few types of LMS before graduating. This of course would require the 800lb gorilla in the room (Blackboard) to work out a deal with the university/department to allow for cheap or free experimentation :-)

I came across this small checklist for those who are in the process of thinking of an LMS: click here. In lieu of a full course, it's good enough to get you started thinking about the issues :-)

Monday, August 10, 2009

Depth or Breath?

I was reading this on Slashdot the other day about a person going back to school to complete their computer science degree.

Here's a quick quote:

I recently went back to college to finish my CS degree, however this time I moved to a new school. My previous school taught only C++, except for a few higher level electives (OpenGL). The school I am now attending teaches what seems like every language in the book. The first two semesters are Java, and then you move to Python, C, Bash, Oracle, and Assembly. While I feel that it would be nice to get a well-rounded introduction to the programming world, I also feel that I am going to come out of school not having the expertise required in a single language to land a good job. After reading the syllabi, all the higher level classes appear to teach concepts rather than work to develop advanced techniques in a specific language. Which method of teaching is going to better provide me with the experience I need, as well as the experience an employer wants to see in a college graduate?



Now there are a ton of opinions in the slashdot article that geeks and non-geeks alike should have a look because it poses a good question about what type of education you should get. Should it be as broad as possible? Or should it be more contained but more comprehensive?

This story also brings up an interesting exchange that I had with my undergraduate advisor in computer science. My computer science program did not take the breadth approach, but rather took the more narrow approach. Yes we did learn about automata, basic and advanced algorithms, logic and so on (so all the things that are mentioned in the comments section, and all the things that every computer scientist should know) BUT we didn't do a lot of languages. We covered Java, ANSI C, and x86 Assembly, and if you took specific electives you would get PL/SQL and SmallTalk.

The problem for me was that I was not being familiarized with more languages that exist out there in the real world (like C# for example). What I failed to realize back then is that Java, C and assembly is really what you need to get started. My advisor told me that the program focuses on concepts (well duh!) and that I can learn any language I want on my own easily. The issue I had was that the languages used in the curriculum were not used a whole heck of a lot. Two semester of Java, 2 of C, and one of assembly.

Yes you need to take the bull by the horns and program you own projects and have what the Greeks call μεράκι (I guess the closest equivalent is the concept of being "jazzed about something"), but as an undergraduate with a full course load, and a job, it's not easy to fit in project just for fun.

Personally I would have preferred more familiar with more languages and then I can practice more on my own, rather than this uncomfortable in-between.

What do you think?

Saturday, August 8, 2009

The point of college, and other diatribes

This past week I saw an article on the BBC and a blog post on the Brazen Careerist network that go well together - like w(h)ine and cheese. Yes, the bad pun was intended.

The BBC article centers around a woman in New York who is a jobless graduate and is suing her college because she's failed to get a job after graduation.

As the BBC reports:

She is seeking to recover $70,000 (£42,000) she spent on tuition to get her information technology degree


and

The ex-student, who received her degree in April, says the college's Office of Career Advancement did not provide her with the leads and career advice it had promised.

"They have not tried hard enough to help me," she wrote about the college in her lawsuit.
Her mother, Carol, said her daughter was "very angry at her situation" having "put all her faith" in her college.



On the Brazer Careerist we see yet another pointless article about Personal Branding...or rather the blogger's conviction that Colleges should teach Personal Branding. What's funny is that the blog isn't really about personal branding (whatever that is - as far as I am concerned it's a bullshit term). Rather, the blog post is about "how useless his college degree is", as he puts it in his own words

I studied one of the least practical majors at my college: Classical Languages. I learned 5th century BC Attic Greek and Latin. I read Homer and Caesar and Herodotus. I spent hours learning languages I’ll never speak in my life. If I went to Greece, I wouldn’t even be able to ask for directions to the bathroom—that’s how useless my college degree is!


and also how colleges should offer courses on how students could talk up their skills:

Students need to learn how to talk-up their skills and abilities. They need to be able to explain how spending a semester studying Spanish in a third world country translates into desirable traits for an employer. They need to learn how to brand themselves not as the “impractical English major” but as someone who really understands communicating and how to write well.


Quite honestly both the blogger and the person in the BBC article fail to realize that college isn't about cookie cutter approaches. It's not about giving you one specific skillset that you can use to get a job, but rather a number of theoretical foundations and practice that you can apply to any job. It's also not the college's job to teach you how to talk up your skills or find you a job (although that would be a nice cherry on the top).

I agree with a commenter when he says:
I disagree though that its colleges responsibility to teach how their degree applies to the real world. Transforming theoretical knowledge to real world requires good analytical brain and ability to adapt. One can't learn these skills in a classroom. They are learned over experiences in life.


Oh well...another day in academia :-)

Wednesday, August 5, 2009

Death by webinar

I was reading about the deadly online seminar (or death by webinar as I call it) on the cogdogbloy recently. I couldn't help but smile because it reminded me of a Death by PowerPoint presentation that I had created a couple of years ago.

I have to say that I echo all of the author's gripes about these types of webinars and it is the reason I generally hate Wimba sessions when we have them. Most Wimba sessions I've been to have been, essentially, a broadcast of information with little input or feedback from the audience (other than the "raise your hand"). It's also really hard to contribute without seeing the face of the people in the room. The paralinguistic features of communication are really marginalized in Wimba.

Monday, August 3, 2009

Digital Natives - are they really natives?

I was reading this article on Inside Higher Ed recently for a case discussion for one of my classes centered around Dr. iCranky. It is a pretty interesting article, and what's more there are some pretty interesting comments.

Boiling the Dr. Cranky's letter down, it's about faculty forced to adopt new shiny technology in the name of Millenials (aka digital natives), the new type of student filling the lecture halls and faculty better get on board.

I also read this article on first monday. Here's the abstract:

Educational technology advocates claim today’s students are technologically savvy content creators and consumers whose mindset differs from previous generations. The digital native-digital immigrant metaphor has been used to make a distinction between those with technology skills and those without. Metaphors such as this one are useful when having initial conversations about an emerging phenomenon, but over time, they become inaccurate and dangerous. Thus, this paper proposes a new metaphor, the digital melting pot, which supports the idea of integrating rather than segregating the natives and the immigrants.


I think both of these articles go together. I've seen a lot of students cross my path, for different generations. People who would fall into the digital natives category know their way around MySpace and Facebook, but when it comes to academic computing they don't know that much and what's more, they don't have the tools to figure it out on their own. I think equating knowledge of one service with broad knowledge of computers and troubleshooting is flawed.

As a side issue, I think that people come to school to learn things that they don't know. While some academic technology can help students, it is important to acclimate students into an environment that they don't know much about. They should know how to use paper based resources to do their work. They should be taught how to be both cyber-sleuths and real-life sleuths when seeking information. And, they should be taught how to be troubleshooters when things don't go well with technology.

I also think that this argument of digital natives neglects to mention people who cannot afford to be brought up in the warm embrace of technology. I've met many people, both locals and from abroad, that never grew up with a computer, they just know basic email and word processing and that's it (if that!) and they are part of the generation that we call Millenials (aka digital natives)