Saturday, September 22, 2012

17 More Universities on Coursera

This video is from the Coursera YouTube Channel.


The video above previews a free online course that Steve Everett from Emory University will teach via the Coursera course management system about digital sound design. The course explores fundamental factors and principles of sound and auditory perception, and it covers techniques for recording, mixing, processing, synthesizing, sampling, analyzing, and editing digital audio. Everett's course is one of the three currently offered on Coursera from Emory University.

Emory is one of 17 universities that recently signed agreements with Coursera about offering free online courses in topics like applied biology, medicine, music, personal finance, economics, business, computer programming, explorations in the humanities, and more. The last time Coursera added this many partners was in mid-July, and if this round of additions to Coursera will mirror the last, then we'll see additional courses added from these 17 universities over the next few months. Each currently offers at least a few.

Another interesting Coursera course, at least for those interested in how online courses include different textual, multimedia and interactive elements, is Jay Clayton's course, Online Games: Literature, New Media, and Narrative. This course will explore what happens when online multiplayer video games are modeled after art, novels or movies. The class will feature 10-20 minute long video lectures, written assignments, and in-game interactive assignments as it focuses on the free-to-play elements of The Lord of the Rings Online game

Rather than offer course credit, Coursera courses offer students and people around the world certificates of course completion that may be signed by the course instructor upon course completion. In August, Coursera encountered plagiarism cases with some of its online courses, but it has made some changes to its courses, including an honor code, to help discourage plagiarism. Check out these articles from the Chronicle of Higher Education for more in-depth discussions about Coursera and plagiarism:


It will be interesting to see what Coursera, which started up earlier this year, holds for us in the future months. 

Saturday, September 15, 2012

Can Technologies Deteriorate the Ability to Think for Oneself?

Picture Credit: Jamie Bernhardt
A picture I made in Art Studio for ipad.


Thinking for oneself has been a value important to Westerners since the Age of Enlightenment, but the wide consumption or “consumerization” of mobile and web-based information technologies in the past two decades have prompted some to question whether our devices help us or hinder us from thinking for ourselves. One stance is that the ability of humans who live during or after this proliferation of mobile information technologies to think for themselves will deteriorate, if it already has not. To an extent, I agree with this stance, but I think it is important that we characterize this extent in terms of the skills or abilities used to think for oneself.

The term ‘thinking for oneself ‘can specifically mean using cognitive or mental abilities to justify one’s beliefs or belief systems. In a broader sense, it can also mean using one’s cognitive abilities (e.g. attention resources, working memories, long-term memory, or critical thinking) to understand something in ways that enable that person to perform important mental procedures or tasks based on that understanding. For example, using one’s attention, memory resources, and critical thinking skills to construct a mental model about something can enable one to pose a question about something one does not know and use one’s abilities to try to answer that question, not simply justify one’s beliefs. This second sense is what I take this discussion to concern.

If using these information technologies can deteriorate the ability to think for oneself in any ways, the deterioration would involve the following: (a) lack of using the mental abilities that are essential to or necessary for thinking for oneself; and (b) deterioration (the reverse of a development) of those abilities. However, we should note that a decline in activity or use of something does not necessarily entail a deterioration of that thing’s elements or parts. To defend the claim that using contemporary information technologies at the scale at which many use them has or will deteriorate the ability to think for oneself, proponents need to show that (i) such usage essentially involves deterioration of the abilities required to think for oneself or (ii) that such usage could lead to the deterioration of the abilities required to be able to think for yourself.

Using web-based technologies like search engines (e.g. Google) that let you search for information about what you are looking for has become important to the daily, business and academic lives of Westerners. If they have a question and a (mobile) information technology device, they can type in a keyword, click “search,” and then use their mental abilities to skim or peruse through information about that topic or question.  People have had to get better at recognizing the type of resource they are looking for as many different types of informational and multimedia resources with different elements of credibility are available on the World Wide Web, our number one way of consuming information. Consuming information, in and of itself, does not involve a lack of using the skills essential to or necessary for thinking for oneself, nor does it involve a deterioration of these skills. Consuming information involves seeing, reading, and using one’s attention, memory resources, critical thinking abilities, and other mental abilities to understand, build mental representations, and integrate these with prior knowledge.

However, using technologies to consume this information does not necessarily involve critical, thoughtful use of our mental abilities at high levels of information processing. People who use information technologies and web-based technologies to search for information are thinking for themselves by figuring out what they are looking for, pinpointing a solution (e.g. using Google in such-and-such way to find a some kind of resource), and consuming the information inherent in the resource. But their ability to think critically for themselves may diminish as they search more than they process information at high levels of using mental abilities. This is because they may not use their abilities, the same ones required for thinking for oneself, at the level of use that is necessary for thinking critically for oneself.  If people are not using their abilities at such-and-such high level of use, then, though their abilities do not deteriorate, their capacities to use these abilities at certain levels of use (e.g. using an ability or group of abilities to accomplish a finite set of tasks by some time or within a specific duration) may wither despite their history of being capable of doing so in the past. This is the relevant sense of “diminish” here.

It is my suspicion that what proponents want to argue at a bare minimum is that relying on information technologies, not simply using them, will stop people from thinking for themselves at a certain level of use of the abilities required to think for themselves. Still, using technologies does not necessarily entail a lack of performance at a certain level. An individual’s ability to perform complex tasks such as keeping up a certain level of thinking for oneself while using or relying on an information technology hinges on that individual’s particularities. Technologies do not obviate the need to think for ourselves, but they can obviate the need to think critically for ourselves if we let them. They improve access to information, the stuff we think about, for ourselves, but if there are some who are not capable of using their mental abilities to support a certain level of information processing while interacting with technologies, then relying heavily (often) on technologies for them may mean a decrease in the capacity to think critically (i.e. at high levels) for oneself, not just a decrease in activity.

This does not spell a doom impervious to people who want to use technologies, think at high levels of cognition for themselves, and improve their mental abilities. Those who use technology primarily for mere entertainment rather than human application, which involves entertainment and the intent to improve something (e.g. one’s own learning abilities), may not always be able to think for themselves at certain levels of cognition. However, those who engage human application of technologies to improve their own learning abilities, not simply their own access to information or the amount of information stored in the memory within their own minds, can maintain a certain level of thinking for oneself or reach higher levels of thinking for oneself. We must remember that thinking for oneself , no matter what one is interacting with, takes place within oneself, and it can be heightened only by improving the abilities necessary to think for oneself.