Monday, December 12, 2011
Monday, December 5, 2011
In Cognitive Surplus, Clay Shirky points out that major changes to society often happen so quickly they don't leave time for anyone to adjust. This results in chaos that traditional solutions can't fix. In fact, tradition has been displaced. There is no plan for going forward, but no way to go back. Shirky's main example is the beginning of the Industrial Era in London when a rapid influx of people into the city created social chaos, new opportunities for leisure, and mass drunkenness. Eventually, people began to use their surplus time to organize, become educated, and develop civic infrastructure. Ergo, we have modern democracy. (Sort of)
We are entering an era of chaos. The traditional approaches to education are being challenged by rapid changes in technology and economic pressures. And if the old model falls, we're simply not ready to replace it. We don't have any good ideas. Or...we have a lot of good ideas but no big picture.
In his opening post for Week 13 of #change11 MOOC, Clark Quinn addresses this problem:
I’m really arguing for the need to come up with a broader perspective on learning. I’ve been calling it learning experience design, but really it’s more. It’s a combination of performance support and learning (and it’s badly in need of some branding help). The notion is a sort-of personal GPS for your knowledge work. It’s knows where you want to go (since you told it), and it knows where you are geographically and semantically (via GPS and your calendar), and as it recognizes the context it can provide not only support in the moment, but layers on learning along the way. And I think that we don’t know really how to look at things this way yet; we don’t have design models (to think about the experience conceptually), we don’t have design processes (to go from goal to solution), and we don’t have tools (to deliver this integrated experience). Yet the limits are not technological; we have the ability to build the systems if we can conceptualize the needed framework.
[....] There’s lots more: addressing the epistemology of learners, mobile technologies, meta-learning & 21st C skills, and deep analytics and semantic systems, to name a few, but I think we need to start with the right conceptions.
Quinn suggests we think about "slow learning" as a way to make the reality of how our brains work match the pace and functional aspects of education design.
This sounds great. What I don't like is GPS as a metaphor. The problem is that the GPS simultaneously knows too much and too little. Have you ever watched someone follow their GPS around and around the block, expecting the little robot to do all the work? Chances are, if the driver would just look up and read a few street signs or use common sense, he could save 15 minutes of wandering.
We need to get lost. We don't need our locations constantly re-calibrated. Learning often means getting lost in the woods and finding your way out. It doesn't mean having a controlling voice talking you through everything, measure, assessing, re-assessing. That's one of the big problems with education now. Let's not replace a human program with a digital one. You can't hear yourself think with such a dominant narrator.
A better metaphor, I think, is the Hero's Journey of Joseph Campbell. Most of those myths would have been utterly destroyed with the careful directions and constant updates of a GPS. Heck, even Luke Skywalker, in his world of advanced technical wizardry, needed to close the blast shield and listen to his own voice .
It might be trite, but it might be true: do we need to focus more on the journey, less on the destination? (Destinations are good, too). The GPS won't shut up until the correct result is reached. We become dependent. We need her every time we go somewhere. We never learn to shut her off and read the landscape directly.
Friday, December 2, 2011
In Stumbling On Happiness, Daniel Gilbert says that all psychologists are required, at some point in their careers, to write a sentence that begins with "The human being is the only animal that..." According to Gilbert, the answer is "imagines the future." My personal choice would be "wears socks with sandals," but I haven't done the field research to back that up.
A recent Discover Magazine article take a crack at this and comes up with the following answer: Humans are the only animals who teach.
At first, this seems like an overreach. Anyone who has watched chimpanzees interact has seen acts of imitation. It seems clear that other primates teach, and, just like humans, often do it for peanuts.
Discover Magazine is ready for this objection:
I know this may come as a surprise, but it does so because we tend to mix up teaching and learning. A young chimpanzee can learn how to smash nuts on a rock by watching an older chimpanzee in action. And when she grows up, her own children can learn by watching her. But in these situations, the students are on their own. They have to watch an action and try to tease apart the underlying rules.
I think we're about to head down a rabbit hole having to do with "intention" and the depth of conscious awareness in primates. Ultimately, this would be a pointless debate. The question is, "Is our children learning?" No, sorry, old joke.
The question is, "Has Learning Occurred?" We aren't necessarily going to know how it happened, who was responsible for it, or what the exact ingredients of the educational cocktail were. Therefore, it doesn't matter whether a child learns to tie his shoes because he was taught or because he imitated an adult. In fact, these distinctions are merely linguistic.
This discussion makes me think of the so-called "self-taught" learner. That term is a misnomer. The only self-taught learners we have on record are children raised by wolves or neglected in massive orphanages. Those children have little or no human contact. They must teach themselves. They don't do so well.
Just like you can't avoid learning (it's in our DNA), you can't avoid teachers. The world is a teacher. Maybe not on purpose, but it is.
Thursday, December 1, 2011
Based on I.S.T (Internet Standard Time) , this 2010 David Carr article about Twitter is ancient. However, I've just come around to David Carr after watching the brilliant Page One, so forgive me for coming late to the party.
Carr, who was initially a Twitter skeptic, has come to find great value in the micro-blogging software:
At first, Twitter can be overwhelming, but think of it as a river of data rushing past that I dip a cup into every once in a while. Much of what I need to know is in that cup: if it looks like Apple is going to demo its new tablet, or Amazon sold more Kindles than actual books at Christmas, or the final vote in the Senate gets locked in on health care, I almost always learn about it first on Twitter.
I find this to be true when preparing for class or exploring ideas for research. If you're following the right people on Twitter, it can be an endless source for material. Carr's quote also made me think of a passage from Shunryu Suzuki's Zen Mind, Beginner's Mind, where he explains the proper way to draw water from a stream:
If you go to Japan and visit Eiheiji monastery, just before you enter you will see a small bridge called Hanshaku-kyo, which means 'half-dipper bridge'. Whenever Dogen-zenji dipped water from the river, he used only half a dipper, returning the rest to the river again, without throwing it away. That is why we call the bridge Hanshaku-kyo, 'half-dipper bridge'. It may be difficult to understand why Dogen returned half of the water he dipped to the river. When we feel the beauty of the river, we intuitively do it in Dogen's way. It is in our nature to do so.
I guess it's best to avoid drinking too deeply from the stream of information, to let some of water pass back into motion. Carr warns that Twitter's power can wash you away:
All those riches do not come at zero cost: If you think e-mail and surfing can make time disappear, wait until you get ahold of Twitter, or more likely, it gets ahold of you. There is always something more interesting on Twitter than whatever you happen to be working on.
All that gurgling can also be misleading. Carr quotes Here Comes Everybody author Clay Shirky, who has long praised the wisdom of crowd-sourcing your problems and allowing the hive-mind to go to work (how's that for larding up my prose with buzz words?)
Twitter helps define what is important by what Mr. Shirky has called “algorithmic authority,” meaning that if all kinds of people are pointing at the same thing at the same instant, it must be a pretty big deal.
Maybe. You'll see "Kim Kardashian" trending on Twitter more frequently than "Eurozone." Collective intelligence is powerful, but so is collective ignorance. Sometimes the stream of consciousness is just water under the bridge.
Wednesday, November 30, 2011
Bryan Caplan, who often presents himself as the paragon of reason and reasonableness, has written an incredibly illogical article about education called "The Magic of Education." Here, Caplan uses "magic" as a shorthand for "things he doesn't understand." This is a common trick for self-proclaimed reason-meisters to dismiss anything that involves more complexity than a land-line poll as "woo woo." Caplan mocks his own profession (he's an Economics professor, but I bet you guessed that based on his glasses) by describing how he thinks unenlightened teachers view the education process:
Step 1: I open my mouth and talk about academic topics like externalities of population, or the effect of education on policy preferences.
Step 2: The students learn the material.
Step 3: Magic.
Step 4: My students become slightly better bankers, salesmen, managers, etc.
This is, of course, a huge straw man argument. Obviously no one (except the insane) over the age of 8 believes in magic. By using this term, Caplan creates a slam dunk argument for himself. To disagree with him is to believe in fairies, the power of crystals, and auras. This is lazy.
He complains that academics don't live in the real world (more on this later), but that isn't Caplan's problem at all. He needs to get out of his department more and run his articles past the Philosophy Department in order to correct his soft thinking. (This is not to mention that most economic theories have more in common with magic than does traditional pedagogical thinking.)
Here is Caplan's biggest philosophical error, and it's based on such a terribly trite bit of rhetoric, we might suggest he head over to the English Department after visiting the logicians. It's that tired line that some kind of "real world" exists, its outward circumference becoming visible just as College Street begins its ascent up the hill toward the glistening ivory tower where absent-minded, bearded, sandal-wearing gnomes frollick in the clouds and pass around 300-page dissertations on the anti-agrarian symbolism of Joyce's use of the semi-colon in Ulysses. (Okay, some of that is true.)
Here, Caplan is free to leave fantasy land and explore the real world of any office setting. (NBC's The Office, to many people I know, is not a farce, but a striking bit of realism. Many doctors have also told me that Scrubs is the most realistic depiction of hospitals television has ever seen.) This is not to mention the proliferation of magical thinking found in Anywhere, USA. Does Caplan honestly think that harder-working, more reasonable people will be found if we simply hop over the brick walls of the academy and mingle among the "commoners," you know, the residents of the real world?
What a silly, stupid distinction. It crumbles upon the slightest questioning. Is an over-worked adjunct with two kids and a freelance job in the real world or the fake world? Is a lazy, frequently unemployed construction worker who believes in Voodoo in the real world or the fake world? What if he jogs a few blocks to the local college? What if he takes one class on campus? What if he straddles the property line of the campus while holding Chaucer in one hand and the National Enquirer in the other?
If there is no magic occurring in Bryan Caplan's classroom, it's probably his fault.
Wednesday, November 23, 2011
Tuesday, November 22, 2011
Today's philosophies of the future are built on three metaphysical assumptions from the past:
1) The projection of human consciousness onto machines.
2) A teleological world view (influenced perhaps by Pierre Teilhard de Chardin) that posits a civilizational movement toward unity and global interconnection, toward an omega point of history.
3) A belief in the goodness of technology and the benevolence of machines.
Despite the reliance on science and technology, these latent metaphysical assumptions result in a kind of techno-religion, replete with the promise of immortality, transcendence, and enlightenment.
If we are seeing the birth of a new religion, then its central prophet is futurist Ray Kurzweil. In the documentary Transcendent Man, we learn of Kurzweil’s bold prediction: coming advances in genetics, nanotechnology, and robotics will result in intelligent machines that will exceed human capabilities, allowing us to merge with the machines and transcend our biological limitations. This is referred to as The Singularity, and it promises immortality and super-human powers. (Sound familiar?)
It all sounds plausible until the documentary focuses on Kurzweil's strange obsession with bringing his father back to life. Then you see that Kurzweil is just in need of a good Freudian analyst, and that, perhaps, his entire theory is driven by a hyper-inflated fear of death.
Perhaps the apocalyptic metaphysics of the internet are mere wish-thinking? Yet another baroque system designed to fix a deadly system failure?
The Singularity depends on all three assumptions I listed above. Its supporters seem unaware that the theory is as much metaphysics as physics. (Atoms are immortal; Adams are not.)
And while, as I've noted here, advances in artificial intelligence may challenge our human-centric notions of consciousness, we currently seem much too eager to describe the internet and advanced networks in terms that degrade human consciousness and creativity, as if sentience were merely an algorithm.
Sunday, November 20, 2011
[....] if we don’t start writing and advocating for a very different vision of learning in real classrooms, one that is focused not just on doing the things we’ve been doing better but in ways that are truly reinvented, one that prepares kids to be innovators and designers and entrepreneurs and, most importantly, learners, we will quickly find ourselves competing at scale with cheaper, easier alternatives that won’t serve our kids as well.[....] That in this moment, 20th Century rules will not work for 21st Century schools. That direct instruction and standardization will make us less competitive, not more. That those strategies will make our kids less able to create a living for themselves in the worlds they will live in. That as difficult as it may be for some to come to terms with, this moment requires a whole scale “radical rethink” in much different terms from the one Jeb Bush wants, the same type of rethink that newspapers and media and businesses and others are undergoing.
Friday, November 18, 2011
Thursday, November 17, 2011
Wednesday, November 16, 2011
What difference does this abundance make in the big questions humanity faces?
The movie Idiocracy, written and directed by Mike Judge (Beavis & Butthead, Office Space) is not very good. It plays the same note over and over again. However, despite being a comedic film, it's a mournful note, a minor chord that resonates with our perception that public discourse is dumb and getting dumber. It's a weak movie, but a great conversation piece.
In a radical rethinking of what it means to go to school, states and districts nationwide are launching online public schools that let students from kindergarten to 12th grade take some—or all—of their classes from their bedrooms, living rooms and kitchens. Other states and districts are bringing students into brick-and-mortar schools for instruction that is largely computer-based and self-directed.
Tuesday, November 15, 2011
Consider the efforts of Frances Harris, librarian at the magnet University Laboratory High School in Urbana, Illinois. (Librarians are our national leaders in this fight; they’re the main ones trying to teach search skills to kids today.) Harris educates eighth and ninth graders in how to format nuanced queries using Boolean logic and advanced settings. She steers them away from raw Google searches and has them use academic and news databases, too.
But, crucially, she also trains students to assess the credibility of what they find online. For example, she teaches them to analyze the tone of a web page to judge whether it was created by an academic, an advocacy group, or a hobbyist. Students quickly gain the ability to detect if a top-ranked page about Martin Luther King Jr. was actually posted by white supremacists.
The pipe is more important than the content within the pipe. Our ability to learn what we need for tomorrow is more important than what we know today. A real challenge for any learning theory is to actuate known knowledge at the point of application. When knowledge, however, is needed, but not known, the ability to plug into sources to meet the requirements becomes a vital skill. As knowledge continues to grow and evolve, access to what is needed is more important than what the learner currently possesses.
Friday, November 11, 2011
Other studies have found the same thing: High school and college students may be “digital natives,” but they’re wretched at searching. In a recent experiment at Northwestern, when 102 undergraduates were asked to do some research online, none went to the trouble of checking the authors’ credentials. In 1955, we wondered why Johnny can’t read. Today the question is, why can’t Johnny search?
Thursday, November 10, 2011
Tuesday, November 8, 2011
It includes a great story of some thieves returning Mr. Rogers' car after finding out it was his, leaving a note that said, "If we'd known it was yours, we never would have taken it."
Once, on a fancy trip up to a PBS exec's house, he heard the limo driver was going to wait outside for 2 hours, so he insisted the driver come in and join them (which flustered the host).On the way back, Rogers sat up front, and when he learned that they were passing the driver's home on the way, he asked if they could stop in to meet his family. According to the driver, it was one of the best nights of his life the house supposedly lit up when Rogers arrived, and he played jazz piano and bantered with them late into the night. Further, like with the reporters, Rogers sent him notes and kept in touch with the driver for the rest of his life.
Monday, November 7, 2011
One other feature of Google+ that makes it a truly "killer app" for education is Hangouts. With a webcam and mic enabled computer or phone/tablet with a front facing camera, you can have a real time, online meeting with up to 10 simultaneous video streams. Google recently added Google docs integration and screen share, making it even more compelling. Now a group of students, or a teacher with some students, can meet, "face to face" and edit one document in real time, with side-chat functionality as well.
The use of hangouts for education is significant. Small group ad hoc meetings between students from anywhere at anytime. Study groups accessed from the palm of your hand. A teacher that can join a group at anytime to give feedback and answer questions. The list goes on.
Friday, November 4, 2011
1) Classes before 11:00 AM will now be prohibited. The internal biological clock of 18-23-year-olds is not suitable for morning classes. Holding class at these times violates the U.S. Constitution's First Amendment against cruel and unuseful punishment. Additionally, given that the extended hours of most extracurricular learning activities interfere with the sleep habits required for waking before noon, the student council has ruled in favor of former Red Sox pitcher Samuel Clemens' awesome idea that schooling should never interfere with education.
2) Classes between the hours of 11:00 AM and 2:00 PM are now banned. Given that extracirricular learning activities are notorious for causing dehydration, the munchies, and frequently require greasy foods to reinforce the previous night's lessons, the latter/former(?) hours remain open, now officially sponsored by Doritos and called "Crunch Time" with the slogan, "We're Chipping Away at Education!"
3) Classes shall not be held between the hours 2:oo PM and 5:00 PM because this is when we get sleepy from the chips and such. We think can't w/out sleeping.
4) No more classes between 5:00 PM and 9:00 PM. This is our prep time for getting ready to stay out all night. :)
5) We hereby stipulate that all educational transactions shall occur at flexible hours between 1 and 3 hours before the due date of an assignment whereby you will answer our emails and provide extensive feeback and/or massively detailed summaries of the last seven hours of course content without making that face because (Please see #1) this is not a good time for us!
Thursday, November 3, 2011
Horton Hears a Who! by Dr. Seuss
My rating: 5 of 5 stars
Certainly the finest book in the American Canon. Seuss, initially conceiving the book as a response to the American occupation of Japan, instead constructed a multi-layered allegory addressing the historical pattern of the scientist/mystic at odds with a totalitarian church-state. Thus, on one level, the representation of Horton as the seer (literally and mystically) who is called to actions by unheard voices of intuition and other-worldliness while, at the same time, embodying the scientist whose extended techno-organs perceive substratum the untutored masses merely mock in their ignorance, suggests Seuss is replicating the plight of Meister Eckhart, Galileo, Theresa of Avila, and countless others. Seuss is not content to stop there. The aptly named Whoville ("Who" first as a question, then as a rapturous owl call announcing the night of triumph) becomes a stand in for vocal, democratic, non-violent resistance (their drum circles reminiscent of Occupy Wall Street, and their total participation in announcing their presence, affirming their existence, reflects nothing other than the multi-cultural, consciousness raising of the 1970's). Here I am not being anachronistic. This is precisely the point. Ultimately Seuss stitches every fabric of allegory together in what can only be described as a Hegelian Historical Dialectic. Horton is not hearing a who. History is hearing its own narrative and responding with a new vision that resounds with echoes of the atemporal fingerprint of God. Horton Hears a Who is, then, not a book exactly, but a sort of opening into the Divine Idea.
View all my reviews