I’ve always reserved a special place in my heart for group projects. It’s the same place that I reserve for pickles, cats, people who talk on their cell phone in quiet places, laptops with no battery life, small talk, and anything that needs to be described as “avant-garde.”
It’s a dark place.
Why is that? According to many educational experts, group projects are excellent teaching tools that help students learn in community and develop the skills they’ll need in the “real” world. According to most students, group projects are special form of hell created by sadistic professors who probably also pluck the wings off butterflies in their spare time.
Obviously there’s a disconnect somewhere.
What’s the problem? Working together and learning in community sounds great. But, in my experience, group projects look better on paper than they work in reality.
So, help me out here. What has your experience been? Have your group project experiences been like mine, or have you been a part of a few that actually worked well and were good learning experiences? If so, What makes for a good group project? (I can’t believe I just used the words “good,” “group,” and “project” in the same sentence. That has to violate some fundamental rule of English grammar.) Why did it work and what made it different from other, less effective, group projects?
I’m always trying to be careful not to allow my personality and preferences to limit my teaching techniques. Everyone should learn just like me. But, sadly, they don’t. So, I should be open to the possibility that some students might benefit from a teaching tool or methodology that has never held much value for me personally. If group projects have been good learning experiences for you, let me know. I may need to reconsider my long-standing resistance to such assignments.
Am I free? Not legally (I’m not in jail) or metaphysically (who knows if I have “free will”?) but intellectually. Do I have intellectual freedom? After all, I teach at a school with belief commitments. To get my job, I had to sign our Faculty Teaching Position. And, if I ever changed my mind on a core aspect of that document, my job would probably be in jeopardy. In that kind of situation, can I have any kind of real intellectual freedom? Or, am I really just kidding myself by thinking that I’m an academic.
If you live in a confessional world, do you need to leave your brain at the door?
There’s been a lot of talk lately about whether Roman Catholics have less intellectual freedom than other Christians because of the strongly confessional nature of the Catholic tradition. Michael Patton began the firestorm, and quite a few have chimed in since then. I don’t want to rehearse the whole debate, so check out Brian LePort’s summary for his comments and links to other good posts.
Most of the discussion so far has focused on whether Patton is right about Roman Catholicism. (Hint: The answer is ‘no’.) But, somewhat lost in all of this is his argument that true scholarship and confessional commitment are antithetical to one another. His comments on Catholicism are based on his Cartesian commitment to skepticism as methodologically necessary for real academic work. If you’re not willing to doubt every idea/belief, open to the possibility that you might be wrong, then you’re not really an academic.
If he’s right, then, any school with confessional commitments only has limited intellectual freedom (at best). And, based on that argument, the faculty at Western Seminary don’t really have academic freedom. We throw it away when we sign the Faculty Teaching Position. Our job status is connected to at least some of their beliefs. Change those beliefs, and we’re in trouble. So, we’re not really academics. We’re just defending the status quo.
Granted, faculty can always leave and try to find a job at another school. So, we haven’t killed intellectual freedom entirely. We’ve just cut off both its legs. It can still move around, but only by painfully dragging its bloody torso somewhere else.
As someone who teaches at such a school, I think there are some critical things wrong with this (common) argument. Brian LePort explains his reservations (and appreciations) in Five Thoughts on Objectivity, Open-Mindedness, and Scholarship. You should definitely check it out. But, let me add three additional concerns about this argument that I think we need to keep in mind.
1. It over-emphasizes the individual. This is the Enlightenment at its finest. Presuppositions and traditions are the enemy of intellectual progress. They must be challenged and questioned at every turn so that I, as the ultimate human authority in my life, can be confident that I am coming to know things as they actually are and not just how they have been presented to me. You never get the sense that intellectual activity is a communal activity in this approach. Instead, you’re left with the picture of the academic locked away in his/her office or lab, seeking Truth through the power of unimpaired reason. Given Patton’s clear commitments to doing theology in community, that seems like an odd stance.
2. It devalues institutions. This one is connected to the last, but the argument seems to betray a subtle anti-institutionalism. This view of academics makes the professor an independent contractor with no real connection or loyalty to particular institutions. The individual sticks around as long as he/she is satisfied with the institution’s position. And, if you change your mind and can no longer affirm those commitments? No worries, there’s always another one around the block. It’s church shopping at the academic level. (I may comment on this more later. This kind of subtle anti-institutionalism is rampant in evangelicalism.)
3. It neglects the importance of presuppositions. Many people make this mistake. Most recognize that we all have our presuppositions. They’re a necessary evil that we have constantly guard against. And, there is some truth to that. But, people often fail to recognize that presuppositional frameworks have value as well. No scientist is going to waste their time investigating whether the world is flat. They’ll assume that question is settled. It’s part of their presuppositional framework. And this allows them to use their time investigating other issues. The same is true in theology. For me, the deity of Christ is a “settled” issue. Not settled in the sense that everyone agrees, and not even settled in that I think I understand everything about what that means (who does?), but settled in that I think that it’s true and not really open to question. Does that make me less free? I don’t think so. If anything, I think it frees me up to pursue other issues. Recognizing that some doors should stay closed, grants me the freedom to go through others. Being “open” to everything leads to bondage, not freedom. So, it’s not just a matter of acknowledging our presuppositions, but embracing them as necessary for real intellectual freedom.
Do I have intellectual freedom? Absolutely. I have the kind of intellectual freedom that comes from knowing who I am as a part of an ecclesial community with a clear sense of its history, identity, and purpose. And, I have the kind of intellectual freedom that comes from a community that raises hard questions and explores new ideas together, supporting each other as we strive toward faithful Christian living in a broken world. And, I have the kind of intellectual freedom that comes from seeing some things as “settled” so that I’m free to spend my time on other issues.
Granted, I don’t have the kind of intellectual freedom that’s willing to throw off all of that in favor of an individualistic pursuit of rational autonomy. But that’s okay. I’m not interested in that kind of freedom anyway.
Few things in life are more frustrating than going through all the work of applying and interviewing for a position that you really want and feel you are very qualified for…and not getting it. Fortunately, I was blessed with a position at Western pretty early in my job-hunting career, so I haven’t experienced this as much as most. But, I feel your pain.
If this is happened to you, or if you think it might happen to you, Timothy
Larson Larsen from Wheaton College offers some insight into why you didn’t get that teaching job, by addressing the four most common questions people ask when they didn’t get that teaching position they so badly wanted. I can’t imagine actually asking the first question (out loud), but the others are ones I’ve heard more than once.
- Given how eminently well qualified I am for this position, how can you possibly justify eliminating me so early in the process?
- I know I was eliminated over a month ago, so why have you not had the decency to tell me so?
- How in the world can you expect someone applying for an entry-level position to already have a handful of research articles in major peer-reviewed journals and a book contract with a leading university press?
- What did I do wrong?
His answers are well worth reading if you an insider’s look at how the hiring process works at a major Christian college.
- Craig Carter offers a post In Praise of the Lecture, arguing that the lecture is a moral event, a personal act, and a tribute to metaphysical truth. HT
Today, the lecture is out of favor in politically-correct circles. Like dead white males, high academic standards and absolute truth, it has been consigned to the dustbin of history by enlightened, late-modern, progressives who do not quite believe that God grades on the curve, but who do hold it against Him that He does not.
To all Christians and other lovers of Lewis I would say this—- please during this Christmas season come out and support this film, not least so we may see more of Narnia in the future. This is certainly a film appropriate for families to see, though a couple of the scenes in 3D with the big sea monster may be a little too intense for wee bairns as small as Reepicheep. Be that as it may, we must say— Well done good and faithful servants at Walden. Inherit the Kingdom yourselves.
- Andy Crouch comments on the desire for “authenticity” in church and society and the ways we try to manufacture and “franchise” it. HT
But our longing for “authenticity” also bears a suspicious resemblance to the latest plot twist in the story of consumer culture: the tendency to rapidly replace the squeaky-clean franchise with the “authentic” franchise.
- David Briggs asks if time spent online is cutting into the clergy’s prayer time.
Hearing what he called “the still, small voice of love” amid the cacophony of secular voices calling for attention needs special effort: “It requires solitude, silence and a strong determination to listen.” The Internet has not made the spiritual life any easier.
- The latest issue of Themelios is out with its usual wealth of articles and book reviews.
- And, here’s a list of the Top 10 Unnecessary Sequels.
I love to listen to the rhetoric that older, more established schools and newer, less traditional schools fling back and forth at each other. It’s a fascinating display of semantic nuance. Reading Terry Pratchett’s Unseen Academicals, I came across the following scene in which Lord Vetinari (i.e. The Boss) is describing two such institutions to Archancellor Ridcully, head of the more traditional school.
“What we have here, gentlemen, is but a spat between the heads of a venerable and respected institution and an ambitious, relatively inexperienced, and importunate new school of learning.”
“Yes, that’s what we’ve got all right,” said Ridcully.
Vetinari raised a finger. “I hadn’t finished, Archancellor. Let me see now. I said that what we have here is a spat between an antique and somewhat fossilized, elderly and rather hidebound institution and a college of vibrant newcomers full of fresh and exciting ideas.”
“Here, hand on, you didn’t say that the first time,” said Ridcully.
Veintari leaned back, “Indeed I did.”
Two very different perceptions of the same academic realities. And, as the technology-in-education debate really starts to heat up, it should be interesting to see where the rhetoric goes from here.
- J.R. Daniel Kirk tackles the question of whether Christianity has really done any good for the world, pointing to adoption, sex trafficking and peace as key areas of contribution. And Andrew Perriman offers his thoughts as well.
- The Vatican has tightened its rules on for disciplining priests involved in sexual abuse cases. NPR has a piece on it here.
- Larry Hurtado is at it again, posting his article “Jesus as Lordly Example.” He also announces the new blog site for his Centre for the Study of Christian Origins.
- Jesus Creed has a nice roundup of recent discussions on science and miracles.
- Inside Higher Ed has a very nice article on the Illinois adjunct who was fired for teaching about Catholic beliefs regarding homosexuality. Unsurprisingly, the situation is more complicated than they first appeared.
- Justin Taylor recommends a book for learning how to write non-fiction book proposals.
- And, apparently spending most of your time sitting down is bad for your health, even if you exercise regularly.
I’ve posted a couple of times recently about the ongoing debate regarding how technology impacts the way that we think and learn (“Testing Your Techno Depravity” and “Wired for Distraction“). Now, I’m not a technophobe arguing that Mark Zuckerberg is the Antichrist or that Twitter is going to bring about the Technocalypse. I just think that everyone involved in any kind of education needs to stay informed about the discussion.
So, to continue the conversation, I thought I’d point out Steven Pinker’s NYT piece today arguing that a lot of the discussion is driven more by media hype that science. (Could that possibly be?) He leads with the observation that “New forms of media have always caused moral panics: the printing press, newspapers, paperbacks and television were all once denounced as threats to their consumers’ brainpower and moral fiber.” And, such claims are rampant in the media about modern technology as well. But Pinker argues that we need a reality check. Instead of declining attention spans and decreased mental capacity, he contends that the sciences and the humanities are flourishing today. So, we simply lack any real evidence that increased technology corresponds to decreased mental capacity.
He also pushes back against claims about how experiences change brain structure. While such changes do occur, he seems to see them as rather superficial and not affecting “the basic information-processing capacities of the brain.”
And so, he concludes his argument with the following suggestions:
Yes, the constant arrival of information packets can be distracting or addictive, especially to people with attention deficit disorder. But distraction is not a new phenomenon. The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life. Turn off e-mail or Twitter when you work, put away your Blackberry at dinner time, ask your spouse to call you to bed at a designated hour.
And to encourage intellectual depth, don’t rail at PowerPoint or Google. It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate. They are not granted by propping a heavy encyclopedia on your lap, nor are they taken away by efficient access to information on the Internet.
Clearly then, the discussion continues. As I commented in interaction with another person, the issue isn’t so much whether technology is changing the way that we learn and think. That is clear even if we are still debating whether the changes are neurological, behavioral, or something else. The more important questions for us have to do with which of those changes are positive and which are negative (there are surely some of both), as well as how this needs to affect the way that we conduct ourselves as educators.
The New York Times post an article yesterday, “Your Brain on Computers,” summarizing the debate about whether our constant use of technology is affecting in mostly positive or negative ways. I commented on this a while back, suggesting that anyone involved in any kind of education/formation needed to be keeping an eye on this discussion. So, if you’re looking for a primer on the debate, this should be helpful.
Although the article is well written and worth reading, I mostly wanted to point out that the article also links to a couple of games designed to test how much you have already been twisted and corrupted by the neurological affects of technological overexposure. (Technically, they just test how much of a multitasker you are; but I like my version better.) One game tests your ability to concentrate in the face of distractions, and another your ability to switch between tasks quickly. According to both tests, my current level of techno depravity is actually rather low (i.e. I don’t distract easily and I switch easily between tasks). Apparently I still spend too much time doing old school things like reading books and talking to people.
Nicholas Carr recent wrote a piece for Wired Magazine on the way that the internet is literally rewiring our brains. The article reports on a 2007 study demonstrating that browsing the internet for as little as five hours actually causes significant changes in the brain’s neural pathways. Given that our brains are constantly adjusting to sensory input, this really isn’t surprising. As Carr points out,
The real revelation was how quickly and extensively Internet use reroutes people’s neural pathways. “The current explosion of digital technology not only is changing the way we live and communicate,” Small concluded, “but is rapidly and profoundly altering our brains.”
So, Carr rightly notes that the real question is, “What kind of brain is the web giving us?” And, he thinks that the answer might be a little troubling.
Dozens of studies by psychologists, neurobiologists, and educators point to the same conclusion: When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. Even as the Internet grants us easy access to vast amounts of information, it is turning us into shallower thinkers, literally changing the structure of our brain.
The rest of the article goes on to point out concerns raised in several studies about the quality of learning in an internet environment – particularly the impact that hyperlinks have on reading comprehension.
Of course, this isn’t a new discussion. In a now famous Atlantic Monthly article, Carr asked the question “Is Google Making Us Stoopid?” Others have sounded a similarly negative tone, warning us about the cognitive dangers of constant web browsing (see esp. Mark Bauerlein’s The Dumbest Generation). But, many disagree. Don Tapscott’s Grown Up Digital is a great resource for arguments suggesting that the rewiring of the modern brain is actually increasing our cognitive abilities in some very important ways. And Curtis Bonk’s The World Is Open argues that web technology can and should revolutionize the way that we teach. So, like most debates, there are voices on both sides. And, it probably isn’t an either/or. I’m sure our changing cognitive context affects us both positively and negatively.
I’m highlighting all of this because most of the people who read this blog are either already involved in teaching (whether in a church or a classroom) or hope to in the future. If that’s the case, this is a debate you definitely need to be following. Most experts are now convinced that the way people learn is changing, regardless of whether they agree about whether this is good or bad. The question, then, is how (or whether) this should affect the way that we teach. Many schools have taken the posture that the changes may be negative, but they’re inevitable. So, we should alter our teaching to be as effective as possible in the new environment. Other schools are resisting the changes entirely, arguing that one of the tasks of any educational institution is to resist developments that negatively impact people’s ability to learn. And, of course, some schools just think this is all great, and they’re excited to embrace the new opportunities.
I have not come to any easy conclusions on this issue yet. You can probably tell from this blog that I like the internet. I think it’s a tremendous resource. And I think it has great potential to facilitate learning. But, I’m also aware that it can change the way that people read and think in potentially negative ways. I’ve even seen this in myself. I notice that the more time I spend online, the more inclined I am to skim articles and draw conclusions very quickly. Indeed, I find that after an extended period online, it’s difficult for me to really dig into a challenging book. It takes awhile for my brain to switch gears and become effective in this different cognitive environment. And, apparently I’m not alone. The challenge for anyone teaching today, then, is how to tap into the strengths of the internet while avoiding or minimizing its learning pitfalls.
So, no easy answers here. The debate continues. I just wanted to make sure that you were paying attention to it.