As some of you are probably aware, I spent the last 10 months working on an academic plan for my university. I tried to be the conduit for 1000’s of pages of feedback, multiple collaborative sessions and piles of surveys. I also tried to listen to hundreds of colleagues and students who had stories they wanted to tell about their time here at the university. It was a fascinating process, and the experience of developing a plan with a committee of 22 people was one I will not soon forget. The digital (meaning the difference between what is possible/likely/imposed in a pre-digital vs. digital society) was hiding around every corner. There was the obvious stuff like the ethical implications of learning analytics and conversations about what ‘quality’ might look like in online learning. There were also more subtle things like integrating student services through a ticket management approach and encouraging networked participatory scholarship. You can ignore it, but you can’t avoid it. We need to re-envision huge chunks of our institutions along new lines taking into account both the affordances and the tyranny of the digital. The systemic impacts of the digital on learning is a panel i’m chairing at the DLRN conference next week. (4 days left to register)
The digital gives us a new window through which to examine our first principles.
An article was posted in Inside Higher Ed yesterday touting the need to shift from the Carnegie Unit to outcomes based education. The author juxtaposes the industrial age approach to learning (thrown them in a room, block out class time) against the information age (let them advance at their own speed towards outcomes). The idea seems to be that we are currently trying to do both industrial, time based education AND outcomes based education at the same time and this leaves us with a commitment to neither. We need to cast off the timed class hour and rebuild our universities to train students for the information economy. Ok. Yup. We need to change because right now we’re trying to do ALL THE THINGS… but lets dig a little deeper.
The two parts of this argument we should ignore
Mastery learning – I have come to see the concept of ‘mastery learning’ as code for ‘and we only care about STEM subjects’. It is a rigid system whereby we create a set of standard blocks of ‘knowing’ that people do one after another, only moving to the next step when the previous one is completed. An assembly line of learning, as it were. An industrial model of learning. I am always a little confused by how people use ‘information age’ networked arguments to suggest we should do mastery learning. And, frankly, many STEM grads will go into companies where the daily work life will look like it did 20 years ago. Research labs or construction sites may have incrementally better technologies, but as many of them rigidly protect their intellectual property and have giant marketing budgets to buy TV ads, the ‘information superhighway’ doesn’t intersect with them very often.
Information Age and knowing information – the suggestion here is that we need to have ‘information havers’ who we can prove they ‘have information’ for the information age. This seems a little confusing to me. If we live with an abundance of information, then we need to teach people how to assemble solutions from various levels of knowing. If I’m building a new birdhouse I may be an expert in construction, kinda knowledgeable about birds and suck at the marketing part of selling my birdhouse. The great thing about the world we live in is that (given access – lots of people don’t have it) you can do all those things. That’s part of what’s changed. But it’s not about ‘having all that information’ but knowing how to bring together the information and/or the people to get what you need. We can do that today… we mostly don’t need to be ‘masters’ ahead of time.
But he’s also right – the asynchronous course hour
The asynchronous course hour often drives this conversation. The research that I’ve done on it (this article is representative) suggest that most people have thought about it, understand that it’s an issue, but aren’t really sure what to do about it. Here’s the problem. We have all decided, for convenience sake, that we’ll teach about 36 classroom hours to students and expect them to study about 80 hours outside the classroom for each ‘course’. We’ve adapted our curriculum to fit this convention and, ostensibly, try to balance the amount of knowing/work/information/learning (KWIT) to fit that time frame. Early in online learning, we took the amount of KWIT we did in a face2face classroom and used that as the basis for how much KWIT we would use in an online course. This works ok for as far as it goes… and then you start to ask questions
What if I record my lectures, is that equivalent to a classroom hour… am i teaching?
If I’m giving the same tests, can i let the students self-pace and finish whenever?
Is my responding in a discussion forum equal to me grading or me teaching?
What if i start my course from scratch, how do I imagine 36 hours of classroom teaching?
How can I do online testing without them ‘finding the answers’ on the internet?
We are living with a foot in both worlds, and we are being forced (at least i hope we are) to ask some profound questions about what it means to teach in ‘the information age’. We have weird monsters-hybrids like ‘a video camera that watches your eyes to make sure you are only staring at the screen when you’re doing an online test’ and faculty requesting f2f tests for online courses. That walled classroom has it’s own affordances that get blown up when you work online. The classroom hour structure is only the start of it.
Information control
One of the nice things about keeping people in a boxed off space when you’re trying to teach them is TOTAL POWER over the information space. If you can keep students quiet, you can totally control the information that is being presented. This makes testing super-easy to monitor. It also allows you to forward one perspective (or multiple ones if you so choose) and create the knowledge narrative that you subscribe to. The digital totally blows this up. Five minutes of clicking can get you a counter to almost any narrative. The ‘information hiding’ that is so critical to the way many still test is next to impossible (Big Brother watching you through your computer not-withstanding). The lessons that this teaches “hide your information” and “choose the RIGHT narrative” doesn’t really map up against the information age story that we are being told.
What is our relationship to information in learning in 2015?
Responsibilities
Most faculty agreements are mapped up against the faculty member spending 36 hours in a classroom. That’s super easy to count. Were you there? Yeah? Ok… you were there. That’s pretty easy right? There are certainly many other things in place, student evaluations, faculty professionalism, etc… I’m not suggesting that faculty just put in time in their classrooms. I’m suggesting that the whole model of ‘doing your job’ STARTS at being in class. But what does that look like in an online space? What does ‘being in class’ mean when you and your students have access to a classroom space (if you’re using a VLE) 24 hours a day? What if you tried to answer all of your students questions when there is an unlimited amount of time for them to ask? I remember trying to find guidance when i taught my first hybrid class (18 hours in class, 18 hours online). I tried my best to make it work out… but how do I know that I’m doing my job? How much is the right amount?
What does it mean to ‘teach an hour’ in 2015?
Fix it with outcomes!!!
The solution to this is to use outcomes based education instead of hours based education. The theory here is that as long as we ensure that students ‘get it’ who cares how many hours it takes? But what is ‘IT’? How do we decide what a person needs to know in order to have a Bachelor’s degree in Arts with a major in Philosophy? What outcomes are you going to choose to make a Major in Biology? Can a student finish in 2 years? What about one year? What about 20 years? Is it time based at all? Well… we could model off of what we have now…
Mastery education advocates often cite professional standards bodies as an alternative way to go with this. They use the fields of engineering, or computer science as their example and say we’ll know when they reach those outcomes that they are prepared to go into those fields. The funny thing is that when i talk to engineers and computer scientists I keep hearing about the need for creativity, time management, grit and people skills as much as I hear about the need to know (insert engineering thing that’s easily measured). Those are wonderful things… but they aren’t mastery things. I am not going to get my first block of creativity learned until moving on to block two of grit. And don’t say i can… because… (angry face)
What outcome do we really want from our universities?
This is just another case where the digital has forced us to consider our first principles. What do we want the ‘outcome’ of a university education to be? As we consider how granular, how technical, how mastery-based we want our outcomes to be we are deciding what it means to be a knower in our society. Our schools have been both drivers for creating drones to work in our factories and an attempt to be places of free thought to allow us to change as a society. They are – always – normative. The way we build them and the ways in which we adjudicate success inside them will be reflections of the society we created… whether we’ve thought about it or not.
The digital isn’t an evolutionary change, it’s a new toolset that allows us to think about the human experience. The internet is full of humans and the residue of the human experience. Given this moment of reflection that we are forced to confront… what do we want ‘knowing’ to be in 2015?