10 things I think I might think about AI for teaching and learning

I’ve been adding thoughts to this blog since 2005. I come here to get my ideas sorted out and to give me both something to link to for those ideas as well as to give me a sense of where my thinking was at a given point in time. The last six months have been AI all the time for me, and I’ve been in some excellent conversations around the topic. Some of these suggestions are the same ones I would have given two years ago, and some even 20 years ago, but they all keep coming up one way or another.

I turned this into a list because I just couldn’t come up with a theme for them. 😛

Discipline specific AI inspired learning goals

This one is starting to float to the top every time I talk about AI. One of the challenges of talking about digital strategy for a whole university is that different fields are often impacted in ways that the same advice can’t be given across different disciplines.

I’m feeling more comfortable with this suggestion. We need to be adding learning objectives/goals/whatever-you-call-thems at both the program and maybe even course level. The details of them will be a little different by discipline and even by parts of disciplines, but broadly speaking they would include something like

  1. How to write good prompts for black box search systems (ChatGPT/google) that return useful/ethical/accurate results in your discipline
  2. Choosing appropriate digital locations/strategies for asking questions
  3. Strategies for verifying/improving/cross-referencing results from these systems
  4. How is AI used by professionals in your discipline (good and bad)

You could say ‘yeah, dave, that digital literacy, we’ve been doing it (not doing it) for almost a generation now.’ I agree with you, but I think it’s becoming increasingly important. Search results have been (in my non-scientific anecdotal discussions) getting less useful and these GPT based systems are approaching ubiquity. Students are going to understand the subtlety of how it works in our profession. Many of us wont know either.

Teach/model Humility

This one’s easy. Say “I don’t know” a lot. Particularly if you’re in a secure work position (this can always be tricky with contingent faculty). Encourage your students to say ‘I don’t know’ and then teach them to try and verify things – sometimes to no avail. It takes practice, but it starts to feel really good after a while. There is NO WAY for anyone to know everything. The more we give in to that and teach people what to do when they don’t know, the better things are going to be.

When we get a results from an AI chatbot, we need to know if we are the wrong person to analyze it. We might need to get help.

Spend time thinking about the WHY of your old assessments

Had an awesome conversation with a colleague a few days ago talking about the how and why of teaching people facts before we expect those people to do things with those facts. This keeps coming up.

  1. We teach facts so that they can have them loosely at their fingertips when they are trying to do more complex things.
  2. We often create assessments so that students will have the scaffolding required to do what is, often, boring memorizing. We know they’ll be happy (or at least more competent) later, so we ‘offer to give them grades’ or ‘threaten to take grades away from them’ if they don’t memorize those things.
  3. Those assessments are now often/mostly completed by students using AI systems in ways that no longer require students to memorize things.

If we need students to have things in their heads, or do critical thinking or whatever, we need to clearly understand what we want AND explain the reasoning to students. At this point, the encouragement/threats that have been our assessments are just not going to be that effective. Does an essay done with GPT4 still teach critical thinking or encourage close reading of texts? Does a summary shoved in an AI system and copy/pasted into a text box help anyone?

Spend even more time thinking about your new AI infused assessments

Lots of interesting conversations recently about how to incorporate AI into activities that students are going to be doing. First and foremost, make sure you read some of the excellent articles out there about the ethical implications of making this decision. We have tons of deep, dark knowledge about the ethical implications of writing the way we’ve been doing for a few thousand years. If we’re going to take a step into this new space, we need to take time to think about the implications. This Mastodon post by Timnit Gebru is a good example of a consideration that just doesn’t exist before AI. AI not only produces problematic texts, the more it produces problematic texts the more problematic texts there are to influence the AI. It’s a vicious cycle.

https://dair-community.social/@timnitGebru/110328180482499454/embed

No really. There are some very serious societal/racial/gender/so-many-other implications to these tools.

Talk to your students about AI, a lot

This is not one of those things you can’t just kind of ignore and hope will go away. These AI tools are everywhere. Figure out what your position is (for this term) and include it in your syllabus. Bring it up to your students when you assign work for them to do. Talk to them about why they might/might not want to use it for a given assignment.

Try and care about student data

I know this one is hard. Everyone, two months ahead of a course they are going to teach, is going to say “oh yes, I care about what happens to my student’s data”. Then they see something cool, or they want to use a tracking tool to ensure the validity of their testing instrument, and it all goes out the window. No one is expecting that you understand the deep, dark realities of what happens to data on the web. My default is “if I don’t know what’s happening to student data, I don’t do the thing”. Find someone at your institution who cares about this issue. They are, most likely, really excited to help you with this.

You don’t want your stuff given away to random corporations, whether it be your personal information or your professional work, make sure that you aren’t doing it to someone else.

Teach less

In my last blog post, I wrote about how to adapt a syllabus and change some pedagogical approaches given all this AI business. The idea from it that I’ll carry forward to this one is teach less. If there’s anything in the ‘content’ that you teach that isn’t absolutely necessary, get rid of it. The more stuff you have for students to remember or to do the more they are going to get tempted by finding new ways to complete the work. More importantly, people can get content anywhere, the more time you spend demonstrating your expertise and getting into meaningful content/discussions that take a long time to evolve, the more we are providing them with an experience they can’t get on Youtube.

Be patient with everyone’s perspective

People are coming at this issue from all sides right now. I’ve seen students who are furious that other students are getting better marks by cheating. I’ve seen faculty who feel betrayed by their students. People focusing on trust. Others on surveillance. The more time we take to explore each other’s values on this issue, the better we’re all going to be.

Take this issue of a teacher putting wrong answers online to trap students. He really seems to think he’s doing his job. I disagree with him, but calling him an asshole is not really going to change his mind.

Engage with your broader discipline on how AI is being used outside of the academy

This is going to be different everywhere you go, but some fields are likely to change overnight. What might have been true for how something worked in your field in the summer of 2022 could be totally different in 2023. Find the conversations in your field and join in.

Focus on trust

Trying to trap your students, to track them or watch them seems, at the very least, a bad use of your time. It also kind of feels like a scary vision of an authoritarian state. My recommendation is the err on the side of trust. You’re going to be wrong some of the time, but being wrong and trusting feels like putting better things into the world. Building trust with your students ‘can’ lead to them having a more productive, more enjoyable experience.

  1. Explain to them why you are asking them to do the work you are giving them
  2. Explain what kind of learning you are hoping for
  3. Explain how it all fits together
  4. Approach transgressions of the social contract (say a student used AI when you both agreed they wouldn’t) as an opportunity to explain why they shouldn’t.
  5. Focus on care.

As teachers/faculty we have piles of power, already, over students. Yes, it might be slightly less power than we had 50 years ago, but I’m going to go ahead and suggest that it might be a good thing.

11 of 10 – Be aware of where the work is being done

I recognize that full time, tenured faculty are busy, but their situation is very different than a sessional faculty member trying to rewrite a course. Making these adaptations is a lot of work. Much of that work is going to be unpaid. That’s a problem. Also, for that matter, ‘authentic assessment’ is way more work.

Same for students. If you are changing your course, don’t just make it harder/more work.

Final thoughts

I’m wary to post this, because as soon as I do I’ll think of something else. As soon as I wrote that, I thought of number 11. I’ll just keep adding them as they come to mind.

Times are weird. Take care of yourselves.

Adapting your syllabus to an online content/AI generated content world

I did another presentation on campus yesterday talking about what it means that students can now generate answers to assignment questions. I often find that writing out some concepts before a presentation helps me focus on what might be important. My pre-writing turned into a bit of a collection of the things that I’ve been pulling together for my session next week on adapting your syllabus.

I’m going to keep working on it, but I figured I would post it on my lonely blog, just to let it know that I still love it.

Here’s the deck

Introduction

The release of ChatGPT on the 30th of November, 2022 has brought into focus a change that has been coming to higher education since the advent of the Internet. Our students have increasing access to people and services on the Internet that provide them with ways to circumvent the intent of many of the assessments that have been traditionally used in higher education. 

Schinske and Tanner (2014) describe four purposes for assessments: feedback, motivation, student ranking and the objective evaluation of knowledge. Regardless of what combination of these purposes you use in your work, or what others you might add, the internet has changed the education landscape through,

  1. The availability of connection to other people to support our learners with assessments. (by text-message, online chat etc…)
  2. The availability of pre-created content (available through search or on sites like Chegg) that can be used to respond to our existing assessments
  3. The availability of generative systems (AI systems like ChatGPT) that can create responses to assessments

This has the potential to impact the effectiveness of our assessments. This is particularly problematic where our assessments are meant as motivation for students to learn. With the plethora of options for students to circumvent the intent of our assessments this require the rethinking of the way we design our courses.

This document takes a neutral position with regards to the student decision to use these connections and tools. These tools exist and the tracking tools that have been designed to identify students who have used these tools are resource heavy, time consuming to use effectively, ethically suspect and ultimately unreliable. We believe that a combination of good strategy in our assessment choices, a focus on student engagement in our classrooms and the establishment of trust relationships with students will be the most effective way forward.

Considering the purpose of assessments

The assessments we provide in our courses each serve a purpose. It can be helpful at the start of the process of reconsidering your assessments to chart what purposes each of your current assessments serve. The following model is developed from Schinske and Tanner article. 

Description of assessmentsFeedback on performanceMotivator for student effortScaling of studentsObjective knowledge

In completing this chart, it is likely that many assessments will fall into several categories. The new tools will impact the reliability of each of these purposes, but some more than others. The biggest impact will probably be in the motivation section. 

This kind of course redesign is also an excellent time to consider overall equity in your course (see Kansman, et. al. 2020). 

Grades as feedback on student performance 

In this view grades are given to students in order to let them know how they are performing in our classes. The evaluative feedback (grades) give them a quantitative sense of how they are doing based on your measurement of performance. The descriptive feedback (comments) is the feedback that you provide students in addition to that grade in order to explain how they can improve their performance or indicate places of particular strength.

Questions to ask:

  1. Does my approach provide an opportunity for students to improve on their performance in a way that would encourage them to return to their own work and improve upon it?
  2. Do the affordances of existing content websites and AI generation programs impede my ability to provide feedback on the performance of students to help them improve given my current assessment design?

Grades as motivator of student effort 

“If I don’t grade it they won’t do it”. Whether you consider it a threat or encouragement, this is the idea that we create assessments in order to encourage students to ‘do the work’. This could be encouraging students to do the actual work that we want them to do (eg. assess a piece of writing a student has done to encourage effective writing) or indirectly (eg. assess  a response to a reading to encourage the student to do the reading).

Grant and Green tell us that extrinsic motivators like grades are more effective at encouraging logarithmic or repetitive tasks, but less effective at encouraging heuristic tasks, like creativity or concentration. (2013) Content and AI systems are excellent at supporting students to do logarithmic tasks without the need to learn anything.

Questions to ask:

  1. Does my grading motivate students to learn (as opposed to simply complete the task)?
  2. Is the learning they are doing the learning that I intend?
  3. Do the affordances of existing content websites and AI generation programs impact the motivation of students to learn in response to my assessment motivator?

Scaling – Grades as tools for comparing students 

This is about using grades to create a ranking of students in a given class. There is a great deal of historical precedent for this (Smallwood, 1935), but it is not clear that this is necessary in the modern university. One way or the other, the curving does depend on the validity of the assessments. 

  1. Is grading on a curve mandatory in my program?
  2. Do the affordances of existing content websites and AI generation programs accurately reflect the different performances of students?

Grades as an objective evaluation of student knowledge

Using grades to objectively reflect the knowledge that a student has on a particular subject. There will doubtlessly be differing opinions on whether this is possible or even desirable, but, similarly to the scaling conversation, this is subject to the validity of the assessments.

  1. Do my grades provide reliable information about student learning?
  2. Do the affordances of existing content websites and AI generation programs allow me to accurately measure objective students knowledge given my current assessment design?
  3. Is the same objective knowledge necessary for students to memorise in the same way given these new technologies?

Ideas for adapting the syllabus

Teach less

Reducing the amount of content that you are covering in a course can be a great way to focus on critical issues. It also gives students a chance to dig deeper into the material. This opens up new opportunities for assessment.

  1. Iterative assignments – if students are focused on one or a few major themes/projects you can assign work that builds on a previous submission. Process writing is a good example of this. The student submits a pitch for a piece of writing for an assessment. Then the student continues with the same piece of work to improve it based on feedback given to them by the professor.
  2. Have students give feedback on each other’s project – When assignments start and end in a week, the feedback that a student gives to another student does not always get reviewed or have a purpose. If students are required to continuously improve their work, this could increase the investment that students have in investing in their work. This approach is, of course, subject to all of the challenges involved in peer assessment.

Lower the stakes for assessment

High stakes or very difficult assessments (20% or more in a given assessment) makes the usage of content or AI systems more valuable in the cost/benefit analysis for students. Lowering the stakes (regular 10% assessments or lower) could reduce student stress levels and encourage students to do their own work. This does, however, run the risk of turning assessments into busy work. It’s a balance.

Consider total work hours

Review your syllabus and consider how many hours it would take a student to be successful in your course. Consider the time it would take a non-expert to read the material for understanding, the time to do research and the time to complete assignments in addition to time spent in class. Would an average student at your school have time to complete the assessments they’ve been assigned given the total amount of work they need to do in all their classes?

Do more assessed work in class

Doing the assessment in class can be an easy way to ensure that students are at least starting the work themselves. More importantly, it gives students a chance to ask those first questions that can make the work easier for them to engage with. 

Reduce the length of assignments

Based on the Total Work Hours conversation above, consider whether the length of your assignments improve upon the intent of the assessment. Does a longer essay give them more of a chance to prove their knowledge or does it fairly reflect their performance? Is it possible that the longer assignments only support students who have more disposable time to complete their assignments?

Change the format of the submission

Not all work needs to be submitted in academic writing form. If you’re looking for students to consider a few points from a reading or from conversations in class, encourage them to submit something in point form. Changing the format of the submission can provide some flexibility for the students and also make the work more interesting to grade.

Ill-Structured (ill-defined) problems

A well structured problem is one where the question, the approach to solving the problem and the solution are all known to the problem setter (or, at least, are knowable). An ill-structured problem is one where one, two or all of those are not-known or not-knowable. Well-structured problems encourage students to search for the ‘right answer’. Ill-structured problems require a student to make decisions and apply their own opinion. (see Spiro et. al., 1991)

Rhizomatic Learning/Heutagogy

Approaches to learning where the curriculum in a given course is developed in partnership with students. Often called self-directed learning, these approaches aim to develop a learner’s ability to find, evaluate and incorporate knowledge available thereby developing essential skills for working in a knowledge landscape of information abundance. (see Cormier 2008 & Blaschke 2012)

Effort-based grading

While sometimes controversial, the move to an effort based grading approach has been shown as effective at encouraging students to take risks and be more engaged in performing their assignments while still helping students develop domain specific knowledge (Swinton, 2010). In this model students aren’t being judged on getting work ‘correct’ but rather their willingness to engage with the material. This can be done with a rubric, or by making assignments pass/fail. 

Ungrading

This refers to finding new ways to motivate students to participate in the work without using grades. Even advanced students struggle with this approach without specific guidance on how this can be done (Koehler & Meech, 2022) so this requires a significant shift in the approach to designing a course.

Contract Grading

A highly interactive approach to designing a course that gives students the choice of what assignments they wish to work on and, in some cases, allows students to decide on the amount of work they choose to do for the course. This approach, when done for an entire course, can potentially conflict with university and departmental guidelines and you might want to discuss it with colleagues. It might be easier to begin this approach as a section of a course rather than for an entire course. See Davidson, 2020.

Assignment integration across courses

Another approach, which does require coordination between different faculty members, is to have assignments apply to more than one course. It could be as simple as an academic writing course supporting the essay writing in another course or full integration in a program where a given project grows throughout a student’s experience moving through a program.

References

Blaschke, L. M. (2012). Heutagogy and lifelong learning: A review of heutagogical practice and self-determined learning. The International Review of Research in Open and Distributed Learning, 13(1), 56–71. https://doi.org/10.19173/irrodl.v13i1.1076

Cormier, D., How much ‘work’ should my online course be for me and my students? – Dave’s Educational Blog. (2020, June 20). https://davecormier.com/edblog/2020/06/20/how-much-work-should-my-online-course-be-for-me-and-my-students/

Cormier, D. (2008). Rhizomatic education: Community as curriculum. Innovate: Journal of Online Education, 4(5), 2.

Davidson, C. (2020). Contract Grading and Peer Review. https://pressbooks.howardcc.edu/ungrading/chapter/contract-grading-and-peer-review/

Grant, D., & Green, W. B. (2013). Grades as incentives. Empirical Economics, 44(3), 1563–1592. https://doi.org/10.1007/s00181-012-0578-0

Kansman, J., et. al. (2020) Intentionally Addressing Equity in the Classroom | NSTA. (n.d.). Retrieved April 20, 2023, from https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-novemberdecember-2022-1

Koehler, A. A., & Meech, S. (2022). Ungrading Learner Participation in a Student-Centered Learning Experience. TechTrends, 66(1), 78–89. https://doi.org/10.1007/s11528-021-00682-w

Radzilowicz, J. G., & Colvin, M. B. (n.d.). Reducing Course Content Without Compromising Quality.

Schinske, J., & Tanner, K. (2014). Teaching More by Grading Less (or Differently). CBE Life Sciences Education, 13(2), 159–166. https://doi.org/10.1187/cbe.CBE-14-03-0054

Smallwood, M. L. (1935). An historical study of examinations and grading systems in early American universities a critical study of the original records of Harvard, William and Mary, Yale, Mount Holyoke, and Michigan from their founding to 1900,. Harvard University Press. http://books.google.com/books?id=OMgjAAAAMAAJ

Spiro, R. J., Feltovich, P. J., Feltovich, P. L., Jacobson, M. J., & Coulson, R. L. (1991). Cognitive Flexibility, Constructivism, and Hypertext: Random Access Instruction for Advanced Knowledge Acquisition in Ill-Structured Domains. Educational Technology, 31(5), 24–33. http://www.jstor.org/stable/44427517

Swinton, O. H. (2010). The effect of effort grading on learning. Economics of Education Review, 29(6), 1176–1182. https://doi.org/10.1016/j.econedurev.2010.06.014

Trail-Constant, T. (2019). LOWERING THE STAKES:  TIPS TO ENCOURAGE STUDENT MASTERY AND DETER CHEATING. FDLA Journal, 4(1). https://nsuworks.nova.edu/fdla-journal/vol4/iss1/11

ChatGPT search – Autotune for knowledge

Lots of interesting conversation going on in my community right now about the implications of ChatGPT style tools for the education system. Will students use it to cheat? Will we incorporate it in our classrooms? Can we use it to do mundane tasks for us? What are the ethical implications of using this kind of software?

My read is that these tools will do to education what the math calculator did to math education. And we’re still fighting about that 40 years later.

Those are important conversations, but I want to talk about something else. I’m interested in how these tools are going to change our relationship to learning, work and knowledge. In a conversation with Nick Baker this morning, we were trying to map out what the future workflow of the average person doing an average task.

  • Step 1 – Go to FutureGPT search
  • Step 2 – Ask FutureGPT ‘what does a government middle manager need to know about the Martian refugee crisis. Include three references and tell me at a grade 12 level using the voice of an expert talking to their boss’
  • Step 3 – Look over the response, click send message, include your mid-level manager’s email address.

I figure were, maybe, two years away from this? But who knows, we might have this before I post this message.

What does this mean for knowledge?

30 years ago when I was in college, you went to the card catalogue, found a book that might be relevant and went to a long line of library books to find your book. Once you remembered how the system worked. On that shelf were a bunch of other books that had been curated by 50 years of librarians to be similar in nature (in one way or another) to the book that you were looking for.

The librarians were my algorithm.

Right now, still, I’m using a search engine with a bunch of different practices to try and find the information I want curated by other people somewhere out there on the Internet. I put in a search string, I look at what I get back from the algorithm, make some adjustments, and try again. Throughout the process I land on some websites created by humans about the issue I’m interested in.

The search engine algorithm brings me to a human (probably) made knowledge space.

Starting this year, we’re going to be returned a mishmash of all the information that is available on the Internet, sorted by mysterious practices (popularity, number of occurrences, validity of sources if we’re lucky) and packaged neatly into a narrative. The algorithm is going to convert that information to knowledge for me.

The algorithm presents me with the knowledge, already packaged.

Autotune for knowledge

In 1998, Cher’s ‘Believe’ hit it big as the first autotuned song to sell tons of, I guess, CDs. Autotuning takes the human voices and ‘removes the flaws’ that are there. Any place where you might be off key, pitchy, where you might have slowed down or sped up in your singing. Musical purists have been decrying the process since as they say that it removes the human part of the process from the music. It’s everywhere now. If you listen carefully to most popular songs you can hear the uniformity in the sound.

That’s what’s going to happen to our daily knowledge use.

This, to me, is the real danger. These tools are so convenient, so useful, save so much time, how is anyone going to rationalized taking the time to actually look into issues to check for nuance? Who is going to pay you to take a week to learn about something enough so you can give an informed opinion when something that looks like an informed opinion can be generated in seconds?

The real danger is not to people who are experts in their fields. Super experts in every field will continue to do what they have always done. All of us, however, are novices in almost everything we do. Most of us will never be experts in anything. The vast majority of the human experience of learning about something is done at the novice level.

That experience is about to be autotuned.

Confronting uncertainty with Truth OR Why use metaphors?

In honour of the release of Martin Weller’s book Metaphors of Edtech, I thought I’d address one of the issues that keeps coming up in my classes that I’m teaching this term. The difference between an educational concept being ‘true’ and something that has explanatory power.

In the introduction to the book Martin suggests that the research provides two key insights about metaphors…

  1. They are fundamental in shaping our interactions with the world
  2. They can be used to understand a new domain

His book is a super interesting journey through the various ways we have used things in our world to try and shed light on the complex uncertain world of learning and its accompanying systems.

I had his book in mind when I came across this tweet circulating in my feed this morning.

I started responding to the question on Twitter (potentially rhetorical, but I’m going to take it as if it wasn’t) “why in the heck does this still happen?” and decided that it was more of a blogpost.

Myths

I understand the word Myth to have originally meant something like ‘the way we explain things’. The word descends to us from the word Greek word Mythos, with a meaning that, originally, doesn’t have the sense of ‘not true’ in it. Rather, as Brzeziński suggests, “Myths have been created to give answer to the most basic questions concerning human existence.” We have our rationalist friends from the late middle ages to thank for the bait and switch in the meaning. Now myth means something like ‘things that aren’t true.’

I think of myth as a way we confront uncertainty.

Learning styles as myth

This is a hot topic in my classroom. I’m not a huge Learning Styles guy because I find that once people find a box to jump in, they don’t like to explore other boxes. I’ve heard too many students/faculty say “i’m an auditory learner” and refuse to watch the video. I know the concept of learning styles is being taught on my campus. The education students that I teach are VERY familiar with the concept and most of them have a clear idea of what their own learning style is supposed to be, regardless of whether they believe in it or not.

In their oft cited 2010 article Reiner and Willingham suggest that learning styles do not ‘lead to better learning’. As is so often the case in this kind of argument, the idea of what ‘better learning’ might be is taken for granted. They did their research in ‘controlled conditions’ and discovered that ‘learning is not improved’. There is a hint, though, where they say “These preferences are not “better” or “faster”, according to learning-styles proponents, but merely “styles.” In other words, just as our social selves have personalities, so do our memories.” (2010) Willingham, certainly, is a big memory guy. (Willingham, 2019).

So. If I’m to read between the lines here, Science has proved that using ‘learning styles’ doesn’t improve memory retention in students. This has very little impact on my teaching. I don’t test my students memory in my classes.

The authors do agree that learning styles advocates are right that all our students are different. That’s good. But what could it do? What if we don’t ‘believe’ in learning styles but are conscious of approaches that different students hate?

Let’s talk about math

Every time I talk about uncertainty in education someone brings up doctors or math. “I wouldn’t want my doctor to learn that way (sorry. they do)” or they say “2+2 is 4, you just need to shut up and learn the math”.

Two weeks ago I was ‘helping’ my 14 year old with her grade 9 math. By ‘helping’ I mean she was explaining it to me. I relearned grade 9 math two years ago to help my other child who had fallen behind at the start of the pandemic, and promptly forgot it all again. What I noticed this time was that the math she was learning was the same math no one in my PhD program last summer could remember how to do. I mean. There might have been one person, but that person was not putting their hand up.

The topic of ‘how we felt about math’ became a conversation in several of the groups that we were in. Those who felt they ‘weren’t good at math’ were struck by a huge sense of anxiety trying to relearn the math. They had ‘learned’ that math, maybe in grade 9. They passed it. They proved at one point in time that they had it in their memory. But they didn’t consider themselves someone who ‘liked math’ or were ‘good at math’.

If a ‘learning style’ helped a student feel better about math, would that make it worthwhile? Let’s say that it had no impact on their ‘memory’ of math concepts, but they remembered math class with a sense of fondness… would that make learning styles an approach we should consider?

Does it even need to be ‘an approach’? Can we posit the idea that working with student preference leads them to be less frustrated about the work they are being forced to do? Maybe. But you’re never going to be able to ‘science’ that one in the same way. (don’t say we could do a survey)

Confronting uncertainty

Testing memory, as it can be done in ‘controlled conditions’, is one way we can ‘shape the interactions in our world’ and ‘understand a new domain’. It is a way to affix a number to the learning process. It allows us to clamp down on the uncertainty of this crazy learning thing we do and give it some clarity. It’s ‘evidence-based’. When we believe that memory is what we’re trying to do in our schools, we interact with our students differently and we understand success as a high grade on a test.

Same with learning-styles. It’s a way of trying to talk about something that we see in our classrooms. All our kids are different. They seem to learn things in different ways. Different approaches seem to be preferable to different kids. Let’s put a name on it and build some categories so we can teach people to teach to specific learning styles.

People remembering things is helpful. Paying attention to how people prefer to learn is probably a good idea. They are different ways of talking about what we call ‘learning’.

The other myths in the tweet above ‘people remember 10% of what they read’ and ‘the factory model of education’ are similar. They have explanatory power. They allow us to talk about how reading something is not ‘remembering’ something. They allow us to talk about how our schools are shaped to train people in certain ways eerily similar to factories. Are they ‘true’?

Shrug.

I think our desire to move passed simply explaining something ‘reading isn’t perfect’ to giving it a ‘model’ or a ‘truth’ is the original sin. We are trying to ‘truthify’ something that often doesn’t respond to true and false statements (though, obviously, some do). And then we follow along behind it and ‘research it’ and discover that it isn’t ‘true’.

Not everything has to be true. If learning styles are a myth, then they do explain something to me. They remind me that my student are different and that I need to pay attention to their preferences.

Even if those preferences aren’t about the math they end up remembering.

The value of metaphor

The nice thing about metaphor, is that it allows us to talk about something without troubling ourselves about the metaphor perfectly describes the world. It’s a way of making meaning by sharing how we feel about something else. It’s a different way of confronting that uncertainty. You don’t need to go find a lab and prove it isn’t true.

If I said that learning is a weed because you can’t really control what someone learns, you could nod, and think about a weed in your garden. It might connect with something else you’ve been thinking and it might not. But if by some strange happenstance you’ve made it this far down in the post, you might now have some image of a weed… and be thinking about learning.

Martin’s book

Martin’s book is a super-cool journey through the metaphors that people have used in edtech. It’s really interesting to see the different ways people have tried to tell stories about the uncertainty they’ve been confronted with.

His writing style is as good as ever. It’s a book I’m going to continue to go and drop into.

  • Paper book is on sale, September 30th http://www.ubcpress.ca/metaphors-of-ed-tech for $24.99.
  • Available now to read online https://read.aupress.ca/projects/metaphors-of-ed-tech

references

Brzeziński, D. (2015). The Notion of Myth in History, Ethnology and Phenomenology of Religion. Teologia i Człowiek, 32(4), 13–26. https://doi.org/10.12775/TiCz.2015.047

RIENER, C., & WILLINGHAM, D. (2010). THE MYTH OF LEARNING STYLES. Change, 42(5), 32–35. http://www.jstor.org/stable/25742629

Weller, M. (2022). Metaphors of Ed Tech. Athabasca University Press. https://read.aupress.ca/projects/metaphors-of-ed-tech

Willingham, D. T. (2019). The Digital Expansion of the Mind Gone Wrong in Education. Journal of Applied Research in Memory and Cognition, 8(1), 20–24. https://doi.org/10.1016/j.jarmac.2018.12.001

Group reads – Using Barbara Fister’s Principled Uncertainty

It’s been both a humbling and exciting prospect to return to the face to face classroom. I’m teaching two versions of the same course ‘Digital Technology and Social Media’ for the faculty of education at the University of Windsor. It was a last minute offer, and I saw it as a great way to get my mind around the new LMS that we have at the university. That was my overt reason. Also… I just really wanted to be back in the classroom.

My students are first year education students from two different programs. The age range is reasonably broad, but I would suggest that the majority of them are early career. I have a few students who are more ‘mid-career’ with some teaching experience, but the course is not designed with any previous teaching experience expected.

The activity from this week was originally stolen from Dr. Bonnie Stewart. The article I used was sent to me by Lawrie Phipps. I can’t imagine that there are many ideas that I use in the classroom that I actually made up, but in this case I remember where both the content and the approach came from, giving me the opportunity to give credit where its due.

I’ve used the approach before in a number of contexts. It’s not a terribly complex process. Take an article, break it into pieces, give each piece to a group of students and get them to report back on what they’re reading.

Why this approach

I’ve been in a number of contexts with students and faculty where it has become abundantly clear that students have not had many opportunities to do deep, opinion based responses to readings. Looked at through a faculty lens, ‘students just don’t read what they are assigned’. They are often given LOTS of readings or asked to do ‘summaries’ of readings… but rarely, it seems, given the opportunity to think about one small piece with time to give a response. It’s also a response to which I apply my own context as the facilitator. So each group has a section of the work and is given, say, half an hour to read that section and come up with a perspective on the piece.

Students in both classes responded very positively to the approach. I overheard a number of ‘why have I never done this before?’ and ‘this is the first time I’ve ever actually read one of these things’. We’re those things said so I would overhear? Maybe. Fact remains that they did the reading, and the groups reported back on it. They also seemed to grasp a number of the concepts in ways that I hoped they might. They also took a few of them out of context, which is maybe just as useful in the way it spurred discussion.

What I actually did

  1. I Created a slide deck in google slides that had the article title and link and then a slide each for each of the 7 sections.
  2. I duplicated the slides so that I had the option of smalls groups or large groups. (i went with groups of 2 or 3 and did the whole article twice)
  3. In the class I introduced, broadly speaking, some of the concepts that in the article. I did a bit of early deconstruction of commonly held beliefs that I thought would be useful to setup the content of the article, but didn’t talk about the article directly.  
  4. I explained a ‘group read’ to the class
  5. I walked around the class and assigned students to a particular slide. “Jim and Jane, you guys are slide 3. (slide 3 explains what part of the article they need to read)
  6. I had to confirm (several times) that their responses were not being graded and that they were not, in actually, meant to ‘just do a summary’ but rather have a response.
  7. They took about 25 minutes to get through their article section and to write up a response in their own slide
  8. We took another 25 or so minutes to go around the room and listen to people talk about their response to their section of the article. I tried to engage with each group and occasionally included other groups in that discussion.

So. Takes about an hour. Students do an ungraded, deep dive into an important article in the field and, hopefully, come out the other side with some new ideas or, at least, a better understanding of their existing perspectives.

What did I choose the Barbara Fister article?

This article introduces many of the concepts that I’m hoping to cover in the first term of the course. It takes on the way we ask questions, integrates it with the way we do assignments in universities, runs through the impact of algorithms and talks about some of the things that the students can do about it. Its central focus on uncertainty matches my own research, at the moment, and allows me to have something external to confirm that I’m not just making things up.

The article is also approachable AND written in an academic style I’d love for the students to emulate. It’s the kind of writing I’d love for them to aspire to.

Outcomes?

I got a lot of what I was hoping for from this activity. We had some talk about how google’s algorithms control how we search for things and how we make truth, which is going to be important for future assignments. We started to talk about how task based much of their education has been and how the digital can, if you’re not careful, lead you further down that ‘task based learning’ approach. We also teased out some of the balance between ‘believing experts’ and ‘having your own opinion’ which I feel is going to be a theme that goes on for the term.

Notes for improvement

I definitely should have taken more time to talk about what a good response looks like. It would have been perfect if i’d said “and this kind of response is what I want you to do in your discussion forums” which is the activity they had to do for homework. Thought of that right after class. smh.

I started to lose focus on the last couple of slides. I struggle to stay in those things for a long time. Just need to do a better job preparing myself so that i can be as interested in the last comment as the first.

Choral Explanations as a way of opening the conversation

I’m working on several courses this fall and one of my areas of focus is trying to come up with some models of looking at building knowledge as a collaborative inside a given course. I was jamming around with my colleagues, sifting through various ideas and I remembered a post by Michael Caulfield. In this section of his post ‘Choral Explanations and OER: A summary of thinking to date’ from 2016, he describes how the choral explanation is a way of allowing for a wider number of voices to contribute to a discussion and avoid the transactional nature of a simple question and answer scenario.

I’m still trying to think my way through how to set this up for the courses I’m designing, but its always useful for my planning if i take some time to think about WHY i might want to do this. This keeps me from wandering off and letting the idea get ahead of me. And, clearly, I can’t think quietly 🙂

For the purposes of this conversation, I’m proposing taking a given concept per week and trying to support a choral explanation by participants. They will be encouraged to ‘add’ to the explanation by providing an example, by offering a citation (with an explanation) or a counter position to existing positions as a ‘YES AND’. The idea is to create a broad image of what a concept means to different people in different contexts, rather than trying to limit ourselves to a simple definition that is by its nature exclusionary of other points of view. If you have a model for doing this, I’d love to see it.

Think about defining learning. Or student success.

Supporting Uncertainty

In a recent Twitter conversation a colleague asked (responding to this interview I did on Teaching in Higher Ed) about how to bring uncertainty into the ‘current container of higher education’. The question stuck with me. I’ve been spending a fair amount of time recently writing about how the way we frame questions to support uncertainty. This approach focuses more about how students actually engage with those questions.

The hope here is to create space for participants to include multiple perspectives and voices to a conversation without them cancelling each other out. There will be no space for searching for the ‘right’ answer, as it depends on what you’re trying to say. It also, like any real conversation, will depend on when you come to the conversation. I’m going to need to figure out how to encourage participants to add something new to the conversation without making that annoying. (groups of five might work)

Novices vs. experts

I keep returning to the literature (a couple of links below) distinguishing between novice and expert learners. There is a fair amount of research out there talking about how they should be treated differently. A novice learner needs to ‘get a sense of the basic concepts’. A novice needs to learn the definitions so that they can know what’s going on. The problem, though, is that we often have to simplify a definition in order to explain it to someone who is not part of the field. That definition often leaves out critical voices or gives the illusion that complex concepts can be easily explained.

The idea of a choral explanation allows for a novice (whatever that is to you) to engage in a conversation with nuance and multiple perspectives, without necessarily ‘learning the words first’.

Community as curriculum

One of the reasons i’m super excited to try this is that it encourages participants to turn to each other to understand something. It also allows participants to bring perspectives beyond my own understanding (could be gender, background, race etc…) to our understanding of things. The community of participants becomes the curriculum that they are learning.

Internet as Platform

Maybe most importantly, it allows us to deal with knowledge the way it is actually created. This is certainly true for the way many people come to know things using the internet. But I think I mean it more broadly than that. We have been taught for decades now to cite where our thoughts about something come from. To situate our work in the literature. We have always been in a choral process for making knowledge… with only so many people being allowed to be part of the choir. Mike’s approach will maybe allow me to keep the conversations about concepts open, collaborative and, ideally, choral.

Tabatabai, D., & Shore, B. M. (2005). How experts and novices search the Web. Library & Information Science Research, 27(2), 222–248. https://doi.org/10.1016/j.lisr.2005.01.005
Hmelo-Silver, C. E., & Pfeffer, M. G. (2004). Comparing expert and novice understanding of a complex system from the perspective of structures, behaviors, and functions. Cognitive Science, 28(1), 127–138. https://doi.org/10.1207/s15516709cog2801_7

Future of Education Speaker Series Episode 1 – Students thinking about future skillsďżźďżź

As I’ve mentioned various times on this blog, I have had the good fortune of working with about 70 Co-op students throughout the pandemic. They were students who would mostly have gone out to their engineering or kinesiology placements, but could not due to the pandemic. They’ve been wonderful to work with. The students I’ve worked with in the past had mostly self-selected into the work that we were doing. These students had not. They were doing their best (like so many of us) to make the best of a Covid situation. I learned a lot from them…

One of the things we’ve worked on extensively… is about what it means to be ‘prepared’ for the world that we have in front of us.

This Friday at noon EST (April 29th, 2022) my students will be presenting the results of their Futures thinking activity. They have been tasked to consider what skills they might need to succeed in the future. They started with the trends that are part of the SSHRC future challenges and built from there. It is capstone presentation at the end of their four month work term in the Office of Open Learning at UWindsor. Join us. We’d love to hear from you.

Why are we doing this?

I do my best to have my student employees do meaningful work. I also try and do the kind of training that might make the experience worthwhile to them. One of the key focuses of that training in the last two years has been the gap between their expectations of what it means to work and what I am actually asking them to do. I have found that I need to spend significant time making students believe that I am actually looking for their ‘opinion’ and not ‘the right answer’. They struggle (at first) confronting things that are uncertain. They want a clear question with a clear answer.

This experience led me to take a closer look at the ‘future preparation of students’ conversation. Whether we are talking about 21st century skills, or preparing people for future jobs or whatever… what are we preparing them for? Is confronting uncertainty a 21st century skill? Are other people in my field seeing the same things from students? What other things should I be trying to prepare them for that haven’t occurred to me that I’ve overlooked? How many of those are the results of my own embodiment and privilege? What, eventually, does this mean we should be doing to change what and how we teach?

I don’t know. I know that I care about the work I’ve done with my students and I believe that there is some kind of disconnect. Over the next ten months I’m going to be hosting a series of conversation to talk about it. I’m planning an open course for the fall. I’m also currently applying for funding for a conference in February of 2023. Stay tuned.

A few thoughts going forward.

Uncertainty

You might believe that uncertainty is the product of our current times (pandemic, war in Europe, housing and oil prices, climate change etc…). You could see the next 20 years as a time of potentially unprecedented uncertainty. You might also believe that the abundance of access to voices and information have unveiled the uncertainty that has always lain underneath the veneer of the post WWII ‘clear objectives’ global north west. Either way. I feel pretty comfortable suggesting that many of the new challenges our OOL students will be facing in the next 20 years don’t have ‘answers’ and, frankly, the ones we’re handing on (eg. poverty) don’t have ‘answers’ either.

I think that preparing people for uncertainty is different than preparing them for certainty. I was talking about uncertainty a few weeks ago and was told that we need to ‘teach the basics‘ so that people can even enter the conversation. We need to teach the certainties before we teach the uncertainties. I hear it. We were making the same criticisms of whole language learning in the 80s. I just have this feeling that this conversation about uncertainty and what we can do for it is important.

Futures thinking

Futures thinking is a method of examining current trends through the lens of the future. It is NOT prediction. Take your time machine back 5 years and make some guesses… your predictions were probably wrong. If they were right… no one listened to you.

Futures thinking is about creating ‘possible futures’ that give us a chance to discuss our current trends outside of the current disagreements we may have about them. We combine trends and think about what would happen if they became a dominant trend in our culture. What if, 20 years from now, housing prices went up by 500%? What if advances in cyborginess gave us all unlimited mental storage?

The possibilities are endless, but the future is not the thing. The real advantage of taking a futures approach is the chance to think about the trend. The outcome is a better understanding of what we should be doing in our world right now.

What do I hope to get from this?

Well. I have this conversation that I want to be in. I can’t find it… so I’m hoping to start one version of it and find the others that are already ongoing.

I’m also hoping to pull together the wisdom we come across. We’ll see how things develop, how many voices decide to join. It’d be great if that October open course actually became a MOOC. I’d like that. We’ll see.

Interested?

For sure come to the presentation on Friday. For now drop a comment on this post if you’re interested. I haven’t quite settled on the platform for communications (what with the recent unpleasantness in the social mediasphere).

Teaching for uncertainty vs. teaching the basics

I had a really great time at the #DLsymp22 conference this week. It was my first time back face 2 face, and while i had a few butterflies before I walked up on stage, it all felt pretty natural. I had forgotten how much fun it was to be with a few hundred passionate educators where the things that I care about are the things that they care about. Good times.

Uncertainty

The talk I gave was the first run of the stuff I’ve been working on for the last two years. I started writing a book in the summer of 2019 that became, eventually, about uncertainty and its relationship to learning. A book, that, frankly, needed to get rewritten a few times considering how the people’s sense of ‘uncertainty’ was impacted by the last few years.

I framed the talk around the needs of the young people coming out of our education system. What do they need to learn ‘for’. I’ve had the opportunity to spend a lot of time around 20ish year olds the past couple of years (I’ve had 70 Co-op students work with me) And their plight, and the way I believe they’ve been misunderstood has become a real focus of my time. What are they facing and what do we need to do to help them face it? What does our education system need to ‘be’ to get them there?

The first question I asked the crowd was what they thought students were learning for…

Q1. What do you think our students are learning for? What should they be good at when they’re done school?

It’s a great chart, with lots of words that I think are super-important in the process of helping students get ready for what they are going to be facing going forward.

The middle part of the talk was about how I have this suspicion that our education system contributes to a particular ‘syndrome’ that is quite the opposite of what you see in that wonderful list of things to learn. Quaintly named, I’ll admit. But I’m calling a hat a hat on this one… to avoid confusion.

I have consistently struggled with student employees, early on in their work term, with assuming that any problem I give them – or questions I ask them -has a clear answer. That they can get their job ‘right’. Maybe more importantly, many (most?) seem to believe that I actually already know the answer to any question I ask them… like I’m playing some kind of game by not telling them the answer.

Like they are still playing the game of school. The game where a teacher has something they want you to do, they know what the ‘answer’ is, but they just won’t tell you. Good, compliant, high achieving students, are the ones who figure out what the teacher wants and gives it to them. They are rewarded for learning compliance. (note: compliance is not on the list of things we say we want them to learn, and yet it is often what we reward most.)

The next question (further into the talk) that I asked was about what they thought students would be facing. This was not the cheeriest part of the talk. We are living through a time right now where at least 4 things I would have considered black swans (war in Europe, oil prices, climate change, pandemic) in 2010 are happening at the same time. If you include things like housing prices, that number goes up.

I asked the ‘future of our learners’ question through the lens of uncertainty. I defined uncertainty through the lens of the ‘ill-structured problems’ or, kind of, the lens of wicked problems. A problem where one some or all of the question, the process for addressing the question or the solution are unknown or unknowable. The list is pretty scary.

What are some real world problems without clear answers?

So many of these real world issues are and will remain uncertain. There’s no ‘solution’ to poverty. There’s only hard work on pieces of the problem, a problem that gets super messy to define if you start to think about it.

The disconnect

If the world our students are facing is full of uncertain problems, can we prepare them for that with right answers? Obviously I don’t think so. And it’s not even about leaving room for ‘failure’. In order to fail at something, someone else has to know what success is… and sure, I can’t fail to fix my water tap, because it’s still leaking when I’m done. Sure. There are definitely problems we can fix. A lot of the big ones, though, are not things that ‘fix’.

But we need to teach them the basics

People were really nice to me about the presentation. Many, clearly, were just happy to be together in a big group again. Some people pointed out some very important equity issues with including more ‘uncertainty’ in our teaching. One teacher asked if adding more uncertainty would lead to more anxiety in students or less, because they’d have more practice with uncertainty. Awesome question.

One gentlemen (quite jovially) accosted me later that day and said “but, obviously, we have to teach the basics! They can’t be involved in this if we don’t teach the basics first!”

I should be better prepared for this objection… I’ve been hearing it for 15 years. At least. But it always sets me back a bit. I’m not suggesting that students should have to learn to identify letters or colours. But i don’t really think the ‘basics’ should be the ‘point’ of learning in most cases.

I think of my journey through carpentry… is hammering a basic? Is it drilling? What about joinery?

I’m not saying any of those things aren’t important, but I’ve learned them in the context of building things, of understanding what they’re useful for, not by hammering 1000 nails into a board so the nail head was perfectly flat.

I’m sure my hammering would be BETTER if i did the 1000 nails thing. But there’s more to hammering a nail than getting the nail flat. Safety. wood grain. wood type. time. So many things that bring context to it. Most importantly I’M NOT A CARPENTER. Most of our students will never need to be ‘amazing’ at anything we’re teaching them.

I don’t wish for a world full of super-scientists, I wish for citizens who understand enough about science and statistics to respond ethically to a pandemic. I wish for citizens who can handle uncertainty and still make good decisions.

A tentative guide for new student employees

For you long suffering readers of this blog (this is year 17) you know that my blog is often just where I keep my notes. This one is going to be my ‘working with dave’ notes for new student employees.

I have been fortunate to work with a fair number of excellent students on a number of different projects over the years. At UPEI I worked with the student union for a number of years to introduce some basic project planning and management. I led New Student Orientation for a couple of years, and had some students working for me full in a number of previous roles. In the last two years (’cause Covid) that number has shot through the roof as I’ve had about 80 CoOp students working with me in the Office of Open Learning.

It’s an interesting group to work with. I’ve only got them for four months and, for many of them, this is the first job where they are expected to do things beyond simply repeating a pattern they’ve been given. This term I have three students, two of them are returning from previous CoOp terms. It means that instead of allowing the ‘this is how we do stuff’ to come naturally through conversations, I’m going to do some one on one training with the new students to avoid forcing the returning students to hear all my introductory advice again. In preparation for that, I thought I’d jot down some notes.

Would love some feedback on this so I can turn it into a long term document.

Choosing to be interested

One of the challenges for CoOp students who are working with us in the department, is that they would not have (with a few exceptions) ever have imagined working in education. They are predominantly engineering students who have been unable to find in-field CoOp placements because of the pandemic. Many students (and not just engineers) have been sold the fantasy that they can go to school to get a job that is going to be easy to love.

I mean, that can happen, I guess. But it probably wont. Liking the work you do is mostly about mindset. It’s something I was fortunate to learn in my house when I grew up. My mother can have fun peeling apples. Or raking the lawn. She turns things into a game for herself (and others), tries to get better at it. Tries to do it more efficiently. That mindset is a critical one to develop for people to live happy lives. Most of us have to work. Most of us will never have jobs that are ‘fun all the time’. Learning to find things to like in your job is critical. Even if that thing you like is playing cards at lunch with coworkers (I learned to play 5-way cribbage working at a lead-silver refinery)

I approach this in a number of ways. I try to bring a positivity and enthusiasm to our staff meetings. Try to model it. More importantly, I talk to students about finding a project during the term that they can invest in. Something that they can find interesting that they are responsible for. That they can take pride in doing well.

You need to choose to be interested.

(but not all tasks are going to be interesting)

Being prepared

Such an easy thing to do. Such an obvious oversight if you don’t do it. If someone says “read this over” then read it. Yes. But read it to the point that you have an opinion about it. Come to our meetings with thoughts about the work that you’ve been given.

You could even read around it. A 2 minute google search where you’ve looked at what other people have done or checked the meaning of an industry term can make all the difference to you enjoying the next meeting.

Be prepared

(it doesn’t have to be a big thing. Just think about the work BEFORE the meeting)

Handling multiple tasks

I have consistently seen students struggle managing multiple tasks. If I forget (early on in the term) to say “hey, when i give you a new thing to do it doesn’t mean you stop doing the other one” I will have at least one project totally fall off the radar. Students have had their lives jammed into 60 minute sections (classes) for basically all of their lives. They are accustomed to being TOLD when to work on things, and when to change tasks.

This is super easy to fix. You just have to say “look, segment out your day so you can keep working on different projects.”

Be conscious in managing your work day.

(this might take a while to get used to)

Learning to prioritize

“Wasn’t it obvious that this was more important? The VP asked for this!”

This is something I once found coming out of my mouth before I stopped and realized that, obviously, if my employee didn’t understand the priorities, it’s because I never explained them. A big part of working with student employees is to pass along these cultural norms, but you need to say them out loud. These are very different based on cultural backgrounds.

I like to think of it in terms of ‘right now’, ‘next 24 hours’ or ‘next two weeks’. YMMV. But a big part of the job is to get students thinking about how they are prioritizing their multiple projects. Talking it through… “hey, i know you’re working on three things right now, how have you got them mapped out?” really helps.

Some things are going to be more important than others

(You don’t have to do everything at once, you just have to keep track)

Managing up

This might be the most important piece. We often hire students to do projects that we don’t want to do. We also hire them to do things we don’t necessarily know HOW to do. So many students end up working for people with very little management experience or, worse, for people who think management is some kind of guessing game where they say ‘go do stuff’ with some expectation that students know what’s hidden in your brain.

I talk to my students about how to manage me. I can get a little scattered, so I tell them it’s perfectly ok to remind me that I said I would send them something or to answer their questions. You can ask me questions about what I’m asking you to do, but I’d like it if you read everything first and get a sense of what’s happening before you ask.

That’s the way I like to work. But one of the biggest jobs of any employee is to figure out how your boss ticks. I mean, some of them are just jerks, BUT we all have to work. Learning to manage your boss in a way that allows you to find out what success looks like is going to make you happier and make your boss happier.

You need to manage your boss, as much as they need to manage you

(figure out what success looks like, it’s almost never obvious)

The project charter

I love me a project charter. It’s just a document that lays out what a project is and keeps track of high level issues. The beauty of your average project charter is it gives you a place to put decisions, to clarify outcomes and timelines and to keep track of risks and scope creep.

It becomes the official record of the project and gives you something to go back to. To make sure everyone’s on the same page. I use various versions of this one, which you are free to steal.

Filling out a project charter is also a great way to frame your questions for your boss (see managing up). “hey i was just going through my notes and realized I don’t think you ever told me when you needed this finished”

Keep track of what you’ve been told to do

(and everything else that’s been said about your project)

Dealing with uncertainty

This is the hardest. When I ask a student to do something or I ask them for their opinion, they almost always assume that I already know the answer. 15 years of school has told them that it’s what adults do. They ask questions to test you.

It takes a lot of convincing from me to get students to realize that when I ask them a questions it’s because I DON’T KNOW THE ANSWER. It’s one of the things that slows down the work the most. Students think they are trying to figure out what I’m withholding and I’m waiting for the work to get done. It’s bad for everybody.

There is not a ‘right way’ to do most things. There are local customs at different places of employment that you need to follow, but a lot of the time you just have to make choices AND learn to be ok with being wrong sometimes. It can take a while before a student gets their mind around that. It requires patience.

Face uncertainty, make decisions

(just not decisions that can get you into too much trouble 🙂 )

Learning to be wrong and to fix it

Today I showed the edit that my own supervisor did on a grant application I wrote. It was GLOWING it had so many edits. Students have been trained to believe that they get one chance to submit something and that if there’s anything ‘wrong’ with it, that they have failed somehow.

Real life is mostly not like that. You need to get used to the fact that your supervisor is going to ask you to change things… sometimes in ways that you wont like. You can certainly talk about why you made those decisions, but being open to critique is a necessity.

Be open to critique

(just bring it back better next time)

Talk about what you don’t know

This is a touchy one. I want students to tell me when they don’t know how to do something, but I also want them to try and figure it out. It’s a delicate balance. If you ask me a simple question you could have easily found the answer to, I’m just going to send you a lmgtfy. So, as a student, you need to try something first. Asking questions without trying just leads to bad questions.

But. If you don’t know how to do something, or you don’t understand the scope of something… your work is going to suck. And people are mostly bad at telling students what to do. That means that students need to advocate for themselves. They also need to make an effort to learn the stuff they don’t know on their own.

Working is all about learning.

(sometimes that learning is your job, sometime’s it’s your boss’ job to help you)

Overall…

Working is hard. You’re going to find, a lot of the time, that you aren’t sure what you’re supposed to be doing or how to do it well. Figuring out the job is a big part of getting good at it.

Open Scholarship: We need to make peer review valuable

A conversation with Mita Williams

For my second discussion I reached out to Leddy Library librarian and long time open thinker Mita Williams to get a different perspective on open scholarship. Mita was kind enough to listen to the conversation I had with Lenandlar Singh so we continued that conversation right where we left off.

One of the key conversations that has come up in my exploration so far is that some of the things that I might call open scholarship is that all of it need not be called ‘research’ in order to be recognized by universities. A faculty member is credited for teaching, service and research. Could some of the actions we do in the open be credited as service? Can we think of open scholarship as moving towards research as it moves to artifact and peer review?

Other thoughts and comments from the podcast

  1. I think of open scholarship I think of it as the whole eco-system around open access.
  2. How does your public, open scholarship integrate into the longer term conversation in your field?
  3. Zines!
  4. What counts as ephemera?
  5. Metrics – we don’t just want to ask for numbers
  6. What I keep coming back to is ‘careful readership’. Whatever peer review system we use needs to be valuable.
  7. Lots more!

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.