Waiting..
Auto Scroll
Sync
Top
Bottom
Select text to annotate, Click play in YouTube to begin
00:16:51
- Hi folks. My name is Brenna Clarke Gray. I want to thank you so much for joining us for this fundraiser in defence of Ian Linkletter and against surveillance. Before we get started, I have just a few organisational notes. I will try to keep this really short because we have a lot of brilliant people to speak to you today. I want to acknowledge that I'm joining you today from Tk'emlups te Secwepemc in the unceded
00:17:17
and traditional lands of SecwepemcĂșl'ecw I think in particular when we talk about surveillance, it's really critical to consider how the settler state uses surveillance tools for both control and silence. And so I'm trying to be even more present and aware of my territorial space than I would normally be. As I said off the top, thank you so much for joining us for such a critical conversation and such a critical cause.
00:17:44
As of just before going live, I can report that we have raised $6,909 for Ian's defence fund. If anybody wants to make that 7,000, I will be really, really happy for a rounder number. You'll see a donation button in the corner of the page if you're watching on the againstsurveillance.net site. That links you directly to Ian's GoFundMe and all money that we've raised will go to Ian's defence fund.
00:18:10
Any overage will be donated to the BC Civil Liberties Association. So thank you so much for donating to this cause and helping to defend the right to academic freedom, to critique of these tools and to challenge the kinds of norms that we're seeing in education that many of us are so disturbed by. A couple of just, like, how to participate notes.
00:18:34
If you would prefer to watch a captioned stream of today's event, you'll see a link to the captioned feed in the top left-hand corner, which is also where the donation button is top left-hand corner of the againstsurveillance.net site. When you access the closed caption feed, do make sure you click the closed captioning button at the bottom right to activate those captions.
00:18:58
That's also a fail safe if for some reason you're not seeing the feed through this portal, you can click over to that feed and you'll have access to it there. And please do participate either in the YouTube comments or on Twitter using the hashtag AgainstSurveillance. We'll be watching that throughout the event and passing questions back to our esteemed guests. If you are a student, we would particularly like to prioritise your questions
00:19:24
so please self identify as a student. A really quick way of doing that is to mark your question with an asterisk at the beginning and we'll keep a special eyes out for those both on Twitter and on YouTube. I have a few thank yous because an event like this does not come together quickly, easily or by one person by any stretch of the imagination. So to all the speakers today for their thoughtful help in crafting such a great schedule,
00:19:50
I'm really grateful to all of you for your contributions. Simon Gray and isiLIVE for providing the captioning feed for us today, Reclaim Hosting for their advice and their tech support along the way, Marco Lussetti and Nicole Singular for web development and graphic design help, Hannah McGregor and Lucia Lorenzi for monitoring the Twitter hashtag for us today, the entire Continuity of Care community for troubleshooting,
00:20:15
problem solving and rampant enthusiasm which I have tapped into greatly over the last few weeks. And finally there would be absolutely no event without Brian Lamb and Jon Fulton for their help with logistics, implementation and direction for today's event. So thank you so much. I'm going to introduce our first panel of speakers and then get the heck out of the way so that they can share their brilliance with you.
00:20:39
Our first panel is Maha Bali, Benjamin Doxtdator, Chris Gilliard and and sava saheli singh. I'm going to tell you a brief amount about each of them and then I'm going to disappear. So Maha Bali is associate professor of practise at the Centre for Learning and Teaching at the American University in Cairo. She has a PhD in education from the University of Sheffield, she's co-founder of virtuallyconnecting.org and co-facilitator of Equity Unbound
00:21:02
and she writes and speaks frequently about social justice, critical pedagogy and open an online education. Benjamin Doxtdator is a citizen of the Oneida Nation of the Thames, Haudenosaunee Confederacy. As an educator, he values creating the conditions for his students to think critically and compassionately. Benjamin teaches English language arts in the middle school at the International School of Brussels. Dr. Chris Gilliard is a writer, professor and speaker.
00:21:27
His scholarship concentrates on digital privacy, surveillance and the intersections of race, class and technology. He's an advocate for critical and equity focused approaches to tech in education. He's a Harvard Kennedy School Shorenstein Centre visiting research fellow, a member of the UCLA centre for critical internet inquiry scholars council and a member of the surveillance technology oversight project community advisory board. And finally sava saheli singh is a postdoctoral fellow at the University of Ottawa.
00:21:52
And previously as a postdoc at the surveillance study centre at Queens university, she conceptualised, co-created and co-produced screening surveillance, a knowledge translation programme for the big data surveillance project. Previously Sava completed her PhD on academic Twitter from New York University's Educational Communication and Technology Programme. Her research interests include educational surveillance, digital labour and surveillance capitalism and critically examining the effects of technology
00:22:18
and techno utopianism on society. I'm grateful to have you all here and I'm going to disappear and let you do your brilliance. Thank you so much. - Thank you so much, Brenna. Thank you all for joining us. You know, solidarity and anger. I am hearing an echo but hopefully this will go away eventually. So in this conversation, each of us is going to ask the other a question and we're all going to go through and answer it.
00:22:45
And the first question is about how our positionality influences our stance against surveillance. So I'm going to answer it really quickly and then move on to Sava to answer it as well. For me, just want to say something very, just at the beginning as a child, my mother taught me that ethics was about doing the right thing when nobody's watching. And I feel like surveillance is the complete opposite of that.
00:23:11
What are we teaching our children, our young people? That the only reason they need to be ethical is because someone's watching them and not only watching them in a, like a real normal way but watching them in a creepy way and invading their homes. And also as a Muslim, when I travelled to the West, I know that I can sometimes be surveilled. I've been taken for random checks and things like that. As an Egyptian, my own government surveils its citizens, right?
00:23:36
And so to have educational institutions that are supposed to be nurturing ethical citizens, that are supposed to be helping us resist inequality and injustice be the ones that are using things like exam proctoring, this is to me antithetical to what education should be about. And I also had, of course I care about also the trauma that students are going through and how, and even adding to it, the anxiety and the stress
00:24:00
and people who have internet connections and people who don't have privacy in their homes and all of that. And the last thing related to Ian Linkletter is that I had a run in with the CEO of Proctorio recently in the summer, where I was asking people what they wanted to uninvent in terms of technologies. And he kept going back and forth with Chris here and with Jesse for a really long time. And then towards the end he said,
00:24:23
"Oh, I hope Maha Bali gives a balanced keynote about this." And so the fact that he, the CEO of a tech company, feels like he can tell me what to say in my keynote, the ways in which at tech companies are influencing policy makers rather than educators, this is hugely, hugely problematic. And so I'm gonna move this onto Sava and then Ben and then Chris who are much, much, much more hard-lined about this than me.
00:24:54
- Remembering to unmute myself. Hi Maha. Hi Benjamin. Hi Chris. Hi, everybody else. It's really amazing to be here. And I'm so honoured to be in conversation with these amazing people. To answer Maha's really important question about positionality. I think about my positionality a lot, the intersections of our identities play out differently within different contexts and as a researcher in academia,
00:25:18
I'm often the only woman of colour, the only person of colour and the only queer person of colour in a room. And that means I have to carry as much of all of those things as I can and be fair to all of those things as I can. And that's hard to do because I don't contain all of the different aspects of all of those identities. But I am also acutely aware of the multiple privileges that needed to have existed for me as an Indian person
00:25:44
to have been able to come to North America, to go to graduate school and to continue living here. I'm also married to a white man. We joke sometimes that it's so that I can have access to his white male privilege but it's not that much of a joke given how often it's been easier for me to have him do something rather than have to deal with the stress of being myself in a situation. It feels good to list all my privileges so I'm not going to. But it's important to think about each intersection
00:26:11
of who I am carries with it certain privileges and certain drawbacks. And I weigh each of those against each other in every situation be it at the local grocery store or in the TSA line, remember TSA lines. I'm also very aware of my relative privilege in terms of who else is in the room and where I need to step up if need be. In academia, that is most important when it comes to students
00:26:35
and other folks in academia who are more precarious than me. So it's important for me to constantly be reflective about who I am and where I am and this isn't as much of an active kind of calculation that I'm doing in every situation because we've had to deal with it so often, it's very quick, right? What we've picked up on, what the cues are, in terms of like, alright, well, who am I in relation to this person? How have they responded to a certain thing?
00:27:00
Okay, now I know how to do. And it's instantaneous. We're so used to doing it all our lives. That being in any situation, it's very quick to pick up and adapt and we've had to do that. So as a brown person, I'm also more likely to be noticed while being surveilled. We're all surveilled, let me be clear but as a brown person, I'm more likely to be noticed. I'm just more noticeable than some, less noticeable than others
00:27:26
but other elements of who I am allow me to push back against or speak out against some types of surveillance. For example, I don't have to be on Facebook and I am not forced by my employer or my insurance company to include a surveillance device in my life. Owning a gadget or having a smart thing used to be a privilege. And now it's a privilege to be able to opt out of these things. How many Silicon Valley types talk about their children
00:27:52
not being allowed screens? Isn't it so interesting? And by interesting I mean enraging that the trickle down of technology means that you get more surveilled while I can opt out of it. Makes you think, doesn't it? So now I'm going to pass it on to Benjamin. - Unmute. Are we good? Yeah. Thank you both so much. I want to echo a lot of things you were saying,
00:28:20
add some different elements to it. For me as an indigenous person that gets coded white. I'm very conscious of the ways that surveillance does and doesn't affect me because I don't get racialized in my daily life especially living here in Europe. I'm on that relatively privileged side of things. And often that will lead me to be more vocal knowing that I'm not being racialized,
00:28:47
that I feel like there's an obligation to speak up. Part of what I want to talk about in my positionality is as a teacher of young kids. So I teach kids who are 13, 14, 15, and I remember going to a conference in Germany last year, and Audrey Waters was there too, and we were looking at some of the taxes and a lot of these companies were trying to sell software
00:29:15
for proctoring exams. And when I said I don't give exams, I teach younger kids. They were very quick to say, well, we can work something out so we can, you know, give you the kind of tech that will enact these same practises of surveillance. And so a lot of my positionality, I'm thinking about me as a teacher in relation to my students and the kinds of ways that I want to treat them
00:29:39
and the kinds of ways I want to make them aware of the surveillance ecosystem. Part of that is that I see the relationship of teaching and this is something that connects with indigenous ways of knowing and teaching that the reciprocity is a key part of life. And I think that surveillance undermines that kind of reciprocity in terms of power dynamics. And so reciprocity, I'm live tweeting a bit of this.
00:30:06
There's a great book called "Braiding Sweetgrass" by Robin Wall Kimmerer and she uses the word reciprocity 91 times talking about relationships between people and land and the future. And I think that every time we opt for surveillance or extractive technology, we undermine a chance for reciprocity between us and our students. Every time that we have a pedagogical situation
00:30:31
where we can be reading it and learning about them and interacting with them and building relationships with them, every time we give up that kind of back and forth reciprocity with our kids and hand it over to systems of surveillance, we lose deep and important pedagogical moments. And so as part of my positionality, I think very much about teaching them explicitly about this. So we just read one of Chris's essays
00:30:54
the other day in class. And the good news to report is kids are wise to this. We looked at a short video of Amazon's Ring and talked about the way that it was looking at public space. And are there legal things you might be doing in public space that you don't want caught on your neighbor's camera? And they were very much all over the odds and one of them very quickly compared it to the kinds of surveillance that you might see in prisons. So that's a bit of a where I am and where I'm coming from
00:31:20
and I'm really excited to hear what Chris has to say. - [Chris] Thanks everyone. It's a pleasure to be here and again, Benjamin, thank you. I mean, it's a real thrill to know that young people are reading my stuff and I wanted to build on what you said. I mean, one of the things I try to do in my work is I try to remember, first of all, what it was like to be in K through 12
00:31:49
or what it was like to be in college. And I try to think about, you know, what it means to not have access or to have access that is in some way constrained by the whims of other people. And so that's really important to me, like, and what surveillance means to marginalised people, right?
00:32:17
And particularly in the ways it is deployed against black and Brown people and tends to have a disparate impact on them. And again, I draw on my own kind of youth and growing up black in Detroit. And one of the stories I always tell or one of the things I always point to is there was an undercover kind of surveillance unit
00:32:41
in Detroit called STRESS. And it stood for stop the robberies, enjoy safe streets. And in the span of two years, they killed 20 people like 18 or 19 of them were black. And I always think about that in terms of, you know, people tend to make some claims about how surveillance is either helpful
00:33:10
or at least not harmful. It's categorically false. But I, you know, I like to always, I mean, I wish I didn't have that example to point to. But I like to point to examples that display just how pernicious and harmful these technologies are. And to point out that they're an extension of things that have been going on for hundreds of years,
00:33:36
they increased the scale and the scope of them but they're not new, they're just different ways of doing a lot of the harmful stuff that we've been doing for a long time. - Thank you, everyone. I think that sort of gives a lot of different perspectives on this topic. And speaking of things that people use in their classes, I use Sava's films from screening surveillance in my class
00:34:05
and we're about to show you "Black Sites". So John is going to cue that up. I'm not really sure how that's happened, probably magically. (dramatic music) (soft music) - Zola. One Americana with cream Elm Cafe, pick up half past nine. - [Zola] Americana with cream has been ordered from El Cafe for 0930 hours. - Zola, renew prescription for Lorazepam
00:36:14
from the pharmacy on campus. - [Zola] Filing renewal. Your request has been denied calling Dr. Vontez. - Hello, Jai Dideero, your emergency appointment with Dr.Vontez is at 12:30 PM today. To confirm this appointment and say confirm. - Confirm. (upbeat music) - Jai, Americano with cream. Have a great day. - How are you doing?
00:37:06
- Honestly, I am kind of freaking out about finals. Are you coming to the library to study. - Yeah, but not till later. I've got to go see the doctor to renew my prescription for anxiety. - You're still having those panic attacks, huh? - Yeah. I can't flank out of the semester too. I'll text you when I'm done, okay? - You can't, I got rid of my phone. - What! You dropped it again? - No, I opted out. - Opted out of having a phone? - Kind of like that but from everything. I got freaked out about how much was out there about me.
00:37:31
Like I posted about staying up all night playing video games and this company sent me caffeine pills. Can you believe that shit? So I deleted all my social media, I gave up my phone. I'll be at the library. I'll see you later. - Later. (dramatic music) - Hello Jai. I see you are here about your prescription. You were flagged as non-compliant for drinking while taking Lorazepam. - What?
00:38:44
Why? I don't understand. - The system indicated that you posted a photo of wine you'd been drinking and you were warned about drinking and taking Lorazepam when you signed up. - You're tracking my social media? - We make it a policy to keep an eye on our patients. We just want to make sure you're not doing any harm to yourself. - I never agreed to that. - Actually you did but we talked about this.
00:39:10
Okay. Even though you're non-compliant, I can set you up with a prescription at full cost. - I can't afford that, I'm a student. - Okay. Let me see what I can do. We have a programme where we track all of your activities and we set you up with a monitor. And if you're clean for 15 days, we can get you back on your prescription. - I can't wait 15 days.
00:39:43
My finals are in a week. - I'm sorry, Jai. I don't have any other options for you. (Jai crying) - They refused to give me my prescription because I had one glass of wine. - How did they even know? Did you tell them? - No, I posted it online. They track their patients apparently to make sure that they take their medicine properly. Now I have to wear this stupid thing
00:40:52
and stay clean for like two weeks. I don't know how I'm going to get through this without my meds. - The system is so messed up. They're watching us all the time, we have no choice. This is why I quit it all. Listen, I think I can help you out. Come with me. It's going to be fine. Don't worry. We got this. You'll have to turn off your phone before you go in. They have to keep moving locations. (dramatic music)
00:42:03
- Hi, can I get your cell phone please? Thanks. I'll put it in here and you can get it on your way out. Just take a seat please. - Number 79. - [Zola] Call from Dr. Vontez - Hello. - Good morning, Jai Dideero You have an emergency appointment scheduled with Dr. Vontez at 9:30 AM today. To confirm this appointment, say confirm. - [Zola] You have an appointment with university counselling services at 10 o'clock,
00:43:08
followed by a study session with Carl 11 o'clock. - Confirm. Zola, cancel my counselling appointment and text Carl. I'll tell him myself. Hey doc, got some good news for me? - Good morning, Jai. I've called you here this morning because of some concerning behaviour we've noticed. - What is it? I haven't had anything to drink, which you already know because that damn thing goes off all the time,
00:43:44
even at night. The system reported Lorazepam in your blood and we know it didn't come from us. Wait, how do you know that? You're only monitoring my blood alcohol levels. - We monitor all activities including other substances in your system. (dramatic music) (Jai crying) (banging on the door)
00:45:03
(Jai crying) (soft music) - So we are going to go with another round of questions. This one for me. Why don't we all take a quick, deep breath after that film too. That is one of the most impactful things I've watched. It really hit me deeply, both in my own experience as a student and a....
00:47:16
I'm going to tweet out this question too. And so my question is coming from something that Audrey Waters wrote and the link will be in the tweets. And Audrey wrote about how we're encouraged to confuse surveillance for care. And so my question is about how do we as educators
00:47:41
help to make our students aware of the differences between surveillance and care, make the other faculty we teach with aware of these differences and bring these differences to the forefront in this system where we're encouraged as Audrey Waters says to confuse the two. So first we go to Sava. - Unmute. Um, this is a perfect question especially in terms of after watching "Black Sites" because I think my short answer to your question
00:48:15
is that that's exactly what we want to do, to create films for, to get people to notice the differences in what surveillance means versus care versus surveillance within context of care. In this story, and if you've watched any of the other films, there's the other film in the series called "A Model Employee". In both films, the protagonist has people in her circle
00:48:43
that care about her and about each person and care about them and look out for them. And in this film, for instance, her friend, Carl really wants to help her, right? And he takes her to a black site to help her get meds because she needs those meds to survive school. And what ends up happening is that whole cycle starts everything else going wrong for her. And so in a weird way, within that context of care,
00:49:08
that he's tried to help Jai, he's inordinately brought harm to her and he didn't intend to, right? And I think that's another aspect of care versus surveillance that we have to like think about. Which is like, if I'm helping this person, what are the consequences of this action for that person? Because nothing happens to Carl. He's moved on with his life. He's not there at the end. She can't even reach him because he's gotten rid of his phone.
00:49:34
So it's very interesting because at the end she's all alone. And the only thing that she can think to do is get rid of everything that she has attached to her current situation. And in some cases there have been some people who've seen the end of this film as somewhat hopeful saying like, you know, maybe getting rid of all of that. And I think this was one of your students in your class, Maha, who said, you know, at some point getting rid of all of these apps and everything
00:49:57
maybe with allow Jai to just to reconnect with herself in a way that she hasn't been able to because she's been surrounded by all these things and has to use all these things. In the second film, in "A Model Employee", there's a young woman who is working in a restaurant run by a very close family friend, almost an uncle who's looking out for her. And in that context, he wants her to succeed in her job
00:50:21
so that she can get a job later, have these experiences and you know, level up as it were. And he forces her, well kind of, convinces her to put on a monitoring system that monitors where she is, what she's doing, how she's working. And he says that it's helping his business and it'll help her because she'll get perks if she does everything right. And even in both the films, in this film,
00:50:45
you can see Jai kind of being like, well, I don't really want to wear this thing. And in the other film, it's the same thing. She asks all these questions like, why should I wear it? Why should I have it when I'm out on my own, when I'm not at work? And so you can see that each person is questioning those things but because of who's asking them to do it, they trust those people, right? So in the second film, the young woman trusts her uncle because she believes he's looking out for her. And then she puts this thing on
00:51:11
and therefore opens herself up to various levels of surveillance. So it's important for us to think both in terms of like, what are we doing to those in our care, in our family, as our students, in any community that we're a part of, what are we doing to them by forcing them to either get on Facebook? Or wear a wearable or do any of these things that is bringing more surveillance into their lives?
00:51:35
What is the consequences for them? What could be the ripple effects for us? What if when we're in the situation where we're asked to wear something, like I said earlier in answer to my last question, I'm fortunate that my employer doesn't force me to have, you know, all of these surveillance devices on me, but what if I'm in a situation where I'm forced to do that? What does that mean? What recourse do I have to this? Who has agency in any of this?
00:51:58
And actually nobody does. Like only the company only Amazon has the power in all of this, right? Because people who are employers are like, we have to do this because we've been asked to do this. So now they're like, they're pushing it further down the line onto us. And everyone along the line gets surveyed and the person at the bottom is hit the hardest. So I think we have to look at it in many directions
00:52:21
in terms of ourselves, who's in our care and what we're doing to them. And in many instances, it's really hard. Like if you are, and I speak in an academic setting because I'm in academia, if you're an administrator and you've been told that this thing needs to be used. Right now, a lot of universities are using proctoring tools. What is the power for an administrator or a teacher or somebody in that position
00:52:46
to say, no, I'm not going to use this if it's an institutionally mandated request. What can you do in that situation? And it's really hard because at one level, I want to say, there's a good way if you are unionised, maybe to have that as a point that the union brings up and says, you cannot force someone to do this. A lot of student groups are rising up and pushing back against the stuff which is great.
00:53:09
It's so heartening to see that people are rising up against this and saying like, no, we won't let this happen to us. So I think it's important while examining what those connections are, we also make sure that everybody is supported in their refusal of things. We need to allow for refusal. And I think it's really important. It's an aspect and an element of this stuff that doesn't get talked about enough.
00:53:33
Let people say no. And when they say no, they shouldn't not have access to other things. So I think I'll end on that for now and pass it on to Chris to see what he has to say. - [Chris] Thanks Sava. I mean, one of the things I would say is we need to be super explicit that surveillance is about control. It's not about care, right? So, like simply put, the idea that surveillance is care
00:54:01
is a lie. It's a very carefully crafted one but it is a lie. And I think the other thing that's really important too to tell people or to highlight is that a thing that sort of surveillance fetishes, like the lie that central to surveillance fetishes is that you can predict the future. Right now, someone came to you
00:54:31
with, you know, like a deck of cards or a ball, you know, magic ball and said they could do that. We would rightly kind of say that they can't, right? But now they come to us with AI and algorithms and people believe them all of a sudden. But the problem is, sorry, my dog is in the background, the problem is that that is a lie. And that one of the things you'll notice
00:54:57
is that each time those predictions proved to be wrong, that what the fault is claimed is that there needs to be more data. So it's always that more data is going to somehow produce a more perfect prediction but it's never true. It can't be true. It's actually not possible. And I think that we need to point these things out
00:55:23
and highlight them. And lastly, as Sava said, start talking about proctoring. It also needs to be highlighted that many of the claims about these systems leading to academic integrity or honesty are only born out by the companies themselves. So what I mean is there's very little to no independent research that supports their claims about what these things do.
00:55:49
And again, I think that needs to be repeated early and often when we have these conversations. - So I dropped off for a second for some reason my internet connection dropped off which is really relevant to this discussion actually because these things happen to students. And if they're being proctored, whether you're doing it through a proctoring tool or through zoom or whatever, if they've got connectivity problems, this just increases their anxiety.
00:56:16
And if you care about students, I think you care about not increasing their stress in the time of pandemic, in the time of trauma. The stress of being watched in itself in general, it's very different than being watched in person. So, I mean, there are issues with that too but it's not the same as being recorded while you're doing something where someone can play it back. And I think that the hidden curriculum of surveillance
00:56:40
has a lot of those long-term harms. And I'm going to say something really simple. My child said something to her PE teacher recently and her PE teacher didn't believe her. She said, "Get me a medical note." And she made her do something that was not medically good for her. And my child comes home and she's not upset that she did the thing, she's upset that she was considered a liar offhand. And you think about the harm
00:57:04
of treating students as cheaters before they ever do an offence and what the long-term consequences of that is my students are saying, "The more the faculty try to surveil us, "the more we're going to find a creative way "to go around it. "But if they assess us in a way that shows respect "for our minds to recognise that we can collaborate, "that we can use the internet because that's what life is, "we're going to the work cause we're going to be learning."
00:57:29
But just trying to be, you know, I'm sorry, I know it's a lot of work moving things online and I know it's a lot of work to do an assessment that makes sense but that's kind of what you do when you care is you try to do that work, put the effort into that rather than putting the effort into catching students and trying to catch them. So that's my take on it. - And my own quick answer to this about surveillancing care
00:58:00
is something I think a lot about I'm teaching kids probably closer to the age of Maha's kids than to the university students. And I try to be really explicit with students about ways that they can opt out and ways that they are parts of systems of surveillance. So, you know, for example, I tell them that teachers have access to dashboard that will let them see everything in their Google docs.
00:58:24
And so if you want to keep things private, then you need to put it somewhere else. And I try to be really explicit about those things with students. One of the things I'm hearing coming through from quite a few people about, like Sava said, about allowing for refusal, and I think one of the differences is that real care will listen to you when you say no. And I think that's what a lot of surveillance doesn't do and surveillance practises. So I hear teachers say things
00:58:49
like kids have to have their zoom on so that I can see if they're paying attention. There was one prominent case this summer where a teacher said that she needed it on so she could physically see her " low income students" to make sure they weren't being abused. And a lot of this kind just refuses to listen to people who were saying, no, I don't want to be seen or I can learn without that. And I think that that's something I try to be very explicit
00:59:18
about teaching my students are those differences about, as Chris said about control, the ability to opt out, the ability to say no and to be suspicious of people and companies who aren't listening to you when you say no and refuse. That's a red flag that there is something wrong in that relationship or that is not a real relationship. And now we're over to Sava for your question, right?
00:59:44
- Thank you. I think it's so important to think about this care and surveillance. So thank you so much for your question, Benjamin. My question, well, given my cynicism and the particular critical work I do, I'm often asked why I focus on the dystopian. A lot of people see these films as dystopian. And my answer is usually that none of the work I've done is actually dystopian but instead it reflects a lot of where we are now.
01:00:12
And I see that in the way that people relate to the films and see themselves in those stories. The number of times I've heard people like, Oh my God, this happened to me. And Maha just talked about her students and people. And we all had examples that are similar to the things that are happening in these films. So I don't see my role necessarily as providing any answers. I mean, my answer usually to most things
01:00:36
is to burn it all down. And I know Maha hates that and we've had a lot of conversations about that. But instead I want to use these stories to get people to think about their own lives within the larger oppressive systems and how individual choice doesn't always do the thing we think it's going to do but that the systems are the ones that need that change. So a lot of people say like, well, if you don't have anything to hide, you should be okay.
01:01:00
And it's like, that's not quite what it's about. It's not about each individual person. It's about a collective system that's trying to control all of us in particular ways. So I understand the importance of offering examples of things that might move us closer to a more equitable and socialist community. And in so doing, we can also find solidarity with each other and those folks in the other examples. So I want to ask my friends here,
01:01:26
can you share an example or a story that can offer guidance and maybe hope and not the kind of story that is about someone working four jobs to buy their children Christmas presents. That's a horrible story. No one should have to do that. But a story of a community coming together to affect change on larger scales, something that's rested power away from the hands of capitalists, corporate greed. And I will be honest, like my quick answer to this
01:01:52
is like, I've seen a lot of stories of young people really resisting and really pushing back against a lot of stuff. And honestly, young people today give me so much hope and maybe that's because I'm getting older and I'm just like, I hope someone will do this. And I see these young people are really stepping up and I really, really appreciate so much of the energy and love that they bring to so much of this stuff. So my question is, give us an example of something you've seen or done recently that has really affected change and push pushback
01:02:17
against our surveillance state. - I can give a tiny one because I'm sure Benjamin and Chris are going to give a bigger one. So last semester, we were trying to push back in my institution against, you know, adopting proctoring software. My centre were pushing back against it and trying to work on alternative assessments. But some faculty just pushed so hard that we said, okay, we will adopt not me, someone else made the decision
01:02:43
to adopt surveillance software. Around that time, Jesse Stammel will remember this, we published Sheriff's Bulger's chase Walker's article about proctoring algorithm proctoring. I hope someone picks that one out, it's such a good article. And I read it with my students the next day this happened, like I changed my lesson time, we read this article together and we talked about the harms of this and how they felt about it. And one of my students took this article
01:03:09
and organised with other people in his department to make like a decree in his apartment that no faculty would use proctoring software when they were planning to try it. So I was really proud of the students for taking action in that informed way, you know? It's not a huge win but if it was a win for them, that's enough to make me proud of him of doing that. - That's amazing. And it must be so empowering to like take up a cause like that
01:03:34
and actually have something done, right. Have people pay attention and listen to them and just make that change. That's really amazing. Benjamin, do you want to go next? - Sure. A lot of the stories I've been trying to read and learn much more about are a lot of the indigenous movements of resistance against surveillance, and they continually give me hope and they have been ongoing.
01:04:02
And I think that is one of the things that often gets written out of indigenous histories, it's the narrative that indigenous people are gone or the resistance gets written out of the narratives. And so a lot of the contemporary movements that are pushing back against surveillance and at the same time against colonialism and capitalism, that they trace routes back to, you know, hundreds of years
01:04:28
of resistance movements. So these times to give me a lot of hope, especially because of the ways that people will come together across turtle Island, across North America, globally in solidarity, in Southern Ontario, where I'm from, there is a coalition Windsor of lawyers supporting these movements,
01:04:52
a couple of hundred kilometres away and near Caledonia, it's called 1492 Land Back Lane. And the ways that these movements are organic grassroots and interconnect, and it's real activism on multiple levels, they're often led by teenagers and elders. And I think that that is a rare thing to see, that kind of solidarity.
01:05:17
And again, I'm tweeting out some further things to read about it. I'm not the person to take you in full detail through each of the movements. I'm not the person on the front lines of this at all by any stretch. I'm trying to tweet out some of the examples of people who are so you can all listen to them. And in Canada, I noticed is particularly, these movements are mapped with increasing levels of surveillance in prosecution much like here how, Linkletter is being sued
01:05:49
and needs to get legal funds. The same thing happens. And there's a similar strategy of targeting individuals and trying to pick off individuals. And it's usually leaders of movements and in cases too journalists. So journalists who have gone, indigenous journalists who have gone to tell the stories then get targeted by police and by surveillance and get prosecuted for doing the basic journalistic job.
01:06:13
And I think that that is the whole irony of surveillance is that from the point of view of the police, they can watch everything on drones but an indigenous journalist shows up and wants to report it they'll get charged. And so, yeah, I tweeted out a few links to movements like that that give me a lot of hope. And we're over to Chris. - [Chris] Hey. Yeah. Thank you. And I just wanted to quickly point something out
01:06:41
about the previous question, which is, you know, I think instructors and educators will widely understand that surveillance is not care when it's deployed against them. And if, for no other reason, right? I mean, this is a point I try to emphasise is that if you, if there's no other reason that would dissuade you from these systems, it should be again, widely acknowledged
01:07:08
that it's not only that they're being deployed against instructors now, but that practise is likely to increase in the future. And so, yeah, I just wanted to point that out. But in terms of your question, Sava, I know that this is a weird thing to say, given how time flows, you know, that in the last eight months either seem like they've only lasted a couple of weeks or several years of our lives.
01:07:37
But one of the things that kind of gives me hope is to think about how recently the idea of abolition of certain technologies, right? And of certain policies or of certain institutions has become a very real talking point. That for so long,
01:08:05
if we look at like the discussions about facial recognition, people would use the phrase, which I don't like, but, you know, I'm quoting what other people would say that the genie is already out of the bottle, and it, you know, I mean, if you read the stories, you know, that a lot of times the genie goes back in the bottle. (Chris laughing) And so we've got to, you know, so that's where I draw hope.
01:08:32
And it certainly, as other people have said that so many of these things are led by young people and students that so much of the pushback is from students. And for years, we've been gas lighted by people telling us that students don't care about this stuff. And we know now very clearly more so than we did before that that's not true. - Very much so. Thank you everybody for your answers. And I think, you know, talking about how hopeful we are
01:09:05
that young people are stepping up for this. I think it's important for those of us who are able to create spaces for support for them as well, to like reach out to the young people and say like, listen, if you're doing stuff and you need help, I'm here to help you, how can I do that? And this goes back to our very first question about positionality and privilege and, you know, who we are in context to people who are refusing or fighting or doing all that stuff and how we can support that.
01:09:28
So I think gathering those of us in the Academy, for instance, if faculty and other other folks can get together and really, you know, just show up in support for students who have to deal with this stuff. And I think the other thing too, is like, it's coming for us eventually. A lot of people say like, well, it's not happening to me right now, why should I bother about it?
01:09:53
It's coming for you. Like really it is, it's already there and we don't even think about it anymore. I think Chris pointed out that in "Black Sites" there's a moment where she hands over her phone to be put away in a party pouch but she forgets to hand over her hand monitoring device. And it's like, right, there's more than one thing that's tracking us at all times.
01:10:20
And we don't always pay attention to that stuff. And even in that film, she has a voice assistant. This is also something that Chris pointed out and we take that for granted in the film. You know, not a lot of people have pointed that out. So it's the like also understanding that there are many, many, many different things that are surveilling us at different times. And what are all those things and how are we protecting ourselves and supporting other people to push back against those things? And I think that's me.
01:10:47
- I think Brenna is gonna join us to share some of the questions she's been curating. Thank you so much Brenna for handling that. - Thank you guys for your fantastic questions. And I think John's going to flag them as they come in. A question that came up, I'm going to start with the two questions we've had come in from students. So the first one is a student who actually asked the question both on Twitter and on our email feed.
01:11:13
Which is grappling with my orientation to EdTech within an abolitionist framework. My question for the panel, where do you situate yourselves and your work in relation to the carceral state? And then she also expands to include capitalism and colonialism in an expansion to that question? - My answer is very short. ECAB always ECAB. **** I think abolition is very important and I would encourage
01:11:39
everyone to follow Marianne Pella's work. She's at prison culture on Twitter who does amazing work and just, I'm always so inspired by the stuff she's doing. And I've learned so much from just following her on Twitter and reading the stuff that she does. And I think abolition is very important. There have been movements on universities where they're asking for new cops on campus. And I really support all of those efforts.
01:12:08
- I'll jump in. I mean, I'm on record as saying, I'm not the only one certainly, but that EdTech is carceral tech. You know, often they're the same companies. They have the same policies. You know, they have the same end goal. And so to the extent that, you know, again, I think it's really important to talk about tech
01:12:36
that shouldn't exist or tech that we should dismantle and get rid of. You know, whether that's ankle monitors or, you know, remote proctoring. I think they run parallel to each other. Then again, like often they're exactly the same thing. And I think we should say that. - Any other thoughts on that question? Okay.
01:13:05
We've got another great one came in from a student leader at UBC who sent this one in via email. Some of the common counter-arguments I get when we stand against proctoring software is how else can we preserve academic integrity? How do we protect other honest students from the cheaters? I never know what to say other than that we need to reframe the way we teach. And I would appreciate your insights on this. - I hate it. That one I'll jump on right away.
01:13:35
It's such a bullshit premise. Like if you look at what real academic work looks like as you move up, you're always doing things in relation to other people and you just need to teach kids how to situate their work. And I think the work of pedagogy is to do that. We've been building a lot of our ideas about knowledge on what people call sequestered problem-solving
01:14:02
that we're going to like lock you away from all the resources. And you just have to answer the question or do this task without any of these things. And there's been some great work in learning sciences, which isn't something you'll hear me say very often, that shows that if you ask high school students and people who have been through college a challenging question about how you take birds and put them in a new habitat,
01:14:28
it makes it look like college students haven't learned anything because they can't answer it. But if you give people the chance to then rely on the wider pedagogical things they have such as asking further questions to help them clarify what they should be doing or thinking about. Then you see these differences in education, as soon as you give people the ability to access those wider things. And so no one in the real world of academia
01:14:51
is going to ever do these kind of isolated tasks where they wall themselves off from anything that could be plagiarised. And so I think that you need teachers who can better understand what academic writing and integrity and purpose looks like. That's a pedagogical issue where teachers are falling down. - [Chris] Yeah. So let me jump in here. I mean, again, like it's a very carefully crafted lie.
01:15:20
I mean, if you believe that Facebook is about connecting people, then you also believe that proctoring is about academic integrity, right? Like neither of these things is true. But the other thing I would point out is, you know, I would ask people like what level of invasion and harm is worth, whatever you're thinking of as academic integrity. I mean, there are terrible, terrible examples
01:15:45
of people having panic attacks or having to shine bright lights in their faces or having, you know, remote proctors say creepy things to them or people having to relieve themselves in bottles so they don't have to get up from the chair, on and on and on and on, right? So like, let's get this out of the way. Like again, there's little to no evidence
01:16:11
that these companies do what they say they do, that they maintain integrity. But even if for some reason you want to accept their lie as the truth, right? You want to accept their framing. What level of harm would make you say, "Oh, okay, Well maybe we should think "about doing this differently." And for me like the harm that's been proven over and over and over again, that in itself is enough for me to say
01:16:36
we should think about doing something else. - I think a couple of things I wanted to bring up was, one, it's so weird to call people cheaters without looking at why they may resort to cheating. And I think it's very important to be like, okay, why does this person feel the need that they need to cheat in order to succeed at whatever task you set them to. And that should answer your question in terms of how to structure your work
01:17:02
and your teaching and your pedagogy in order to not get people to even think about what cheating is. Like, it's so weird that so many of these things are aimed at controlling cheaters rather than being like, all right, how do we make this a really good learning experience for them? And how do we really get at what they're learning? So that's one thing. And just to echo what Chris said,
01:17:26
the line that seems to be drawn about where we, like, where are we drawing the line seems to be shifting every day and it's very, very unnerving to watch that. You know, like Chris was giving all these examples. There was one example I remember of a young woman saying that she had to take a photo of her lap in order to take a test. And to me, that's like, you're asking a young person, a young woman to take a photo of a part of her body.
01:17:53
That's now going to be owned by a company. And she's at a university and universities have things like Title IX in the U.S. and other places. It's just like you're trying to protect people in these situations. And if you're saying that, take a photo of your body, young woman and send it to me. So that I can verify this or you have people who sit there and watch other people take exams. And it's just like, we cannot normalise these things.
01:18:17
These are not normal things, they really aren't. And I think we really need to be pushing back strong and hard against these things. And I'm really glad that we're doing this particular teaching and this particular event to raise that awareness a little bit more and get people to be enraged. I think it's important to be angry and to push back against this. Anger is useful and it brings change and we should all be angry. - Yeah.
01:18:42
I think every time we talk about academic integrity, we need to ask ourselves what it is we're really talking about. What's our purpose? What are the values? Because catching something is not solving the problem. That's not the root of the problem, it's not the cause of the problem. And all the things that you would do to promote academic integrity that does not involve surveillance is probably better pedagogy. And so, for example, this is not the same as a proctoring,
01:19:05
but turnitin.com teaches students about plagiarism in a technical way as if it's a technical thing whereas you should be working on explaining to them the values behind attribution and the importance of being proud of the originality of your work. We're teaching it the wrong way around. And so what they're getting out of it is, oh, so I need to just not get caught. Again, that kind of problem. And I think someone,
01:19:28
I don't know if this is the same question we're asking but I know someone was asking when people say, how do I maintain academic integrity? Every time I've worked with a faculty member to help them think about how they might modify their assessments so that they're more, I don't want to say more difficult to cheat, like their authentic learning experiences that students will want to do on their own. They always end up happy because they realise that they're assessing something more valuable
01:19:54
than what they were assessing before. So it's also about what's the learning experience. So you'll make you make your choices about your values and then you figure out how to do them. You don't just say, this is the easy way out, I'm just going to do it even though it's not perfect. You with what's the most important value or what's the most important thing you want to do or want to avoid. Don't go there, you know. - So we had one more student question come in that I'm going to just sneak in ahead of some of the other questions we fielding.
01:20:22
Came in from Twitter, from Sinead Doyle, and she asks, "Is what we saw in the film "an example of what's happening in the world now "or an example of where you see things going?" - And I think I said this when I was asking my question about people ask me why these are so dystopian and that's exactly right. It is happening now. There are already insurance companies and other places that are like, unless you have this wearable
01:20:49
and let us track everything you do, you're not going to have a lowered rate of coverage, as in the money you pay. So they're doing that in terms of incentivizing people and saying that you don't have to pay as much if you wear this thing and we can track you all the time and stuff like that. So that stuff already exists. I think the the larger issue of how all of these systems are interconnected
01:21:13
also already exists. Her social media was being monitored. And that's all also already being done both at institutional levels, in terms of like tracking academics and what they're saying, in terms of students and tracking what they're doing and saying, in terms of companies tracking you and doing that, in terms of, I don't know, like, governmental agencies
01:21:38
tracking you and stuff. So I think all of the things depicted and all of the things are happening right now and what the films are trying to do is make that more explicit in terms of how those things are connected and how they can affect you. So it's a bit dire and I apologise for some of the, for not kind of, before the film played, for not warning everybody and saying this is a bit dire, so hang in there. But yes, these are things that are happening right now.
01:22:05
And that's the thing, all of these films feel like they're near future, and they are, but also are they? There more people than I can tell you have told me how close these things resonate and affect their lives. Sorry, that was a long way of saying yes. - It was good. Does anybody else want to jump in on that question about the film before we move on to others? - [Chris] It's there now. I mean, there's just an article today about tech companies
01:22:33
that are becoming life insurance companies. And one of the ways they're going to do that is by aggregating a bunch of third-party data about you in order, you know, like someone's going to be like, "Well, there's actuarial data that does that now." But it's not the same. So there's nothing, nothing in "Black Sites" that couldn't be or isn't being done right now.
01:23:00
- Yeah. We might mention seeing some chat in our backstage area here. We might talk about how tracking on social media is why we are having this event today, right? A company not feeling very positive about the way their product was being portrayed on social media and the consequences there of. Here's one that came in via email. It was a question about whether or not anyone on the panel
01:23:27
wanted to talk about the new news that came out last week around Microsoft 365, which of course is ubiquitous in universities and in businesses and the productivity scoring for workplace surveillance. - [Chris] Yeah. I mean, it's a pretty open ended question. A really great researcher named Wolfie Crystal kind of pointed this out in a lot of media picked up on it that there's a productivity score
01:24:00
that Microsoft offers that is pretty clearly, you know, a surveillance tool. And it turns out, right, that Microsoft even has a patent out, right? Patent doesn't necessarily mean that it's going to make the product but it has a patent out that's going to do things like track people's biometrics, you know, their facial expressions, the temperature of the room and all these other things
01:24:24
in order to assess how well a meeting went. And so, again, like to my point earlier, if you think this is great for students, you know, I don't think you'll have the same opinion when it's turned on you. And that's not like next year or five years from now, that's right now. I mean, there's a company that famously bragged that they could prevent workers from teachers,
01:24:49
from unionising and striking by monitoring their social media. This is a prominent, you know, EdTech that's used in K through 12. So like, this is now, so yeah, it's terrible. It's things that shouldn't exist. Like there we go. - Thank you. Here's one and it's one that I think about a lot. I was pleased to see it come through the chat. Now we've had such an active conversation
01:25:16
in the comments that we've lost the ability to pull older comments forward into the video screen. So I apologise for that. But the question was, "A lot of our higher ed organisations "are hosting conferences "that are sponsored by surveillance tech. "That money is what funds very important work. "So what do we do about that?" - That's a really rough question. - I need to tell two stories.
01:25:44
The first story is Autumn Keynes, Michelle Caskey Brock and I'm sorry that I'm forgetting her name now. And I was giving a session, not of a huge conference but it was a small conference that I was later keynoting a larger conference. In that small conference, we discovered that one of the partnering companies was sponsored a little bit too late to do anything about it.
01:26:08
And the person came in late. And then he spoke in the middle of our session as if he's a normal person. I'm sorry to say as if he's a normal person because of course he's a human being, but he spoke out without mentioning his affiliation and sort of advertising for the company in the midst of a session that was about care. And can you imagine, like that kind of, it's one thing to throw their money and the organisation that was organising the conference
01:26:32
to take that money but to put them to sponsor a session about care and to allow them to speak. And we later talked about how big of a mistake that was, there were students in that session. They were like, what's this what's going on here? And they were trying to frame what Melody Buckner, thank you, Michelle, she told me on YouTube. And at the end, they were trying to frame what they were doing as giving agency to students.
01:26:56
And of course, there's nothing related to agency about what they're doing. They're even sort of, you know, hijacking this kind of space to try to sell what they're doing as working with what we're doing. And so, I'm not the person who's responsible for funding these kinds of things and I empathise with people who need to get the money to do these things. But for my other keynote, I spoke to the organisers
01:27:20
and they removed the proctoring company from at least sponsoring my event because a lot of EdTech conferences have people who are differently valued. They don't have the same values. But at least the people who have these values should not have to endure being sponsored by someone like that. Especially if you're not paying those keynotes and honoraria. Like you're not paying me money and you're going to get a sponsor like that. So this is one thing.
01:27:43
I'll leave others to comment on it. - I think an important thing to bring up too, is that a lot of these folks don't see themselves as being evil. It's like that gift where, you know, it's a comedy gift where they're all dressed in like Nazi regalia and they're like, wait, are we the bad guys? And I think that a lot of people don't get that, you know, what they're doing is harmful. They think they're, to be generous to them, they genuinely think that they're doing good in the world.
01:28:10
And in addition to that, a lot of their identity and job and work and family and everything is tied up in this thing. So it's really hard to let go of it. And this is capitalism at work, right? The other thing that's capitalism at work is that all of the money gets, you know, it gets siloed into certain big companies and they're the only ones who can sponsor this stuff so it's also that. It's like if you want to run a conference,
01:28:33
it's very expensive, and who are the only people who have money? I'm being so good at not swearing on this session. I have to tell you. Microsoft obviously has a lot of the money. So they're going to put that out there. And they have, you know, philanthropic arms and stuff which is just like, oh, here's a tax dodge. Let's give money to this good thing. Like, let's give money to academia. And then that way we'll be able to write it off, et cetera. So it's like, all of these systems are so big
01:29:00
and so complicated and we do need to push back against it, unravel it, separate it out and be like, no, you can't have all the money. No, it can't be in corporate hands. No, you can't be the only persons, you know, funding this stuff. So it's very complex, it's very difficult but we need to do that work. And I think what, you know, the example Maha gave about asking her session, not to be sponsored by that,
01:29:23
that's a small way to do that, but at least to have this conversation is important. I shouldn't say that, no, we need to burn it all down. Really. - [Chris] I mean, I would just say that I tend to be a hard liner on these things, you know? I mean, like, would you take money from Palentier? I mean the answer should be no, right? My answer is no but I think it's also important to note
01:29:51
that part of what these companies do with their money is they attempt to drive the direction of education. Okay, you're not taking the money, no strings attached, right? You're taking the money and in some ways that's helping to drive the conversation about what education should look like. And then that's coming from venture capital and from tech bros
01:30:17
who think that because they went to school that they know something about education and we got to resist that. - Okay, we've got another question from a student that's coming via Twitter. How do we stand up against the ableist aspects of surveillance without being targeted for our disabilities or making ourselves vulnerable to authority figures that either don't understand or think it's more important than protecting academic integrity.
01:30:50
- I'll jump in with a quick one on this. I teach students who have a variety of learning needs and it's gotta be, you've got to address this at a fundamental pedagogical levels. There's a really great framework called universal design for learning that's about designing learning to be inclusive, much like we designed physical spaces to be inclusive so that they're as accessible as possible for everyone.
01:31:18
And I think that has to be a key part of that. So students who have different learning needs, the burden, shouldn't be on them to come ask for it. Teachers should be proactively planning in all these ways to make things accessible, to have audio versions of things, to make their notes available, all those kinds of things so that is not something that students have to come to teachers for. So I see that as like a really fundamental pedagogical issue
01:31:47
that teachers need to drop and then institutions need to support because it's a lot of work to do that kind of designing for everyone. And so really institutions need to support educators in doing that too. - So, I sort of mean making systemic, right? Don't make it case by case where someone has to come forward and say, I have a disability help me. It should be just the way things are just like, for example, captioning videos
01:32:14
and that kind of thing. And people don't have to explain why we need to use it. It just should be available to everyone and they can choose it when they need to. - I think I saw a comment in way back when in the comments talking about students with disabilities who had to struggle with some of the technologies because they were required to do certain things that they could not because of their disabilities
01:32:38
and therefore were discounted from being able to participate in their learning online. And I know that the approach to that is to be like, all right, so how do we fix proctoring so that disabled students don't get left out and that's the wrong approach, right? We just don't want proctoring at all. And I think both of what Benjamin and Maha were saying about like, the focus should be how do we make all of this stuff
01:33:04
accessible to disabled folks? And everybody else will take care of themselves. Like if our aim for creating things is aimed at those of us who have who need to access it in particular ways, then those of us who are more able-bodied are able to do that anyway. So I think rethinking how we design everything and starting at the beginning
01:33:29
with how do we make this accessible to people with certain kinds of disabilities? - Those furthest from justice. - Sorry? - Those furthest from justice. - Yes. - I love that line. Thank you, Maha. - [Chris] I mean, what I would add is that one of the things that my work with surveillance has shown me is that these systems always come, they are always deployed,
01:33:58
the most harmful effects are deployed against the most marginalised, the earliest and the in most harmful ways. And you know, to me, I mean, these are the kinds of things that make these systems a non-starter, right? Like if it can't be deployed properly and like, it's my firm assumption that it can't, right? If it can't be deployed properly
01:34:23
in a way that's not ablest and racist, it shouldn't be used at all. And so, yeah, I mean kind of like build on what Sava said. And I'm so grateful to people like Shay who have pointed this out, you know, or highlighted it. And also the students who have highlighted it and some of them have had to do that not because they want it to but because they've been outed by the tech. This fact alone of how it discriminates against,
01:34:55
how it's ableist technology. We actually don't need any other reason not to use it. Like, again, that's a reason. And that's the only reason we would need not to use it. And yeah, I mean, that is a, I think it's troubling that we don't, that that's not widely accepted. - I think that's a pretty phenomenal place to end a pretty phenomenal panel. Sava, Benjamin, Chris, Maha,
01:35:25
thank you so much for all of your thoughts for joining us here today. There's a lively conversation going on on the hashtag which I know you will all find and participate in I'm sure and in the YouTube comments. So I'm going to say thank you. And I'll introduce our next speaker. - Thank you so much. - Thank you. Okay. So our next speaker is Jesse Stammel. Jesse, I'm going to give you a little introduction and then I'll let you do your thing.
01:35:52
Jessie's co-founder of the digital pedagogy lab and Hybrid Pedagogy, the journal of critical digital pedagogy. He has a PhD from the University of Colorado, Boulder. He's co-author of "An Urgency of Teachers", the work of critical digital pedagogy. Jesse is a documentary filmmaker and teaches courses about pedagogy, film and new media. Jessie experiments relentlessly with learning interfaces both digital and analogue and his research focuses on higher education pedagogy,
01:36:17
critical digital pedagogy and assessment. And I'll get out of the way and let you do your thing. Thanks so much for being here, Jesse. - Thank you. It's good to get to spend some time with you all, this morning, afternoon, evening, wherever you are. I'm going to try to share a screen briefly and see how that works. Let me just get that going. Okay. There we go. I don't know if someone,
01:37:11
can, you all see that? Yes, there we go. So, I'm going to talk to you a little bit about, I want to talk about civil disobedience a little bit. That's something that came up in the last panel and that's just going to be a small part of what I talk about but I'm hoping that this comes up in the Q & A and that it continues with the next speakers. It's been really wonderful listening to the previous panel.
01:37:35
And I found myself actually rearranging these slides as I was listening to them because on the fly I was having new ideas, being inspired to think about this and new ways. And let me just start by just putting up two quotes that I've juxtaposed here. And I'm thinking about them alongside one another, and I've never juxtaposed these two particular quotes
01:38:00
in quite this way before. And although these are two quotes that I use frequently and that resonate quite a bit with some of the topics we're talking about. Henry David Thoreau writes, "Let your life be a counter friction to stop the machine. "If it is of such a nature "that it requires you to be the agent of injustice "to another, then I say, break the law." And Bell Hooks writes in "Teaching to Transgress",
01:38:30
"As a classroom community, "our capacity to generate excitement "is deeply affected by our interest in one another, "in hearing one another's voices, "in recognising one another's presence." I'm thinking about this idea of resistance and this idea of how we say no and who gets to say no. And about the sort of privileged spaces from which that act of saying no can happen.
01:38:53
And as I was thinking about that, I was thinking about all the people who can't say no. All of the people, oftentimes students who are in a position where saying no is either impossible or feels dangerous. And also I've written quite a bit about the difficulty that EdTech critics often face that it's difficult to critique EdTech without experiencing repercussions. Obviously that's part of the theme of this entire event,
01:39:20
is how do we do this critical work of inspecting and analysing and pushing back on EdTech while also maintaining our livelihoods. And I think that something here in Bell Hook's quote resonates with me this idea of community and how important community is. She talks a lot about bringing our full selves to the work of education. Although she pushes on that by talking about also again,
01:39:43
who has privileged to bring their full selves, what kind of risks do we face when we bring our full selves? So when we resist who's able to do that, and I think there's something here about doing that collectively. And when we come together and can engage in this work as a community, I think that we're more powerful and that we're able to push back more effectively. So over the last 10 months, I've been disturbed to watch many institutions
01:40:08
actually cut faculty development budgets while massively increasing spending on LMS contracts, proctoring solutions, plagiarism detection software, cameras in classrooms and video conferencing tools. Some of you may have seen some of these ideas in several tweet threads that I've done recently about proctoring. I've really been thinking increasingly about this idea of how institutions show what they value
01:40:35
by what they invest in, by what they're willing to support both financially and infrastructurally. And I often find that that idea of what they will support is directly at odds with what they say that their missions are. So they might say their mission is one thing but it's clear their investments that their choices are not lining up with their mission statements.
01:40:59
The remote proctoring industry is expected to grow from being a $4 billion market in 2019 to a nearly 21 billion market in 2023. Massive investment, massive spending. And where's this coming from, who's paying for these companies as they're birthed and then become billion dollar companies almost overnight.
01:41:23
Students are often paying, institutions are paying. And where is that money coming from? How much money are these institutions investing and where are they taking funds away from in order to invest in tools like proctoring solutions. The companies themselves are audacious in the way that they're willing to brag about their numbers, brag about how fast they're growing. And honestly, I find myself appalled
01:41:49
when I see some of these CEOs talking about their numbers and the sort of glee that they talk about how fast that this technology is growing. Here's from a page on the ProctorU site where they talk about just how many exams that they proctored. And every single one of those is an actual set of human beings who's engaging with that exam on the other side. And yet they talk about it as exams,
01:42:13
reducing this to the transaction of this set of exams that these students are engaging with. Again, most of our assessment mechanisms in higher education don't assess what are institutions say they value most. Here's the page from Honorlock. They write that their mission is to safeguard academic testing services. And if you've seen the Proctorio website,
01:42:37
they talk about being first in academic integrity. They talk about being, even I think on the front page, they say that they are the first academic integrity platform, which is a very strange claim to make. Honorlock here writes, "Your audience is watching. "Prospective student, accrediting bodies "and candidate employers "want to know if your university's degrees
01:43:01
"are academically sound" Lots of discussion on all of these websites about how institutions prove their legitimacy and that these are the tools that are gonna to help these institutions prove their legitimacy because the degrees that they offer have currency and the students who get those degrees, they can spend that currency with increased wage growth
01:43:25
over the course of their lifetimes. This is sort of the dream that these tools like Honorlock and Proctorio and ProctorU are selling. And yet, if you look at the websites themselves, so University of Wisconsin, Madison, I was faculty member there and they are a customer of Honorlock. And a customer that Honorlock brazenly touts on the front page of their website.
01:43:50
From the University of Wisconsin mission, the primary purpose of the University of Wisconsin, Madison is to provide a learning environment in which faculty, staff and students can discover, examine critically, preserve and transmit the knowledge, wisdom and values that will help ensure the survival of this and future generations and improve the quality of life for all.
01:44:15
So it's also a really bold statement but it's interesting to me how that doesn't necessarily line up with what proctoring software say that they are about, which is turning an educational degree into a form of currency or a kind of legitimising exercise, that they focus on the reputation of the institution itself, whereas University of Wisconsin, Madison says that they have this broad vision
01:44:39
of improving life as we know it. And so Honorlock's method for ensuring integrity, lacks integrity itself. Going further than assuming students cheat by actually baiting them into doing it. They have patented the use of honeypot websites. So this is an example of a patent that they filed, where they are patenting the use of honeypot websites which offer answers to exams,
01:45:04
tracking students accessing these pages and then giving them incorrect answers. So they're luring students, baiting them into cheating. And interestingly that certainly helps their bottom line because if they can lure more students into cheating, they can talk about increased cases of academic dishonesty that then they can use those numbers to then sell themselves back to institutions.
01:45:31
So if you google, "Is cheating on the rise?" And this is what most of these companies talk about, they talk about how cheating is on the rise in, you know, in the age of the internet or in the age of computers or in the age of Google or in the age of paper farms or in the age of COVID as more and more institutions are pivoting online. A Google search finds a half dozen articles like this one
01:45:56
from the Washington Post, which actually cites data reported by ProctorU about the rise of online cheating. So they don't even find any sources outside of the tools themselves. Chris talked earlier about how these tools, institutions do not demand that these tools prove that there actually is a problem with cheating or that these tools can provide a solution to that problem.
01:46:22
I'm not going to share the data that shows up in this Washington Post article in some ways, because I feel like there's a sort of journalistic irresponsibility in Washington Post even sharing this data. If ProctorU's revenue model hinges upon the perception that cheating is on the rise, and then they can conveniently prove that by being paid to proctor more exams, it seems like going to ProctorU
01:46:46
and asking them is cheating on the rise is probably not the best source to go to. So if we actually look at this question, is cheating on the rise? This is from Lang's cheating lessons, where he gives a really good account of this. And I'm going to kind of summarise this very, very briefly. "In 1963, Bowers surveyed roughly 100 institutions "and found that 75% of the surveyed students
01:47:10
"admitted cheating at least once in their college careers. "The numbers have not changed much since then. "In fact, McCabe et. al. in 2012, "Cheating in College", it includes findings from 150,000 students recently surveyed "showing that between 60 to 70% of respondents "admitted cheating." So this idea that cheating is on the rise is sort of false premise
01:47:35
from which all of these proctoring tools are born. It doesn't necessarily mean that cheating doesn't exist. It means that selling us the idea that cheating is on the rise isn't necessarily a truth in marketing on their part. Wiley surveyed 789 instructors for their 2020 report on "Academic Integrity in the Age of Online Learning" and 93% of those instructors
01:48:00
felt students are more likely to cheat online in-person, with emphasis on the word felt. What follows in that report is a guide to discouraging academic misconduct without any sources for an actual rise in cheating beyond the imagination of those surveyed. So oftentimes we're relying on the sort of internalised fear and the sort of cultural message that instructors get
01:48:25
that cheating is on the rise in order to then feed these tools or this response. This is one of these just absolutely ridiculous comments by one of the proctoring CEOs. This is Scott McFarland, the ProctorU CEO. He says, "We can only imagine "what the rate of inappropriate testing activity is "when no one is watching." What a way to sell your product that watches people
01:48:49
to say, you know, just imagine how many people would be cheating if they're not watching. There is no neat and tidy technological solution to the challenges we presently face in education. And frankly, it's insulting when institutions throw money at corporate EdTech when so many of their students are struggling and many of their faculty staff are on furloughs. If your institution just spent 500,000
01:49:15
on a proctoring solution, and that's a figure from a recent Washington Post institution is reported to have spent 500,000 on their proctoring software, or 200,000 or even 30,000. Someone wrote to me and said "Well, my institution only spends 25,000." I don't think that the number is really what's at issue here. It's where else could that money be spent? And what I suggest is find out how much your institution
01:49:40
is spending on these tools and then put that number next to how many students at your school are food or housing insecure? How many faculty or staff have been furloughed or fired? How many positions are currently frozen? How many faculty or staff have part-time or contingent positions? Then, add up all that money your institution spends on extraneous, often pedagogically suspect, EdTech like cameras in classroom, plagiarism detection software,
01:50:04
the LMS proctoring solutions and compare that number to the total budget for the centre for teaching and learning. So how much are we investing in preparing teachers and supporting students, and how much are we spending on the technological tools set up to surveil the students, to monitor, to rank, to track their behaviour. Cheating and plagiarism are pedagogical problems, not technological ones.
01:50:30
They're not inevitable. We can design proactively and together with students rather than relying on cruel surveillance software that creates a culture of suspicion which interferes with good pedagogy. And I want to say that it not only interferes with good pedagogy but something like remote proctoring actually does harm to students. And I feel like right at this particular moment, when students are experiencing lots of kinds of acute trauma,
01:50:54
we need to be thinking about that trauma. And when we're talking about trauma informed pedagogy, which is something I've been talking about increasingly, I can't see how remote proctoring can be part of a trauma informed pedagogy. John Warner recently wrote talking about this event. "We do not do our best, most interesting work "when we know we are being watched. "Skinner-like behaviourist practises "where every moment of a student's day
01:51:18
"is monitored for compliance and correctness "is incompatible with learning." Sarah Goldrick-Rab and I wrote a piece called "Teaching the Students We Have Not The Students We Wish We Had". And the goal of this piece was to start by finding out who your students are. I've pushed back on scaffolding and there are even some versions of universal design that I would push back on when not universal design
01:51:42
or the scaffolding is done in advance of students arriving upon the scenes. I think we need to bring students into the design of the learning. When our most marginalised students, we can start by designing for them but I think we also need to start by designing with them and allowing them space to design their own educations. So who are these people? And how are exams effective in working with these students?
01:52:11
Food insecurity is a significant factor in determining the average math SAT score. An increase in food insecurity lowers the students' math SAT scores. I'm going to kind of move through this data relatively quickly. I could almost spend an hour talking about any of these data points. Ozturk, et. al. found that students perform more poorly on exams when they are several weeks removed
01:52:35
from receiving food stamp benefits. So when we're talking about proctoring, we need to start by talking about exams and talking about the effectiveness of exams and who exams are effective for and who exams do harm to. And then also thinking the same thing about proctoring, who is most harmed by something like remote proctoring, disabled students were talked about previously. And there's many ways in which the sort of horrors
01:53:01
of remote proctoring are not distributed equally among our students. First-generation students perform more poorly compared to continuing generation students only when the function of selection was salient while no difference appeared in the no selection condition. What this study talks about is it talks about first-generation students perform poorly on exams when something in the instructions
01:53:24
or the construction of the exam reminds them of their space. When they're reminded of the competitive nature of the exam, when they're reminded that some students are more deserving of others, things that our assessment systems often do ranking students against one another. There is consensus in the literature about the benefits of a student's sense of belonging. Researchers suggest that higher levels of belonging lead to increase in GPA, academic achievement
01:53:51
and motivation. There's some recent work that I've been reading about loneliness and the degree to which loneliness affects test performance. And if you think about what we've all been experiencing over the last 10 months, loneliness is something that resonates with me a great deal. And if you think about the environment in which students are taking proctored exams, remotely proctored exams
01:54:14
and how the isolation that they feel at that moment of the testing experience affects their performance. I'm going to stick with this one a little bit longer. Children displayed a statistically significant increase in cortisol level in anticipation of high stakes testing. Literally having an effect on their hormones, testing affecting their hormones. Large decreases and large increases in cortisol
01:54:40
were associated with under-performance on the high stakes tests, essentially either too much stress or a situation that's constructed where students actually shut down and stopped producing cortisol that's necessary for their performance. Oh gosh, this is horrifying to me. This is texts from a five page set of instructions for a remotely proctored exam at Laurier University. Some of you may have seen this going around on Twitter,
01:55:07
just looking at this block of text. This is actually only four of the five pages represented here. I went on a rant about the four pages and someone then reminded me or noticed that it says one of five, two of five, three of five, four or five. So there's a page of this that I haven't even yet seen. And these pages along the block of texts, even things about the formatting feel like they do violence
01:55:34
to all of the students who are engaging with this. A single page of these exam instructions is bewildering, inscrutable, anxiety producing, pedagogically bereft. Four pages is abusive. Exam integrity would be better served by three words, I trust you. These words have psychological and pedagogical benefits. Students are more likely to cheat when we start off by calling them cheaters. So I'm just going to end with this.
01:56:00
And this is the four words that I have recently boiled my pedagogy down to. Start by trusting students and thinking about all of the ways that we construct learning environments, how we can come back to this idea. So I want to open it up to questions. And so we've got about eight or nine minutes. And in keeping with this and in keeping with Brenna's call, I would love if some of the questions started by coming from students
01:56:24
or even if people are willing to share reflections in the chat about their experience with some of these softwares. So I think you're muted Brenna. - Of course I am. Of course I am. I'm hear now. That was fantastic, Jesse, thanks so much. And there's lots of really good reflection going on in the chat and discussion about EdTech sort of more broadly and what it's responsibilities are. I flagged a couple of questions
01:57:04
and there's a couple more on Twitter that I'll bring you. But the first one I want to start with is a question that asks, "If entrapment is illegal for law enforcement, "why does it seem to be okay in education?" I like that question because it kind of ties us back to this conversation in the first portion of your talk and of the panel earlier about the relationship between sort of cercarial practise and education. - Sorry, I'm writing in the chat also as we were talking.
01:57:32
Can you repeat the last part of the question? I got the bit about the entrapment. - Yeah. So the question was, why is it okay here, if entrapment is illegal for law enforcement, why's it okay for teachers? Or why does it seem to be okay for teachers? - Well, I think that this isn't illegal, it isn't illegal relationship. The relationship between teachers and students, I think there are certain legal dimensions to it
01:57:56
but I think that I actually would push, I would push away from the sort of legal characterising of the relationship. I think too often, what we do is we set up a set of structures or rules that govern relationships. And when I think the thing that's most powerful is relying on our human natures. And I guess the quick answer
01:58:22
is it's not okay by any stretch of the imagination. What Honorlock is doing is absolutely gross. It's disgusting. (laughing) - There was another question and I think it was in response to, you were talking about some of the recent news coverage of the Proctorio situation and other surveillance technologies right now, people are very interested in it. And the question was, why are there no articles
01:58:50
in this kind of popular press space about people who are finding innovation and connection teaching online? Now you write about innovation and connection and teaching online. So I'm wondering if you have any thoughts on the explosion of interest now that it's all so creepy. - Yeah. Well, I mean, I think that there are those stories, but to some degree those stories don't have the front page. I mean, like I cannot stand this concept
01:59:18
and can't believe it's even going to come out of my mouth, but the idea of if it bleeds, it leads. I think to some degree what we're seeing is the harm being done. And we're seeing less stories about all the teachers and students who are working so incredibly hard to make it through this moment. And also I think that truth is not really a narrative that sells nearly as well
01:59:42
because the truth is that we're fumbling through. We're not all figuring out the answer. I don't think that I could say, "Hey, let's shut down all of these stories "about the horror stories of these proctoring solutions. "And let's publish stories about all the successes "and the amazing marvels," because truthfully, I'm not seeing that either. What I'm seeing is humans fumbling through a really, really situation in the world together.
02:00:08
And I'm seeing, and I'm feeling both a few successes, but also a lot of just being together in our grief and being together in our trauma. There is a lot of emphasis on care but I don't know that we're necessarily getting that right. And I don't necessarily know if it's something that you can just put 10 best practises and suddenly you can get it right. I think the fumbling through is the story,
02:00:34
what lead do you right through them for that, you know, a bunch of people all fumbling through, trying to figure out what online learning is. That doesn't sound nearly as much as the other headlines that I'm seeing. But I think that you're right. When I google trying to find, even when I tried to find data about changes in cheating, If I just do a Google search for that, I see 10, 15, 20 pages of people saying, "Oh my God,
02:01:02
"cheating is on the rise. And then you dig a little deeper and you find out, no, actually cheating's not on the rise. And there's no evidence that I've been able to find to support that. - I'm going to segue into a question from a student that just came in while you were answering the last one. And that is," I'm a student. "What is your best advice "for actively calling out professors who only offer exams "through Proctorio and similar systems "rather than providing meaningful alternatives?"
02:01:29
I think students are in a really difficult position here, right? Because they have to advocate for themselves in so many situations where the power differentials are troubling. - Yeah. I mean, that's absolutely true. There are so many students that I hear about who want to push back and only feel like they can do so anonymously and anonymous in spaces. And we actually need to tell this story,
02:01:52
we need faces and human beings that are experiencing this. And yet it's so difficult to find that exactly because the power imbalance is so problematic. I mean, what I would say is that there are a lot of advocates for students, people in positions of power or relative power, at least at the institutions. And that doesn't necessarily just mean, I mean the idea of relative power I think is important here. We all can use whatever power we have to support the less powerful people within the system.
02:02:18
And honestly the least powerful position. Sometimes people push back on this, but I'll push back on it for the rest of time. The least powerful people in this system are students. And we too, as people who have whatever small amount of power that we have, whatever platforms we have, we need to make sure that we provide space for students voices. We need to talk about students.
02:02:41
We need to invite students to events. We need to create space for students. We need to provide funding for students. And so ultimately my answer is that it is really our job as educators. Those of us who have any space or any power or any funding or any ability to support students to make sure that students are invited to the table of these discussions, but also to make sure that these spaces
02:03:05
are not hostile to their voices. And that starts by reminding students constantly that a rug is not going to get pulled out from under them, because so often that's exactly what has happened. So to go back to your question, I would say there's safety in numbers to some degree, find other students who are having similar experiences and come together and write and talk about this. - Okay. One last question ties in really nicely to the last one,
02:03:32
sort of flips to professor perspective but is being posed by a student. "Student here, what would you say to the professors who might not want to use proctoring in their classes, but are in a position where they feel they have to or being mandated by institutions?" And I think this will be our last question, Jesse. - Yeah. Well, what I would say is that, and I've talked about this quite a bit with grading. Every time I talk about un-grading or resisting grades
02:03:57
or resisting quantitative evaluation, invariably, I get a question, well, that's not allowed at my institution, so what do I do now? And my answer to that has been to actually find out what the rules and restrictions at your institution are because usually we internalise a lot more restrictions than are actually there. And I think that that's true even with some of these proctoring solutions, our institutions, we need to actually find out when,
02:04:23
how and where we're required to use those. Especially if you're a contingent faculty member who doesn't feel like they have power to resist. If you have some power within the system, say no and say no loudly so that other people can hear you say no, because that's what helps contingent people also feel like they can say no. And then I would say that, where you have to, two things, find out what the rules actually are
02:04:48
so that you can follow the letter of the law while pushing back on it at the same time. And the other thing I would say is talk to students about these these platforms because honestly, if a student has to go into a proctoring conversation, having a really honest frank conversation with their teacher, with their fellow students, with their colleagues about how proctoring works, what it is, why it exists, it goes a long way.
02:05:13
It helps them think critically and engage in what is a traumatic experience in a much more thoughtful, critical way. - Thank you so much, Jesse. That's fantastic. And thank you for addressing those questions and thank you to the students who got their questions in. We really do want to prioritise your voices. If you missed me saying off the top, please do self identify as a student in your questions and we'll make sure we prioritise getting them to our speakers. So thank you, Jesse.
02:05:40
Our next speaker today is Audrey waters. Audrey, I'm going to give you a little bit of an introduction and then I'll leave you to do your thing. Audrey waters is a writer who focuses on educational technology, the relationship between politics, pedagogy, business, culture and EdTech. She has worked in the education field for over 15 years, teaching, researching, organising and project managing. Although she used two chapters into her dissertation on a topic completely unrelated to EdTech.
02:06:05
She decided to abandon academia and she now happily fulfils the one job recommended to her by a junior high aptitude test, freelance writer. Audrey has written for the Atlantic Edutopia Mindshift inside higher ed, the school library journal, the Huffington Post and elsewhere across the web, in addition to her own blog, "Hack Education". And I'm so delighted to have you here, Audrey. Thank you so much. - Thank you. Thank you for having me. I'm so pleased but I'm also really outraged to be here today.
02:06:32
And I don't know, maybe I shouldn't have sat in and listened to everyone else's talks so far because they've been great, but like my level of curiosity has just sort of ramped up as the day's gone on. I feel like I'm revving the engine, so you'll have to, excuse me if I get a little ranty. Yeah, I'm pleased, but I'm outraged that we're here raising money for Ian Linkletter's defence. More broadly raising awareness
02:06:58
about the dangers of EdTech surveillance. I mean, it is nice, it's a relief to be part of an event where everyone is on the same page, I think, politically and pedagogically. And that I don't need to be the only person saying, Hey, wait folks, this EdTech stuff is at best snake oil and at worst fascist. You know, the challenge though on the other hand
02:07:22
is not simply to repeat the things that Sava, Maha, Benjamin, Chris and Jesse have already said. I'm very lucky that those five are not just colleagues but are dear friends and the love and support that they've shown me. And actually the solidarity that all of you show today gives me great hope that we can build better educational practises and that we aren't stuck with the snake oil or the fascism.
02:07:49
And I will say this, even though it's kind of been stated and restated a dozen or more times today. Test proctoring is exploitative and extractive. It is harmful to all students but particularly to those who are already disadvantaged by our institutions. To adopt test proctoring software is to maintain a pedagogical practise
02:08:13
that's based on mistrust surveillance and punishment. To adopt test proctoring software is to enrich an unethical industry. And to adopt Proctorio in particular, is to align oneself with a company that has repeatedly demonstrated that it sees students, teachers and staff as its enemy. It's a company that has no respect for academic freedom,
02:08:36
for critical inquiry or for the safety and wellbeing of the community that it purports to serve. When we talk about surveillance and EdTech, much of the focus and rightly so is on how this affects students. But as we gather here today to raise funds for Ian, a learning technology specialist at UBC, we have to recognise no doubt how much surveillance
02:09:02
and EdTech affects teachers and staff as well. And this should not come as a surprise, right? A fair amount of education technology comes from what we call enterprise technology, right? That's the kind of software that's adopted by large corporations and government entities. Zoom, which thankfully we're not using today, Zoom for example, was not designed for classroom use.
02:09:25
Although, Stanford university was interestingly its first paying customer, it was designed to facilitate remote business meetings. And I should say that it was designed pretty poorly even for that. This spring when schools and workplaces turned to video conferencing tools on mass, the CEO of Zoom admitted that he'd never really thought about privacy and security issues on his platform.
02:09:50
He couldn't fathom why someone would Zoom bomb a class. Enterprise technology is utterly committed to monitoring and controlling workers. Although I think many white collar professionals imagine themselves labouring outside these sorts of constraints, right? Much like professors seem to imagine that plagiarism and proctoring software is only interested in their students' work, not their own.
02:10:18
Microsoft and Google offer schools productivity tools, for example, they make it quite clear, I would hope That whatever students or staff type or click, feeds into financial measurements of efficiency. I mean, we talked about this earlier, quite literally so in the case of Microsoft, which now offers managers a way to measure and score workers based on how often they email chat
02:10:43
or collaborate in SharePoint. In EdTech circles, we call this learning analytics, right? But it's less about learning than it is about productivity. The learning management system as the name suggests is as much a piece of enterprise software as it is educational software. It too is committed to measuring and scoring based on how often its users email, chat or collaborate.
02:11:09
That word management is a dead giveaway that this software is designed neither for students nor for workers, right? The LMS is an administrative tool designed at the very outset to post course materials online in such a way that the institution and not the open web can control access, right? The LMS monitors student performance and productivity
02:11:34
to be sure, but it can also be used to surveil instructors as well, right? To track which grad students are on strike based on their logins, for example, to track how many hours teachers are working during the pandemic. These are real world examples, not hypotheticals. The co-founders of Blackboard incidentally are back with a new startup, an administrative layer on top of Zoom
02:11:58
to make it work better for school they say with features like attendance taking, test proctoring, eye tracking. So congratulations to EdTech for taking a surveillance tool, designed for the office and making it even more exploitative when turning it on to students and staff. I guess I should pause here for a moment just to make an important point. I think too often when it comes to education technology,
02:12:26
all technology, to be honest, we get really caught up in talking about the material or the digital object itself. Like we talk about the tool as though it has agency and ideas all its own, rather than recognising that technology is part of broader practises, systems and ideologies that are at play in the world around us, right? Technologies have history.
02:12:50
The LMS has a history. You know, one that's bound up in the technological constraints of the student information system in the late 1990s. And then the EdTech imagination about what sort of access to the web students should or shouldn't have. And that history gets carried forward, right? Analogue beliefs and practises get hard-coded into software, creating new pedagogical constraints
02:13:15
that students and teachers must work with. Technology never arrives all of a sudden, right? Although that's the stories about tech and EdTech, you know, they like to ignore history and highlight innovation, right? Proctorio did emerge fully formed out of Mike Olson's head, right? Like the goddess of wisdom emerged out of Zeus'. But in this case, like with less wisdom and more lawyers.
02:13:42
Online proctoring has a history. It has ideology and test taking itself, right? Is a cultural construct. It's embedded in long standing beliefs and practises about schooling. What we test, how we test academic integrity, cheating and so on. Again, like Jesse said, this whole idea that there's rampant cheating is a nifty marketing narrative, right?
02:14:06
One that's been peddled for a long time and is unfortunately I think, the cornerstone, the philosophical cornerstone for how so much of EdTech gets built. You know, but again, surveillance didn't just appear in our institutions with the advent of the computer. So many of the deeply held beliefs and practises
02:14:30
of education involve watching, monitoring, controlling, right? Watching, monitoring, controlling children and adults, students and teachers. And I would argue that watching monitoring and controlling are fundamentally behaviourist practises. I know with me all things come back to Skinner, you'll just have to deal with it.
02:14:53
But we can think about the ways in which education technology emerged from workplace technologies, right? As well as the ways in which schools adopt that language and culture of productivity in order to control our bodies, our minds, our time, our behaviour. But a lot of the labour of education is also reproductive labour, right?
02:15:17
Teaching and learning is the reproduction of knowledge and culture, teaching and learning as acts of care. Another theme I think of today. What does it mean to bring surveillance and behaviourism to bear down on reproductive labour? Wired magazine ran one of those really awful future of work stories, last week I think.
02:15:41
Something about how too many meetings make us unhappy and unproductive and therefore of course, artificial intelligence will optimise all of this by scheduling our meetings for us, scheduling all of our appointments for us, by using facial recognition technology and body language reading software to make sure that we're all paying attention and staying on track and maximising our efficiency.
02:16:07
The underlying message, right. More surveillance makes us better workers. And it's not hard to see how this gets repackaged for schools. So I'm really interested, not just in this question of sort of productive labour, but also of reproductive labour. And also, I apologise for these are sort of loosely formed ideas. Maybe this isn't fully baked,
02:16:33
but, you know, what does it mean to automate and to algorithmically programme reproductivity, right? What does it mean when the demands of the workplace and the demands of school enter the site we commonly associate with reproductive labour, right? That is the home. We've been, for over 100 years now, we've been developing technologies, telling stories about the necessity of automating the home
02:17:00
and this other influence that joins enterprise tech. So they're influence on EdTech, which is of course, consumer tech. So let me tell you a quick story that bridges my book, "Teaching Machines" with the next project that I'm working on. And actually it's on the history of the baby monitor, right? We have this thing called a baby monitor and we have bought this product for almost a century now.
02:17:27
And all of a sudden folks want to talk about why children are so heavily surveilled, but good Lord, you know, read some history. Anyway, let's talk about Skinner in his efforts to build a behaviourist technology of child-rearing. He called it the heir crib. And again, I want to talk about Skinner because I really think we have to recognise that so much of the surveillance in EdTech
02:17:50
comes from his behaviourist bent, right? His belief that we can and we must observe learning by monitoring behaviour. And then we can enhance learning by engineering behaviour that is also fundamental to EdTech. So in 1944, Skinner fabricated a climate controlled environment for his second child.
02:18:15
He first called it the baby tender and then I kid you not, the heir conditioner, H-E-I-R. It was a device meant to replace the crib, that bassinet, the playpen all in one. He wrote an article in ladies home journal one year later. And he said, "When we decided to have another child, "my wife and I felt it was time to apply "a little labour saving invention
02:18:39
"and design to the problems of the nursery. "We began by going over the disheartening schedule "of the young mother step-by-step. "We asked only one question, 'Is this practise important "'for the physical and psychological health of the baby?' "When not, we marked it for elimination. "Then the gadgeteering began." So the crib that Skinner gadgeteered for his daughter was made out of metal. It was larger than a typical crib, higher off the ground.
02:19:05
Labour saving in part because of less bending over, Skinner argued. It had three solid walls, a roof and a safety glass painted at the front, which could be lowered to move the baby in and out. Canvas was stretched across the bottom to create a floor. The bedding was stored on a spool outside the crib to be rolled in to replace the soiled linen. It was soundproof and dirt proof Skinner said,
02:19:31
but the key feature was that it was temperature controlled. So save the diaper. The baby was kept unclothed and unbundled and Skinner argued that clothing created unnecessary laundry, inhibited the baby's movement and therefore the baby's exploration of her world. It was a labour saving machine. And we even talk about reproductive labour in terms of productivity.
02:19:54
But Skinner boasted that the air crib would only take about one and a half hours each day to change feed and otherwise care for the baby. He insisted that his daughter who stayed in the crib for the first two years of her life was not socially starved or robbed of affection or mother love. He insisted that the compartment does not ostracise the baby. Not surprisingly perhaps, the invention never caught on.
02:20:20
In no small part, because the title that "Ladies' Home Journal" ran with the article was baby in a box connecting this crib to the Skinner box, the operant conditioning chamber that he'd designed for his experiments on rats and pigeons, right? Sort of associating the crib with the rewards and pellets that he used to modify these animals behaviour in his lab.
02:20:44
And Skinner described the crib's design and the practises that he and his wife developed for their infant daughter as an experiment. A word that he probably didn't mean in a strict scientific sense, but that I think probably suggested to readers that this was a piece of lab equipment, not really a piece of furniture suited for a baby or for the home. The article that he wrote also opened with a phrase, "In that brave new world,
02:21:11
which science is preparing for the housewife of the future." And I think many readers in the 1940s would have been familiar with all this Huxley's novel, brave new world and made the connection between the air crib and Huxley's dystopia in which reproduction and child-rearing were engineered and controlled by a techno scientific authoritarian government.
02:21:36
But probably most damning was the photo that accompanied the article with little Deborah Skinner, enclosed in the crib with her face and hands pressed up against the glass. Skinner's call to automate child-rearing. Also coincided with the publication of Dr.Benjamin Spock's book, the common sense book of baby and childcare, which rejected these behaviourist methods
02:22:03
that were promoted by a psychologist like John Watson. Strict data-driven timetables for feeding and toilet training and so on. And Spock argued, you know, contrary to all this kind of techno scientific endeavour that mothers and it was mothers should be flexible, loving, and natural. Now, of course, there are plenty of problems
02:22:27
with this naturalised domesticity to be sure. A naturalised domesticity that is imagined as white affluent American female. And some of these are problems, I think with the teaching profession, particularly at the K through 12 level today, right? The problem of white womanhood. The air crib psychologist, Benjamin has argued
02:22:53
was viewed at the time as a technology of displacement, right? A device that interfered with the way in which human beings normally contact one another. In this case with the parent and child, it displaces the parent. And it's a similar problem, Benjamin argues that Skinner faces with his other invention, the teaching machine, a concept that he came up with in 1953,
02:23:19
after visiting Deborah's fourth grade classroom. Skinner's inventions failed to achieve wide spread adoption, Benjamin argues because they were seen as subverting valuable human relationships, relationships that were necessary for human development. But I'm not sure that either the ideas of the teaching machine
02:23:45
or the baby monitor were really rejected, right? If nothing else, these ideas keep getting repackaged and reintroduced to us at work at school and at home. And I think the question before us, I'd argue is whether or not we want behaviourist technologies.
02:24:09
And again, I think all behaviourist technology technologies are surveillance technologies. Do we want behaviourist technologies to be central to human development. Remember, B.F. Skinner did not believe in freedom, right? And if we do, then we have to reject not the just the latest, shiny gadgetry
02:24:34
and anti-cheating bullshittery, but we have to reject really over a century of psycho technologies and pedagogies of oppression. And that is a lot of work for us. But I think if we bite off just one chunk, one tiny chunk that I hope we can all agree on today, it's to make sure that Proctorio is wildly unsuccessful
02:25:02
in all its legal and business endeavours. Thank you. - Thank you, Audrey. That was great. I had a bunch of people like texting me saying that your rage was nourishing them, which I love, also love to be nourished on rage. I'm going to ask students to join in with their questions. Remember that if you self identify with a little asterisk,
02:25:29
I will put your question first but I'll start with a question that came in on YouTube, which I think it's more of a provocation than a direct question, but I think it's worth exploring. "Certain ways of working will be considered productive. "Who defines that. "As a knowledge worker I've worked "while taking a walk," for example. - Yeah, I think that that actually this question is one of the ways in which the language and the imagination
02:25:57
around labour have permeated all of our lives. And I think that would probably feel that so profoundly now, when so many of us, not all of us, but so many of us particularly knowledge workers are working from home. Because there is no sort of time clock, right? That you punch in at eight and then you get a break and then you're done for the day.
02:26:21
Really you're working all the time. You're working every day of the week. And capitalism loves that. They love us to be working while we're supposed to be, they love that our whole world now is constrained in this way. And that every moment, even moments when they should be ours, right, that we should reclaim as our time, not the time of the boss is now infiltrated, I think
02:26:52
by these values. - So a question that is recurring throughout each talk, I'm seeing it come up in the chat and it's extremely prevalent in the Twitter hashtag is this idea of how do we resist? Whether from the perspective of a student who obviously lacks a certain amount of power within the dynamic or from the instructor, perhaps contingent, who also feels unable to resist.
02:27:18
I'm posing this question, I've posed it to Jesse as well. And the, the opening panel tackled it too. But I think it's something to keep coming back to considering what we're here for today. How can we resist within the structures in which we work? - I think that everyone, Jesse's comments in particular, I think were really great. I think that it's important to recognise that active resistance and refusal
02:27:42
don't have to sort of come with capital letters. Like there are everyday small acts of resistance that we take acts of refusal that we take that we might not even recognise as much. We all know this. We all know that when we're slower than usual or when we drag our feet, you know, I think that there are little steps that we can do to make the machinery of surveillance,
02:28:10
make it more difficult for the machinery of surveillance to operate. And they don't actually have to be dramatic things, although that's great. And like Jesse said, if you're in a position where you can be loud and can be overt, then by all means. But I think that there are just lots of ways in which we refuse in which we say no, and we don't even have to make a big deal about it. You don't have to announce on Twitter that you're leaving Facebook, right?
02:28:34
It doesn't have to be showy at all. You can just not do it. Just not do it. - Somebody just asked in the chat, "Sabotaging?" - I think that that's what so much of, I mean the whole conversation, I think around cheating is, gets pulled in different ways. But I think that in some ways that's what cheating is too. Like cheating is a refusal.
02:29:01
Cheating is sabotage. And so I think that we should, I think maybe Sava said this earlier, like maybe ask more questions about why, why students are cheating. But if we see some of these acts not so much as sort of, you know, academic dishonesty, but really engaged in pushing back and what are exploitative practises. These are acts of refusal and resistance
02:29:26
when students cheat. - Steel, has a question in the chat, John, if you could pull it up. "Audrey, I was really taken by your framing questions. "Should behaviours technologies "be used in our approaches to human development. "Curious to know more about what constitutes behaviourism. "Is it a scientific, rational approach "to older forms of social control "in group out group shunning, ostracism "or something different?"
02:29:52
- Yeah. Well, what Skinner meant by behaviourism and really what educational psychologists have really, since the beginning of educational psychology been behaviourists. You can't know what's going on in someone's mind, right? Skinner would mock, he mocked what he called mentalism. Psychoanalyst, psychoanalytic approach, union approach to thinking about the mind.
02:30:16
You can really only know what people do so you can only just monitor behaviour. And then you can see if someone changes their behaviour and that's learning, right? So much of what we know about, what became educational psychology is built on animal behaviours in the lab, right? So Thorndyke who was a professor at Columbia
02:30:40
would train mice to run through a maze. And the speed at which the mouse learned the maze became something that he graphed and called the learning curve. So we have adopted all of this behaviourist language and behaviours, I would say assessment into so much of what we do in education as a practise
02:31:05
and in particular, thanks to Skinner, thanks to some other psychologists as well. Into our technologies, into our education technologies. And so this is very much, Skinner's insight was he would train, he would train pigeons in particular to do certain tasks. He trained pigeons to play ping pong, for example,
02:31:32
by rewarding them for good behaviour. And that's a lot of what happens in education technology. - It's not unlike academic writing class, for example. - Yeah. We shouldn't punish students. We shouldn't beat them, you know, calming the corporal punishment. We should reward them when they do things correctly. And I think that that sort of design that the haters design runs throughout
02:31:57
so much of education technology. So much of all of our technologies. Silicon Valley likes to call them nudges, right? That these are the ways in which our behaviours are shaped in order to do certain things that the engineers decide are good. And in the case of education technology, you know, the outcome is learning. It's a change in behaviour that demonstrates learning.
02:32:23
- Sort of filters nicely into Laura Taubs question in the chat, which is, "Refusal and re-imagination, "resisting one kind of world. "So what's the alternative world we want to imagine "and bring into existence "within education and beyond?" Just a little question for you. - Yeah. I think that this is, I mean, I'm really interested in this question. I spent a lot of time observing the imagination of education technologists
02:32:53
and the the things that they talked to or the literature that they invoke when they talk about what they imagine school to be like, what they imagine, the kinds of schools that they want to build in the future. Like, I will never forget. I will never forget that Saul Khan from Khan Academy was interviewed in the New York times. And they asked him about what literature inspires him. And he said that he was inspired to build Khan Academy
02:33:17
by Ender's game. And I thought, my God, this is like imperialism where you're drafting children without their knowledge or consent to fight a war against insects, that's your inspiration? Holy Shit! I think, you know, there are people who are really active about building futures that are, I think, pretty horrifying. I do think that, I mean, to me
02:33:44
the kind of work that we do over the imagination, it doesn't have to necessarily or maybe even shouldn't be utopian, but I do think that work of imagination is really important, but then listening to other people, when they talk about what they think the future of teaching and learning should look like, you know, is that a future of robot teachers. There was an episode of the "Mandalorian", I'm going out really pop culture, there's an episode of the "Mandalorian" a week or so ago
02:34:10
in which the child was brought into a classroom of the future, I guess, a long time ago in a galaxy far, far away. But there was a robot teacher standing in front of the class lecturing to people. And I think that that's, you know, there's this idea of like, we imagine these sort of more technological futures and they really so often just reinscribe really poor pedagogical practises. I have colleague was tweeting this morning,
02:34:35
the image of one of these virtual immersive world classrooms but it was still like, everybody's in an avatar. Everybody's like a digital representation of the self but they're still all in the lecture hall. - Yes. - We'll imagine entirely different bodies for ourselves but we're in a lecture hall, there's one guy at the front. I have one final question for you, Audrey, it's from Brian Lamb. He wants to know if it were a monster of EdTech, which monster would Proctorio be?
02:35:02
I'll take my answer off the air in case I get sued. - That is wisdom right there. Thank you so much for your time today, Audrey. I've really appreciated this conversation. - Thank you. - Thank you. So our final speaker of our event today is Cory Doctorow. Cory, I'm going to give you a little brief intro and then I will let you do your thing. Cory Doctorow is a science fiction author, activist and journalist. His latest book is "Attack Surface",
02:35:28
a stand-alone adult sequel of "Little Brother" and he's also the author of "How to Destroy Surveillance Capitalism", which is barely relevant today. A non-fiction book about conspiracies and monopolies. He maintains a daily blog at pluralistic.net. He works for the electronic frontier foundation, is an MIT media lab research affiliate, is a visiting professor of computer science at Open University, a visiting professor of practise at the University of North Carolina, School of Library and Information Science.
02:35:52
And co-founder at the UK open rates group. Thanks, Cory. I'll hand it over to you. - Thank you very much. It is a pleasure and an honour to be able to do this. You know, as a Canadian, I'm always excited when something terrible happens back home, not here in America, where I live. So it was particularly galling and engrossing to watch Ian's saga unroll
02:36:16
and, you know, looking at the other speakers that were on the bill today and the kinds of things that we've been talking about in the context of Proctorio, I'm guessing that you've already heard a lot about the pedagogical bankruptcy of high stakes testing, the cruelty of this kind of remote proctoring, the discrimination and intrinsic in it. You know, what if the thing
02:36:41
that your surroundings looks like is a hatchback outside of a taco bell, because you lived in a broadband desert or you can't afford home broadband because everyone you live with who's a wage earner has been laid off and the algorithm says, well, you're cheating because that doesn't look like a house to me. The academic shallowness of not being able to imagine a way of evaluating student performance without using these high stake tests
02:37:06
and, and you know, the vocational incoherence of it, speaking as someone who's worked in and out of the tech industry for, you know, decades now. I don't think that there's a workplace that mirrors the conditions that Proctorio asks you to assess your performance under. I think if your boss found out that you were trying not to look away from your screen
02:37:31
or talk to anyone else or consultant any references while you were writing software for them, your boss would either give you a stern talking to or possibly fire you because that is not how people work. So if it's not how people learn and it's not how people work and it's cruel and terrible when then there's no excuse for it. And you know, that is a thing that there are many speakers on the bill today
02:37:54
who are far better qualified to talk about than me. But there is a thing that I have a specific domain of expertise in. Which is talking about the relationship of this stuff specifically to computer science and to academic technology. After all, Ian is an academic technologist and a tech specialist. And you know, one of the places where Proctorio has found its most hospitable home
02:38:18
is ironically in the computer science world. And I say that in part, because of the stuff I've told you about the vocational and coherence of training computer scientists, never to look at references, look away from their screens or talk to other people while they're working. That's not how we do computer science but also it's strange because there are specific risks and harms intrinsic to the model of Proctorio
02:38:44
and things like Proctorio that computer science department should really know better about. And actually I think have a moral duty to be warning the rest of us about. So, you may have heard when people talk about self-driving cars, you may have heard someone who's kind of half right say, well, what about the trolley problem? The trolley problem is this a 1960s ethical thought experiment where you imagine that there's a trolley coming down a track
02:39:11
and there's a switch point. And depending on whether you pull the switch or not, the trolley will drive into six children who are playing unwisely on the track or whose school buses stopped on the track, or you can pull it into going the other way and it'll kill a guy who's standing on the other track again unwisely for reasons known only to him. And the question is you pull the lever and kill the children or that guy, right?
02:39:35
And so when people apply this to self-driving cars, they say, well, what if you're a hundred thousand dollar self-driving Mercedes detects that it can either kill the millionaire behind its wheel or a bunch of poor people who are crossing the street, which of those people should the Mercedes vehicle be programmed to murder? And you know, this is an interesting thought experiment, but I'm here to tell you that as a computer science matter,
02:39:59
this is a really dumb question because the thing that happens before we ask whether the car should murder its owner is how did we design a car that sometimes murders its owner, that in some way the owner cannot reprogram so that it never murders them. Because objectively speaking, a product that you own that is designed to never murder you
02:40:26
is much better than a product that is designed to sometimes murder you. And so implied in the AI trolley problem, which I'm not even gonna talk about the ethics of it, is a structural assumption. And that structural assumption is that we are headed into a future in which computers treat their owners as adversaries and resist the orders that owners give them
02:40:52
in favour of policies that are set remotely either in real time by authorities that log into them remotely and say, now you must do a thing. You know, now your thermostat must be turned down. Now your car must be kill switched because you're in a high-speed car chase. Now your car must be immobilised because you didn't make your bill payment. That the car will as between an order
02:41:16
given by the person who owns it, who's using it, who's life has been entrusted to it and anyone who has the keys that indicate that the manufacturer has authorised some kind of override, that the manufacturers override always wins. And, you know, speaking in my capacity here as like a part-time dystopian science fiction writer,
02:41:41
this is a terrible idea. Like we really do not want our computers to be designed by default so that if they get orders that run counter to our interests, potentially lethally, we can't change those orders. That is a structural issue that will come back and bite us in the ass in a million ways as time goes by because one of the things that we know
02:42:04
is that manufacturers are not pure of motive, right? We've seen, for example, HP tricking users into installing fake security updates that actually just allow the HP printer to distinguish official HP printer cartridges from third-party cartridges so they can reject the ink that you've bought. Sometimes manufacturers have insider problems. So even if the manufacturer never takes a decision
02:42:30
to do something that's harmful to you, there might be someone within the firm who makes that decision off their own bat and goes ahead and does bad things to you. We've seen many instances of that. And also that manufacturers are liable to pressure from the state. You know, when we have seen this play out in the real world, we saw play out pretty recently with Apple who famously make mobile devices
02:42:54
that if you say to them, "I would like to run this app," even though Apple doesn't like it, they say, you know, "I can't let you do that, Dave". And that's bad enough when it's just like you want to play Fortnite and Apple wants 30% of Epic's revenue but when it's really bad is like if you're a user in China and Apple takes the orders of the Chinese state authorities to remove all working privacy tools from their App store.
02:43:22
So that the only privacy tools that exist have back doors that the Chinese government can surveil. And the reason they're surveilling the traffic is in part to figure out whether or not to put you in a concentration camp where there's forced labour, punitive rape, unsanctioned medical experimentation and a host of other horribles that are underway right now in Xinjiang province for all kinds of Turkic Muslim ethnic minorities.
02:43:49
Just say nothing of like the harvesting of organs from Falun Gong people and so on. So as soon as you design a device that can push around the person who relies on it, people are going to start trying to figure out how to push those people around. And this is a thing that computer science professors, computer science academics, computer science professionals, technologists who work in academic settings,
02:44:14
we should be alive to this because part of our job as academic computer scientists is to inculcate a generation of forthcoming computer scientists with good understandings of how their technological choices will play out, how their design decisions will play out, what they face in the future if they give into the temptation to build a device that won't allow its owner or user
02:44:40
to override remote policy. Now, one thing I want you to understand, that's a subtlety that I think computer scientists understand but that people outside of the computer science world tend to allied is that when we say that a phone can't run a certain app or that your laptop, when it's running Proctorio software can't run another app without Proctorio knowing about it.
02:45:09
What we mean is that the device won't do it, right? The device is capable of doing it. The universal computer design that Alan Turing and John Von Neumann came up with in the post-war era, the design that we still use for all our devices, it says formally that any computer can run any programme that can be expressed in symbolic logic.
02:45:32
I mean, maybe very slowly, right, there are very old computers that if you try to run Photoshop on them would take beyond the heat death of the universe before they finished loading the load screen. But they could given enough time, run any programme that is intrinsic to the nature of computer science. So when we say that we have a device that can't run a certain programme or can't do a certain thing. What we mean is that it won't,
02:45:56
what we mean is that it has been designed to hide some supervisor process from the user, such that the user cannot detect it or terminate it, right? That there isn't like an icon on the desktop that says, I am the spyware and you can just drag it into the trash and it goes away. Instead the operating system and the stuff below the operating system,
02:46:22
all the way down to things like the boot loader, they're all designed to obfuscate their action from the user. So that processes that run in this super privileged mode, don't get a numerated when you ask the computer, "Hey, what programmes are you running?" And the files associated with those processes, the programmes, when you list the directory, they're hidden from the user. Because again, nobody asked for this stuff, right?
02:46:47
Nobody woke up this morning and said, I wish I had a printer except that it wouldn't let me run ink of my choosing. So if there is a computer that is designed to do things that are adverse to its owner, the only way that that can succeed is to hide stuff from the owner. And here's where it gets kind of technical. And I'm sorry, I'm trying not to make this too technical, but here's where it gets a little technical. Hiding secrets in computers
02:47:11
that you give to your adversaries has a technical definition in security. And that technical phrase, the word that we use to describe that is wishful thinking, because anything that you hide in a device that you then give to anyone, who's got whatever, 18 bucks to get a Netflix subscription or a trial copy of Proctorio or, you know, a car engine part
02:47:36
that's got some kind of digital lock in it to stop you from replacing it with a generic spare. If you can just go out and get that device then anyone can, and that anyone includes like board grad students with their own electron telling microscope in the lab, a bunch of undergrads hanging around like a bad smell looking for extra credit work and nothing to do this weekend. There are people who just like routinely de compile this stuff,
02:48:00
take the tops off of chips and examine them with microscopes, do all the things that you can imagine. And it's brittle, right? Because the secret that is in the computer that stops the owner from making a computer do what they want instead of doing what the manufacturer or whoever can lean on the manufacturer wants, that secret once it's out is out, right. Once it's on the internet, it's like gone forever, right?
02:48:24
And particularly when this stuff is hidden in hardware, you can't just update it. Apple's security chip for eight generations, eight years worth of phones has been compromised And the there's no way to fix that. In fact, those chips, because their security chips are designed so that you can't even take them off the board and put another security chip in, like the whole thing is designed so it just sort of falls apart if you even try to do that. So you would need like a really full on lab
02:48:50
just to fix one phone so that Apple security chip, which both enforces security for users but also security against users, So that chip could be replaced with a newer model that didn't share this defect. And so it is not enough to design a computer that hides what it does from its user. If you are committed to a model of computing that says the computers give orders to their owners
02:49:16
instead of the other way around, then you have to have a legal backstop for whatever it is you're doing. You need a law that criminalises looking too hard at a computer and finding out and reporting on what you find there. And for that we can thank British Columbia. We can thank, not UBC, but by British Columbia politicians.
02:49:39
In 2011, James Maura, who was then the MP for Coquitlam, along with Tony Clemon, who's now been disgraced and drummed out of parliament. They introduced Canada's equivalent to a U.S. law called the digital millennium copyright act that contains what's called an anti circumvention clause. And the anti circumvention clause says that if you override any kind of access control in a device, that you commit a crime and it's like a jailable offence.
02:50:08
And if you tell people how to do this, it's also a crime and a jailable offence, as well as a civil offence. In the U.S. it's a five-year prison sentence and a $500,000 fine for first offence if you get caught doing this. And so this means that all the stuff that we would normally expect to happen with devices, like people modifying them so that they do what the owner wants and not what a third party wants, like you would expect all of the things being equal.
02:50:31
If you bought a printer from HP and it said that you only could use the ink that costs more than vintage clioc that like someone out there would just make them a chip that allows it to run just, you know, regular ink. And so this criminalises this and also creates these vast civil penalties for it. And that's why we don't see these in the market but it's also why we don't see in-depth security audits
02:50:54
of these products and these tools, because revealing defects that adversaries could exploit, because remember these are programmes that are designed not to be visible, legible or interoperable by the users. So if you've got, Proctorio operating your webcam in a way that allows that you can interfere with then anyone who can impersonate Proctorio or its officers to your system can do that too.
02:51:19
And all of the security that goes in with Proctorio that tries to detect and interdict your efforts to make your computer do your bidding and now Proctorio is bidding is replaced or it can be suborned to defend your attacker, to defend a stalker and identity thief, a state surveillance operative, anyone else who can kind of step into Proctorio's shoes
02:51:43
as far as your computer is concerned. And so we see this all over the place, now. You know, there was a kind of watershed in 2005 when Sony BMG is one of the big three record labels now, I think there were five back then as three now, was caught selling about six million audio CDs that had been infected with software that poisoned your windows computer.
02:52:07
So the first time you put the audio CD in your computer, it would rewrite your operating system so that certain programmes could no longer be seen by that computer. If the programme started with a dollar sign, sys dollar sign, then any programme that was running that had that pre-pended to it, wouldn't show up in the process monitor. Any file that was pre-pended with that wouldn't show up in the file directory.
02:52:30
And they wrote this because they then wanted to run another programme that would just interrupt your CD burning and ripping stuff. It was just this like sort of penny grift, but what this had the effect of was turning every computer that it infected into a computer that if it got malware, if it got malicious software that started with dollar sign sys dollar sign, then by design you couldn't know that the malware was there. You couldn't see it in your file listings.
02:52:55
You couldn't stop it from running. So immediately virus writers started to prepare their viruses with this string to kind of take advantage of this hole in your computer's immune system that had been blown in it by this giant media and technology company. And hundreds of thousands of military and government networks around the world were eventually infected with this root kit.
02:53:18
And so that was kind of the starting gun on the standard for recklessness and for making the safety and security needs of computer users, subordinate to the desires of the shareholders of the companies that made their devices. And since then, we've seen an absolute metastasis of this model, where we see these kinds of access controls,
02:53:43
this model of controlling users in devices as diverse as insulin pumps, cars, printers, phones, browsers and of course, browsers are the user interface for everything else from medical implants to power plants, to pacemakers. And that's why Ian's work is so important. That's the thing that's at stake here because the duty of academic computer scientists
02:54:07
and educational technologists is to warn us about stuff like this. This is an industry, Proctorio is kind of the tip of the spear of industry using legal threats to silence academics, who point out the dangers of their commercial products, who engage in the normal business of academia, which is to investigate both the built and natural environment around us
02:54:32
and report truthfully on how it functions and critically. So that we can make our own assessments about what we want to do. So that can feed into policy questions and, you know, to see the tech industry doing this is to see a playbook being followed, sorry, I just leaned on my own pocket, pressed my car key and set off my car alarm, it's off, thank you. Speaking of technology gone mad.
02:54:56
So, you know, we've seen this before from other kinds of firms. You may, if you're a Canadian, remember in the late 1990s that Apotex, the giant Canadian pharmaceutical company went after a researcher at Sick Kids Hospital in Toronto, who's doing a trial for Apotex on a drug called Deferiprone. And she found that, her name was Nancy Oliveri, and she found that her patients were getting gravely ill
02:55:21
and she feared that they might die. And she halted the trial. But Apotex went to her and said, "If you tell your clinical subjects "that their health is at risk, "we will sue you for violating your nondisclosure." So we can immediately see why this is unethical but it is in form the same thing Ian is doing when he warns educators and students
02:55:45
about the risks that they face from Proctorio and its products. So at a time when computer science departments like Wilfrid Laurier are doubling down on this nonsense, Ian really stands apart. This is an academic who rather than going with the flow is risking everything, risking penury, risking years as at what amounts to being a professional litigant, which is what can arise
02:56:09
if you stick to your guns on stuff like this. And if the firm that you have embarrassed is belligerent enough, and certainly Proctorio has nearly unlimited access to the capital markets. If they want to drag this out depending on how the SLAPP stuff goes. And you know, this is why this issue matters to me, and this is why Ian's play matters to me because it is a rare computer scientist, a shamefully rare computer scientist
02:56:34
who will speak out on these things. And it's an even rarer one, who when threatened will stick to his guns and say that the truth is more important than your shareholders acid indigestion. And so we are going to tell the truth because that is our job. And so that's what I wanted to say to you. It's not about the pedagogy, but about the technology and its bankruptcy
02:56:57
and the duty of technological academics of all kinds to inform other academics who are structuring their own learning about the risks of using these products and why, irrespective of all other questions, this is not the model we should be embracing. Thank you. - Thanks so much, Corey. Thanks for being here. And for bringing that perspective to our conversation today,
02:57:24
we've been starting with student questions. So I'm going to start there. And it's actually a question that I've seen echoed a few times. This one came through on Twitter from Sinead Doyle. She asks, "Are educational technologies "for surveillance viruses? "Can you ever truly remove them from your computer?" So as a student who is perhaps being forced to use this technology, what are the long-term ramifications? And would you classify them as viruses? - So they're not viruses. I mean, there are a couple of ways
02:57:48
in which they're not viruses, right? So the most obvious one is that they don't self spread. They have a different vector and that vector is educator, morally different educators - The professors. - Yeah, right. So, I mean, you know, in that sense like there's lots of instances in which you see bad things happen with technology that isn't untouched by human hands. And we don't like it, even though it's not fully automated. What I would say is that it shares the formal characteristics of a virus.
02:58:15
Nobody wants a virus. For a virus to run, it has to hide its operations from you. The virus is adverse to the owner of the system and the user of the system. And so, you know, although it's not a virus, it has all the things that makes us not like viruses that makes us want to remove viruses. And, you know, I'll say that antivirus companies actually struggle with this question very thoroughly.
02:58:42
So when the new news about the Sony root kit came to light in 2005, a bunch of security companies said, "We actually pick that up on our like normal malicious software scanning months ago. But our general counsel said we can't talk about it because although this is like formally indistinguishable from a virus, it is undesirable software from the perspective of the user
02:59:06
that is installed covertly without the user's consent and that hides itself from the user. It is there to shore up the interest of a multinational corporation. And so under section 1201 of the digital millennium copyright act under the then pending Canadian copyright act, under article six of the European copyright directive, it is a potential felony for us to tell you
02:59:29
that your computer is being infected by this." So in that sense, it is very much like a virus. - I've got a question that is perhaps a little bit more open-ended also came through on Twitter. I think it's more an expression of horror, but I'd like to share it none the less. Rissa on Twitter shares, "My horror, filled thoughts. "We've seriously entrusted the ethics of education "to companies "who are undermining the trust we build with our students.
02:59:54
"Seriously, why? "Why are we allowing this to happen?" I wonder if you have any thoughts on the sort of weaponization of these tools in education. - So I have the, can I swear on this? - You know what, some people have some people haven't, I feel like we're getting to late night in the UK. - One swear word. I have a thing I call the shitty technology adoption and the shitty technology adoption curve,
03:00:19
it predicts that when you are trying to do something technologically, that the users of this technology will be really angry about, you have to first use the technology on people whose anger doesn't provoke action. You have to find people who have no privilege, who have no lookout, whose complaints fall on deaf ears. And so we start with our worst technology.
03:00:44
We inflict on like prisoners, mental patients, asylum seekers, young children. And then, you know, it's like blue collar workers, parolees, older children in elementary schools. And then it kind of works its way up. And you know, you can see this at play like 20 years ago, if you were eating your dinner and there was like a CCTV up in the corner watching you eat,
03:01:08
it was because you were in a supermax prison. And now it's because you bought an Apple or Google or Amazon home camera system, right? So, you know, like science fiction writers can't predict the future, anyone who claims they can predict the future is full of it. Fortune tellers are all con artists or deluded or both. But if you want to, like at least get a kind of leading indicator of your technological future, look at what people with less privilege right now
03:01:32
are being subjected to against their will. That's your future because that's kind of the trajectory that this stuff goes in. It's not like for fore ordained, but you know, the body in motion tends to stay in motion unless it encounters some friction. And so in the absence of some friction, eventually the trajectory of this thing is going to reach you and then go past you and will still roll over people even fancier than you. This is Trump going, wait a second,
03:01:57
They wiretapped my devices while I was running for president. I thought that was only supposed to happen to people I didn't like. You know, that's that technology adoption curve. And for many years we have been trialling invasive EdTech on much younger kids starting with elementary school students and then working up to secondary students. The most aggressive version of this was sensorware
03:02:21
where it's supposed to stop you looking at pornography and other undesirable materials on school networks. And the thing about sensorware is it's generally like the market for sensorware is only secondarily educational. The primary market is usually oppressive governments. And so when we look at the companies that provide this, you know, their major customers are like the King of Bahrain, right?
03:02:45
Or the Syrian ruling elite, right? And what they do is they take this tool and, you know, this is more technology adoption curve. These war criminals take this tool that they rubbed the rough edges off of by subjecting people living under totalitarian regimes to them. And then they make students use them,
03:03:09
make younger students use them. And I think what we're seeing is the kind of invasive tech that we used to apply to minors because they were minors. Where we just said, Oh, well, they're a kid. They don't have human rights. Their human rights take a backstop to our need to protect them. Even if that protection comes in the form of like these Dingo babysitters, right? These war criminals who we give the entire click stream and message history and email and everything
03:03:33
in order to keep them safe. Even though in theory, you know, someone who's like cozying up to the Bashar al-Assad is someone you need to be kept safe from not someone who will keep you safe. And now it's just moved up the privileged gradient, right? And, you know, we see like it had some intermediate stops. There was an absolutely ghastly report on women
03:03:58
who are sucked into doing work from home call banking, call centre stuff, where they actually are misclassified as independent contractors. They have to pay to get trained to be like a Disney call centre operator. Their calls are listened in on all the time, every click they make is watched, every character they type is watched and so on. And it's a pyramid scheme. Like the only way to get ahead is to recruit your friends.
03:04:23
The majority of them are African-American women. You can be fired at any time but if you quit, you have to pay a cancellation fee for your contract. And so, like we see, you know, that kind of crappy technology and social arrangement being visited on these people. And now, Microsoft has just announced that office 365, which most enterprises are now using has like 78 metrics that it automatically compiles by default
03:04:48
on every person in the company and then produces these leader boards for managers. And then, you know, like to watch it kind of run up the shitty technology privileged gradient really quickly, those managers are then incentivized to do the same thing to their whole company, to authorise Microsoft to take all of the data generated by their whole company and everything they do
03:05:11
and use it to give them metrics against other companies to say, you know, your employees are 60% as effective as these other companies but like that's kind of the ugly totalitarian face of it. But it's also all of these companies having all of the data that detail their operations being given to a company that is this diversified monopolistic conglomerate
03:05:34
that probably competes with them in some way. And so now, like you've got this competitor that is sucking up all of your operational data that they can then use in ways that can come back to harm you. So, you know, it goes from like prisoners to elementary schools students, to like African-American women working in pyramid scheme phone banking abusive firms, to like university students,
03:05:59
to white collar workers, to white collar firms, to like the actual businesses themselves have now been kind of subsumed into this logic. And so, yeah, like we are definitely, we need a big old reset on the way we think about this stuff. - That's depressing. Thanks. (laughing) - Well, you know what, if we're ready to like push back
03:06:24
then we got a lot of allies. - Yeah, we do. And speaking of which, we're going to invite Ian in to have some closing remarks today for the event. So thank you so much for your contribution today. - Thanks for having me. - Ian, I just realised I don't have an introduction for you. Hey pal, how's it going? - I'll speak to that. - I'm going to disappear then. This is Ian. - Hi. - Thanks Brenna. And thank you all for joining the teach-in against surveillance today.
03:06:53
I'm so glad that you were here to share this powerful learning experience and to all the volunteers, teachers, participants, I really hope that your hearts are feeling full because we needed this. We're a community of communities. We don't always get to come together but today we are one community with shared values and purpose. And I don't want anybody to forget this well-earned moment. It's almost time to peace out from the teach-in
03:07:20
but thank you for sharing these last few minutes with me. My name's Ian Linkletter and I have the dubious honour today of being the dude being sued. Three months ago on September 1st, Proctorio filed a strategic lawsuit against public participation against me to try to silence me and to try and intimidate all of you. And three months later, here we are. So how's that going? I don't know.
03:07:43
In their civil claim, Proctorio said that I've harmed them. And because of my tweets, this is a quote. "Students could change their behaviour "or adopt strategies to circumvent the software." Students could change their behaviour and that's bad. They're not supposed to know how the software they're required to use works. To begin, I guess I'd like to start by answering Brennan's question, which is one that I've been getting a lot.
03:08:10
I'm doing okay. Thanks for asking. Sometimes I say that and it's just because I'm being polite but today it's true. And it's true because of all of you. The lawsuit has been a really big burden as intended. It's cost me financially. It's cost me emotionally. And only my wife and family are ever going to really know. I'm forever grateful because of due to their help
03:08:33
this experience is not going to defeat me. And because of your contributions, we're going to withstand this lawsuit. And with your voice, we will make Proctorio pay. On September 2nd, Proctorio went to BC Supreme court for an ex parte without notice hearing about my tweets. It sounds kind of fun ex parte but I wasn't invited. A judge granted them a temporary injunction
03:09:00
preventing me from distributing copyrighted, confidential, technologically protected materials, which is more commonly known as tweeting YouTube videos. So until I get my day in court, my freedom of expression is constrained. I can't afford to take chances. And until my case is heard, I'm watching every word. So out of an abundance of caution, I'm not able to share with you today the specifics
03:09:25
of why I think their surveillance software is harmful or discriminatory or biassed. And that's why I'm really happy to speak last because you know, so much now. So let's make, Proctorio pay together. The first step in making them pay us to say their name when you're talking or writing about Proctorio, say Proctorio. It works for any unethical company. That's just kind of on my mind lately.
03:09:52
Say their name and let anybody ever forget what it means. This simple practise makes it safer for everyone to participate in the conversation. It helps people find us. Companies can afford lawsuits but they cannot afford to lose market share. And call surveillance by its name too, I was happy to hear that today. If technology is surveillance technology say so. If the technology serves no pedagogical purpose,
03:10:18
don't call it EdTech because it's not. Resisting surveillance happens one conversation at a time and it only takes one person to hold the line or stop the curve, be that person. And so I'm going to take a deep breath now because I've told you to do something that's not easy. So here goes, Proctorio is academic surveillance software. It's easier the second time you do it.
03:10:44
Proctorio is academic surveillance software, it's not EdTech. And they may say that you're being dishonest. This is from the lawsuit. The defendant falsely asserts that the software is an academic soft surveillance software product. The software does not engage in academic surveillance, which implies undisclosed spying. Rather the software provides an automated alternative
03:11:08
to live invisible examination proctoring, so on. This is greedsplaining and you can ignore it. Proctorio, they're not scholars like us. This isn't their conversations so flex your freedom. Proctorio doesn't like being called surveillance because it's a much harder sell than learning integrity platform or proctoring solution or whatever else. Use the correct language
03:11:31
and you can change the way that it's funded, who has to support it, how it's perceived and most importantly, how it's regulated. We can stop the curve and hold the line against surveillance by speaking freely, truthfully, and without fear. You only need to look over your shoulder or keep your voice down if nobody has your back and we have you covered. So one more time.
03:11:56
Proctorio is academic surveillance software. I'll sleep soundly tonight. Listening to students is a deliberate practise that'll help you design learning experience centred around trust. And we've talked a lot about that today. Many of us are in roles that don't necessarily interact directly with students. Personally, my role works mostly with faculty but student voices are all around us, especially now that we're online.
03:12:23
So just seek them out and you'll know really early if something is wrong and share those voices with people that might not know, Proctorio says that they're going to surveil over 20 million exams this year. And that's horrifying to me too. Thank you Jesse for saying that earlier, because when they brag about growth, I think about harm. Students outcry against surveillance is everywhere
03:12:46
and it should drive all of our advocacy. So go to change.org and search for Proctorio or ProctorU or ExamSoft, whatever it is. Read the petitions and read the carefully researched concerns of students. And if you're on Twitter, you can follow an account called Procterio with three tweet student voices every day. Every day, there's more harm.
03:13:12
So we have to listen to students. We have to believe them and we have to protect them. Before I say farewell and we only have a minute left here. I want to provide a bit of perspective about the lawsuit and the figures involved. So today we've raised over $7,000, which is amazing. A lot of people have given so much today to make today a success and I'm never going to forget it.
03:13:38
So what this means, 7,000 is that my GoFundMe is almost at 50,000, which with my own contributions means that we're about to exceed $100,000 but I don't know how much this is going to cost. Proctorio has already stated their desire to take this case as far as it'll go. On Thursday, we received a notice of constitutional question which initiates a jurisdictional test of BC's anti-SLAPP legislation
03:14:04
and the federal copyright act. So Joe Arvay and John Truman of Arvay Finley, my lawyers have my complete confidence, no matter how far this goes. And you can read our filings online at defense.linkletter.org. But here's the thing, $100,000 is just not that much to Proctorio. A legal victory for me doesn't make this a bad business decision for them. What we need to do is make them pay by saying their name.
03:14:32
If you're a student, talk to your profs, tell them how exam surveillance makes you feel and ask if you can opt out. One opt out opens the door for everyone else. And that opt out may have your professor talk to the learning support team and ask about alternative assessments. They may have a eureka moment and decide that high stakes exams just aren't the way to go. One opt out, one course, one professor, one contract,
03:14:57
one relationship, one conversation at a time we can cost Proctorio so much more than this lawsuit and hundreds of contracts are coming up for renewal in the next several months. So this is our moment to work together against surveillance. Normal is up to us and we've got this. Thank you all so much. Thank you. - Thank you, Ian.
03:15:26
It's a perfect way to close out the day. Thank you so much to the folks who contributed time and energy today. Ian's cost will be ongoing even after this event finishes, as he's just very eloquently explained. So there's a link on the againstsurveillance.net site, where you can link to Ian's GoFundMe in order to donate if you haven't yet
03:15:50
or if you wish to contribute again as this progresses and the event will be archived in that space so you'll be able to send people back to this space to check out the video, if they have questions or if they want to understand more about any of these issues. I hope you'll make use of it as a teaching and learning tool for those who are in a teaching position. And I hope you'll share it with folks in decision-making power. That's it for us.
03:16:14
We are now officially over time. Thank you all for being here and for the wonderful live chat and the tremendous discussion on Twitter. I know we will continue it.
End of transcript