Interview – Listen to LTIC’s Academic Director, Dr. Elisa Baniassad, talk about courage on the “EduTrends” podcast

Interview – Listen to LTIC’s Academic Director, Dr. Elisa Baniassad, talk about courage on the “EduTrends” podcast

Dr. Elisa Baniassad.

Elisa sat down with Dr. José Pepe Escamilla, Associate Director of the Institute for the Future of Education at Tecnológico de Monterrey, to discuss a variety of topics including her education journey, AI adoption principles, and the importance of leadership from educators in a world filled with AI.

Elisa would like to extend a special thank you to Dr. Escamilla for having her on the podcast.

Have a listen below!

Accessibility Note:

The Spotify player embedded below contains its own heading elements (of various levels). This may result in inconsistencies in this page’s heading hierarchy.


Transcript

Voiceover:

“Tec Sounds presents ‘EduTrends’.”

José Pepe Escamilla (Host):

“Welcome to EduTrends, presented by the Institute for the Future of Education. This is a Tec Sounds podcast, the official podcast of Tecnológico de Monterrey. I am José Pepe Escamilla, Associate Director of the Institute for the Future of Education.

“Today’s guest is Elisa Baniassad, Professor of Teaching from the University of British Columbia. Elisa, it’s a pleasure to have you here.”

Elisa Baniassad (Guest):

“I’m thrilled to be here! Thank you so much for inviting me.”

José:

“Thanks to you for being with us!

“I would like to start with a personal question: what inspired you to dedicate your career to education and, specifically, to the field of computer science?”

Elisa:

“Yeah, I mean, I think – that’s a deep question. I think it came from: my whole childhood, my dad was an educator, and I listened to him every single day talk about the – kind of – activity of education; not even the discipline (he taught architecture), but really, the act of teaching and of trying to work with a student as an individual person, to really meet them where they were, and also build this deep sense of trust and relationship as a person working with another person. The educator relationship was so primary.

“And I think I just kind of became – kind of – enamoured with it, this idea of truly personal education. And that was really in contrast, sadly, to the education that I experienced as a child, which was very de-humanizing.

“I grew up…my first few years were in England, [with] its very Victorian system, and I didn’t feel like an individual who was being educated. I kind of felt like a problem for my teachers. And that continued through to my life in Canada (actually, I have various reasons that I’m an annoying student).

“And I just noticed the way that he talked about his work as an educator was so dramatically different from my experience as a student. And, it just, it was sort of really sad.

“And then, I think the reason that I went into education and computer science – I mean, it was interesting…I think I just was quite good at computers. And so I could have gone into any field, really. It was kind of an arbitrary choice. I assumed I would go into architecture; I kind of got side-quested over into computer science because I was like, ‘well, these are very neat machines. I really like solving puzzles and they’re the ultimate puzzle.’

“But the big puzzle that I really fell in love with was educating at scale and keeping that individuality and that sense of individual dignity of the student. How do you do that when you have 600 students in a room? How do you maintain the dignity and the importance of each individual’s voice in a room that size?

“So, that’s really what brought me to where I am right now, to the role that I’m in, and to really leaping in to teaching huge classes, and making that, really, a life mission – that sense of that personal nurturing of the student.”

José:

“Very interesting and personal story, Elisa. Thank you for sharing. And I think this will connect also to your work, which is – part of your work is on the use of AI in education.

“I happen to travel a lot. I talk to a lot of people, and sometimes I’m with professors that are angry (or teachers), because AI came to schools, you know, and I say, ‘well, you didn’t choose AI’; you didn’t say, ‘I want AI to come to schools’, but that’s something that happened. It’s now there, and – for good or for worse – we have to find [out] how to deal with AI.

“And part of the advantages of AI is that eventually we can do things to scale, but also there are some worries, some worries about ethics, privacy, fairness. So, in your opinion, what can guide universities to adopt AI that guides guidelines that are ethical [and] address the concerns of privacy and fairness?”

Elisa:

“Yeah, I think one of the major principles that all educational institutions have to truly commit to is courage. Institutions need to be truly courageous in looking at, honestly, the truth of these technologies, what they afford, and the challenges they pose. And they have to be willing to engage in the discomfort of the trade-offs with each of these principles.

“So: fairness is often about affordability, but the most affordable thing is the cheapest off-the-shelf thing, and that will compromise ethics. And it will compromise privacy. So it’s almost…like, you have to try and hold this tension, and you have to be really transparent.

“And that’s another aspect of courage: honesty. That these are the choices that we have made. These are the reasons that we’ve made these choices. This is how we will be agile if the landscape changes.

“You also can’t make any kind of institutional decision that you’re going to lock into for any duration of time, because the technology and the hardware supporting the technology is changing so rapidly that even the trade-offs will look different in a couple of years. But if you can stay pinned to your institutional goals of academic excellence, foregrounding the individual student, and maintaining that courage of communication and transparency, then you can hold this tension. You can’t maximize all of them, but you can make clear choices and then readjust as you see that things were maybe overcorrected in a certain direction.”

José:

“Very good approach of your answer. These tensions that we have with things that are accessible because they’re free, but sometimes it’s – uh, I don’t know how to say that in English – a ‘poisonous gift’. We can say now that it’s something that is given, but there are some things hidden. So transparency…”

Elisa:

“Yes; ‘strings attached’. Definitely.”

José:

“‘Strings attached’.”

Elisa:

“Yeah, that’s right. I mean, it’s that classic thing. If something is free, then maybe you’re the product. Maybe your data is being listed.

“So I think protecting students from having to make those trade-offs themselves without clear understanding; also, protecting faculty from making poor choices. I think this technology is going so quickly, we can’t expect everyone to both be a disciplinary expert and an expert in this new technology.

“Institutions play a huge role in centralizing the knowledge about the technology so that faculty don’t have to sidebar over into suddenly becoming experts about AI. They should get help with that so that they can navigate it responsibly in each of their own disciplines.”

José:

“Okay, and also create spaces for them to to disclose those implications.

“In your initial answer for the question of how you [became] an educator, you mentioned that one of the things is something that I think you learned from your father. [I’m not] quoting you, but I just…It’s not exactly what you said, that…’the student not as a number, but as an individual person. Not as a total, completely individual person.’

“So how can institutions, professors, [and] teachers embrace AI, and keeping the human at the centre in what we do?”

Elisa:

“Yeah, absolutely. I mean, my sincere hope is that AI will be an avenue for centring the human (paradoxically).

“So: technology is always, kind of, supposed to set people free from all of the annoying tasks that distract them from human connection (and leisure, and philosophy, and engagement in the world). And AI is just a tremendously powerful one, especially because of the hardware that backs it. I mean, we’ve never put this much computation power towards any technology before, at least not for the populace. And so we’re seeing the populace now using computation at a level it’s never used before.

“We should be using that to regain the opportunities for human connection. We should be actually using this as a way to offload tasks that are busy work, so that we can think more about our disciplines, so that we can think more about our teaching.

“So one of the things that I do every year as an educator is I mess with my gradebook to try and get it to be correct (or whatever). Like, I have several hundred students and I’m sort of building out some sort of view, and I’m trying to look to see if, you know, oh no, which students didn’t get enough help, and can I do an intervention later? And I just sort of stare at this thing. It takes me hours to construct it.

“This is the kind of thing that AI could make for me, so that then I could still be the one to look at it. I don’t want AI to look at it for me, but I want it to save me the 15 hours of putting the thing together. That kind of task.

“So, letting me engage in a human way, so that I can do human tasks, and not actually outsourcing the human aspects of my job. I don’t want the AI to teach instead of me. I want it to do the clerical work, so that all I have to do is teach, so that all I need to do is relate to students, so that all I need to do is talk to them, so that I can assess their knowledge.

“I think…when we look at the way that education has been practiced for the last…well, since our institutions were founded, and my own educational experience as a child (and to some extent an undergrad and high school student), it’s not necessarily that human, right? It’s about…it’s this, you are treated kind of more as a number or as a problem. You’re not seen as the point of the exercise.

“And that, really, is what students are! Each individual student is the point of the exercise. And my real hope is that this technology frees us from all of the scaled issues that we’ve had in the past so that we can remember that students are actually the point. They’re not our problem. They’re our purpose. So yeah, that’s the…But that takes commitment, because if institutions don’t see it like that, then it could go in any direction.

“The commitment to seeing the student as the main purpose of education, which when you say it, it’s like, ‘well, of course, that’s what they are’, but I don’t know. I think sometimes institutions lose track of that. We put students into huge exam halls that terrify them, and then we try and figure out what they learnt, what they can relay while terrified. That’s not a very human experience.

“We test them in ways that make sure that they’re a little bit off kilter. We don’t let them have materials that they’ve been relying on, all so that we can supposedly get these robust grades for the students.

“Some people really believe that those are the things that are important about education, and they aren’t seeing that it is actually the student’s mind that is important about education, and that we should be thinking about the mind of the student, not about just what we can observe about the student in a given, terrifying moment.

“So, I think AI, because it can provide such bespoke help, you can ask it to do any old thing. And because of that, it can fill all the crannies of things that we normally would have to do for ourselves by doing filing or keeping a spreadsheet (or doing whatever), and we can concentrate on the individual minds of the student.

“But I think that leadership has to come from the top of the university. The university has to loudly proclaim this philosophy, and loudly commit to get everybody on board with this mentality and this mindset. So it’s a big challenge.”

José:

“Yes, a very big challenge.

“So, elaborating on what you are saying, I will say that artificial intelligence is making [it] financially possible that a professor can have a more personal contact with his students, you know, because there are so many things that you normally don’t do because you teach for the media[n] of the class and you don’t have a lot of time for many individual interaction[s] with your students. But with AI, there is a lot of repetitive work or work in the background that the AI can do for you so that you can dedicate time to the students, either individually, or in clusters, or in different ways.”

Elisa:

“Yeah, exactly.”

José:

“So, in order for that to work, you say, well, ‘universities’ leaders have to be very bold in creating that vision’ because it’s something that we have to be intentional in the way of using those technologies. So why, what will be this…how does it look like (this intention)? These learning technologies, implemented in ways that strengthen this human connection?”

Elisa:

“Mmm hmm. Well, I think, letting the…yeah, I mean, it’s a good question. I think, letting educators really drive the technology support.

“So, one of the big challenges is being agile to what people need in changing, a changing landscape. It’s very difficult. I think the way that education technology usually works is you buy a product – like a learning management system – and then you keep it for 10 years [laughs]. Because people don’t like tech change. They don’t want to think about it. It’s hugely problematic to get off of one video platform and move to a different video platform. But now we have this thing that is changing constantly and educators are using it in a different way every single semester.

“There’s no…like they’re modifying with such rapidity. Sometimes they’ll go to no use of the technology, or they’ll go to ‘just for one type of thing’ and then they’ll use it to create materials; they’re just using it in such different ways. I think we need to listen to the academics about what they need so that they can do aggressive experimentation, safely, and not have to be tech leaders to be able to engage in that experimentation.

“And we have to provide, really, the fabric of the institution. Like, central administration in institutions has to do its role of being central. So, it has to provide the communication fabric so that educators can share what they’ve learned, what worked and what didn’t, what they’re worried about, in a kind of, you know, that mesh that we really need from a collegia.

“If we were just 12 faculty or something, sitting around, we would come to the table each week and say, ‘here’s what we’ve learned.’ When you have 7,000 faculty, central administration needs to play the role of bringing people together so that they can learn from each other. It should not be top-down. It should just be supported, heavily supported.

“So I think that’s really the answer, is: letting faculty drive the need, and also, believing them when they say they don’t want to engage with the technology (definitely not forcing it). There are some fields that don’t require that, or only require it for just the clerical aspects of the role that the educator would engage in, but really, organically, relating to the subject matter is vital for students. So they shouldn’t be encouraged to use the technology at all.

“So I think it has to be how each individual educator wants to connect with the minds of their students, and then just being there to support them in that.”

José:

“Yes, you’re right that there are some moments in which the use of AI can be detrimental for the learning experience of the students. So, I want to go into that direction also.

“I will start first by stating – you may be in agreement or not – that one of the most important skills for the future is those skills that we usually call ‘soft skills’ (I like to call them ‘power skills’). And I will say, one of them is things that we usually think are very human, like empathy, motivation, communication, et cetera. But also, a group of competencies that I classify as ‘complex thinking’.

“So complex thinking is…it has inside [it] critical thinking, scientific thinking, systems thinking and creative thinking. And also, all those kinds of things that are important in the future because they will make you a critical user of AI, [so] that you will understand better the limitations, the benefits, no matter if you are just a casual user, or you’re a medical doctor in the future that is a heavy user of AI, or you’re a writer in the journal that [is] a heavy user of AI, whatever discipline you are in.

“Nevertheless, because of the way that students are using AI, they do cognitive overloading on the tool. It’s like, if I register into a gym and I pay the gym and I ask someone else to go for me to the gym. And also, it’s more or less a metaphor of what can be happening with many students – that in order to develop these higher order skills, you really have to go very in-depth from the disciplines. Whatever discipline you are doing, if you’re a chemical engineer, you go very in-depth in there and then you develop critical thinking or scientific thinking (or whatever). It can be the humanities…(no matter where).

“Also, if you’re using these tools (you’re relying on these tools), then there’s a paradox now, that: how can you develop those important tools if you’re relying too much on those tools? So I wanted to ask you if you have some thoughts about how we can prevent this dystopian future that [I’m] presenting that could happen.”

Elisa:

“Yeah, I mean…yeah. Outsourcing your thinking, your empathy and, really, any difficult process to the tool is what you’re talking about. Like, I think that individual connection with students is maybe the key, because if you are working, let’s…you know, leaning into your ‘gym’ metaphor: if you have a personal trainer who’s there, who’s talking to you about why you should do this, and to push yourself harder, and to do your exercises, or you’re going to physiotherapy…you could pay somebody to go to physiotherapy.

“But if you’ve built a sense of trust with your physiotherapist, you feel you owe it to them (to some extent) to go in. You go because they are going to be there and you don’t want to stand them up. You know they care about you. You know they’re going to push you, but you also trust that whatever exercises you’re going to do are going to feel really hard.

“It’s a commitment. And you have put your growth in their hands. You said, ‘I want to grow in this way; I know I’m the one who’s going to have to be painfully growing, but I know you can guide me in my growth. I trust you. I’m not going to get injured. I’m not going to get too tired. I’ll still be able to go to work afterwards’ (or whatever it is). And I think that’s the role of the educator: the role of the educator is to build that trust with a student so that the student is willing to embark on this painful journey of learning something.

“Learning is painful! It hurts your brain. I remember sitting in a math class, like, some calculus class or something, and thinking, like: ‘I feel like I can feel my brain paining’ (being in pain) (like the neurons were reconnecting or whatever). I could feel my assumptions about the way things work having to be torn down so that I could build new ones. I could almost feel it happening and it was so painful. But just like [a] really great exercise session, it was a positive pain. And I do think that students shy away from discomfort.

“I think there has been some conflating of discomfort; of personal, emotional discomfort and intellectual discomfort. I think we have to let them trust that intellectual discomfort, just like exercise discomfort, is okay and it can be done safely. And to do that, you have to ensure their social safety (their sort of psychic safety), their sense that they won’t have to apologize for who they are in the room. That you can accept that they have a learning challenge or something like that (some accommodation that they need) and you won’t think they’re bad people for it.”

“They feel accepted, deeply psychically accepted, and so they can commit to doing this heavy lift, which is: taking in information and translating it into knowledge that they then maintain forever. I think that’s the, that is truly the key, is really pushing forward that human intervention, because if there’s a crutch this available for students, they’re going to take it. Of course they are! You’d be foolish not to. I do it!

“You know, I’m trying to write an email and I can’t figure out if I have the tone right. Instead of using my brain, I ask an AI, like, ‘can you just check that for me? Is there any way that, you know, somebody could misunderstand what I’ve said in this thing?’

“And it’s so much easier. But at the same time, it means that I’m not doing my own exercise. I’m not trying to push myself in those situations.

“So we don’t always have to push ourselves. Sometimes you eat candy and chips and it’s fine. We don’t always have to be in a state of intellectual tension with ourselves. But there are times when you have to engage and when you have to commit. And you need somebody who’s there to guide you to do it safely so that you don’t get demoralized. You don’t feel humiliated by the challenge.

“You feel instead encouraged by the fact that you found something you don’t know, which gives you something fun to work on next. And really doing that mind activity. But I think, without that humanity, we can’t overcome the lure of this technology.”

José:

“When I’m listening to you, I am trying to think ‘what are the skills that professors have to have to be that kind of a professor?’ I think there’s a lot of skills that I’m not sure that all of us have. Like, where some of them is a lot of generosity, compassion, but at the same time, self-awareness and awareness of what is happening to other people; being more empathic, more reactive to different cues of people.

“And I know that there are some professors that are very naturally like that, but I don’t think it’s part of the standard skill set that we learn when we do some teacher development material (none on how to be like that). It’s a big challenge.”

Elisa:

“I think that’s true. I think…(ironically, I don’t think we are taught those). Ultimately, these are almost leadership skills. I think academics are taught to think intellectually, and they’re taught to think about their disciplines, and they may even be taught how to be good pedagogically, like what it takes to do good active learning, or what it takes to design a graphic that will engage people, or how to do universal design for learning.

“But to be a good teacher is to actually be a leader. You have to say, ‘we’re going to go in that direction, and I’m going to get us there. And you’re going to have to do the work of walking, but I’m…you follow me. I’m going to take you on a journey.’ And that leadership quality is something, I think, that’s missing in the training of, at least, higher educators, because (I think) we just didn’t realize it was as important.

“I think AI is actually showing us the cracks in our system. It’s showing us that we don’t really know what it is that’s important about education, especially higher education. We thought maybe it was just providing information; well, we don’t need to be the ones to do that. We thought maybe it was giving assessments; ChatGPT can quiz us pretty effectively on topics because it’s really great at that. There are lots of things that we thought were the point of our jobs that are actually not really the point of our jobs. There are even whole learning outcomes that are kind of moot at this point. But I think what’s not moot is that leadership quality: being able to be the grown up in the room; being able to go first; to be the bigger person when a student gets mad at you for something, to not let your ego get involved, to not take it personally, to not get offended. Instead: to be a grown up and just say, ‘I get it, you’re really mad. Let me help figure out why you’re so upset about this and let’s get you past it so that you can keep learning’ and really take on that leadership approach.

“I think…yeah, I think that’s gonna be really key to the next phase of education, because that’s something that ChatGPT and AI really cannot do. I don’t think it can lead people in that way. It can provide a lot of services, but I don’t think it can be a leader in the way that educators need to be leaders.”

José:

“Very interesting reflections.

“I have a last question for you: so, looking ahead in a window of time of five or maybe ten years, you can decide ‘what are the things that excite you the most?’ and ‘what are the things that concern you the most?’ about the future of education and the impact of these technologies.”

Elisa:

“Well, I think what excites me is the idea that we have in our hands the way to get out from under all the stuff we’ve always done, just because we always did it like that. You know, we have all these practices; exams look a certain way, lectures look a certain way. All of these things. These are all imposed on us because of logistical challenges, like I said before. And we can maybe get out from under these. We don’t have to do it like that anymore.

“It’s almost like we’ve been gifted thousands (an infinite number) of teaching assistants. And what would you do if you were gifted an infinite number of teaching assistants, all the teaching assistants you could ever want? What would you do as an educator? What would you do differently? How would you deal out work to those teaching assistants so that they could do some of your job while you’re still the leader (you’re still taking the students on the main journey)? So that’s super exciting. This is [a] really interesting new challenge.

“But I think the flip side of that is the worry: that educators won’t see it like that, that institutions won’t see it like that. They’ll see it as just another tool or technology as opposed to something that can really free us from our shackles from the past. And it’ll just kind of get diluted.

“I guess I worry that it’ll just kind of become part of, you know, the ecosystem; part of the ether. And that institutions and individual educators won’t take up the mantle of this transformation. That they won’t really commit with singularity of purpose to foregrounding individual students. Because this is our chance. We really have a chance to foreground that individual education and that individual courage and leadership.

“So…but I think we can do it. I think this is our, this is our key. If we use it wisely with ethics and fairness and being mindful of impacts, environmental impacts, human impacts, cultural impacts, we have a real chance here.

“Technology can really help us out, or it can kind of be frittered. So that’s my excitement and my worry, all in one.”

José:

“All in one, great. All in one. And I think that’s a very good ending of this conversation so that we as either educational leaders or professors (in whatever role we are)…So we have to be very intentional in designing the kind of future that we want to have and push into that direction to be more inclusive, fair, and look for the benefit of everyone.

“Thank you, Elisa. Thank you for taking the time to share your insights on education, on the purpose of people, on these things around inclusion and how to be a good educator in the future, with, I think, a very human-like…kind of things that we have to be more aware [of] as professors and as leaders, also – educational leaders.

“This has been a fascinating conversation and I am sure that our audience will like it too.”

Elisa:

“Well, I’ve enjoyed it very much. Thank you so much!”

José:

“Thank you. See you soon.”

Elisa:

“See you soon.”