“Technology is a useful servant, but a dangerous master.” Teach Different with Christian Lous Lange – Technology
How do we prevent technology from controlling our lives?
Today, Steve and Dan Fouts tackle an engaging quote about technology from Norwegian educator and Nobel Peace Prize winner Christian Lous Lange. Through connections to World War I, artificial intelligence, social media, and more, we contemplate the proper role technology should play in shaping our existence.
Teach Different serves educational institutions, families, corporate entities, and mental health communities. If you think the TD method could be effective in your setting, we’d love to hear from you! support@teachdifferent.com
Image source: Store norske leksikon | Av Historia/REX/NTB. | Public Domain
Today’s Guest(s)
Transcript
Dan Fouts 00:00
Okay, well tonight, we got we got an interesting quote, I think this is our first quote, I think directly related to technology. And it’s going to be by the author Christian Lous Lang. And I, if I’m not pronouncing that right, I apologize, but Christian lous Lang. And he was a Norwegian educator, political scientist, he was actually a secondary teacher, I was doing a little reading on him. He was one of the thought leaders behind the idea of internationalism. And His work led to the creation of the League of Nations after World War One. And he got the Nobel Peace Prize for a lot of his work in 1921. So interesting guy, here’s his quote, on technology that we’re going to break down today. This is great. “Technology is a useful servant. But a dangerous master. Technology is a useful servant, but a dangerous master.” Claim. What’s going on?
Steve Fouts 01:18
I don’t know why I’m interested in the historical context of this. I’m wondering what type of technology he’s thinking about back in the early 20th century? That’s just a passing thought.
Dan Fouts 01:36 Claim
Can I make a guess here? This is coming off the heels of I don’t know, when he said this. I should have looked at it my fault. But the technology leading up to militarism leading up to World War One, and, you know, machine guns? Yeah. Just the creative, efficient ways, unfortunately, human beings devised to kill each other. He might be referring to some of that. You know, that’s definitely a possibility.
Steve Fouts 02:12
Maybe, but you know, they’re human beings have to, you know, they have to hold the weapons, and they have to pull the trigger. So I don’t know in what sense, is technology a master in that context? I’m not sure. So I’ll just go to the more general claim that I think he’s making “Technology is a useful servant, but a dangerous master.” He’s saying that. Well, I looked at what technology means– the application of scientific knowledge for practical purposes. So he’s saying that you have scientific knowledge, for practical purposes, is useful to us. But we don’t want non-humans to be in control of technology. We need to be there, needs to be something else that is in control of how technology is used. Because when technology is a master, it’s going to be a problem when it has that power. And this is where it does get interesting. What was he talking about? Because he didn’t know about AI. We know that and we’re going to discuss that together. But that’s what I think he’s saying it’s good as a tool. Technology is good as a tool, but don’t rely on it to make, you know, decisions and to start dictating things to human beings, presumably,
Dan Fouts 04:03
Well, right. Right. I’m with you on the claim that you can’t let it control you. You can’t let it be the master. And I think that’s a metaphor for you to create. I’m going to go back to my military hypothesis here. You can’t let the creation of war technology be your master. You know, just because you’ve created it doesn’t mean that it should fall out of your control. You just need to be careful with it. And I think that what happens is people create technology, whether in a military or we’re going to be you know, we think about AI we create all this technology. But then there’s always a danger that it’s going to control us and we’re going to lose control of it. So I like his “technology is a useful servant.” It’s something that serves your needs but not the other way around. I like how he says it.
Steve Fouts 05:08
Yeah. And he obviously was not alive during the discovery of the atom bomb, if you want to talk about a good application of this quote, and maybe like a harbinger of what he saw coming with regard to weapons perhaps. Yeah, technology needs to be relegated to practical uses. And he doesn’t bring up morality here. And I don’t want to read too much into this. But part of me thinks that the reason that it’s a dangerous master is because it doesn’t have a conscience. It doesn’t have a sense of right and wrong, and people do–some people, hopefully, most people, have a sense of the appropriate use of the development of technologies for either good or ill ends. If technology is running the show, it’s going to probably be the most efficient route, that it’s going to take, not the route that’s the best, or the most virtuous or the most moral.
Dan Fouts 06:22
The being with the morality, the beings with the ethics are the human beings who confront the ‘should’ question. How should we use technology? Technology is often how, how do I do this? Let’s create a way in the most efficient way, here’s how I can do this. But then you need that human being as the master saying, well, should we use it in this way? And if we should use it, how should we use it? And those are questions that must be answered by human beings as the master. I see his claim — technology is a dangerous master, because it’s devoid of humanity, if you let it get to that point.
Steve Fouts 07:12
Yeah, and we’re inferring that, you know, technology is a useful servant, but a dangerous master. The other thing I want to point out is that the word technology, we associate it with things like AI, and the internet, and coding and programming. But technology, in its truest sense, is something as simple as a pencil. It’s the wheel. These are all technologies that were developed to make something practical and efficient that wasn’t beforehand. And it took foresight, it took an ability to like, look at nature. I got a definition for science here– the study of the physical and the natural world through observation, and experimentation. So what were people doing when they made the first pencil and made the first wheel, they were experimenting with things that may or may not work in trying to make it easier to write things down, or get from point A to point B, and they had to try a bunch of different things. Think of the Wright brothers, and the first flights and the experimentation there. So technology, I’m just saying it’s a broad subject.
Dan Fouts 08:34
Are you supporting the claim here are you just defining technology?
Steve Fouts 08:37
Defining technology. Okay, because I think that’s gonna help in assessing this–this claim, but really more about the counterclaim as well.
Dan Fouts 08:50 Counterclaim
Yeah, I want to let’s hop to the counter. We don’t have to wait for the counterclaim. Let’s just get there. Right?
Steve Fouts 08:55
You want to take it? What’s the counterclaim? “Technology is a useful servant, but a dangerous master. “
Dan Fouts 09:01
Technology is not all is not always a useful servant. I think technology sometimes is a waste.
Steve Fouts 09:11
Okay, give me an example.
Dan Fouts 09:13
And I’ve got to think of an example here. Okay, what would be in a good example of technology being a waste not useful?
Steve Fouts 09:25
I’m going to say this in the most positive way of what’s happened to the education profession. Over the last say, 10 to 15 years. There’s been kind of an obsession of the use of data, to drive instruction and to to to drive decision making in the schools, where there’s some very strong opinions on the part of people who have a lot of power in the education system, that say A that, you know, behaviors are extremely important to look at, and quantifying reactions of students to, to certain teaching methodologies and test scores, turning student improvement and learning into numbers. I think that that is not useful and practical if that’s all you’re looking at when you’re trying to decide whether or not learning is occurring, or whether you’re evaluating a teacher, or whether you’re deciding whether or not a school is doing a good job. I think that it’s not useful. I’m kind of agreeing with you.
Dan Fouts 10:49
But you’re talking about quantitative data being not useful. So to the extent that technology is used to render the data, you’re saying it’s, it’s not a useful servant? Am I understanding you? Right?
Steve Fouts 11:06
Yes, technology is the application of scientific knowledge for practical purposes. So the application of scientific knowledge is happening to the education system. And that’s creating all these metrics and these benchmarks in these ways that people are using to measure success.
Dan Fouts 11:31
Yeah, to measure they’re using technology tools to measure success. I get that. Definitely.
Steve Fouts 11:38
And I think that that can go too far in the extreme. And it misses a lot of education. That is more common. Yeah, we have inspiration. It’s more about spontaneous.
Dan Fouts 11:52
We have to put up all of our artifacts, and, and our observations and fill out so many forms using technology — so much data is mined and put into an online space. And much of it is really not useful. Some of it is, but a lot of it isn’t. So that’s, that’s the angle I would take on the counterclaim. I think that people think technology just because you can do things that we should be doing things. And I think that’s the mistake we make- we go from the how to we assume the should without questioning the should
Steve Fouts 12:37
Well said, well said, that’s really good.
Dan Fouts 12:42
I’m looking at the “…but a dangerous master,” and the counterclaim to that, could it ever–hmm. When is a good example. The question in my head is when would be a good example, when technology would be a safe master?
Steve Fouts 13:06
I got a slightly different angle on that. Think of all the despots and the tyrants that we’ve had over history, and the people, people that are running countries, and running organizations that are not up to any good. People are dangerous. The one thing about technology is technology doesn’t hold grudges. Technology doesn’t seek vengeance. Technology doesn’t have psychological hang ups. I guess unless you program it in.
Dan Fouts 13:49
Yeah, somebody’s programming it so somebody might have a bias.
Steve Fouts 13:54
You get my point. And this is kind of getting into AI. I don’t know what you’ve heard about AI. But I hear people talking about how the machines are going to take over now. That Oh, we’re letting them think like us and all of a sudden they’re going to start thinking on their own. And they’re going to be making their own decisions, and they’re going to be freed from us. That’s what people are saying is going to happen. Okay, it’s scary. Scary.
Dan Fouts 14:31
You’re back to the claim. That’s a dangerous master.
Steve Fouts 14:36
I was admitting it’s scary. But does it have to be? Is it always scary? I mean, how are we doing now? With people in control? Well, how are we doing in the world?
Dan Fouts 14:51
You think of a driverless car. You know, there have been some terrible accidents with it, but on the whole, at least the car manufacturers would argue that it is safer to drive a driverless car to be in a driverless car on the whole than it is to have the car be subject to the whims of a human being making decisions. So in that case, you could say a driverless car, where the technology is the master of the vehicle, is a safer choice, not an unsafe choice, not a dangerous choice.
Steve Fouts 15:38
But the moment we think that it wasn’t a human error, that’s when we get nervous. We’re team human, we’re biased. Everybody who thinks about AI, I’d say most people would say that, in the end, human beings are who you want in control of this, you want AI just helping us do things so that we’re more impactful. We’re more powerful. The minute we lose control of that. I don’t know many people that like that idea. I know there are some though, the ones that know how to program this stuff. Sure. And part of what I think the issue is with them is that they’re just curious. No one ever stopped a scientist from pushing the limits of what they’re trying to figure out. And what parts in nature they’re trying to predict. You can’t stop a scientist knowledge.
Dan Fouts 16:41
Knowledge for knowledge’s sake. I can’t believe we missed this one. In a classroom, social media. To what extent is social media– “technology is a useful servant, but a dangerous master?” Would that be true with social media? Tick tock Instagram…
Steve Fouts 17:12
Are you saying, it’s useful in the sense that it connects us very conveniently, with people all around the world. And it gives us chances for new friendships, new relationships, it’s entertainment, it’s free. You know, it’s useful, right? But you finish it, what’s the dangerous part?
Dan Fouts 17:35
The dangerous part is you’re sitting in class and you can’t get off your phone and pay attention to the teacher. You’re addicted, it has become your master, you have lost control of your ability to resist the fantastic satisfaction of getting, you know, getting a text, or a like, or whatever, you are kind of captured by it. And I think that can be a negative. So I think I’m flipping back to the claim here. That would be a fruitful discussion. We’ll definitely talk about that in the lab next week– how we could flush that out.
Steve Fouts 18:21
That’s definitely how I think social media is where you want to go with that. Dangerous Master. I have another way that social media you could argue as dangerous as a master. Think of, there’s research out there about I think, especially teenage girls, how they’re suffering from an epidemic of low self esteem, and, and just kind of social dysfunction. Why? Because they’re measuring themselves, their value and their worth, based on who they see in social media. And they’re not, you know, being authentic and, and believing that they have a lot to give to the world, regardless of how they look, compared to this airbrushed celebrity who’s on there, or worse yet, their good friend in the school who has more followers than they do. It’s become a master of them. It’s become kind of their criteria, their judger. And that’s not healthy, you could argue, definitely, I mean, of course, you’re gonna argue that of course, it’s not healthy.
Dan Fouts 18:28
Definitely, I mean, that’s a good application of it. Also misinformation that is spread over social media. It’s a great way to get information, but then when it becomes your only source of information, you’re subjecting yourself to being controlled by that media source that often dispenses inaccurate information. And so if that’s all you’re relying on, that’s a danger not only to yourself, but to society and to being a good citizen. All right, this, this is something that will be great in class– kids will connect with this.
Steve Fouts 20:27
Absolutely. They know what technology is. So what do you think about it an essential question, I mean, I jotted one down.
Dan Fouts 20:37 Essential Question
I have just some basic ones. Where, when is technology most useful? And when does it become dangerous?
Steve Fouts 20:48
How does it become dangerous? I have is technology necessarily dangerous? I mean, the answer to that is no.
Dan Fouts 21:05
Here’s one that maybe blends both of them. What are the best ways of using technology without it becoming your master? What are the most useful ways of employing technology without it controlling you? There’s different ways to move that.
Steve Fouts 21:30
Yeah and then we watch the football game. And there’s those robots that are walking on the field.
Dan Fouts 21:36
You saw that too?
Steve Fouts 21:38
Does anyone else’s skin crawl?
Dan Fouts 21:40
Oh my gosh.
Steve Fouts 21:44
And it’s because of that heartless lack of humanity and lack of conscience, that really makes it nerve wracking. And AI, it’s scary in a way that you can have a friend now, that will talk back to you. And it’s not a real person. But it can give you emotional feelings. It can, you know, parade around as something that can be your friend. There’s something unnerving about that. But I don’t want to cloud my belief, and just think that it’s automatically bad. I just don’t know how to evaluate it exactly.
Dan Fouts 22:39
Yeah, we’re still getting used to the idea. It’s weird. I guess we could say at this point, it’s, it’s weird. And it’s something that moves quickly. And we’re gonna have to deal with it more and more. So this is a great example of 100 year old quote that is so relevant today.
Steve Fouts 23:04
You know, maybe a question, you could ask anyone who’s talking to you about AI? And is arguing for the latest, greatest advancement? Maybe posing a question to them– something like–How are you going to make sure that this new tool that you fallen in love with isn’t going to become your own prison? Or limit you? How are you going to make sure that you use it as a tool, and it doesn’t become something that limits you and your development as a person, and you’re going to be missing out on things? You know, what do you think you’re missing out on by using this technology?
Dan Fouts 23:58
Thoughtful questions like raising the consciousness, bringing back and reaffirming the humanity behind the technology. Otherwise, I think people just fall into a love of technology at all costs kind of mentality. And they don’t think of whether they shouldn’t be doing this or not. So that’s definitely something to keep in mind.
Steve Fouts 24:22
It’s a great quote one more thing. I’ve got to come to the defense of the Teach Different method for a moment, because that really is a technology. How many times did we have to experiment through observation? To come up with a claim counterclaim, a central question framed by a quote. We had all different kinds of ideas at first, we kind of meandered a little bit and we thought, well, how do you keep a conversation going? But that actually is a practical use for that that came about because we studied people’s reactions and how you get conversations started and how you keep them going.
Dan Fouts 25:07
Right. It was all trials, experiment, experimentation. It was trying, looking at what works, looking at what doesn’t work, and getting all the nuances down. That’s exactly what it was. It was extremely scientific, the whole process. And it’s interesting as we move forward– Steve– with Teach Different and how we integrate technology with it. It’s cool how it’s, we have a very human experience that we’re having right now. I know, I know, it’s over zoom. But in a classroom or any kind of audience. There’s a human experience that is important with conversations, but there’s also a place for technology in our online community of sharing resources like this and making them useful for people who want to get better with conversations. It’s a beautiful blend. Christian Lous Lang thank you again, everybody, for showing up. We really love it and we’re looking forward–we are doing all of our podcasts now, inside our community, and you are invited to each and every one of them.