During my 40-plus years of integrating technology into business and education, the biggest obstacles were not faulty products or excessive costs, but the objections of people. Those same people were also the heroes of education technology because some had the vision to see the bigger picture and the potential implications of new tools before anyone else. Early in my career, I found those objections particularly grating because they were what separated me and my technology from the world of successful, completed projects. It took moving to an elite boarding school as IT director to reframe and refocus what I now consider truly important when we introduce technology to people: setting up the balance between data and dignity, the idea that we must consider people and their agency when responsibly implementing and using technology.
I arrived at a school in 1992 that had rudimentary administrative computer systems in place and a makeshift academic network that ran copper wire through manholes, meaning the network took rainy days off. There was one dial-up connection to the internet in a networked computer science lab so most students and teachers did not have access. Consequently, the key proposal of my first strategic plan was to network the campus and add a more robust internet connection. The Board of Trustees liked the idea that our school would be one of the first private boarding schools to network the campus and provide our community with a new spectrum of knowledge (and data about those using the network). The project and financing were approved and it was time to move forward and find the appropriate vendors.
I mentioned what seemed like a triumph to a few faculty colleagues, and the response was a far cry from that of the trustees. To say they didn’t see the vision was an understatement. Since our school was a boarding environment, students would not come out of their rooms if they had internet access, and the classrooms would be cluttered with computers that served no purpose, cried some of my colleagues. They viewed the project as having only downsides and as an extravagant expense that might have been diverted to faculty compensation. They didn’t yet know enough about the internet to thoughtfully incorporate it into their teaching and advising, and in that light, their caution was not unfounded. They also hadn’t yet anticipated the impact of the data collected and the loss of dignity as a result of their activity and that of their students.
Lesson learned: Don’t make proposals to the board without including faculty and students in the conversation first. What was true in 1996, the date the campus network went live without faculty buy-in, is still true today. Educators, not the technocrats, need to drive the EdTech adoption process based on learning goals and accept the associated risks to the school community. True, the faculty I worked with couldn’t have foreseen our current understanding of big tech and user data, but they saw risks in the unrestricted use of technology (data and dignity) where others didn’t. In many ways, it’s a lesson those in education are still learning.
Balancing Data and Dignity
Our best reason for the campus network was that we wanted our students to have the most up-to-date resources for high-quality learning. At the same time, we were mindful in our own way of the risks to students and adults in any school community. That’s where the balance between data and dignity comes into play, the tradeoff between using digital resources to raise the level of human dignity for all versus the data captured that, among other things, can create a behavioral narrative of every student.
The truth is, the tools we use have an impact on our well-being, and the companies we trust to provide solutions with a minimum of risk are sometimes irresponsible with the ways they manipulate our time and attention, along with the data they collect from us. Our ability to be responsible citizens and EdTech users, and to ensure that the tools we use are indeed solving a problem or enhancing an experience, will be a result of finding the balance between data and dignity. Here are some examples.
In education, we tend to seek “free” tools because budgets are limited and licensing is easier. Unfortunately, our economic realities may harm both our data and dignity. Some of the best technology minds tell us that these tools are not free; in fact, they can’t be when they are developed by for-profit companies. In a market economy, there is always a paying customer. With tech tools, there are only two possible customers: schools, if they pay for the tool, or advertisers supporting the tool. If the advertisers are paying, then we—or our data—will be manipulated to generate revenues for those advertisers. Free means that some invisible processes are influencing, limiting, or driving you. As an educator, the economic model may conflict with the values and goals of your school, upsetting the balance between data and dignity.
Not only do we prefer “free” tools, but we also are drawn to “free content” in the world of the internet. Consider that this, too, poses risks. We have traditionally valued intellectual property and copyrighted material through the principle of ownership by the author. On the internet, it is sometimes difficult to determine who owns the content. Even when we do know, the simple process of copy-paste makes using that material with poor or no attribution very easy. One teaching colleague of mine asked me a few years ago why anybody would post something on the internet if they wanted to protect their copyright. The introduction of AI large language models makes the content ownership issues even more complicated. The result has been sloppier citations and attributions, a topic that has dominated news channels recently when it was revealed that attribution issues formed part of the reason for the resignation of Harvard President Claudine Gay. Our students should be learning that if we create something of value, we should be compensated for it in some fashion—or at the very least properly attributed. We should respect that principle when we utilize the knowledge of others. Value the data so dignity will be valued as well.
We are at a point in the evolution of education technology where we are being asked to reflect deeply on how large language models and generative AI will impact teaching and learning. We must aggressively take the lead in assessing the benefits and the costs to avoid a scenario in which technology drives education because, as that balance shifts, so does the value of data and dignity. The way we address that challenge is by thinking about the issue from an educator’s perspective: What makes us human and what can teachers do that AI bots cannot? Good technology can enhance the human experience, but only if we maintain the balance between data and dignity. I expect my colleagues in the ‘90s would not have settled for anything less.
This column is courtesy of our partners at ET Magazine.
About the author
Joel Backon is a retired Educator, Editor and Writer who supports educators doing brave and interesting work, and writers telling the truth. He spent twenty-seven years in independent school education and fifteen years in the graphic arts industry. Joel is currently dedicated to moving educators forward.