When technology is offered to schools free of charge, it always comes with the promise of improving teaching and learning. It also often comes with a catch.
Thirty years ago, Channel One offered schools nationwide $30,000 worth of audiovisual equipment at no cost in exchange for requiring students to view a daily current events program during class. Commercials, shown alongside educational programming, entered one of the last ad-free spaces in children’s lives. Research showed that students did not significantly benefit from the news programming, and were more likely to remember the content of commercials than the news.
Today, the tradeoffs that school leaders and teachers face about technology — whether free or for a fee — are more complex and troubling. It’s not just a question of exposure to advertising and commercial branding, but of the ethics of public education in an increasingly digital world.
As internet users, we implicitly understand that if we’re not paying for a commercial service, our data — and the content we create — are being commodified and sold to others. Revelations about the privacy practices of Facebook only serve to underscore the stakes surrounding the capture and use of personal data. While 41 states have passed student data privacy laws in recent years to protect students from such practices, it’s important to remember that compliance with current privacy regulations is only part of the issue.
Technology that is designed to monitor students in real-time and maximize their level of engagement and progression through lessons generates massive amounts of student data. Yet students and their families are rarely offered a glimpse of the scope of the digital records being assembled. Nor are they offered insights into how algorithms that determine student outcomes are created. Absent checks on real and potential biases, errors and omissions, students’ future educational prospects are being shaped by software.
In this new digital era for education, we should ask: What rules of the road are needed to ensure that decisions about technology are made in the best interests of students?
In the absence of answers to this question, it is too easy for teachers and administrators to become unwitting brand ambassadors for companies seeking to integrate their products into the classroom, regardless of impact on student learning and success.
As recent investigations have shown, too many public schools lack clear conflict-of-interest policies to ensure that decisions about whether and when to use specific technology products or services are being made in the best interests of students. The technology may be free, but at what cost to a quality education and to the civil liberties of educators and students?
For example, many of the nation’s largest school systems use Google’s free suite of education tools, which allows teachers to develop their lessons, create assignments and administer assessments online. When these students graduate, they are encouraged to migrate to personal Google accounts, giving the advertising superpower a potentially lucrative stream of data about consumers for life. And, when teachers share lesson plans with one another in closed or proprietary online learning platforms, they may be ceding ownership of their lessons to the technology providers, depending on the agreements made between their school and the provider.
Given that school technology adoption is unlikely to slow, the time has come for transparency.
We need to ensure that all learning materials — including software — are subject to public inspection by taxpayers and parents of students assigned to use them.
We also need to ensure that public schools and their technology partners provide reasonable security and privacy for information gathered about students and school staff. Policymakers should enact cybersecurity standards for schools, and privacy laws should be revisited to reflect changes in technology practices. Policymakers, parents and educators need to remain vigilant about the de facto creation of permanent digital records of students that may be used contrary to students’ best interests and without their full knowledge and consent.
While it is important that technology products are improved upon based on student and teacher input, many companies capture student and teacher school-related work for private benefit and profit. Students and schools should be able to access and export all of their data to the service providers of their choice or for their own needs.
One of the most promising solutions to many of these challenges comes from the open source community. Open source educational software and openly licensed instructional materials present an alternative to the market-driven requirements of commercial technology vendors. “Open” solutions — which grant the freedom to view, adapt and redistribute software and educational content to others for free — are widely relied upon in other sectors of the economy. School leaders should encourage widespread adoption of open source software and lessons in lieu of licensing high-cost, proprietary services from commercial firms.
Technology can and will play a positive role in the education of future generations of the nation’s youth. Yet at the same time, competing private interests should not be allowed to diminish the future of public education, which must be predicated on a respect for student and educator rights and freedoms. We can regulate the use of technology while also promoting innovation. It is in everyone’s best interests to ensure that schools protect the digital rights of their stakeholders, putting the best interests of students and teachers at the center.
Lisa Petrides is founder and CEO of the nonprofit ISKME, an international leader in information-sharing and innovation in the education sector that manages OER Commons, a public library for open education resources.
Douglas Levin is founder and president of EdTech Strategies, a consulting firm that provides strategic research and counsel on issues at the intersection of education, public policy, technology and innovation.