<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>surveillance &amp;mdash; Minimalist EdTech</title>
    <link>https://minimalistedtech.org/tag:surveillance</link>
    <description>Less is more in technology and in education</description>
    <pubDate>Wed, 29 Apr 2026 20:24:43 +0000</pubDate>
    
    <item>
      <title>Intentional Forgetting in Edtech</title>
      <link>https://minimalistedtech.org/intentional-forgetting-in-edtech?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[&#xA;#forgetting #intentionalforgetting #minimalistedtech #edtechminimalism #edtech #privacy #surveillance&#xA;&#xA;We need more forgetful educational technologies. The default mode is always record and preserve first, deal with data issues after that. Privacy policies are not sufficient. We need intentional forgetting in edtech. Here&#39;s why. &#xA;&#xA;!--more--&#xA;&#xA;Do students know or care that technology is always watching?&#xA;&#xA;This past year many of us have been participants in a grand experiment in surveillance, conducting classes on video meeting platforms like Zoom or Teams or Meet and, for good and noble reasons, recording those videos. I have myself recorded all my classes, both online and face to face, going back at least 2 years and then in more piecemeal fashion before that. My main use case was about accessibility, as a way to allow students who needed extra help with note-taking access the class for longer and without having to struggle with their note-taking during class sessions. Video technologies have put in our face questions about data retention and student visibility in online platforms that have been there for a long time. There was, for example, vigorous debate in the pedagogical twitterverse (and elsewhere) about whether students should be required to have their cameras on during online class sessions. It is a defining feature -- indeed, a selling feature -- of most major edtech products, from LMS to single-purpose tool, that such tools can provide analytics about students. Trends change quickly with tech and, by extension, in edtech. In the space of five years it feels like my students have gone from being completely unaware about the amount of data collected by learning platforms to being moderately aware that their logins are tracked, their reading behavior in online &#34;textbooks&#34; is recorded, their actions in any tool are cataloged and analyzed; in most cases, it didn&#39;t seem to faze them. I had been concerned by that, but then the switch to video impacted them more acutely, in that they could feel being watched in real time. In certain ways it was like the ubiquitous passive surveillance was finally visible to them.&#xA;&#xA;Many still didn&#39;t care and took it for granted. This reaction concerns me.&#xA;&#xA;Video is in many ways easier to control and be intentional about. We get immediate feedback that we are being watched and it feels like we can address issues more immediately. Turn off the camera. Log in or log out and you&#39;re there or not. Things like an LMS or various online textbook platforms are more subtle. I have at various points shown students the kind of data that teachers see about login, page views, and the like in the major LMS platforms. When they are surprised by this, there tends to be a bit of a surprise that I might look at data like that or use that to expose in particular students who have, for example, not logged in to the course recently. I tend to think most of these &#34;analytics&#34; fall in the category of junk statistics. There&#39;s some benefit to the big numbers -- i.e. students who log in rarely and never or, conversely, those who are spending hours logged on and looking at pages over and over. But that comes through pretty clearly in other areas of their course performance. &#xA;&#xA;So what function do these metrics really serve? Cynically I might say that they are all marketing. It&#39;s something that can be sold as a form of &#34;insight&#34; into student behavior. It&#39;s a way to claim &#34;engagement.&#34; Hey look, you have all this visibility into what students are doing in online courses. And, sure, there can be ways that you can use that to understand certain trends, e.g. when students tend to complete assignments, how many submit stuff late and so forth. That&#39;s all well and good. &#xA;&#xA;But is that benefit worth the effect of having students always being watched? It&#39;s not surveillance of the sort that has caused outrage about spying, as with various remote proctoring services. It&#39;s just login data. It seems somehow more innocuous. Or, one might argue, it&#39;s just the same stuff that happens with an Amazon Kindle or some other commercial product.&#xA;&#xA;I am thinking more and more that those seemingly innocuous bits of data retention are a major problem in edtech. It is the ubiquitous surveillance that we take for granted and is even sold as a benefit of edtech products (&#34;analytics&#34;, &#34;insights&#34;, etc.) But the utility of such surveillance remains unproven while their cost is too often underappreciated. Surveillance, even passive surveillance in the form of metrics and logs and &#34;insights&#34;, realigns power relationships in teaching and makes the process of learning dependent upon observation rather than dialogue and mutual meaning-making. &#xA;&#xA;Thus my question: What would a forgetful edtech look like? What would it look like if you could reset things every day, or every week or every so often, if forgetting were built into technology for learning? Would it even be possible to build such a thing in our current commercial landscape? (Preview on the last question: I suspect not.)&#xA;&#xA;The Forgetful Classroom&#xA;&#xA;Educational technology demands that we think of classrooms as spaces for remembering and forgetting in a way that I suspect we wouldn&#39;t have even 10 or 15 years ago. &#xA;&#xA;Imagine for a second that your classroom or lecture hall has cameras not just on you but on every student, all the time, recording, for some future and as yet unknown use, that you are in there and even when you&#39;re not. What would you make of that system? Would it seem like a good? Would you want to turn it off? Would you change your behavior and, more importantly, would students change theirs?&#xA;&#xA;(It is true that at various points this ubiquitous use of video has been something that educational theorists and futurists have imagined as a positive. It&#39;s a staple of sci-fi to see automated learning that looks like some sort of AI-interactive surveillance system.)&#xA;&#xA;There is, in the physical space, a rhythm and pattern of remembering and forgetting which not only does not translate to digital space but is radically distorted by working in digital space. In situations where you have the same classroom and it isn&#39;t shared by others, there&#39;s a way in which you can reset that space each day, while keeping reminders of what has been done in the past, visible markers of the previous day and weeks of assignments. In other scenarios, with shared space or in the higher-ed situation where typically you visit a classroom space only a few times each week, there is something of a reset each time.&#xA;&#xA;There are a number of benefits to this pedagogical clean slate and stability, to the fact that your basic surroundings are the same. By contrast, in an online course built into an LMS for example, we might say that the interface is stable in a similar way. But there&#39;s a difference in terms of how we are tracked. Physical space doesn&#39;t record the traces of your behavior for a third party to dissect. It is not, fundamentally, a space of surveillance.&#xA;&#xA;By contrast, anything online is, by default and by nature, a place of potential surveillance. It is built into the technology. It is the economic model for Big Tech. How can those values help but be baked into educational experiences built on those technologies?&#xA;&#xA;What if, by default, all student interaction on a platform wasn&#39;t logged? What if notifications were not the default, if constant automation were not the default? What if you just arrived at a menu and decided what to open up from there. Would that be so bad? &#xA;&#xA;Further, what if everything was opt-in, where you have to trigger a specific action in order to things. There is no passive anything. Nothing happens that a human doesn&#39;t trigger intentionally. &#xA;&#xA;We take for granted that educational technologies track our students and, to a certain extent, teachers and anyone who interacts with the system. There are good technical and security reasons to log events and actions in computer systems; however, that paradigm doesn&#39;t have to apply at the level that is closest to users. Forgetful behaviors could be built into software. &#xA;It would, in my mind, be a great value, to be able to say to students that this platform retains your information for the duration of the class, while we are working, but then it is archived for a brief period and deleted. A clear life cycle for everyone&#39;s data goes beyond a simple privacy statement (which is about use and abuse and sharing of data) and foregrounds the pedagogical purpose to which the data is being used. &#xA;Default behavior which requires students to commit their data or information to the system rather than passively tracking anything they do. At the user level, the UX is completely about intentional actions, never about passive surveillance. &#xA;Drafts by default. In a typical system like an LMS, student actions on the platform are immediate. You submit an assignment -- that&#39;s the main action. Drafting capabilities lie entirely with teachers. It is a non-trivial technical change but a significant ideological change to make educational platforms places which are about drafts of things. Tools like google docs or others work well in part because of that killer feature of auto-save. As much as that violates my idea about intentionality, I recognize that there is power in the way that that turns everything into a working draft, because it is changeable and editable most times If it were the case that such a tool also forgot about your data at the end and wasn&#39;t doing all manner of other unknown things with it along the way, then it would fit the bill (but of course, it&#39;s the big G, so they are most certainly taking your data for all sorts of purposes, if not right now, then down the road.)&#xA;&#xA;I suspect their are more, perhaps forms of self-destructing data or other kinds of encryption that allow for more robust privacy. A more extreme example would allow users control of their data through decentralization or federation. &#xA;&#xA;We need to imagine a more forgetful kind of edtech because the alternative is one that continues to walk in lock-step with the world of big tech, where even well-meaning initiatives by the major platforms are still built upon the assumptions that the monetary value of people is in the data that they provide. That is inherent to the platform and the business model of companies like Facebook and Google in toto and many others in significant degrees. (On this last point, I highly recommend reading anything by Jaron Lanier, and especially 10 Arguments for Deleting Your Social Media Accounts Right Now.) &#xA;&#xA;In pedagogical terms, it is an important lesson from the field of memory studies that meaning requires forgetting. The rare individuals who have excessive and near perfect memory (classically, S. in Luria&#39;s Mind of a Mnemonist) are crippled in certain ways precisely because of their inability to forget. To understand in the present we must often forget parts of the past. Growth requires both retaining and leaving behind details. It is as true of personal memory as it is of collective memory. Where technologies seem to provide absolute memory, they are in fact failing us as media for making meaning. &#xA;&#xA;Recently in the news: Florida&#39;s legislature approves and to some extent encourages students to spy on their teachers in higher ed; conversely, student surveillance company Proctorio continues its ill-advised lawsuit against its critics. These are very different examples, but all part of the continuum of creeping classroom surveillance. It is a bad trend. Learning requires the freedom to make mistakes and room to experiment, support for growth and the messiness that is education. The expectation and acceptance of surveillance, face to face or online through technology, runs counter to those values. &#xA;&#xA;Memory isn&#39;t the same thing as privacy&#xA;Discussions about edtech surveillance and data logging (and for data retention in most tech platforms generally) are often framed as issues of privacy. Data privacy policies must indicate and spell out how data will be used and for what purposes. While it is true that data retention is a part of that kind of data policy language, these policies are much more about making clear commercial and non-commercial uses of data, a legal butt-covering to make sure that possible use cases have been enumerated should anyone find out at a later date that their particular data has been used in some way that has not been cataloged in the privacy policy. &#xA;&#xA;The more robust policies of the GDPR are something I wish would gain traction in the U.S., but even those guidelines and principles are of a different sort than what I am describing here. (The policies around retaining minimal data necessary are all hugely important steps in the right direction.) But educational technologies requires not simply privacy practices (e.g. FERPA) but best practices around inentional forgetting. We do a disservice to students to have only flimsy and piecemeal protections against their younger selves, their learning selves, leaving behind lasting traces that they don&#39;t have control over. That is not a problem only for their future, when they might be embarrassed by something they said on video when younger or when novices. That is a problem of the here and now, because knowing that you leave lasting traces, that you are being recorded changes behaviors. &#xA;&#xA;It&#39;s easy to think of things like anonymity or forgetting as negatives, to imagine that we are losing something, taking away something that should otherwise be preserved. And in our present moment where wearable technologies and the quantified self are sold as an obvious good and inevitable direction of health tech, it&#39;s easy to think that tracking is inevitable. (Side note: measuring the value of self-quantification is complicated.) I don&#39;t think it is inevitable that tracking is the norm forever, particularly, especially if we can articulate alternatives that provide other kinds of value beyond what the marketers and big tech companies want to extract as user data that feeds back into their platforms. &#xA;&#xA;Paradoxically, forgetting is often the most important mechanism for making things meaningful, not just because you know that something must exist in a particular moment, but because you know that the arena for it that matters most is in your own memory, not offloaded onto the computer system. When I think of a forgetful kind of edtech, I think mostly about how forgetting might help create more meaningful experiences with technology. That might involve some combination of temporary anonymization, rolling windows of auto-deleting records, scrubbing information as early as possible. It&#39;s paradoxical because computing is, by nature, about storing bits and bytes and then doing something with those stored bits and bytes -- it is all built on the mechanics of recording data. So thinking about forgetful edtech, or forgetful computing is an interesting problem. But the field of &#34;intentional forgetting&#34; is an important area of study (see, e.g. http://www.spp1921.de/index.html for the Intentional Forgetting in Organizations projects) and one which may have benefits for education. Many of the insights gleaned from outside education may have larger impacts if implemented for students, providing a direct counterpoint to the seemingly inevitable trend towards educational technologies that record everything, while at the same time opening the way to new, more meaningful educational experiences.&#xA;&#xA;Postscript: This is a first attempt to think through this topic, but there is a lot more to say, particularly in light of the growing literature on intentional forgetting (which I have not referred to much in the above). More coming...&#xA;&#xA;]]&gt;</description>
      <content:encoded><![CDATA[<p><img src="https://i.snap.as/2h2y0k6b.jpg" alt=""/>
<a href="https://minimalistedtech.org/tag:forgetting" class="hashtag"><span>#</span><span class="p-category">forgetting</span></a> <a href="https://minimalistedtech.org/tag:intentionalforgetting" class="hashtag"><span>#</span><span class="p-category">intentionalforgetting</span></a> <a href="https://minimalistedtech.org/tag:minimalistedtech" class="hashtag"><span>#</span><span class="p-category">minimalistedtech</span></a> <a href="https://minimalistedtech.org/tag:edtechminimalism" class="hashtag"><span>#</span><span class="p-category">edtechminimalism</span></a> <a href="https://minimalistedtech.org/tag:edtech" class="hashtag"><span>#</span><span class="p-category">edtech</span></a> <a href="https://minimalistedtech.org/tag:privacy" class="hashtag"><span>#</span><span class="p-category">privacy</span></a> <a href="https://minimalistedtech.org/tag:surveillance" class="hashtag"><span>#</span><span class="p-category">surveillance</span></a></p>

<p>We need more forgetful educational technologies. The default mode is always record and preserve first, deal with data issues after that. Privacy policies are not sufficient. We need <strong>intentional forgetting</strong> in edtech. Here&#39;s why.</p>



<h2 id="do-students-know-or-care-that-technology-is-always-watching" id="do-students-know-or-care-that-technology-is-always-watching">Do students know or care that technology is always watching?</h2>

<p>This past year many of us have been participants in a grand experiment in surveillance, conducting classes on video meeting platforms like Zoom or Teams or Meet and, for good and noble reasons, recording those videos. I have myself recorded all my classes, both online and face to face, going back at least 2 years and then in more piecemeal fashion before that. My main use case was about accessibility, as a way to allow students who needed extra help with note-taking access the class for longer and without having to struggle with their note-taking during class sessions. Video technologies have put in our face questions about data retention and student visibility in online platforms that have been there for a long time. There was, for example, vigorous debate in the pedagogical twitterverse (and elsewhere) about whether students should be required to have their cameras on during online class sessions. It is a defining feature — indeed, a selling feature — of most major edtech products, from LMS to single-purpose tool, that such tools can provide analytics about students. Trends change quickly with tech and, by extension, in edtech. In the space of five years it feels like my students have gone from being completely unaware about the amount of data collected by learning platforms to being moderately aware that their logins are tracked, their reading behavior in online “textbooks” is recorded, their actions in any tool are cataloged and analyzed; in most cases, it didn&#39;t seem to faze them. I had been concerned by that, but then the switch to video impacted them more acutely, in that they could feel being watched in real time. In certain ways it was like the ubiquitous passive surveillance was finally visible to them.</p>

<p>Many still didn&#39;t care and took it for granted. This reaction concerns me.</p>

<p>Video is in many ways easier to control and be intentional about. We get immediate feedback that we are being watched and it feels like we can address issues more immediately. Turn off the camera. Log in or log out and you&#39;re there or not. Things like an LMS or various online textbook platforms are more subtle. I have at various points shown students the kind of data that teachers see about login, page views, and the like in the major LMS platforms. When they are surprised by this, there tends to be a bit of a surprise that I might look at data like that or use that to expose in particular students who have, for example, not logged in to the course recently. I tend to think most of these “analytics” fall in the category of junk statistics. There&#39;s some benefit to the big numbers — i.e. students who log in rarely and never or, conversely, those who are spending hours logged on and looking at pages over and over. But that comes through pretty clearly in other areas of their course performance.</p>

<p>So what function do these metrics really serve? Cynically I might say that they are all marketing. It&#39;s something that can be sold as a form of “insight” into student behavior. It&#39;s a way to claim “engagement.” Hey look, you have all this visibility into what students are doing in online courses. And, sure, there can be ways that you can use that to understand certain trends, e.g. when students tend to complete assignments, how many submit stuff late and so forth. That&#39;s all well and good.</p>

<p>But is that benefit worth the effect of having students always being watched? It&#39;s not <a href="https://minimalistedtech.com/surveillance-edtech-is-why-we-need-a-different-approach">surveillance of the sort that has caused outrage about spying, as with various remote proctoring services</a>. It&#39;s just login data. It seems somehow more innocuous. Or, one might argue, it&#39;s just the same stuff that happens with an Amazon Kindle or some other commercial product.</p>

<p>I am thinking more and more that those seemingly innocuous bits of data retention are a major problem in edtech. It is the ubiquitous surveillance that we take for granted and is even sold as a benefit of edtech products (“analytics”, “insights”, etc.) But the utility of such surveillance remains unproven while their cost is too often underappreciated. Surveillance, even passive surveillance in the form of metrics and logs and “insights”, realigns power relationships in teaching and makes the process of learning dependent upon observation rather than dialogue and mutual meaning-making.</p>

<p>Thus my question: What would a forgetful edtech look like? What would it look like if you could reset things every day, or every week or every so often, if forgetting were built into technology for learning? Would it even be possible to build such a thing in our current commercial landscape? (Preview on the last question: I suspect not.)</p>

<h2 id="the-forgetful-classroom" id="the-forgetful-classroom">The Forgetful Classroom</h2>

<p>Educational technology demands that we think of classrooms as spaces for remembering and forgetting in a way that I suspect we wouldn&#39;t have even 10 or 15 years ago.</p>

<p>Imagine for a second that your classroom or lecture hall has cameras not just on you but on every student, all the time, recording, for some future and as yet unknown use, that you are in there and even when you&#39;re not. What would you make of that system? Would it seem like a good? Would you want to turn it off? Would you change your behavior and, more importantly, would students change theirs?</p>

<p>(It is true that at various points this ubiquitous use of video has been something that educational theorists and futurists have imagined as a positive. It&#39;s a staple of sci-fi to see automated learning that looks like some sort of AI-interactive surveillance system.)</p>

<p>There is, in the physical space, a rhythm and pattern of remembering and forgetting which not only does not translate to digital space but is radically distorted by working in digital space. In situations where you have the same classroom and it isn&#39;t shared by others, there&#39;s a way in which you can reset that space each day, while keeping reminders of what has been done in the past, visible markers of the previous day and weeks of assignments. In other scenarios, with shared space or in the higher-ed situation where typically you visit a classroom space only a few times each week, there is something of a reset each time.</p>

<p>There are a number of benefits to this pedagogical clean slate and stability, to the fact that your basic surroundings are the same. By contrast, in an online course built into an LMS for example, we might say that the interface is stable in a similar way. But there&#39;s a difference in terms of how we are tracked. Physical space doesn&#39;t record the traces of your behavior for a third party to dissect. It is not, fundamentally, a space of surveillance.</p>

<p>By contrast, anything online is, by default and by nature, a place of potential surveillance. It is built into the technology. It is the economic model for Big Tech. How can those values help but be baked into educational experiences built on those technologies?</p>

<p>What if, by default, all student interaction on a platform wasn&#39;t logged? What if notifications were not the default, if constant automation were not the default? What if you just arrived at a menu and decided what to open up from there. Would that be so bad?</p>

<p>Further, what if everything was opt-in, where you have to trigger a specific action in order to things. There is no passive anything. Nothing happens that a human doesn&#39;t trigger intentionally.</p>

<p>We take for granted that educational technologies track our students and, to a certain extent, teachers and anyone who interacts with the system. There are good technical and security reasons to log events and actions in computer systems; however, that paradigm doesn&#39;t have to apply at the level that is closest to users. Forgetful behaviors <em>could</em> be built into software.
– It would, in my mind, be a great value, to be able to say to students that this platform retains your information for the duration of the class, while we are working, but then it is archived for a brief period and deleted. A clear life cycle for everyone&#39;s data goes beyond a simple privacy statement (which is about use and abuse and sharing of data) and foregrounds the pedagogical purpose to which the data is being used.
– Default behavior which requires students to commit their data or information to the system rather than passively tracking anything they do. At the user level, the UX is completely about intentional actions, never about passive surveillance.
– Drafts by default. In a typical system like an LMS, student actions on the platform are immediate. You submit an assignment — that&#39;s the main action. Drafting capabilities lie entirely with teachers. It is a non-trivial technical change but a significant ideological change to make educational platforms places which are about drafts of things. Tools like google docs or others work well in part because of that killer feature of auto-save. As much as that violates my idea about intentionality, I recognize that there is power in the way that that turns everything into a working draft, because it is changeable and editable most times If it were the case that such a tool also forgot about your data at the end and wasn&#39;t doing all manner of other unknown things with it along the way, then it would fit the bill (but of course, it&#39;s the big G, so they are most certainly taking your data for all sorts of purposes, if not right now, then down the road.)</p>

<p>I suspect their are more, perhaps forms of self-destructing data or other kinds of encryption that allow for more robust privacy. A more extreme example would allow users control of their data through decentralization or federation.</p>

<p>We need to imagine a more forgetful kind of edtech because the alternative is one that continues to walk in lock-step with the world of big tech, where even well-meaning initiatives by the major platforms are still built upon the assumptions that the monetary value of people is in the data that they provide. That is inherent to the platform and the business model of companies like Facebook and Google <em>in toto</em> and many others in significant degrees. (On this last point, I highly recommend reading anything by Jaron Lanier, and especially <a href="https://www.amazon.com/Arguments-Deleting-Social-Media-Accounts/dp/125019668X"><em>10 Arguments for Deleting Your Social Media Accounts Right Now</em></a>.)</p>

<p>In pedagogical terms, it is an important lesson from the field of memory studies that meaning requires forgetting. The rare individuals who have excessive and near perfect memory (classically, <a href="https://en.wikipedia.org/wiki/Solomon_Shereshevsky">S. in Luria&#39;s <em>Mind of a Mnemonist</em></a>) are crippled in certain ways precisely because of their inability to forget. To understand in the present we must often forget parts of the past. Growth requires both retaining and leaving behind details. It is as true of personal memory as it is of collective memory. Where technologies seem to provide absolute memory, they are in fact failing us as media for making meaning.</p>

<p>Recently in the news: <a href="https://www.insidehighered.com/news/2021/04/16/florida-poised-pass-bill-allowing-students-record-classes">Florida&#39;s legislature approves and to some extent encourages students to spy on their teachers in higher ed</a>; conversely, <a href="https://www.insidehighered.com/quicktakes/2021/04/23/student-sues-remote-proctoring-company-proctorio">student surveillance company Proctorio continues its ill-advised lawsuit against its critics</a>. These are very different examples, but all part of the continuum of creeping classroom surveillance. It is a bad trend. Learning requires the freedom to make mistakes and room to experiment, support for growth and the messiness that is education. The expectation and acceptance of surveillance, face to face or online through technology, runs counter to those values.</p>

<h2 id="memory-isn-t-the-same-thing-as-privacy" id="memory-isn-t-the-same-thing-as-privacy">Memory isn&#39;t the same thing as privacy</h2>

<p>Discussions about edtech surveillance and data logging (and for data retention in most tech platforms generally) are often framed as issues of privacy. Data privacy policies must indicate and spell out how data will be used and for what purposes. While it is true that data retention is a part of that kind of data policy language, these policies are much more about making clear commercial and non-commercial uses of data, a legal butt-covering to make sure that possible use cases have been enumerated should anyone find out at a later date that their particular data has been used in some way that has not been cataloged in the privacy policy.</p>

<p>The more robust policies of the <a href="https://gdpr.eu/what-is-gdpr/">GDPR</a> are something I wish would gain traction in the U.S., but even those guidelines and principles are of a different sort than what I am describing here. (The policies around retaining minimal data necessary are all hugely important steps in the right direction.) But educational technologies requires not simply privacy practices (e.g. FERPA) but best practices around inentional forgetting. We do a disservice to students to have only flimsy and piecemeal protections against their younger selves, their learning selves, leaving behind lasting traces that they don&#39;t have control over. That is not a problem only for their future, when they might be embarrassed by something they said on video when younger or when novices. That is a problem of the here and now, because knowing that you leave lasting traces, that you are being recorded changes behaviors.</p>

<p>It&#39;s easy to think of things like anonymity or forgetting as negatives, to imagine that we are losing something, taking away something that should otherwise be preserved. And in our present moment where wearable technologies and the <a href="https://en.wikipedia.org/wiki/Quantified_self">quantified self</a> are sold as an obvious good and inevitable direction of health tech, it&#39;s easy to think that tracking is inevitable. (Side note: <a href="https://qz.com/quartzy/1644006/the-psychology-of-self-tracking/">measuring the value of self-quantification is complicated</a>.) I don&#39;t think it is inevitable that tracking is the norm forever, particularly, especially if we can articulate alternatives that provide other kinds of value beyond what the marketers and big tech companies want to extract as user data that feeds back into their platforms.</p>

<p>Paradoxically, forgetting is often the most important mechanism for making things meaningful, not just because you know that something must exist in a particular moment, but because you know that the arena for it that matters most is in your own memory, not offloaded onto the computer system. When I think of a forgetful kind of edtech, I think mostly about how forgetting might help create more meaningful experiences with technology. That might involve some combination of temporary anonymization, rolling windows of auto-deleting records, scrubbing information as early as possible. It&#39;s paradoxical because computing is, by nature, about storing bits and bytes and then doing something with those stored bits and bytes — it is all built on the mechanics of recording data. So thinking about forgetful edtech, or <a href="https://link.springer.com/article/10.1007/s13218-018-00574-x">forgetful computing</a> is an interesting problem. But the field of “intentional forgetting” is an important area of study (see, e.g. <a href="http://www.spp1921.de/index.html">http://www.spp1921.de/index.html</a> for the Intentional Forgetting in Organizations projects) and one which may have benefits for education. Many of the insights gleaned from outside education may have larger impacts if implemented for students, providing a direct counterpoint to the seemingly inevitable trend towards educational technologies that record everything, while at the same time opening the way to new, more meaningful educational experiences.</p>

<p>Postscript: This is a first attempt to think through this topic, but there is a lot more to say, particularly in light of the growing literature on intentional forgetting (which I have not referred to much in the above). More coming...</p>
]]></content:encoded>
      <guid>https://minimalistedtech.org/intentional-forgetting-in-edtech</guid>
      <pubDate>Fri, 30 Apr 2021 13:17:24 +0000</pubDate>
    </item>
    <item>
      <title>Alternatives to Surveillance Edtech: Students as Publishers</title>
      <link>https://minimalistedtech.org/alternatives-to-surveillance-edtech-students-as-publishers?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[&#xA;Alternatives to Surveillance Edtech: Students as Publishers&#xA;&#xA;The case against surveillance edtech like Proctorio isn&#39;t really about privacy; in pedagogical terms, it&#39;s about automation and agency.&#xA;&#xA;!--more--&#xA;&#xA;There&#39;s been a fair amount of press and twitter feather-ruffling about surveillance edtech this past year. Critics of these tools have rightly highlighted dubious data practices and the creepiness and invasiveness of tools like Proctorio, Honorlock and all the rest. (I wrote about it a few months ago here.) But I wonder whether framing it as a matter of privacy is, as an argument to admins, to the public at large, and to many teachers and students, not particularly persuasive, despite being the ethical high ground. After all, in the face of the ubiquitous tracking of Big Tech in general, most people are simply resigned to the fact that their privacy is not a given. Or, rather, the trade off is rendered socially acceptable. Convenience today beats the future and unknown danger of someone else having data or spying, so the thinking goes. Out of sight, out of mind. So long as that invasion of privacy doesn&#39;t actually cause a specific problem, then the trade off of privacy for convenience can be rationalized or simply ignored. In America in particular, privacy is an issue which many people react to with resigned passivity. &#xA;&#xA;Now, for some (me included), that privacy argument is compelling. The ethical argument against surveillance edtech is also compelling: how could I promote and facilitate that kind of invasion of privacy to my students? It&#39;s just obviously the wrong direction and not worth the cost for the (purported) benefit.&#xA;&#xA;But I think we need a more compelling pragmatic and pedagogical argument. The case against surveillance edtech needs to focus on the fact that we can teach better without that sort of software. The sales pitch of such software is that it solves a particular problem (e.g. cheating); it&#39;s a bit like home security services, relying on fear. Proctoring software depends on your fear as a teacher that students are doing something wrong such that you need this &#34;solution&#34; to protect yourself from being tricked. &#xA;&#xA;But let&#39;s step back from their framing of the problem as one of cheating or verifying identity. What about the project of learning? Surveillance software is, on the contrary, highly counterproductive to the project of teaching and learning. It is not a &#34;tool&#34; or helper; it is a hindrance and unnecessary barrier. It doesn&#39;t actually help students learn or teachers teach.&#xA;&#xA;That kind of argument requires us to reject the assumption that automation is an obvious good and focus instead on technology that promotes and requires the agency of teachers and students. We need not just a criticism of surveillance software as a platform on ethical terms, but also of the software on pedagogical terms. We need to foster classroom design that pushes back against the kinds of pedagogical assumptions underlying this sort of tool. That means, yes, rejecting the notion of &#34;cop shit&#34; in the classroom. It also means doubling down on mechanisms for promoting student and teacher agency. We need to talk about students as active publishers of their own content and we must reject entirely the idea that anyone in a classroom should be subject to passive data gathering in any way. &#xA;&#xA;A distinction here from the everyday world of how we deal with digital files might be worth thinking about. A lot of programs will autosave your work. Word processors, google docs and its imitators, spreadsheets, databases -- any number of programs will save a copy of what you are working on it while you work. This is a convenient feature in that context, but it is also one that, subtly, makes us just a bit passive about versions or drafts of our work. On the other hand, if you work with code then you likely use some sort of versioning system like git. Git, whether it&#39;s github or a self-hosted alternative, works by forcing a lot of steps on you (at least it will seem that way at first). You have to mark what files you&#39;ve updated (git add), add a  bit saying what&#39;s changed and signal your intention to really really add this to the changes being tracked (git commit), and then you might give a command to move the files from your computer to an online folder (git push). There&#39;s something very intentional about all of that. It requires just a bit more agency in the process of saving a file. &#xA;&#xA;Now, on the specifics, there are obviously ways to do version control in a more automated way and, conversely, ways to make word processors and all the like less automatic in their updating. My point isn&#39;t about those kinds of programs in their details. My point is that we have a choice to adjust the degree of automation and control we want to exert. What most edtech does, and surveillance edtech in particular, is remove or obscure much of that choice and control over automation. Take any platform you use regularly for teaching or in the classroom. How much control do you have over the data it collects? How obvious is it when it is tracking something and when it isn&#39;t? &#xA;&#xA;This can be different. Automation is not necessarily the enemy of good pedagogy, but it does require careful thought and design. In the case of materials that students submit to a class or their actions in a class, constant surveillance, whether through proctoring software or, for synchronous classes, with the constant lens of Zoom or its ilk, robs students of a certain degree of agency. (This is in part why, when given the choice, students may be so eager to turn off their phones or give you a view that is not all that great. It is, like anything, a small exertion of control with a system that is not entirely under their control.) &#xA;&#xA;What would it look like if the majority of edtech were data neutral and forgetful by default? What if we only used technologies that not only did not track or store data (hence, ethically &#34;private&#34;), but went a step further so as to be, by default, anonymous until a student or teacher chooses to make something public? Maybe that seems unthinkable, as the default mode for edtech is essentially a list of students, authenticated and identified so that they can be tracked as they submit assignments or complete activities or the like. Indeed, for many edtech products, integrating sign-on is one of the thornier problems to navigate (i.e. does it integrate with an LMS, is it SSO, etc.). Put another way, you have to opt-in, with most platforms, to find ways of allowing students to be anonymous. You can, for example, have a survey in an LMS that might not be graded. Or you might have a wiki that students can collaborate on and not look at the names. Or you might hide names when grading. Or you likely just need to use a different platform (but then we run into FERPA issues of course, because you as a teacher are held to a standard of data sharing that the big tech companies can negotiate their way around.). Edtech is designed so that the identity of students is front and center. It is designed with passive surveillance already primary. &#xA;&#xA;What if students had to &#34;push&#34; their commits? What if every edtech product made student activities, by default, private and self-destructing or forgettable? The difference here is between publishing and surveillance. Students have to publish to an audience they define -- the teacher, their class, etc. &#xA;&#xA;This would of course mean very different things at different levels. But in principle what I&#39;m getting at is that we already have ways of thinking about how students &#34;participate&#34; in a class. That is our pedagogical paradigm that needs to stand up against the imposition of a surveillance paradigm. &#xA;&#xA;One other point here. This is ultimately all about grades. Why do we need so desperately to pin down identity and live in fear of cheating? It&#39;s about the evaluation and assessment, about the stakes for grading. That&#39;s a topic for another post, but so long as we are wedded to outmoded grading patterns as ways to assess learning, then we&#39;re stuck with systems that trend towards invasive tracking when it comes to technology. &#xA;&#xA;Finally, could we separate out process and product? Is part of an approach to confronting surveillance tech investing in tools that foster students&#39; working more anonymously or without surveillance as part of the process? Students can work on process knowing that that material is only shared with their intention and permission. They can have multiple products -- some failed perhaps, and commit the best one. &#xA;&#xA;There are, currently, some platforms that can be adapted for this sort of anti-surveillance kind of work.  I imagine this as one possible use for something like write.as, e.g. create your own blog that is kept private and then submit assignments from that as you work up material. Most tools though are not aimed at educational markets. And many would require self-hosting in some form and so are not really turnkey for educators. Any sort of document sharing or file sharing utilities that require students to opt-in might be useful; so too shared whiteboards or peertube or discourse. Setting up something with cloudron or yunohost might be one sort of way to go. &#xA;&#xA;As with most privacy-related things, right now the solutions would seem to fall on the shoulders of the user and require setting up some sort of shadow IT or secondary infrastructure. That&#39;s not a great solution, but until anonymity-first is part of the thinking behind edtech, we&#39;re going to struggle with issues of privacy where the surveillors are probably going to have the momentum.&#xA;&#xA;#minimalistedtech #edtech #proctorio #surveillance #edtechminimalism]]&gt;</description>
      <content:encoded><![CDATA[<h1 id="alternatives-to-surveillance-edtech-students-as-publishers" id="alternatives-to-surveillance-edtech-students-as-publishers">Alternatives to Surveillance Edtech: Students as Publishers</h1>

<p><img src="https://i.snap.as/Hl2ReNPd.jpg" alt=""/></p>

<p>The case against surveillance edtech like Proctorio isn&#39;t really about privacy; in pedagogical terms, it&#39;s about automation and agency.</p>



<p>There&#39;s been a fair amount of press and twitter feather-ruffling about surveillance edtech this past year. Critics of these tools have rightly highlighted dubious data practices and the creepiness and invasiveness of tools like Proctorio, Honorlock and all the rest. (<a href="https://minimalistedtech.com/surveillance-edtech-is-why-we-need-a-different-approach">I wrote about it a few months ago here</a>.) But I wonder whether framing it as a matter of privacy is, as an argument to admins, to the public at large, and to many teachers and students, not particularly persuasive, despite being the ethical high ground. After all, in the face of the ubiquitous tracking of Big Tech in general, most people are simply resigned to the fact that their privacy is not a given. Or, rather, the trade off is rendered socially acceptable. Convenience today beats the future and unknown danger of someone else having data or spying, so the thinking goes. Out of sight, out of mind. So long as that invasion of privacy doesn&#39;t actually cause a specific problem, then the trade off of privacy for convenience can be rationalized or simply ignored. In America in particular, privacy is an issue which many people react to with resigned passivity.</p>

<p>Now, for some (me included), that privacy argument is compelling. The ethical argument against surveillance edtech is also compelling: how could I promote and facilitate that kind of invasion of privacy to my students? It&#39;s just obviously the wrong direction and not worth the cost for the (purported) benefit.</p>

<p>But I think we need a more compelling pragmatic and pedagogical argument. The case against surveillance edtech needs to focus on the fact that <strong>we can teach better without that sort of software</strong>. The sales pitch of such software is that it solves a particular problem (e.g. cheating); it&#39;s a bit like home security services, relying on fear. Proctoring software depends on your fear as a teacher that students are doing something wrong such that you need this “solution” to protect yourself from being tricked.</p>

<p>But let&#39;s step back from their framing of the problem as one of cheating or verifying identity. What about the project of learning? Surveillance software is, on the contrary, highly counterproductive to the project of teaching and learning. It is not a “tool” or helper; it is a hindrance and unnecessary barrier. It doesn&#39;t actually help students learn or teachers teach.</p>

<p>That kind of argument requires us to reject the assumption that automation is an obvious good and focus instead on technology that promotes and <strong>requires</strong> the agency of teachers and students. We need not just a criticism of surveillance software as a platform on ethical terms, but also of the software on pedagogical terms. We need to foster classroom design that pushes back against the kinds of pedagogical assumptions underlying this sort of tool. That means, yes, rejecting the notion of “cop shit” in the classroom. It also means doubling down on mechanisms for promoting student and teacher agency. We need to talk about students as active publishers of their own content and we must reject entirely the idea that anyone in a classroom should be subject to passive data gathering in any way.</p>

<p>A distinction here from the everyday world of how we deal with digital files might be worth thinking about. A lot of programs will autosave your work. Word processors, google docs and its imitators, spreadsheets, databases — any number of programs will save a copy of what you are working on it while you work. This is a convenient feature in that context, but it is also one that, subtly, makes us just a bit passive about versions or drafts of our work. On the other hand, if you work with code then you likely use some sort of versioning system like git. Git, whether it&#39;s github or a self-hosted alternative, works by forcing a lot of steps on you (at least it will seem that way at first). You have to mark what files you&#39;ve updated (git add), add a  bit saying what&#39;s changed and signal your intention to really really add this to the changes being tracked (git commit), and then you might give a command to move the files from your computer to an online folder (git push). There&#39;s something very intentional about all of that. It requires just a bit more agency in the process of saving a file.</p>

<p>Now, on the specifics, there are obviously ways to do version control in a more automated way and, conversely, ways to make word processors and all the like less automatic in their updating. My point isn&#39;t about those kinds of programs in their details. My point is that we have a choice to adjust the degree of automation and control we want to exert. What most edtech does, and surveillance edtech in particular, is remove or obscure much of that choice and control over automation. Take any platform you use regularly for teaching or in the classroom. How much control do you have over the data it collects? How obvious is it when it is tracking something and when it isn&#39;t?</p>

<p>This can be different. Automation is not necessarily the enemy of good pedagogy, but it does require careful thought and design. In the case of materials that students submit to a class or their actions in a class, constant surveillance, whether through proctoring software or, for synchronous classes, with the constant lens of Zoom or its ilk, robs students of a certain degree of agency. (This is in part why, when given the choice, students may be so eager to turn off their phones or give you a view that is not all that great. It is, like anything, a small exertion of control with a system that is not entirely under their control.)</p>

<p>What would it look like if the majority of edtech were data neutral and forgetful by default? What if we only used technologies that not only did not track or store data (hence, ethically “private”), but went a step further so as to be, by default, anonymous until a student or teacher chooses to make something public? Maybe that seems unthinkable, as the default mode for edtech is essentially a list of students, authenticated and identified so that they can be tracked as they submit assignments or complete activities or the like. Indeed, for many edtech products, integrating sign-on is one of the thornier problems to navigate (i.e. does it integrate with an LMS, is it SSO, etc.). Put another way, you have to opt-in, with most platforms, to find ways of allowing students to be anonymous. You can, for example, have a survey in an LMS that might not be graded. Or you might have a wiki that students can collaborate on and not look at the names. Or you might hide names when grading. Or you likely just need to use a different platform (but then we run into FERPA issues of course, because you as a teacher are held to a standard of data sharing that the big tech companies can negotiate their way around.). Edtech is designed so that the identity of students is front and center. It is designed with passive surveillance already primary.</p>

<p>What if students had to “push” their commits? What if every edtech product made student activities, by default, private and self-destructing or forgettable? The difference here is between publishing and surveillance. Students have to publish to an audience they define — the teacher, their class, etc.</p>

<p>This would of course mean very different things at different levels. But in principle what I&#39;m getting at is that we already have ways of thinking about how students “participate” in a class. That is our pedagogical paradigm that needs to stand up against the imposition of a surveillance paradigm.</p>

<p>One other point here. This is ultimately all about grades. Why do we need so desperately to pin down identity and live in fear of cheating? It&#39;s about the evaluation and assessment, about the stakes for grading. That&#39;s a topic for another post, but so long as we are wedded to outmoded grading patterns as ways to assess learning, then we&#39;re stuck with systems that trend towards invasive tracking when it comes to technology.</p>

<p>Finally, could we separate out process and product? Is part of an approach to confronting surveillance tech investing in tools that foster students&#39; working more anonymously or without surveillance as part of the process? Students can work on process knowing that that material is only shared with their intention and permission. They can have multiple products — some failed perhaps, and commit the best one.</p>

<p>There are, currently, some platforms that can be adapted for this sort of anti-surveillance kind of work.  I imagine this as one possible use for something like write.as, e.g. create your own blog that is kept private and then submit assignments from that as you work up material. Most tools though are not aimed at educational markets. And many would require self-hosting in some form and so are not really turnkey for educators. Any sort of document sharing or file sharing utilities that require students to opt-in might be useful; so too shared whiteboards or peertube or discourse. Setting up something with cloudron or yunohost might be one sort of way to go.</p>

<p>As with most privacy-related things, right now the solutions would seem to fall on the shoulders of the user and require setting up some sort of shadow IT or secondary infrastructure. That&#39;s not a great solution, but until anonymity-first is part of the thinking behind edtech, we&#39;re going to struggle with issues of privacy where the surveillors are probably going to have the momentum.</p>

<p><a href="https://minimalistedtech.org/tag:minimalistedtech" class="hashtag"><span>#</span><span class="p-category">minimalistedtech</span></a> <a href="https://minimalistedtech.org/tag:edtech" class="hashtag"><span>#</span><span class="p-category">edtech</span></a> <a href="https://minimalistedtech.org/tag:proctorio" class="hashtag"><span>#</span><span class="p-category">proctorio</span></a> <a href="https://minimalistedtech.org/tag:surveillance" class="hashtag"><span>#</span><span class="p-category">surveillance</span></a> <a href="https://minimalistedtech.org/tag:edtechminimalism" class="hashtag"><span>#</span><span class="p-category">edtechminimalism</span></a></p>
]]></content:encoded>
      <guid>https://minimalistedtech.org/alternatives-to-surveillance-edtech-students-as-publishers</guid>
      <pubDate>Sat, 13 Feb 2021 17:01:15 +0000</pubDate>
    </item>
  </channel>
</rss>