Data, Data, and more Data
Welcome back, my friends and only droogs. I hope you had an enjoyable spring break - we certainly jumped back in with both feet. Today we're talking about data, and some of the debates involved in privacy.
Now, on reading this week's offerings and doing a little digging, I figured out what the angle was here. We're being asked to critique our relationship to our data and what privacy means to us. The TED talk on the guy's experiment with both isolation and hyper-sharing was somewhat eye-opening. (source) I found it particularly odd that he stated that the month he spent in overshare, where everyone knew his activity, was the happiest of his life, and yet railed against it, saying that humanity was disappearing into conformity because of digital devices. I find that odd, and uniquely tied to a completely non-digital concept that I'll come back around to here.
It's true. We live in a controlled digital ecosystem. This isn't a secret, and it isn't new. The base concepts aren't even terribly new, they're just constant because they live in your hand or in front of your eyes. Social systems are part of something we need, and the mores and structures are part of what we construct naturally. In this system is the wide variety of people we know - the good, the bad, and the ugly. There is no social system that will not become predatory depending on who ends up in power. By the same token, there is no digital social system that will not become predatory for the same reason.
It can be scary when you really think about it. The prevalence and depth of data collection (source) is staggering. The ease at which it can be taught and enacted (source) is mind-boggling. The famous quote from Antitrust (an anti-Big Brother movie from 2001) that "any kid working in his garage could put me out of business" is true.
And let me give you an example I just encountered. As I was writing I remembered that quote, so I popped over and Googled it. I thought it was from a real person, but it was written for the movie, so I decided to cite my source. I typed "AntiTrust" into the Google search to grab the year. It popped up actual anti-trust laws, but at the top, it said, "Based on recent searches, Anti-Trust Movie". It watched and adapted. Datamining is so much a part of computer advancement that Google is a 99% of the Turing test, which measures sentience.
And that's the thing. Data, like magic or science or guns, is just data. It can have great benefits - keeping you informed (source) about things in your community, help you make informed decisions (source), or protect some of our most vulnerable citizens (source). It can connect users with creators who otherwise couldn't make a living - I recently followed a TikTok to a Facebook ad and ended up buying a poster of some beautiful art from some kid in Oklahoma. I'm thrilled with my purchase, and datamining helped me find them.
But it can also be used poorly, and in a macro sense, it is. Some of the earlier sources I cited talk about massive data companies who influence elections and drive social opinion by manipulating those datasets. People who see dollar signs instead of human beings are at the helm, and here's where I get back to my earlier thought, which is the Kohlberg scale. This evaluation of moral structure is used to study how people interact in a social capacity, and part of it is very, very salient.
Kohlberg's scale is divided into six steps in three stages as we grow. (source) The first step of the first stage of Pre-conventional morality is the Obedience and Punishment driven stage - i.e. the reason we do not do wrong is that we will be punished, not for any objective reason. The conformity Mr. Farid is referencing is directly related to that primitive morality - I will obey the rules of society lest I be punished. It also connects to one of the conventional morality stages, the "social consensus" stage, where individual approval is more important than doing wrong (individual advantage). These relate to social constructs, and both are fairly basic.
There is post-conventional morality, defined by social contracts and universal ethical principles. How does the digital world work with these things? Does the pressure remain the same, or is it surpassed? How much social expectation is healthy to keep us from being growling trolls under a bridge, and how much privacy is healthy for us? What is the humanity that these heralds of disaster are trying to protect?
To use the vulgar, jerks exist whether they're in our phones, behind our government screens, or next door. And I am not particularly special - I may exist in thousands of databases, but I am important in none of them. I would like my phone to remember my passwords so I don't have to, and for it to inform me that my favorite band is playing next month and that my pizza will be free the next time I order.
We need to change our relationship to data and defy and call out rampant abuses, but I feel it's alarmist and contributing to a culture of fear to obsessively erase one's footprint. This is our world. We built it. And we can make it better.
This week, I'm leaving you with an adorable bird sleeping on a warm laptop, mostly because he has the right idea and I too would like a nap. Join me next week for "A Networked Society: Promises and Parakeets." ...er, Paradoxes.
I'll be in tow!
Comments
Post a Comment