After listening to Dollars for Data we see how there are drastically different views around data and how it should be managed by our society. From a capitalist standpoint, everyone should be allowed to sell their data to whomever they’d like if it benefits them financially, yet from a decolonized perspective, our data is this interconnected knowledge base that should be shared and used with discretion. This realization that not everyone fundamentally views their data in the same way, gives me concern as a designer who wants to create systems that meet the needs of the user. How can we strike a balance between individuals who want to be in control of their data and monetize the system, while others would like to see data be treated as if it were an extension of their bodies?
Watch | Sarah Gold, “Trust and Digital Rights in Learned Systems,” PAIR UX Symposium 2018
In Sarah Gold’s talk, we are presented with several ideas on how designers could tackle this wicked problem, yet there is still an issue of alignment on how exactly we implement these solutions. Governments around the world are all defining the rules around data differently and thus making it extremely difficult for one solution to extend across all interactions online. So perhaps instead of placing all of our efforts into the design of the learning system, we could attempt to address the concerns at a policy level first. What design methods could help us approach this problem as a whole rather than as fragmented communities? … Maybe we should call Stewart Candy …
Read | “You Are Now Remotely Controlled” by Zuboff, New York Times.
Finally, I found it extremely interesting that data scientists can’t find work at universities because of big tech involvement in data collection. Places like Google and Microsoft have worked to centralize their control over data tracking technology leading to a deficit of data scientist representation in university and academy settings. With this realization, I’m forced to think about my own role as a designer and the influence I have over the creation of tools and systems that impact one’s anonymity online. How do I know if I’m contributing to a system that is actually harming me and my community? How does the perception of a company like Google cloud my judgment when making design decisions around people I want to “help”.
P.S. After all of these readings, I had a more analytical eye when listening to Sarah Gold’s talk just because it is hosted on a Google Developers YouTube Channel. Clearly, the more knowledge and awareness you have the better!