Inter(M)ing(L)ing

Rough Draft: Ch. 1& 2 | Helen Armstrong

I love this juxtaposition prevalent between Kate Darling and Sherry Turkle’s work in this week’s readings. Sherry Turkle gives us a peek of dystopia, claiming that by continuing to engage with machines, we forget our essence of humanity. And before we spiral into a void of uncertainty, Darling offers us a glimpse of hope. She cites that anthropomorphizing machines and assigning them human attributes can instead lead to more empathy. For example, she provides us with an anecdote about a robot and a colonel assigned to oversee that robot. The robot’s task was to continually trigger landmines. The colonel eventually became distraught with the sight of the poor robot sacrificing itself each time to set off a landmine and canceled the project soon after.

These juxtapositions are useful when designing because it allows us to envision possibilities. It addresses a larger issue of scope, by asking us to consider the global implications to humanity that interacting with machine learning and AI can postulate. And then we can also consider smaller scopes as well. For example, Carnegie Mellon HCI Professor John Zimmerman applies a great exercise to get students thinking about machine learning. He asks students to provide machine learning capabilities across 400 occupations to get students to think broadly and see the world more algorithmically. Rather than envisioning every possibility of a machine that can aid us, he asks his students to think about the opportunities in our everyday experiences that design can aid.

It leads me to this question of design opportunities for machine learning to enact in our every day; so long as they don’t strip us of our humanity as Turkle possibly warns. What are some good implications for machine learning in our everyday lives that can enrich human interaction? Where are these boundaries to consider? Helen mentions in her book, “When should designers encourage emotional bonding with AI and when should they actively discourage it?”

Tempting Fascism | Yuval Noah Harari

Data is the most valuable commodity in our current time and Yuval Noah Harari warns us that if the government controls our data, they can manipulate our deepest emotions and desires. Algorithms will become so enmeshed within our lives that whatever suggestion it shows us, we’ll believe that it’ll be integral to our authentic selves. This seems to be rapidly encroaching our digital spaces, but I’m wondering the extent to how much it encroaches on our physical as well? In a digital space, I get an Amazon suggestion that reminds me that I need a new hand towel. And then I might be influenced by another anticipatory design ad that recommends me a new toothbrush. Pretty soon, I’m sucked into a quick Amazon purchase scheme, feeling satisfied that my needs are met, and Amazon walks away with a few more bucks. Rinse and repeat and I’m continually hooked on the next Amazon Drug.

But how does the manipulation of data transcend the digital world and encroach upon the physical? In the digital space, I can see a proliferation of correct product placement and I recognize that grocery stores, for example, are strategically designed to manipulate customers into choosing certain items. That’s why they continually stack holiday merchandise conveniently located next to the checkout lanes, in the hopes that people are enticed into picking something up. But that’s already manufactured and there is the possibility that I might see an ad, but so far my physical space hasn’t changed. I might be thinking about the current present and it might be more troubling in the future. Who knows?

But I think It’s important to consider this question. Furthermore, the videos have also inspired me to ask a few more questions:

How might we look at our past triumphs to overcome this? What strategies in the past have been employed previously, and how can we apply those principles here?

What tactics can we choose to avoid data manipulation and succumb to dark patterns? Is this something that we should address collectively or on an individual level? How do we limit ourselves to plummeting into excess?

Resources

Yuval Noah Harari, Why Fascism is so Tempting—And How Your Data Could Power It.

Rough Draft Chapt 1 & 2 of Big Data. Big Design.