Cage-free Ai

Caroline Sinders visited the NCSU this January and discussed her work with Machine Learning and Ai, and her human-centered approach with this technology. I was inspired by her call for individuals to think critically about this technology and the negative impacts it has on civilians, and the role designers have to play in mitigating and preventing the havoc these tools can cause. As Caroline puts it, we need to be asking for more transparency. As a society, we deserve to know where our data is being sold, who is requesting our data, how it is being stored, and why people are collecting it. Without this knowledge, vulnerable populations of people are at risk of further discrimination and harm. Take for example an Ai algorithm that is designed by a group of white men, and then fed to a computer that is then trained by data biassed towards white men. What do you get? A product that works only for white men. Without asking ourselves what kind of impact these algorithms will have on our society as a whole, we are subjecting whole groups of people to potential discrimination and harm simply by excluding them.

As designers, we need to ask ourselves this: How can we expose harmful products like these and give people a choice (an actual choice) to make decisions about how their data is being used. For, when we have transparency, change can happen. Just look at the food industry. When people became more aware of the impacts their food had on their health, they wanted to know how it was sourced and if it was being produced in an ethical manner. This, in turn, led to more ethically produced and transparent products like organic fruits and vegetables and cage-free eggs. So as we enter this Ai driven future, our role as designers will be crucial for designing products that tell the whole story and empower individuals to make a choice about how they participate in and use products with Ai.