The 50% Opacity Box

The 50% Opacity Box with lots of dots going into a ox in a smooth line and coming out scattered

Since coming to NC State 3 semesters ago, I’ve been wondering why the professors here won’t make it more obvious which design philosophies are the “right” ones to subscribe to. They even go so far as to assign readings from design academics whom they disagree with. Like the “black box” of machine learning software, students emerge from graduate school with changed perceptions and new design philosophies, but it’s unclear to me what happens in the middle. At least that’s the way it seemed. I was having trouble figuring out how the alumni got from point A to point B.

Now, I think the process is more like a “slightly opaque box”. Having had real conversations with professors about design, I know that you can just ask them what they think and they’ll tell you. “This guy’s ideas are outdated,” “I like her writing,” etc. But they are very careful about steering you in any one direction. There is still something about the “grad school algorithm” that is blurry to me, something that is shaping us (the students) that I can’t get a good focus on.

Having recently attended a workshop on conversation and cybernetics with Paul Pangaro, I have been thinking a lot about the power that conversation has. AI is improving at exponential speeds but artificial conversational agents are still pretty dull. They don’t understand irony or sarcasm and they are just products of the training data given to them. Myself, my grad school colleagues, and professors have also been shaped by our own design educations, careers, and the work that we consume. But we are also deeply affected by non-design interests, varied backgrounds, families, spirituality, art, and more.

Graduate school is a series of conversations, with ourselves, with our peers, and with our own work. These conversations have a cybernetic (steering) quality to them. But the conversations we have with each other shape our design perspectives in more complex ways than can be illustrated in an  algorithm. There is a certain magic to it though, not unlike the black box of a machine learning algorithm.