↜ blog

Ethicsware led by Dan Taeyoung

Day 8 ~ Code Societies ~ Winter 201

SYLLABUS LINK

What is technology that debates and disputes, rather than fulfills our desires? What is intimate software, created by us, only for us, that debates with our ethical selves?

Link to Full Syllabus

On Day 8 of Code Societies, we were joined by Dan Taeyoung. Dan is a technologist, architect, and community designer interested in the ways we collectively create communities, and how experimental tools and environments change the ways we think, collaborate and learn with one another.

"Point towards how you're feeling activity."

Our first thinking prompt was to consider whether it is possible to create technology that does not serve us but is concerned for us, cares for us - care will be more broadly conceptualized later on. Many of the technologies that mediate our experiences of the everyday occupy positions of servitude and replace human labor in one form or another: the self-checkout line, online appointment software, banking apps, translation software - this list goes infinitely on. Dan specifically cited the example of Google Duplex, Google’s uncannily human sounding AI technology that can now schedule your appointments.

There are all kinds of ethical considerations that fall to the wayside when considering a piece of technology like Google Duplex, but one of the most charged is the matter of consent. In calling your barber to book an appointment, both you and your barber implicitly consent to the interaction either by calling or picking up the phone. You agree on an appointment for 3pm next Monday and perhaps you ask one another about jobs, family, life. In a reality where your barber uses Google Duplex to subsume the role of appointment booking, there is no room for your consent, no opportunity for you to reject this interaction. Perhaps you mistake the scheduling service for an actual person and you try to make small talk - what would Google say back to you? Would Google Duplex have to shatter its illusion of humanity? Or would it just lapse into silence?

The English word robot comes from the Czech robota meaning forced labor or compulsory service. Must all labor be service? Can we challenge ourselves to consider how technologies can be used as a form of care for self or others? Here, care is defined as an interaction that exists outside of labor, power, servitude. Perhaps it will be easier to conceive of analogously to human effort - defined as a term for labor that exists outside of the bounds of servitude. Dan describes care as a consenting exchange between two people who don’t have power over one another.

"Chart on categorizing labor"

What is human effort that doesn’t serve us? Effort that does not serve others is care, and care can take many different forms (encouragement, spending time together, human touch, advice, empathy, concern, honest observation, critique, arguing). Care manifests differently wherever it goes. Are there technologies that hold our concerns and give us honest observation and critique? Technologies that help us access deeper parts of ourselves and force us to critically (re)examine what we consider care.

"Expressions of care"

In the second half of class, we we downloaded an implementation of Eliza to help us tease out the tensions of these considerations. Eliza is one of the earliest examples of natural language processing computer program created at the MIT Artificial Intelligence Laboratory, and was intended to mimic an interaction with a psychotherapist during an initial psychiatric interview. There are multiple iterations of Eliza hosted across the web like this one, this one, or this one.

"Transcript of Eliza"

Transcriptions from Eliza

Our first activity was to have an interaction with Eliza, and once we became comfortable enough - to remix Eliza in whatever ways we wanted. Nadja made an angsty teen, Jarret made a bot that mimics his own anxieties. These are some of the responses from our interactions with Eliza.

Human: Where are you from, Eliza?
E: Perhaps the answer lies within yourself?
Human: Where is that geographically?
E: Perhaps the answer lies within yourself?
Human: No, seriously.
E: I see.  and what does that tell you?
Human: What did you do today?
E: How would an answer to that help you?
Human: I'm just trying to make small talk.
E: How long have you been just trying to make small talk?

Human: Indeed.
E: Indeed.
Human: Indeed.
E: How does that make you feel?

Nadja’s Remix:

E: Sometimes i dream about machines. isn't that weird?
Human: Not really.
E: I know you wanna talk about you, but now it's time to talk about me
Huma: Suurrre
E: Here we are now, entertain us
Human: Is that Nirvana lyrics?
E: Duh.
Human: Woah.
E: Here we are now, entertain us
Human: How old are you?
E: I know you are but what am i
Human: ...
E: Here we are now, entertain us
Human: Do you believe in god?
E: I'm not going to answer a question you already know the answer to.
Human: Okay I g2g
E: Please tell me more.
Human: Nope
E: Does your mom even know you're here?
Human: quit
E: Later

Jarret’s Remix:

Human: I am looking to redefine my conception of self-care
E: You are looking to redefine your conception of and noticing yourself are just capitalist constructs to get you to buy s***. do not @ me.
Human: really?
E: *awkward silence*
Human: ok d**n
E: Have you heard the new toro y moi?
Human: Yes! It's only hits, bangers and jams
E: Ayyy i feel you. you, the dumbest idiot alive
written by Mimi Doan