Avish Gordhan’s SXSW Diary: Day #2

| | No Comments

SXSW2 (2).jpgAvish Gordhan, joint-ECD at M&C Saatchi Sydney is attending SXSW 2019 in Austin, Texas. Here Avish shares his experiences straight from the conference exclusively for Campaign Brief.

A Black Mirror Reality

 

Ideas are everything in our business. But so is execution.

On my second day of SXSW, I sat through a discussion about AI and the future of storytelling. One of the speakers referenced a project she’d been working on – Dimensions in Testimony: Virtual Conversations With Holocaust Survivors. The idea behind it is to record and display testimony in a way that will preserve the dialogue between Holocaust survivors and learners far into the future.

It’s an interactive display that allows people to have a conversation with a Holocaust survivor who’s now passed away. Makes sense. Holocaust survivors are dying. And the best engagement with this subject is from first-hand accounts.

Everything I’ve learnt about AI so far (and I’ll caveat this by saying I’m not an expert in this subject) tells me that the input stimulus dramatically affects the output. Racist ‘Tay’ twitter bot is an example of that. But an idea where the input is from the last remaining Holocaust survivors feels like a good thought. From a comms perspective, specifically, it has an elegance to it. The use of tech is central to the human cause.

But then the conversation moved on to the application of this tech at a personal level. Imagine being able to hold onto the stories of your family. Imagine inputting your grandfather, mother, wife or husband, anyone into the learning process so that you could talk to them after they’ve passed away.

If you’ve ever lost someone you care about, you probably will have very strong opinions about this one way or another.

I was put off.   

This execution ‘feels’ different. It lacks humanity. It’s dystopian. And I think it ignores the reality of grief and the process that goes with it. In an age of personalisation, I think this is (dare I say it) too personal. And it’s probably damaging to our psyches too.

Same idea. Different executions. And, in my opinion, vastly different results.

This is one of the clearest expressions I’ve come across of how tech can be pointed incorrectly.

As creators, we have a small but important duty of care. This particular talk was a reminder of why that’s necessary. We have stories to tell and we have new ways of telling them. But there needs to be humanity in our technology. We are responsible for what shape that humanity takes. We need to remember that just because we can, doesn’t mean we should.