Senior Product designer in London

My virtual sketchbook

My virtual sketchbook from University

Misidentified #4 RunwayML

Screen-Shot-2020-01-12-at-00.17.30.png

RunwayML is a machine learning software with a number of models to be tested. The first thing I tested was a model which detected a face in a image so I wanted to see if it would recognise a face from a scene of a deep fake video which it did. I then started thinking about how could I create something which test people if they could tell the difference between a machine response and human response – recreating the Turing test using a model which generated a text response based on a prompt the user typed but that didn’t work out to well. I wasn’t sure of what to do but I wanted to test race so I would put images of myself and darker skin toned family members to see if it worked. This is where I identified a problem. In one of the models it was meant to hide an identity of a person pretty much changing the face but I soon realised that the dataset was a problem because they replaced Obama’s face, a brown skinned tone person’s face with a light skinned person’s face. I right away informed runwayML of this error and if there was a way of fixing their dataset. They told me this wasn’t possible as of now. I wanted to see what difference it would make if a dataset was more inclusive, RunwayML is a machine learning software with many machine learning models for people to test. The first thing I tested was a model that detected a face in an image so I wanted to see if it recognised a face from a deep fake video scene that it did. Then I started to think about how I could create something that would test people if they could tell the difference between a machine response and a human response-recreating the Turing test using a model that generated a text response based on the user’s typing prompt but that didn’t work too well. I wasn’t sure what to do but I wanted to test race so I’d put pictures of myself and darker skin toned family members to see if it was working. I identified a problem here. I tested a model that was meant to hide a person’s identity in one of the templates pretty much changing the face but I soon realised that the dataset was an issue as they replaced Obama’s face, the face of a brown-skinned tone person with the face of a light-skinned person. I told runwayML of this error immediately and if there was a way to fix their dataset. They told me that as of now this was not possible. I tried to see what would make a difference if a dataset were more inclusive, the most likely thing that would happen is not to have a mistake like this again but I couldn’t investigate properly because the trial for training a model was very short and it took some time to train. Also, to transfer the model into a website was tricky and involved using tensorflow.js.