[In April] Microsoft launched Tay, a bot with the face and mannerisms of a teenage girl who was designed to learn and interact with users on Twitter. Within hours, Tay had been bombarded with sexual abuse and taught to defend Hitler, which is what happens when you give Twitter a baby to raise. The way Tay was treated by fellow Twitter users was chilling, but not without precedent – the earliest bots and digital assistants were designed to appear female, in part so that users, who were presumed to be male, could exploit them without guilt.
This makes sense when you consider that a great deal of the work that we are anticipating may one day be done by robots is currently done by women and girls, for low pay or no pay at all. Last week, a report by the ONS finally quantified the annual value of the “home production economy” – the housework, childcare and organisational chores done largely by women – at £1 trillion, almost 60 per cent of the “official” economy. From nurses, secretaries and sex workers to wives and girlfriends, the emotional labour that keeps society running is still feminised – and still stigmatised.
Right now, as we’re anticipating the creation of AIs to serve our intimate needs, organise our diaries and care for us, and to do it all for free and without complaint, it’s easy to see how many designers might be more comfortable with those entities having the voices and faces of women. If they were designed male, users might be tempted to treat them as equals, to acknowledge them as human in some way, perhaps even offer them an entry-level salary and a cheeky drink after work.
Laurie Penny on artifical emotion.