Once upon a time, Snapchat was a simple app. You took a picture, you sent it to a friend, and it disappeared shortly after.
Nowadays, it’s a messaging app, a video sharing app and a news app. A face-swapping, dog-masking revolution in online communications.
But is it indecent? And are children being put at risk?
That’s what’s at the centre of an intriguing lawsuit made public on Thursday.
It focuses on the Discover function of Snapchat, a section in which publishers are falling over themselves to be seen.
It’s here where content made exclusively for Snapchat (in portrait orientation) comes from the likes of the Daily Mail, Vice, Buzzfeed and others*, and users are highly encouraged to read through stories and watch videos from those publications.
Stories such as “10 things he thinks when he can’t make you orgasm” from Cosmopolitan magazine.
“Millions of parents in the United States today are unaware that Snapchat is curating and publishing this profoundly sexual and offensive content to their children,” reads the lawsuit, filed by “John Doe”, a 14-year-old from Los Angeles.
State law means he has been made anonymous, although we know he apparently has “good grades”.
In US law, a class-action lawsuit allows one party – in this case “John Doe” – to represent a potentially much larger group of people who could be compensated if Snapchat was to lose or settle the case.