Nona Belomesoff is dead after meeting someone she knew only through Facebook. Should she have known better? No. Too harsh. Rules are changing. We all have lots to learn. But Facebook isn't helping. Perhaps it needs bringing into line.
The core problem is that the very idea of Facebook privacy is a contradiction.
As users, we want to limit the information we disclose about ourselves, to control who sees what. As Mark Pesce writes, this control goes to the heart of trust and personal safety. In theory Facebook agrees. "You should have control over what you share," says its privacy guide.
Yet Facebook's business model is best served by exposing your personal information as widely as possible. To advertisers, so they can target advertising more accurately and pay more for the privilege. To other users, to encourage them to share more as well. To search engines, to bring more traffic to Facebook. To anyone who wants to pay.
Throughout its six-year history, as this infographic shows, every time Facebook changes its privacy controls, the default settings always reduce your privacy.
Sure, you don't have to accept those default settings. You can lock down your profile - even to the point where only specific people can see it. But when logging into Facebook and being confronted with a request to review and update their privacy settings, what do most people do? Stop what they're doing to read through dozens of questions? Or click "OK" and get on with checking their party invitations and playing Farmville?
If Facebook was serious about providing privacy settings that are "simple and easy to understand", the process would look very different indeed. At the very least, whenever the underlying privacy model changed, Facebook's new "recommended settings" would be no more permissive than what you'd set previously - and they certainly wouldn't default to "everyone".
Facebook is "an intrinsically dangerous model", says David Vaile, executive director of the Cyberspace Law and Policy Centre at the University of New South Wales.
In the case of Nona Belomesoff, Vaile says it's "so clear and so stark" that it was giving out intimate personal information that enabled a stranger to model the sort of person she'd like to meet.
"I think the fact that many of the behaviours of the various software components are not sort of obvious, that the interface is so complex that only a programmer could love it, I think all of these things would point us to the need to review this," he told the Patch Monday podcast.
But Facebook's attitude is unlikely to change voluntarily, given the views of founder Mark Zuckerberg.
Privacy is no longer a social norm, Zuckerberg claimed in January. He derided companies that don't act like Facebook as being "trapped by the conventions and their legacies of what they've built".
Zuckerberg has always seen personal information as a commodity to be traded, it seems. Leaked chat logs from 2004, when he was 19 years old and "TheFacebook" was solely for students at Harvard University, reveal him mocking his fellow students for trusting him.
"Yeah so if you ever need info about anyone at Harvard - just ask," Zuckerberg told a friend. "I have over 4,000 emails, pictures, addresses, SNS [social network service addresses]."
When asked how he'd gotten them he replied, "People just submitted it. I don't know why. They 'trust me'. Dumb f***s."
There are also allegations that Zuckerberg used private login data from TheFacebook to break into users' external email accounts. Facebook, for its part, dismisses these claims. "We're not going to debate the disgruntled litigants and anonymous sources who seek to rewrite Facebook's early history or embarrass Mark Zuckerberg with dated allegations," the company stated.
Privacy concerns have now boosted web searches for "delete facebook account" and even spawned a Quit Facebook Day for May 31.
But though there are alternatives, leaving Facebook is difficult. Microsoft social researcher danah boyd says many people feel they need to be part of it, "For work, for personal reasons, because they got to connect with someone there that they couldn't connect with elsewhere".
"Facebook is now a utility for many. The problem with utilities is that they get regulated," boyd writes.
Vaile agrees, citing Facebook as an example of market failure.
"One sort of market failure is where you have a near-monopoly," he says. "If you've got 400 million users and many commercial entities feeling like they have to put a Facebook link on their pages, then you start to get some of the capacity for abuse that you get with monopolies. The other aspect of market failure is when you don't have a proper disclosure of, say, risks and costs in way that actually gets through and enables sensible decisions to be made."
The jurisdictional problems are obvious, yet both boyd and Vaile see the safety issue as paramount.
"I find James Grimmelmann's argument that we think about [privacy as product safety] to be an intriguing frame," boyd writes. "I'd expect to see a whole lot more coming down the line in this regards. And Facebook knows it.
Why else would they bring in a former Bush regulator to defend its privacy practices?
And as Vaile puts it, "Now that it's clear it has potentially fatal consequences, that takes it from something that's just goodie-goodie airy-fairy wouldn't-it-be-nice kind of thinking to one where it's a question of a life-saving issue to take on board."