Commentary, Facebook’s Deletion Policy
Why do people post on Facebook what they do? For instance, a man in Thailand live streamed himself killing his baby daughter and then committing suicide. http://electronicnewsnetwork.com/news/baby-murder-suicide-facebook-201032349/ A man in Memphis, Tennessee, set his phone to record as he doused himself with kerosene, lit a match and committed suicide. http://www.stuff.co.nz/world/americas/92648925/man-commits-suicide-on-facebook-live-by-setting-himself-on-fire Videos of rapes, “revenge porn” (attempts to use intimate images to shame, humiliate or gain revenge against a person), and ISIS beheadings mingle on line with images of family feasts and frolicking kittens. And any of these can be downloaded and saved to an institutional or personal archives.
The Guardian published a series of articles it called “The Facebook Files,” based on “more than 100 training manuals, spreadsheets and flowcharts” leaked to The Guardian that show how Facebook is dealing with violent content on its service. The company uses algorithms and is working with artificial intelligence to address the problem of content, trying to walk the line between censorship and free speech. And Facebook has a team of 3,000 people—and growing—to monitor the postings and decide what to delete. As one man described a monitor’s job: “You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see.” Several nongovernmental organizations also have teams monitoring content, particularly watching for images of child abuse. All of these organizations have “safeguard programs” to support the mental well-being of the monitors who work in these psychologically stressful jobs. (When archivists must identify and redact documents for human rights lawsuits or process records from truth commissions and criminal courts, these same stresses are apparent and the same care for the health of the staff members is essential.) https://www.theguardian.com/news/series/facebook-files
Facebook told The Guardian it is “going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.” What is not clear, however, is how law enforcement will be able to act on the complaints—or even conduct investigations—if Facebook truly deletes the content. Or is Facebook only disabling public links but retains the content for use by legitimate investigators and prosecutors or defense counsel? We don’t know—or, at least, The Guardian doesn’t tell us. If Facebook does keep the information and it is available to law enforcement, archivists will eventually have to handle that evidence as part of their responsibility for police and attorney records. But if that repulsive information is not available for the use of those who protect our human rights, we will all be less secure and the posters—that is, the ones who are still alive—will be free to offend once more.