Last December, two journalists in Myanmar, Wa Lone and Kyaw Soe Oo, were invited to dinner by police officers to discuss their research into war crimes carried out by the military. The officers handed them some documents, then immediately arrested them under the country’s Official Secrets Act for possessing the classified material.
breach of the province’s Freedom of Information web portal. Although details of the case are still emerging, it seems as though the province left sensitive documents on publicly accessible URLs, alongside material it had cleared for release. These materials were then downloaded by a researcher using a program which automatically retrieves all available files connected to a website—a practice known as scraping. The researcher has since been charged with unauthorized use of a computer and may face up to 10 years imprisonment.
This was not a hack. Website scraping is a perfectly valid research tool, which is routinely done by journalists and researchers who may find a website’s built-in search and retrieval functions to be cumbersome and inefficient. The alleged perpetrator has since told journalists he thought the material was open to
Some have suggested that the researcher is being used as a scapegoat, to deflect from embarrassment at the security failures that led the province to distribute
It may not be fair to compare this case to that of the Myanmar journalists, who were deliberately entrapped, but the stories are similar as both relied on the good faith and competence of official sources. When a researcher visits a government website, they should be able to assume that the material which is made available is intended for public consumption and has been adequately vetted. Imagine a police van driving up to your street, dropping a few boxes of classified files onto the sidewalk, and then arresting anyone who stopped to take a look.
We still don’t know all the facts in this case. Maybe it will emerge that the researcher was, in fact, looking to