Monthly Archives: July 2017

Report states Australians do not trust Telcos keeping their data safe & private

A report from Essential Research has emphasised that Australians do not trust telcos and ISPs storing their data, even though trust is rising for governments, law enforcement, and other businesses.

Australians are losing trust in telecommunications and internet service providers’ (ISPs) ability to store their data safely and securely, with a report from Essential Research highlighting only 4 percent of respondents have “a lot of trust” in the industry.

29 percent of the 1,020 respondents surveyed for the report [PDF] said they have some sort of trust in telcos and ISPs, a 3 percent drop from the previous year’s results.

Security agencies such as the Australian Federal Police (AFP), local police, and ASIO were found to be trusted by 64 percent of respondents, an increase from the 49 percent that said they trusted security agencies to store personal data safely and in a way that would prevent abuse in 2015.

Governments were found to be trusted 3 percent more than they were a year prior, with 43 percent having faith in those elected into office to protect their personal information.

It was revealed last week that Medicare card information was up for sale on the dark web, with the federal government responding swiftly to the claims with a statement that said reports are being taken seriously. The system used to access Medicare card details is now undergoing a review.

However, a remark was made by Minister for Human Services Alan Tudge that downplayed the seriousness of the issue, with Tudge commenting that the only information available was a Medicare card number and the information available was not sufficient to access any personal health record.

The federal government accidentally published the full names, nationalities, locations, arrival dates, and boat arrival information of nearly 10,000 asylum seekers housed both on the Australian mainland and Christmas Island in February 2014.

KPMG said human error and a push to get immigration data up on deadline resulted in the details being published on the Department of Immigration and Border Protection’s website by mistake.

Last month, the Queensland Crime and Corruption Commission (CCC) alleged that two male police officers accessed the state’s criminal records database on a handful of unauthorised occasions.

According to the CCC, a 60-year-old former sergeant undertook checks on the Queensland Police Records and Information Management Exchange (Qprime) for personal purposes. The 31-year-old serving sergeant was accused of accessing Qprime on 10 occasions.

A 43-year-old serving detective senior constable from State Crime Command was similarly charged in March, and another was fined in May for 80 instances of unauthorised Qprime access.

A report from the Office of the Australian Information Commissioner (OAIC) in May revealed that only 53 percent of people it surveyed were able to nominate an organisation to report the misuse of their information to.

The OAIC said that when asked, only 47 percent admitted awareness of a Privacy Commissioner — either federal or state level — but a mere 7 percent said they would report misuse of information to a Privacy Commissioner. Rather, 12 percent would prefer to report such acts to the police, and 9 percent would rather directly contact the organisation involved.

The survey found that Australians have awarded the highest level of trust to health service providers, followed by financial institutions, and then both state and federal government departments.

Of the 1,800 Australians surveyed, 16 percent said they would avoid dealing with a government agency because of privacy concerns, while 58 percent would avoid dealing with a private company for the same reasons.

Another question asked by Essential Media was whether the individual surveyed had fallen victim to a handful of cyber-related crimes.

33 percent said they had a computer virus that damaged their computer or data; 22 percent admitted to having their credit card information stolen; 14 percent had been the victim of online fraud; cyber bullying was experienced by 10 percent of respondents; online stalking, invasion of privacy, or high levels of harassment was reported by 9 percent; and 6 percent claim to have had their identity stolen.

50 percent — 510 individuals — said they had not fallen victim to any of the cyber-related crimes.

A computer virus was reported by more males than females, while cyber bullying was experienced by more females than males, with those aged 18 to 34 the most susceptible to be at the receiving end of the anti-social behaviour. Similarly, online stalking was experienced more by females, with those aged 18-34 again the most targeted.


Henry Sapiecha

Organisations need a ‘moral imagination’ to build ethical data services that are fair for all punters. That’s where Centrelink said .OOPS.!

Many recent discussions on privacy and data governance have focused on the practical, and on the short term. That’s understandable, given that here in Australia our mandatory data breach notification regime looms large, and Europe’s General Data Protection Regulation (GDPR) follows soon after. But balance is a good thing.

I was pleased, therefore, that the Data + Privacy Asia Pacific conference in Sydney on Wednesday kicked off with a look at the ethics of data stewardship. Not the everyday what, where, and how of data operations, but the why of doing any of these data things in the first place.

This framing was deliberate, Australia’s Information and Privacy Commissioner, Timothy Pilgrim, told ZDNet.

“There is no irony in the fact that often the most personal information is the richest in its potential for public data use,” said Pilgrim in his opening remarks. Therein lie the ethical problems.

How do we balance personal risks with the opportunity for public good, or at least the good of the organisation collecting the data? What counts as having a “genuine interest” in collecting the data, as opposed to sucking in as much as possible as soon as possible?

New ways of analysing data, re-identifying supposedly anonymous data, and reaching conclusions are being developed rapidly. Even the biggest players like Google and Facebook would admit they’ve no idea what might be possible even a few years ahead.

So how do we work within the ocean of future possibilities when data can be bought, sold, lost, stolen, or leaked?

“I think you’re exactly right in pointing to this as the main challenge, not just for us in this discussion, but for all of us here today,” said Rob Sherman, deputy chief privacy officer at Facebook. “We don’t know 20 years down the road what technology is going to look like.”

“You have to be willing to iterate. We have to have principles that are established, that reflect our views on the way to do this, independent of technology, and independent of specific use of data.”

Great. But how do you develop principles in the abstract?

According to Dr Simon Longstaff, executive director of The Ethics Centre, a useful tool here is the “veil of ignorance”, a thought experiment proposed in 1971 by American philosopher John Rawls.

Imagine that you’re developing the operating principles for, say, an on-demand transport service.

Now imagine that you know nothing about yourself, your natural abilities, or your position in society. You know nothing about your gender, race, language skills, health … none of these things. The veil of ignorance has descended.

How would you set up the rules for this service when you could be any of the people involved — driver, passenger, shareholder, brown-skinned, pregnant, mentally ill, drunk, whatever? Or even people not directly involved, such as vehicle manufacturers, regulators trying to minimise their overhead, or residents dealing with any environmental effects?

What principles would be fair and reasonable for everyone involved?

“[By doing that] you can start to get a sense of what you would build, that is technology-neutral, and effective in terms of dealing with our interests,” said Longstaff.

Such a “moral imagination”, as Longstaff described it, would go a long way towards addressing one of Startupland’s most obvious problems — that services are imagined by, built by, and built for a narrow demographic that’s mostly male, mostly white, mostly privileged, mostly aged under 30, and mostly besotted with their own “understanding” of how the world works.

Such a moral imagination might have helped create an on-demand transport service very different from Uber.

Remember the real reason for Uber?

“We wanted to get a classy ride. We wanted to be baller in San Francisco. That’s all it was about,” said founder Travis Kalanick in 2013.

Such a moral imagination might have helped the creators of the service which, it is alleged, discouraged poor students from university, rather than suggesting ways to help them follow their dream. They might have imagined how a teenager might feel being told, “Nah, I wouldn’t bother.”

And such a moral imagination might have helped human services minister Alan Tudge navigate his way through the Centrelink robodebt debacle, where shoddy algorithms and processes led to unreasonable demands for money being sent to welfare recipients. He might have imagined what it’d be like on the receiving end.

Things work very differently in Canada.

“Governments want to link data, so it might be to cut off somebody’s benefits, for example, because you’re declaring income which was not [previously] known,” said Michael McEvoy, Deputy Commissioner in the Office of the Information and Privacy Commissioner for British Columbia.

“What we’re working with government on that is to say that you can do that by machine process, but if you’re doing to disentitle somebody, or in some way be prejudicing that individual, a human being has to look at that before any decision is made.”

While imagining Tudge with a moral imagination may be a stretch of the imagination, it’s not quite as unrealistic to expect a organisation’s board to include these issues under the heading of corporate social responsibility.

Henry Sapiecha

With just one wiretap order, US authorities listened in on 3.3 million phone calls

The order was carried out in 2016 as part of a federal narcotics investigation.

NEW YORK, NY — US authorities intercepted and recorded millions of phone calls last year under a single wiretap order, authorized as part of a narcotics investigation.

The wiretap order authorized an unknown government agency to carry out real-time intercepts of 3.29 million cell phone conversations over a two-month period at some point during 2016, after the order was applied for in late 2015.

The order was signed to help authorities track 26 individuals suspected of involvement with illegal drug and narcotic-related activities in Pennsylvania.

The wiretap cost the authorities $335,000 to conduct and led to a dozen arrests.

But the authorities noted that the surveillance effort led to no incriminating intercepts, and none of the handful of those arrested have been brought to trial or convicted.

The revelation was buried in the US Courts’ annual wiretap report, published earlier this week but largely overlooked.

“The federal wiretap with the most intercepts occurred during a narcotics investigation in the Middle District of Pennsylvania and resulted in the interception of 3,292,385 cell phone conversations or messages over 60 days,” said the report.

Details of the case remain largely unknown, likely in part because the wiretap order and several motions that have been filed in relation to the case are thought to be under seal.

It’s understood to be one of the largest number of calls intercepted by a single wiretap in years, though it’s not known the exact number of Americans whose communications were caught up by the order.

We contacted the US Attorney’s Office for the Middle District of Pennsylvania, where the wiretap application was filed, but did not hear back.

Albert Gidari, a former privacy lawyer who now serves as director of privacy at Stanford Law School’s Center for Internet and Society, criticized the investigation.

“They spent a fortune tracking 26 people and recording three million conversations and apparently got nothing,” said Gidari. “I’d love to see the probable cause affidavit for that one and wonder what the court thought on its 10 day reviews when zip came in.”

“I’m not surprised by the results because on average, a very very low percentage of conversations are incriminating, and a very very low percent results in conviction,” he added.

When reached, a spokesperson for the Justice Department did not comment

Contact me securely

Zack Whittaker can be reached securely on Signal and WhatsApp at 646-755–8849, and his PGP fingerprint for email is: 4D0E 92F2 E36A EC51 DAAE 5D97 CB8C 15FA EB6C EEA5.

If you see something, leak something. Telling the world holds people in office accountable, no matter how big or small it may be.

There are a number of ways to contact me securely, in ranking order.

Encrypted calls and texts

I use both Signal and WhatsApp for end-to-end encrypted calling and messaging. The apps are available for iPhones and Android devices.

You can reach me at +1 646-755–8849 on Signal or WhatsApp.

I will get back to you as soon as possible if I don’t immediately respond.

Encrypted instant messaging

You can also contact me using “Off The Record” messaging, which allows you to talk to me in real time on your computer. It’s easy to use once you get started. This helpful guide will show you how to get set up.

You will need a Jabber instant messaging account. There are many options to choose from. For anonymity, you should create an account through the Tor browser.

You can reach me at: during working hours.

When you verify my fingerprint, it’s this: 914F503C 03771A5F A9E2AC91 95861FDA 9B3A7EAD.

Send me PGP email

My email address is (remove the dot for PGP).

PGP, or “Pretty Good Privacy,” is a great (but tricky-to-use) way of emailing someone encrypted files or messages. PGP works on almost every email account and computer, but using it on your work or home email address won’t hide who you are, or the fact that you sent a reporter an email.

If you want to remain anonymous, go somewhere that isn’t your home or work network. Then, you should use the Tor browser, which hides your location, to access a free email service (like this one or this one).

The EFF has a set of easy-to-use tutorials on how to get started.

You will need my public PGP key to email me securely, available here.

You can also verify my PGP fingerprint to be sure it’s me: 4D0E 92F2 E36A EC51 DAAE 5D97 CB8C 15FA EB6C EEA5.

You can also get this information on my Keybase profile.

When all else fails…

You can always send me things through the mail. My work address is:

Zack Whittaker c/o CBS,
28 E. 28th Street,
New York, NY 10016,
United States of America.

(Updated: January 14 with additional Keybase details.)
(Updated: April 30 with new Jabber fingerprint.)

Henry Sapiecha