Glenn Greenwald and his colleagues at The Intercept has just released an extensive report on the NSA use of XKEYSCORE. And here’s a video on the same topic:
It’s simple. Whenever Bruce Schneier speaks, listen.
How we sold our souls – and more – to the internet giants
Sunday 17 May 2015 11.00 BST
Last year, when my refrigerator broke, the repair man replaced the computer that controls it. I realised that I had been thinking about the refrigerator backwards: it’s not a refrigerator with a computer, it’s a computer that keeps food cold. Just like that, everything is turning into a computer. Your phone is a computer that makes calls. Your car is a computer with wheels and an engine. Your oven is a computer that cooks lasagne. Your camera is a computer that takes pictures. Even our pets and livestock are now regularly chipped; my cat could be considered a computer that sleeps in the sun all day.
Computers are being embedded into all sort of products that connect to the internet. Nest, which Google purchased last year for more than $3bn, makes an internet-enabled thermostat. You can buy a smart air conditioner that learns your preferences and maximises energy efficiency. Fitness tracking devices, such as Fitbit or Jawbone, collect information about your movements, awake and asleep, and use that to analyse both your exercise and sleep habits. Many medical devices are starting to be internet-enabled, collecting and reporting a variety of biometric data. There are – or will be soon – devices that continually measure our vital signs, moods and brain activity.
This year, we have had two surprising stories of technology monitoring our activity: Samsung televisions that listen to conversations in the room and send them elsewhere for transcription – just in case someone is telling the TV to change the channel – and a Barbie that records your child’s questions and sells them to third parties.
All these computers produce data about what they’re doing and a lot of it is surveillance data. It’s the location of your phone, who you’re talking to and what you’re saying, what you’re searching and writing. It’s your heart rate. Corporations gather, store and analyse this data, often without our knowledge, and typically without our consent. Based on this data, they draw conclusions about us that we might disagree with or object to and that can affect our lives in profound ways. We may not like to admit it, but we are under mass surveillance.
Internet surveillance has evolved into a shockingly extensive, robust and profitable surveillance architecture. You are being tracked pretty much everywhere you go, by many companies and data brokers: 10 different companies on one website, a dozen on another. Facebook tracks you on every site with a Facebook Like button (whether you’re logged in to Facebook or not), while Google tracks you on every site that has a Google Plus g+ button or that uses Google Analytics to monitor its own web traffic.
Most of the companies tracking you have names you’ve never heard of: Rubicon Project, AdSonar, Quantcast, Undertone, Traffic Marketplace. If you want to see who’s tracking you, install one of the browser plug-ins that let you monitor cookies. I guarantee you will be startled. One reporter discovered that 105 different companies tracked his internet use during one 36-hour period. In 2010, the seemingly innocuous site Dictionary.com installed more than 200 tracking cookies on your browser when you visited.
It’s no different on your smartphone. The apps there track you as well. They track your location and sometimes download your address book, calendar, bookmarks and search history. In 2013, the rapper Jay Z and Samsung teamed up to offer people who downloaded an app the ability to hear the new Jay Z album before release. The app required that users give Samsung consent to view all accounts on the phone, track its location and who the user was talking to. The Angry Birds game even collects location data when you’re not playing. It’s less Big Brother and more hundreds of tittletattle little brothers.
Most internet surveillance data is inherently anonymous, but companies are increasingly able to correlate the information gathered with other information that positively identifies us. You identify yourself willingly to lots of internet services. Often you do this with only a username, but increasingly usernames can be tied to your real name. Google tried to enforce this with its “real name policy”, which required users register for Google Plus with their legal names, until it rescinded that policy in 2014. Facebook pretty much demands real names. Whenever you use your credit card number to buy something, your real identity is tied to any cookies set by companies involved in that transaction. And any browsing you do on your smartphone is tied to you as the phone’s owner, although the website might not know it.
Surveillance is the business model of the internet for two primary reasons: people like free and people like convenient. The truth is, though, that people aren’t given much of a choice. It’s either surveillance or nothing and the surveillance is conveniently invisible so you don’t have to think about it. And it’s all possible because laws have failed to keep up with changes in business practices.
In general, privacy is something people tend to undervalue until they don’t have it anymore. Arguments such as “I have nothing to hide” are common, but aren’t really true. People living under constant surveillance quickly realise that privacy isn’t about having something to hide. It’s about individuality and personal autonomy. It’s about being able to decide who to reveal yourself to and under what terms. It’s about being free to be an individual and not having to constantly justify yourself to some overseer.
This tendency to undervalue privacy is exacerbated by companies deliberately making sure that privacy is not salient to users. When you log on to Facebook, you don’t think about how much personal information you’re revealing to the company; you chat with your friends. When you wake up in the morning, you don’t think about how you’re going to allow a bunch of companies to track you throughout the day; you just put your cell phone in your pocket.
But by accepting surveillance-based business models, we hand over even more power to the powerful. Google controls two-thirds of the US search market. Almost three-quarters of all internet users have Facebook accounts. Amazon controls about 30% of the US book market, and 70% of the ebook market. Comcast owns about 25% of the US broadband market. These companies have enormous power and control over us simply because of their economic position.
Our relationship with many of the internet companies we rely on is not a traditional company-customer relationship. That’s primarily because we’re not customers – we’re products those companies sell to their real customers. The companies are analogous to feudal lords and we are their vassals, peasants and – on a bad day – serfs. We are tenant farmers for these companies, working on their land by producing data that they in turn sell for profit.
Yes, it’s a metaphor, but it often really feels like that. Some people have pledged allegiance to Google. They have Gmail accounts, use Google Calendar and Google Docs and have Android phones. Others have pledged similar allegiance to Apple. They have iMacs, iPhones and iPads and let iCloud automatically synchronise and back up everything. Still others let Microsoft do it all. Some of us have pretty much abandoned email altogether for Facebook, Twitter and Instagram. We might prefer one feudal lord to the others. We might distribute our allegiance among several of these companies or studiously avoid a particular one we don’t like. Regardless, it’s becoming increasingly difficult to avoid pledging allegiance to at least one of them.
After all, customers get a lot of value out of having feudal lords. It’s simply easier and safer for someone else to hold our data and manage our devices. We like having someone else take care of our device configurations, software management, and data storage. We like it when we can access our email anywhere, from any computer, and we like it that Facebook just works, from any device, anywhere. We want our calendar entries to appear automatically on all our devices. Cloud storage sites do a better job of backing up our photos and files than we can manage by ourselves; Apple has done a great job of keeping malware out of its iPhone app store. We like automatic security updates and automatic backups; the companies do a better job of protecting our devices than we ever did. And we’re really happy when, after we lose a smartphone and buy a new one, all of our data reappears on it at the push of a button.
In this new world of computing, we’re no longer expected to manage our computing environment. We trust the feudal lords to treat us well and protect us from harm. It’s all a result of two technological trends.
The first is the rise of cloud computing. Basically, our data is no longer stored and processed on our computers. That all happens on servers owned by many different companies. The result is that we no longer control our data. These companies access our data—both content and metadata—for whatever profitable purpose they want. They have carefully crafted terms of service that dictate what sorts of data we can store on their systems, and can delete our entire accounts if they believe we violate them. And they turn our data over to law enforcement without our knowledge or consent. Potentially even worse, our data might be stored on computers in a country whose data protection laws are less than rigorous.
The second trend is the rise of user devices that are managed closely by their vendors: iPhones, iPads, Android phones, Kindles, ChromeBooks, and the like. The result is that we no longer control our computing environment. We have ceded control over what we can see, what we can do, and what we can use. Apple has rules about what software can be installed on iOS devices. You can load your own documents onto your Kindle, but Amazon is able to delete books it has already sold you. In 2009, Amazon automatically deleted some editions of George Orwell’s Nineteen Eighty-Four from users’ Kindles because of a copyright issue. I know, you just couldn’t write this stuff any more ironically.
It’s not just hardware. It’s getting hard to just buy a piece of software and use it on your computer in any way you like. Increasingly, vendors are moving to a subscription model—Adobe did that with Creative Cloud in 2013—that gives the vendor much more control. Microsoft hasn’t yet given up on a purchase model, but is making its MS Office subscription very attractive. And Office 365’s option of storing your documents in the Microsoft cloud is hard to turn off. Companies are pushing us in this direction because it makes us more profitable as customers or users.
Given current laws, trust is our only option. There are no consistent or predictable rules. We have no control over the actions of these companies. I can’t negotiate the rules regarding when Yahoo will access my photos on Flickr. I can’t demand greater security for my presentations on Prezi or my task list on Trello. I don’t even know the cloud providers to whom those companies have outsourced their infrastructures. If any of those companies delete my data, I don’t have the right to demand it back. If any of those companies give the government access to my data, I have no recourse. And if I decide to abandon those services, chances are I can’t easily take my data with me.
Political scientist Henry Farrell observed: “Much of our life is conducted online, which is another way of saying that much of our life is conducted under rules set by large private businesses, which are subject neither to much regulation nor much real market competition.”
The common defence is something like “business is business”. No one is forced to join Facebook or use Google search or buy an iPhone. Potential customers are choosing to enter into these quasi-feudal user relationships because of the enormous value they receive from them. If they don’t like it, goes the argument, they shouldn’t do it.
This advice is not practical. It’s not reasonable to tell people that if they don’t like their data being collected, they shouldn’t email, shop online, use Facebook or have a mobile phone. I can’t imagine students getting through school anymore without an internet search or Wikipedia, much less finding a job afterwards. These are the tools of modern life. They’re necessary to a career and a social life. Opting out just isn’t a viable choice for most of us, most of the time; it violates what have become very real norms of contemporary life.
Right now, choosing among providers is not a choice between surveillance or no surveillance, but only a choice of which feudal lords get to spy on you. This won’t change until we have laws to protect both us and our data from these sorts of relationships. Data is power and those that have our data have power over us. It’s time for government to step in and balance things out.
Adapted from Data and Goliath by Bruce Schneier, published by Norton Books. To order a copy for £17.99 go to bookshop.theguardian.com. Bruce Schneier is a security technologist and CTO of Resilient Systems Inc. He blogs at schneier.com, and tweets at @schneierblog
Some thoughts for the weekend… listen especially to the first six and a half minutes of this clip below about the conspiracy theories surrounding the recent mysterious death of Dave Goldberg, the husband of Facebook Chief Operating Officer Sheryl Sandberg – the “Facebook-NSA Queen”.
(Above) Photo credit: dailydotcom
Are you wondering why this “problem” (data overload – see article below) did not happen earlier…?
NSA is so overwhelmed with data, it’s no longer effective, says whistleblower
Summary:One of the agency’s first whistleblowers says the NSA is taking in too much data for it to handle, which can have disastrous — if not deadly — consequences.
By Zack Whittaker for Zero Day | April 30, 2015 — 14:29 GMT (22:29 GMT+08:00)
NEW YORK — A former National Security Agency official turned whistleblower has spent almost a decade and a half in civilian life. And he says he’s still “pissed” by what he’s seen leak in the past two years.
In a lunch meeting hosted by Contrast Security founder Jeff Williams on Wednesday, William Binney, a former NSA official who spent more than three decades at the agency, said the US government’s mass surveillance programs have become so engorged with data that they are no longer effective, losing vital intelligence in the fray.
That, he said, can — and has — led to terrorist attacks succeeding.
Binney said that an analyst today can run one simple query across the NSA’s various databases, only to become immediately overloaded with information. With about four billion people — around two-thirds of the world’s population — under the NSA and partner agencies’ watchful eyes, according to his estimates, there is too much data being collected.
“That’s why they couldn’t stop the Boston bombing, or the Paris shootings, because the data was all there,” said Binney. Because the agency isn’t carefully and methodically setting its tools up for smart data collection, that leaves analysts to search for a needle in a haystack.
“The data was all there… the NSA is great at going back over it forensically for years to see what they were doing before that,” he said. “But that doesn’t stop it.”
Binney called this a “bulk data failure” — in that the NSA programs, leaked by Edward Snowden, are collecting too much for the agency to process. He said the problem runs deeper across law enforcement and other federal agencies, like the FBI, the CIA, and the Drug Enforcement Administration (DEA), which all have access to NSA intelligence.
Binney left the NSA a month after the September 11 attacks in New York City in 2001, days after controversial counter-terrorism legislation was enacted — the Patriot Act — in the wake of the attacks. Binney stands jaded by his experience leaving the shadowy eavesdropping agency, but impassioned for the job he once had. He left after a program he helped develop was scrapped three weeks prior to September 11, replaced by a system he said was more expensive and more intrusive. Snowden said he was inspired by Binney’s case, which in part inspired him to leak thousands of classified documents to journalists.
Since then, the NSA has ramped up its intelligence gathering mission to indiscriminately “collect it all.”
Binney said the NSA is today not as interested in phone records — such as who calls whom, when, and for how long. Although the Obama administration calls the program a “critical national security tool,” the agency is increasingly looking at the content of communications, as the Snowden disclosures have shown.
Binney said he estimated that a “maximum” of 72 companies were participating in the bulk records collection program — including Verizon, but said it was a drop in the ocean. He also called PRISM, the clandestine surveillance program that grabs data from nine named Silicon Valley giants, including Apple, Google, Facebook, and Microsoft, just a “minor part” of the data collection process.
“The Upstream program is where the vast bulk of the information was being collected,” said Binney, talking about how the NSA tapped undersea fiber optic cables. With help from its British counterparts at GCHQ, the NSA is able to “buffer” more than 21 petabytes a day.
Binney said the “collect it all” mantra now may be the norm, but it’s expensive and ineffective.
“If you have to collect everything, there’s an ever increasing need for more and more budget,” he said. “That means you can build your empire.”
They say you never leave the intelligence community. Once you’re a spy, you’re always a spy — it’s a job for life, with few exceptions. One of those is blowing the whistle, which he did. Since then, he has spent his retirement lobbying for change and reform in industry and in Congress.
“They’re taking away half of the constitution in secret,” said Binney. “If they want to change the constitution, there’s a way to do that — and it’s in the constitution.”
An NSA spokesperson did not immediately comment.
Have to feel sorry for Snowden here…
Here’s an exclusive story (below) from Al Jazeera neither Google nor the NSA wants you to know.
Exclusive: Emails reveal close Google relationship with NSA
National Security Agency head and Internet giant’s executives have coordinated through high-level policy discussions
May 6, 2014 5:00AM ET
by Jason Leopold
Email exchanges between National Security Agency Director Gen. Keith Alexander and Google executives Sergey Brin and Eric Schmidt suggest a far cozier working relationship between some tech firms and the U.S. government than was implied by Silicon Valley brass after last year’s revelations about NSA spying.
Disclosures by former NSA contractor Edward Snowden about the agency’s vast capability for spying on Americans’ electronic communications prompted a number of tech executives whose firms cooperated with the government to insist they had done so only when compelled by a court of law.
But Al Jazeera has obtained two sets of email communications dating from a year before Snowden became a household name that suggest not all cooperation was under pressure.
On the morning of June 28, 2012, an email from Alexander invited Schmidt to attend a four-hour-long “classified threat briefing” on Aug. 8 at a “secure facility in proximity to the San Jose, CA airport.”
“The meeting discussion will be topic-specific, and decision-oriented, with a focus on Mobility Threats and Security,” Alexander wrote in the email, obtained under a Freedom of Information Act (FOIA) request, the first of dozens of communications between the NSA chief and Silicon Valley executives that the agency plans to turn over.
Alexander, Schmidt and other industry executives met earlier in the month, according to the email. But Alexander wanted another meeting with Schmidt and “a small group of CEOs” later that summer because the government needed Silicon Valley’s help.
“About six months ago, we began focusing on the security of mobility devices,” Alexander wrote. “A group (primarily Google, Apple and Microsoft) recently came to agreement on a set of core security principles. When we reach this point in our projects we schedule a classified briefing for the CEOs of key companies to provide them a brief on the specific threats we believe can be mitigated and to seek their commitment for their organization to move ahead … Google’s participation in refinement, engineering and deployment of the solutions will be essential.”
Jennifer Granick, director of civil liberties at Stanford Law School’s Center for Internet and Society, said she believes information sharing between industry and the government is “absolutely essential” but “at the same time, there is some risk to user privacy and to user security from the way the vulnerability disclosure is done.”
The challenge facing government and industry was to enhance security without compromising privacy, Granick said. The emails between Alexander and Google executives, she said, show “how informal information sharing has been happening within this vacuum where there hasn’t been a known, transparent, concrete, established methodology for getting security information into the right hands.”
The classified briefing cited by Alexander was part of a secretive government initiative known as the Enduring Security Framework (ESF), and his email provides some rare information about what the ESF entails, the identities of some participant tech firms and the threats they discussed.
Alexander explained that the deputy secretaries of the Department of Defense, Homeland Security and “18 US CEOs” launched the ESF in 2009 to “coordinate government/industry actions on important (generally classified) security issues that couldn’t be solved by individual actors alone.”
“For example, over the last 18 months, we (primarily Intel, AMD [Advanced Micro Devices], HP [Hewlett-Packard], Dell and Microsoft on the industry side) completed an effort to secure the BIOS of enterprise platforms to address a threat in that area.”
“BIOS” is an acronym for “basic input/output system,” the system software that initializes the hardware in a personal computer before the operating system starts up. NSA cyberdefense chief Debora Plunkett in December disclosed that the agency had thwarted a “BIOS plot” by a “nation-state,” identified as China, to brick U.S. computers. That plot, she said, could have destroyed the U.S. economy. “60 Minutes,” which broke the story, reported that the NSA worked with unnamed “computer manufacturers” to address the BIOS software vulnerability.
But some cybersecurity experts questioned the scenario outlined by Plunkett.
“There is probably some real event behind this, but it’s hard to tell, because we don’t have any details,” wrote Robert Graham, CEO of the penetration-testing firm Errata Security in Atlanta, on his blog in December. “It”s completely false in the message it is trying to convey. What comes out is gibberish, as any technical person can confirm.”
And by enlisting the NSA to shore up their defenses, those companies may have made themselves more vulnerable to the agency’s efforts to breach them for surveillance purposes.
“I think the public should be concerned about whether the NSA was really making its best efforts, as the emails claim, to help secure enterprise BIOS and mobile devices and not holding the best vulnerabilities close to their chest,” said Nate Cardozo, a staff attorney with the Electronic Frontier Foundation’s digital civil liberties team.
He doesn’t doubt that the NSA was trying to secure enterprise BIOS, but he suggested that the agency, for its own purposes, was “looking for weaknesses in the exact same products they’re trying to secure.”
The NSA “has no business helping Google secure its facilities from the Chinese and at the same time hacking in through the back doors and tapping the fiber connections between Google base centers,” Cardozo said. “The fact that it’s the same agency doing both of those things is in obvious contradiction and ridiculous.” He recommended dividing offensive and defensive functions between two agencies.
Two weeks after the “60 Minutes” broadcast, the German magazine Der Spiegel, citing documents obtained by Snowden, reported that the NSA inserted back doors into BIOS, doing exactly what Plunkett accused a nation-state of doing during her interview.
Google’s Schmidt was unable to attend to the mobility security meeting in San Jose in August 2012.
“General Keith.. so great to see you.. !” Schmidt wrote. “I’m unlikely to be in California that week so I’m sorry I can’t attend (will be on the east coast). Would love to see you another time. Thank you !” Since the Snowden disclosures, Schmidt has been critical of the NSA and said its surveillance programs may be illegal.
Army Gen. Martin E. Dempsey, chairman of the Joint Chiefs of Staff, did attend that briefing. Foreign Policy reported a month later that Dempsey and other government officials — no mention of Alexander — were in Silicon Valley “picking the brains of leaders throughout the valley and discussing the need to quickly share information on cyber threats.” Foreign Policy noted that the Silicon Valley executives in attendance belonged to the ESF. The story did not say mobility threats and security was the top agenda item along with a classified threat briefing.
A week after the gathering, Dempsey said during a Pentagon press briefing, “I was in Silicon Valley recently, for about a week, to discuss vulnerabilities and opportunities in cyber with industry leaders … They agreed — we all agreed on the need to share threat information at network speed.”
Google co-founder Sergey Brin attended previous meetings of the ESF group but because of a scheduling conflict, according to Alexander’s email, he also could not attend the Aug. 8 briefing in San Jose, and it’s unknown if someone else from Google was sent.
A few months earlier, Alexander had emailed Brin to thank him for Google’s participation in the ESF.
“I see ESF’s work as critical to the nation’s progress against the threat in cyberspace and really appreciate Vint Cerf [Google’s vice president and chief Internet evangelist], Eric Grosse [vice president of security engineering] and Adrian Ludwig’s [lead engineer for Android security] contributions to these efforts during the past year,” Alexander wrote in a Jan. 13, 2012, email.
“You recently received an invitation to the ESF Executive Steering Group meeting, which will be held on January 19, 2012. The meeting is an opportunity to recognize our 2012 accomplishments and set direction for the year to come. We will be discussing ESF’s goals and specific targets for 2012. We will also discuss some of the threats we see and what we are doing to mitigate those threats … Your insights, as a key member of the Defense Industrial Base, are valuable to ensure ESF’s efforts have measurable impact.”
A Google representative declined to answer specific questions about Brin’s and Schmidt’s relationship with Alexander or about Google’s work with the government.
“We work really hard to protect our users from cyberattacks, and we always talk to experts — including in the U.S. government — so we stay ahead of the game,” the representative said in a statement to Al Jazeera. “It’s why Sergey attended this NSA conference.”
Brin responded to Alexander the following day even though the head of the NSA didn’t use the appropriate email address when contacting the co-chairman.
“Hi Keith, looking forward to seeing you next week. FYI, my best email address to use is [redacted],” Brin wrote. “The one your email went to — firstname.lastname@example.org — I don’t really check.”
Facebook ‘tracks all visitors, breaching EU law’
Exclusive: People without Facebook accounts, logged out users, and EU users who have explicitly opted out of tracking are all being tracked, report says
Facebook tracks the web browsing of everyone who visits a page on its site even if the user does not have an account or has explicitly opted out of tracking in the EU, extensive research commissioned by the Belgian data protection agency has revealed.
The researchers now claim that Facebook tracks computers of users without their consent, whether they are logged in to Facebook or not, and even if they are not registered users of the site or explicitly opt out in Europe. Facebook tracks users in order to target advertising.
The issue revolves around Facebook’s use of its social plugins such as the “Like” button, which has been placed on more than 13m sites including health and government sites.
Facebook places tracking cookies on users’ computers if they visit any page on the facebook.com domain, including fan pages or other pages that do not require a Facebook account to visit.
When a user visits a third-party site that carries one of Facebook’s social plug-ins, it detects and sends the tracking cookies back to Facebook – even if the user does not interact with the Like button, Facebook Login or other extension of the social media site.
EU privacy law states that prior consent must be given before issuing a cookie or performing tracking, unless it is necessary for either the networking required to connect to the service (“criterion A”) or to deliver a service specifically requested by the user (“criterion B”).
A cookie is a small file placed on a user’s computer by a website that stores settings, previous activities and other small amounts of information needed by the site. They are sent to the site on each visit and can therefore be used to identify a user’s computer and track their movements across the web.
“We collect information when you visit or use third-party websites and apps that use our services. This includes information about the websites and apps you visit, your use of our services on those websites and apps, as well as information the developer or publisher of the app or website provides to you or us,” states Facebook’s data usage policy, which was updated this year.
Facebook’s tracking practices have ‘no legal basis’
An opinion published by Article 29, the pan-European data regulator working party, in 2012 stated that unless delivering a service specifically requested by the user, social plug-ins must have consent before placing a cookie. “Since by definition social plug-ins are destined to members of a particular social network, they are not of any use for non-members, and therefore do not match ‘criterion B’ for those users.”
The same applies for users of Facebook who are logged out at the time, while logged-in users should only be served a “session cookie” that expires when the user logs out or closes their browser, according to Article 29.
The Article 29 working party has also said that cookies set for “security purposes” can only fall under the consent exemptions if they are essential for a service explicitly requested by the user – not general security of the service.
The social network tracks its users for advertising purposes across non-Facebook sites by default. Users can opt out of ad tracking, but an opt-out mechanism “is not an adequate mechanism to obtain average users informed consent”, according to Article 29.
“European legislation is really quite clear on this point. To be legally valid, an individual’s consent towards online behavioural advertising must be opt-in,” explained Brendan Van Alsenoy, a researcher at ICRI and one of the report’s author.
“Facebook cannot rely on users’ inaction (ie not opting out through a third-party website) to infer consent. As far as non-users are concerned, Facebook really has no legal basis whatsoever to justify its current tracking practices.”
Opt-out mechanism actually enables tracking for the non-tracked
The researchers also analysed the opt-out mechanism used by Facebook and many other internet companies including Google and Microsoft.
Users wanting to opt out of behavioural tracking are directed to sites run by the Digital Advertising Alliance in the US, Digital Advertising Alliance of Canada in Canada or the European Digital Advertising Alliance in the EU, each of which allow bulk opting-out from 100 companies.
But the researchers discovered that far from opting out of tracking, Facebook places a new cookie on the computers of users who have not been tracked before.
“If people who are not being tracked by Facebook use the ‘opt out’ mechanism proposed for the EU, Facebook places a long-term, uniquely identifying cookie, which can be used to track them for the next two years,” explained Günes Acar from Cosic, who also co-wrote the report. “What’s more, we found that Facebook does not place any long-term identifying cookie on the opt-out sites suggested by Facebook for US and Canadian users.”
The finding was confirmed by Steven Englehardt, a researcher at Princeton University’s department of computer science who was not involved in the report: “I started with a fresh browsing session and received an additional ‘datr’ cookie that appears capable of uniquely identifying users on the UK version of the European opt-out site. This cookie was not present during repeat tests with a fresh session on the US or Canadian version.”
Facebook sets an opt-out cookie on all the opt-out sites, but this cookie cannot be used for tracking individuals since it does not contain a unique identifier. Why Facebook places the “datr” cookie on computers of EU users who opt out is unknown.
For users worried about tracking, third-party browser add-ons that block tracking are available, says Acar: “Examples include Privacy Badger, Ghostery and Disconnect. Privacy Badger replaces social plug-ins with privacy preserving counterparts so that users can still use social plug-ins, but not be tracked until they actually click on them.
“We argue that it is the legal duty of Facebook to design its services and components in a privacy-friendly way,” Van Alsenoy added. “This means designing social plug-ins in such a way that information about individual’s personal browsing activities outside of Facebook are not unnecessarily exposed.”
A Facebook spokesperson said: “This report contains factual inaccuracies. The authors have never contacted us, nor sought to clarify any assumptions upon which their report is based. Neither did they invite our comment on the report before making it public. We have explained in detail the inaccuracies in the earlier draft report (after it was published) directly to the Belgian DPA, who we understand commissioned it, and have offered to meet with them to explain why it is incorrect, but they have declined to meet or engage with us. However, we remain willing to engage with them and hope they will be prepared to update their work in due course.”
“Earlier this year we updated our terms and policies to make them more clear and concise, to reflect new product features and to highlight how we’re expanding people’s control over advertising. We’re confident the updates comply with applicable laws including EU law.”
Van Alsenoy and Acar, authors of the study, told the Guardian: “We welcome comments via the contact email address listed within the report. Several people have already reached out to provide suggestions and ideas, which we really appreciate.”
“To date, we have not been contacted by Facebook directly nor have we received any meeting request. We’re not surprised that Facebook holds a different opinion as to what European data protection laws require. But if Facebook feels today’s releases contain factual errors, we’re happy to receive any specific remarks it would like to make.”
Let’s continue on the Facebook topic from yesterday and hear it this time from software freedom activist and computer programmer Richard Stallman (also known as rms).
Do you need convincing reasons to leave Facebook for good? Look no further than this video clip and Guardian article below.
To be honest, I signed up to Facebook only late last year but used it exclusively to promote this blog. Yet, I’m always having second thoughts…
Leave Facebook if you don’t want to be spied on, warns EU
European Commission admits Safe Harbour framework cannot ensure privacy of EU citizens’ data when sent to the US by American internet firms
Thursday 26 March 2015 19.11 GMT
The European Commission has warned EU citizens that they should close their Facebook accounts if they want to keep information private from US security services, finding that current Safe Harbour legislation does not protect citizen’s data.
The comments were made by EC attorney Bernhard Schima in a case brought by privacy campaigner Maximilian Schrems, looking at whether the data of EU citizens should be considered safe if sent to the US in a post-Snowden revelation landscape.
“You might consider closing your Facebook account, if you have one,” Schima told attorney general Yves Bot in a hearing of the case at the European court of justice in Luxembourg.
When asked directly, the commission could not confirm to the court that the Safe Harbour rules provide adequate protection of EU citizens’ data as it currently stands.
The US no longer qualifies
The case, dubbed “the Facebook data privacy case”, concerns the current Safe Harbour framework, which covers the transmission of EU citizens’ data across the Atlantic to the US. Without the framework, it is against EU law to transmit private data outside of the EU. The case collects complaints lodged against Apple, Facebook, Microsoft, Microsoft-owned Skype and Yahoo.
Schrems maintains that companies operating inside the EU should not be allowed to transfer data to the US under Safe Harbour protections – which state that US data protection rules are adequate if information is passed by companies on a “self-certify” basis – because the US no longer qualifies for such a status.
The case argues that the US government’s Prism data collection programme, revealed by Edward Snowden in the NSA files, which sees EU citizens’ data held by US companies passed on to US intelligence agencies, breaches the EU’s Data Protection Directive “adequacy” standard for privacy protection, meaning that the Safe Harbour framework no longer applies.
Poland and a few other member states as well as advocacy group Digital Rights Ireland joined Schrems in arguing that the Safe Harbour framework cannot ensure the protection of EU citizens’ data and therefore is in violation of the two articles of the Data Protection Directive.
The commission, however, argued that Safe Harbour is necessary both politically and economically and that it is still a work in progress. The EC and the Ireland data protection watchdog argue that the EC should be left to reform it with a 13-point plan to ensure the privacy of EU citizens’ data.
“There have been a spate of cases from the ECJ and other courts on data privacy and retention showing the judiciary as being more than willing to be a disrupting influence,” said Paula Barrett, partner and data protection expert at law firm Eversheds. “Bringing down the safe harbour mechanism might seem politically and economically ill-conceived, but as the decision of the ECJ in the so-called ‘right to be forgotten’ case seems to reinforce that isn’t a fetter which the ECJ is restrained by.”
An opinion on the Safe Harbour framework from the ECJ is expected by 24 June.
Facebook declined to comment.
It’s mid-week… thought I should share something light for a change: an alternative comic look into privacy and the government takeover of the internet in our daily lives.
Use only end-to-end encryption programs and apps like SpiderOak, Signal, RedPhone and TextSecure, according to Snowden – see article below.
And never ever anything like Dropbox, Facebook and Google, as he has previously stressed (watch this video clip):
The apps Edward Snowden recommends to protect your privacy online
Mar 05, 2015 9:57 AM ET
Andrea Bellemare, CBC News
There are a host of free, easy-to-use apps and programs that can help protect your privacy online, and if everybody uses them it can provide a sort of “herd immunity” said Edward Snowden in a live video chat from Russia on Wednesday.
Snowden appeared via teleconference in an event hosted by Ryerson University and Canadian Journalists For Expression, to launch the CJFE’s online database that compiles all of the publicly released classified documents the former U.S. National Security Agency contractor leaked. In response to a Twitter question,Snowden expanded on what tools he recommends for privacy.
“I hardly touch communications for anything that could be considered sensitive just because it’s extremely risky,” said Snowden.
But Snowden did go on to outline a few free programs that can help protect your privacy.
“You need to ensure your communications are protected in transit,” said Snowden. “It’s these sort of transit interceptions that are the cheapest, that are the easiest, and they scale the best.”
Snowden recommended using programs and apps that provide end-to-end encryption for users, which means the computer on each end of the transaction can access the data, but not any device in between, and the information isn’t stored unencrypted on a third-party server.
”SpiderOak doesn’t have the encryption key to see what you’ve uploaded,” said Snowden, who recommends using it instead of a file-sharing program like Dropbox. “You don’t have to worry about them selling your information to third parties, you don’t have to worry about them providing that information to governments.”
“For the iPhone, there’s a program called Signal, by Open Whisper Systems, it’s very good,” said Snowden.
He also recommended RedPhone, which allows Android users to make encrypted phone calls, and TextSecure, a private messenging app by Open Whisper Systems.
“I wouldn’t trust your lives with any of these things, they don’t protect you from metadata association but they do strongly protect your content from precisely this type of in-transit interception,” said Snowden.
He emphasized that encryption is for everyone, not just people with extremely sensitive information.
“The more you do this, the more you get your friends, your family, your associates to adopt these free and easy-to-use technologies, the less stigma is associated with people who are using encrypted communications who really need them,” said Snowden. “We’re creating a kind of herd immunity that helps protect everybody, everywhere.”
Above photo credit: http://background-kid.com/blurred-people-background.html
Great, now there’s a new technology to get true clear pictures out of blurred CCTV images just when we learned last week that there are gadgets to hide one’s identity from the prying eyes of facial recognition programs like the FBI’s US$1 billion futuristic facial recognition program – the Next Generation Identification (NGI) System.
Fujitsu, the Japanese multinational information technology equipment and services company, recently said it has invented a new, first of its kind image-processing technology that can detect people from low-resolution imagery and track people in security camera footage, even when the images are heavily blurred to protect privacy. See full story below.
Sad to say, this is probably the easiest, effective and most feasible solution:
Fujitsu tech can track heavily blurred people in security videos
By Tim Hornyak
IDG News Service | March 6, 2015
Fujitsu has developed image-processing technology that can be used to track people in security camera footage, even when the images are heavily blurred to protect their privacy.
Fujitsu Laboratories said its technology is the first of its kind that can detect people from low-resolution imagery in which faces are indistinguishable.
Detecting the movements of people could be useful for retail design, reducing pedestrian congestion in crowded urban areas or improving evacuation routes for emergencies, it said.
Fujitsu used computer-vision algorithms to analyze the imagery and identify the rough shapes, such as heads and torsos, that remain even if the image is heavily pixelated. The system can pick out multiple people in a frame, even if they overlap.
Using multiple camera sources, it can then determine if two given targets are the same person by focusing on the distinctive colors of a person’s clothing.
An indoor test of the system was able to track the paths of 80 percent of test subjects, according to the company. Further details of the trial were not immediately available.
“The technology could be used by a business owner when planning the layout of their next restaurant/shop,” a Fujitsu spokesman said via email. “It would also be used by the operators of a large sporting event during times of heavy foot traffic.”
People-tracking know-how has raised privacy concerns in Japan. Last year, the National Institute of Information and Communications Technology (NICT) was forced to delay and scale down a large, long-term face-recognition study it was planning to carry out at Osaka Station, one of the country’s busiest rail hubs.
The Fujitsu research is being presented to a conference of the Information Processing Society of Japan being held at Tohoku University in northern Japan. The company hopes to improve the accuracy of the system with an aim to commercializing it in the year ending March 31, 2016.
Fujitsu has also been developing retail-oriented technology such as sensors that follow a person’s gaze as he or she looks over merchandise as well as LED lights that can beam product information for smartphones.
Sending an email message is like sending a postcard. That’s the message Hillary Clinton probably now wish she heard earlier.
Andy Yen, a scientist at CERN – the European Organization for Nuclear Research – co-founded ProtonMail, an encrypted email startup based in Geneva, Switzerland. As he explained in this TEDTalk, it is easy to make encryption easy for all to use and keep all email private.
But curiously, it seems so much like PGP.
“Those kinds of restrictive practices I think would ironically hurt the Chinese economy over the long term because I don’t think there is any US or European firm, any international firm, that could credibly get away with that wholesale turning over of data, personal data, over to a government.”
That’s a quote from Obama reported in The Guardian (see article below).
Oh great, so Obama actually understood the consequences of government gaining backdoors into encryption? He should give the same advice to his NSA director Mike Rogers who somehow struggled when asked about the issue recently.
Building backdoors into encryption isn’t only bad for China, Mr President
Wednesday 4 March 2015 16.15 GMT
Want to know why forcing tech companies to build backdoors into encryption is a terrible idea? Look no further than President Obama’s stark criticism of China’s plan to do exactly that on Tuesday. If only he would tell the FBI and NSA the same thing.
In a stunningly short-sighted move, the FBI – and more recently the NSA – have been pushing for a new US law that would force tech companies like Apple and Google to hand over the encryption keys or build backdoors into their products and tools so the government would always have access to our communications. It was only a matter of time before other governments jumped on the bandwagon, and China wasted no time in demanding the same from tech companies a few weeks ago.
As President Obama himself described to Reuters, China has proposed an expansive new “anti-terrorism” bill that “would essentially force all foreign companies, including US companies, to turn over to the Chinese government mechanisms where they can snoop and keep track of all the users of those services.”
Obama continued: “Those kinds of restrictive practices I think would ironically hurt the Chinese economy over the long term because I don’t think there is any US or European firm, any international firm, that could credibly get away with that wholesale turning over of data, personal data, over to a government.”
Bravo! Of course these are the exact arguments for why it would be a disaster for US government to force tech companies to do the same. (Somehow Obama left that part out.)
As Yahoo’s top security executive Alex Stamos told NSA director Mike Rogers in a public confrontation last week, building backdoors into encryption is like “drilling a hole into a windshield.” Even if it’s technically possible to produce the flaw – and we, for some reason, trust the US government never to abuse it – other countries will inevitably demand access for themselves. Companies will no longer be in a position to say no, and even if they did, intelligence services would find the backdoor unilaterally – or just steal the keys outright.
For an example on how this works, look no further than last week’s Snowden revelation that the UK’s intelligence service and the NSA stole the encryption keys for millions of Sim cards used by many of the world’s most popular cell phone providers. It’s happened many times before too. Ss security expert Bruce Schneier has documented with numerous examples, “Back-door access built for the good guys is routinely used by the bad guys.”
Stamos repeatedly (and commendably) pushed the NSA director for an answer on what happens when China or Russia also demand backdoors from tech companies, but Rogers didn’t have an answer prepared at all. He just kept repeating “I think we can work through this”. As Stamos insinuated, maybe Rogers should ask his own staff why we actually can’t work through this, because virtually every technologist agrees backdoors just cannot be secure in practice.
(If you want to further understand the details behind the encryption vs. backdoor debate and how what the NSA director is asking for is quite literally impossible, read this excellent piece by surveillance expert Julian Sanchez.)
It’s downright bizarre that the US government has been warning of the grave cybersecurity risks the country faces while, at the very same time, arguing that we should pass a law that would weaken cybersecurity and put every single citizen at more risk of having their private information stolen by criminals, foreign governments, and our own.
Forcing backdoors will also be disastrous for the US economy as it would be for China’s. US tech companies – which already have suffered billions of dollars of losses overseas because of consumer distrust over their relationships with the NSA – would lose all credibility with users around the world if the FBI and NSA succeed with their plan.
The White House is supposedly coming out with an official policy on encryption sometime this month, according to the New York Times – but the President can save himself a lot of time and just apply his comments about China to the US government. If he knows backdoors in encryption are bad for cybersecurity, privacy, and the economy, why is there even a debate?
Forget Google Glass, there’s something more fun and useful (picture above) but first, consider this picture below.
It may sounds like the Hollywood movie Matrix but let’s face it, everyone would sooner or later have their photos captured in the public space.
Consider for example, the FBI’s US$1 billion futuristic facial recognition program – the Next Generation Identification (NGI) System – was already up and running with the aim to capture photographs of every Americans and everyone on US soils.
The pictures above is an example of what the US government had collected of one individual – she filed a Freedom of Information Act request to see what was collected and the Department of Homeland Security subsequently released the data collected under the Global Entry Program.
But apart from immigration checkpoints, and potentially other files from other government departments (local and global), we are also subjected to the millions of CCTV cameras in public areas and the facial recognition programs scanning through the captured images (and also those on the internet and social networks).
So it’s good to know there may be a potential solution – though it’s still early days and it may not apply to cameras at immigration checkpoints.
The (computer) antivirus software company AVG is working on a “privacy glasses” project. These glasses (above) are designed to obfuscate your identity and prevent any facial recognition software from figuring out who you are, either by matching you with the pictures in their database or creating a new file of you for future use.
Find out more from this article below.
NSA Director Admiral Michael Rogers said at a cyber security conference in Washington DC Monday this week that the government needs to develop a “framework” so that the NSA and law enforcement agencies could read encrypted data when they need and he was immediately challenged by top security experts from the tech industry, most notably Yahoo’s chief information security officer Alex Stamos (see transcript).
This is probably the most telling moment of how US President Barack Obama is still on the wrong frequency on cyber matters…
Obama blamed the “impact on their [the tech companies] bottom lines” for the mistrust between the government and Silicon Valley in the aftermath of the Snowden revelations. These were his words, straight from the POTUSA mouth rather than reading from the scripts, in an exclusive interview with Re/code’s Kara Swisher (see video below) following the well publicized cybersecurity summit at Stanford University last Friday, when he signed an executive order to encourage the private sector to share cybersecurity threat information with other companies and the US government.
Contrast that with the high-profile speech by Apple CEO Tim Cook (see video below), who warned about “life and death” and “dire consequences” in sacrificing the right to privacy as technology companies had a duty to protect their customers.
His speech was delivered before Obama’s address to the summit – which the White House organized to foster better cooperation and the sharing of private information with Silicon Valley – best remembered for the absence of leaders from tech giants like Google, Yahoo and Facebook who gave Obama the snub amid growing tensions between Silicon Valley and the Obama administration. Heavyweights whom Obama counted as “my friends” in the Re/code interview (watch closely his expression at the 39th second of the clip above).
This is really nothing new but I’m posting it because similar “news” resurfaced again the past week.
If you’ve already bought one, the easy solution is to cover the webcam with a duct tape unless you need to use it.
This is bad news with far-reaching global implications – and it’s affecting not just only those based in China.
News has surfaced over the weekend that some foreign-based virtual private network (VPN) vendors found their services in China had been disrupted following a government crackdown – which the authorities labeled as an “upgrade” of its Internet censorship – to block the use of VPNs as a way to escape the so-called Great Firewall.
Many China-based internet users use VPNs to access external news sources but this is also bad news for companies and government offices based in China as well as anyone visiting the Chinese mainland – as many businessmen and executives use VPNs, as part of their company (and security) practice, on their business trips. Many foreigners and businesses residing in China also use VPNs for their day-to-day communications.
The VPNs provide an encrypted pipe between a computer or smartphone and an overseas server such that any communications would be channeled through the designated pipe, which effectively shield internet traffic from government filters that have set criteria on what sites can be accessed.
Find out more about this news below – And as China is fast moving beyond the “factories of the world” tag to become a global economic powerhouse and important trading partner to many developed and developing countries, this is one development to keep a close watch on.