STE WILLIAMS

Monday review – the hot 27 stories of the week

In case you missed any stories over the last seven days, here’s our weekly catch up.

Watch the top news in 60 seconds, and then check out the individual links to read in more detail.

Monday 14 October 2013

Tuesday 15 October 2013

Wednesday 16 October 2013

Thursday 17 October 2013

Friday 18 October 2013

Saturday 19 October 2013

Would you like to keep up with all the stories we write? Why not sign up for our daily newsletter to make sure you don’t miss anything. You can easily unsubscribe if you decide you no longer want it.

Article source: http://feedproxy.google.com/~r/nakedsecurity/~3/1aV8RWiAuWM/

Chrome support for XP to continue after Microsoft ditches it – helpful, or dangerous?

Chrome and Windows XP logos Google has pledged to continue supporting its Chrome browser on Windows XP until at least April 2015, a full year after Microsoft officially ends support for the legacy platform in April 2014.

Google’s rationale behind the decision is that some people will find the transition away from XP a difficult process, and that allowing them to ensure their browsers are kept free of vulnerabilities will ease that transition.

But could its decision end up dissuading people from moving away from XP in a prompt and timely manner?

Windows XP has now been superseded by three separate, fully-fledged Windows versions (if, that is, one counts the widely despised Vista). Its mainstream support phase ended way back in 2009, and the current extended, patch-only support period is rapidly drawing to a close.

This end of life has been described as a “perpetual zero-day”, leaving lingering XP users exposed to all manner of dangers, many of which will likely be easy to reverse-engineer from bugs publicised by Microsoft itself after they are spotted and fixed on later Windows versions.

So the best advice is for anyone still clinging to XP to bite the bullet and move on to something else if at all possible. The deadline for the end-of-life has been well-known for a long time, so there’s no excuse to be taken by surprise.

There are plenty of options available – people not keen to pay for newer and safer Windows versions can take their pick from all manner of well-built, well-supported and user-friendly Linux distros these days, and some have even suggested that Google’s decision to extend Chrome support may be a sneaky tactic to persuade people to move to its Chrome OS.

But the main message a lot of people are going to pick up from Google’s announcement is, don’t worry, there’s no big rush, you’ve now got an extra year to think about your options and finally get moving.

Don’t fall for this. OK, so during that extra year there will be at least one browser being maintained and patched, but the rest of the OS, and likely most of the other software you’re running on it, will be falling ever deeper into obsolescence and vulnerability.

The availability of a fully-patched Chrome could be more of a danger than a help – it could be giving a false sense of security and further delaying the switch to more modern platforms.

Patching shouldn’t be a partial process – you should be keeping everything running on your system fully up to date. That means anti-malware products, browsers, office suites and PDF readers, and anything else you use, but most of all the core operating system itself.

Don’t be lulled into thinking a well-patched browser is all you need to keep you safe. I know many people out there have developed a trust and fondness for XP that’s going to be hard to break, but break it you must.

If you’re still putting off upgrading from XP for no other reason than that you’ve got it, you’re used to it and you like it, don’t be tempted to keep delaying, just hurry up and move on.

It may be, of course, that you really have no choice. You may have XP embedded in some vital system which continues to run fine and isn’t due to be replaced for many years to come, or you may have some legacy apps which will only run on XP.

In these edge cases there’s not a whole lot you can do. But really, if such systems do need to stay in operation, you don’t really need to use them to check your Gmail, keep up with your friends’ holiday snaps on Facebook, or watch amusing cat videos.

Keep them running if you really must, but minimise their interaction with the web and keep them as secure as possible.


Article source: http://feedproxy.google.com/~r/nakedsecurity/~3/vTT69OCy4OA/

If there’s somethin’ strange in your network ‘hood. Who y’gonna call? Google’s DDoS-busters

Email delivery: 4 steps to get more email to the inbox

Google will shelter charities and activists from distributed denial-of-service attacks by wrapping their websites in its protection technologies.

The advertising giant announced its Project Shield initiative on Monday, and said it wanted to “protect free expression online.”


“Project Shield is a service that currently combines Google’s DDoS [distributed denial-of-service] mitigation technologies and Page Speed Service (PSS) to allow individuals and organizations to better project their websites by serving their content through Google’s own infrastructure, without having to move their hosting locations,” the company wrote in an FAQ document outlining the tech.

As is typical with modern Google product launches, the internet goliath neglected to give precise technical details on how Project Shield’s web-flood and packet-attack mitigation magic actually works.

But we can surmise that the company is probably using thousands of edge servers around the world to act as a scalable, elastic front-end for incoming traffic deluges, as this is the same approach taken by DDoS-mitigation experts CloudFlare.

Due to the distributed nature of Google’s underlying infrastructure management, provisioning, and monitoring systems it’s likely the company has fairly effective early warning systems in place to watch for faults as they develop.

By plugging activist and charity sites into the company’s cloud infrastructure, Google is able to subsume targets beneath a protective cloud of servers that can absorb and route large amounts of malicious traffic.

For now, Project Shield is free, and is accepting applications from websites serving news, human rights, and election-related information.

The ‘Page Speed’ component of the service is available for free now, but may cost money to use in the future, the company said. Page Speed slurps content from a company’s servers and applies “web performance best practices” to them, then serves them to end-users via Google’s global server network, according to an FAQ on the tech.

CloudFlare chief Matthew Prince reckons the service is a sign of forthcoming consolidation among edge network service providers.

“The challenges that websites face in both performance and security are substantial so it’s inevitable there will be a consolidation of the edge of the network,” CloudFlare chief Matthew Prince told El Reg via email. “In the future, there will likely be two to six companies that run the edge of the web. We’ve been predicting for some time those companies will be Akamai, Amazon, CloudFlare, and Google.”

CloudFlare offers a free DDoS mitigation service that, Prince says, “provides at least equivalent DDoS protection to what Google is offering.”

Google is not giving any guarantees to sites using Project Shield, but did say in its FAQ document: “Google has designed its infrastructure to defend itself from quite large attacks and this initiative is aimed at providing a similar level of protection to third-party websites.” ®

Free Regcast : Microsoft Cloud OS

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/10/21/google_project_shield_ddos/

CryptoSeal shutters consumer VPN service

Email delivery: 4 steps to get more email to the inbox

VPN service CryptoSeal has followed Lavabit’s example and shuttered its consumer service, saying its CryptoSeal Privacy service architecture would make it impossible to comply with a government order without handing over the crypto keys to its entire system.

The company, which will continue offering business services, made the announcement via a notice to users trying to log into the service, which has been posted to ycombinator here.


“With immediate effect as of this notice, CryptoSeal Privacy, our consumer VPN service, is terminated. All cryptographic keys used in the operation of the service have been zerofilled, and while no logs were produced (by design) during operation of the service, all records created incidental to the operation of the service have been deleted to the best of our ability,” the notice states.

Referring to the pen register issues that drove Lavabit’s decision to close, the post continues: “Our system does not support recording any of the information commonly requested in a pen register order, and it would be technically infeasible for us to add this in a prompt manner. The consequence, being forced to turn over cryptographic keys to our entire system on the strength of a pen register order, is unreasonable in our opinion, and likely unconstitutional, but until this matter is settled, we are unable to proceed with our service,” it continues.

Founder Ryan Lackey confirmed the shut-down on Twitter:

Paid subscribers are offered a one-year subscription to a non-US VPN service, a refund of their balance. CryptoSeal says it’s looking at legal ways to relaunch a consumer service, and subscribers will also be offered a year of free service should it bring a new service to market. ®

Free Regcast : Microsoft Cloud OS

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/10/22/cryptoseal_shutters_consumer_vpn_service/

New leak claim: NSA saw hole in Mexican prez’s email box

Email delivery: 4 steps to get more email to the inbox

America’s relationship with its nearest southerly neighbor are frostier than before after it was claimed that in May 2010 the NSA conducted an operation dubbed Flatliquid that hacked the contents of the then-Mexican president’s inbox.

According to documents leaked to Der Spiegel, a division of the NSA dubbed Tailored Access Operations (TAO) reported successfully penetrating the public email systems of President Felipe Calderón, who stepped down from office in December 2012. The account was used for communication with other staff and was described in the once top-secret report as “a lucrative source.”


NSA report on hacking Mexican president's email

‘South of the border, down Mexico way’

“TAO successfully exploited a key mail server in the Mexican Presidencia domain within the Mexican Presidential network to gain first-ever access to President Felipe Calderon’s public email account,” the partially redacted document states.

The alleged backdoor allowed “diplomatic, economic and leadership communications which continue to provide insight into Mexico’s political system and internal stability,” states the report, leaked by NSA whistleblower Edward Snowden.

Calderón and the Mexican government have reacted angrily to the news. On his personal Twitter account, the former president said that the NSA’s actions were “more than personal, are an affront to the nation’s institutions, since they were carried out during my tenure as president of the republic,” and called on the Mexican authorities to investigate.

The Mexican foreign ministry has already announced an investigation by the Attorney General and called the actions “unacceptable, illegal.” In a statement to the BBC it said “in a relationship between neighbours and partners, there is no place for the alleged practices.”

The leak is an embarrassing one for the US, since it is already investigating itself for allegedly hacking the email of current Mexican president Enrique Peña Nieto last summer during his election campaign. Documents slipped to local news television station Fantastico in September showed the NSA had copies of Peña Nieto’s email conversations, including discussions about likely ministerial appointments.

The same reports suggested that the NSA had also comprehensively pwned the email accounts of the Brazilian president Dilma Rousseff and some of her aides, as well as those of the state oil company Petrobas.

President Rousseff cancelled a visit to the US in protest and at the recent G20 meetings in Russia, President Obama promised that both hacking reports would be investigated.

“What I got from President Obama was a commitment to a full investigation… and if they turn out to be true to impose corresponding sanctions,” said President Peña Nieto at the time. El Reg wonders how this latest report will factor into negotiations. ®

Free Regcast : Microsoft Cloud OS

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/10/22/nsa_tailored_ops_squad_hacked_mexican_presidents_inbox_report/

Fraudster bought names and address from Experian, says Krebs

Email delivery: 4 steps to get more email to the inbox

Brian Krebs alleges a subsidiary of data aggregator Experian was duped into selling personal information about millions of Americans by a scammer.

Detailing his investigations here, Krebs accuses a Vietnamese national indicted in New Hampshire, Hieu Minh Ngo, of using the handle “hieupc” to operate Superget.info, which marketed itself as allowing lookups of individuals’ social security numbers, drivers’ license records, and financial information.


The link to Experian, Krebs reports, came via its acquisition of a company called Court Ventures. He writes that Superget.info gained access to Court Ventures’ databases by posing as a private investigator. He claims that a third party, Marc Martin, CEO of Info Search (which had a data sharing arrangement with Court Ventures), says payments for access to the datasets came as transfers from Singapore.

Experian told Krebs it has worked with authorities on the arrest of Ngo, via this statement: “Experian acquired Court Ventures in March, 2012 because of its national public records database. After the acquisition, the US Secret Service notified Experian that Court Ventures had been and was continuing to resell data from US Info Search to a third party possibly engaged in illegal activity. Following notice by the US Secret Service, Experian discontinued reselling US Info Search data and worked closely and in full cooperation with law enforcement to bring Vietnamese national Hieu Minh Ngo, the alleged perpetrator, to justice. Experian’s credit files were not accessed. Because of the ongoing federal investigation, we are not free to say anything further at this time.”

Krebs says Ngo operated a second similar site called findget.me, and along with Superget.info, he held and offered access to data on “more than half a million people”.

Meanwhile, data aggregators – in particular, their control over who they sell data to – is bound to come under the spotlight even more than it already has, with hints that the Federal Trade Commission is becoming more active in the field. ®

Free Regcast : Microsoft Cloud OS

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/10/22/fraudster_bought_names_and_address_from_experian_says_krebs/

How To Avoid Breaches Where You Least Expect Them

In the real world of constrained budgets and limited personnel, prioritization of security resources is a must. Many departments prioritize practices based on the severity of vulnerabilities, the value of a target, and the likelihood of a threat hitting said target. However, the flip side of that is to remember the real world is also a connected one. And as many security experts can attest, enterprises often forget to account for how attacks against the vulnerabilities in less critical systems can jeopardize the crown jewels.

“Most companies focus their efforts on locking down vital assets, such as the infrastructure, servers, mission-critical applications, and work machines, and when assessing risk put too much emphasis on these as opposed to other systems deemed not as vital,” says Vann Abernethy, senior product manager for NSFOCUS. “But we have seen attacks against these soft targets that either led to serious damage or were used as a way into the systems that were thought to be better protected.”

A great example of what it looks like when an organization chooses not to secure these incidental soft systems happened back in 2011 at the Hong Kong Stock Exchange (HKEX), Abernethy explains. HKEX ran a simple informational news site that wasn’t prioritized for protection because it was a low-risk system with no connection to trading platforms and seemingly no connection to the organization’s core trading functions. Nevertheless, a DDoS attack against this site actually kept a number of prominent companies from trading while that site was down.

[Your organization has been breached. Now what? See Establishing The New Normal After A Breach.]

“The news site is where companies posted announcements to comply with disclosure regulations, and when those statements could not be posted, trading was halted,” Abernethy says. “So a site with minimal protection and a lower perceived risk value can cause several major stocks to go untraded when taken out — and result in a huge loss in revenue.”

It is a good lesson in how organizations have to exercise a higher level of thinking about potential threats to seemingly low-priority systems. In that case, the system in question was not necessarily connected to more sensitive systems of data. But often deprioritized soft targets are ideal for attackers because these systems have back-end connections to other systems that IT staff may not be aware of or have forgotten about. Similarly, some soft targets may not necessarily be connected to sensitive systems but could still hold sensitive data due to lack of policies or lack of enforcement of existing policies. Take, for instance, test databases for development work — in many organizations, these databases will contain real production data. But they’re not considered high-priority systems and don’t have near the levels of controls on them as production databases.

So how does IT find those systems that could prove to be soft targets for attackers? It starts with becoming more comprehensive in asset discovery and tracking — it’s a task that’s helpful not just for vulnerability management, but many more security investments that need to be made, says John Walton, principal security manager at Microsoft, in charge of the Office 365 security engineering team. Walton recommends using as many different sources of data as possible to put together an asset list, starting first with subnet base scanning and moving outward from there.

“So think about things like your log data, maybe netflow data or network routing information, your asset data in Active Directory, and any other number of sources you may have available or could start collecting from,” he says. “Then really try to combine those different sources because the more you can identify, the closer you can get to having a complete asset list.”

Even before developing that list, though, netflow data can also be particularly helpful for identifying existing compromises of seemingly low-risk systems connected to and endangering more critical systems.

“If you are seeing large and unexpected flows of data from an internal origination point to other computers on the network or to external addresses, this can indicate an attempt to exfiltrate data from your company,” says A. N. Ananth, CEO of EventTracker. “Netflow data is a useful way to spot these unexpected information flows.”

However, keeping tabs on netflow data may be only addressing symptoms of a deeper problem. Part of the issue at hand is that organizations are assessing risks to their assets in a bubble, says John Pescatore, director of emerging trends for SANS Institute.

“There is generally no real connection to real-world threats on how best to protect the business or the customer’s information,” he says.

He says that all too often organizations use a small imaginary number to estimate the probability of a security incident, a large imaginary number to estimate the cost of a security incident, and then multiply those two numbers together to get a medium-size imaginary number, says Pescatore, adding that the exercise is purely done to tell auditors that they did an assessment.

Instead, he says, it is important to home in on a controls-based priority list. This can be done by relying on a community of experts who can look at real-world threats and prioritize which security controls are most valuable in deterring those threats. Then they can prioritize solutions that implement those controls with as much automation as possible to improve efficiency and effectiveness.

“Work your way down the priority list until you run out of budget,” Pescatore says.

Most importantly, though, organizations need to be comprehensive when seeking IT assets eligible for these controls. While mission-critical systems certainly deserve the most attention to details, security professionals must also keep an eye out for the fringes of IT infrastructure. It is there — in the places where high-priority and low-priority systems may be interconnected — where business processes create a tenuous connection between unrelated systems, and where data lurks in unexpected places. It is that gray area where the biggest propensity for compromise awaits.

“Companies should take a very serious look at all assets and be very comprehensive in looking at the consequences of an attack,” Abernethy says. “Don’t overlook the mundane because, as the HKEX found out, it may very well be a critical risk area.”

Have a comment on this story? Please click “Add Your Comment” below. If you’d like to contact Dark Reading’s editors directly, send us a message.

Article source: http://www.darkreading.com/vulnerability/how-to-avoid-breaches-where-you-least-ex/240162928

ObserveIT Monitors Third Parties And Privileged Users

October 21, 2013 08:00 AM Eastern Daylight Time-NEW YORK–(BUSINESS WIRE)–ObserveIT, an industry-leading provider of user activity recording and auditing technology, today introduced ObserveIT 5.6.8. The behavior monitoring software solution is already in use in more than 800 corporations in 70+ countries, where it is used for auditing, tracking and recording 3rd-party and internal privileged user behavior with real-time, context-aware video monitoring within any server environment. ObserveIT augments the traditional access management and logging systems, by adding video playback of user activities, helping IT management clearly and easily understand who did what and when.

ObserveIT version 5.6.8 includes new features that enable deeper insight into user activity and help flag interactions with sensitive data. New tracking parameters, detailed graphical charts and database administrator (DBA) auditing all add granularity and valuable detail to the system’s reporting capabilities. Seeing user activities that fall outside normal behavior is as simple as pushing play to view the video.

Monitoring users’ activities within corporate computer systems is increasing in importance, in part because of ongoing insider threats and the growing reliance on outside consultants. A Verizon survey published May 2013 showed that misuse by accredited, privileged users accounted for 13% of the breaches in the last 12 months. Compliance requirements are also dramatically increasing – with monitoring, auditing and tracking data all necessary for compliance with regulations such as PCI, HIPAA, ISO 27001 and Sarbanes-Oxley.

“A single video is worth a thousand logs,” said Gaby Friedlander, CTO and co-founder of ObserveIT. “For most companies, once the ‘privileged’ user gains access to sensitive data systems, there’s been no easy way to track their behavior. With ObserveIT, we’ve been able to offer corporations a window into user behavior that was previously obscured.”

ObserveIT captures a detailed textual log and video recording of every action in areas where a company feels it useful to track user activities. Reporting is generated in natural language, not buried in a list of security protocols and logging software. Additionally, with ObserveIT video sessions, irregular actions are flagged and reported to administrators, eliminating the need to sift through hours of video.

“We need to be able to see exactly what internal and 3rd-party users were doing on our systems,” said Paul Leger, CISA/Manager of Information Security for Atlantic Lottery Corporation. “ObserveIT’s solution can tell us this in fine-grained detail that you can’t get anywhere else.”

Other new features of ObserveIT 5.6.8 include:

Threat Detection Console – A new dashboard that highlights non-compliant activity of users, such as unusual amounts of night/weekend accessing of information, usage from infrequently used applications or computers, or remote access sessions of unauthorized users.

DBA Activity Auditing – Monitoring SQL queries executed by DBAs against production databases. This highlights the date, results, users, server, login ID or any text contained within the queries.

Graphical User Server Statistics – Charting of individual user or server activity in a pie chart and bar graph of number or recorded frames, by day, during a selected time period.

Advanced Keylogger for Unix/Linux– While ObserveIT has always captured every user-executed command, the new version captures all output to the terminal screen.

URL Exclude for Web Browser Capture – Administrators can now specify website URLs which will not be recorded.

About ObserveIT

ObserveIT, a leading provider of server monitoring and auditing technology, makes life easier for IT managers, security and compliance officers. Founded in 2006, ObserveIT’s fifth-generation system offers IT executives a method to clearly and easily understand who did what, when within the corporate IT environment. ObserveIT enhances the ability to audit, monitor, track and record user behavior with real-time, context-aware video monitoring within the server. ObserveIT is currently in use in more than 800 corporations in 70+ countries, and has become a standard for many network environments with market-leading and enterprise-grade functionality. For more information on the company, visitwww.observeIT.com or on Twitter @ObserveIT.

Article source: http://www.darkreading.com/privacy/observeit-monitors-third-parties-and-pri/240162930

Catching Malware With DNS As A Service

Three years ago, Jim Clark, chief technology officer of West Liberty University, had a number of IT headaches.

A botnet had resisted efforts to eradicate it from the West Virginian university’s network, and students complained that Internet requests were often slow. Clark considered doubling the campus’ 100-Mbps connection, but first decided to try out the free domain-name system (DNS) service offered by OpenDNS. The move to the globally available DNS service helped bring latencies on Internet requests down, and OpenDNS’s increasing focus on helping customers with security helped delay the spread of the botnet until Clark’s two-person information technology team could clean it out.

“In the age of bring-your-own-device … you need to know what your network is doing, how it’s breathing, and how it’s performing,” Clark says. “We really haven’t had a botnet or malware problem since we turned it on.”

While companies look to antivirus deployments and security-intelligence systems to help lock down their computers and networks, many of the features of today’s DNS services can help secure the corporate network. In addition to helping speed DNS resolution, cloud-based DNS services are gaining many adherents because they give organizations better visibility into the traffic leaving their networks and the ability to filter out traffic to certain types of sites or block the command-and-control communications to known malicious sites.

[No evidence thus far to confirm that the Syrian Electronic Army embedded malware on redirected Web pages, but investigation continues. See No Proof Of Malware In New York Times DNS Hijacking Attack.]

While larger companies have the ability to deploy DNS servers in their internal networks, cloud services have quickly begun offering much of the flexibility of internal configurations while delivering on a passel of security features as well, says Patrick Foxhoven, chief technology officer for cloud security firm Zscaler.

“We see the days of deploying another security appliance on-premise are numbered,” he says. “Everything that you could do on the inside, you can do as a service, especially with DNS.”

Zscaler places its service as a proxy between workers’ devices and requests to the Internet, so that all traffic, including DNS requests, are first sanitized through its service.

Many variants of malware, created specifically to evade detection by popular antivirus programs, still communicate to collections of domains in a way that can be detected. In September, Zscaler published an analysis of a banking Trojan known as Capshaw that used domain-generation algorithms to create pseudo-random domain names and communicate with its command-and-control servers. Other malware uses fast-flux domain resolution, which quickly changes the IP addresses assigned to a domain to hinder takedown efforts. Still other malware just communicates with known bad or relatively unknown domains. Each strategy can be detected through a cloud DNS service.

Cloud security services based on DNS also have significant benefits to companies dealing with the bring-your-own-device trend. Mobile users can still be protected, even once they have left the corporate network, says David Ulevitch, founder and CEO of OpenDNS.

“Enterprise security is predicated on having either endpoint software or network visibility,” he says. “And when you have employees who have their own devices, you have no endpoint software, and when they are using it outside your network, you have no visibility.”

By routing DNS traffic through a service provider’s servers, mobile users retain the security benefits of the DNS proxy service without requiring that corporate IT manage the devices.

DNS services that run on internal servers cannot serve mobile users and require expensive maintenance to keep patched. While the initial cost of deploying a DNS server can be next to nothing, most companies want the reliability and extra features, says Rodney Joffe, senior vice president and fellow at domain registry and services provider Neustar. Implementing such as service inside the network means that IT is responsible for load balancing, patching, implement DNS SEC, and fending off denial-of-service attacks.

“If you are running a business, and reliability and security are important to you, then you absolutely need to use a service,” he says.

Have a comment on this story? Please click “Add Your Comment” below. If you’d like to contact Dark Reading’s editors directly, send us a message.

Article source: http://www.darkreading.com/services/catching-malware-with-dns-as-a-service/240162945

Furious French choke on chardonnay over NSA’s phone spying in France

Free Regcast : Microsoft Cloud OS

The latest documents leaked by NSA whistleblower Edward Snowden have the French government up in arms: the spying agency collected data on seven million calls and texts a day in the land of fine wine and cheese.

That’s according to a dossier published by Le Monde on Monday.


The files reveal that the NSA had two spying operations running to capture French phone calls under the code names “DRTBOX” and “WHITEBOX”. Between December 10, 2012 to January 8, 2013, French citizens’ “telephone data” was logged in 70.3 million records by the agency for analysis: making a call in France is enough to trigger a recording of the conversation, we’re told. Uncle Sam’s spooks also intercepted text messages and kept logs of who was contacting whom.

The documents show the NSA, at its peak on Christmas Eve 2012, intercepted seven million French calls and texts a day – perhaps checking who had been naughty or nice – but that surveillance dropped to zero between December 28 and 31. That’s possibly because the NSA was at the time waiting for Congress to approve its latest spying operations under Section 702 of the Patriot Act. The average interception rate was three million records a day.

France was the third most highly spied-on European state, according to the dossier, after the UK and Germany. The leak states that between February 8 and March 8, 2013, the NSA collected 124.8 billion telephone data items and 97.1 billion computer data items from foreign countries across the world.

The news of the leaks was timed for maximum embarrassment, as US Secretary of State John Kerry has just arrived in France and may have been hoping for an easy ride given the extent to which France and the US are willing to collaborate on Syria.

Instead of a welcome with open arms and cheek kissing, Kerry will be facing tough questions and the US ambassador has been summoned for an official complaint from the French government; its foreign minister Laurent Fabius described the spying as “unacceptable.”

“We have extremely useful cooperation with the United States in the struggle against terrorism, but this cooperation does not justify everything,” he told reporters, Reuters reports. “So we’ve asked the United States to provide clarifications, explanations and justifications extremely quickly.”

US ambassador Charles Rivkin declined to comment on the content of the meeting, but said the French government’s concerns would be relayed to Washington and commented that US-French relations were the best they have been in a generation. ®

Supercharge your infrastructure

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/10/21/france_snowden_nsa_call_eavesdropping/