STE WILLIAMS

Fury as OS X Mavericks users FORCED to sync contact books with iCloud

Free Regcast : Managing Multi-Vendor Devices with System Centre 2012

Apple has removed from Mac OS X Mavericks the ability to directly sync an iPhone’s contact list with its owner’s computer – forcing the user to instead upload their address book to Cupertino’s cloud and download it to the local computer.

Anyone who updates to the latest release of Apple’s desktop operating system, version 10.9, can only synchronize personal contacts and calendar information via Apple’s off-site iCloud, because the feature to do the job via iTunes (specifically version 11) no longer exists.


The change – confirmed by El Reg – comes with no forewarning to those about to upgrade, which will annoy privacy-conscious fanbois already spooked by revelations about the NSA and GCHQ’s surveillance of anyone using an electronic device.

“Apple seems to now want to force its clients, who tend to be high end, to export their contact information to a nation that is at best shoddy at keeping secrets and that is already spying on everyone,” said a Reg reader who tipped us off about the OS X change.

“You could argue the US probably has that information already, but as far as I can tell, a business that uses iCloud for business contacts is in the same position as a business that uses Gmail.”

Our reader is not the only person to have spotted the update: irritated Apple fans have filed support queries to Apple about the change, and argued that resorting to setting up a CardDAV, CalDAV or similar network server is not an acceptable workaround.

The iCloud-only contact sync policy emerged as Apple published a transparency report that documented the number and types of requests it has received for copies of users’ personal records from cops and intelligence agencies around the world. The US topped the charts, and also bans Apple from revealing specific details about the info slurped – the iPhone maker is opposed to this gagging order.

Spook agencies such as Uncle Sam’s NSA don’t necessarily need the contents of your emails and messages for analysis: just the metadata describing who contacted whom and when, or simply who is associated with whom, is enough – thus, address books slurped from cloud providers are a treasure chest of intelligence.

And by mandating iCloud-only calendar synching as well as contacts, Apple could end up handing over details of your whereabouts past and future. Of course, if the NSA really wanted someone’s contact book, it could certainly find some way of snaffling it – but tapping up the iCloud is so much more easier for them.

Apple goes out of its way to say it gives the privacy of its customers “consideration from the earliest stages of design for all our products and services”, a stance that goes against the iCloud-only sync change, which activists will likely view as a step in the wrong direction by Cupertino.

Apple Mavericks sync warning

Apple Mac OS X Mavericks help center … a warning lies in the small print

Apple Mavericks sync warning

Up close … we zoom in on the cloud sync wording

Mavericks includes a new iCloud Keychain that can store all website usernames, passwords, credit card numbers and Wi-Fi network information, and keeps the data up to date across a user’s Apple devices, including iPhones and iPads. While we’re told the data is encrypted using the AES256 algorithm, security researchers including Mike Shema, director of engineering at cloud security firm Qualys, expressed mixed feelings about the password management feature: it helps people juggle their login credentials, but ultimately users are in the hands of software developers.

“It’s one thing to hear advice that users should have separate passwords for each of their accounts, it’s another to actually follow through on the advice since adhering to it can be such a hassle,” Shema said.

“Something like [Apple’s iCloud] keychain essentially makes this effortless and uniform across a user’s devices – of course, only their Apple devices.

“However, the keychain solves some of the user’s password management problems but none of the app’s. In other words, there may still be weaknesses in how the app handles password storage and password resets, for example. One of the biggest problems in identity security is that apps still equate users with email addresses for password-reset mechanisms.”

On a more positive note, Qualys separately praised Apple for updating Mavericks to mitigate against the infamous BEAST SSL snooping attack. Other security improvements in OS X 10.9 Mavericks include restricting Adobe Flash Player plugins to run in a locked-down sandbox within Apple’s Safari browser software. ®

Free Regcast : Managing Multi-Vendor Devices with System Centre 2012

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/apple_mandates_icloud_contact_syncing/

Furious Google techie on NSA snooping: ‘F*CK THESE GUYS’

The business case for a multi-tenant, cloud-based Recovery-as-a-Service solution

Eric Schmidt’s indignation over the NSA’s reported spying on links between Google’s data centres pales in comparison to the righteous indignation of his engineers.

The latest leaks from whistleblower Edward Snowden provide evidence that Google and Yahoo! data centre interconnects were being tapped by the NSA’s spies, as part of a program code-named MUSCULAR.


Both Yahoo! and Google are knowing participants in the NSA’s even more notorious PRISM web surveillance dragnet program.

But PRISM apparently wasn’t enough for the signals intelligence agency, hence its decision to use MUSCULAR to covertly hoover up any of the bits it might have missed by tapping into fibre-optic links leased or run by Google (and others) between its data centres.

All this is in addition to GCHQ’s Tempora program for wholesale collection of traffic through transatlantic fibre-optic cables and Bullrun – the bete noire of security professionals – which is the NSA’s effort to work with hardware and software technology vendors to weaken encryption standards and their underlying components.

Google’s executive chairman branded the NSA’s surveillance of its data centre as “outrageous” in an interview with the Wall Street Journal, while Google engineer Mike Hearn and Brandon Downey have gone further – much, much further – in lambasting the NSA and GCHQ for their snoopy ways.

Downey, after saying that he was speaking in a personal capacity, took to Google Plus to say”fuck these guys” in a splendid rant comparing the NSA to the lesser denizens of Mordor.

I’ve spent the last ten years of my life trying to keep Google’s users safe and secure from the many diverse threats Google faces.

I’ve seen armies of machines DOS-ing Google. I’ve seen worms DOS’ing Google to find vulnerabilities in other people’s software. I’ve seen criminal gangs figure out malware. I’ve seen spyware masquerading as toolbars so thick it breaks computers because it interferes with the other spyware.

I’ve even seen oppressive governments use state sponsored hacking to target dissidents.

But after spending all that time helping in my tiny way to protect Google – one of the greatest things to arise from the internet – seeing this, well, it’s just a little like coming home from War with Sauron, destroying the One Ring, only to discover the NSA is on the front porch of the Shire chopping down the Party Tree and outsourcing all the hobbit farmers with half-orcs and whips.

The US has to be better than this; but I guess in the interim, that security job is looking a lot more like a Sisyphus thing than ever.

Hearn, a British colleague of Downey’s who worked on anti-hacking systems for Google for two years and is based in Switzerland, backed his colleague’s “fuck you, NSA” message in a similar (equally angry but perhaps less mythologically inclined) rant, also posted on Google Plus:

I now join him in issuing a giant Fuck You to the people who made these slides. I am not American, I am a Brit, but it’s no different – GCHQ turns out to be even worse than the NSA.

We designed this system to keep criminals out . There’s no ambiguity here. The warrant system with skeptical judges, paths for appeal, and rules of evidence was built from centuries of hard won experience. When it works, it represents as good a balance as we’ve got between the need to restrain the state and the need to keep crime in check. Bypassing that system is illegal for a good reason .

Unfortunately we live in a world where all too often, laws are for the little people. Nobody at GCHQ or the NSA will ever stand before a judge and answer for this industrial-scale subversion of the judicial process. In the absence of working law enforcement, we therefore do what internet engineers have always done – build more secure software. The traffic shown in the slides below is now all encrypted and the work the NSA/GCHQ staff did on understanding it, ruined.

Thank you Edward Snowden. For me personally, this is the most interesting revelation all summer.

MUSCULAR was possible because Google wasn’t encrypting traffic on dedicated leased lines running between its data centres. It’s easy to be wise in hindsight, but this looks like a serious shortcoming.

The Operation Aurora cyber-espionage attacks against Google and other hi-tech firms back in 2009, and blamed on China, ought to have acted as a wake-up call prompting the Chocolate Factory to improve its security. Improvements were undoubtedly made but they obviously weren’t comprehensive enough.

Security experts have welcomed Hearn and Downey’s impassioned diatribes against the NSA. “If only Google’s legal team were as angry as Google’s security engineers,” said Christopher Soghoian, principal technologist and senior policy analyst at the American Civil Liberties Union in an update to his personal Twitter account. ®

5 ways to reduce advertising network latency

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/google_engineers_slam_nsa/

KitKat swats yet another Android ‘MasterKey’ bug

The business case for a multi-tenant, cloud-based Recovery-as-a-Service solution

Android 4.4 contains a fix for yet another – albeit weaker variant – of the so-called MasterKey bug that first surfaced in July.

The vulnerability first shook the security world when mobile security startup Bluebox Security warned about a class of flaw that potentially affected 99 per cent of Android devices. The problem revolved around how Android handled the verification of the integrity of apps.


Security shortcomings meant that malicious parties could alter some of the contents bundled in an app without changing its cryptographic signature. Apps for Android come as .APKs (Android Packages), which are effectively just ZIP archives. Bluebox discovered it was possible to pack an installation file with files whose name is the same as those already in the archive. These renamed files could easily contain malicious code. It discovered the gaping security hole in February and notified Google but a fix didn’t arrive until July.

The issue arose because Android checked the cryptographic hash of the first version of any repeated file in an APK archive, but the installer extracts and applies the last version, which might be anything and wouldn’t be checked providing it had the same file name as an earlier (legitimate) component.

A similar bug, discovered by Chinese Android researchers, was also fixed in July. It was Java-based but had the same practical consequences – miscreants could upload Trojan-laden .APK files onto online marketplaces that carried the same digital signature as the legitimate app. Both the two earlier issues were resolved with Android 4.3 Jelly Bean, which was released in July.

Investigation of the recently released Android 4.4 source code by Jay Freeman, a mobile security developer best known for his work on iOS and Cydia*, has revealed that it contains a patch for a third flaw along the same lines. The third flaw is less easy to exploit than the two previous variants, but is still potentially problematic. It arises because it is possible to manipulate the filename length field in a ZIP file’s metadata.

“The local header filename length is deliberately set so large that it points past both the filename and the original file data,” explains veteran antivirus expert Paul Ducklin on the Sophos Naked Security blog. “This presents one file to the verifier, and a different file to the operating system loader.”

Android maintainers have quashed the latest bug by altering the Java-based validation code “so that it follows a similar path through the data to that used by the loader,” according to Ducklin, who describes this as an effective (if not holistic) fix.

Freeman has published a detailed analysis of the flaw, along with proof-of-concept code, here. The third flaw was found at around the same time as the others, but only patched this month.

All three flaws stem from the features of the Zip file format, designed in an earlier era of computing, which featured filename redundancy in case files had to be split across multiple floppy disks. These and other antiquated features are hard-wired into the Zip format, handing over security issues to Android Packages built on the foundations of the format as a result.

Sources have confirmed that all three bugs have been fixed in Android 4.4 and that Google’s OEM hardware partners have been notified. It might still take some time for the roll out of the update by device manufacturers, if the progress through the Android eco-system of previous updates is any guide.

El Reg was able to confirm through Romanian software security firm Bitdefender that the latest MasterKey vulnerability has been fixed.

Bogdan Botezatu, senior e-threat analyst at Bitdefender, said: “The code committed into the linked GIT repository has changed in the 4.4 RC1 iteration and the attack vector described in the article has – to our knowledge – been mitigated.

“We also tried to reproduce the described exploit in the compiled AOSP builds that started showing up since Friday with no avail. However, we are looking into the unit to see if special scenarios could allow for similar exploits,” he added.

BitDefender Botezatu’s discovered two benign gaming apps featuring the original MasterKey Vulnerability in the official Google Play store two weeks after the problem first surfaced so his reassurance that there’s no further hidden problems in Android along the same lines, at least for now, is welcome.

Bootnote

*Cydia is an application that lets fanbois search for and install software packages on jailbroken iOS Apple devices.

5 ways to reduce advertising network latency

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/third_android_master_key_vuln_squashed/

Prototype Encrypts Data Before Shipping It To The Cloud

Researchers at Georgia Tech have built a prototype that encrypts files before they are sent to the cloud for storage.

The so-called “CloudCapsule” system can be used with cloud storage services, such as Dropbox and Google Drive, for locking down files prior to their storage in the cloud and for accessing them without a proxy. The technology can be used for desktops, laptops, and mobile devices, but the researchers initially have built a prototype for just mobile devices.

“We thought its greatest utility would be in the mobile space,” given the explosion in BYOD, says Paul Royal, associate director of the Georgia Tech Information Security Center (GTISC), where the prototype was created. “This lets us combine some of the reasonable process-isolation present in mobile OSes with a seamless and transparent way of encrypting data you want to place into the cloud.”

It’s the classic conundrum with the cloud: balancing utility with security. According to a new report published today by GTISC, corporate information stored in the cloud is typically secured solely with what the cloud storage provider offers. And encrypting data in the cloud via private-key encryption typically makes the cloud less useful, the report says.

CloudCapsule basically uses a virtual machine instance that lets a user from the same machine go into encrypted mode and access encrypted files stored in the cloud. The operating system and malware have no “knowledge” of the data, according to GTISC, nor can the cloud provider read the files.

“CloudCapsule is an interesting approach and from the details available … it seems specific to DHS, which may not be ideal for other users. A potential issue that enterprises might encounter is in the deployment,” says Paige Leidig, senior vice president at CipherCloud, a cloud security firm.

Leidig says CloudCapsule would be difficult to scale compared with a single gateway model — the approach CipherCloud takes — because it’s deployed on endpoints. “The other potential problem for the endpoint approach is key management — if the user loses the keys, they would need to be revoked and replaced, which adds more complexity, especially for large enterprises with hundreds of thousands of users,” Leidig said an email interview.

But searching encrypted information remains problematic. GTISC researchers also have been working on techniques for “searchable encryption” so users can more easily find their protected data and files in the cloud. “We are trying to design types of encryption that support … performance requirements” of real-world users, GTISC’s Royal says. “You do need to encrypt data before it goes into the cloud, but you would still like to do basic keyword searches over that data. That’s something we’ve been working on at Tech.”

Striking a balance between securing the data and indexing or searching it is complicated, he says. “There are going to be fundamental tradeoffs between security and efficiency,” he says. “In some cases, there’s a desire not to introduce significant overhead, so, for example, in some cases, we are turning the problem on its head and asking a person who would use this in the real world what they consider acceptable performance.”

[The cyberespionage gang out of China who recently hacked into media outlet networks is now using Dropbox and WordPress in its attacks rather than via traditional email phishing attacks and server compromise. See Dropbox, WordPress Used As Cloud Cover In New APT Attacks .]

Georgia Tech researchers also have built an email encryption prototype called “Very Good Privacy,” a more user-friendly option than the existing Pretty Good Privacy email encryption tool. Very Good Privacy software sits atop the user interface and can be used with cloud-based email services. The tool intercepts and encrypts the text as it’s typed in, before it gets to the email service. “Plain text never gets entered into an application,” Royal says. But the look and feel of the process remains unchanged for the user, so it’s transparent, he says.

The full Georgia Tech Emerging Cyber Threats Report for 2014 is available here (PDF) for download.

Have a comment on this story? Please click “Add Your Comment” below. If you’d like to contact Dark Reading’s editors directly, send us a message.

Article source: http://www.darkreading.com/authentication/prototype-encrypts-data-before-shipping/240163657

Schneier: Make Wide-Scale Surveillance Too Expensive

As custodians of the Internet mull over the lessons that revelations about National Security Agency (NSA) surveillance offer about the insecurity of the Internet’s infrastructure, architects must find ways to make wholesale spying more expensive. So said noted cryptographer and security evangelist Bruce Schneier in a talk today about Internet hardening at the Internet Engineering Task Force (IETF) plenary session.

“There are a lot of technical things we can do. The goal is to make eavesdropping expensive,” Schneier said. “That’s the way to think about this, is to force the NSA to abandon wholesale collection in favor of targeted collection of information.”

As things stand now, the NSA’s surveillance efforts are aided and abetted by the information economy as it stands today, he explained. With data being collected about consumers at every step of their movement online and very little of it being purged from corporate systems, it is only a matter of time that someone puts that data to use.

“This is not a question of malice in anybody’s heart, this is the way computers work. So what you’re ending up with is basically a public-private surveillance partnership,” he says. “NSA surveillance largely piggybacks on corporate capabilities—through cooperation, through bribery, through threats and through compulsion. Fundamentally, surveillance is the business model of the Internet. The NSA didn’t wake up and say let’s just spy on everybody. They looked up and said, ‘Wow, corporations are spying on everybody. Let’s get ourselves a cut.'”

[How do you know if you’ve been breached? See Top 15 Indicators of Compromise.]

According to Schneier, groups like IETF need to find a way to get everyone to understand that a secure Internet is in everybody’s best interest. And beyond the political and legal solutions to the problems, technologists must find ways to make it more onerous for wide-scale surveillance to be carried out.

This starts first with ubiquitous encryption on the Internet backbone, Schneier said, along with useable application layer encryption. Additionally, thought needs to put into target dispersal.

“We were safer when our email was at 10,000 ISPs than it was at 10,” he said. “It makes it easier for the NSA and others to collect. So anything to disperse targets makes sense.”

Additionally, increasing use of endpoint security products and better integrated anonymity tools can help thwart widespread spying. Finally, security and technology assurance needs to be fixed, so that back doors aren’t left behind for any one person or group to take advantage.

“This is a hard one, but it’s an important one,” he said. “We need some way to guarantee, to determine, and to have some confidence that the software we have does what it’s supposed to do and nothing else.”

Additionally, people need to understand that while the NSA is in the limelight at the moment, it is a symptom of a much bigger disease. Not only is the NSA not the only government agency across the world to engage in these behaviors, but so too are private organizations to some extent.

“This is a fundamental problem of data-sharing and of surveillance as a business model. This is about the benefits of big data versus the individual risks of big data,” he said. “When you look at behavioral data of advertising, of health data of education data, of movement data, the question becomes how do we design systems that benefit society as a whole while protecting people individual. I believe this is the fundamental issue of the information age.”

Have a comment on this story? Please click “Add Your Comment” below. If you’d like to contact Dark Reading’s editors directly, send us a message.

Article source: http://www.darkreading.com/vulnerability/schneier-make-wide-scale-surveillance-to/240163668

Apple: How we slip YOUR data to govts – but, hey, we’re not Google

Free Regcast : Managing Multi-Vendor Devices with System Centre 2012

Apple has joined Facebook, Google, Microsoft, Twitter, and Yahoo!’s transparency club, releasing a detailed report on the numbers and types of requests for personal records it has received from law enforcement and government agencies around the world.

“We have reported all the information we are legally allowed to share,” the report, issued Tuesday, states, “and Apple will continue to advocate for greater transparency about the requests we receive.”


The report’s Account Information Requests table, below (click to make readable), lists the exact number of requests received, acted upon, and other details from the 31 countries from which Apple received such requests. “Some countries are not listed in this report,” a note reads, “because Apple has not received any information requests from the government there.”

Among the 31, only one country disallows companies from revealing the exact number of requests. Yes, you guessed right: the good ol’ U.S. of A.


Account Information Requests listing from Apple transparency report

“At the time of this report,” Apple notes, “the U.S. government does not allow Apple to disclose, except in broad ranges, the number of national security orders, the number of accounts affected by the orders, or whether content, such as emails, was disclosed. We strongly oppose this gag order, and Apple has made the case for relief from these restrictions in meetings and discussions with the White House, the U.S. Attorney General, congressional leaders, and the courts.”

Account requests, Apple says, commonly involve law enforcement asking for information regarding robberies or other crimes, as well as searches for missing persons or kidnapping victims.

“In very rare cases,” the report says about account requests, “we are asked to provide stored photos or email. We consider these requests very carefully and only provide account content in extremely limited circumstances.”

In the report, Apple manages the somewhat contortionistic feat of simultaneously patting itself on the back while sticking its thumbs in the eyes of such companies as Google, Facebook, Twitter, and the like. After saying that the privacy of their customers is “a consideration from the earliest stages of design for all our products and services” and that they “work hard to deliver the most secure hardware and software in the world,” the thumbs comes out:

Perhaps most important, our business does not depend on collecting personal data. We have no interest in amassing personal information about our customers. We protect personal conversations by providing end-to-end encryption over iMessage and FaceTime. We do not store location data, Maps searches, or Siri requests in any identifiable form. … Unlike many other companies dealing with requests for customer data from government agencies, Apple’s main business is not about collecting information.

In addition to the information on requests for account information, Apple also provides details on device requests, of which they say “the vast majority” relate to lost or stolen devices. “These types of requests frequently arise when our customers ask the police to assist them with a lost or stolen iPhone, or when law enforcement has recovered a shipment of stolen devices.”

Device Information Requests listing from Apple transparency report

Apple also notes that it has never received an order to release information under Section 215 of the USA PATRIOT* Act.

That section, under challenge by such civil liberties organizations as the EFF and the ACLU, allows the FBI – and who knows what other federal authorities – to obtain secret clearance from the FISA court to obtain information from a company about you and your activities, ostensibly to “to protect against international terrorism or clandestine intelligence activities.” The company must hand over that info to the investigators under a gag order that prevents them from ever informing you+world+dog that they even received the order.

“We would expect to challenge such an order if served on us,” Apple says. However, we may never know whether or not they were so served, or if they challenged such an order. Section 215 remains the law of the the land here in the good ol’ U.S. of A. ®

Bootnote

* Do know that the USA PATRIOT Act is so capitalized because its common name is an acronym for its full name: the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001.

Free Regcast : Microsoft Cloud OS

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/05/apple_transparency_report_sticks_thumb_in_eyes_of_google_facebook/

Microsoft, Facebook: We’ll pay cash if you can poke a hole in the INTERNET

The business case for a multi-tenant, cloud-based Recovery-as-a-Service solution

While Facebook and Microsoft already run security bug bounty programs of their own, the two companies are now working together to reward researchers who can find flaws in some of the underlying technologies behind online communications.

The Internet Bug Bounty program will pay a minimum for $5,000 for flaws in sandboxed applications or for bugs in fundamental internet technologies such as DNS and SSL. Lower payouts are offered for spotting problems in Ruby, Python, PHP, Apache, Perl, and other software.


“Our collective safety is only possible when public security research is allowed to flourish. Some of the most critical vulnerabilities in the internet’s history have been resolved thanks to efforts of researchers fueled entirely by curiosity and altruism,” the two companies said on the bounty program’s website.

“We owe these individuals an enormous debt and believe it is our duty to do everything in our power to cultivate a safe, rewarding environment for past, present, and future researchers.”

To qualify, flaws must found in code that is in widespread use, of serious or critical severity, or be an unusual or novel hack that no one has thought of as yet. Once reported and verified, software providers will have 180 days to fix the problem before any announcement is made of money paid out.

The 10-person judging panel is dominated by Microsoft and Facebook staff, but there will be input from Google’s security researcher Chris Evans, director of security engineering at Etsy Zane Lackey, and penetration tester from iSec Jesse Burns.

The contest is open to anyone in the world, except those countries under US trade embargo. There’s no age limit, but if you’re not yet a teenager then a parent or guardian will have to claim the money for you.

If researchers choose to donate their winnings to charity, the program may increase the end payout as a gesture of altruism. It’s a sad fact of life that the baseline payouts on offer here are far less than what weaponized exploits against unpatched security bugs can fetch on the open market – although the Internet Bug Bounty sets no upper limit on payments for some security holes. ®

5 ways to reduce advertising network latency

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/microsoft_and_facebook_offer_cash_for_other_peoples_coding_flaws/

Truly secure clouds? Possible but not likely say Georgia Tech boffins

The business case for a multi-tenant, cloud-based Recovery-as-a-Service solution

Georgia Tech has added itself to the chorus, nay, throng of voices warning that poorly-implemented cloud computing and the world of BYO mobile devices are threats to enterprise security.

In its Emerging Cyber Threats 2014 report, GT’s Information Security Center joins World+Dog in noting that the Snowden NSA whistle-blowing has concentrated minds wonderfully on the question “who’s reading my cloud?”


However, trying to secure what leaves the premises comes at a cost, says GTISC director Wenke Lee: “Encryption in the cloud often impacts data accessibility and processing speed. So we are likely to see increased debate about the tradeoffs between security, functionality and efficiency.”

Even if a company bites the bullet and encrypts everything going to the cloud services it has bought on contract with an enterprise provider, the report notes that employees’ individual use of “shadow” services like Dropbox, Box.com and Google’s sharing services can undermine that security (although The Register notes that Google began encrypting enterprise level cloud data in August, and with more recent NSA revelations, the encryption deployment will probably expand).

In the mobile space, GTISC points to the university’s own work on AppStore vetting bypasses and malicious chargers. No matter how robust vendors’ security models might be, GTISC says this only deals with large-scale attacks: targeted attacks that can be used against smaller groups or individuals still remain a threat.

GTISC also highlights the burgeoning enthusiasm for the Internet of Things as an embryonic threat for the future. The report notes that the simplicity of IoT devices can be an attack point. Detecting, for example, counterfeit devices in an IoT environment is resource-intensive, the report notes, which works against the low-power and simplicity sought by device makers.

In the industrial space, the report also criticises system designers for failing to build defences against side-channel vulnerabilities such as timing attacks. ®

5 ways to reduce advertising network latency

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/cloud_mobile_keep_sysadmins_awake_georgia_tech/

Another zombie ‘bogus app’ bug shambles out of Android

The business case for a multi-tenant, cloud-based Recovery-as-a-Service solution

Jay Freeman, aka @saurik, has detailed another Zip implementation bug in pre-4.4 (Kit Kat) versions of Android which, similarly to the notorious APK vulnerability exposed earlier this year, opens a hole that malware can sneak through.

Freeman – whose previous credentials include security analysis of Google Glass and uncovering the dodginess of the “iMessage for Android” app – has written in a blog post that he uncovered the extra vulnerability in June, but waited until Android 4.4 (with a fix) was shipping.


Freeman’s dense post is here, and is unpicked and explained by Sophos’ Paul Ducklin at Naked Security here.

In brief, the extra APK vulnerability offered a path for an attacker to exploit the way Android used Zip file headers to verify the software. As Ducklin explains, Zip still carries an obsolete of its history around with it: lots of filename redundancy in case files had to be split across multiple floppy (remember those?) disks. To help a program navigate a file, the header includes a field for filename length – this lets an extractor navigate to where the file data is, by skipping the header.

As Ducklin writes, the problem is this: “The Java code in Android 4.3 and earlier, that extracts the file data to verify it, uses the filename length from the central directory. But the C code that extracts the file to install and execute it uses the filename length in the local header.”

An attacker could then take a verified app, add their malware, and modify the header length the C-code loader uses to point not to the legitimate app, but to the malware. Ducklin’s illustration shows this simply:

Paul Ducklin's illustration of the APK vulnerability

Image: Paul Ducklin, Naked Security

As Saurik writes: “The central directory includes a file offset for each local header, so that once the Java code has finished verifying a file, it can jump directly to the next one, thus avoiding the local header data that would cause it to skip forward incorrectly. The imposter data, squeezed between the legitimate file and the next local header, is simply ignored.”

The fix in Kit Kat is to force Java to look at the same data as the C-loader so that a discrepancy is identified. ®

5 ways to reduce advertising network latency

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/another_zombie_bogus_app_bug_shambles_out_of_android/

IPMI in Supermicro servers vulnerable says HD Moore

The business case for a multi-tenant, cloud-based Recovery-as-a-Service solution

Metasploit’s HD Moore is gnawing at the security of the The Intelligent Platform Management Interface (IPMI) again, this time zeroing in on the firmware implementation from vendor Supermicro.

Moore had looked at IPMI in general in July, at which time he pointed to vulnerabilities in Supermicro’s uPNP implementation.


His latest work at Rapid7, here, takes a closer look at the baseboard management controller (BMC) on motherboards using the SMT_X9_226 firmware.

His findings are that the firmware includes a small host of vulnerabilities: static credentials, buffer overflows, and directory traversals. Taking them in order:

  • Static Encryption Keys (CVE-2013-3619) exist in the Lightppd Web server SSL interface and the Dropbear SSH daemon. Users can update the SSL keys but not the SSH keys.
  • The OpenWSMan interface (CVE-2013-3620) has a static password (admin) for the digest authentication file, providing an attacker with a backdoor.
  • Various built-in CGI applications contain buffer overflows that give attackers root access for remote code execution – these are listed as CVE-2013-3621, CVE-2013-3622, and CVE-2013-3623.
  • A directory traversal attack exists in the url_redirect.cgi application, and various other CGI applications include unbounded calls to functions like strcpy(), memcpy(), and sprint().

As stated back in July, Moore says, there are 35,000-plus Supermicro IPMI interfaces visible to the Internet (El Reg supposes his source is the ever-reliable Shodan search engine). ®

5 ways to reduce advertising network latency

Article source: http://go.theregister.com/feed/www.theregister.co.uk/2013/11/07/ipmi_in_supermicro_servers_vulnerable_says_moore/