Mining Android Secrets (Decoding Android App Resources)


By Jeff McJunkin

As a pen tester and avid Android user, I’m keenly interested in the security of Android applications. Even without looking at the code, we can gain a tremendous understanding of what happens in the deep, dark corners of an application. All we need to do is dig away at the Android resources.

Evaluating Android applications is more than just looking at the decompiled source or runtime analysis of packet capture data. When I’m examining applications, for example, the stuff other than code (“resources” in Android terms) can help me figure out the purpose of the application itself. Josh Wright found the same thing to be true when he examined a number of “ghost detection” Android apps over the summer.

As a general rule, if the app includes 10,000 MP3 files of creepy sounds, it probably isn’t “tapping into the non-corporeal aether.”


While we commonly think of resources as files, they can just as well be strings or any arbitrary data. For internationalization (commonly abbreviated “i18n”) application developers will commonly include the strings of an application in many forms, so that they don’t hard-code English text, for example. Similarly, it’s convenient to extract some constants such as URL’s, local HTML, and images from being included in the code itself and put it into a resource, instead.

Unzipping an APK

All that resource information is bundled together in a binary file named resources.arsc and stored in the final .APK file. Since an APK file is just a .zip file that has been renamed, you can use the unzip utility to open it up and take a look:

jeff@blue:~/example$ unzip mobile-release.apk 
Archive:  mobile-release.apk
  inflating: AndroidManifest.xml     
  inflating: META-INF/CERT.RSA       
  inflating: META-INF/CERT.SF        
  inflating: META-INF/MANIFEST.MF    
  inflating: classes.dex             
  inflating: jsr305_annotations/Jsr305_annotations.gwt.xml  
  inflating: res/anim-v21/design_bottom_sheet_slide_in.xml  
[...skipping a lot of output...]
  inflating: res/transition-v21/lb_vertical_grid_return_transition.xml  
  inflating: res/xml/allowed_media_browser_callers.xml  
  inflating: res/xml/automotive_app_desc.xml  
 **extracting: resources.arsc**          

Here I extracted an APK file that I built from Google’s open source project.

Unfortunately, I wouldn’t recommend looking around in these folders manually as a regular activity (feel free to do so for the first few applications you extract, though).

Why? Because production applications have a lot of files.

Really. A lot of files:

$ unzip mobile-release.apk >/dev/null
$ find . -type f | wc -l

In other words, it’s easy to get lost exploring these dark passageways, and even more difficult to thoroughly inspect all of the data manually.

Since today’s topic is resource files, let’s take a look at those in particular. I’ll take a great idea from our DFER friends and look for the least-frequently occurring file types with some Bash command line kung-fu below:

$ cd res
$ find . -type f -exec file -b {} \; | cut -d, -f1 | sort | uniq -c | sort -n
      4  Ogg data
    380  Android binary XML
    999  PNG image data

In this command, find identifies files (not directories and such, that’s what -type f is for) and runs file -b on each filename, in case of mislabeled file extensions. The first cut command takes just the output from file -b before a comma, since file gives information about image resolution and such. The sort command sorts the output from the previous command, which is necessary for uniq -c, which counts the sorted input. Finally, I used sort -n to show the output in numerically-sorted form, so we see the least frequently occurring file types first.

The “Android binary XML” output that the file command returned tells us that those XML files aren’t actually plain text. They are a binary representation that is a lot more hassle to deal with. Fortunately, we don’t have to go file system spelunking manually; we can rely on other tools to help with the analysis.

Decoding an APK

First up is Apktool, a command-line utility that works very reliably to decode Android application data. If other, perhaps more automated tools are failing you, I’d highly recommend going back to Apktool for your Android application analysis needs.

Here is the basic usage of Apktool against an APK file:

jeff@blue:~/example$ ls *.apk
jeff@blue:~/example$ apktool d mobile-release.apk 
I: Using Apktool 2.2.1 on mobile-release.apk
I: Loading resource table...
I: Decoding AndroidManifest.xml with resources...
I: Loading resource table from file: /home/jeff/.local/share/apktool/framework/1.apk
I: Regular manifest package...
I: Decoding file-resources...
I: Decoding values */* XMLs...
I: Baksmaling classes.dex...
I: Copying assets and libs...
I: Copying unknown files...
I: Copying original files...

We can see that Apktool loaded and decoded the resource files (including that “binary XML” we saw before) and wrote them to the filesystem. Let’s verify that using the same find usage as before:

jeff@blue:~/example$ cd mobile-release/res/
jeff@blue:~/example/mobile-release/res$ find . -type f -exec file -b {} \; | cut -d, -f1 | sort | uniq -c | sort -n
      4 Ogg data
    544 XML 1.0 document
    999 PNG image data

Sweet! Not only did we get more files, the XML files are now in plain text. What treasures shall we find inside? Well, for quick analysis of the application, I like to search in the decoded strings.xml file as follows:

jeff@blue:~/example$ cd mobile-release/res/values
jeff@blue:~/example/mobile-release/res/values$ less strings.xml 

Of course, if you’re looking for back-end web services, you could also try to find URLs inside some of those XML files.

What else do we have to work with? If you’re looking for a GUI and you’re willing to accept a little less reliability, there are two tools I commonly reach for that I’d like to recommend.

First up: JadX. I like JadX for a more thorough examination of the code inside the application itself, as it gives you an IDE-like viewer of the decompiled code, like in the following image:

Another tool I like is called APK Studio. If I ever plan to edit the source code to an Android application, I like to use APK Studio, because it makes re-zipping the resources, signing it with a certificate (that you unfortunately have to create outside of APK Studio), and installing it onto your Android device very easy. Here’s a screenshot of the application itself:

Whew! That was a lot of content for one blog article, but I hope it helps you get a good start to examining Android applications.

Next time you’re evaluating an Android app, make sure you investigate the app resources. You never know what unexpected and exciting speleothem you may find.

Happy hacking!

– Jeff McJunkin


Upcoming SANS Special Event – 2018 Holiday Hack Challenge


SANS Holiday Hack Challenge – KringleCon 2018

  • Free SANS Online Capture-the-Flag Challenge
  • Our annual gift to the entire Information Security Industry
  • Designed for novice to advanced InfoSec professionals
  • Fun for the whole family!!
  • Build and hone your skills in a fun and festive roleplaying like video game, by the makers of SANS NetWars
  • Learn more:
  • Play previous versions from free 24/7/365:

Player Feedback!

  • “On to level 4 of the #holidayhackchallenge. Thanks again @edskoudis / @SANSPenTest team.” – @mikehodges
  • “#SANSHolidayHack Confession – I have never used python or scapy before. I got started with both today because of this game! Yay!” – @tww2b
  • “Happiness is watching my 12 yo meet @edskoudis at the end of #SANSHolidayHack quest. Now the gnomes #ProudHackerPapa” – @dnlongen

Ghost in the Droid: Reverse Engineering Android Apps

By Joshua Wright

For the past few years I’ve been invited to speak at the SANS HackFest conference. This is a great opportunity for me to present new research and useful pen testing techniques to a hungry audience.

It’s also a highly competitive event among speakers. Each year my stuff needs to be bigger and better than the year before.

Ghost Box app for Android

Over the summer, my daughter and I watched a show about a haunted house, and the reenactor used an Android app to communicate with ghosts. I saw two excellent opportunities:

  • An opportunity to answer the timeless question: can Android apps detect ghosts?
  • An opportunity to get better at Android application reverse engineering.

The Plan

My budget-manager-for-crazy-projects and I agreed that I would spend $200 on Android apps that claim to detect ghosts. I excluded anything that was free, marked as entertainment, or otherwise admitted that they did not actually detect real ghosts. Instead I focused on apps that were labeled professional use, that claim to meet or exceed the capabilities of commercial ghost detection tools, and claim to perform genuine ghost detection.

The apps I chose ranged from $0.99 to $29.99 in price, and are generally categorized as including one or more of 4 ghost detection capabilities:

  • Electromagnetic Frequency Measurement (EMF) tools
  • Electronic Voice Phenomenon (EVP) measurement (ghost audio)
  • Ghost radar and visual identification apps
  • Ghost communication tools

Ghost Detection Apps
I evaluated 20 Android ghost detection apps in total, and at HackFest I revealed my analysis results for 5 apps:

  • Ghost Hunter
  • Joe’s Ghost Box
  • Ghost Speaker
  • P-SB7 Ghost Box
  • My Own Ouija Board

Spoiler alert: None of the Android apps could be confirmed as actually capable of identifying ghosts. Sorry to disappoint!

The Tools

My work primarily consisted of dynamic analysis (install the app, capture network traffic, monitor the filesystem, and look at logging messages) followed by static analysis (unzip the Android APK file, examine the embedded resources, decompile and analyze the source code). Somewhat unsurprisingly, I found my usual cache of tried-and-true tools served me well:

  • Android Emulator, Genymotion or Android-x86 to virtualize Android devices (making analysis a bit easier than using a physical Android device)
  • Wireshark for packet capture analysis
  • Burp Suite for proxy interception
  • JadX for static application reverse engineering
  • Android Studio to import, annotate, and refactor decompiled sources
  • Apktool to extract Android resources and decompile low-level Android bytecode
  • 7-zip or any other unzip tool to extract resources from an Android APK file

In particular, I started using Genymotion as my go-to emulator for runtime analysis of Android applications. Genymotion uses VirtualBox as the behind-the-scenes hypervisor to emulate Android devices, but it does so with a very simple user interface. Free for personal use, it’s quick and easy to download and install Genymotion on Windows, Mac OS X, Linux.

After starting Genymotion, you’ll have the option to create new Android virtual devices, choosing from different Android versions and hardware configurations. From there, it’s easy to start the Android device, scale the window, and install or copy files to the virtual device with drag-and-drop.

Screen Shot 2016-12-03 at 4.47.25 PM

Although Genymotion lacks the “-tcpdump” and “-http-proxy” features that come with the official Android Emulator and the Qemu hypervisor, it’s easy enough to capture Genymotion network traffic from your guest using Wireshark. HTTP proxy settings can be configured using the Genymotion virtual WiFi interface, or though the Genymotion settings before starting the virtual device.

Screen Shot 2016-12-03 at 4.52.49 PM

The Results

The apps were pretty lame.

Screen Shot 2016-12-03 at 11.36.23 AM

Several of the apps used a random number generator to “detect” ghosts at various intervals, using static images to populate “radar” systems, or to serve up “ghostly” audio clips. Other apps used static wordlists to spook the user into thinking they could communicate with the dead, or otherwise hooked into cloud-based bot communication systems. One app overlaid a picture of a Ouija board with chat directly from, changing each response of “bot” to “spirit”, “ghost”, “psyche”, or “soul”.

Screen Shot 2016-12-03 at 11.40.34 AM

“Stay classy, ghost box app developers.” -me


My budget-manager-for-crazy-projects asked me if the $200 and hundreds of hours spent analyzing Android ghost box apps was worth it.

Yes! Analyzing crappy ghost box apps was well worth the time and money investment.

Going into this, I hoped I would find evidence of ghost detection capabilities that would defy scientific understanding. Instead, I found developers using the compass as an RNG to graph “energy values that paranormal entities might be projecting”. While that was mildly disappointing, I am well-pleased with this project.

I’m a better Android app reverse engineering analyst now than when I started.

Looking at all these apps forced me to improve my understanding of how the Android SDK works. I found bugs in some of my favorite tools, and figured out how to overcome them. I was able to optimize my workflow, to analyze an app in less time, and to produce better results. I was able to leverage Android Studio as a mechanism to aid my reverse engineering efforts, building a better understanding of how these apps work.

Going into this project, I was motivated to find out what I could about ghost box apps, and I wanted to build my skills as a mobile application security analyst. I had clearly defined goals and a strong motivation to keep working through the challenges that inevitably creep up in any project. At the end, I hadn’t identified any ghosts, but I felt smarter and more capable to evaluate Android applications, a skill that I can apply in customer pen test engagements going forward.

You can check out my presentation from the SANS HackFest 2016 conference. While you’re there, check out the other presentations too.

In the meantime, if you come across a ghost box app that you think actually detects ghosts…then drop me a note in the comments section below.



Upcoming SANS Special Event – 2018 Holiday Hack Challenge


SANS Holiday Hack Challenge – KringleCon 2018

  • Free SANS Online Capture-the-Flag Challenge
  • Our annual gift to the entire Information Security Industry
  • Designed for novice to advanced InfoSec professionals
  • Fun for the whole family!!
  • Build and hone your skills in a fun and festive roleplaying like video game, by the makers of SANS NetWars
  • Learn more:
  • Play previous versions from free 24/7/365:

Player Feedback!

  • “On to level 4 of the #holidayhackchallenge. Thanks again @edskoudis / @SANSPenTest team.” – @mikehodges
  • “#SANSHolidayHack Confession – I have never used python or scapy before. I got started with both today because of this game! Yay!” – @tww2b
  • “Happiness is watching my 12 yo meet @edskoudis at the end of #SANSHolidayHack quest. Now the gnomes #ProudHackerPapa” – @dnlongen

iOS 10 is Apple’s Gift to Android Users

How the latest update to iOS 10 will dramatically improve Android security

At the Apple WWDC conference in June, Ivan Krstic, Apple Head of Security Engineering & Architecture, made a bold declaration:

“At the end of 2016, Apple will make ATS mandatory for all developers who hope to submit their apps to the App Store.”

ATS is App Transport Security, Apple’s security feature that requires the use of TLS/1.2 encryption for all network transport data (specifically, the libraries used for interacting with web servers, NSURLSession and NSURLConnection). Apple introduced ATS in iOS 9, but offered developers the ability to work with servers on insecure domains by specifying exceptions in the application Info.plist file.

According to Krstic and Lucia Ballard, Apple’s Secure Transports Engineering Manager, the enhancements to ATS will no longer accommodate the exceptions previously available for insecure domains. Further, ATS will now require TLS/1.2 with AES-128 and SHA2 hashing, while introducing support for ECDHE forward secrecy (to prevent a private key compromise from decrypting passively collected traffic), OCSP stapling (to mitigate information leaks from CRL checking), and certificate transparency (to defeat compromised CA attacks). The ATS changes will dramatically improve network transport security of iOS apps. This bears repeating:

ATS changes will dramatically improve the network transport security of iOS apps

Of course, transitioning from plain-old HTTP to TLS/1.2 is no simple affair, particularly for large application vendors. For a quick test, I downloaded a list of the top 20 iOS apps (on 6/32, this was before the Pokémon Go debut):

Top 20 Free iOS Apps, 6/23/2016

For each app, I took a packet capture while using the application using the rvictl utility, then evaluated the packet capture with Wireshark using the http display filter. From this top 20 list, only 7 were exclusively using HTTPS encryption for network transport, bringing shame to the remaining 13 for continuing to use HTTP.

Top Free iOS Apps using HTTP, 6/23/2016

Looking at the network traffic, I see this pattern often:

Instagram Network Traffic in Wireshark


Whether hosted on AWS or other cloud services, or traditional Content Distribution Networks (CDNs) such as Akamai or MaxCDN, big and little app developers alike will need to change their apps and infrastructure alike to meet the ATS requirement in iOS 10. The use of CDN is not iOS-specific, with many platforms sharing the same infrastructure for iOS, Android, and web applications.

With this change for iOS 10 users, Apple forces developers to migrate their platforms to TLS. Instagram, WhatsApp, Spotify, and other companies will accommodate this change to continue to serve the iOS market. In doing so, Android apps will also reap the benefits from this transition, even if Google continues to maintain a position of stubborn passivity in encouraging developers to do the right thing.

Apple’s you-must-TLS decision is not without its drawbacks. Migrating an HTTP infrastructure to TLS is not free, and companies must find a way to pay for this added overhead. Further, TLS has distinct disadvantages for end-users: more network overhead, slower network responsiveness, and reduced battery life due to increased CPU overhead.

For end-users, an app update to Kik that causes their battery to die at 3:00 pm (instead of the 4:30 pm, these are iPhones we’re talking about) will be met with derision, cries of outrage, and 1-star ratings. Regardless, TLS/1.2 is something Apple will require of iOS app developers. Why? Because HTTP must die.


It’s long past time when we need to protect mobile device users from the deficiencies in HTTP. Just like I tell my kids when they don’t like the taste of medicine, sometimes we have to do things that we don’t like because they are good for us.

For Android users, you’ve been given a gift with iOS 10 and ATS enhancements: it will naturally lead Android app developers to migrate to TLS and strong network transport security for your apps as well. It’s a much-needed positive change for mobile security. Thank you, Apple.

–Joshua Wright
SANS Instructor and Course Author

SANS Note: Joshua Wright is the course author of SANS SEC575: Mobile Device Security and Ethical Hacking. This course is offered in-person numerous times each year and always available 24/7 in an online format – OnDemand.


Upcoming Training Opportunity:

Learn more about the latest attacks and techniques used against organizations at the SANS Pen Test HackFest Training & Summit. This year’s HackFest Summit features two days of leading talks from top experts and then six days of hands-on, immersion-style pen test training in one of our seven courses to choose from! Learn and develop your offensive techniques as you strive to better defend your environment. Whether you are a penetration tester, red team member, a forensics specialist, or cyber defender, the techniques covered at HackFest represent the latest and most powerful attacks every organization needs to thwart. You NEED to be there!

For more free educational resources, follow:

Mobile Device Security Checklist

By Lee Neely & Joshua Wright

We often get asked for things we can do to help users keep their mobile devices secure. Here’s a quick list of some simple things you can do to ensure that your mobile devices are running with at least some security. All of these steps are free and raise the bar on both unauthorized use of your device and the integrity of the applications you’re running on them.  Our goal here is not to make your device impenetrable to attack, but instead to raise the bar.

image2 (1)

Security Tips For Android Devices

  • Turn on disk encryption (not explicitly tied to PIN/screen lock).
  • Use biometrics for unlocking normally with a longer passcode (instead of a simpler 4-character PIN).
  • Disable developer access (off by default).
  • Disable third-party app store access (off by default, but very common)
  • Evaluate and uninstall apps with excessive permissions using Android Permission Apps or other tools.
  • Install Android platform updates when they become available
  • Compare your Android version to recent releases.  Is your phone getting updates? If not, it’s time for a new phone. (This is hard, because most users will find that Android phones are poorly supported and require more frequent replacements, which end up being more costly than iOS devices over time).
  • Do your research before you buy a new phone. Nexus has the best record for security update delivery and support, followed by Samsung, and then by LG. Everyone else is the pits for security updates.
  • Turn on “Android Device Manager” for remote location services for lost devices or a third-party “Find my Android” tool if your Android device doesn’t support this feature.
  • Periodically erase your network settings to forget about old, insecure WiFi networks you don’t use anymore.
  • When plugging in USB, don’t say yes to “Trust this PC” when prompted, unless it is a personally owned system.
  • Set a strong Google password, better still, enable two-factor authentication.
  • Complain to your cell phone carrier about unwanted applications on device and loss of control. There’s no challenge currently, so the carriers do what they want.


Security Tips for iOS Devices

  • Make sure you update iOS when new updates come out.
  • Periodically erase your network settings to forget about old, insecure WiFi networks you don’t use anymore.
  • Make sure “Find my iPhone” is turned on for locating or wiping lost devices.
  • Use TouchID with a longer passcode in lieu of a 4-digit PIN.
  • When plugging in USB, don’t say yes to “Trust this Computer” when prompted, unless it is a personally owned system.
  • Turn off iCloud backup unless you are comfortable with your pictures being stored in the cloud.
  • Use iTunes to make a backup with a password to both encrypt and to capture all your settings.
  • Set a strong Apple iTunes password.
  • Review the Settings | Privacy settings, revoking permissions from apps that are unnecessarily greedy with permissions.

Security Tips for For Both iOS and Android Devices

  • Disable wireless and leave it off unless you’re actively using it.
  • Install a VPN (proXPN, Private Internet Access, etc.) for when you need to use Wi-Fi, and always use the VPN when connecting to Wi-Fi.
  • Only use known Wi-Fi connections, beware of free public Wi-Fi.
  • Don’t leave your device unattended, treat it like your wallet.
  • Use caution lending your device to others, they can quickly make unauthorized changes.
  • Disable premium rate messages via your cell carrier! If you manage cell phones for the organization, turn it off for all.
  • Uninstall unused apps.
  • Factory reset phones before returning for service.

Want to learn more about this topic? You really should check out SEC575: Mobile Device Security and Ethical Hacking.  It’s an amazing course covering mobile device security attacks and much more!

-Lee Neely

Upcoming SANS Special Event – 2018 Holiday Hack Challenge


SANS Holiday Hack Challenge – KringleCon 2018

  • Free SANS Online Capture-the-Flag Challenge
  • Our annual gift to the entire Information Security Industry
  • Designed for novice to advanced InfoSec professionals
  • Fun for the whole family!!
  • Build and hone your skills in a fun and festive roleplaying like video game, by the makers of SANS NetWars
  • Learn more:
  • Play previous versions from free 24/7/365:

Player Feedback!

  • “On to level 4 of the #holidayhackchallenge. Thanks again @edskoudis / @SANSPenTest team.” – @mikehodges
  • “#SANSHolidayHack Confession – I have never used python or scapy before. I got started with both today because of this game! Yay!” – @tww2b
  • “Happiness is watching my 12 yo meet @edskoudis at the end of #SANSHolidayHack quest. Now the gnomes #ProudHackerPapa” – @dnlongen

TLS/SSL Failures and Some Thoughts on Cert Pinning (Part 1)

By Chris Crowley

It’s going to happen sooner or later…sooner probably. You’re going to be asked about your company’s mobile app or a mobile app your company wants to install across all mobile devices. They’ll put the request in the “yet another duty as assigned” (YADAA) category/bucket. You look at the network traffic; it’s using TLS so you can’t see the content. Cool, right? Maybe, maybe not. Can an attacker get man in the middle position by tricking users, or by attacking the very root of trust (well, one of the 100 or so root CAs) that is used to sign a TLS server cert? It is a complicated situation and requires a thorough understanding of several systems and protocols to address. I want you to be smart on the subject, so when that YADAA comes your way, you can answer the question knowledgeably and authoritatively. So here you go…read on.

The first installment of this two-part blog post will provide some background on TLS (Transport Layer Security) certificates so you can understand why your mobile apps should implement certificate pinning. In the second part, I’ll discuss a tool that was released recently, called TrustKit, that makes it easy for iOS developers to implement certificate pinning with no change to the iOS app. (No change presumes the URLs to be protected are requested by TLS already). No analogous project exists for Android yet.


TLS provides transport encryption to protect data while in transit over a network, providing confidentiality and integrity protection against untrusted and unknown network nodes. This is a mainstay in our protection of information in transit. In addition to recent attacks against implementations in TLS/SSL, (iOS/OSX goto fail; Heartbleed; Poodle) there are fundamental trust issues within the implementation of TLS. A trusted Certification Authority (CA) is trusted to issue certificates for any resource.

The TLS handshake involves a client verification of the certificate the server provides, then an optional server validation of client certs. In practice, there is no server validation of client certificates, but the spec allows for it.

Screen Shot 2015-11-25 at 8.47.39 AM

Diagram Source:

There are 5 basic items that the client validates during the assessment of the server presented certificate.

  1. Is this certificate current and still valid based on the expiration date in the cert compared to the current system date?
  2. Is this certificate intended for the purpose I’m attempting to use it for? This capability is defined in the certificate. In the case of a client connecting over HTTPS to a server, the certificate must include the specification for identifying a server.
  3. Is this certificate on a revocation list? In practice, we rarely see OCSP (Online Certificate Status Protocol – RFC 6960) a real time query mechanism for certificates, used. The client may have downloaded CRLs stored. The certificate offered may include a query URL.
  4. Is this certificate issued to the resource I’m requesting? If the browser requested, but the certificate was issued to a server named, the browser (or mobile app) shouldn’t allow the connection to proceed.
  5. Is this certificate issued by a certification authority I trust? The certification authority (CA) included with the mobile device or web browser is stored in a structure called a certificate store. The list of trustworthy authorities usually numbers in the tens of CAs up to 100 or more CAs. This means that any of these CAs (companies, organizations, or government entities) could issue a certificate for any server name. In practice, the CA is supposed to vet the requester of a certificate. In practice, the CA is only supposed to issue certificates to trustworthy people who actually represent the organization the certificate is assigned to. In practice, the CA has your personal privacy as its primary concern. In reality, we’ve seen a small number of violations of these practical considerations.

To put this in perspective, iOS 9 trusts 218 CA’s:

$ curl -s | grep '</tr><tr><td>' | wc -l


We’ve seen some attacks in the past related to certificate manipulation. We’ve seen scenarios of complete CA compromises that were used to issue unauthorized certs. We’ve seen CAs tricked into handing out certs. We’ve seen users fail to protect the CA-issued private keys which resulted in unauthorized certificates being issues. Below I cite two reported instances (India CCA and TURKTRUST Inc) where government organizations have issued certificates for servers in an unauthorized fashion. These two are not really a big deal, just for host certificates. Also, many organizations using HTTP proxies will configure CA certs on the client end points, allowing for the transparent interception of HTTPS communication.

We’ve seen theoretical and in the wild attacks against TLS and its related certificate authority infrastructure and organizations. Why, there’s even an RFC “Summarizing Known Attacks on Transport Layer Security (TLS) and Datagram TLS (DTLS)

Some of my favorite failures are below. I like studying failure; it helps us to improve.

2001 : VeriSign issues Microsoft certificates to unauthorized party:

2008 : MD5 Collision Construction allows rouge CA Certs

2009 : TLS Renegotiation attack

2011: Diginotar

2011: Duqu – Code Signing used in malware

2013 : TURKTRUST – Unauthorized certs

2014 : India CCA – Unauthorized Google certs

2014: Heartbleed

2014 : goto fail;

2015 : Microsoft revokes certs associated with

In many of the cases listed above, there was some fundamental component of the protocol or the implementation that undermined integrity of the security. Several of the attacks were entirely fraudulent certificates. You can’t have endpoint protection when there is a broken implementation on the endpoint (like in the Apple goto fail;) or in the server OS (like in Heartbleed). But, we can address the frequent (2001, 2009, 2011, 2013, 2014, 2015) scenario of fraudulent certificates used in networks by requiring our mobile applications to only trust specific certificates we intend for use.

Certificate Pinning

Certificate pinning is the application-specific requirement that some specific certificate or CA be required for a TLS connection, rather than accommodating any of the CAs the phone trusts. This can also be used to remove the user from the negotiation. Certificate pinning is most effective when the user is unable to override it.

For protecting your data, this is a useful thing. Is it foolproof? Certainly not. For example, there’s a tweak available for jailbroken iPhones called SSLKillSwitch ( which is a “Blackbox tool to disable SSL certificate validation – including certificate pinning – within iOS Apps.” Instead of falling victim to the perfect solution fallacy, I recommend developers use certificate pinning to minimize the likelihood of data interception, while acknowledging the reality that a jailbroken or rooted endpoint is not going to provide the same data protection as a non-rooted device.

Cert pinning provides your application with a layer of certificate control that allows you to specify which certificates you’ll accept, substantially limiting the likelihood of unauthorized information disclosure or alteration during network transmission. Another scenario where this may be undesirable is if your organization uses a privately issued CA to introduce a man-in-the-middle (MITM) TLS scenario for data loss protection. Google implements one of the largest certificate pinning instances in the world via its Chrome browser. It addresses this nuance by allowing the private trust store CAs to override pinning, while prohibiting the public trust store to violate pinning. Read more here:

Here’s a simple example of how certificate pinning could help you to protect your organization’s data. A user, alibo, reported to Google Groups that there was a cert error reported by Google Chrome. As it turns out, Google used certificate pinning capability in the Chrome browser to identify unexpected certificates signing certificates for servers in the domain. Read more here: . That is a public example of detecting the DigiNotar compromise.

Stay Tuned…

In my next blog installment I’ll discuss the specific implementation of certificate pinning in iOS applications using TrustKit. Let me know if you found this useful on twitter @CCrowMontance.

-Chris Crowley

P.S. If you like this kind of thing, you really should consider joining me for the SANS Security 575 course on mobile device penetration testing and security in New Orleans in January.  We call the event SANS Security East, and it’s gonna be an awesome event full of great strategies, tactics, techniques, and practical advice on securing your organizations mobile devices!  I hope you can make it!

What’s the Deal with Mobile Device Passcodes and Biometrics? (Part 2 of 2)

By Lee Neely

In the first installment of this 2-parter, I discussed  the use of mobile device  fingerprint scanners to unlock the device.  As a follow-up, I’d like to discuss how a developer can integrate the scanner into their applications.  This discussion may provide some insights into how to secure mobile apps, or even inspire some hacking ideas (this is a pen test related blog, after all).  At the end of the article below, I’ll discuss some ideas for compromising this type of environment.

In part one, I introduced the secure environment used to manage fingerprints. This environment is called by a couple of names. Most commonly it’s called the Trusted Execution Environment (TEE) and Secure Enclave. In both cases, the terms describe a separate environment consisting of hardware and software, which performs specific tasks relating to managing and validating trusted information, isolated from the primary device applications and operating system.


Global Platform, an international standards organization, has developed a set of standards for the TEE’s APIs and security services. These were published in 2010 and the corresponding APIs between the Trusted Application and Trusted OS was completed in 2011. The image below, from, describes the TrustZone standard.Blog 1

With this published, we saw the introduction of implementations from Apple and Samsung using a secure micro-kernel on their application coprocessor. A TrustZone enabled processor allows for hardware isolation of secure operations. Rooting or Jailbreaking the device Operating System does not impact the secure micro-kernel.

Both platforms are using the same paradigm. There is now a Normal World (NW) and Secure World (SW) with APIs for NW applications to access information in the SW. In both cases the Fingerprint Reader/Scanner is connected to the SW, so the fingerprint scan itself cannot be accessed by NW applications. Both platforms have API calls that mask the details of the implementation from an application developer, making it simple to incorporate this technology into applications.

Apple Implementation Details

Apple introduced their Secure Enclave when they released the iPhone 5S which included Touch ID and their A7 processor. Apple creates a Secure Enclave using encrypted memory and includes a hardware random number generator, and a micro-kernel based on the L4 family with modifications by Apple. The Secure Enclave is created during the manufacturing process with its own Unique ID (UID) that reportedly not even Apple knows. At device startup, an ephemeral key is created, entangled with its UID, and used to encrypt the Secure Enclave’s portion of the device’s memory. Secure Enclave data written to the file system is encrypted with a key that is entangled with the UID and an anti-replay counter. During device startup, the Secure Enclave coprocessor also uses a secure boot process to ensure the software is verified and signed by Apple. In the event the Secure Enclave integrity cannot be verified, the device enters device firmware upgrade (DFU) mode and you have to restore the device to factory default settings.

Using Touch ID with applications:

Third-party apps can use system-provided APIs to ask the user to authenticate using Touch ID or a passcode. The app is only notified as to whether the authentication was successful; it cannot access Touch ID or the data associated with the enrolled fingerprint.

Keychain items can also be protected with Touch ID, to be released by the Secured Enclave only by a fingerprint match or the device passcode. App developers also have APIs to verify that a passcode has been set by the user and therefore able to authenticate or unlock keychain items using Touch ID.

Apple APIs

After all that, it may seem like it’s hard to make a call to Touch ID for authentication. Apple is now publishing sample code and API documentation through the Apple iOS Developer Library. Note: Downloading the iOS SDK requires an Apple Developer Program membership.

This code sample Local Authentication Framework from Apple’s iOS Developer Library makes it pretty simple to add a Touch ID authentication prompt to an application:


Blog 2.2

Samsung Implementation Details

Samsung introduced their Trusted Execution Environment in the Galaxy SIII. Samsung uses a micro-kernel named Mobicore.  Developed by Giesecke & Devrient GmbH (G&D), it uses TrustZone security extension of ARM processors to create a secure program execution and data storage environment which sits next to the “rich” operating system of the device. This isolation creates the TEE. Secure applications that run inside Mobicore are called trustlets. Applications communicate with trustlets through the Mobicore library, service and device drivers.

While third-party application developers can create their own trustlets, they need to be incorporated into Mobicore by G&D.

The following figure published by G&D demonstrates Mobicore’s architecture:

Blog 2

Using Samsung Fingerprint Scanner with applications:

Samsung’s fingerprint scanner can be used with Web sign-in and verification of a Samsung account. Additionally, it will be incorporated into the authorization process for the new Samsung Pay application.

Applications workflow for Fingerprint Authorization is as follows:

  1. User opens app (that uses fingerprint)
  2. User goes to authorize something (such as payment)
  3. App launches fingerprint Trustlet
  4. Secure screen takes over and provides UI for reading fingerprint
  5. Fingerprint scanner accepts input
  6. Trustlet verifies fingerprint by comparing it to stored scan data from registration and provides authorization back to App
  7. Authorization is completed

Samsung API

To access fingerprints on Samsung, you need to use the Pass SDK. It works with both the Swipe (S5, Note 4) and Touch (S6) sensor. The Pass SDK provides for not only checking a fingerprint but also has mechanisms to add, check, and even throw a handled exception in the event your application is run on a device that doesn’t support a fingerprint scanner.

Sample code from the Pass SDK to authenticate with a fingerprint:

Blog 3


So now you know how the architecture works, and how to access it, the question becomes how can this be compromised? What are the risks?

The first option is to directly attack the fingerprint database. The FireEye team found HTC was storing the fingerprint database as a world-readable world-writable file, which means that any application could read or modify the contents. Reports of patches have been issued to address this.

The second option is to interfere with the communication between the fingerprint reader and the SW. If you could attach a process to that device, you could make your own collection of fingerprints accessing the device, or even more insidiously, insert your own fingerprint into the legitimate fingerprint database. Again, the FireEye team found cases where this was possible and the vendors issued patches.

The third option is to modify the device driver that’s between the NW and SW portions of the device. Potentially altering the behavior of the fingerprint scan checks to always return true. Other data, which is uniquely stored in the SW, cannot so easily be faked so there are limits to how successful this would be. As luck would have it, another presentation at Black Hat 2015 found an exploit.

Di Shen’s presentation from BH-15 shows how an insecure implementation of the TEE OS and driver in the Huawei devices with the Huawie HiSilicon chipset can be used to alter trusted memory, injecting shell code, accessing fingerprint data, and otherwise compromising the TEE. These compromises do require kernel (rooted) access to be successful.

A common thread in addressing these weaknesses is patches or updates. Android users have had a struggle getting these updates consistently and in a timely fashion. On August 5th, Google announced monthly Over the Air (OTA) security updates for their Nexus devices and Samsung announced a fast track OTA security update process. Samsung is negotiating with mobile operators worldwide to insure the process will be successful. With luck, other device manufacturers will follow suit. An important thing to note is the backward compatibility of the updates. Typically Android devices only have updates for eighteen months, if at all. By contrast, Apple iOS 8.4.1 is still supported on the four-year-old iPhone 4s. Once a platform with updates is chosen, it is still necessary to check for and apply the updates that are distributed.

To stay informed about security updates, subscribe to data feeds that carry announcements of updates and fixes. Google recently created an Android Security Updates Google Group. Apple announces their updates through their security page and security-announce mailing list.


When considering the use of biometrics either for device or application authentication, a risk-based decision has to be made, as with any external authentication service, whether to trust it. Fingerprint readers are still subject to bypass techniques as were described in part one. The simplest mitigation is to rely on device and application integrity checks which must pass prior to allowing the external authentication to either be enabled or succeed. Additionally, being aware of the physical security of the mobile device is paramount. Protect it like you would protect your wallet.

If the risk is acceptable, users are again provided with an authentication mechanism that cannot be merely observed to circumvent. The authentication information itself is securely stored, the application verifies it via calls through a secure API designed to detect tampering, and the application is no longer managing passwords.

One advantage an application developer has is they may elect to have layered authentication. If the target is a corporate application, working with the mobile device administration team to ensure a device level passcode is also configured would then ensure two levels of authentication prior to allowing access to your application.

One more option an application developer has is to require added authentication, within the application, prior to allowing access. Perhaps, that might be a PIN plus a Fingerprint. While this is appealing from a security perspective, from an end-user perspective this may be a tough option to accept.


In short, adding support for biometric authentication has a nominal impact and, as such, making a pitch for using it when developing applications should be an easy option for management to accept. I suggest seriously considering it for your next mobile application development project.

-Lee Neely

Want to learn more on this topic? You really should check out SEC575: Mobile Device Security and Ethical Hacking, an amazing course covering mobile device security, attacks, and much more!

Apple iOS Security Guide:

Brute Force Android PIN:

Samsung Fingerprint 4.4:

Samsung S5 Fingerprint Hacked:,news-18655.html

Samsung/Apple face-off:

Improvements to iPhone 6 Fingerprint scanner:

CCC how-to make a fake fingerprint:

FAR/FRR Graph:

Entrust Blog:

iOS Authenticate with Touch ID:

Samsung Pass SDK:

Samsung KNOX Security Overview:

MobiCore description:

BlackHat Presentation:

Attack TrustZone:



What’s the Deal with Mobile Device Passcodes and Biometrics? (Part 1 of 2)

By Lee Neely


Mobile device administrators and end users need to be more cognizant of the risks of allowing unauthorized access to their smartphones and take steps to raise the bar on accessing those devices to mitigate those risks.

This is part one of two articles on securing mobile device access. In this article, I am going to focus on securing access to the physical device itself. In part two, I will discuss on-device security APIs and how one would know they are still in place.

The case for a strong passcode

When the first smartphones were introduced, they were corporate owned, managed, and secured to business standards. Device access was on par with accessing corporate laptop systems. The number, variety, and quantity of applications and personal or sensitive information stored on the device was far less than we see in modern   iOS, Android, Windows Mobile, and other devices. While there were some devices owned and managed by end users, it was not as significant as it is today. The seminal events that tie to explosion of content, applications, data and personal use are the introduction of the iPhone in 2007 and Android in 2008. With this change of use, device administrators, and end users, we now need to worry about access to the information on their smartphone, and usually have less control over the device.

When the first iPhone lock-screen bypass “bug” was announced in 2008, I found the reaction was unexpectedly blasé. Further research indicated, at that time, that the majority of users weren’t setting a passcode any way, so there was no lock screen to bypass. A 2014 survey by Consumer Reports found that while 47% of users surveyed were setting a passcode, gesture or other mechanism to lock the device screen, 77% of those users were only using a four digit PIN. Further, users were unlikely to do anything more, such as configuring an automatic wipe after a specified number of passcode failures. Observation indicates that while setting a passcode is becoming more standard, a four-digit PIN remains the de-facto setting chosen by users.

While the use of a passcode is better than a device that has no screen lock, a four digit PIN is relatively simple to bypass using one of the following techniques:

  • Observing the numbers used:
    • SANS instructor Chris Crowley has found that he can reliably observe the numbers used in a four digit PIN from across a room, quite a feat but not at all implausible.
  • Finding a likely four-digit PIN
    • Searching social media, phone directories, and other on-line sources for four digit numbers of significance to the given user.
    • Users typically pick passcodes that are easy to recall, such as a birth or anniversary date, home address or a simple pattern, such as 1111, four corners, etc.  For more on the most common PINS, check out
  • Brute force:
    • Devices that connect to an iOS device, such as an IP-Box, can try all 10,000 four digit PINs in about 17 hours.
      • Users often don’t configure the device to wipe after ten failed passcode attempts. Additionally, there were several scenarios prior to iOS 8.3 that allow the device wipe on ten failed attempts to be bypassed.
    • Devices that connect to an Android, such as a  USB Rubber Ducky  (using a USB OTG cable) can try all 10,000 four digit PINs in about 16 hours.
      • Default behavior on the Android is forcing a 30 second pause after five failed passcode attempts, so the brute force tools simply pause as well.

The argument for stronger authentication

So, as the four digit PIN can be compromised, and with the current uses including email, shopping, banking, payment, contacts, notes, and social media applications, many configured to login as the owner without prompting for additional credentials, the need for a strong device passcode becomes increasingly more important.

The need for a strong passcode to access their smartphone and all the potentially sensitive information on it is not hard for users to understand. Yet, device administrators don’t get support when they suggest strong passcodes. My experience is most users get stuck on the idea of using, and more specifically entering, a complex password on this device they have significant personal interaction with and control of. I have found that if using more secure options is too difficult for users, they will find ways to use simpler, less secure ways to get the job done. Entering a long, complex passcode, every time they want to use their smartphone, is not something that is done quickly and easily; an alternate secure authentication mechanism is needed.

A mentor once told me “We hire the smartest people on the planet to solve problems; don’t be a problem they solve.”


Enter biometric authentication: authentication based on some intrinsic characteristic of the user. Biometrics used for authentication include fingerprints, retina scans, face recognition and voice prints. Enabling biometric authentication can offset the impact of requiring a strong passcode, a win for device administrators and users. While also reducing the times where a passcode can be observed and subsequently entered.

When the Android OS 4.1 was introduced, an option appeared to allow the smartphone to unlock when it saw the configured user’s face. The big challenge with that option is a photo of the users face would work as well as the real face. Updates were made to require the user to blink their eyes as a differentiation from a photograph. This option generally means that the device camera is on and watching full time, which may not always be appropriate or desirable. I recommend disabling this option.

Another popular form of biometric authentication has emerged, namely fingerprint readers (queue bionic sound-effect from the Six Million Dollar Man television series). These appear in one of two styles, a finger press or finger swipe. The finger press style has seen a large degree of user acceptance because it is easier to use than either a finger swipe or passcode entry and has fewer false rejections.

While biometric authentication is not free from issues, it has removed most of the situations where a user has to enter the passcode; and therefore is seeing increased adoption and acceptance.

Biometric Sensitivity

It is important to remember that regardless of the biometric system used, if it determines that the user is not who they claim to be, they will need to enter the device passcode. Vendors work hard to adjust the False Accept Rate (FAR) and False Reject Rate (FRR) to ensure that fake users are turned away and genuine users are able to authenticate. The Crossover Error Rate (CER) is the rate at which the FRR and FAR are equal, and sometimes called sensitivity or the Equal Error Rate (EER.) The graph below illustrates this phenomenon. When the vendor delivers a product that operates with a good CER, user acceptance is high, and widespread adoption is possible.
Blog 1

Why focus on Android/iOS?

According to data from International Data Corporation (IDC) in Q4 of 2014, 96% of the worldwide smartphone market is comprised of Android and iOS devices. Given the substantial margin over other options, I am going to focus on iOS and Android device solutions.


Current Solutions

The iPhone 5s and Samsung Galaxy S5 both introduced fingerprint readers to enable biometric authentication. In both cases, groups such as The Chaos Computer Club (CCC) devised mechanisms for creating a fake fingerprint that would unlock the device.  Joshua Wright, author of the SANS Institute SEC575 course on Mobile Device Security and Ethical Hacking,  illustrates the CCC process for creating a fake fingerprint to trick Touch ID, as shown in the course except below. A similar process can be used against Samsung devices. With the introduction of the iPhone 6 and Samsung Galaxy S6, the sensitivity of the readers has increased, meaning marginal fake fingerprints will no longer work, and at times it may pick up the fingerprint behind the fake, again causing the fingerprint to be rejected. This change in sensitivity also means some legitimate fingerprints will be rejected.


Protection of Biometric information

One of the most important security measures in biometric authentication is protection of the biometric information. In both Samsung and Apple implementations, the fingerprint scan is not actually stored. Instead, a mathematical representation of the fingerprint is made which is then stored in a secure location on the device that is not replicated to the cloud or backups. This is of critical importance since, unlike passwords, fingerprints cannot be simply changed if they are compromised.

Samsung Fingerprint Scan Data Security

Samsung devices use a trusted execution environment to protect fingerprint data. Per Samsung, the fingerprint framework works as follows:

  • Actual fingerprints or biometric data is not stored. A hash is created from the scan and the resulting hash is stored in Trustzone which is the ARM architecture TEE (Trusted Execution Environment)
  • Fingerprint scanner & UI are in TEE
  • Fingerprints cannot be accessed outside the TEE
  • Fingerprint scanner hardware cannot be accessed outside TEE
  • Scanner is connected such that only TEE can access it physically
  • TEE takes over display to show trusted UI for input
  • Fingerprint data is not accessible outside TEE
  • Trustlet provides results of scan, possibly some key protected by successful scan, but no scan information

Apple Touch ID Fingerprint Data Security

On Apple iOS devices, fingerprint representations are stored in the Secure Enclave. The Secure Enclave is a co-processor with its own boot and update processes, as well as encrypted memory with a unique key that is assigned during fabrication. The Secure Enclave maintains the security of the data contained within it, even if the main kernel is compromised. Apple provides limited System APIs for applications that wish to use Touch ID for authentication, restricting access to how the fingerprint reader is used.

Biometric Impacts on Daily Use

Using Biometric authentication has impacts, both good and bad, on how users access their devices and on privacy.

While the fingerprint readers appear to eliminate the need to enter the device passcode, there are still a few times you need to enter it. Here is a comparison of where Touch ID and Samsung’s Fingerprint Scanner still require the password:

Scenario Apple Touch ID Samsung Fingerprint Scanner
Reboot/Power Cycle Must Enter Only required if on device encryption enabled
Idle more than 48 hours Must Enter Can still use fingerprint
After five unsuccessful attempts to match a fingerprint Must Enter Must Enter. This is new with Samsung’s S6.Previously, no limit was set.
Enrolling/managing fingerprints Must Enter. Must have passcode prior to enrolling. Maximum five fingerprints stored Either may be used. Must have a minimum of six character “Backup Password”, including one digit and one number configured. Maximum four fingerprints stored
Device receives a remote lock command Must Enter Passcode Must use configured lock password. Note remote lock changes the screen lock to password instead of fingerprint.

Device administrators and users should consider the table above when debating the use of biometrics in conjunction with strong passcodes.

Beyond the fingerprint replication threats publicized by CCC, using fingerprint authentication also introduces privacy concerns for some users. For example, Law Enforcement Agencies (LEA) can compel a suspect to turn over “something you have”, but the restrictions for turning over “something you know” are more difficult. Applied to biometric authentication, LEA can compel you to unlock your phone with a fingerprint, but do not have the US legal authority to compel you to reveal a PIN or password used to lock a device

If the stored fingerprint is connected to Apple Pay, PayPal, or Samsung Pay on your mobile device, potential impacts of using a fake fingerprint are increased.

Peace of Mind – Practical Recommendations

It is important to remember that creating a fake fingerprint takes time, resources, and skills. This means the chances of an attacker  or street thief getting into a user’s device before they or the device administrator have had a chance to remotely wipe it are low. Additionally, the device is protected by a strong passcode that can’t be easily guessed, falling back to compromising the device passcode is also non-trivial.

Other things that must be done:

Enable on-device encryption on Android devices. (This is always on for iOS devices.) This ensures the data on the device cannot be retrieved by merely dumping the NVRAM to another computer for analysis or use.

In addition to any corporate Mobile Device Management (MDM) solution in use, verify the device is setup for either Find My iPhone  or Android Device Manager. This gives the users the ability to remotely locate, ring, lock and/or wipe their device in the event something happens.

Configure the device to wipe after a pre-set number of failed passcodes. Don’t give bad guys unlimited license to try passcodes. Device administrators should set this number high enough to account for difficulties entering a passcode so as to avoid accidental data wiping. I suggest ten. If that is an unacceptable risk, set it to no lower than five.

Lastly, regardless of using a passcode, fingerprint, or any other authentication mechanism, physical possession of the device is always important. Train users to treat their smartphone as they would their wallet: don’t leave it where others can easily take it and use caution revealing authentication mechanism details.

Conclusion for Part 1

Following these recommendations, access to a user’s device and the corresponding sensitive information on it become much harder. Further, shoulder surfing a fingerprint read and then replicating it to access a device are much harder than simply capturing the PIN or password a user is entering.

Raising the bar on accessing the sensitive information on mobile devices, particularly as we find more places they enable actions and protect a greater amount of our personal information, helps us be a little more certain about who has access to that information. But, this approach also raises the question of how else these security mechanisms could be used.

Part 2

In part two, we will talk about how applications access fingerprint data, how the security of those APIs are maintained, and how one would know they were still in place.

-Lee Neely

Want to learn more on this topic? You really should check out SEC575: Mobile Device Security and Ethical Hacking, an amazing course covering mobile device security, attacks, and much more!



Modifying Android Apps: A SEC575 Hands-on Exercise, Part 2

By Joshua Wright



In the last installment of this article, we looked at the IsItDown application, and how it is designed not to run in the Android Emulator, and to include a super-annoying banner ad. We showed how the Apktool utility can be used to decompile an Android APK file, and how we can evaluate and modify the produced Smali code to manipulate the application’s functionality.

In this final installment, we’ll re-build the IsItDown application with our Smali file changes, then we’ll generate the necessary keys and sign the application so that it can be used in the modified form on a virtual or a physical Android device. We’ll also look at techniques that can be applied to defend against this hack, with advice that you can pass on to Android developers.

Rebuild the App

With the completed changes to the Smali files saved, we can re-build the app. We return to the IsItDown directory created by Apktool when we decompiled the app, and rebuild:


As long as there are no syntax errors in your modified Smali code, Apktool will build a new Android APK file in the dist directory:


Sign the App

We can’t install and run this new version yet though, because the Android platform requires signed application packages. Never fear though – the signature isn’t anything that is checked (by anyone, ever), and can be self-signed.

First, use the JDK tools to run the keytool utility and create your own signing key as shown here:


Note that the path to the keytool.exe utility might be different for you depending on your version of the JDK. You can feel free to enter any values you find entertaining here, just make the alias “IsItDown” as shown. Next, use the jarsigner.exe utility to leverage the key and generate a signed application package:


With the jarsigner utility, specify the location of your keystore file, the APK filename, and the keystore alias “IsItDown”. Enter the password you supplied when you generated the keystore and ignore the warning. Viola! Signed Android package!

Run the App

Finally, we get to reap the benefits of our work. Uninstall the old version of IsItDown, and install the modified version.


Next up, run the app and see your changes in action. AWESOME!


Defend the App

Now that we’ve seen the steps to manipulate the app, we should take a look at what developers can do about this. As a pen tester, it’s important to always leave your customer with more than “this is broken”: we need to address the audience and give them advice on what to do about the issues you’ve identified.

Here… it’s hard. The bottom line is that anyone can modify and manipulate an Android app or any software (on any device not using end-to-end trusted execution policies). The developer can do things that make it harder for an attacker, but ultimately there is no reasonable way to stop a determined attacker from manipulating the application.

One easy opportunity for Android developers is to check the integrity of the application signature. When we generate a new APK file with Apktool, we have to also generate a self-signed key and sign the APK file before we can install it. Since we don’t have the original developer’s signing key, we can never reproduce the original signature information.

A developer can periodically check the certificate information for the application at runtime. I saw this technique first in Android Security Cookbook by Keith Makan and Scott Alexander-Bown, Scott (2013, Packt Publishing, page 179). Their code doesn’t work on newer Android API levels, but I’ve modified it with a complete example available at

Essentially, the developer generates the signing certificate, and calculates a SHA1 hash of the certificate, embedding it in the application source:


Periodically during the application execution, the developer calls getPackageInfo() with the PackageManager.GET_SIGNATURES argument to retrieve the app signing certificate information, comparing the embedded SHA1 certificate hash with the hash of the current certificate:


This is… OK. An attacker who can modify the Smali code can also modify the embedded SHA1 of the developer’s certificate to match their own signing certificate, or just disable the certificate checks altogether. The only “superior” defense is to move sensitive or critical code into a natively-compiled library with the Android Native Development Kit (NDK). Here, the developer would re-write their Java code in C/C++ to make it harder to manipulate the app’s functionality (harder, but still not impossible, since many attackers can manipulate native libraries too).


In this article we’ve looked at the techniques for manipulating Android apps. Our IsItDown target app was a straightforward target, but you can use the same techniques on many other Android apps with just a little added patience.

Remember to use these techniques for good, and not evil. That game you like to play was written by someone who needs to make money too, and turning off their banner ads or manipulating the “coin” value x100 is fun for you, but it makes them think twice about implementing their next project too. Use these techniques ethically.

As a parting note: my friends who write iOS apps sputter about how iOS apps are natively compiled and can’t be reverse-engineered this same way. Whenever I do iOS reverse-engineering, I take a look for the Android version of the same app. Since Android reverse-engineering is so much easier, we can spend a little time looking at the Android code before moving onto the iOS app to get a feel for how the app functions. If I’m attacking the back-end servers, I don’t have to reverse-engineer the iOS version of the app; the Android version will do just fine.




Upcoming SANS Special Event – 2018 Holiday Hack Challenge


SANS Holiday Hack Challenge – KringleCon 2018

  • Free SANS Online Capture-the-Flag Challenge
  • Our annual gift to the entire Information Security Industry
  • Designed for novice to advanced InfoSec professionals
  • Fun for the whole family!!
  • Build and hone your skills in a fun and festive roleplaying like video game, by the makers of SANS NetWars
  • Learn more:
  • Play previous versions from free 24/7/365:

Player Feedback!

  • “On to level 4 of the #holidayhackchallenge. Thanks again @edskoudis / @SANSPenTest team.” – @mikehodges
  • “#SANSHolidayHack Confession – I have never used python or scapy before. I got started with both today because of this game! Yay!” – @tww2b
  • “Happiness is watching my 12 yo meet @edskoudis at the end of #SANSHolidayHack quest. Now the gnomes #ProudHackerPapa” – @dnlongen

Modifying Android Apps: A SEC575 Hands-on Exercise, Part 1

By Joshua Wright



As a security professional, I’m called on to evaluate the security of Android applications on a regular basis. This evaluation process usually takes on one of two forms:

  • Evaluate app security from an end-user perspective
  • Evaluate app security from a publisher perspective

While there is a lot of overlap between the two processes, the difference effectively boils down to this: whose risk perspective does my customer care about the most?

When an app publisher wants me to evaluate the security of their Android app, I need to determine if the app employs sufficient controls to protect the required app functionality and publisher brand. Often, this requires me to identify critical app components (critical for my customer, such as how they make revenue from the app, or the integrity of the data transmitted by the app), and determine if I can manipulate them in interesting ways.

As a guide, I developed an Application Report Card system to steer an analyst through the application evaluation process. One of these tasks is to manipulate the Android application by modifying the low-level Smali bytecode, removing the intended publisher functionality.

Blog 1a

In my SEC575: Mobile Device Security and Ethical Hacking course, we use a custom application, IsItDown as our evaluation target. In the exercise, participants have to install and run the application to identify the app functionality and constraints, then decompile, modify, reassemble, and re-sign the application so the app can be used without restrictions.

In this 2-part article, we’ll take a peek at the SEC575 hands-on lab exercise for modifying Android apps. In the first part, we’ll take a look at how we can leverage the Apktool utility to decompile an Android app, followed by a quick primer on reading Smali code. In the second part, we’ll generate custom keys to sign the modified application so that it can be used on a virtual or physical Android device, and look at defense techniques.

Feel free to download the IsItDown application and follow along. You’ll also need the following tools for your system:

Evaluate the App


After starting a virtual Android device (or connecting your physical Android device over USB), use the adb utility to install the IsItDown.apk application as shown above. Next, run the app and experiment.

IsItDown is a basic IsItDownForJustMeOrForEveryoneElseToo app… except it’s only testing from your mobile device. Frankly, it’s not that useful, but it serves our needs fine.

If you are working from a virtual device, you’ll quickly be disappointed. Not only does IsItDown have an obnoxious banner ad on the bottom of the page, it only teases you with the promise of functionality before it tells you to go away.


Our goal here is modify the Android application to allow us to run in the Android Emulator, and to remove that annoying banner ad.

Decompile the App

First, we’ll use Apktool written by Ryszard Wisniewski and Connor Tumbleson to convert the IsItDown.apk file into Smali code. Smali code is a representation of the app code using the Android Dalvik opcodes – essentially an intermediate representation of the code between the original Java source and the processor-specific assembly instructions. Apktool allows us to take the Android APK file, convert it into a Smali source representation that can be modified, and then recompile it back into a new APK file.

Note: The pedantic reader will no doubt be questioning my use of “decompile” as a verb here. Converting an Android APK file to Smali code is not quite decompilation, but it’s not quite disassembly either. It’s somewhere in the middle. I’ll keep using decompilation, since you are getting a high-level representation of the app code in the Smali file, but if you are more comfortable with the decompilation verb, feel free to search+replace this article.

Apktool isn’t so much a hacking tool as it is a mechanism to evaluate Android applications. Sure, people with ill intent can manipulate an application for evil purposes, but it can also be used for Android application troubleshooting, and for the localization (adding local language support) for applications as well. Don’t use it to do evil things, OK? (Ed.: I second that.)

Blog 2a

Make sure you have apktool.jar (renamed from apktool-2.0.0.jar), apktool.bat and IsItDown.apk all in the same directory (or put apktool.jar and apktool.bat in your system PATH somewhere). Next, use apktool.bat to decompile the APK file, as shown here:

Blog 4

Apktool will generate a new directory IsItDown that holds the application resources, AndroidManifest.xml declarations, the original signature information from the developer, and the Smali code itself.


Modify the App

Browsing to the smali directory, we’ll see a directory structure created for the application package names (e.g. com/willhackforushi/isitdown/). For our simple application, only a handful of Smali files are generated, as shown here:

Blog 3a

Most of these Smali files include auto-generated code, so we don’t have to look through all of them. First, let’s search for a reference to the string “No emulator use permitted.” using the Windows findstr.exe utility:

Blog 4a

Here we see the string is in the MainActivity.smali file. This is unusual; typically, string values will be defined in the res/values/strings.xml file, making it easier to localize the application. Here, the developer just got lazy, and embedded the string directly in the Java source. This makes it a little easier for us to evaluate the Smali code though: simply open the MainActivity.smali file with your favorite editor and skip to the line where the string is defined.

Blog 6

At this point, if you’re not a developer, or have never seen Smali code before, you might be like “WTW? I’m supposed to read this?”. Have no fear! You just need to know a few things about Smali:

  • Smali uses declared registers as placeholders for objects, variables. v0 is a local register, p0 is a parameter register (e.g. something passed to the function/method)
  • Syntax in Smali is always destination, then source
  • Object types are specified with a capital letter at the end of the object or method:
    • V – Void
    • Z – Boolean
    • B – Byte
    • S – Short
    • C – Char
    • I – Int
    • J – Long (64-bits)
    • F – Float
    • D – Double (64-bits)
    • L – Object

A cheat sheet is a useful thing to have as well, that explains common Smali opcodes and their functionality. Here’s one from my SEC575 course!

Blog 8

Finally, the reference maintained by Gabor Paller at is awesome, and frankly, is how I learned to read Smali code. You should bookmark that link!

Looking at that Smali code, there is a lot of stuff we don’t care about. We know that the application shows us an error “Go away” and refuses to run. Logically, before that message is displayed, we should see some code that determines whether or not to display that error. Sure enough, check out line 350:

Blog 9

Line 350 invokes a method calls isEmulator() which returns a Boolean value (see the big “Z” at the end?). The Boolean result is moved to the v13 register, and then the if-eqz opcode  determines if the value is 0 (or “False”). If v13 is equal to 0, then the code jumps to the code_0 block. Otherwise, we get the nasty “No emulator use permitted.” message.

Not so hard, right? Knowing this, we can make a simple change to this code. Consider this small “fix”:

Blog 11See what I did on line 354? I changed if-eqz to if-nez, effectively inverting the test. Now, even though isEmulator() still returns True on emulated devices, the code behaves as if it the device were not an emulator, jumping to the cond_0 block.

This is not completely desirable, since this change would invert the test, and break the functionality for legitimate devices that install the modified IsItDown.apk file. You could modify the isEmulator() method to always return False as another option, but this is what I ended up doing:

Blog 12

On line 354 is the original if-eqz test, and then I added the inverted test immediately afterward. This way, regardless of what the isEmulator() method returns, the code always skips the “No emulator use permitted” message. It’s a little hackish, but it gets the job done. (Ed.: Did Tom Liston teach you that, Josh?)

Next up: removing that banner ad. Banner ads are usually displayed in a WebView object, which is effectively a little tiny web browser in the app. Searching for the string “http:” reveals two references that look like the source of our advertising:

Blog 13

Searching for that same string in the MainActivity.smali and MainActivity$1.smali files reveals code like this:

Blog 14

In this example, line 268 loads the reference to the WebView in the v2 parameter, while line 270 loads the ad URL in the v3 parameter. Line 272 loads the URL content into the WebView. Easy enough to change that behavior:

Blog 15

Simply commenting-out the invoke-virtual opcode that fills the WebView is enough to cause the banner ad to stop loading. Repeat this step for both MainActivity.smali, and MainActivity$1.smali and you’re done!

Concluding Part One

In this first part of our article, we saw how Apktool can be used to decompile an Android application, and looked at the Smali code with a focus on changing the code to overcome Android emulator restrictions and to disable an annoying banner ad. In Part Two we’ll pick up where we left off and re-build the application and sign it with the Java Development Kit utilities so we can run the modified app on an emulated or real Android device. Finally, we’ll address defensive techniques that you can use in your next Android pen test report with suggestions to pass on to Android developers, and briefly discuss how Android testing can also be useful when evaluating iOS targets.

Until next time,



Upcoming SANS Special Event – 2018 Holiday Hack Challenge


SANS Holiday Hack Challenge – KringleCon 2018

  • Free SANS Online Capture-the-Flag Challenge
  • Our annual gift to the entire Information Security Industry
  • Designed for novice to advanced InfoSec professionals
  • Fun for the whole family!!
  • Build and hone your skills in a fun and festive roleplaying like video game, by the makers of SANS NetWars
  • Learn more:
  • Play previous versions from free 24/7/365:

Player Feedback!

  • “On to level 4 of the #holidayhackchallenge. Thanks again @edskoudis / @SANSPenTest team.” – @mikehodges
  • “#SANSHolidayHack Confession – I have never used python or scapy before. I got started with both today because of this game! Yay!” – @tww2b
  • “Happiness is watching my 12 yo meet @edskoudis at the end of #SANSHolidayHack quest. Now the gnomes #ProudHackerPapa” – @dnlongen

How Pen Testers Can Deal with Changes to Android SD Card Permissions

By Lee Neely & Chris Crowley

Recent updates to the Android OS have changed the permission model for external storage, and these changes will likely impact the way pen testers assess the actions and corresponding risks associated with applications, both malicious and benign, particularly when analyzing how they interact with external storage.

Consider this scenario: You are provided an application from an unknown third party to assess. Your assignment is to assess both the behavior and trustworthiness of the application. Because of the permission model changes, the application behaves differently when trying to access external storage than it would have in earlier releases of the Android OS.

In this article, we’ll provide information on how the permission model changed and some tips and techniques you can leverage when you are assessing an application in your next Android pen test.

What changed?

There were two changes which we will discuss separately. They have different impacts when assessing application behavior. Additionally, based on feedback after the KitKat release, Lollipop (5.0) introduced a new intent that allows application developers to return to the more familiar behavior. More on the Lollipop change later.

The first change, which has the greater impact on applications, was a change in the permission model for external storage. The second is a refinement in the treatment of application private storage area on secondary storage, which could be a bonus for malware.


Android external storage is defined to be a case-insensitive filesystem with immutable POSIX permission classes and modes. External storage can be provided by physical media, such as an SD card, or by exposing a portion of internal storage through an emulation layer. (see )

Android versions prior to KitKat (4.4) provided a single access model for external storage. Access to this storage was protected by a single permission WRITE_EXTERNAL_STORAGE. Starting in Jelly Bean (4.1) read access was protected with the READ_EXTERNAL_STORAGE permission. Prior to Jelly Bean, read access did not require any special permissions.

KitKat introduced two storage models for external storage; one known as primary storage which is essentially unchanged and another known as secondary storage. Primary storage is a part of the device internal storage. The APIs for accessing primary storage are unchanged. Secondary storage is provided by physical media, such as an SD card. Secondary Storage implements new permissions, such that attempts to write outside the applications private storage area (/storage/extSDCard/Android/Data/[Package Name]) are not permitted by the application without additional permissions. The concept of these application specific directories was introduced in FroYo (2.2.)

Permissions on external storage are synthesized. This can be confusing if you’re used to accessing SD cards that utilize the FAT file system without concerns about permissions. This is because the FAT file system doesn’t support permissions natively. Starting in KitKat, the owner, group and modes of files on external storage devices are now synthesized based on directory structure. These synthesized permissions are accomplished by wrapping raw storage devices in a FUSE daemon.

The First Change:

When Honeycomb (3.0) was introduced, a new paradigm for secondary storage access was also included. The paradigm includes the relevant additional permissions: WRITE_MEDIA_STORAGE and READ_MEDIA_STORAGE. (Note the change from *_EXTERNAL_STORAGE.) These permissions allow read and/or write access across the secondary storage device and are granted to system, manufacturer and mobile operator applications. Initially some device manufacturers also granted this on the fly to applications with WRITE_EXTERNAL_STORAGE permission. With the KitKat updates, this behavior has disappeared requiring applications to use new API calls to perform updates on items in secondary storage. Thus it has received the most attention with the KitKat release.

Applications with only the READ_EXTERNAL_STORAGE permission are largely not impacted. While the new permission model also tightened the permissions on the application specific directories on secondary storage; most applications were not accessing those locations. The WRITE_EXTERNAL_STORAGE permission also includes READ_EXTERNAL_STORAGE permissions.

The Second Change

The application specific folders (/storage/extSDCard/Android/Data/[Package Name]) are now deleted upon application uninstall. That means that if you store important information in that directory, then uninstall the application that created that information, your data will also be deleted. In contrast, while this behavior could be desired if an application is trying to hide its tracks or malware, it could be frustrating during the forensic examination or during times where you expect data to persist between actions taken while testing a device.

When analyzing application behavior, be sure to examine the application specific folders. Also, compare the folders on Jelly Bean and KitKat devices for content differences due to the permission changes.

How does that permission model work?

These permissions are implemented in kernel level groups.

The WRITE_EXTERNAL_STORAGE permission grants membership to the sdcard_rw and sdcard_r groups, which used to have full access to the secondary storage device.

The READ_EXTERNAL_STORAGE permission grants membership to the sdcard_r group, and therefore grants permission to read the secondary storage device.

The WRITE_MEDIA_STORAGE permission grants membership to the media_rw and media_r groups. This is the group system applications use to access the entire secondary storage device.

It turns out there is also a READ_MEDIA_STORAGE permission with a corresponding media_r group. It is not clear how this differs from the READ_EXTERNAL_STORAGE group.

Groups are associated with their permissions in system/etc/permissions/platform.xml. Researchers wishing to restore the old behavior are modifying this file to add the media_rw group to the WRITE_EXTERNAL_STORAGE permission. This modification requires a rooted device.

Show me!

To illustrate, here are secondary storage directory listings from a KitKat and Jelly Bean device:

Here is the private application storage directory on a Jelly Bean device: (Android/data on the secondary storage device.)

shell@android:/storage/sdcard0 $ ls -la Android/data
-rw-rw-r-- root     sdcard_rw        0 2013-03-28 15:04 .nomedia
drwxrwxr-x root     sdcard_rw          2011-12-31 16:01
drwxrwxr-x root     sdcard_rw          2013-03-28 15:32
drwxrwxr-x root     sdcard_rw          2013-03-28 15:04
drwxrwxr-x root     sdcard_rw          2013-03-28 15:04
drwxrwxr-x root     sdcard_rw          2013-03-22 16:11

Here is the same directory on a KitKat device:

shell@d2vzw:/storage/extSdCard $ ls -la Android/data
-rwxrwx--- root     sdcard_r        0 2014-12-15 19:58 .nomedia
drwxrwx--- u0_a84   sdcard_r          2014-12-15 20:11
drwxrwx--- u0_a137  sdcard_r          2014-12-15 20:14
drwxrwx--- u0_a56   sdcard_r          2014-12-15 19:58
drwxrwx--- u0_a124  sdcard_r          2014-12-15 19:58
drwxrwx--- u0_a92   sdcard_r          2014-12-15 20:48
drwxrwx--- u0_a6    sdcard_r          2014-12-15 19:58
drwxrwx--- u0_a153  sdcard_r          2014-12-15 20:12 com.vcast.mediamanager

If you examine the two listings you can see that the group controlling access to the private application directories has changed, and the ownership of the individual directories has changed to match the application owner/user on the device. Also notice the world read/execute permissions on these directories is removed in KitKat. With those changes a given application can no longer write to other application directories.

Note: the sdcard_r group doesn’t actually have write permissions to these directories, the FUSE daemon is synthesizing the results.

Let’s take a look at the permissions at the top level of the secondary storage device:

Storage Permissions in Jelly Bean:

shell@android:/storage/sdcard0 $ ls -la
drwxrwxr-x root     sdcard_rw         2014-11-10 16:02 .downloadTemp
drwxrwxr-x root     sdcard_rw         2011-12-31 16:01 .face
drwxrwxr-x root     sdcard_rw         2014-12-15 21:27 .thumbnails
drwxrwxr-x root     sdcard_rw         2011-12-31 16:01 Alarms
drwxrwxr-x root     sdcard_rw         2011-12-31 16:01 Android
drwxrwxr-x root     sdcard_rw         2011-12-31 16:01 Application
drwxrwxr-x root     sdcard_rw         2011-12-31 16:01 DCIM

Storage Permissions in KitKat:

shell@d2vzw:/storage/extSdCard $ ls -la
drwxrwx--x root     sdcard_r         2014-12-15 19:58 Android
drwxrwx--- root     sdcard_r         2014-12-15 19:58 LOST.DIR
drwxrwx--- root     sdcard_r          2013-02-21 04:57 books
drwxrwx--- root     sdcard_r         2013-02-21 06:32 camera
drwxrwx--- root     sdcard_r         2013-02-21 06:31 documents
drwxrwx--- root     sdcard_r         2013-02-21 04:57 downloads
drwxrwx--- root     sdcard_r         2013-02-21 04:57 music

From here, it looks like nothing can write to the top level under KitKat. In practice, the system applications are able to read and write just fine.

So, when you’re assessing the operations of an application, you now need to consider both the overt READ/WRITE_EXTERNAL_STORAGE permissions, actions taken in the application private storage area, and lastly, use of the new intents designed to manage access to this storage.

What is the reasoning behind the change?

In short, the media card was unstructured storage and not well managed, which made it easy for applications to both write and read that data. That’s why OHA is continually adjusting the behavior, because there is a critical balance between access to the phone owner’s data, and the risk of exposing that data. The new model is to wrap access to external storage through the FUSE daemon as well as cleanup and/or normalize access to the media card.

From the web site:

The WRITE_EXTERNAL_STORAGE permission must only grant write access to the primary external storage on a device. Apps must not be allowed to write to secondary external storage devices, except in their package-specific directories as allowed by synthesized permissions. Restricting writes in this way ensures the system can clean up files when applications are uninstalled.

What are the impacts?

Apps can no longer write ad-hoc to the media card, so things like third-party photo editors don’t work without changing how they are accessing those files. Read permissions still work, so third party media players/readers still work. Files are lost on application uninstall and device backup applications have had to adjust where files are stored. Additionally, running applications from a media card no longer works. Sideloading applications from the media card works provided the APK file is in a viable location, such as the downloads directory, and the options are set to allow application installation from other sources.

What does that mean when assessing an application?

  1. Application use of primary rather than secondary storage.
    1. This is not changed. If an application makes no use of secondary storage, your assessment is not impacted by these changes.
    2. An application may avoid the whole issue by never trying to write there.
  2. Application developers may try new ways to access secondary storage.
    1. Look for applications trying to add the WRITE_MEDIA_STORAGE permission in the manifest.xml file. Note: this only works for system/manufacturer applications.
    2. Look for calls to the new APIs – Pen Testers can assume malware has already adjusted to the new paradigm.
  3. Examine use of private application storage space
    1. Check for use of the new private application space on secondary storage, and consider how the application may try to leverage it being deleted on uninstall. Conversely if you uninstall the application, remember it gets deleted.
  4. Does the application use new intent introduced in Lollipop?
    1. Lollipop contains a new intent designed to emulate the old behavior.
    2. This is the supported/preferred mechanism for accessing secondary storage and isn’t necessarily an indication of a problem.
  5. Just to be sure, make sure the platform.xml file hasn’t been modified.
    1. Examine system/etc/permissions/platform.xml to see if media_rw group was added to WRITE_EXTERNAL_STORAGE definition which restores the old behavior and could be indication of device modification/rooting.
    2. It could be illuminating to compare the application behavior on a Jelly Bean device.

New in Lollipop

With Lollipop, a new ACTION_OPEN_DOCUMENT_TREE intent was introduced that allows application developers to return to the more familiar behavior. It is expected that applications will implement this intent as the simplest mechanism to restore the expected behavior. Google Software Engineer Jeff Sharkey explained the change and is quoted below.


After the change was rolled out in KitKat, Google heard loud and clear that developers wanted richer access beyond these directories, so in Lollipop they added the new ACTION_OPEN_DOCUMENT_TREE intent.  Apps can launch this intent to pick and return a directory from any supported DocumentProvider, including any of the shared storage supported by the device.  Apps can then create, update, and delete files and directories anywhere under the picked tree without any additional user interaction.  Just like the other document intents, apps can persist this access across reboots.

This gives apps broad, powerful access to manage files while still involving the user in the initial selection process.  Users may choose to give your app access to a narrow directory like “My Vacation Photos,” or they could pick the top-level of an entire SD card; the choice is theirs.

To make it easy for developers to transition to these new APIs, there’s a new DocumentFile support library class.  It looks and feels just like a traditional java.lang.File object, which makes it easy to adapt existing code:

-Lee Neely & Chris Crowley

p.s.  If you want to secure your mobile infrastructure and apps using world-class methodologies and techniques, you should definitely check out SANS Security 575, a GREAT  course on mobile device security and pen testing.

Chris Crowley will be teaching it in Houston from March 28-28, 2015.

Josh Wright will be teaching it at SANS Orlando from April 13-18, 2015.