The definitive iOS security testing checklist for developers

Tom Colvin
6 min readFeb 11, 2022

--

Padlock on a turquoise door
Photo by Kaffeebart on Unsplash

Apple has been careful to foster an image of their mobile platform, iOS, as secure and privacy-focused. The locked-down nature of the platform, including the fact that Apple acts as gatekeeper for anything you might want to do on your iPhone, goes some way to realise this.

However, although Apple has codified certain best practices around security, they are not exhaustive and the often-haphazard review process mostly doesn’t enforce them anyway.

So security and privacy remain firmly in the hands of the developers. And there are so many pitfalls that it’s actually very difficult to find an app which does get it completely right.

So this article is intended as a checklist for iOS developers. These are the critical areas of your app to check, but your app may have other requirements too depending on the class of data it stores and local regulations.

Clearly define sensitive data

The security testing journey always begins with an understanding of what data in your app is sensitive. That could come from user input, the app’s back end, or other sources online. Once you’ve carefully defined what’s sensitive and what’s not, you can work out which bits of your code handle that data. Then you have a checklist of code areas to test all the following against.

Encrypt sensitive data at rest

Small pieces of authentication data — for example passwords, keys and tokens — should be stored using the iOS keychain. That is a super-secure lock-box which is backed up by the iDevice’s own encryption hardware.

Larger pieces of sensitive data need to be encrypted on disk as described here.

Work around the iOS keychain bug

iOS has a long-standing bug where keychain data is not deleted when an app is uninstalled (unlike the rest of the app’s data). This means a new install of an app will pick up that old data, potentially causing it to use credentials from years ago. A user would reasonably have expected this to have been purged.

You can’t work around this bug by deleting keychain data yourself when the app is uninstalled, simply because iOS won’t let you hook into the uninstall process. But you can and should add a mechanism to delete keychain data when you detect a new install.

Encrypt all data in transit

Everything that goes over the network should be encrypted, to prevent prying eyes from reading or altering it. If your app uses only REST APIs, this is often a simple case of ensuring that they are only accessed via HTTPS. Otherwise, you need to make sure that whatever connection you use is encrypted with TLS.

Enforce password complexity

Length is the most important factor in a password’s ability to withstand a brute force attack. It’s also valuable to ensure users pick from a wide range of characters (uppercase, lowercase, numbers, symbols). Finally, there are hacker’s dictionaries of easily guessed passwords which apps should check against.

Password policy should ideally be implemented at the authentication server, but that is sometimes not possible (we’re looking at you, Firebase Auth), hence it’s good practice to implement it in the app too.

Remove all app logging — or at least sensitive logs

Best practice is to suppress all console logging for production builds. But if that’s not possible you must at least suppress logs which contain sensitive data.

By the way, if you’re searching your code for logs to suppress it’s insufficient to simply search for “print” (in Swift) or “NSLog” / “printf” (in ObjectiveC). There are lots of other ways to log data and ultimately a manual review is needed to capture all possibilities.

Disable auto-correct for sensitive keyboard input

Text fields which accept sensitive input should have their autocorrectionType property set to UITextAutocorrectionTypeNo. Why? Because auto-correct stores vast quantities of the data that users type to help future predictions.

Only use standard crypto implementations

Don’t try to implement your own encryption or hashing algorithms! It will end badly.

If you think you need something non-standard then your architecture is probably wrong.

Check your permission requests

Confirm that you really do need each and every permission request that is made. Make sure you’re only asking for permissions at the time that they are needed, not before.

Sanitise EVERY input

This is one of the biggest problems we see in our security testing of clients’ apps. Each and every input must be checked for bad characters and escape sequences. If not, you leave yourself open to things like SQL Injection and cross-site scripting attacks (XSS).

This doesn’t just apply to checking what the user types in, but also the results of any external query such as to your app’s back end. After all, an attacker can in many situations affect the data that the app receives via the internet.

Deserialise objects safely

Similar to the previous point on sanitisation, any input into NSCoding or other mechanisms for serialising and deserialising data needs to be carefully reviewed. You should also check its provenance and avoid deserialising data you can’t prove you created.

There is a significant problem with NSCoding in that any class it creates can execute code before you have a chance to confirm it’s valid. There is an almost-drop-in replacement called NSSecureCoding which you should use in place of NSCoding to avoid this.

Safe WKWebViews

WKWebViews provide browser-like functionality in your app. They are enormously powerful, but they need careful handling. In particular:

  1. You should disable Javascript unless it’s essential using the javascriptEnabled property of WKPreferences. If Javascript is needed, make sure you only load code from trustworthy sources (such as hard-coded app assets) and consider whether you need to verify its integrity cryptographically.
  2. Don’t enable file access (via the file:// protocol scheme) unless essential.
  3. WKWebViews can interact with native code via window.webkit.messageHandlers. If you have implemented an API like this, make sure it’s as small as possible and that all inputs are carefully sanitised.

Finally, note that the old UIWebView (deprecated since iOS 8) is enormously insecure. If your app still uses it, you should upgrade to WKWebView immediately.

Remove all back doors and debugging code

A client of mine once implemented an in-app purchase into their app. In order to test the fully-fledged app, one of their developers implemented a hidden feature: tap six times on the logo to get everything for free. Cue thousands in lost revenues when a member of the public found this out and told Twitter about it. Basically: don’t be that developer.

That includes removing any hard-coded passwords or security system bypasses to help debugging.

Whilst you’re at it, make sure that any testing back-end URLs or account details are mentioned in comments only, not in code that will compile into the final app.

Carefully select what goes to the iCloud

Apple really likes sending data to their iCloud service. Automatic backups will send your data up to the cloud without your users really noticing, and this can have both security and privacy implications for sensitive data. After all, it means that an attacker wouldn’t have to have physical access to your device to get your data.

Apps have control over what gets backed up to the cloud, and these controls should be used judiciously. Use isExcludedFromBackupKey to help.

Educate your users on sensitive data and how to protect themselves

…For example, when you send them a PIN or recovery key, assert how important it is that they don’t share it. If they can enable authentication via fingerprint, make sure they realise that doing so opens their account to anyone whose fingerprint is registered on their device. If you detect that the device is jailbroken, make sure they understand the potential effect that could have on their security.

To conclude…

This blog post has listed 15 of the most basic things you need to do in order to ensure that your iOS app is secure. Depending on your app, the data it handles and other regulations, it’s quite possible that other rules around data protection will apply too. But the above is good practice which applies to any app.

Tom Colvin is CTO of Conseal Security, the mobile app security experts; and Apptaura, the app development specialists. Do get in touch if I can help with any mobile security or development projects!

--

--

Tom Colvin

The Android Sherpa. Google Developer Expert in Android and CTO of Apptaura, the app dev specialists. Message for free consultancy. All articles 100% me, no AI.