Security breaches are reported almost every day. Data gets stolen so often there are already services like https://haveibeenpwned.com/ that monitor those attacks and inform if your credentials have been compromised.
And while you might have only been reading those reports on the Internet, for us as an app building company it's an everyday threat since we're working in probably the most data-sensitive industries: healthcare and fintech. Compromising such data may lead to everything from digital identity theft to life-threatening situations.
For instance, just recently, a woman had died due to a software failure in a German hospital. That failure was caused by a cyber attack and forced doctors to move the patient to another facility, but she died on her way to it.
Security Testing as an Answer
So what's our way to minimize the possibility of such outcomes? (We can only imagine the feelings of those QA guys who – although most probably unintentionally – overlooked that threat in the software and made this attack possible).
There's a number of techniques that all can be united by one broad term "security testing". Its scope varies from project to project and may include or combine such activities as finding and removing system vulnerabilities and weaknesses, penetration testing, line-by-line code inspection, and others.
Instead of diving into technical aspects of our work, we'd like to dedicate a few sentences to its importance.
Often companies tend to save money on or postpone security testing, mainly due to three reasons:
- Their business is too small/unimportant to be targeted by hackers
- There is no return on investment
- The software can never be perfectly secure OR it's much easier to install some third-party tool for data protection.
However, that attitude is exactly the reason data breaches happen so often. Yes, databases leak every day, and a single database might be of no significant value. But when they are combined, such new big data provides an enormous amount of information about people, starting with the time they usually come home from work and their most often ordered pizza to the passwords of their email accounts and CVV codes of their credit cards.
Now, it might seem that security testing is something very vague for a non-technical person. How can you know that your development team is actually working on your product's security?
Answering this question, any app building company would say something like:
“– Our QA team finds a unique testing approach for every project, and to do that, they apply experience from the decade of work in this area.”
Yet in real life, it is only during your interaction with the most reliable teams you'll be getting an answer to that question.
Most often, you’ll get detailed reports on what vulnerabilities and breaches were found and fixed. But that's not all. Security testing also reveals additional ways to protect user data.
For instance, here is only a couple of things we suggested and developed in our latest projects:
- In one of the healthcare projects, a user was able to download medical files to their device through the app. We programmed the app to automatically delete those files after some time. Not only it is done because a copy may become outdated. We noticed that files are downloaded for a one-time interaction (or importing into another app for further processing).
- In another project, our software was communicating with the client's proprietary scanners. Initially, we intended to work with those scanners remotely. But we decided to have them delivered to our office to check if the hardware might be tampered with to alter its behavior and how our code can prohibit that.
- When we were making an app for the bank, security testing requires having quite an amount of client data, and resources were invested to fake them. It is OK to suppose we might have a breach possibility on our side too.
You might also know that all healthcare applications in the US must follow the HIPAA Security Standards. Those include solid requirements for data encryption, user identification, emergency access, and other aspects of such software. To follow them, we implement features like password hashing, biometric identification, automatic logoff, file change logging, and others.
Sometimes we might go as far as to sending our clients cheat sheets like this to implement healthcare security practices all over their organization.
How Can You See EGO Pays Attention to Security?
Now, the main outcome of the successful security testing might seem to be hardly visible. Your app works as intended, and hacks are either unsuccessful or absent at all.
So here are a few security practices we often turn to when working on software projects having risky functionalities (mainly, those include payment features, file uploading, and personal data processing). Seeing them when interacting with your development team, you may be sure the security aspect is treated seriously on your project.
Most of the applications we maintain or improve already have a real-world userbase.
To make sure we won't affect their user experience, we create different environments:
- a demo environment to show new or updated functionality to the client
- A staging environment for smoke testing purposes
- A few QA environments for our internal needs (mostly for testing new features separately from each other)
Additionally, as in the example earlier, we create fake or seed data for testing purposes.
It's very common to build MVPs without paying special attention to the user roles and permissions. However, if the project works with data sensitive enough to have its security taken care of, this is a critical issue to address.
Who can access the code and user data?
Who can edit it?
Who can dump it?
In teams with people combining various roles and responsibilities, those are questions not obvious at all. And the better is the security policy, the less are the chances of internal data theft, a social engineering attack, or a banal mistake.
As an example, we get our hands on projects that initially allow us to access user data. Many mobile apps let users sign up, and the APIs implementing this functionality provide the developers with the ability to access, dump, or alter the database of users and their passwords.
In such cases, we change the app logic and permissions to make such data invisible and inaccessible for developers (at least, by default).
However appealing might be the software product, if the payment procedure is unclear, or, even worse, goes wrong, that might cause a user not only to refuse using your app but leave a negative review and tell about their experience on social networks.
To make sure the payment procedure works flawlessly, quality assurance engineers from the EGO app development agency create a sandbox and test credentials (a card number, date, random name, etc. for testing purposes.
Where's a fly in the ointment?
Although we dropped many more boring technical details – like how we perform penetration testing and security auditing – there will still be a question hanging in the air:
Can EGO guarantee your product's security?
The answer is "No", but also "No, and nobody can".
In September 2020, another zero-day vulnerability was discovered in Windows 10. A zero-day vulnerability means there’s a chance it was found earlier by a trespasser and used for an unknown number of times for an unknown purpose.
Who knows how many more of such vulnerabilities are in the operating system most of our clients use. Clients, who may have access to the code base or the user database of their product. And this is only one of many attack vectors we cannot do anything about.
And then there are new security testing tools coming out on the market. And new testing approaches. And new people joining our team.
The growth strategy that our QA team has come up with over the years of their work is finding ways to make better-informed decisions. Decisions on the optimal testing approach for every project, on testing plans and estimates, on testing tools and reports.
And then, following our decisions, we should never stop paying attention to details. Because if there would be an idiom that would describe the essence of quality assurance, it would be “God is in the detail”.