The leak of a database of the records of users of Apple HealthKit and Google FitBit services, alongside several other brands of fitness tracker products, has highlighted once again the critical importance of securing enterprise databases, and could put more than 61 million people – including an unknown number in the UK – at risk of compromise by opportunistic cyber criminals.
The unsecured, 16.7GB database, which was left exposed to the public internet without password protection, was uncovered by Website Planet and security researcher Jeremiah Fowler, and is owned by GetHealth, a New York-based provider of health data services.
Data points exposed in the leak included names, dates of birth, weight, height, gender and location. Affected individuals are located all over the world, said Fowler, who uncovered the database on 30 June 2021, according to ZDNet.
“I immediately sent a responsible disclosure notice of my findings and received a reply the following day thanking me for the notification and confirming that the exposed data had been secured,” he said.
Fowler said it was unclear how long the data records had been exposed, or whether or not they had been accessed by malicious actors, nor did he imply any wrongdoing by GetHealth, its customers or partners.
“We are only highlighting our discovery to raise awareness of the dangers and cyber security vulnerabilities posed by IoT [internet of things], wearable devices, fitness and health trackers, and how that data is stored,” he said.
While most owners of wearable devices might be tempted to assume that no cyber criminal could possibly be interested in their daily step count, this is not necessarily the case. For example, such information could theoretically be used to track the movements of someone who walks their dog at the same time every day and therefore when they are unlikely to be at home.
Although it is probably unlikely that the average burglar would go to such lengths to target a victim, Fowler pointed out that as wearable technology is developed and iterated, devices collect more and more intimate data that could be more valuable to malicious actors. For example, they could use data on people who have set weight loss goals to target them with phishing emails using diet or personal training plans as a lure.
Commenting on the incident, ProPrivacy’s Hannah Hart urged users of fitness-tracking apps and devices to check their privacy settings immediately, and be vigilant against possible follow-on incidents.
“While wearable devices have made it that much easier to track our weight, sleep patterns, and even our relationship with alcohol – we hardly want this information to be widely accessible as a person’s health history should be utterly confidential,” she said. “While GetHealth has since secured the affected database, it is apparently yet unclear who might have had access to the previously unsecured database and for how long.”
Comforte AG’s Trevor Morgan said the rapid rise and development of fitness trackers reflected the fact that people enjoy tracking their own progress towards their goals.
“The ‘quantified self’ movement not only gained traction but went from zero to 100mph very quickly,” he said. “Of course, this data ultimately winds up in repositories, allowing us to analyse that information from many different angles and then perform historical comparisons as time goes on. That’s a lot of personal data about a highly sensitive topic most of us are hoping is kept wholly secure.”
Morgan said the incident highlighted the need for data responsibility, security and privacy to be baked into organisational cultures, and noted that it also highlights another strong argument for moving away from traditional protection methods, such as passwords, perimeter security and simple methods of data access management. Adopting data-centric security policies can go some way towards reducing the risk, he said, while tokenising key data elements can help to ensure data cannot be exploited by the wrong person if it does leak.
“At the end of the day, utilising as many protection methods as possible is the right way to go,” he said. “The alternative is an exercise in incident management and the accompanying negative fallout – and that’s the most punishing workout of all for any enterprise.”
From a compliance standpoint, ProPrivacy’s Hart said the incident highlighted wider privacy concerns around wearable technology itself. In the US, for example, federal law protects health data from being disclosed without patient consent under the Health Insurance Portability and Accountability Act (HIPAA) of 1996.
“HIPAA regulations would usually protect this data, but since the information collected by wearables isn’t considered PHI unless shared with a doctor or hospital, some companies may be able to sell or share it with third parties,” she said.