Yale University’s Privacy Lab is calling on Google and Android app developers to increase transparency into privacy and security practices related to dozens of trackers built into popular Google Play apps such as Uber, Tinder, Skype, Twitter, Spotify and Snapchat.
Privacy Lab teamed up with France-based non-profit Exodus Privacy to uncover “clandestine surveillance software that is unknown to Android users at the time of app installation.”
“These trackers vary in their features and purpose, but are primarily utilized for targeted advertising, behavioral analytics, and location tracking,” the team writes.
Using signature-based detection similar to that employed by traditional AV software, Exodus researchers have so far uncovered 44 such trackers, including 25 used as a sample in their joint research with Yale.
Analyzing the trackers, the team quickly learned that their activity crosses numerous legal jurisdictions as they are used internationally with identical functionality.
“Lack of transparency about the collection, transmission, and processing of data via these trackers raises serious privacy concerns and may have grave security implications for mobile software downloaded and in active use by billions of people worldwide,” Yale researchers warn.
More than three quarters of the apps analyzed by Exodus contained trackers. Apps advertised as “clean” today may still contain trackers that have not yet been identified, researchers believe.
Privacy Lab urges the information security community to help find trackers that it believes are more deeply hidden, as it aims to uncover all, or at least most of them.
A similar audit of iOS apps is much more difficult because of the way Apple stewards the App Store and how iOS apps are packaged and sandboxed.
However, many companies selling trackers also do business with iOS developers, and many developers distribute the same apps on both iOS and Android, so it’s reasonable to ask whether those practices are also unfolding in Apple’s walled garden.
The team doesn’t settle at simply offering a general view of the situation. In its paper, which it believes is of the utmost interest to the public, it offers a rather worrying example of how trackers could be modified to send highly personal information to third-party servers.
“FaceGrok recognizes faces in view of the camera, a simple demonstration of the type of data which may be collected and transmitted via trackers,” Privacy Lab folks write. “Though FaceGrok does not transmit any facial recognition data, it could do so with simple modifications. The process of Android app development and submission to the Google Play store has revealed the ease of adding tracker code and the ubiquity of trackers, as well as a glimpse into Google Play policies and app review.”
The versatile nature of trackers is exemplified by FidZup, a tracker that employs sonic emitters which diffuse a tone inaudible to the human ear and detect the presence of phones with FidZup on them.
Such a tracker is presumably used by apps like Bottin Gourmand, which works as a guide to restaurants and hotels in France. Other apps engaging in this practice include car magazine app Auto Journal and TV guide app TeleStar, the authors claim.
There’s more. Among the apps analyzed using the Exodus platform were several finance and healthcare apps, each found to contain as many as six trackers in some cases. High-profile apps on that list include Aetna, American Red Cross, WebMD, American Express, Discover, HSBC, Wells Fargo, and PayPal. Needless to say, apps like these contain highly sensitive data, or precious information that bad actors would love to get their hands on.
Privacy Lab is therefore calling upon developers and app store vendors to increase transparency into the privacy and security practices employed by their apps and the venues used to sell them.
“Android users, and users of all app stores, deserve a trusted chain of software development, distribution, and installation that does not include unknown or masked third-party code,” the team’s final message reads.
This is a Security Bloggers Network syndicated blog post authored by Filip Truta. Read the original post at: HOTforSecurity