Uber Wasn’t ‘Tracking’ Users, But It Was Breaking Apple’s Rules. Here’s How.

In a lengthy profile of Uber CEO Travis Kalanick yesterday, the New YorkTimes revealed that at one point in early 2015, Kalanick was summoned to Apple HQ for a meeting with Tim Cook. Apple had discovered that the Uber iOS app was “secretly identifying and tagging iPhones even after its app had been deleted and the devices erased,” a method sometimes known as “fingerprinting.”

Fingerprinting is one of many methods that software developers use to identify and recognize returning users. The benefits of these systems are obvious: If you delete an app and reinstall it, you can presumably continue right where you left off, with your preferences, and sometimes even your data, held over from the last use. There are ways of doing this aboveboard on iOS, such as by using your Apple iCloud account as an account identifier. But while iCloud can be used as a persistent user identifier, it is not tied to the hardware itself.

What Uber was doing, on the other hand, was below-board. The app, at the time of Kalanick’s meeting with Cook, was using a private software framework known as IOKit, which in turn was accessing device-hardware information. “Private,” in this case, means that iOS developers were prohibited from using it per their agreement with Apple, the platform holder. According to a copy of the Uber app from 2014, examined by developer and founder of Verify.ly Will Strafach, Uber was reading the serial number from users’ phones.

Along with the device’s serial number, IOKit was able to access a persistent value known as the IMEI, a unique identifier for mobile phones. These are values that, even if the phone is wiped, do not change — which means that Uber could store the serial number on its server, and then see when the Uber app is deleted and reinstalled on that device, or if someone logs out of their account and logs into a new one, even if the phone had been completely erased between uses.

This is what is important to understand: When we say that Uber could “track” a user even after uninstalling the app, it doesn’t mean that Uber was keeping some nefarious background program on your phone that could monitor your device. To its credit, iOS has always been heavily sandboxed, a term meaning that apps are mostly walled off from each other in the system architecture, and can only exchange data in strictly controlled ways. For as long as the App Store has been around, it has been impossible to leave executable code on an iOS device after an app has been uninstalled. The sort of “tracking” Uber did meant that if you deleted and re-downloaded the app, the system could recognize that the same device had been used with the service before. It couldn’t, however, see what the device had been up to between installations. In that sense, it was an intermittent type of tracking, not persistent. (The Times’ article was updated, without a formal correction, changing “tracking” to “identifying and tagging.”)

In China, where Uber was making a heavy push at the time, Uber was offering incentives for drivers to sign up new users. It would have been easy for drivers to just create a bunch of new accounts on the same phone. By knowing that all of the accounts were being created from a single device, the company could identify fraudulent sign-ups.

Using a private framework like IOKit is, as stated above, a violation of Apple’s developer agreement, and violates App Store guidelines. When reviewing apps for entry into the store, Apple’s system can detect if an app calls a private function, resulting in automatic disqualification. To get around this review process, Uber “geofenced” Apple’s campus in Cupertino, drawing a virtual boundary, and the app wouldn’t make the private-function call within that boundary.

Once on the App Store, Uber’s program could then function as intended, making use of the private framework that it wasn’t supposed to access. Since then, iOS has gone through a few iterations, and these methods no longer work. When Kalanick was summoned to a meeting with Cook, the App Store–review process was the screen that prevented apps from making private calls, but as of iOS 10, that screening is now on the operating system itself.

So, how bad was Uber’s violation? Obviously, pretty bad — Uber may not have been “tracking” users by the most common definition of the word, but it was violating rules intended to provide phone users with some measure of privacy. Worse, it very clearly understood that it was breaking the rules: Arguably more scandalous than the fingerprinting technique that Uber used was the geofence technique that it reportedly used to hide the fingerprinting from Apple. And then when Apple found out, the two companies resolved it privately, without informing consumers. Using fingerprinting may have obvious benefits, but Apple believes those benefits are outweighed by users’ privacy concerns, and iPhone users have bought into the Apple ecosystem, in part, with an understanding that their privacy and security are protected. Neither Uber nor Apple did the right thing for consumers here.

 

from: http://nymag.com/selectall/2017/04/how-uber-tracked-iphones-and-broke-apples-rules.html

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注