While apps like Waze, Uber and Apple Maps may be helping to ease some of your transportation woes, recent stories suggest we should think twice before blindly trusting these apps to work in our best interests.
Apps designed to route people – either you or drivers looking to pick you up – rely on a lot of data: traffic data, location data, user-reported data, even personal data (Who is requiring a pickup? Where do you often visit late at night?). What if this app data could be manipulated, either intentionally by the companies that distribute the apps, or by users submitting false or misleading data?
This appears to be exactly what is happening with GPS mapping provider Waze and ridesharing behemoth Uber.
Many “Waze” Home
People assume apps like Waze are programmed to guide you on the fastest way to your destination, considering traffic conditions and any other measurable factors. While that is mostly true, Waze’s algorithms are also designed to proactively reduce roadway congestion by spreading traffic onto underused roadways, perhaps including the quiet residential street you live on. This is causing numerous issues with local communities that now have streams of commuters flowing through their once-private residential communities. “Don’t Trust Your Apps,” reads one road sign in Fremont, California, where residents got fed up with the residential traffic and attempted to make people think they took a wrong turn onto a slow route.
City planners, law enforcement, and transportation officials must now scramble to come up with ways to “block” these commuters and make these routes undesirable – by narrowing drivable space, making streets one way, instituting rush hour restrictions, or other previously unthinkable measures. Fremont added rush hour traffic restrictions to many of its streets and then sent the information to Waze to integrate into its algorithms. So far, the plan has worked for Fremont, where residential traffic is back to more normal levels. [Source]
In Miami, the local police found it useful to post false reports of police speed traps in Waze, which had a desirable effect of slowing traffic by misleading people into believing there was a policeman watching. [Source]
With the power to route hordes of people away from one neighborhood and into another with just a few tweaks to its algorithms, Waze can literally turn one neighborhood into a ghost town overnight while making another an instant traffic hotspot. One can only hope they continue to use this power with balance – to both optimize our transportation routes and preserve our local communities.
Uber’s Fake Data
Ridesharing company Uber is accused of violating ridesharing laws in many cities, states and countries, and aggressively evading regulators by using a software tool called Greyball.
Uber regularly enters markets where rideshare services are illegal. In these markets, city officials sometimes pose as regular riders, hoping to catch Uber drivers in the act of violating city ordinances.
This is where Greyball comes in. Uber uses its Greyball tool to identify these bogus riders through proprietary algorithms and research and “Greyballs” them – making them ineligible for pickup by an Uber driver. A Greyballed official who tries to hail an Uber car will see Uber car icons on their app all around them so they believe they are in a hot Uber market, but what they don’t realize is that those car icons are actually fake — they never approach the government official for a pick-up. This is done so that they are not able to successfully hail a car. It appears the drivers continually cancel on them. Uber achieves this by creating a fake version of its app that shows ghost (fake) cars driving around the city. Real drivers are not shown to these “Greyballed” riders, protecting Uber and its drivers from fines and penalties. [Update: Since this story broke, Uber has vowed to stop using Greyball to target actions by local regulators.]
If Uber can fake data in its app to evade prosecution or penalties, what else might it fake data for? Considering what some have reported to be ethical gaps Uber has shown in recent years (including a work culture led by CEO Travis Kalanick that is known to condone sexual harassment and encourage winning at all costs, laws be damned), it is not farfetched to imagine other “winning” uses of fake app data.
In a new, budding market where ridesharing is not yet commonplace and Uber doesn’t yet have many drivers, showing fake cars to users might deter competitors like Lyft from trying to enter that market. In a similar vein, showing fake cars in a more developed market might convince riders to always check Uber for a ride first, instead of Lyft, because Uber seems to have much “better coverage.” The revenue implications of such practices would be enormous, and these two ideas only scratch the surface of what is possible.
There are also the ethical considerations. What if a “Greyballed” rider, legitimately expecting to be picked up, is not able to hail a ride in the sketchy part of town and is subsequently mugged after having driver after driver “cancel” on him? Win at all costs, Uber might say – one less to worry about.
Real Concerns
Ridesharing and mapping apps can have an incredibly positive impact on people’s lives. They reduce travel time, lower transportation costs, increase carpooling and create a whole economy of part-time drivers, while connecting people in communities across the world. But they are for-profit businesses operating in a cutthroat environment. These businesses ultimately exist to maximize profits. Their goals may run entirely in opposition to the goals of individuals and cities, whose mandates include things like public safety and preventing excessive traffic and noise in residential streets.
Do not blindly trust the data in your apps. The information could be biased, hacked, or even deliberately faked. Something to think about the next time Waze recommends a detour from your usual route or Uber reassures you there are plenty of drivers nearby (“just be patient”).
November 12, 2024
November 06, 2024
November 01, 2024
October 29, 2024
October 25, 2024