Online_price_discrimination

How Zip Code is used for price discrimination.

Online pricing policies are generally not transparent to customers, and are based on parameters that we completely ignore. This opens up a series of opportunities for online price discrimination. Zip Codes and user's location data are usually used to this end, and it is scientifically demonstrated. In this article I have reviewed different price discrimination cases in different areas where you can realize that you actually are also a victim.

How users’ location data is used to make discriminatory decisions?

Online pricing policies are generally not transparent to customers, and are based on parameters that are completely unknown. This opens several opportunities for the so-called price differentiation and price discrimination.

But, let’s clarify certain terms that can lead to confusion. There is an important difference between price DISCRIMINATION and price DIFFERENTIATION.

Price differentiation describes a strategy to determine the price of a product or service based on the needs of a potential customer, it does not depend on the characteristics of the customer.

However, in price discrimination, the price is determined based on the personal attributes of a potential customer, such as location, financial status, possessions, gender or behavior.

From a technical point of view, an online platform can take advantage of many types of techniques to identify a user, which would be the starting point for price discrimination.

In practice, browser fingerprints provide more information about a client than cookie-based methods, including software attributes, such as the user agent used, or installed plugins.

But, there is nothing better than to prove facts scientifically. In the paper “An Empirical Study on Online Price Differentiation” real-world browser fingerprints were applied to simulate different systems and analyze the corresponding price changes.

To achieve this goal, the authors implemented an automated price scanner capable of disguising itself as an arbitrary system that takes advantage of the fingerprints of the real-world system and sought price differences related to:

1. User location,

2. Specific systems represented by your fingerprints, and

3. Unique features of fingerprints.

In this empirical study, several accommodation reservation websites and a car rental provider platform were examined to identify which parameters affect the price of an asset.

The result showed the existence of price discrimination based on location, while price changes based on the system’s fingerprints are found in unique cases and do not reveal systematic discrimination.

In this other paper, “Proxy Discrimination in Data-Driven Systems”, is explained how several examples of models (decision trees) used by a bank to accept loan applications.

The bank uses race, zip code and customer interest level. The w1 and w2 zip codes are predominantly white, while zip codes b1 and b2 are predominantly black. Interest on the loan (high or low) is independent of race.

No hay texto alternativo para esta imagen

A bank explicitly uses race to assess loan eligibility. This form of explicit use of a type of protected information can be discovered by existing black box experimentation methods that establish causal effects between input data and output data.

In figure b) the Zip Code of the applicants is indicative of their race. Therefore, the bank can use the zip code instead of race to assess loan eligibility.

Example d) does not use the race directly, but infers it through associations and then, uses it. Existing methods can detect such associations between protected classes and results in observational data.

(Use of masked proxy, Fig. D) This is a more insidious version of example b). To mask the association between the outcome and the race, the bank offers loans not only to the white population, but also to those with low interest expressed in loans, people who would be less likely to accept a loan if one were offered. Figure d) is an example of such an algorithm.

While there is no association between race and outcome in both Example c) and Example d), there is a key difference between them. In example d), there is an intermediate calculation based on Zip Codes that is a race predictor, and this predictor is used to make the decision and, therefore, is a proxy use case.

To maximize profits, sellers like to participate in price discrimination: set higher prices for consumers who are willing to pay more and lower prices for consumers who are willing to pay less.

Driven by large data, algorithmic price discrimination is able to analyze the population of potential customers in increasingly thin subcategories, each with a different price.

In some cases, sellers can even set custom prices, reduce the demand curve and set a different price for each consumer.

B&Q, a British multinational, tested digital price tags, in its physical stores, that interacted with customers’ phones and adjusted the price shown based on loyalty card data and customer spending habits.

Price discrimination is based on the ability to identify the willingness to pay (WTP) of the consumer. In fact, this data is essential for any discussion about price discrimination.

The willingness to pay of the consumer is a function of preferences and perceptions (wrong). Think about signing up in the gym. Many clients falsely believe that they will attend, at least, once a week and overestimate the benefit of membership. Therefore, they are willing to pay more.

Large online platforms use Big Data and sophisticated algorithms to identify and establish a personalized price equal to the willingness to pay, although wrong, of the consumer.

When the willingness to pay of the consumer reflects both preferences and misperceptions that inflate demand, price discrimination harms consumers even more and can also reduce efficiency.

The damage to consumers increases. As consumers, paying a price that is equal to their perceived benefit, which exceeds their actual benefit.

Let’s go back to the buyer’s location. The price discrimination is based on various types of data, including consumer’s location, time of the day, the characteristics of the consumer’s computer, such as the operating system, the browser, and their purchase history.

In the paper “Measuring Price Discrimination and Steering on E-commerce Web Sites” the authors demonstrated that sellers can collect this information themselves, or buy it from data brokers.

The authors analyzed real-world data from 300 consumers who visited 16 e-commerce websites and found evidence of personalization at four general retailers and five travel websites, including cases where sites altered prices in hundreds of dollars.

Evidence suggests that the geographic location of a consumer affects the pricing decisions of sellers. For example, Uber collects rich geographic data about its customers and uses them to discriminate prices.

In addition, it was discovered that Amazon, Staples and the Steam video game store vary the price by geographic location by up to 166%.

No hay texto alternativo para esta imagen

Source: “Detecting price and search discrimination on the Internet” https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.352.3188&rep=rep1&type=pdf

According to reports, Uber uses time, in addition to location, to discriminate prices, calculating a passenger’s willingness to pay for a particular route at a certain time of day and adjusting prices accordingly.

All prices are not the same. Home Depot website visitors may assume that they get the same treatment as everyone else, but the reality is that the retailer charges higher or lower prices based on the zip code of each individual visitor.

Custom prices are more likely to occur in markets where consumers do not have direct information on how much other consumers pay and, in markets where differential prices are combined with heterogeneous products, so that comparisons between consumers become difficult.

A better known tactic used by airlines alters fares based on information such as time of the day, day of the week and the traveler’s zip code.

The same-day booking application, HotelTonight, recently introduced two features that show discounted rates to users based on their location. How is this possible? The moment you enter the domain of an electronic seller, the company can see everything from your browsing history, to your zip code.

The technology allows providers to track users across multiple browsers. It is not surprising that Amazon can update its prices for each customer every 10 minutes.

We live in the algorithm era. Increasingly, decisions that affect our lives are made by mathematical models, such as where we go to high school or university, if we get a loan, when buying a car, getting a job, how much we pay for health insurance. But, in theory, this should lead to greater equity: everyone is judged according to the same rules and bias is eliminated.

But, as Cathy O’Neil reveals in her book “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens DemocracyAlgorithmic models are opaque, unregulated and incontestable, even when they are wrong.

The most worrying thing is that they reinforce discrimination. In her book, Cathy O’Neil says that if a poor student cannot get a loan because a loan model considers it’s too risky (decision made by his zip code) then, the type of education that could get him out of the poverty is cut off.

And there is a vicious and discriminatory spiral that prevents an entire disadvantaged social class from getting out of the hole.

“The models catapult the lucky ones and punish the oppressed, creating a toxic cocktail for democracy. Welcome to the dark side of Big Data.”

Next time you go to a store and the cashier asks for your zip code, tell him you refuse to give that information. If they get angry, or they tell you that they cannot make the sale without that data (which they will cross with your debit / credit card, name, amount of the sale and products / services purchased to make a profile), tell them that you want to talk with their Data Protection Officer claiming your right not to give certain data.

Leave a Reply

Your email address will not be published. Required fields are marked *