Trust, transparency, choice and consequence

Seemingly innocuous

I notice a pattern and it starts with a seemingly innocuous event like using a loyalty card or visiting a website and then what follows is a narrative about why these particular things aren’t what they seem, and a list of possible events beyond what’s on display.

The question was ‘what would happen if you overheard the person in front of you in line get a better rate at a hotel?‘ Most people who answered did so with a sense of resignation or mild curiosity but generally the tone was conciliatory and far from indignant or something worth writing about.

Digital red lining

Historically banks in the U.S. used location information to overtly discriminate against African Americans when it came to offering bank loans, or unfair/unreasonable rates 1. Banks would draw red lines on a map around areas that were predominantly places where racial minorities lived and they were less likely to offer people in those areas bank loans. The proxy for actual discrimination was the postal code. 

In an online environment similar digital red lines can be drawn, and proxies for overt discrimination can be buried in data sets. So what appears on the surface to be a request for some personal data can be used and cause actual discrimination. Price discrimination, such as offering a higher credit rate becomes a mechanism for real discrimination if that decision is based on something like gender, racial identity or socioeconomic status. If those decisions can be buried through a proxy association, the business determines who they want or don’t want and the consequence is discriminatory. With data mining and the data sets that are no doubt available to many companies, there is plenty of opportunity to establish relationships between something we might think to be innocuous, like the music we like and something that we’d like to keep private, like socioeconomic status.

Does the general public seem to accept revenue management?

Revenue management is a term that describes how a company can charge two different prices for the same product depending on how much they know about you. If there’s something they find that would indicate you’d be willing to pay more, there is economic motivation to do it. For instance, if it was determined that mac users were willing to pay more for things, generally, then a hotel website visited by a user with a mac might present rates that are higher. 

McMahon-Beattie describes revenue management as something that happens to customers and not for customers which infers that a question about the public’s ‘acceptance’ of it is an assumption that misrepresents reality 2. Acceptance requires that someone is given a choice, and/or the ability to express consent and the current implementations of revenue management do anything but conjure up images of consumer choice or even bothering to ask. Acquiescence would better describe the reality of the general public’s relationship to revenue management. Ignorance and misunderstanding are also used to describe the general public’s acquiescence to revenue management 3. Turow et al, use the phrase ‘open to financial exploitation’ as an indication of the level at which the general public‘accepts’ revenue management.

Against this pricing system

A case against revenue management can be framed by concerns around trust, fairness and justice 4. If the conditions around price discrimination are not clearly communicated, or the process by which one can obtain a discount are not apparent, then it is more likely that it will be perceived as an unfair mechanism which undermines trust between consumer and a business.

Further arguments against this pricing system can be found in the discussions about antitrust and consumer protection law, human rights and privacy protection 5. In a section titled ‘It is Already Illegal (Or Should Be) Under Antitrust Law’, Miller references the anti-competitive effects of loyalty programs and bundling and refers to corporate legal liability (in American law) if the circumstances around revenue management raise concerns of monopolization. The recently introduced GDPR (General Data Protection Regulation) has provisions in it with respect to online price discrimination and significant fines can be levied against corporations that do not comply with a consumer’s right not to be subjected to certain discriminatory pricing decisions 6. Finally, the practice of revenue management often involves automated decisions based on the collection and use of personal information. There are many reasons people would desire anonymity on the internet, one of them being to avoid the reduction in personal welfare which comes from the ubiquity of surveillance capitalism.

Another way

Revenue management requires information from both the consumer and the business and the majority of problems tends to arise from either a customer not having enough information about how a business allocates discounts and to whom, or the business having too much information about the customer to be considered an improvement to social welfare. The GDPR, in flipping the dynamic from thinking of the protection of personal information as a luxury to thinking of it as a right has also created a compelling reason to re-imagine (from a technical perspective) how personal information can be put back into the hands of the people whose data it is. Tim Berners-Lee and his SOLID project looks like a promising endeavour. Anything that that looks to put users back in control of their data is something that would improve current revenue management practices.


  1. A. Taylor and J. Sadowski, “Digital Red Lining,” Nation, vol. 300, no. 24, pp. 24–27, Jun. 2015.
  2. U. McMahon-Beattie, “Trust, fairness and justice in revenuemanagement: Creating value for the consumer,” J. Revenue PricingManag., vol. 10, no. 1, pp. 44–46, Jan. 2011.
  3. J. Turow, L. Feldman, and K. Meltzer, “Open to Exploitation:America’s Shoppers Online and Offline,” Rep. Annenberg PublicPolicy Cent. Univ. Pa., Jun. 2005.
  4. U. McMahon-Beattie, “Trust, fairness and justice in revenue management: Creating value for the consumer,” J. Revenue PricingManag., vol. 10, no. 1, pp. 44–46, Jan. 2011.
  5. A. A. Miller, “What Do We Worry about When We Worry about Price Discrimination – The Law and Ethics of Using Personal Information for Pricing [article],” J. Technol. Law Policy, no. Issue 1, p.41, 2014.
  6. R. Steppe, “Online price discrimination and personal data: A General Data Protection Regulation perspective,” Comput. LawSecur. Rev. Int. J. Technol. Law Pract., vol. 33, pp. 768–785,Dec. 2017.


I am a developer with an 18-year career in the Information Technology sector. Over the last half-decade, I’ve dedicated myself to advancing my expertise in the realm of intelligent information systems with a Master of Science in Information Systems (MScIS) degree. Notably, I recently completed a substantial socio-technical study, examining the feasibility of implementing responsible AI (RAI) within the public sector. Prior to my role in the public service, I undertook diverse software development roles as a contractor, a team lead and providing valuable services to post-secondary institutions. My driving passion revolves around the convergence of technology and the law, with a particular focus on the capacity of ethical AI systems to shed light on critical issues.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *