This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Nov. 29, 2023

Antitrust in AI World: The Robots Did It

See more on Antitrust in AI World: The Robots Did It

Barbara A. Reeves

JAMS

Labor and employment, sports and entertainment, health care, trade secrets, business/commercial, insurance

1925 Century Park E Ste 1400
Los Angeles , CA 90067-2715

Phone: (310) 309-6255

Email: breeves@jamsadr.com

Harvard Univ Law School

On Oct. 30, President Biden issued an executive order on the safe, secure and trustworthy development and use of artificial intelligence. It provides eight guiding principles and priorities, focusing on national security, privacy and intellectual property issues, but also sounds an alert about competition-related issues of concern to antitrust enforcers at the Department of Justice (DOJ) and the Federal Trade Commission (FTC), referencing the need “to ensure fair competition in the AI marketplace and to ensure that consumers and workers are protected from harms that may be enabled by the use of AI.”

Antitrust regulators and scholars have been raising concerns for years about algorithmic pricing and whether pricing algorithms can enable pricing collusion. See, e.g., Algorithms and Collusion – Note by the United States, available at https://www.justice.gov/atr/case-document/file/979231/download. The recent advances in generative AI highlight the areas of concern.

Agreements between competitors to raise prices or limit supply is per se illegal under Section 1 of the Sherman Act. 15 U.S.C. § 1. A dominant firm that employs exclusionary or predatory conduct to monopolize, attempt to monopolize or raise prices above those that would be charged in a competitive market, or exclude competition, violates Section 2 of the Sherman Act. 15 U.S.C. § 2. Historically, enforcement has focused on collusion, explicit or conscious parallelism, by competitors, meaning people at the firms.

Enter artificial intelligence (AI), algorithms, and computers. Can computer-determined pricing be susceptible to coordination, just as human-determined pricing can?

What if businesses agree to match prices and then leave it to their computer algorithms to monitor and enforce the agreement?

What if competitors in a market adopt a common pricing algorithm, whether by agreement or just because it is highly recommended?

What if competitors unilaterally design a pricing algorithm to react in certain ways to changing market conditions, with the expectation that other competitors are developing and implementing similar algorithms?

What if the competitors unilaterally design algorithms to maximize profit by monitoring supply, demand, costs and other market factors, and then the algorithms, learning through ongoing feedback, independently determine that profit is maximized by raising prices, signaling price changes and retaliating against a competitor’s algorithm that undercuts the supracompetitive pricing?

What if a dominant firm adopts an algorithm to produce exclusionary conduct, such as predatory pricing, inflated pricing and self-preferencing?

Algorithms and AI have revolutionized the way we make decisions, process information, and forecast the weather. Business models rely on self-learning, or generative, algorithms that learn from experimentation and the data they process to make decisions in nanoseconds. Online platforms in retail, air travel, concert tickets and other areas have been using pricing algorithms for years to adjust prices based on supply and demand, also known as dynamic pricing.

Generative AI is now enabling algorithms to take over marketplace roles previously played by humans, not only setting prices, but also responding to prices and distribution practices set in motion by other algorithms in the marketplace. Can algorithms learn to collude? Can they create an environment in which they predict each other’s moves and strategies? And whom can you blame for the results if the programmer of the algorithms employed a neutral, profit-maximizing set of instructions? A competitor’s unilateral efforts to maximize its profits is not, by itself, illegal.

Antitrust regulators are grappling with these issues. Spokespersons from the Antitrust Division of the DOJ and the FTC have pointed to cases in which pricing algorithms used by competitors lead to collusion in the marketplace, potentially resulting in higher prices or a reduction in competition, calling out the need to bring antitrust enforcement in line with market realities. https://www.justice.gov/opa/speech/principal-deputy-assistant-attorney-general-doha-mekki-antitrust-division-delivers-0. The DOJ and FTC have announced that they are hiring data scientists, computer scientists and economists to help them better understand and detect anticompetitive conduct by algorithms, and developing new guidance on the antitrust risks associated with algorithms.

On Nov. 2, the Federal Trade Commission released new details in its antitrust case against Amazon about Amazon’s secret pricing algorithm, code-named “Project Nessie,” which is alleged to have generated more than $1 billion in extra profits for the company. The FTC and 17 states sued Amazon in September, alleging the company was abusing its position in the marketplace to inflate prices on and off its platform, overcharge sellers and stifle competition. The FTC alleges that Amazon activated the algorithm to predict where it can raise prices and have other online shopping sites follow suit, and to keep the higher prices in place once competitors followed suit. Nessie would automatically raise the prices of selected items and then monitor competitors to make sure Amazon was not being undercut.

Antitrust regulators and plaintiffs’ attorneys are studying these developments. The tools exist to “read” algorithms being used by businesses under investigation.

With this background, counsel will need to be prepared to advise clients about potential liability. To what extent will liability be imputed to the person who created the algorithm? To what extent is the algorithm acting as programmed, and to what extent is the algorithm acting on its iterative learning? Programming an algorithm not to fix prices may seem simple, but as the algorithm learns, it may program itself to act in ways that resemble collusion or predatory responses.

We don’t know exactly how AI and pricing algorithms will evolve, but at this point, counsel should be advising clients on measures to avoid antitrust risk, including monitoring their algorithms to understand how they function and training their IT teams on the antitrust implications of the algorithms they create. Pricing algorithms should be based on objective factors, such as cost, supply and demand, and firms should document that the pricing decisions are made independently, not through cooperation with competitors. Finally, algorithms must be monitored as they change in order to address and mitigate antitrust risks as they occur.

A company that is alleged to have used an algorithm to engage in anticompetitive conduct may find itself raising the following defense: The robots did it!

Disclaimer: The content is intended for general informational purposes only and should not be construed as legal advice. If you require legal or professional advice, please contact an attorney.

Barbara A. Reeves is a mediator, arbitrator and special master with JAMS. She specializes in commercial cases, sports and entertainment law, intellectual property matters, health care business disputes and employment cases.

#375980

For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com