Amazon Secures Court Order Halting Perplexity AI’s Shopping Agent
A U.S. federal court in San Francisco has temporarily barred Perplexity AI from using its Comet browser-based assistant to make purchases on Amazon, marking one of the first major legal clashes over how far AI shopping agents can go when interacting with third‑party platforms.
U.S. District Judge Maxine Chesney granted Amazon a preliminary injunction on Monday, siding-at least at this early stage-with Amazon’s argument that Perplexity accessed password‑protected Amazon accounts without proper authorization, even when users themselves had given Comet permission to act on their behalf.
This is not a final decision on the merits. A preliminary injunction is essentially a “pause button” the court can press while a case moves forward, used when a judge believes the plaintiff has shown a plausible legal claim and a risk of ongoing harm. The broader, unresolved issue-whether AI agents can log in, browse, and transact on large commercial platforms without those platforms’ explicit consent-remains squarely in dispute.
How the Dispute Began
Amazon’s lawsuit, filed in November 2025, accuses Perplexity of violating the federal Computer Fraud and Abuse Act (CFAA) as well as a similar California computer fraud statute. According to Amazon, Perplexity’s Comet tool did not simply help users compare prices or gather product information; it allegedly logged in to users’ Amazon accounts and executed actions in a way Amazon says bypassed its rules and technical protections.
Central to Amazon’s complaint is the claim that Perplexity masked Comet’s automated activity to resemble normal web browsing traffic from Google Chrome. In Amazon’s telling, this disguise allowed the AI agent to slip past systems designed to detect and curb automated or unauthorized access.
Amazon also argues that Perplexity ignored multiple warnings. The company says it notified Perplexity at least five times, beginning in November 2024, that the AI startup’s practices violated Amazon’s terms of use and should be stopped. When warnings failed, Amazon moved to technical countermeasures.
Escalation After Amazon’s Technical Block
By August 2025, Amazon says it took a more aggressive step: implementing a technical block aimed at preventing Comet from logging in and performing actions on behalf of users. Rather than backing off, Perplexity allegedly modified Comet’s behavior to work around those defenses.
In court filings, Amazon portrays this as deliberate evasion, not an accidental conflict between incompatible systems. That framing is key to the CFAA and state law claims, both of which hinge on notions of “unauthorized” or “exceeding authorized” access to computer systems.
Perplexity, for its part, has maintained that its service acts only with users’ express consent and operates as a kind of automated assistant doing what users could do themselves through a standard browser. The injunction, however, indicates the judge found Amazon’s interpretation of “authorization” at least plausible enough to justify a temporary halt while the facts are sorted out.
Why the CFAA Matters for AI Agents
The CFAA was written long before consumer AI agents were imaginable, but it has become a central tool in disputes over automated access to websites and platforms. At its core, the law penalizes accessing a computer “without authorization” or “in excess of authorization,” language that courts have wrestled with for years.
In the context of AI agents, the law raises tricky questions:
– If a user gives an AI agent permission to log in to their account, does that automatically mean the platform has authorized the agent’s method of access?
– Can a platform’s terms of service or technical barriers convert what users view as permitted help into legally “unauthorized” access?
– Where is the line between acceptable automation (like password managers or browser extensions) and forbidden scraping or bot-driven interaction?
Judge Chesney’s injunction does not definitively answer those questions, but it signals that courts may view platform‑imposed limits and technical blocks as legally meaningful, even when end users want the assistance.
The Core Disagreement: User Consent vs. Platform Control
At the heart of the Amazon-Perplexity clash lies a fundamental tension: who ultimately controls how an online account can be used-the user who owns it, or the platform that hosts it?
Perplexity’s position leans heavily on user autonomy. If a customer chooses to let an AI tool log in, search for products, compare deals, and even complete a purchase, the startup argues, that should be treated as the digital equivalent of granting a human assistant access.
Amazon presents a different view. The company’s stance is that it has the right to define and enforce rules governing automated access to its services, regardless of user intent. In Amazon’s telling, when Perplexity allegedly disguised its traffic and circumvented technical defenses, it stepped over a bright legal line, no matter how many customers were on board with the idea.
The injunction suggests that, for now, the court is more persuaded by Amazon’s framing-at least enough to prevent Comet from operating on Amazon while the case continues.
Implications for E‑Commerce and AI Startups
This early court win for Amazon carries implications far beyond a single shopping assistant:
1. AI shopping agents face higher legal risk. Startups building tools that log into user accounts on retail, travel, or banking platforms may now need to assume that crossing a platform’s technical or contractual boundaries can trigger serious litigation.
2. Platforms are likely to tighten their defenses. Seeing Amazon succeed-at least preliminarily-could encourage other major services to update terms of use, boost bot detection, and formalize bans on AI agents operating without explicit platform-level approval.
3. Enterprise partnerships may become the safe path. Rather than building tools that operate from the outside, AI companies may be pushed toward official integrations, APIs, and co‑branded solutions that give platforms more control and visibility.
4. Users’ expectations will be tested. Many consumers may assume that “it’s my account, I can use any tool I want.” This case hints that the law may not always align with that intuition when platforms have clearly declared and enforced boundaries.
A Test Case for the Future of Autonomous Commerce
The Amazon-Perplexity dispute also serves as an early test case for what more fully autonomous commerce might look like. In theory, AI agents could eventually:
– Maintain shopping lists;
– Monitor prices across dozens of merchants;
– Apply coupons and rewards automatically;
– Schedule deliveries to match a user’s calendar; and
– Handle returns or disputes without direct user intervention.
All of that depends on agents being able to move freely across multiple platforms, log in securely, and execute transactions. This lawsuit underscores that such freedom is not simply a technical problem-it is a legal and contractual one.
If courts ultimately side with large platforms, AI agents may become fragmented tools, constrained to officially sanctioned integrations. If, instead, judges recognize broader user rights to delegate access, the landscape might tilt more toward open, cross‑platform automation.
Privacy, Security, and Liability Questions
Beyond authorization, the case also raises thorny questions about privacy and security:
– Who is liable when something goes wrong? If an AI agent makes a mistaken purchase, falls for a scam listing, or misinterprets a user’s instructions, it is not yet clear whether responsibility will rest with the AI provider, the platform, or the user.
– How should credentials be handled? Tools like Comet often require storing or handling users’ login details or tokens. That creates new attack surfaces and potential vulnerabilities that platforms did not design or vet.
– What counts as informed consent? Even if users click “agree” to allow an AI agent to control their account, regulators and courts may question whether they fully grasp the scope of actions that the agent can take and the risks involved.
Amazon’s claims-focused on masked traffic and bypassing blocks-subtly tap into these concerns, suggesting that Perplexity’s behavior could undermine the safeguards Amazon has put in place to protect accounts and data.
The Business Stakes for Perplexity
For Perplexity AI, the injunction is more than a technical setback. Comet’s ability to actually perform tasks-like completing purchases or modifying orders-has been a core part of the company’s pitch that AI agents should do more than just answer questions.
Losing access to one of the world’s largest online retailers weakens that value proposition and may force the company to rethink how it integrates with major platforms. It also sends a signal to investors and competitors that aggressive, “ask forgiveness later” approaches to automation now carry a higher legal cost.
Perplexity may try to pivot toward official partnerships, more transparent technical architectures, or alternative use cases that do not require deep account access. But as the case progresses, each new court filing will shape how other AI players calibrate their own risk.
What to Watch as the Case Moves Forward
The preliminary injunction is just an opening chapter. Several developments will be important to track:
– Discovery and technical evidence. How, exactly, did Comet log in, route traffic, and respond to Amazon’s blocks? The technical record could clarify where courts will draw the line between clever engineering and unlawful circumvention.
– Interpretation of “authorization.” Subsequent rulings may refine how user consent interacts with platform-imposed restrictions under the CFAA and state laws.
– Settlement vs. precedent. The case could quietly settle, with Perplexity agreeing to new rules or licensing terms. Alternatively, it could move toward a decision that sets a precedent for AI agents across the industry.
– Regulatory interest. Lawmakers and regulators following the rise of consumer AI tools may view this dispute as a signal that clearer statutory guidelines are needed for automated access to online services.
The Emerging Rules of AI‑Driven Shopping
For now, the immediate outcome is straightforward: Perplexity’s Comet assistant is, by court order, blocked from making purchases on Amazon while the lawsuit plays out. The deeper story is that the legal framework governing AI agents is being written in real time through cases like this.
Companies building AI tools that act on users’ behalf will need to:
– Treat platform terms of service as legally consequential, not optional reading.
– Design systems that respect technical barriers instead of trying to sneak past them.
– Be transparent about automation to both users and the platforms they interact with.
– Prepare for a world in which user consent alone may not be enough to justify any form of automated access.
Amazon’s early victory does not close the debate. It does, however, make one thing clear: the era of AI agents quietly slipping into existing web ecosystems without asking permission is coming to an end. The next generation of shopping bots and digital assistants will be shaped as much in courtrooms as in code.
