Thursday, 23 October 2025
28 C
Singapore
22.3 C
Thailand
24 C
Indonesia
28 C
Philippines

AI startup Anthropic is accused of bypassing anti-scraping rules

Websites accuse AI startup Anthropic of bypassing anti-scraping protocols, causing disruptions and sparking debates over compliance and licensing.

In recent news, AI startup Anthropic, known for developing the Claude large language models, has accused multiple websites of disregarding their anti-scraping protocols. Freelancer and iFixit have raised concerns over Anthropic’s alleged behaviour, claiming that the company’s web crawler has been excessively active on their sites.

Freelancer’s complaints

Matt Barrie, CEO of Freelancer, has stated that Anthropic’s ClaudeBot is “the most aggressive scraper by far.” Barrie said the crawler visited Freelancer’s website 3.5 million times within four hours, causing significant disruption. This traffic volume is reportedly “about five times the volume of the number two” AI crawler. Barrie noted that this aggressive scraping has negatively impacted their site’s performance and revenue. Despite initially trying to refuse access requests, Freelancer blocked Anthropic’s crawler to prevent further issues.

iFixit’s experience

Kyle Wiens, CEO of iFixit, echoed similar concerns. Wiens mentioned on social media platform X (formerly Twitter) that Anthropic’s bot hit iFixit’s servers one million times within 24 hours. This high volume of requests led to considerable strain on iFixit’s resources, prompting the team to set alarms for high traffic that woke them up at 3 AM due to Anthropic’s activities. The situation improved only after iFixit specifically disallowed Anthropic’s bot in its robots.txt file.

This isn’t the first time an AI company has been accused of ignoring the Robots Exclusion Protocol, or robots.txt. Back in June, Wired reported that AI firm Perplexity had been crawling its website despite the presence of a robots.txt file, which typically instructs web crawlers on which pages they can and cannot access. Although adherence to robots.txt is voluntary, bad bots often need to pay more attention to it. After Wired’s report, startup TollBit revealed that other AI firms, including OpenAI and Anthropic, have also bypassed robots.txt signals.

Anthropic’s response and ongoing issues

Anthropic has responded to these accusations, telling The Information that it respects robots.txt and that its crawler “respected that signal when iFixit implemented it.” The company strives for minimal disruption by being thoughtful about how quickly it crawls the exact domains and is currently investigating the issue to ensure compliance.

AI firms frequently use web crawlers to collect content to train their generative AI technologies. However, this practice has led to multiple lawsuits from publishers accusing these firms of copyright infringement. Companies like OpenAI have started forming partnerships with content providers to mitigate the risk of further legal action. OpenAI’s content partners include News Corp., Vox Media, the Financial Times, and Reddit.

Wiens from iFixit is willing to discuss a potential licensing agreement with Anthropic, suggesting that a formal deal could benefit both parties. This approach could pave the way for a more collaborative relationship between content providers and AI developers, reducing the friction caused by unauthorised scraping activities.

Hot this week

Whisker introduces Litter-Robot 5 Pro with AI facial recognition for cats

Whisker introduces the Litter-Robot 5 Pro, featuring AI facial recognition and new smart features for advanced cat care.

Salesforce and Google deepen partnership with new AI integrations across Agentforce 360 and Gemini Enterprise

Salesforce and Google expand their partnership with deeper AI integrations across Agentforce 360, Gemini Enterprise, Google Workspace, and Slack.

NVIDIA unveils first US-made Blackwell wafer as domestic chip production expands

NVIDIA unveils its first US-made Blackwell wafer at TSMC’s Arizona facility, marking a major milestone in domestic AI chip production.

NTT DATA strengthens global insurance leadership with acquisition of Alchemy Technology Services

NTT DATA acquires Alchemy Technology Services to strengthen its global insurance capabilities and accelerate digital transformation worldwide.

Microsoft releases emergency Windows 11 update to fix recovery bug

Microsoft has issued an emergency Windows 11 update to fix a recovery bug that disabled USB mouse and keyboard support in WinRE.

OpenAI launches ChatGPT Atlas, a browser built around AI assistance

OpenAI launches ChatGPT Atlas, a new browser with built-in AI that helps users browse, plan, and work more efficiently.

Google strengthens Singapore’s cyber defences with AI-powered initiatives and national partnerships

Google partners with CSA and launches AI-powered initiatives to strengthen Singapore’s cyber defences and protect users from scams.

Illumio introduces AI-driven Insights Agent to simplify threat detection and containment

Illumio launches Insights Agent, an AI-powered tool that simplifies threat detection and helps security teams respond faster to cyber risks.

Hitachi Vantara partners with Supermicro to boost AI and enterprise data performance

Hitachi Vantara partners with Supermicro to strengthen enterprise AI, combining unified storage with GPU compute for modern data workloads.

Related Articles