IAB/ABC International Spiders & Bots List
The IAB/ABC International Spiders & Robots list helps filter known, non-human activity that can significantly inflate ad impression and site traffic counts. Effective use of this data can lead to a more transparent and accurate measurement for you and your clients.
Implementation of this list is recommended in the UK by ABC & JICWEBS ‘Web and AV Reporting Standards’ and a requirement in the US for the IAB‘s ‘Ad Impression Measurement Guidelines‘ & the MRC‘s ‘Invalid Traffic Detection and Filtration Guidelines‘.
- Best practice guidelines
- Technical support
- Industry leading spider & bot detection and filtering:
The IAB/ABC International Spiders & Robots List
To help exclude known robotic traffic, the IAB/ABC International Spiders & Bots List contains a list of User-Agent strings previously found to have made material levels of robotic activity.
The IAB/ABC International list of valid browsers
Media Owners may wish to use a more rigorous robot exclusion process by adopting the dual-pass approach recommended by the IAB. This approach first checks that the User-Agent matches the include list and then excludes it if it matches the exclude list.
The ABC IP Address Exclusion List
ABC also publishes a list of IP addresses that have previously been found to make material levels of invalid robotic activity. These IP addresses typically represent activity from external site monitoring tools, and from certain robots which use generic User-Agents.
The IAB US, working with our sister company - The Alliance for Audited Media (AAM) - performs evaluation and management services associated with maintaining and publishing the industry Spider and Robot list.
A Spider & Robot Policy Board oversees and approves modifications to the List which is updated monthly (on or before the 25th of each month) to reflect changes that are brought to the attention of ABC UK, AAM and the Policy Board.
Spiders & Robots Policy Board (as of 1/1/2019)
- Graeme Halls, ABC (UK)
- Sophie Wallace, ABC (UK)
- Martin Liljenback, Adobe
- Steve Guenther, Alliance for Audited Media
- Todd Martens, Alliance for Audited Media
- Richard Thurman, AOL
- Jeff Gilbert, Conversant
- Albert Roux, Criteo
- Michael Ying, Criteo
- Gabriel Burete, Extreme Reach
- Jamie Polster, Extreme Reach
- Neha Bensal, Google
- Per Bjorke, Google
- Sam Tingleff, IAB Tech Lab
- Ankit Patel, Microsoft
- Jaelene Price, Microsoft
- Saif Jafri, Yahoo
The IAB/ABC International Spiders & Robots List is made up of two text files, both are lists of User-Agents. There are two ways of implementing these lists; using the single-pass method (excluding robots only), or the dual-pass method (including valid browsers THEN excluding remaining robots) which is recommended by the IAB US.
The ABC IP Address Exclusion List typically represents activity from external site monitoring tools and from certain robots which use generic User-Agents. The current IP address exclusion list is always available in 3 formats: CIDR, plain text and also in a perl-style regex format.
Both the Bots file (approx. 40kb) and IP Exclusion file (approx. 3kb) download in a text file format. Access can be direct through our website or we can set up access via FTP and or SFTP.
Subscriptions start at £3,000 per annum (pro-rata) and, while members get discounts, you don’t have to be a member of ABC to subscribe to the list. Click here to enquire.