The IAB/ABC International Spiders & Bots List, helps filter known non-human activity that can significantly inflate ad impressions and site traffic counts.
Effective use of the data can lead to a more transparent and accurate measurement for you and your clients.
Implementation of the List is a requirement in the UK by ABC & JICWEBS 'Web Reporting Standards' and in the US for the IAB's 'Ad Impression Measurement Guidelines' & the MRC's 'Invalid Traffic Detection and Filtration Guidelines'.
To help exclude known robotic traffic, the IAB/ABC International Spiders & Bots List contains a list of User-Agent strings previously found to have made material levels of robotic activity.
Media Owners may wish to use a more rigorous robot exclusion process by adopting the dual-pass approach recommended by the IAB. This approach first checks that the User-Agent matches the include list and then excludes it if it matches the exclude list.
ABC also publishes a list of IP addresses that have previously been found to make material levels of invalid robotic activity. These IP addresses typically represent activity from external site monitoring tools, and from certain robots which use generic User-Agents.