The DataDome marketplace is the first to offer a practical way for data producers and big data companies to enter mutually beneficial agreements that regulate the data harvesting process, and to optimize the process itself.
The DataDome API provides an interface which enables e-commerce platforms, media sites and big data companies to manage which bots can access content, which data to extract — and under which terms and conditions.
We do understand. It doesn’t intuitively seem all that attractive to start paying for data that you are currently accessing and using without any compensation to the source. But we hope to convince you that using the DataDome marketplace to access the data you need represents a host of benefits to you — not just a cost of doing business.
Would you like to optimize your engineering resources? Ensure stellar data and service quality? Eliminate legal risks?
We thought so. Let us show you how.
Better resource allocation
The data your bots are collecting is free, but the bots themselves are not. They need continuous development, testing, updating and maintenance. And as bot detection and blocking technologies become increasingly sophisticated, it takes additional development efforts to work around the blockers.
The DataDome API offers you the possibility to quit the arms race, make better use of your engineering resources, and give your developers more interesting challenges. Instead of performing the relatively mundane data harvesting tasks, your teams can use their skills and creativity to optimize your technology and develop value-added features that differentiate your services.
Data quality and reliability
Webmasters and IT security professionals are increasingly aware of the exponential growth in bot traffic to their sites, and they have more and more sophisticated tools at their disposal for protecting their content from bots.
As a growing proportion of sites are protecting their websites from scraping, the data that remains available to feed into your services becomes less representative. And having access to more limited data sets may skew your analyses and jeopardize the quality of your service.
Furthermore, using the DataDome API to access data will ensure that you always have real-time data, and eliminates the risk of crises resulting from indexing delays.
Thanks to the DataDome API, you can easily enter and manage agreements with all the data sources you need, and ensure uninterrupted access to reliable and representative data sets.
Current scraping processes typically require that you harvest vast amounts of data in order to be able to extract the precise information you need. The results are excessive granularity, a waste of server resources, and limited solution scalability.
The DataDome API enables you to make very complex queries and extract only the data you need from any data provider. For instance, if all you want to know is the number of comments or social shares a media article got, you don’t have to extract the entire content of the comments or the names of the people who shared it.
Other examples of available queries include the volume of mentions of named entities, update frequency, and author information.
Last, but not least, entering commercial agreements with data providers eliminates the risk of legal conundrums and protects your reputation.
Yes, there is currently much legal uncertainty in this domain, but things are moving — in favor of data sources. The 2015 Court of Justice of the European Union ruling in the Ryanair vs PR Aviation case established that website operators in the EU can use their website terms and conditions to prohibit content scraping. Should you choose to ignore such clauses and continue to carry out scraping activities, the website owner can make a breach of contract claim against you.
The DataDome API will provide the necessary legal framework that ratifies mutually beneficial agreements between data sources and big data companies. It protects you as a data user against breach of contract claims and other legal issues and expenses.