“We can’t let go of our data like this”

How can media sites turn their automated traffic (scraping bots) into new revenue lines? The topic was debated at a conference organized on December 7 by GESTE, a French consortium of online content and service providers, in partnership with DataDome.

Highly engaged in the topic, Jean-Charles Falloux, Managing Director Digital Media & Tech Innovation at France’s #1 business daily Les Echos, shared insights gained from his experience as an early user of the DataDome technology.

He shared the podium with two representatives from Big Data: Raphael Labbé, founder of content distribution platform WizTopic, and Mickaël Réault, founder of Sindup, a competitive intelligence and e-reputation monitoring platform.

The debate highlighted a series of challenges to be addressed by publishers and big data companies alike. Excerpts:

Reduce infrastructure costs

Jean-Charles Falloux: “Crawlers account for around 50% of our traffic, and this traffic generates substantial infrastructure costs. We can no longer afford to have these production costs that don’t correspond to any revenue line.”

Generate new revenue lines

Jean-Charles Falloux: “In order to monetize editorial content, we have existing partners, it’s functioning. Here, we’re looking at monetizing a different layer of information. We’re discovering that it’s possible to do more than just sell an article by means of syndication, there’s a ton of information all around our content which is also valuable, and which we must identify and exploit. It’s not about incurring additional costs, but about generating additional revenue from the content that already exists.”

Raphael Labbé, fondateur de WizTopic

Raphael Labbé: “We don’t use crawlers, but we work with communications departments whose environments have become a lot more complex. Content distribution has changed a lot: New formats, new channels, new audiences …

At the same time, the teams are often very lean, and reporting is a real headache for them. We all know that Advertising Value Equivalency is a useless metric, but it’s all they have.

The WizTopic platform helps them put their work and their results into perspective, and to that end we are purchasing data. Rather than developing crawlers — which we really don’t want to do — we’re happy to pay for quality data in order to enrich what we offer to our customers.”

Mickaël Réault: “Certain types of information aren’t easily available, but they’re very valuable to the communications departments. Did an article mentioning the brand get posted to the front page? Was it published in the Companies section or the Business section? How many comments and shares did it generate, and were there any mentions in the comments? Who wrote the article? These data have real value and we’re ready to pay for that.”

Find the right price for each item

Jean-Charles Falloux: “What’s the right price for this behavioral data, the shares, the comments, the number of views? What’s the value of this content? We’re currently in the testing phase with DataDome, which lets us try out the tariffs and adjust as we move along.”

Fabien Grenier: “We advise you on prices and usage. Before founding DataDome, my cofounder Benjamin Fabre and I built a social media monitoring company where we were using crawlers extensively. We’ve been operating in this industry for years, we know the use cases, and we can guide you.”

Quickly identify new entrants

Jean-Charles Falloux: “Our approach is to be active on this market and to understand how it works. It’s important to be able to quickly identify the new bots that keep coming. Who are they? What are they coming for?

The DataDome back office has revealed crawlers that we really didn’t suspect, including lots of American bots. We’ve become aware of how complex this topic is.”

Fabien Grenier: “Thanks to technical innovation, it’s become possible to scan every hit on a web server and to see all the bots. We offer you a simple dashboard to manage it all in real time.

You can block bots that are coming for editorial content, but the main objective is to identify potential partners and generate new revenue lines.”

Strength in numbers

Mickaël Réault: “We are waiting for a solution that can improve the market conditions. Currently, the cowboys who aren’t playing by the rules are causing disadvantages for those of us who want to observe good practices.

It’s encouraging to see everyone come together here, DataDome, the CFC*, representatives from the media: everyone wants a clear and transparent value chain, to establish healthy practices, and to redefine current usages with clear offers.”

*) Editor’s note: the French Reproduction Rights Organization

Jean-Charles Falloux: “The purpose isn’t to block all these bots, but to establish partnerships. With better quality content, these big data companies can sell their services at a higher price — but we want a compensation.

It’s not our core business, but we can’t let go of our data like this. However, we can only change the market if we’re standing together. In order for it to work, the majority of publishers must play along.”

Conclusion

This fruitful exchange was the culmination of a process that started a few months ago, and which has allowed Jean-Charles Falloux and his team to take stock of bot activity on Lesechos.fr and discover the business potential revealed by the DataDome Traffic Quality Report. The report has been designed to assess the quality of a site’s traffic and more precisely to identify bot traffic, in order to be able to take action against bad bots and to generate new business opportunities with big data companies.

DataDome is continuing its discussions with major French national and regional newspaper publishers who are looking for ways to control automated traffic to their sites and to generate new revenue streams.