Using the DataDome dashboards

Once DataDome is up and running on your website, you will have access to your own personal dashboards.

Bots generate over half of a website’s traffic – that’s a fact. But what types of bots mostly visit your website? Are they hurting your business? Are they contributing to your SEO ranking? Could a potential new business partner hide behind its bots?

DataDome dashboards have been created to reach these two goals:

  • Keep an eye on traffic quality or precisely analyse bot activity.
  • Define and adjust actions against bots.

Analytics

Share of voice: good vs. bad vs. monetisable

The indicators are updated in real-time, and can easily be shared with your colleagues. Alerts also keep you updated.

Of course, all bots aren’t bad. Some of them contribute either to your website’s SEO ranking or to your content’s visibility. No need to say, you should make sure they get the right content from you, as fast as possible. Amongst the main good bot categories:

  • Search engines such as Google, Bing, DuckDuckGo, etc. regularly come and visit your website to check for fresh content.
  • Aggregators are key to your content’s dissemination online.
  • Social networks (Facebook, Twitter, etc.) need to access your content to promote it on their own platform.

Bad bots are here to hurt you. 4 different types exist:

  • Hackers look for technical flaws so as to attack your applications.
  • Impersonators are after your users’ personal information.
  • Ad Fraudsters click on your ads, causing your CPC to rise and your revenue to decrease.
  • Scrapers feed on your content. Thousands of pages can be scraped from your website in just a few minutes.

Thousands of companies, worldwide, base their business on content / items snatched from the web. This content is then sold off to their own customers. These companies are potential new business partners for you, and this blooming industry is very varied:

  • Competitive intelligence
  • SEO
  • Marketing database
  • Etc.

Time lines

You wouldn’t regularly check the same website twice a day, at exactly the same time. Bots don’t either. It’s paramount to understand the context surrounding massive peaks of non-human traffic.

There are a thousand different reasons why hackers, impersonators or scrapers might launch an attack against your website:

  • Is there a newly discovered security flaw in the web applications that are most commonly used?
  • Has there been a leakage of personal information elsewhere on the web that prompts for an impersonating attack?
  • Is your competition in the process of updating their website and hence needs to benchmark their content with yours?
  • Etc.

Depending on the kind of business they run, your (potential) new partners will come and get your content accordingly:

  • Business Intelligence solutions need to be constantly kept up to date – they can’t miss one bit of content you publish.
  • SEO agencies will probably come and check you out more intensely when you operate changes in the website structure.
  • Needs related to Marketing content will probably be more seasonal: a brand that trusts a consultancy with a market analysis, an organisation that wants to inventory digital spaces based on their visitors’ profiles.
  • Etc.

Take action

  • Improve your website’s performance: higher SEO ranking, better user experience, increase of page views, etc.

    → BLOCK all bad bots and ALLOW all good bots.

  • Increase ad revenue: optimize your ad campaigns and your conversation rates.

    → BLOCK ad fraudsters.

  • Make sure your users’ personal information is safe

    → BLOCK impersonators.

  • Protect / regain control over your content

    → BLOCK scrapers

  • Discover new potential business / partnerships

    → FILTER monetisable bots.