If your website is slow, you’re losing money fast.

In online business, milliseconds matter. As early as 10 years ago, Amazon famously found that every 100ms of latency cost them 1% in sales, and Google discovered that an extra half second’s search result generation time caused traffic to drop by 20%.

Consumers certainly haven’t become any more patient since then, and for any modern online business, web performance equals company performance. The consequences of less-than-stellar load times can be brutal.

How slow is too slow?

Aberdeen Research: web performanceAccording to Aberdeen Research, three seconds appears to be the breakpoint for all web users. Compared to a one-second load time, abandonment rates triple (see figure).

Kissmetrics found even more dramatic numbers: according to their figures, 40% of consumers abandon a website that takes more than 3 seconds to load.

They also found that a 1-second delay in page response can result in a 7% reduction in conversions. For an e-commerce site generating $100,000 per day, this percentage translates to $2.5 million in lost sales per year.

That’s a pretty expensive second.

Mobile makes matters worse

Mobile users are even more impatient than desktop users, and as e-commerce continues its shift to mobile, site speed becomes more important than ever.

According to Google, as much as 53% of mobile site visitors leave a page that takes longer than three seconds to load.

If your pages are even slower than that, the consequences are chilling.

Poor mobile site performance kills your traffic beyond just bounces, too: it hurts your SEO. Google has officially announced that starting in July 2018, page speed will be a ranking factor for mobile searches.

To drive home the point, Google has introduced a nifty mobile revenue impact calculator. You can enter a few key online business metrics such as average order value and conversion rate, and watch the potential annual revenue impact from improving speed on a sliding scale.

How much more could you have made this week with a one-second improvement?

Improving web performance: the invisible factor

Enough scare stories, you probably got the point. The question now is: what can you do to fix poor site performance and boost your business?

Site speed assessment tools such as Pingdom and Google PageSpeed Insights will provide a list of improvement suggestions. Typical recommendations include to leverage browser caching, optimize JavaScript, and optimize image sizes.

However, these tools all focus on on-site measures. And while they’re certainly important, an external factor can potentially have a much higher impact: large volumes of invisible traffic.

Bots (automated programs) now represent more than half of the world’s total website traffic. Much of this traffic goes undetected by standard analytics tools, and site owners are often shocked to discover the true volume of hits to their website.

For example, PriceMinister-Rakuten, France’s #2 most visited e-commerce site, discovered that a whopping 75% of its total traffic was generated by bots.

Bots tie up valuable CPU resources just like any other visitor, often causing significant load time increases and sometimes even site downtime.

This means that they don’t only cause a poorer user experience for your human visitors, with the obvious consequences for your bounce rates, conversion rates and revenue — this server infrastructure also has a direct cost.

More visitors means you need (and pay for) more servers, even when the visitors you’re serving aren’t human and will never buy from you.

Unwanted bot traffic, therefore, doubly hurts your business: by decreasing revenue and by increasing cost.

Block bad bots, improve your bottom line

DataDome’s technology protects your websites and APIs against unwanted bot traffic. It detects bots with > 99% accuracy, and enables you to block malicious bots from accessing your websites and online apps.

LeParisien.fr, one of France’s top three general-interest news portals, was aggressively targeted by scraper bots coming for its editorial content. When the company installed the DataDome solution and started to block unwanted bots, it saw an immediate 10% drop in traffic even before all its assets were protected.

Similarly, the cycling news website DirectVelo experienced heavy scraper bot traffic. And when the bot traffic peaks coincided with important events, such as the live coverage of the French Championships, they sometimes brought the site to its knees.

By blocking unwanted bots from the site, DirectVelo was able to drastically reduce traffic, and thereby to avoid the deployment of new servers and the costs that this would have entailed:

Blocking unwanted bot traffic is a fast and easy way to improve website performance. And as an added benefit, it keeps your content and online data protected from scraping, hacking and fraud.

The DataDome solution is easy to install without changing your infrastructure, and it’s compatible with multi-cloud/CDN architectures.

Curious to find out how much of your real traffic is generated by bots? Try DataDome for free for 30 days and find out!

Try DataDome free for 30 days

No credit card. No contract.
Just install the module that fits your architecture, and observe your automated traffic in real time for the next 30 days.

REQUEST A DEMO
FREE TRIAL