Datadome
Customer Stories

How Tripcentral Defeated Scraping Bots & Saved 70 Hours a Month

Table of content
Paige Tester, Sr. Content Marketing Manager
30 Sep, 2024
|
min

Tripcentral, a major travel booking site, was facing aggressive scraping bots that were slowing down their website. This harmed the user experience, affecting both individuals and travel agencies. After trying other fixes, they chose DataDome for its robust, real-time protection against bots. DataDome significantly reduced site scraping, strengthened security against hacking attempts, and saved their team valuable time and resources. As a result, Tripcentral was able to get back up to speed and concentrate on growth without worrying about the disruption caused by bots.

“If you exist on the internet, you're going to face cyberattacks. We’ve noticed a massive increase in this traffic, and we wanted extra protection. DataDome added that layer of protection against automated attacks and put an end to scraping traffic.”
Jerry Han
Director of Technology at Tripcentral

The Challenge: Scraping Bots Overloading Infrastructure and Disrupting User Experience

It all began in 2017. Jerry Han was Director of Technology at Tripcentral, a thriving online travel agency. Business was good, and the websites were a key driver of their success, offering unique travel features and competitive pricing. But then, Jerry started noticing unusual spikes in traffic and a drastic slowdown in website performance. It wasn’t a surge of customers eager to plan their vacations, but an army of malicious bots. “We saw a definite hit in performance,” he recalls, “clients would complain that the site used to be faster, and agents would say the slowness made their jobs harder.” 

Tripcentral cares deeply about their customer’s happiness and experience and knew they needed a solution. So, Jerry dug deeper and realized that bots were relentlessly harvesting content and imitating human behavior to bypass basic defenses. They were scraping large volumes of information. Together with his team, they tried to take matters into their own hands by building in-house solutions.

“We tried IP filtering, regex filtering, we cobbled together scripts and manual processes to flag suspicious behaviors… But it was too time-consuming and not effective enough,” explains Jerry. With scraping bots becoming more sophisticated, Tripcentral turned to DataDome.

The Solution: Embracing Cutting-Edge Bot Protection to Put a Stop to Scraping

Tripcentral knew they needed a solution that was as agile and able to detect even the most advanced bots. “Given the advancements in technology, we were looking for a solution at the cutting edge of bot detection, mitigation, threat research, and artificial intelligence,” says Jerry. That’s when he heard about DataDome from one of his suppliers who had made the switch and never looked back.

What caught Jerry’s attention was the platform’s ability to handle a variety of threats—from scraping and credential stuffing to more sophisticated attacks like SQL injection and DDoS—all without slowing down the website or causing friction for legitimate users. “We had a firewall, web appliances, all that kind of stuff. But what we wanted was more depth to our security. DataDome gives us that.”

The Results:  Powerful, Adaptable, & Easy-to-Manage Bot Protection

Within a month, DataDome was up and running at Tripcentral. Jerry’s team was impressed by how easily DataDome integrated with their existing infrastructure. No extended downtime, no headaches—just a seamless transition to a stronger defense.

Immediately, Tripcentral saw a massive drop in scraping traffic, freeing up a significant portion of their bandwidth and improving the speed and responsiveness of their site for genuine users.

DataDome also effectively neutralized other automated threats: “We now have a more robust defense against hacking,” says Jerry, “we see much less attack traffic in our logs, which is great because we can’t afford to have someone watching the logs all day!” This advanced level of protection has saved Jerry and his small team a lot of time. Before DataDome, monitoring and managing bot traffic was a constant, labor-intensive task that could easily consume 50-70 hours a month.

“I feel more secure knowing the DataDome team is watching over our sites and will react if anything goes wrong,” says Jerry.