Over the last few years, bot traffic has become a massive headache for organizations across all industries. While bots themselves are nothing new, bot traffic volume and sophistication have risen drastically as hackers sought ways to scale their malicious activities. In this article, we’ll explain the difference between good and bad bots, explore the risks they pose, and cover the steps you can take to protect your business website and web applications.
Bot traffic consists of any non-human interaction with your website and online applications. Bots — short for robots — are computer programs written to complete repetitive tasks automatically. Bots complete tasks much faster and more accurately than humans and can also scale their activities easily. After overtaking human traffic in 2016, bots account for more than half of all Internet traffic. While bots were initially confined to simplistic tasks, modern bots are often highly sophisticated, and some even include self-learning algorithms.
So, what exactly do bots do? Among the traffic your website experiences, you’ll find a mixture of bots programmed to carry out benign and malicious tasks.
The most common example of ‘good’ bots are the search engine ‘spiders’ used by Google and Bing to discover web pages and catalog their content. Without these bots, search engines wouldn’t be able to serve users with relevant search results, and it would be hard for website owners to attract attention. However, while some bots are essential, bad bot traffic can be highly detrimental. The most extreme example comes in the form of Distributed Denial of Service (DDoS) attacks, where a large number of compromised devices are used to flood a website or application with more traffic or requests than it can handle. DDoS attacks can be extremely damaging, particularly for organizations that rely on their web assets for financial or operational success.
Learn more about Link11 Bot Management
However big your company website is, you can expect a significant amount of bot traffic. Good bot traffic consists mainly of bots managed by online service providers like search engines and SaaS companies. On the other hand, bad bots are created by malicious actors to conduct fraud, spread spam, and carry out cyber attacks.
|Good Bots||Bad Bots|
|Search engine crawlers (e.g., Google, Bing)||Spambots|
|SEO and backlink checkers (e.g., SEMrush, Moz)||Website information scrapers|
|Aggregator feed bots (e.g., Feedly)||Fake account creators|
|Digital assistants (e.g., Siri, Alexa)||Checkout and business logic abuse bots|
Maintaining oversight of the bots that visit and interact with your website and applications is essential. While good bots are often valuable, too much bot traffic can hurt website or application performance, damaging legitimate customers’ experience. To make matters worse, sophisticated bad bots can cripple a website, either by disrupting or damaging its operations or by abusing business logic.
Hoarding bots are common on websites that sell limited stock that’s highly in-demand, such as ticketing websites for music and sporting events. These bots create fake accounts and repeatedly add stock to their baskets, preventing legitimate customers from purchasing it. This is usually done to ‘hold’ stock while it is fraudulently sold elsewhere for a higher price.
This automated ad clickers target websites that include paid advertisements and are used to ‘click’ ads hundreds or even thousands of times. This may be done to damage the success of a company’s advertisements, exhaust its advertising budget, or prompt advertising providers to ban the website from their network.
To ensure your website and applications are properly served by good bots and protected from bad bots, you first need a mechanism to detect bot traffic. Unfortunately, this is far from simple. A major complaint from webmasters is that it’s difficult to distinguish between human and bot traffic in web consoles like Google Analytics. This makes it harder to understand user behavior and track the success of pages, features, and advertisements.
However, this is merely an unfortunate byproduct of good bots. Bad bots are much harder to detect and have far more significant repercussions. Some of the indicators you can look out for to identify bad bot traffic include:
Unfortunately, while these indicators can be helpful, it’s challenging to reliably identify bad bot traffic through monitoring. Even if the indicators above can be flagged for automatic identification, only the most basic bots are likely to be detected.
To make matters worse, modern bots are extremely good at avoiding detection. Malicious actors are increasingly designing bots to mimic human behavior, and some advanced bots even include self-learning algorithms to help them go under the radar.
While there are several things you can do to limit the damage caused by bot traffic, most are only partially effective. For instance, you can set rules for bots using a simple file called robots.txt, a standard component of any website. While this is an important step to take, it will only affect good bots like search engine crawlers. Bad bots will simply ignore your instructions.
Other standard precautions include:
To ensure websites and online applications are adequately protected against bots, organizations need the means to distinguish between good and bad bots in real-time — even when those bots have never been seen before. More specifically, organizations need a bot management solution that benefits from the same advantage as bots themselves — the ability to complete actions quickly and automatically while ‘learning’ as they go.
To manage bots effectively, you need a bot management solution that provides full control over the wide range of bots that access your website and web applications each day.
Link11’s advanced bot management solution uses proprietary AI and Machine Learning algorithms to distinguish between good and bad bots in real-time — with zero human intervention — and block bot traffic only if it poses a threat.
Learn more about Link11 Bot Management
Bots that are known to be malicious are blocked instantly, while new, unknown bots are identified and mitigated in under ten seconds on average. This is essential for full protection, as new bots are under continual development to bypass lower-quality controls.
As a result, your organization gets:
To find out how you can take control of bot traffic on your business website, visit our bot management page.