Bot traffic is online traffic generated by computer programs, not people. Often called spiders, this automated, computer-generated traffic can make up 20% or more of the web traffic coming to any one site and frequently is malicious.

Because it can prove difficult to separate bot traffic from real people traffic, most organizations determine a set percentage of traffic they expect is caused by bots. That number typically ranges from 10% to 20% but can go much higher.

What Does Bot Traffic Stand For?

Bot is short for robot. It simply means a software program that ‘crawls” the internet, looking at different sites. Some are benevolent, like a Google spider used to scan and index content on a website’s page. An online chatroom might have bots that scan for profane language or other violations of the rules. Others, however, are “bad” – hackers create them for all kinds of purposes, including disrupting internet traffic or stealing information.

Are Bots Legal?

Yes, bots are legal – but many states have started to take action on bots. For example, both California and New York have created laws that make bots that attempt to capture event ticket information illegal. A federal 2016 law made bots illegal for scalping tickets.

Who Uses Bots?

As mentioned, bots can be used for good purposes. Unfortunately, they are more often associated with bad uses. In addition to the situations mentioned above, bots can be used to overrun a competitor’s website. They can put spyware into your site. Bots called “scrapers” are designed to pull information off a site and post it elsewhere – this usually hits sites heavy with content or ecommerce sites that list products.

The most common “bad bots” are spammers, which can be used to place comments on sites, push phishing emails and otherwise seek to undermine a legitimate website.

How Do You Protect Yourself From Bots?

Your first, best move is to hire a quality website company that places software that blocks malware and other malicious bots from your site. This makes it far less likely that a bot can attack your site. Having users fill in a Captcha box to prove they are not bots also helps.

However, you must stay vigilant. Bots can cause inflated traffic numbers in your website analytics, leading you to believe something is working well when in fact it is only bots crawling the site. Signs of bots include sudden spikes in online traffic, performance issues with the site, and sudden traffic from places where they don’t speak the language your site is written in.

Many analytics programs also allow you to filter out bot traffic – although the accuracy of this is still a subject of debate.

The best approach with bots is to treat them like insects at home. They can’t be ignored. You’re going to have to continuously face this issue, so find ways to protect yourself as best you can.