Bad Bots vs SEO: Where Security and Marketing Collide

Bad Bots vs SEO: Where Security and Marketing Collide

There are good bots and bad bots. On the one hand, good bots help improve your Search Engine Optimization (SEO) rankings while on the other hand, bad bots can have marketing teams camped out with the IT and security teams trying to figure out what to do. For organizations that provide online services, not only do bad bots negatively impact SEO rankings, but they can also wreak havoc on productivity, brand image, and ultimately a company’s bottom line.

In speaking with many of our customers about identity-related threats and bot detection, security now needs to work across a variety of internal stakeholders in order to protect their online applications and the customer journey. How threats impact SEO and their company’s marketing and product teams are becoming an increasingly critical problem that security teams need to help address.

According to a recent research report, we live in a world where bad bots represent a quarter of all internet traffic. Even though bots are nondiscriminating, there are some industries that tend to be targeted more frequently.  These same researchers found financial, education, IT & services industries, marketplaces, and government are some of the most frequently targeted.

Bad bots are designed to launch automated attacks to steal your content, serve spam, take over accounts, and engage in other automated activities. All of which can impact an organization’s SEO.

Here are 4 ways that bots can interfere with SEO

  1. Web and Price Scraping: The two most common malicious web scraping use cases are content theft and price scraping. When your content is stolen and posted elsewhere it can result in a low-quality page and relate your page to a poor ranking site. This will then drop your overall search engine ranking and  can result in reduced visitors and revenue. Price scraping refers to real-time theft of pricing data for products or services. This can be maliciously used by competitors trying to undercut your organization’s prices. They can price products lower than yours to manipulate search engines that display and rank lowered priced products first, thus directly impacting your sales and conversions.
  2. Form and Content Spamming: Form and content submissions that are filled with unwanted or harmful information can be particularly damaging. Repeated submission of forms is the work of bots trying to create fake leads. At the same time, content spam is often seen on sites where users can share and post content.

    Spam can contain both false user data as well as SEO-damaging backlink injections, user-deceiving injected redirects, and even severe SQL injections designed to take down your site or steal your user data. These poor-quality backlinks can cause a website to be blacklisted and removed from search engines’ search pages. Managing this kind of spam is incredibly time-consuming and complex to fix and eliminate.

  3. Skewed Metrics and Analytics: If a large portion of your web traffic is generated by bots, then there is a very high possibility of inaccurate metrics being reported.

    With the inability to differentiate between bot and human traffic, organizations have difficulty understanding which content is important and this lack of visibility can lead to poor decision making. For example, A/B tests could lead you to choose the wrong variants. In addition, skewed conversion rates could cause you to shift budgets from the effective strategies to faulty ones.

  4. Automated Attacks: Research shows that  73.6 percent of bad bots are classified as Advanced Persistent Bots (also known as Advanced Persistent Threats). To evade detection, these bots mimic human behavior, cycle through random IP addresses, and access through anonymous proxies.

    These automated threats pose a severe security risk such as DDoS attacks, account takeovers, fake account creation, credential stuffing attacks. Bots continuously requesting information from your site can lead to overall slowdowns.

    Google puts focus on page speed and page load times because of the impact to end-user experience. If your pages aren’t loading quickly, then that slow speed and load time can negatively impact your Google rankings.

Beating Bad Bots to Improve Website SEO

The challenge with today’s bots is that they simulate basic human-like interactions. These bad bots are interacting with your website and applications much the same way a human user would. Examples of this are mimicking human mouse movements and keystrokes to try to get past your organization’s detection tools, thus making them harder to detect and prevent.

Even more, these bad bots are able to figure out how to breach a user’s session by mimicking how a user behaves throughout their entire session. They can impact your website and application both before and after login. These bots can also mimic human behavior on both web and mobile which means that no channel of your digital business is safe. The best way to combat competitors from web scraping and block bad bots is by understanding a customer’s identity and using user behavior analytics (UBA) to spot these kinds of attacks.

In our recent post on “Redefining Bot Detection: Why Identity Matters,” we discuss how combining traditional bot detection signals with identity-aware detection offers much richer behavioral analytics that detects and stops automated attacks such as fake account creation, in-app scraping, content spam, credit card stuffing, and account takeover.. The better an organization is at stopping bots, the easier it will be to maintain and improve SEO.

If you’re interested in learning more about how Castle can protect your organization from sophisticated bot attacks as well as see a live demonstration of how a bot can compromise your customer application, watch our on-demand webinar on Beating Bad Bots.

*** This is a Security Bloggers Network syndicated blog from Blog | Castle authored by Heather Howland. Read the original post at: