What are manual actions and how to fix them?

Manual actions occur when you breach Google Guidelines. As the name suggests, they’re not algorithmic but instead are the result of a real human reviewing your site. Manual actions can be found in Google Search Console, under the Security & Manual Actions dropdown in the sidebar.

Remember that manual actions aren’t the same as algorithmic penalties. Manual actions punish a breach in the guidelines that can be explicitly traced to a specific cause. Algorithmic penalties don’t directly punish you, but they will favour competitors who better target their site to meet the algorithm.

There are several types of manual action, all with varying consequences. However, due to being performed by a human, manual actions are rarely received by accident and are most often the result of intentional attempts to manipulate search engines or harm users. Occasionally manual actions can be the result of gross incompetence, when practice warnings from Google have been consistently ignored.

Submitting manual action fixes

Once you have amended any manual actions to the best of your knowledge, you can submit a request for Google to review your fixes by selecting Request Review from the Manual Actions Report. The request you submit should contain details of the exact issues which caused the manual action, the steps you have taken to amend the issues and the steps taken to prevent them from happening again, as well as any observed outcome of your changes.

READ:

Types of manual actions

User-generated spam

User generated spam is guideline-breaking content created by external people posting to your site, most often in the form of comments or user profiles. While you are not directly in control of this content, failure to properly manage and moderate it will result in a manual action.

If possible, Google will try to target the manual action to the specifically affected pages,  however if it’s a widespread issue, the entire domain could be affected and possibly even be deindexed.

User generated spam can be handled with improved moderation methods. These vary from reviewing all posts prior to their publishing to preventing users from posting multiple times within a certain time frame. 

It’s not uncommon for spammers to post links in comments or on their profiles. These links can be nofollowed, which is essentially a way to tell search engines you do not endorse the links or the content they lead to. This helps ensure no link equity is passed. (You’ll learn more about nofollowing user generated content in the Crawling and Directives sessions of this module.)

Spammy free host

If you’re running a web hosting service, you need to be aware of the sites created on your service. You could be punished for breaking Google guidelines with spammy content, or even more malicious actions such as infecting pages with malware.

Again, moderation is key. Use verification like captchas to help prevent automatic account generation. This ensures users are not able to script attacks which generate multiple spammy sites at one time.

You can also monitor your log files for unexpected traffic increases or sudden increases in bot redirects, which may indicate users have implemented cloaking techniques.

To check for malicious content including malware, you can use the Google Safe Browsing API to regularly and safely test URLs on your service against a list of unsafe resources, compiled by Google. 

Structured data issue

Using structured data (or schema) to manipulate Google’s rich results will incur a penalty. The majority of the time it will be an intentional manipulation which is penalised, however, continually failing to meet Google Structured Data Guidelines can result in this penalty if warning messages are ignored.

A manual action for manipulative structured data can result in your pages being deindexed, however it is more likely that your pages remain indexed but are prevented from generating rich results.

The fix for guideline-breaking structured data is simply to remove or alter your structured data so guidelines are no longer broken. However, even when the manual action has been resolved it may take time to see rich results for your site again, due to Google having lost trust in your markup.

Typically, unnatural links to your site come as a result of buying links from spammy domains, likely as part of a link scheme. This could be a result of your own actions, or someone acting on your behalf (like a dodgy SEO agency). It could even be a competitor attempting to cause issues for you. However your site acquired unnatural links, Google will always penalise you for them if they feel you are attempting to manipulate PageRank.

To amend any unnatural links to your site, we recommend contacting the webmasters of the domains the links exist on and requesting their removal from the site. We’d also suggest disavowing the domains these spammy links exist on (more on disavowing domains later).

A penalty for unnatural links will be incurred if Google feels the way your site links out is an attempt to manipulate rankings. A penalty is particularly likely if Google thinks you have sold the links as part of a link scheme. 

We recommend identifying any links on your site which break Google guidelines. This includes links from link schemes and any links not intended to be malicious but which may be considered unnatural. 

Where links were part of manipulative link schemes, we recommend simply removing these from your site. However, where the manual action is as the result of unintentional spammy linking, Google recommends ‘changing them so they no longer pass PageRank’. This can be achieved with a nofollow attribute or by making them inaccessible to bots.

Thin content with little or no added value

While Google’s Panda Algorithm can result in a penalty that drops the rank of thin pages, it is also possible to incur a manual action for thin content (if Google feel it’s widespread enough to devalue your whole site). Spammy practices such as scraping, or pages designed to rank for terms which will draw in users and then funnel them to unrelated sections of a site (known as doorway pages) can also cause penalties.

We recommend amending any spammy practices by removing scraped content and removing doorway pages. If you’re faced with a widespread content issue, we recommend reviewing the quality of your content and updating it to be as useful to the user as possible.

Cloaking and sneaky redirects

Cloaking is where a site shows different content (be it text or images) to users than it does to bots, while sneaky redirects typically take users to a page hidden from bots. Both are against Google guidelines as they are intentionally deceptive towards bots, and prevent search engines from appropriately evaluating page content.

To amend cloaking issues we recommend reviewing how your site treats bots. Redirects should be checked with multiple user agents. The Screaming Frog SEO Spider can be used to spoof Googlebot and browser extensions can be used to spoof bots and check for on-page differences. Live testing an inspected URL in Google Search Console can help you see how Google really sees the page. Cloaking issues should then be amended so that both bot and user experience are as similar as possible.

Hidden text and keyword stuffing

Keyword stuffing is the practice of writing keywords so often in text it seems unnatural and offers little to no benefit to a user. This is done to manipulate search engines and improve rankings for queries containing the keywords. One of the ways sites attempt to hide this manipulation is by setting text colour to match the background, making it invisible upon first looking. It is also possible to hide the text via CSS or by setting the font size to 0. This is known as hidden text. 

Another example of hidden text is attempting to hide a link on a small character with no context, such as on punctuation. Keyword stuffing can also occur in the form of unnatural prose (which we’ll touch on in more detail later in the lesson).

Where keyword stuffing has occurred as a block of keywords, we recommend removing them entirely from the site as they serves no purpose in modern search. Where unnatural prose has occurred, we recommend reviewing and amending content to read more organically.

AMP content mismatch

Accelerated Mobile Pages (AMP) are pages which have been streamlined to function better on mobile. While the source code of the pages might be very different to their non-AMP counterparts (due to the difference in the way the pages are built), the content should be similar and serve the same purpose. If a site has a notable difference between the content served on non-AMP pages and their AMP counterparts, the pages may incur a penalty from Google.

If you’re penalised for AMP content mismatch, we recommend a site reviews all AMP pages and their non-AMP counterparts. AMP pages should be amended so that the content is representative of what is seen on non-AMP pages.

Sneaky mobile redirects

Mobile redirects are sometimes used to redirect a user to a more appropriate service, such as an m. subdomain. However, sometimes sneaky redirects specifically target mobile users and redirect them to unrelated and possibly dangerous content. It should be noted that these can often be caused by malicious third-party code or hacking. 

We recommend ensuring a site has not been hacked by checking the Security Issues dropdown in Search Console (located in the sidebar). If this is not the case, we recommend crawling the site with the SEO Spider, with the user agent set to ‘Googlebot Smartphone’. The crawl should include all resources and redirects should be followed. If picked up by the crawl, you can use the data from this to begin to assess which redirects are manipulative.

We also recommend manually spot checking the site by emulating a mobile device in Google Chrome DevTools. This can be done by right clicking on a page and selecting inspect, toggling the device toolbar by clicking the smartphone icon in the top-right of the console, and refreshing the page.

Any sneaky redirects should be removed. If third-party code is the source of the redirects, we recommend removing it entirely as it cannot be trusted.

Pure spam

Pure spam is any intentional combination of the techniques we’ve already listed, including cloaking and scraping or sites with auto-generated content with the intention of manipulating rankings. Typically a site will receive a Pure Spam manual action when there is no doubt they are intentionally engaging in practices which break Google’s Guidelines.

If hit with a manual action for pure spam, we recommend auditing for all of the above issues, and resolving them ASAP so Google guidelines are met.

Impact of manual actions

The impact of a manual action can differ depending on the severity of the guideline broken, as well as how widespread the breaches are.

If a guideline breach is localised to only a few pages, it’s probable that only those pages will be penalised (and possibly deindexed). However, if the guideline breach is spread across an entire site it’s possible the entire domain will be penalised and deindexed.

If a guideline breach is less significant and does not impact the user experience of a site, a manual action isn’t such a big deal. For example, schema manipulation (which should not directly impact users), will often result in rich results being hidden for the site than pages being deindexed.

Leave a Comment

Your email address will not be published. Required fields are marked *