Google Webmasters Archives | Offshore Web Development Services India - Brain Technosys https://www.braintechnosys.com/blog/category/google-webmasters/ Mon, 30 Apr 2018 05:42:16 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.braintechnosys.com/blog/wp-content/uploads/2018/03/cropped-logo-32x32.png Google Webmasters Archives | Offshore Web Development Services India - Brain Technosys https://www.braintechnosys.com/blog/category/google-webmasters/ 32 32 SEO 101: The Importance of Adding Structured Snippets https://www.braintechnosys.com/blog/seo-101-the-importance-of-adding-structured-snippets/ Tue, 02 Dec 2014 09:09:48 +0000 http://braintechnosys.net/braintech/blog/?p=6492 A rich snippet, also known as a structured snippet, is the extra text that can be found underneath search results used to provide additional information to users about a link and the content provided therein. For instance, were you to search for a popular film of your choice, you would probably find among the top … Continue reading "SEO 101: The Importance of Adding Structured Snippets"

The post SEO 101: The Importance of Adding Structured Snippets appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
A rich snippet, also known as a structured snippet, is the extra text that can be found underneath search results used to provide additional information to users about a link and the content provided therein. For instance, were you to search for a popular film of your choice, you would probably find among the top results IMDB (the Internet Movie Database); and the result would likely show alongside the star rating and possibly the number of votes contributing to said score.

Likewise, you will probably see a few results showing sites where you can buy the DVD, in which case the snippets might provide prices. If the film is showing in theatres, then some results might show movie times. You might also notice that some articles have dates next to them.

These are all examples of rich snippets in action, but for a bit more information and explanation on how to use them, see this advice from Google.

Why Rich Snippets Are Crucial For SEO

The first major of advantage of rich snippets is the simple fact that they draw users’ attention. With Google Authorship done and dusted, rich snippets are now one of the only ways to make your results stand out against the competition on the SERPs (unless you can get featured as an in-depth article or as news). In short, someone is more likely to click on a film review with the star rating showing beneath it than the one without. In fact, many companies report seeing a 20-30% rise in CTR after implementing rich snippets. That is the kind of boost and optimization any company, including new startups, should understand and implement right from inception of their website.

At the same time, if nothing else, the presence of stars also confirms the link is indeed a film review/store selling a product, which again increases click-through rates and also reduces bounce rates – because users know what to expect before they land on the site.

With Google’s data highlighter (see below), it’s now possible to add a lot more structured data to a site, including information about articles such as the title, author, and publish date. This could help to increase the perceived reliability and again boost CTR – we all feel more comfortable trusting information that was written recently.

Crucially, information regarding your business can also help Google to understand the location of your business, which is very useful for local cinemas displaying film times, for instance. Local businesses are one group who should strongly consider looking at rich snippets (see more information here from Google).

Perhaps what’s more important though, is that by using rich snippets you are helping Google to understand the content on your website – which can only be a good thing. Google says using rich snippets won’t directly affect search rankings, but the same may not be true of all search engines and the role of rich snippets is only likely to increase going forward.

Rich Snippets: The Past and Future

Support for microdata that makes structured snippets possible is something that Google, Microsoft and Yahoo all pledged for in a rare moment of collaboration. The three companies united in order to support rich snippets because they see it as an important step for the future of search and the web generally. Don’t you think they would have been unlikely to go to such lengths unless they had big plans for them? This should tell you that snippets will likely be an important factor going forward.

The most probable reason for this is that future products from these companies will rely on rich snippets more and more. Google is already making it a habit to display more information directly on their SERPs (thus giving users less reason to navigate quickly away), and it is very easy to see how AI tools such as Google Now, Siri, or Cortana would make use of the readily available information.

Does this actually benefit the webmasters as future users could have the information dictated to them with no need to navigate to the actual website? Perhaps not, but it’s certainly better to be the site referenced than to be one of countless sites left behind who also won’t get visited.

And if future systems cite their sources, then such features could at least provide a boost for your brand’s visibility and your site’s authority in the eyes of visitors. Likewise, if you want the AI of the future to recommend your local business, then you should have organization information in your snippets.

In short, rich snippets are a critical tool in the construction of the semantic web, and if you want to be a part of this evolution then you need to embrace them.

There are other potential uses for rich snippets going forward too. A smart watch, for instance, could potentially display an abridged version of a website in order to accommodate a small display, while the ambitious and somewhat amazing VR Browser Janus could also make use of the feature (Janus converts web pages in ‘rooms’ to be explored in virtual reality). Rich snippet are essential for future-proofing your website.

A Word of Caution

While Rich Snippets are important for the clicks you get from Google and your future compatibility with the web though, it’s important to also exercise restraint.

Rich snippets give you the opportunity to spam. For instance, there’s nothing to stop you giving your own website or article a five-star review in theory. Likewise, you could fill your rich snippets with sales copy and tell Google it’s a “recipe.” But it’s vital that you don’t use these underhanded methods because Google are onto this trick and have been issuing manual penalties for this kind of spam for a while now.

Rich snippets are not for everyone. If you run a blog dealing in fitness articles, then there may not be as much need for rich snippets as yet (though basic information about your articles could still be useful). On the other hand though, sports scores, recipes, song lyrics, prices, dates and star ratings all can be very useful in increasing CTR. Events can also benefit from structured snippets, while local businesses should also be sure to include information regarding their business.

Getting Started

Likewise, irrelevant information or faulty markups could also hurt your site more than help it. Google has a rich snippets testing tool you can use to make sure your rich snippets are working effectively. Or, there are several WordPress plugin that can do the job for you.

Alternatively, you may be able to use Google’s data highlighter to outline information more easily. Google’s highlighter tool allows webmasters to add more versatile information to their site listings with no need for them to get down and dirty with code. This data highlighter is available through Webmaster Tools and lets users use a point-and-click solution to highlight the information they want outline for Google. This is pretty much a fool-proof tool for adding structured data to your site, but will only work with certain types of data.

Structured snippets aren’t going to change your company’s fortunes overnight, so you certainly shouldn’t rely on them. Nevertheless, they are still a very important piece of the huge jigsaw puzzle that is SEO, and they deserve your attention.

Note: The  post original posted on Search Engine Journal.

The post SEO 101: The Importance of Adding Structured Snippets appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
It’s Over: The Rise & Fall Of Google Authorship For Search Results https://www.braintechnosys.com/blog/its-over-the-rise-fall-of-google-authorship-for-search-results/ Fri, 29 Aug 2014 11:28:33 +0000 http://braintechnosys.net/braintech/blog/?p=6456 Google has completely dropped all authorship functionality from the search results and webmaster tools. After three years the great Google Authorship experiment has come to an end … at least for now. Today John Mueller of Google Webmaster Tools announced in a Google+ post that Google will stop showing authorship results in Google Search, and … Continue reading "It’s Over: The Rise & Fall Of Google Authorship For Search Results"

The post It’s Over: The Rise & Fall Of Google Authorship For Search Results appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Google has completely dropped all authorship functionality from the search results and webmaster tools.

After three years the great Google Authorship experiment has come to an end … at least for now.

Today John Mueller of Google Webmaster Tools announced in a Google+ post that Google will stop showing authorship results in Google Search, and will no longer be tracking data from content using rel=author markup.

This in-depth article, which I’ve jointly co-written with Mark Traphagen, will cover the announcement of the end of Authorship, the history of Authorship, a study conducted by Stone Temple Consulting that confirms one of the stated reasons for cessation of the program, and some thoughts about the future of author authority in search.

Authorship’s Gradual Slide Toward Extinction

The cessation of the Authorship program comes after two major reductions of Authorship rich snippets over the past eight months. In December 2013 Google reduced the amount of author photo snippets shown per query, as Google’s webspam head Matt Cutts had promised would happen in his keynote at Pubcon that October. Starting in December, only some Authorship results were accompanied by an author photo, while all others had just a byline.

Then at the end of June 2014 Google removed all author photos from global search, leaving just bylines for any qualified authorship results.

At that time, John Mueller in a Google+ post stated that the photos were removed because Google was moving toward unifying the user experience between desktop and mobile search, and author photos did not work well with the limited screen space and bandwidth of mobile. He also remarked that Google was seeing no significant difference in “click behavior” between search pages with or without author photos.

A Brief History of Google Authorship

The roots of the Authorship project go back to Google’s Agent Rank patent of 2007. As explained by Bill Slawski, an expert on Google’s patents, the Agent Rank patent described a system for connecting multiple pieces of content with a digital signature representing one or more “agents” (authors).

Such identification could then be used to score the agent based on various trust and authority signals pointing at the agent’s content, and that score could be used to influence search rankings.

Agent Rank remained a theoretical idea without a practical means of application, until the adoption by Google of the schema.org standards for structured markup. In a blog post in June 2011, Google announced that it would begin to support authorship markup. The company encouraged webmasters to begin marking up content on their sites with the rel=”author” and rel=”me” tags, connecting each piece of content to an author profile.

The final puzzle piece for Authorship to be truly useful to Google fell into place with the unveiling of Google+ at the end of June 2011. Google+ profiles could now serve as Google’s universal identity platform for connecting authors with their content.

In a YouTube video published in August of that year, Matt Cutts and then head of the Authorship project Othar Hansson gave explicit instructions on how authors should connect their content with their Google+ profiles, noted that doing so could cause one’s profile photo to show in search results, and for the first time mentioned that — at some future time — data from Authorship could be used as a ranking factor.

Over the next three years, Authorship in search went through many changes that we won’t detail here (although Ann Smarty has compiled a complete history of those changes). On repeated occasions, though, Matt Cutts and other Google spokespeople reiterated a long-term commitment by Google to the concept of author authority.

Why Has Google Ended the Authorship Program?

Over its entire history Google has repeatedly demonstrated that nothing it creates is sacred or immortal. The list of Google products and services that were introduced only to be unceremoniously discontinued later would fill a small phone book.

The primary reason behind this shuffle of products is Google’s unswerving commitment to testing. Every product, and every change or innovation within each product, is constantly tested and evaluated. Anything that the data show as not meeting Google’s goals, not having sufficient user adoption, or not providing significant user value, will get the axe.

John Mueller told my co-author Mark that test data collected from three years of Google Authorship convinced Google that showing Authorship results in search was not returning enough value compared to the resources it took to process the data.

Mueller gave two specific areas in which the Authorship experiment fell short of expectations:

1. Low adoption rates by authors and webmasters. As our study data later in this article will confirm, participation in authorship markup was spotty at best, and almost non-existent in many verticals. Even when sites attempted to participate, they often did it incorrectly. In addition, most non-tech-savvy site owners or authors felt the markup and linking were too complex, and so were unlikely to try to implement it.

Because of these problems, beginning in early 2012, Google started attempting to auto-attribute authorship in some cases where there was no or improper markup, or no link from an author profile. In a November 2012 study of a Forbes list of 50 Most Influential Social Media Marketers, Mark found that only 30% used authorship markup on their own blogs, but of those without any markup, 34% were still getting an Authorship rich snippet in search. This is similar to data found in a study performed by Eric which is further detailed below.

However, Google’s attempts at auto-attribution of authors led to many well-publicized cases of mis-attribution, such as Truman Capote being shown as the author of a New York Times article 28 years after his death. Clearly, Google’s hopes of being able to identify the web’s authors, connect them with their content, and then evaluate their trust and authority levels as possible ranking factors was in trouble if it was going to depend on the cooperation of non-Google people.

2. Low value to searchers. In his announcement of the elimination of author photos from global search in late June of this year, John Mueller stated that Google was seeing little difference in “click behavior” on search result pages with Authorship snippets compared to those without. This came as a shock (accompanied in many cases with outright disbelief) to those who had always believed that author snippets brought higher click-through rates.

Mueller repeated in his conversation with Mark about today’s change that Google’s data showed users were not getting sufficient value from Authorship snippets. While he did not elaborate on what he meant by “value” we might speculate that this could mean that overall, in aggregate, user behavior on a search page did not seem to be affected by the presence of author snippets. Perhaps over time users had become used to seeing them and they lost their novelty.

It is interesting to note that (as of the time of this posting) author photos continue to appear for Google+ content from people a searcher has in his or her Google network (Google+ circles or Gmail contacts) when the searcher is logged in to her or his Google+ account (personalized search).

When asked, Mueller said he had no knowledge of any plans to stop showing those types of results. However, some users have reported to Mark that they are no longer seeing them. We will watch this development and update here if it looks like Google is indeed removing author photos from personalized results as well.

If Google does continue to show author photos in some personalized results, it would seem to indicate that Google data is showing that when content is from someone with whom the searcher has some personal association, a rich snippet actually does provide value to that searcher. More about this in our final section below.

Study of Rel=Author Implementations
As luck would have it, Stone Temple Consulting was in the process of wrapping up a study on rel=author markup usage. A look at the data illustrates part of the problem that Google faces with an initiative like this one. The bottom line of what we found? Adoption was weak, and accurate implementation among those that attempted to set up rel=author was also bad. If that was not enough, the adoption by authors was also bad. So let’s look at the numbers!

Authorship Adoption

We sampled 500 authors across 150 different major media web sites. Here is a summary of what we saw for their implementation of authorship tagging in their Google+ profiles:

A whopping 70% of authors made no attempt to connect their authorship with the content they were publishing on major web sites. Of course, this has much to do with how Google attempts to promote these types of initiatives. In short, they don’t. They rely on the organic spread of information throughout the Interweb ecosystem, which is uneven at best.

Publisher Adoption

50 of the 150 sites did not have any author pages at all, and more than 3/4 of these provided no more than the author’s name for attribution. For the remaining batch, some of them would allow authors to include links with their attribution at the bottom of the article, but the great majority of these authors did not take advantage of the opportunity.

For today’s post, we also took 20 of the sites that had author pages, and analyzed in detail their success in implementing authorship:

13 of the 20 sites attempted to implement authorship markup (65%)
10 of these 13 attempts had errors (77%)
12 of the 13 attempts received rich snippets in the Google SERPs (92%)
The implementation style for authorship was all over the map. We found malformed tags, authorship implemented on site, but no link the the author’s G+ profile, conflicting tags reporting multiple people as the author for a given article, and one situation where an article had 2 named authors, but only the 2nd named author linked to their G+ profile, and Google gave the 2nd author credit for that article.

Seven of the 20 sites did not attempt to implement authorship markup (35%)
Two of these seven received rich snippets in the Google SERPs (28%)
In the two cases where Google provided the rich snippets even though there was no markup, the authors did link to the site from the Contributor To section of their G+ profile.

Summarizing the Study

In short, proper adoption of rel=author markup was extremely low. Google clearly went to extreme efforts to try and make the connection between author and publisher, even in the face of many challenges. From a broader perspective, this tells us quite a bit about the difficulties of obtaining data from publishers. It’s hard, and the quality of the information you will get is quite low.

Summary

Google has stated many times over the past three years its interest in understanding author authority. It’s hard to forget executive chairman Eric Schmidt’s powerful statements on the topic:

Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.
Eric Schmidt in The New Digital Age

However, this has proved to be a very tough problem to solve. The desire to get at this data is there, but the current approach simply did not work. As we noted above, this is one of the two big reasons why this initiative is being abandoned.

The other problem identified by John Mueller is equally important. The approach of including some form of rich snippet, be it a photo, or a simple byline, was not providing value to end users in the SERPs. Google is always relentlessly testing search quality, and there are no sacred cows. If Google is not seeing end users valuing something they try, out, it will go.

We also can’t ignore the impact of the processing power used for this effort. We all like to think that Google has infinite processing power. It doesn’t. If it did have such power, it would use optical character recognition to read text in images, image processing techniques to recognize pictures, speech to text technology to transcribe every video it encounters online, and it would crawl every page on the web every day, and so forth. But it doesn’t.

What this tells us is that Google has to make conscious decisions on how it spends its processing power — it must be budgeted wisely. As of this moment, the Authorship initiative as we have known it has not been deemed worthy of the budget it was consuming.

The rise of mobile may have played a role in this outcome as well. When John Mueller says staffers don’t see a significant difference in click behavior in the SERPs as a result of Authorship rich snippets, remember that about half of Google’s traffic comes from mobile devices now. Chewing up valuable screen real estate for this type of markup on a mobile device may simply be a bad idea.

So is authorship gone forever? Our guess is that probably is not. The concept is a good one. We buy into the notion that some people are smarter about certain topics than others. It’s the current attempts at figuring this out that have failed, not the concept.

As Google moves forward in its commitment to semantic search it has to develop ways to identify entities such as authors with a high degree of confidence apart from human actions such as markup. Recent announcements about Google’s Knowledge Vault project would seem to reinforce that Google is moving steadily in that direction. So this may be how it approaches detection.

If, and when, it makes use of such data, what will it look like? Don’t be surprised if the impact is too subtle to be easily noticed. We will probably not see author photos in the results ever again. Could we see some form of AuthorRank? Possibly, but it may come in a highly personalized form or get blended in with many other factors that make it detection virtually impossible.

So goodbye for now, Authorship. You were a grand and glorious experiment, and we will miss you, but we look forward to something even better for Authorship in the future.

Note– This article originally posted on Search Engine Land.

The post It’s Over: The Rise & Fall Of Google Authorship For Search Results appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Authority vs. Popularity: Matt Cutts Teases New Google Search Result Shake-Up https://www.braintechnosys.com/blog/authority-vs-popularity-matt-cutts-teases-new-google-search-result-shake-up/ Tue, 08 Apr 2014 11:52:07 +0000 http://braintechnosys.net/braintech/blog/?p=5858 Over the years Google algorithm has been changing to take into account social factors more than they were previously. And with a lot of social sharing comes the issue of how to separate knowing what is simply popular, or even a one-hit wonder in the social media world versus something with true authority. This is … Continue reading "Authority vs. Popularity: Matt Cutts Teases New Google Search Result Shake-Up"

The post Authority vs. Popularity: Matt Cutts Teases New Google Search Result Shake-Up appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Over the years Google algorithm has been changing to take into account social factors more than they were previously. And with a lot of social sharing comes the issue of how to separate knowing what is simply popular, or even a one-hit wonder in the social media world versus something with true authority. This is the latest topic from another of Matt Cutts’ Google webmaster help videos.

As Google continues to add social signals to the algorithm, how do you separate simple popularity from true authority?

“We’ve actually thought about this quite a bit because from the earliest days it would get us really kind of frustrated when we’d see reporters talk about PageRank and say PageRank is a measure of the popularity of websites because that’s not true,” Cutts said. “For example, if you’re to look at sites that are popular, for example porn sites are very popular, but people tend not to link to porn sites. On the other hand, if you take something like the Wisconsin real estate board, probably not a ton of people go there, but quite a few people do link to government websites.”

This is definitely true. PageRank was simply based on popularity, we would definitely see a different mix of websites with higher PageRank than they currently have.

“So popularity is some sense is a measure of where people go, whereas PageRank is much more a measure of reputation, it’s much more reputation of where people link, and there is a disparity there or else porn sites would have the highest PageRank and government sites would be very, very low within our ranking system, and that’s not the way that things work. We tend to see more links to reputable government websites.”

So as Google can separate between popularity and authority, how does it then decide based on those factors which search results to show for specific query.

“Well it turns out you can say take PageRank for example, if you want to do a topical version of PageRank, you could look at the links to a page and say, ‘OK suppose it’s Matt Cutts, how many of my links actually talk about Matt Cutts?'” he said. “And if they are a lot of links or large fraction of links that I’m pretty topical, or maybe an authority on the phrase Matt Cutts.”

If you always hear about how important authority is. Whenever you can make your site or your persona be an authority in your particular market area, not only can you benefit in the search rankings, but you can also benefit simply from other members within your market area perceiving you as an authority and promoting you and linking to you because of it.

“So it’s definitely the case that you can think about not only taking popularity and going to something like reputation, which is PageRank, but you can also imagine more topical or you’re an authority in the medical space, or you’re an authority in the travel space or something like that,” Cutts said. “By looking at extra signals where you can say oh you know what, as a percentage of the sort we see you are doing well for, or whatever, it turns out your links might be including more anchor text about travel or medical queries or something like that.”

Cutts also gives us a rare heads up into some algorithmic changes that are coming in to the search results. Google is going to try to determine more between the site simply being popular and the site being a topic authority. Because he uses the example of medical queries, this could be an algorithmic change targeted to specific niche areas, such as medical queries or perhaps things like travel queries or legal queries.

“So it is difficult, but it is a lot of fun. We actually have some algorithmic changes that try to figure out hey this site is the better match for something like a medical query,” Cutts said. “And I’m looking forward those rolling out because a lot of people have worked hard so that you don’t just say oh this is a well-known site therefore should match for this query, it’s this is a site that actually has some evidence that it should rank for something related to medical queries, and that’s something where we can improve the quality of the algorithms even more.”

Note: This article original posted on Search Engine Watch.

The post Authority vs. Popularity: Matt Cutts Teases New Google Search Result Shake-Up appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
7 Things You May Not Know About Google’s Disavow Tool https://www.braintechnosys.com/blog/7-things-you-may-not-know-about-googles-disavow-tool/ Wed, 05 Mar 2014 06:54:16 +0000 http://braintechnosys.net/braintech/blog/?p=5796 Are you completely obsessed with understanding and getting the most benefit out of the Google Disavow Links Tool? This tool has been a mystery to many since it was announced in October 2012, and several misconceptions surround its use. Here are seven facts that you may not know about the disavow tool. 1. Disavowed Links … Continue reading "7 Things You May Not Know About Google’s Disavow Tool"

The post 7 Things You May Not Know About Google’s Disavow Tool appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Are you completely obsessed with understanding and getting the most benefit out of the Google Disavow Links Tool?

This tool has been a mystery to many since it was announced in October 2012, and several misconceptions surround its use.

Here are seven facts that you may not know about the disavow tool.

1. Disavowed Links are Still Seen in Webmaster Tools

I will commonly see people asking in forums why the disavow tool isn’t working for them. “I disavowed thousands of links, but I still see them in my Webmaster Tools backlinks!”

When a link is disavowed, the next time that Google crawls that link they essentially add an invisible nofollow tag to the link. There is no external evidence of this. Just as your nofollowed links are listed in WMT, so are your disavowed links.

In this webmaster central hangout Google’s John Mueller said, “Disavowed links stay in Webmaster Tools” and in this hangout he said, “When you disavow links we will still show them as inbound links in Webmaster Tools.”

2. There is a Size Limit to the Disavow File

The disavow file has a 2 megabyte size limit according to Google employee Aaseesh Marina. This is still quite large though.

Two megabytes of text is essentially the same 1,000 full pages of text. Even my largest disavow files have come nowhere near this size limit.

3. The Webspam Team Doesn’t Read Comments in Your Disavow File

The official documentation for the disavow tool is a bit confusing when it comes to comments. They give the following example:

This makes it look like we should put an explanation in our disavow file for every single link that we disavow. But really, the comments in the disavow use are meant for your own use to make the file easier to understand if you need to edit it in the future.

Once again, here is a quote from Mueller in a hangout: “The disavow file is something that is processed completely automatically. If you put a lot of text in those comments in the disavow file, then nobody will be looking at them. Those comments are essentially for you, to help you understand the file a little bit better and those comments are not used by the webspam team”.

I tend to use comments to help me classify the different types of links in my disavow file and when they were added. Here are some examples of comments that I will use in my files:

  •     # Added Mar 1, 2014: These are domains where we tried to remove links but did not succeed.
  •     # Added Mar 1, 2014: These are sites we did not visit to evaluate because they gave a malware warning.

4. You Don’t Need to Include Nofollowed Links in the Disavow File

A nofollowed link doesn’t carry PageRank and won’t affect your Google rankings. Here is more information on what Google says about whether to include nofollowed links in your disavow file.

“You don’t need to include any nofollow links…because essentially what happens with links that you submit as a disavow, when we recrawl them we treat them similarly to other nofollowed links,” Mueller said. “Including a nofollow link there wouldn’t be necessary.”

5. Disavowed Links Can be Reavowed

If you have added a link to your disavow file in error, or if you change your mind about disavowing a particular link, you can remove the link from your file and reupload it. The next time that Google visits that particular link, they will see that it is no longer in your disavow file and will start counting that link toward your PageRank again.

If a link you reavow was indeed one that Google had considered unnatural, removing it from your disavow won’t do any good and actually could do you harm. A client of mine got penalized a second time by Google by reinstating links that they had previously disavowed. When you get penalized a second time, Google makes you work even harder to get your penalty lifted!

A good example of a situation where you might want to reavow a link would be the case where you have disavowed an entire domain, but now have a truly natural link from that domain. Let’s look at an example.

Let’s say you had an unnatural links penalty and a good portion of your unnatural links came from keyword anchored links in a widget that was embedded by a large number of sites. Perhaps a high quality site had embedded your widget and you had disavowed it at the domain level. But now, that high quality site has actually mentioned your business and linked to you. Because the entire domain is disavowed, that natural link won’t count.

What you would do in that situation is remove the domain:example.com directive from the disavow and insert the url on which your widget is listed. (This is assuming you couldn’t get the link from the widget removed.) If you do this, be careful to include every URL that could link to this widget as the link may exist on:

  •     example.com/widget_page.html
  •     example.com/category/widgets/
  •     example.com/archive/page2.html

…and so on.

The next time that Google recrawls this site, they will only disavow the specific URLs that are in your disavow file and links on other pages of this domain will be reavowed to your site.

Here is Mueller explaining that links can be reavowed:

Links are essentially only disavowed as long as they are in the disavow file. So, if you remove them after some point, then essentially when we recrawl and reprocess those URLs … then we will treat those as normal links again. If you remove them, then essentially you are returning them to their normal state. If they were problematic links in the past then they would be problematic links again.”

6. A Disavow May Not Work Through a 301 Redirect

Let’s you’ve got bad links pointing to Site A and you disavow those links. You then institute a 301 redirect to Site B. A redirect passes close to 100 percent of the link equity associated with that link and will also pass unnatural link signals as well.

You would think that disavowing the links pointing to Site A would essentially nofollow the link break the flow of PageRank through to Site B, but Mueller said, “Generally speaking, I’d use the same disavow file on both of the domains if you are redirecting from one domain to the other one so it’s [the link] kind of taken out from both sides.”

This is an iffy point. It sounds to me like with a straightforward redirect you are probably safe to just disavow the original source. In the example that Mueller was talking about, the site owner was asking about multiple redirects and canonicals and the situation was muddy. Still, if I was redirecting pages from one site to another and the original site had bad links, I would also add those bad links to the disavow file for the second site.

7. Disavow Data Isn’t Used Against the Site Being Disavowed

This is a contentious point. You will find a good number of people who thoroughly believe that Google is crowdsourcing the data obtained by the disavow tool and using it as a mass spam reporting tool.

When I write to webmasters and request link removal, I will often get replies back saying, “I removed your link. Please don’t add me to your disavow file!!!” The site owners are obviously concerned that if I disavow their link then I am reporting their domain as a spam domain to Google.

Here’s what Mueller said about this thought:

When it comes to the disavow links tool, at the moment we are not using that data in any way against the sites that are being disavowed because there are just so many reasons why a link might be disavowed. It might be that it’s a perfectly fine site but for some reason the ads on that site are passing PageRank and maybe the webmaster is not aware of that and that’s not something that we would say, “Oh, this is a spammy site”, because some of these ads are passing PageRank. Or maybe they have comments on a blog or on articles that they publish and people have been spamming those comments. Just because those links are in someone’s disavow file, it doesn’t mean that the content on that site is necessarily bad.”

I find it interesting though that Mueller said that they are not using the disavow data against other sites “at the moment”. It is perhaps possible that Google is gathering this data to help improve their algorithms in the future provided they can find ways to weed out the false positives. But, at this point it’s not like using the disavow tool against another site is the same as filing a spam report.

This video has a similar discussion about disavow data, and Mueller also said, “It’s not that we are using this [the disavow tool] as a spam report form. So, when you get a message saying, ‘You should remove this link to my website or else I’ll put you on my disavow file’, that’s not a threat for your website. It’s not a problem to have your website on a disavow file.”

Summary

There is a reason why Google tells webmasters that the disavow tool is an advanced tool and should be used with caution. Using it improperly has the possibility to do harm to your site.

Google has been very vague with their explanations to webmasters on how to use this tool. In fact, in one hangout, Mueller was asked why there is no link directly from Webmaster Tools to the disavow tool and his was response was to say that Google doesn’t want webmasters to use the tool if they don’t know what they are doing.

Hopefully these tips have helped you to better understand the disavow tool.

Note: This article original posted on Search Engine Watch.

The post 7 Things You May Not Know About Google’s Disavow Tool appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Google’s Matt Cutts: We Dropped The 100 Links Per Page Guideline But We May Take Action If It Is Too Spammy https://www.braintechnosys.com/blog/googles-matt-cutts-we-dropped-the-100-links-per-page-guideline-but-we-may-take-action-if-it-is-too-spammy/ Tue, 26 Nov 2013 10:09:21 +0000 http://braintechnosys.net/braintechnosys/?p=5515 Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline. In fact, the guideline was dropped well before 2008, but SEOs and webmasters still think having over 100 links on a page is something that may lead to a penalty. The truth is: no, it won’t. Sites like Techmeme … Continue reading "Google’s Matt Cutts: We Dropped The 100 Links Per Page Guideline But We May Take Action If It Is Too Spammy"

The post Google’s Matt Cutts: We Dropped The 100 Links Per Page Guideline But We May Take Action If It Is Too Spammy appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline.

In fact, the guideline was dropped well before 2008, but SEOs and webmasters still think having over 100 links on a page is something that may lead to a penalty.

The truth is: no, it won’t. Sites like Techmeme likely has thousands of links on their home page, and they are not penalized by Google.

That being said, Google said if a site looks to be spammy and has way too many links on a single page — Google reserves the right to take action on the site.

Matt also explained that your PageRank is divided by the number of links on a page. So if page A links to page B, C and D, that PageRank is split into three. If you have hundreds of links, it is divided by hundreds, and so forth.

This content original posted on Search Engine Land .

The post Google’s Matt Cutts: We Dropped The 100 Links Per Page Guideline But We May Take Action If It Is Too Spammy appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Amazing Google Webmasters Tool for Security Issue https://www.braintechnosys.com/blog/amazing-google-webmaster-tool-for-security-issue/ Wed, 06 Nov 2013 09:15:29 +0000 http://braintechnosys.net/braintechnosys/?p=5261 Google has introduced the inclusion of a new segment in Webmasters Tools termed as“Security Issues.” This latest section is designed for effective communicating to site owners security matters like malware, website hacks and others, and then offering a more comprehensive and concise means of solving the problem and posting a review request. In the latest … Continue reading "Amazing Google Webmasters Tool for Security Issue"

The post Amazing Google Webmasters Tool for Security Issue appeared first on Offshore Web Development Services India - Brain Technosys.

]]>
Google has introduced the inclusion of a new segment in Webmasters Tools termed as“Security Issues.” This latest section is designed for effective communicating to site owners security matters like malware, website hacks and others, and then offering a more comprehensive and concise means of solving the problem and posting a review request.

In the latest security issues segment, you will be able to:

  • Get more information regarding the security issues on your website, at one place.
  • Pinpoint the issue quicker with comprehensive code snippets.
  • Request review for entire issues in one go via the new simplified procedure.

Get More Information:

You’ll be able to view all kinds of security issues that might be on your website, such as malware error template injection, code injection, content injection and SQL injection for spam and a lot more.

Here’s a screen shot of a few of these security issues as observed in Webmaster Tools:

Identify The Security Threat:

Google will then allow you to zoom into the real malware or hack, so that you can quickly scan your website for a match on that dubious snippet of code or content. Here’s an illustration of what that might look like:

Quicker Review Requests:

Eventually, after you resolve the issue(s) you can speedily look at box that says, “I have fixed these issues” and then click on the “Request a review” option to facilitate the review procedure. This would facilitate the procedure of eliminating a hack or malware warning on your website’s Google snippet and/or addition of your website back in Google’s search engine results.

The post Amazing Google Webmasters Tool for Security Issue appeared first on Offshore Web Development Services India - Brain Technosys.

]]>