Go Back   Wiki NewForum | Latest Entertainment News > General Discussion


Understanding Big Daddy


Reply
Views: 1536  
Thread Tools Rate Thread
  #1  
Old 12-14-2009, 06:25 AM
bholus10 bholus10 is offline
Award Winner
 
Join Date: Nov 2009
Posts: 10,043
Default Understanding Big Daddy

Big Daddy – what an appropriate name. Matt Cutts, when foretelling the Bid Daddy update, said
“But this is neither an update nor a data refresh; this is new infrastructure. It should be much more subtle/gentle than an update.”
http://www.mattcutts.com/blog/bigdaddy/

Yeah, right.

Big Daddy will probably be recorded by many webmasters as disruptive as the worst of the past updates – right in line with the magnitude of the Florida or Austin updates. Of course, an update is considered bad only in a relative sense. Those who see a drop in rankings or a drop in indexed pages typically are not fans of the update while those who fare well come to the defense of Google.

If you look on any webmaster forum, you will certainly be able to find a Big Daddy related post. Big Daddy has been associated with several major problems: webmasters with large websites have seen thousands of pages dropping from the index or relegated to a supplemental status, hyphenated domains for a while were not being counted at all, and 301 redirects seem to still have problems.

Unlike past updates, Google has tried to keep webmasters informed as to what we ought to do in order to preserve/increase our rankings. In this regard, Matt Cutts has been a chosen messenger. Before Big Daddy went live, Matt informed the webmaster

community that it was coming. Now that it is fully deployed, Matt has spent time trying to give the webmaster community a set of rules to follow to increase the chance of being looked upon favorably by the Google 'gods'.

Much of what has been said can now be reduced to 5 commandments:

Thou Shalt Know Thy Redirects



Redirects are a tricky beast. Google has had their fair share of problems with redirects, and Big Daddy was partly rolled out to combat these problems. The result? Well, redirects appear to still be a problem – at least for now.

There are two main types of redirects available to webmasters: the 301 redirect and the 302 redirect. A 301 redirect is the most common type of redirect. It is used to tell Google that a page or a site has been permanently moved. A 302 also tells Google that a page has been moved, but only temporarily.

The rule that webmasters should follow is simple: anytime you change a page's location on your site, you must put in place a 301 redirect to tell Google where the page went. You should never make the same content available in two separate locations, and it is preferable to not simply drop one page while adding another page. Google trusts sites that show consistency in their index, and by using a 301 redirect, you show Google a level of consistency that they can trust.

You can read more on redirects at the following URLs:
The Little 301 That Could
SEO Advice: Discussing 302 Redirects
Redirecting Your Way Out of Google

A Link for a Link Will Lead to SERP Blindness



You have heard 'an eye for an eye' and 'a tooth for a tooth', but a link for a link could lead you into disappearing from the results of Google. The topic of link exchanges is one that is debated at some length by webmasters – many believe they are a penalty waiting to happen while others see plenty of examples of successful link exchanges.

There is more reason now to believe that link exchanges not only offer a marginal benefit, but possibly could even hurt your website's rankings. In the past, the argument for link exchanging was that there are plenty of successful examples of websites that are using link exchanges with very good success. Big Daddy is changing this.
“As these indexing changes have rolled out, we’ve improving how we handle reciprocal link exchanges and link buying/selling. ”
http://www.mattcutts.com/blog/indexing-timeline/

Anyone who follows Google knows that they have, time and time again, stated that they do not like link exchanges or purchased links. To them, this represents a way of falsely increasing your search engine rankings by manipulating your inbound link count. They have been combating link exchanges for some time, and slowly they seem to be making progress in weeding out link exchanges.
“The best links are those which are editorially chosen. Linked because of the site's merit. Some of the best SEOs these days are those who are really good at buzz marketing, viral marketing, and word-of-mouth marketing.

Tactics like lavishing on reciprocal links; or reciprocal links don't work as well -- let's try this fad called triangular linking; or let's try buying links; all these sorts of things. These are not the sort of links that are best for your site.

They're certainly more high risk. Buying links is extremely risky. It falls outside of our guidelines, unless you add a no-follow tag. And that's a very simple way to say, "You know what? I only wanted the traffic. I'm not concerned with search engines." ”
http://www.clickz.com/experts/search...le.php/3605961

There certainly will still be examples of sites that use link exchanges successfully – Google is not perfect. But one thing is certain: they are actively working on ways to reduce the benefit of link exchanges and purchased links. In fact, those who abuse link exchanges could actually be hurting their rankings.
“The sites that fit “no pages in Bigdaddy” criteria were sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking

to spammy neighborhoods on the web, or link buying/selling. The Bigdaddy update is independent of our supplemental results, so when Bigdaddy didn’t select pages from a site, that would expose more supplemental results for a site.”
http://www.mattcutts.com/blog/indexing-timeline/

If you are a webmaster who engages in heavy link exchanging, you may want to consider abandoning your link exchange, requesting your links be removed, and start to build links to your site through traditional marketing methods (you know them – public relations, active advertising, and buzz marketing).

The age of reciprocal links is dying, and dying quickly.

Thou Shalt Not Call Thyself By Any Other Name



Canonicalization is a term that is thrown around quite a bit in SEO circles. As a general rule, any word longer than 8 letters needs a little explanation.

When Google visits your website, they try to find your home page. However, there are many different ways you can access most homepages. Below are a few examples:

http://yourdomain.com
http://www.yourdomain.com
http://www.yourdomain.com/index.html
https://www.yourdomain.com (a secure URL)
https://yourdomain.com
etc

Reply With Quote
  #2  
Old 12-14-2009, 06:25 AM
bholus10 bholus10 is offline
Award Winner
 
Join Date: Nov 2009
Posts: 10,043
Even if you were to make all of your links back to your homepage the exact same style, outside sources may link to different flavors of your homepage. Google's knowledge of which URL to list as your homepage is referred to as canonicalization.

It is no secret that Google has been actively pursuing duplicate content, especially duplicate content within your own website. When there are multiple links that lead to the same page, with identical content, you are setting yourself up for being penalized with a duplicate content penalty. The problem of canonicalization is a problem that ultimately rests on Google - webmasters who have a homepage accessible by both a www version and a non-www version are not spamming Google with duplicate content, but Google is not able to tell the difference.

Part of the Big Daddy update was to address the problem of canonicalization. As we have seen in the past, even though Google rolls out an update to fix a problem, the problem does not necessarily go away completely.
“You can make your webserver so that if someone requests http://example.com/, it does a 301 (permanent) redirect to http://www.example.com/ . That helps Google know which url you prefer to be canonical. Adding a 301 redirect can be an especially good idea if your site changes often (e.g. dynamic content, a blog, etc.). ”
http://www.mattcutts.com/blog/seo-ad...onicalization/

So just how do you do this? The solution is simple if you have mod_rewrite and access to your .htaccess file. Simply open up your .htaccess file and post this in there, changing the necessary parts:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^yourdomain.com [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301]


Keep in mind that canonicalization does not just apply to the root of your website. You should make sure that any directories, such as http://yourdomain.com/directory/ are linked to consistently (in other words, do not link to it as http://www.yourdomain.com/directory/ one time, then link to http://www.yourdomain.com/directory/index.html another time).

You can also find other ways of establishing a redirect by referencing the article Redirecting Your Way Out of Google

Be Thy Confident In Thy Originality – Do Not Take Comfort in the Words of Others



Duplicate content is a quick way into Google's supplemental 'hell' (or purgatory, if you are Catholic like myself), or even worse, an outright ban. Historically, search spammers have made it quite difficult on webmasters by copying content en masse in an attempt to get as many pages in Google as possible. Because of this, duplicate content has become public enemy #1 to Google.

If you navigate to any webmaster forum, you will inevitably run into the question “How much duplicate content can I have before it is considered duplicate content?” So much for living in the spirit of the rule...

Unfortunately no one can actually answer this question. In all likelihood, Google is not simply comparing two pages that are similar and making a percentage duplication judgment on the page (although this is probably done to some extent). Other factors will

inevitably rank into the equation such as how well a page is trusted, how well is it linked to from other highly trusted websites, and how much original content appears throughout the site.

A general rule to follow with your content is to seek out originality at all costs. If you run a web store, make it a goal to add unique content to every page, and take the time to possibly differentiate your product descriptions as much as possible. If you have a blog, do not just post quotes from other articles, but take the time to offer your thoughts and perspectives on the content.

Big Daddy was a change to Google's infrastructure to make crawling and indexing more efficient. You can certainly bet that low quality websites that are filled only with content that can be found elsewhere will not fare well in the long-run.

It Is Easier for a Camel to Pass Through the Eye of a Needle than a Low Trust Website to Be Included



The name of the new Google SEO game is trust. Google has become a skeptic – they are slow to trust new websites and are slowly becoming less and less likely to spend time with sites that they do not trust.

Greywolf had a great insight recently that there may be a new type of 'sandbox', this time a crawling sandbox.
“Here’s the way I see it, if your website is missing the right ‘quality indicators’ what you’ll start to see is superficial crawling and indexing of your website. Your site which may have had hundreds, thousands or even hundreds of thousands of pages will just not be as well represented in Google’s index as you would like it to be. ”
http://www.wolf-howl.com/seo/superfi...eo-strategies/

Whether or not a sandbox exists is a concept that is fairly well debated and documented. Many people have brought examples of sites that did not suffer through a 'sandboxing' period, while the majority of sites do seem to suffer through a waiting period of sorts

before Google trusts the site. Whether or not an actual sandbox exists, and whether or not there is a new type of sandbox, as Greywolf suggests, ultimately is a moot point. We still know that Google does not want to rank 'shady' sites well, and they want to rank 'trustworthy' sites well.

Google trusts established domains. Google trusts domains with existing original content. Google trusts websites with a logical site architecture. Google trusts sites that have shown,

over time, to be a consistent source of original content that continues to get quality links from a variety of trusted resources. Google trusts websites that do not have a lot of orphan pages, or pages that lead to dead ends.
Reply With Quote
Reply

Latest News in General Discussion





Powered by vBulletin® Version 3.8.10
Copyright ©2000 - 2024, vBulletin Solutions, Inc.