Search Engine Optimization (SEO) Information Tony |
|
Cracking the Google Code: Under the GoogleScope
Google's sweeping changes confirm the search giant has launched a full out assault against artificial link inflation & declared war against search engine spam in a continuing effort to provide the best search service in the world? and if you thought you cracked the Google Code and had Google all figured out ? guess again. Google has raised the bar against search engine spam and artificial link inflation to unrivaled heights with the filing of a United States Patent Application 20050071741 on March 31, 2005. The filing unquestionable provides SEO's with valuable insight into Google's tightly guarded search intelligence and confirms that Google's information retrieval is based on historical data. What exactly do these changes mean to you? Your credibility and reputation on-line are going under the Googlescope! Google has defined their patent abstract as follows: A system identifies a document and obtains one or more types of history data associated with the document. The system may generate a score for the document based, at least in part, on the one or more types of history data. Google's patent specification reveals a significant amount of information both old and new about the possible ways Google can (and likely does) use your web page updates to determine the ranking of your site in the SERPs. Unfortunately, the patent filing does not prioritize or conclusively confirm any specific method one way or the other. Here's how Google scores your web pages. In addition to evaluating and scoring web page content, the ranking of web pages are admittedly still influenced by the frequency of page or site updates. What's new and interesting is what Google takes into account in determining the freshness of a web page. For example, if a stale page continues to procure incoming links, it will still be considered fresh, even if the page header (Last-Modified: tells when the file was most recently modified) hasn't changed and the content is not updated or 'stale'. According to their patent filing Google records and scores the following web page changes to determine freshness. ·The frequency of all web page changes ·The actual amount of the change itself? whether it is a substantial change redundant or superfluous ·Changes in keyword distribution or density ·The actual number of new web pages that link to a web page ·The change or update of anchor text (the text that is used to link to a web page) ·The numbers of new links to low trust web sites (for example, a domain may be considered low trust for having too many affiliate links on one web page). Although there is no specific number of links indicated in the patent it might be advisable to limit affiliate links on new web pages. Caution should also be used in linking to pages with multiple affiliate links. Developing your web page augments for page freshness. Now I'm not suggesting that it's always beneficial or advisable to change the content of your web pages regularly, but it is very important to keep your pages fresh regularly and that may not necessarily mean a content change. Google states that decayed or stale results might be desirable for information that doesn't necessarily need updating, while fresh content is good for results that require it. How do you unravel that statement and differentiate between the two types of content? An excellent example of this methodology is the roller coaster ride seasonal results might experience in Google's SERPs based on the actual season of the year. A page related to winter clothing may rank higher in the winter than the summer... and the geographical area the end user is searching from will now likely be considered and factored into the search results. Likewise, specific vacation destinations might rank higher in the SERPs in certain geographic regions during specific seasons of the year. Google can monitor and score pages by recording click through rate changes by season. Google is no stranger to fighting Spam and is taking serious new measures to crack down on offenders like never before. Section 0128 of Googles patent filing claims that you shouldn't change the focus of multiple pages at once. Here's a quote from their rationale: "A significant change over time in the set of topics associated with a document may indicate that the document has changed owners and previous document indicators, such as score, anchor text, etc., are no longer reliable. Similarly, a spike in the number of topics could indicate spam. For example, if a particular document is associated with a set of one or more topics over what may be considered a 'stable' period of time and then a (sudden) spike occurs in the number of topics associated with the document, this may be an indication that the document has been taken over as a 'doorway' document. Another indication may include the sudden disappearance of the original topics associated with the document. If one or more of these situations are detected, then [Google] may reduce the relative score of such documents and/or the links, anchor text, or other data associated the document." Unfortunately, this means that Google's sandbox phenomenon and/or the aging delay may apply to your web site if you change too many of your web pages at once. From the case studies I've conducted it's more likely the rule and not the exception. What does all this mean to you? Keep your pages themed, relevant and most importantly consistent. You have to establish reliability! The days of spamming Google are drawing to an end. If you require multi page content changes implement the changes in segments over time. Continue to use your original keywords on each page you change to maintain theme consistency. You can easily make significant content changes by implementing lateral keywords to support and reinforce your vertical keyword(s) and phrases. This will also help eliminate keyword stuffing. Make sure you determine if the keywords you're using require static or fresh search results and update your web site content accordingly. On this point RSS feeds may play a more valuable and strategic role than ever before in keeping pages fresh and at the top of the SERPs. The bottom line here is webmasters must look ahead, plan and mange their domains more tightly than ever before or risk plummeting in the SERPs. Does Google use your domain name to determine the ranking of your site? Google's patent references specific types of 'information relating to how a document is hosted within a computer network' that can directly influence the ranking of a specific web site. This is Google's way of determining the legitimacy of your domain name. Therefore, the credibility of your host has never been more important to ranking well in Google's SERP's. Google states they may check the information of a name server in multiple ways. Bad name servers might host known spam sites, adult and/or doorway domains. If you're hosted on a known bad name server your rankings will undoubtedly suffer? if you're not blacklisted entirely. What I found particularly interesting is the criteria that Google may consider in determining the value of a domain or identifying it as a spam domain; According to their patent, Google may now record the following information: ·The length of the domain registration? is it greater than one year or less than one year? ·The address of the web site owner. Possibly for returning higher relevancy local search results and attaching accountability to the domain. ·The admin and the technical contact info. This info is often changed several times or completely falsified on spam domains; again this check is for consistency! ·The stability of your host and their IP range? is your IP range associated with spam? Google's rationale for domain registration is based on the premise that valuable domains are often secured many years in advance while domains used for spam are rarely secured for more than a year. If in doubt about a host's integrity I recommend checking their mail server at www.dnsstuff.com to see if they're in the spam database. Watch for red flags! If your mail server is listed you may have a problem ranking well in Google! Securing a reputable host can and will go a long way in promoting your web site to Google. The simplest strategy may be registering your domain several years in advance with a reputable provider thereby demonstrating longevity and accountability to Google. Google wants to see that you're serious about your site and not a flash in the pan spam shop. http://www.tkqlhce.com/click-1604302-10294265 Googles Aging Delay has teeth? and they're taking a bite out of spam! It's no big secret that Google relies heavily on links when it comes to ranking web sites. According to their patent filing, Google may record the discovery date of a link and link changes over time. In addition to volume, quality & the anchor text of links, Google's patent illustrates possible ways how Google might use historical information to further determine the value of links. For example, the life span of a link and the speed at which a new web site gets links. "Burst link growth may be a strong indicator of search engine spam". This is the first concrete evidence that Google may penalize sites for rapid link acquisition. Whether the "burst growth" rule applies to high trust/authorative sites and directory listings remains unknown. I personally haven't experienced this phenomenon. What's clear for certain though is the inevitable end to results orientated link farming. I would point out here that regardless of whether burst link growth will be tolerated for authorative sites or authorative link acquisition, webmasters will have to get smarter and work harder to secure authorative links as their counterparts become reluctant to exchange links with low trust sites. Now Page Rank really has value! Relevant content swaps may be a nice alternative to the standard link exchange and allow you some control of the link page elements. So what else does Google consider in determining the aging delay? ·The anchor text and the discovery date of links are recorded, thus establishing the countdown period of the aging delay. ·Links with a long-term life span may be more valuable than links with a short life span. ·The appearance and disappearance of a links over time. ·Growth rates of links as well as the link growth of independent peer pages. Again, this suggests that rapid link acquisition and the quality of peer pages are monitored. ·Anchor text over a given period of time for keyword consistency. ·Inbound links from fresh pages? might be considered more important than links from stale pages. ·Google doesn't expect that new web sites have a large number of links so purchasing large numbers of brokered links will likely hurt you more than help you. Google indicates that it is better for link growth to remain constant and naturally paced. In addition, the anchor text should be varied as much as possible. ·New web sites should not acquire too many new links; it'll be tolerated if the links are from trusted sites but it may be considered spam. So how do you build your link popularity / Page Rank and avoid penalties? When it comes to linking, you should clearly avoid the hocus pocus or magic bullet linking schemes. If you participate in quick fix link exchange scams, use automated link exchange software or buy hundreds of links at once, chances are Google will interpret your efforts as a spam attempt and act accordingly. Don't get caught in this trap? the recovery period could be substantial since your host and IP range are also considered! When you exchange links with other web sites, do it slowly and consistently. Develop a link management and maintenance program. Schedule regular times every week to build the links to your site and vary the anchor text that points to your site. Obviously, the links to your site should utilize your keywords. To avoid repetition use lateral keywords and keyword phrases in the anchor text since Google wants to see varied anchor text! Your sites click through rate may now monitored through bookmarks, cache, favorites, and temporary files. It's no big secret that Google has always been suspected of rewarding sites with higher click through rates (very similar to what Google does with their AdWords program) so it shouldn't come as a great surprise that Google still considers site stickiness and CTR tracking in their criterion. What's interesting though is Google is interested in tracking the behavior of web surfers through bookmarks, cache, favorites, and temporary files (most likely with the Google toolbar and/or the Google desktop search tool). Google's Patent filing indicates Google might track the following information: ·Click through rates are monitored for changes in seasonality, fast increases, or other spike traffic in addition to increase or decrease trends. ·The volume of searches over time is recorded and monitored for increases. ·The information regarding a web page's rankings are recorded and monitored for changes. ·Click through rates are monitored to find out if stale or fresh web pages are preferred for a search query. ·The traffic to a web page is recorded and monitored for changes? like Alexa. ·User behavior may be monitored through bookmarks, cache, favorites, and temporary files. ·Bookmarks and favorites could be monitored for both additions and deletions, and; ·The overall user behavior for trends and changes. Since Google is capable of tracking the click-through rates to your web site, you should make sure that your web pages have attractive titles and utilize calls to action so that web surfers click on them in the search results. It's also important to keep your visitors there so make your web pages interesting enough so that web surfers stay some time on your web site. It might also help if your web site visitors added your web site to their bookmarks. As you can see, Google's new ranking criterion has evolved far beyond the reliance of criteria that can be readily or easily manipulated. One thing is for certain with Google, whatever direction search innovation is going; you can trust Google to be pioneering the way and setting new standards! Copyright 2005 Lawrence Deon Lawrence Deon is an SEO/SEM Consultant and author of the popular search engine optimization and marketing model Ranking Your Way To The Bank. http://www.rankingyourwaytothebank.com
|
RELATED ARTICLES
SEO #5: Analyzing the Top Ranked Website on Google Yesterday you should have read the forth course out of 6 courses that will help you get a TOP rank in the search engines and get EXPLOSIVE LASER TARGETED TRAFFIC for Free. Today we move on to course #5 and study analyzing the Top Ranked Website on Google. Please read today's course very carefully and take some time to test what I'm about to tell you on your own webpage. Alright let's start! Link Building Services In today scenario when we talk about Search Engine Optimization, we also talk about one of the most important aspect of SEO, which is Link Building. But there are different types, aspects and limitations of Link Building, which would be discussed now under 7 Search Engine Optimization Strategies Search engine optimization refers to the technique of making your web pages search engine friendly so that search engines are more easy to understand and analyze your website. Consequently, your site has a better chance to gain high search engine ranking. Onpage Optimization: Essential for Effective Offpage Optimization Onpage optimization is the process by which various elements on an individual web page are structured so that the web page can be found by the search engines for specific keyword(s) or keyword phrases. Why Search Engine Optimization is Not Enough OK. So you've created a nice website with lots of interesting products and information. Now all you need is visitors. Search Engine Traffic: Winning With Content Targeted traffic is the lifeblood of any online business. The best source of free-targeted traffic to your site is the search engines. To be visible at the search engines, you have to appear in the first three or so pages of the search engine result pages (SERPS). The quest for attaining high rankings at the search engines has spawned a whole industry, the search engine optimization (SEO) world. Search Engine Optimization for Beginners If you are confused about terms like "search engine optimization" or having a "search engine friendly" site, then listen up! I am here to help. Website Promotion: 10 Search Engine Optimization Blunders to Avoid If you want to develop a successful search engine optimization (SEO) strategy, go out of your way to avoid blunders that limit rankings. Keyword Targeting Strategy In Your Site Once the keywords have been decided for the site one has to come up with a strategy to target those keywords across the site. Here is a primer on that. Search Engine Marketing (SEM) - Houses on Sand Do you depend on free search engine traffic for your livelihood? Ten Steps To A Well Optimized Website - Step 5: Internal Linking Welcome to part five in this search engine positioning series. Last week we discussed the importance of content optimization. In part five we will cover your website's internal linking structure and the role that it plays in ranking highly, and in ranking for multiple phrases. See No Google, Hear No Google, Speak No Google That's right - I dreamt of a World Wide Web without the Googlopoly. And let me tell you - it was a scary place. Search Engine Optimisation ? Why And How To Do It In the following article I will give you some basic guidelines to follow in order to help get your web site ranked at the top of popular organic search engines. More detailed information can be found in our DIY Internet Marketing Guide "Start at the Beginning" Click here for an excerpt: http://www.enable-uk.co.uk/html/book_2.html Is Reciprocal Linking Dead? I just read an article at SitePro News that really rings my bell. How To Get Your Web Site Banned ? or Search Engine Promotion, How Not to do it! Search engines constantly strive to advance their technology and algorithms in order to provide the most relevant search results for their users. Achieving effective results require the identification, and ultimately the complete eradication of, manipulative search engine optimisation tactics. As Internet marketers, it is up to us to achieve high listings for our customers and this requires that our search engine marketing tactics change and grow with the new technology. A Naive Mistake Cost Me My Google Rankings Little do you know but you too could be making mistakes with your website that are costing you your search engine rankings. To Understand the Success of Website Ranking Time is a factor Link Exchange Tips, No Tricks Use text links, avoid image links. Anyhow, if you have used image links, then always make sure to put your keywords in the alt tags. Put your prime keywords in text links and always insure to put a short descriptions of your website/page in minimum of 20-30 words or more if allowed. In text links if you are not allowed to add descriptions then always try to put a short descriptions in title tag. E.g.: title =" Link Exchange Tips, No Tricks ">Link Exchange Avoid repetition of keywords, repetition will lead to low rankings. Always try to exchange links from related websites (e.g. if you have a football league website, then a link from other sports websites will be known as related website links) Never, exchange links with unrelated websites/pages. Always ensure that the website that you are exchanging links with, is well optimized and has a good content. Never try to exchange links in quick sessions. Since search engines ignores or does not index those sites who gets bundles of links in quick time. Give your link exchange works few hours every day, to ensure that you get limited but quality links. Mainly for new websites, avoid exchanging bundles of links in quick sessions. Avoid creating a link/links.html file, since search-engines do not index link/links.html files, often. Always make a 'resources' (any appropriate name) folder and create an index.html file for that named folder and create appropriate html files for appropriate links going into them. (a sort of web-directory) Never, never exchange links with pages bearing more than 50 back-links. Generally 25 is appropriate and should not be more than two clicks away from the index/main page. Always ensure yourself that the page/website you are exchanging link is being indexed by the search engines or not. Also ensure on your own that the page/website you are exchanging link with, is not being blocked by any scripts, or disallowed by robots.txt, or disallowed by using noindex, nofollow on html page. Avoid linking to those tricky and false websites, and if possible inform already linked websites about the matter and ask them for link-exchange, if related to your websites. Links from the main/index page of the related website is the most quality link that a website can get. But, never try to place more than 20 back-links in index/main page. Exchanging links with your competitor's back linking website/pages are also the quality links that a website can get. Never post your website links to FFA, Guestbook, or other unrelated areas. Since search-engines completely ignores such links and may penalize your website. Site bearing good Google page rank are appropriate for link-exchange. Generally 4+ Google page rank is great for link-exchange. Exchanging links with old website is excellent. Link exchanging with new websites, with decent Google ranking at first, but bears quality contents and is well optimized, is also an excellent option for exchanging-links, since new websites bears less outgoing links than the old websites. While exchanging links for homepage of your website, also try to exchange links for sub-pages of it, for better ranking for that particular sub-page. Personal verification of the linking partner's website is recommended. Check for total incoming and outgoing links for whole website or for particular sub-pages of your linking partner's website and off course of your own before getting link exchanged. Internet Marketing and SEO Have you ever seen any email offers of getting you to the top of search engines guaranteeing top rankings. Like anything in life there is no quick result without hard work. When it comes to designing a website it is the proper foundation principle. Like any great structure you need the foundation built for search engines. An Introduction to Google Sitemaps ... and why I 'm dying to get finally in the Google SERP |
home | site map |
© 2022 |