Search Engine Optimization
1. What Is Search Engine Optimization?
Ask ten different people what Search Engine Optimization (SEO) is and you
may receive ten different answers. For the purposes of this guide, SEO will
be defined as the process of making changes to your site to make it more
accessible to search engines and the people that arrive at your site from
the search engines.
Search engine optimization is not about tricking the search engines. It
is about understanding what elements search engines look for on a page to
help determine the relevance of the page to a search term. By understanding
what page elements the search engines take into consideration, and making
adjustments to better present your page, you can improve your rank for a
Search engines use programs called "spiders" or "crawlers" that visit the
pages of your Web site and index the information on those pages. This
includes the content that your visitors see when viewing your page and some
of code used to build your page.
Taking search engine optimization into consideration before you build
your site is the best approach. Fortunately, search engine optimization can
also be performed after a site is built, but it will require making changes
to your site.
It is important to understand that it will likely take more than a month
before you see the results of your work. This is due to the fact that it
takes time for the search engines to update their databases.
A basic overview of the content and basic code that the spiders or
crawlers will look for on a page is covered here. The information provided
is not specifically geared toward one search engine. Instead it takes a more
general approach that will work well with most, if not all, search engines.
Each section starts with a brief description of the page element covered.
You will then be provided with sources for further information including:
- Recommended Articles
These are articles that do a good job explaining the key concepts of the
topic discussed. Each articles is linked with the option to open it in a
- Check the Article Archives
Search Engine Guide provides a directory of thousands of search engine
articles from experts across the Internet. This section will provide a
link to the relevant category covering the page element being covered. The
articles are arranged by date with the newest articles first. This will
allow you to get the most up to date information and then do further
research on older articles.
- Recommended Forum Threads
These are pointers to specific threads in search engine discussion forums
that cover the page element being discussed.
- Books and Reports
In some case, there are books and reports that are available that cover
the page element being discussed. For those that wish to have further
research material, we provide links to sources that provide the books and
2. Domain Names
Having keywords in your domain is not a major consideration. For example,
having a domain such as www.keyword-keyword-keyword.com isn’t going to do
much, if anything.
If keywords in your domain naturally make sense and would make it more
memorable to your visitors, then don’t be afraid to use keywords in your
domain name. If you already have a domain name, don’t worry about switching
just because it isn’t packed with keywords.
Don’t purchase multiple domains with the idea of having multiple copies
of your Web site to get multiple listings in the search engines. This will
likely result in a penalty from the search engines.
To learn more about Domain Names:
Attacking the Search Engines – There’s no need for an additional
domain. If you have additional products or services, simply add pages to
your existing site. Every page is a gateway to the rest, and they can all
rank highly if properly optimized.
Multiple Domain Names Pointing to One Site – I have heard through the
grapevine that if you buy many domain names and point them back to your
home page, when the search engines find this out they will shut you off.
Is that true at all?
Keyword-Rich Domain Names – The myth of using keyword-rich domains for
SEO purposes has been perpetuated for way too long, and quite frankly I’m
tired of seeing it written about as if it’s an all-important SEO factor.
Check the Article Archives:
More Articles About Domain Names – Check the Search Engine News
Archives for articles about domain names and the impact they can have on
your rankings. The articles are arranged by date with the newest articles
first. This will allow you to get the most up to date information and then
do further research on older articles.
Recommended Forum Threads:
3. The Title Tag
The title tag for each page of your site is very important. If you look
at the code of a web page, you will find it in the head section and it will
look something like the following:
<title>Your Page Title Here</title>
<meta name="Description" content="Your Description">
<meta name="Keywords" content="Your Keywords">
Your title should include your keywords or keyphrases and help describe
your site, or the specific page, in a concise manor. When people are viewing
a list of search results, they typically will scan down the list. Make sure
your title tells them what your site, or the page, is about.
For the title, your keyphrases are more important than your company name,
unless your company name is a well known brand.
Repeating keywords and keyphrases over and over is likely to be penalized
and looks terrible in the search results.
To learn more about the Title Tag:
Search Engine Optimization Basics Part 2 – Title Tags – Not only is
the structure and content of the Title tag used by the search engines when
calculating your webpage’s relevance, but it is also displayed in most
search engine results pages. It therefore needs to be carefully
constructed in such a way that it influences your website’s position in
the SERP, but is also attractive enough to encourage a surfer to click on
Check the Article Archives:
More Articles About The Title Tag – Check the Search Engine News
Archives for articles about the Title Tag. The articles are arranged by
date with the newest articles first. This will allow you to get the most
up to date information and then do further research on older articles.
Recommended Forum Threads:
4. Meta Description and Meta Keyword Tags
These tags are also in the head section of the code of a web page and are
not seen by the visitors of your site unless they happen to look at your
source code. If you look at the code of a web page, the Meta Tags will look
something like the following:
<title>Your Page Title Here</title>
<meta name="Description" content="Your Description">
<meta name="Keywords" content="Your Keywords">
Of the two, the Meta Description is the most important. In fact, many
experts will advise that it isn’t even necessary to bother with the Meta
Keywords tag because most search engines ignore it.
For your home page, your Meta Description Tag should briefly explain what
your site is about. Each page of your site should also have a Meta
Description Tag that explains what that specific page is about. Keep it
short and to the point.
To learn more about Meta Description and Meta Keyword Tags:
Search Engine Optimization Basics Part 3 – Meta Tags – As part of the
continued series ‘Getting Back To Search Engine Optimization Basics’, Andy
Beal takes a closer look at Meta description and Meta keywords tags. Do
you still need them and what benefits do they bring?
What Exactly are META tags? – You probably know that META tags are
important for search engine optimization and that they need to be included
on your Web site. But what exactly do they do? What is their purpose and
how exactly do the search engines interact with them?
No Meta Keyword Tags – Unfortunately, once Meta tags became spam
magnets, we SEOs started telling people to only use words that were
already on the page and no others. From what I discovered, that advice was
actually wrong — and in fact, probably useless!
Check the Article Archives:
More Articles About Meta Description and Meta Keyword Tags – Check the
Search Engine News Archives for articles about Meta Description and Meta
Keyword Tags. The articles are arranged by date with the newest articles
first. This will allow you to get the most up to date information and then
do further research on older articles.
Recommended Forum Threads:
The Truth About Reciprocal Link Networks
Happened across a nice little discussion regarding reciprocal linking and thought we would give a snippet of it here with a link to the authors blog if you are interested in the full story. Good stuff from WebGuerrilla. . .
Shortly after stage 2 of the recent Google algo update began, I received an email from a panic stricken ex-client. He woke up on a Monday morning to find all of the recent gains we had made Google were gone.
I was a bit shocked to hear this because this particular project was as “white hat” as they come. And from a keyword standpoint, it wasn’t even close to being a space that any competent SEO would consider competitive.
My initial reaction was to simply ignore the email. He was no longer a paying client, so why waste the time looking into it? But then curiosity got the best of me. So I fired up some tools and started to do some digging.
What I discovered was that my ex-client had been very busy building links since our contract expired. In fact, he had managed to pickup about 7000 new links in about a month and a half. Almost all of these links were coming from a reciprocal linking network called GotLinks.
When I wrote him back and told him that developing that many links in such a short period of time probably wasn’t the smartest thing to do, (especially when the top site in his space has been around since ’95 and has only managed to collect a total of 250 inbound links) he promptly cancelled his GotLinks account and wrote the owner and asked that all his links be removed.
That led to a bunch of defensive emails from the owner which basically said that all the sites in his network were kickin’ ass and I was just an idiot who didn’t know what I was talking about.
5 Ways Google Will Help You With Your Traffic
If you’ve ever had a severe drop in your Google rankings in search results, you may think of Google more of an enemy than an ally.
But if you knew what I do, you’d realize that there are tools provided by the search engine that help you learn more about your traffic, and may even help drive visitors to your site.
Here are five ways that Google provides free traffic assistance.
#1 – Google will Help Your Pages Get Discovered with Google Sitemaps
Google Sitemaps is a program that gives you the opportunity to present your site’s pages to Google in XML or text. Google will then come by and spider the pages, getting you indexed faster.
Take note that this doesn’t necessarily mean that your pages will be listed for your favorite keywords, only that discovery will take place a lot faster than with manual submission. Google Sitemaps will also give you some basic site stats if you verify your site, such as the top keywords for discovery, errors it found when crawling, and the types of documents at your site, if you need other ways to manage your documents at your website we recommend you to visit the Digital WarRoom site.
If you find compiling your sitemap for Google in the correct format difficult, try the SOFTplus GSiteCrawler Google Sitemap generator. It’s my favorite Sitemap generator, free and easy to use.
#2 – Google Will Talk To You or Your Webmaster with the Webmaster Section
The Google Information Page for Webmasters should be your first stop when you want to know more about anything that has to do with your site and its relationship to Google and any of its many flavors of search such as Froogle. Particularly for new site owners or operators, checking this page first has saved many from needless anxiety.
Most of the basic information is in straightforward language, with links to details for geeks like me.
Usability is already a critical component of successful online ventures but with the advent of Google Analytics and the implementation of the Jagger algo update, user-activities and behaviours are going to play an influencing role in search engine rankings. How people act when they visit a website or document is being measured and accounted for, even for sites without Google Analytics tracking codes in thesection of the document source-code.
Google is concerned with how people find information and what they do when they access a document found in the Google index. Which document in a site they tend to land on, how long users spend on that document and how much, if any, time does a user spend exploring information in a domain, are all pertinent to how Google perceives the relevance of documents listed in the index. As long-term online marketers know, this is where usability comes into the picture.
Usability, as defined by Kim Kraus Berg is, “… the ability to successfully, comfortably and confidently learn or complete a task. For the web site designer or application developer, it’s the mechanics of designing and building a web site or Internet-based application so that it can be understood and easy to accomplish any task.”
According to local (Victoria-based) website marketing expert, Michael Linehan, a focus on site usability is simply common sense marketing. Leading visitors towards goal-orientated outcomes makes as much sense for a functioning website as it does for a functional building and, to follow through on the analogy, it all starts with a smart architect.
Michael knows his stuff, so much so StepForth considers him to be one of our marketing and site usability gurus. If our assumptions about user-behaviours and the post-Jagger Google SERPs are correct, Michael’s talents will play an important role in our overall SEO techniques.
New Features in Control Center
It’s been a little while since we have spoken of new features added to the control center. There have been a few added recently and it is time to point them out and give brief explanations to each.
REPURCHASE LINKS – Advertisers
This new feature gives advertisers the ease of repurchasing links they have previously purchased but have either not renewed, cancelled, declined or expired. Rather than having to go through the buy links section and find the partner listings and take the steps again, you can simply login to your account, go to >> TEXT ADS >> LINK HISTORY and select either TEXT LINKS, BILLBOARDS, PACKAGES or ROTATING ADS. It will show all of your past history and any links available for repurchase will have a checkbox located to the left of the listing. Select any and all partner listings you wish to repurchase, and then click the “REPURCHASE” button on the bottom of the page. It will direct you to a summary page of all listings you selected and the option to set the link location you want your link to appear on. Once these selections are set, click the REPURCHASE button once again and you will get the confirmation page showing prices. Confirm and they are repurchased. For those of you who might miss renewals and the text ads are not renewed and cancelled, you can now easily get your campaigns back up quickly with this new option. It will save advertisers quite a bit of time if in this situation.
LINK HISTORY – Partners
Why we did not have this before is a bit of a mystery. It has always been a feature for advertisers, but not for partners. Now partners can navigate to >>MANAGE LINKS >> HISTORY and select either TEXT LINKS, BILLBOARDS, PACKAGES or ROTATING ADS. This will provide a listing of all previous activity which has occurred in your account. If a link appears and disappears, you can now see the status of the link in your history. Sometimes advertisers will send a request and cancel before the partner ever gets to their account and it’s a bit of a mystery. Now you have full access to see the link and what the status is, i.e. . . . if it’s cancelled, cc failed and so on. Also, if you have a question on if a link should be published or not, you can visit your link history page and see what the status of any link is.
ADDITIONAL BUY OPTIONS – Advertisers
Based on several requests about adding more than one text link on one single partner site, we have implemented an options tab for advertisers when buying text links, packages, billboards and rotating ads. If the checkbox for Show Additional Options during checkout is selected (located next to the BUY LINKS and/or ADD PARTERS buttons), the advertiser will be taken to a new page that gives the option to add multiple text ads on one given partner website. You’ll have the ability to add multiple text ads on *ALL* partner listings selected, or go through individually and select one or multiple text ads no each partner listing selected. On the new page, each partner listing will be shown with all of your text link ads you have created. Each of your text ads will have a checkbox next to it and a link location drop down menu. Place a checkbox next to each ad you wish to place on the given partner website, select the appropriate link location/price and then move to the next listing until you have gone through all of them. When finished, click the CONTINUE button and it will direct you to the confirmation page. If all looks good, select the payment method and then the BUY LINKS button and you’re done. This is an option that may not appeal to everyone so we just made this step an option to select when buying. This allows those uninterested to skip this step. It definitely saves those wanting to add multiple ads on one partner site a TON of time.
SORTING OPTIONS – Advertisers & Partners
For those of you who might manage your clients linking campaigns and have hundreds of purchased links, we have been adding sorting options to all of the pages which might contain a lot of listings. This will help locate specific listings in your account easily and save a lot of time. Advertisers can access their APPROVED LINKS section and notice at the bottom of the page where we now show how many listings are on the page and also total up the link prices. This can help advertisers easily calculate prices paid for specific ads and or listings.
TAX FORMS ONLINE – Partners
Partners no longer have to fax or mail their tax forms before getting paid. Due to popular demand, we have discussed the option of receiving tax info online with our tax advisors and have been given the green light. We setup a simple wizard similar to other online tax forms you may have experienced with other companies that payout money and it’s easily entered inside your LinkWorth account. All information is received with secured encryption methods and is instant.
LINKWORTH MOVES TO NEW OFFICE! ! !
We have finally gotten moved into our new office. To be honest, we didn’t think it was ever going to happen. We constantly questioned whether we should have just found an existing location to get in quicker rather than having a brand new office built for us, but now that we’re moved in, we couldn’t be happier. This new office now allows us to increase our staff. More staff means more business for everyone. We are aggressively seeking new sales/account management personnel to catch up with our demand. We have set our goals very high for 2006 and expect to reach all goals set, if not surpass them. One of the main goals is to make all of our partners and advertisers happy to be part of our success. We appreciate everyone that makes LinkWorth what it is and want to show our appreciation with success for all. If you are located in the Dallas/Ft. Worth area and are looking for employment, give us a call.
Future services to be offered by LinkWorth
Pay Per Click Management
Since PPC can be a very effective method of instant targeted traffic, we plan to incorporate PPC services within your existing LinkWorth account. We will use our proprietary software to integrate your keywords into Yahoo and Google PPC programs.
This service will be free to all managed accounts and for a small fee to unmanaged accounts. This will allow webmasters to place a code snip on their pages and track all visitors with all stats detailed out in your LinkWorth account. We will relate it to your LinkWorth efforts and put it in very easy to understand terms.
Even though this is an old method of marketing, many people still prefer banner ads. It is a great way to build name recognition and visibility, along with traffic generation. This will also be incorporated into your existing LinkWorth account as an option.
As always, we will continue to improve our system to your needs. We listen to each and every suggestion and/or request. If it makes sense and others want it as well, it’s on our new tasks list. Without you, LinkWorth is nothing, so we will continue to tailor to your needs.
We also want to wish everyone a very safe and happy holiday season.
Contest Money Giveaway
As most LinkWorth customers know, we are always trying to make LinkWorth a better and more user friendly atmosphere for all. Over time, we have grown to learn that what might seem trivial to us, might be confusing to others. After many discussions of how to make our control center interface easier, we have come up with a new and unique idea. We are calling it our “Facelift Contest“.
With any software program, one of the most important parts is the initial homepage. It can be constructed to give a quick insight to all major aspects of the program, much like having a bird’s eye view of an entire city. We, at LinkWorth, feel our initial login homepage could use a major facelift to better illustrate important features and/or sections of your account. It could provide more accurate statistical information about your account and be laid out in a much more user friendly manner. So instead of us putting our thoughts into the layout, we feel the best way to get something everyone could greatly benefit from was to allow our customers, who use their accounts on a daily basis, the power to do it.
This leads us to the actual contest. We want you to submit your best login homepage of how you think it should be laid out. It can be submitted in the following formats: jpg, gif, bmp, pdf, html or psd. While the design/artwork is a plus, we are looking mostly for the art of information provided on the homepage. What bits of information would you like to see on the login homepage? How would you like to see it shown? . . . and so on.
There are TWO (2) login homepages; Advertiser Home and Partner Home. There will be one winning submission for each homepage with either one or two actual winners. The contest will run until January 1, 2006. After the deadline has passed, we will select the top 5 submissions for each homepage. We will then construct demo pages of each and open the voting up to the LinkWorth customers. The voting process will run for a total of 30 days and at the end of the 30th day, we will crown a winner for the advertiser homepage and partner homepage.
Now for the part everyone is reading this for! ! ! The prize.
Grand Prize Advertiser Homepage – $500 credit to account (total of 1 prize)
Grand Prize Partner Homepage – $500 credit to account (total of 1 prize)
Runner Up Advertiser Homepage – $25 credit to account (total of 4 prizes)
Runner Up Partner Homepage – $25 credit to account (total of 4 prizes)
This is a view of the current advertiser homepage:
This is a view of the current partner homepage:
To submit your entry, login to your LinkWorth account and look for the submission directions on the homepage, located in the announcements/news section.
Tips and Hints to participants are:
1. Be creative and informative
2. Fast loading page (low image use)
3. Great use of statistical data
4. Detailed labeling
5. Extremely User Friendly
6. Simplified but detailed
Best of luck to all who participate! We are excited to see your idea’s.
Be sure to read our rules and regulations of this contest.
More Jagge’d’ Information
This post is from Matt Cutt’s blog. Matt is a resident Google-ite who appears to be a fellow collaborator on their search inventions (his name appear on patent applications as inventor). Here we go:
Okay, Brett Tabke decided to call it Update Jagger. Here are the ways that I’d use to contact us if you have feedback on Google search results:
Reporting spam in Google’s index
I especially want to hear about webspam that you see in Google. The best place to do that is to go to http://www.google.com/contact/spamreport.html . In the “Additional details:” section, I would use the keyword “jagger1″ (that’s “jagger” and the number one with no spaces in between).
Reporting non-spam issues or problems in Google’s index
Do the search that you’re interested in on google.com, then click the “Dissatisfied? Help us improve” link at the bottom right of the page. Again, fill in details and use the keyword jagger1 so that folks at Google can separate out feedback specifically about this update.
You think your site has been penalized
If your site is not showing up at all, and you recently had something like hidden text or hidden links on your pages, I would recommend doing a reinclusion request. I wrote up my advice on the best way to do a reinclusion request. Note that a reinclusion request won’t make much difference if our algorithms/scoring are what is affecting your site though.
You see a low-quality site that is running AdSense
If you run across a site that you consider spammy and it has AdSense on it, click on the “Ads by Goooooogle” link and click “Send Google your thoughts on the ads you just saw”. Enter the words spamreport and jagger1 in the comments field.
You want to talk about data center IP addresses amongst friends, or “update speculate”
Lots of search-related discussion goes on at WebmasterWorld, but bear in mind that you won’t be able to mention specific urls or searches on WMW. If you want to mention specifics to Google, I’d go with one of the ways above. Visit site if you want to have a Free IP Geolocation API
Hope that helps to let people know where to send their feedback based on what they see.
As Easy As Jagger 1 – 2 – 3
We wanted to give a new update to those who might have not heard about this Google update. According to one of the Google inventor’s of the recent patent, “Matt Cutt’s“, this latest updated labeled Jagger, is a 3 step program. Jagger 1 is up and live and has seemed to kill a lot of webmasters natural listings. Jagger 2 is currently in the process of spreading like a virus throughout the web, with the final step of Jagger 3 starting next week.
Reading through the patent, it appears they are really going after spammy technique’s and link purchasing employed by webmasters. What we’ve noticed is some sites originally took big hits in the SERPs, however, Jagger 2 has begun bringing them back to the top. We expect after the Jagger 3 has completed, we will see more stable listings as previously shown. Here is the recent post from Matt Cutt’s blog. . .
It looks like Jagger2 is starting to be visible. GoogleGuy posted over on WebmasterWorld with what SEOs should expect:
McMohan, good eyes in spotting some changes at 126.96.36.199. I expect Jagger2 to start at 66.102.9.x. It will probably stay at 1-2 data centers for the next several days rather than spreading quickly. But that data center shows the direction that things will be moving in (bear in mind that things are fluxing, and Jagger3 will cause flux as well).
Matt Cutts posted how to send feedback on Jagger1 at http://www.mattcutts.com/blog/update-jagger-contacting-google/
If you’re looking at 66.102.9.x and have new feedback on what you see there (whether it be spam or just indexing related), please use the same mechanism as before, except use the keyword Jagger2. I believe that our webspam team has taken a first pass through the Jagger1 feedback and acted on a majority of the spam reports. The quality team may wait until Jagger3 is visible somewhere before delving into the non-spam index feedback.
If things stay on the same schedule (which I can’t promise, but I’ll keep you posted if I learn more), Jagger3 might be visible at one data center next week. Folks should have several weeks to give us feedback on Jagger3 as it gradually becomes more visible at more data centers.
Not much more I can add to that. Even on 188.8.131.52, it will still take a day or so for the changes to be fully visible at that data center. Just to re-emphasize, if you send new feedback on a data center such as 184.108.40.206, be sure to use the keyword jagger2 in spam reports or index feedback so that we can tell this is newer feedback. Jagger1, Jagger2, and Jagger3 are mostly independent changes, but they’re occurring closely enough in time (plus they interact to some degree) that it’s clearer just to act as if they were one update for feedback purposes.
I think the moral of the story is, it’s very possible your site will drop off initially, but eventually rebound . . . IF…AND ONLY IF … your site employs whitehat techniques and full of content related to your subject. Keep building your linking campaign with relevant partner sites and build pages and pages of relevant content. Make your pages or site bookmarked accessible as this is a new feature Google will watch for ranking.
Dissection of New Google Patent
While some people may take a quick peek at the new Google patent and get quick tired head, the true geeks like to dissect the crazy talk and try to make sense of it. I figured I would put down my thoughts and perception of certain key sections of their new piece of work. Maybe it will help those who feel like their head will explode if they read anymore of this new ranking system. 🙂
Lets do all of ourselves a favor and skip the “claims” and move down to the meat of the patent application labeled Description.
 Both categories of search engines strive to provide high quality results for a search query. There are several factors that may affect the quality of the results generated by a search engine. For example, some web site producers use spamming techniques to artificially inflate their rank. Also, “stale” documents (i.e., those documents that have not been updated for a period of time and, thus, contain stale data) may be ranked higher than “fresher” documents (i.e., those documents that have been more recently updated and, thus, contain more recent data). In some particular contexts, the higher ranking stale documents degrade the search results.
They are saying here that sites can get themselves ranked high with a back link from a high ranked site that is not updated often.
SUMMARY OF THE INVENTION
 Systems and methods consistent with the principles of the invention may score documents based, at least in part, on history data associated with the documents. This scoring may be used to improve search results generated in connection with a search query.
 According to one aspect consistent with the principles of the invention, a method for scoring a document is provided. The method may include identifying a document and obtaining one or more types of history data associated with the document. The method may further include generating a score for the document based, at least in part, on the one or more types of history data.
 According to another aspect, a method for scoring documents is provided. The method may include determining an age of linkage data associated with a linked document and ranking the linked document based on a decaying function of the age of the linkage data.
In this summary, they are implying the scoring of sites will be based on historical information like the age of back links.
 History component 320 may gather history data associated with the documents in document corpus 340. In implementations consistent with the principles of the invention, the history data may include data relating to: document inception dates; document content updates/changes; query analysis; link-based criteria; anchor text (e.g., the text in which a hyperlink is embedded, typically underlined or otherwise highlighted in a document); traffic; user behavior; domain-related information; ranking history; user maintained/generated data (e.g., bookmarks); unique words, bigrams, and phrases in anchor text; linkage of independent peers; and/or document topics. These different types of history data are described in additional detail below. In other implementations, the history data may include additional or different kinds of data.
This is a good section which details the many bits of data that may be included in the scoring of websites.
 Search engine 125 may use the inception date of a document for scoring of the document. For example, it may be assumed that a document with a fairly recent inception date will not have a significant number of links from other documents (i.e., back links). For existing link-based scoring techniques that score based on the number of links to/from a document, this recent document may be scored lower than an older document that has a larger number of links (e.g., back links). When the inception date of the documents are considered, however, the scores of the documents may be modified (either positively or negatively) based on the documents’ inception dates.
 Consider the example of a document with an inception date of yesterday that is referenced by 10 back links. This document may be scored higher by search engine 125 than a document with an inception date of 10 years ago that is referenced by 100 back links because the rate of link growth for the former is relatively higher than the latter. While a spiky rate of growth in the number of back links may be a factor used by search engine 125 to score documents, it may also signal an attempt to spam search engine 125. Accordingly, in this situation, search engine 125 may actually lower the score of a document(s) to reduce the effect of spamming.
These two sections talk of scoring a site based on the age of back links pointing to it. It also talks of scoring based on the rate of back links that are found pointing to the website. In other words, if a brand new site goes from 10 links to 10,000 links in a week, it’s obvious the website is using spamming techniques to increase it’s score and basically lower the score rather than increase it.
 In one implementation, search engine 125 may modify the link-based score of a document as follows:
 where H may refer to the history-adjusted link score, L may refer to the link score given to the document, which can be derived using any known link scoring technique (e.g., the scoring technique described in U.S. Pat. No. 6,285,999) that assigns a score to a document based on links to/from the document, and F may refer to elapsed time measured from the inception date associated with the document (or a window within this period).
 For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set. In other words, search engine 125 may determine the age of each of the documents in a result set (e.g., using their inception dates), determine the average age of the documents, and modify the scores of the documents (either positively or negatively) based on a difference between the documents’ age and the average age.
This illustrates a formula used to get the score. The link score appears to be related to their PageRank, although I have not gone through the related patent since it relates to several others. (Quite a web of patents!) But it appears use an existing score of a website into their new formula using history and link age giving a new score. I’m definitely not a math whiz to break this down, but I’m sure the genius’ will soon follow with their interpretations.
 In one implementation, search engine 125 may generate a content update score (U) as follows:
 U=f(UF, UA),
 where f may refer to a function, such as a sum or weighted sum, UF may refer to an update frequency score that represents how often a document (or page) is updated, and UA may refer to an update amount score that represents how much the document (or page) has changed over time. UF may be determined in a number of ways, including as an average time between updates, the number of updates in a given time period, etc.
 UA may also be determined as a function of one or more factors, such as the number of “new” or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document. Yet another factor may include the amount that the document is updated over one or more periods of time (e.g., n % of a document’s visible content may change over a period t (e.g., last m months)), which might be an average value. A further factor might include the amount that the document (or page) has changed in one or more periods of time (e.g., within the last x days).
 UF and UA may be used in other ways to influence the score assigned to a document. For example, the rate of change in a current time period can be compared to the rate of change in another (e.g., previous) time period to determine whether there is an acceleration or deceleration trend. Documents for which there is an increase in the rate of change might be scored higher than those documents for which there is a steady rate of change, even if that rate of change is relatively high. The amount of change may also be a factor in this scoring. For example, documents for which there is an increase in the rate of change when that amount of change is greater than some threshold might be scored higher than those documents for which there is a steady rate of change or an amount of change is less than the threshold.
This part is interesting. It says they will begin scoring the actual content found and the rate of change. There will be two parts, change on a page and change in the amount of pages being added. The key here is, they watch for trends. If you change information every day and add 10 pages of content every day, it will be normal. If another site adds information sporadically and puts a lot of information up over two days and a little over the next few days, that site’s score will be higher. And in , they are all but disregarding everything but the meat of your page. This means links placed on the side or foot of the pages are probably not going to be counted much, if at all. This is a definite cry for our Billboard product! Your links are embedded into the meat of the page, which is what they do count.
 According to yet another implementation, search engine 125 may store a summary or other representation of a document and monitor this information for changes. According to a further implementation, search engine 125 may generate a similarity hash (which may be used to detect near-duplication of a document) for the document and monitor it for changes. A change in a similarity hash may be considered to indicate a relatively large change in its associated document. In other implementations, yet other techniques may be used to monitor documents for changes. In situations where adequate data storage resources exist, the full documents may be stored and used to determine changes rather than some representation of the documents.
I wanted to point this one out for one reason . . . do not duplicate your content! They have mathematical equations that spot this automatically and will ding you for it. There is nothing that will help you by duplicating your content on several pages or different domains. 😉
 According to an implementation consistent with the principles of the invention, one or more query-based factors may be used to generate (or alter) a score associated with a document. For example, one query-based factor may relate to the extent to which a document is selected over time when the document is included in a set of search results. In this case, search engine 125 might score documents selected relatively more often/increasingly by users higher than other documents.
 Another query-based factor may relate to the occurrence of certain search terms appearing in queries over time. A particular set of search terms may increasingly appear in queries over a period of time. For example, terms relating to a “hot” topic that is gaining/has gained popularity or a breaking news event would conceivably appear frequently over a period of time. In this case, search engine 125 may score documents associated with these search terms (or queries) higher than documents not associated with these terms.
In simple terms, if people do not click your listing in results, you’ll slide down. If you are being clicked by people more often, then it will help you. And if your site deals with a hot topic then you will be scored higher than if your site is not directly related to the hot topic.
0062] Yet another query-based factor might relate to the “staleness” of documents returned as search results. The staleness of a document may be based on factors, such as document creation date, anchor growth, traffic, content change, forward/back link growth, etc. For some queries, recent documents are very important (e.g., if searching for Frequently Asked Questions (FAQ) files, the most recent version would be highly desirable). Search engine 125 may learn which queries recent changes are most important for by analyzing which documents in search results are selected by users. More specifically, search engine 125 may consider how often users favor a more recent document that is ranked lower than an older document in the search results. Additionally, if over time a particular document is included in mostly topical queries (e.g., “World Series Champions”) versus more specific queries (e.g., “New York Yankees”), then this query-based factor–by itself or with others mentioned herein–may be used to lower a score for a document that appears to be stale.
 In some situations, a stale document may be considered more favorable than more recent documents. As a result, search engine 125 may consider the extent to which a document is selected over time when generating a score for the document. For example, if for a given query, users over time tend to select a lower ranked, relatively stale, document over a higher ranked, relatively recent document, this may be used by search engine 125 as an indication to adjust a score of the stale document.
 Yet another query-based factor may relate to the extent to which a document appears in results for different queries. In other words, the entropy of queries for one or more documents may be monitored and used as a basis for scoring. For example, if a particular document appears as a hit for a discordant set of queries, this may (though not necessarily) be considered a signal that the document is spam, in which case search engine 125 may score the document relatively lower.
All of this is simply saying their bots will monitor the activity of users. Depending on the behavior of the users, it will score a site high based on staleness or newness. In other words, if in a subject people constantly select older sites, then it will score older sites higher in that subject. And vice versa for the newer sites.
 According to an implementation consistent with the principles of the invention, one or more link-based factors may be used to generate (or alter) a score associated with a document. In one implementation, the link-based factors may relate to the dates that new links appear to a document and that existing links disappear. The appearance date of a link may be the first date that search engine 125 finds the link or the date of the document that contains the link (e.g., the date that the document was found with the link or the date that it was last updated). The disappearance date of a link may be the first date that the document containing the link either dropped the link or disappeared itself.
 These dates may be determined by search engine 125 during a crawl or index update operation. Using this date as a reference, search engine 125 may then monitor the time-varying behavior of links to the document, such as when links appear or disappear, the rate at which links appear or disappear over time, how many links appear or disappear during a given time period, whether there is trend toward appearance of new links versus disappearance of existing links to the document, etc.
 Using the time-varying behavior of links to (and/or from) a document, search engine 125 may score the document accordingly. For example, a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document’s score. Conversely, an upward trend may signal a “fresh” document (e.g., a document whose content is fresh–recently created or updated) that might be considered more relevant, depending on the particular situation and implementation.
This says they will monitor back links for trends. If the trend begins to drop links, the site could be deemed a stale site. If the trend is increasing, it would be scored higher as a freshly updated site. In other words, if you continually build links, you will be fine. If you add a lot of links and stop, then they start dropping off, not so good.
 The dates that links appear can also be used to detect “spam,” where owners of documents or their colleagues create links to their own document for the purpose of boosting the score assigned by a search engine. A typical, “legitimate” document attracts back links slowly. A large spike in the quantity of back links may signal a topical phenomenon (e.g., the CDC web site may develop many links quickly after an outbreak, such as SARS), or signal attempts to spam a search engine (to obtain a higher ranking and, thus, better placement in search results) by exchanging links, purchasing links, or gaining links from documents without editorial discretion on making links. Examples of documents that give links without editorial discretion include guest books, referrer logs, and “free for all” pages that let anyone add a link to a document.
This is an important part stating that natural link building process happens slowly. So building link popularity should be a steady process occuring in small chunks. And definitely don’t waste your time on guestbooks/forums and FFA pages.
 According to another implementation, the analysis may depend, not only on the age of the links to a document, but also on the dynamic-ness of the links. As such, search engine 125 may weight documents that have a different featured link each day, despite having a very fresh link, differently (e.g., lower) than documents that are consistently updated and consistently link to a given target document. In one exemplary implementation, search engine 125 may generate a score for a document based on the scores of the documents with links to the document for all versions of the documents within a window of time. Another version of this may factor a discount/decay into the integration based on the major update times of the document.
This one goes out to all of you participating in the programs like Digital Point’s coop network and also Link Vault. . . Other than possibly getting traffic, which probably isn’t that great since most people stick them in hard to find locations, these programs look like they are not going to help much anymore. They are great free programs to help people out, but of course the big girl (Google) had to put a stop to them. I believe the constant changing of text ads and url’s with every visit is a flag to them and they will either be discounted or ignored all together. Hopefully they will not penalize for them!
 Alternatively, if the content of a document changes such that it differs significantly from the anchor text associated with its back links, then the domain associated with the document may have changed significantly (completely) from a previous incarnation. This may occur when a domain expires and a different party purchases the domain. Because anchor text is often considered to be part of the document to which its associated link points, the domain may show up in search results for queries that are no longer on topic. This is an undesirable result.
 One way to address this problem is to estimate the date that a domain changed its focus. This may be done by determining a date when the text of a document changes significantly or when the text of the anchor text changes significantly. All links and/or anchor text prior to that date may then be ignored or discounted.
This one is a sharp blow to those who hawk over high ranked older sites when they expire. Buying existing sites will be no good unless their topic is directly related to what you plan to put on it after you attain the name. Once they realize the domain changed ownership, all previous links are discounted.
 The freshness of anchor text may also be used as a factor in scoring documents. The freshness of an anchor text may be determined, for example, by the date of appearance/change of the anchor text, the date of appearance/change of the link associated with the anchor text, and/or the date of appearance/change of the document to which the associated link points. The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good. In order to not update an anchor text’s freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and an anchor text’s freshness may be updated (or not updated) accordingly.
HELLO ROTATING ADS! ! ! ! 🙂 This says the LinkWorth rotating ads are great!
 According to an implementation consistent with the principles of the invention, information relating to traffic associated with a document over time may be used to generate (or alter) a score associated with the document. For example, search engine 125 may monitor the time-varying characteristics of traffic to, or other “use” of, a document by one or more users. A large reduction in traffic may indicate that a document may be stale (e.g., no longer be updated or may be superseded by another document).
 In one implementation, search engine 125 may compare the average traffic for a document over the last j days (e.g., where j=30) to the average traffic during the month where the document received the most traffic, optionally adjusted for seasonal changes, or during the last k days (e.g., where k=365). Optionally, search engine 125 may identify repeating traffic patterns or perhaps a change in traffic patterns over time. It may be discovered that there are periods when a document is more or less popular (i.e., has more or less traffic), such as during the summer months, on weekends, or during some other seasonal time period. By identifying repeating traffic patterns or changes in traffic patterns, search engine 125 may appropriately adjust its scoring of the document during and outside of these periods.
 Additionally, or alternatively, search engine 125 may monitor time-varying characteristics relating to “advertising traffic” for a particular document. For example, search engine 125 may monitor one or a combination of the following factors: (1) the extent to and rate at which advertisements are presented or updated by a given document over time; (2) the quality of the advertisers (e.g., a document whose advertisements refer/link to documents known to search engine 125 over time to have relatively high traffic and trust, such as amazon.com, may be given relatively more weight than those documents whose advertisements refer to low traffic/untrustworthy documents, such as a pornographic site); and (3) the extent to which the advertisements generate user traffic to the documents to which they relate (e.g., their click-through rate). Search engine 125 may use these time-varying characteristics relating to advertising traffic to score the document.
They are scoring sites based on traffic. They watch for seasonal trends and will know to score sites higher during their seasonal times.
 According to an implementation consistent with the principles of the invention, information corresponding to individual or aggregate user behavior relating to a document over time may be used to generate (or alter) a score associated with the document. For example, search engine 125 may monitor the number of times that a document is selected from a set of search results and/or the amount of time one or more users spend accessing the document. Search engine 125 may then score the document based, at least in part, on this information.
 If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively. For example, assume that the query “Riverview swimming schedule” returns a document with the title “Riverview Swimming Schedule.” Assume further that users used to spend 30 seconds accessing it, but now every user that selects the document only spends a few seconds accessing it. Search engine 125 may use this information to determine that the document is stale (i.e., contains an outdated swimming schedule) and score the document accordingly.
Make sure you keep your users reading your info as long as possible! If people leave your site faster each time, it will go against you considering your site stale.
 According to an implementation consistent with the principles of the invention, information relating to a domain associated with a document may be used to generate (or alter) a score associated with the document. For example, search engine 125 may monitor information relating to how a document is hosted within a computer network (e.g., the Internet, an intranet or other network or database of documents) and use this information to score the document.
 Individuals who attempt to deceive (spam) search engines often use throwaway or “doorway” domains and attempt to obtain as much traffic as possible before being caught. Information regarding the legitimacy of the domains may be used by search engine 125 when scoring the documents associated with these domains.
 Certain signals may be used to distinguish between illegitimate and legitimate domains. For example, domains can be renewed up to a period of 10 years. Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith.
 Also, or alternatively, the domain name server (DNS) record for a domain may be monitored to predict whether a domain is legitimate. The DNS record contains details of who registered the domain, administrative and technical addresses, and the addresses of name servers (i.e., servers that resolve the domain name into an IP address). By analyzing this data over time for a domain, illegitimate domains may be identified. For instance, search engine 125 may monitor whether physically correct address information exists over a period of time, whether contact information for the domain changes relatively often, whether there is a relatively high number of changes between different name servers and hosting companies, etc. In one implementation, a list of known-bad contact information, name servers, and/or IP addresses may be identified, stored, and used in predicting the legitimacy of a domain and, thus, the documents associated therewith.
 Also, or alternatively, the age, or other information, regarding a name server associated with a domain may be used to predict the legitimacy of the domain. A “good” name server may have a mix of different domains from different registrars and have a history of hosting those domains, while a “bad” name server might host mainly pornography or doorway domains, domains with commercial words (a common indicator of spam), or primarily bulk domains from a single registrar, or might be brand new. The newness of a name server might not automatically be a negative factor in determining the legitimacy of the associated domain, but in combination with other factors, such as ones described herein, it could be.
This claims they will be learning DNS information on each domain, including nameservers, ip addresses and where the domains were registered.
 In addition, or alternatively, search engine 125 may monitor the ranks of documents over time to detect sudden spikes in the ranks of the documents. A spike may indicate either a topical phenomenon (e.g., a hot topic) or an attempt to spam search engine 125 by, for example, trading or purchasing links. Search engine 125 may take measures to prevent spam attempts by, for example, employing hysteresis to allow a rank to grow at a certain rate. In another implementation, the rank for a given document may be allowed a certain maximum threshold of growth over a predefined window of time. As a further measure to differentiate a document related to a topical phenomenon from a spam document, search engine 125 may consider mentions of the document in news articles, discussion groups, etc. on the theory that spam documents will not be mentioned, for example, in the news. Any or a combination of these techniques may be used to curtail spamming attempts.
This section proves that buying links based solely on high pagerank is not the way to go. The mixture will cause a more natural increase of your ranking. . . something they want to see.
 It may be possible for search engine 125 to make exceptions for documents that are determined to be authoritative in some respect, such as government documents, web directories (e.g., Yahoo), and documents that have shown a relatively steady and high rank over time. For example, if an unusual spike in the number or rate of increase of links to an authoritative document occurs, then search engine 125 may consider such a document not to be spam and, thus, allow a relatively high or even no threshold for (growth of) its rank (over time).
This simply states acquiring links from big directories like Yahoo and so on is fine. A bit hypocritical, don’t you think?! You can pay for it with big companies, but you can’t pay for it from small companies. hmmm. . .
 According to an implementation consistent with the principles of the invention, user maintained or generated data may be used to generate (or alter) a score associated with a document. For example, search engine 125 may monitor data maintained or generated by a user, such as “bookmarks,” “favorites,” or other types of data that may provide some indication of documents favored by, or of interest to, the user. Search engine 125 may obtain this data either directly (e.g., via a browser assistant) or indirectly (e.g., via a browser). Search engine 125 may then analyze over time a number of bookmarks/favorites to which a document is associated to determine the importance of the document.
 Search engine 125 may also analyze upward and downward trends to add or remove the document (or more specifically, a path to the document) from the bookmarks/favorites lists, the rate at which the document is added to or removed from the bookmarks/favorites lists, and/or whether the document is added to, deleted from, or accessed through the bookmarks/favorites lists. If a number of users are adding a particular document to their bookmarks/favorites lists or often accessing the document through such lists over time, this may be considered an indication that the document is relatively important. On the other hand, if a number of users are decreasingly accessing a document indicated in their bookmarks/favorites list or are increasingly deleting/replacing the path to such document from their lists, this may be taken as an indication that the document is outdated, unpopular, etc. Search engine 125 may then score the documents accordingly.
 In an alternative implementation, other types of user data that may indicate an increase or decrease in user interest in a particular document over time may be used by search engine 125 to score the document. For example, the “temp” or cache files associated with users could be monitored by search engine 125 to identify whether there is an increase or decrease in a document being added over time. Similarly, cookies associated with a particular document might be monitored by search engine 125 to determine whether there is an upward or downward trend in interest in the document.
Now is a good time to add the “ADD US TO YOUR BOOKMARK” options. They will consider this as a positive to your score.
 According to an implementation consistent with the principles of the invention, information regarding unique words, bigrams, and phrases in anchor text may be used to generate (or alter) a score associated with a document. For example, search engine 125 may monitor web (or link) graphs and their behavior over time and use this information for scoring, spam detection, or other purposes. Naturally developed web graphs typically involve independent decisions. Synthetically generated web graphs, which are usually indicative of an intent to spam, are based on coordinated decisions, causing the profile of growth in anchor words/bigrams/phrases to likely be relatively spiky.
 One reason for such spikiness may be the addition of a large number of identical anchors from many documents. Another possibility may be the addition of deliberately different anchors from a lot of documents. Search engine 125 may monitor the anchors and factor them into scoring a document to which their associated links point. For example, search engine 125 may cap the impact of suspect anchors on the score of the associated document. Alternatively, search engine 125 may use a continuous scale for the likelihood of synthetic generation and derive a multiplicative factor to scale the score for the document.
 In summary, search engine 125 may generate (or alter) a score associated with a document based, at least in part, on information regarding unique words, bigrams, and phrases in anchor text associated with one or more links pointing to the document.
This just says, DON’T SPAM!. This talks about machine generated pages that really have no content but only keyword/anchor text happy pages.
 According to an implementation consistent with the principles of the invention, information regarding linkage of independent peers (e.g., unrelated documents) may be used to generate (or alter) a score associated with a document.
 A sudden growth in the number of apparently independent peers, incoming and/or outgoing, with a large number of links to individual documents may indicate a potentially synthetic web graph, which is an indicator of an attempt to spam. This indication may be strengthened if the growth corresponds to anchor text that is unusually coherent or discordant. This information can be used to demote the impact of such links, when used with a link-based scoring technique, either as a binary decision item (e.g., demote the score by a fixed amount) or a multiplicative factor.
This is telling us to stay relevant! Don’t plaster your text link ads all over non-relevant sites. There are definitely situations where a non-relevant site is actually relevant, but keep the majority relevant.
In conclusion, this new patent, which is said to be implemented during this most recent update, is definitely involving some big changes in scoring websites. The underlying goal of this patent is to combat the various types of techniques people use to alter their results. There is no doubt they need to work on many of these forms of spamming, but they also might be taking things to the extreme in some situations. Many of their theories do fit in many situations, however, there are also many situations where their theories go against the grain of certain markets. Although they include that their machine will adapt and “learn” the trends and patterns to eliminate these situations, there is no doubt innocent websites will be hit and hit hard for their over bearing attempts.
Now I can’t fault them for continuing to innovate and strive for the “perfect results”, but it’s almost as if they are focusing more on spammers than they are on relevancy. Ok, so let’s say they successfully eliminate all aspects of spam sites (which will NEVER happen), does this automatically give them the perfect natural listing results? The answer is very easy, which wouldn’t take 4 hours to read like this patent app, the answer is “NO”.
Finally, what does this mean to LinkWorth? “Nothing!” LinkWorth does sell text ads, but guess who else sells text links? Google. It’s a proven method of online marketing and our advertisers are wonderfully excited with the results we present to them. LinkWorth does not use spamming techniques since we match advertisers to relevant partners. If we focused on rankings only, we obviously would not stay in business long with constant changing. Soon we will launch or additional services which will include pay per click management, banner ads and a couple of very exciting new products unheard of by the public! We will change as quick, or quicker, than the search engine themselves. Staying ahead of the curve is what makes or breaks companies and we pride ourselves in staying ahead. Two of our products, Billboards and Rotating Ads, directly benefit from this latest patent addition, so we’re just fine. Sticking to the straight and narrow path will allow companies to live long and healthy lives.
Recent Google Update
To those who have read or have noticed the big changes in Google’s most recent updated, labeled “Jagger”, the dust is beginning to settle and search results are starting to look more reasonable. Many theories have been thrown around as to what has changed with Google’s algorithm, including new PageRank, new algo calculations, discounting this, discounting that . . . but as usual, they are complete guess-timations because no one really knows what is happening except those under the Google roof themselves. Even the update name’s, like “Jagger”, are complete fabrications created by someone over at a popular forum. Google has no involvement with names to the best of our knowledge. At least not any names that are given to the public.
Many people took big hits with this recent update and while it is only natural to knee jerk if you drop off the face of the earth, we have been able to talk everyone off the ledge and wait around for things to settle. Google is obviously revising their quest for natural search results, which is a great thing, but in the process, things can definitely change. If you have no life and want to read some really boring techno mumbo jumbo, take a read of Google’s recent patent submission for more info. Some say it could be blue smoke being blown to disguise the true workings of their brain, but who really knows?! The bottom line is, they are constantly innovating new ideas to present the best results possible.
Some may ask, “Well why would you want their results to change?” That’s easy . . . I want only reliable results. The bottom line is, if you have a great site, build it so you like it, other people like it and it’s a constant venue for new information about your subject, you will be rewarded in the search results. LinkWorth is not here to spam or trick search engines, we’re here to help companies build their visibility, name awareness and their popularity. The best formula for anyone promoting their website would be a combination of text link advertising through LinkWorth and putting together a PPC (pay per click) campaign through Google and/or Yahoo. LinkWorth will allow your visibility to be built up through relevant or similar sites as yours and PPC will help get the instant traffic one may need.
So please take the Google update as a good change for all. Sure, it will take more work for some of you, but if they made things a breeze, it would make any market impossible unless you have millions of bucks in the bank to out pay your competition. Google makes the playing field fair and competitive for big to small companies. We’ll always be here to help through the tough times and the easy times.