Search Engine Optimization History Webmasters and content providers
began optimizing sites for search engines in the mid-1990s, as the first
search engines were cataloging the early Web. Initially, all webmasters
needed to do was to submit the address of a page, or URL, to the
various engines which would send a “spider” to “crawl” that page,
extract links to other pages from it, and return information found on
the page to be indexed. The process involves a search engine spider
downloading a page and storing it on the search engine’s own server,
where a second program, known as an indexer, extracts various
information about the page, such as the words it contains and where
these are located, as well as any weight for specific words, and all
links the page contains, which are then placed into a scheduler for
crawling at a later date.
Site owners started to recognize the
value of having their sites highly ranked and visible in search engine
results, creating an opportunity for both white hat and black hat SEO
practitioners. According to industry analyst Danny Sullivan, the phrase
“search engine optimization” probably came into use in 1997.The first
documented use of the term Search Engine Optimization was John Audette
and his company Multimedia Marketing Group as documented by a web page
from the MMG site from August, 1997.
Early versions of search
algorithms relied on webmaster-provided information such as the keyword
meta tag, or index files in engines like ALIWEB. Meta tags provide a
guide to each page’s content. Using meta data to index pages was found
to be less than reliable, however, because the webmaster’s choice of
keywords in the meta tag could potentially be an inaccurate
representation of the site’s actual content. Inaccurate, incomplete, and
inconsistent data in meta tags could and did cause pages to rank for
irrelevant searches. Web content providers also manipulated a number of
attributes within the HTML source of a page in an attempt to rank well
in search engines.
By relying so much on factors such as keyword
density which were exclusively within a webmaster’s control, early
search engines suffered from abuse and ranking manipulation. To provide
better results to their users, search engines had to adapt to ensure
their results pages showed the most relevant search results, rather than
unrelated pages stuffed with numerous keywords by unscrupulous
webmasters. Since the success and popularity of a search engine is
determined by its ability to produce the most relevant results to any
given search, poor quality or irrelevant search results could lead users
to find other search sources. Search engines responded by developing
more complex ranking algorithms, taking into account additional factors
that were more difficult for webmasters to manipulate. Graduate students
at Stanford University, Larry Page and Sergey Brin, developed
“Backrub,” a search engine that relied on a mathematical algorithm to
rate the prominence of web pages. The number calculated by the
algorithm, PageRank, is a function of the quantity and strength of
inbound links. PageRank estimates the likelihood that a given page will
be reached by a web user who randomly surfs the web, and follows links
from one page to another. In effect, this means that some links are
stronger than others, as a higher PageRank page is more likely to be
reached by the random surfer.
Page and Brin founded Google in
1998. Google attracted a loyal following among the growing number of
Internet users, who liked its simple design.Off-page factors (such as
PageRank and hyperlink analysis) were considered as well as on-page
factors (such as keyword frequency, meta tags, headings, links and site
structure) to enable Google to avoid the kind of manipulation seen in
search engines that only considered on-page factors for their rankings.
Although PageRank was more difficult to game, webmasters had already
developed link building tools and schemes to influence the Inktomi
search engine, and these methods proved similarly applicable to gaming
PageRank. Many sites focused on exchanging, buying, and selling links,
often on a massive scale. Some of these schemes, or link farms, involved
the creation of thousands of sites for the sole purpose of link
spamming.
By 2004, search engines had incorporated a wide range
of undisclosed factors in their ranking algorithms to reduce the impact
of link manipulation. In June 2007, The New York Times’ Saul Hansell
stated Google ranks sites using more than 200 different signals.The
leading search engines, Google, Bing, and Yahoo, do not disclose the
algorithms they use to rank pages. Some SEO practitioners have studied
different approaches to search engine optimization, and have shared
their personal opinionsPatents related to search engines can provide
information to better understand search engines.
In 2005, Google
began personalizing search results for each user. Depending on their
history of previous searches, Google crafted results for logged in
users.In 2008, Bruce Clay said that “ranking is dead” because of
personalized search. He opined that it would become meaningless to
discuss how a website ranked, because its rank would potentially be
different for each user and each search.
In 2007, Google
announced a campaign against paid links that transfer PageRank.On June
15, 2009, Google disclosed that they had taken measures to mitigate the
effects of PageRank sculpting by use of the nofollow attribute on links.
Matt Cutts, a well-known software engineer at Google, announced that
Google Bot would no longer treat nofollowed links in the same way, in
order to prevent SEO service providers from using nofollow for PageRank
sculpting.As a result of this change the usage of nofollow leads to
evaporation of pagerank. In order to avoid the above, SEO engineers
developed alternative techniques that replace nofollowed tags with
obfuscated Javascript and thus permit PageRank sculpting. Additionally
several solutions have been suggested that include the usage of iframes,
Flash and Javascript.
In December 2009, Google announced it
would be using the web search history of all its users in order to
populate search results.
Google Instant, real-time-search, was
introduced in late 2010 in an attempt to make search results more timely
and relevant. Historically site administrators have spent months or
even years optimizing a website to increase search rankings. With the
growth in popularity of social media sites and blogs the leading engines
made changes to their algorithms to allow fresh content to rank quickly
within the search results.
In February 2011, Google announced
the Panda update, which penalizes websites containing content duplicated
from other websites and sources. Historically websites have copied
content from one another and benefited in search engine rankings by
engaging in this practice, however Google implemented a new system which
punishes sites whose content is not unique.
In April 2012,
Google launched the Google Penguin update the goal of which was to
penalise websites that used manipulative techniques to improve their
rankings on the search engine.
Free Sms, Send Sms Messages, Send Sms, Funny Sms Messages, Sms, Funny Sms, Friendship Sms Messages, Friendship Sms Quotes, Mobiles Text Sms, Sms, Send Free Sms, Free Sms Send, Mobile Sms, Free Mobile Sms, Sms to Mobile, Sms Free, Web to Sms, Free Send Sms, Sms Send, Mobile Sms Free, Send Sms, Sms Urdu, Poetry Sms, Free International Sms, Sms Poetry, Urdu Poetry Sms, Pashto Sms, Ghazal Sms, Funny Urdu Sms, Urdu Funny Sms, Sms Software, Fake Sms Sender, Sms in Urdu, Faraz Sms, Shero Shayari Sms
No comments:
Post a Comment