Search engine optimization (SEO)
Search Engine Optimization (SEO)
SEO, or Search Engine Optimization, is a set of techniques and strategies aimed at improving the visibility and ranking of a website or web page in search engine results pages (SERPs). The primary goal of SEO is to drive organic (non-paid) traffic to a website. Here are some key aspects of SEO:
- Keyword Research: This is the foundation of SEO. It involves
researching and selecting relevant keywords or phrases that users are
likely to enter into search engines when looking for information related
to your website.
- On-Page SEO:
This refers to optimizing individual web pages to make them more
search-engine-friendly. It includes elements like optimizing meta tags
(title tags, meta descriptions), using relevant keywords in content, and
ensuring proper HTML markup.
- Content Quality: High-quality, relevant, and valuable content is
crucial for SEO. Search engines reward websites that provide useful
information to their users. Regularly updated content can also improve
SEO.
- Link Building:
This involves getting other reputable websites to link to your site.
Quality backlinks from authoritative sources can improve your site's
authority and ranking in search results.
- Technical SEO:
This aspect deals with the technical aspects of a website, such as site
speed, mobile-friendliness, and proper website structure. Ensuring that
your website is technically sound is important for SEO.
- User Experience (UX): A good user experience is important for both users
and search engines. Sites with a clean design, easy navigation, and fast
loading times tend to rank better.
- Local SEO:
For businesses with physical locations, optimizing for local search is
crucial. This includes creating and optimizing Google My Business listings
and obtaining local citations.
- Mobile Optimization: With the increasing use of mobile devices, it's
important to have a mobile-responsive website. Google also considers
mobile-friendliness when ranking sites.
- Analytics and Monitoring: Regularly monitoring your website's performance using
tools like Google Analytics is essential. It helps you understand what's
working and what needs improvement.
- Algorithm Updates: Search engines like Google frequently update their
algorithms. Staying informed about these updates and adapting your SEO
strategies accordingly is vital.
- White Hat vs. Black Hat SEO: SEO techniques can be broadly categorized into
"white hat" (ethical) and "black hat" (unethical).
It's important to follow ethical practices to avoid penalties from search
engines.
- SEO Tools:
There are many tools available to help with SEO, such as keyword research
tools, SEO auditing tools, and analytics platforms. These can assist in
optimizing your website effectively.
SEO is an ongoing process that
requires time, effort, and constant adaptation to changing search engine
algorithms and user behaviors. It's a crucial aspect of digital marketing for
businesses and website owners looking to improve their online visibility and
attract organic traffic.
Search engine optimization (SEO) is the process of
improving the quality and quantity of website traffic to a website or
a web page from search engines. SEO targets unpaid traffic
(known as "natural" or "organic" results) rather than direct
traffic or paid traffic. Unpaid traffic may originate from different kinds
of searches, including image search, video search, academic search, news
search, and industry-specific vertical search engines.
As an Internet marketing strategy,
SEO considers how search engines work, the computer-programmed algorithms that
dictate search engine behavior, what people search for, the actual search terms
or keywords typed into search engines, and which search engines are
preferred by their targeted audience. SEO is performed because a website will
receive more visitors from a search engine when websites rank higher on
the search engine results page (SERP). These visitors can then
potentially be converted into customers.
History
Webmasters and content providers
began optimizing websites for search engines in the mid-1990s, as the first
search engines were cataloging the early Web. Initially, all webmasters
only needed to submit the address of a page, or URL, to the various
engines, which would send a web crawler to crawl that
page, extract links to other pages from it, and return information found on the
page to be indexed. The process involves a search engine spider
downloading a page and storing it on the search engine's own server. A second
program, known as an indexer, extracts information about the page, such as
the words it contains, where they are located, and any weight for specific
words, as well as all links the page contains. All of this information is then
placed into a scheduler for crawling at a later date.
Website owners recognized the value of
a high ranking and visibility in search engine results, creating
an opportunity for both white-hat and black-hat SEO
practitioners. According to industry analyst Danny Sullivan, the phrase
"search engine optimization" probably came into use in 1997. Sullivan
credits Bruce Clay as one of the first people to popularize the term.
Early versions of search algorithms relied
on webmaster-provided information such as the keyword meta tag or
index files in engines like ALIWEB. Meta tags provide a guide to each
page's content. Using metadata to index pages was found to be less than
reliable, however, because the webmaster's choice of keywords in the meta tag
could potentially be an inaccurate representation of the site's actual content.
Flawed data in meta tags, such as those that were inaccurate or incomplete,
created the potential for pages to be mischaracterized in irrelevant searches. Web
content providers also manipulated some attributes within the HTML source
of a page in an attempt to rank well in search engines. By 1997, search
engine designers recognized that webmasters were making efforts to rank well in
their search engines and that some webmasters were even manipulating their
rankings in search results by stuffing pages with excessive or irrelevant
keywords. Early search engines, such as Altavista and Infoseek,
adjusted their algorithms to prevent webmasters from manipulating rankings.
By heavily relying on factors such
as keyword density, which were exclusively within a webmaster's control,
early search engines suffered from abuse and ranking manipulation. To provide
better results to their users, search engines had to adapt to ensure
their results pages showed the most relevant search results, rather
than unrelated pages stuffed with numerous keywords by unscrupulous webmasters.
This meant moving away from heavy reliance on term density to a more holistic
process for scoring semantic signals. Since the success and popularity of
a search engine are determined by its ability to produce the most relevant
results to any given search, poor quality or irrelevant search results could
lead users to find other search sources. Search engines responded by developing
more complex ranking algorithms, taking into account additional factors that
were more difficult for webmasters to manipulate.
Companies that employ overly aggressive
techniques can get their client websites banned from the search results. In
2005, the Wall Street Journal reported on a company, Traffic
Power, which allegedly used high-risk techniques and failed to disclose those
risks to its clients. Wired magazine reported that the same
company sued blogger and SEO Aaron Wall for writing about the ban. Google's Matt
Cutts later confirmed that Google did in fact ban Traffic Power and some
of its clients.
Some search engines have also reached
out to the SEO industry and are frequent sponsors and guests at SEO
conferences, webchats, and seminars. Major search engines provide information
and guidelines to help with website optimization. Google has a Sitemaps program
to help webmasters learn if Google is having any problems indexing their
website and also provides data on Google traffic to the website. Bing
Webmaster Tools provides a way for webmasters to submit a sitemap and web
feeds, allows users to determine the "crawl rate," and track the web
page's index status.
In 2015, it was reported that Google was
developing and promoting mobile search as a key feature within future products.
In response, many brands began to take a different approach to their Internet
marketing strategies.
Relationship with
Google
In 1998, two graduate students at Stanford
University, Larry Page and Sergey Brin, developed
"Backrub," a search engine that relied on a mathematical algorithm to
rate the prominence of web pages. The number calculated by the algorithm, PageRank,
is a function of the quantity and strength of inbound links. PageRank
estimates the likelihood that a given page will be reached by a web user who
randomly surfs the web and follows links from one page to another. In effect,
this means that some links are stronger than others, as a higher PageRank page
is more likely to be reached by the random web surfer.
Page and Brin founded Google in 1998. Google
attracted a loyal following among the growing number of Internet users,
who liked its simple design. Off-page factors (such as PageRank and
hyperlink analysis) were considered as well as on-page factors (such as keyword
frequency, meta tags, headings, links, and site structure) to enable Google
to avoid the kind of manipulation seen in search engines that only considered
on-page factors for their rankings. Although PageRank was more difficult
to game, webmasters had already developed link-building tools and schemes
to influence the Inktomi search engine, and these methods proved
similarly applicable to gaming PageRank. Many sites focus on exchanging,
buying, and selling links, often on a massive scale. Some of these schemes,
or link farms, involved the creation of thousands of sites for the sole
purpose of link spamming.
By 2004, search engines had
incorporated a wide range of undisclosed factors in their ranking algorithms to
reduce the impact of link manipulation. The leading search engines,
Google, Bing, and Yahoo, do not disclose the algorithms they use to
rank pages. Some SEO practitioners have studied different approaches to search
engine optimization and have shared their personal opinions. Patents
related to search engines can provide information to better understand search
engines. In 2005, Google began personalizing search results for each user.
Depending on their history of previous searches, Google crafted results for
logged-in users.
In 2007, Google announced a campaign
against paid links that transfer PageRank. On June 15, 2009, Google
disclosed that they had taken measures to mitigate the effects of PageRank
sculpting by use of the nofollow attribute on links. Matt Cutts,
a well-known software engineer at Google, announced that Google Bot would no
longer treat any no-follow links, in the same way, to prevent SEO service
providers from using nofollow for PageRank sculpting. As a result of this
change, the usage of nofollow led to the evaporation of PageRank. In order to avoid
the above, SEO engineers developed alternative techniques that replace
no-followed tags with obfuscated JavaScript and thus permit PageRank
sculpting. Additionally, several solutions have been suggested that include the
usage of iframes, Flash, and JavaScript.
In December 2009, Google announced it
would be using the web search history of all its users in order to populate
search results. On June 8, 2010, a new web indexing system called Google
Caffeine was announced. Designed to allow users to find news results,
forum posts, and other content much sooner after publishing than before, Google
Caffeine was a change to the way Google updated its index in order to make
things show up quicker on Google than before. According to Carrie Grimes, the
software engineer who announced Caffeine for Google, "Caffeine provides 50
percent fresher results for web searches than our last index..." Google
Instant, real-time-search, was introduced in late 2010 in an attempt to make
search results more timely and relevant. Historically site administrators have
spent months or even years optimizing a website to increase search rankings.
With the growth in popularity of social media sites and blogs, the leading
engines made changes to their algorithms to allow fresh content to rank quickly
within the search results.
In February 2011, Google announced
the Panda update, which penalizes websites containing content
duplicated from other websites and sources. Historically websites have copied
content from one another and benefited in search engine rankings by engaging in
this practice. However, Google implemented a new system that punishes sites
whose content is not unique. The 2012 Google Penguin attempted
to penalize websites that used manipulative techniques to improve their
rankings on the search engine. Although Google Penguin has been presented
as an algorithm aimed at fighting web spam, it really focuses on spammy links by
gauging the quality of the sites the links are coming from. The 2013 Google
Hummingbird update featured an algorithm change designed to improve
Google's natural language processing and semantic understanding of web pages.
Hummingbird's language processing system falls under the newly recognized term
of "conversational search," where the system pays more attention to
each word in the query in order to better match the pages to the meaning of the
query rather than a few words. With regards to the changes made to search
engine optimization, for content publishers and writers, Hummingbird is
intended to resolve issues by getting rid of irrelevant content and spam,
allowing Google to produce high-quality content and rely on them to be
'trusted' authors.
In October 2019, Google announced they
would start applying BERT models for English language search queries
in the US. Bidirectional Encoder Representations from Transformers (BERT) was
another attempt by Google to improve their natural language processing, but
this time in order to better understand the search queries of their users. In
terms of search engine optimization, BERT intended to connect users more easily
to relevant content and increase the quality of traffic coming to websites that
are ranking in the Search Engine Results Page.
As marketing strategy
SEO is not an appropriate strategy for
every website, and other Internet marketing strategies can be more effective,
such as paid advertising through pay-per-click (PPC) campaigns,
depending on the site operator's goals. Search engine marketing (SEM) is
the practice of designing, running, and optimizing search engine ad campaigns.
Its difference from SEO is most simply depicted as the difference between paid
and unpaid priority ranking in search results. SEM focuses on prominence more
so than relevance; website developers should regard SEM with the utmost
importance with consideration to visibility as most navigate to the primary
listings of their search. A successful Internet marketing campaign may
also depend upon building high-quality web pages to engage and persuade
Internet users, setting up analytics programs to enable site owners
to measure results, and improving a site's conversion rate. In
November 2015, Google released a full 160-page version of its Search Quality
Rating Guidelines to the public, which revealed a shift in their focus
towards "usefulness" and mobile local search. In recent years
the mobile market has exploded, overtaking the use of desktops, as shown in
by StatCounter in October 2016, where they analyzed 2.5 million
websites and found that 51.3% of the pages were loaded by a mobile device. Google
has been one of the companies that are utilizing the popularity of mobile usage
by encouraging websites to use their Google Search Console, the
Mobile-Friendly Test, which allows companies to measure up their website to the
search engine results and determine how user-friendly their websites are. The
closer the keywords are together their ranking will improve based on key terms.
SEO may generate an adequate return
on investment. However, search engines are not paid for organic search traffic,
their algorithms change, and there are no guarantees of continued referrals.
Due to this lack of guarantee and uncertainty, a business that relies heavily
on search engine traffic can suffer major losses if the search engines stop
sending visitors. Search engines can change their algorithms, impacting a
website's search engine ranking, and possibly resulting in a serious loss of
traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500
algorithm changes - almost 1.5 per day. It is considered a wise business
practice for website operators to liberate themselves from dependence on search
engine traffic. In addition to accessibility in terms of web crawlers
(addressed above), user web accessibility has become increasingly
important for SEO.
International markets
Optimization techniques are highly
tuned to the dominant search engines in the target market. The search engines'
market shares vary from market to market, as does competition. In 2003, Danny
Sullivan stated that Google represented about 75% of all searches. In
markets outside the United States, Google's share is often larger, and Google
remains the dominant search engine worldwide as of 2007. As of 2006,
Google had an 85-90% market share in Germany. While there were hundreds of
SEO firms in the US at that time, there were only about five in Germany. As
of June 2008, the market share of Google in the UK was close to 90% according
to Hitwise. That market share is achieved in a number of countries.
As of 2009, there are only a few large
markets where Google is not the leading search engine. In most cases, when
Google is not leading in a given market, it is lagging behind a local player.
The most notable example markets are China, Japan, South Korea, Russia, and the
Czech Republic, where Baidu, Yahoo! Japan, Naver, Yandex, and Seznam are
market leaders.
Successful search optimization for
international markets may require professional translation of web
pages, registration of a domain name with a top-level domain in the
target market, and web hosting that provides a local IP address.
Otherwise, the fundamental elements of search optimization are essentially the
same, regardless of language.
Legal precedents
On October 17, 2002, SearchKing filed
suit in the United States District Court, Western District of Oklahoma,
against the search engine Google. SearchKing's claim was that Google's tactics
to prevent spamdexing constituted a tortious interference with
contractual relations. On May 27, 2003, the court granted Google's motion to
dismiss the complaint because SearchKing "failed to state a claim upon
which relief may be granted."
In March 2006, KinderStart filed a
lawsuit against Google over search engine rankings. KinderStart's website was
removed from Google's index prior to the lawsuit, and the amount of traffic to
the site dropped by 70%. On March 16, 2007, the United States District
Court for the Northern District of California (San Jose Division)
dismissed KinderStart's complaint without leave to amend and partially granted
Google's motion for Rule 11 sanctions against KinderStart's attorney,
requiring him to pay part of Google's legal expenses.
0 Comments