Few things are more gratifying for a website manager than to type a relevant keyword or phrase into a search engine and see their site proudly listed near the top of the results page. Whether you’re running a business, promoting written or video content, or maintaining any other kind of online presence, standing out from the crowd in a vast World Wide Web is no small feat.
Practicing effective SEO can help you gain an edge over the competition. SEO comprises a variety of strategies for signaling to search engines like Google that your content is appropriate for users searching specific terms. However, SEO has changed just as rapidly as the internet itself over the past nearly 30 years.
Understanding how SEO has evolved over time reveals valuable insights regarding the best methods of optimizing your site to rank highly in your respective field. It also provides some hints as to what changes could be on the horizon. Let’s look back at the history of SEO and how its present state compares to its infancy.
The birth of SEO essentially dates back to the first search engine, which appeared in 1990 and was known as Archie (Source: Hubspot). While Archie itself predated most webpages, it represented a desire to create a searchable index of the content on the internet. Over the next several years, the web would really begin to grow, and many other search engines emerged to help users harness the boundless information and knowledge available on the platform.
If you surfed the web in the 90s, you’re probably very familiar with names like Excite, Lycos, AltaVista, and Yahoo!. Perhaps you even asked Jeeves a few questions back in the day. In this fledgling era, SEO was much like the Wild West. Anything was fair game. Unsurprisingly, an environment founded on flimsy rules led to the rise of many practices which are frowned upon today as “black hat” SEO techniques.
Back then, if you wanted to rank highly for a specific search term, you’d simply use that term as frequently as possible in your website content. This strategy is appropriately known as keyword stuffing. Cramming your chosen keyword or phrase into your site countless times is a clear example of how SEO used to be more about producing content for a machine rather than a human audience. Search engines would recognize the term used heavily on your site and assume it was, therefore, a good resource for those interested in the subject. Of course, that’s not always true. In fact, keyword stuffing more often results in thin, clumsy content rather than comprehensive information.
Websites also used to hoard backlinks in an effort to appear more attractive to search engines. The problem was that many of these links came from spammy, low-quality sources. In theory, many links directing to your site should indicate that you’re an authoritative destination in your field. But when those links appear on inferior sites, your page’s lofty search rank could be misleading.
The course of SEO history took a fateful turn in 1997 when a pair of Stanford University students registered the domain Google.com. Larry Page and Sergey Brin had some fresh ideas about how search engines should operate. In 1998 they published a paper titled “The Anatomy of a Large-Scale Hypertextual Web Search Engine” that would prove highly influential in the world of search.
The paper introduced the term PageRank, which became the name for part of the algorithm Google would use to evaluate and rank websites in search results. Page and Brin also established many of the core SEO principles we adhere to today. They envisioned search engines as filtering content based on quality and relevance, not merely an overabundance of keywords. SEO should be about delivering the best possible experience to the user, not checking off a mechanical list of boxes (Source: Hubspot).
While these ideas were revolutionary, not much changed among website owners. If the status quo was still working for them, why bother doing anything differently? Yet as Google became the dominant search engine and its clout grew, it found itself in a position to alter the rules of the game.
Google’s algorithms determine how results are delivered for a search query. These algorithms receive frequent updates that adjust the way pages are evaluated and ranked. Many are very minor and little more than blips on the radar, but some fundamentally shift the SEO landscape. The “game changers,” if you will.
The search engine giant fired its first major salvo with the Florida update in November 2003. With this update in effect, sites were heavily penalized for employing the questionable SEO tactics discussed earlier. Domains that relied on keyword stuffing, low-quality backlinks, and more to rise to the top watched their names plummet down the SERP (Source: Search Engine Journal).
Unsurprisingly, the owners of these sites were incensed at the abrupt change. They felt like the rug had suddenly been pulled out from under their feet. However, the message was clear: If you wanted to play ball with the world’s most popular search engine, a black hat strategy would no longer guarantee success.
Many of Google’s updates are expressly designed to create a better search experience for users. 2005 was a landmark year in the continuation of this effort.
Personalized search made its debut. Aided by users’ search and browsing histories, Google began serving more custom-tailored results. The search engine would offer up destinations it determined as likely relevant based on previous searches and sites a user had visited before.
Google also launched a free, powerful tool allowing users access to valuable data and insights about their sites’ performance. Google Analytics put key metrics like pages per session and bounce rate right at users’ fingertips. Webmasters could easily see where their sites needed to improve to provide a better experience and move up search rankings. Analytics remains one of the most important resources in an SEO manager’s arsenal (Source: Squirrly SEO).
In the following years, Google rolled out a series of updates to further refine their search ranking algorithms. Several of the most prominent ones featured memorable, animal-themed nicknames. The changes implemented by these updates were significant enough that sites needed to take notice.
Panda arrived in February 2011, dubbed “Farmer” by some for the way it targeted “content farms.” These are sites that compile a large amount of pages covering an array of search queries in an attempt to maximize their impact on search engines. However, the individual pages are typically very low in quality and thin on actual content.
Sites that relied on weak or duplicate content received penalties from Panda, as well as those with more ads than real content. To rank better in search, webmasters had to create comprehensive material that was genuinely useful to readers (Source: Moz).
Google next moved its crosshairs onto dubious link-building. Beginning in April 2012, the Penguin update punished sites that had previously benefited from amassing spammy, low-quality links. Instead of rewarding sites purely for their amount of incoming links, the search engine would assess the quality and authority of those links. This way, sites with lots of link spam wouldn’t appear more noteworthy than they actually were (Source: Search Engine Journal).
In 2013, Google rolled out a major update called Hummingbird that affected about 90% of all searches. Hummingbird’s main goal was to adjust to the evolving ways people interacted with search engines. Online search was becoming more like an in-person conversation, a trend still apparent today with the growth of digital voice assistants like Amazon Alexa and Google Home.
Hummingbird improved Google’s ability to handle nuanced, complicated search queries. By focusing on the query as a whole rather than individual terms, the search engine could more accurately determine user intent and deliver results relevant to what they wanted to know.
Unlike some of its predecessors, the update didn’t aim to outright penalize poor SEO practices. However, it was important for webmasters to think about building their content with long-tail queries in mind (Source: Search Engine Journal).
Google turned its attention to providing a better local search experience with the Pigeon update in July 2014. Like Hummingbird, its primary goal wasn’t to address black hat SEO tactics. Instead, it helped deliver more relevant results to a user based on their location. This gave local businesses the opportunity to rank higher in SERPs when users in their area searched for their type of business (Source: Search Engine Journal).
As more searches moved to mobile devices, the ability of websites to optimize for a mobile audience became essential. Google responded with the Mobile-Friendly Update (given the more dramatic “Mobilegeddon” moniker by some) in April 2015. The update rewarded sites that loaded quickly and were easy to navigate on smartphones and other mobile devices (Source: Search Engine Land).
Google stepped further into the future with RankBrain, introduced in October of the same year. RankBrain is an algorithm powered by artificial intelligence that uses machine learning to filter search results. While it’s still just part of the puzzle, Google has indicated that RankBrain is now the third-most important factor in ranking web pages. With RankBrain, the search engine can more accurately interpret search queries and fetch relevant results based on the user’s intent (Source: Search Engine Land).
SEO continues to change at a frenetic pace: Google made 10 notable tweaks to its algorithms in 2017 alone, per Search Engine Journal. These updates typically arrive unannounced and with minimal supplementary information, making it difficult for those in the industry to keep up.
However, this also makes it an exciting space to monitor. While it’s hard to say with absolute certainty where the road will lead, it’s a safe bet that SEO will increasingly revolve around personalizing for users, along with optimizing for mobile and local search. With 50% of all searches projected to be voice searches by 2020, websites can’t afford to ignore conversational queries, either.
The days of going all-in on ranking for a specific keyword are far behind us. As AI and machine learning play expanded roles, search engines like Google can do a much better job of determining how useful pages actually are to searchers.
For more insights and predictions about what the future holds for SEO, be sure to download our free “Future of Search Marketing” ebook!