Part I: SEARCH ENGINE OPTIMIZATION
By Ruth Burr
Search engine optimization has been around for about as long as search engines have. It's hard to imagine SEO before Google with its current market share domination, but as early as the mid-1990s, marketers were thinking about how to make their products as findable as possible on the web.
The Birth of SEO
In the early days of search, Yahoo! and its cohorts were run like Yellow Pages services: website owners submitted their sites for indexing, and search engines did their best to match up pages with search queries. Most ranking criteria centered on keyword density—did a given keyword show up in prominent places on the page? How many times did it occur in the page's content?
It wasn't long before website owners caught on to search algorithms and began to tailor their sites to meet search engines' criteria. Search engine optimization was born! This meant that the first generation of web marketers had new tools to help get their content in front of the growing consumer base that was the Internet. It also meant that they could, through reverse-engineering the algorithm, easily create hundreds of pages that ranked for search terms without passing much value along to the searcher.
These were the first skirmishes in what would become a battle that continues to this day. Search engines try to create spam-proof algorithms that surface the best content to their users, while marketers struggle to get their sites to the top of the rankings—sometimes, by any means possible.
As weaknesses in their algorithms continued to be exploited, search engines began to look beyond individual web pages to off-page criteria like links. It was around this time that Google came on the scene, and changed everything.
Life After Google
Google's algorithm was based on a concept called PageRank, which weighed on-page factors against value passed from page to page via links. With PageRank, a link to a webpage becomes a “vote” for that page. It's a form of social proof: the more people who agree that a site is worthwhile, the more credibility that site has.
Google quickly amassed a huge market share, edging out smaller competitors like AltaVista and toppling Yahoo!'s dominion over the search engine space. As more and more people turned to Google to shop, play, and find information, the Google SERPs (search engine
results pages) became the place you had to be if you wanted to succeed online. In 2013, comScore reported that Google had 67 percent of the U.S. search engine market share,
trailed by Microsoft Bing and Yahoo! with 16.5 percent and 12.1 percent, respectively (http://mz.cm/XOG96n
). Eager to reach customers on the web, more and more businesses are turning to SEO as a major revenue stream.
As SEO grew as a practice, Google built Google Webmaster Central and Google Webmaster Tools. Now site owners and SEOs could hear directly from Google about new developments and get some hints about what they should or shouldn't be doing to rank well. The algorithm is still more shrouded in secrecy than not, but Google Webmaster Tools does provide some good diagnostic tools to help site owners maintain search-friendly sites.
The modern SERPs look very different than they did at Google's inception, when the top 10 pages were listed as 10 blue links on a white page. Since then, Google has integrated its News, Video, Images, Local, and other vertical searches into one SERP format called Universal Search. It's started utilizing users' search histories, IP addresses, and social media activity to tailor search results to individuals. Google is also constantly experimenting with different numbers of results, new result and ad formats, and even a Direct Answers service that displays the answer to a question like “How many tablespoons in a cup” directly on the SERP, no click needed. To be successful, it's important for today's SEOs to keep abreast of the latest changes.
Cracking Down
As Google rose to ascendancy in the search engine market, attempts to exploit the algorithm cropped up as fast as Google could squash them down. With updates to Google's algorithm coming every few months, new ways to game the system had plenty of time to take effect before the next crackdown.
That all changed in 2010 with Google Caffeine, an update that marked the beginning of more frequent updates to the algorithm. Now Google is making slight tweaks to the algorithm almost every day, with frequent larger changes as well.
It had been a long-held axiom in the search world that “Content is King.” Without at least some text content on a page, it was very difficult to show search engines what a page was about at all, let alone that it was unique enough to rank. However, this meant that hundreds and thousands of sites on the web were shoehorning small amounts of unnecessary, keyword-stuffed text onto pages that didn't really need it. Additionally, huge content sites sprang up with page upon page of content designed to rank for queries but not provide real answers, instead using content to draw users in to a page full of ads.
In 2011, Google released a major update called Panda. Google Panda targeted this “thin content,” looking for more robust signals that content was relevant, unique, and valuable to users. Google has confirmed that Panda is an ongoing algorithmic “check” that is run periodically to target new thin content.
While inbound links have remained a valuable signal for site authority, they were also one of the most frequently manipulated. In 2012, Google released the Penguin update, designed to target “unnatural” links such as links from directories and links that webmasters had surreptitiously paid for.
One major upheaval as a result of Penguin was the changing focus on link anchor text. Google had long named keyword-rich anchor text in inbound links as an indicator of quality, but eventually also found that a high percentage of inbound links with keyword-rich anchor
text (as opposed to the name of the website or generic text like “click here”) was also a sign
of an unnatural link profile. Like Panda, Penguin is a periodic fix that Google runs to catch new offenders.
Panda and Penguin impacted countless websites. Companies who had had search engine success for years suddenly found themselves scrambling. In the wake of these updates, the SEO community has had a renewed focus on “white hat” SEO—that is, implementing solid business practices to create quality websites within search guidelines, rather than resorting to tricks or loopholes. For more on this, see Chapter 1, “White Hat SEO: It F@$#ing Works.”
How Search Engines Make Money
When learning how to rank in search engines, it's helpful to remember that search engines aren't public services; they're businesses, out there to make money. Google's market share is an asset that can be used to sell ads. Sixty-seven percent of the available eyeballs in the U.S. are looking at Google when they're searching, and that's an audience advertisers can't afford to ignore. Charging advertisers to get their ads in front of those eyeballs is what drives Google's bottom line.
What this means to search engine marketers is that Google is going to do everything it can to protect its most valuable asset: its market share. That means that Google will consistently do everything in its power to make sure that people who use Google find what they're looking for. Search engines put a great deal of time, talent, and money into discerning what users want when they search, and which pages don't fulfill those needs.
Tactics That Never Stop Working
Building an “algorithm-proof” website that won't be hurt by the likes of Panda and Penguin means adopting classic white hat techniques that never stop working. These include building an easily-crawled site; creating content meant to engage users; building relationships and communities to encourage content sharing in order to naturally accrue links; and looking at site performance to consistently improve performance. The benefit of this stance is that SEOs can return to focusing on the user and customer, while still showing search engines that we have ...