Hotel websites are still recovering from last year’s Penguin update, but a new update loosely called ‘Penguin 2.0’ by the industry is looming on the horizon. Now is the time hotel marketers must make changes to their website SEO to prevent any penalties that might be levied by Penguin 2.0. While no one knows for certain what will be in Google’s upcoming update, here are a few likely scenarios you can prepare for.
First, Some Context
Last year, Google released an update to their algorithm called Penguin, which severely penalized websites that practice bad SEO. Tragically, hotel websites were some of the worst hit by these penalties, as hoteliers have a history of paying unscrupulous SEO agencies to ‘do’ SEO for them. Hoteliers are therefore vulnerable to be penalized by upcoming penalties, unless hoteliers change their SEO strategies and move away from vendors whose techniques are outdated.
Soon, Google will update Penguin, and eventually incorporate it into their broader algorithm. Google’s head of web spam Matt Cutts spilled the beans about the looming update when he commented on a popular SEO blog:
“…expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact.” (Emphasis mine)
In other recent interviews, Cutts said the updates will be “jarring and jolting”, and to those looking forward to the next Penguin update, “You don’t want the next Penguin update”.
@mrjamiedodd we do expect to roll out Penguin 2.0 (next generation of Penguin) sometime in the next few weeks though.
— Matt Cutts (@mattcutts) May 10, 2013
8 Steps to Prepare Your Hotel for Penguin 2.0
Note: I am indebted to many thought leaders in the SEO industry, including Search Engine Watch, QuickSprout, and State of Search for performing excellent SEO research and giving me much to think about when composing this guide.
Also note: No one truly knows what Google plans to do. This guide is a prediction of what Google will likely focus on with Penguin 2.0, based on intuition, industry discussions, and public statements from Google.
1 ) Manipulated Links
As Google’s technology advances, they become increasingly sophisticated at discovering manipulated backlinks. Google valued backlinks in the first place only because backlinks are harder to ‘game’ compared to other outdated SEO techniques, like meta tags. But, like Google’s use of meta keywords tags, Google will rely on backlinks less and less with each algorithm update. It’s quite possible that someday, Google won’t place any value on backlinks at all.
For Penguin 2.0, the only links Google will value are links placed with editorial intent—that is, placed with purpose because they give benefit to the reader.
Google can easily detect where links appear on a page, and they may discount the following kinds of links completely:
- Site-wide navigation links, including footer and sidebar links
- Links in author signatures
- Links in blog comments
- Links in forum signatures
- Links in author profiles
- Sponsored links
- Blogroll links
- Links from embedded media, like apps, widgets, images, infographics, and video
- Generic boiler-plate links
- Article directories
- Links from press releases
- Much more
Click on image to enlarge.
In the image above, we see that the majority of backlinks for this website come from forum signatures (dark blue), blog comments (teal), and short text paragraphs (pink), which are conspicuous of purchased links and ‘spun’ content. We also see that links from more trusted websites found further out in the disc are from public sites like YouTube and Blogspot which are easy to obtain links from, while the rest of the website’s links come from poorly trusted websites further into the disk (notice how dense the center of the disk becomes). If I can find rich data like this about a hotel website’s manipulated backlink strategy, how easier can Google?
In short, if Google can detect a link was not consciously placed by the author as an editorial comment, reference or source in the body of the article, then Google will disavow the link.
Action Item: Don’t pay for backlinks. If you have paid for backlinks in the past, verify that none of the links you received were from these low-quality websites. If you have links from these kinds of websites, you may want to have them removed.
2) Domain Trust
If a website repeatedly practices suspicious linking, then Google will decrease the domain authority they give to that website over time. Websites like this are easy to detect, as they have characteristics similar with each other–call it a low-quality finger print. If Google sees that the majority of your website’s backlinks come from low quality sites like these, then they may conclude you’re buying backlinks.
Google will likely publish strict penalties with Penguin 2.0, to penalize websites they have caught receiving links from ‘bad neighborhoods’, or sending links to them.
Click on image to enlarge.
In the above example, we see that the majority of backlinks this website has come from websites with low authority. The dots on the outer edge of the spiral are websites with higher authority. Notice how few there are. The closer the dots get towards the middle of the spiral, the lower authority these websites have. Hovering over one of the dots, we see the kind of poor quality, unrelated websites this hotel has backlinks from: this one is a public forum for free video games. (I’ve blurred out part of the anchor text and website URL as I’m not in the business to out people for their shady SEO.)
Action Item: If you actively build backlinks for your hotel website, make sure you only solicit high quality websites that are thought leaders in your niche. If your agency has solicited low quality links, try to have them removed.
3) Social Signals
Google has been using social signals to supplement backlinks as a way to judge a website’s quality for some time now. But even social signals like Facebook likes can be ‘gamed’, or purchased, and Google is getting increasingly better at detecting them.
In the example above, we see a cluster of one website’s Facebook likes, broken into different clusters. Each gray dot represents one person who liked the website, and the lines represent connections to other people who have liked the content.
The left side of the cluster looks normal. The page has likes from various people, who are connected to each other through long chains of interlinking friends. A healthy social graph will depict an interlinking web of likes from friends connected to other people through various degrees of removal, whom the website author has never met.
On the right, we see some suspicious activity. We find four big lumps of likes from people only connected to friends within small groups. These four groups connect with each other, but very few of them connect with the larger web to the left. This is highly suspicious, as it is unlikely for there to be a community of friends connected to each other but have no connection with other people outside their click. That is, the people who liked the page are all friends with each other only, and have few friends outside those circles. This makes sense, because it is unlikely for a real person on Facebook to send a friend request to a fake Facebook account created by a robot to give likes to a website.
Google could use data like this to help detect spam, and penalize websites.
Since Google does not have access to all the data 3rd-party social websites like Facebook and Twitter have, Google created their own social network, Google+. It is likely that Google will put greater trust on social signals from Google+ in the future, over all other social networks, as they continue to aggressively encourage users to adopt their new social network. They will do this because they have access to all the social networking data from users of their network, which gives them an easy way to detect suspicious activity from fake Google accounts. This will give Google the ability to reliably place trust and value on +1s (the Google equivalent of a Facebook ‘like’) and Google+ social shares while weeding out the chaff.
Action item: Optimize your hotel website to accept Google +1s (see my guide to +1s for more info). Use the social network to build up a community of Google+ users with whom to share hotel information.
Google has clearly said they will give greater weight to content tied to verified online identities compared to anonymous content. Google can verify authorship though Google+ and verified Twitter profiles, but Google probably will rely more and more on Google+ authorship in the future as that network becomes more mature. Hotel websites will receive a boost in the rankings if they are tied to a verified online identity. Read our guide to Google+ for hotels and our article on setting up Google+ authorship and publisher markup for more information.
Action Item: Make sure Google+ authorship is installed correctly.
For a long time now, Google has warned us that getting a slew of links from a random assortment of websites might not be the best idea, as they give greater weight to links from websites in your niche. With Penguin 2.0, they probably will become stricter with the authority they give to backlinks from non-relevant websites.
Click image to enlarge.
In the above example we see this hotel website gets many of their backlinks from article directories (pink), blog comments (green), and, tightly packed towards the center of the spiral, low quality web directories (red). Instead, a healthy backlink profile would have the majority of these coming from travel related websites, personal blogs, university websites, corporate websites, news websites, and search engines.
Action Item: Discover the kinds of websites your backlinks come from. If the majority come from article directories, blog comments, or other un-related websites, consider having them removed.
6) Rich Anchor Text
Links with rich or ‘exact-match’ anchor text are links on a website made from words that exactly match the search engine query the recipient hopes to rank for. The idea goes that a hotel website will get 100 backlinks with anchor text that says ‘great Seattle hotels’. Google will index these links, see that the hotel website is relevant to ‘great Seattle hotels’, and then rank the hotel website well for that keyphrase.
This type of manipulation was all but killed with Penguin last year, and it was a lesson learned hard. This, more than anything, was the cause for so many hotel websites being severely penalized or even completely removed from Google’s index. While having some rich anchor text is still good for a website, the majority of backlinks should have the hotel brand name, or generic phrases like ‘click here’ to be deemed safe and ‘natural’.
Recent reports indicate Google is reducing the percentage of rich anchor text backlinks that are acceptable. With Penguin 2.0, we could see even more websites fall out of the index or get penalized for having backlinks with rich anchor text.
Action Item: If your website has paid backlinks, or links from friends, family, or colleagues that include rich anchor text, ask them to replace the anchor text with something safer, like your hotel brand name.
7) Traffic Metrics
Metrics that demonstrate how popular and interesting your website is have always been a part of the Google algorithm, but they were emphasized with Google Panda (not Penguin). Now, the way your viewers act on your website can determine how well it ranks.
Google will look at how quickly your website loads, how long guests stay on your website, their ‘bounce rate’, how much of your content is read, how many broken links your website has, and so on. Now more than ever, it is essential to perfect the technical side of your website’s SEO.
Action Item: If your website is outdated or hasn’t been touched for years, it’s important to check on it—it may need to be rebuilt. Partnering with a digital marketing system is a smart move, as they will make sure your website is always up-to-date and innovative.
8) Co-Citations / Co-Occurrence
Google using co-citations is a theory that websites can obtain authority from other websites even when they aren’t linked to by those websites, based on mentions, and links from other websites in the network. This can be a tad complicated, but I’ll try to explain with diagrams.
Above, hotel A is linking to both B & C (blue arrow). Hotel B is linking to hotel A. But even though hotel B is not linking to hotel C, B is giving C a co-citation (black arrow) because of its reciprocal relationship to A. That is, Google connects C’s relationship to B through A and B’s mutual relationship.
Above, hotel A and hotel C are linking to each other only. However, both websites are talking about hotel C, by mentioning hotel C in a blog article where they link to each other, for example. Hotel C gets a co-citation from each website, even though there are no actual links.
Here, no one is actually linking to anyone. However, each website mentions one of the others. By citing each other, Google can figure out there is a connection between them all, and pass along co-citations where necessary.
It’s possible that Google is resorting to alternative trust indicators like co-citations to convey website and article authority, and rank content.
Action Item: Continue to develop thought leadership in your niche using advertisements, creating deals, guest blogging, and developing relationships with leaders in your niche, in order to get people talking about you. Even if they don’t give you backlinks, your website may still receive a ranking benefit simply by being cited by other websites in your niche.
I realize that this information is pretty thick and can be overwhelming. My goal is to demonstrate how far away we are from the old days of SEO where stuffing keywords in meta tags and stuffing backlinks in website footers was considered clever SEO. We now have to work in a world where Google’s algorithm is increasingly difficult to game, and any SEO designed to manipulate the algorithm that actually works, if it even exists, is far beyond the talents of most SEO agencies.
Google Penguin has always been designed to take-out sites that use manipulative techniques to improve search ranking. As long as your efforts improve content quality to benefit guests with better information, there is little to worry about. If, however, you have used SEO agencies in the past that proved unreliable, or to have used black-hat tactics, then you may have some work cut out for you to prepare for Penguin 2.0.
N.B.: A previous version of this article mistakenly referred to Penguin 2.0 as Panda 2.0. Panda 2.0 is impossible, because Google has woven Panda into their regular algorithm now. There are likely “Panda updates” many times a month, just as there are regular algorithm updates and experiments.