written by:

That’s right folks, it’s time for some more PubCon South 2011 nuggets. I left Austin with notes on such a wide variety of topics that it’s hard to mold them into coherent posts, but I’ll see what I can do here.

Google's Panda feasts on pennies in a sea of content farms

Google's panda feasts on pennies in the SERPs... That's not too much of a stretch, right?

In this post I’ll be covering analytics, page speed, log files, content/the “Panda” update, and the JCPenney link building fiasco. I’m only presenting the info that struck me as being particularly useful or interesting… And it’ll all flow together beautifully. I promise. Ready? Let’s do this.

Analytics and the competition

How much SEO traffic is your competitor driving? Prashant Puri suggested using Compete.com to view your competitors referrers. You can specify if you’d like to view total vs paid search referrers, and from there you can determine roughly how much of their traffic is coming from organic search (total referrals minus paid referrals = organic search referrals). You can use another analytics data service like Hitwise to validate this data – the numbers should be similar.

Google Analytics and user privacy

On the topic of analytics – did you know that the Google Analytics TOS requires you to divulge your usage in your site privacy policy?

“You will have and abide by an appropriate privacy policy and will comply with all applicable laws relating to the collection of information from visitors to Your websites. You must post a privacy policy and that policy must provide notice of your use of a cookie that collects anonymous traffic data.”

Nate Griffin mentioned this briefly, and I certainly wasn’t aware of it. If you use Analytics on your site, you may want to consider revising your terms of service/privacy policy page. This template provides a nice starting point. By doing this, not only are you following a requirement, but you’re also being transparent and letting your customers know that you care about them. If you want to take it a step further, you can offer them “opt out” options: cookie or browser plugin.

Reevaluate your approach

Nate also stressed the importance of not viewing Analytics as a “set it and forget it” service. You’re not “doing analytics” just because you bought some fancy software. The analysis (and implementation of changes based upon findings) of analytics data should be a part of your regular site maintenance. It has to be a priority of the executives at your company.

The state of SEO in 2011

Aaron Shear spoke on Advanced SEO Tactics. He stressed that SEO is no longer just “moving words around”, and that in order to stay ahead, more advanced techniques must be employed. Do you have the right staff to get the job done? An SEO role is for technical people who can interface with both developers and senior management.

Page speed is not old news

In addition to URL structure, site structure/taxonomy, and content, Aaron feels that site performance (page speed) is a major factor in Google’s ranking algorithm, and should not be overlooked. He said that a load time in excess of 4 seconds is “your death”. Simply put, your site should be pretty fast if you want to list well in the SERPs.

He went on to list some things that can be done to decrease your load time, such as:

  • Leverage browser caching (in your server settings) to a minimum of 1 month
  • Minify CSS and javascript files
  • Enable GZIP compression and configure the server to use it properly
  • Utilize image sprites
  • Use a CDN (content delivery network)

He made other recommendations as well, but a lot of these will vary based on your CMS/backend and server configuration. To check the page speed of your site and see which aspects could be improved upon, I’d recommend using GTmetrix, which pulls data from both Page Speed (Google) and YSlow (Yahoo!) without the need to install any browser plugins/extensions.

Page speed and WordPress

If you’re using WordPress as a CMS, he recommended that you use the W3TotalCache plugin. He said that he saw a client’s site drop from a load time of 8 seconds to 1.3 after installing this plugin. That’s a pretty significant change, especially considering the ease of implementation.

I decided to give this bad boy a shot on my personal site, johnvantine.com. I ran a speed test using GTmetrix, and my score wasn’t all that great:

Page Speed grade 66% (D), YSlow grade 66% (D), 3.66 second load time, 1.20mb page size, 51 total requests

I then installed W3 Total Cache. I also installed WP Minify, and disabled the minify options in W3, as I’ve heard WP Minify does a better job at these specific tasks. After very minimal tweaking (I followed the suggestions given here) I saw a definite improvement in my page speed time:

Page Speed grade 90% (A), YSlow grade 90% (A), 1.17 second load time, 757kb page size, 18 total requests

Considering that it took me less than 15 minutes to install these plugins and tweak the settings, that’s a pretty substantial improvement. I was dangerously close to a 4 second load time, and now I’m just shy of 1 second.

Getting down and dirty with log files

Ian Lurie spoke about working with server log files. He feels that most people are not taking advantage of the data contained in server log files, and that this is an area where you can get a leg up on the competition. He’s right… I’m guilty of this. I’ve always been intimidated by log files, but after seeing his presentation I’m excited to dive in when I have some time.

You may currently be using a service like Google Analytics or Coremetrics. While these tools are good, they don’t provide you with the level of detail that log files do. You can use log files to:

  • Identify image indexation issues (look for visits from search bots that resulted in 404 errors)
  • Identify “crawl budget” leaks and wasted bandwidth
  • Identify link opportunities (external sites linking to pages on your domain that no longer exist)

…and a whole lot more. Google Webmaster Tools offers some basic functionality for things like incoming links to 404 pages, but log files do a much better job.

If you want to catch details that your competitors aren’t, this is a great place to do it, as your average SEO is most likely not doing this. Working with log files to get this information can be complicated, so instead of outlining it all here, check out Ian’s post about log files for more information.

The Panda Update – what was it all about?

So what was Google’s latest major algorithm change (now commonly known as the “Panda” update) all about? The general consensus seems to be that it was targeted specifically at content farms. Eric Enge claims that it was broader than that. You’re not automatically “in the clear” if your site wasn’t specifically a content farm. He said that in light of the update, all webmasters should be asking themselves how good their content is.

He pointed out the fact that Google has access to a lot of data regarding your site, including:

  • How users interact with your site in the SERPs
  • Chrome Blacklist extension data – there’s a strong correlation between sites that people choose to block with this tool and sites that are seen as “low quality”. Eric thinks that the Chrome Blacklist had a strong alignment with the Panda update.
  • User bounce rate (and other information from Analytics)
  • SERP site previews – if a user clicks on the magnifying lens to see a preview of your site and then ultimately doesn’t click through, what does that say about the quality of your site?
  • Google toolbar data (how many pages on your site did they visit, and for how long?)

He also pointed out that the “time on site” data that you see in Analytics isn’t very accurate, because technically it’s a measure of the time elapsed from when the first page is loaded to when the last page is loaded. You don’t know when the user left that last page. They could have been on it for 10 seconds or half an hour, but this is not factored into that Analytics metric. Google has access to the true “time on site” data via the Google toolbar, and they can compare the time users spend on your site versus your competitor’s sites.

“Content is king” continues to ring true

Any combination of the above metrics can be looked at to determine whether your content is substantial, as certain user behavior patterns are indicative of content quality.

If you’re worried about the content (or lack thereof) on your site, ask yourself what makes your site unique. If you’re struggling to come up with an answer, you may have “thin content”. An example of a site with thin content would be an eCommerce store that provides little more than the manufacturer’s description on product pages.

What do you do if you have thin content? You want to either remove the offending pages (the ones that are performing poorly in Analytics) or “fix them” by adding quality content. There are a lot of different ways to obtain quality content for your pages. User generated content (UGC) can work wonders here. If you don’t have enough traffic for UGC, try running a Facebook contest. Regardless of the method, you want to replace thin content with deep (or at least original) content ASAP.

Please the panda

What else should you fix to make the Panda happy? Some other common areas for improvement:

  • A high number of spelling or grammatical errors
  • Content areas with a high ad-to-content ratio
  • Pages with notably poor performance in Analytics
  • Pages with duplicate content (content that appears on other sites)

Regarding duplicate content, the most powerful source will win. What this means is that even if the content belongs to you, if you post it somewhere else, and that site is seen as more authoritative as yours, you could suffer in the rankings for it. Example: Let’s say you have an eCommerce store, and you have unique product descriptions for each item that you sell. If you sell these items on eBay as well as your site, and you use those same product descriptions on the eBay auctions, you may ultimately be penalized for having these descriptions on your site, since eBay is seen as a very authoritative site. Be careful who you syndicate your content out to!

For more information regarding the Panda Update, check out Google’s Farmer Update: 5 SEO Tips You Need Now from Wpromote’s own Mike Mothner.

Before the panda struck, JCPenny was smacked

Prior to the Panda update, everyone in the SEO community was talking about JCPenney’s very public link building fiasco. To summarize, JCPenney was purchasing thousands of links on irrelevant sites in order to list higher in the SERPs. This New York Times article put everything in the spotlight, and Google had no choice but to take action. They made an example out of JCPenny (as well as Overstock.com) by “slapping” them, essentially lowering their pages in the SERPs.

Link building and thresholds

Kristine Shachinger talked about JCPenney’s fiasco for a bit. When running a link building campaign that includes purchased links, she feels that there are thresholds that, when crossed, will put you at risk of penalization in the SERPs. She has experimented with this thoroughly, and she said that she’s able to knock a website out of the index by simply purchasing large amount of irrelevant links to it.

While Wpromote certainly does not condone this type of strategy (nor does Kristine), it can happen. Just ask JCPenney – they claim to have had no knowledge of the thousands of links being built to their domain; links that ultimately resulted in their penalization. Whether or not this is true is irrelevant, because regardless of who built the links, JCPenney ultimately suffered the consequences (and presumably lost a large amount of business as a result).

Defense!

So how can you protect yourself against this? You need to be on the ball. Keep track of inbound links to your domain with MajesticSEO. Track your rankings. If you begin to rank for new terms that you haven’t optimized for, find out why. The best way to combat “JCPenney Syndrome” is to make sure that roughly 80% of your inbound links are “white hat, Matt Cutts friendly” links. And the other 20%? Kristine estimates that you can get away with roughly 20% irrelevant links.

If a competitor shifts your link balance from 80/20 to 50/50, you’re skating on thin ice, and you need to “re-balance your link zen”.

That’s all that I’ve got for now. I still have a good amount of PubCon notes to sift through, so a third post may very well be in the works. Stay tuned!

Comments

5 thoughts on “PubCon South 2011 Takeaways Round II – Pandas, Page Speed, Pennies & More
  1. Andrew Voudouris says:

    Great post, I didn’t even need to go to the show!

    1. John Vantine says:

      Thanks for checking it out! I talked to Rob Snell for a bit and he remembers you and Steve. Have some good stuff from his presentation as well, hope to get around to posting it in the next few days.

  2. Hong Vo says:

    Another great post John! Your Page Speed test showed significant improvements with how fast your site loaded — something so simple to resolve seems to be overlooked by many websites.

    You talked about Aaron Shear’s presentation in your post and it mentioned, “An SEO role is for technical people who can interface with both developers and senior management.” I totally agree with that because you really have to articulate what is you want to achieve and understand if it is something that developers can achieve in conjunction with their goals too.

Leave a Reply

Your email address will not be published.

Technology

3 Time Saving Chrome Extensions/Firefox Add-Ons
Internet News

Best Movies, Bands and Gadgets at SXSW 2011
Internet News

Viral Video Friday!
Become An Insider!

Thanks for signing up to be a Wpromote Insider.
You’ll be the first to get the scoop on our latest services, promotions and industry news.

CONNECT
  • Los Angeles HQ: 866.977.6668
  • Chicago: 310.529.4578
  • San Francisco: 310.683.0435