written by:

This is my 2nd post about the presentations that I attended at SMX West. If you didn’t see the first one, check it out here.

There’s nothing better than getting your hands dirty with some advanced technical SEO stuff first thing in the morning, right? Those were my exact thoughts as I sat in the expo hall on Wednesday morning, dreary eyed with hot caffeinated beverage in hand. Without further ado…

Diagnosing Technical SEO Issues

Adam Audette (Audette Media) was the first one to speak. Adam spoke about SEO site audits, and reminded us that they should be a collaborative process.

SEO Site Audits

When multiple people are involved in a site audit, it can get confusing. It’s very important that everyone involved is on the same page; otherwise, you may find that you are giving your client conflicting advice/suggestions in the site audit. For this reason, he suggested the use of a project management tool like Basecamp.

Everyone does site audit’s differently, but they should all include the same core elements. Adam identified these as being…

  • On-Site: Domain(s), navigation, sections and categories, pages, and media (images/video/etc)
  • Off-Site: Backlinks (quantity, quality, frequency), social media signals, cache dates/crawl frequency/indexed pages, and toolbar pagerank

After identifying the core elements of an SEO site audit, he said that the “big 4 factors” are URLs, site architecture and navigation, deep pages (pagerank dispersion), and site latency.

Gabe Gayheart (Razorfish) took the stage next, and continued the discussion on site audits.

Gabe recommended that you do a full site crawl and identify each page on the site. He also suggested doing “search engine simulation crawls”, which are individual crawls where you simulate each major search engine, and “browser crawls” where you view how the site is being rendered in different browsers. I don’t recall Gabe mentioning this tool, but for the browser crawls, I would highly recommend using a tool like browsershots.

When you’re finishing up a site audit, how do you prioritize the recommendations that you’re making? Gabe recommended that you prioritize in terms of impact (how will this effect the client’s business?), ease of implementation (how difficult will these recommendations be to execute?) and readiness (how quickly will the client be able to implement these changes?).

SEO Tips For Images

Brian Ussery (Search Discovery) was next. He went over some of the things that you can do to make sure that your images do well in Google Image Search:

Use good filenames. If your file is titled 120483.jpg, how is Google supposed to know that it’s a picture of a dog? Call it fluffy-white-pomeranian.jpg instead. Make sure that you’re using descriptive alt text/anchor text (when applicable).

Don’t put text in images. The search engines aren’t going to read the text in your images… I may be speaking too soon on that one, but regardless, you’d be better off with placing the text right next to (or below) your image. It’s a good practice to provide information on your images anyway, because the text can be used to determine what the image might be about, and that’s more copy for your page.

Specify image width and height. This one was news to me. The browsers I’ve used have always displayed an image properly regardless of the inclusion of width and height, but I’d imagine that there is a benefit to including this if he mentioned it.

Provide as much meta data as possible. Data such as EXIF, tags/labeling, and location info can be used to help your image perform well.

He also mentioned using a favicon with expiration to avoid 404 errors. Using a favicon is a good idea for obvious reasons… But did you know that you’re generating 404 (not found) errors each time the favicon is requested if you don’t have one?

CSS Sprites

Patrick Bennett (ModernBlue) was the last to speak. One of the things that he mentioned was CSS sprites. I had heard of these being used before, but have yet to try and implement them myself.

How do CSS sprites work? Instead of having 80 different images for your site, you can put all of the images into 1 “master” image, and then use CSS to display the parts of the image where necessary. That’s a very brief description – For more information on CSS Sprites, check out this article from A List Apart.

Domain Names, Parameters, URLs, and All That Jazz

The second presentation on my schedule was another technical one. Google’s Maile Ohye was the first one to speak, and the only one that I found myself taking notes from.

She cleared some confusion on ccTLDs (country code top level domains) when she revealed that they are automatically geotargeted by Google. So if you own a .fr domain, Google will know that your site is targeting people in France. If you don’t own a ccTLD, you can manually geotarget your site in Webmaster Tools. If you do this manually, it takes about a week to go into effect.

Industrial Strength SEO

I wasn’t sure what to expect from an “industrial strength” SEO presentation, but I liked the name. It actually ended up being about SEO for large websites.

Marshall Simmonds (Define Search Strategies) spoke about his work with The New York Times, and the challenges that he was faced with while working there… 22 million documents, a subscription “wall”, rigid IT, an enormous company ego, resistance to change/getting them to “get” search, a limited CMS, massive duplicate content, and crawl barriers were just some of the obstacles that he mentioned.

Marshall talked about the window of opportunity for content creators when breaking news happens. When the US Airways flight landed in the Hudson River last year, that window of opportunity was about two hours long. In that time, you had to do keyword research, come up with good content, and do whatever you needed to do to get your content published and ready so that it could be indexed by the search engines.

The New York Times missed out on that opportunity, because they didn’t do the necessary keyword research. Their headline used the word “land”. While technically the plane did land in the Hudson, everyone was searching for things like “plane crash hudson”. This is an example of a dilemma where a company has to choose between editorial integrity and organic search traffic.

The Need For Speed: Google Says It Matters

Everyone has been talking about speed (in regards to page load time), and how it’s the next big thing on Google’s radar. Because of this, I was excited for this next presentation, especially because Google’s own Maile Ohye was the first speaker.

Speed Increases Conversions

Maile made it abundantly clear that speed increases conversions. In the sample that they looked at, an improved page load time resulted in an increase in conversions and order price. That alone should be enough to convince anyone with an ecommerce store to invest some time in improving their speed.

She also said that delays under 1/2 a second impact business metrics. A delay of less 1/2 a second effects the number of searches per day that a user makes, even after the delay is removed.

Anywhere from 80-90% of the end-user response time is spent on the front-end, so start there. Things like code, stylesheets, and javascript are most likely what is bogging down your site load time. Not only does this have the potential to help your business, but it can also reduce your overall spend on servers.

Webmaster Tools has a “Site Performance” tool that will tell you the average load time for your pages, and how slow they are when compared to other sites. It will also give you some clues as to what you can do to make the pages load faster. Another tool to check out is Page Speed. This is a plugin for Firebug that prioritizes ways to speed up your site.

Google hasn’t come out and said that performance is a factor in organic rankings… Yet. But when a faster site results in increased conversions (and increased average order prices), pageviews, and time on site, and decreases bounce rate and operating costs, it should be on everyone’s to-do list. And Google has been putting a lot of emphasis on speed, so it seems that it is only a matter of time until performance becomes one of the ranking factors.

Patrick Bennett (BLVD Status) was next to speak. He talked about using KPIs (Key Performance Indicators) to shed light on bottlenecks.

Using KPIs to Identify Bottlenecks

When it comes down to monitoring your site, KPIs can be crucial. Some of these KPIs are visits, pageviews, time on site, and bounce rate. Create a custom report in your analytics tool so that you can keep an eye on these KPIs.

Perhaps you’re skeptical, and wondering “what’s the ROI?” You should see lower bounce rates, a higher number of pageviews, an increase in time spent on the site, more user interaction (which results in more conversions), and more spider interaction (which results in higher indexability).

So where do you start? Patrick suggests using a tool like YSlow to measure metrics and create a benchmark. He suggested considering a new server, though this may not be necessary. Other things on the list were server-side caching, fewer HTTP requests, Gzip compression, image compression, and last (but certainly not least), javascript and CSS as external files.

Content Delivered Via Proxy For Increased Speed

Ralf Schwoebel (TradeBit) picked up where Patrick left off by making another suggestion: using a proxy. He suggested using Squid to optimize the delivery of content. Squid caches and delivers your content, and reduces the number of requests, lightening the load on your server.

Ralf suggested “going global” with this idea. If you have Squid installed on VPS servers in several countries, you will essentially have a global content distribution network. This can also be helpful for companies that have websites for multiple countries – If your VPS is in France, your IP will be located in France as well, and having a French IP for a French website only works in your favor… Think geotargeting.

Page Speed & A Visitor’s Attention Span

Brian Ussery also spoke during this presentation. He started off by stating that a load time of under 0.1 second is the limit for the user to feel that the system is reacting instantaneously, meaning that no special feedback is needed except for the page to be displayed.

1 second is roughly the time limit for the user’s train of thought to stay uninterrupted. After 1 second, the user realizes that the computer, not the user, is in control. 1 second seems like nothing, but keep this in mind the next time you’re waiting for a website to load… After a second or two, you start to wonder what’s going on.

10 seconds (roughly) is the limit for keeping a user’s attention. After 10 seconds, a lot of users may go looking for another site.

Brian also said that most consumers expect a page to load in 2 seconds or less.

Still not convinced? Amazon and Google have conducted experiments with page load time.

Amazon added 100 ms to their load times. Sales dropped 1%. Google added 500 ms to their load times, and they lost 20% of their traffic.

Brian suggested splitting your “payload” into 2 parts: the part necessary to render the page, and the part that isn’t necessary. Load the second part after a page has rendered.

Brian also mentioned that Gzip compression generally only applies to text (HTML, javascript, and CSS). He said that using web proxies and security software can prevent Gzip compression, and suggested that you check your logs to make sure that things are working properly.

During the discussion after the presentation, someone mentioned that API shouldn’t be called server-side; AJAX should be used to call it so that the API content doesn’t slow the page load time.

My third day at SMX West consisted of one more traditional presentation, followed by several open-ended “Ask the ____” panels… “Ask the Linkbuilders”, “Ask the Search Engines”, etc. While these “Ask” panels were interesting, I didn’t find myself taking down nearly as many notes. So instead of creating a third post for day 3, I’ll just summarize it here.

Link Building Fundamentals

The last traditional presentation that I checked out at SMX was “Link Building Fundamentals”. First up to speak was Rand Fishkin (SEOMoz). I was looking forward to hearing Rand speak, as I’ve been following SEOMoz for quite some time now, and always check out his “Whiteboard Friday” videos.

Create A Strategic Link Acquisition Plan

Rand talked about the importance of creating a truly strategic link acquisition plan. Over the past year or so, he seems to have moved away from active link building, and become a proponent of creating high quality content/features that people want to link to, and will do so on their own, without persuasion from the content creator or a 3rd party. He gave some examples of sites that have successfully made link building a part of their product.

Twitter. It is beneficial for someone to link to their own Twitter account. It exposes their account to new potential followers, and opens up a new channel of communication. Twitter makes people want to link to them; it is ingrained into the business plan as a whole.

Urbanspoon. On Urbanspoon, food reviews done by bloggers are featured right alongside those of professional food critics. There’s a sense of pride associated with this, and naturally some bloggers are going to want to brag a little. They can do this by placing an Urbanspoon badge on their blog. The badge tells people “I’m on Urbanspoon”, and naturally, it links back to Urbanspoon.

Vimeo. People love to share video. Maybe it’s their own content, maybe not. Regardless, people will always be embedding videos – on their blogs, on social networking sites, in forums, etc. Vimeo took this into account and made sure that each embedded video would earn them a backlink. When you use the “embed” feature on Vimeo, the code snippet contains a backlink to the video. It seems so simple, but it worked out incredibly well for them.

He also mentioned Last.fm, SurveyMonkey, and Scribd. When you embed their media, they earn a backlink. Many other companies have successfully worked link building into their products as well.

He then addressed some common SEO myths.

“Great content builds great links.” He said that great links come when the linker is rewarded. The examples that he gave prior to this would seem to back that up. If there’s some sort of incentive (whether or not it’s obvious) for the linker, they are much more likely to link back to you,.

“Link building tactics are homogeneous.” A link building strategy that worked incredibly well for one site could easily fail for another site. Link building solutions must fit the problem.

“Only ‘X’ types of links matter.” While some links are stronger than others, every sharing and citation has some sort of value.

Rand wrapped up his presentation by reiterating that link building shouldn’t just be part of the marketing plan, it should be part of the product plan. Ask yourself, “is there an audience of people with websites who directly benefit when they link to me?” Continue to productize until the answer to this question is yes.

Create Community-Engaging Content

Garrett French (Ontolo) agreed with Rand – you need to do more than just create quality content. He said that it’s important to identify the key participants in your niche/space. Who is aggregating content?

He talked about creating community-engaging, reciprocity-inducing content.

An example is to conduct a group interview. Contact those key participants and ask them questions that will elicit useful responses. The survey questions should result in content that will appeal to other people in your space, and the people who you asked questions to may very well link back to the survey results once you’ve published them on your site. Conducting a survey like this will also familiarize those key participants with your site/brand, and will put you one step closer to becoming a key participant yourself.

He too stressed that content won’t generate links on its own, and that you have to share it with the people who it’d interest.

Identifying Link Opportunities

Mike Gullaksen (Covario) changed it up a bit by talking about identifying opportunities for links.

One interesting suggestion that he made was to do a full backlink analysis of your site. Create a spreadsheet, and include each backlink – the URL, the link type, and the anchor text. Go through all of these links. Could they be better? Are they linking to your site improperly? Maybe they’re using some less-than-desirable anchor text? Contact them and see if you can get them to change it. Supply them with the code that you’d like them to use.

He also recommended using Google Alerts to identify link opportunities. This may seen obvious, but I’d be willing to bet that there are many people out there who aren’t doing it.

If you work in a small niche, setup an alert for your main keyword. Set the frequency to “daily”. Set some time aside to go through all of these alerts emails once a day, or once every few days, and contact the people (when appropriate/relevant).

Be honest in what you’re looking for. If you can, tell them why you think they should link to your site. How will they/their readers benefit from it? Make sure you add a few words in the email that make it obvious that you checked out their site, so that it doesn’t look like you’re sending out a blanket email.

This concludes my summary of SMX West presentations. I do have other notes, but they’re very sporadic and wouldn’t translate well into a blog post.

I left SMX feeling refreshed, for lack of a better term. Not refreshed physcially, because I was pretty exhausted, but refreshed in terms of my attitude towards SEO. I felt much more engaged with the SEO community as a whole, and all of the things that are currently going on in the world of search. I definitely look forward to attending another conference like this, and would encourage you to do the same if possible.

If you weren’t able to attend SMX West this year, I hope that you’ve learned something from my notes. Be sure to check out My Notes From SMX West 2010 (Part One) if you haven’t already. Thanks for reading!

Comments

5 thoughts on “My Notes From SMX West 2010 (Part Two)
  1. Ademola says:

    John, thanks for your notes and the various links to useful tools and sites on the topic. Very helpful for those who weren’t opportuned to attend the conference.

  2. John Vantine says:

    Thanks for taking the time to read through it!

  3. Jesse Bouman says:

    Sounds like you learned a lot.

  4. Kevin says:

    You optimized the sh*t out of that conference!

  5. marcyzuendel says:

    What an awesome summary! Thanks for sharing these valuable notes, there are many insightful tools/points that people can take away from this.

Leave a Reply

Your email address will not be published.

Internet News

Tues News: 3/9 (Sweating the Small Stuff Edition)
Google

My Notes From SMX West 2010 (Part One)
Viral Marketing

Viral Video Friday!
Become An Insider!

Thanks for signing up to be a Wpromote Insider.
You’ll be the first to get the scoop on our latest services, promotions and industry news.

CONNECT
  • Los Angeles HQ: 866.977.6668
  • Chicago: 310.529.4578
  • San Francisco: 310.683.0435