Posted on

SMX Advanced 2008 in Seattle, Washington – KRONiS Update.

SMX Advanced - Seattle, June, 2008

This Year’s Seattle Search Marketing Expo (SMX Advanced) conference was great, aside from the uncomfortable red chairs it went off without a hitch.

Space Needle Seattle

⁃ Monday night started with a Microsoft sponsored party that had great catering, a DJ and great hors d’oeuvres. It took place close to the conference at the Olympic Sculpture Park.

They were offering to take pictures of people for fancy luggage tags (branded by Microsoft of course) which was funny to watch as people were drinking from an open bar and taking silly photos.

Olympic Sculpture Park, Seattle, WA

Many of the well known SEO companies were present and it was a nice start to the conference, I found some people I had met at SXSW and SMX Long Beach and the networking had begun.

It did rain the entire time, but that’s alright as we were inside the whole time anyway. Looking for cabs could get you wet and annoyed, but at least there were lots of ‘Vancouverish’ trees everywhere and it was a beautiful location on at the Bell Harbor Convention Center.

The three major search Engines, Google, Yahoo and MSN Live finally announced some clarifications on how they treat the Common REP (Robots Exclusion Protocol) Directives.

Yahoo Search BlogSee YAHOO!’s blog about REP Directives.

Google Webmaster Central Blog

Google’s Webmaster Central Blog re: REP Directives.

MSN Live Search Webmaster Central Blog
MSN’s blog about REP Directives can be read here.

There were some different answers regarding how the SE’s treat things such as the ‘nofollow’ attribute in links.

SMX Advanced Conference PanelsThe MSN crew at first didn’t seem clear on what their own standards were, however at the end they did clarify that they don’t do anything different for nofollow links at this time. I’m still confused by their confusion. Good old Microsoft!

Google: – Were very clear that they don’t use noindex, nofollow for discovery (finding new content to index or at least store somewhere)

Yahoo: – Confirmed they do use nofollow links for Discovery.

I think the way to go here is to follow the industry leader, Google and do what they say and suggest.

For example, Yahoo! was really pushing the use of Yahoo Site Explorer to provide rewrites for URLs that are not SEO-Friendly – This will drive tons of traffic to Yahoo Site Explorer but I think its a pain in the ass to be honest. Plus the Yahoo folks didn’t even know how to use the microphone or speak in a way you could hear them, at EITHER panel I saw different Yahoo! folks at.

Google’s reps put on their ‘Google faces’ and obviously had public speaking training and were very easy to understand. Google’s Maile Ohye recommended that you take care of your own canonicalization issues using cookies for session IDs and putting the exact URL in your sitemap of canonicalized pages.

Friends as SMX Seattle

Pictured here is Maile from Google with Michael from Penwell, Colin and Pete from the UK and Mike from San Diego. I rolled with this fun crew most of the time, pictured here at the SeoMOZ party. Kudos to Jane and Rand’s crew for always being easily accesible to discuss SEO and for giving out sweet shirts and hoodies. – Back to the technical stuff…so this this can seem rather confusing…USE the sitemaps and tell Google which versions of pages to use. The key here is to put the canonical version IN the sitemap, NOT the human readable version and 301 the canonical page to the SEO friendly URL.

i.e. would go in the xml sitemap.

And you would 301 redirect to


There’s a long road ahead of us but at least the Search Engines are finally starting to try and work together to provide standards for webmasters to follow that are the same for each engine. Google is obviously way ahead of the pack on this one.

Another interesting and completely unrelated topic was when a presenter explained this: How do you explain the word ‘spicy’ to a child that has never tasted spicy? It is pretty much impossible without ever tasting something spicy….he also mentioned how the dictionary is a circular reference…A book full of words describing other words…circular. You would be able to point to a tree and say that physical thing over there is a tree…but without that the words describing it are all defined inside the ‘circular reference’ of the dictionary. I never thought if it that way…one for all the nerds out there i guess…

BUZZWORD: Progressive Enhancement

Some of you may not know this term. The Wikipedia definition is as follows:

“a strategy for web design that emphasizes accessibility, semantic markup, and external stylesheet and scripting technologies. Progressive enhancement uses web technologies in a layered fashion that allows everyone to access the basic content and functionality of a web page, using any browser or Internet connection, while also providing those with better bandwidth or more advanced browser software an enhanced version of the page.” – Source – Wikipedia

It is now recommended to start websites over using this methodology to make them accessible at every level rather than try to implement backwards changes to your existing sites.

Buying old Domains:

Some cool ways to find and buy old sites without losing their historical strength in Google were discussed. The goal is to use a TRUST to keep the WHOIS information the same, and to also buy the hosting account from the owner as well.

The way to do this if you do NOT want to see the WHOIS information change is to do the following three things.

  1. Get a Lawyer and establish an intent to create a trust.
  2. Make sure the certainty of the property – Take inventory and it is a wise idea to include the hosting as part of what you are buying to avoid any WHOIS changes.
  3. The Object is the beneficiary.

So what does this legalese mean?

– Well I’m not a lawyer but the general idea here it to use legal means to ensure that the WHOIS registration does not change and it is also highly recommended to also purchase the hosting account from the current owner of the domain.

If you are interested in this I recommend checking out who presented this at SMX Advanced.

H1 tags – only one per page? that’s what all the SEO’s say, but really why? any proof?

So I had an opportunity to discuss an issue about H1 tags that my company was having with Matt Cutts who is very easy to talk with and obviously passionate about his work at Google. Even during one of the panels he was sitting with some of the folks I was hanging with and was very helpful with his little notepad describing how Google does specific things that would affect the sites he was asked about.

Matt Cutts with Michael K explaining duplicate content solutions.

At the end of it all we pretty much agreed that, – Matt even said this – “You don’t need Matt Cutts”. The reason is that it is at this point pretty obvious when something is shady or not. If the work you are doing is for the search engines and not for the users and affects the user experience than it could be risky.

Matt Cutts from Google talking with SEO Aaron Kronis

(Matt Cutts discussing use of multiple H1s on pages with Aaron Kronis) and using his diagram pad as always.)

We discussed the issue that our programming team here at Wpromote is having with regards to the use of H1s at the beginning of sections rather than reserving it for just the main page header.

Almost every SEO I know swears by ‘One H1 tag per page with the top keyword phrase for that site in it’ and here Matt said that it was alright to have MULTIPLE H1s on the page as long as you don’t stuff too many keywords into the H1 tags and design the site for the users.

My question to any SEOs out there (thanks to Merlin for pointing this out btw) is that other than all the SEOs saying to only have one H1 per page, where is this proven to be any different then if there are multiple H1s used at the heading of each section the way the H1 tag was designed. This reserves H1-H6 for usage if needed – great for automated pages and sites… the limit of the H1 to one usage can change the programming and possibly cause you to run out of Hx’s if you get up to the depth of H5 o H6 and have used up your H1 at the top of your pages. Not super critical but nonetheless not very well explained or documented in the SEO community.

Other things to note – if you are using IP Delivery (Cloaking) then the content you serve the search engines MUST be the same as the media you are normally serving users. i.e. the text of the flash better look like the text you send Google or you will be booted from the index. The question was asked “Can you describe the video that is in flash?” the answer from the Search Engines was ‘NO – you may use a static image however’.

– So if some of this was a little dry, there’s a lot of new exciting things going on in the SEO community and with the new tools that Wpromote are developing (thanks to CP) for our SEO division will really help us out with getting the best results for our clients. Welcome to the next generation of Internet Rockstars who know how to use Progressive Enhancement to build search friendly and great user-friendly websites.

I missed the SeoMOZ party but I had to get home…back to LA.


Taken on the final approach to LAX looking off the edge of the world.