Posted on

Interning in SEO: What it Means in 2015

Today is (unfortunately) the last day of my internship here at Wpromote before I return to USC for my senior year. I’ve had the opportunity to intern at a wide array of companies during my time in college, but this has been an experience like no other. I’ve never been one to accept a given role without, in some way, tweaking it until I am happy with the value I am adding. Often my attempts at changing and improving my role were met with mixed reviews from leadership. In spite of past experiences, I had every intention of continuing to push the expectations here. To my surprise, outside-the-box thinking was not just accepted at Wpromote, but encouraged.

Here, I have been given wide latitude as an intern. While I’m yet to take this to its extreme (the goal is to get a full time offer, after all), it has made this internship unlike any other. While I could elaborate for hours about the remarkable company culture – read: Mondays Suck Less – I want to focus on interning in this industry instead.

I’ve always been interested in online marketing because of its relative youthfulness. I recall a particularly inspiring quote from our COO at the internship panel where I met him. (Sorry if I butcher this, Block.) He said, “One of the reasons I love this industry so much is because of how new it is. I’m not an old guy and still, no one can say to me, ‘Hey I’m right, I’ve been doing this for longer,’ because that’s virtually impossible.”

Lo and behold, I start working for Wpromote, ready to take on the digital marketing space. Because, hey, the industry is young, the leadership is young, and I’m young.

Upon entering the workspace, I would soon learn that I am, in fact, not young, but rather an infant in this industry. SEO moves at a ludicrous speed. There are people that have been immersed in SEO since its beginning. There are people who got their start just a few (long) years ago. And then there are (sometimes) interns. We have read an outdated book, maybe taken a class, and come into our roles eager and ready to learn. We scour the Internet, read everything Rand Fishkin has to say, and yet, this dynamic industry allows for constant innovation despite your background, title, or tenure.

The fact of the matter is, nobody knows everything. That’s the point. And that’s the game. So what makes great SEO? What have I learned during my time here? I’ll summarize in a few quick hints that are rooted in SEO, but that I feel should have a role in every marketing campaign.

1) Google is smart. Make your users’ lives easy and Google will do the same for you.

That’s not to say technical details don’t matter. In fact, they matter tremendously, but SEOs can easily get lost in the technical aspects—page speed, for example. I do not care that page speed is a “ranking factor.” That is not why page speed is important. Page speed is important because if users have to wait more than three seconds to see your webpage, they’re going back to the SERPs and clicking on your competitor. Goodbye sale. Goodbye lead. Goodbye conversion.

2) Don’t build links, earn them.

I have seen far too many clients come in with penalties because of linking tactics, yet today we still hear about agencies and in-house teams using questionable means just to fill a report with “acquired links.” Link-Building-vs-link-EarningSure, links factor into rankings, but remember why links were originally factored in? They were meant to be votes of confidence expressing the authority of your site. Getting your link on a cringe-worthy blog shows little value to your users and even less to Google. I think you forgot to read point number one.

3) Make great content. Don’t say, “make great content.”

If you’re reading this post, then you’ve undoubtedly heard a million things about the need for great content. Well, here’s a million and one. My biggest issue with the rush to create content is exactly that—the rush. Crafting content relevant to your demographic should be a no-brainer, but it has to go further than that. We need to make content that will blow their minds.

Don’t search BuzzSumo or Google Trends for what’s been popular in the past; make something new! The audience of your skateboard shop doesn’t want to read “The Top Ten Tricks You Need to Know!” They know them all. They want to discover new music, find new spots to skate, or have the ability to customize a board on your site. Generic content doesn’t drive traffic, but content with clear intent does.great-content-LOLcats-wikipedia

By no means is this a comprehensive guide to SEO, but rather an effort to share some of my biggest takeaways from interning in this rapidly evolving industry. The truth is, as an intern, you never know what you’re going to get (typically, it’s coffee). But every once in a while, you land a gig where you truly feel you’ve made an impact, and I’m (hashtag) blessed to have found one.

It has been an honor and a privilege to work at Wpromote and I hope I will have the opportunity to come back. Thank you everyone!

Posted on

Why is My Direct Traffic Increasing? Dark Search, Probably.

If you look at your Google Analytics metrics as often as I do, you have probably noticed a steady increase in your direct traffic over the past year or so. And if you’re like me, it drives you crazy. Why the increase you ask? Well, this is because when traffic is reported as direct, we can’t see how it’s being acquired. Part of my job as an SEO Manager is to drive traffic to a client’s website and tell them how its being done. We SEOs rely on these metrics to make informed decisions on our campaign strategy, and direct traffic is the equivalent to “not provided” keywords: we can’t do anything about it. Or we couldn’t, until now.


Direct traffic, why!?!?!?


Marshall Simmonds of Define Media Group had a great presentation at MozCon 2015, one of his main topics being this idea of “dark traffic.” Dark traffic is actually comprised of three points: Dark Search, Dark Mobile, and Dark Social, but this post is specifically about Dark Search, and what it means for us SEOs.


Let’s first define what dark traffic is: it’s a URL that cannot be tracked because it doesn’t pass along a referrer string. When this happens, Google Analytics doesn’t know what to do with the visit and dumps it into Direct Traffic. When does this happen? More often than you think. Referral strings are stripped when users go from a secure site to non-secure, links are clicked in apps, under specific browser conditions, and more.



You might be thinking, “ok this is interesting, but why should I care?” Well, the traffic that’s “dark” and being reporting as direct could actually be organic or the result of a successful social campaign. As SEOs our jobs depend on giving our clients results, and dark search is skimming off the top of our hard work. We need to know how to talk to our clients and bosses about these surges in direct traffic, and how they could actually be a result of our campaigns.


Through a segmentation process, Simmonds was able to find that out of 215 million page views about 18% of direct traffic was pointed to a deep link that users wouldn’t actually reach directly.


That’s 18% of direct traffic that could be organic, social referrals, or even incorrectly tagged paid search. 18% of traffic that was “Dark Traffic.” How was Simmonds able to find this out? Let’s dig into his process.


1. Pull Your Direct Traffic – Remember, dark traffic is reported as direct. You won’t find it anywhere else.


2. Remove Your Homepage And Section Fronts



Your homepage probably gets legitimate direct traffic from people typing in your URL or bookmarking your page. The same goes for section fronts, or pages within your navigation menu. These are (hopefully) some of your more popular pages, and should be excluded based on the likelihood of being bookmarked or having an easy-to-type url. Use advanced filters in Google Analytics to accomplish this.


Filters are your friends!

Remember to exclude, not include, and use “Exactly Matching” to remove data from only that page. You don’t want to accidentally exclude an entire directory.


3. What’s Left Is “Dark Social”

Dark Social is one of the three major buckets of Dark Traffic. This traffic is “direct traffic” reaching deep links that may not be commonly bookmarked or reached by an exact match URL. Dark Social includes Dark Search traffic in it, which is why it has to be mentioned, but again this is focusing on Dark Search. To segment out the social and get search, we need to take this a few steps further.


4. Compare Filtered Links With Social Campaign Metrics

The reason why this bucket is Dark Social is because some social sites use HTTPS, rather than HTTP. We have to cross reference our social campaigns against the URLs found after our filters are implemented. Most major social media platforms have some type of referrer tag implemented, but you always want to double check.


5. Filter For New Users

Filtering for new users will remove users that may have bookmarked your site and returned recently, or memorized a URL and re-typed it in. After this last filter is done, what’s left is dark search.


6. I Have My Dark Search, Now What?

Analyze all of your leftover long tail URLs. There’s a good chance that these URLs were not typed in directly, and were not tied to a social campaign, so how else could a user have reached it? There are quite a few possibilities, but the point is you ruled it out being direct or social traffic, which leaves us with Organic or some unknown referral source (we are assuming your Paid Search manager is properly tagging their campaigns). Let your client know about this section of traffic, especially if you have seen major increases in direct traffic as of late or during a period that correlates with a specific marketing campaign.


I ran some of our client’s data through this process, and found that 11% of 33 million page views were dark traffic pointing to a deep link.


Traffic pulled from actual accounts.

I unfortunately couldn’t cross reference every account with each social campaign, but the data is still pretty significant. This 11% is traffic that our social media managers or SEO managers can speak to as being “dark” and not actually direct.

Dark Traffic can’t be prevented (yet) but there are some steps we can take as online marketers to cut this number down.


1. Tag All Of Your Marketing Campaigns Correctly – Whatever your preferred tagging method is, make sure any URLs you’ve built that are pointing to your site are tracked.

2. Look For Correlations In Your Marketing Campaigns And Direct Traffic – You can’t go back and tag old traffic, but you can analyze spikes in direct, and look for possible causes of dark traffic.

3. Suggest Switching To HTTPS – Although this is a lot of work and should be thoroughly vetted before implementing, referral strings are passed from one HTTPS site to another.


There you have it! This is what’s (probably) going on with your direct traffic. Always remember to double check your metrics and look for possible bot traffic as well. Hopefully you can now speak to your clients, boss, or in-law about Dark Traffic in confidence.

Posted on

A Smattering of SEO News – ABC…Easy as 123…

Welcome to the Smattering my friends! The big, massive, huge, mondo super duper news this week — that surprised everyone — is that Google is now part of a larger conglomerate corporation called Alphabet. While Google will be more focused on search and its various properties like YouTube, Maps and so on, Alphabet will take some of the other stuff Google was doing and separate it out. Neat, huh? Check it out and more below!

Google News

  • shutterstock_280249064-760x400Google Restructures Under New Alphabet Corporate Umbrella – Well this was a huge surprise earlier this week, to be sure. Google just went through a seriously massive restructuring, which includes both a new CEO – Sundar Pichai – and a new umbrella company called Alphabet. Google, thankfully, will still have control of things like search, Gmail, YouTube and so on, while Alphabet will broaden to other ventures. This includes Wing, their drone delivery system, as well as Calico, an anti-aging company. It’ll be fascinating to see what lies ahead for this new conglomerate. If you want more information, check out Google’s blog post on the matter.
  • Google Local Pack Drops From Seven Listings To Three – In what surely will be felt by many local retailers who use the services of Google Local to drive business, this week Google limited the Local Pack a bit. Whereas before this change, Google’s Local Pack had seven listings, they now will only find three. Other things removed included phone numbers, replaced with the hours of the business. Thankfully users can still expand the Local Pack to show up to twenty listings, but this is still going to be a blow to businesses below that top three.
  • shutterstock_248465470-760x400“Watch Time” Patented As A Ranking Factor – A recent patent granted to Google this past week proposed methods of ranking content – video or otherwise – via “watch time.” The patent explains that this patent covers algorithmic systems which can adjust a site’s rankings based on how long a user watches video content, but can also apparently be used for other types of content as well. It’ll be interesting to see when this goes into effect. Also interestingly, Google has a document on optimizing for watch time, so this isn’t exactly a new idea, but the patent itself is decidedly new.
  • Illyes: 404s Don’t Impact Panda – Less in the “news” and more in the “neat to know” column is the fact that earlier this week, Google’s Gary Illyes was asked if pages with 404 error codes have an impact on Panda. His answer? “Nope.” Seriously, that was it.

General News

  • Moz Ranking Study: Links Still Rule – Moz recently released their Search Engine Ranking Factors 2015 study, and it’s their most detailed yet. In asking around 150 SEO experts to rate over ninety ranking factors, the largest factor – in their opinion – is domain-level link features such as trust, page rank, quantity and so on. Falling just shy of that, in second place, is anchor text and referring domains. Google might try to tell us otherwise, but apparently in the minds of many SEOs, links are still super important.Screen Shot 2015-08-13 at 10.06.27 AM
Posted on

How to Implement Hreflang Correctly

Hreflang was introduced by Google in 2011 to help site owners serve the correct language or country-based language variation to users around the globe. All country-based Google search engines, as well as Yandex, support hreflang. If you’re targeting Bing, you’ll have to use language meta tags instead.

Google gives us some examples where rel=”alternate” hreflang=”x” is recommended:

  • The entire website is fully translated in another language, i.e. one version in Spanish – and one version in German
  • You have regional variations within a single language, i.e. one version in Spanish for users in Mexico – and one version in Spanish for users in Spain –
  • Your site is partially translated with the main content being in a single language i.e. sites with translated site temple only (navigation menu/footer). This is very common on user-generated sites like forums and discussion boards.


So, how does an SEO implement hreflang correctly? Google gives us three methods:

1. HTML Link Element In Header

In the HTML <head> section, add a link element pointing to the language or language/country version of that webpage:

<link rel=”alternate” hreflang=”es” href=””>

Note: If your site is targeting multiple languages or language/region variations, the hreflang tag should include each variation. This is crucial so that search engines understand the relationship between each URL variation and serves the right content to users.

<link rel=”alternate” hreflang=”es” href=””>

<link rel=”alternate” hreflang=”de” href=””>


2. HTTP Header For Non-HTML Files Like PDFs

Indicate a different language or language/country version of a URL in the HTTP Header:

Link: <>; rel=”alternate”; hreflang=”es”

Note: To specify multiple hreflang values in a Link HTTP header, separate the values with commas like so:

Link: <>; rel=”alternate”; hreflang=”es”, <”>; rel=”alternate”; hreflang=”de”


3. XML Sitemap

Instead of using markup, you can submit language or language/country version information in an XML sitemap.

The example sitemap below is from Search Console Help forum and describes a site that has pages equivalent to the English version ( targeting worldwide German-speaking ( users and German-speaking users in Switzerland ( This would require you to have one XML sitemap for the entire website.



There’s a few things you should be aware of:

  • The country is optional, the language is not. Hreflang works independent of the country. You can specify a language version of your site or you can specify language & country. For example, using the HTML link header method – this is a site that has a default version for Spanish language users worldwide, but also offers a version of the URL for Spanish language users in Mexico:


<link rel=”alternate” hreflang=”es” href=””>

<link rel=”alternate” hreflang=”es-419-mx” href=””>


  • Search engines support ISO 639-1 for language codes and ISO 3166-1 Alpha 2 for the region. For language script variations, the script is derived from the country itself. For example, the language code for Portuguese (Brazil) is: pt-BR and for Portuguese (Portugal) it is: pt-PT.

The hreflang tags in this case would be: pt-BR-br and pt-PT-pt. Search Console Help refers you to ISO 15924 for language scripts but the explanation is not that clear.

  • Although not listed in Search Console, Google also seems to support IETF language tags as supported language values. I found this by reviewing the hreflang tags used on Google properties like For example, the language code for Spanish (Latin America & Caribbean) is es-419. 419 is the regional code used by the United Nations and one of the components of IETF language values.
  • You should only use one out of the three methods to implement hreflang. If you’re using more than one method, this creates redundancy.
  • One method is not better than the other but Google does recommend the HTTP header method for non-html files.
  • X-default hreflang is a tag you can use to identify a page that doesn’t target a specific language or region. This is most commonly used on country-selector homepages.

The Android site has a version of their website that doesn’t target a specific language or region and therefore utilizes x-default hreflang site-wide.



In 2014, Google Search Console (then Google Webmaster Tools) added a feature to identify issues with the rel-hreflang-implementation on your site.

The most common mistakes site owners make when implementing language/country annotations are the following:

  • No Return Tags – If page A links to page B, page B must link back to page A. Annotations must be confirmed from the pages they are pointing to. Google gives site owners insights as to where the error was detected and where the return link should be.


  • Invalid Country or Language Codes – in the example below, the site uses JP to specify Japanese, however the correct ISO 639-1 code is actually JA for Japanese. Google also provides examples on how to fix this in your site’s Google Search Console dashboard.




So What Happens If I Implement Hreflang Incorrectly?

If you implement hreflang incorrectly, search engines will not be able to interpret the tag and will ignore it. Don’t believe me? Check out what Google Webmaster Trends Analyst Gary Illyes had to say…


So at this point you’re probably thinking… “Zetus Lupetus Mariel! My site has thousands of pages in over a dozen languages, and my URLs structures vary from country to country. This seems like a lot of work. What’s the big deal with hreflang anyway?”

zetus lapetus gif


And you’re probably right. Depending on the website, hreflang can get very complicated…very fast. But keep in mind that the language of your users and their location are two of the biggest influencing factors in Google’s search algorithm. Besides delivering the best user experience to your site visitors globally, recently at SMX Advanced 2015, Cyrus Shepard at SearchMetrics identified a correlation between organic rankings and the use of properly implemented hreflang tags.


Below are some tools for hreflang tag implementation. Please note that these tools still require manual work. For example, the hreflang tag generator requires you to input every single url and language/country variation. If you’re going with the sitemap, remember that you’ll need one XML sitemap for the entire website. These tools may not be feasible depending on the extent of your site and resources.


Questions? Please follow me on Twitter @marieldoesseo.

Posted on

A Smattering of SEO News – A Slow, Slow Panda

Interestingly, previous rollouts of Google algorithm updates felt close to instantaneous, but not with this latest version of Panda. Oh, no no. With this version, Google has confirmed that it will take months to roll out, which means we’ll have to agonize over the fluctuations of our rankings as far as traffic goes. Fun times ahead! All this and more in this week’s Smattering!

Google News

  • Panda 2.0 Rolling Out Slowly For Technical Reasons – Not getting more specific than the words “technical reasons,” Google’s John Mueller said in a recent hangout that the most recent Panda update is rolling out similarly to previous updates. However, due to those aforementioned technical reasons, they’re rolling the update out a bit more slowly than before. You can read even more about this in an FAQ done by Search Engine Land on the rollout.
  • Google Refuses To Implement Right To Be Forgotten Globally – Google has politely and formally turned down a request from a French commission to implement Europe’s Right to be Forgotten rules globally. Google stated that they refer users to the version of Google specific to their country, and that not every country has such laws. So, “no one country should have the authority to control what content someone in a second country can access.” Well
  • Possible Spam Algorithm Update Launched? – There seems to be a lot of chatter amongst black hat SEO types about their rankings going through severe fluctuations beginning this last weekend. No one is sure if this is related to Panda or not, but it could be its own thing. We’ll keep you updated as we find out more.
  • Google Hiring More SEO Professionals – According to a tip received by Search Engine Land’s Barry Schwartz, Google recently put out ads looking for more SEO professionals, this time to help manage the Google store at These are for 12-month contract jobs, and candidates will be responsible for marketing Google devices. Interesting stuff.
  • Search Analytics API Launched – Google recently launched its new Search Analytics API, allowing the creators of searchcentric tools to use this new API to create tools that access a wide range of Google data. This includes top ten clicks by query and top ten pages.
  • YouTube Ditches “301+ Views” For More Accurate View Counting – For a long time, if you had a video on YouTube that was popular, you might see “301+ views” in the view counter. This was Google stalling the view counter while it verified proper views rather than just using bots. Google now will be removing this limitation and showing a more accurate view count, which will also be in real-time. This will lead to far more reliable numbers for your video viewer
Posted on

Tag, You’re It! Learning YouTube SEO (or Teaching an Old SEO Dog New Tricks)

I’ve been in SEO for over eleven years now, and while I’ve helped hundreds of websites overcome technical and usability hurdles in order to make their sites more user and search engine friendly, I’ve never been the best at promoting and optimizing myself. See, I have this very niche, very focused gaming blog that has brought me countless hours of joy. While it began as a mostly textual endeavor, it’s evolved into a primarily video-based enterprise, with the blog and its accompanying YouTube channel taking up the bulk of my creative efforts. As a result, I optimized my YouTube channel as I did most other sites, which seemed logical, but after a year or two I learned  doing that was a big, big mistake.

So you see, YouTube has its own type of SEO, and while much of it is rooted in typical organic SEO (titles and textual descriptions are insanely important), there are a couple of areas I completely mismanaged until I learned better. Because of this experience, I want to share what I’ve learned with you fine people, in the hopes that you will also find success with your video endeavors. The first area of importance that really blew my mind was one so simple, so transparent I can’t believe I missed it: tags.


Tag, You’re Doing It Wrong

For years, we used the meta keyword tag to tell search engines which keywords we hoped a site would rank for. While tagging YouTube videos is similar, it also is very different. For example, if you have a site that sells widgets in various colors, you would have a meta keyword tag that included, “blue widgets, red widgets, purple widgets, widget sale” and so on, which is fairly simple and straightforward. Tagging YouTube videos follows that same mantra: it works in a similar way, but yet not similar at all. While the classic meta descriptions are meant to include basic phrases, YouTube tags actually need to include not only these, but also full search phrases. So, in keeping with that site above that sells widgets, if they made a video on how to build widgets, their YouTube keywords for that video could also include, “how to build widgets, building widgets, how to build a widget, let’s build a widget, build a widget episode one,” and so on. Here’s an example of some keywords I came up with for a recent Let’s Play video of mine:

Screen Shot 2015-08-04 at 10.20.19 AM

As you can see, there are not only the more generic phrases you would see in a meta keyword tag, but also more specific phrases that people should actually be searching for as well.


It’s All About Proper Tag Research

How does one come up with these keywords? There are a few methods, the simplest being to do a search in YouTube and see what the search bar automatically fills in. For example, Minecraft is an insanely popular video game that garners a wide variety of Let’s Play videos, so if you wanted to see what people were searching for when they want to watch people play Minecraft, you could do something like this:

Screen Shot 2015-08-04 at 10.24.02 AM

Those numbers are episode numbers (Minecraft games can go on forever), but you can also see people are looking for videos of the game with particular mods, or for videos in a new series. There’s also a tool I use to get keyword ideas called TubeBuddy, which is a browser extension and is actually pretty dang awesome. This tool allows you to see what other tags people are using, such as this Minecraft video that focuses on dinosaurs, or one of the keywords above shows is rather popular:

Screen Shot 2015-08-04 at 10.59.02 AM


You can also click one of those keywords to get a breakdown of the usability and popularity of the keyword using various data sources such as YouTube and Google Trends:

Screen Shot 2015-08-04 at 11.00.55 AM

It’ll also show you where your own channel shows up for this particular keyword (as you can see, my channel doesn’t show up at all because I don’t cover this game). 😉


The Results Of Proper YouTube Video Tagging

I began adding more varied and targeted tags to my own channel back in early May, and as you can see from this analytics chart, daily views definitely began to show some serious fluctuations, as well as an overall rise in traffic on a daily basis:

Screen Shot 2015-08-04 at 11.05.11 AM

I was averaging less than 1,000 views a day: I am currently around 1,500 to 2,000 views a day on average. Refocusing my tags has also led to an increase in my subscription rate, as you can see here:

Screen Shot 2015-08-04 at 11.09.59 AM

My daily subscriber averages have increased to the point where I’m within spitting distance of 3,000 subscribers on my channel as I write this. That might not sound like a lot, but for a very niche one-man operation like mine, if feels like a lot. This clearly shows that proper tagging — not just with generic terms but with full search phrases that people actually use — goes a long way toward making your videos, and your channel, more visible in the long run.

Proper tagging of YouTube videos is incredibly important in increasing their visibility, to the point where you’ll get not only more views on your videos and your channel, but more subscribers as well. The more subscribers you get, the better engagement your channel will see in terms of comments and shares, all of which go toward making your channel more visible and more valuable. I never would have known this unless a friend of mine drilled it into my head, and now, after several months of implementing these tagging changes, I’m seeing the results.


I plan to share this information I’m learning with you about YouTube SEO — which is SEO within SEO really — as I learn them myself. While tagging was the first big step I learned toward making a more visible channel, the next step (thumbnails) was a bit more complicated, especially for someone with zero artistic ability such as myself. In my next installment I plan to go into detail about making clear, usable thumbnails for your videos, and how they can also increase visibility and value for your individual videos and your channel as a whole.

Thank you for taking the time to read this first installment of my series on properly optimizing YouTube channels and videos for search friendliness and maximum usability. I hope you found it useful, and I welcome any critiques or questions you might have in the comments below. See you next time!


Posted on

A Smattering of SEO News – Google Shuts Down Another Useful Tool

Google is shutting down another of its fairly useful tools, the autocomplete API. While not crucial to SEO in any means, it was a neat API that led to tools that could be helpful in a lot of situations. All this and more in this week’s smattering.

Google News

  • Google to Shut Down Autocomplete API on August 10th – Google recently announced that it would be shutting down its long-running autocomplete API in a few weeks. When asked why, they cited that they never anticipated it being used outside of a search context, and feel that it’s so fully disconnected from providing users with a valuable search tool that it no longer “maintains the integrity” of their search functionality. It will be interesting to see which tools and websites this ultimately affects once shut down.
  • Google Reports 180% Increase In Hacked Sites Over Last Year – As part of the launch of their #NoHacked campaign, one of the pieces of data Google dropped on us is that they’ve seen a 180% increase in hacked sites over the last year alone. That, my friends, is simply staggering. As part of this new campaign, Google will provide insights into how to prevent being hacked on their blog every week, along with actionable tips and security-themed Hangouts with their staff. This will hopefully bring this crazy number down over the next year.


  • Panda Update Fallout Begins To Be Noticeable – The latest update to Panda has been slow, and while many webmasters aren’t yet noticing much in the way of fluctuations, some are starting to see some volatility in their search results, mostly in the amount of keywords being ranked. For example, one marketing agency noticed that, before it was officially announced yet after the update had begun, card-maker Hallmark went from ranking for around 29,000 keywords to around 17,000, which is a massive drop. It’s still relatively early days, but it’s fascinating to try and gain a handle on what’s happening after this latest update. We’ll keep you posted as we get more information.
  • Google Will Search For Large Discrepancies Between Desktop And Mobile Site – When asked on Twitter about the differences between mobile and desktop platforms, Google’s Gary Illyes stated that while they use mostly the desktop site for its rankings, they’ll look to see if there are big differences between the desktop and mobile site to see if there’s any foul play going on. While this seems like this would be one of those things that is obviously a given, apparently it happens enough that they check. So keep the content on your sites the same, I believe is the takeaway.
  • Google Ignores Dynamic, Tab-Based Content – A little while back, Google dropped something of a bombshell, in which they informed us that they don’t value the content hidden behind tabs or accordion scripts as highly as fully visible content. Now Google’s John Mueller has given even more information on this phenomenon, stating to Stack Overflow that if content behind a tab is dynamic – meaning it’s not rendered until the tab is clicked – then it wont be indexed at all. It makes sense really: If a user can’t see it even in the source code, then Google will ignore it as well.

Other News

  • Bing Traffic Drop Caused By HTTPS Switch – A week or so ago, many webmasters noticed that their traffic from Bing was dropping in their analytics tools. Bing has now admitted this was a result of their recent switch to encrypted search results, but hasn’t been clear on whether this is a bug or a permanent change. We’ll keep you updated as we find out more.


Posted on

A Smattering of SEO News – A Fistful of Pandas

I usually write these news posts on Wednesdays, so at the time, there were murmurings of some “phantom update” at Google, as a lot of webmasters were noticing fluctuations in their rankings. Well, lo and behold, AFTER I wrote the news below, Google confirmed that Panda 4.2 has launched. WHEEE! Prepare to ride the roller coaster again my friends! That and more below!

Google Newsgoogle-panda-cop1-fade-ss-1920-800x450

  • Did A “Phantom Update” Happen Last Weekend? – Many webmasters, according to Search Engine Roundtable’s Barry Schwartz, have been emailing him asking if Google had an update this past weekend, as there apparently was a lot of volatility in their rankings. One of his readers thinks it’s similar to the odd update that happened last month which Google first denied then confirmed. We’ll have to see if anything comes of it. Update: Since writing this, Google has now confirmed that they’ve launched Panda 4.2 this past weekend, but it’ll take months to roll out, so keep an eye on those SERPs people!
  • Bad HTML Validation Can Impact A Variety Of SEO-Related Factors – In a recent Google Hangout, Google’s John Mueller explained that while having badly validated HTML won’t affect your rankings, it can affect aspects of your site in which visibility can be impacted, such as structured data, meta tags, or even links. Making sure your HTML is valid can really help your site achieve better SEO friendliness.
  • HTTPS Can Be A Rankings Tie-Breaker – According to a Tweet from Google’s Gary Illyes, he stated that, if two sites are neck-and-neck for the top spot in a given search engine result, Google will give the nod to the site that has a proper HTTPS implementation. A “dealbreaker,” he called it, as it gives that site the edge to break a tie. Fascinating stuff.
    Screen Shot 2015-07-23 at 2.35.38 PM
  • Search Console Sitemap Indexation Bug Affecting Many Sites – Many webmasters have been noticing an odd drop in their indexed page amounts via XML sitemaps in the Search Console. Gary Illyes confirmed to Search Engine Land that they know about it, and it is indeed a bug, so if you’ve lost indexed pages, don’t freak out. Hopefully a fix will happen soon.
  • All Top-Level Domains Treated The Same By Google – With all the brouhaha surrounding many of the new top-level domains, such as .london, .how, .guru and others, Google took an official stance on whether these affect rankings or not. In a new FAQ, when asked how TLDs effect rankings they state, “Overall, our systems treat new gTLDs like other gTLDs (like .com & .org). Keywords in a TLD do not give any advantage or disadvantage in search.” Makes sense.

Other News

  • Yahoo Testing Google-Like SERPs – According to screenshots taken from intrepid Tweeters, they noticed that Yahoo is playing with a Google-style Search Engine Results Page (SERP) which includes sitelinks and site-specific search boxes. It’ll be interesting to see if Yahoo adopts this permanently.
  • Bing Analytics Suffering From Tracking Issue – Many webmasters have recently noticed that Bing’s analytics were showing chalkboard-wide-bing-1437391667massive drops in traffic for their sites. Apparently it’s due to a bug in which they’re showing the wrong report segments in their analytics platform. No word from Bing at the time of this writing on a fix.




Posted on

5 Common On-Site Optimization Problems and How to Fix Them

In my eleven years within the SEO industry, I’ve spent the majority of my time diagnosing websites. It’s actually my favorite part of the job, digging through code, testing navigational elements, finding things to break, it’s so fun. However, I’m still boggled when I run into the same basic SEO flubups from over a decade ago. I mean, it’s 2015, there’s a plethora of information out there via books, websites, and conferences to give people a basic foundation of proper on-site SEO, and yet, there are many, many websites that still fail at some of its most basic tenets.

Therefore, I’m going to give y’all a freebie. Five freebies to be exact. I’m going to list the five most common on-site SEO issues I run into, as well as a brief rundown on how to fix them. While these won’t be super-comprehensive, they will hopefully push you toward learning more about making your own site SEO-friendly by giving you a foundation to build upon. Now with that said, let’s dive in:

1. Missed Opportunities With Title Tags – This one surprises me the most. So much has been written about title tags that you would think anyone with a passing interest in titletagsranking their website would know this, but nooooooo, apparently this one still flies by people. If you’re reading this and asking, “What’s a title tag?”, go Google that and come back. I’ll still be here. We good? Okay, now, what I mean by “missed opportunities” are basically just that. You wouldn’t believe how many title tags I run into that are either empty(!!!), poorly written, under-optimized, or outright spammy. The key to a good title tag is both brevity and expressiveness. You want at least your top keyword in there along with your branding. So, if you’re selling blue widgets, red widgets, and widget accessories, you can have a tag like, “Red & Blue Widgets | Accessories | Brand Name”. This covers three keywords and slips the brand name at the end all while being under fifty-five characters. You can experiment with word and phrase combinations to get just the right mix that works for you, but don’t get frustrated if it’s difficult, this takes a while to grok.

2. Unengaging Meta Descriptions – While meta descriptions have had a negligible weight as an SEO ranking factor for years now, it’s still important for one key reason: it’s the blurb of text that shows up in your SERP (search engine results page) listing. Therefore, you need to have an engaging meta description that pulls people to your site with a strong call to action. So in sticking with the widget example, don’t just say something like, “We sell blue and red widgets, and accessories.” That’s bland, man. Try, “Come check out our exceptional selection of low-priced blue and red widgets, along with accessories!” for example. Make it call to people.

3. Poor Navigation – Usability is something of a fascination of mine, so it always shocks me when I come to a website that seems to actively work against its users in terms of navigation. You want to give users as many avenues as they can to move around your site. This includes not just a streamlined main menu, but navigational breadcrumbs, links to related stories or products, a well-curated HTML sitemap, and much more. If users can’t find the information they’re looking for, they’ll bounce from your site to another where they can find it. Therefore, you need to make sure your site has multiple avenues of navigation, like those mentioned above, accessible from every page so that users can easily find what they need.

4. Misusing Image ALT Attributes – Many sites are getting more and more image-heavy, which takes away the possibilities for on-page textual content (which is still the best stuff you can put on a site). With this increased reliance on images, ALT attribute usage is just becoming more and more important. If you’re unaware, ALT attributes are where you can place text describing what an image is. These should always be used on images that aren’t spacers or placeholders, in order to help give search engines context as to what the image is and why it’s being used. This is especially important with images that also have text in them. Since search engines can’t read that text, it’s crucial to place that text in the ALT attribute that explains why that image is there as well as what it is. If you have images without ALT attributes filled in, you need to fix this one.


5. Have The Proper Sitemaps – I mentioned the HTML sitemap above, which is crucial for user navigation, but both the HTML and XML sitemaps are very important for search engine spiders, and you would not believe how many sites I run into that are missing one or both of these. Many of today’s content management systems (CMS) include features to create sitemaps by default, but even then, their implementation can be a little problematic. Therefore, you need to make sure your sitemaps are up to snuff. For HTML sitemaps, if you have a larger site, you don’t need to include every page on your site, but maybe the top two levels of your navigation (such as the homepage and product categories, for example), but for smaller sites having every page listed is fine. For XML sites, make sure you correctly use the priority attribute for each URL to denote its importance. For example, the homepage (and only the homepage) should be 1.0, while pages the next level down can be 0.75 and so on. These help search engines spider your site to the best of their ability, which helps your site in the long run.


Now sure, there are a myriad of other issues that can befall a website that can cause it to lose rankings, or rank poorly initially, or have spidering problems, or whatever else, but these are the ones I run into over and over again. Therefore, fixing these issues, if you have them — and I hope you don’t — can go a very long way toward making your site more valuable in the eyes of both users and search engines, which should be the ultimate goal of any website owner. Thanks for reading, and if you have any questions, don’t hesitate to hit us up in the comments! Have a great day, and happy SEOing!

Posted on

A Smattering of SEO News – Late Penguins and Incorrect Languages

Welcome, my friends, to another Smattering of SEO News! This week it’s all Google, all the time! First we find out Penguin is still months away, then Google is looking for their own SEO manager and much more. Check it out!

Google News

  • Google Penguin Refresh Still Months Away – According to a Tweet from Google’s Gary Illyes, a refresh to the Penguin algorithm is still “months away.” It has been nine months since the last update to Penguin in October of last year, and before that it was a year between updates. Will we see another update to Penguin in October, which is indeed months away? We’ll have to wait and see, but as soon as we know, you’ll know.


  • Google Is Looking For An SEO Manager – Despite reports released over the years that Google tends to unfairly promote its own properties in search results, Google is indeed looking for a SEO manager to help improve organic traffic to its own pages and properties. They want at least 4 years of web development, 2 years of experience with SEO and quite a bit more. If you think you have what it takes, go check it out.
  • Google Clarifies Its “Don’t Ask For Links” Statement – Last week we brought you news that a Google Portuguese Webmaster blog had been updated with the language, “do not buy, sell, exchange or ask for links.” It was the portion that stated, “ask for links” that got folks curious. Search Engine Roundtable’s Barry Schwartz followed up with Google to try and get more information on this change, and Google clarified saying, “not buy, sell or ask for links that may violate our linking webmaster guidelines.” This makes a ton more sense.
  • Search Console Notifications Going Out For Incorrect Hreflang Implementation – In a new addition to their ever-growing list of webmaster notifications, Google is now sending out notifications which let site owners know that their Hreflang implementation is incorrect. This is excellent for those with multi-language sites, as Hreflang implementation has always been a bit tricky, to say the least.


  • Search Console To Soon Send Out Fewer Notifications – Google’s Gary Illyes recently announced a change to how Search Console would be sending out most of its non-critical communications. Starting in a few weeks, Google will only notify direct site owners, rather than property owners, of non-critical issues. This means that if had an issue, only the direct owner of would get a notification, and not property owners of, or It’s an odd shift, to be sure.