Colin Cochrane

Colin Cochrane is a Software Developer based in Victoria, BC specializing in C#, PowerShell, Web Development and DevOps.

Please Don't Urinate In The Pool: The Social Media Backlash

pool-party The increasing interest of the search engine marketing community in social media has resulted in more and more discussion about how to get in on the "traffic goldrush".  As an SEO, I appreciate the enthusiasm in exploring new methods for maximizing exposure for a client's site, but as a social media user I am finding myself becoming increasingly annoyed with the number of people that are set on finding ways to game the system.

The Social Media Backlash

My focus for the purposes of this post will be StumbleUpon, which is my favourite social media community by far.  That said, most of what I say will applicable to just about any social media community, so don't stop reading just because you're not a stumbler.  Within the StumbleUpon community there has been a surprisingly strong, and negative, reaction to those who write articles/blog posts that explore methods for leveraging StumbleUpon to drive the fabled "server crashing" levels of traffic, or dissect the inner-workings of the stumbling algorithm in order to figure out how to get that traffic with the least amount of effort and contribution necessary. 

"What Did I Do?"

When one of these people would end up on the receiving end of the StumbleUpon's community's ire they would be surprised. Instinctively, with perfectly crafted link-bait in hand, they would chronicle how they fell victim to hordes of angry stumblers, and express their disappointment while condemning the community for being so harsh.  Then, with anticipation of the inevitable rush of traffic their tale would attract to their site, they would hit the "post" button and quickly submit their post to their preferred social media channels.  What they didn't realize was that they were proving the reason for the community's backlash the instant they pressed "post".

Please Don't Urinate In The Pool

To explain that reason, we need to look at the reason people actually use StumbleUpon.  The biggest reason is the uncanny ability that it has for providing its users with a virtually endless supply of content that is almost perfectly targeted to them.  When this supply gets tainted, the user experience is worsened, and the better that the untainted experience is, the less tolerant the users will be of any tainting.

To illustrate, allow me to capitalize on the admittedly crude analogy found in the heading of this section.  Let's think of the StumbleUpon community as a group of friends at a pool party.  They are having a lot of fun, enjoying eachother's company, when they discover someone has been urinating in the pool.  The cleaner the water was before, the more everyone is going to notice the unwelcome "addition" to the water.  When they find out who urinated in the pool, they are going to be expectedly angry with them.  To stretch this analogy a little further, you can be damned sure that they wouldn't be happy when they found out that someone was telling everyone methods for strategically urinating in certain areas of the pool in order to maximize the number of people who would be exposed to the urine.

For anyone who was in the group of friends, and actually used and enjoyed the pool, the idea of urinating in it wouldn't even be an option.  Or, in the case of StumbleUpon, someone who actually participated in the community and enjoyed the service, wouldn't want to pollute it.

How Being An SEO Analyst Made Me A Better Web Developer

Being a successful web developer requires constant effort to refine your existing abilities while expanding your skill-set to include the new technologies that are continually released, so when I first started my job as a search engine optimization analyst I was fully expecting my web development skills to dull.  I could not have been more wrong.

Before I get to the list of reasons why being an SEO analyst made me a better web developer I'm going to give a quick overview of how I got into search engine optimization in the first place. Search engine optimization first captured my interest when I wanted to start driving more search traffic for a website I developed for the BCDA (and continue to volunteer my services as webmaster). Due to some dissatisfaction with our hosting provider I decided that we would switch hosting providers as soon as our existing contract expired and go with one of my preferred hosts. However, as a not-for-profit organization the budget for the website was limited and the new hosting was going cost a little more, so I decided to set up an AdSense account to bring in some added income. 

The expectations weren't high; I was hoping to bring in enough revenue through the website to cover hosting costs.  At that point I did not have much experience with SEO so I started researching and looking for strategies I could use on the site.  As I read more and more articles, blogs and whitepapers I became increasingly fascinated with the industry while I would apply all of the newfound knowledge to the site.  Soon after I responded to a job posting at an SEO firm, applied, and was shortly thereafter starting my new career as an SEO analyst. 

My first few weeks at the job were spent learning procedures, familiarizing myself with the various tools that we use, and, most importantly, honing my SEO skills.  I spent the majority of my time auditing and reporting on client sites, which exposed me to a lot of different websites, programming and scripting languages, and tens of thousands of lines of code.  During this process I realized that my web development skills weren't getting worse, they were actually getting better.   The following list examines the reasons for this improvement.

1) Coding Diversity

To properly analyze a site, identify problems, and be able to offer the right solutions I often have to go deeper than just HTML on a website.  This meant that I had to be proficient at coding in a variety of different languages, because I don't believe in pointing out problems with a site unless I can recommend how to best fix them.  Learning the different languages came quickly from the sheer volume of code I was faced with every day, and got easier with each language I learned.

2) Web Standards and Semantic Markup

In a recent post, Reducing Code Bloat Part Two: Semantic HTML, I discussed the importance of semantic HTML to lean, tidy markup for your web documents.  While I have always been a proponent of web standards and semantic markup my experience in SEO has served to solidify my beliefs.  After you have pored through 12,000 lines of markup that should have been 1000, or spent two hours implementing style modifications that should have taken five minutes, the appreciation for semantic markup and web standards is quickly realized.

3) Usability and Accessibility

Once I've optimized a site to draw more search traffic I need to help make sure that that traffic actually sticks around.  A big part of this is the usability and accessibility of a site.  There are a lot of other websites out there for people to visit and they are not going to waste time trying to figure out how to navigate through a meandering quagmire of a design.  This aspect of my job forces me to step into the shoes of the average user, which is something that a lot of developers need to do more often.  It has also made me more considerate when utilizing features and technologies, such as AJAX, in respect to accessibility, such that I ensure that the site is still accessible when that feature is disabled or is not supported.

4) The Value of Content

Before getting into SEO, I was among the many web developers guilty of thinking that a website's success can be ensured by implementing enough features, and that enough cool features could make up for a lack of simple, quality content.  Search engine optimization taught me the value of content, and that the right balance of innovative features and content will greatly enhance the effectiveness of both.

That covers some of the bigger reasons that working as an SEO analyst made me a better web developer.  Chances are that I will follow up on this post in the future with more reasons that I am sure to realize as I continue my career in SEO.  In fact one of the biggest reasons I love working in search engine optimization and marketing is that it is an industry that is constantly changing and evolving, and there is always sometime new to learn.

Google Cracks Down On Sites That Are Losing Importance

There has been a lot of controversy in the world of SEO as of late, mostly centering around the PageRank penalties that Google has been handing out to enforce their paid-link policies.  The members of the SEO community, who do enjoy to voice their opinion from time to time, increased the size of Google's index by about 2.3% blogging about it (just kidding! I love the SEO community).  As more updates to the Toolbar PageRank happened over the weeks, more disgruntled SEOs, webmasters and site owner's came out of the woodwork to express their outrage at Google for decreasing the Toolbar PageRank of their site. 

This outrage was in the form of more blogging which, inevitably, a lot of people were reading.  For a while you would be hard-pressed to find someone who wasn't writing about the latest Google PageRank Massacre.  PageRank was on everyone's mind. 

Those of us, such as myself, who pay little attention to the little green bar sat back and watched, interested in the developments regarding Google's policies and the paid-link debate.  We would sift through post after post of someone denouncing Google because their Toolbar PageRank dropped from a 3 to a 2, loudly proclaiming that they shouldn't be penalized for the 2 paid links they have when the site around the corner has nothing but paid links and cloaked text.  

As more and more people shared their tragic stories of their little green bar shrinking I started to notice that something was missing in the SEO blog community.  No-one was complaining about their ranking positions anymore, wondering how their site could not possibly be in the top then when they had a 900 characters in their meta keywords tag and was positioning a div full of keywords at -9999px to the left.  Suddenly it appeared that everyone had a perfectly optimized site with an abundance of high-value, relevant backlinks with content so useful and compelling that bounce rate was a thing of the past.  But these sites were losing Toolbar PageRank, and since they were shining beacons of optimized perfection the only explanation is that they were casualties of the PageRank Massacre.

Give me a break.

I'm going to kill the sarcasm now and get to the point now.  Here are some things that you might want to consider if you are thinking you got slapped by the latest Google crackdown:

1) Does Your Site Actually Have Paid Links?

Surprisingly there are people out there who are complaining about Google penalizing them in a PageRank Massacre when their sites don't actually have any paid links, or anything that even looks like a paid link.  Whether its a cheap ploy to draw some worthless traffic or just a desire to be part of the group, it make you look foolish.

2)  PageRank Can In Fact Decrease

Take a deep breath on this one and repeat after me: "My PageRank is not guaranteed to go up on every update".  PageRank is a relative value and if your site is not keeping up with the rest of the herd your PageRank will decrease.  This is growing more prominent every day as the level of competition continues to increase with more sites actually making an effort in SEO.

3) Stop Making Excuses

Unless you are in minority of sites that were actually penalized without cause, step back and take a good, honest look at the state of your website.  Do you have quality content, user-friendly navigation, optimized titles and descriptions?  Is it really so perfectly optimized that the only possible explanation is Google's ire?

My point here, if somewhat longwinded, is that it is easy to hop on the "Google screwed me" bandwagon when your site doesn't perform to your expectations.  It takes a little more honesty and willingess to improve yourself, and your site, to recognize areas that need improvement and work on those.  When it comes down to it, whether you got smacked with a PageRank penalty or simply fell behind the rest of the pack the best thing you can do is roll up your sleeves and get back to work.  The time will always be much better spent and your site and your site's visitors will benefit because of it.

Using the ASP.NET Web.Sitemap to Manage Meta Data

kick it on DotNetKicks.com

In an ASP.NET application the web.sitemap is very convenient tool for managing the site's structure, especially when used with an asp:Menu control.  One aspect of the web.sitemap that is often overlooked is the ability to add custom attributes to the <siteMapNode> elements, which provides very useful leverage for managing a site's meta-data.  I think the best way to explain would be through a simple example.

Consider a small site with three pages: /default.aspx, /products.aspx, and /services.aspx. The web.sitemap for this site would look like this:

<?xml version="1.0" encoding="utf-8" ?>

<siteMap xmlns="http://schemas.microsoft.com/AspNet/SiteMap-File-1.0" >

<siteMapNode url="~/" title="Home">

     <siteMapNode url="~/products.aspx" title="Products" />
     <siteMapNode url="~/services.aspx" title="Services" />

</siteMapNode>

 

Now let's add some custom attributes where we can set the Page Title (because the title attribute is where the asp:Menu control looks for the name of a menu item and it's probably best to leave that as is), the Meta Description, and the Meta Keywords elements. 

 

<?xml version="1.0" encoding="utf-8" ?>

<siteMap xmlns="http://schemas.microsoft.com/AspNet/SiteMap-File-1.0" >

<siteMapNode url="~/" title="Home" pageTitle="Homepage" metaDescription="This is my homepage!" metaKeywords="homepage, keywords, etc">

     <siteMapNode url="~/products.aspx" title="Products" pageTitle="Our Products" metaDescription="These are our fine products" metaKeywords="products, widgets"/>
     <siteMapNode url="~/services.aspx" title="Services" pageTitle="Our Services" metaDescription="Services we offer" metaKeywords="services, widget cleaning"/>

</siteMapNode>

 

Now with that in place all we need is a way to access these new attributes and use them to set the elements on the pages.  This can be accomplished by adding a module to your project, we'll call it "MetaDataFunctions" for this example.  In this module you add the following procedure.

 

Public Sub GenerateMetaTags(ByVal TargetPage As Page)
    Dim head As HtmlHead = TargetPage.Master.Page.Header
    Dim meta As
New HtmlMeta

    If SiteMap.CurrentNode IsNot Nothing Then
      meta.Name = "keywords"
      meta.Content = SiteMap.CurrentNode("metaKeywords")

      head.Controls.Add(meta)

      meta =
New HtmlMeta
      meta.Name = "description"
      meta.Content = SiteMap.CurrentNode("metaDescription")
      head.Controls.Add(meta)

      TargetPage.Title = SiteMap.CurrentNode.Description
   
Else

      meta.Name = "keywords"
      meta.Content = "default keywords"

      head.Controls.Add(meta)

      meta =
New HtmlMeta
      meta.Name = "description"
      meta.Content = "default description"

      head.Controls.Add(meta)

      TargetPage.Title = "default page title"
    End If
  End Sub

 

Then all you have to do is call this procedure on the Page_Load event like so...

 

Protected Sub Page_Load(ByVal sender As Object, ByVal e As EventArgs) Handles Me.Load

    GenerateMetaTags(Me)    

End Sub

 

..and you'll be up and running.  Now you have a convenient, central location where you can see and manage your site's meta-data.

Missing Link Data in Google Webmaster Tools

Late last night I noticed that the link data for my sites was missing in Google Webmaster Tools.  I assumed this was the result of some routine maintenance on Google's end and thought nothing of it.  Apparantly this may not be the case, as I came in to work this morning and discovered the the link data was still missing, and was also missing for every single one of our client's sites (both external links and internal links).

I'm not going to throw out wild theories about what Google is up to, but it will certainly be interesting to see if anything comes of this latest development.  If I had to hazard a guess, I would say the most likely explanation is an update of the link database.  However, when it comes to Google, as we all know, the only people that really know what's happening are the people at Google.