Colin Cochrane

Colin Cochrane is a Software Developer based in Victoria, BC specializing in C#, PowerShell, Web Development and DevOps.

ASP.NET Custom Errors: Preventing 302 Redirects To Custom Error Pages

 
You can download the HttpModule here.
 
Defining custom error pages is a convenient way to show users a friendly page when they encounter an HTTP error such as a 404 Not Found, or a 500 Server Error.  Unfortunately ASP.NET handles custom error pages by responding with a 302 Temporary redirect to the error page that was defined. For example, consider an example application that has IIS configured to map all requests to it, and has the following customErrors element defined in its web.config:
 
<customErrors mode="RemoteOnly" defaultRedirect="~/error.aspx">
<error statusCode="404" redirect="~/404.aspx" />
</customError>

If a user requested a page that didn't exist, then the HTTP response would look something like:

http://www.domain.com/non-existant-page.aspx --> 302 Found
http://www.domain.com/404.aspx  --> 404 Not Found
Date: Sat, 26 Jan 2008 03:08:21 GMT
Server: Microsoft-IIS/6.0
Content-Length: 24753
Content-Type: text/html; charset=utf-8
X-Powered-By: ASP.NET
 
As you can see, there is a 302 redirect that occurs to send the user to the custom error page.  This is not ideal for two reasons:

1) It's bad for SEO

When a search engine spiders crawls your site and comes across a page that doesn't exist, you want to make sure you respond with an HTTP status of 404 and send it on its way.  Otherwise you may end up with duplicate content issues or indexing problems, depending on the spider and search engine.

2) It can lead to more incorrect HTTP status responses

This ties in with the first point, but can be significantly more serious.  If the custom error page is not configured to response with the correct status code then the HTTP response could end up looking like:

http://www.domain.com/non-existant-page.aspx --> 302 Found
http://www.domain.com/404.aspx  --> 200 OK
Date: Sat, 26 Jan 2008 03:08:21 GMT
Server: Microsoft-IIS/6.0
Content-Length: 24753
Content-Type: text/html; charset=utf-8
X-Powered-By: ASP.NET
 
Which would almost guarantee that there would be duplicate content issues for the site with the search engines, as the search spiders are simply going to assume that the error page is a normal page, like any other.Furthermore it will probably cause some website and server administration headaches, as HTTP errors won't be accurately logged, making them harder to track and identify.
I tried to find a solution to this problem, but I didn't have any luck finding anything, other than people who were also looking for a way to get around it.  So I did what I usually do, and created my own solution.
 
The solution comes in the form of a small HTTP module that hooks onto the HttpContext.Error event.  When an error occurs, the module checks if the error's type is an HttpException.  If the error is an HttpException, then the following process takes place:
  1. The response headers are cleared (context.Response.ClearHeaders() )
  2. The response status code is set to match the actual HttpException.GetHttpCode() value (context.Response.StatusCode = HttpException.GetHttpCode())
  3. The customErrorsSection from the web.config is checked to see if the HTTP status code (HttpException.GetHttpCode() ) is defined.
  4. If the statusCode is defined in the customErrorsSection then the request is transferred, server-side, to the custom error page. (context.Server.Transfer(customErrorsCollection.Get(statusCode.ToString).Redirect) )
  5. If the statusCode is not defined in the customErrorsSection, then the response is flushed, immediately sending the response to the client.(context.Response.Flush() )

Here is the source code for the module.

   1: Imports System.Web
   2: Imports System.Web.Configuration
   3:  
   4: Public Class HttpErrorModule
   5:   Implements IHttpModule
   6:  
   7:   Public Sub Dispose() Implements System.Web.IHttpModule.Dispose
   8:     'Nothing to dispose.
   9:   End Sub
  10:  
  11:   Public Sub Init(ByVal context As System.Web.HttpApplication) Implements System.Web.IHttpModule.Init
  12:     AddHandler context.Error, New EventHandler(AddressOf Context_Error)
  13:   End Sub
  14:  
  15:   Private Sub Context_Error(ByVal sender As Object, ByVal e As EventArgs)
  16:     Dim context As HttpContext = CType(sender, HttpApplication).Context
  17:     If (context.Error.GetType Is GetType(HttpException)) Then
  18:       ' Get the Web application configuration.
  19:       Dim configuration As System.Configuration.Configuration = WebConfigurationManager.OpenWebConfiguration("~/web.config")
  20:  
  21:       ' Get the section.
  22:       Dim customErrorsSection As CustomErrorsSection = CType(configuration.GetSection("system.web/customErrors"), CustomErrorsSection)
  23:  
  24:       ' Get the collection
  25:       Dim customErrorsCollection As CustomErrorCollection = customErrorsSection.Errors
  26:  
  27:       Dim statusCode As Integer = CType(context.Error, HttpException).GetHttpCode
  28:  
  29:       'Clears existing response headers and sets the desired ones.
  30:       context.Response.ClearHeaders()
  31:       context.Response.StatusCode = statusCode
  32:       If (customErrorsCollection.Item(statusCode.ToString) IsNot Nothing) Then
  33:         context.Server.Transfer(customErrorsCollection.Get(statusCode.ToString).Redirect)
  34:       Else
  35:         context.Response.Flush()
  36:       End If
  37:  
  38:     End If
  39:  
  40:   End Sub
  41:  
  42: End Class

The following element also needs to be added to the httpModules element in your web.config (replace the attribute values if you aren't using the downloaded binary):

<httpModules>
<add name="HttpErrorModule" type="ColinCochrane.HttpErrorModule, ColinCochrane" />
</httpModules>

And there you go! No more 302 redirects to your custom error pages.

Web Standards: The Ideal And The Reality

There has been a flurry of reactions to the IE8 development team's recent announcement about the new version-targeting meta declaration that will be introduced in Internet Explorer 8. In an article I posted on the Metamend SEO Blog yesterday, I looked at how this feature could bring IE8 and Web Standards a lot closer together and find the ideal balance between backwards-compatibility and interoperability.  Many, however, did not share my optimism and saw this as another cop-out by Microsoft that would continue to hold back the web standards movement.  Being that this is a topic that involves both Internet Explorer/Microsoft and web standards I naturally came across a lot of heated discussion.  As I read more and more of this discussion I was once again reminded about how so many people take such an unreasonably hard stance on the issue of web standards and browser support.  When it comes to a topic as complex as web standards and interoperability it is crucial that one considers all factors, both theoretical and practical, otherwise the discussion will inevitably end up taking a "your with us or against us" mentality, that does little to benefit anyone.

The Ideal

Web standards are intended to bring consistency to the Web.  The ultimate ideal is a completely interoperable web, independent of platform or agent.  The more realistic ideal is a set of rules for the creation of content that, if followed, would ensure consistent presentation regardless of the client's browser   This would allow web developers who followed these rules to be safe in the knowledge that their content would be presented as they intended for all visitors.

The Reality

Web standards are attempting to bring consistency to what is a enormously complex and vast collection of mostly inconsistent data.  Even with more web pages being created that are built on web standards, there is still, and will always be, a subset of this collection that is non-standard.  There will never be an entirely interoperable web, nor would anyone reasonable expect there to be.  The reasonable expectation is that web standards are adopted by those who develop new content, or modify existing content, and that major web browsers will be truly standards-compliant in its presentation, so that web developers need not to worry about cross-browser compatibility.

One aspect that is often forgotten is the average internet user.  They don't care about standards, DOCTYPES or W3C recommendations.  All they care about is being able to visit a web site and have it display correctly, as they should.  This is what puts the browser developers in a bind, because the browser business is competitive and its hard to increase your user base if most pages on the web break when viewed with your product.  A degree of backwards-compatibility is absolutely essential, and denying that is simply ignorant.  This leads to something of a catch-22, however, because on the other side of the coin are the website owners who may not have the resources (be it time or money), or simply lack the desire, to redevelop their sites.   They are unlikely to make a substantial investment to bring their sites up to code for the sole reason of standards-compliance unless there is a benefit in doing so, or a harm in not doing so.  While the more vigorous supporters web standards  may wag their fingers at Microsoft for spending time worrying about backwards compatibility, you can be sure that if businesses were suddenly forced to spend tens of thousands of dollars to make their sites work in IE, Microsoft would be on the receiving end of a lot more than finger wagging.

I admit this was a minor rant.  As a supporter of web standards, I get a great deal of enjoyment out of good, honest discourse regarding their development and future.  This makes it all the more frustrating to read article after article and post after post that take close-minded stances, becoming dams in the flow of discussion.  The advancement of web standards is, and only can be, a collaborative effort, and this effort will be most productive when everyone enters in to it with their ears open and their egos left at the door.

Please Don't Urinate In The Pool: The Social Media Backlash

pool-party The increasing interest of the search engine marketing community in social media has resulted in more and more discussion about how to get in on the "traffic goldrush".  As an SEO, I appreciate the enthusiasm in exploring new methods for maximizing exposure for a client's site, but as a social media user I am finding myself becoming increasingly annoyed with the number of people that are set on finding ways to game the system.

The Social Media Backlash

My focus for the purposes of this post will be StumbleUpon, which is my favourite social media community by far.  That said, most of what I say will applicable to just about any social media community, so don't stop reading just because you're not a stumbler.  Within the StumbleUpon community there has been a surprisingly strong, and negative, reaction to those who write articles/blog posts that explore methods for leveraging StumbleUpon to drive the fabled "server crashing" levels of traffic, or dissect the inner-workings of the stumbling algorithm in order to figure out how to get that traffic with the least amount of effort and contribution necessary. 

"What Did I Do?"

When one of these people would end up on the receiving end of the StumbleUpon's community's ire they would be surprised. Instinctively, with perfectly crafted link-bait in hand, they would chronicle how they fell victim to hordes of angry stumblers, and express their disappointment while condemning the community for being so harsh.  Then, with anticipation of the inevitable rush of traffic their tale would attract to their site, they would hit the "post" button and quickly submit their post to their preferred social media channels.  What they didn't realize was that they were proving the reason for the community's backlash the instant they pressed "post".

Please Don't Urinate In The Pool

To explain that reason, we need to look at the reason people actually use StumbleUpon.  The biggest reason is the uncanny ability that it has for providing its users with a virtually endless supply of content that is almost perfectly targeted to them.  When this supply gets tainted, the user experience is worsened, and the better that the untainted experience is, the less tolerant the users will be of any tainting.

To illustrate, allow me to capitalize on the admittedly crude analogy found in the heading of this section.  Let's think of the StumbleUpon community as a group of friends at a pool party.  They are having a lot of fun, enjoying eachother's company, when they discover someone has been urinating in the pool.  The cleaner the water was before, the more everyone is going to notice the unwelcome "addition" to the water.  When they find out who urinated in the pool, they are going to be expectedly angry with them.  To stretch this analogy a little further, you can be damned sure that they wouldn't be happy when they found out that someone was telling everyone methods for strategically urinating in certain areas of the pool in order to maximize the number of people who would be exposed to the urine.

For anyone who was in the group of friends, and actually used and enjoyed the pool, the idea of urinating in it wouldn't even be an option.  Or, in the case of StumbleUpon, someone who actually participated in the community and enjoyed the service, wouldn't want to pollute it.