Internet Explorer Must Be Stopped

time-breakdown-of-moder-web-design

Casual web users probably find this chart amusing, but my fellow web professionals know how close it comes to the truth. Today, I’d like you to take note of the giant yellow section in the top, right-hand corner that indicates, “Time spent trying to get the bastard to work in Internet fucking Explorer.”

In case you don’t already know, Internet Explorer is an abyssmal web browser. Here are just a few reasons why.

  • It’s insecure. Due in part to its popularity and in part to its security vulnerabilities, Internet Explorer is frequently targeted by hackers as an easy way to spread viruses. I’m not a desktop support specialist, but every time I help someone out with their computer, I advise them to download something else for their web browsing, if only out of security concerns.
  • It creates more work for web developers. CSS is the language that controls how websites are supposed to display. On almost every website I build, I have to make special CSS to deal with improper implementation in the three most popular versions of Internet Explorer. Yes, that’s custom styling for each of them individually. Internet Explorer is the only browser that needs such special handling. Multiply this by the number of websites being created by web developers every day and we’re talking about centuries of wasted man hours.
  • It doesn’t automatically update. Google Chrome and Mozilla Firefox both update automatically. Thus, if security vulnerabilities or bugs ever crop up, they’re patched rapidly in every distribution of the software connected to the internet. This is how it should work. Internet Explorer doesn’t do this; if you get an update, it only comes as an update to your core Windows installation. As proof, look at the larger percentage of users still using Internet Explorer 6, now two full versions behind the main software. If anything has lead to Internet Explorer’s pervasive security and compatibility problems, it’s this.
  • It’s always playing catch-up. Tabbed browsing, for example, really came into vogue in the past few years, but for the longest time, Internet Explorer has refused to embrace it like its competitors. Sure, it’s got tabs now, but its implementation of tabs is terrible compared to Chrome or Firefox. One gets the sense that Microsoft wants to dictate what users want rather than listen and truly innovate.
  • It’s only popular because it is the default on computers running Windows. Period. If people were given the choice right from the start, Internet Explorer would have lost the browser battle long ago. The fact is, most people don’t even realize they have a choice.

If you use it, chances are good you don’t know any better; after all, it’s what came on your computer. Let this be your wake-up call. You have alternatives. Google Chrome and Mozilla Firefox are outstanding, completely free, more secure, and offer an all-around better user experience than IE.

I believe in this strongly enough that I’ve signed Ward on the Web up as a supporter of IE 6 No More. If you visit Ward on the Web in IE 6, you’ll get a special message telling you that you need to upgrade.

Personally, though, I don’t think that goes far enough. Internet Explorer has a bad enough history that the most up-to-date versions should be suspect. Thus, I proudly advocate that not only IE 6, but all versions of Internet Explorer, should be discarded in favor of better alternatives.

So, please, for your own sake and the sake of the internet, if you’re reading this in Internet Explorer, it’s time to upgrade. Download Chrome or Firefox and never look back.

Bridging the Digital Divide to Combat Poverty

Blog Action Day 2008 Poverty

What can a person do to elevate him or herself from poverty? What can the rest of us do to help the less fortunate and combat poverty on the local, national, and even global levels?

Everyone has a different opinion. Certainly, there are many good answers to this question, and you’ll probably hear a lot of them today. That’s because today is Blog Action Day, a day the blogosphere has singled out to discuss an important topic. This year it’s poverty. Since I’m not one to miss out on a good meme, here is my niche-appropriate take on the matter.

The one thing that separates the haves from the have-nots is most often education, not necessarily in the sense of school but in the sense of knowledge acquisition. After all, we’ve all heard of successful individuals who never so much as graduated from high school. Those with access to knowledge can develop and thrive, as individuals and as communities.

Before the modern age, knowledge was a limited resource, available only through books and teachers. Providing knowledge to the less fortunate, then, required considerable resources. That is no longer true. Compared to the significant expense of books and teachers, the cost of providing internet access is almost negligible. For developing countries, the primary expense lies in establishing the electrical and telecommunications infrastructure. Beyond that, decent computers can now be produced en masse and on the cheap.

We have the ability to give the poor access to the greatest information system the world has ever known. Imagine a world without a digital divide, where every person, regardless of their economic background or location in the world, could tap into the same global wealth of knowledge. Everyone could communicate, share, collaborate, and contribute to that knowledge equally. It would be the promise of the internet realized, knowledge leveraged for the good of all.

Granted, I’m biased in this opinion; the internet is my bread and butter. And I’m not deluded enough to think that global internet access would solve the problem of poverty on its own. Especially in developing countries, it wouldn’t be enough. Many people would have to be instructed in basic skills like reading and writing first, and access to information wouldn’t directly solve problems like hunger and healthcare.

Still, I believe it would be a great start. Knowledge is the first step to solving any problem. It’s not about elevating the poor; it’s about empowering them to elevate themselves. To paraphrase the old proverb, give a man information and he learns for a day. Teach a man to use the internet and he learns for a lifetime.

Shiver Me Timbers! jQuery Off the Starboard Bow!

Before you get confused, no, jQuery doesn’t have anything to do with pirates (at least, not that I’m aware of). In honor of International Talk Like a Pirate Day, I’ll be interspersing this post with random, distracting, inappropriate pirate jargon. Don’t blame me; blame the guys who started the meme. 😉

I’ve never been a big fan of JavaScript. Like many developers, I use it when absolutely necessary (AJAX, form validation, DOM scripting, etc.) and avoid it in favor of server-side scripting solutions the rest of the time. This saves me the hassle of writing JavaScript that is cross-browser compatible, a feat which is often more troublesome than getting CSS to behave in Internet Explorer.

That was before I got a hold of jQuery. In case you’ve never heard of it, jQuery is an open source JavaScript library that simplifies most DOM scripting tasks. I won’t go into detail on its virtues here; if you’re curious about the specifics, I encourage you to check out the jQuery website. Instead, I’ll show you just how I used it to spruce up the starboard side of the site by walking through the contents of my scripts.js file.

// Custom jQuery interaction layer for WardOnTheWeb.com by Stephen Ward
$(document).ready(function(){

...

});

Most languages have boilerplate code, and jQuery is no different. This is just the part that says, “Weigh anchor when the page is loaded and ready to manipulate.”
// Toggle search prompt
$search_box = $('#s');
$search_box.focus(function(){
  if ($search_box.val() == 'search this site...')
    $search_box.val('');
});
$search_box.blur(function(){
  if ($search_box.val() == '')
      $search_box.val('search this site...');
});

Here, we see a few lines of code that toggle the “search this site…” prompt in the search box. I hate having to delete default text like this, so I made it so nobody has to. This used to be accomplished with much less in-line JavaScript, so, at first, it might seem like a bad example of jQuery’s seaworthiness. However, when you consider that the interaction logic is now completely removed from the semantic HTML without complicated selector and event binding functions, it’s pretty obvious how useful jQuery can be. It’s the holy trinity, the separation of content (HTML), presentation (CSS), and interaction (JS) with minimal effort.
// Set default menu states
expanded = $('li.expanded > h2');
expanded.css('cursor', 'pointer');
jQuery.each(expanded, function(){
  $(this).text('- ' + $(this).text());
});
collapsed = $('li.collapsed > h2');
collapsed.css('cursor', 'pointer').next('ul').css('display', 'none');
jQuery.each(collapsed, function(){
  $(this).text('+ ' + $(this).text());
});

// Switch menu states when the menu headers are clicked
$('li.toggleable > h2').click(function(){
  menu_header = $(this);
  menu_header.next('ul').slideToggle('normal');
  if (menu_header.text().substring(0, 2) == '+ ')
    menu_header.text('- ' + menu_header.text().substring(2));
  else if (menu_header.text().substring(0, 2) == '- ')
    menu_header.text('+ ' + menu_header.text().substring(2));
});

Did I need collapsible menus for Ward on the Web? No, but I thought it would be a fun exercise to make them anyway. Here, you see the code that does it. Again, notice how easily jQuery lets me select HTML elements for manipulation. All I had to do was assign the “toggleable” class to designate a collapsible menu, along with either the “expanded” or “collapsed” class to designate its default state; a few simple jQuery selectors did the rest. This is especially handy in case I decide to add more menus later.

The visual effect is, of course, stunning. Collapsible menus come in many varieties; I programmed a very simple, in-line script for Simon over at Bloggasm that toggles the display state without animation. However, jQuery’s “slideToggle” function blows it out of the water with smooth, organic transitions. Note that most of the above code toggles the plus and minus signs in front of the menu headers; I could have accomplished the collapsing menu effect with two lines of jQuery code.

Perhaps the most important part of all is what the menu looks like without the jQuery interaction layer. All of the CSS and content changes that have to do with collapsing the menus are isolated in the JavaScript. When it’s turned off, they revert to normal, non-collapsible menus. In other words, it degrades perfectly.

Overall, I’d say jQuery is a great tool that makes writing cross-brower-compliant JavaScript much simpler. Beyond that, its main benefit lies in simplifying the separation of interaction from content. No more in-line scripting; now your program logic can be just as extensible and maintainable as your CSS. Granted, I’ve still got a lot to learn about it; the above code could probably be accomplished in fewer lines by a more experienced scallywag. Still, it’s been a pleasure to discover jQuery for the first time. I only wish I had found it sooner.

Cloak and Dagger: The Many Motives for Cloaking… and their Consequences

Cloaking is a touchy subject in SEO circles. Typically falling into the category of “gray hat,” it can be fairly safe or incredibly risky to your site’s search engine rankings, depending on how and why it is done. Unfortunately, the growing literature around cloaking does little to dispel the confusion over acceptable use.

Often times, the method doesn’t matter as much as the motive. Why are you cloaking? Is it for a legitimate reason, such as a better user experience? Or are you trying to pull a fast one on the search engines? Here are some common motives to cloak, as well as the risks associated with each.

I’m cloaking in order to rank better.
I’ll go ahead and start with the obvious. Cloaking has a poor reputation because it can be used to manipulate search results, which of course violates Google’s webmaster guidelines. If you’re using it to rank better, you’ll get in trouble. Even if you’re using it for a legitimate reason, however, cloaking can still be construed as manipulative. Read as, “Cloak at your own risk.”

I’m delivering different versions of the same content to different browsers.
Even with the best intentions, cloaking by user agent can be risky. Right out of Google’s help center, “Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.” If you can, try to find other ways (CSS hacks, for instance) to cater to different browsers. If cloaking is the only way to go, stay safe by serving up the same default version of your site to search engine spiders that you serve to most browsers.

I’m serving crawlable content to search engine spiders and dynamic content to users.
Sorry, but Google’s stance is to use alts, noscripts, noembeds, and similar static-within-dynamic HTML elements to serve up crawlable content. Really, you should do this for the sake of all the atypical users who visit your site, not just search engine spiders. To put it simply, cloaking is not the answer to crawlability, especially when search engines are getting smart enough to crawl Flash.

I’m providing restricted content to search engine spiders.
It can be tempting to give Google back-door access to all of your juicy content so it can be ranked, while at the same time password protecting it from unregistered human users. Google condones this, so it’s naturally considered permissible in SEO circles. The problem is, search engines spiders save cached copies of every page they visit. More importantly, they share these caches with users on demand. Thus, if Google can see it, any savvy user can see it. Even cookies will do little good in restricting access. Dan Goodin over at the Register shows how this exploit is being used by less scrupulous users to view restricted forum postings.

I’m delivering different content based on location.
This one has been debated and cleared as acceptable. You can perform geolocation, even to the point of blocking certain geographical regions, so long as you treat search engine spiders the same as you would any user from the same region.

My site is big and important, and I’m cloaking because I feel like it.
It’s a known fact that Google won’t penalize sites that are big and important enough, no matter what black hat tactics those sites use. Apple’s iTunes store was recently outed for cloaking. NYTimes.com, Forbes.com, CareerBuilder.com, CNet.com, Nike.com, and WSJ.com are just a few others known to cloak with impunity. The trick here is to be so big and important that users would stop searching Google if they couldn’t find your site on it. The rest of us need to keep our noses clean.

If you’ve been paying attention, there’s a common theme with cloaking. If you treat search engine spiders the same as human users, it’s probably okay. If you give them special treatment, you place your site’s rankings at risk. When in doubt, remember the first and fourth universal truths of SEO: “Don’t Game the System” and “Users First; Search Engines Second.”

Ward on the Web Entered in Blogging Idol Contest

Blogging Idol

After all the positive feedback I’ve received, not to mention winning that marketing contest a few weeks ago, I’ve decided to enter Ward on the Web in the Blogging Idol competition over at Daily Blog Tips. Throughout the month of July, I’ll be doing everything I can to promote my subscribership. Even short of winning, the conversational and linking benefits of participation are just too good to pass up. So, if you haven’t already subscribed, click here to have Ward on the Web delivered straight to your email or feed reader and help me become the very first Blogging Idol!