Why I Didn’t Try To Win Webperf Contest 2010
When Webperf Contest 2010 was first announced I was excited. As someone who develops and enters contests regularly, not to mention my devotion to WPO, I was instantly set on winning. The prize of an iPad would be icing on the delicious cake of first place. But that feeling faded after digging into the project. Here’s why.
As I previously discussed, the contest was to optimize a static site the organizers would provide, and provide they did. We were given a static copy of Fnac’s games and toys category page, which consisted of the main page, an iframe, some stylesheets, a handful of JS files, and about a bazillion images.Â I started by looking over everything, sizing it up, and deciding what to work on first. After taking it all in I realized the scope of the work was massive. The site had obviously been built years ago and maintained by different developers – with vastly different standards. What had started as a nice site had grown into a complicated, tangled mess. The intial onload was almost 12 sec, sizing up to 600k with 115 requests.Â This certainly isn’t uncommon for large companies with sizable sites like Fnac. Like I said, lots of work to do.
Getting To Work
I created a privateÂ github repository, which I’ve since made public,Â to track my changes. Part of the contest was to create a summary of what you did and why you did it. I figured it’d be much easier to create a repository and simply use the commit history as my summary.
When it comes to websites I’m a perfectionist. It’s a blessing and a curse. I like to do the absolute best job I possibly can. If I don’t it really bothers me, sometimes to the point of compulsively looking over code and tweaking single characters to get things just right. I’m a stickler for strict coding style and best practices. Since this site had neither, the first thing I did was try to standardize. I threw the HTML into HTML Tidy. It took a few tries to get it formatted to my liking. I also spent some significant time cleaning up and organizing the stylesheets and JS files. The combination of these tasks made me understand much of the site’s structure, which I consider to be absolutelyÂ crucialÂ to success. If you don’t understand what you’re working on it’s going to lead to disaster.
One of the easiest things to do was remove unnecessary URL parameters. All of the links included a handful of parameters such as SID and UID. Not only did this add many kilobytes to the page size it also poses a potential security issue. Sharing a link with someone could lead to an account hijack. They should be using cookies instead of this junk. Quite frankly no one should be using this outdated method of tracking user sessions.
I continued cleaning code, shrinking CSS, and crushing images. I converted all GIFs to PNGs and manually processed each one, shrinking the color palettes as much as possible while avoidingÂ noticeableÂ artifacts. All PNGs were processed with the imgopt script, which passes them through 3 different optimizers. The eventual goal was to merge them into a single sprite, something I never got to. I eliminated calls to document.write where possible and addedÂ asynchronousÂ script loading with LABjs. Despite only eliminating 10 requests I had raised my Yotta score from the starting 48 to reasonable 56. That’s without a lot of optimization techniques I planned on using, including sprites, dynamically loading images with JS (which they approved), CSS & JS optimization/minification, reduction of HTML elements and HTML minification.
Is The Effort Worth It?
As you can see in the history, I did a decent amount of work before quitting. I spent over 10 hours on the site. I charge $75/h for consulting and freelance work. Do the math.Â That’s over $750 worth of my time, had I been billing. I wasn’t even close to being done either. We’re talking at least another 15-20 hours of work. After coming to that realization I decided it wasn’t going to be worth my time. I had intended on spending another hour or 2 using automated tools to make the sprites and minify the CSS + JS to see what it’d make my Yotta score but I never got to it.
I have to commend the organizers on doing a great job at running this contest. They found a massively challenging site, answered questions on Twitter, maintained a leaderboard, hosted all the contestant sites, and will provide judging. I know how much work running a contest can be since it’s part of my day job and they were on top of it. I may try participating next year if they do it again but I’ll have to accept the fact that my entry won’t be perfect.
Trackbacks and Pingbacks
- Tweets that mention Why I Didnâ€™t Try To Win Webperf Contest 2010 | Razor Fast -- Topsy.com
- Retour sur le Concours Webperf 2010 | BrainCracking - Veille technologique sur les applications Web
Comments are closed.
nice try anyway 🙂
the top contesters also have spent 2 or 3 times that amout of hours, we (as organizers) didnt expected that much!
But would a smaller page have been interesting ? I mean this page is typical from a lot of web pages today, and the fact there is a lot of room for improvement allow for a lot of different strategies, leading to a lot of lessons learned
also, as a judge I can assure you that we spend lot of time to try to see what each participant has made, the amount of work was really high
I understand using a large page like this for a contest. I think for the next one you should try to find a company who needs the work done and is willing to pay the winner for their time.
I’m sure all the judges will have to put in a lot of hours reviewing the entries. With so much time being spent on them, it will take a lot of work to review them. Good luck!