Archive | December 2010

The Importance of an Agile Process

The Importance of an Agile Process

written by Paul on December 28th, 2010 @ 12:02 PM

Over ten years ago at the start of my career I was vehemently opposed to “process” and mostly thought that a process was for those who could not be independent and relied on slowing down to mediacracy in order to ensure that the whole organization was synchronized.

Now that I am looking back I realize that I was right and wrong. I have learned that processes are the only way to breakout of mediacracy. I would like to share what has changed my outlook and discuss the pitfalls of applying processes the wrong way.

A process must…

1) be changeable by everyone who operates within it:

Everyone within the process must use their observations and power of perspective to “own” of the portions of the process that they interact with. Nothing is more destructive to a team’s productivity or an employees morale than seeing opportunities to improve, proposing them, and then being told by “the powers that be,” that their ideas are good, but [put traditional management smoothing line here.] This is what caused what Frederick Taylor referred to as “Soldiering” in his paper, The Principles of Scientific Management—people, when “put in their place” will do the minimal in order to get paid but avoid “punishment.”

2) be seen as a way to instill discipline within a team:

Lets face it, we are all different and some of us have more discipline (or other kinds of discipline) than others and even the most well-meaning and disciplined person can quickly become derailed when under the pressure from investors or the excitement of a new idea. Because of this normal tendency that each of us have, a process can ensure that rational thought and perspective is continuously applied.

In XP there is a notion of pair-programming—this tool does much to keep two minds around the same problem, where one might slip off into an interesting but unnecessary deviation within a project, the other can pull the reins and so the pair become more productive because they are guiding each other as a team.

Looking back on my life the times when I was int he best physical shape and frequented the gym was when I had a friend to train or work out with. The dynamics of a process or team culture can take a team very far.

3) allow and even encourage experimentation and testing:

When a potential opportunity for making the process better arises, proposals for improvement must be respected and the ideas should be tested. Like all empirical process, the results should speak for themselves. In fact there is a powerful side effect of testing that is not to be ignored. The Hawthorne Effect illustrates how by simply knowing that you are being measured you perform at a greater level. The idea or process change itself may not directly increase productivity, but just like how a pedometer can get one to move more to lose weight, knowing that you are being tested increases ones focus.

4) not driven top-down or bottom-up, but both:

Right below being told you can’t do something (referenced in #1 above) on the list of productivity killers is being told what to do. In a perfect world your whole company would all have the same epiphany on exactly how to conduct the their agile process, but in the real world, we need to share our ideas, persuade, and be persuaded. Kent Beck has said “The best architectures, requirements, and designs emerge from self-organizing teams”, and when people are empowered people care-less about position warfare and individuals are more likely to take feedback and ideas from the “top” and vice-versa. Processes that evolve on principles of efficiency and are aligned with organizational goals are always going to be more successful than those that are imposed.

5) increase productivity:

If any aspect of a process does not contribute to the productivity and health of an organization or team then they should be changed. One of the beauties of Scrum is the focus from the team and organization on “Inspecting and Adapting.” Process changes should always result in greater productivity!


When I was younger and more allergic to the word “process” I was right to feel that way because many of my observations of what “process” was presented as were indeed dysfunctional applications of agile (or not) “processes.”

Some companies like 37Signals denying the existence of a “process” because of their “process” allergy. They often talk about not having a process, but what they don’t convey properly to their audience (and I think its unfortunate) is that they DO have a process, though not written down, that is deep within their culture. It’s great when the process is natural and doesn’t require a wiki page to describe it, but for most teams and organizations a common vocabulary and understanding of the what, when, and how’s is important because some people need to adapt to a new culture.

Now that I have experienced the benefits of having just-enough process to get hyper-productive my allergies to the word itself have ceased.


Crawlable AJAX for SPEED with Rails

Recently at work we have been focusing our efforts on increasing overall performance of our site. Many of our pages have a lot of content on them, some might say too much. Thanks to Newrelic, we identified a couple of partials that were slow (consumed ~60% of the time) but could not just remove them from our page and the long term fix was going to be put into place over the coming weeks. Long story short, we though that it would be better to speed up the initial page load time and then call the more expensive partials asynchronously using separate AJAX calls. That way the page time would be faster and the slow parts would be split up between requests.

The Problem: Google’s Crawler doesn’t like AJAX (normally)

Googlebot still does not load javascript and perform AJAX calls. Because of this we don’t get credit from google for the content that we loaded on AJAX—which is a bummer and a show stopper for us. Duplicate content is bad forSEO and Google will come to our pages and see that they are similar, even though the user sees the relevant content as the page loads, it will “think” that our pages are mostly the same (header and footer, etc.)

The Solution: Google allows for Crawlable AJAX

On their site, Google suggests that sites that have AJAX on them use a particular approach to making them crawlable. I won’t go into the details of how google supports this because its all stated in their paper, but I did want to focus on how I implemented the solution.

Before I continue I want to say that I was hesitant to do this because at first glance I didn’t think it was be easy or effective, I was wrong and I apologize to my friend Chris Sloan for doubting him in the beginning as he proposed the idea. (he made me include this in the post and threatened my life if I didn’t)

Google basically wants to be able to see the ajaxified page as a whole static page, so they pass an argument to the page and in turn we are supposed to render the whole page without the need to call AJAX to fill portions of the page with content.

I wanted to funnel the AJAX calls for different partials though a single action within our site so I didn’t have to build out custom routes and custom actions for each partial, which would be extremely messy to maintain.

The Code

So here is a simple example of the approach we took:

Created a single action that tooked for specific classes and then made requests to the server passing a couple of key parameters: /ajaxified?component=&url=<%= request.request_uri %>

    module AJAXified # include this in a controller (or the app controller)

      # HOWTO
      # To add a new AJAX section, do the following:
      # 1) Write a test in crawlable_spec for the new container and url
      # 2) Add the new method/container to the the ALLOWED_CALLS array      
      # 3) Add the new method below so it sets required instance variables


      def is_crawler?

      # Actual Instance Setting Methods Are BELOW This Line

      # Note: each method needs to return the partial/template to render

      def bunch_o_things(options=nil)
        @thing ||= Thing.find(options[:params][:id])

        @things_for_view = @thing.expensive_call

      # Actual Instance Setting Methods Are ABOVE This Line

      public # below is the actual main ajax action

      def ajaxified
        raise "method \"#{params[:container]}\"is not allowed for AJAX crawlable" unless ALLOWED_CALLS.include? params[:container].to_sym

        raw_route = ActionController::Routing::Routes.recognize_path(params[:url],:method=>:get)
        request.params[:crawlable_controller] = raw_route[:controller]
        request.params[:crawlable_action]     = raw_route[:action]

        render :template => self.send(
          params[:container].to_sym, :params => request.params
        ), :layout => false


I needed to ensure that the method :is_crawler? is available within views as controller.is_crawler?

  hide_action :is_crawler?

In the controller action where the code would have normally been executed, we need to add a check for crawler so we don’t execute code that is not needed.

def show
    @thing = Thing.find(params[:id])

    if is_crawler?
      # sets @things_for_view

In the view:

<article id="things" data-crawlable="<%= controller.is_crawler? ? 'false' : 'true' %>">
  <% if controller.is_crawler? or request.xhr? %>
    <% @things_for_view.each do |thing| %>
      ... potentially expensive stuff ...
    <% end %>
  <% end %>

Because I had to water the code down a bit to show how it works ingeneral, this code is not tested nor has it been executed as is. I actually had to add more stuff around the project I did for work in order for it it work as we needed it to.

The general idea here is to centralize the partial render, reduce duplication within the controller and ensure that the code that slowed the partial down to begin with is not executed when the page is not being crawled.

In the end, we were able to reduce the initial request by ou users by 60% and google is able to crawl our site as it always had.


Saving time with Threads the Ruby way

I have been working on some projects that require me to do multiple serial webservice calls using soap and http clients. As you might guess without concurrency its such a waste waiting for the network IO and its ends up being accumulative in times—the more service calls the slower it gets (.5s+1s+2s+1s+1s = 5.5seconds). Originally I wasn’t worried because I knew I would come back and tweak the performance by using threads and so today was the day for me to get it going. Before I got too crazy coding i wanted to run some basic benchmarks just to see if it would really end up making things faster. Here is what I did: { |rep|"non-threading") { 
    1.upto(100) { |count|
      amount_rest = rand(4)
      # puts "##{count}: sleeping for #{amount_rest}" 
      # puts "##{count}: woke up from a #{amount_rest} second sleep" 
  }"threading") { 
    threads = []
    1.upto(100) { |c|
      threads << { |count| 
        amount_rest = rand(4)
        # puts "##{count}: sleeping for #{amount_rest}" 
        # puts "##{count}: woke up from a #{amount_rest} second sleep" 
    while !( == 1 and == false)
     # puts "will check back soon" 

benchmark        user     system      total        real
non-threading  0.100000   0.290000   0.390000 (142.005792)
threading          0.010000   0.020000   0.030000 (  3.182716)

As you can see, the threading in Ruby works really well as long as each thread is not doing anything CPU intensive. Even though ruby 1.8.7 does not support native threads, the threading, as you can see above, does work well. When all was said and done, I ended up making more than a 100% improvement and it will work a bit better if and when we have to do more requests concurrently.

I do however look forward to using ruby 1.9, but this will do the trick for me now.