Friday, February 26, 2010

Why is "Continuous Deployment" challenging for most companies?

Continuous Deployment (CD) is one of the elements of Lean Startup model and the most controversial one. In simple words, it is an act of rolling software codes to live servers (a.k.a production) as frequently as many times a day. It is an extension of the popular and critical Agile software development practice known as Continuous Integration (CI).  In CI, a development team continuously integrates each other's work and run a suite of automated tests with every change in the code repository (a.k.a. check-in). This allows the team to catch any bad changes as soon as they are introduced so that they can be fixed quickly. Eric Ries took this practice a step further at his company IMVU. At IMVU, they move codes to live servers as many as 50 times a day, a high mark set for others to follow. To be able to pull this off, you would need technical maturity and product management discipline. However, as much as I love to be able to implement it at my company, I see a few issues with extending CI to become CD at most companies.

Review by the product owner. Unless you are a one-man team, you would want your changes reviewed by the product owner before you make it available to the users. In Lean Startup world, most likely the product owner role will be with someone from the customer development team. At least, the product owner is not someone from the product development team. Even the team is co-located, and there is daily interaction between the development team and the product owner,  the product owner should review the changes before  we ship them out the door. The product owner would want to make sure the changes are what he/she asked for. Usually the product owner would have feedbacks for the team. So, it is not clear how bypassing this human review is going to work. The larger the team, the more important the review becomes. If the team is geographically distributed, it is even more important. If you outsourced the development, I don't think you would feel comfortable with changes making to the live servers without you reviewing it.It is not clear how the product owner role fit into this CD at IMVU.

Risk vs reward. Yes, we cannot eliminate all bad changes or bugs through tests. However, I am not sure what incremental value we would get from CD beyond CI. With CD, yes, you may catch some bugs that may still slip through CI sooner (assuming you have a lot of users, which may not be the case for an early stage startup) and it may be easier to track down (there are other ways to reduce debug cycle) the bad codes since only need to work with smaller set of changes. But, at the same time, CD increases business risks (not just ours, but also for the end users), even if we can back out our changes with the flip of a switch. For most business applications, the reward would not outweigh the risk. In fact, it is a good practice to perform deployment during a "scheduled maintenance window" (even if you can do hot deployment) that the users know about so that their is no surprises. However, I can see the reward outweigh the risk for IMVU and similar services.

Non-backward compatible database changes. Database changes are always the most difficult to deal with, and requires more scrutiny and would not be good candidate for CD. In other words, it is difficult to back out (undo) changes if things go wrong. With iterative, incremental development, we encounter these changes more frequently. It is not very clear how IMVU handles these type of changes. It seems from the comments on their blog on CD, they handle the DB changes out of band. That means these are not part of CD.

Although I do not see any significant value in CD to live server beyond CI, I definitely see the value in CD to test/integration servers. In fact, we recently started doing this. It is not exactly CD, more like daily deployment to test servers for the product owner to review and provide feedback. This helps us to be able to deploy changes to live server every iteration (every 2 weeks).

Are you doing CD? Am I missing something?

***********************************************************************
Update: Got a chance to speak with Eric at a recent DC Lean Startup event. At IMVU, they do not do any non-backward compatible changes. All changes to DB are additive. Also, he thinks it is still important to do continuous deployment to production (to catch issues that surface under high load) even though you can (should) catch most of the other issues through CI. I can see his point, but for a startup having issues with scalability is a good thing (which can be addressed easily). At product/market fit stage, scalability is most probably not going to be an issue. So, we continue to agree to disagree on the benefit of continuous deployment.

Friday, February 05, 2010

Retrospective on ScrumPad Product/Market Fit

As I am beginning my journey of Lean Startup with our ScrumPad service, I plan to share my experience along the way.

The first stop in the journey is "Product/market fit." For the uninitiated (I am not an expert either, learning by doing), all it means is to prove that there is a large enough market for the product or service that I am building. The goal at this stage of a startup lifecycle is to identify/define that market with specific details- who are the users, why they like our product, and the business model that would work the best, and a repeatable sales model.

The best way to adopt the Lean Startup model is to first look back to understand where we are (since it's been 18 months that we are working on ScrumPad). During this time, we did not apply the Lean Startup model (were not aware of it). We only focused on product development using Agile software development model (the only good thing we did). Customer development was largely absent. I haven broken the retrospective in two parts- 2008 retrospective, and 2009 retrospective. And I have structured it in light of Lean Startup for easier understanding.

Version 1- Free ScrumPad: 2008 Retrospective
What was our hypothesis?
All started with a itch of ours. We looked for a tool that fits our needs. We found two- Basecamp and Version One. But, none was what we were looking for. After trying both, we had decided to build one that we liked. So, since we liked what we had built, we thought there must be more like us who would also like ScrumPad.

How long did we spend to validate this hypothesis?
6 months.

What did we do during this period?
  • We primarily focused on adding functionality (product development, not customer.). We thought more functionality = more satisfaction = more users. 
  • We were releasing incremental version of ScrumPad every two weeks using Agile/Scrum process.
  • We did not do anything on the customer development front. Did not do any marketing. Did not talk to customers for feedback (did not get out of the building). 
  • We completely deferred design thinking that design and usability is secondary to functionality. We thought once we have the functionalities in, we could always add design and improve usability.
  • We did not charge for our service. We thought we need to first build out the full product (not thought about an MVP) before we can start charging.
  • We did put some basic tracking- Google analytics as well as some custom ones (i.e., number of projects, users, and overall project activity index that we defined). It helped us detect some early issues with signup and getting started with ScrumPad. 
What was the result?
We did get a constant stream of SEO traffic from search engines. This traffic resulted in a trickle conversion 1 to 2 every week. Received some unsolicited feedback, primarily from those who liked it. However, when we started charging in early 2009, most of the users from 2008 stopped using the service.

  New   Bounce Avg. Time   Pages 
Visits     rate     on Site       per visit  Conversions
3,521     48%   00:02:50            2.61          5%

What did we learn?
Our product is better than the opensource counterparts. However, as a paid service, we did not learn anything about whether we have a market yet as a paid service.

Version 2- Commercial ScrumPad: 2009 Retrospective 
What was our hypothesis?
If we shift focus from functionality to user-interface and usability, there is a market for our service for a $21/user/month with single pricing model (no tiered pricing). I did a pricing analysis of our competitors (I must admit my class on pricing for my MBA from UNC helped).

How long did we spend to validate this hypothesis? 
All of 2009.

What did we do during this period?
  • We changed our logo, updated the UI, and continued to tweak the interface to improve usability. 
  • However, we spent a significant amount of time in refactoring (rework) to fix the sprawling code-base that resulted from the quick piling up of functionality. 
  • We started actively asking for feedback. We in fact incorporated a short exit interview during the cancellation process. 
  • We started charging for our service. 
  • We still did not do any CPC marketing. We wanted to focus on word of mouth. We added affiliate program instead. 
What was the result?
We continued to have a constant stream of SEO traffic from search engines. Even with pricing in place, we saw significant improvements in all metrics except trial conversion rate. We saw a drop in trial conversion rate (by 1%. I think it makes sense since its no longer free). With "trial to subscription" business model, we finished the year with a few paying customers, but no new customers through our affiliate program.
                                                                                                           
 New     Bounce    Time      Pages per        Conversion
Visits  rate Avg.   on Site      visit         Trial   Trial to Paid
   7,578    38%    00:04:23      3.72           4%          4%

What did we learn?
Since we have customers who are now paying for our service, there exists a market who are willing to pay for our service at the price-point we are offering now. However, we do not know who they are and how to reach them, and why they like our product over our competitors. Also, the metrics suggest that we have room to optimize the conversion rates through A/B testing of landing page and other aspects of the site and the service. Affiliate program was too early to add.

In hindsight, I believe what we achieved in 18 months or so, we could have learned even more in 6 months or less using Lean Startup techniques. More on what's next from here.