It sounds like the company in this article is doing everything right - they're following Dave McClure's cheeky metrics plan, they're using Mixpanel, and their numbers are going up. Yay metrics! Unfortunately, they've made a fatal flaw. The metric they are measuring is worthless.
What if Google tracked site actions per user? What if thy celebrated every time that metric went up?
The important thing to remember is that you need to measure the bottom line with every test you can. Do the people with feature A resubscribe? Do they create more referrals? Do they make you more money? Money is what matters in the end, do measure that.
Increased interaction does have value because, as mentioned in the article, it keeps users coming back to the service -> more likely to find it useful -> more likely to use the paid service when their trial ends.
The two metrics given as examples are just that - examples. Various other more "bottom line" orientated metrics like which user account types interact the most, how often they return before the trial ends, are also being measured.
However, it's not all about "bottom line". The second of the two metrics doesn't tell you anything about how much revenue is being generated - as is explained, this is to help find out which fields users are requiring additional advice so they can potentially improved.
Measuring the "bottom line" is important, but it's not important for every single metric.
Actually such "interaction" metrics are only useful if it is proven that they ultimately lead to increase in site objectives (notice I didn't say sales). Increase in interaction will be much stronger data point if you analyze the historical data and found that correlation of interaction per user and site goals (sales) per user is positive and high.
No point enjoying the increase in pageviews if it in no way affects the bottom line. This is generally what happens with sudden bursts of traffic from Digg, Reddit, etc. While traffic increases tremendously, the conversions (not conversion rate) remain the same.
Eric Ries likes to call them vanity metrics ( http://www.fourhourworkweek.com/blog/2009/05/19/vanity-metri... ). They are great for making you feel good about yourself, but they don't necessarily improve anything.
What if Google tracked site actions per user? What if thy celebrated every time that metric went up?
The important thing to remember is that you need to measure the bottom line with every test you can. Do the people with feature A resubscribe? Do they create more referrals? Do they make you more money? Money is what matters in the end, do measure that.