It seems to me Microsoft is taking the wrong approach to its forced VAR/customer satisfaction requirement. In case VARs haven't heard, Microsoft will force all who want to achieve or maintain Gold level status to participate in its customer satisfaction index study beginning Oct. 31.
I'm not against VARs getting an understanding of customer satisfaction levels. In fact, I believe they should do more to understand the customer and its perception. Fact is, we here at Everything Channel do lots of perception studies, so I understand their validity and benefit.
The trouble with Microsoft's approach is that it's highly unlikely
to return much usable data.
What Microsoft should be surveying for is the total customer experience through its channel partners when its products are included in a solution.
Given where the future computing model is headed (I'm talking cloud
computing), it's going to be a lot more important to understand the total customer experience in a solution sale than it will be to understand customer satisfaction levels of a VAR.
Let me use myself as a far-too-simple hypothetical example but one that will illustrate my point.
I have a Lenovo Reserve Edition laptop that I own personally. It's a wonderful machine encased in a leather cover with wonderful support. Trouble is, the operating system is Microsoft Vista. If I were to be surveyed as to my satisfaction with the reseller I obtained the machine from, it would score highly. Ask me about the hardware itself and I also would be raving. If I were to be surveyed about the total customer
experience with this product, however, it would result in much lower satisfaction numbers. Those numbers would be dragged down exclusively because of Vista. I don't need to go into all the reasons why Vista is an ugly dog because we all know them. We can only hope that Windows 7 is dramatically better.
So my total customer experience here is abysmal because of a single element that makes up the total experience -- that's important to note.
I'm not arguing that it isn't important to understand how solution providers perform in the market. In fact, if a VAR consistently falls short any vendor ought to think about the value in working with that partner. If all Microsoft is trying to accomplish is development of a system that will weed out partners, then perhaps it should proceed.
But if what it really wants to do is understand what is happening in the market when a Microsoft product is baked into a solution -- and how it can be sure that all the gears that must mesh to result in a great customer experience are well-oiled -- then I think it needs to look at this a bit differently.
So here are a few suggestions: First, Microsoft needs to state clearly
what it wants to accomplish over the long term. If that is trying to determine which portions of its partner base are poor performers, then say it and expect to take some heat.
If what Microsoft wants to do is help partners determine what they
can do to improve their customer satisfaction levels, then it should develop an ongoing system that regularly and specifically surveys the VAR's end user base about the VAR's interaction with the customer.
But if what it really wants to do is determine the real skinny as it pertains to the entire engagement and experience a customer feels, then I think it needs to go after the total customer experience. That will also give it some benchmark data it can use for comparison as we move further along the path to cloud computing.