February 22, 2013 | By Rey Villar
Here are two real-life scenarios from past clients: “analyze 13,000 reports slated for a migration,” “assess 3,500 mainframe jobs for a re-platform” and “tell us what we really need.”
The sheer number of inputs to a requirements gathering effort can be daunting. The solution is to use the right technique to find the pay dirt. One such approach focuses on business capabilities.
The analysis typically starts with a comprehensive list of jobs and/or reports. Inventories are an easy starting point because they can be exported en masse off a mainframe or out of a reporting tool to a spreadsheet format.
Inventories are good, because they are all-inclusive. That being said, inventories are bad, because they are all-inclusive. If it’s your job to vet this massive list, it’s easy to get lost in the weeds and discouraged.
We need to think in terms of mining the value out of these inventories some other way. The 80/20 rule can provide a lot of leverage here.* At the same time, we also need to satisfy both the business and technical sides of the house.** The solution understood by all is to drive the analysis using the business capabilities that will be supported by the technology.
Let’s work through the details.
The first step is to determine the essential business capabilities. Here are a few samples from the healthcare payer data space:
- Set up and maintain members
- Set up and maintain providers
- Pay broker commissions
- Generate invoices
Each capability has a set of enabling technologies. Identify the supporting technology, including systems that create and consume the data. An architecture diagram may be useful here. A Harvey ball diagram, mapping each system against data domains found within the system, can be helpful as well.
Here comes the fun. Gather the SMEs that are experts in their respective areas, with the capabilities, inventory, and architecture in hand, and link the capabilities to the specific jobs and reports needed. Think reverse-engineering of the inventory. Match the capabilities to the required reports and jobs. The obsolete, redundant and irrelevant items will fall off the list.
If you do your job right, you should end up with a manageable worksheet that can be reviewed with the business at a high level in hours, not days (or weeks).
Here’s the scorecard from client experience above:
- 13,000 reports culled to a top 100, reduced further to a top 20
- 3,500 mainframe jobs distilled to just under a hundred
Personally, I can’t put wrap my noggin around 13,000 reports, but I comfortably can deal with 20. Using the business capabilities approach can increase the skew of inputs to outputs by 100 to 1. That means we’ve leveraged Pareto-style thinking to concentrate on what is really important to the business, while reducing our scope by 99%. That’s a big win.
The culled list might not satisfy the entire crowd, but you should have enough breathe and depth to your analysis that everyone will come to the table for further discussions.
*See my blog “Scale Your Data Efforts Using the Pareto Principle”
**See my blog “Breaking the Wall Between Business and IT”