Selecting The Right Use Case For Your Big Data Implementation
Organizations have come to the realization that data is a core part of their strategy and a scalable distributed computing platform central to their technology investment. However, a challenge that big data practitioners face is what use case they should first implement in their journey towards realizing their big data strategy. The reality is, multiple items need to be addressed: choosing the right technology, requisite funding, and the right technical talent. However, identifying the right use case with defined success outcome is the most crucial point of starting a big data project.
Identifying The right Use Case
Many big data implementations fail for one reason or another, and a central part of the problem comes down to not identifying the right use case, and clear success criteria on what is expected out of this use case. This article dives into how to identify the use case that when implemented will drive the greatest value and create champions in your organization for further investment in technology. From my experience working with Fortune 500 enterprises, here are examples to help you get a head start.
First, figure out the following questions
- Is there a business or executive leader who will be a champion of this initiative?
- Are there goals or specific outcomes expected from the initiative?
- Are all stakeholders aligned with that goal?
- Does this initiative tie to a specific business outcome with value?
Although big data implementation may be a challenge for enterprises, it also comes with benefits across the organization. Here are examples, based on field experience, of the kind of outcomes to expect to achieve from a big data implementation.
Improving the amount of time it takes for customers to get access to data for decision making. For example, a leading internet company had to wait twelve hours to process data on Hadoop and then push it to a commercial relational database. The objective for this customer was to reduce that time to less than one hour providing incrementally more than eleven hours of access to information from the existing solution. By reducing the latency and increasing the availability of the data, analysts and operations personnel had more ready access to information, thus improving the time to insight.
“Many big data implementations fail for one reason or another, and a central part of the problem comes down to not identifying the right use case”
Lower Cost to Scale
Another financial institution reduced their cost footprint for their commercial analytics platform from $10k per TB down to $162 per TB on their big data analytics platform. The goal of this big data platform was to reduce the total cost of ownership. The rule of thumb for investment in technology is be able to provide business outcome and improves ROI on previous investment.
Increased Access to Historical Data
Another leader in the retail space can only access less than three months of their data. This was due to restrictions on the amount of data the user could feed into their existing BI acceleration technology. Moving to a big data analytics platform and with Atscale’s adaptive cache and query cache, they were able to access historical data going back five-plus years. This made certain types of time series analysis and historical trending that were previously not possible a reality. With this increased access to information, they were also able to do the risk models, marketing segmentation and ‘period over period’ analysis. There was a clear connection to a better business outcome via improved marketing segmentations, and risk models.
Establishing clear success criteria, with alignment from key stakeholders is step one in identifying the right use case to implement your big data strategy. An overarching theme of all these examples is whether the success outcome delivers on a business outcome that is tied to business value. This is an important point to reiterate, The expected outcome must tie to a business outcome. We find customers struggle with this. They are caught up on the technical implementation details (this is important but can not be the sole focus). It’s worth a milestone in your plan to ensure that you have a milestone to identify and align with your outcomes.
It’s worth noting that the first use case should be one where the data practitioner does not take on too much in the first go. This can be a recipe for disaster, with lofty expectations set to the end users and executives. Instead address this incrementally, show some wins, communicate them, and pursue the grander and bolder vision which is the ultimate goal, once you have some key wins in your back pocket. The old adage, “to not boil the ocean”, or more appropriately the data lake, applies aptly here.
We invite you to learn more about AtScale today!