Everyone today is talking of Big Data. The discussion is divided straight down the line – most people pushing the case present it as the panacea of all known & unknown problems; and interestingly the opposing group does not deny its relevance or value but makes every effort to categorize it as no different from the investments in data handling & management done painstakingly over the years. Of-course, both sides are right and miss the real point in the debate.
The real opportunity is untapped and often overlooked – it goes far beyond just data or its derived intelligence. The potential is in the real-time application of the data (intelligence) into operations by developing an active closed feedback loop. The ability to seamlessly integrate the results of analysis (post data aggregation) into active operating and business work-flows changes the landscape. Information today has a shelf life of a few hours (data even lower!) and the achievement here is if it actually gets acted on during its life to make an impact. This possibility is what makes the whole discussion and investment around big data so appealing. It has the potential to enhance not just true customer experience but also have a tangible measurable impact on network costs, support services, self-care and even go beyond to facilitate launch of differentiated services in new businesses.
Data has been collected, managed and applied for many years now. The shift that is emerging is driven by a complex multi-dimensional change in underlying network technologies, early introduction of automated workflows (e.g. policy control & enforcement, self-optimizing networks, etc) and customer behavior fundamentals.
- Volume of data has increased many-fold. Existing systems and solutions may not be able to handle the ever-increasing scale of data (terabytes, records, transactions, tables/fields) and it may require a re-look at data architectures.
- New sources and types of data are getting added. This ranges from the unstructured data from social platforms, application marketplaces, mobile devices, and semi-structured data from M2M.
- Sources of data are spread across different organizational functions and many different systems. Data needs to be culled from large networks, internal corporate operations and users of all types.
- Data needs to be extracted in real-time from network nodes and user devices. Silo’ed transactional measurements are no longer sufficient and continuous management is imperative. Re-use of existing tools, systems is a must for practical implementations.
- Same data is relevant for multiple businesses and functions. Operating efficiency is critical by building collect once, use many times architectures.
- There is a growing need to combine the in-service and out-of-service data to develop dynamic co-relations i.e. associating the real-time & stream data with static data (customer data from CRM/BI systems or operations data (Fault & Performance Tools, Policy Control Frameworks) or network (HLR, EIR etc) to derive actual use-cases.
- Managing data goes beyond data aggregation or even an analysis of usage patterns. It requires seamless integration with existing operating & business systems. And most importantly, there is a need for tight integration with different businesses to ensure insights are actually used! Metrics & KPIs, use-cases, business applications need to be driven & owned by the business teams for full integration into product life-cycle and pre-emptive & predictive operations.
These shifts can no longer be ignored; it is no longer a future vision but a hard-core reality for survival. But the key point to take note of is that the shifts have far reaching impact and go beyond big data to also trigger a change in all operator functions (IT, Networks, Planning, Operations, Care, Marketing, etc). This is where the challenge lies. An effective strategy is needed to define a holistic implementation approach that goes beyond organizational boundaries. We see the need to factor 4 goals to cover all critical success factors.
Goal 1: Affordable Management of data at extreme scale
- Existing Systems/Tools vs Big Data Platform selection to evaluate on various parameters such as Volume, Velocity, Variety & Variability, Parallel processing/distributed architectures
- Consideration of new requirements like distributed compute-first (as against storage first) architectures, distributed file-systems, parallel architectures, complex event processing, highperformance query architectures based on in-memory architectures for analytical driven cloud operations etc.
- Design of hybrid data architectures built on multiple data platforms and technologies to support different needs of different business applications
Goal 2: Optimal re-use strategy without compromising on architecture expansion
- Audit of data & systems to define re-use approach
- Reuse peripheral infrastructure – tools/systems but evolve core data architecture
- Define data sources & types to avoid duplication and rationalization of systems & tools across functions
- Business driven def of KPIs & Metrics and Analytics
- Identification of high-impact business applications to define prioritized use-cases Analytical Application Acceleration
Goal 3: Seamless Integration into Operations
- Automation & feedback cycle into operating & business systems and process flows to ensure tangible business value
- Convert insight into real-time actions for preemptive & predictive actions by alignment with businesses
Goal 4: Continuous Operation as a managed service
- Handling the complexity & diversity of multiple services and managing multiple organizational function interfaces
- Visualization thru customized dashboards and reports
These goals overlap at many points and the implementation priority and planning can be determined by the business approach and the decision of adopting a disruptive or a simpler adaptive strategy – driven by the initial risk & investment appetite. While Disruptive Approach will deliver a high market impact it will also require higher organizational alignment and upfront investment. It will establish the full blueprint and implementation plan and result in e2e innovation thru a fully integrated approach built over big data solution and automated action loops. The alternative approach is to take Adaptive Approach route which has an edge due to lower initial risk besides providing early feedback that can be fed back into the fully integrated approach. This would develop a high-level blue-print and identify 2-3 selected pilots in identified functional areas and expand to other areas later. The cost of execution will largely depend on the approach preferred, the implementation timelines established and the business goals defined.
Most initial implementations will fall under the adaptive approach and slowly evolve to put in place an effective roadmap for Intelligence-driven Operations.
However, the next big-challenge is to identify the right business use-cases that actually have an impact on the business and the user. The need of the day is to go beyond the standard talked-of use cases (customer segmentation, application analytics, content analytics, network optimization, performance, predictive & preemptive technical problems etc.) and come up with something innovative (such as bridging the consumption gap, turning customer service data into a new revenue stream or…) – the possibilities are only limited by our imagination.
This article was published on Aricent Connect on 27 September 2012: http://blog.aricent.com/blog/opportunity-not-about-big-data-intelligence-driven-operations