A new National Highway: Virtual Connectivity to override Physical Infrastructure in India

 

Can Aadhaar evolve into a virtual connectivity infrastructure that drives a seamlessly connected society? What if Aadhaar gears up to be India’s answer to its painstakingly slow progress in building physical highways and infrastructure.

Aadhaar is a unique identification initiative launched by the Government of India under its planning commission. It is an ambitious project of using basic IT technology (databases, computing) and connectivity (fixed or mobile) to create a dynamic online identity system. The integration of biometric technology has provided an advanced and secure capability of authentication. This has further been extended by integrating payment platforms and providing an unified system of real-time identity, authorization and payment transaction support.

The vision outlined by the government lays emphasis on social & financial inclusion. As the first step, authorization and payment services are being used to drive delivery of distribution and transaction based services. Initial pilots have focused on social and welfare schemes such as Public Distribution Systems, LPG distribution & subsidy management, old-age pension distribution etc. In the next phase, applications could extend usage from authorization to access control or location/presence and drive services that are as simple as attendance to more dynamic deployment of resources based on current location of users. The scenarios are only limited by our imagination.

However, for true momentum to be built up, the initiative has to garner the industry attention and evolve to provide value to encourage adoption by businesses and enterprises.  This will not just lead to a massive build-up of Aadhaar-enabled services but also provide the impetus to propel it out of the current orbit to the next level of growth.

This evolution will need to be centered around 3 core areas – (1) Extending its application beyond social welfare into businesses (2) Introducing Support for Analytics – analytics could be used for converting raw data into value-added user/service context or applied to intelligence-driven operations (3) Inter-linkages with other databases and systems for seamless connectivity.

Data has been touted as the new oil of the connected world. However, our experience has taught us that data has no value unless it is acted on and converted into meaningful actions. Its only when the monetization potential is realized that it will drive social change.

The question for all of us – can this infrastructure be exploited to compensate for the lag in physical infrastructure investments? A nation that has been recognized for its extensive reserves of IT resources should not falter in playing to its strength in IT – we should be investing in creating an unprecedented scale of connected applications cut across both social and industrial sectors and use the virtual connectivity to open up reach as well as delivery. This could be the one area where we outpace every other nation & challenge the perceived dominance of other emerging nations like China.

Explosion of Data: Can it be monetized? (Part 1)

 

Effective data pricing is not about simply rolling out new pricing plans – it requires a re-think of strategies: implementation of new capabilities like policy control & traffic management; innovations in self-care, loyalty programs and cross-marketing; and  integration of all these dimensions into real-time charging, notification & payment solutions.

The discussion on the ideal model to monetize the explosion of data is live again! The classic one-size-fits-all  approach does not make sense any more.

Last year saw most of the major operators eliminating unlimited data plans to move to tiered pricing. There is very little support for the operators from the community as everyone sees it at an inhibitor to the connected world. Questions afloat on whether it will impede the growth of video-centric applications (still in their infancy) – be it the multi-player gaming or the video calling,  media streaming or the many anticipated new applications. But it is being recognized that growth in data traffic is impacted by multiple drivers – as was seen in India over the last year after introduction of 3G services, where data growth was  impeded due to high tariff’s, inadequate coverage for 3G across the regions and lack of seamless interoperability of many services (e.g., Video conference) across operators. It is clear that monetization of higher bandwidth networks cannot be taken as given –  a more holistic approach is needed to facilitate the adoption of data services and thereafter manage the explosion of data.

It is a reality that the carriers need to gain control of growing bandwidth consumption and to make consumers pay for what they use, while provisioning for an adequate level of quality of service. While the initial efforts started with ways to curtail data-hogging activities, it is slowly been recognized that there may be better alternatives to address the fundamental problem by re-defining the delivery of their services, managing the way bandwidth is utilized between voice and data services, as well as within multiple data sessions, and introducing new revenue streams. This also serves to enable the telecom providers to differentiate their role in the value-chain and demand a share in the revenue thru premium services.

This requires a re-think of the existing architecture to target fair play, offer flexibility thru tiered services facilitate monetization thru dynamic policy control & upgrade options, recognize customer loyalty, and integrate partner/sponsors into service models.

Architecture: Manage Inter-linkages

Some of the dimensions involved:

  1. Implement a data architecture which is able to distinguish & differetiate various types of data services in a granular form, so that differentiated policies can be implemented based on service types, usage and subscription profiles.
  2. Implement policy control solutions in the networks to exploit the variation in data usage and apportion the usage of the network to maximise the revenue. It has been seen that the data usage varies widely depending on the end-device, end-user and other parameters – half of a typical operator’s data traffic is driven by approx 5% of the subscribers only, top 20% of subscribers in usage avail 80% of available capacity. The available capacity during off-peak hours can be monetized by deploying non-user-based services (e.g., utility metering, telematics, M2M etc).
  3. Extend Policy Control to create monetization opportunities, i.e. moving beyond a denial or restriction of service to introducing real-time notifications & engagement with the user to present options to upgrade to the desired service levels. These could be offered on additional payment or linked with operator’s other loyalty & payment modules.
  4. The key is to provide a seamless experience to the user that integrates policy control with real time charging, self-care, payment & notification systems.
  5. Develop network intelligence by consolidating data from multiple sources – monitoring usage patterns, behaviour and service experience, data collection from network nodes, integration with operational & service assurance, CRM,  and care systems for analytics in real-time to develop profiles and characterics that can drive usage based pricing strategies aligned with user behavior.
  6. Implement an intelligent “offload mechanism” to selectively detour certain types of data traffic (e.g., bandwidth hogging video traffic etc) to altrenate bypass routes at the network edge, to ensure consistent quality of service.
  7. Proper dimensioning of networks (access and core) to support the heavy-tailed nature of data traffic is required
  8. Enhance  loyalty packages (discounts, loyalty rewars, bonus points, promotions etc) based on collected intelligence to define targeted segment promotions and innovate pricing capsules. Integrate into the payment systems to interlink with service upgrades.
  9. Integrate of new channels for notification, communication and self-care.
  10. Introduce new pricing Models from bundles, service premiums, partner/B2B models etc. Other options such as dynamic pricing could also be added. 

Next  we will look at the opportunities created by the new architecture to introduce new services, products & offers for the changing connected society.

This article first appeared on Aricent Connect on February 14, 2012  (http://blog.aricent.com/blog/explosion-data-can-it-be-monetized)

Great User Experiences start at the Back End

 

Successful consumer experiences are as much about behind-the-scenes business operations and processes as they are about easy-to-use products and cool designs. 

There is no doubt anymore that most companies recognize the true power of experience—thanks in large part to Apple, who has successfully emphasized user experience as an important element of success. And yet, it’s amazing to see that there is no simple definition of what “experience” really means or entails. Is it the overall interaction with a cool or smart design or is it confined to the graphic user interface (GUI)? I would say both. But I would also add that the complete user experience must also entail the service flows around various experience touch points.

The fact is, experience is the art of taking all those behind-the-scenes business processes and operating complexities (which we always tend to overlook), rationalizing them into streamlined functions, and then hiding them from the user by creating easy-to-use touch points and cool designs. So far, this is what has set Apple ahead of the competition, even as the competition floods the market with a surfeit of Apple look-alike products.

I’ve been associated with customers who go through an innovation cycle to replicate Apple’s success. Most of these initiatives ended with marginal success, and none created any industry-changing paradigms. What they all had in common was a single dimensional approach of hurrying the service functionality to the market. Existing operations and business processes were given short shrift and overlaid with ad-hoc upgrades in order to address service impact requirements. The result was that while the functionality excited technology innovators, it failed to generate any momentum because the experience wasn’t seamless enough.

These days, as the world becomes more connected than ever, dependency on the entire business eco-system has increased significantly. Touch points of customer experience now extend into multiple and diverse back end systems and processes (e.g. authentication and user identity/profile management, discovery and recommendations, download and upgrades, optimization for bandwidth and performance, campaigns and advertising, billing and revenue assurance, business analytics and service assurance, inventory and fulfillment, and third party eco-system management across CRM, billing, BI, OSS and SDP systems). Initially, it’s fairly common to overlook the complexity and impact of creating a comprehensive user experience, and when the challenge surfaces during the later phases of deployment cycles, the impending launch dates leave no room for innovation in that area.

In my experience, the most well-defined experiences get created when the impact on operations is envisioned at the same time as the product or service itself. This allows for all dependencies to be built in upfront into planning, while ensuring that a separate focus is created to commit to a seamless end-to-end operation. This may be possible through simple rationalization or through the evolution of existing systems, but in some cases it may require a full transformation. As the world becomes more connected and more people and products come online, the biggest challenge to service delivery models and business processes will be to sustain dynamic ever-changing real-time user and business parameters.

This is, in-fact, the difference. Apple designs for the end-to-end service—not just for a product. What this means is that companies must plan for service integration as part of an innovation strategy. At least, they do if they want Apple-like success.

This article first appeared on Aricent Connect on June 18, 2011 (http://blog.aricent.com/blog/great-user-experiences-start-back-end)

The Opportunity is not about Big Data but Intelligence-Driven Operations

 

Everyone today is talking of Big Data. The discussion is divided straight down the line – most people pushing the case present it as the panacea of all known & unknown problems; and interestingly the opposing group does not deny its relevance or value but makes every effort to categorize it as no different from the investments in data handling & management done painstakingly over the years. Of-course, both sides are right and miss the real point in the debate.

The real opportunity is untapped and often overlooked – it goes far beyond just data or its derived intelligence. The potential is in the real-time application of the data (intelligence) into operations by developing an active closed feedback loop. The ability to seamlessly integrate the results of analysis (post data aggregation) into active operating and business work-flows changes the landscape. Information today has a shelf life of a few hours (data even lower!) and the achievement here is if it actually gets acted on during its life to make an impact. This possibility is what makes the whole discussion and investment around big data so appealing. It has the potential to enhance not just true customer experience but also have a tangible measurable impact on network costs, support services, self-care and even go beyond to facilitate launch of differentiated services in new businesses.

Data has been collected, managed and applied for many years now. The shift that is emerging is driven by a complex multi-dimensional change in underlying network technologies, early introduction of automated workflows (e.g. policy control & enforcement, self-optimizing networks, etc) and customer behavior fundamentals.

  • Volume of data has increased many-fold. Existing systems and solutions may not be able to handle the ever-increasing scale of data (terabytes, records, transactions, tables/fields) and it may require a re-look at data architectures.
  • New sources and types of data are getting added. This ranges from the unstructured data from social platforms, application marketplaces, mobile devices, and semi-structured data from M2M.
  • Sources of data are spread across different organizational functions and many different systems. Data needs to be culled from large networks, internal corporate operations and users of all types.
  • Data needs to be extracted in real-time from network nodes and user devices. Silo’ed transactional measurements are no longer sufficient and continuous management is imperative. Re-use of existing tools, systems is a must for practical implementations.
  • Same data is relevant for multiple businesses and functions. Operating efficiency is critical by building collect once, use many times architectures.
  • There is a growing need to combine the in-service and out-of-service data to develop dynamic co-relations i.e. associating the real-time & stream data with static data (customer data from CRM/BI systems or operations data (Fault & Performance Tools, Policy Control Frameworks) or network (HLR, EIR etc) to derive actual use-cases.
  • Managing data goes beyond data aggregation or even an analysis of usage patterns. It requires seamless integration with existing operating & business systems. And most importantly, there is a need for tight integration with different businesses to ensure insights are actually used! Metrics & KPIs, use-cases, business applications need to be driven & owned by the business teams for full integration into product life-cycle and pre-emptive & predictive operations.

These shifts can no longer be ignored; it is no longer a future vision but a hard-core reality for survival. But the key point to take note of is that the shifts have far reaching impact and go beyond big data to also trigger a change in all operator functions (IT, Networks, Planning, Operations, Care, Marketing, etc). This is where the challenge lies. An effective strategy is needed to define a holistic implementation approach that goes beyond organizational boundaries. We see the need to factor 4 goals to cover all critical success factors.

Goal 1: Affordable Management of data at extreme scale

  • Existing Systems/Tools vs Big Data Platform selection to evaluate on various parameters such as Volume, Velocity, Variety & Variability, Parallel processing/distributed architectures
  • Consideration of new requirements like distributed compute-first (as against storage first) architectures, distributed file-systems, parallel architectures, complex event processing, highperformance query architectures based on in-memory architectures for analytical driven cloud operations etc.
  • Design of hybrid data architectures built on multiple data platforms and technologies to support different needs of different business applications

Goal 2: Optimal re-use strategy without compromising on architecture expansion

  • Audit of data & systems to define re-use approach
  • Reuse peripheral infrastructure – tools/systems but evolve core data architecture
  • Define data sources & types to avoid duplication and rationalization of systems & tools across functions
  • Business driven def of KPIs & Metrics and Analytics
  • Identification of high-impact business applications to define prioritized use-cases Analytical Application Acceleration

Goal 3: Seamless Integration into Operations

  • Automation & feedback cycle into operating & business systems and process flows to ensure tangible business value
  • Convert insight into real-time actions for preemptive & predictive actions by alignment with businesses

Goal 4: Continuous Operation as a managed service

  • Handling the complexity & diversity of multiple services and managing multiple organizational function interfaces
  • Visualization thru customized dashboards and reports

These goals overlap at many points and the implementation priority and planning can be determined by the business approach and the decision of adopting a disruptive or a simpler adaptive strategy – driven by the initial risk & investment appetite. While Disruptive Approach will deliver a high market impact it will also require higher organizational alignment and upfront investment. It will establish the full blueprint and implementation plan and result in e2e innovation thru a fully integrated approach built over big data solution and automated action loops. The alternative approach is to take Adaptive Approach route which has an edge due to lower initial risk besides providing early feedback that can be fed back into the fully integrated approach. This would develop a high-level blue-print and identify 2-3 selected pilots in identified functional areas and expand to other areas later. The cost of execution will largely depend on the approach preferred, the implementation timelines established and the business goals defined.

Most initial implementations will fall under the adaptive approach and slowly evolve to put in place an effective roadmap for Intelligence-driven Operations.

However, the next big-challenge is to identify the right business use-cases that actually have an impact on the business and the user. The need of the day is to go beyond the standard talked-of use cases (customer segmentation, application analytics, content analytics, network optimization, performance, predictive & preemptive technical problems etc.) and come up with something innovative (such as bridging the consumption gap, turning customer service data into a new revenue stream or…) – the possibilities are only limited by our imagination.

This article was published on Aricent Connect on 27 September 2012: http://blog.aricent.com/blog/opportunity-not-about-big-data-intelligence-driven-operations

Leading Innovation – Making Ideas Happen

 

Success of Innovation is generally associated with Ideas. There is little doubt that ideas are core to innovation,  but we have also seen time-and-again that many good ideas and innovations lose their way – in fact only a small fraction eventually end up living to the original promise.

Innovation is the talk-of-the-industry. It is probably the most overused term in business today – and yet, everyone has there own interpretation of what innovation means and how it can be introduced. The focus is mostly on Idea Generation – some emphasize on systemizing innovation processes while others look to inculcate innovation thru specialized focus groups. Innovation is rarely measurable and disappointingly it seems to have become an end in itself – it is no longer about transforming businesses, changing lives or creating a new world. Innovation seems to be losing its meaning.

In reality, innovation is not just about ideas – an idea is just the beginning of the innovation process and the real work starts after that. This is not to say that defining the idea is not important – it is at the core of the innovation process and the relevance of innovation is tied to its response to real problems and addressing the pain points. However the potential of the idea needs to be deterministic and fully measurable – the path and the journey itself in most cases of true innovation would always be unknown. The difference between success and failure usually lies in how this unknown and unchartered journey is undertaken – how the idea is translated into a goal and (more importantly) the commitment that is made to take it forward.  

And this is where leadership comes in and leaves its mark. For innovation to drive achievement, it cannot be simply built around creativity, but has to be realized through careful planning, painstaking execution, constant vigilance, periodic adjustments and diligent pursuit. All examples of successful innovation have been driven by strong leaders who have been instrumental in changing the impact of various actions down the path of success. There is no better example than Steve Jobs – the ideas in themselves were not his own nor were these new inventions – it was the journey that made all the difference!

There is no doubt that the basic qualities of a good leader – i.e. clarity of goal, adaptability to change, swift & decisive action, calculated risk taking and the courage to override obstacles – are still important. But these form only the minimum requirements to drive innovation, and are not necessarily sufficient to ensure successful impact. Leading innovation requires additional capabilities that help traverse the lifecycle of change.

(1) Passion to make a difference – the inspiration to leave a lasting impact.  

(2) Perseverance & Courage to pursue, to keep going despite all odds & contradictions, to always look for alternatives.

(3) Conviction to challenge conventional wisdom, manage inertia and resistance to change.

(4) Focus to drive towards the goal – ignoring newer attractions and distractions, staying on course, constantly improving usability, simplifying the experience and using technology as a means and not as the goal.

(5) Pursuit of Perfection – willingness to admit mistakes, change course even late in the game and striving always for the best possible.

We just need to look around to see how many innovations have been lost on the way – some have been stopped, others modified beyond recognition, while the majority have simply been replaced by newer initiatives. In most of these instances, it has rarely been the idea at fault, seldom a scarcity of resources – and mostly always a lack of conviction or a faltering commitment.

There is no doubt that success in innovation is far more influenced by leadership than any other element – even more than creative ideas, smart resources or unconventional out-of-the-box thinking!

This article was written on February 09, 2012.

Driving the Connected Society

 

Often overlooked by the end-user, the connected society has truly been built upon the advances in technology & business processes achieved by the semiconductor industry.

Everything today is being driven by, and built for the connected society. We all talk of the impact of Apple, Facebook, Twitter, Amazon and emergence of new eco-systems. These new faces of the connected services have re-kindled the vision of connecting many more devices that are part of our lives such as televisions, appliances and cars; and more recently we see renewed interest in expanding into new solutions in Smart Energy, Mobile Healthcare etc.

Many talk of this as a new vision. And yet, it has been around for many years now. What is often overlooked is that the recent optimism is truly driven by the advances in underlying technologies over the last several years. In fact, it is not just the availability of the new technology or the continuous improvements in size and processing capability, but the drastic reduction in price-points that has been achieved by the semiconductor industry that has opened up the reach for mass scale adoption.

I have been tracking the M2M world for more than 5 years now – I have seen many technology advances that have led to the current point of inflection – from the changed world of microprocessors with their ever increasing processing power and capability, to the re-incarnation of memory with flash memory & in-memory computing; validation of  wireless connectivity and the viability of embedded sensor technology; besides the new thresholds achieved on  video resolution, imaging and voice quality, and many more.

It is true that each of these technologies has opened up new possibilities. But I believe that the tipping point was achieved through the power of combination – as created by harnessing multiple of these individual technologies together into new applications. This led to a new level of innovation delivering far-reaching impact on user experience, operational costs and new business services. It is even more fascinating that the impact spans across verticals, businesses and industries.

Apple provides a good example of this seamless integration of multiple technologies – from rich media experience to optimized browsing experience to new user interfaces like touch, gesture or speech recognition or the extension of storage into seamless cloud repositories or the management of media across devices.

Apple is of-course also largely responsible for relegating technology to behind the scenes – they have focused on user experience and combined smart designs with streamlined processes to create easy-to-use touch points; thus hiding the technology and its complexity from the user.

Hidden or not – yet these technology advances will need to continue and maybe even gather more momentum. If the prediction that the connected devices will reach 15 billion over the next few years comes true, the spurt in data traffic and its implications on processing capability can barely be imagined today. We will need all the power that we can get.

And it is re-assuring to hear (from industry leaders like Intel) that the next generation chip technologies will continue to maintain the pace demanded by Moore’s Law, with the number of transistors doubling every 2 years. It is this power that will drive the connected world – and we should give it its due credit!

This article was written on 20 October 2011.

Reverse Innovation Lessons for Developed Economies

 

Reverse innovation stories in emerging markets highlight the untapped potential of innovating operating and business practices in developed markets.

The more I read about “reverse innovation” and the opportunity this creative method of rethinking products and services has opened up for the developed world, the more I see how important operating and business processes are. What’s interesting is that in the telecom industry the best examples (and the most successful ones) are coming out of emerging economies. This shouldn’t be a surprise as innovation in the emerging world has been an outcome of the prevailing business pressures that left little option but to change conventional thinking.

We have all seen that most of the subscriber growth in the developing world has come from bottom-of-the pyramid consumers that generate marginal ARPUs (average revenue per user). It’s no wonder then that in most instances, innovation in the emerging market has been in the form of new business models that profitably target these consumers with low disposable incomes and stimulate the usage of basic and value-added telecommunication services.

This kind of innovation started by changing very basic paradigms. Companies redefined core values and questioned competitive drivers. We also saw operators such as Bharti moving away from physically owning the entire front-and-back-end infrastrcuture to outsourcing many non-core activities in an attempt to turn capital expenditures into operational costs and manage cash flow better. They started with Network Sharing in order to optimize on-network op-ex and cap-ex costs (often driven by the need to extend coverage into remote areas). They then added Network Outsourcing and later IT Outsourcing to bring down fixed costs and improve performance. These moves have also helped Bharti divert internal resources into customer management and business evolution.

These innovations are not just focussed on operations. There are many examples around advanced business practices as well. Dynamic pricing has been introduced in South Africa by the African telecommunications company MTN (the plan is called MTN Zone). It uses realtime network loading to offer time-limited discounts on voice and sms in specific geographical areas. This has increased usage minutes, stabilized pre-paid ARPU, and helped to smooth traffic profile for higher network efficiency. Concepts like sachet-pricing, which has been introduced in a few markets including Africa, India, and Latin America, enable customers to purchase data access in small increments (1-hour, 1-day, 3-days, etc), thereby providing a flexible alternative to the standard flat or usage-based pricing models. Another high-profile example is the One Network launch from Zain that allows customers to roam across 16 markets using their home SIM card at local call rates.

And there are many more such examples.

What is probably not commonly known is that most emerging market mobile operators have been able to record EBIDTA margins that are, on average, higher than the margins in developed economies—proof that there is value in taking a different approach to backend requirements.

With the developed world now faced with similar challenges on costs and margins, there is a case for learning from the innovation in emerging markets and building on it further to target the untapped potential of innovation in both operating and business practices.

This article first appeared on Aricent Connect on 20 August 2011 (http://blog.aricent.com/blog/reverse-innovation-lessons-developed-economies)

Managed Connectivity and mHealth

 

Bringing disparate healthcare solutions together could be the catalyst to drive adoption to a scale that can translate into large scale healthcare transformation.

Most mobile health (mHealth) initiatives typically focus on singluar solutions—simple mobile apps, cloud-based data management solutions, new consumer medical devices, virtual clinics, remote management, and many more that get added every day. Very few really look at getting them all to work together. And yet, that’s what needs to happen; we must manage the connectivity. Until we do, many singluar initiatives will stay at the periphery of the healthcare ecosystem. 

The looming threat of unsustainable healthcare costs around the world has generated significant market attention. Some are predicting that these costs will reach 15 percent of global GDP, and in some countries even reaching an astronomical 30 percent. The challenge to deliver effective and affordable healthcare appears to be almost an afterthought.

This is not a new challenge, but there seems to be renewed interest because of the new category of healthcare management enabled by mHealth, eHealth, or connected healthcare. More often than not these terms refer to the application of mobile technology and communication solutions to transform healthcare service delivery.

There may be a lot of hype but the category is still in infancy, and commercial viability is still to be tested. The ecosystem itself is complex with multiple entities, including government agencies, public and private healthcare service providers, pharma and medical device vendors, and users.

The current focus is mostly on point-solutions—these include data aggregation and management (e.g. healthcare records, cloud-based storage and backup, etc.), mobile applications (e.g. monitoring of lifestyle diseases, wellness applications, health information access, etc.), remote healthcare management (e.g. pill adherance, diagnostics and patient-centric monitoring, etc. ), and tele-medicine (e.g. tele-triage services, mobile healthcare vans, etc.).

Adoption of these services requires as much a change in industry outlook and social behaviours, as it does in making these services seamless and integrated. The true impact of these point services will be seen only if they are all tied together into an end-to-end solution. The question is who is best equipped to tie all the pieces together, especially with the ongoing convergence of telecom, information, and healthcare. In fact, the telcos seem to be best equipped to facilitate end-to-end solutions, but to be successful they will need to move beyond their current connectivity orientation and adapt to the speed and business models of the connected world paradigm.

What’s needed now is the ability to establish a connected managed service that provides the underlying connectivity and communication infrastructure while also having the ability to manage operations and support to facilitate the connection of the healthcare ecosystem. This would allow healthcare service providers to focus on their core strengths around healthcare, while innovating new models of service delivery. A “connected service” would automatically take care of connectivity (including 24-7 managed support), and add value through advanced connectivity management—monitoring, diagnostics, and service assurance. It could also evolve into a central entity to integrate disparate data sources, perform data analytics (including understanding user behaviour and profiles), and create service aggregation portals to create a single-point delivery for end-users. 

When that happens the disparate services would all be seamlessly integrated. Extended connectivity will maximize returns on innovation on the individual planes—medical devices, healthcare service delivery models, information access, and management—and make a true impact on healthcare service. From where we stand today, it’s quite clear that we are still a few years away from seeing this. And yet, if we can make this happen, we can touch lives as nothing else can. 

This article first appeared on Aricent Connect on 21 September 2011 (http://blog.aricent.com/blog/managed-connectivity-and-mhealth)