If only my interaction with a machine could be Collaborative…?

I realised that most Error Messages frustrate me, some even scare me and only a very few actually guide me through the error scenario…

error scares me

Just last week, I was exploring a new software service – and incredible as it sounds – despite being a native user (and developer) of software, and having survived generations of software applications, I still panicked at the sight of a ‘big red X’ (a typical ‘ERROR’ alert) – so much so – that before I knew, I had instinctively killed the application!

That got me thinking. A simple response by a system to one of my ‘natural’ actions managed to induce a feeling of helplessness and despair, even a degree of frustration and anger – enough for me to give up. And, if I am honest then I know that I have no intention of retrying anytime soon.

I am sure that I am not an exception – many of you will have a similar experience to share – at least at some point in your interaction, with one or another system or application.

I looked further – I found research that said that there is a tendency in all of us to blame ourselves for our failure with everyday objects and systems. Surprised? Well I certainly was. Isn’t it a contradiction to our natural inclination to blame everything that goes wrong in our lives on others or our environment?

But, the underlying question that continued to bother me is simple – if I reflect on my ‘natural’ action, it was nothing but a typical ‘user behaviour’. How can user behaviours be ‘Errors’? So what are we missing?

I guess, it comes down to the product design and user experience. I know, that when I design a product, no matter how I expect users to use the product, it is a reality that, there will always be a few users who will find different and unexpected ways to use it. And a good design can neither ignore that (and hence needs to handle the unexpected), nor be restrictive to force all users to comply with a single flow (which would inherently conflict with their natural behaviour).

I started to look at the error messages that we issue under different error scenarios in our own mobile video application. We had invested a lot of time in humanising all our application messages and notifications, and were even inspired by NLP (Neuro-Linguistic Programming) – so while most had a human touch (and hence were not as scary as the big X), I realised that they still were fairly limited in guiding the user through to the next stage.

I started to look at error messages from different applications under different scenarios. I noticed that even when actual errors encountered were similar, my experience (and hence response) as a user was very different – and the difference was in the small details of how the message was communicated.

And then I remembered an old reference from the book ‘Design of Everyday Objects’, where Don Norman interestingly uses a standard interaction between two people as an example to demonstrate that effective interactions are mostly built on collaboration – where a person tries to understand and respond to the other party, and when something is not understood or is inappropriate, then it is seamlessly questioned, clarified and the collaboration continues naturally.

I guess, as a user, I am tuned to expect my interactions to be collaborative – and hence struggle when my interactions are with a machine (and not another person)… of-course they inevitably fall short! Any expectation that the user behaviour will/should change when interacting with a machine is suspect – we all know how un-adaptable we all are as a species! There is no doubt that the goal for us – as product designers – should be to build in intelligence into the machine interaction and aspire to develop a collaborative interaction between the user and the machine – i.e. when the user does something wrong, the machine should respond by providing clarification, helping the user to understand, guiding the user to continue through to the next stage – ensuring that the communication illustrates how to rectify the inappropriate action and recommend the next actions.

I know it sounds onerous. But it is not. Technology is a powerful tool and we have enough capability and building blocks now to easily build us simple collaborations and design good feedback and interaction models.

I am fascinated with this new challenge. Our goal will now be to eliminate all error messages and instead replace them with messages that aid and continually guide the user…

This article was first published on LinkedIn on January 23, 2016.

Advertisements

When I paid the price for forgetting that the first 5 minutes of user journey is more important than all the cool features…

user journey

I love technology and of-course I thrive on building cool features – nothing compares to the excitement of implementing highly complex algorithms or finding new ways of using the technology to solve a problem.

But many hard experiences have taught me that technology (alone) does not sell and I have over-the-years learnt to first focus on user experience and always keep the technology hidden.

So, when I started to write down the product specifications for my new startup idea, I did everything right – there was not a single word on technology and I captured the product definition through 5 stages of user journey.

[5 Minutes] captured the 1st Experience: AWARENESS

[5 Hours] focused on the 1st Use: ORIENTATION

[5 Days] looked at making it work for the user and lead to Participation: PERSONALISATION

[5 Weeks] looked at making it work for the larger group and added elements of Induction: INFLUENCE

[Ongoing] introduced capabilities to sustain the momentum: CULTURE

It was the right way to do it. We ran the user journeys with many clients and fed the inputs and preferences back. After 4 months of market validation we decided we were ready to start development.

And that’s when the problem inadvertently occurred (of-course, I never realised it at that time). As we drew out the release plan we ran through the normal cycle of chopping features and prioritised to define the feature-set for the first beta version. I believe that is when my latent love for technology overtook my experience and I selected the base feature set to include functions that demonstrated the algorithms (justifying it all by saying that these complex contextualization algorithms were our differentiator and critical for illustrating the value to the user). Nothing comes for free – to balance time and resources we decided to take a shortcut for our initial sign-up and login process. At that stage it seemed the perfect thing to do – after all its something a user does only once or at best a few times – and even if its a few extra steps or a little painful – it will still work!

Of-course it worked. But only for those true early-adopters who had the motivation to take that extra initiative and accept a few painful interactions. As our user base grew, a lot many users attempted to sign-up but never managed to get onboard. Its amazing how often it happened – and believe me that the interaction wasn’t anything demanding – it simply required them to copy an access code (sent separately) and input it as part of the sign-up. In those days, we lost a few users and for many others our client engagement teams had to invest time and run after them (and often hand-hold) to complete the process. If only we had stuck to our original definition of keeping the 1st 5 minutes interaction simple and seamless, we would not just have got a lot more people on-boarded, but also ensured that their first touch point was fail-safe.

In hindsight, it seems a blunder – how could we have ignored the fact that users had to onboard first before they could experience the cool contextual and personalisation features? There was no technical complexity to the desired sign-up process and I do not even know if we really saved that many development hours and resources. It’s more like we were avoiding working on a task that had virtually no challenges for us to fix…

What hurts me even more is that its something that I, as a Product Manager, always knew and even apportioned the right value to it at the time of conceptualisation. And yet, somewhere I still lost control on the road from concept to delivery.

This article was first published on Medium on October 27, 2015 and LinkedIn on November 21, 2015.

Reducing attention span – how I started to exploit it in today’s asynchronous world…

Less is More

I can no longer focus on a thought for long – can you? Is a short attention span really as bad as often suggested…? I don’t think so. Maybe working professionals can adapt to exploit this emerging behaviour pattern?

My attention span has fallen over the last few years. It’s a fact. Earlier I could focus and concentrate at length, now I struggle. With so much happening in today’s multi-media, always-on world if something fails to catch my attention in the first few seconds, I just drop it and jump to the next – and even then I end up with so much that I still can’t find time to look at. Some research puts the blame on our growing use of smartphones and connected networks, others on the never-ending information overload. Whatever the cause, the effect can’t be ignored.

An article some months back created excitement by announcing that a goldfish (at 9 sec) has a higher attention span than a human…

While I have found little medical validation to support this claim and hence decided that it’s premature to give up on my digitally-connected lifestyle, it certainly opened my mind to accept that I am now doing many things differently. And since the verdict is out on whether it’s good or bad, my actions are contradictory. At one extreme, I am using yoga and meditation to build concentration; on the other I am developing new operating patterns that fit better with a shorter attention span.

One important change that I am exploring is to drive a culture where information is broken into bite-sized chunks. Obvious, as it may be, believe me that it indeed is a step change. We all suffer overflowing inboxes, overly-descriptive documents and hours of conference calls. To break away from this overload and start communicating in bite-sized chunks makes us edgy… Will data be missed? Can all the facts be captured succinctly? With more pages/slides comes more extensive preparation. And so on…

But, if we stop for a moment – we all know (deep down) that hardly anyone reads big communications diligently and much of what is said is often ignored… We also know that if the key message is clear then we can ‘capture’ it simply. It’s only when we lack clarity that we waffle… and we have all seen that structuring information into multiple, easier-to-digest, pieces helps present the big picture much better.

Let’s say we succeed in getting the information broken down. Smaller pieces work well with our reduced attention spans. We don’t have to wait to free-up time; instead we can pick up chunks to fill in time-windows. We can cover a lot more in a shorter time. We can start by focusing on the key highlights – and only delve into details for areas that really deserve attention. And we can always create time for that. We can become better at filtering and prioritising – and hence act more effectively.

The shift is to move from quantity to QUALITY.

The shift is to move from activity to RESULTS.

The shift is to move from management to OWNERSHIP.

Through our innovative use of structured mobile-video, we at humanLearning have pioneered this change. Our platform ‘WinSight’ organises professionals to craft clear messages – quickly & easily – in short (30-60sec. max.), segmented, templated videos. We can now exploit small time-windows – we have tried to capture our messages as we walk out of a client meeting to the parked car, sometimes as we wait for our turn in a queue, sometimes as we travel in the train or the underground, but quite often as we walk the busy the streets. All our communication has become near real-time and yet is available asynchronously… it gives us all the independence of un-interrupted work-flows and flexibility from time-zones but still keeps us connected – more than we have ever been before!

Our belief in ‘Less is More’ may seem counter-intuitive in the age of BigData but it can evolve into the natural way of future communication – a more human way to interface & interact.

It will take time to move all the interactions to this new mantra. However, we should get started now. After all, it’s not just about survival anymore but an opportunity for working professionals to simplify their work-life by creating new, easier, quicker, more effective – asynchronous – ways of working.

This article was first published on LinkedIn on October 18, 2015  [ref. https://www.linkedin.com/pulse/reducing-attention-span-how-i-started-exploit-todays-arti-khanna] and GrowthHackers on November 2 2015 [ref. https://growthhackers.com/articles/reducing-attention-span-how-i-started-to-exploit-it-in-today-s-asynchronous-world]

I recently concluded that I learn more from Success than Failure… and yet, isn’t it ironic that we are still obsessed with learning from failure?

right or wrong

It is popular belief – especially in the startup eco-system – that failure is a stepping-stone to success. I cannot deny that this gave me a lot of confidence (and comfort) when I co-founded a technology startup, as I believed that the worst outcome (for me) would be all the great learning that I will acquire, even if we faltered on the way.

Now, after many years of living the startup journey, I have lots of learning – both good and bad. But, being true to the spirit of learning from failure, I always diligently record everything that doesn’t work. I even look at it often, analyse it sometimes, and consciously try not to follow the same approach again. But then everything changed one day…

It was just one of those days when I was flustered – I was looking for answers and I was getting irritated as I realised that for every previous effort that had failed, I only knew what did not work. But I still had no clue of what would work? I asked myself – how effective is that learning – if I still have to go back to the drawing board and continue the search for answers on how to make it work? I was not very upbeat as I had gone through the process once and failed to find the answer, and what was the guarantee that the second search would be any more fruitful?

In that state of exasperation, I happened to come across an interesting neuroscience research that suggested that brain cells only learn from experience when we do something right and not when we fail. I was intrigued.

I wondered if I could correlate it with my own personal experience – so I tried to test the theory on the problem at hand. Our mobile-video based service for sharing experiences, stories and insights is deployed across 25+ countries in Europe. Most groups are very actively engaged, but few still require constant nudges. All our discussion around driving adoption in the low-activity groups has always focused on what wasn’t working for these groups. That day we changed our outlook – we instead discussed everything that was working for the high-activity groups. We uncovered simple observations and found interesting patterns. We realised that we just had never bothered to re-apply this successful learning back into the groups that required external stimuli.

That was the day I realised, that my obsession with learning from failure meant that I was simply – taking for granted – everything that was working for us. Here was an opportunity for us to focus on the success and build upon it – I knew what worked and I could make it happen again, maybe even do it much better. And yet I was spending more of my time in learning from failures. Why? It made no sense.

I am now a convert. I now track our successes as much as (if not more) than the failed attempts. Of-course I know that I need to be cautious and ensure that I am not blinded by success. More importantly I am cognisant that I need to continuously strive to do better than the last success. And, of-course it also does not mean that I overlook failures – but I now look at them in the right context.

Learn from success is my new mantra! I realise that the need is not to glorify success – but to recognize core strengths and convert them into strategic assets. Just as it is important to manage our weaknesses, we also need to diligently work on developing our strengths. And believe me – it is harder to focus on strengths, far much easier to lapse into failures, regrets, emotions.

This article was first published on LinkedIn on September 13, 2015 [ref. https://www.linkedin.com/pulse/i-recently-concluded-learn-more-from-success-than-failure-arti-khanna]

Arti is the co-founder of humanLearning – a fast growing UK-based technology startup – setup with an earnest desire to make the life of busy professionals simpler and more effective. humanLearning is disrupting business work-flows thru WinSight – a mobile-video based platform that empowers ‘every’ professional to benefit from each other’s experiences & insights in the easiest, fastest and most impactful way. 

A new National Highway: Virtual Connectivity to override Physical Infrastructure in India

 

Can Aadhaar evolve into a virtual connectivity infrastructure that drives a seamlessly connected society? What if Aadhaar gears up to be India’s answer to its painstakingly slow progress in building physical highways and infrastructure.

Aadhaar is a unique identification initiative launched by the Government of India under its planning commission. It is an ambitious project of using basic IT technology (databases, computing) and connectivity (fixed or mobile) to create a dynamic online identity system. The integration of biometric technology has provided an advanced and secure capability of authentication. This has further been extended by integrating payment platforms and providing an unified system of real-time identity, authorization and payment transaction support.

The vision outlined by the government lays emphasis on social & financial inclusion. As the first step, authorization and payment services are being used to drive delivery of distribution and transaction based services. Initial pilots have focused on social and welfare schemes such as Public Distribution Systems, LPG distribution & subsidy management, old-age pension distribution etc. In the next phase, applications could extend usage from authorization to access control or location/presence and drive services that are as simple as attendance to more dynamic deployment of resources based on current location of users. The scenarios are only limited by our imagination.

However, for true momentum to be built up, the initiative has to garner the industry attention and evolve to provide value to encourage adoption by businesses and enterprises.  This will not just lead to a massive build-up of Aadhaar-enabled services but also provide the impetus to propel it out of the current orbit to the next level of growth.

This evolution will need to be centered around 3 core areas – (1) Extending its application beyond social welfare into businesses (2) Introducing Support for Analytics – analytics could be used for converting raw data into value-added user/service context or applied to intelligence-driven operations (3) Inter-linkages with other databases and systems for seamless connectivity.

Data has been touted as the new oil of the connected world. However, our experience has taught us that data has no value unless it is acted on and converted into meaningful actions. Its only when the monetization potential is realized that it will drive social change.

The question for all of us – can this infrastructure be exploited to compensate for the lag in physical infrastructure investments? A nation that has been recognized for its extensive reserves of IT resources should not falter in playing to its strength in IT – we should be investing in creating an unprecedented scale of connected applications cut across both social and industrial sectors and use the virtual connectivity to open up reach as well as delivery. This could be the one area where we outpace every other nation & challenge the perceived dominance of other emerging nations like China.

Explosion of Data: Can it be monetized? (Part 1)

 

Effective data pricing is not about simply rolling out new pricing plans – it requires a re-think of strategies: implementation of new capabilities like policy control & traffic management; innovations in self-care, loyalty programs and cross-marketing; and  integration of all these dimensions into real-time charging, notification & payment solutions.

The discussion on the ideal model to monetize the explosion of data is live again! The classic one-size-fits-all  approach does not make sense any more.

Last year saw most of the major operators eliminating unlimited data plans to move to tiered pricing. There is very little support for the operators from the community as everyone sees it at an inhibitor to the connected world. Questions afloat on whether it will impede the growth of video-centric applications (still in their infancy) – be it the multi-player gaming or the video calling,  media streaming or the many anticipated new applications. But it is being recognized that growth in data traffic is impacted by multiple drivers – as was seen in India over the last year after introduction of 3G services, where data growth was  impeded due to high tariff’s, inadequate coverage for 3G across the regions and lack of seamless interoperability of many services (e.g., Video conference) across operators. It is clear that monetization of higher bandwidth networks cannot be taken as given –  a more holistic approach is needed to facilitate the adoption of data services and thereafter manage the explosion of data.

It is a reality that the carriers need to gain control of growing bandwidth consumption and to make consumers pay for what they use, while provisioning for an adequate level of quality of service. While the initial efforts started with ways to curtail data-hogging activities, it is slowly been recognized that there may be better alternatives to address the fundamental problem by re-defining the delivery of their services, managing the way bandwidth is utilized between voice and data services, as well as within multiple data sessions, and introducing new revenue streams. This also serves to enable the telecom providers to differentiate their role in the value-chain and demand a share in the revenue thru premium services.

This requires a re-think of the existing architecture to target fair play, offer flexibility thru tiered services facilitate monetization thru dynamic policy control & upgrade options, recognize customer loyalty, and integrate partner/sponsors into service models.

Architecture: Manage Inter-linkages

Some of the dimensions involved:

  1. Implement a data architecture which is able to distinguish & differetiate various types of data services in a granular form, so that differentiated policies can be implemented based on service types, usage and subscription profiles.
  2. Implement policy control solutions in the networks to exploit the variation in data usage and apportion the usage of the network to maximise the revenue. It has been seen that the data usage varies widely depending on the end-device, end-user and other parameters – half of a typical operator’s data traffic is driven by approx 5% of the subscribers only, top 20% of subscribers in usage avail 80% of available capacity. The available capacity during off-peak hours can be monetized by deploying non-user-based services (e.g., utility metering, telematics, M2M etc).
  3. Extend Policy Control to create monetization opportunities, i.e. moving beyond a denial or restriction of service to introducing real-time notifications & engagement with the user to present options to upgrade to the desired service levels. These could be offered on additional payment or linked with operator’s other loyalty & payment modules.
  4. The key is to provide a seamless experience to the user that integrates policy control with real time charging, self-care, payment & notification systems.
  5. Develop network intelligence by consolidating data from multiple sources – monitoring usage patterns, behaviour and service experience, data collection from network nodes, integration with operational & service assurance, CRM,  and care systems for analytics in real-time to develop profiles and characterics that can drive usage based pricing strategies aligned with user behavior.
  6. Implement an intelligent “offload mechanism” to selectively detour certain types of data traffic (e.g., bandwidth hogging video traffic etc) to altrenate bypass routes at the network edge, to ensure consistent quality of service.
  7. Proper dimensioning of networks (access and core) to support the heavy-tailed nature of data traffic is required
  8. Enhance  loyalty packages (discounts, loyalty rewars, bonus points, promotions etc) based on collected intelligence to define targeted segment promotions and innovate pricing capsules. Integrate into the payment systems to interlink with service upgrades.
  9. Integrate of new channels for notification, communication and self-care.
  10. Introduce new pricing Models from bundles, service premiums, partner/B2B models etc. Other options such as dynamic pricing could also be added. 

Next  we will look at the opportunities created by the new architecture to introduce new services, products & offers for the changing connected society.

This article first appeared on Aricent Connect on February 14, 2012  (http://blog.aricent.com/blog/explosion-data-can-it-be-monetized)

Great User Experiences start at the Back End

 

Successful consumer experiences are as much about behind-the-scenes business operations and processes as they are about easy-to-use products and cool designs. 

There is no doubt anymore that most companies recognize the true power of experience—thanks in large part to Apple, who has successfully emphasized user experience as an important element of success. And yet, it’s amazing to see that there is no simple definition of what “experience” really means or entails. Is it the overall interaction with a cool or smart design or is it confined to the graphic user interface (GUI)? I would say both. But I would also add that the complete user experience must also entail the service flows around various experience touch points.

The fact is, experience is the art of taking all those behind-the-scenes business processes and operating complexities (which we always tend to overlook), rationalizing them into streamlined functions, and then hiding them from the user by creating easy-to-use touch points and cool designs. So far, this is what has set Apple ahead of the competition, even as the competition floods the market with a surfeit of Apple look-alike products.

I’ve been associated with customers who go through an innovation cycle to replicate Apple’s success. Most of these initiatives ended with marginal success, and none created any industry-changing paradigms. What they all had in common was a single dimensional approach of hurrying the service functionality to the market. Existing operations and business processes were given short shrift and overlaid with ad-hoc upgrades in order to address service impact requirements. The result was that while the functionality excited technology innovators, it failed to generate any momentum because the experience wasn’t seamless enough.

These days, as the world becomes more connected than ever, dependency on the entire business eco-system has increased significantly. Touch points of customer experience now extend into multiple and diverse back end systems and processes (e.g. authentication and user identity/profile management, discovery and recommendations, download and upgrades, optimization for bandwidth and performance, campaigns and advertising, billing and revenue assurance, business analytics and service assurance, inventory and fulfillment, and third party eco-system management across CRM, billing, BI, OSS and SDP systems). Initially, it’s fairly common to overlook the complexity and impact of creating a comprehensive user experience, and when the challenge surfaces during the later phases of deployment cycles, the impending launch dates leave no room for innovation in that area.

In my experience, the most well-defined experiences get created when the impact on operations is envisioned at the same time as the product or service itself. This allows for all dependencies to be built in upfront into planning, while ensuring that a separate focus is created to commit to a seamless end-to-end operation. This may be possible through simple rationalization or through the evolution of existing systems, but in some cases it may require a full transformation. As the world becomes more connected and more people and products come online, the biggest challenge to service delivery models and business processes will be to sustain dynamic ever-changing real-time user and business parameters.

This is, in-fact, the difference. Apple designs for the end-to-end service—not just for a product. What this means is that companies must plan for service integration as part of an innovation strategy. At least, they do if they want Apple-like success.

This article first appeared on Aricent Connect on June 18, 2011 (http://blog.aricent.com/blog/great-user-experiences-start-back-end)

The Opportunity is not about Big Data but Intelligence-Driven Operations

 

Everyone today is talking of Big Data. The discussion is divided straight down the line – most people pushing the case present it as the panacea of all known & unknown problems; and interestingly the opposing group does not deny its relevance or value but makes every effort to categorize it as no different from the investments in data handling & management done painstakingly over the years. Of-course, both sides are right and miss the real point in the debate.

The real opportunity is untapped and often overlooked – it goes far beyond just data or its derived intelligence. The potential is in the real-time application of the data (intelligence) into operations by developing an active closed feedback loop. The ability to seamlessly integrate the results of analysis (post data aggregation) into active operating and business work-flows changes the landscape. Information today has a shelf life of a few hours (data even lower!) and the achievement here is if it actually gets acted on during its life to make an impact. This possibility is what makes the whole discussion and investment around big data so appealing. It has the potential to enhance not just true customer experience but also have a tangible measurable impact on network costs, support services, self-care and even go beyond to facilitate launch of differentiated services in new businesses.

Data has been collected, managed and applied for many years now. The shift that is emerging is driven by a complex multi-dimensional change in underlying network technologies, early introduction of automated workflows (e.g. policy control & enforcement, self-optimizing networks, etc) and customer behavior fundamentals.

  • Volume of data has increased many-fold. Existing systems and solutions may not be able to handle the ever-increasing scale of data (terabytes, records, transactions, tables/fields) and it may require a re-look at data architectures.
  • New sources and types of data are getting added. This ranges from the unstructured data from social platforms, application marketplaces, mobile devices, and semi-structured data from M2M.
  • Sources of data are spread across different organizational functions and many different systems. Data needs to be culled from large networks, internal corporate operations and users of all types.
  • Data needs to be extracted in real-time from network nodes and user devices. Silo’ed transactional measurements are no longer sufficient and continuous management is imperative. Re-use of existing tools, systems is a must for practical implementations.
  • Same data is relevant for multiple businesses and functions. Operating efficiency is critical by building collect once, use many times architectures.
  • There is a growing need to combine the in-service and out-of-service data to develop dynamic co-relations i.e. associating the real-time & stream data with static data (customer data from CRM/BI systems or operations data (Fault & Performance Tools, Policy Control Frameworks) or network (HLR, EIR etc) to derive actual use-cases.
  • Managing data goes beyond data aggregation or even an analysis of usage patterns. It requires seamless integration with existing operating & business systems. And most importantly, there is a need for tight integration with different businesses to ensure insights are actually used! Metrics & KPIs, use-cases, business applications need to be driven & owned by the business teams for full integration into product life-cycle and pre-emptive & predictive operations.

These shifts can no longer be ignored; it is no longer a future vision but a hard-core reality for survival. But the key point to take note of is that the shifts have far reaching impact and go beyond big data to also trigger a change in all operator functions (IT, Networks, Planning, Operations, Care, Marketing, etc). This is where the challenge lies. An effective strategy is needed to define a holistic implementation approach that goes beyond organizational boundaries. We see the need to factor 4 goals to cover all critical success factors.

Goal 1: Affordable Management of data at extreme scale

  • Existing Systems/Tools vs Big Data Platform selection to evaluate on various parameters such as Volume, Velocity, Variety & Variability, Parallel processing/distributed architectures
  • Consideration of new requirements like distributed compute-first (as against storage first) architectures, distributed file-systems, parallel architectures, complex event processing, highperformance query architectures based on in-memory architectures for analytical driven cloud operations etc.
  • Design of hybrid data architectures built on multiple data platforms and technologies to support different needs of different business applications

Goal 2: Optimal re-use strategy without compromising on architecture expansion

  • Audit of data & systems to define re-use approach
  • Reuse peripheral infrastructure – tools/systems but evolve core data architecture
  • Define data sources & types to avoid duplication and rationalization of systems & tools across functions
  • Business driven def of KPIs & Metrics and Analytics
  • Identification of high-impact business applications to define prioritized use-cases Analytical Application Acceleration

Goal 3: Seamless Integration into Operations

  • Automation & feedback cycle into operating & business systems and process flows to ensure tangible business value
  • Convert insight into real-time actions for preemptive & predictive actions by alignment with businesses

Goal 4: Continuous Operation as a managed service

  • Handling the complexity & diversity of multiple services and managing multiple organizational function interfaces
  • Visualization thru customized dashboards and reports

These goals overlap at many points and the implementation priority and planning can be determined by the business approach and the decision of adopting a disruptive or a simpler adaptive strategy – driven by the initial risk & investment appetite. While Disruptive Approach will deliver a high market impact it will also require higher organizational alignment and upfront investment. It will establish the full blueprint and implementation plan and result in e2e innovation thru a fully integrated approach built over big data solution and automated action loops. The alternative approach is to take Adaptive Approach route which has an edge due to lower initial risk besides providing early feedback that can be fed back into the fully integrated approach. This would develop a high-level blue-print and identify 2-3 selected pilots in identified functional areas and expand to other areas later. The cost of execution will largely depend on the approach preferred, the implementation timelines established and the business goals defined.

Most initial implementations will fall under the adaptive approach and slowly evolve to put in place an effective roadmap for Intelligence-driven Operations.

However, the next big-challenge is to identify the right business use-cases that actually have an impact on the business and the user. The need of the day is to go beyond the standard talked-of use cases (customer segmentation, application analytics, content analytics, network optimization, performance, predictive & preemptive technical problems etc.) and come up with something innovative (such as bridging the consumption gap, turning customer service data into a new revenue stream or…) – the possibilities are only limited by our imagination.

This article was published on Aricent Connect on 27 September 2012: http://blog.aricent.com/blog/opportunity-not-about-big-data-intelligence-driven-operations

Leading Innovation – Making Ideas Happen

 

Success of Innovation is generally associated with Ideas. There is little doubt that ideas are core to innovation,  but we have also seen time-and-again that many good ideas and innovations lose their way – in fact only a small fraction eventually end up living to the original promise.

Innovation is the talk-of-the-industry. It is probably the most overused term in business today – and yet, everyone has there own interpretation of what innovation means and how it can be introduced. The focus is mostly on Idea Generation – some emphasize on systemizing innovation processes while others look to inculcate innovation thru specialized focus groups. Innovation is rarely measurable and disappointingly it seems to have become an end in itself – it is no longer about transforming businesses, changing lives or creating a new world. Innovation seems to be losing its meaning.

In reality, innovation is not just about ideas – an idea is just the beginning of the innovation process and the real work starts after that. This is not to say that defining the idea is not important – it is at the core of the innovation process and the relevance of innovation is tied to its response to real problems and addressing the pain points. However the potential of the idea needs to be deterministic and fully measurable – the path and the journey itself in most cases of true innovation would always be unknown. The difference between success and failure usually lies in how this unknown and unchartered journey is undertaken – how the idea is translated into a goal and (more importantly) the commitment that is made to take it forward.  

And this is where leadership comes in and leaves its mark. For innovation to drive achievement, it cannot be simply built around creativity, but has to be realized through careful planning, painstaking execution, constant vigilance, periodic adjustments and diligent pursuit. All examples of successful innovation have been driven by strong leaders who have been instrumental in changing the impact of various actions down the path of success. There is no better example than Steve Jobs – the ideas in themselves were not his own nor were these new inventions – it was the journey that made all the difference!

There is no doubt that the basic qualities of a good leader – i.e. clarity of goal, adaptability to change, swift & decisive action, calculated risk taking and the courage to override obstacles – are still important. But these form only the minimum requirements to drive innovation, and are not necessarily sufficient to ensure successful impact. Leading innovation requires additional capabilities that help traverse the lifecycle of change.

(1) Passion to make a difference – the inspiration to leave a lasting impact.  

(2) Perseverance & Courage to pursue, to keep going despite all odds & contradictions, to always look for alternatives.

(3) Conviction to challenge conventional wisdom, manage inertia and resistance to change.

(4) Focus to drive towards the goal – ignoring newer attractions and distractions, staying on course, constantly improving usability, simplifying the experience and using technology as a means and not as the goal.

(5) Pursuit of Perfection – willingness to admit mistakes, change course even late in the game and striving always for the best possible.

We just need to look around to see how many innovations have been lost on the way – some have been stopped, others modified beyond recognition, while the majority have simply been replaced by newer initiatives. In most of these instances, it has rarely been the idea at fault, seldom a scarcity of resources – and mostly always a lack of conviction or a faltering commitment.

There is no doubt that success in innovation is far more influenced by leadership than any other element – even more than creative ideas, smart resources or unconventional out-of-the-box thinking!

This article was written on February 09, 2012.

Driving the Connected Society

 

Often overlooked by the end-user, the connected society has truly been built upon the advances in technology & business processes achieved by the semiconductor industry.

Everything today is being driven by, and built for the connected society. We all talk of the impact of Apple, Facebook, Twitter, Amazon and emergence of new eco-systems. These new faces of the connected services have re-kindled the vision of connecting many more devices that are part of our lives such as televisions, appliances and cars; and more recently we see renewed interest in expanding into new solutions in Smart Energy, Mobile Healthcare etc.

Many talk of this as a new vision. And yet, it has been around for many years now. What is often overlooked is that the recent optimism is truly driven by the advances in underlying technologies over the last several years. In fact, it is not just the availability of the new technology or the continuous improvements in size and processing capability, but the drastic reduction in price-points that has been achieved by the semiconductor industry that has opened up the reach for mass scale adoption.

I have been tracking the M2M world for more than 5 years now – I have seen many technology advances that have led to the current point of inflection – from the changed world of microprocessors with their ever increasing processing power and capability, to the re-incarnation of memory with flash memory & in-memory computing; validation of  wireless connectivity and the viability of embedded sensor technology; besides the new thresholds achieved on  video resolution, imaging and voice quality, and many more.

It is true that each of these technologies has opened up new possibilities. But I believe that the tipping point was achieved through the power of combination – as created by harnessing multiple of these individual technologies together into new applications. This led to a new level of innovation delivering far-reaching impact on user experience, operational costs and new business services. It is even more fascinating that the impact spans across verticals, businesses and industries.

Apple provides a good example of this seamless integration of multiple technologies – from rich media experience to optimized browsing experience to new user interfaces like touch, gesture or speech recognition or the extension of storage into seamless cloud repositories or the management of media across devices.

Apple is of-course also largely responsible for relegating technology to behind the scenes – they have focused on user experience and combined smart designs with streamlined processes to create easy-to-use touch points; thus hiding the technology and its complexity from the user.

Hidden or not – yet these technology advances will need to continue and maybe even gather more momentum. If the prediction that the connected devices will reach 15 billion over the next few years comes true, the spurt in data traffic and its implications on processing capability can barely be imagined today. We will need all the power that we can get.

And it is re-assuring to hear (from industry leaders like Intel) that the next generation chip technologies will continue to maintain the pace demanded by Moore’s Law, with the number of transistors doubling every 2 years. It is this power that will drive the connected world – and we should give it its due credit!

This article was written on 20 October 2011.