Combining process excellence with data analytics to gain an edge
For companies embarking on business transformation, new technologies are offering endless opportunities to improve efficiency, effectiveness and more. Arjit Agarwal and Utkarsh Sharma, leaders at Nexdigm, outline how one such technology – advanced data analytics – can be combined with process excellence to gain a competitive edge.
The emergence of high-speed internet, digital technologies and real-time decision-making have irrevocably changed how business is done. These revolutionary developments have leveled the playing field in most industries, made them far more customer-centric than in past decades, and significantly narrowed the gaps that separate one company from the next.
In an era where every advantage is a significant one, business transformation has become the goal to not only keep pace with the times, but also stay ahead of the competition.
One way to stay ahead is by consistently upgrading an organization via process excellence – powered by data analytics.
When data analytic insights are used effectively, they trigger a company’s evolution towards the pinnacle of its potential.
Process excellence in its truest form is not possible without data analytics. That is, the ability to define a company’s strengths and weaknesses through the power of numbers – pure, raw data that surpasses the conjecture of “best guesses” and “gut instincts” and breaks down every process in every business unit to then reshape it for maximum efficiency.
Understanding process excellence
Process excellence is not just another buzzword that C-suite executives bandy around when writing their yearly memorandums for the company newsletter. It is the grand plan by which a company operates to become as effective and efficient as possible through dedicated operations such as testing and design.
The desire is to improve processes to deliver positive outcomes consistently with a bare minimum of variation in results and waste – be that of man-hours, resources, etc. When process excellence is achieved, it allows companies to make the best decisions possible on how to shape current and future plans.
A pivotal point to understand in process excellence is that it is not a one-time process that can be put into motion and then forgotten about for the next decade. It is a continual process that requires regular check-ins and updates to keep everything functioning at the highest level possible. This is known as continuous improvement and is the last step of the cycle that comprises process excellence.
Think of it the way you would envision the development of a product. When Henry Ford designed his Model-T and released it for sale in 1908, he didn’t simply wipe his hands clean and declare that the pinnacle of automobiles had been reached. Even as the car dominated the industry, he was off and running on improvements for new models within a year’s time.
What then are the components of process excellence?
An excellent first step is process mapping, which creates a workflow visualization of a process, detailing what steps are taken to move it from beginning to end. This is essentially a flowchart that follows part of an organization’s practices from end to end, showing what business units are involved, which employees play roles, what technology is incorporated, and so on. Often, simply inking a flowchart into life is enough to reveal spots where improvements can be made.
Visual learning is a very underrated tool. This is used in conjunction with process analysis (typically a more technical, systematic review of all steps and procedures that comprise a given activity). The general mindset is to document how input is transformed into the desired output and reduce the number of resources and redundancies used to reach that output.
In the same vein as process mapping, process analysis can sometimes reveal places where unnecessary time sink is occurring or where two business units are doing the work of one. An important aspect of process analysis is value-stream mapping, which considers the current state of the business and designs a possible future state for the chain of events that will move a product from its first stages of development to its end-use stage for customers.
It is a very visual process that measures not only the steps necessary, but also the time each step takes and the amount of work involved. A value-stream map captures information as it moves from step to step as well as the materials being used.
When points are identified where things can be improved, companies start the journey with process optimization, adjusting the workflow of processes to take the straightest path from Point A to Point B without compromising quality or increasing cost drastically. Common optimization goals include forecasting changes, improving communication, streamlining workflows, and eliminating redundancies.
When successful, the optimization exercise enables businesses to reduce their risks, become more consistent, have a better understanding of end-to-end visibility, and be in compliance with the market.
Of course, this optimization exercise is not as easy as snapping one’s fingers. It can take significant testing and usually multiple iterations to achieve, and seldom is it the result of a small change. If the earlier steps reveal fracture lines down to the core, process re-engineering may be necessary to start from scratch and build the processes back from ground zero to deliver more value to the customers.
This can be a major investment in restructuring multiple business units to conform with the new approach. The concept of process redesign typically involves upgrading the physical tools – software, hardware, machinery, technology – to achieve great leaps forward in how a company measures its performance, such as via return on investment, quality of service, and cost reduction.
A mixture of these different elements will combine to allow the company to engage in continuous improvement, keeping itself on a perpetual ascent rather than a series of peaks and valleys based on outside forces.
The power of data analytics
In recent years, there has been a meteoric rise in the role of data analytics inside business planning. Rapid advancements in how data is collected, sorted, and analysed – largely through the development of the artificial intelligence (AI) sub-field known as machine learning (ML), companies are able to collect enormous fields of data (known commonly as big data) – and turn them into actionable insights in a very short amount of time.
The ability to collect data from a vast array of resources has allowed companies to advance from making decisions based on previous years of business to doing so based on data collected in close to real-time environments online and in other formats.
Fresh data analysis can lead to more relevant key performance indicators (KPIs). KPIs are the first component of data analytics. KPIs are the formal creation of goals, milestones, and insights put in place in a company to see and understand how much progress is being made towards specific objectives that business leaders have identified as key to a company’s success. By having reliable data in place, these KPIs can more sharply define a company’s performance over time versus expectations.
One of the most successful forms of data analytics is the use of machine learning to parse giant swathes of data and pull out patterns and trends in minutes and hours that might take human data practitioners weeks and months to deduce – or quite possibly never find at all. Identifying these trends allows humans and AI alike to turn data-driven insights into strategies for future success.
Data-driven insights are the second critical component of data analytics. While it is not quite the equivalent of predicting the future, it gives companies a much better shot at using what has come before to make plans for what lies ahead.
The third component of data analysis involves scenario planning – building organizational awareness of what could happen in the future. By visualizing both possible risks and potential opportunities, companies can be better prepared for what might be coming down the road and react to it more quickly than the competition, which can lead to tremendous advantages and deliver enormous returns in both the short and long terms.
A fourth valuable task is forecasting; perhaps the oldest and most common use of data, which takes the raw data, analyzes trends, and predicts future values based on those trends. Forecasting is used to determine everything from what the weather will look like next week to how stock prices will rise or fall tomorrow. In process excellence, forecasting can do basic extrapolation for aspects such as units sold, customer turnover, cost of raw materials, etc.
The best of both worlds
While data analytics and process excellence can be siloed and still be successful, combining resources from these two activities can provide a sustainable competitive advantage for a firm that percolates across all business units and becomes part of the company’s life cycle.
The power of data has grown enormously, just in the past decade, and turned forecasting from a process that became stale after a few months into a vibrant, ultra-specific practice that can function in close to real-time, given the right parameters. Data insights are not just for customer interaction and satisfaction but can reveal inefficiencies inside of company workflows and showcases how process innovations can improve the speed of work, the value of workers, and eliminate bottlenecks and downtime utterly.
Even the keenest human observer in the process analytics and process mapping stages can have their knowledge augmented by real-time data that shows the strengths and weaknesses of how every product or service of a business is calculated, constructed, and delivered.
Harnessing the data to set future KPIs and collecting that data means not having to wait for the next quarter or month to see what improvements have been realized. By utilizing the right software to collect and analyze the right data points, organizations can add real numbers to the concept of continuous improvement, gaining near real-time insights into what is working to improve in-house efficiencies and what needs to be further refined, re-engineered, or completely scrapped.
The pairing of these powerful domains has virtually unlimited possibilities.