14 July 2022
Business success depends to a large extent on the speed and accuracy of decision-making processes. We can observe this especially nowadays. At first, COVID-19 pandemic and now the war in Ukraine have forced companies’ management boards to move away from the usual patterns and look for solutions to new problems. These include challenges related to supply chain continuity, workforce shortages (first due to getting sick and the quarantine period, later the departure of Ukrainian employees), economic instability, restrictions to export and import of certain goods.
In such challenging conditions, the companies which know how to make better use of the available opportunities and protect themselves more effectively against inflation and market’s vagaries will gain the advantage over others. The dynamics of change is so fast that some decisions, such as the purchase of parts for stock, need to be made within several hours, rather than several weeks or months. For that reason, the importance of collecting, evaluating and interpreting data about your company, its business environment and global markets is growing. Valuable information can be deduced from reliable data. Chaos, allowing uncertain sources, erroneous data selection could lead to serious mismanagement.
From data to information
Data is not information yet. Some data in the stream of information clearly carries usable information (zeroed product stock) but this is just one of many possibilities. Some of the ‘raw’ data is irrelevant and for practical reasons should be filtered out as quickly as possible and as close to the origin where recorded as possible. Other data can be a source of valuable information if we can judge and interpret it in a correct way.
How do you manage the flood?
Business and technology processes may be the source of large data streams. The extreme approach is to gather everything. This vast collection of data can be analyzed by looking for non-obvious links and correlations between events, environmental conditions and market situations. This approach has many advantages but is technically very complex and costly. It requires investment in IT infrastructure and continuous improvement of data analysis software.
The optimal approach could be pre-selection of data and processing only the elements that carry information about the change. If the number of stored packages of a certain item does not change, there is no need to transfer such data. What is important is the confirmation of delivery or issuing goods (or inventory control).
Separate the wheat from the chaff
Sources of data and information should be assessed in terms of reliability, risk of misinformation and deliberate misrepresentation. All sources of data and information should be evaluated as follows: automatic (e.g. RFID reading in the warehouse), internal (depending on the competence and reliability of its own employees) or external (consulting companies, analysts, financial and economic forecasts). The evaluation and selection of data and information must be continuous and based on the principle of limited trust. It cannot be ruled out that some source of information will “go off,” e.g. because experienced analysts are replaced by newcomers.
In a flood of data a company can be saved from sinking thanks to experienced analysts, who are constantly improving their qualifications, efficient IT tools based on artificial intelligence and machine learning as well as management insights. This may sound a bit enigmatic but experience and knowledge may not be enough if you have to continue playing according to new and unfamiliar rules. Maybe that’s why Napoleon was looking for commanders who… were lucky.