Data Gateways: Modern companies rely on data analysis systems equipped with AI components for decision-making. The more these tools intervene in management processes, the more their demands on the quality of the source data resemble those of human decision-makers. Data gateways can assist them ideally in the processing of information.
A Google search for “data analysis” and “assistance” quickly turns up many job postings. A financial service provider, for example, who supports its customers in “analysis, strategy definition, negotiation, decision-making, communication/implementation and monitoring” of business operations, is looking for a person who deals with “system recording and analysis of annual financial statements, business evaluations, restructuring concepts and the preparation of balance sheet and cash flow planning” and does “industry-related research.”
Analysis Requires Processing
Pay attention to the points “system acquisition” and “research.” What is required here is the arduous task of consolidating disparate source data and gathering additional information that places the results in a valuable comparative context. The managers at the customer and the service provider want a basis for decision-making that is understandable for them, enables them to classify their position compared to competitors and customers, and delivers results that can be quickly and comprehensively communicated to other stakeholders.
Artificial intelligence that supports management in business decisions places similarly high demands on the initial information as the employer from the example above directions for his primary material and its preparation. If in extreme cases, an AI that takes on the corresponding tasks were only fed with numbers, in the end, only trends would come out in the form of numbers, and the charge of concrete interpretation and communicative processing would then again lie entirely with the human decision-makers.
Science fiction fans may rightly be thinking of Douglas Adams’ novel “The Hitchhiker’s Guide to the Galaxy,” in which the largest supercomputer in the universe, after millions of years of computing, answered the question of “life, the universe and everything else.” “42” answered and left the perplexed protagonists alone with this answer.
Bringing it back to today’s everyday production processes, this would look like this: A production plant notes in its log files purely based on EAN numbers which components it has completed over a certain period and in what number. At the same time, a merchandise management system provides data on the raw materials purchased and the electricity consumed, and the internal accounting system provides information on the workload. The first two systems mentioned at least write rudimentary context information – about the raw materials, for example, the suppliers – while the third is even too informative because it garnishes the team utilisation information with personal data by default, which is irrelevant in calculations for business forecasts due to data protection regulations have to look for.
Without Context, AI Comes To Nothing
A typical requirement in this environment could be to derive from the data described whether the personnel planning, the purchasing quantities, the supplier selection, and the contracts with the energy providers are future-proof for the company concerned. However, for analysis software to be programmed in such a way that it delivers the desired results directly in a dashboard, apart from the formal normalisation of the data and the structure of the communication channels – data is enriched at a crucial point and left out at others: the EAN numbers of the production line should be added to the EAN numbers of the production line, which raw materials are required for which component and how many screwing or welding processes are necessary for the final production. All information that could allow assignment to individual employees should be removed from the personnel utilisation data.
This could be achieved by adjusting the source systems, but this would burden the relevant assets with additional, unfamiliar tasks. In addition, further evaluation interests could quickly come into play, which then entails the implementation of different unique interfaces – until, in the end, there is an unmanageable number of two-way relationships that require several agents running in parallel on each device involved.
Tireless Transformation
The problem can be solved much more flexibly and efficiently if an organisation provides its highly advanced analysis solutions and AI instances with dedicated, specialised assistance similar to that of human analysts. A “data gateway” can take on the role ideally. The data gateway from the manufacturer Crib, for example, could cover the requirements with the following functions:
- Enrichment: During data transfers, the gateway can add exactly the information that the source system does not provide but desires in the target system. Enhancements of this kind are also crucial in the security area, where the report of a possible attack on a system must be quickly linked to information on its risk context and potential vulnerabilities.
- Reduction Of The Data Volume: Not only the addition but also the subtraction of data is possible. This reduces the volume of data transferred in the company, and the processing speed for all analysis projects can increase.
- Central And Event-Based Routing: The user can set up rules according to which the data gateway dynamically changes destinations for a data transfer depending on the content or other characteristics of the information supplied and, for example, alternately or simultaneously software such as Splunk, Elastic Search, InfluxDB and security monitoring -solutions provided. Existing agents (Splunk Universal Forwarder, Beats Agent, and so on) can continue to be used – as long as they are still needed. This way, a step-by-step or limited-scope implementation of the data gateway is possible.
- Encryption: A data gateway can also meet the requirement of standards such as the GDPR or PCI DSS (security standard of the banking industry), mentioned in the example, to make personal data accessible only where a particular interest in it can be proven. It then encrypts the information on the fly, preserving it for any forensic investigations that may be necessary but ensuring access is only granted in suspicious cases.
Conclusion
AI accelerates data analysis in heavily digitised and automated business processes. Suppose you have a data gateway at your side as a fast, tirelessly assisting data gateway. In that case, this increases the accuracy of the analysis and the significance of the results – and the user gains the opportunity to be able to implement new analysis ideas without great effort.
ALSO READ: How Storage Affects The Success Of Data Analytics