Windows Shortcut: Malware & Spam Cyberattacks Are On The Rise

0
Windows Shortcuts Malware & Spam Cyberattacks Are On The Rise

Varonis Threat Labs has recently noticed an increasing number of cyberattacks via malicious Windows shortcuts. Targeted attacks by Malware-as-a-Service provider Golden Chickens (also known as Venom Spider) and malspam campaigns by Emotet have been observed.

“These campaigns show once again that cybercriminals keep using proven tactics, even if they seem to have gone out of style long ago.” Users use shortcut files to create a shortcut to any file or folder and develop user-friendly Windows shortcuts in the Start menu. By default, Windows shortcuts take on the target file type’s icon with a small arrow mark.

However, it’s easy to change this icon to make it appear that the target is some other, seemingly legitimate file type. Accordingly, the malicious shortcut looks like any additional shortcut file familiar to the victim and uses legitimate utilities to launch an initial stager (LOLBins/living off the land binaries technique). “This fairly simple social engineering technique can trick victims into viewing malicious content. It also doesn’t require complex exploits or suspicious initial payloads,”.

Windows Shortcuts: Countermeasures Against Cyberattacks

Since Windows shortcuts are generally viewed as benign by users, security officers should implement the following measures to mitigate these threats due to the similarity in attacks observed recently:

  • Scan email attachments and quarantine or block questionable content such as compressed files containing Windows shortcuts (.lnk files).
  • Prevent the execution of unexpected binaries and scripts from the %TEMP% directory.
  • Restrict user access to Windows scripting engines, including PowerShell and VBScript. Make sure scripts need to be signed via Group Policy.
  • Beware of the unexpected execution of legitimate LOLbins such as ie4uinit.exe and wmic.exe by “normal” users.

Since its inception in 2005, Varonis has taken a different approach than most IT security vendors. The provider places the company data stored locally and in the cloud at the center of the security strategy. Varonis Data Security Platform (DSP) detects insider threats and cyberattacks by analyzing data, account activity, telemetry, and user behavior. 

ALSO READ: Credential Stuffing: Companies Lose Up To 9 Percent Of Their Sale

Credential Stuffing: Companies Lose Up To 9 Percent Of Their Sale

0
Credential Stuffing Companies Lose Up To Nine Percent Of Their Sales

With the help of leaked password databases, cybercriminals have repeatedly succeeded in taking over user accounts. Highly automated tools are used for this. A single credential stuffing attack can result in thousands of victims. For companies in the online business, account takeovers via credential stuffing by cybercriminals have become a financial business risk. This can reduce sales by up to nine percent, according to a new study carried out by strategy consultants Aberdeen on behalf of Nevis. For the study, Aberdeen focused on ten selected B2C categories in EMEA.

In addition to commercial banks, credit unions, savings institutions, and financial technology, property and casualty insurance were examined for the study. Other industries include consumer electronics, healthcare provider networks, online gambling, telecommunications, and utilities. The study shows how widespread attacks via credential stuffing are currently. Seventy-six percent of those surveyed said that some of their online users had been victims of successful account takeovers in the past 12 months.

Credential Stuffing: Cyber ​​Attacks Hurt Profitability

The investigation also makes clear the dramatic extent of the resulting damage. The costs of successful cyberattacks quickly add up to significant amounts that cannot simply be dismissed as an unavoidable “cost of doing business.” Commercial banks lose 3.4 to 5.28 percent of sales due to credential stuffing. In the fintech sector, it is even between 5.57 and 8.96 percent. Sectors outside of the financial world are also affected to a comparable extent. Loss of sales due to illegal account takeovers from healthcare providers amounts to 4.45 to 5.79 percent. Even in the gambling sector, which is strictly regulated and therefore concerned with security, the losses are between 5.02 and 8.2 percent.

How Cybercriminals Deal With Attacks

Once access to a user account is open, the criminals can exploit it for various purposes. According to the Aberdeen study, fraudulent transactions (39 percent), creating new accounts (34 percent), and erroneously rejecting card payments (34 percent) are the most common. Other typical consequences of account takeovers are chargebacks (18 percent). Also, transferring funds or other fungible assets (11 percent), fraudulent purchases (11 percent), and theft of digital content and services (11 percent). In addition to these direct consequences, there are other indirect consequences. For example, a decline in active users is deterred by increased security measures or migration to competitors.

Aberdeen has also looked into how companies are trying to protect themselves from the increasing number of attacks via credential stuffing: This shows a growing avoidance of both the username-password model and multi-factor authentication solutions. For example, mobile apps for multi-factor authentication are currently used in 42 percent of the companies surveyed – but only 24 percent support a future introduction. On the other hand, the respondents see strong potential for innovation in passwordless approaches, which are both user-friendly and cost-efficient for the providers. At the same time, only 20 percent have implemented passwordless (adaptive, contextual, transparent) practices, and 46 percent plan to do so in the future.

Credential Stuffing Remains Popular With Cybercriminals

Credential stuffing is currently an attractive method for attackers for the following three reasons:

  • First, the dark web makes it easy to obtain lists of credentials that have been made public through data breaches or hacks.
  • Second, all business relationships based on digital accounts require digital credentials. Unless additional security measures have been taken, they are therefore vulnerable to brute force attacks such as credential stuffing.
  • Thirdly, the attacks can easily be automated: The perpetrators do not necessarily need programming knowledge but can rent the required programs on the Darknet according to the software-as-a-service principle.

This lucrative business model is only likely to disappear when most companies switch their user accounts to secure processes such as multi-factor authentication and, in particular, passwordless authentication. The Nevis solution portfolio includes password-free logins that can be operated intuitively and provide optimal protection for user data. Nevis is one of the market leaders for identity and access management in Switzerland and secures over 80 percent of all e-banking transactions.

ALSO READ: Experience Management: 5 To Transforming Business Processes

Experience Management: 5 To Transforming Business Processes

0
Experience Management 5 Steps To Transforming Business Processes

The constant change in the business world puts increasing pressure on companies to optimise their processes. However, this is only a basis for end-to-end process optimization concerning costs, efficiency, or agility.

To remain competitive in the modern and digitised business world, companies should rely on experience management to complete end-to-end process optimization to improve the experience of customers, employees, and suppliers. In five steps, implement an experience management strategy for the holistic transformation of business processes.

Create A Target Actual Analysis

Before companies can expand their process optimization with the information from experience management, inventory is necessary. This is the only way they can find out what the current processes are like, i.e., which processes work and which do not. Once this is done, a target state should be defined: What should the process landscape look like in the end? The target/actual analysis provides clarity in a data-driven manner as to where the company needs to lend a hand to create the necessary conditions for business process transformation.

Experience Management: Understand The Stakeholders

According to the conventional model, process optimization completely ignores the wishes and experiences of customers and employees. In business process transformation, on the other hand, they are an essential point. To involve them in their process transformation, companies need information they can extract from surveys, for example. It would be best if you also considered the Net Promoter Score (NPS). It is an essential indicator of whether customers would recommend a company or not. It is crucial at this point that all internal departments communicate openly with each other and provide their data for planning further steps.

Identify The Most Critical Touchpoints

The proper understanding of where exactly, for example, the customers come into contact with the internal process is crucial. This is the only way to identify the most critical touchpoints. When buying a new smartphone with a contract, the customer goes to a website and completes their order. In the backend, however, much more complex processes take place: The provider has to request the device from the manufacturer, enter the customer’s data into the system, activate their phone number, and ensure that the hardware arrives at the customer’s.

The many processes of which the customer is unaware often intertwine. So-called customer journey maps help identify the dependencies between the internal process and the external experience. They can target stakeholders – customers, partners, suppliers, and employees. Based on the data collected, companies can take targeted measures that bring added value and do not increase efficiency at the expense of the experience or vice versa.

Experience Management: Define And Implement Measures

Identifying the process points where the action is required is followed by their definition and implementation. With this potentially disruptive step, companies must not forget to get all employees affected by change on board. Carrying out the process optimization without looking at the employees’ experience would be grossly negligent since resistance to the changes could form. The planning of measures must therefore be transparent, and the stakeholders should have the opportunity to participate in its design actively – after all, they have to work according to the new rules at the end of the transformation process.

Evaluate The Process Transformation

Business process transformation is never a one-off thing because changes to the process can – despite all planning – remain unsuccessful. Or even raise new problems. Therefore, it is essential to carry out further evaluations for optimising the business processes even after the process transformation. The Net Promoter Score is an excellent guide to this, as is the full breadth of experience management. Companies should initiate the necessary steps again if something is not yet running optimally.

ALSO READ: Tested In Practice: How  Companies Can Halve The Costs For Public Cloud

Tested In Practice: How  Companies Can Halve The Costs For Public Cloud

0
Tested In Practice How Companies Can Halve The Costs For The Public Cloud

Public cloud spending is hitting new record highs yearly, and overall, cloud infrastructure makes up the lion’s share of many organizations’ IT budgets. Alexander Zschaler, Sales Manager Germany at Cloudera, explains how this reduces unnecessary expenses and halves costs.

Public cloud spending will reach a notable $500 billion globally in 2022 for the first time. A value that illustrates how much companies are struggling with rising costs. This fact prompted the CEO of the cloud specialist Cloudera to set an ambitious goal for his company: the annual public cloud costs should be halved from USD 25 million. To this end, a three-point plan was drawn up:

A First Overview

The main cost driver is waste. According to a 2021 survey of companies, 82 percent are significantly overspending on the cloud, and 86 percent cannot keep an overall view of these expenditures, including Cloudera itself. However, a complete overview of costs is essential to gain an overview. This is the only way to develop a suitable strategy and take the proper measures. Due to the complex requirements, Cloudera decided after the first analysis to break away from external SaaS providers and develop its automation solution based on CDP Data Warehouse in-house.

NimbusWatch – Record And Allocate Costs Automatically

Cloudera’s new solution, NimbusWatch, ingests data directly from the public APIs of the big three cloud providers. This saves license costs and leads to faster, reliable, and detailed data collection. In addition, human resources and financial systems data can be transferred to map the organization as a whole. The individual items in consumption and the costs incurred are divided into the following categories:

  • Cloud account (at Cloudera, for example, there were 200 cloud accounts, most of which can be assigned to a cost center)
  • Object owners that can be set to an organizational unit and thus to a cost center
  • Tags – A company-wide tagging process allows costs to be reassigned if necessary
  • Identify Waste – Dedicated dashboards track patterns in data usage and provide actionable information to help those responsible initiate conversations or directly reach out to the right team to make changes and eliminate waste.

Prepare And Forward Analysis Clearly

The next step is transforming the mined and assigned data into measurable cost savings. To do this, the analyses still have to be prepared formally and forwarded to the right competence center. Only then can they derive effective measures from the data. For this purpose, NimbusWatch automatically sends weekly reports to the technical managers via e-mail. These show the evolution of cloud spending and waste and highlight potential for improvement or savings. Such statements help managers proactively manage costs and alert teams to expenses incurred promptly instead of reacting at the end of each month.

The Goal Of Halving Costs Achieved

Despite cost pressure, companies cannot do without the advantages of the cloud. With NimbusWatch, the Cloudera team was able to develop a suitable tool for this challenge. This allows all public cloud spending to be proactively monitored and managed. The strength of the solution lies in the fact that it processes previously highly complex and distributed data into precise analyzes and reports and enables users to take responsibility for their cost management through quickly actionable insights. In this way, the project team at Cloudera was not only able to achieve the ambitious goal of halving the costs for its public cloud infrastructure but even exceeded it.

ALSO READ: Advanced Analytics: Companies Introduce Data-Driven Decision-Making

Advanced Analytics: Companies Introduce Data-Driven Decision-Making

0
Advanced Analytics Companies Introduce Data-Driven Decision-Making Processes

Advanced analytics and business intelligence are drivers for changes in business processes, decision-making, and work culture. A new report by Reply shows that decision-makers should therefore deal with innovations in self-service BI, cloud, automation and AI.

Reply has the new trend report on transforming business intelligence and advanced analytics with the proprietary AI-supported trend platform Sonarcreated. This analyzes over 50 million sources such as international specialist media, scientific publications and patent applications in recent years to identify future trends. According to the research, self-service BI marks the shift towards integrated self-service analytics solutions. Further development in this area will be determined by the movements of Bot & Voice-Assisted BI, Intelligent Business Alerts and Low/No-Code Development. They enable the creation of analyzes and reports without programming knowledge and thus form the basis for data democratization – cross-departmental access to information at all hierarchical levels for data-based decisions.

Advanced Analytics Through Data Processing In The Cloud

In addition, cloud processing and performance are the keys to AI-based automation of process flows for processing exponentially increasing data volumes and complex calculations with advanced analytics. Shared computing resources in the cloud are more efficient and meet the rapidly growing demand for on-demand computing power with flexible terms. 5G technology offers a further increase in performance: It enables distributed and parallel processing by combining edge and cloud systems.

The added value of BI and advanced analytics for companies is significant: In combination with cloud computing, automation and AI, data can be converted into relevant insights and actions. They offer a fast route to business-critical insights, support automated decision-making, make increasing complexity manageable and bring transparency to business processes. Last but not least, they help to identify and predict market dynamics so that the performance of companies can be increased in the long term.

Intelligent Decisions Based On Analysis Of Information

“As we increasingly live in a world of hyper-automation, we can overcome limitations in the processing and analysis of data that result from increasing complexity. Automated and AI-based systems feed the results into output systems and allow intelligent decisions based on analysis information. Such systems help to develop sustainable perspectives and competitive advantages and are an indispensable part of the further development of companies”.

The trend report on BI and advanced analytics is part of a series on AI and CX, logistics, autonomous things, new interfaces, automotive & mobility in e-commerce and 5G. Reply develops and implements solutions based on new communication channels and digital media. Consisting of a network of specialized companies, Reply supports the fields of telecommunications and media, industry and services, banking and insurance and public administration in the definition and development of business models driven by the new paradigms of AI, big data, cloud computing, digital media and Internet of Things are made possible. 

ALSO READ: Machine Learning: The Technology Has So Much Potential In Production

Machine Learning: The Technology Has So Much Potential In Production

0
Machine Learning The Technology Has So Much Potential In Production

AllCloud’s new study “Machine Learning in Production” examines the potential of machine learning in manufacturing and production. According to this, almost every second company plans to optimise ML systems.

AllCloud, the provider of professional data analytics and machine learning services, has published the study ” Machine Learning in Production. ” The study clearly shows that manufacturing companies have recognized the great potential of machine learning in manufacturing and production. Almost every second company plans to optimise the ML systems used. Forty-one percent want to increase the use further and expand it to other areas. There are many reasons for this: ML models, for example, can significantly impact a company’s success and can be crucial for strategic orientation.

Main Fields Of Application Of Machine Learning

The main application areas are quality assurance and control (at 37 percent of the companies surveyed) and logistics and inventory expansion (at 25 percent). In addition, the optimization of the production process (at 24 percent) and predictive maintenance (also at 24 percent). The benefits of using ML are immense. A significant benefit is cost savings, which occurred with 45 percent of those surveyed. In addition, 42 percent state that production optimization has been achieved. Forty-one percent see an increase in productivity, 34 percent a process acceleration, and 32 percent a relief for employees through machine learning.

Implementation Partners For Machine Learning Are In Demand

The study also shows that companies‘ ambitious plans cannot be implemented without the help of implementation partners. Only two percent of production companies can independently implement their plans in ​​ML technologies. Accordingly, 98 percent of those surveyed stated they depended on external service providers. This need is due to the lack of expertise in companies in the area of ​​ML models and tools and the lack of skilled workers. The companies surveyed lack experts who know how to use the application fields and ML systems’ potential. And in this way, we can tap into it for the company. External service providers are also indispensable for technical requirements and the development of individual ML strategies.

ALSO READ: Artificial Intelligence: 4 Typical Hurdles In AI Projects

Artificial Intelligence: 4 Typical Hurdles In AI Projects

0
Artificial Intelligence 4 Typical Hurdles In AI Projects

Artificial intelligence offers opportunities in sales and service as well as in production. But technology alone is no guarantee of success, and how companies can avoid the four most common pitfalls in AI projects.

There is no doubt: that artificial intelligence holds enormous potential for companies. But as with many trends, the problems often begin with the term “artificial intelligence” itself. Anyone who wants to start an AI project should first ask about artificial intelligence within their own company. And then bring leadership, management, and operational levels to the same level of knowledge. It is advisable to explain internally how machine learning, deep learning, artificial intelligence, neural networks, and natural language processing are related and how they differ.

The Understanding Hurdle: What AI Is All About

Based on this explanation of terms, existing wishes, ideas, and expectations can be clarified: At which point of the traditional core capabilities of an IT system should the AI ​​start? Perceiving (input), understanding (processing), or acting (output)? Where are the participants concerned that the system shows definitional, human-like, intelligent behaviour? Where do they tend to strive for independent learning from feedback or mistakes, and where is only thoughtful collection and enrichment of data desired? One such fundamental question has been clarified. It is possible to exchange ideas internally about goals, strategies, and approaches – so that they meet everyone’s needs.

Artificial Intelligence And Size Hurdle: Outline The Project Precisely

It can be a mistake to set up a project too big. “Anyone who wants to achieve results through artificial intelligence instead of conducting complex and risky experiments should clearly define goals and ideally limit themselves to individual processes,” explains Carsten Hunfeld, Head of Operations for the DACH region at Augmentin. Hunfeld recommends breaking down an overarching goal, such as a better operating result, into subgoals, such as concrete improvements in productivity, quality, occupational safety, or compliance, and breaking them down into milestones. Such an approach makes transparent what is to be achieved: On the one hand, it makes it easier to assess at which points and in which processes AI can be used sensibly and also helps with the exact definition of goals up to KPIs. In the end, it’s about achieving real added value. After all, technology for the sake of technology is of no use to anyone.

Such a sub-goal could be to train staff more quickly. Another would be the desire to ensure that employees have the knowledge they need to do their jobs safely and with high quality. AI-based software platforms can drive these and many more endeavours by delivering training, instruction, and support right at the workplace via smartphone, tablet, or data glasses. With the help of algorithms, the instructions can be personalised so that each employee receives the information and control to help them progress. Where beginners are given a hand with all the details, trained professionals receive nothing that would slow down their work with too much information.

The Data Hurdle: A Lot Helps A Lot, But It Has To Be Clean

Another factor to check at the very beginning is the data. Because without the correct input, artificial intelligence – primarily if it is based on machine learning – cannot work. The first step is to find application areas in which enough data sets are generated to train an algorithm so that a reliable forecast is possible. Instead of operating with “small data,” such as customer data, they should either look for existing big data scenarios or start collecting large amounts of data. Networked employees and machines offer a rich source because not only do sensors provide a lot of valuable input, but also personnel.

Examples are the feedback on work steps and the confirmation of hygiene measures via mobile devices. As well as the documentation of statuses, errors and a lot of other information about machines, systems, and tasks. However, it is not just the amount of data that counts, especially quality. According to the motto “garbage in – garbage out,” it is otherwise all too easy for an AI-supported system to become a nail in the coffin instead of leading to the hoped-for success. Data cleansing and a corrective look from a human, competent side are part of the mandatory program for AI projects.

The Staffing Hurdle: Don’t Depend On Artificial Intelligence Experts

This is precisely where another obstacle arises for many companies. They wonder if they need to hire model developers and data scientists to start their first AI project. The fact is that such experts are rare and expensive. Cloud solutions that already come with ready-made models are therefore ideal. They do not require special knowledge and are sometimes ready for use in less than a week. You can then directly optimise the deployment planning of the employees based on their skills and experience or start compiling the most frequently asked questions from production together with answers from experts in a knowledge database for an AI bot.

In addition, the wealth of data from the connected work area cannot be quickly evaluated with conventional business intelligence tools. Until now, it took a data scientist to turn them into valuable insights. Not so with AI-based systems, which provide intelligent analysis functions and dashboards “plug-and-play.” Their algorithms can recognize inconsistencies or outliers and find correlations even in “noisy data.” They help to identify the most promising improvement opportunities for continuous learning.

Artificial Intelligence: Use Out-Of-The-Box Solutions

Jumping onto artificial intelligence now is a good idea. Pushing the big thing right away can be overwhelming. It is often better to gain initial experience on a small scale first. Out-of-the-box solutions for specific, clearly defined use cases, such as autonomous maintenance or predictive maintenance, provide the opportunity for this. They can often be implemented without long preparation, significant risks, and personnel changes and are rewarded with quick results. To avoid silos, it is essential that such a cloud platform can be easily connected to existing systems. To quickly feed the data and knowledge gained into downstream processes and make it accessible to the entire organisation. 

ALSO READ: Cloud-Native: Why This Approach Requires New Skills

Cloud-Native: Why This Approach Requires New Skills

0
Cloud-Native Why This Approach Requires New Skills

Telecom technology is currently in the transition towards disaggregated, cloud-native networks. However, this technology requires new skills, to navigate cloud-native. The telecommunications market is in transition. The ongoing pandemic has boosted global internet traffic by up to 60 percent, increasing demand for bandwidth and increasing pressure on operators to provide reliable, high-speed broadband connections.
This has challenged operators’ perspective on future-proof and efficient network infrastructure, prompting them to question how they built and operated their networks. While telecom technology has stagnated for decades, we are now on the cusp of shifting to cloud-native that enables disaggregated networks. Industry organisations such as the TIP initiative are pioneers here.

Cloud-Native Leads To A Surge In New Skills

The market is now witnessing a shift in connectivity toward a cloud computing approach, away from the traditional monolithic legacy hardware that has dominated the sector since its inception. Demand for new qualifications accompanies this. Just as the dot-com boom in the 2000s led to the emergence of coding boot camps and reskilling employees for the new age, the shift to cloud-native in the 2020s will boost new skills in the telecom industry.

These new “Cloud Native Engineers” need to understand software-centric, cloud-native, and disaggregated networks, from the Radio Access Network (RAN) to the edge and 5G core. Employees must be able to understand and navigate the world of the cloud quickly, moving an application from a repository to a new operational environment through continuous integration and delivery pipeline.

The challenge now is that there is a skills gap for both internal and external employees. There is already a shortage of technicians who can properly install fibre, power, and radio equipment at telecom sites, let alone engineers with the expertise to navigate the new Cloud Native environment. So how can we build the next workforce for cloud-native technologies? 

Adaptation To Cloud-Native Environments

In telecommunications, the term “cloud-native” describes various functions in networks that were developed as software from the start and run on independent hardware. Of course, such a Cloud-Native design has many advantages because the independent microservices are provided and executed in containers. If a new feature or update is required, the software developer delivers a corresponding microservice that updates or adds the individual element within milliseconds without interrupting the service. This way, route editing, updating, and restarting are 20 times faster than traditional router operating systems if open interfaces are also available.

However, the implementation of a cloud-native environment and the code and processes that sit on top of the governing functions and management must be performed by engineers with new skills. Compared to legacy, fixed networks, and hardware, Cloud Native engineers must understand how a container-based architecture works to allow microservices and APIs to work together in a loosely coupled approach for maximum flexibility and development agility. In addition, they must know about running routing software that turns bare metal switches into IP/MPLS carrier routers, often in different areas of the network, such as broadband access, edge, or core. It is not easy for engineers. 

New Ways To Build Expertise In Cloud-Native

Of course, traditional routers and dynamic control systems are being challenged by new concepts such as disaggregation and distributed SDNs. These promise significantly faster implementation, automated control, and a shorter market time. For future router designs to meet these challenges, fundamentally new hardware and software must be developed, and modern software architectures and paradigms must be introduced. 

A cloud-native engineer must-have software skills, such as coding, testing, design, or architecture. And at the same time, I know how to customise applications to make the best use of the cloud platform services. The best way to build this broad knowledge base is through training and hands-on experience. Training typically includes learning about Docker and Kubernetes in production use cases, writing complex cookbooks, transforming existing applications into cloud-native applications, and so on. Unfortunately, most training is currently focused on the “legacy” engineer deployed in the field to replace radios or fix newer 5G stations. not enough is done

Implement A Cloud-Native Approach

Most operators understand the case for a cloud-native approach given the apparent benefits of improved deployment flexibility, on-premises service adoption, and cost savings. However, they are busy with thousands of operations workers trained to solve yesterday’s problems instead of looking to the future. Imagine if the electric car industry came along and said, “We made this cool electric car, but we don’t sell the motor or the batteries that power it.” This is precisely what is happening now with the cloud-native approach. Operators are not used to building networks this way, so they must employ other workers to implement their plans.

Further Training Of Their Employees

To build talent, a company should first look within its ranks. Indeed some employees resist when they have to start over with a demanding qualification profile. But many young, bright, and eager-to-learn engineers would love to learn new cloud-native skills if given the opportunity. Also, this approach allows for a hybrid model of expertise, which can benefit operators depending on the project. 

In Europe and the UK, investment in technical skills is essential to give these markets a competitive advantage in the decades to come. The best way to achieve this is to start at an early age, in school, university, or college, and through on-the-job training, and provide a practical, project-based education that allows young engineers to develop individually and operationally.

ALSO READ: Execution Management: Technology Is Changing The Way People Work

Execution Management: Technology Is Changing The Way People Work

0
Execution Management How Technology Is Changing The Way People Work

Data & intelligence are changing how companies will work in the future. The Celonis Labs research how Process Mining & Execution Management, can support different users in process optimization – and provide impulses for developing new technologies and trends.

Frictional losses in processes, unused data, non-transparent information silos – in most companies, there is unimagined potential. Based on artificial intelligence (AI), automation, and machine learning, technologies such as process mining and execution management support organizations in leveraging this potential and improving their processes.

What Execution Management Can Do

Process mining works like an X-ray machine to uncover inefficiencies in the processes. As a further development of this technology, the Execution Management System provides the recommendations for action suitable for the diagnosis and implements them partially automatically. This gives users a complete real-time overview of how processes influence each other and where there are delays. Improvements are also implemented automatically, or users receive tips on where they should intervene.

“democratization” Of Technology

The use of these technologies is fundamentally changing how companies work: from the management level to the local manager to the employees – everyone involved in a process works together in a result-oriented manner and can access all the relevant knowledge. This creates democratization of technology that enables individuals to make the right decisions more easily and quickly despite greater complexity.

This, in turn, continues to drive emerging technologies such as the advanced use of AI. Hybrid intelligence – the interaction of human and machine intelligence – is gaining importance. Here it is essential to define which tasks should be taken over by algorithms and where human judgment remains indispensable. To further exploit the possibilities of execution management, the experts at the labs are also working intensively on how a wide variety of existing business applications can be connected directly to the Celonis software using suitable interfaces (APIs). In this way, you can improve the user experience, carry out targeted actions and break down silos with the help of a holistic process context.

New Trends – New Experiences

Digital solutions and management systems make the working world more agile and flexible. This development also influences our work. The focus here is increasingly on technologies that were previously more relevant to the consumer sector. For example, virtual reality applications make it easier to use company software or offer companies and their customers who work together in a hybrid working environment exciting opportunities when implementing new solutions, support, or further training.

We are convinced that real and virtual worlds will move closer together in the future, and “phygital” experiences will become more critical for employees and customers. A perfect digital user experience is no longer enough. A holistic approach is required to create physical and digital points of contact.

Even the most successful technologies must not remain in the status quo. We are building a network of innovators in the spirit of exponential innovation, including Celonis employees, customers, partners, developers, and scientists. Everyone should have the opportunity to contribute ideas, discuss trends and learn from successes and failures together. Genuine innovations can only come when you are ready to broaden your perspective.

ALSO READ: Human-Like Robotic Skin: Machines Are Becoming More And More Like Us

Human-Like Robotic Skin: Machines Are Becoming More And More Like Us

0
Human-Like Robotic Skin Machines Are Becoming More And More Like Us

Researchers from Tokyo have succeeded for the first time in developing a skin imitation for robots. However, the artificial robot skin, cannot yet survive in everyday life. Robot technology is developing at a breathtaking speed. Some time ago, machines with artificial intelligence (AI) were still considered a rarity in everyday life. Today, it’s hard to imagine life without them in many places. But it will probably be a while before we encounter humanoid androids regularly.

So far, no company has brought a human-robot onto the market. This may also be because some human characteristics cannot be copied so easily. The skin is an example of this. Researchers from the University of Tokyo have now made a breakthrough.

Human Skin Grows Around Robot Fingers

In a recent experiment, human skin grew around a robot’s finger. To do this, the researchers placed the robot in a mold and covered it with connective tissue cells and a nutrient solution. The cells grew under laboratory conditions and settled around the mold used over time.

To achieve a replica that is as realistic as possible, the researchers planted horn-forming cells on the resulting construct. Over time, this resulted in a robot finger with an artificial skin that has properties similar to those of humans.

Robot Skin Can Heal Itself In An Emergency

The whole thing goes so far that artificial skin can heal itself. The researchers placed a piece of collagen on the spot if the finger was injured. As a result, the damaged area regenerated and then showed similar stability.

However, the approach is not yet fully developed. Although the robot’s skin is very similar to that of humans, the cells can hardly survive in dry conditions and outside of the nutrient liquid. So it will still be a while before machines become more and more like humans.

ALSO READ: Robot Pasta Can Move Independently – Without Computer Chips