Staying on Top of Change

By Catherine
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

By Catherine Bolgar

 

Network on cityscape background

Just as humans develop from a single cell and end up with some 13 trillion cells by the time they’re born nine months later, so too do products start with a concept or definition—essentially a single data point.

Whether that definition is expressed as a design on paper or digitally, it represents data, explains Callum Kidd, lecturer and leading configuration management researcher at the University of Manchester, U.K. The data then evolves, matures, is iterated and eventually becomes a defined configuration, which is a collection of more data. That process turns out a product whose use, maintenance, quality and lifecycle may be monitored, generating still more data.

We have created a digital world and [have] become more and more adept at creating data. But we haven’t created awareness of managing data, from creation to disposal,” Mr. Kidd says.

Just as in the single-cell example above, “We create life in data from day one, not by adding in something along the way. True, nature has taken millions of years to perfect this, but we need to learn lessons faster if we are to manage products and systems through life effectively.”

86661501_thumbnailKeeping track of this process can be mind-boggling, due to the many changes along the way—and especially when it involves many partners, suppliers and sub-tier suppliers. This is where configuration management enters the scene.

Configuration management is how we define a configuration, which is essentially data at some level of maturity,” he says. “By evolving that data, and managing changes to it reflecting the evolution of its definition, we create physical structures, or systems. These, however, are just data represented in a physical form. Essentially, we manage [product] design and [product] definition data through life. The validated physical representation is merely proof that the data was valid.

Configuration management is like just-in-time [manufacturing] for data,” he adds. “It gets the right data in the right format to the right people at the right time.”

Configuration management is closely linked with product lifecycle management, or PLM, which follows a product from concept to disposal. But “that’s a one-dimensional, linear view of the world,” Mr. Kidd says. “In reality, we share information backward and well as forward.”

Taking the aerospace industry as an example, “it’s highly possible that due to complex work-share arrangements, we could be managing changes in the design, manufacture and support phases of the life cycle concurrently,” Mr. Kidd says. “This adds considerable complexity in managing the status of data at any point in time. We need to know exactly what we have if we are to manage changes to that data effectively. That is one of our greatest challenges in a modern business environment.”

A survey of more than 500 companies last year, Aberdeen Group, a technology, analytics and research firm based in Waltham, Massachusetts, found that for many companies configuration management remains a manual, handwritten process. Aberdeen separated the companies into “leaders” and “followers,” and found that only 54% of leaders and a mere 37% of followers had automated or digital change management.

Yet, keeping track of frequent engineering changes during the development process is the top challenge, cited by 38% of companies. Among industrial equipment manufacturers, 46% named frequent engineering changes as their biggest challenge.

Changes are amplified by the increased complexity of products themselves. In another report, published in 2015, Aberdeen found a 13.4% increase in the number of mechanical components, a 19.6% climb in the number of electrical components and a 34.4% rise in lines of software code over the previous two years.

“Especially for industrial equipment manufacturers, products are getting more complex and customizable,” says Nick Castellina, vice president and research group director at Aberdeen Group. “Configuration management helps manage the flow of all that data and the lifecycle and needs of the shop floor. It centralizes all the visibility into the needs of each new product being built and how that interacts with any materials you’re trying to get at any stage.”

Visibility is important, because “sometimes it’s the minutest of things that can cause the biggest failures of all,” Mr. Kidd says. Automated configuration management not only ensures that all changes are recorded, along with the reasoning behind them, but also serves as a record in the future of every decision that was made in respect of a configuration’s life.

Businesspeople working togetherChange boards, which gather the relevant stakeholders, are the primary mechanism for approving change in configuration management. These boards are dealing with greater volume of change and complexity of the impact. That’s why “every piece of information in that room is retained and digitized. Notes that somebody makes but doesn’t communicate may be relevant,” Mr. Kidd says. Even emails are archived.

“We live in a litigious society,” he says. “Configuration management can prove you did the right thing, even if in the future a decision is called into question. You can show you made decisions based on the best possible information, and in the knowledge that you understood the status of the configuration at that point—in short, proving that you took due diligence in the process.”

 

Catherine Bolgar is a former managing editor of The Wall Street Journal Europe, now working as a freelance writer and editor with WSJ. Custom Studios in EMEA. For more from Catherine Bolgar, along with other industry experts, join the Future Realities discussion on LinkedIn.

Photos courtesy of iStock

A New Model for Manufacturing Innovation

By Valerie C.
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

by Werner Krings

The Austrian Economist Joseph Schumpeter argued that industries must incessantly revolutionize their economic structure from within. I interpret this statement to mean that manufacturers, especially in the High Tech industry, must continually strive to innovate with better or more effective processes in order to build new products.

Innovation is a core attribute of successful High Tech manufacturers, impacting every aspect of the business–economics, business profitability, product design, technology, and engineering best-practices, not to mention overall brand value.
Innovation impacts growth

Manufacturing innovation can mean the use Lean and other cost reduction strategies. Increasingly, it means automation and digitization of manufacturing as we move toward the era of the Digital Factory and big data analytics. And, In today’s global landscape, innovation must include the ability to easily replicate processes across sites to ensure higher global quality standards and greater control, visibility and synchronization across operations.

How do you get there?

A key requirement for global innovation is a unified production environment across facilities. High Tech manufacturers that use different processes and production systems in their various facilities will have difficulty achieving innovation– effectively blocking all of the potential benefits. When different plants use different MES systems, for example, there can be little agility, as every change becomes a custom IT project.

Improve operations processes across sites

This is why High Tech manufacturing leaders have moved toward unified and standardized systems, so that process changes and manufacturing agility can be achieved faster and more easily. In such an environment, global shop floor operations can be unified through a Center of Excellence, which can then ensure comparable and measurable manufacturing standards on a global scale. As they say, you can’t improve what you can’t measure.

Measuring Innovation

Innovation can (and should) be measured on an organizational level. The implementation of manufacturing intelligence solutions is often justified by this single function, as part of a manufacturer’s quest to achieve better visibility across operations. The ability to measure is greatly enhanced when it is part of an overall innovation strategy, underpinned by unified technology.

High Tech manufacturers will want to measure several aspects of innovation, such as business measures related to profitability, innovation process efficiency, or employees’ contribution and motivation. Measured values might include new product revenue, spending in R&D, time to market, quality scores for suppliers, and growth in emerging markets.

Manufacturing Innovation

What is pivotal is that innovation must align with corporate strategy and global manufacturing performance in order to ensure continuous growth and return on investment. A well-defined innovation program, combined with an IT infrastructure that supports global agility, is essential for High Tech manufacturers that want to compete and grow in a sustainable fashion, now and in the future.

Now there’s a solution for greater visibility, control, and synchronization of operations. Visit the Flexible Production solution page and read the flyer to find out what a flexible global production platform for manufacturing can do for your High Tech enterprise.

Bringing Predictive Analytics to the Shop Floor: OK, but How?

By Christian
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

Predictive Analytics can bring a lot of value to shop floor operations, especially to improve quality, yield or process sustainability, for example in composite manufacturing.

Machine learning algorithms allow to extract patterns from past production data. These patterns, which make up a model, can in turn be used to obtain predictions (“what is the risk to have a defective part?”) or even recommendations (“what can I do to reduce the risk?”).

Data Scientists wanted!

I have intentionally used the expression “machine learning algorithm” and you may think that companies that want to go in this direction need to hire a team of data scientists.

Indeed, many open source or commercial solutions require the availability of data scientist skills along with good programming skills in order to:

  • Identify the proper algorithms to use
  • Fine tune algorithms to get good results
  • Ensure scalability and performance

So it is no surprise that with the explosion of Big Data and Predictive Analytics, job postings in this field have skyrocketed since early 2012:

data_scientist_job_trends
Percentage of job offers with words “Data Scientist” or “Data Science” ©  Indeed.com.

And, as a result, salaries have soared and positions are hard to fill, which slows down the adoption of Predictive Analytics solutions.

Empowering Quality Managers and Process Experts

In order to overcome this difficulty, the DELMIA Operations Intelligence  solution for shop floor optimization (DELMIA OI) has been designed from the start for Quality Managers and Process Experts. There is no need to select or fine-tune algorithms and “correlation” is probably the most complex word used in the User Interface. Training is achieved in a few days.

shop_floor_quality
A failure analysis engineer prepares boards for corrosion testing. © Intel.

We also think that expertise is essential to obtain reliable models in the manufacturing field. For example, a process expert may identify irrelevant parameters, add relevant durations between operations, spot errors in data… And, last but not least, he may get inspiration from the model, which in the case of DELMIA OI comes in the form of human-readable rules.

Does this mean that data scientists are out of the picture? No, if you are lucky enough to have such resources, you will realize that best results are actually obtained by the collaboration between all profiles. Data scientists bring their experience on how to prepare and handle data, while quality managers and process experts can make informed decisions using their process knowledge.

The need for a Method

Even simple concepts and an intuitive user interface will not guarantee best results. You need a method to avoid pitfalls when you have to deal with potentially erroneous or incomplete data and different ways to address the problem.

Using the experience of DELMIA Operations Intelligence past projects, we have been able to build such a method, which has been recently shared in the DELMIA Enterprise Intelligence community.

The method consists in 8 steps:

understand_process
Understand process
import_curve_data
Leverage curves
 cleanup
Clean data
 prepare
Prepare data
 target
Define output
build_model
Build model
 validate
Validate model
 assess_value
Assess value

The method answers questions such as:

  • How to leverage curve data (hint: you may need BIOVIA Pipeline Pilot)?
  • Where should I put the frontier between a good and a bad yield?
  • How can I measure the reliability of the model (its ability to predict)?
  • How can I improve my model?
  • How can I evaluate the number of defective parts that could be spared if DELMIA OI recommendations were applied on the shop floor?

Discover more about how to build reliable predictive models to optimize your manufacturing operations by joining our free DELMIA Enterprise Intelligence community.

Once you are registered, it all starts with this post!



Page 1 of 1412345...10...Last »