Staying on Top of Change

By Catherine
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

By Catherine Bolgar

 

Network on cityscape background

Just as humans develop from a single cell and end up with some 13 trillion cells by the time they’re born nine months later, so too do products start with a concept or definition—essentially a single data point.

Whether that definition is expressed as a design on paper or digitally, it represents data, explains Callum Kidd, lecturer and leading configuration management researcher at the University of Manchester, U.K. The data then evolves, matures, is iterated and eventually becomes a defined configuration, which is a collection of more data. That process turns out a product whose use, maintenance, quality and lifecycle may be monitored, generating still more data.

We have created a digital world and [have] become more and more adept at creating data. But we haven’t created awareness of managing data, from creation to disposal,” Mr. Kidd says.

Just as in the single-cell example above, “We create life in data from day one, not by adding in something along the way. True, nature has taken millions of years to perfect this, but we need to learn lessons faster if we are to manage products and systems through life effectively.”

86661501_thumbnailKeeping track of this process can be mind-boggling, due to the many changes along the way—and especially when it involves many partners, suppliers and sub-tier suppliers. This is where configuration management enters the scene.

Configuration management is how we define a configuration, which is essentially data at some level of maturity,” he says. “By evolving that data, and managing changes to it reflecting the evolution of its definition, we create physical structures, or systems. These, however, are just data represented in a physical form. Essentially, we manage [product] design and [product] definition data through life. The validated physical representation is merely proof that the data was valid.

Configuration management is like just-in-time [manufacturing] for data,” he adds. “It gets the right data in the right format to the right people at the right time.”

Configuration management is closely linked with product lifecycle management, or PLM, which follows a product from concept to disposal. But “that’s a one-dimensional, linear view of the world,” Mr. Kidd says. “In reality, we share information backward and well as forward.”

Taking the aerospace industry as an example, “it’s highly possible that due to complex work-share arrangements, we could be managing changes in the design, manufacture and support phases of the life cycle concurrently,” Mr. Kidd says. “This adds considerable complexity in managing the status of data at any point in time. We need to know exactly what we have if we are to manage changes to that data effectively. That is one of our greatest challenges in a modern business environment.”

A survey of more than 500 companies last year, Aberdeen Group, a technology, analytics and research firm based in Waltham, Massachusetts, found that for many companies configuration management remains a manual, handwritten process. Aberdeen separated the companies into “leaders” and “followers,” and found that only 54% of leaders and a mere 37% of followers had automated or digital change management.

Yet, keeping track of frequent engineering changes during the development process is the top challenge, cited by 38% of companies. Among industrial equipment manufacturers, 46% named frequent engineering changes as their biggest challenge.

Changes are amplified by the increased complexity of products themselves. In another report, published in 2015, Aberdeen found a 13.4% increase in the number of mechanical components, a 19.6% climb in the number of electrical components and a 34.4% rise in lines of software code over the previous two years.

“Especially for industrial equipment manufacturers, products are getting more complex and customizable,” says Nick Castellina, vice president and research group director at Aberdeen Group. “Configuration management helps manage the flow of all that data and the lifecycle and needs of the shop floor. It centralizes all the visibility into the needs of each new product being built and how that interacts with any materials you’re trying to get at any stage.”

Visibility is important, because “sometimes it’s the minutest of things that can cause the biggest failures of all,” Mr. Kidd says. Automated configuration management not only ensures that all changes are recorded, along with the reasoning behind them, but also serves as a record in the future of every decision that was made in respect of a configuration’s life.

Businesspeople working togetherChange boards, which gather the relevant stakeholders, are the primary mechanism for approving change in configuration management. These boards are dealing with greater volume of change and complexity of the impact. That’s why “every piece of information in that room is retained and digitized. Notes that somebody makes but doesn’t communicate may be relevant,” Mr. Kidd says. Even emails are archived.

“We live in a litigious society,” he says. “Configuration management can prove you did the right thing, even if in the future a decision is called into question. You can show you made decisions based on the best possible information, and in the knowledge that you understood the status of the configuration at that point—in short, proving that you took due diligence in the process.”

 

Catherine Bolgar is a former managing editor of The Wall Street Journal Europe, now working as a freelance writer and editor with WSJ. Custom Studios in EMEA. For more from Catherine Bolgar, along with other industry experts, join the Future Realities discussion on LinkedIn.

Photos courtesy of iStock

Why Design Data Management and Analytics aren’t just ordinary Big Data

By Eric
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

SemiconductorsI’m an electronics engineer who spent a part of his career in the business intelligence and analytics domain. In that regard, I’m always interested in technology and business areas that have unique analytics needs. Semiconductor design closure is one such domain. With 14 nanometer geometry fabrication now coming on-line, the complexity of integrated circuits is taking another geometric step in complexity as large projects can have 200+ IP blocks in their designs (see figure below).

Variability and Velocity are more critical than Volume

When taking into consideration that millions of transistors can constitute a block and that blocks can be chosen from libraries in the thousands, and that there can be multiple variations of a block, the analytics challenge approaches that of Big Data. Though, this not necessarily because of overall data size, but because of data complexity, variability and velocity.

For these large projects, then, the effort to meet timing, power, IR drop and other design parameters takes geometrically longer…yet again. Of course, some of this increased verification effort can be done in parallel by multiple design teams, each working on sub-sections of the chip. But, ultimately the entire system design has to be simulated to assure right design first time. I’m sure most would agree with me that system failure often happens at interfaces. Whether it’s an interface within a design or a responsibility interface between designers, it’s the same situation.

Why ordinary Big Data analytics won’t do the job

Effective analytics for design testing and verification provides a way to analyze interface operation from all relevant perspectives. Coming back to the topic of Big Data, my view is that commonly known Big Data analytics tools could be helpful, but are not sufficient to meet this requirement. In particular, I observe that appropriate semiconductor big data analytics must have the following capabilities:

  • Support for the hierarchical nature of chip design.
  • Ability to integrate information from multiple design tools and relate them in some way to each other to indicate relevant cause/effect relationships.
  • The ability to compare and contrast these relationships using graphical analytics to expose key relationships super quickly.
  • The ability to easily zoom, pivot, filter, sort, rank and do other kinds of analytics tasks on data to gain the right viewpoints.
  • The ability to deliver these analytics with minimal application admin or usage effort.
  • Effective visualizations for key design attributes unique to semiconductor projects.
  • The ability to process data from analog, digital and the other types of common EE design and simulation tools.
  • The ability to handle very complex, large chip design data structures so that requirement, specification and simulation consistency is maintained.

It seems to me that semiconductor design engineers have been quietly contending with Big Data analytics challenges even though they haven’t necessarily been part of the mainstream Big Data conversations. Yet, the tools in use for chip design perhaps have some very interesting capabilities for other technical and business disciplines. My $.02.

Also, we’re going to be at the Design Automation Conference in San Francisco this year again. We will have a full presentation and demo agenda, a cocktail hour and prizes, join us!

Eric ROGGE is a member of the High-Tech Industry team. You can find him on Twitter @EricAt3DS.

Do You Comply?

By Kate
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+


I imagine I’d go crazy if I were a manufacturer today. There are so many regulations to follow, and with the burgeoning environmental/green standards, which can differ per country, the complexity grows. Then, when I begin to think about the various substances that are regulated, like lead, hazardous chemicals, etc., coupled with the specific industry regulations, I start feeling like I need a Business Intelligence solution to understand it all. (Breathe now.) And then, I imagine how extra-complex it must be for OEMs and Tier 1 suppliers who are outsourcing their parts manufacturing around the globe, where depending on the country, the manufacturing cultures differ, and thus their awareness/compliance to the complex web of. . . directives.

Welcome to the second post in our introductory Green PLM blog series.

We can quickly get overwhelmed when we start digging into Compliance. When it comes to Green Compliance, we’re still in the early days, i.e. there’s a lot more to come. Most of the directives bubble up from Europe, and to my knowledge, so far there are no widespread, ISO-type standards.

Mike Zepp, our in-house regulatory compliance expert, used to deal directly with the type of scenario I imagined above, and now he helps Dassault Systèmes arm companies with tools to successfully navigate through the green compliance jungle. Mike was in Paris recently and kindly agreed to let me video-interview him. Here’s what Mike has to say about Compliance and the role PLM, particularly managing product-linked data throughout the lifecycle, can play to help. The real-life example he cites in the video is particularly telling:

It seems to me that using an efficient compliance assessment and impact analysis data management tool will help put some greenbacks into your Green PLM, or at least save you some. While this is only a piece of Green PLM, it’s a major one.

Stay tuned for my next Green PLM post on reducing material use in product design.

Best,

Kate

P.S. Here are some Green Compliance resources:

Examples of product recycling directives:
End-of-Life Vehicle (ELV)
Waste Electrical and Electronic Equipment Directive (WEEE)

Examples of banned substances directives:
Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) Directive
Restriction of Hazardous Substances (RoHS)

Whitepaper:
The Voice of the Customer: Process Integration and Traceability Through Requirements Management