Alan Turing, Another #DDay Engineering Hero

By Aurelien
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

Alan Turing60 years ago died Alan Turing, a great mind considered to be the father of computer science. On this occasion, the University of Reading announced that they passed the famous Turing test (although leading to much debate and controversy as one could expect).

But Alan Turing is also known as playing a key role in the resolution of WWII, in particular his contribution to the effort at Bletchley Park. There, he created an incredibly efficient signal intelligence operation to simplify the reading of messages encrypted using the German Enigma Machine. The Allies had to know if Germany was taking the bait on the many deceptions being deployed, such as Operation Fortitude, a deception strategy to let Germans believe that the landings would actually happen in North of France instead of Normandy. As many other engineers who contributed to the incredible innovations of the D-Day, we should pay tribute to Turing for the same.

Of course, Alan Turing also has a special connection with us, as a scientific software company.

Put simply, Alan Turing’s test could be summed up in this quote:

A computer would deserve to be called intelligent if it could deceive a human into believing that it was human.

What I like in this definition is how it relies on human perception to assess whether a machine is declared intelligent or not. In fact, it has a lot of commonalities with 3DEXPERIENCE itself. Indeed, if you’d transpose this definition of artificial intelligence to realistic visualization and simulation, you’d end up in a definition of what a Lifelike Experience can be:

YouTube Preview Image

A visionary mind, Turing envisioned a digital world where computers could not only simulate lifelike experiences but could even be a part of those experiences, as he mentioned during a not well-known lecture broadcast on the BBC:

It might for instance be said that no machine could write good English, or that it could not be influenced by sex-appeal or smoke a pipe. I cannot offer any such comfort, for I believe that no such bounds can be set.

Check out this nice infographic about Alan Turing, courtesy of Jurys Inn Manchester Hotel:

Alan Turing Infographic courtesy of Jurys Inn Hotels

Design Collaboration – What do we gain with integrated Design Analysis

By Eric
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

High-Tech IndustryEven from the early days of chip design the different tasks involved from architecture, logic design, layout and verification were accomplished in most part as individual efforts. The considerations of the “other disciplines” most of the time were not part of the equation in accomplishing ones task. “Once the logic design is done the back-end person can figure out how best to implement the layout”. When chip complexity and size were not so great we could get away with this kind of approach.

Today with large scale SOC designs and aggressive design targets, sophisticated nm technologies and schedules, this can no longer be the norm. More and more design tasks are being parallelized to compress design schedules. Design teams are much larger and can be located in different parts of the planet. The complex silicon technologies require deeper, more time consuming analysis of an increased list of parasitic effects such as cross-talk, inductive and capacitive coupling, junction leakage, etc. to achieve functional, performance and power design targets. In addition, sophisticated design tools produce volumes of analysis data over hundreds of modes and corners for each design flow step in the implementation process which allow engineers to evaluate if the design is converging toward budget targets.

So how can we manage this torrential flow of data in a way that keeps us on track and meet aggressive schedules? We need the ability to collect all this data from all project instances consistently from each design step, where ever it is produced, to a centralized location. The data needs to be organized in a way that allows review in a systematic approach from a project level to detailed issue presentation. The hundreds of analysis corners that may be generated for each flow step covering different process and operating conditions should be captured and organized for quick review. Important key metrics need to be displayed and highlighted making it possible to to make decisions where to focus first. As shown in Figure 1 below, the system should allow all aspects of the analysis data to be viewed in context (such as timing, layout, power, congestion, etc.) to see how different metrics could be contributing to specific issues. Historical data collected by such a system can then be compared by various analysis capabilities (tables, plots, metric aggregation, views) to assess metric trends and determine if the design is converging to expected targets. The system would enhance the ability to weed out non-issues from “project-critical” issues, allowing focus on key resolutions for the next pass of implementation. Finally, the system should help in constructing the current status and progress of the design and highlight problematic blocks that need further attention.

Figure 1

This integrated system would be useless without the ability to share the organized database with others to collaborate on issues, resolutions and trends as the design matures to completion. A centralized database where all team members can view the same picture of issues allows better decisions to be made and help with communication between disciplines (i.e. front-end and back-end).

With the ability to collect data from anywhere at any stage the flow, automatically keep track of design progress and analyze issues from an integrated view the prospect of meeting or bringing in schedules for these complex SOC design projects becomes more attainable.

Also, we’re going to be at the Design Automation Conference in San Francisco this year again. We will have a full presentation and demo agenda, a cocktail hour and prizes, join us!

Introducing BIOVIA: Expanding 3DEXPERIENCE to the Virtual Biosphere

By Aurelien
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+


Five years ago, we blogged about a project called BioIntelligence that was just starting up. At that time, we had a dream:

“Imagine a day when Life Science innovations are created in 3D virtual labs.”

That dream is about to become real. Today, we’re introducing a new Dassault Systemes brand: BIOVIA, which combines our efforts in the BioIntelligence Project, collaborative 3DEXPERIENCE  technologies, and leading life and material sciences applications from the recent acquisition of Accelrys. Watch this:

YouTube Preview Image

For some time, our mission has been to provide businesses & people with 3DEXPERIENCE universes to imagine sustainable innovations capable of harmonizing Product, Nature and Life.

Product, Nature and Life

For more than 30 years, we have been helping our customers develop their products virtually. In 2012, we expanded 3DEXPERIENCE to the virtual planet (nature) with GEOVIA, which includes Gemcom Software, Archivideo and other applications. However, a part has been missing: the ability to virtualize and simulate living cells, biology, biochemistry; in other words, life!

Today with BIOVIA, the picture is now complete. From discovering raw materials to delivering finished products, manufacturing companies will now be able to predict and measure impacts of their products on the environment and people, while maximizing their business processes to create true sustainable innovations.

While BIOVIA is just starting as out as a brand, the contribution of 3DEXPERIENCE to Life Sciences has a long track record, as demonstrated the Living Heart
project unveiled earlier this week–just one of many projects in this field.

You can learn more about BIOVIA on our website.

Page 4 of 26« First...23456...1020...Last »