Create Once, Use Many Times

Lionel Grealou Data, Engineering, PLM Leave a Comment


Robust information and (meta-)data management is one of the core requirements for Engineering and Manufacturing functions to deliver new product developments and innovations with a solid data foundation. Managing effectively (or not) Bills of Materials(BoMs), the “lifeblood of manufacturing companies“, can make (or break) the Product Life-cycle Management (PLM) experience – if not break it, to make it really painful and inefficient. 

Robust Engineering data management requires 6 Critical Success Factors (CSFs):

  1. Business process and data owners are clearly identified and clustered so that every one understands and agrees where data ‘sources‘ are located (data owners are likely to be different that process owners as data flows throughout the enterprise – not just Engineering and Manufacturing) and is generated and managed via several processes).
  2. The enterprise has adopted a robust framework of Master Data Management (MDM): focusing on data duplication minimization, “create once, use many times” strategies, adaptable data structures that are fit for purpose and respect data interfaces and data integration tool constraints; the information is stored only once, in one location / one system (ideally, with a “single source of truth“), with one master data used downstream in several relevant views, sometimes also referred as ‘no more than a single row of a single table’, and there is a clear set of processes to manage it (Product Development Life-cycle Efficiency).
  3. A robust Enterprise Service Bus (ESB) will act as a ‘hub‘to receive updates of data that has changed in another system, in limiting duplication, increasing re-use, and act as a ‘broker‘ to determine which updates are to be regarded as authoritative.
  4. Data models, business objects and their interoperability are effectively governed by a corporate business data architecture committee (which is specifically managed by the business, not solely by corporate IT).
  5. Flawless translation and transformation of metadata is required between business functions and between businesses, from record keeping to archival.
  6. Data quality and integrity require continuous improvement, business engagement, rigour and perseverance (especially with historical legacy data that is not clearly ‘owned‘ across the business); it is not just something to do when considering data migration from one system to another.

MDM is about data structure and business object governance, data architecture standards, hierarchical data model, interoperability, interfaces, enterprise IT data alignment, business data understanding, conscious decision as to where the master data is stored, synchronization, integration and data re-use strategies, change management, data archiving and legacy decommissioning strategies, long term thinking (flexibility, scalability, relevance, fit-for-purpose, impact of change, etc.). 

Typical challenges with (meta-)data management include ‘breaking‘ (or integrating) metadata silos with limited interoperability, with metadata generation and deployment semi-automated at best, resource intensive and application specific, multiplication of local data registries, non-standardised data representations and communication protocols, and limited forward thinking in terms of archival control. One key challenge is also to adapt and interface between various PLM and ERP systems that have off-the-shelve data modelswhich cannot be modified in non-trivial ways or because they specifically require (by design) duplication to operate (perhaps due to the fact that software authors consider their solutions as the ‘source‘ of master data). Finding smart ways to minimize impacts and implications is critical to aim at lean MDM strategies to enable the realization of the single version of truth” vision.

 What are your thoughts?


This post was originally published on LinkedIn on 5 May 2015.