Harmonize vs Integrate

Lionel Grealou Digital PLM 3 minutes


There is an old debate about Product Life-cycle Management (PLM) and Enterprise IT application landscape and how the ecosystem is architected: either based on one core ‘harmonized‘ or consolidated platform that is used across the relevant business units with a single PLM backbone, or based on an ‘integrated‘ platform that is connecting different solutions across multiple business units or functional groups. There are fundamental differences between the two approaches but also common considerations. To understand these characteristics, it is interesting to look back at the various approaches adopted in the past 30 years across the PLM application landscape:

1. Point-to-point architecture: various PLM or IT systems are connected through bespoke interfaces, with limited scalability and high cost of ownership (maintenance).

  • Pros: simple single interfaces, easy to make minor changes.
  • Cons: duplication, many connections / interfaces, lack of consistency, reusability, scalability, high maintenance cost.

2. Hub-and-spoke architecture: some PLM or IT applications / databases are deployed as global or local hubs (one or more hubs) for the other local applications to interface with. This approach is relevant to reduce the cost of ownership by minimizing the number of point-to-point interfaces; it is also a valid approach to workaround connectivity challenges and technical limitations.

  • Pros: limited number of connections / interfaces, improved cost of ownership with reusable interfaces and processes, clear roles and responsibilities at hub and satellite levels, scalable solution.
  • Cons: some (controlled?) duplication, can be complex if many hubs, effectiveness will also depend on middleware integration approach.

3. Single monolithic holistic / harmonized platform: a central single PLM or IT application (or more realistically a set of applications) that is accessible globally, with some sort of remote data alignment to allow effective operations.

  • Pros: ‘one size fits all‘ single solution, limited amount of data duplication and synchronization (though some caching required for effective file management, etc.), leveraging common / global metadata. 
  • Cons: high risk relying on a single ‘hub‘ application or set of application (and high dependency on one vendor), challenge to select ‘best-in-breed’ technology or solution as solutions evolve, challenges in aligning all product projects and programs across various maturity cycles, challenges in maintaining certification requirements – especially historical / legacy data, complex and (very) lengthy transition into a fully ‘harmonized‘ solution (if ever feasible?).

4. Service-orientated integrated architecture: an intelligent network of applications exchanging data with some levels of integrations, using an effective middleware or Enterprise Service Bus (ESB).

  • Pros: effective open architecture for integration with reusable service contracts, efficient data payload optimization, adopting a central SOA integration framework with the ability to manage different queues and shadow ESBs, scalable, limited need to reeducate users.
  • Cons: some data and process duplication across many applications, complex integration scenario, might make certification difficult as many different components will come from different sources, challenges to retire IT legacy, no incentive to change as it might be simple enough to ‘integrate everything with everything‘.

Having discussed the 4 possible architectures and approaches above, it seems obvious that options 1 (point-to-point) and 3 (single monolithic harmonized) are the least optimum due to complexity and cost. A combination of options 2 (hub-and-spoke) and 4 (integrated / ESB) seems to be the best logical solution as the future of data management in terms of helping manufacturers with:

  • Enabling optimized data management strategies with a distributed architecture.
  • Maximizing the value of data repositories and multiple fit-for-purpose applications of varying characteristics, which can be used for different sorts of analysis. 
  • Keeping business data for longer and for a wider variety of analytics, leveraging big data and contextual access to data at the source (where it is created and mastered in the first place).
  • Removing the need for capital intensive data warehousing infrastructure.

What are your thoughts?


This post was originally published on LinkedIn on 19 November 2015.

Views: 1
About the Author

Lionel Grealou

Twitter

Lionel Grealou, a.k.a. Lio, helps original equipment manufacturers transform, develop, and implement their digital transformation strategies—driving organizational change, data continuity and process improvement, managing the lifecycle of things across enterprise platforms, from PDM to PLM, ERP, MES, PIM, CRM, or BIM. Beyond consulting roles, Lio held leadership positions across industries, with both established OEMs and start-ups, covering the extended innovation lifecycle scope, from research and development, to engineering, discrete and process manufacturing, procurement, finance, supply chain, operations, program management, quality, compliance, marketing, etc.

You may also like: