The Future of PLM Will Be Built by You

Lionel Grealou Business Digital PLM 4 minutes

Keynote by Rob McAveney at ACE 2026 (April 2026)

At ACE 2026, a single line from Rob McAveney’s keynote stood out for its apparent simplicity and underlying weight:

The future of PLM will be built by you.

Rob McAveney, CTO at Aras

Positioned against a backdrop of AI-driven transformation and Adaptive PLM narratives, the statement was less a slogan than a reframing of responsibility. It signals a shift already underway across industrial software. PLM is no longer something delivered. It is something the enterprise must continuously design, align, and sustain.

From Platform Delivery to Operating Model Ownership

Across the conference, a consistent theme emerged. PLM is evolving from a system of record into an operational capability embedded in the business.

Rob McAveney’s keynote made that transition explicit. The emphasis was not on new features or incremental improvements, but on the conditions required for PLM to remain relevant in an AI-enabled environment. The platform provides extensibility and scale, but it does not impose coherence.

That responsibility now sits with the enterprise. It aligns with the broader positioning introduced in the conference opening sessions, where PLM was framed within an AI-driven engineering context. The implication is structural. Technology enables, but outcomes reflect how data, processes, and decisions are connected.

Adaptability Without Structure Creates Variability

Adaptive PLM was positioned at ACE 2026 as a move from narrative to operating model. The concept is gaining traction. The execution remains uneven.

The flexibility of modern enterprise platforms enables organizations to continuously evolve data models, workflows, and integrations. However, without a clear architecture and decision framework, this adaptability tends to increase variability rather than alignment.

Enterprises are expanding capabilities, but often without a consistent definition of which decisions matter across the lifecycle, what constitutes decision-grade data, and how impact is assessed prior to execution. In this context, adaptability risks becoming a form of fragmentation at scale.

Decision-Centric PLM as the Missing Layer

A notable gap in many discussions is the absence of explicit decision models.

PLM environments remain largely structured around objects and processes. Yet the real value emerges at the level of decisions, how they are defined, what evidence supports them, and how they propagate across enabling enterprise systems.

Rob McAveney’s positioning implicitly points in that direction. If the enterprise is responsible for delivering PLM, it must also define the decision logic that governs it. Without such a layer, workflows automate activity, but not intent; data accumulates without a clear purpose, and traceability exists without interpretability.

A decision-centric approach reframes PLM as an execution infrastructure for maintaining coherence across the lifecycle.

AI Exposes the Quality of the System

The role of AI at ACE 2026 reinforced this perspective. Across sessions, AI was consistently positioned from a catalyst for searchability to an accelerator of interaction and insight. At the same time, its effectiveness was shown to depend directly on the structure of the underlying data and source systems.

Fragmented data leads to partial outputs. Missing relationships limit reasoning. Speed increases, but coherence does not.

In contrast, business environments with connected data and defined semantics enable AI to support more advanced capabilities, particularly around scenario exploration and impact assessment. This is not just an IT or technical system question.

This aligns with demonstrations of agent-based approaches, where requirements can be translated into data models and workflows. The potential impact is significant. The dependency on structured context is equally clear.

Traceability as a Behavioral Discipline

One of the more subtle but important shifts at ACE 2026 was the reframing of traceability. It is often treated as a system feature. In practice, it functions as a behavioral discipline—enabled by a structured data model.

Maintaining traceability requires consistent capture of intent, decisions, and changes across teams and systems. Without that discipline, even well-designed platforms fail to preserve meaning over time.

This becomes critical in an AI-enabled environment. Traceability determines whether outputs can be trusted, interpreted, and acted upon. It is the foundation of decision continuity, not simply a compliance requirement.

Openness and the Orchestration Challenge

Platform openness was positioned as a strategic enabler throughout the conference. Extensible architectures, integration frameworks, and ecosystem connectivity are now expected capabilities.

However, the challenge has shifted. Integration is largely solvable. Coherence is not.

As systems become more connected, the risk moves from isolation to fragmentation. Without shared semantics and aligned decision models, integration propagates inconsistency rather than resolving it.

The enterprise must therefore take on a new role. Orchestrating how systems, data, and processes align around common decision logic. Basically, enterprise architecture must drive internal alignment and consistency to limit unmanaged data gaps and overlaps.

Capability, Not Configuration

The implication of Rob McAveney’s statement becomes clearer in this context. “The future of PLM will be built by the enterprise” is not about increasing configuration flexibility. It is about developing the capability to design and govern coherence.

This requires convergence across disciplines that are often separated. Domain expertise, data modeling, process design, and governance.

Where these remain disconnected, PLM reflects organizational silos. Where they converge, PLM becomes an enabling layer for aligned execution.

A More Grounded Interpretation

ACE 2026 did not present PLM as being displaced by AI. It positioned PLM as becoming more central, provided it evolves.

The trajectory is clear. From engineering control to business-aligned decision systems. From data management to decision-grade intelligence. From integration to orchestration.

Rob McAveney’s keynote captured that transition succinctly. The future is not predefined by the platform. It is shaped by how the enterprise connects data, defines decisions, and sustains coherence over time.

The Strategic Question

PLM advantage will not be determined solely by the pace of technology adoption. It will depend on whether organizations can establish clear decision ownership across the lifecycle, consistent decision-grade data, and mechanisms to maintain coherence as systems evolve.

That is what it means, in practice, for the future of PLM to be built by the enterprise. The question is no longer whether the capability exists. It is whether organizations are structured to realize it.

What are your thoughts?


Disclaimer: articles and thoughts published on v+d do not necessarily represent the views of the company, but solely the views or interpretations of the author(s); reviews, insights and mentions of publications, products, or services do neither constitute endorsement, nor recommendations for purchase or adoption. 

About the Author

Lionel Grealou

Lionel Grealou, a.k.a. Lio, helps original equipment manufacturers transform, develop, and implement their digital transformation strategies—driving organizational change, data continuity, operational efficiency and effectiveness, managing the lifecycle of things across enterprise platforms, from PDM to PLM, ERP, MES, PIM, CRM, or BIM. Beyond consulting roles, Lio held leadership positions across industries, with both established OEMs and start-ups, covering the extended innovation lifecycle scope, from research and development, to engineering, discrete and process manufacturing, procurement, finance, supply chain, operations, program management, quality, compliance, marketing, etc.

You may also like: