PLM Upgrades – Just DON’T do it

The main thing to avoid when doing a PLM upgrade is the upgrade itself—it’s a waste.

If the technology can’t be downloaded with the tap of a finger, which currently isn’t the case with most enterprise applications like PLM, the vendor should do it. Aras does it for our subscribers, and their customizations keep working—that’s a reality.

Accelerating the speed and frequency of PLM upgrades is a key to any successful PLM implementation. The more upgrades get done, the more successful your PLM will be—another reality.  You want the bug fixes, the enhancements, the new applications, but there is no value in doing it yourself.

At a high level, IT organizations supply the infrastructure, the applications, and governance to ensure their business partners aren’t left behind. They lay the foundation that drives business velocity and growth. When it comes to PLM, this includes continually addressing the ever-evolving needs of the business, providing regular enhancements, and evaluating and deploying new disruptive technologies that have profound business impact, such as Variant Management, Supplier Access, Systems Engineering, Digital Twins, and Simulation Process and Data Management, and so on. Consequently, having the right resources, working on the right goals, in collaboration with the business, is paramount.

The opposite is also true. Anything being done that could’ve been sustainably automated is, in effect, waste. Just as applications and servers can now be run from the Cloud, PLM software upgrades can be done by a vendor. Aras has been successfully doing this for years.

The rate at which you upgrade is key, as new capabilities are rolled out several times a year.  The planning that goes into constantly evolving PLM roadmaps is significant. Regression testing, user acceptance testing, integration testing, and upgrades should be seamless and a positive experience for the business. But performing the actual upgrade itself should be a thing of the past in order to keep your IT resources focused on business critical activities.

My experience is that many companies get stuck on old legacy PLM systems tied to even more legacy. They’re stuck because they customized on an unsustainable technology stack. After some number of years, they realize they’re stuck, but when did it begin?

Legacy typically begins with the initial deployment. It occurs because companies that manufacture complex products have to customize their PLM solutions and make changes to the underlying data model to meet business needs and integrate to other critical business systems. The problem is that this is done in ways never envisioned by the PLM provider, who didn’t design their products to handle customizations with ongoing upgrades, thus locking themselves into “legacy PLM” from the start.

As the years go by, the legacy compounds, taking on more and more customizations that increase the cost to maintain, and becomes a drag on the business.

Even if it were possible to upgrade, the high costs and risk of disrupting the business often dissuade groups from it. Too often they wait for that Titanic moment when the risk of the PLM system failing is real. No longer on a supported version, they finally begin weighing alternatives. The truth is they likely hit the glacier during deployment and have taken in water all along, but the old school mentality is to pump money at it and keep the business running.

This is so prevalent that I would argue the very term “PLM” gets a bad rap with the “C-suite” because, in some instances, it’s a money pit. When the CEO wants to digitally transform the business, it’s hard for me to imagine them giving the nod to the PLM team that hasn’t upgraded in years.

The reality is that no enterprise software is going to anticipate every need and never will. Even the best can’t anticipate everything or accomplish everything per a company’s timetable. Best practices are a good thing, but they are still yesterday’s practices, so businesses need the ability to configure and customize as I described in a former blog entitled Out-of-the-Box is a PLM Fantasy.

Digital transformations require IT organizations to deliver enhancements, customizations, and new technologies, and this can’t be done with yesterday’s technology and outdated processes. An IT organization should be rolling out new PLM capabilities every eight or so weeks, relying on their PLM vendor to upgrade them approximately every year.

At Aras, we do the upgrades, which includes a first pass of the customer’s acceptance testing along with regression testing.

This is a portion of CIMdata’s commentary on it:

“. . . it is important to note that the guaranteed upgrade of a highly customized PLM solution is not only possible, but is in fact commonplace among Aras subscribers. Many report that the upgraded database is typically returned from Aras within two weeks. Aras says the technical part of the upgrade generally takes a few hours, with the remaining time consumed with validation testing. Customers still need to validate integrations and perform their own acceptance testing.” (CIMdata, May, 2018)

The following is CIMdata’s full commentary on the Aras PLM Platform: Redefining Customizations & Upgrades.

There’s no longer a reason to fall behind in keeping your business on the latest and greatest versions and taking advantage of new applications. I urge you to consider where you concentrate your IT resources and investments. I believe they are best focused on technologies that can quickly address your company’s changing market needs that enable you to roll out enhancements that drive business velocity.

I’ll end with how I started. My advice on doing upgrades is to stop doing them. www.aras.com

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.