Get Savvy about PLM

February 5, 2015

The mysteries of PLM in the Cloud

Filed under: Uncategorized — Laila Hirr @ 11:06 am

Over the past year the hype around “the cloud” has grown substantially, but so has the confusion. What is the cloud, should I fear it or love it?  I’ve started digging into the confusion about the cloud and specifically the impacts it has on PLM.

First there’s the question of terminology

Private, Public, Single Tenant, Multi-tenant, Hosted, SaaS, on-premise, data centers, and more.  These terms are all references to various forms of the cloud at the fundamental level.  Yes – even the “on-premise”.  But here are some clarifications that may help:

  • Private, Single Tenant, and on premise, tend to refer to a single enterprise application per server per business.  So your business owns or leases equipment and owns or leases the software and the databases whether onsite at the business,  at a data center or at some hosted location.  The configuration of the software will be unique to the business and may be managed by the business or by contracted IT admin services.
  • Public and Multi-tenant cloud offerings are typically a single instance of the software, managed by the supplier, that has multiple businesses using the system all operating on the same version/update level of the software – however the application data is isolated to the security level of the client business.  Configuration capabilities are more restrictive as customizations are typically not supported.  Features are added globally with any given update and the client business usually has the choice as to whether that feature would be enabled for end users.

Most of the PLM vendors have or plan to have cloud level offerings but the definitions can weigh heavily to one or the other of the directions mentioned above.  The level of feature offerings may vary and the data and security models may or may not be appropriate to your business.

I’ll be discussing the details of perceptions and issues around PLM and the Cloud during the upcoming CIMdata webinar on March 11th, so please join me for more insights on this topic.

November 7, 2014

Reminded this week why I love my work in PLM

Filed under: Uncategorized — Laila Hirr @ 5:06 pm

As many of you know, I have recently returned to the realm of Vendor-Neutral PLM advising. After doing a 2 day PLM check up with a fantastic customer, I was very firmly reminded of why this service is so important.

The reality is that while manufacturing companies have much in common, there are uniqueness issues that make one solution better for one company while another set of issues will make that same solution completely unsuited.

To have so much affirmation from the customer that we were able to help them in a mere 2 days to get a handle on the challenges they were facing was great. They were excited too by being more clear on the path they need to follow and which companies to consider that really made sense for them.

Each company has it’s own unique issues, some are tied to the industry they work in and some issues are related to the complexity of the products they produce, whether their product is a piece of wood, a silicon chip, a car, a set of services, packaging, or a toy… all of these have product information to manage and having the right tools is essential.

HOWEVER – lest anyone say the tool is the means to success – no… the tool enables – PEOPLE innovate!

October 10, 2014

Is it a new trend or an old one – spinoffs, divestitures and splits in High Tech? Are we ready?

Filed under: Uncategorized — Laila Hirr @ 10:12 am

This week both HP and Symantec have announced plans to “split” and the questions it raises for the employees, operations, production, and yes… enterprise systems will be interesting.

When mergers and acquisitions take place – there is always the fear, uncertainty and doubt.  I recall being involved with a company that had gone thru 5 acquisitions in as many years where I helped roll out CAD and PDM with each deal.  I watched manufacturing operations, HR operations, sales, service, and engineering all wrestle with merging cultures, standardizing practices and consolidating disparate systems.  Certain departments always have redundancy in mergers and we all know which departments those are…

But what of spin-offs?  Which group owns which products, which group needs what data? Are gaps going to arise because the infrastructure, the enterprise systems are too big, too enmeshed to handle the segregation of the information.  These are not trivial aspects of a spin-off, divestiture, or split.  The effort to split enterprise systems is not trivial and requires formal project execution, planning, and funding as does merging those systems to begin with.

Many companies large and small do go through these splits – and like a divorce, even when amicable – every asset, every piece of data, every stick of furniture, every dollar must be evaluated and examined for end ownership.

September 26, 2014

Does Historical PDM Usage limit how we view PLM today?

Filed under: Uncategorized — Laila Hirr @ 7:56 am

Early in my career, as I was implementing 3d, solid modeling across 5 corporate divisions, I was asked if I had “ever used a drafting board” by one of the manufacturing engineers who was skeptical about the use of the incoming tool set. My response was “not only have I used a drafting board, I have also used a slide rule and punch cards – and I don’t think any of us would argue that we should go back to those tools today.” That said, we did what every company was doing at that time, implementing CAD on high end personal computers while mimicking the practices of using a drafting board. My premise is that we are doing the same today with PLM.

Companies implemented PDM (Product Data Management) over the past ten to fifteen years with an emphasis on part number creation, drawing management, and engineering change management workflows. The issue here is that the definition of the tools were fundamentally defined by manufacturing product structures and not by the engineering innovation perspective. So in implementing PDM, companies were replicating the practices of the document control organizations of manufacturing companies, not by the need to innovate. Manufacturing effectively requires “control” yet fostering innovation requires “freedom” – companies have had to balance this dichotomy for years.

Fifteen years ago I saw a chart of the fuzzy boundary between structured data and unstructured data and the comparison with IT investing that indicated that 80% of IT budgets were spent on managing structured data, and the balance was on supporting unstructured data. Yet structured data only represented 20% of the information being supported by those same departments.     The point of that diagram was to stress the importance of thickening the fuzzy boundaries of managing structured data and unstructured data in a manner that would be intuitive and truly support innovation by giving end users freedom to innovate while assuring that the corporate intellectual assets resulting from that innovation are secured.


Many PLM systems today are very capable of managing more than a manufacturing bill of materials, can do much more than support engineeringchange management, yet repeatedly industry users are using PLM as engineering document management tools rather than doing a comprehensive roadmap of how to leverage their enterprise solutions to truly benefit from increasing the fuzzy boundary between structured data and unstructured data, and to truly enable innovation.

I would love to hear from you on this topic. Do you feel that your business has effectively implemented PLM beyond the replacement of a paper change management process? Have you leveraged PLM for the fuzzier boundaries? Does your business need assistance in getting to the heart of this problem?

For more information stay tuned to CIMdata for an upcoming webinar on this topic.

September 8, 2014

PLMSavvy Blog – back in business

Filed under: Uncategorized — Laila Hirr @ 3:40 pm

I’m very pleased to confirm today’s announcement by CIMdata ( that I have returned to the world of vendor neutral analytics and consulting.  Keep your eyes open for a renewed emphasis on providing PLM focused topics and updates.

November 21, 2011


Filed under: Lessons Learned,Mythbusting,Organization — Laila Hirr @ 12:11 pm

I have observed some interesting dynamics around the use of consultants – what makes them valued, what makes them effective, reasons people avoid them.  Note that I DO make a distinction between Consultants and Contractors – Consultants should be bringing value, business level expertise, leadership, and a point of view and should be considered as a partner of the company engaging the Consultant.  Contractors are typically brought on for task level activities, specific technical solutions, to address gaps in capacity to complete a given level of effort (commonly thought of as Staff Augmentation).

1) Team member and Partner?? In this economy, when engaged on a project/program, having deliberate endorsement of the consultant is mandatory.  On this engagement – a contractor was frequently pointed to as a shining example of a success while the consultant not –  yet the consultant brought more to the table at a strategic level.  So why was the customer perception reversed – first of all the two were handled differently by the customer in terms of how they were introduced and engaged with the program stakeholders.  The contractor level resource was escorted by the director to every program meeting, introduced and given a vocal endorsement.  The consultant, interestly enough was viewed as a leader and therefore was let loose to go it on his own – having to forge a relationship path to every doorway.  Only after several months did the customer realize the impact this had on the consultant’s ability to drive for strategic results.

2) Not Invented Here – is the resistance to leveraging the knowledge of a consultant team related to NIH mentality or is it fear of revealing one’s laundry?  Both are motivations that are based upon the desire to prove one’s value at the cost of missing strategic opportunities.  One group was so focused upon doing it themselves that even when offered knowledgeable consulting support that would yield millions on ROI the team was adamant that they could not even spare the time to use the help (even though the help was funded).  Another team, realizing that they could leverage the help, made the same consulting team integral to their work.  The value of the consultant is only realized if engaged.

3) Can’t afford it – resource, time, or dollars.  Repeatedly I see organizations say this – and as a result they end up requiring years instead of months to gather the knowledge, the practices, and the expertise required to execute major strategic initiatives that can be easily documented as having tremendous ROI and business impact.  By proper identification of the business value, the benefits, etc – the cost of the consultants can be clearly offset by an accelerated path to delivery of the solution that is bringing value.

4) Disregarding the counsel given when it is unpopular – this one always surprises me.  Why pay for a consultant who has industry knowledge, expertise, and understanding of the myriad of complexities for the area of their specialization, and then minimize the importance of the issues they call out – whether risks, costs, recommendations, or opportunities.

If you are interested in further perspectives on the use of consultants, I recommend the book “Taking Advice: How Leaders Get Good Counsel and Use it Wisely” by Dan Ciampa

May 17, 2010

Quality Management Systems (QMS), Document Management (DMS) or PLM? What are the differences?

Filed under: Mythbusting,PLM,System Selection — Laila Hirr @ 1:55 am

Copyright 2007, LRHirr

Definition of QMS/DMS/PLM

This topic has been sitting in my notes as a to-do for a very long time.  From a very high level perspective, terms like knowledge management, document management, quality management and product management just create confusion for folks – are they all referring to the same thing?

Quality Management Systems (QMS) are considered to be focused upon a company’s quality records, often including the ability to manage basic workflows around corrective actions/preventative actions (CAPA), statistical process controls, non-conformance reports.  Most QMS systems are industry focused and are often tied to the medical device industry and the pharmacology industry.  QMS systems may or may not have document management system (DMS) level capabilities.

DMS systems are focused upon managing documents electronically.  The systems are typically structuring to support “file folder” based information and have workflows to support revision history and change processes around those electronic files.  DMS systems typically cannot support product structures and the more complex relationships of tying documents to a bill of materials. Some document management systems will “build” a compiled content site for large scale web environments that are sensitive to rollouts of new art, press releases, special offers and such.

Product Lifecycle Management(PLM) systems  may or may not have QMS level templates, however they support a broad range of workflows, are highly adaptive and handle the change management of documents, folder based file management and the product structure related file management.  PLM systems often include capabilities to manage CAD data, Bills of Material, program management, requirements management, view and markup capabilities, and supplier access to key product related data.  PLM will not “build” a content site.

November 18, 2009

Fundamentals of PLM Seminar Video

Filed under: Uncategorized — Laila Hirr @ 5:51 pm

I was recently asked to speak at Portland State University for the School of Engineering Technology Management’s Graduate Seminar.  What was unexpected was that they filmed the session to post on their site which is viewed internationally by the PICMET community.  PICMET stands for Portland Information Center on Managing Engineering and Technology (an annual conference attended by academia and industry from around the world).


The seminar series is found at

The recording of my seminar is listed at



April 22, 2009

Successful Implementations – Invest in Phase 1 – Establish the Business Case for PLM

A reader recently asked me to share perspectives on successful PLM deployments – what are the common themes across successful implementations.

The thought that comes to mind first and foremost is the upfront planning.  By planning I don’t mean the development of a project schedule or of the system requirements – it’s the investment in taking the time to really make sure the business justification is well understood.

There is a process I have used repeatedly that involves managing expectations very carefully before engaging the software vendor.  Too often it seems that the PLM decision is reactive – as sold by the software vendor, not based upon a clear set of business drivers.

To be successful with the process steps executive level endorsement is critical as it has a lot in common with a business process assessment in that it probes the entire organization, however, the assessment is focused upon the product record, not operations or production.  The challenge for many however having someone who knows  PLM well enough to be able to see the opportunities is often not a skill set contained within your company.  The cost of having consulting done for this stage is often well worth the initial investment when taking into account the overall investment you may end up making in the implementation.

The single most significant thing to be done is to look at the PLM project with the same kind of rigor as a product development project.  Too often IT projects (coming from a non revenue perspective) fail to use simple PMO types of processes.  It all starts with the business case phase.


But on the successful implementations – the business case stage would include the following activities:

  1. Take your organization chart and mark every department that handles product related information, identify key external partners that are also interacting with your product records (note – do not count your financial transactions here – just focus on the product information).
  2. Evaluate each department around it’s  handling of product information or the product record look for manual transactions, duplicated entries, printouts, markups, shadow databases, network storage of product information.  Include examining what IT infrastructure components are involved.
  3. Dollarize the impact of the “non-value” added activities associated with these product record related transactions – the caution is that executives will readily let you turn this into a blame game rather than a financial business case – so be careful how this is framed (Read the Dollarization Discipline for more on this).
  4. As in product development project management practices – develop a high level specification (similar to a Market Requirements Document) that ties business objectives to the functionality requirements – this is NOT where to say it must run on xyz database, or have n tier architecture.  This is a market level specification for executive consumption.
  5. Obtain executive buy-in for the next stage – do not go too deep into specification definition until this phase is approved.

January 7, 2009

Today’s economy and the enterprise: Now or Never

Filed under: Cost and Benefits,Enabling Technology — Laila Hirr @ 11:57 am

I watched Charlie Rose last night as he met with Leo Apothekar, co-CEO of SAP AG, and Andrew Mcaffee of Harvard Business School as they discussed the state of affairs with the economy and IT. Mr. Apothekar at the end of the discussion stated that in today’s economy there are two behaviors in IT: Now or Never. That companies recognize that either they must invest now in the process factories (his description of the enterprise applications) for efficiencies or they risk not surviving.

Some of the key points they discussed that I found very pertinent are that:

1) the Lack of Integrations of systems has been an Achilles heel to the IT industry. This was most clearly brought out in the examination of the intelligence community after 9/11 and in today’s banking situation. Too many systems and too isolated systems have led to breakdowns in critical functions. The good news is that more and more of the enterprise applications have been making the access to the data structures simpler and thus easier to build integrations.

2) the need for flexibility to foster use. As Andrew Mcaffee stated it was once thought that to get a good outcome you had to tightly control the process, yet wiki and open source show us that the organic controls are working.

3) the connectivity inside, outside and across boundaries is through process. Leo Apothekar gave a very visual description of this environment. To add to his words I’d suggest you take a look at code swarm Code Swarm videos give a very good sense of the complexities and interactions of product development (for code) – after watching it – just imagine the interactions that include electrical and mechanical design, not just software – and you begin to get a sense of how crucial access, vaulting, and change management are across both the internal and external enterprise.

4) the emergence of “cloud” computing for the enterprise. The use of the internet or the “space out there” as the environment for the enterprise operations.

In their discussion they covered other topics such as the challenges of employee retention and the viability of open source for the enterprise. I’m pleased that Charlie Rose programming posts the replays on their website, since he is on so very late at night.

Next Page »

Blog at