Friday, June 29, 2012

A Brief Look at MES Products From a Historical Perspective

I am working with a number of Life Science manufacturing companies that have taken a strategic approach for their manufacturing systems landscape. There is a lot of buzz on this topic in the industry, which makes it that much more interesting but with some challenges. I am generally fond of using a historical perspective and so I decided to do the same for the MES software products in the life science industry. This perspective is just mine and I am sure there are many more that can be given by my peers in the industry – a subtle hint.

So let’s start in the 1980’s, the decade that gave us CIM and a growing awareness about the role that computers play in manufacturing operations. The focus at that time was how computer systems, aka software, can be used to increase efficiencies and manage complexity. In fact computer technology was gaining so much momentum that it was considered a major element in revolutionizing the manufacturing landscape in parallel to the advent of the Lean movement.

This gave birth to quite a few Manufacturing Execution System (MES) product companies in the 1980’s. The 90’s then followed by a massive development and spread of information technology, which is now at the core of everything we do today, not only in manufacturing. Manufacturing operations are becoming so dependent on information that these systems have to be considered at a strategic level. Initially this strategic focus was given to expensive business systems such as ERP however it is becoming evident that other systems, specifically MES, sometime have more impact on the bottom line and should be considered equally strategic.

MES software products evolved in different industries and their roots manifest themselves in both the functionality and the corresponding  MES vendor’s organization. Companies and products that currently serve the Life Science industry generally have their roots in the semi-conductor and electronics industry, and understandably also the Life Science industry itself.

In the industries outside Pharma and Bio-Pharma, MES was introduced to deal with the inherent high level of automation and complexity of the high volume manufacturing process where lowering cost and increasing production throughput were crucial. It was virtually impossible to manually manage the wealth and complexity of information and MES provided a solution. The  MES products were centered on a discrete workflow model that allowed rich modeling capabilities while at the same time allowing customizations. In fact early  MESs were merely toolboxes with a workflow engine, rich data modeling capabilities and tools to custom build user interfaces and business logic.

In the Life Science industry the main driver for introducing  MES was compliance or the electronic batch record and therefore the first such systems provided a “paper-on-glass” solution. The idea was to simply digitize the paper batch records, kind of like the old “overhead projectors”. These systems had simple modeling capabilities and did not allow for much customization. In many cases these “paper-on-glass” systems were supplemented with business logic built as customizations in the automation system. They were commonly implemented in pharmaceutical plants, where the focus on compliance meant low tolerance for customizations and a minimum of change after system were commissioned. This resulted in  MES functionality that was split between the heavily customized automation applications and a “canned” paper-on-glass system to deal with batch records. The Weigh and Dispense feature of these systems was used mostly for traditional pre-weigh activities where the materials are weighed and staged before the process.

In the 2000’s a consolidation started in which some of the independent  MES from vendors where acquired by the major automation vendors and positioned into the life sciences industry. This introduced the rich modeling capabilities that grew out of the semi-conductor and electronics industry to the  Life Sciences  industry accustomed to “paper-on-glass” systems. This leaves us today with a wide choice of  MES that are rapidly gaining maturity and sophistication in the form of advanced functionality and interoperability. I think that this maturity is an important factor and plays nicely into the strategic nature of most Manufacturing Systems initiatives that I have been involved in. There is still a long road ahead but I have not been so optimistic about the Manufacturing System domain, in a long time. It certainly looks like there are some very interesting and also challenging years ahead as we work to execute on these strategic initiatives.

Friday, May 18, 2012

How hard is it to define MI requirements?

Well let's say its not straightforward! For quite some time I have been trying to explain the difficulties in providing clear specification for Manufacturing Intelligence (MI) system. In addition the life science industry is still bound by traditional methods of analysis, requirements specs and functional specs that simply put will not work for MI. MI's whole purpose is to provide information to somebody to support his creative behavior as he explore root-causes and solutions in the dynamic world of manufacturing operations. (This is an extract from a forthcoming article in Pharmaceutical Engineering).

The right approach is based on the critical element of understanding how people use information to solve problems and gauge performance. There is a clear need to provide effective and relevant information necessary to support the information consumed by the different roles in the manufacturing operations. Identifying what needs to be measured is a fundamental principle but it is not sufficient, the information also has to be arranged in a usable manner. Therefore it is important to study and understand information consumption patterns by roles. Take for example the information consumption pattern for a supervisor in a biotech plant that is creatively analyzing a production event.


The production supervisor glances at his dashboard and observes that the Cell Density is not within acceptable limits. He immediately navigates to view the “Cell Density by Time” trend over the last 2 weeks and observes a negative trend beginning around “Mon.” that indicates something is seriously not in order. 
To begin the analysis, he examines the “Media Batch Feed Schedule” to see if there is any correlation between the trend and the Media that is being fed to the bio-reactor. This action is obviously based on intuition possibly because he has seen that before. Seeing that there is a correlation between the change to a new Media batch feed when the trend start he decides to take a look at the “Exception By Batch” information and notices that this specific batch had an unusual number of exceptions. He then dives deeper into the data by analyzing an exception Pareto for the suspect batch. He finds a high number of operator errors, which clearly highlights the root cause of the trend. Finally since he is accountable for operational profits he decides to take a look at the cost impact of this event in order to understand what the impact is to the plants financial performance (see graphic below). Unfortunately the cost impact is substantial and thus he as to take action to mitigate this increased cost. 


The scenario shows the power of “Actionable Intelligence”. The supervisor has all the information he needs in order to quickly and effectively analyze the situation to determine root-cause and he can take action based on the results. The path that the supervisor decides to take in the example above is one of several that could have been used to detect and diagnose the Cell Density performance issue. It is this type of self-guided or self-serve analysis that really shows how information is consumed to meet a specific goal and should be the common pattern for the information required by a specific person or role. These requirements have direct bearing on the information context and data structures that must be provided, and the dimensions by which the metric is analyzed or “sliced and diced”. Although this seems trivial at first the requirements that this analysis patterns has on the underlying information and data structures is significant and is a critical component of the system design. It is not enough just to collect the data; it has to be arranged in a manner that enables this unique type of analytic information consumption.

Thursday, December 29, 2011

Why Do We Still Use Spreadsheets?

Sometimes I find some unfinished article while cleaning up – something I should obviously do more regularly! So here is one of these excerpts that I found particularly relevant as I am discussing the topic of “Manufacturing Intelligence” with a number of companies.

Manufacturing systems software vendors continuously tell us that you cannot have visibility into your operations without a software application, which I have to agree is generally true. This forces us to sift through the onslaught of offerings full of buzz words such as “metrics”, “digital dashboards”, and “business intelligence platforms”. Yet, it is remarkable that one of the most commonly used tools to capture and manage information from the shop floor is The Spreadsheet - typically Microsoft’s Excel. In some cases, even with a major ERP system investment, the Spreadsheet is still the primary source of timely data collection about the manufacturing operations. In other cases, expensive solutions are put in place to capture and collect data from automation equipment but fail to provide the information in a useable context and once again users resort to the spreadsheet.


Why is it then that manufacturing organizations resort to solutions that are based on a spreadsheet? It is typically not because of lack of understanding about information systems or the skills required to use them. It is because a spreadsheet provides the flexibility and ability to manage and present shop floor information in the most useable and advantageous manner. (By the way the common term for “manage and use” is “information consumption”.) Remember that a manufacturing manager’s main focus is productivity and quality. They use this information to obtain metrics about the value stream that they are trying to manage because they need to know how they are performing in real time. This need is similar to that of a sport’s team, where you know where you stand at every second of the game. You don’t have to wait until tomorrow morning’s newspaper to know who won the game. Running a manufacturing operation without real time metrics is like bowling without being able to see the pins. You can see some of the action, you know that something happened, but you don’t know what the result was.

Of course in recent years, manufacturers have gained some visibility with the increased application of technology, but they are still far from what is possible. I also believe that most of the vendors are clearly aware of the needs and I hope that they we will soon start to see Manufacturing Intelligence applications with the flexibility and convenience that we really need.

Wednesday, November 2, 2011

A Foundation For Quality (and performance...)


Next week is ISPE’s annual meeting in Texas and once again I am participating in an interesting session titled “Operational Excellence - A Foundation for Quality”. We held a similar session last year and it was a great success with more than 90 people participating. The topic that I am covering this year is about the current capabilities of automation technology andinformation systems that provide a vital ingredient in enabling operational excellence. I will be joined by speakers from Pfizer, Celgene, Amway and of course NNE Pharmaplan.

My co-presenters will share experiences about OperationalExcellence, Quality by Design (QbD), and Process Understanding. All-in-all an interesting combination of topics that may initially seem loosely related but they are in fact deeply related. They stem from the some old notion that manufacturing performance is a holistic concept. I have mentioned in a previous post that we are “re-inventing the CIM wheel” but at the same time we are also looking at it in a new perspective with much more modern and usable technologies.  I strongly believe that before we venture with new ideas we have to look and learn from the past, it is very likely that somebody has had a similar problem and may even have a solution. So CIM, PAT, QbD, DFM it really is all related, related to trying to excel at we do and work more synergistically to accomplish our manufacturing goals.