Pharma IT Blog

Blogging about hot topics related to Pharma and IT. Please use the mail icon to subscribe to new blog posts.

Conclusions on SAP ATTP implementation

We have been through a long a demanding project and is currently in hyper care.

I have tried to summarize the key points from the project, I hope they might be able to help if you are planning a similar project.

It’s been more than a year since my last blog post(http://www.pharmait.dk/index.php/blog/conclusions-on-proof-of-concept-sap-attp) and it’s time to give you an update on the project which I have had the privilege of leading.

Last time we had just finished the POC for SAP ATTP and had found that the solution did indeed meet the customer’s requirement. So, since then we have been in implementation mode.

We are now live with the solution and have been so for a little over a month.

To summarize our learnings, start with end users early - Serialization requirements take some time to familiarize you with, and the sooner the input from the users the easier it is to implement in the design. A universal truth from all IT projects, also applies to Serialization projects. J

Project phase

The implementation of SAP ATTP is not in itself an overly complex IT project. The serialization requirements, the ones already know, are straight forward, and SAP has ensured that ATTP meets all of them. It even in their license model, so no big surprises on the requirement side.

Solution Design

It’s important to know your SAP ECC implementation. If you are to implement SAP ATTP successfully, you will most likely need to update you RF scanner transactions.

The only reason not to do this is if you are exclusively producing for countries which does not have Serialized & Traced requirements. Even in that case, I would strongly recommend updating you RF scanner transactions to meet these requirements, later.

In my opinion SerializationSerialization requirements must be viewed as a funnel, all markets begin with 2D data matrix, then shortly transition to Serialization, then aggregation and in the end reporting. As you are already implementing SAP ATTP, my recommendation is to ensure that ECC is ready as well.  Otherwise you will need to run another project when your first market requires aggregation.

So, with these considerations out of the way we updated the RF transaction to support, all the existing warehouse processes, for serialized and Traced. That means that whenever we scan a pallet (or shipper box, or even bundle) ECC will check against ATTP if the quantity in ECC is the same as the serials in ATTP.

This will of course result in less flexibility in warehouse and production, as an example they can no longer use, MIGO for changing ware house status. They must use our custom build ones.

However, it also means that we have complete control of our serialised and traced products from Process Order Release until Issue of Goods.

That has reduced the need for reconciliation reports between our L3 system and ECC, as we now have complete control of what is produced and transferred from L3 and what is received in SAP.

Interfaces

Se picture below for the interfaces we developed.

 Integration

We developed interfaces from ECC to the L3 system for transfer of material master data and process orders. This is not related to serialization as these could just as well by inserted manually, but it eases operation with less risk of manual errors.

We also developed a serial number Request/response system from L3 to ATTP, this is a synchronous interface. We went live with ATTP 1.0 in 2.0 this can be done as asynchronous, which in most cases is the preferable solution.

We of course also have a commissioning interface between L3 and ATTP, for serialised materials it after the production of the batch has been completed. For serialized and traced it’s after each pallet. This is necessary, as we need to start ware house transaction before the production of the full batch is complete.

The final interface we developed was toward our 3PL in Korea, as Korea has requirement for Serialised and traced, this is an interface which sends the full hierarchy of serials to our partner in Korea. This is the most complex interface as it needs to handle multiple interactions, for instance samples, scrap, and returns,

Validation approach

IQ in SAP projects is simple, we checked the transport list in each environment we went through, as we were not changing environment configuration.

OQ, we did a full OQ of all functional requirements, this included all the scanner transactions and the interfaces, on our side. We did not OQ the interface on the 3PL side, but did a parallel OQ for the L3 system. This part was delayed and ended up postponing our PQ. The reason for the delay was the L3 supplier’s development effort to create the interface took longer than anticipated. They had only file exchange interfaces as standard, as we needed a synchronous interface for serial number request response we needed a web service.

After the OQ we conducted a full PQ, we had a production line available and tested a full process flow with, Non-serialized, Serialized, Lot based, serialized & Traced and non-finished goods.

This showed a challenge in setting up master data, as we had numerous production sites, but only 1 L3 site server, we could not test on all the sites materials.

We also had issues with getting LOT based material tested, and ended up descoping this from the PQ, as we currently have a manual solution in place, and the serialization requirements will be effective from November 2017 in the USA.

The PQ was the first time our end users tried the system hands on, this process showed that we needed a lot more training in serialization in general, and SAP ATTP specifically.

It also resulted in some minor changes to the design we had made. As well as changes to local procedures.

I would strongly recommend doing an end to end test with a real-life line if possible. This will reduce the number of issues found afterwards significantly. Especially Master data and Authorizations should be focus areas.

Cut over

We planned with a technical go live 3 weeks before the functional go live. This was an installation of SAP ATTP and implementation of the SAP notes needed in ECC.

This was to give us time to configure ATTP outside a closing window.

In the weekend we went live, we started installation on the 3 site servers Friday evening, and begun SAP installation Saturday afternoon. This was to keep SAP open for as long as possible, and because we needed to run the L3 installation 3 times, for that we needed more time.

We experienced a lot of issues during the cut over, primarily because of master data and authorizations. We saw issues with the materials where we had open process orders, this should normally not be an issue, but if you have the option, I would ensure everything is closed before starting the go live activities.

Hyper care

Since go live, we have had a high double digits number of incidents and about 30% is still open.

The high incidents we have had has been with SAP but not related to Serialization, and with L3 system. We have had incidents where batch data was not transferred from ECC to ATTP for serialized materials. This is an issue with ATTP as far as we can analyze, and we expect a note to fix this.

We have had a lot of issues with Master data, and authorization, and some issues with the interfaces from ECC to L3.

I would strongly recommend having people on site on all production sites when you go live, as working with serialization requires significant knowledge of changed processes. 

Continue reading
979 Hits
1 Comment

How to bring Pharma and Regulated Data to the cloud

Many companies, including pharmaceutical manufacturers – are changing their business model to focus more on core business capabilities and in doing so - outsourcing more or less of the IT business and processes. As a consequence regulated data are moved out of direct internal control of the company.

Discussing the topic of cloud computing cannot be done without considering security, risk, and compliance. Cloud computing does pose challenges and represents a paradigm shift in which technology solutions are being delivered.

The cloud may be more or less secure compared to the in-house environments and security controls of your organization depending on any number of factors, which include technological components; risk management processes; preventative, detective, and corrective controls; governance and oversight processes; resilience and continuity capabilities; defence in depth; and multifactor authentication.

Within general security framework e.g. ISO27001/02, the concept of CONFIDENTIALITY, INTEGRITY and AVAILABILITY (CIA) are the cornerstones and are equally important in pharmaceutical business and should be included in a risk based approach, meaning that IT security controls are implemented in a way which matches the risks they are mitigating.

Data integrity is critical to regulatory compliance, and the fundamental reason for 21 CFR Part 11 and EU GMP Annex 11, applying equally to manual (paper) and electronic systems, throughout the data lifecycle. Data integrity enables good decision-making by pharmaceutical business and should be ensured so that data integrity risk is assessed, mitigated and communicated in accordance with the principles of quality risk management. It is a fundamental requirement of the pharmaceutical quality system. Data lifecycle refers to how data is generated, processed, reported, checked, used for decision-making, stored and finally discarded at the end of the retention period.

DanArticle

Pharma IT propose a five-step approach for your company to assess How to bring Pharma and Regulated Data to the cloud in a compliant way, securing CONFIDENTIALITY, INTEGRITY and AVAILABILITY and mitigating risk accordingly.

  1. Identify the data (could be data classification)
  2. Perform a specific cloud risk assessment (incl. audit)
  3. Determine the level of confidence in the Cloud Service Provider handling CIA for the required data
  4. Identify controls covering the entire data lifecycle
  5. Provide proof of compliance for the entire data lifecycle

When the assessment of data and risk has been completed and possible mitigating activities has been described, the project can commence preparing for the qualification/validation of the cloud service. This is in principle a basic discipline for a pharmaceutical business, but dealing with Cloud Service Provider does require a slightly different approach.

When a pharmaceutical business acquires services where critical data is processed and/or resides outside internal control, the general quality level, validation activities and IT security planning must be maintained to an agreed and desired level. The approval of the validation report should focus on conclusions that proves that the service is fit for intended use, based on controls compared with the GxP risk.

Tags:
Continue reading
672 Hits
0 Comments