...
Digital Transformation and Penguins: How to Analyze, Process, and Store Data in 2024?

Digital Transformation and Penguins: How to Analyze, Process, and Store Data in 2024?

Some species of penguins fall asleep 10,000 times a day. It’s a bit like parents of young children who wake up every five minutes to check if the baby is breathing, to change a diaper, or to feed the infant. Falling asleep 10,000 times a day sounds unbelievable but also fascinating. Nature can be almost as surprising as the idea of penguin researchers studying their sleep habits might seem to the average person. When I think about the amount of data that must have been collected regarding this phenomenon, questions come to mind: How can something like this be studied? How can such a vast amount of data, stemming from this gigantic number of sleep cycles, be analyzed? 1. Why does humanity collect information in the first place? To understand the intricate nature of researchers who had enough determination to observe penguins’ habitats, let’s go back to the beginning. Humanity has been gathering information for thousands of years. Our species intuitively knows that more information leads to better and faster decision-making, a better understanding of the root of problems and complex issues, as well as safety and threat prevention. Initially, information was collected through oral transmissions, then cave paintings, and later increasingly advanced forms of writing. The transmission of knowledge through writing became commonplace. Initially, the ability to read and write was available only to the wealthiest and the clergy during the Middle Ages. These were the two social groups that once had exclusive access to information conveyed in writing. Over time—with the development of content duplication techniques like printing—the transmission of information became the cause of rapid growth in education and knowledge. The swift development of available printed materials fueled the popularization of literacy skills. More people with these skills accelerated the development of science and industry. The faster advancement of science and industry, in turn, meant that humans could allocate more resources to further scientific progress and conduct more complex research. At a certain point, we reached a stage where processing data obtained during experiments in paper form was not efficient. Data began to be collected electronically. The emergence of the global internet network was another impulse that accelerated the amount of data being collected, processed, and disseminated. 2. The Excel Era Let’s take a moment to jump back in time to 1985. That’s when Excel was born—a marvelous tool for collecting, processing, distributing, and visualizing data. It allowed users to create forecasts, charts, tables, complex formulas, and even macros that helped quickly process large amounts of data. The possibilities of using spreadsheets were essentially limited only by the users’ imagination. However, over time, we began to hit the spreadsheet wall. Using them as databases, scheduling tools, or for statistical analysis of vast amounts of data led to the creation of “monsters” several gigabytes in size that could bring any computer to a halt. They also made it impossible to use them on multiple devices simultaneously. Spreadsheets were also unsuitable for storing important information due to technical limitations, such as the lack of version history for changes made to the file. It’s no surprise that over time, this tool began to encounter its own limitations. Applying Excel to tasks it wasn’t originally designed for—like database management or complex statistical analysis—led to performance and technical problems. As a result, despite its versatility, Excel did not meet all the requirements of modern organizations, highlighting the need for more advanced tools. 3. I Appreciate You, but I’m Leaving: Saying Goodbye to Excel Every enterprise must mature enough to step out of the “Excel cage.” Its versatility means it’s pretty good for everything… but only pretty good. As processes become more complex, it’s necessary to use specialized tools that ensure security and quick analysis of key data and processes. To illustrate this problem, one might attempt to draw a timeline with specific points: from a finger smeared in dye, through cuneiform writing, paper, Excel, all the way to AI. There’s no going back to cave paintings—that seems logical to us. Yet we still have to convince others that the paper era is over, and Excel’s time has just ended. In times when humanity is producing more and more data every day, can we afford to use paper? Paper, which is only slightly better than a clay tablet? These are, of course, rhetorical questions. Regarding the penguins—to check if a bird is sleeping, it was necessary to analyze much more than just its sleep. Parameters such as brain electrical activity (independently for each hemisphere), body movements, neck muscle activity, and even the depths at which the birds hunted fish in the ocean were examined. The observation results were surprising. It turned out that the birds didn’t sleep for long periods. An average penguin nap lasted 1–4 seconds. However, it’s worth mentioning that there could be several hundred such naps per hour. When summing up all the moments devoted to sleep, it turned out that the animals could sleep up to 15 hours a day. In this particular case, Excel was sufficient because the analysis was conducted for only 14 individuals. However, as you might guess, with a larger number, computational tool performance issues could arise. 4. How to Analyze, Process, and Store Data in 2024 The aforementioned penguins were studied for parameters that could indicate they were falling asleep. They would fall asleep for a few seconds, even up to 600 times an hour. This means that measurements had to be taken at a frequency of at least every 0.5 seconds. One parameter would occupy 170,000 cells in a spreadsheet for just one bird per day. Over 10 days, this amount would increase to 1,700,000. Multiplying this result by 14 (the total number of studied individuals), we get nearly 24 million cells. If we then multiply this by 10 different parameters (which were also studied), we obtain 240 million cells filled with the vital parameters of 14 penguins. If we measure even more parameters, we hit the wall of the spreadsheet’s cell limit. A similar problem occurs in any quality process we might want to conduct using Excel. If the process requires implementing an audit trail (a chronological record of all operations), the spreadsheet’s size begins to increase very rapidly. Of course, Excel is a much better place to store data than the clay tablets and papyrus mentioned multiple times in this article. However, it is not suitable for use as a database. That’s why dedicated tools are used for data collection. Here are a few of them: MES Systems (Manufacturing Execution System): Systems for supervising and controlling the production process, ensuring proper parameters and efficiency, allowing you to monitor and plan production processes. ERP Systems (Enterprise Resource Planning): Help in managing the entire enterprise. EDMS (Electronic Document Management System): Enable and facilitate control over the quality and availability of documents. All the above categories (and many others) require proper infrastructure and maintenance. It’s worth mentioning that each of these systems can be supported to some extent by AI. Performing scoring and analyzing vast amounts of data is something AI excels at. This allows for optimizing processes in ways not available in simple tools like Excel. In many cases, including the validation of computerized systems, determining user requirements and properly understanding their needs is crucial for the safe and efficient operation of any computer system—by the user themselves as well. “Blind optimism” from the client, which may arise after an excellent sales presentation of a system’s demo version, is not good practice because it usually doesn’t consider the real business needs, system capabilities, and infrastructure. Suppliers are generally interested in good cooperation with the client and providing what they genuinely need, but they often can’t spend enough time with them to assess their real needs. This is especially true when implementing a tool is something new, and the user isn’t always aware of their needs (e.g., validation and change tracking, which may turn out to be required—something Excel does not support to the extent needed for large amounts of data. An improperly chosen tool may require unfeasible customizations to implement the proper solution). For example, the penguin researchers started by developing a methodology, checking previous publications on the subject, and considering what needed to be studied to obtain specific data. In computer systems, this stage is called “defining user requirements”. To enable a company to effectively determine what functionalities a new system for collecting, storing, and processing data—meant to replace Excel—should have, it’s worth going through several key steps: Analysis of Business Requirements: The company should gather and analyze all needs and goals related to data across the organization. It’s important to understand which processes are to be supported by the new system and what specific problems need to be solved. Engagement of Key Users: It’s beneficial to conduct interviews, surveys, or workshops with employees who currently use Excel. This way, you can learn about their daily needs and discover the limitations and challenges they face. Mapping Processes and Data Flows: The company should trace how data flows in different departments. Clear process mapping allows for identifying which functionalities are needed to automate, streamline, or integrate different stages of working with data. Identification of Key Functions and Excel’s Shortcomings: Analyzing where Excel does not meet expectations is crucial. For example, there may be a lack of simultaneous multi-user access, advanced reporting, real-time analysis, integration with other systems, or ensuring an appropriate level of security. Analysis of Available Technologies and Solutions: The company should explore market-available solutions that offer functionalities required to achieve the set goals. Defining Priorities and Essential Functionalities: After gathering all requirements, the company should create a prioritized list of functionalities. Key functions may include storing large volumes of data, real-time analytics, automatic reporting, data security, or the ability to integrate with other systems. Testing Solutions: If possible, it’s worth conducting tests of selected solutions with the participation of end-users. This will allow the company to assess how new functions work in practice and whether they meet the employees’ needs. Selecting a Vendor and Pilot Implementation: After initial tests, the company can choose a system vendor that best meets the requirements and conduct a pilot implementation. Such a pilot allows for adapting the system to the company’s specific work environment before full deployment. By going through these steps, the company will be able to precisely determine what functionalities are needed in the new system and choose a solution that best fits its requirements. That’s why more and more enterprises, especially in industries where data collection is crucial, are moving away from spreadsheets in favor of more advanced tools. This is an obvious consequence resulting from the growth of every company. 5. Modern Data Management – Summary and Conclusions I have addressed the complexity of collecting, storing, and analyzing data using the example of research on penguin sleep to show that transitioning from Excel to dedicated systems like ERP, MES, and EDMS is necessary. Such solutions are indispensable for companies processing vast amounts of data. In the digital age, traditional spreadsheets no longer meet the needs of dynamic businesses, and optimal data management requires professional tools and an understanding of real user requirements. Returning to the topic of the penguins themselves—imagine someone falling asleep and waking up 600 times an hour. You could say they are almost always asleep, waking only to take care of something. This resembles a user during system implementation who hasn’t thoroughly considered their needs. It is the user and their needs that are crucial in any implementation. Often, the user cannot properly define their expectations, is unaware of the necessity of conducting validation, or conversely—thinks it is essential, leading to unnecessary expenses. “Overquality”—this is a word I just invented, but it aptly describes the phenomenon of excessive attention to quality. And what about “underquality”? Here, the issue is much more serious—it can lead to disruptions in key business processes, endangering patient safety, product quality, and reducing the company’s profitability. And what does this have to do with Excel? Every enterprise should care about the safety and efficiency of its business processes. To achieve this, these processes should be supported by appropriate tools that specialists in this field can provide. If this article interested you, if you have questions or need support in digital transformation, leave us a message. Our team of Quality Management experts will help meet your expectations in the area of digital change. Meet our case studies in data management and digital transformation: Automated Workforce Management System Case Study Supply Chain Management Case Study: Cost Improvement Case Study: Customized Finance Management System in Enterprise Example of How We Improved Reporting and Data Analysis Efficiency Consent Management Platform Integration in Pharma Case Study Effective Consent Lifecycle Management in Pharma Case Study Global Coaching Transformation at BVB with Coachbetter App A Pharma Platform Case Study – Implementing a Digital Health Operation Crocodile – building a new service center  

Read
Serialization of Medicines: An Effective Tool in the Fight Against Frauds in the Pharmaceutical Market

Serialization of Medicines: An Effective Tool in the Fight Against Frauds in the Pharmaceutical Market

The entire world recognizes the risks associated with using medicines of unknown origin. Numerous studies indicate a significant percentage of counterfeit medicines reaching patients. Patients often unknowingly acquire these products on auction platforms, various “markets,” from acquaintances, or even at pharmacies. While trading products outside of state control can only be limited through education and a firm stance by authorities, counterfeit medicines that reach pharmacies are more challenging to detect, posing a serious threat. 1. Why is medicine serialization important? Medicine serialization is the process of assigning each package of medicine an individual serial number, which is monitored at various stages of the supply chain. Thanks to unique serial numbers, it becomes easier to identify genuine products, helping to eliminate fake medicines and protect patients from harmful or ineffective substances. The serialization process allows tracking of each package of medicine from the moment of production to delivery to the patient. This enables a rapid response in case a defective batch needs to be withdrawn from the market. It’s worth noting that in many countries, such as EU member states (in line with the Falsified Medicines Directive) or the USA (DSCSA law), serialization is mandatory to enhance the safety of pharmaceutical products. Moreover, this system allows for more efficient management of drug inventories and control over their distribution, reducing the risk of errors, such as duplicate batches or improper handling of controlled medicines. There have been several prominent incidents in pharmaceutical history that exposed the problem of lacking adequate control mechanisms, such as serialization. In the 1930s in the USA, mass poisonings with “Elixir Sulfanilamide,” containing toxic diethylene glycol, led to the deaths of over 100 people, including children. The lack of regulation and control mechanisms like serialization contributed to this tragedy, which led to stricter laws being introduced (the Food, Drug, and Cosmetic Act, 1938). Thalidomide, used in the 1950s and 1960s as an anti-nausea medication for pregnant women, caused deformities in thousands of children. The lack of appropriate monitoring and serialization systems prevented a quick response and effective recall of the drug from circulation. In Nigeria, in the mid-1990s, fake antimalarial medicines flooded the market, leading to numerous deaths and health complications among people who unknowingly took ineffective or toxic substitutes. Unfortunately, this is not the only counterfeiting scandal remembered in Africa’s pharmaceutical history. In various African countries between 2000-2010, counterfeit antimalarial medicines, such as artemisinin, were widely distributed, leading to thousands of deaths due to ineffective treatment. None of the packages of the mentioned medicine had a serial number. While it may seem that the aforementioned pharmaceutical crimes took place long ago or do not concern the Western world, the case of contaminated heparin in 2007 feels much closer. Heparin is a commonly used blood thinner. Batches of this drug contaminated with chondroitin sulfate were previously produced in China and then imported into the USA, where they entered circulation, leading to hundreds of complications and 81 deaths. The lack of effective mechanisms for identifying individual product batches, such as serialization, made it difficult for authorities to swiftly withdraw the dangerous mediciness from the market. The problem was discovered only at the beginning of 2008 when a series of unexpected complications and deaths were reported in patients undergoing medical procedures in hospitals where heparin was administered. The heparin scandal clearly showed the importance of monitoring the medicine supply chain, especially in a global economy. Medicine serialization, which was later introduced as a standard in many countries, aimed to prevent similar scandals by allowing each package of medicine to be tracked from production to the patient, significantly enhancing safety. All these cases demonstrate how crucial it is to maintain thorough control over medicines and to be able to track them. The serialization of medicinal products is one of the tools that can help prevent such tragic incidents, ensuring patient safety and effective product recall from the market. 2. Verification through Unique Identifiers To combat counterfeit drugs, the obligation to verify the authenticity of medicines through serialization has been introduced. Medicineverification through unique identifiers is a key component of the system designed to prevent the falsification of pharmaceutical products. This process involves assigning each package of medicine a unique batch number (LOT), which is tracked at various stages of distribution. Each medicine box receives a unique identifier, which typically consists of: Global Trade Item Number (GTIN) – assigned to the pharmaceutical product, Serial Number (SN/NS) – unique for each package, Batch Number (LOT) – indicating the production batch, Expiry Date (EXP) – specifying the drug’s shelf life. All these details for a given batch of medicine are stored in the form of a 2D barcode (e.g., DataMatrix) or a QR code, which is placed on the medicine’s packaging. Verification of authenticity is done by scanning the code on the packaging, which is checked against a central database accessible to manufacturers, wholesalers, pharmacies, and relevant regulatory authorities. This system of medicine authenticity verification enables the checking of the origin of mediciness before they are dispensed to patients, as well as tracking each package of medicine throughout the entire supply chain. 3. Application of Advanced Information Technologies The online medicine verification system utilizes advanced information technologies, enabling effective verification and identification of medicines. The serialization process requires generating unique codes and printing them on the packaging. An important element is also print verification, which involves checking whether the medicine codes are readable and contain correct information. Image recognition techniques and large-scale data processing are used for this purpose. It is also worth mentioning that the use of advanced IT technologies, such as blockchain, cloud computing, IoT, AI, and Big Data, significantly increases the security and transparency of medicine verification. Blockchain is one of the newest medicine verification technologies, ensuring complete transparency and immutability of records. In blockchain, every transaction related to the medicine – from production, through distribution, to sale – is recorded as a block in the chain, which is publicly accessible and cryptographically secured. Cloud systems allow quick access to medicine-related data from anywhere in the world and integration of global systems (facilitating easy connection between different pharmaceutical supervision systems, enabling international cooperation). Big Data, or the analysis of large data sets, allows the identification of patterns, trends, and anomalies in the medicinesupply chain. By analyzing huge amounts of data on distribution, sales, and market monitoring, it is possible to detect irregularities early that suggest the presence of counterfeit medicines. IoT technology, in turn, enables communication between devices over the Internet. It can be used to monitor medicines at every stage of their journey. Drugs, especially those requiring special conditions (e.g., low temperature), are equipped with IoT sensors that monitor temperature, humidity, and other parameters during transport and storage. If storage conditions are exceeded, the IoT system automatically sends alerts to relevant personnel, allowing for immediate action and the withdrawal of drugs from circulation. AI and machine learning are increasingly used to analyze drug-related data and predict risks associated with counterfeiting. AI can predict potential counterfeiting threats by analyzing historical data and behaviors in the supply chain. AI systems can automatically analyze serialization data and report suspicious cases or deviations from the norm. The introduction of mobile applications allows patients and pharmacists to quickly verify the authenticity of drugs. Users can scan the 2D barcode on the medicine’s packaging using a smartphone and instantly check whether the drug is genuine, as well as obtain information about its origin, manufacturer, and expiration date. Examples: Mobile applications can alert patients if a counterfeit drug is detected, as well as allow for reporting suspicious products to regulatory authorities. RFID technology enables wireless identification of products using radio waves. Thanks to RFID chips, it is possible to monitor in real time where a particular drug is and whether it has reached the correct facility. 4. System Integration and Central Database Drug manufacturers must integrate new systems with existing ones or develop new solutions to enable the serialization process. System integration involves connecting various tools and technologies used by different entities in the drug supply chain, such as manufacturers, wholesalers, pharmacies, and regulatory bodies, into a single cohesive system. The goal of integration is to ensure smooth information exchange between these entities, enabling comprehensive monitoring and verification of medicines at every stage of their journey, from production to delivery to the patient. Each packaging line in the factory must be equipped with appropriate interfaces, printers, cameras, and computers to manage the drug serialization process. The central drug database is a centralized platform that collects and manages information about all medicines subject to serialization and monitoring. This database is a key element of the verification system, as it enables the storage of data on every package of medicine or medicinal product and its distribution history. An example of such a database in Europe is the European Medicines Verification System (EMVS), which allows drug verification in accordance with the requirements of the Falsified Medicines Directive (FMD). In the USA, this role is fulfilled by the Drug Supply Chain Security Act (DSCSA), which envisions the creation of a similar monitoring system. The central drug database forms the foundation of verification systems, providing full control over the authenticity and origin of drugs, preventing counterfeiting, and increasing patient safety. Furthermore, the costs associated with adding a drug to the central drug database are usually borne by the drug manufacturer. Through system integration and the collection and submission of codes from properly packaged products, effective verification of drug authenticity is possible at every stage of distribution. 5. Impact of Serialization on Pharmaceutical Infrastructure The implementation of the serialization process has had a significant impact on the pharmaceutical infrastructure and industry. This legal requirement has resulted in positive changes for the entire industry. First and foremost, every entity involved in the production and distribution of a drug must adapt production, distribution, monitoring, and data management processes. This affects all entities in the supply chain, from manufacturers to wholesalers to pharmacies. Every pharmacy has been equipped with terminals for verifying packages, allowing them to check the authenticity of drugs before dispensing them to patients. Moreover, the introduction of the serialization process required coordination among manufacturers, pharmacies, and other entities involved in the process. This certainly increased drug safety. However, ensuring compliance with global regulations has raised production costs. In many countries, serialization is required by law (e.g., in the European Union under the Falsified Medicines Directive – FMD, and in the USA under the DSCSA), meaning companies must adapt their infrastructure to meet these requirements. In Poland, the National Medicines Verification Organization (KOWAL) was tasked with overseeing and coordinating the implementation of the serialization process. Drug serialization forced pharmaceutical manufacturers to adapt to new regulations, which involved additional costs. Although this may seem challenging, the significant benefits of this process cannot be overlooked. One of the key advantages is the improvement of quality management, which these regulations enforce on manufacturers, ensuring a higher level of product control and increased patient safety. Serialization allows companies to better manage drug quality by monitoring the entire product lifecycle. This not only leads to improved quality and safety but also helps optimize costs and produce with greater respect for the natural environment. 6. Benefits of Serialization for Patients and the Pharmaceutical System The introduction of the serialization process for medicinal products has brought numerous benefits for patients and the entire pharmaceutical system. Most importantly, it has enabled the effective identification of genuine medicines and the detection of counterfeits, increased transparency (greater transparency in drug distribution has contributed to a better perception of the pharmaceutical system), as well as improved inventory management. Patients can have greater confidence that they are receiving safe and effective products rather than counterfeit drugs. Drug serialization also helps combat the gray market and the illegal trade of counterfeit medicines, contributing to the protection of public health. Additionally, improved control over drug distribution can help reduce treatment costs associated with the misuse of counterfeit or ineffective products. Serialization also helps protect patients from expired medications, as the system allows tracking of the expiration date of medicines at every stage of distribution. Patients only receive medications that fully comply with current quality standards and are not expired. 7. Medicine Serialization – Summary In summary, the introduction of the serialization process in pharmacy was an essential tool in the fight against counterfeit medicines, which simultaneously ensured safe and authentic products for patients. The use of advanced information technologies enabled effective drug verification and identification, as well as control over the distribution process. The benefits for patients, manufacturers, and the entire pharmaceutical system are evident, marking an important step towards ensuring the safety and authenticity of drugs on the market. In this fight against counterfeits, we are proud to be a provider of quality management solutions for the pharmaceutical industry. Acting as experts at TTMS in the field of drug serialization, we offer comprehensive services and consulting to help fully implement the serialization process. Our experienced team of specialists has the knowledge and skills to support you at every stage, from planning to implementation and maintenance. If you need support in implementing appropriate systems tailored to your needs, TTMS Quality is here to help. Our experts can provide not only technical solutions but also business consulting services. Contact us to learn more about our services and how we can assist you in implementing the serialization process. Together, we can ensure drug safety and authentic products for patients. We will support you in achieving success in the pharmaceutical industry.

Read