Pages

Technology Without Risk

As the wheels of technology churn out exponentially complex systems designed to lightened man’s burden and propel society upwards and onwards, in turn much of society has little or no bearing on which direction we are heading or how far we may fall. We leap without looking, and while escaping the naiveté of our technological childhood, we have more recently become aware that the complexity and pace of the derived technology has resulted in an increasing frequency and severity of catastrophic disasters.

Consequently, while risk acceptance remains a social decision [1 45], the science of risk management which takes into account the actual risks and the general public’s negative perceptions of the risk stemming from a lack of faith in the political systems set up to fill the gap, whether they be technological or human, are diverging. This isn’t to say that technology and society cannot coexist without risk. It is this author’s and others belief that risks can be mitigated to acceptable levels, but the social perception of said risks must also be understood by all of the players for a proper assessment to be formulated and meaningful progress to take place.

While the early successes of trial and error [2 53] engineering resulted in marvels of science like the Egyptian pyramids right up to the civil infrastructure of the 19th century, more recent catastrophes (plane crashes, bridge collapses, epidemics) preclude any engineering design from foregoing rigorous documentation; qualification, validation, and life cycle assessment. Case in point; the Hubble Space Telescope Systems Engineering Study [3] conducted by Mattice in March of 2005 illustrates the level of regulatory and voluntary documentation required for any modern engineering project and the need for guiding principles that meet the needs of end users through participation and transparency.

Moreover, a persistent quest for improvement and consequently, an increasing level of skill is required as the best defense against unforeseen failure. For it seems history has taught us the unforeseen failures are in the details [2 95]. All too often we are reminded of catastrophe due to some innate part of a larger system like the failure of the Hubble Space Telescope due to manufacturing defect of a single component or the collapse of the Hyatt Regency elevated walkway due to an engineering blunder [2 85]. These are failures of the human mind and the catastrophes that ensued may have been eschewed had the rigorous tasks been performed with focus on the details. We must keep an ever watchful eye on technology and challenge our assumptions continuously, for each new innovation spawns unforeseen risk, which if left unchecked is simply passing the buck to future users.

The Role of Economics & Policy
As a Quality Engineer Specialist at Merck with 15 years experience in the pharmaceutical industry I was able to witness the birth of FDA policy enacting regulation on pharmaceutical manufacturers to perform retrospective validation for existing products. In later years, the FDA’s policy was expanded to include prospective validation, qualification, life cycle assessments, and a host of other documentation standards required by manufacturers to enumerate and abate health and environmental risks. Until that point, however, quality was confined to real time control charts and monthly quality circle meetings, but no insights were ever gained from this process. It wasn’t until FDA policy requiring all manufacturers to perform these tasks using data as far back as possible that we began to see patterns and clear trends towards unforeseen failures. To remedy these risks, Merck formed the Quality Engineering Group in which I worked over the course of my employment and to which we were widely successful towards improving safety and efficacy of our product line.

All this was at a great cost to the company, but quality improved tremendously, complaints were down, and quality above profits was then and remains Merck’s mantra. However, as the years went by, the strides in quality our department could make began to diminish and it became clear that the point of no return was approaching. We had gained extensive knowledge and control of the processes and were able to reduce variability of critical quality control parameters to less than 1 sigma in many cases. This information was shared across Merck’s network and with other teams from across the globe we were able to bridge cultural, political and economical differences towards a common goal. Although, continuous monitoring remains an essential task, we had extensively created a zero defect process and it was no longer pragmatic to continue our research. It wasn’t long after, management began to realize any future improvements would only be gained from serious investment and the company began to shift globally to a one plant - one product process. In doing so, they were able to benefit from the technology we had envisioned, reduce operating costs significantly, and allow each unit to focus on the most minute details of its operation (improve quality). Whence, I and many others found ourselves without a job. But the final insult came more recently when Merck announced they were closing the doors on the very workshop here in Montreal that led the zero defect revolution and was the epitome of quality within the industry during its reign all together; presumably for lower wages, free markets, and a chance to influence the policy of emerging nations, but I digress.

The lesson learned is policy can be the impetuous towards risk free technology. If the commitment remains as in the case of Merck, there are sure to be guinea pigs along the way, however, the result is better quality for the long run and a step towards a risk free technology. As Bulte reiterates in Economic Incentives and Wildlife Conservation [4 2], policy in the form of economic incentives, regulation, or moral suasion are the tools strategists have at their disposal towards stimulating and enforcing risk reduction. In Merck’s case the proof lies in the founding fathers own words: “We try never to forget that medicine is for the people. It is not for the profits. The profits follow, and if we have remembered that, they have never failed to appear." –George W. Merck, 1950. The moral and economic lessons are evident early on, yet even with such a failsafe vision, it wasn’t until regulation forced the corporate mentality into a new way of thinking that meaningful strides in quality were to be realized, some 40 years later.

Producers of technology like Merck and others now realize a zero defect product is achievable and that what follows are healthy profits in the long run. There are generally heavy upfront costs towards instituting and maintaining zero defect systems, so it’s necessary for economic incentive to be present where the rationalization or realization is beyond the scope of emerging technologists. Once rooted, technology is drawn towards quality in a free market and is improved upon as far as the market is willing to afford it. To these ends economics and policy are essential elements of a risk–free technology.


References
[1] Wenk, E. Technology and Risk, University of Illinois Press, 1995
[2] Petroski, Henry, Success is Foreseeing Failure, Princeton University Press, 2006
[3] Mattice, James. Hubble Space Telescope Systems Engineering Case Study,AFIT, 2005
[4] Bulte, Erwin H. Economic Incentives and Wildlife Conservation, Tilburg University, 2003

No comments:

Post a Comment