Ch02 - Chapter 2 wn-— Ina. '- — -——— 2. l—l la)...

Info iconThis preview shows pages 1–22. Sign up to view the full content.

View Full Document Right Arrow Icon
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 8
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 10
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 12
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 14
Background image of page 15

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 16
Background image of page 17

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 18
Background image of page 19

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 20
Background image of page 21

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 22
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 2 wn-— Ina. '- — -——— 2. l—l la) San Francisco Police Department has a total police force of 3900, with 850 officers on patrol. The total budget of SFPD in 1986 was Sl‘lo million with patrol coverage cost of $79 million. This brings out the importance of the problem. Like most police departments, SFPD was also operated with manually designed schedules. it was impossible to know if the manual schedules were optimal in serving residents' needs. It was difficult to evaluate alternative policies for scheduling and deploying officers. There was also the problem of poor response time and low productivity, pressure of increasing demands for service with decreasing budgets. The scheduling system was facing the problem ofproviding the highest possible correlation between the number of officers needed and the number actually on duty during each hour. All these problems led the Task Force to search for a new system and thus undertake this study. lb) After reviewing the manual system, the Task Force decided to search for a new system. The criteria it specified included the following six directives : -- the system must use the CAD (computer aided dispatching) system, which provides a large and rich data base on resident calls for service. The C AD system was used to dispatch patrol officers to call for service and to maintain operating statistics such as call types, waiting times, travel time and total time consumed. in servicing calls. The directive was to use this data on calls for service and consumed times to establish work load by day of week and hour of day -- it must generate optimal and realistic integer schedules that meet management policy guidelines using a micro-computer -- it must allow easy adiostrnent of optimal schedules to accommodate human considerations without sacrificing productivity -- it must create schedules in less than 30 minutes and make changes in less than 60 seconds -- it must be able to perform both tactical scheduling and strategic policyr testing in one integrated system -- the user interface must be flexible and easy, allowing the users (captains) to decide the sequence of functions to be executed instead of forcing them to follow a restrictive sequence. (c) The computer aided dispatching (CAD) system also is used to gather hourly data on historical patrol activity. For each of the 168 hours of the week, data are gathered on call types, consumed times by call type, and percent of each call type requiring two or more officers. Other operating statistics maintained include waiting times and travel times. When needed for forecasting purposes, the call and consumed time data are downloaded from the mainframe to a microcomputer. The other needed data are input by the captains and stored in a file for reusa. (See Figures 1 and 2 of the article for further information.) (d) A list of the various tangible and intangible benefits are as follows '. -- total savings of El l million per year -— inding 4il0 plan (4 day 10 hour shift) is superior to 5:8 plan (5 day 8 hour shift) which leads to a saving of 5.8 milliOn -- added [76,000 productive hours to the patrol stall every year --3> a saving of $5.2 million per year — response times dropped an average of 20 percent which led to potential crime reduction —~ an approximate 50 percent reduction in shortages and surpluses of police officers ~~ resulted in a 32 percent increase in traffic citation «3 a $3 million increase in citation revenues -- a cost benefit analysis of the scheduling aspects shows a one-time cost of $50,000 and benefits of $5.2 million per year —- a 36 percent decrease in days lost due to sick leave -- a 2] percent increase in self-initiated office activities —— an impmved morale of the officers revealed by a survey by an approval rating of 96 percent -- it allowed management to adjust schedules easily and quickly to meet seasonal changes in pattern of crime. 2.1.2 (at Taking all the statistics ofAlDS cases into account it was inferred that just one- third of all cases nation-wide involved some aspect of Injection Drug Uset'lDU). But in contrast to this national picture, over 60% of 500 cases reported in New Haven , Connecticut was traced to drug use. Though it was realized previously , by 198? it was clear that the dominant mode of HIV trans mission in New Haven was the practice of needle sharing for drug injection. 9,—3. i E N: This was the background of the study and in l98’? a street outreach program was implemented which included a survey of drug addicts with partial intent to determine why .fDUs centinued to share needles given the threat of HIV infection and AIDS. [t was claimed by the survey respondents that IDUs shared needles since they were scared and feared arrest for possessing a syringe without prescription which was forbidden by law in Connecticat. Respondents also pointed out difficulties involved in entering drug treatment program. The officials recognized that logical intervention was needle exchange whereby TDUs exchanged their used needles for clean ones. This would remove infectious drug injection equipment from circulation and also ease access to clean needles. Further, contacts made as a result of needle exchange might lead some active [DUs to consider counseling or enter drug treatment. After a lot of lobbying finally the bill for the first legal needle exchange'program became effective on July 1, 1990. (b) The design for the needle exchange program was achieved over the summer of 19 80. The relevant committee decided that lDUs would be treated with respect and so no identificationinfonnation was asked of program clients. The program began operating on November 13, 1990. The needle exchange operate on an outreach basis. A van donated by Yale university visits neighborhoods with high concentration of lDUs. Outreach staff members try to educate the clients over there by different means like distributing literature documenting risks of HIV infection, dispensing condoms , clean packets, etc. The primary goal of needle exchange is to reduce incidence of new HIV infection among lDUs. While studies showed consistent self-reported reductions in risky behavior among lDUs participating in needle exchange programs the studies were not convincing. So the mechanics of needle exchange require that the behavior of needles must change What was required was to reduce the time needles spend circulating in the population. As needles circulate for shorter period of time, needles share fewer people which lower the number of infected needles in the pool of circulating needles which in effect lowers chances of an IDU becoming infected being injected with a previously infected needle. To use this theory required invention of new data collection system which is as follows. A syringe tracking and testing is a system developed to interview the needles returned to the program. All clients participating in the needle exchange are given unique code names and every needle distributed receives a code. Everytime a client exchanges needles, an outreach worker records the date and location of exchange. He also records the code name of the client receiving the needles alongside the codes of the needles. The client then places the returned needles in a canister to which the worker puts a label with the date and location of exchange and code name of client. All returned needles are brought to a laboratory at Yale University where a technician collates the information on the canister labels with the tracking numbers on the returned needles. For non-program or street needles returned to needle exchange , the 9s?) location, date, and client code are recorded. A sample of the returned needles are tested for HIV. (c) The initial results from this system were both shocking and decisive. At the start of the program, the lDUs presented the needles in their possession for clear needles. These street needles are representative of risk faced by an lDU prior to operation of needle exchange which showed a prevalence level of 6T5 percent which tested HIV positive. As of middle of March 1991, 50.3 percent of the program needles tested positive. Since March 1991, additional program needles have been tested of which 40.5 percent tested positive. This gave further suppert to the protection offered by needle exchange program. Though these results are encouraging, they do not link operations of needle exchange to changes in the rate of new HIV infections . To achieve this required a development of a mathematical model describing HIV transmission among lDUs via needle sharing. The syringe tracking and testing system in concert with limited observation obtained from surveying program clients provided data required to estimate parameters for this model. Though the model developed was conservative, the results were interesting. It estimated that in absence of behavioral changes on part of lDUs in the program, rate of new HW infections among needle exchange clients w0uld drop. It estimates a 3-3 percent reduction in new HIV infections. ' (d) To understand the impact of this study requires both a local and national perspective. In the local aspect, it is possible to construct a conservative estimate of the actual number of infections averted. As many clients who joined the needle exchange apparently dropped out, the conservative impact of the program can be estimated by multiplying the cumulative number of person years spent in the program over all clients by the incidence reduction of 2 HIV infections per 100 client years. This assumes all those who apparently dropped out of the program are truly recidivists, an assumption that may be patently false. Calculations have shown that between $1 million and $2 million _ dollar in public health care expenditure have been avoided over the first two years of the program. This only hints at the true impact of this work. Needle exchange has been returned to the menu of legitimate AIDS intervention in major American cities in large part due to the evaluation of New Haven. In some calculations made as to how much public health care costs could be avoided only the annual reduction in HIV incidence among needle exchange program clients is considered, as opposed to changes in lifetime probability of acquiring HIV infection. While decrease in lifetime risk will be less than decrease in annual incidence the effect of placing clients in drug treatment via needle exchange has been ignored. If this point is considered impact of needle exchange on probability of their acquiring HIV could be substantial. 2.2—l {_ a] The Dutch Government has been facing problems regarding its water management the past it was too much water but now it is the scarcity of fresh water and pollution due to increased industrialization and a growing population with high standard of living. Some features of the Dutch landscape exaggerate the problem. Netherlands , one of the densely populated countries of the world and the seventh largest wealthiest nation derives a huge amount of wealth from crops grown in irrigated land. Since agriculture is the largest user of fresh water in Netherlands water shortages can cause large economic losses. The Rhine river is Netherlands major source of surface water For agriculture, irrigation and other purposes. Along with other rivers and canals it is a major artery for the inland shipping fleet of Western Europe. Low water levels in rivers and canals can cause shipping delays and economic losses too because only partially laden ships can navigate the inland waterways. Besides this mines and industries along the Rhine discharge into the water different type of pollutants which also contribute to the rivers increasing level or" salinity which in turn damage crops and threaten environment and personal health. Power plants on the banks can degrade quality of water by discharging excess heat into streams. which may in turn endanger the ecologicai balance in the neighborhood. Besides salinity, the most important water quality problem is eutrophication heavy growth ofalgae in relatively stagnant water of storage reservoirs and lakes which cause the water to smell and taste foul. Though in mid 1970s supply me: the demand of fresh surface water except in dry years, it was predicted not to be true for late 19805. But ground water sources were already facing scarcity. Rapid increases in ground water extraction recentty have resulted in drop of its level in man}r areas. This in turn can cause agricultural and environmental damage in areas where water level was higher. Facing such water management problem the Netherlands Government agencyr responsible For water control and public works, Rijltswaterstaat commissioned. an analysis On which to base a new national water management policy. which resulted in FAWN , the Policy Anaiysis for the Water Management of the Netherlands in April, 1977. (be The purpose of the five mathematical models are as follows: The Water Distribution Model is the heart of the analytic method. The infrastructure of the surface water system consists of rivers and canals that transport water, lakes and reservoirs that store water, weirs, locks and lock bypasses , sluices and pumping stations that are used to control the transport of water throughout the country. The model simulates the major components ofthis system in detail and contains aggregated representations of the other components. The model provides information on the water management system, including flows, level of water, extractions, discharges, depth of shipping and concentration of pollutants. it provides information on different. costs, including investment and operating for technical and managerial tactics and irrigation, aswell as shortage and salinity losses for agriculture, low water shipping loss and shipping delay losses. This information is provided for each 10-day period and is given in a summary of totals on averages for the entire year. The Industry Response Simulation Model '. ‘When water becomes more expensive or less available, firms respond by modifying their production process to consume less water, usually with an accompanying increase in costs, part or all which may be passed on to their customers. To find out these responses and their costs, FAWN developed and used this model. The model simulates the behavior of industrial firms in respOnse to a change in ground water extractions or an increase in the price of drinking water. In determining behavior, the model assumes that each firm will choose the least costly alternative available; Although the model was developed to investigate the effect of ground water charges on industry it is also used to examine effect of imposing quotas that restrict ground water extractions Electric Power Reallocation and Cost Model '. The Water Distribution Model provides an excess temperature table that shows rise in temperature at one node resulting from a reference heat discharge by a power plant at another node. To obtain the excess temperatures created by heat of power plant discharges other than the reference, this model scales the correct entries in the table by the ratio of the new to reference discharge. ' The model caICulates the optimal generating schedule for two basic. conditions: one in which the thermal standards are relaxed and the other in which it is imposed. The difference is the cost attributable to the thermal standards, the thermal penalty cost. The model repeats this process for each 10-day period in the year and calculates the total thermal penalty annually as well as some other statistics. The Nutrient Model 1 Eutrophication, a heat-y growth of algae called an algae bl00m occurs in the still water of lakes and reservoirs. This model estimates the amounts of nutrients , phosphates, nitrogen and silicon available to algae, given the nutrient flOWS entering the lake. The model calculates the composition of a column of water l square meter in area and as deep as the lake under investigation , in contact with the air and with the bottom sedimentations. The important nutrient processes include the inflow and outflow of nutrient bearing water, the flux of nutrients from the bottom, and the flux of nutrient to and from algae. The Algae Bloom Model : FAWN used this model to analyze the effect on algae blooms of circumstances, including introducing control tactics. lt predicts weekly size and species composition of the algae bloom, given amounts of nutrients and solar energy available to the algae. 2,-6- l: c) FAWN compares policies in terms of their impacts. to choosing impact measures, primal}r criterion was that they are sufficient to span quite a number of objectives. it includes both national and general water management as well as specific objectives mentioned by different interest groups. The objectives also had to reflect both equity and efficiency. Impacts on water management system include investment and operating cost of technical and managerial tactics, as well as flood risk in the ljssal lake. Direct impact on users include change in profit, expenditure, revenue for each user group like agriculture, shipping, electric power generation, industries, drinking water supply companies. Environmental impacts include violation of water quality standards, damage to nature areas caused by construction of new facilities and total amount of ground water extracted. lmpact on entire nation include net monetary benefit to nation after deducting transfer payments, total economic effects-- both Government revenues and charges—— in production , employment and imports-- that occurs in both industries directly involved in construction of major new facilities and interrelated indusrries and effects on public health. FAWN also pays attention to distributional effects that show uneven distribution of monetary benefit and costs among producers, consumers, Gowernment and uneven distribution of other impacts among different groups and locations. (d) The several tangible benefits are: —-- the building of Brielse Mier pipeline which will yield $38 million investment savings and $1 5 million annual net benefit in decreased salinity damage to agriculture. —- rejection of plan to build the second dike to separate Markermeer from an adj acent saline lake-- saving more than $95 million in investment costs, 0.2% of Dutch domestic product. -- implementation of new flushing policy for Markermeer is expected to yield net benefits between $1.2 million and $5.4 million per year. -- adoption of a more stringent thermal standard of an increase of 3 degree Celsius for canals by Dutch since FAWN showed it was practical and not costly. This led to a decrease in locally harmful ecological effects of power plant heat discharges. The intangible benefits are: -— drastic changes in Dutch approach to eutrophication, their most serious water quality problem -- implementation of all recommendations by FAWN would lead to an expected profit between $53 million and $128 million per year -- to deal with ground water extraction problem, priority had to be given to industry and drinking water companies. if practical methods to replenish the grOund water cannot be devised, regulatory measures will show growth of ground water sprinkling. - -- comprehensive methodology developed by FAWN has been adopted by the Government, other departments, laboratories and used in several major studies -- FAWN provides method to educate decision makers and train analysts in analyses of complex natural resource and environmental questions. -- the general approach and some of the techniques have potentially wide applicability. 7 9:) (a3 The authors example of a model in natural sciences is Newton's Law of Universal Gravitation. Tough he says it is one of the most important models in Physics this does not account for all details. For example, it is only approximate if the particles are objects with non-spherical shapes and model ignores relativity. The model in OR identified by him is the Ecomrnic Order Quantity (EOQ) model. Like Newton‘s Law in Physics, this model too is simple and highlights important features of the real world. It identifies some critical relationships and also shows that a single model can be used for all types of orders, This model too ignores details of the real world which might be considered important. Butjust like Newton‘s law, EOQ model is one of the mos: important one in MSI'OR. (b) The MSFOR profession is often compared to natural sciences. Basic precepts in natural sciences can be used to guide research in MSHOR. The author believes a greater understanding of these precepts can provide needed focus form the profession and help resolve some recent debates. To be useful, an MSJOR model must possess some qualities as models in natural science. The most important of them are : -- understandibility -- verifiability -- reproducibility The extent to which a model can be understood depends on tools available for evaluation. But models have inherent values which can be interpreted on inspection. In 1960s many MSI'OR professionals tried to model the behavior of automobile traffic. The most successful models examined traffic from macro point of View and found similarities between traffic flow and fluid flow. Less successful models examined behavior of individual drivers with complicated queuing expressions. The latter models tho ugh more accurate have less value. They are too difficult to interpret. A model should be veri tiable and based on observable phenomena. It must capture the essence of a problem faced by MSKOR practitioners. It must include important parameters, decision variables, aims and objectives and their relationships. As a criterion for publication, the phenomena underlying the model must be reproducible. To have a broader appeal the model must be sufficiently general. 2.3-1 (a) The role ofevaluating a model is to extract information from it. It entails two, often simultaneous activities -- identifying alternatives and calculating objectives. The most known technique for identifying alternatives is optimization. The process yields a single solution which maximizes 0r minimizes a single objective function. The most prevalent technique used for identifying multiple alternatives is sensitivity analysis. The process can show how the optimum changes when model parameters change or can provide near-optimal alternative solutions. The author views that optimization should not be the sole goal, not just because models are abstractions of real world but because does not prOvide adequate information for making decisions. Its objective is to find only one solution. But the decision maker probably would prefer information on several alternatives. Though sensitivity analysis increases effectiveness of Optimization , it is deficient. it only yields alternative solution near optimum. The decision maker rather needs unique solutions which offer distinct alternatives. So the author opines that research should be devoted to identify multiple alternatives. One may begin in the solution process itself. Each solutiori is a feasible alternative, which the decision maker may choose over the optimum. New algorithms may be designed to identify distinct alternatives. The second step of evaluation should involve calculating quantifiable objective for each alternative. Thus summarizing, the author views that although optimization has daminated research in MSKOR it is but one technique for addressing one part of MSIOR process. It is deficient since does not provide adequate information for making important decisions:{ Complex decisions rather require information on many alternatives and also an understanding of basic trade-offs and principles. Optimization alone cannot provide this inforrnatiOn. (b) The key to MSEOR is not only possessing knowledge. Though different practitioners take different approaches -— three key steps being -- modeling -- evaluating -- deciding , which are all complementary. ln MSFOR systematized knowledge is reflected in better decisions. The key to good decisions is knowledge and judgment. Modeling and evaluation form a syste matized way For acquiring knowledge; judgment is acquired through experience. The problems which do not require judgment are the ones which can be formulated with well-defined objective functions and solved automatically with algorithms which are pretty efficient; an example being the shortest path algorithm. On the other hand, there are problems which are easy to‘ formulate but difficult to solve. example a carpet store owner would not argue with the objective of the cutting stock problem but may not be happy with solutions provided by available software. He would benefit from models that offer help in cutting the carpet. Combining knowledge from modeling with judgment of store owner would give best reSult. Generally, important questions facing management are not well—defined as shortest path or cutting stock problem. Neither there are related well-defined problems which can be optimized, example the facilities layout problem. Thus the roles are all complementary. Most depend on both judgment of decision maker and knowledge gained from modeling and evaluating. 2 .4-l The credibility of analyses and therefore the probability that policies based upon them will be implemented depends on the perceived validity of the models. The process of model validation tho ugh is a burden helps to learn lessens which may not lead to just improvements in the model but also to changes in the scientific theory and public policy. This happened in FAWN with the Nutrient model and eutrophication. When FAWN was started, the. Dutch eutrophication control. strategy was to decrease phosphate discharges into Surface water from point sources mostly sewerage treatment plants. To find out how effective this strategy is the Algae Bloom model was applied to some major Dutch lakes. It was revealed that in most cases this required enormous percentage decrease in phosphate concentrations. Next question was what was to be done to achieve a particular percentage decrease in phosphate concentration. The Dutch strategy was based on the fact that large amount of phosphates and other nutrients acoumulated in bottom of the lakes was bound permanently to the bottom and hence unavailable to support algae blooms. This was contradicted both in the Nutrient Model caiibrati on process and validation process. Studies taken convinced that nutrients partic ularly phosphate can be liberated from bottom sediments both in normal steady mode and explosive mode. This conclusion was widely accepted in the scientific community. But the conclusion implied that use of a phosphate reduction program as the only way to limit algae bloorn would have hardly any immediate success. But analysis with the Algae Bloom model suggests other tactics which could be effective and combination of tactics should be tailored to individual lakes. gaff-2, The author feels that observation and experimentation are not emphasized in the MSKOR literature or in the training of its workers as much as experience would lead one to believe. As examples he has given some experiences with the US AirForce in early '50s 'which strengthens his belief. He opines that observing actual operations as part of analysis process provides a required base for understanding what is going on in a problem situation. They can help to pornt out difficulties being encountered, suggest hypothesis and theories that may account for problems and offer evidence regarding the validity of the models built as par: of problem solving process. if a problem is in regard to a non-existing system or an operating system fulfills an important function that must continue, so that controlled experiments with are not possible-- one can build a theory about relevant phenomena and analyze the theory but numerical results obtained in this way clearly can be viewed with suspicion. Alternatively if a similar system exists, one can extrapolate from results with it to make estimates about the prospective system. intact, administrative emergencies or an executive desire to try something new may cause the behavior of a system already in existence to change. The analyst may then be able to collect data useful for analyzing how the system would operate under changed circumstances or for identifying problems that might crop up under different operating regimes. From his personal experiences he gives evidence to give substance to these remarks of his. if data was used from one system to predict performance of another he believes that the parameter values form observing another similar system can be useful, and incorporating such estimates in a crude study can be better than not doing a study at all, Parameter values from one context to another cannot be expected to support detailed findings, but even crude findings are enough to provide indispensable information on which to base policy. i-le has also analyzed the results of a continent wide Air Defense exercise. He says here that analysis must be carefully planned, and planning must begin early. Early work serves to put attention on the structure of the work and issues to be faced as well as other responsibilities. Thus, in nutshell, the author views that skills involved in observation and experimentation are enumerous and should be part of the tool kit of many MSIOR analysts. [-ie views that discriminating observation and care fully planned experimentation and anaiysis are central to MSIOR. Observing actual operations and collection of data allow us to discern problems, develop hypotheses and validate models needing skill. Similarly, accurate and complete data are required to estimate validity. Program evaluation brings together many of the issues of observation and experimentation. 5L~ll Thus issues of scientific and professional craft related to observation and experimentation should occur important places in experience, Literature and training of MSFOR workers. 2.4-3 (a) The author views that analysts do not believe that a model can be completely validated. He further opines that policy models can at best be invalidated. 'Thus the objective of validation or invalidation attempts is to increase the degree of confidence that the events obtained from the model will take place under conditions assumed. After trying all invalidation procedures, one will have a good understanding of strengths and weaknesses of the model and will be able to meet criticisms of omissions. Knowing the limitations of the model will enable one to express proper confidence on its results. (b) Model Validity deals with correspondence of the model to the real world and related to pointing out all stated and implied assumptions, identification and inclusion of all decision variables and hypothesized relations among variables. Different assumptions are made and the analyst compares each assumption and hypothesis to the internal and external problem environments viewed by the decision maker and comments on the extent of divergence. Data validity deals with raw and structured data, where structured data is manipulated raw data. Raw data validity is concerned with measurement problems and determining if the data is accurate, impartial and representative. Structured data validity needs review of each step of the manipulation and is a part of model verification. Logicalfmathematical validity deals with translating the model form into a numerical, computer process that produces solutions. There is no standard method to determine this. Approaches include comparing model outcomes with expected or historical results and a close scrutiny of the model form and its numerical representation on a flowchart. _ Predictive validity is analyzing errors between actual and predicted outcomes for a model‘s components and relationships. Here one looks for errors and their magnitudes, why they exist and if how they can be corrected. Operational validity attempts to assess the importance of errors found under technical validity. It must find out if the use of the model is appropriate for the observed and expected errors. it also deals with the fact whether the model can produce unacceptable answers for proper ranges of parameter values. Dynamic validity is concerned with determining how the model will be maintained during its life cycle so it will continue to be an accepted representation of the real system. The two areas of interest thus are update and review. 3.42. l c) Sensitivity analysis plays an important role in testing the operational validity ofa model. In this, values of model parameters are varied over some range of interest to determine if and how the recommended solution changes. If the solution is sensitive to certain parameter changes, the decisiori maker may want the model analysts to explore further or justify in detail values of these parameters. Sensitivity analysis also involves the relationship between small changes in parameter values and magnitude of related changes in outputs. if d) Validating a model tests the agreement between behavior of the model and the real world sysrem being modeled. Models of a non-existing system are the difficult to validate. Three concepts apply here '. face validity or expert opinion, variable -parameter validity and sensitivity analysis and hypothesis validity. Thoagh these concepts are applicable to all models, models of real systems can be subjected to further tests. Validity is measured by how well the real-system compares with model-generated dataThe model is replicatively valid if it matches data acquired from the real system. It is predictively valid when it matches data before getting the data from the real system. A model is structurally valid if it reproduces the observed real system behavior as well as reflects the way in which real system works to produce this. The author views that there is no validation methodology appropriate for all models. He says that a decision-aiding model can never be completely validated as there are never real data about the alternatives not implemented. Thus, analysts must be careful in devising, implementing, interpreting and reporting validation tests for their models. (e) Basic validation steps have been cited in page 616 of the article. 2.5-1. [a] in late 19705 oil companies began to experience downward pressure on profitability due to rapid and continuing changes in external environment. Partially in response to these pressures Texaco‘s Computer information Systems department developed an improved on-line interactive gasoline blending system called OMEGA. [t was first installed in l983 and is now used in all seven Texaco US refineries and in two foreign plants. (b) Simple interactive user interface makes OMEGA easy to use. All input data can be entered by hand. OMEGA can interface with refinery data acquisition system. The user can access stock qualities, stock availabilities, blend speci fication and requirements, starting values and limits, Optimization options, automatic stock selection, automatic blend specification and several other options. Several features aid the user in performing planning functions. By choosing appropriate options user can obtain optimization options. User also has other options. Each refinery uses different set of features depending on its availability of blending stocks. These vary depending on the configuration of the refinery and particular crudes being refined. Availability and easy use of OMEGA features has provided engineers and blenders with powerful and easy tool. (c) OMEGA is constantly being updated and extended. it had to be modified to take into account EPA's regulation for a lead phase down for regular-leaded gasoline so that now OMEGA could be more accurate for these lower lead levels. OMEGA is centinuously modified to reflect changes in refinery operations. Differences in refineries required changes to the system. When Texaco began installing OMEGA in their foreign refineries, additional changes had to he made to handle different requirements of different countries. Improvements to OMEGA are needed to enable it to answer the new and unanticipated what~if questiOns often asked by refinery engineers. (d) Each refinery uses OMEGA in varying degrees and for various purposes depending on their needs, complexity and configuration. Below the typical usage of the system is pointed out. . _ On a monthly basis, refineries use OMEGA to develop a gasoline blending plan for the month. The refinery planning model's projected blending stock volumes are input to OMEGA. The blending planner calculates 3 to 8 blends in a single OMEGA run. The refinery planning model's blend compositions are input into OMEGA as initial values, Once a reasonable blend is developed, the marketing department is contacted to discuss resulting grade splits. After marketing department does their job a finalized blending plan is developed for the month. The scheduler determines when each of the grades will be blended. All these work are done by using OMEGA. { e‘} OMEGA contributes to overall profitability. To measure actual benefit, a method tried was comparing blend composition that blenders used with and without OMEGA. Here OMEGA achieved as much as 30 percent increase in profit. Average increase in profit is approximately 5 percent of gross gasoline revenue. lf OMEGA is used to calculate blending recipes fewer blends fail to meet their quality specification. OMEGA‘s more reliable gasoline grade-split estimates provide significant aid to those developing marketing strategies and refinery production targets. OMEGA is used for what-if case studies performed for example for economic analysis of refinery improvement projects and analysis of how proposed Government regulations would affect Texaco. OMEGA's features have enabled Texaco with capacity to do things not possible with previous blending system, for example, to deal with mix stocks, consider new grades of gasoline, more centre! on inventory, etc- OMEGA's features make it easy and quick to explore new avenues of pro titability for a refinery. 1-14 . WT; .- . 2.5-2 (a) Yellow Freight System, lnc. was founded in 1926 as a regional motor carrier serving the Mid-West. Today it is one of the largest motor carriers in the country. From a mixed operations in 1.9705, Yellow now predominantly serves the less~than—truckload {:LTL) portion of the freight market. The '805 were a difficult decade for the motor carrier industry. Deregulation made the way for tremendous opportunity for growth but also presented management with new and difficult challenges to manage these larger operations more efficiently than before. After l980, motor carriers were forced to compete on price, which led to a lot of pressure to cut costs. The result was decrease in transportation rates. Between 1980 to 1990, tranSportation rates translated to a drop- in real terms of 29%. In addition to real rate decreases, the shipping cornmunity in response to intense international competition, started to increase their expectation in service. For many shippers, Yellow Freight is a full partner in their total quality management programs. Another important component of the logistics system is timely delivery of freight. Service reliability is also critical- This heightened emphasis on service was a problem for some long-standing operating practices used by national LTL carriers. The effect of these pressures can be seen in the tremendous attrition the industry suffered. Out of top 20 revenue producing LTL carriers in l979, only 6 are there today. In this period, Yellow Freight grew from 248 to 630 terminals. This growth has had the effect of creating an extremely large and complex operation. The large network also needs a greater degree of coordination. in 1986, Yellow initiated a project to improve its ability to manage a complex system. Yellow was interested in using modern network method to simulate and optimize a large network. The project had a main goal-- improved service and service reliability through better management control ofthe network. This goal was supplemented by broader management objectives. There was also an expectation that improved planning would lead to higher productivity level and lower costs. Consequently, a project team was formed. (b) The development effort at Yellow started with an existing model as a base and then were modified. The result of this effort was SYSNET. SYSNET is more than 80,090 lines of FORTRAN code for performing sophisticated optimizations using modern network tools. They developed an innovative, interactive optimization technolog t that puts human beings in the loop, placing sophisticated, up—to-date optimization methods in their hands. These methods were required in the development of a system that would handle the entire network without resorting to heuristic methods to decrease the size of the problem. As a result , user is able to analyze impacts of changes in the whole network in a simple but interactive fashion. Projects can be completed earlier new with greater precision. Decisions on shipment consolidations are now optimized taking into account the system effect of each decisiOn. 2,45 Yellow uses SYSNET for two sets of applicatimis: . -- main use is tactical load planning, which involves monthly planning and revision of set of instructions that govern handling and consolidation of shipments through the network. -- the second set of applications involve longer range planning of the network itself. These problems cover the location and sizing of new facilities, and long range decisions that govern the flow of freight between terminals. At Yellow SYSNET is more than just a piece of code. It embodies an entire planning methodology adopted by all levels ofthe company, From strategic planning studies communicated to high-level management to network routing instrucnons sent right to the field, SYSNET has become a comprehensive planning process that has allowed management to maintain control of a large complex operation. In addition, Yellow uses SYSNET as the central tool in the design and evaluation of projects of over $1.0 million in annual savings. (c) The interactive aspects of the code proved important in two respects: -- the user was needed to guide the search for changes in the network. For example, user may know that freight levels are in the rise in the Midwest or a particular breakbullr is facing problems with capacity, in other cases, user may know that current solution is a local minimum and a major change in the network is needed to achieve an overall improvement. A human being can easily point out these spatial patterns and test for promising COnfi gurations. -- the second use proved critical to the adoption of the system and was the user's capability to accept and reject suggestions made by the computer. SYSNET displays suggested changes and allows the user to evaluate each one in terms of difficult to quantify factors. Also local factors, such as work rules or special operating practices that are not incorporated into the model can be accounted for by a knowledgeable user. (d) For strategic planning, the outputs from SYSNET are a set of reports used to prepare management summaries on different options. SYSNET is also used on a operational basis to perform-load planning. In this role, SYSNET is used to maintain a file that determines'the actual routing of shipments through the service network. This file, which contains the load planning, is accessed directly by systems that are used by every terminal manager in the field. SYSNET‘S control of load planning and its capability to communicate these instructions to the field is the most important accomplishment of the project. (e) SYSNET's effect can be seen in four areas : -- quality of planning practices and management culture -— cost savings resulting directly from improvement in load planning 3:“: -- in analyzing projects -— improved service to customers frorn more reliable transportation Qualitative changes includes the following 1 ~- management had more control over network operations. SYSNET now allows managers to have direct control. The new load pattern closely controls the loading of directs and management can quickly change the load pattern in response to changing needs ' -- it could set realistic performance standards. SYSNET allowed Yellow to set direct loading standards based 0n anticipated freight levels, creating more realistic performance expectations. —- planners can better understand the total system now. Yellow can now evaluate new projects and ideas based on their impact on the entire system -- SYSNET allows managers to analyze projects formally before making decisions -- with SYSNET managers can analyze new options quickly in response to changing situations . —- Analysts can now try new ideas on computers which ultimately leads to new ideas in the field -- because of SYSNET, Yellow is more open now to use of new information technologies ~- the new system has reduced claims. SYSN'ET has had a substantial impact on management culture at Yellow Performance improvement due to better load planning include '. A study was undertaken to estimate savings that could be attributed to SYSNET. Total cost savings for the system were estimated at over $7.3 million annually. Savings in breakhulk handling costs also increased. Besides this, reducing shipments handled in the long run may bring down investments in fixed facilities. SYSNET brought down the cost of routing trailers in part by identifying directs with lower transportation costs-- savings due to better routing of trailers were estimated to be $1 million annually. Ongoing projects include '. Operations planning uses SYSNET to scrutinize various projects with a wide range from relocating breakbulks to realigning satellites with brealthul'its. Using SYSNET, operational planning now completes over 200 projects per year, mostly on an informal, exploratory basis. SYSNET‘S speed in evaluating different ideas is critical to this process. In 1990, Yellow used SYSNET to identify over $10 million in annual savings from different projects. SYSNET improved the speed with which such analyses could be completed and expanded the scepe of each project thus allowing Yellow to study system impacts with more precision than before. SYSNET thus has played a main role in identification, design and evaluation of these projects. Improved service includes '. Savings from SYSNET are substantial compared to the cost of its development and implementation. Following the implementation of SYSNET management can be better focused on improved service. ' Yellow continues to use SYSNET for a number of planning projects and to continuously monitor and improve the load planning system which is now used directly within linehaul operations group responsible for day—to—day management of tlows through the system. In addition, Yellow is using SYSNET as a foundation to expand the use of optimization methods for the other aspects of its operations. SYSNET is now very popular within the cornpany for its capability to carry out accurate, comprehensive network planning projects. 2 .6— l (a) implementing this major change in operations needed involvement and support of all levels of the company. Process started with acceptance of system with operation planning department. Operation planning was responsible for guiding the project and managing with close cooperation from the information services department and all aspects of the implementation. The systems acceptance was ma lot due to use of interactive optimization which gave users the support needed to optimize such a large network while simultaneously keeping them in close control of the entire process. Users could also analyze suggested changes to the network based on changes in flows and costs, which could be compared against actual field totals. The next step was to validate the cost model. They were able to compare both total system costs and different subcategories against actual cost summaries for these categories. The individual cost categories within SYSNET consistently match corporate statistics within a few percent and total costs often match with l or 2 percent. The validation of the cost model, both in totality and individual components, played a vital role in gaining upper management's acceptance. The interactive reports and features that convinced operations planning also played a strong role in winning support ot‘ top management. They ran sessions for upper management to demonstrate how SYSNET made suggestions and generated supporting reports to back-up the numbers. They also demonstrated how standard operating practices could be detrimental and why coordinating the entire network was important. By taking all these efforts, they gained the needed confidence of upper management required to support a field implementation. (h) With the support of upper management, they were able to develop an implementation strategy, The controlled direct program changed operating philosophy so drastically that a single corporate-wide transition was viewed as not safe. in implementing SYSNET, Yellow made a systematic change in the way it loaded directs. SYSNET encourages a greater proportion of directs to be loaded onto breakbulks. it was not possible to change this operation methods so easily over the whole network. it was also difficult to do it in a piecemeal fashion. To deal with this problem, they developed a phased implementation strategy that started with smallest breakbulks in the system and went up to larger ones. Careful planning made sure that no break‘oulk would be over capacity during the intermediate stages of the process. The entire implementatiori was so planned as to ensure that no breakbulk would find itself over capacity during the transition period. {'c) To communicate the new concept to terminal managers in the field involved three steps : -- designing new support tools so that SYSN'ET routing instructions were easy to follow H— training terminal managers and dock personnel to use these new system and most important -- convincing terminal managers that the new approach was a good idea. They developed two new suppon tools to assist field operations : «- first was a set of reports that managers 0r dock supervisors could access from their local computer terminals which would give them immediate access to SYSNET load pattern, -- second, was a revised shipment movement bill. This provides a very high level of centre] over the routing of individual shipments. ' The Operations Planning department handled training by organizing series of visits to all 25 breakbulks. During each visit, the staff members explained the principles behind the controlled direct program, new reports and use of. new routing directions. Follow-up was done by phone calls. The most important task was to convince terminal managers of the logic behind the new operations strategy. Terminal managers needed to understand that they had to follow the load planning since it was designed to coOrdinate different parts of the system. They used examples to illustrate the effect their decisions could have on other terminals. Generally, people in the field accepted the principle that their decisions should be coordinated with those in the rest of the system i d) Following the implementation of SYSNET, they developed a target that represented the expected number of directs that they-should he loading based on the SYSNET plan. Yellow then measured terminal manager's performance based on how close they were to this target. After some period, it deemed compliance with the plan so good that it now measures terminal managers performance on other activities and Yellow continues to monitor compliance with the load plan informally. it then contacts managers that appear to be not in compliance to determine the reasons. In short, SYSNET has changed load planning from a decentralized process that depended on local management incentives to a centralized process that relies on monitoring and enforcement 2.6-2 (a) The information processing industry has experienced several decades of sustained profitable growth. Recently, competition has intensified leading to quick advances in computer technology. This in turn leads to proliferation of both—end products and services. These trends are especially relevant for after—sales service. Maintaining a service parts logistic system to support products installed in the field is essential to competing in this industry. Growth in both sales and scope of products offered has dramatically increased the number of spare parts that must he maintained. For IBM, the number of installed machines and the annual usage of spare parts have both increased. This growth has increased the dollar value of service inventories, which are used to maintain the very high levels of service expected by IBM‘s customers. IBM has developed an extensive multiple- ec‘nelon logistic structure to provide ready service for the large population of installed machines, which are distributed through the United States. [BM developed a large and sophisticated inventory management system to provide customers with prompt and reliable service. A fast changing business environment and pressures to decrease investment in inventory led lBM to look for improvements in its control system. in response to these new needs, [BM initiated the development of a new planning and control system for management of service parts. The result of this was the creation and implementation of a system called Optimizer. (b) The complicating factors faced by the OR team are as follows : -— there are more than 15 million part-location combinations -- there are more than 50000 product-location combinations -- frequent updating (weekly) of system control parameters was a requirement in respOnse to changes in the service environment and installed base - success of the system is important to IBM‘s daily operations and so can have a major impact on its future sales and revenues -- employees could be expected to protest against any change since the existing control system was working and sophisticated and overall parts logistics problem was complex. 1—10 (c) The system developed in this phase had minimum interface to provide data inputs and multi-echelon algorithm without any improvements. Most of the big changes from the original design was in this phase. They discovered that the echelon structure was in reality more complex than the one used in the analytic model. Consequently, they had to develop extensions to the demand pass-up methodology and incomorate them into the model. The test was conducted in early l986 and led to the finding that the value of the total inventory generated by the new system was smaller than expected. It was discovered that the problem was due to differences in criticality ofparts. The algorithm made extensive use of inexpensive, non-functional parts to meet product-service obj ectivc. Another problem found out at this stage was the churn (instability) in the recommended stock levels every week- Although stock levels are expected to change periodically in respOnse to changing failure rates and to changes in the installed base, it is desirable to keep the stock levels quasi-static in order to avoid logistic and supply problems. They developed control procedures and changed the model to take care of this problem. (d) in this phase, they completed all Lfunctions required for implementation and developed a measurement system to monitor the field implementation test. After being done with the system coding for this phase, they cenducted an extensive user acceptance test. Every program module was tested individually and jointly. Finally, a field implementation test went live on 7' machine types in early 1987. The working of the system fulfilled expectations. Scope of the field test was slowly expanded. Results were ' monitored on a weekly and then monthly basis by the measurement system. [cl In this phase, they completed the development and installation of all the functions currently in place in Optimizer. The system was able to provide the specified service performance for all parts and locations. lmprovements were made. User acceptance testing and integration of final system went smoothly. The project staging helped to sustain support for the project by demonstrating concrete progress throughout the implementation process. it also helped to eradicate problems in formulation and algorithm and programming bugs early. So very few problems occurred when the system went live in a national basis. The final Optimizer system for national implementation consisted of four major modules '. w a forecasting system module -- a data delivery system module a a decision system that solves multi-echelon stock control problem -— the Pl'lVlS interface system if) The implementation of {)ptimizer yielded a variety of benefits '. 2e2l nun... -- a decrease in investment on inventory -- improved services -- enhanced flexibility in responding to changing service requirements -- provision or" a planning capability —- improved understanding of the impact of parts operations -- increased responsive of the control system -- increased efficiency of N30 human resources -- identifying the role of functional parts in providing product service is an example of benefits derived from implementation of Optimizer -- ability to run Optimizer on a weekly basis has increased responsiveness of entire parts inventory system -- for machines controlled by Optimizer, inventory analysts no longer have to specify parts stocking lists for each echelon in order to make sure that service objectives are attained. They can now focus on other critical management issues. Optimizer thus has proved to be an extremely valuable planning and operating control tool. 2.74 The 13 detailed phases of an OR study according to this reference are : Initiation(E.mbryonic), Feasibility, Formulation, Data, Design, Verification($oftware Development), Validation, Training and Education, Installation, Implementation, Maintenance and Update, Evaluation and Review, Documentation and Dissemination. Following are the 6 broader phases given in the chapter and in parentheses are given the detailed phases of the article that fall partially or primarily within the broader phase : - Define the problem ofinterest and gather relevant data (initiation, feasibility, data), formulate a mathematical model to represent the problem (formulation), develop a computer-based procedure for deriving solutions'for the problem from the model(design, verification that is sofiware development), test the model and refine it as needefivalidation), prepare for ongoing application of the model as prescribed by management(training and education, installation}, impleme ntation(irnplementation, maintenance and update, evaluation and review, documentation and dissemination). 2:19. ...
View Full Document

This note was uploaded on 02/15/2010 for the course IE 220 taught by Professor Storer during the Spring '07 term at Lehigh University .

Page1 / 22

Ch02 - Chapter 2 wn-— Ina. '- — -——— 2. l—l la)...

This preview shows document pages 1 - 22. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online