This preview shows page 1. Sign up to view the full content.
Unformatted text preview: 300 seconds. An machine takes on average 40 seconds more
than the human operator to serve a customer, but with a standard deviation of only
150 seconds. Estimate the expected waiting time in queue of a customer for each
of the operators, as well as the expected queue size. Note the e↵ect of variability
in service time. If the cost of a human operator and the cost of a machine is the
same, which one would you choose?
View Full Document
- Fall '14