An machine takes on average 40 seconds more than the

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 300 seconds. An machine takes on average 40 seconds more than the human operator to serve a customer, but with a standard deviation of only 150 seconds. Estimate the expected waiting time in queue of a customer for each of the operators, as well as the expected queue size. Note the e↵ect of variability in service time. If the cost of a human operator and the cost of a machine is the same, which one would you choose? 2...
View Full Document

Ask a homework question - tutors are online