Abstract :
We consider an MX/G/1/K queue in which the removable server applies the following (ν, N) policy: Every time the server completes service and finds ν customers in the system, the server takes a sequence of vacations. At the end of each vacation, the server inspects the length of the queue. If the queue length is greater than or equal to N, the server begins service until the number of customers drops to ν. Under a classical cost structure, we characterize an optimal policy and develop an algorithm to find an optimal policy which minimizes the expected cost per unit time.