We consider a batch service queue which is controlled by switching the server on and off, and by controlling the batch size and timing of services. These batch sizes cannot exceed a fixed number Q, which we call the service capacity. Costs are charged for switching the server on and off, for serving customers and for holding them in the system. Viewing the system as a semi-Markov decision process, we show that the policies which minimize the expected continuously discounted cost and the expected cost per unit time over an infinite time horizon are of the following form: at a review point if the server is off, leave the server off until the number of customers x reaches an optimal level M, then turn the server on and serve min (x, Q) customers; and when the server is on, serve customers in batches of size min(x, Q) until the number of customers falls below an optimal level m(m ≦ M) and then turn the server off. An example for computing these optimal levels is also presented.