A usage variance is the difference between the expected number of units used in a process and the actual number used. If more units are used than expected, the difference is considered a negative variance. If fewer units are used than expected, the difference is considered a positive (or favorable) variance. For example, the standard number of ounces of titanium needed to fabricate a widget is ten. If the actual number used is eleven, there is a negative usage variance of one ounce.
A usage variance can be stated in terms of the number of units differential. It can also be restated into currency by multiplying the variance by the standard cost of the units. To continue with the example, if one ounce of titanium costs $100, the cost of the one-unit usage variance is $100. The calculation of this costed form of usage variance is:
(Actual usage - Expected usage) x standard cost per unit
The usage variance concept is most commonly applied to judge the volume of materials used in a production process, and is called the direct material usage variance. The concept is also applied to the amount of labor used; in this case, it is called the labor efficiency variance.
The usage variance can be of considerable utility from a management perspective, since it highlights areas in which there may be excessive levels of waste. These areas can then be targeted for investigation, followed by one or more improvement projects.
The usage variance concept is only used in a standard costing system, where the engineering staff creates standard usage levels that form the baseline for analyses. Standard usage amounts are stored in bills of material (for materials) or in labor routings (for labor). These standards may be adjusted from time to time, based on subsequent engineering reviews of products and processes, and on changes in the expected level of scrap derived from a process. If a standard is set incorrectly, it will trigger an essentially meaningless variance, since the basis of comparison is wrong.