The main difference is that the diminishing returns to a factor relates to the efficiency of adding a variable factor of production but the law of decreasing returns to scale refers to the efficiency of increasing fixed factors. The law of diminishing returns refers to the short run; as you add more variable factors e.g labor the marginal cost will start to increase as output increases. An example of this would be a firm deciding to increase workers in a factory. However, if there is a shortage of equipment for the workers to use, output can't increase as much but the firm will still incur costs of employing the worker hence efficiency will fall. In comparison, decreasing returns to scale relates to the long run. It is an increase in long run average costs resulting from an increase in scale of production and output. This relates to increasing fixed, not variable factors of production e.g. capital. An example of decreasing returns to scale could be if a company becomes too large, they may become inflexible and slow to respond to the market conditions. This means that long run average cost will increase if output increases.
Both laws show that increasing a variable or fixed factor, productivity initially will increase but then start to decline. The main difference between the two is time scale and therefore the factors which are increased(variable or fixed). Both of these laws are useful for firms decision making especially if they are looking to profit maximize and work at the most cost efficient level. Firms will use the principles of the laws to see where adding more variables will start to reduce productivity and increase costs.