Pruning corrects a decision tree (model) that overfits the data. Pruning proceeds from the leaves upward, combining two leaves into one new leaf when doing so does not increase the error rate.
A model that overfits the data predicts outcome poorly. For example, if there is only random data for the attributes, and the class predicts the value 'heads' 75% of the time and 'tails' 25% of the time, the resulting decision tree model performs worse than a model that always predicts 'heads', which is correct 75% of the time.
If pruning=gainratio (the default), the decisiontree function prunes the decision tree that it creates, using the gain ratio pruning techniques. Alternatively, you can specify a pruning technique of none.