A decision tree is a graphical representation of a multistage decision-making system. It comprises of a set of decision variables that are arranged in the form of a tree, with each variable representing a decision and branching out to one or more decision variables. Decision trees are widely used in the field of Operations Research for decision analysis.
A typical decision tree uses standard flowchart symbols. As shown in the figure below, a decision tree has nodes (circular, in this case) to represent decision variables. These nodes have arcs originating from them that lead to all possible decision outcomes.
There is a starting node at the base of the tree which represents the central input, or the central idea/decision – this is known as the root node. We have represented it with a blue circle. In Stage 1, there are two possible decision outcomes (coloured green and blue) emanating from the root node. In the second stage of decision-making, each of these two nodes has two decision nodes emanating from it, and so on.
Now, with respect to the starting node, the two nodes at Stage 1 are the two possible outputs. These two outputs are known as chance nodes and serve as inputs for Stage 2 and the process continues until we have arrived at all possible inputs and outputs.
Now, we will notice that there are several paths that can be taken from the central input to reach either of the possible outcomes in the final stage – these are known as endpoints (in our case, there are eight endpoints in Stage 3). Each of these paths is known as a feasible path.
The primary purpose behind this whole exercise is to find the best possible path from the root node to the endpoints. This is known as the optimal path. To obtain the optimal solution, it is essential to find an optimal path that picks the best outcome at every stage.
Decision trees are the most effective when:
- There is uncertainty in predicting outcomes for a decision.
- There is a large number of factors influencing decisions or many possible outcomes for every decision.
- There is a possibility that newer outcomes might be added later.
Advantages of decision trees
- Ease of Construction and Interpretation: Decision trees are very simple to construct and equally easy to understand and interpret. While it is a humongous and time-consuming task to convey the idea of multistage decisions to an audience without a graphical representation to support the theories, the same becomes vastly effortless with the aid of a properly-structured decision tree.
- Dynamic Nature: Decision trees are dynamic in the sense that they permit the addition of newer decision nodes and paths.
- Versatile: Decision trees are versatile. They can be used in conjunction with a wide range of other decision-making techniques and tools like decision matrices, T-charts and Pareto analyses. Also, the structure of a decision tree allows non-linear variables in its decision nodes, which is not feasible in the case of most other decision-making tools.
- Step-by-step Evaluation: Decision trees allow a step-by-step evaluation of scenarios, thus making it possible to evaluate the optimum values of the decision variables at each stage.
Disadvantages of decision trees
- Variance: In decision trees, the values assigned to the decision variables are based on expectations and as such, outcomes may vary from the expected values.
- Unnecessary Complexities: Over-simplification of the decision tree will result in the addition of unnecessary nodes to the tree. This will have an adverse impact on the accuracy of the final outcome.