Society of Actuaries (SOA) PA Practice Exam 2025 - Free Actuarial Practice Questions and Study Guide

Question: 1 / 400

What is a common drawback of Decision Trees?

They require normalization of data

Sensitivity to noise and overfitting

A key drawback of Decision Trees is their sensitivity to noise and the tendency to overfit the data. Decision Trees create models by splitting data into subsets based on feature values, which can lead to complex trees that capture noise rather than the underlying patterns in the data. This complexity often results from making too many splits, especially with small sample sizes, which can create a model that doesn’t generalize well to new, unseen data.

Overfitting occurs when the model describes random error or noise instead of the underlying relationship, leading to poor performance on validation or test datasets. To mitigate this, techniques such as pruning, setting a maximum depth for the tree, or using ensemble methods like Random Forests can help improve the model’s robustness and ability to generalize.

The other options, while relevant to different modeling techniques, do not specifically apply to Decision Trees in the same way. Normalization is primarily a concern for algorithms that utilize distance calculations or are sensitive to scaling. Understanding the decisions made by a Decision Tree is generally more straightforward than complex models like neural networks. Lastly, Decision Trees do not rely on linear relationships; in fact, their strength lies in their ability to capture non-linear relationships.

Get further explanation with Examzify DeepDiveBeta

Difficulty in understanding the model's decisions

Relying heavily on linear relationships

Next Question

Report this question

Subscribe

Get the latest from Examzify

You can unsubscribe at any time. Read our privacy policy