What technique can ...
 
Notifications
Clear all

What technique can be adopted when a weak learners hypothesis accuracy is only slightly better than 50%?

1 Posts
1 Users
0 Likes
88 Views
(@woodliffrueben)
Posts: 735
Noble Member
Topic starter
 

What technique can be adopted when a weak learners hypothesis accuracy is only slightly better than 50%?

  • A . Over-fitting
    B. Activation.
    C. Iteration.
    D. Boosting.

Show Answer Hide Answer

Suggested Answer: D

Explanation:

✑ Weak Learner: Colloquially, a model that performs slightly better than a naive model.

More formally, the notion has been generalized to multi-class classification and has a different meaning beyond better than 50 percent accuracy.

For binary classification, it is well known that the exact requirement for weak learners is to be better than random guess. […] Notice that requiring base learners to be better than random guess is too weak for multi-class problems, yet requiring better than 50% accuracy is too stringent.

― Page 46, Ensemble Methods, 2012.

It is based on formal computational learning theory that proposes a class of learning methods that possess weakly learnability, meaning that they perform better than random guessing. Weak learnability is proposed as a simplification of the more desirable strong learnability, where a learnable achieved arbitrary good classification accuracy.

A weaker model of learnability, called weak learnability, drops the requirement that the learner be able to achieve arbitrarily high accuracy; a weak learning algorithm needs only output an hypothesis that performs slightly better (by an inverse polynomial) than random guessing.

― The Strength of Weak Learnability, 1990.

It is a useful concept as it is often used to describe the capabilities of contributing members of ensemble learning algorithms. For example, sometimes members of a bootstrap aggregation are referred to as weak learners as opposed to strong, at least in the colloquial meaning of the term.

More specifically, weak learners are the basis for the boosting class of ensemble learning algorithms.

The term boosting refers to a family of algorithms that are able to convert weak learners to strong learners.

https://machinelearningmastery.com/strong-learners-vs-weak-learners-for-ensemble-learning/
 
Posted : 14/11/2022 11:30 pm

Latest BCS AIF Dumps Valid Version

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund
Share: