{"id":3428,"date":"2024-09-01T13:57:53","date_gmt":"2024-09-01T13:57:53","guid":{"rendered":"https:\/\/workhouse.sweetdishy.com\/?p=3428"},"modified":"2024-09-01T13:57:54","modified_gmt":"2024-09-01T13:57:54","slug":"decision-tree-supervised-learning-regression","status":"publish","type":"post","link":"https:\/\/workhouse.sweetdishy.com\/index.php\/2024\/09\/01\/decision-tree-supervised-learning-regression\/","title":{"rendered":"Decision Tree\u00a0(Supervised Learning\/Regression)"},"content":{"rendered":"\n<p id=\"Par175\">No doubt,&nbsp;clustering&nbsp;may not&nbsp;work&nbsp;on some datasets. But the good news is that there are alternatives, such as a decision tree. This approach generally works better with nonnumerical data.<\/p>\n\n\n\n<p id=\"Par176\">The start of a&nbsp;decision&nbsp;tree is the root node, which is at the top of the flow chart. From this point, there will be a tree of decision paths, which are called splits. At these points, you will use an algorithm to make a decision, and there will be a probability computed. At the end of the tree will be the leaf (or the outcome).<\/p>\n\n\n\n<p>A famous example\u2014in\u00a0machine learning\u00a0circles\u00a0\u2014is to use a decision tree for the tragic sinking of the Titanic. The model predicts the survival of a passenger based on three features: sex, age, and the number of spouses or children along (sibsp). Here\u2019s how it looks, in Figure\u00a03-5.<\/p>\n\n\n\n<figure class=\"wp-block-image\" id=\"Fig5\"><img decoding=\"async\" src=\"https:\/\/learning.oreilly.com\/api\/v2\/epubs\/urn:orm:book:9781484250280\/files\/images\/480660_1_En_3_Chapter\/480660_1_En_3_Fig5_HTML.jpg\" alt=\"..\/images\/480660_1_En_3_Chapter\/480660_1_En_3_Fig5_HTML.jpg\"\/><figcaption class=\"wp-element-caption\"><strong><em>Figure 3-5.<\/em><\/strong>This is a basic&nbsp;decision tree&nbsp;algorithm for predicting the survival of the Titanic<\/figcaption><\/figure>\n\n\n\n<p id=\"Par178\">There are clear&nbsp;advantages&nbsp;for&nbsp;decision trees. They are easy to understand, work well with large datasets, and provide transparency with the model.<\/p>\n\n\n\n<p id=\"Par179\">However, decision trees also have drawbacks. One is error propagation. If one of the splits turns out to be wrong, then this error can cascade throughout the rest of the model!<\/p>\n\n\n\n<p id=\"Par180\">Next, as the decision trees grow, there will be more complexity as there will be a large number of algorithms. This could ultimately result in lower performance for the&nbsp;model&nbsp;.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>No doubt,&nbsp;clustering&nbsp;may not&nbsp;work&nbsp;on some datasets. But the good news is that there are alternatives, such as a decision tree. This approach generally works better with nonnumerical data. The start of a&nbsp;decision&nbsp;tree is the root node, which is at the top of the flow chart. From this point, there will be a tree of decision paths, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":3326,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[441],"tags":[],"class_list":["post-3428","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-3-machine-learning"],"jetpack_featured_media_url":"https:\/\/workhouse.sweetdishy.com\/wp-content\/uploads\/2024\/08\/images-41-1.jpeg","_links":{"self":[{"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/posts\/3428","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/comments?post=3428"}],"version-history":[{"count":1,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/posts\/3428\/revisions"}],"predecessor-version":[{"id":3429,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/posts\/3428\/revisions\/3429"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/media\/3326"}],"wp:attachment":[{"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/media?parent=3428"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/categories?post=3428"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/workhouse.sweetdishy.com\/index.php\/wp-json\/wp\/v2\/tags?post=3428"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}