|
|
{% extends "layout.html" %} |
|
|
|
|
|
{% block content %}<!DOCTYPE html> |
|
|
<html lang="en"> |
|
|
<head> |
|
|
<meta charset="UTF-8"> |
|
|
<title>π¬ Visual Random Forest Classifier (2D)</title> |
|
|
<script src="https://cdn.plot.ly/plotly-2.32.0.min.js"></script> |
|
|
|
|
|
<script src="https://cdn.tailwindcss.com"></script> |
|
|
<script src="https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js"></script> |
|
|
<style> |
|
|
|
|
|
.info-icon { |
|
|
cursor: help; |
|
|
margin-left: 5px; |
|
|
color: #6B7280; |
|
|
position: relative; |
|
|
display: inline-block; |
|
|
} |
|
|
.tooltip { |
|
|
visibility: hidden; |
|
|
width: 250px; |
|
|
background-color: #333; |
|
|
color: #fff; |
|
|
text-align: center; |
|
|
border-radius: 6px; |
|
|
padding: 8px 10px; |
|
|
position: absolute; |
|
|
z-index: 10; |
|
|
bottom: 125%; |
|
|
left: 50%; |
|
|
margin-left: -125px; |
|
|
opacity: 0; |
|
|
transition: opacity 0.3s; |
|
|
font-size: 0.85rem; |
|
|
line-height: 1.4; |
|
|
} |
|
|
.info-icon:hover .tooltip { |
|
|
visibility: visible; |
|
|
opacity: 1; |
|
|
} |
|
|
|
|
|
.tooltip::after { |
|
|
content: ""; |
|
|
position: absolute; |
|
|
top: 100%; |
|
|
left: 50%; |
|
|
margin-left: -5px; |
|
|
border-width: 5px; |
|
|
border-style: solid; |
|
|
border-color: #333 transparent transparent transparent; |
|
|
} |
|
|
|
|
|
|
|
|
.highlight-blue { color: #2563EB; font-weight: 600; } |
|
|
.highlight-red { color: #DC2626; font-weight: 600; } |
|
|
.highlight-green { color: #16A34A; font-weight: 600; } |
|
|
.highlight-bold { font-weight: 600; } |
|
|
|
|
|
|
|
|
.flow-box { |
|
|
background-color: #F3F4F6; |
|
|
border-radius: 0.5rem; |
|
|
padding: 1.5rem; |
|
|
text-align: center; |
|
|
box-shadow: 0 4px 6px -1px rgba(0, 0, 0, 0.1), 0 2px 4px -1px rgba(0, 0, 0, 0.06); |
|
|
min-height: 120px; |
|
|
display: flex; |
|
|
flex-direction: column; |
|
|
justify-content: center; |
|
|
align-items: center; |
|
|
} |
|
|
.flow-arrow { |
|
|
font-size: 2.5rem; |
|
|
color: #9CA3AF; |
|
|
margin: 0 1rem; |
|
|
display: flex; |
|
|
align-items: center; |
|
|
justify-content: center; |
|
|
} |
|
|
</style> |
|
|
</head> |
|
|
|
|
|
<body class="bg-gray-100 text-gray-900"> |
|
|
<div class="max-w-5xl mx-auto mt-10 bg-white p-8 rounded-xl shadow-lg"> |
|
|
<h1 class="text-3xl font-bold mb-4 text-center">π¬ Visual Random Forest Classifier (2D)</h1> |
|
|
<p class="mb-6 text-center text-gray-600"> |
|
|
An ensemble learning method for classification that operates by constructing a multitude of decision trees. |
|
|
</p> |
|
|
|
|
|
<div class="grid grid-cols-1 md:grid-cols-2 gap-6 mb-6"> |
|
|
<div> |
|
|
<label for="testX" class="block font-medium mb-1 flex items-center"> |
|
|
Test Point X1: |
|
|
<span class="info-icon"> |
|
|
ⓘ |
|
|
<span class="tooltip"> |
|
|
The <span class="highlight-bold">X1-coordinate</span> of the new data point (<span class="highlight-green">green 'x'</span>) that you want the Random Forest to classify. |
|
|
</span> |
|
|
</span> |
|
|
</label> |
|
|
<input type="number" id="testX" value="4" class="w-24 px-2 py-1 border rounded" onchange="predict()"> |
|
|
</div> |
|
|
|
|
|
<div> |
|
|
<label for="testY" class="block font-medium mb-1 flex items-center"> |
|
|
Test Point X2: |
|
|
<span class="info-icon"> |
|
|
ⓘ |
|
|
<span class="tooltip"> |
|
|
The <span class="highlight-bold">X2-coordinate</span> of the new data point (<span class="highlight-green">green 'x'</span>) that you want the Random Forest to classify. |
|
|
</span> |
|
|
</span> |
|
|
</label> |
|
|
<input type="number" id="testY" value="2" class="w-24 px-2 py-1 border rounded" onchange="predict()"> |
|
|
</div> |
|
|
</div> |
|
|
|
|
|
<div class="mb-4 text-center"> |
|
|
<button onclick="predict()" class="bg-blue-500 hover:bg-blue-600 text-white px-6 py-2 rounded-lg text-lg transition duration-200"> |
|
|
Run Random Forest Prediction |
|
|
</button> |
|
|
</div> |
|
|
|
|
|
<div id="plot" class="border rounded-lg shadow-inner h-[400px] md:h-[500px] w-full"></div> |
|
|
<p id="result" class="mt-4 font-bold text-lg text-center text-gray-800"></p> |
|
|
|
|
|
<div class="mt-10 p-6 bg-purple-50 rounded-xl border border-purple-200"> |
|
|
<h2 class="text-2xl font-bold mb-6 text-center text-purple-700">How Random Forest Classifies Your Data</h2> |
|
|
<div class="flex flex-wrap justify-center items-center gap-4"> |
|
|
<div class="flow-box bg-purple-100"> |
|
|
<span class="text-5xl mb-2">π³</span> |
|
|
<p class="text-lg font-semibold text-purple-800">Many Decision Trees</p> |
|
|
<p class="text-sm text-purple-600">Each trained on a random subset</p> |
|
|
</div> |
|
|
<div class="flow-arrow">→</div> |
|
|
<div class="flow-box bg-purple-100"> |
|
|
<span class="text-5xl mb-2">π</span> |
|
|
<p class="text-lg font-semibold text-purple-800">New Data Point</p> |
|
|
<p class="text-sm text-purple-600">Sent to ALL trees</p> |
|
|
</div> |
|
|
<div class="flow-arrow">→</div> |
|
|
<div class="flow-box bg-purple-100"> |
|
|
<span class="text-5xl mb-2">π</span> |
|
|
<p class="text-lg font-semibold text-purple-800">Individual Predictions</p> |
|
|
<p class="text-sm text-purple-600">Each tree "votes" on the class</p> |
|
|
</div> |
|
|
<div class="flow-arrow block md:hidden">↓</div> |
|
|
<div class="flow-arrow hidden md:block">→</div> |
|
|
<div class="flow-box bg-purple-100"> |
|
|
<span class="text-5xl mb-2">π³οΈ</span> |
|
|
<p class="text-lg font-semibold text-purple-800">Majority Vote</p> |
|
|
<p class="text-sm text-purple-600">Most common prediction wins</p> |
|
|
</div> |
|
|
<div class="flow-arrow">→</div> |
|
|
<div class="flow-box bg-purple-100"> |
|
|
<span class="text-5xl mb-2">β
</span> |
|
|
<p class="text-lg font-semibold text-purple-800">Final Classification</p> |
|
|
<p class="text-sm text-purple-600">Robust and accurate</p> |
|
|
</div> |
|
|
</div> |
|
|
|
|
|
|
|
|
<p class="mt-6 text-center text-gray-600 text-sm"> |
|
|
Random Forest combines the power of many individual decision trees to make a more robust and accurate classification, leveraging collective intelligence. |
|
|
</p> |
|
|
</div> |
|
|
|
|
|
<div class="mt-8 text-center"> |
|
|
<a href="/liar" class="bg-purple-600 text-white px-4 py-2 rounded hover:bg-purple-700"> |
|
|
π Go to Liar Predictor |
|
|
</a> |
|
|
|
|
|
|
|
|
|
|
|
<div class="mt-8 p-6 bg-gray-50 rounded-lg border border-gray-200"> |
|
|
<h2 class="text-2xl font-bold mb-4 text-center text-blue-700">Understanding Random Forest</h2> |
|
|
|
|
|
<p class="mb-4 text-gray-700"> |
|
|
Random Forest is an <span class="highlight-bold">ensemble learning method</span> that builds a "forest" of decision trees. For classification tasks, it outputs the class that is the mode of the classes (majority vote) of the individual trees. It's known for its high accuracy and ability to handle complex datasets. |
|
|
</p> |
|
|
|
|
|
<h3 class="text-xl font-semibold mb-2">Key Concepts:</h3> |
|
|
<ul class="list-disc list-inside text-gray-700 mb-4"> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-bold">Ensemble Learning:</span> Instead of relying on a single model, Random Forest combines predictions from multiple models (decision trees) to improve overall accuracy and robustness. |
|
|
</li> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-bold">Decision Trees:</span> Each tree in the forest makes a prediction independently. A single decision tree creates axis-parallel splits, leading to rectangular decision regions. |
|
|
</li> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-bold">Randomness:</span> Random Forest introduces randomness in two ways: |
|
|
<ol class="list-decimal list-inside ml-4"> |
|
|
<li><span class="highlight-bold">Bagging (Bootstrap Aggregating):</span> Each tree is trained on a random subset of the training data (with replacement).</li> |
|
|
<li><span class="highlight-bold">Feature Randomness:</span> When splitting a node, each tree considers only a random subset of the available features. This decorrelates the trees.</li> |
|
|
</ol> |
|
|
</li> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-bold">Decision Boundary:</span> Unlike a single decision tree's sharp, rectangular boundaries, the Random Forest's decision boundary is the aggregated result of many trees. This often results in a smoother, more complex, and often non-linear boundary, as seen in the plot. |
|
|
</li> |
|
|
</ul> |
|
|
|
|
|
<h3 class="text-xl font-semibold mb-2">How this Visualization Works:</h3> |
|
|
<ul class="list-disc list-inside text-gray-700 mb-4"> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-red">Class 1 (Red Circles):</span> These are your labeled data points belonging to Class 1. |
|
|
</li> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-blue">Class 0 (Blue Circles):</span> These are your labeled data points belonging to Class 0. |
|
|
</li> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-green">Test Point (Green 'x'):</span> This is the new, unlabeled data point you want to classify. You can adjust its X1 and X2 coordinates. |
|
|
</li> |
|
|
<li class="mb-2"> |
|
|
<span class="highlight-bold">Colored Background:</span> This represents the <span class="highlight-bold">decision boundary</span> of the trained Random Forest model. |
|
|
<ul> |
|
|
<li><span class="highlight-red">Red regions</span> indicate areas where the Random Forest predicts Class 1.</li> |
|
|
<li><span class="highlight-blue">Blue regions</span> indicate areas where the Random Forest predicts Class 0.</li> |
|
|
</ul> |
|
|
The smoothness and complexity of this boundary are a result of the ensemble nature of Random Forest. |
|
|
</li> |
|
|
</ul> |
|
|
|
|
|
<p class="mt-4 text-sm text-gray-600"> |
|
|
*The plot will show the decision boundary by predicting the class for a grid of points covering the entire plot area. The color of each grid point reflects the predicted class, creating the background regions.* |
|
|
</p> |
|
|
</div> |
|
|
</div> |
|
|
|
|
|
<script> |
|
|
|
|
|
|
|
|
let labeledPoints = [ |
|
|
|
|
|
[1.0, 1.0, 0], [1.5, 1.8, 0], [2.0, 1.2, 0], [1.2, 2.5, 0], [2.5, 2.0, 0], |
|
|
[3.0, 1.0, 0], [3.5, 2.2, 0], [2.8, 0.5, 0], [1.8, 0.8, 0], [0.5, 2.0, 0], |
|
|
[0.8, 3.0, 0], [2.2, 3.5, 0], [3.2, 3.0, 0], [4.0, 2.5, 0], [4.5, 1.8, 0], |
|
|
|
|
|
|
|
|
[5.0, 5.0, 1], [5.5, 4.2, 1], [6.0, 5.8, 1], [4.8, 6.0, 1], [6.2, 6.5, 1], |
|
|
[7.0, 5.0, 1], [6.5, 4.0, 1], [5.8, 6.8, 1], [7.2, 6.0, 1], [7.8, 5.5, 1], |
|
|
[8.0, 7.0, 1], [7.5, 7.5, 1], [6.8, 7.2, 1], [5.2, 7.8, 1], [4.0, 6.0, 1] |
|
|
]; |
|
|
|
|
|
function getTestPoint() { |
|
|
const testXInput = document.getElementById('testX'); |
|
|
const testYInput = document.getElementById('testY'); |
|
|
|
|
|
const testX = parseFloat(testXInput.value); |
|
|
const testY = parseFloat(testYInput.value); |
|
|
|
|
|
if (isNaN(testX) || isNaN(testY)) { |
|
|
document.getElementById("result").innerText = "β Please enter valid numbers for Test Point X1 and X2."; |
|
|
testXInput.value = 4; |
|
|
testYInput.value = 2; |
|
|
return null; |
|
|
} |
|
|
return [testX, testY]; |
|
|
} |
|
|
|
|
|
async function predict() { |
|
|
const testPoint = getTestPoint(); |
|
|
if (testPoint === null) { |
|
|
return; |
|
|
} |
|
|
|
|
|
document.getElementById("result").innerText = "Calculating decision boundary and prediction..."; |
|
|
|
|
|
console.log("Predicting with test point =", testPoint); |
|
|
|
|
|
|
|
|
const res = await fetch("/rf_visual_predict", { |
|
|
method: "POST", |
|
|
headers: { 'Content-Type': 'application/json' }, |
|
|
body: JSON.stringify({ |
|
|
points: labeledPoints, |
|
|
test_point: testPoint |
|
|
|
|
|
}) |
|
|
}); |
|
|
|
|
|
if (!res.ok) { |
|
|
document.getElementById("result").innerText = `β Error from backend: ${res.statusText}. Ensure your Flask backend is running.`; |
|
|
console.error("Backend error:", res.statusText); |
|
|
return; |
|
|
} |
|
|
|
|
|
const data = await res.json(); |
|
|
console.log("Response:", data); |
|
|
|
|
|
document.getElementById("result").innerText = `β
Predicted Class for (${testPoint[0]}, ${testPoint[1]}): ${data.prediction == 0 ? 'Class 0 (Blue)' : 'Class 1 (Red)'}`; |
|
|
|
|
|
|
|
|
const class0 = labeledPoints.filter(p => p[2] == 0); |
|
|
const class1 = labeledPoints.filter(p => p[2] == 1); |
|
|
|
|
|
const trace0 = { |
|
|
x: class0.map(p => p[0]), |
|
|
y: class0.map(p => p[1]), |
|
|
mode: 'markers', |
|
|
name: 'Class 0', |
|
|
marker: { color: 'blue', size: 10, symbol: 'circle' } |
|
|
}; |
|
|
|
|
|
const trace1 = { |
|
|
x: class1.map(p => p[0]), |
|
|
y: class1.map(p => p[1]), |
|
|
mode: 'markers', |
|
|
name: 'Class 1', |
|
|
marker: { color: 'red', size: 10, symbol: 'circle' } |
|
|
}; |
|
|
|
|
|
const testTrace = { |
|
|
x: [testPoint[0]], |
|
|
y: [testPoint[1]], |
|
|
mode: 'markers', |
|
|
name: 'Test Point', |
|
|
marker: { color: 'green', size: 14, symbol: 'x' } |
|
|
}; |
|
|
|
|
|
|
|
|
|
|
|
const boundaryTrace = { |
|
|
z: data.decision_boundary_z, |
|
|
x: data.decision_boundary_x_coords, |
|
|
y: data.decision_boundary_y_coords, |
|
|
type: 'contour', |
|
|
colorscale: [ |
|
|
['0.0', 'rgba(0, 0, 255, 0.2)'], |
|
|
['1.0', 'rgba(255, 0, 0, 0.2)'] |
|
|
], |
|
|
showscale: false, |
|
|
hoverinfo: 'skip', |
|
|
opacity: 0.8 |
|
|
}; |
|
|
|
|
|
|
|
|
Plotly.newPlot('plot', [boundaryTrace, trace0, trace1, testTrace], { |
|
|
title: `Random Forest Decision Boundary`, |
|
|
xaxis: { title: 'X1', range: [0, 9] }, |
|
|
yaxis: { title: 'X2', range: [0, 9] }, |
|
|
autosize: true, |
|
|
hovermode: 'closest', |
|
|
margin: { t: 40, b: 40, l: 40, r: 10 } |
|
|
}, { responsive: true }); |
|
|
} |
|
|
|
|
|
window.onload = () => predict(); |
|
|
</script> |
|
|
|
|
|
<div class="mt-12 bg-white rounded-lg shadow-lg p-6 border border-gray-300"> |
|
|
<h2 class="text-2xl font-bold text-center text-purple-700 mb-4">π³ Single Tree vs Random Forest</h2> |
|
|
<div id="treeComparisonPlot" class="w-full h-[800px] md:h-[500px]"></div> |
|
|
</div> |
|
|
|
|
|
<script> |
|
|
|
|
|
function getTreeLayout() { |
|
|
const isMobile = window.innerWidth < 768; |
|
|
|
|
|
const singleTreeTrace = { |
|
|
type: "scatter", |
|
|
mode: "markers+lines+text", |
|
|
x: [2, 1, 3, 0.5, 1.5, 2.5, 3.5], |
|
|
y: [3, 2, 2, 1, 1, 1, 1], |
|
|
text: ["Root", "", "", "class 1", "class 1", "class 2", "class 2"], |
|
|
textposition: "top center", |
|
|
marker: { size: isMobile ? 20 : 30, color: "royalblue" }, |
|
|
line: { color: 'royalblue', width: 2 }, |
|
|
name: "Single Decision Tree", |
|
|
showlegend: false, |
|
|
xaxis: 'x1', |
|
|
yaxis: 'y1' |
|
|
}; |
|
|
|
|
|
const forestTraces = [ |
|
|
{ |
|
|
type: "scatter", |
|
|
mode: "markers+lines+text", |
|
|
x: [7, 6.5, 7.5], |
|
|
y: [3, 2, 2], |
|
|
text: ["Tree 1", "class 1", "class 1"], |
|
|
textposition: "top center", |
|
|
marker: { size: isMobile ? 18 : 28, color: "red" }, |
|
|
line: { color: 'red', width: 2 }, |
|
|
showlegend: false, |
|
|
xaxis: 'x2', |
|
|
yaxis: 'y2' |
|
|
}, |
|
|
{ |
|
|
type: "scatter", |
|
|
mode: "markers+lines+text", |
|
|
x: [9, 8.5, 9.5], |
|
|
y: [3, 2, 2], |
|
|
text: ["Tree 2", "class 2", "class 2"], |
|
|
textposition: "top center", |
|
|
marker: { size: isMobile ? 18 : 28, color: "green" }, |
|
|
line: { color: 'green', width: 2 }, |
|
|
showlegend: false, |
|
|
xaxis: 'x2', |
|
|
yaxis: 'y2' |
|
|
}, |
|
|
{ |
|
|
type: "scatter", |
|
|
mode: "markers+lines+text", |
|
|
x: [11, 10.5, 11.5], |
|
|
y: [3, 2, 2], |
|
|
text: ["Tree 3", "class 3", "class 3"], |
|
|
textposition: "top center", |
|
|
marker: { size: isMobile ? 18 : 28, color: "orange" }, |
|
|
line: { color: 'orange', width: 2 }, |
|
|
showlegend: false, |
|
|
xaxis: 'x2', |
|
|
yaxis: 'y2' |
|
|
} |
|
|
]; |
|
|
|
|
|
let layout; |
|
|
|
|
|
if (isMobile) { |
|
|
|
|
|
layout = { |
|
|
grid: { rows: 2, columns: 1, pattern: 'independent' }, |
|
|
|
|
|
xaxis: { |
|
|
title: { text: "Single Decision Tree", font: { size: 14 } }, |
|
|
showgrid: false, zeroline: false |
|
|
}, |
|
|
yaxis: { domain: [0.55, 1], showgrid: false, zeroline: false }, |
|
|
|
|
|
|
|
|
xaxis2: { |
|
|
title: { text: "Random Forest (Multiple Trees)", font: { size: 14 } }, |
|
|
showgrid: false, zeroline: false |
|
|
}, |
|
|
yaxis2: { domain: [0, 0.45], showgrid: false, zeroline: false }, |
|
|
|
|
|
showlegend: false, |
|
|
margin: { t: 40, b: 40, l: 20, r: 20 }, |
|
|
autosize: true, |
|
|
title: { text: "Visualizing Decision Tree vs Random Forest", font: { size: 16 } } |
|
|
}; |
|
|
} else { |
|
|
|
|
|
layout = { |
|
|
grid: { rows: 1, columns: 2, pattern: 'independent' }, |
|
|
|
|
|
xaxis: { |
|
|
domain: [0, 0.45], |
|
|
title: { text: "Single Decision Tree", font: { size: 14 } }, |
|
|
showgrid: false, zeroline: false |
|
|
}, |
|
|
yaxis: { domain: [0, 1], showgrid: false, zeroline: false }, |
|
|
|
|
|
|
|
|
xaxis2: { |
|
|
domain: [0.55, 1], |
|
|
title: { text: "Random Forest (Multiple Trees)", font: { size: 14 } }, |
|
|
showgrid: false, zeroline: false |
|
|
}, |
|
|
yaxis2: { domain: [0, 1], showgrid: false, zeroline: false }, |
|
|
|
|
|
showlegend: false, |
|
|
margin: { t: 50, b: 40 }, |
|
|
autosize: true, |
|
|
title: { text: "Visualizing Decision Tree vs Random Forest", font: { size: 18 } } |
|
|
}; |
|
|
} |
|
|
|
|
|
return { traces: [singleTreeTrace, ...forestTraces], layout: layout }; |
|
|
} |
|
|
|
|
|
function drawTreePlot() { |
|
|
const data = getTreeLayout(); |
|
|
Plotly.react('treeComparisonPlot', data.traces, data.layout, {responsive: true}); |
|
|
} |
|
|
|
|
|
|
|
|
drawTreePlot(); |
|
|
|
|
|
|
|
|
window.addEventListener('resize', drawTreePlot); |
|
|
</script> |
|
|
|
|
|
<div class="mt-8 text-gray-800 space-y-4"> |
|
|
<h3 class="text-xl font-bold text-blue-700">π Working of Random Forest Algorithm</h3> |
|
|
<ul class="list-disc list-inside space-y-2"> |
|
|
<li><strong>Create Many Decision Trees:</strong> The algorithm makes many decision trees using different random parts of the data.</li> |
|
|
<li><strong>Pick Random Features:</strong> Each tree picks a random subset of features to make splits. This keeps trees diverse.</li> |
|
|
<li><strong>Each Tree Makes a Prediction:</strong> Every tree gives its own output.</li> |
|
|
<li><strong>Combine the Predictions:</strong> |
|
|
<ul class="ml-6 list-disc"> |
|
|
<li><em>Classification:</em> Uses majority voting across trees.</li> |
|
|
<li><em>Regression:</em> Averages the outputs of all trees.</li> |
|
|
</ul> |
|
|
</li> |
|
|
<li><strong>Why It Works:</strong> Randomness prevents overfitting and improves overall prediction accuracy.</li> |
|
|
</ul> |
|
|
|
|
|
<h3 class="text-xl font-bold text-blue-700 mt-6">π Key Features of Random Forest</h3> |
|
|
<ul class="list-disc list-inside space-y-2"> |
|
|
<li><strong>Handles Missing Data:</strong> Works even with some missing values.</li> |
|
|
<li><strong>Shows Feature Importance:</strong> Identifies most important features for prediction.</li> |
|
|
<li><strong>Handles Complex Data:</strong> Efficient with large datasets and many features.</li> |
|
|
<li><strong>Versatile:</strong> Works for both classification and regression tasks.</li> |
|
|
</ul> |
|
|
|
|
|
<h3 class="text-xl font-bold text-blue-700 mt-6">π Assumptions of Random Forest</h3> |
|
|
<ul class="list-disc list-inside space-y-2"> |
|
|
<li>Each tree is independent and makes its own prediction.</li> |
|
|
<li>Each tree is trained on random samples and features.</li> |
|
|
<li>A large enough dataset is required for diverse learning.</li> |
|
|
<li>Combining different trees improves accuracy.</li> |
|
|
</ul> |
|
|
</div> |
|
|
</div> |
|
|
|
|
|
</body> |
|
|
</html> |
|
|
{% endblock %} |