data_mining:regression

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Last revisionBoth sides next revision
data_mining:regression [2014/07/13 03:37] – [Normalengleichungen] phreazerdata_mining:regression [2019/02/10 17:12] – [Cost function] phreazer
Line 17: Line 17:
  
 ==== Cost function ==== ==== Cost function ====
- +$\displaystyle\min_{\theta_0,\theta_1} \sum_{i=1}^m (h_\theta(x^{(i)})-y^{(i)})^2$
-$\text{minimize}_{\theta_0,\theta_1} \sum_{i=1}^m (h_\theta(x^{(i)})-y^{(i)})^2$+
  
 Vereinfachtes Problem: Vereinfachtes Problem:
  
-$\text{minimize}_{\theta_0,\theta_1} \frac{1}{2*m} \sum_{i=1}^m (h_\theta(x^{(i)})-y^{(i)})^2$+$\displaystyle\min_{\theta_0,\theta_1} \frac{1}{2*m} \sum_{i=1}^m (h_\theta(x^{(i)})-y^{(i)})^2$
  
 $h_\theta(x^{(i)}) = \theta_0 +\theta_1x^{(i)}$ $h_\theta(x^{(i)}) = \theta_0 +\theta_1x^{(i)}$
  
-Cost function (Squared error cost function):+Cost function (Squared error cost function) $J$:
  
 $J(\theta_0,\theta_1) = \frac{1}{2*m} \sum_{i=1}^m (h_\theta(x^{(i)})-y^{(i)})^2$ $J(\theta_0,\theta_1) = \frac{1}{2*m} \sum_{i=1}^m (h_\theta(x^{(i)})-y^{(i)})^2$
  
-Goal: $\text{minimize}_{\theta_0,\theta_1} J(\theta_0,\theta_1)$+Goal: $\displaystyle\min_{\theta_0,\theta_1} J(\theta_0,\theta_1)$
  
 === Functions (example with only $\theta_1$): === === Functions (example with only $\theta_1$): ===
  • data_mining/regression.txt
  • Last modified: 2019/02/10 17:14
  • by phreazer