Skip to content
Snippets Groups Projects
Commit 8310c074 authored by WeichenXu's avatar WeichenXu Committed by Sean Owen
Browse files

[SPARK-16600][MLLIB] fix some latex formula syntax error

## What changes were proposed in this pull request?

`\partial\x` ==> `\partial x`
`har{x_i}` ==> `hat{x_i}`

## How was this patch tested?

N/A

Author: WeichenXu <WeichenXu123@outlook.com>

Closes #14246 from WeichenXu123/fix_formular_err.
parent 6caa2205
No related branches found
No related tags found
No related merge requests found
......@@ -794,16 +794,16 @@ class LinearRegressionSummary private[regression] (
*
* Now, the first derivative of the objective function in scaled space is
* {{{
* \frac{\partial L}{\partial\w_i} = diff/N (x_i - \bar{x_i}) / \hat{x_i}
* \frac{\partial L}{\partial w_i} = diff/N (x_i - \bar{x_i}) / \hat{x_i}
* }}}
* However, ($x_i - \bar{x_i}$) will densify the computation, so it's not
* an ideal formula when the training dataset is sparse format.
*
* This can be addressed by adding the dense \bar{x_i} / \har{x_i} terms
* This can be addressed by adding the dense \bar{x_i} / \hat{x_i} terms
* in the end by keeping the sum of diff. The first derivative of total
* objective function from all the samples is
* {{{
* \frac{\partial L}{\partial\w_i} =
* \frac{\partial L}{\partial w_i} =
* 1/N \sum_j diff_j (x_{ij} - \bar{x_i}) / \hat{x_i}
* = 1/N ((\sum_j diff_j x_{ij} / \hat{x_i}) - diffSum \bar{x_i}) / \hat{x_i})
* = 1/N ((\sum_j diff_j x_{ij} / \hat{x_i}) + correction_i)
......@@ -822,7 +822,7 @@ class LinearRegressionSummary private[regression] (
* the training dataset, which can be easily computed in distributed fashion, and is
* sparse format friendly.
* {{{
* \frac{\partial L}{\partial\w_i} = 1/N ((\sum_j diff_j x_{ij} / \hat{x_i})
* \frac{\partial L}{\partial w_i} = 1/N ((\sum_j diff_j x_{ij} / \hat{x_i})
* }}},
*
* @param coefficients The coefficients corresponding to the features.
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment