Linear Regression in Javascript

試著忘記壹切 提交于 2020-05-09 18:17:21

问题


I want to do Least Squares Fitting in Javascript in a web browser.

Currently users enter data point information using HTML text inputs and then I grab that data with jQuery and graph it with Flot.

After the user had entered in their data points I would like to present them with a "line of best fit". I imagine I would calculate the linear, polynomial, exponential and logarithmic equations and then choose the one with the highest R^2 value.

I can't seem to find any libraries that will help me to do this though. I stumbled upon jStat, but it is completely missing documentation (as far as I can find) and after digging through the the source code it doesn't seem to have any linear regression functionality built in--I'm basing this purely on function names however.

Does anyone know any Javascript libraries that offer simple regression analysis?


The hope would be that I could use the library like so...

If I had some set of scatter points in an array var points = [[3,4],[15,45],...[23,78]], I would be able to hand that to some function like lin_reg(points) and it would return something like [7.12,3] if the linear equation was y = 7.12 x + 3.


回答1:


What kind of linear regression? For something simple like least squares, I'd just program it myself:

http://mathworld.wolfram.com/LeastSquaresFitting.html

The math is not too hard to follow there, give it a shot for an hour or so and let me know if it's too hard, I can try it.

EDIT:

Found someone that did it:

http://dracoblue.net/dev/linear-least-squares-in-javascript/159/




回答2:


The simplest solution I found for the question at hand can be found in the following post: http://trentrichardson.com/2010/04/06/compute-linear-regressions-in-javascript/

Note that in addition to the linear equation, it also returns the R2 score, which can be useful.

** EDIT **

Here is the actual code snippet:

function linearRegression(y,x){
        var lr = {};
        var n = y.length;
        var sum_x = 0;
        var sum_y = 0;
        var sum_xy = 0;
        var sum_xx = 0;
        var sum_yy = 0;

        for (var i = 0; i < y.length; i++) {

            sum_x += x[i];
            sum_y += y[i];
            sum_xy += (x[i]*y[i]);
            sum_xx += (x[i]*x[i]);
            sum_yy += (y[i]*y[i]);
        } 

        lr['slope'] = (n * sum_xy - sum_x * sum_y) / (n*sum_xx - sum_x * sum_x);
        lr['intercept'] = (sum_y - lr.slope * sum_x)/n;
        lr['r2'] = Math.pow((n*sum_xy - sum_x*sum_y)/Math.sqrt((n*sum_xx-sum_x*sum_x)*(n*sum_yy-sum_y*sum_y)),2);

        return lr;
}

To use this you just need to pass it two arrays, known_y's and known_x's, so this is what you might pass:

var known_y = [1, 2, 3, 4];
var known_x = [5.2, 5.7, 5.0, 4.2];

var lr = linearRegression(known_y, known_x);
// now you have:
// lr.slope
// lr.intercept
// lr.r2



回答3:


I found this great JavaScript library.

It's very simple, and seems to work perfectly.

I also can't recommend Math.JS enough.




回答4:


Check out https://web.archive.org/web/20150523035452/https://cgwb.nci.nih.gov/cgwbreg.html (javascript regression calculator) - pure JavaScript, not CGI calls to server. The data and processing remains on your computer. Complete R style results and R code to check the work and a visualization of the results.

See the source code for the embedded JavaScript implementations of OLS and statistics associated with the results.

The code is my effort to port the GSL library functions to JavaScript.

The codes is released under GPL because it's basically line for line porting of GPL licensed Gnu Scientific Library (GSL) code.

EDIT: Paul Lutus also provides some GPL code for regression at: http://arachnoid.com/polysolve/index.html




回答5:


Simple linear regression with measures of variation ( Total sum of squares = Regression sum of squares + Error sum of squares ), Standard error of estimate SEE (Residual standard error), and coefficients of determination R2 and correlation R.

const regress = (x, y) => {
    const n = y.length;
    let sx = 0;
    let sy = 0;
    let sxy = 0;
    let sxx = 0;
    let syy = 0;
    for (let i = 0; i < n; i++) {
        sx += x[i];
        sy += y[i];
        sxy += x[i] * y[i];
        sxx += x[i] * x[i];
        syy += y[i] * y[i];
    }
    const mx = sx / n;
    const my = sy / n;
    const yy = n * syy - sy * sy;
    const xx = n * sxx - sx * sx;
    const xy = n * sxy - sx * sy;
    const slope = xy / xx;
    const intercept = my - slope * mx;
    const r = xy / Math.sqrt(xx * yy);
    const r2 = Math.pow(r,2);
    let sst = 0;
    for (let i = 0; i < n; i++) {
       sst += Math.pow((y[i] - my), 2);
    }
    const sse = sst - r2 * sst;
    const see = Math.sqrt(sse / (n - 2));
    const ssr = sst - sse;
    return {slope, intercept, r, r2, sse, ssr, sst, sy, sx, see};
}
regress([1, 2, 3, 4, 5], [1, 2, 3, 4, 3]);



回答6:


Here is a snippet that will take an array of triplets (x, y, r) where r is the weight of the (x, y) data point and return [a, b] such that Y = a*X + b approximate the data.

// return (a, b) that minimize
// sum_i r_i * (a*x_i+b - y_i)^2
function linear_regression( xyr )
{
    var i, 
        x, y, r,
        sumx=0, sumy=0, sumx2=0, sumy2=0, sumxy=0, sumr=0,
        a, b;

    for(i=0;i<xyr.length;i++)
    {   
        // this is our data pair
        x = xyr[i][0]; y = xyr[i][1]; 

        // this is the weight for that pair
        // set to 1 (and simplify code accordingly, ie, sumr becomes xy.length) if weighting is not needed
        r = xyr[i][2];  

        // consider checking for NaN in the x, y and r variables here 
        // (add a continue statement in that case)

        sumr += r;
        sumx += r*x;
        sumx2 += r*(x*x);
        sumy += r*y;
        sumy2 += r*(y*y);
        sumxy += r*(x*y);
    }

    // note: the denominator is the variance of the random variable X
    // the only case when it is 0 is the degenerate case X==constant
    b = (sumy*sumx2 - sumx*sumxy)/(sumr*sumx2-sumx*sumx);
    a = (sumr*sumxy - sumx*sumy)/(sumr*sumx2-sumx*sumx);

    return [a, b];
}



回答7:


Somewhat based on Nic Mabon's answer.

function linearRegression(x, y)
{
    var xs = 0;  // sum(x)
    var ys = 0;  // sum(y)
    var xxs = 0; // sum(x*x)
    var xys = 0; // sum(x*y)
    var yys = 0; // sum(y*y)

    var n = 0;
    for (; n < x.length && n < y.length; n++)
    {
        xs += x[n];
        ys += y[n];
        xxs += x[n] * x[n];
        xys += x[n] * y[n];
        yys += y[n] * y[n];
    }

    var div = n * xxs - xs * xs;
    var gain = (n * xys - xs * ys) / div;
    var offset = (ys * xxs - xs * xys) / div;
    var correlation = Math.abs((xys * n - xs * ys) / Math.sqrt((xxs * n - xs * xs) * (yys * n - ys * ys)));

    return { gain: gain, offset: offset, correlation: correlation };
}

Then y' = x * gain + offset.



来源:https://stackoverflow.com/questions/6195335/linear-regression-in-javascript

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!