Upon Dirk's advice, I am posting single examples. I hope they are not too "cute" [clever, but I don't care] or trivial for this audience.
Linear models are the bread and butter of R. When the number of independent variables is high, one has two choices. The first is to it use lm.fit(), which receives the design matrix x and the response y as arguments, similarly to Matlab. The drawback to this approach is that the return value is a list of objects (fitted coefficients, residuals, etc), not an object of class "lm", which can be nicely summarized, used for prediction, stepwise selection, etc. The second approach is create a formula:
> A
X1 X2 X3 X4 y
1 0.96852363 0.33827107 0.261332257 0.62817021 1.6425326
2 0.08012755 0.69159828 0.087994158 0.93780481 0.9801304
3 0.10167545 0.38119304 0.865209832 0.16501662 0.4830873
4 0.06699458 0.41756415 0.258071616 0.34027775 0.7508766
...
> (f=paste("y ~",paste(names(A)[1:4],collapse=" + ")))
[1] "y ~ X1 + X2 + X3 + X4"
> lm(formula(f),data=A)
Call:
lm(formula = formula(f), data = A)
Coefficients:
(Intercept) X1 X2 X3 X4
0.78236 0.95406 -0.06738 -0.43686 -0.06644