Below are digital resources that complement the book Practical Machine Learning with R: Tutorials and Case Studies.
A video from StatQuest by Josh Starmer. The video explains the fundamentals of linear regression.
A blog post by Carsten Lange describes how to program an R function.
It uses the R function FctMSE() that was used in the book Practical Machine Learning with R as an example. FctMSE() calculates the mean square error for a dataset based on the provided parameter values for slope and intercept of a univariate prediction function. The blog post also provides the source code for the function.
A blog post by Carsten Lange describes how optimal parameters for a univariate linear regression can be derived using the Ordinary Least Square method (OLS).
This blog post by Carsten Lange describes why OLS linear regression parameters do not
need to be normalized. The article includes an example with Rcode showing how OLS
coefficients adjust when dimensions change.
This blog post by Carsten Lange shows that processing data, creating an OLS model design, and running the model require only three steps in tidymodels. The blog post also shows that adjusting the code to use a different machine learning model requires only minimal modifications.
In this article from the blog Heartbeat, the author, Prince Grover, explains various loss functions for classification and regression.
The article by Jim Frost from the website Statistics from Jim discusses the role of the intercept in linear regression. The author explains why the intercept cannot be interpreted as the predicted outcome when the predictor variable(s) are zero.