Generalised Linear (MIXED) (GLMM), Nonlinear (NLGLM) And General Additive Models (MIXED) (GAMM) (GNAM01) FREE ACCOMMODATION AVAILABLE
25 May 2020 - 29 May 2020£275.00 - £560.00
This course provides a general introduction to nonlinear regression analysis, covering major topics including, but not limited to, general and generalized linear models, generalized additive models, spline and radial basis function regression, and Gaussian process regression. We approach the general topic of nonlinear regression by showing how the powerful and flexible statistical modelling framework of general and generalized linear models, and their multilevel counterparts, can be extended to handle nonlinear relationships between predictor and outcome variables. We begin by providing a comprehensive practical and theoretical overview of regression, including multilevel regression, using general and generalized linear models. Here, we pay particular attention to the many variants of general and generalized linear models, and how these provide a very widely applicable set of tools for statistical modeling. After this introduction, we then proceed to cover practically and conceptually simple extensions to the general and generalized linear models framework using parametric nonlinear models and
polynomial regression. We will then cover more powerful and flexible extensions of this modeling. framework by way of the general concept of basis functions. We’ll begin our coverage of basis function regression with the major topic of spline regression, and then proceed to cover radial basis functions and the multilayer perceptron, both of which are types of artificial neural networks. We then move on to the major topic of generalized additive models (GAMs) and generalized additive mixed models (GAMMs), which can be viewed as the generalization of all the basis function regression topics, but cover a wider range of topic including nonlinear spatial and temporal models and interaction models. Finally, we will cover the powerful Bayesian nonlinear regression method of Gaussian process regression.
This course is aimed at anyone who is interested to learn and apply nonlinear regression methods. These methods have major applications throughout the economics and other social sciences, life sciences, physical sciences, and machine learning.
Venue – PS statistics head office, 53 Morrison Street, Glasgow, G5 8LB – Google map
Availability – 20 places
Duration – 5 days
Contact hours – Approx. 28 hours
ECT’s – Equal to 3 ECT’s
Language – English
We offer COURSE ONLY and ACCOMMODATION PACKAGES;
• COURSE ONLY – Includes lunch and refreshments and welcome meal Monday evening.
• ACCOMMODATION PACKAGE (to be purchased in addition to the course only option) – Includes breakfast, lunch, refreshments and welcome dinner Monday evening. Self-catering facilities are available in the accommodation. Accommodation is multiple occupancy (max 3- 4 people) single sex en-suite rooms. Arrival Sunday 24th May (between 17:00-21:00) and departure Friday 29th May (accommodation must be vacated by 09:15).
To book ‘COURSE ONLY’ with the option to add the additional ‘ACCOMMODATION PACKAGE’ please scroll to the bottom of this page.
Other payment options are available please email email@example.com
PLEASE READ – CANCELLATION POLICY: Cancellations are accepted up to 28 days before the course start date subject to a 25% cancellation fee. Cancellations later than this may be considered, contact firstname.lastname@example.org. Failure to attend will result in the full cost of the course being charged. In the unfortunate event that a course is cancelled due to unforeseen circumstances a full refund of the course fees (and accommodation fees if booked through PS statistics) will be credited. However, PS statistics will not be held responsible/liable for any travel fees, accommodation costs or other expenses incurred to you as a result of the cancellation. Because of this PS statistics strongly recommends any travel and accommodation that is booked by you or your institute is refundable/flexible and to delay booking your travel and accommodation as close the course start date as economically viable.
Dr. Mark Andrews
This course will be hands-on and workshop based. Throughout each day, there will be some lecture style presentation, i.e., using slides, introducing and explaining key concepts. However, even in these cases, the topics being covered will include practical worked examples that will work through together.
Assumed quantitative knowledge
We assume familiarity with linear regression analysis, and the major concepts of classical inferential statistics (p-values, hypothesis testing, confidence intervals, model comparison, etc). Some familiarity with common generalized linear models such as logistic or Poisson regression will also be assumed.
Assumed computer background
R experience is desirable but not essential. Although we will be using R extensively, all the code that we use will be made available, and so attendees will just to add minor modifications to this code. Attendees should install R and RStudio on their own computers before the workshops, and have some minimal familiarity with the
Equipment and software requirements
A laptop computer with a working version of R or RStudio is required. R and RStudio are both available as free and open source software for PCs, Macs, and Linux computers. R may be downloaded by following the links here https://www.r-project.org/. RStudio may be downloaded by following the links here: https://www.rstudio.com/. All the R packages that we will use in this course will be possible to download and install during the workshop itself as and when they are needed, and a full list of required packages will be made available to all attendees prior to the course. In some cases, some additional open-source software will need to be installed to use some R packages. These include Stan for probabilistic modeling.; Keras for neural neural network modeling.; Prophet for forecasting. Directions on how to install this software will also be provided before and during
UNSURE ABOUT SUITABLILITY THEN PLEASE ASK email@example.com
Meet at 43 Cook Street, Glasgow G5 8JN between 17:00 – 21:00
Monday 25th – Classes from 09:30 to 17:30
Module 1: General and generalized linear models, including multilevel models. In order to provide a solid foundation for the remainder of the course, we begin by providing a comprehensive practical and theoretical overview of the principles of general and generalized linear models, also covering their multilevel (or hierarchical) counterparts. General and generalized linear models provide a powerful set of tools for statistical modeling., which are extremely widely used and widely applicable. Their underlying theoretical principles are quite simple and elegant, and once understood, it becomes clear how these models can be extended in many different ways to handle different statistical modeling. situations.
For this module, we will use the very commonly used R tools such as lm, glm, lme4::lmer, lme4::glmer. In addition, we will also use the R based brms package, which uses the Stan probabilistic programming language. This package allows us to perform all the same analyses that are provided by lm, glm, lmer, glmer, etc., using an almost identical syntax, but also us to perform a much wider range of general and generalized linear model analyses.
Tuesday 26th – Classes from 09:30 to 17:30
Having established a solid regression modeling. foundation, on the second day we may cover a range of nonlinear modeling. extensions to the general and generalized linear modeling. framework.
Module 2: Polynomial regression. Polynomial regression is both a conceptually and practically simple extension of linear modeling. It be easily accomplished using the poly function along with tools like lm, glmer, lme4::lmer, lme4::glmer. Here, we will use cover piecewise linear and polynomial regression, using R packages such as segmented.
Module 3: Parametric nonlinear regression. In some cases of nonlinear regression, a bespoke parametric function for the relationship between the predictors and outcome variable is used. These are often obtained from scientific knowledge of the problem at hand. In R, we can use the package nls to perform parametric nonlinear regression.
Module 4: Spline regression: Nonlinear regression using splines is a powerful and flexible non-parametric or semi-parametric nonlinear regression method. It is also an example of a basis function regression method. Here, we will cover spline regression using the splines::bs and splines::ns functions that can be used with lm, glm, lme4::lmer, lme4::glmer, brms, etc.
Module 5: Radial basis functions. Regression using radial basis functions is a set of methods that are closely related to spline regression. They have a long history of usage in machine learning and can also be viewed as a type of artificial neural network model. Here, we will explore radial basis function models using the Stan programming language, which will allow us to build powerful and flexible versions of the radial basis functions.
Module 6: Multilayer perceptron. Closely related to radial basis functions are multilayer perceptrons. These and their variants and extensions are major building block of deep learning (machine learning) methods. We will explore multilayer perceptron in Stan, but we will also use the powerful Keras library.
Wednesday 27th – Classes from 09:30 to 17:30
Module 7: Generalized additive models. We now turn to the major module of generalized additive models (GAMs). GAMs generalize many of concepts and module covered so far and represent a powerful and flexible framework for nonlinear modeling. In R, the mgcv package provides a extensive set of tools for working with GAMs. Here, we will provide an in-depth coverage of mgcv including choosing smooth terms, controlling overfitting and complexity,
prediction, model evaluation, and so on.
Module 9: Generalized additive mixed models. GAMs can also be used in linear mixed effects models where they are known as generalized additive mixed mmodels (GAMMs). GAMMs can also be used with the mgcv package.
Thursday 28th – Classes from 09:30 to 17:30
Module 10: Interaction nonlinear regression: A powerful feature of GAMs and GAMMs is the ability to model nonlinear interactions, whether between two continuous variables, or between one continuous and one categorical variable. Amongst other things, interactions between continuous variables allow us to do spatial and spatio-temporal modeling. Interactions between categorical and continuous variables allow us to model how nonlinear relationships between a predictor and outcome change as a function of the value of different categorical variables.
Module 11: Nonlinear regression for time-series and forecasting. One major application of nonlinear regression is for modeling. time-series and forecasting. Here, we will explore the prophet library for time-series forecasting. This library, available for both Python and R, gives us a GAM-like framework for modeling. time-series and making forecasts.
Friday 29th – Classes from 09:30 to 16:00
Module 12: Gaussian process regression. Our final module deals with a type of Bayesian nonlinear regression known as Gaussian process regression. Gaussian process regression can be viewed as a kind of basis function regression, but with an infinite number of basis functions. In that sense, it generalizes spline, radial basis functions, multilayer perceptron, generalized additive models, and provides means to overcome some practically challenging problems in nonlinear regression such as selecting the number and type of smooth functions. Here, we will explore Gaussian process regression using Stan.