BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//PS Statistics - ECPv4.6.2//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:PS Statistics
X-ORIGINAL-URL:https://www.psstatistics.com
X-WR-CALDESC:Events for PS Statistics
BEGIN:VEVENT
DTSTART;VALUE=DATE:20200525
DTEND;VALUE=DATE:20200530
DTSTAMP:20200408T190620
CREATED:20190424T185728Z
LAST-MODIFIED:20200327T044908Z
UID:3449-1590364800-1590796799@www.psstatistics.com
SUMMARY:ONLINE COURSE - Generalised Linear (MIXED) (GLMM)\, Nonlinear (NLGLM) And General Additive Models (MIXED) (GAMM) (GNAM02) This course will be delivered live
DESCRIPTION:\nThis course will now be delivered live by video link in light of travel restrictions due to the COVID-19 (Coronavirus) outbreak.\nThis is a ‘LIVE COURSE’ – the instructor will be delivering lectures and coaching attendees through the accompanying computer practical’s via video link\, a good internet connection is essential. \nPlease feel free to email oliverhooker@psstatistics.com with any questions\, full course detials below. \nCourse Overview:\nThis course provides a general introduction to nonlinear regression analysis\, covering major topics including\, but not limited to\, general and generalized linear models\, generalized additive models\, spline and radial basis function regression\, and Gaussian process regression. We approach the general topic of nonlinear regression by showing how the powerful and flexible statistical modelling framework of general and generalized linear models\, and their multilevel counterparts\, can be extended to handle nonlinear relationships between predictor and outcome variables. We begin by providing a comprehensive practical and theoretical overview of regression\, including multilevel regression\, using general and generalized linear models. Here\, we pay particular attention to the many variants of general and generalized linear models\, and how these provide a very widely applicable set of tools for statistical modeling. After this introduction\, we then proceed to cover practically and conceptually simple extensions to the general and generalized linear models framework using parametric nonlinear models and\npolynomial regression. We will then cover more powerful and flexible extensions of this modeling. framework by way of the general concept of basis functions. We’ll begin our coverage of basis function regression with the major topic of spline regression\, and then proceed to cover radial basis functions and the multilayer perceptron\, both of which are types of artificial neural networks. We then move on to the major topic of generalized additive models (GAMs) and generalized additive mixed models (GAMMs)\, which can be viewed as the generalization of all the basis function regression topics\, but cover a wider range of topic including nonlinear spatial and temporal models and interaction models. Finally\, we will cover the powerful Bayesian nonlinear regression method of Gaussian process regression. \n\n\n\nIntended Audience\nThis course is aimed at anyone who is interested to learn and apply nonlinear regression methods. These methods have major applications throughout the economics and other social sciences\, life sciences\, physical sciences\, and machine learning. \nVenue – Delivered remotely \nTime zone – UK (GMT) \nAvailability – 15 places \nDuration – 5 days \nContact hours – Approx. 28 hours \nECT’s – Equal to 3 ECT’s \nLanguage – English \nPLEASE READ – CANCELLATION POLICY: Cancellations are accepted up to 28 days before the course start date subject to a 25% cancellation fee. Cancellations later than this may be considered\, contact oliverhooker@psstatistics.com. Failure to attend will result in the full cost of the course being charged. In the unfortunate event that a course is cancelled due to unforeseen circumstances a full refund of the course fees (and accommodation fees if booked through PS statistics) will be credited. \n\n\n\n \nDr. Mark Andrews\n\n\n\n\nTeaching Format\n\nThis course will be hands-on and workshop based. Throughout each day\, there will be some lecture style presentation\, i.e.\, using slides\, introducing and explaining key concepts. However\, even in these cases\, the topics being covered will include practical worked examples that will work through together. \nAssumed quantitative knowledge \nWe assume familiarity with linear regression analysis\, and the major concepts of classical inferential statistics (p-values\, hypothesis testing\, confidence intervals\, model comparison\, etc). Some familiarity with common generalized linear models such as logistic or Poisson regression will also be assumed. \nAssumed computer background \nR experience is desirable but not essential. Although we will be using R extensively\, all the code that we use will be made available\, and so attendees will just to add minor modifications to this code. Attendees should install R and RStudio on their own computers before the workshops\, and have some minimal familiarity with the\nR environment. \nEquipment and software requirements \nA laptop computer with a working version of R or RStudio is required. R and RStudio are both available as free and open source software for PCs\, Macs\, and Linux computers. R may be downloaded by following the links here https://www.r-project.org/. RStudio may be downloaded by following the links here: https://www.rstudio.com/. All the R packages that we will use in this course will be possible to download and install during the workshop itself as and when they are needed\, and a full list of required packages will be made available to all attendees prior to the course. In some cases\, some additional open-source software will need to be installed to use some R packages. These include Stan for probabilistic modeling.; Keras for neural neural network modeling.; Prophet for forecasting. Directions on how to install this software will also be provided before and during \nUNSURE ABOUT SUITABLILITY THEN PLEASE ASK oliverhooker@psstatistics.com \n\n\n\nCourse Programme\n\nMonday 25th – Classes from 09:30 to 17:30 \nModule 1: General and generalized linear models\, including multilevel models. In order to provide a solid foundation for the remainder of the course\, we begin by providing a comprehensive practical and theoretical overview of the principles of general and generalized linear models\, also covering their multilevel (or hierarchical) counterparts. General and generalized linear models provide a powerful set of tools for statistical modeling.\, which are extremely widely used and widely applicable. Their underlying theoretical principles are quite simple and elegant\, and once understood\, it becomes clear how these models can be extended in many different ways to handle different statistical modeling. situations. \nFor this module\, we will use the very commonly used R tools such as lm\, glm\, lme4::lmer\, lme4::glmer. In addition\, we will also use the R based brms package\, which uses the Stan probabilistic programming language. This package allows us to perform all the same analyses that are provided by lm\, glm\, lmer\, glmer\, etc.\, using an almost identical syntax\, but also us to perform a much wider range of general and generalized linear model analyses. \nTuesday 26th – Classes from 09:30 to 17:30 \nHaving established a solid regression modeling. foundation\, on the second day we may cover a range of nonlinear modeling. extensions to the general and generalized linear modeling. framework. \nModule 2: Polynomial regression. Polynomial regression is both a conceptually and practically simple extension of linear modeling. It be easily accomplished using the poly function along with tools like lm\, glmer\, lme4::lmer\, lme4::glmer. Here\, we will use cover piecewise linear and polynomial regression\, using R packages such as segmented.\n\nModule 3: Parametric nonlinear regression. In some cases of nonlinear regression\, a bespoke parametric function for the relationship between the predictors and outcome variable is used. These are often obtained from scientific knowledge of the problem at hand. In R\, we can use the package nls to perform parametric nonlinear regression.\n\nModule 4: Spline regression: Nonlinear regression using splines is a powerful and flexible non-parametric or semi-parametric nonlinear regression method. It is also an example of a basis function regression method. Here\, we will cover spline regression using the splines::bs and splines::ns functions that can be used with lm\, glm\, lme4::lmer\, lme4::glmer\, brms\, etc.\n\nModule 5: Radial basis functions. Regression using radial basis functions is a set of methods that are closely related to spline regression. They have a long history of usage in machine learning and can also be viewed as a type of artificial neural network model. Here\, we will explore radial basis function models using the Stan programming language\, which will allow us to build powerful and flexible versions of the radial basis functions.\n\nModule 6: Multilayer perceptron. Closely related to radial basis functions are multilayer perceptrons. These and their variants and extensions are major building block of deep learning (machine learning) methods. We will explore multilayer perceptron in Stan\, but we will also use the powerful Keras library. \nWednesday 27th – Classes from 09:30 to 17:30 \nModule 7: Generalized additive models. We now turn to the major module of generalized additive models (GAMs). GAMs generalize many of concepts and module covered so far and represent a powerful and flexible framework for nonlinear modeling. In R\, the mgcv package provides a extensive set of tools for working with GAMs. Here\, we will provide an in-depth coverage of mgcv including choosing smooth terms\, controlling overfitting and complexity\,\nprediction\, model evaluation\, and so on.\n\nModule 9: Generalized additive mixed models. GAMs can also be used in linear mixed effects models where they are known as generalized additive mixed mmodels (GAMMs). GAMMs can also be used with the mgcv package. \nThursday 28th – Classes from 09:30 to 17:30 \nModule 10: Interaction nonlinear regression: A powerful feature of GAMs and GAMMs is the ability to model nonlinear interactions\, whether between two continuous variables\, or between one continuous and one categorical variable. Amongst other things\, interactions between continuous variables allow us to do spatial and spatio-temporal modeling. Interactions between categorical and continuous variables allow us to model how nonlinear relationships between a predictor and outcome change as a function of the value of different categorical variables. \nModule 11: Nonlinear regression for time-series and forecasting. One major application of nonlinear regression is for modeling. time-series and forecasting. Here\, we will explore the prophet library for time-series forecasting. This library\, available for both Python and R\, gives us a GAM-like framework for modeling. time-series and making forecasts. \nFriday 29th – Classes from 09:30 to 16:00 \nModule 12: Gaussian process regression. Our final module deals with a type of Bayesian nonlinear regression known as Gaussian process regression. Gaussian process regression can be viewed as a kind of basis function regression\, but with an infinite number of basis functions. In that sense\, it generalizes spline\, radial basis functions\, multilayer perceptron\, generalized additive models\, and provides means to overcome some practically challenging problems in nonlinear regression such as selecting the number and type of smooth functions. Here\, we will explore Gaussian process regression using Stan. \n\n\n\n
URL:https://www.psstatistics.com/course/generalised-linear-glm-nonlinear-nlglm-and-general-additive-models-gam-gnam02/
LOCATION:United Kingdom
ATTACH;FMTTYPE=image/jpeg:https://www.psstatistics.com/wp-content/uploads/2019/04/gnmr01.jpg
END:VEVENT
END:VCALENDAR