top of page
base_SPS_edited_edited_edited.png
 Statistics Regression 

The Statistics Regression Module allows you to increase the accuracy of your forecasts with advanced regression procedures.

​

Statistics Regression allows you to predict categorical results and apply a wide range of NLR (Nonlinear Regression) procedures. You can apply the procedures to business projects and analyses where ordinary regression techniques are limited or inappropriate: for example, to study customer buying habits and responses to treatments or to analyse credit risks.

​

With Statistics Regression, you can expand the functions of the Statistics Base data analysis phase in the analytical process.

​

  • Predicting categorical results with more than two categories using MLR (Multinomial Logistic Regression).

  • Easily classifying data into groups using binary logistic regression.

  • Calculating the parameters of non-linear models using NLR (Nonlinear Regression) and CNLR (Constrained Nonlinear Regression).

  • Meeting statistical predictions using WLS (Weighted Least Squares) and 2SLS (Two-stage least squares).

  • Assessing the value of stimuli using probit analysis.

 

Predicting categorical results:

 

  • MLR (Multinomial Logistic Regression) allows the regression of a categorical dependent variable with more than two categories on a set of independent variables. This procedure allows precise prediction of group membership in key groups.

  • Using the stepwise functionality, which includes forward entry, backward elimination, forward stepwise or backward stepwise methods, to find the best predictor.

  • If you have a large number of predictors, the Score and Wald methods provide faster results.

  • Evaluating the model as needed, using AIC (Akaike Information Criterion) and BIC (Bayesian Information Criterion) methods.

 

Classify data easily:

 

  • Using binary logistic regression, it is possible to develop models in which the dependent variable is dichotomous; for example, purchase and non-purchase, payment and default, graduate and non-graduate.

  • Predicting the probability of events such as reminder responses or programme participation.

  • Selecting variables using six types of stepwise methods including forward (select the most reliable variables until there are no more relevant predictors in the dataset) and backward (at each step, remove the least relevant predictor in the dataset).

  • Defining the criteria for inclusion or exclusion.

 

Estimating the parameters of non-linear models:

 

  • Estimating non-linear equations using NLR for unconstrained problems and CNLR for problems with and without constraints.

  • NLR allows models with arbitrary relationships between independent and dependent variables to be calculated using iterative prediction algorithms.

  • CNLR allows the use of linear and non-linear constraints or a combination of parameters.

  • Calculating parameters, reducing loss functions (objective function) and calculating start-up estimates of correlations and standard errors of parameters.

 

Respecting statistical forecasts:

 

  • If the spread of residual values is not constant use WLS (weighted least squares) to calculate the model. For example, to predict share values, shares with higher or lower dividends and the fluctuation of values.

  • Using the 2LS (Use least squares) technique to calculate the dependent variable, in case of correlation between independent variables and regression error terms. Thus, it is possible to simplify the control of correlations between predictor variables and error terms.

 

Assessing the value of stimuli:

 

  • Using probit analysis is most appropriate to calculate the effects of one or more independent variables on a categorical dependent variable.

  • Evaluating the value of the stimuli using the logit or probit transformation of the respondent proportions.

 

Videos about some of the Statistics Regression procedures

Tecnical sheet Statistics Regression

bottom of page