IBM SPSS Regression enables you to predict categorical outcomes and apply a wide range of nonlinear regression procedures.
Effective where ordinary regression techniques are limiting or inappropriate: For example, studying consumer buying habits or responses to treatments, measuring academic achievement, and analyzing credit risks.
Recommended products
SPSS Statistics - Premium
EViews 14
STATA MP
IBM SPSS Statistics - Regression
Improve the accuracy of predictions with advanced regression procedures
IBM® SPSS® Regression software enables you to predict categorical outcomes and apply a range of nonlinear regression procedures. You can apply the procedures to business and analysis projects where ordinary regression techniques are limiting or inappropriate—such as studying consumer buying habits, responses to treatments or analyzing credit risk.
With SPSS Regression software, you can expand the capabilities of IBM SPSS Statistics Base for the data analysis stage in the analytical process.
- Predict categorical outcomes with more than two categories using multinomial logistic regression (MLR).
- Easily classify your data into groups using binary logistic regression.
- Estimate parameters of nonlinear models using nonlinear regression (NLR) and constrained nonlinear regression (CNLR).
- Meet statistical assumptions using weighted least squares and two-stage least squares.
- Evaluate the value of stimuli using probit analysis.
Desktop-Systems
Windows® | Mac® OS X | Linux® | ||
Further Requirements | Super VGA-Monitor (800x600) or higher Resolution For a connection to SPSS Statistics Base Server, you will need a network adapter for TCP/IP-Network protocol Internet Explorer |
Super VGA-Monitor (800x600) or higher Resolution Webbrowser: Mozilla Firefox |
Super VGA-Monitor (800x600) or higher Resolution Webbrowser: Mozilla Firefox |
|
Operating System | Windows XP, Vista, 7, 8, 10 (32-/64-Bit) | Mac OS X 10.7 (32-/64-Bit), Mac OS X 10.8 (only 64-Bit!) | Debian 6.0 x86-64, Red Hat Enterprise Linux (RHEL) 5 Desktop Editions, Red Hat Enterprise Linux (RHEL) Client 6 x86-64:
|
|
Min. CPU | Intel or AMD-x86-Processor 1 GHz or better | Intel-Processor (32-/64-Bit) | Intel or AMD-x86-Processor 1 GHz or better | |
Min. RAM | 1 GB RAM + | 1 GB RAM + | 1 GB RAM + | |
Festplattenplatz | Min. 800 MB | Min. 800 MB | Min. 800 MB |
Server-Systems
SPSS Statistics Server | |
Further Requirements | For Windows-, Solaris-PC's: Network adapter with TCP/IP-Network protocol For System z-PC's: OSA-Express3 10 Gigabit Ethernet, OSA-Express3 Gigabit Ethernet, OSA-Express3 1000BASE-T Ethernet |
Operating System | Windows Server 2008 or 2012 (64-Bit), Red Hat Enterprise Linux 5 (32-/64-Bit), SUSE Linux Enterprise Server 10 and 11 (32-/64-Bit) Details can be found in the the following PDF-document:System Requirements SPSS Statistics Server 22 |
Min. CPU | |
Min. RAM | 4 GB RAM + |
Disk Space | ca. 1 GB for the installation. Double the amount may be needed. |
Predict categorical outcomes
- Using MLR, regress a categorical dependent variable with more than two categories on a set of independent variables. This helps you accurately predict group membership within key groups.
- Use stepwise functionality, including forward entry, backward elimination, forward stepwise or backward stepwise, to find the best predictor.
- For a large number of predictors, use Score and Wald methods to help you quickly reach results.
- Assess your model fit using Akaike information criterion (AIC) and Bayesian information criterion (BIC).
Easily classify your data
- Using binary logistic regression, build models in which the dependent variable is dichotomous; for example, buy versus not buy, pay versus default, graduate versus not graduate.
- Predict the probability of events such as solicitation responses or program participation.
- Select variables using six types of stepwise methods. This includes forward (select the strongest variables until there are no more significant predictors in the data set) and backward (at each step, remove the least significant predictor in the data set).
- Set inclusion or exclusion criteria.
Estimate parameters of nonlinear models
- Estimate nonlinear equations using NLR for unconstrained problems and CNLR for both constrained and unconstrained problems.
- Using NLR, estimate models with arbitrary relationships between independent and dependent variables using iterative estimation algorithms.
- With CNLR, use linear and nonlinear constraints on any combination of parameters.
- Estimate parameters by minimizing any smooth loss function (objective function), and compute bootstrap estimates of parameter standard errors and correlations.
Meet statistical assumptions
- If the spread of residuals is not constant, use weighted least squares to estimate the model. For example, when predicting stock values, stocks with higher share values fluctuate more than low-value shares.
- Use two-stage least squares to estimate the dependent variable when the independent variables are correlated with regression error terms. This allows you to control for correlations between predictor variables and error terms.
Evaluate the value of stimuli
- Use probit analysis to estimate the effects of one or more independent variables on a categorical dependent variable.
- Evaluate the value of stimuli using a logit or probit transformation of the proportion responding.