Docstoc

LR_spss

Document Sample
LR_spss Powered By Docstoc
					                           Linear Regression in SPSS

   I. SIMPLE LINEAR REGRESSION EXAMPLE
   Butler’s Trucking Company is an independent trucking Company in southern
   California. A major portion of Butler’s business involves deliveries throughout its
   local area. To develop better work schedules, the managers want to estimate the total
   daily travel time for their drivers.
       Initially the managers believed that the total daily travel time would be closely
   related to the number of miles traveled in making the daily deliveries. A simple
   random sample of 10 driving assignments is provided in Table 1. Use SPSS to make
   a scatter diagram of these deliveries (to verify that a linear relationship does exist)
   and develop a regression equation expressing this relationship.

Table 1
           Driving Assignment     X1=Miles Traveled       Y=Travel Time (hrs.)
           1                      100                     9.3
           2                      50                      4.8
           3                      100                     8.9
           4                      100                     6.5
           5                      50                      4.2
           6                      80                      6.2
           7                      75                      7.4
           8                      65                      6.0
           9                      90                      7.6
           10                     90                      6.1

   SPSS Instructions

   1. Click on the program SPSS 9.0 for windows. When the box appears asking you
      ‘what you want to do?’, click cancel.

   2. Enter the values for your independent variable (x) in the first column. As you
      begin entering values in this column, a heading will appear above the column
      labeled var00001. Double click on the var00001 heading. A row will appear
      allowing you to name your variable, format the data type (i.e. as dollars, a time),
      declare the number of decimal places for expressing your values, and several
      other formatting options. For this example I named the variable x.

   3.    Enter your dependent variable (y) in the second column. The heading var00002
        will appear above this column. Double click on var00002 to specify y (or a
        descriptive name for your dependent variable) as the variable name for the data
        listed in this column. Figure 1 displays your input information.




                                            1
   Figure 1


4. To produce a scatter plot click on Graphs on the tool bar, and select scatter.
   When the scatter plot box appear, click on simple, followed by the define tab.
   You must specify the variable that you want plotted on the x, and y axes.
   Highlight x in the window on your left. While x is highlighted, click the arrow to
   the left of the window labeled XAxis. The label x should appear in this box.
   Highlight the variable y, and click on the arrow to the left of the window labeled
   YAxis.. A scatter plot will appear. You may save or print the scatter plot, and
   then close the output screen by clicking file (on the tool bar) and selecting close.


5. To obtain the regression equation click on the Analyze on the tool bar. Select
   Regression, and click on Linear. Inside of the Linear Regression box, you need
   to specify your independent and dependent variables. Highlight x in the window
   on your left. While x is highlighted, click the arrow to the left of the window
   labeled Independent. The label x should appear in this box. Highlight the
   variable y, and click on the arrow to the left of the window labeled Dependent..
   The label y should now appear in this window.


6. Next, click on the Statistics tab on the bottom of the Linear Regression Box.
   Inside of the Linear Regression: Statistics box check the boxes next to
   Estimates, Model Fit, and R squared change. Then select continue. When
   you return to the Linear Regression Box, click ok. Your results should look
   similar to the results shown below.


                                         2
 REGRESSION
        Variables Enter ed/Re m ovebd

          Variables     Variables
Model     Entered       Remov ed         Method
1        Xa                       .     Enter
  a. All requested variables entered.
  b. Dependent Variable: Y


                                                               Model Sum m ary


                                                                                                   Change Statistics
                                      Adjusted         Std. Error of     R Square
Model        R         R Square       R Square         the Estimate      Change        F Change        df 1            df 2       Sig. F
1             .815 a       .664            .622              1.0018          .664         15.815              1               8
  a. Predictors: (Constant), X


                                          ANOVAb

                         Sum of
Model                    Squares           df              Mean Square         F           Sig.
1        Regression        15.871                  1            15.871        15.815         .004 a
         Residual           8.029                  8             1.004
         Total             23.900                  9
  a. Predictors: (Constant), X
  b. Dependent Variable: Y

                                                  a
                                      Coe fficients

                                                       Standardi
                                                          zed
                           Unstandardiz ed             Coef f icien
                             Coef f icients                ts
Model                       B         Std. Error         Beta             t            Sig.
1        (Cons tant)        1.274          1.401                          .909           .390
         X             6.783E-02            .017              .815       3.977           .004
  a. Dependent Variable: Y




                                                       3
Interpreting Results
1. In your second model summary table, you will find the Coefficient of
   Determination, R2, and the Correlation Coefficient, R.

2. The ANOVA table gives the F statistic for testing the claim that there is no
   significant relationship between your independent and dependent variables. The
   sig. value is your p value. Thus you should reject the claim that there is no
   significant relationship between your independent and dependent variables if p<.

3. The Coefficients box gives the b0 and b1 values for the regression equation. The
   constant value is always b0. The b1value is next to your independent variable, x.

4. In the last column of the coefficient box, the p values for individual t tests for our
   independent variable is given. Recall that this t test tests the claim that there is no
   relationship between the independent variable and your dependent variable. Thus
   you should reject the claim that there is no significant relationship between your
   independent variable and dependent variable if p<.




                                          4
II. MULTIPLE REGRESSION EXAMPLE
      In attempting to identify another independent variable, the managers felt that the
number of deliveries could also contribute to the total travel time. Table 2 includes the
number of deliveries for each of the random driving assignments provided in Table 1.

Table 2
            Driving                  X1=Miles           X2=Number of      Y=Travel Time
            Assignment               Traveled           Deliveries        (hrs.)
            1                        100                4                 9.3
            2                        50                 3                 4.8
            3                        100                4                 8.9
            4                        100                2                 6.5
            5                        50                 2                 4.2
            6                        80                 2                 6.2
            7                        75                 3                 7.4
            8                        65                 4                 6.0
            9                        90                 3                 7.6
            10                       90                 2                 6.1

To determine the regression equation for this scenario follow the same SPSS steps
provided for Simple Linear Regression with the following modifications:

        In Step 2, redefine x as x1, and then enter the data for x2 in another column and
         name the column x2.
        In Step 3, you must specify x1 and x2 as independent variables (i.e. after placing
         one of the variables in the independent box, follow the same procedure to place
         the other variable in the independent box).
        Omit Step 4.

Your output for this multiple regression problem should be similar to the results shown
below.

REGRESSION


           Variables Enter ed/Re m ovebd

             Variables      Variables
  Model       Entered       Remov ed       Method
  1               a
             X2, X1                   .   Enter
       a. All requested variables entered.
       b. Dependent Variable: Y




                                                    5
                                                    Model Summ ary


                                                                                       Change Statistics
                                  Adjusted    Std. Error of     R Square
Model       R       R Square      R Square    the Estimate      Change     F Change        df1             df2       Sig. F Change
1            .951 a     .904           .876          .5731          .904      32.878             2               7            .000
  a. Predictors: (Constant), X2, X1


                                              ANOVAb

                             Sum of
Model                        Squares           df             Mean Square         F                  Sig.
1         Regression           21.601                 2            10.800        32.878                .000 a
          Residual              2.299                 7              .328
          Total                23.900                 9
   a. Predictors: (Constant), X2, X1
   b. Dependent Variable: Y

                                                    a
                                        Coe fficients

                                                          Standardi
                                                             zed
                              Unstandardiz ed             Coef f icien
                                Coef f icients                ts
Model                          B         Std. Error         Beta             t              Sig.
1         (Cons tant)          -.869           .952                          -.913            .392
          X1              6.113E-02            .010              .735        6.182            .000
          X2                    .923           .221              .496        4.176            .004
   a. Dependent Variable: Y


 Interpreting Results
 1. In your second model summary table, you will find the Adjusted Coefficient of
    Determination, Adjusted R2, and the Correlation Coefficient, R.
 2. The ANOVA table gives the F statistic for testing the claim that there is no
    significant relationship between your all of your independent and dependent
    variables. The sig. value is your p value. Thus you should reject the claim that
    there is no significant relationship between your independent and dependent
    variables if p<.
 3. The Coefficients box gives the b0 and b1, and b2 values for the regression equation.
    The constant value is always b0. The b1value is next to your x1 value, and b2 is
    next to your x2 value.
 4. In the last column of the coefficient box, the p values for individual t tests for our
    independent variables is given. Recall that this t test tests the claim that there is
    no relationship between the independent variable (in the corresponding row) and
    your dependent variable. Thus you should reject the claim that there is no



                                                          6
significant relationship between your independent variable (in the corresponding
row) and dependent variable if p<.




                                    7

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:2
posted:10/25/2011
language:English
pages:7