CARMA 2010 Summer Short Courses Abstract Bios by pqhJNdjb


									                      CARMA 2010 Summer Short Courses
                                 Program I
                Virginia Commonwealth University, Richmond, VA
                            May 17 – May 22, 2010

                               Session I: May 17 – May 19, 2010

                       “Introduction to Structural Equation Methods”
                       Dr. Larry J. Williams, Wayne State University


Dr. Larry J. Williams joined the faculty of Wayne State University as Professor of Management
in January of 2010, and previously he was an Associate Professor and Jay Ross Young Faculty
Scholar at the Krannert School of Management of Purdue University (1987-1996), the Fisher
Distinguished Scholar in the Industrial/Organizational Psychology program at the University of
Tennessee (1996-1997), and a University Professor in the Department of Management at
Virginia Commonwealth University (1997-2009). Dr. Williams served as the Founding Editor of
Organizational Research Methods (ORM), a journal sponsored by the Research Methods
Division (RMD) of the Academy of Management, and he previously served as Consulting Editor
for the Research Methods and Analysis section of the Journal of Management (1993-1996). Dr.
Williams also has served as Chairperson for the Research Methods Division (RMD) of the
Academy of Management and he established and currently serves as Director of the Center for
the Advancement of Research Methods and Analysis (CARMA).

Among his recent accomplishments, in 2004 Dr. Williams was recognized by the Southern
Management Association as an author of 2 of the 6 most highly cited articles in the 30 year
history of the Journal of Management. He was also elected in 2004 to be a member of the
Society for Organizational Behavior, an international group of approximately 80 leading scholars
from the field of organizational behavior. In 2005, Dr. Williams was selected to be the recipient
of the 2005 Distinguished Career Contributions Award by the Academy of Management’s
Research Methods Division. In 2008, Professor Williams was recognized as one of the 150 most-
cited authors in the field of management (1981-2004) in an article published in the Journal of
Management. He was elected Fellow of the Society for Industrial Organizational Psychology in

Course Summary

The Introduction to Structural Equation Methods Short Course provides (a) introductory
coverage of confirmatory latent variable techniques, including confirmatory factor analysis and
structural equation methods with latent variables, (b) discussion of special issues related to the
application of these techniques in organizational research, and (c) a comparison of these
techniques with traditional analytical approaches. This Short Course will contain a balance of
lecture and hands-on data analysis with examples and assignments, and emphasis will be placed
on the application of SEM techniques to organizational research problems.
Course Outline & Objectives

a. Participants will develop skills required to conduct confirmatory latent variable data analysis,
based on currently accepted practices, involving topics and research issues common to
organizational research.

b. Participants will learn the conceptual and statistical assumptions underlying confirmatory
latent variable analysis.

c. Participants will learn how to implement data analysis techniques using software programs for
confirmatory modeling. Special emphasis will also be placed on the generation and interpretation
of results using the contemporary software programs LISREL and MPlus.

d. Participants will learn how latent variable techniques can be applied to contemporary research
issues in organizational research.

e. Participants will learn how the application of current latent variable techniques in
organizational research differs from traditional techniques used in this literature

                         *“Meta-Analysis: Models & Processes”
                 Dr. Mike McDaniel, Virginia Commonwealth University
          Dr. Hannah Rothstein, Baruch College and City University of New York

Dr. Mike McDaniel Biography

Dr. McDaniel is a member of the doctoral faculty in the management program. His current
research interests include publication bias, racial differences in job performance, situational
judgment tests, the effects of applicant faking on employment decisions and issues concerning
older workers. Dr. McDaniel has also contributed to the research literature concerning
applications of the meta-analysis in the evaluation of various personnel selection methods. Dr.
McDaniel received his Ph.D. in Industrial Organizational Psychology from the George
Washington University. Prior to joining VCU he was a tenured associate professor at the
University of Akron. Dr. McDaniel has published in Academy of Management Journal, Journal
of Applied Psychology, Personnel Psychology, the Journal of Gerontology: Psychological
Sciences, and Human Performance. Dr. McDaniel is a member of the Academy of Management,
and a Fellow of the Society for Industrial, Organizational Psychology, Inc., a Fellow of the
American Psychological Association, a Fellow of the American Psychological Society, and a
member of the Society for Human Resource Management. Dr. McDaniel has received awards in
recognition of his research from the Academy of Management, the International Personnel
Management Association Assessment Council, the Southern Management Association, and the
US Office of Personnel Management.
Dr. Hannah Rothstein Biography

Hannah R. Rothstein is Professor of Management at Baruch College and the Graduate Center of
the City University of New York, where she coordinates the doctoral specialization in
Organizational Behavior, serves as chair of the Baruch IRB, and teaches graduate courses in
research methods.

Dr. Rothstein is a co-author of widely used software for meta-analysis (Comprehensive Meta-
Analysis) and for power analysis (Power and Precision). With Alex Sutton and Michael
Borenstein, she is co-editor of Publication Bias in Meta-Analysis: Prevention, Assessment and
Adjustments, (Wiley, 2005), and with Michael Borenstein, Larry Hedges and Julian Higgins, she
is co-author of Introduction to Meta-Analysis (Wiley, 2009), and Computing Effect Sizes for
Meta-Analysis (Wiley, forthcoming). With Sally Hopewell, she is author of a chapter on
literature searches for research syntheses and meta-analysis in the second edition of the
Handbook of Research Synthesis.

Dr. Rothstein is a fellow of the Society for Industrial and Organizational Psychology and of the
American Psychological Association, and is a founding member of the Society for Research
Synthesis Methodology. She is an associate editor of the Journal of Research Synthesis
Methodology, and currently serves on the Editorial Boards of Psychological Methods,
Organizational Research Methods, and the Journal of Experimental Criminology. She is also on
the advisory board of the International Campbell Collaboration Methods Group. Dr. Rothstein
received her Ph.D. in Industrial and Organizational Psychology from the University of Maryland.

Course Summary

The purpose of the course is to make participants aware of methods for integrating empirical
research. Participants will leave the course with sufficient understanding to conduct a meta-
analysis. The course provides hands-on experience in conducting a meta-analysis and the
participants learn the major issues in conducting and interpreting a meta-analysis. The course
covers both psychometric meta-analysis (Hunter & Schmidt) and meta-analysis in the Hedges
and Olkin tradition. Steps in conducting a meta-analysis are reviewed. Typical challenges in
conducting a meta-analysis are addressed. The most popular methods of meta-analysis are
compared and contrasted. The role of random sampling error in distorting research literatures is
emphasized. Other distorting artifacts such as measurement error and range restriction are also
reviewed. Fixed and random effect models are considered. Exercises illustrate methods of meta-
analyzing correlation coefficients and standardized mean differences as well as methods to detect
publication bias. An addition exercise concerns estimating effect sizes from various types of
statistics. Software is demonstrated in the course and participants gain experience in using the
software. Software for this course is CMA and Schmidt Data Analysis which will be given in

*Biography and course summary from 2009.
                         *“Grounded Theory Method & Analysis”
                        Dr. Karen Locke, College of William & Mary

Karen Locke, Ph. D., is W. Brooks George Professor of Business Administration at the College
of William and Mary’s school of business. She joined the faculty there in 1989 after earning her
Ph. D. in organizational behavior from Case Western Reserve University. Dr. Locke's work
focuses on developing a sociology of knowledge in organizational studies and on the use of
qualitative research for the investigation of organizational phenomena. Her work appears in
journals such as Academy of Management Journal, Organization Science, Journal of
Organizational Behavior, Journal of Management Inquiry, and Studies in Organization, Culture
and Society. And, she has authored Grounded Theory In Management Research and co-authored
Composing Qualitative Research, both books published by Sage. Her current work continues her
interest in the processes of qualitative researching and focuses on exploring and explicating their
creative and imaginative dimensions. Dr. Locke also serves as an associate action editor for
Organizational Research Methods as a member of the editorial board Academy of Management

Course Summary

This workshop will introduce researchers to the grounded theory approach by outlining its key
operational processes and the distinguishing characteristics of the theory these processes
generate. Workshop participants will take from it a) a general understanding of the logic
underlying this foundational approach to qualitative research, b) a specific understanding of and
practice with its operational procedures (e.g. theoretical sampling, coding forms, constant
comparison, memoing etc.), c) familiarity with the grounded theory methodological literature.
Participants are invited to bring samples of their own data to the session. No software is required
for this course.

*Biography and course summary from 2009

                            *“Introduction to Linear Regression”
                          Dr. Jose Cortina, George Mason University


Jose M. Cortina is a Professor in the I/O Psychology program at George Mason University.
Professor Cortina received his Ph.D. in 1994 from Michigan State University. His recent
research has involved topics in meta-analysis, structural equation modeling, and the use of
personality to predict job performance. His work has been published in journals such as the
Journal of Applied Psychology, Personnel Psychology, Psychological Bulletin, Organizational
Research Methods, and Psychological Methods. He currently serves on the editorial boards of
four journals and is an Associate Editor of the Journal of Applied Psychology. Dr. Cortina was
honored by SIOP with the 2001 Ernest J. McCormick Award for Distinguished Early Career
Contributions and by the Research Methods Division of the Academy of Management with the
2004 Robert O. McDonald Best Paper Award.
Course Summary

The general purpose of this course, besides torture which, sadly, has been prohibited by APA, is
to subject you to all, or at least much, that is regression analysis (How’s that for a sentence?).
Specifically, we will cover the nuts and bolts of standard linear regression (i.e., excluding things
like logistic regression and random coefficient modeling). This will allow you to address a wide
variety of research questions, to identify those questions which are not appropriately addressed
with standard regression analysis, and to isolate the problems associated with any given
regression analysis.
Only so much material can be covered in 3 days. If you expect to be a regression expert in that
time, you will be disappointed. On the other hand, if you need a foundation on which to build
expertise, then you have come to the right place. At the end of this course, you will have a basic
understanding of the questions to which regression applies, of the interpretation of regression
output, and of the most important assumptions on which OLS regression is based. The software
for this course is SPSS Version 16.

*Biography and course summary from 2009
                             Session II: May 20 – May 22, 2010

                    “Advanced Topics in Structural Equation Methods”
                      Dr. Robert Vandenberg, University of Georgia


Robert (Bob) Vandenberg is a Professor of Management in the Terry College of Business at the
University of Georgia, Athens, GA (USA). He belongs to the organizational behavior group. He
teaches in the undergraduate, MBA and Ph.D. programs including courses in organizational
behavior, leadership, change management, introductory research methods, introduction to
structural equation modeling, and advanced structural equation modeling. Bob's primary
substantive research focuses are on organizational commitment, and high involvement work
processes. His methodological research stream includes measurement invariance, latent growth
modeling, and multilevel structural equation modeling. Bob's articles on these topics have
appeared in the Journal of Applied Psychology, Journal of Management, Journal of
Organizational Behavior, Human Resource Management, Organization Sciences, Group and
Organization Management, Organizational Behavior and Human Decision Processes, and
Organizational Research Methods.

Bob's measurement invariance article co-authored with Charles Lance received the 2005 Robert
McDonald Award for the Best Published Article to Advance Research Methods given by the
Research Methods Division of the Academy of Management. He has served on the editorial
boards of the Journal of Applied Psychology, Journal of Management, Organizational Behavior
and Human Decision Processes, and Organizational Research Methods. Bob is currently editor-
in-chief of Organizational Research Methods. He is past division chair of the Research Methods
Division of the Academy of Management. In addition, he is a Fellow of the Society for
Industrial and Organizational Psychology, the American Psychological Association, and the
Southern Management Association. He is also a Fellow in the Center for the Advancement of
Research Methods and Analysis at Virginia Commonwealth University in which he conducts
annual workshops, and a Fellow of the Institute for Behavioral Research at the University of
Georgia. Further, he was recently accepted as a member of S.O.B. Finally, and vastly more
important, Bob is married to Carole, has three children, Drew, Kaity and Jackson, and rides his
Harley with a passion every day.

Course Summary

The focus of my workshop is on SEM topics that are outside the boundaries of introductory SEM
courses and workshops. Thus, participants should have some grounding in basic CFA and SEM
analyses. Further, I use the Mplus program in my workshop (Muthén & Muthén, 1998-2006).
However, no prior experience with the latter program is needed. Many of my examples entail
multi-group comparisons such as intervention vs. control conditions. All topics will be
approximately half lecture and half hands-on experience where you will actually conduct the
analyses. Data for the hands on portion will be provided. However, I highly encourage
participants to bring their own data and attempt the exercises using them. Among the advanced
topics to be discussed are: (a) measurement invariance; (b) latent growth modeling; (c)
multilevel SEM; (d) latent interaction terms; (e) latent class analysis; and (f) missing data
procedures in SEM. Software required for this course is MPlus.

                        “Testing Interactions with Linear Regression”
                          Dr. Herman Aguinis, Indiana University


Herman Aguinis is the Dean’s Research Professor and Professor of Organizational Behavior and
Human Resources at Indiana University’s Kelley School of Business. He has been a visiting
scholar at universities in the People’s Republic of China, Malaysia, Singapore, Argentina,
France, Spain, Australia, Puerto Rico, and South Africa. He is the author of Performance
Management, (2nd edition, 2009), Applied Psychology in Human Resource Management (7th
edition, 2011, with W.F. Cascio), and Regression Analysis for Categorical Moderators (2004,
Guilford); and has edited two others, including Opening the Black Box of Editorship (2008, with
Y. Baruch, A.M. Konrad, & W.H. Starbuck). Also, he has published more than 70 articles in
Academy of Management Journal, Academy of Management Review, Journal of Applied
Psychology, Organizational Behavior and Human Decision Processes, Personnel Psychology,
and elsewhere. He is a Fellow of the Association for Psychological Science, the American
Psychological Association, and the Society for Industrial and Organizational Psychology, and is
the recipient of several recognitions and awards including the Academy of Management
Research Methods Division Robert McDonald Advancement of Organizational Research
Methodology Award (2009), the University of Colorado Denver Best Researcher of the Year
Award (2004), the Academy of Management Research Methods Division Advancement of
Organizational Research Methodology Award (2001), and the Journal of Organizational
Behavior Best Article of the Year Award (1996). He has served the Academy of Management as
Chair of the Research Methods Division, Program Chair of the Iberoamerican Academy of
Management, and elected member of the Executive Committee of the Human Resources
Division. He served as Editor-in-Chief of Organizational Research Methods (2005-2007),
currently serves as guest Co-Editor for a special issue of Journal of Management on “bridging
micro and macro research domains,” and serves or has served on the editorial board of fourteen
journals. For more information, see:

Course Summary

The goal of this workshop is to provide a review and update regarding the estimation of
moderating (i.e., interaction) effects using multiple regression. The workshop will include
theoretical/conceptual issues and hands-on demonstrations on the following topics: (a) a review
of multiple regression, (b) definition of moderating effect, (c) interpretation of moderating
effects, (c) a review of the latest research on factors known to affect the power of multiple
regression to estimate moderating effects (e.g., range restriction, heterogeneity of error variance).
The hands-on/demonstration portion of the workshop will include the following topics: (a) how
to use computer programs (with an emphasis on SPSS and EXCEL) to estimate moderating
effects with multiple regression, (b) how to use computer programs (provided at the workshop
and available on-line) to assess violation of assumptions that bias the moderator test, to obtain
alternative statistics to the F-test when assumptions are violated, and to estimate the statistical
power of a moderator test. Software required for this course is Excel, SPSS, and JAVA.

                     *“Survey Design/Data Collection Using the Internet”
                            Dr. Jeff Stanton, Syracuse University


Jeffrey M. Stanton, Ph.D. (University of Connecticut, 1997) is Associate Dean for Research and
Doctoral Programs in the School of Information Studies at Syracuse University. Dr. Stanton’s
research focuses organizational behavior and technology, with his most recent projects
examining how behavior affects information security and privacy in organizations. He is the
author with Dr. Kathryn Stam of the book, The Visible Employee: Using Workplace Monitoring
and Surveillance to Protect Information Assets ? Without Compromising Employee Privacy or
Trust (2006, Information Today, ISBN: 0910965749). Stanton has published more than 60
scholarly articles in top peer-reviewed
behavioral science journals, such as the Journal of Applied Psychology, Personnel Psychology,
and Human Performance. His work also appears in Computers and Security, Communications of
the ACM, the International Journal of Human-Computer Interaction, Information Technology
and People, the Journal of Information Systems Education, as well as Behaviour & Information

Dr. Stanton is an expert psychometrician with published works on the measurement of job
satisfaction and job stress, as well as research on creating abridged versions of scales and
conducting survey research on the Internet; he is on the editorial board of Organizational
Research Methods, the premier methodological journal in the field of management. Dr. Stanton
is an associate editor at the journal Human Resource Management. Dr. Stanton’s research has
been supported through more than ten different grants and awards including the National Science
Foundation’s prestigious CAREER award. Dr. Stanton’s background also includes more than a
decade of experience in business–both in established firms and start-up companies.

In 1995, Stanton worked as a human resources analyst for Applied Psychological Techniques, a
human resource consulting firm based in Darien, Connecticut. His projects at this firm included
the development, implementation, and assessment of a performance appraisal system,
development of a selection battery for customer service representatives, and the creation of a job
classification and work standards system for over 350 positions in the public utilities industry.
Dr. Stanton also worked for HRStrategies, Inc. as a human resources consultant, the Connecticut
department of Mental Health as a statistical consultant, and for Inpho Inc. (now,
AKG Acoustics Inc., and the Texet Corporation in management and engineering positions.

Course Summary

The Internet provides a range of powerful methods for collecting social science research data.
Thousands of researchers around the world have taken advantage of the flexibility and reach of
email and the web to deliver research materials to participants and collect their responses. Yet
there are numerous pitfalls in Internet-based research and many studies have ended up with small
samples, poor response rates, low quality data, and research ethics disasters. This three day Short
Course provides all of the tools, techniques, and insights you will need to conduct a worthwhile,
methodologically sound research study using the Internet. The software will be all web based
open source software.

*Biography and Course Summary from 2009

      *“Alternatives to Difference Scores: Polynomial Regression & Response Surface
               Dr. Jeff Edwards, University of North Carolina (Chapel Hill)


Jeffrey R. Edwards is the Belk Distinguished Professor of Organizational Behavior and Strategy
at the Kenan-Flagler Business School at the University of North Carolina at Chapel Hill. He was
previously Professor of Organizational Behavior and Human Resource Management at the
University of Michigan Business School and Associate Professor of Business Administration at
the Darden Graduate School of Business at the University of Virginia. He holds a B.A. in
psychology and economics from the University of North Carolina at Chapel Hill and a M.S. and
Ph.D. in organizational psychology and theory from the Graduate School of Industrial
Administration from Carnegie Mellon University. He is past editor of Organizational Behavior
and Human Decision Processes, has served as associate editor for Organizational Behavior and
Human Decision Processes, Organizational Research Methods, the Journal of Organizational
Behavior, and Management Science, and has served on the editorial boards of the Academy of
Management Journal, the Journal of Applied Psychology, Personnel Psychology, Organizational
Research Methods, the Journal of Organizational Behavior, the Journal of Management, the
Journal of Occupational Health Psychology, and Social Indicators Research. He has been
elected to various positions in the Academy of Management, including representative at large of
the Organizational Behavior Division and representative at large, program chair, and division
chair of the Research Methods Division. He is also founder and coordinator of RMNET, the
electronic question-and-answer discussion group for members of the Research Methods
Division. He is a Fellow of the Academy of Management, the American Psychological
Association, the Society of Industrial and Organizational Psychology, and the Center for the
Advancement of Research Methods and Analysis (CARMA), and has been elected to the Society
of Organizational Behavior. He has also received the Distinguished Career Award from the
Research Methods Division of the Academy of Management.

Professor Edwards’ research and teaching focus on person-environment fit in organizations,
stress, coping, and well-being, the work-nonwork interface, and methodological issues in
organizational research. His methodological work has examined difference scores, polynomial
regression, and measurement and construct validation using structural equation modeling. His
work has been published in the Academy of Management Review, the Academy of Management
Journal, the Journal of Applied Psychology, Personnel Psychology, Organizational Behavior
and Human Decision Processes, Human Relations, the Journal of Organizational Behavior,
Psychological Methods, and Organizational Research Methods. He has taught courses in
undergraduate, MBA, doctoral, and executive education programs on topics such as
organizational behavior, individual and organizational change, stress management, employee
involvement, human resource management, and research methods. His research methods course
has won awards at the school and university levels. He has served as an instructor and consultant
for Alcoa, Burlington Industries, ExxonMobil, General Electric, General Motors,
GlaxoSmithKline, Johnson & Johnson, Kaiser Permanente, Misys Healthcare, Quintiles,
SonyEriccson, Wachovia, W.C. Bradley, Westinghouse, Whirlpool, and the U.S. Department of

Course Summary

For decades, difference scores have been used in studies of fit, similarity, and agreement in
organizational research. Despite their widespread use, difference scores have numerous
methodological problems. These problems can be overcome by using polynomial regression and
response surface methodology to test hypotheses that motivate the use of difference scores.
These methods avoid problems with difference scores, capture the effects difference scores are
intended to represent, and can examine relationships that are more complex than those implied
by difference scores.

This short course will review problems with difference scores, introduce polynomial regression
and response surface methodology, and illustrate the application of these methods using
empirical examples. Specific topics to be addressed include: (a) types of difference scores; (b)
questions that difference scores are intended to address; (c) problems with difference scores; (d)
polynomial regression as an alternative to difference scores; (e) testing constraints imposed by
difference scores; (f) analyzing quadratic regression equations using response surface
methodology; (g) difference scores as dependent variables; and (h) answers to frequently asked

*Biography and course summary from 2009
                        CARMA 2010 Summer Short Courses
                                    Program II
                         Wayne State University, Detroit, MI
                             June 17 – June 19, 2010

         *“Nonlinear Dynamic Models: Neural Network & Agent Based Analysis”
                       Dr. Paul Hanges, University of Maryland


Paul J. Hanges is Professor of Industrial/Organizational Psychology and is currently the
Associate Chair/Director of Graduate Studies for the University of Maryland’s (UMD)
Psychology Department. He is also an affiliate of the UMD’s R. H. Smith School of Business
and the Aston Business School (Birmingham, England). He received his Ph.D. from the
University of Akron in 1987. His research focuses on testing and strategic human resource
management, diversity and organizational climate, cross-cultural leadership, and
mathematical/computational modeling. He has published over 60 articles and book chapters as
well as one book. Paul’s publications have appeared in such journals as Advances in Global
Leadership, American Psychologist, Applied Psychological Measurement, Applied Psychology:
An International Review, Journal of Applied Psychology, Journal of International Business
Studies, Psychological Bulletin, and The Leadership Quarterly. He is on the editorial board of
the Journal of Applied Psychology and The Leadership Quarterly and a fellow of the American
Psychological Association, Association for Psychological Sciences, and the Society for
Industrial/Organizational Psychology.

Course Summary

The Nonlinear Dynamic Models Short Course is designed to expose researchers to two common
analysis techniques used to develop and test dynamic models. Specifically, the following topics
will be covered: (a) Introduction to dynamic models and an explanation for why they are needed
in the Organizational Sciences, (b) Explanation of neural network analysis, (c) Discussion and
interpretation of neural network analysis using SPSS 16, (d) Introduction to agent-based models
and explanation of their utility in developing dynamic hypotheses, (e) Downloading and
installing shareware agent-based model software, (g) Explanation of software and instruction on
programming software, (h) comparison of these two techniques. This Short Course combines
lecture with hands-on experience with these two techniques. Software for this course is SPSS

*Biography and course summary from 2009
                        “Repeated Measures/Longitudinal Research”
                      Dr. Robert Ployhart, University of South Carolina


Robert E. Ployhart is an Associate Professor of Management & Moore Research Fellow in the
Darla Moore School of Business at the University of South Carolina. He is also the Academic
Director for the Master of Human Resources program. He received his Ph.D. from Michigan
State University and M.A. from Bowling Green State University. His research focuses on
staffing, personnel selection, recruitment, staffing-related legal issues, and applied statistical
models such as structural equation modeling, multilevel modeling (HLM/RCM), and
longitudinal modeling. Rob has published numerous articles in outlets such as the Academy of
Management Journal, Journal of Applied Psychology, Journal of Management, Personnel
Psychology, Organizational Behavior and Human Decision Processes, and International Journal
of Selection and Assessment. He has also coauthored two books;Staffing Organizations with Ben
Schneider and Neal Schmitt, and Situational Judgment Tests with Jeff Weekley.

Rob is an Associate Editor for the Journal of Applied Psychology, and has previously served as
an invited Editor for Organizational Research Methods and an invited Associate Editor for
Organizational Behavior and Human Decision Processes. He has served on the editorial boards
of six scientific journals. Rob received the Best Paper Award in 2006 from the Journal of
Management, and has received other awards from the Society for Industrial and Organizational
Psychology and the Human Resource Division of the Academy of Management. His research
has been funded by both private and public organizations.

Course Summary

This Short Course will discuss theoretical, methodological, measurement, and statistical issues
unique to repeated measures and longitudinal designs. Topics include how to conceptualize
change and design studies to test different forms of change, how to evaluate the adequacy of
measures in longitudinal contexts, and how to analyze longitudinal data correctly. Students will
learn how to analyze repeated measures/longitudinal data using the repeated measures General
Linear Model, and be introduced to Random Coefficient Modeling and Structural Equation
Modeling. Included in such discussions will be a treatment of growth modeling.

Course Outline & Objectives

Day 1
1. Review of General Linear Model (GLM)
2. Theoretical & Design Issues
3. Repeated measures GLM

Day 2
1. Trend Analysis & Growth Curve Modeling
2. Random Coefficient Growth Curve Modeling
Day 3
1. Intro to Growth Modeling in Structural Equation Models
2. Additional Analyses/Analyze Your Own Data/Open Discussion

                                  “Social Network Analysis”
                            Rich DeJordy, Northeastern University


Rich DeJordy is an ABD PhD Candidate at Boston College where he researches network,
identity, and institutional mechanisms of social conformity. Studying and working with Steve
Borgatti, Rich has been teaching Social Network Analysis for the University of Essex Graduate
Summer School in Social Science Data Analysis since 2006, was an inaugural instructor at the
LINKS Center Summer SNA Workshop in 2008, and has provided specialized workshops in
SNA to academic, government, and corporate clients throughout the world. He has presented at
the International Network of Social Network Analysts SunBelt conference and is currently
developing a methodological variation to collecting cognitive social structure (CSS) data. He
has published a number of articles in refereed management and methods journals, as well as
several book chapters.

Course Summary

A beginner’s tutorial on the concepts, methods and data analysis techniques of social network
analysis. The course begins with a general introduction to the distinct goals and perspectives of
network analysis, followed by a practical discussion of network data, covering issues of
collection, validity, visualization, and mathematical/computer representation. We then take up
the methods of detection and description of structural properties such as centrality, cohesion,
subgroups, cores, roles, etc. Finally, we consider how to frame and test network hypotheses. An
important element of this workshop is that all participants learn to use the UCINET 6 and
NetDraw network analysis software packages. Software required for this course is Ucinet.

       “Analysis of Limited Dependent Variables: Logit, Tobit and Related Models”
                     Dr. Harry Bowen, Queens University of Charlotte


Harry P. Bowen received his Ph.D. in Economics from the University of California at Los
Angeles, with specializations in international economics, econometrics and finance. He received
his undergraduate degree (summa cum laude) in Economics from the University of California-
San Diego. He is internationally recognized of his seminal work on testing international trade
theory and the determinants of trade patterns. His most recent research focuses on the strategic
scope decisions of firms within the context of international business and methods for analyzing
moderating hypotheses in nonlinear models. Fundamentally an empiricist, he is forever
interested in methodological issues and analytic techniques. He is currently Associate Professor
of Management and W.R. Holland Chair of International Business and Finance at the McColl
School of Business, Queens University of Charlotte.

Course Summary

This short course will expose researchers to the theory and application of techniques designed to
model a dependent variable that takes a limited number of (usually) discrete outcomes. Such
techniques encompass the common binary logit and probit models, as well as models for
censored dependent variables such as the Tobit, and models for truncated dependent variables
including the Heckman “self-selection” model. The emphasis of the course will be on gaining an
understanding of when and how the various models should be applied, their basic statistical
foundations, and how to analyze and interpret the results obtained from such models. The course
is aimed at students and faculty who have only a cursory exposure to limited dependent models
and techniques.
                                 “Multi-Level Measurement”
                           Dr. James LeBreton, Purdue University


Dr. James M. LeBreton is an associate professor of Psychological Sciences at Purdue University.
He earned his Ph.D. in industrial/organizational psychology with a minor in statistics from The
University of Tennessee. Prior to joining the faculty at Purdue, James spent 5 years at Wayne
State University. His research focuses on personality assessment and the application of
personality assessment in personnel selection and work motivation contexts. During the last 10
years he has been involved in the development and validation of several measures designed to
assess implicit personality. His current research involves the measurement of
aggressive/antisocial personality characteristics and how those characteristics can be used to
enhance our understanding of important behavior in organizations (e.g., job performance,
counterproductive behavior, decision making, test faking). James also conducts research
involving the development and application of new statistics and research methods designed to
improve the decisions made by organizational scholars. His current projects involve 1) statistical
techniques for determining the relative importance of predictors in regression models, 2) the
development and evaluation of statistics designed to measure interrater agreement and reliability,
and 3) statistical techniques for analyzing longitudinal and multilevel data.

In 2009 James was awarded the Early Career Award from the Academy of Management’s
Research Methods Division and the Center for the Advancement of Research Methods and
Analysis. James serves on the editorial boards for Journal of Applied Psychology, Journal of
Management, Organizational Research Methods, and Journal of Business and Psychology.
Beginning July 1st, James will begin serving a 3-year term as an Associate Editor at
Organizational Research Methods.

Course Summary

This course is aimed at faculty and students who are relatively new to multilevel theory,
measurement, and analysis. It will review basic issues associated with the development and
testing of multilevel theories. Although the focus will be on issues pertaining to multilevel
measurement (e.g., multilevel constructs, multilevel construct validation, aggregation and
composition models), we will also discuss general issues associated with multilevel theory and
multilevel analysis. Specific topics will include:

1) Multilevel Theory & Multilevel Modeling: Constructs, Inferences, and Composition Models
2) Multilevel Measurement: Aggregation, Aggregation Bias, & Cross-Level Inference
3) Multilevel Measurement: Estimating Interrater Agreement & Reliability
4) Multilevel Measurement and Multilevel Modeling: Analyzing Composite Variables in
Hierarchical Linear Models
-----Examples using HLM Software
-----Examples using SPSS Software
5) Time permitting: Multilevel Measurement and Multilevel Modeling: Accounting for
Measurement Error via Latent Variable Analysis

To top