America�s Best Colleges- The influence of the U.S. News college ... - PowerPoint by kumar12

VIEWS: 0 PAGES: 66

									America’s Best CollegesThe influence of the U.S. News college rankings

Robert J. Morse, Director of Data Research, U.S. News Presented at IREG-3 Institute of Higher Education, Shanghai Jiao Tong University Shanghai, China October 30, 2007

America’s Best Colleges: The Editorial Philosophy Behind the Rankings
 To help consumers: prospective students and their parents make informed choices about:  expensive investment in tuition, room & board, etc. cost is around $200,000 plus in some cases  one-time career decision  Provide the public and prospective students with an understanding of latest trends in higher ed  Practical advice on many aspects of attending, financing and applying to college. One part of our ongoing reporting on educational issues.  Transparency in how we do the rankings.

America’s Best Colleges 2008 Edition Controversy
 There has been a great deal of controversy and media coverage about the Best Colleges rankings over the last few months.  As a result, there have been hundreds of articles written in local and national media about the U.S. News rankings and how some college presidents and other academics view them.  College presidents and others being against the rankings is nothing new.

America’s Best Colleges 2008 Edition Controversy
 On May 10, 2007 12 U.S. college President’s released a letter saying that they would no longer participate in the peer assessment/reputation component of the America’s Best Colleges rankings and they asked others to follow their lead.  The Education Conservancy (http://www.educationconservancy.org/ ) and its founder Lloyd Thacker are helping push this movement forward. (# of signers is now around 65)  The letter also said that these schools and others that signed it would make commitments about how their schools would no longer promote the U.S. News rankings on their web sites and other places.

America’s Best Colleges 2008 Edition Controversy
 The U.S. News response to the President’s letter has been to say:  We take any criticism seriously  That the reputation for undergraduate academic excellence helps graduates with getting that all important first job.  Where someone has gone as an undergrad does play a key role in getting into graduate school  Allows us to measure the “the intangibles” that aren’t covered by the statistical data

America’s Best Colleges 2008 Edition Controversy
 The ones complaining about the peer survey say they don't know about their competitors. We say, If you don't know enough about the other schools, don't rate them.  We are asking the experts in a profession to rank their competitors. In other words, we believe that presidents, provosts and deans of admissions do know a lot about their competitors since they are leaders in higher education.

America’s Best Colleges 2008 Edition Controversy
 If enough college president’s refuse to participate, which is surely their right, we will consider surveying other experts, like high school college-counselors, who have shown an interest in the past in participating.  The rankings are not published for college presidents.  If there were comparable outcome measures available, we would collect them and use them.  U.S. News has no intention to stop publishing the Best Colleges rankings

America’s Best Colleges 2008 Edition Controversy
 On September 7, 2007 another independent group of 19 college presidents (https://cms.amherst.edu/news/statements/node/2 1784/ ) from college with the top rankings in the U.S. News liberal arts colleges category (Amherst, Williams and Swarthmore, etc.) signed a letter that had a much different tone. Among other things  Did not mention peer survey  That it was important to make their data public and develop new types of data.  Would work with U.S. News to make ranking better.

U.S. News: Impact on the Best Colleges rankings of the boycott
 Kaplan Test Prep's annual survey of admissions officers at 322 top colleges and universities in July/August 2007 insight into the boycott of U.S.News & World Report's annual peer assessment survey by a small number of college presidents.  Of the 322 schools surveyed, 93 percent said they had participated in U.S. News & World Report's 2008 edition college rankings guide this year results were published on usnews.com on August 17, 2007.

U.S. News: Impact on the Best Colleges rankings of the boycott
 Among schools who participated in the rankings this year, Kaplan found that 97 percent reported that they plan to participate again next year and in the foreseeable future.  Kaplan found that 59 percent of participating schools said that college ranking reports are ultimately beneficial to students during the application process.  57 percent of the colleges said that the U.S. News rankings provide fair assessment of schools.  What doe this mean for U.S. News?

U.S. Commission on the Future of Higher Education’s Findings
 U.S. Secretary of Education’s Commission on the Future of Higher Education final report in September 2006 called for “robust culture of accountability and transparency…a consumerfriendly database, with useful, reliable information….coupled with search engine to weigh and rank comparative institutional performance.”  Report calls for measuring student learning outcomes and increased data collection on easily understood quality and cost indicators.

U.S. News and Commission on the Future of Higher Education’s Findings
 U.S. News has consistently been in favor of schools producing outcome and accountability indicators and developing ways to measure learning.  So far such indicators are being produced by only a few schools and generally aren’t comparable between schools.  It’s important that the data be comparable and produced in a way that it’s easily understood by consumers.  Some are calling this the U.S. Government rankings. It seems that higher education in the U.S. is strongly against this idea. Why? One size fits all? Measuring things that can’t be measured. Data burden. Strength of U.S. higher education has been in its diversity and this pan would inhibit that.

U.S. News rankings: Impact
 Best Colleges rankings are part of growing accountability movement: colleges/grad schools have increasingly had to account for and/or explain actions undertaken, funds expended and how students and graduates perform and learn.  A factor in the spread of the assessment movement in U.S. The National Survey of Student Engagement (NSSE) was started to be a counterweight to U.S. News.

U.S. News rankings: Impact
 Prospective applicants and enrolled students have become active consumers and have been given much more information to make independent judgments.  Has resulted in higher quality data: Common Data Set and IPEDS collecting more consumer data and posting it on http://nces.ed.gov/collegenavigator/ IPEDS College Navigator. Secretary’s Commission wants to improve and greatly expand the College Navigator so it can do semi-rankings.

 National Association of Independent Colleges and Universities a group of 940 private U.S. colleges in 9/2007 started a web site “University and College Accountability Network”. http://www.ucan-network.org/. Profiles of schools, no searching same..  National Association of State Universities and Land-Grant Colleges and American Association of State Colleges and Universities soon to start Voluntary System of Accountability (VSA) another common data template more promise. All major public universities in the U.S. (http://www.nasulgc.org ).  Association of Association Universities (62 largest U.S. research universities) making progress on its own, different effort to provide consumer data.

U.S. News rankings: Impact ..more data available

U.S. News rankings: Impact
 Rankings and consumer guides have resulted in colleges reporting higher quality and more consistent data to the public.  At least in the U.S. this rising consumer empowerment has led to the “commoditization” of college data in the U.S. This means that there is a lot a college data available. The difference is to what degree individual publishers use this information to develop unique and useful information..

U.S. News rankings: Impact …more
 Created a competitive environment in higher education that didn’t exist before. Some college president's say that this competition makes everyone better and helps students.  Annual public benchmark for academic performance-moving up the rankings has become a goal of some college presidents/boards/deans. Colleges presidents and boards can say if they move up, that their policies have worked and they have made “real measurable progress.”

U.S. News rankings: Impact
 Do rankings “make” school administrators do the wrong thing? Do school administrators make policy choices for the sole purpose of doing better in the rankings versus making them for the good of the students and learning? Is this good or bad?  Rankings have filled a large void caused by greatly reduced high school college counseling resources at public schools. Parents and students are more and more left to fend for themselves to find out about colleges and the admission process and in many cases have turned to U.S. News as a trusted source of advice and planning.

U.S. News rankings: Impact …more
 Created a “new class of elite schools” (Robert Samuelson).  New elites would be Duke (NC), MIT-(MA), Washington U.-St. Louis (MO)  “Old elites” didn’t have enough room for all the qualified students given growing U.S. and world populations.  The “old elite” schools, and the “Ivys” like Harvard, Yale, Princeton, Stanford still exist.

U.S. News Rankings Perspective and Impact
 Some call the rankings a case of extreme and unintended consequences.  It’s true that there have been some unintended consequences…  On the other hand, it can be argued that rankings’ time has come and now they are in the forefront of higher education discussions in the U.S. and around the world.  The biggest issue in academia surrounding rankings is the still the most basic one. Can complex institutions be numerically ranked?

Future of Rankings
 Rankings are here to stay.  Rankings are now a worldwide phenomenon.  Rankings will continue to evolve on a country-by-country basis.

Future of Rankings
 Rankings are now being done or studied for various reasons.
  

Consumer guides Public policy—benchmarking and education policy Academics study them as a discipline or for education/public policy reasons

 Rankings, more and more, are becoming a positive force around the world.  Rankings have found their place in the 21st century as a tool that can be used for consumers, assessment, accountability, peer analysis, and as a public benchmark to compare education performance and to set goals.

Morse Code: Inside the College Rankings
 On June 1, 2007 U.S. News started a blog called: Morse Code: Inside the College Rankings link: http://www.usnews.com/blogs/college-rankingsblog/index.html  Morse Code provides deeper insights into the methodologies and is a forum for commentary and analysis of college, grad and other rankings.  Future plans for this blog

Why are the U.S. News Rankings Helpful to Consumers?
 Based on accepted measures of academic quality.  Provide comparable, easy-to-read and accessible statistical information on a large number of colleges.  U.S. News’ ranking process is totally independent of information produced by a college or university through view books or other materials that students or their parents receive in the mail.  U.S. News has become a trusted and respected unbiased source of college assessments.

Appropriate Use of Rankings
 As one tool in college application process  U.S. News stresses the need in many of its editorial products that students/parents should consider: Cost, location, course offerings, alumni network, state of a school’s facilities, placement success, visiting the school, faculty, input from counselors and parents, personal fit

Appropriate Use of Rankings
 Research and feedback suggests this is how students use the rankings




Feedback to U.S. News indicates readers value the directory, college comparison and searches on the usnews.com web site Available higher education studies show that applicants, on average, do use rankings appropriately: 2006 UCLA Freshman Survey

UCLA Freshman Survey: Fall 2006
Reasons noted as “very important” influencing student’s decision to attend this particular school (11 out of 20)
1. College has very good academic reputation 57.4% 2. Colleges graduates get good jobs 49.3% 3. Wanted to go to a college this size 38.9% 4 A visit to the campus 38.2% 5. Offered financial assistance 34.3% 6. The cost of attending this college 32.2% 6. College has a good reputation for Social Activities 32.2% 8. Grads get into good grad/professional schools 30.2% 9. Wanted to live near home 18.3% 10. Information from a Web site 17.0% 11. Rankings in national magazines 16.4% 12. Relatives wanted the school 11.6% (2006 freshman norms based on 271,441 students at 393 baccalaureate colleges and universities)

U.S. News rankings: A few basics
 If survey not returned with latest fall 2006 IPEDS enrollment included, school receives Footnote 1School declined to fill out U.S. News survey.  Estimates aren’t printed, but published as N/A.  The following default and estimate protocols used rankings published in August 2007.  If school doesn’t respond to survey and has CDS posted or other comparable data available on the school’s site, we will use the data from the school’s site and footnote this.

The Sub-factor formulas
 Z Student selectivity = Z test score * (50%) + Z high school class standing * (40%) + Z acceptance rate * (10%)  Z Grad and retention = Z avg. 6-yr. grad rate * (80%) + Z avg. fresh retention rate * (20%)  Z faculty resources= Z avg. faculty salaries * (35%) + Z fac w/term degree * (15%) +Z % fac ft * (5%) + Z Student fac ratio * (5%) + Z % class < 20 * (30%) + Z % class 50 or more * (10%)

The Overall Score formula: National Univ. and National Liberal Arts categories
 Z academic reputation * (25%) + Z alumni giving* (5%) + Z financial resources * (10%) + Z student selectivity * (15%) + Z graduation and retention * (20%) + Z faculty resources * (20%) + Z grad rate perf. * (5%) = each school’s total weighted Z score  Z = each school’s Z-score for that variable  Overall score for school X = school X’s total weighed Z score/highest weighted Z score of school in X’s category: rounded nearest whole number  Ranking model for the rankings published 8/17/2006.

U.S. News Process to Rank Colleges
 Universe of schools eligible to be ranked is 1421 regionally accredited 4-year colleges that enroll first-time, first-year, degree-seeking undergraduate students. 80 of these were “Uranked”  Classify colleges into categories using the 2006 “Basic” Carnegie Classification of Institutions of Higher Education by the Carnegie Foundation.  Data gathered and analyzed on up to 15 indicators that measure academic quality.  Weights assigned to these 15 indicators.  Schools ranked against their peers, in their category based on their overall weighted scores.

U.S. News Ranking Factors
 Peer Assessment  Graduation Rate Performance  Retention  Alumni Giving Rate  Faculty Resources  Student Selectivity  Financial Resources

One Perspective on the U.S. News Indicators
Inputs
Faculty Resources Student Selectivity

Outputs
Retention Alumni Giving Rate Graduation Rate Performance

Financial Resources Peer Assessment

Weights for National Universities and Liberal Arts Colleges
10% 15% 5% 5% 20% Retention 20% Faculty Resources 25% 15% Selectivity 10% Financial Resources 20% 5% Grad Rate Perf 20% 5% Alumni giving 25% Assessment

Weights for Universities-Master’s and Baccaulaureate Colleges

10% 5% 25%

25% Assessment 15% Selectivity 20% Faculty Resources 25% 25% Retention 10% Financial resources

20% 15%

5% Alumni giving

Z-scores in the U.S. News Best Colleges Ranking Process
 Step one: create a z-score for each indicator in each U.S. News category using the data from the appropriate schools  Step two: percentage weights that U.S. News uses are applied to the z-scores  Weighted z-scores for each school are summed  Overall score for each school = each school’s total weighted z-score/ highest total weighted z-score for a school in each U.S. News category  Top school’s overall score in each category =100; other overall scores are sorted in descending order.

America’s Best Colleges: What’s a Standardized Value or “Z-score”
 Formula to standardize data before weighting  Z-score = (X-U)/SD  X= each school's data point for that particular indicator in that category only  U= average of that indicator for that year of reported data in the school’s category  SD= standard deviation of that indicator for that year of reported data in the school’s category  Z-scores re-scaled for negative values: No rescaling Z-scores negative values on individual indicators since 2002

America’s Best Colleges: “Z” Score Example
 90 = A school's top 10% high school class standing  60 = average top 10% among colleges reporting in that category.  5 = standard deviation top 10% high school class standing in that category  (90-60)/5 = (30/5); “Z” score or standardized value for weighting = 6

Peer Assessment
 Why?  Reputation for excellence helps graduates with jobs and further education in graduate school  How?  Surveys of educators: president, provost, and dean of admission at each school. Rate only schools within their U.S. News categories like National University, etc.  Rate school’s academic quality of undergraduate program on 1 “Marginal” to 5 “Distinguished” scale with “don’t know”option

Academic Peer score
 Average peer score = total score for those rating the school 5 through 1 among those respondents for that school/number of respondents that rate that school in that category.  Surveys conducted in winter/spring prior to publication.  Estimates or defaults: None are used for this variable.  In 2008 edition response rate =51%. Survey conducted spring 2007.

Student Selectivity
 Why?


Abilities and ambitions of students influence the academic climate of the school High school class standing: the percent of enrolled students who graduated in top 10 % or 25% of high school class (40%) Test scores (average SAT/ACT) (50%) Acceptance rate (Accepted/Applied) (10%) Fall 2006 Entering class data was used

 How?


  

High School Class Standing
 From current year USN survey  If not reported on current year USN survey, use from last year’s USN survey.  If not reported for either year, then estimate = one standard deviation less category’s mean.  If percent submit H.S. Class standing < 20% or null, then use estimate. A footnote appears if the percent submitting is less than 50. Note: this is change from 34%<, in 2007 edition.

HS Class Standing
Changes made in how we estimated HS class standing if a school had a small percent submitting high school class rank. The estimate used in HS Class rankings: if < % submit, then 75% of the actual HS class standing submitted was used in the ranking model. For example: if HS submit =18% and T0P 10% = 50% then .75 x .50 = .375% (value of HS standing used in ranking model).

SAT/ACT scores
 Determine which score used in admissions SAT/ACT policy questions from CDS. If required and either accepted, use % submit SAT and ACT to determine most frequently used.  Use average Math & Critical Reading, if reported. If average not reported, estimate using the midpoint of the 25th and 75th distribution. The same is done for the ACT composite score.  Scores converted to SAT or ACT national distribution.  600 CR=79% + 600 M =74% /2 =153/2 =76.5. That 76.5% score is used in model for z-score calculation. The same is done for ACT.

SAT/ACT scores-continued
 If test score not required and percent submit < 50%, then use SAT/ACT percentage *.9.  All international, all minority, all Student athletes, all legacies, all student admitted special circumstances, all summer enrollees with scores reported. If Yes or N/A, SAT/ACT percentage used as is. If No/Null for any of them then SAT/ACT percentage * .9.  If SAT/ACT not reported on this year’s USN survey, use last year’s score.

SAT/ACT scores-continued
 If null for both years, then estimate is one standard deviation less than the mean for the category.  Note = .9 reduction is roughly reducing SAT scores by 10% of total combined SAT percentile distribution for ranking purposes. That means 80% x .9 = 72% for purposing of the ranking model. This does not change what score will be published on the ranking tables.

Acceptance rate
 Acceptances/applications, then take inverse to create rejection rate.  If not reported on current year USN survey, data is used from last year’s USN survey.  If both year’s data are unavailable, then use the estimate of 1 standard deviation less than the category’s mean

Retention
 Why?  A measure of how satisfied students are with a school  To assess if a school is providing the courses and services students need for timely graduation  How?  Average Freshman entering fall 2002-2005 retention rate (20% of category)  Average Six-year graduation rate for classes starting 1997-2000 (80% of category)

Average Freshman Retention
 Average Freshman Retention rate: average of retention rates for classes entering in 2002 to 2005.  If less than 4 years are reported, then a footnote indicates this.  Use previous year’s survey data for three years.  If no information is reported, the estimates is 46 + .54 * average 6-year graduation rate.  If the average 6-year graduation rate is blank, then the estimate is one standard deviation less than the category’s mean.

Average Graduation rate
 Average 6-yr. Graduation rate: average of six-year graduation rates for the cohort of students entering from 1997 to 2000.  If less than 4 year’s rates are used for average, a footnote indicates this.  If school is a non-responder and didn’t return its US News statistical survey, data from previous years are footnoted.  NCAA and IPEDS data are footnoted.  Nat U. and Liberal Arts: most recent grad class’s 6-yr rate printed on ranking tables.

Average Graduation rate
 For NCAA DIV. I, II, and III compared NCAA reported rate with what school reported to USN for 1999 and 1998 cohorts for 1997 entering cohorts, in no NCAA use IPEDS, if not EXACT match then substituted NCAA/IPEDS for USN for that year.  If no USN is available, then the NCAA/IPEDS is used.  If no NCAA/IPEDS then use previous years USN data.  If no NCAA/IPEDS or USN, but freshmen retention available, estimate made off freshman retention.  estimate = 1.3 * freshman retention - 44.2  If no data at all, then estimate is one standard deviation less than category’s mean.

Faculty Resources
 Why?
 

To measure the nature of student-faculty interaction To assess the quality and commitment of a school’s faculty Class size--most small and fewest large classes (less than 20 students 30% and 50 or more students 10%) Faculty salaries adjusted for cost of living (35%) Proportion of faculty with top degree in field (15%) Student-faculty ratio (5%) Percent of faculty that is full-time (5%) Data is for the 2006/2007 academic year

 How?


    

Faculty Resources-class size
 % < 20 class size: number of classes less than 20/total number of classes  % 50 or more class size: number of classes 50 or more/total number of classes; then take inverse to find % that are not 50 or more.  Defaults: If not reported in current year will substitute data reported last year to USN.  If no data for current or previous years, then estimate.  The estimate is one standard deviation less than the mean or half the mean if the standard deviation is greater than the mean.

Faculty salaries
 Average faculty salary including fringe benefits: Professor, associate, assistant ranks (not instructor rank). Averaged over two most recent years.  If only one year is available those data are used.  Cross-checked with AAUP faculty salary, if average didn’t match, use AAUP. If no USN, then use AAUP.  If no AAUP, then use estimate of one standard deviation less than the category’s mean, after cost of living adjustment.  Salary first adjusted for Cost-of-living using Runzheimer International, 300 City/Metro Area Index, Family 4, $60,000 income level. No metro area use statewide average. Index not changed for 2008 edition.  35% of faculty resources.

% Faculty Top Terminal Degree
 Number full-time faculty with a terminal degree/total number full-time faculty.  If the data are not reported on the current year’s survey, then use last year’s survey data.  If there is no data from last year, then an estimate is used.  The estimate is one standard deviation less than the category’s mean.

Student-faculty ratio
 Student-faculty ratio: Self-reported by school using the Common Data Set definition.  Value standardized is the category’s maximum - the school’s student-faculty ratio.  If data for the current year is not available then use last year’s student-faculty ratio.  If there is no data, the estimate is 1 standard deviation less than that category’s mean.

Percent of faculty that is full-time
 Calculation: Full-time faculty/full-time faculty + (33.3% * part-time faculty) from U.S. News survey.  If faculty data are not reported on current year survey, then use last year’s data.  If last year’s data is unavailable, use the estimate of 1 standard deviation less than the category’s mean.  If school says 100% full-time then will double check likelihood of that claim. If claim seems unlikely (research or large university with no part-time faculty), then use estimate.

U.S. News & World Report’s Financial Resources Calculation
 Fiscal years 2005 and 2006 data was used in the 2008 edition  Private Colleges and Universities  IPEDS Finance Total Expenses Column  Education Expenses = ((Research + Public Service) * percent full-time equivalent enrollment that is undergraduate) + Instruction + Academic Support + Student Services + Institutional Support  Educational Expenses per student = Education expenses/total full-time equivalent enrollment

U.S. News & World Report’s Financial Resources Calculation
 Fiscal year 2005 and 2006 data was used in the 2008 edition  Public Colleges and Universities  IPEDS Finance Total Expenses Column  Education Expenses = ((Research + Public Service) * percent full-time equivalent enrollment that is undergraduate) + Instruction + Academic Support + Student Services + Institutional Support + Operations/Maintenance  Educational Expenses per student = Education expenses/total full-time equivalent enrollment

U.S. News & World Report’s Financial Resources Calculation
 IPEDS Finance Public and Private reporting rules are different: O&M, Scholarships, depreciation. New GASB rules now required.  Full-time equivalent enrollment = (total full-time undergrads + total full-time post baccalaureate) + .333 * (total part-time undergrads + total parttime post baccalaureate)  Percent full-time equivalent undergrads = full-time equivalent undergrads/full-time equiv. enrollment  Education expenses per student are averaged over the two most recent years fiscal 2005 and fiscal 2006.

Financial Resources: Calculation details
 After calculating each school’s education expenses per student (adjusted for research and public service) we applied a logarithmic transformation to the spending per full-time equivalent student. This was done for all schools.  That transformed value was then standardized before the 10% weight for financial resources was applied.

Why use the natural log of expenses per student rather than simply expenses per student?
 Small number of schools that fall outside of 2SD of the mean of this parameter. Changing parameter by its log doesn’t change distance between values for the overwhelming majority of cases within 2SD of the mean.  This transformation does reduce value of the few outliers outside of 2SD of mean, reducing their impact in ranking model. It doesn’t change their place, still leaders, but does reduce the contribution of this one indicator to the school’s overall score.  This corresponds to what U.S. News and many in higher education believe about the effect of spending on education quality, that beyond a certain level an increase in spending does not lead to a proportionate increase in quality.

 Why?  An outcome measure of the school’s role in the academic success of students.  Does the school over- or under-perform with respect to 6-YR graduation rates?  Only in National Universities and Liberal Arts categories--not in others.  How?  Measured as the difference between expected and actual six-year graduation rate for class starting 2000 and graduating in 2006.

Graduation rate performance

 Regression model used:  Dependent variable: 6-year graduation rate  Independent variables: high school class standing, standardized test score, financial expenditures, % receiving Pell Grants and institutional control  Independent variables taken for corresponding cohort and expenditures during the first 4 years of cohort  If 6 year grad rate is unavailable, the average 6 year grad rate of the previous 3 years is used

Predicted Graduation Rate Details

Alumni Giving Rate
 Why?  A rough proxy for how satisfied graduates are with their alma mater.  How?  Average percentage of undergraduate alumni with undergraduate degrees who contribute in most recent two-year period. Grad degrees are excluded.  Alumni giving rate is calculated separately for two most recent years and then averaged: undergraduate alumni donors/undergraduate alumni of record.


Years 2004/2005 and 2005/2006.

Alumni Giving Rate
 If only one year is reported on the USN survey, then the one year’s rate is used instead of the two year average.  If not reported, then use last year’s survey.  If the data is unavailable for both years, then use Council for Aid to Education data.  If there is no USN data and no C.A.E. data, then use one standard deviation less than the category’s mean as an estimate.


								
To top