Document Sample
Comments Powered By Docstoc

I am curious to find out whether the six/eight journal list provided by
Dennis Galletta, President, AIS has been approved by the AIS membership
(or AIS leadership), or is it based on the views of select individuals?

What do others think of this list?

Prashant Palvia
UNC - Greensboro



It is not an "AIS" list but a list proposed by the Senior Scholars who meet annually after ICIS.



First: hello! I hope you are doing well and enjoying life! It has been
some time since we've last seen each other.

Next: Clearly, from my reading of the letter that Dennis published on
the AIS website, this list came from a rather small group of senior
folks. That is not necessarily a good or a bad thing. It just is what
it is -- the opinion of a select group of talented and experienced
folks. At least, that was my interpretation of its genesis.



Some of the SIGSAND members (including me) will be filing firm but courteous protests about
the list. I have included the note that was sent by SIGSAND President to SIGSAND
members. My personal view is that impact factors should govern the weight of a journal, and
individual schools should come up with their own lists based on interests and situational factors.
AIS can serve to provide data such as impact factors, although these are already available. This
list will affect our group's tenure track faculty and may affect those who are applying for
promotion to Full.

You have a group with a large number of members. I guess members should be asked to send
notes to Dennis since he has invited comments.
XXX: I would ask SIGHCI to consider if they will be affected because it seems they will.
SIGHCI is a large AIS group, and they can have an impact.

Prashant, please look carefully at my note: the list came from the Senior
Scholars. The Senior Scholars' letter also provides details about whose
views are reflected by the list. And I provided a motion that was approved
by AIS Council to provide a link to the Senior Scholars statement from the
AIS web site.

You are implying that we should put the journals up for a "vote" or broader
survey. Well, you are in luck: there are already lists that reflect a large
group of individuals. Many have been published and you can find recent ones
on Carol Saunders' list on AISWorld at Based on widespread
surveys, they reflect a broad base of opinion.

However, one could argue that voters are biased and, in their responses,
favor those journals that they have heard of, or worse yet, those in which
they have published. This could account for the strange rankings that
result from time to time. For example, both ACM and AIS have found that the
more magazine-like "Communications" ranks (perhaps temporarily) more highly
than the much more heavily screened and edited "Transactions" or "Journals."
This is not to disparage the magazine format, but both societies intend for
the Journals and Transactions to be the destination of choice for the
highest quality research, while the Communications are intended for a
broader audience and more lightly edited.

Another problem with a long ranked list is that Deans/departments/schools
can choose their own cutoff point and often there are only 2-3 journals on
the "A" list.

The Senior Scholars list is offered as an alternative, and is especially
targeted to help those with a small number of journals on the "A" list. The
corollary: if your school already has 20 journals on its "A" list, as many
do, then don't use it. If your school only has 2 or 3, then it can
definitely be helpful.

One might argue that a large number of Senior Scholars represent ex-editors
in chief of MISQ and ISR, so they might unduly elevate those journals at the
expense of others. Fortunately, we find that, however tempting it might
have been to them, they have indeed NOT limited their selections to those
two journals.

To your last point, besides yours, I have received only 5 comments in
response to my posting. Thanks to Google Desktop, they were easy to
retrieve: Here is what they said, in order of their receipt (with names
removed). #5 is quite interesting, and I respond to it at the bottom below:


1. Dear Dennis
first, congratulations for the initiative. I would like to add an issue
related with the rankings of IS top journals and business schools in this
For top business schools, MBA rankings are extremely important, and to
create these rankings financial times, WSJ, the economist also have their
lists of TOP A journals. And, unfortunately and again, only MISQ and ISR are
the two only ones used to evaluate IS departments.
Thus, I also think that it would be important that these rankings would
accept the new list of top A journals. If not all of them include one or two

2. Dennis,

Great work! I will make immediate use of this for an external promotion
review I am currently working on. Thank you for producing such an
actionable and credible journal rating.


3. Hi Dennis, the MISQ September 2007 editorial discusses this issue and the
reasons for having such an initiative in more detail. Cheers.

4. Dear Professor Dennis Galletta,
Your email is one of the most transcend for our IS discipline.
While other disciplines (even the hard Computer Sciences) has around 10-15
journals rated as A level (and still identifies the older by respect and
tradition, as MISQ could be in IS), in our discipline the highest ranked
publishing space is limited to 40 papers by year (capacity of MISQ and ISR).
Any Industrial Engineering researcher could critique our journal ranking
system that affects negatively to all IS community. It is an imperative that
AIS establishes the criteria to classify the journals and then be linked to
ISI Journal Citations. In particular, because you are also in this
committee, I consider that 10 journals qualify in the same cluster (with not
statistical differences despite the average scores be different):


In particular, the IRMJ has been published from 1988 and the most important
IS researchers (that have also published in MISQ or ISR) have published also
research papers in this journal. Thanks,

5. Dear Dennis:

I read the announcement from AIS regarding the "list" of what the senior
scholars consider "A" journals and this message is in response to that list.

I laud the attempts of the AIS group to move away from specifying just ISR
and MISQ as the top journals. However I am also very concerned that the
current list completely ignores the "Design Science" and more technical side
of MIS that often does not have an outlet in any of the journals mentioned
in the current list of 8 journals. I believe the current list is more
focused on organizational and behavioral research rather than "Design
The latter is often published in the ACM or IEEE transactions journals.

I would like to see a clearer statement regarding this "bias" in the
current listing. I am afraid otherwise lists like this can end up hurting
our young untenured faculty and may be detrimental to the field by biasing
it in the direction of purely organizational and behavioral research in

I would like to recommend several alternatives for this list -- including
(a) a clear and strong statement about how this list represents the
organizational and behavioral side of MIS
(b) creating a similar list for the "Design science" type of research in
(c) Creating other lists covering all other areas of MIS that are currently
represented by WITS and SIGSAND for example.
(d) not having any lists that are endorsed by AIS at all -- I would much
rather prefer this approach so that each university and Department is free
to choose its own.

Best Regards


In response to #5, a very thoughtful mailing indeed, the Senior Scholars
chose to focus on general journals, not on specialized ones. One might
argue that a really good design science or technical paper would be accepted
in the top journals. Regarding having specialized journals, some actually
consider our field too specialized to value our journals, and some have said
that really important studies in IS would be published in Mgt Science, AMJ
or AMR. Those not important enough for those journals would end up in our
niche journals (our field's flagship journals). Having our own specialized
journals that are labeled "premiere" might be difficult for deans to
swallow, but I have forwarded that last note to Carol Saunders for
consideration by the Senior Scholars. Therefore, we will look into this

As to option (d), without this list, we will have scores of schools counting
only two or three journals. Departments need an authoritative body for
additional strength in combating a limited list. Just try to get an article
accepted in any of the basket of 8 journals--it is not easy. We do not
believe they should count as zero in dozens of departments worldwide.


As always, you can count on this list to provide marvelous discussions!

I believe that Prashant does have a point. Dennis points out that there
are several broader surveys of journal rankings, however, the letter
from the "Senior Scholars" clearly presents this basket list of
journals as being superior to the lists produced by such research.
Thank goodness we have these "Senior Scholars" to save us from our own
biases, and lucky for us that they have no bias themselves -- as Dennis
indicated is shown by the fact that they provided a list of more than 2
journals. :-)

Seriously though, I agree with both comments #4 & #5 that Dennis
provided. The list does appear biased toward certain types of research
and it does exclude journals that are just as worthy as other journals
that made the list. Without some type of systematic research
methodology, I think the presentation of the list should definitely be
tempered with a clear disclaimer that it represents the opinions of a
very small, homogeneous group. I think it should not appear to be
endorsed by the AIS membership.



I am happy we are just embarking on a very important debate for our discipline.

1. What is the domain of MIS? Is it just the behavioral side or should it include the technical
side also -- as suggested by some friends?

2. Shouldn't the ranking based on "areas" within MIS? For example, those focusing on global IT
issues, two prominent journals are JGIM and JGITM. For those, who are focusing on electronic
commerce, there are few prominent journals like Electronic Markets, Journal of Electronic
Commerce. Those focusing on qualitative research based on real world cases and applications,
the good outlets are JITCAR and MISQ Executive and CACM.


Dennis –

First, I’d like to commend the Senior Scholars for producing the Senior
Scholar’s letter and AIS for making it available on the web site. It is
very welcome and should be helpful when IS faculty go up for promotion and
tenure, as well as for faculty serving as external reviewers. Thanks for
a job well done.

Second, I take exception with the suggestion in the fourth comment (in
your email) that IRMJ be included in a set of 10 top IS journals. IRMJ is
simply not in the same class as the other journals on the Senior Scholar’s
list, or for that matter many others that didn’t make the list. All of
the published IS journal rankings bear this out; IRMJ typically ranks in
the 30’s. This is true whether the ranking study is based on perception
surveys or citation scores (as shown in the Saunders' list on AISWorld) or
from newer methods such as the author affiliation index (Ferratt, CAIS,
2007) or school lists (Templeton, in a letter in response to the Ferratt
article, CAIS, 2007).

Bruce Lewis
Wake Forest University


Dear Dennis:

Let me congratualte you on focusing on critial issues facing our discipline. It was you who
focused on the Enrollment Issue about 6-8 months back.

Ranking of journals is another critical issue.

1. May be I missed it, could you please elaborate on the objective critieria utlized by the senior
scholars in choosing the 8 or 10 journals?

2. Also, it would be a good idea to disclose the names of these scholars -- in the name of
transparency on this list.

3. Without the above information, it appears to me that the list generated by Carol Saunders' on
AISWorld at <> <
tm, based> , based on widespread surveys (with objective criteria) would be much more credible.


Shailendra Palvia

Bruce, we don't need to try to insult journal editors in public.

A list of six or eight journals obviously does not include everything.
There is no need to single out any one journal that doesn't make it as

We have a journal not on the list, but I would prefer not to be called
out as undeserving, whether I complain or not. As it happens, we did not
complain. Should our journal eventually make the list, we will certainly
not gloat over it nor call attention to journals that didn't.

On the contrary, the list solves a broader problem, namely the lack of
awareness of business school deans about the information systems
discipline and its journals. The judgment of the scholars, Whoever They
might be, should be respected for the moment.

There may be a contingent whose work belongs in IRMJ and that has a
strong audience there. We also have to think about international
opportunities for publication and particular themes that are represented
by individual journals. They are still "valuable" if not "A-list".

Scholarly standards of behavior historically have implied courtesy.
Let's not use the list to be exclusive, but rather to be an expanded
guideline for tenure decisions FOR NOW.

As Dennis pointed out, universities that already have longer lists or
have certain specializations, will already know what journals they
respect. The list is for those in doubt.



With all due respect for the scholars in their attempt to rank these
journals to help administrators in awarding tenure and promotions, I
wish to bring out a current reality. When I joined my school five years
ago, I had to publish papers right away. I did not have the luxury of
living through the long waiting period typical for the top journals. So,
I did not even bother. And, I am not the only one. There are many young
professors who are doing this. The point is that just because a paper
got published in just a so so journal, it's quality is not necessarily
poor. Quality should be measured on its own merit.

Ram Misra


Eleanor, "just the facts ma’am" was my motivation for the comments about
IRMJ. Not to insult, but to point out that the inclusion of IRMJ in a set
of top-tier IS journals (as asserted by the author of the fourth comment
to Dennis) was not supported by the stream of research on journal
rankings. We’re scholars, we should preach what we practice.

I agree with your comment that we should "not use the list to be
exclusive." However, based on the implication in the fourth comment that
the list be expanded by two, research indicates that there are many other
journals, (for example Information & Management and IT & People) that
should be considered before IRMJ.




This is a good point. Many journals serve as a starting point for
younger scholars.

With regard to quality being measured on its own merits, regrettably
many of the higher institutional ranks involved in tenure decisions are
not capable of doing this, hence the search for numbers and a certified

I don't think the list was meant to exclude other journals from
consideration, but to give at least some indication of what journals in
this discipline constitute an "A" list for deans to at least recognize
as meritorious. I have observed that some cannot even tell if articles
are about Information Systems by the titles, as they have so little clue
about the types of research in the field.

So the list fills an abyss of unknowing and this is a good thing.



The question of justification of ranking is not new for this list.
However, I would like to raise it again in connection to Eleanor's
reply (see below).


Just my 2c on this debate.

I think this is in the back of everyone's mind, but it's probably good to
bring it into the forefront of everyone's mind. The senior scholar list
was not a research tool. It was a political tool designed to help IS
assistant professors get tenure and associate professors get full

As a political tool, it should not be evaluated using research standards.
Rather, it's evaluation should be based on its effect on the research

Here's my perhaps overly simple analysis.

Assume an assistant/associate professor who publishes in the basket of
journals in the scholar list. This assistant/associate professor now has
justification for P&T and is better off than before.

Assume an assistant/associate professor who does not publish in the basket
of journals in the scholar list. This assistant/associate professor does
not have additional justification for P&T and is NO WORSE OFF than before.
 Some folks brought up alternate P&T systems where just pub count was
important. This list doesn't help those folks either. But it doesn't
harm them.

Do there exist side effects where the scholar list hurts someone? Yes.
Editors of journals who just missed the cutoff decision of the scholars
may get ruffled feathers. However, the social, monetary and political
aspects of the damage are minimal. Within the community, we know that
they just missed the cutoff, and there's good research to show any list
will do this (in bibliometrics this is called the Matthew Effect).
Are there more damaging effects? Perhaps, but I can't think of any.

Should the list be rejected on the grounds it wasn't research? I say no,
because it is a POLITICAL tool, not a research tool. It was definitely
informed by research (I know the folks who came up with this list know the
target journal literature). Also, this list isn't about methodology. It's
a DECISION made by people. Research doesn't make policy decisions.
PEOPLE make political decisions.

So, should the IS community as a whole support this list? Should AIS
endorse this list? I say YES. It doesn't harm us, and it helps a
subsegment of us.

Folks, it is HARD to develop policy that benefits some without harming
others. You see this every day in your own countries where some idiot
politician screws you over. I think we need to ask ourselves whether this
list HURTS us. If so, then there is room for complaint.

Cecil Chua


Cecil, thanks for a beautifully-stated and pragmatic analysis that pulls it
all together very nicely. I think I would really enjoy reading your work
sometime, and would look forward to having a cup of coffee with you at ICIS!

I am happy you picked up on my arguments that if this could harm a
tenure-seeking individual, then he or she can just ignore it. It is
optional, and is meant to broaden, not limit, your choices.

The posting of the list on the AIS website is only days old, but I already
know of three cases that made writing a positive tenure letter much
easier--two of which I wrote.

As a promotion/tenure candidate, besides use your dean's list of journals,
you now have six other tools to show you are publishing meaningful research:

1. Reach journals that are ranked highly on the Saunders' compilation on
2. Reach journals rated as excellent by the Senior Scholars
3. Reach journals with high ISI impact scores
4. Obtain many citations for each of your articles
5. Obtain praise from writers of your external letters
6. Have the promotion/tenure committee itself read your articles to see the
inherent quality

Some of these work better than others and some of these are interrelated.
For instance, #6 does not work well if your committee is composed of those
in disciplines that traditionally discount our work. Also, #5 feeds off of
#1 through #4.

Fortunately, the tools do not cancel each other out and promotion/tenure
committees do not examine the list for the most restrictive view possible.
They represent chances to "strut your stuff."

Cecil's well-written idea is a good one--this is an optional exercise that
broadens our choices. I love the "idiot politician" metaphor and will try
to make sure as few people as possible can apply it to me!

Don't forget that cup of coffee, Cecil!




In brief: FOR the list, AGAINST endorsement without explicitly asking

1) The list is just one list and those who can take advantage of it,
should do so. The list is an OFFER by the Senior Scholars (the AIS?) and
tries to be of some help. As long as it is either helpful or neutral,
one may not be against it.
2) IF the list is helpful or neutral AS IT IS, we do NOT NEED any
3) In any case, this may be a good opportunity to clarify what an
endorsement would mean / imply. I doubt that we have a GLOBALLY common
4) In my eyes, an endorsement only makes sense if
(a) AIS-members think / hope an AIS-ENDORSED list could influence the
lists used by (some) P&T committees (otherwise we do not need an
endorsement) AND
(b) AIS-members (not just AIS Council) WANT such influence from that / a
list (otherwise we do not want an endorsement).
5) If 4a and 4b are the case, an endorsed list would be so important
that I would strongly suggest to ASK MEMBERS (have them vote or at least
put it up on a publicly visible council agenda that people can discuss)
instead of AIS / Council just endorsing THE journal list and adding a
line / stamp on the website.

Claudia Loebbecke


Dear Dennis,

Privately many individuals have told me that this list is arbitrary and
based on the opinions of only a few people. No list is perfect, but
would rather take a list based on a much larger poll. However, these
people are afraid to speak in public as they can't really speak their
against the AIS President or "senior" scholars. That's probably why you
have received only a few responses. In fact, I have been warned to keep
quiet else I may burn my bridges. But I will take my chances for the
of an honest and spirited debate. It is my hope that others would speak
up whether they agree or disagree with me.

Sorry, but I see many issues with this list as listed below:

1. First it includes some less known and less "perceived quality"
journals. Obviously I cannot name anyone. (error of inclusion)

2. Second it excludes some longstanding high quality journals such as
Decision Sciences (yes it regularly publishes IS articles) and
& Management. (error of exclusion)

3. The list accomplishes a self-fulfilling prophecy. Journals listed on
the list will receive more and higher quality submissions; others will
ignored. Thus the list interferes with the free market system; a hall
mark of democratic nations. Because of this artificial intervention,
time, these journals will become top-tier. This is unfair to all the
other journals and their editors who have worked so hard over the years.

4. Our profession believes in results based on research rigor. Such
results based on a few individuals will never be published in any of the
IS journals, even the journals on this list. Should we not practice
we preach? I challenge the AIS leadership to come up with a sound
to develop such a list. This process must be sound and transparent.
a list, in my view, must be approved by the AIS membership.

5. Your suggestion that this list is just one tool among many other
 But why include a tool which is not validated? I am sure there are
hundreds of scholars who can come up with such lists. Would you post
their lists on the AIS web site?

6. Certainly, this list gives the impression of legitimacy and
by AIS the way it has been posted. Because of this, I believe many
departments will give it more credence than you yourself may have
imagined. Perhaps, there should be an explicit statement that this list
is not endorsed by AIS and should not even be posted on the site.

This was a tough one to write. But I am genuinely concerned. I also
many who feel the same way as I do.

If I have offended anyone, my apologies. Maybe I am completely wrong!


I personally think the list is good and serves as an excellent short
list, as I have said in earlier posts. The fact that it includes two
EU/UK journals on the more "critical" side attests to its scholastic and
paradigm diversity.

Please, let's stop arguing with this list. It is short, it is meant to
clarify the IS profession to a group of tenure decision-makers, and
there is nothing to be said against the journals on the list. They all
deserve to be there.

I have no personal interest here, since I am co-Editor-in-Chief of a
generally acknowledged high-quality journal that is not on the list. I
am not bothered by this! I can see the purpose of the list and it serves
a good purpose.

All idealism aside, the list is a valuable decision tool that helps many
tenure-seekers. The only downside is that it is US institutions that
seem to need this, but that accounts for a good many people.

Non-research institutions mostly will not care; non-US institutions will
likely have different criteria.


Eleanor Wynn
CoEditor-in Chief
Information Technology & People


A nice shot, Prashant. You are completely right.

John Wang.


I believe that the concern expressed by Ram below is a genuine and
important one. Remember that these are the junior researchers that this
senior scholar "list" is supposed to help, but at least in Ram's case,
does not seem to be helping anyway. Whether or not the purpose of the
list is "political", what is the point in having a list that does not
benefit a majority of its intended beneficiaries?
We all know that "lists" based on people's opinion are just that -
subjective. A test-retest reliability analysis of the lists currently
available on AISnet may just prove that! Many of our business school
deans know that, and are increasingly discounting these lists, and
focusing more on researchers' behaviors (citation patterns) rather than
their subjective opinions. Over the last year or so, my own school has
moved away from using these lists for tenure & promotion decisions to
using "research impact" (citation counts) of papers, as captured in ISI
impact scores and/or Google Scholar impact ratings. The ISI score is
tabulated annually and is time-lagged by 2-3 years, hence may not
accurately reflect the impact of more recent articles; while the Google
scores appear to be recomputed every 2-3 weeks or so. Though there tends
to be a correlation between the impact score of journal and that of
articles, I have noticed many instances where conference proceedings
articles or unpublished working papers have greater impact than articles
published in top journals. Though the impact score undoubtedly has its
own weaknesses (e.g., literature review and scale development articles in
"popular" research areas tend to be cited more often, and a truly
innovative article in a niche area may not be cited as much),
nevertheless, it does seem to provide a measure of a paper's "quality",
irrespective of its publication outlet, as Ram requested.

Rather than engage in a meaningless annual ritual of rating journals,
maybe we should focus our efforts on helping junior faculty (1) design
better research, (2) publish in reputed journals, and (3) improve our
review cycles. I know that MIS Quarterly (and maybe few other journals as
well) is trying to do some of that though "New Authors' Workshop" at ICIS,
imposing review deadlines, and increasing acceptance counts. Other
possible suggestions may include: (1) reducing the number of reviewers per
paper in top journals from 3 or 4 to 2 (to improve turnaround speed),
given that many comments tend to be similar across reviewers, (2)
negotiating with deans to increase the number of years to achieve tenure
(with the provision of allowing superlative candidates to go up earlier),
(3) be open to accepting a wider range of less-traditional papers. I
encourage the senior scholars to explore strategies that can help boost
both the quality of our research and the reward we get for conducting such

Thank you,
Anol Bhattacherjee
University of South Florida


Hi Dennis (and all AISer's),

Greetings from Taiwan!

I have been reading the communications regarding the IS journal list and
am delighted to see two of AIS's journals is in the list (EJIS and
JAIS). You and the other senior scholars' good intention and effort are
much appreciated. I feel deeply that the list is useful to promote our
organization. Nevertheless, my college only recognizes MISQ and ISR as
the top IS journals because they are recognized by FINANCIAL TIMES
during its annual university ranking. But, this should not prevent us
from publishing good papers in the other journals.

Just like you said, this list only provides us one of the different
sources of quality evaluation. I concur with the seven possible sources
you pointed out (see below). A candidate for tenure/promotion evaluation
could select multiple sources to argue his/her case against the other
sources used by the evaluation committee.

1. Reach journals that are ranked highly on your administrator's list of journals,
2. Reach journals that are ranked highly on the Saunders' compilation on AISWorld
3. Reach journals rated as excellent by the Senior Scholars
4. Reach journals with high ISI impact scores
5. Obtain many citations for each of your articles
6. Obtain praise from writers of your external letters
7. Have the promotion/tenure committee itself read your articles to see the inherent quality

Experience told us that without disciplined training, quality is in the
eyes of the author. I have seen many authors argued that their papers
should be rated as top quality papers during the personnel process. This
list tells the evaluation committee whose members might not be in IS
discipline that a paper published in these 6-8 journals has outstanding
quality. From the first 4 sources listed above, you did not preclude the
other journals. Perhaps, if you put a disclaimer below the journal list,
it will resolve all the raised issues. Cheers.
Best regards,
* Eldon Y. Li (李有仁), Ph.D., CPIM, CDE


Eldon, thanks for your note. I am sorry to post yet another message, but I must make a

EJIS is not published by AIS. It is published by the OR Society and Palgrave MacMillan.

Here are the others:

ISJ (Information Systems Journal) is published by Blackwell
ISR (Information Systems Research) is published by Informs
JAIS (Journal of the AIS) is published by AIS
JMIS (Journal of MIS) is published by M.E. Sharpe
MISQ (MIS Quarterly) is published by The University of Minnesota
JSIS (Journal of Strategic Information Systems) is published by Elsevier
JIT (Journal of Information Technology) is published by Palgrave MacMilllan and the
Association for Information Technology Trust

Some of us believe that AIS should publish more of the field's journals. We are currently
exploring ways in which we can put together a variety of models we can use to increase our
engagement with journals in the field. If and when we have announcements in this regard, I will
(bravely or foolishly) post it on AISWorld.

One more thing--I found that there is an article this year in CAIS that allows for assessment of IS
journal quality using a measure called the "author affiliation index." This could provide another
more quantitative assessment of journals. It seems to me that it avoids the bias problems
mentioned by Prashant and the respondent bias problems that I mentioned. The results are
strikingly similar to those of the Senior Scholars once you eliminate journals outside of our field
(such as Management Science and, yes, Decision Sciences) and specialized journals (such as
Decision Support Systems).

Ferratt, T. W., Gorman, M. F., Kanet, J. J. and Salisbury, W. D. (2007). IS Journal Quality
Assessment Using the Author Affiliation Index. Communications of the AIS, 19, 710-724.
Following this message, I plan to be rather mum on this subject. Sorry for all the emails,
especially the incomplete note "misfire" caused by my belt buckle clicking my Thinkpad's mouse
button while I was typing. It happened almost instantly: I once heard comedian Rich Hall (of
"Sniglets" fame) call this an "ohnosecond."




Another message in private as I don't want to clutter the listserv.

I had sent a note to Dennis depicting the potential harm of a list like this
to specific candidates [he did acknowledge].

Basically, the gist of my argument was as follows:

The argument that an option use of the list can only help and not hurt a
case is a little simplistic. It assumes that there is complete independence
between tenure cases and there is no memory among P&T committees or tenure
decision makers.

Here's how the list can hurt an individual case.

Let's say at University X we have faculty member A and faculty member B
being considered for tenure in the same year, both with tenure-worthy
records but A has published in journals on the list, and B has published in
reputed journals outside this list. The use of the list by A to supplement
his/her case has to hurt B. The same would apply if A and B were to be
considered in successive years.

One last request. Please don't post this to the listserv, but feel free to
share with individuals as you see fit.


Dennis and other have pointed out that when there is a variety of lists, then a faculty member
going up for promotion or tenure can select from among the lists to bolster his or her claims. Not
necessarily. Our deans have pushed for short lists of top journals in the field. These lists are to be
considered definitive for promotion, tenure, and research support. One of our faculty members
has already proposed that we adopt the senior scholars list from the anonymous powers that be
the Lords of MIS). Bless schools that allow applicants to present their best cases, as should be
the case with scholarship; but this is not always the case.


One way to bring clarity is is to have a list of top IS journals and other lists of top journals in
related fields. Our department has always stressed breadth. If one of our IT faculty members
catches an article in a top management science, computer science, management, marketing,
accounting, or psychology journal, we consider this to be considered a good thing. Other
departments feel differently.

It is up to us to produce lists of top IS journals. We should leave top journals in other fields off
that list, even if they do publish *some* IS articles. Deans want lists of limited size, and if some
non-IS journals are on the list, that means that there are fewer publications outlets of high
standing for our faculty.

Ray Panko


I think that citation analysis is a great thing. (I also believe that citation outside of the scholarly
community is an even greater thing if you consider relevance to the working profession to be a
goal.) However, there are two problems.

The big one is that it takes years for citations to appear. Not too big a problem for promotion to
full professor, but a big thing for promotion to associate and for tenure. I believe that requiring
citation analysis for these purposes would verge on the immoral, especially in today's world of
shrinking tenure deadlines. (Of course if you are junior faculty and have a good citation pattern,
go for it.)

The other one is that it is difficult to do. Our field spans several citation services, and even when
there are several citation services, they often miss MIS (pun intended) papers.

The biggest reason to do citation analysis, is the Sally Field reason. You can say to the world,
"you really like me; you really do!"

Ray Panko


I must disagree with your assessment of one of your scenarios.

> Assume an assistant/associate professor who does not publish in the basket
> of journals in the scholar list. This assistant/associate professor does
> not have additional justification for P&T and is NO WORSE OFF than before.
> Some folks brought up alternate P&T systems where just pub count was
> important. This list doesn't help those folks either. But it doesn't
> harm them.
You suggest the situation where an assistant/associate professor who
does not publish in the basket of journals in the scholars' list. What
if the Dean / P & T committee take the stance that the AIS has defined
the "A" list? Therefore, the assistant/associate professor's
publications are discounted because they are not in the journals from
this list. How does this person make a convincing case? The list is
presented by the Senior Scholars as being superior to the published
rankings, making the use of previous research on journal rankings
questionable. I foresee great potential for such a list to be misused,
perhaps through ignorance, to require researchers to focus only on these
behaviorally-oriented, generalist journals. The presentation of this
list as the unbiased list of excellent journals undermines attempts to
justify the use of any other list. The letter from the Senior Scholars
even indicates that the list can be expanded from 6 to 8 without loss of
quality...implying that expanding it beyond those 8 would result in a
loss of quality. The counter argument that has been presented ("if you
don't like this list, don't use it") is of little help. It is like
saying, "your institution may not require quality research".

I know the Senior Scholars had the best of intentions in the creation of
this list. It can be of value to the people that they intended it to
help, but the way it is presented is potentially harmful to a large
group of faculty. The list can still be used as intended even if it is
presented as the opinion of a group of respected researchers, without
the added weight of an apparent endorsement by AIS or by presenting it
as superior to the journal rankings based on research.

Steven A. Morris, Ph.D.


Well stated Prashant, and right on target. IF AIS really cares about academic legitimacy, then
this has been an interesting discussion starter, but has no validity, and is left wanting of a well-
defined process of validation.
Paul Leidig


Dear colleagues,

I was on the road since last Thursday, so I read with interest the many
postings to ISWorld about the "basket of journals" list in the last few
days. I would like to join the melee with a few observations that might
help you understand how the "basket of journals" came about.

Izak Benbasat and I were charged with chairing the Senior Scholars
Consortium in 2006. We felt that this group represented a significant,
perhaps under-used, resource for the IS discipline. Each year the
Consortium deals with a topic that the group considers to be important.
Recent Issues and Opinions articles in MIS Quarterly and Information
Systems Research reported studies suggesting that the research published
in these two journals, often the only IS journals considered to be
premier by Deans and Promotion and Tenure (P&T) Committees, appeared
paltry compared to the publications of faculty in other business
disciplines in their premier journals. We viewed this as a threat to our
discipline, and especially to untenured faculty members. We decided to
channel the group's efforts to develop a strategy to increase the number
of A+ journal articles in the Information Systems discipline. We
conducted a Delphi study using the inputs of senior scholars and a
carefully selected group of junior scholars. This study is described in
my September 2007 editorial in MIS Quarterly
( The
top-ranked strategy recommended by the Senior Scholars (and the second
ranked strategy of the junior scholars) was to recognize at least three
journals as A+ outlets. In my opinion the challenge to implementing
this strategy meant that somehow a group of scholars from our community
would have to agree upon what those journals would be.

A group of esteemed members of our community rose to the challenge.
Their intent was to help the IS community by expanding the list to
include more than two journals. In particular, when they suggested the
list they stated:

"In the 2006 Senior Scholars' Forum we agreed that the AIS can play a
major part in initiating and seeing through this change that will
provide greater opportunities for colleagues to publish, provide a wider
scope of excellence in IS research papers and, in general, improve the
perception of the IS discipline generally.

We think that the view that we put forward here is consistent with the
espoused policy of the AIS. We would like to help the AIS 'make it
happen'. Can we agree to a way forward?"

Having served on many P&T Committees and as a department chair, I can
attest to how important it is to be able to point to some established,
authoritative source for designating a limited number of premier
publications in a discipline. Eldon Li has noted some of the
possibilities (i.e., journal rankings, citation counts, ISI journal
impact factors, etc.) Ideally it would be good if the members of P&T
Committees would read and could evaluate the contributions, but I have
found that often the papers aren't read, and if they are read it is
often with the intent of finding reasons why the articles are not of
high quality. Ideally it would be nice to use impact factors and
citation counts, but these are often biased because newer publications
are usually not heavily cited and many good journals are not included in
determining the citation counts (as Ray Panko noted).

Recently I read and commented on strengths of a faculty member's
publications, and used the journal list to support my assessment of the
quality of the faculty member's articles, for one of the external
reviews that I wrote since the list was posted on the AIS website.
Thus, I found the list very useful.

I think the debate about this issue on ISWorld is healthy. I think it
also highlights the difficulty of finding a perfect strategy. The
Senior Scholars proposed the list as a way of starting the ball rolling
(i.e., to encourage more than two journals as outlets for top quality
publications). My hope is that the efforts that have been initiated
with considerable work will not prove fruitless because people disagree
with the approach taken or the journals included on the list. Rather, my
hope is that those who are dissatisfied will propose and help enact a
strategy that can take advantage of the work performed to date and can
address the concerns that have been raised. Until this strategy is
implemented, I hope that we can use the current list to the advantage of
faculty in our discipline.
Carol Saunders


Dear Carol:

Thanks for apprising list members about the process -- that is what went behind the scene.

In summary, it is certainly a laudable effort and hopefully a laudable outcome.

If it is possible -- could you please provide more information about all senior and junior scholars
who participated in the Delphi study? As somebody pointed out in the discussion, there may be
serious errors of inclusion as well as exclusion in the list.


Dr. Shailendra Palvia



Prashant, I've been following the email on the ISWorld about journal listing. If I may, I'd like to
share my opinion a little bit. I don't think the list addresses the root of the problem. The problem
was we don't have enough premier journal outlets for IS researchers. Why can't we
create/produce more journals at same caliber of MISQ and ISR or add more volumes/numbers to
these two journals? This requires more works but I think it's a better way to approach the
problem. I guess I'm one of many silent supporters of yours. I appreciate you brought up this
issue to the IS community.


As an Assistant Professor currently up for tenure at a US institution, I thought I would offer my

I am grateful for this list. My reality is that despite several attempts to publish interpretive
research in both ISR and MISQ, my papers were rejected. Trained in the UK, I have found it
difficult to break-into these two journals although I keep trying. My record is strong and my
research rigorous and creative (so I have been told). I am not sure that I will be successful with
tenure where the faculty want to see "proof" that my publication outlets are premier.

Personally, this list helps in that it includes 2 journals in which I have published. It excludes
Information & Organization where I have another two articles. Overall I appreciate the effort of
the Senior Scholars to broaden our list of top-tier outlets - it encourages me going forward. I was
feeling disheartened by the idea of a career where I was continually having to seek acceptance in
the top two journals in order to legitimize my work. The broadening out to include European-
oriented research is a step in the right direction.

Thank you!

Erica L Wagner, PhD


dear all

list should be prepared by AIS members itself .some broad frame work may be drawn for this.
Other than putting a few journal list for approval. let the things be on the formal map.
Lawrence Harold


qihao0824 qihao0824 http://