I recently came across a new website from Thomson Scientific called sciencewatch. It is an interesting attempt to provide researchers with lists of hot papers, topics, countries and institutions all in terms of citations (bibliometrics), and nicely self-contained within a neatly branded website.
Sciencewatch seems to have started a few months ago based on using the results from Thomson's Essential Science Indicators that provide quantitative analysis of research performance. The website contains lists of hot researchers, topics, emerging fields of research, top trends in subject areas and covers a range of science fields, such as materials science, physics, medicine and chemistry.
I personally believe that you can infer a lot from these statistics, if one looks over long time periods (i.e. more than a few years; better still about a decade). But by the very nature of citations, however, I have to question the principle of sciencewatch pushing this analysis to detect 'emerging' topics, or 'hot' researchers (though there may be some level of hypocrisy here).
Many of the lists that define hotness, such as the hottest researchers, on sciencewatch look over a two year time period or even less. This causes a few problems; a two year period is probably just when your paper is starting to generate citations. In a quite noddy explanation, it may be that Thomson adds your paper to its database around 5-6 months after publication (this is variable of course). If someone then sees your paper by searching Thomson's database and then duly cites it, by the time the citation is tagged in Thomson's database it could be another six months before this happens. So it could take one year minimum to start gaining citations since your paper was published. If a paper is generating citations before quite soon then it is probably due to 'self citations'.
My problem is that the rankings which try to define something as hot, do not provide a very transparent method, or I at least have no idea how they defined these hot papers or topics. On the other hand if you look at the list of top countries in physics that are also provided on sciencewatch, you find a table with the total number of papers, total number of citations, and citations per paper over a timescale from 1997 - 2007 for each country in the top 20. This is very transparent, a nice large time period, and clearly defined table, indeed you can even start to infer your own analysis. In my view, citations per paper, is probably a good indication of quality. Although the US is highest in terms of citations, Switzerland is number one if you look at citations/paper - although it has only around a tenth of the papers the US publishes, these are on average, cited more.
For example, I just came across the hottest researchers for 2006 - 2007, the top three are all physicists -- each with 12 'hot' papers. Taking it further, the top three are all high energy physicists. Particle physics is renowned for its large author sets, could this just be due to the authors belonging to large collaborations, with many papers and thus many self-citations?
Most of the analysis for new hot researchers, such as the "rising stars," use rolling two month intervals to judge the output of researchers: "that have achieved the highest percentage increase in total citations" from a two month interval. This sort of thing reminds me of fantasy football, something I used to do as a teenager (not that long ago...) when you pick a team and then week in week out, after they have played, the players get ranked on goals scored, assists etc, and you build up point for your team. I couldn't help but notice a similarity, maybe you can start a physics dream team, put together a team of 10 researchers and then see how they compare month in month out, though I doubt such a thing would catch on with today's youths.
Looking at the hot papers, a new 'hot' paper listed in March (which happened to be published in 2006) is a paper in review of modern physics, on electronic structure calculations. It is well known that review articles are highly cited, but does that make them 'hot'? It is probably the case that review papers are supposed to consolidate an area of research and do not contain new results, so I wouldn't say they contain hot research.
It is also confusing on sciencewatch to have different sections entitled: "emerging research fronts", "Fast moving fronts," and "hot topics" I don't quite get the subtle difference between them, which all give different results for what is emerging, fast moving or hot in physics. These are also given on a monthly basis, I am not sure how this can change rapidly from month to another. One emerging research field in physics given in February 2008 is 'environment-induced sudden death', apparently.
A website such as this has a useful purpose, if only at the moment it seems to be a little confused about what data to show and how to analyze it (definitely the more tricky part). Using citation analysis for short term intervals is a tricky business and to start comparing researchers and topics is something difficult to do using only a search program on a database.
Sunday, 23 March 2008
Subscribe to:
Post Comments (Atom)
1 comment:
look at http://xstructure.inr.ac.ru/htopics2008.htm
Post a Comment