Wednesday, March 02, 2011

We Got a Winner

In the competition for the place you can publish just about anything, the competition is fierce. Energy and Environment has gone to the mat, but there is always JPANDS and the current title holder, the Journal of Scientific Exploration. So, into the innocent young Bunny's Email box this flies

Invitation to submit to The All Results Journals:Chem
To: Eli Rabett

Dear colleague,

Tired of your experiments failing? Been working for months or even years on a project that's not producing the results you expect or desire? My name is David Alcantara and, on behalf of my editorial board, I’d like to invite you to submit your articles to The All Results Journals: Chem, a new journal that focus on publishing the grey literature that have never been published. Our target is to compile and publish the experiments with negative results and their interpretations and solve the current problems that publication bias is causing.

The All Results Journals:Chem is a peer-reviewed journal dedicated to publishing articles with negative results in all areas of Chemistry (pure and applied). The Journal is Total Open Access (no fees to publish and reading) and are being indexed by main scientific databases (Web of Knowledge, Scirus, Pubmed, etc.) certifying maximum exposure of your articles. We expect to publish articles within four to six weeks of submission, and our award-winning OJS Publications Web Editions Platform
will showcase your important findings to the international scientific community.
Ain't these guys ever heard of arXiv? Even Lubos and Oliver "Iron Sun" Manuel publish there. Either this is the best spoof since Denial Depot, or there is something strange going that way, however, Eli, thanks to Marco, has a wonderful suggestion for a submission, which started with a colloquy between john n-g and Eli
John, when you say:
Jae- I do sometimes envy certain chemists, who can describe their experimental setup, report on the results of the experiment, and be done with it. Once the experimental setup has been ratified, it's fairly easy to crank out the papers, and there's very little for a peer reviewer to criticize.

There is always the worry that you are not measuring what you think you are, and those errors can be really spectacular. There have been a few of those recently.

[Any examples worth describing? - John N-G]

Marco won the web

John, try this one: as an example of what may go wrong in chemistry.

A mislabeled bottle, and you suddenly have yourself a breakthrough!

The ideal submission to All Results Physics:

PHYSICAL REVIEW B 83, 019901(E) (2011)
Retraction: Superconductivity in the Rh-based Heusler family MRh2Sn
[Phys. Rev. B 82, 134520 (2010)]
T. Klimczuk, K. Gofryk, C. H. Wang, J. M. Allred, F. Ronning, W. Sadowski, J.-C. Griveau, E. Colineau, D. Safarik, J. D. Thompson, and R. J. Cava
(Received 2 December 2010; published 4 January 2011)
DOI: 10.1103/PhysRevB.83.019901 PACS number(s): 74.70.Ad, 71.20.Be, 74.25.Bt, 99.10.Cd
Recently we published a paper entitled “Superconductivity in the Rh-based Heusler family MRh2Sn” [Phys. Rev. B 82, 134520 (2010)]. Due to the mislabeling of a rebottled chemical starting material, the superconductors originally reported in this paper as Rh intermetallics are now known to be Pd intermetallics. The basic superconducting properties of the Pd compounds have been previously reported.
And Eli is waiting to hear from the lawyers
This e-mail is from The All Results Journals. The e-mail and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. Any unauthorized dissemination or copying of this e-mail or its attachments, and any use or disclosure of any information contained in them, is strictly prohibited and may be illegal. If you have received this e-mail in error, please notify or telephone and delete it from your system.
but this post surely fits under fair use.


Rattus Norvegicus said...

Me thinks that Judith needs to publish in the "All Results Journal: Climate". Either that or give up her doctorate.

Marco said...

Oh my, referenced by the Rabett, again!

Note that I don't have amazing google skills, or work in this particular area. Retractionwatch is to blame.

Current top news there: 89 papers from one single author retracted. Ouch!

On another note: the All Results Journals are real enough, no spoof. We'll have to see how it manages the onslaught of nonsense that is likely to be submitted, too.

Thomas said...

This kind of journal can be useful. Say you come up with a clever method that you think will give interesting results. You try it, fix problem after problem, but in the end you get nothing useful. Normally you'd throw this result away, maybe tell some colleague about it. The result is that two years later some other group has the same bright idea, and wastes the same amount of time finding out that it doesn't work.

Had you instead been able to publish an account of what you did and why it didn't work, a simple literature search would have saved that second group a lot of time.

Joel said...

This sort of journal is a positive thing - particularly in fields where someone might be do a meta-analysis of the literature on a particular question. Say you're doing an experiment on whether a particular pharmaceutical treatment works, and get a negative result. The drug did no better than a placebo. Well, many journals wouldn't publish that result (it's boring). Noone's* going to cite your paper (after having established that the drug doesn't work, why bother doing a new study?). And publishing would require a bunch of work to write the paper and submit it to a journal, as well money to pay the publication costs, all of which could be used to do more hopeful studies.

Because of the high degree of noise in a medical study (hey, people can get positive, statistically significant results on *homeopathy*), there will be people who get positive results. End-game: if you look at the literature, homeopathy does better than a placebo, because a positive result is more likely to get published than a negative one. This is known as "publication bias" [1], and it's a serious problem in medical research, as well as in other fields.

*: For modestly accurate values of "noone".

maxwell said...

I'd say that this journal is going to be a major pain.

Knowing what someone thinks affected the negative result of an experiment might just be the most useless scientific information producible. If he/she couldn't get the apparatus/calculation/equipment to work properly, why should we believe that they understand what went wrong?

Horatio Algeranon said...

Ain't these guys ever heard of arXiv? Even Lubos and Oliver "Iron Sun" Manuel publish there."

Strictly speaking, "No results" or "goofy results" are not the same as "negative results".

Marco said...

Maxwell, I think you misunderstand the idea behind the All Results journals. Allow me to give a very recent example I experienced myself: we noted in a manuscript that certain methods could not be used for the application we had in mind. A reviewer then noted that we should add a reference to that claim. Slight problem there: while we ourselves and I am certain many others have tried and failed using those methods, no one has ever *shown* that in a publication. Why not? Simple, because you cannot get a paper published that shows that certain methods cannot be used. You might try and sneak such an example in, but last time I tried that, two reviewers 'kindly' requested me to remove the figure showing it didn't work.

The result is that others will repeat the 'useless' experiments we have done, come to the same conclusion ("can't use this"), andsoonandsoforth.

Another example: suppose there is a standard method to synthesize certain compounds, but that method fails for a certain class, and someone has a reasonable idea as to why. Until that has been reported, people all over the world will try out the standard method first...and fail. So, who will report the fail? It might be mentioned in the paper that actually finds an alternative method, but will people notice that one line?

There is a danger, of course, that papers get submitted where the negative result is a result of the authors not knowing what they are doing.

maxwell said...


I do understand the nature of this journal and appreciate your position on this topic.

In fact, your examples are great...assuming you actually know why things aren't working correctly. For most things that go wrong in a lab, that's a substantial assumption.

Moreover, peer review journals are not the only means of communicating results (positive and negative) to one's colleagues. Conferences, seminars, talks and other avenues are available for researchers to share with one another their feelings for the success or failure of a particular research project/experiment. Those settings also better allow for speculation, which is often necessary when one cannot produce the desired result of an experiment.

I mean, only a handful of expert researchers read the papers that have positive results to begin with. You could just email those people your negative results and the reasons why you produced them if you really wanted to.

Anonymous said...

Prof. Schooler had an opinion piece on just this issue in last week's Nature. The comments, especially the first, were also interesting.

Cymraeg llygoden

bruced said...

And if 'The All Results Journals:Chem' rejects your paper you could always try 'Journal of Unsolved Questions' ( even 'The Journal of Serendipitous and Unexpected Results' ( Since there seems so much nonsense circling, particularly in the blogosphere, I suppose its only natural there would be a growth of places to put it.

David B. Benson said...

Then there is the

Journal of Irreproducible Results.

John Farley said...

In at least some areas of physics and astronomy, negative results can be published. How can you spot these papers? The title starts with "Search for..."
This spell failure. You conducted a search for some molecule using (say) a microwave telescope, found nothing, and now publish upper limits on the abundance of that molecule in that part of the sky.

Experimental high-energy particle physicists can publish "search for..." some exotic particle that wasn't detected. The experiment failed, but an upper limit has been determined for the cross section for production of that particle.

Years ago, particle theorists had the idea that the proton might decay at some very very small rate, and the quarks might escape from the proton. A number of high power theorists calculated estimates of the lifetime of the proton. Large scale experiments were conducted to detect proton decay. So far, no proton decay has ever been detected.

I recall around 1980, when these experiments were planned, one experimentalist told me "I hope it [the proton] doesn't decay, because the theorists have been so insufferable about it."

a_ray_in_dilbert_space said...

Another key term to search for negative results:

upper limits

We had "upper limits" papers on neutrino oscillations for 40 years before they were detected.

John Mashey said...

JIR is a fine journal, but doesn't publish just anything, but humor like that in Stress Analysis of A Strapless Evening Gown, or anything written by J.B. Cadwallader-Cohen, who I actually used to work for, albeit under his real name.

I think All Results might even be useful, see article.

I still think JSE is the champ. When All Results publishes studies shoeing no correlation of dog personalities with astrological signs, I may reconsider.

Horatio Algeranon said...

And a key term to search for "no results":


We had "string" papers for 40 years before they were not detected.

Douglas Watts said...

So if the editors find your experiment actually worked, you get rejected?

David B. Benson said...

John Mashey --- JSE?

John Mashey said...

JSE = Journal of Scientific Exploration, or the dog astrology journal, of which McIntyre/McKitrick and Montford are fond.

(From IE, previous one was from Safari on iPhone).

David B. Benson said...

John Mashey --- Thank you.

Any journal publishing UFOlogy can't be all bad...

Anonymous said...

"palindrom" here. Over in the wacky world of Huffington Post, a denialist troll posted a link to this odd paper:


in which Hermann Harde, a laser spectroscopist morphs into a climate modeler, inserts some very good molecular data into what appears to be a rather crude, toy, do-it-yourself planetary atmosphere, and decides the CO2 isn't quite what it's cracked up to be as a greenhouse gas. Can someone who is actually expert at this give an authoritative review of this? (Dammit, Jim), I'm an astronomer, not a planetary atmospheres person, so I don't have the requisite deep knowledge of planetary atmospheres models, but the inquiring minds posting here ought to know ...

John said...

In dealing with irrational beliefs, I've always found the Skeptic's Dictionary to be very useful.

John said...

Dear Anonymous:

I skimmed the abstract by Hermann Harde, who is not a global warming denier. Harde calculates a smaller rise in global temperature (about 0.45 C) as a result of doubling CO2, compared with the IPCC number (3C), by a factor of about 7.

The IPCC result is based on: (1) a rise of 1.2 C from doubled ed CO2 in the atmosphere, causing an enhanced Greenhouse effect, and (2) positive feedbacks (ice-albedo, rising water vapor concentrations) which raise the IPCC estimate to about 3 C +_ 1.5 C.

Harde has calculated only the atmospheric effects, and has ignored feedbacks completely. So his result of 0.45 C should be compared with the IPCC "no-feedback" estimate of 1.2 C. Thus the discrepancy is more like a factor of three, not seven.

Where does it come from? It struck me that Harde does not describe what measures, if any, he took to verify that his calculations is free of numerical or coding mistakes. (Dropped a factor of pi??)
Harde should validate his computer programs by calculating something that is simple enough to have an analytical expression. For example, see Chapter 4, "The grey gas model," in Raymond Pierrehumbert's recent textbook, Principles of Planetary Climate.

Among climate scientists, there is very little controversy about the "no-feedback" result. It has been known for decades. Harde is new to the field. So if I had to bet money on one of the two possibilities:
Harde overlooked something, or everybody else overlooked something for decades, I know how I'd bet.

John Mashey said...

UFOlogy is commonplace.
JSE goes far beyond. I can imagine nowhere else that has covered dog astrology, reincarnation if WW II soldiers as Myanmar children, the hum heard around the world, and inexplicable transient weight gains found when suffocating sheep. Another author has followed the last one with further research, but it is not yet freely available.

Marion Delgado said...

Another author has followed the last one with further research, but it is not yet freely available.

Most Disturbing Footnote: "You know, people are kind of like sheep, when you think about it."

Lose weight the Thuggee way!

Rattus Norvegicus said...

Wasn't JSE the rag that published the infamous "sheep with souls" article? That one was a real gem in which the researcher strangled a bunch of sheep and weighed them before and after death to see how much their soul weighed! I'd have to say that the experimental design was, um, interesting...

Holly Stick said...

Thanks Rattus, I had the impression from John's description that the strangler was the one who lost weight and had nightmare visions of a new bestseller: Lose Weight by Suffocating Sheep.

John Mashey said...

Rattus: yes, that's what "inexplicable transient weight gains found when suffocating sheep" was about.

Really, everyone here owes themselves the amusement of:
1) Going to JSE.

2) Perusing all the titles, of which all but the last year have freely-available PDFs.

3) Selecting a few likely candidates to examine in further detail.
4) Report research on favorites. The sheep research was my favorite, but later supplanted by dog astrology.

I've looked most closely at issue 19:2, since it has the Deming article quoted by McIntyre&McKitrick, and then Andrew Montford (He Who Quotes from Dog Astrology Journal, as a key basis for the Hockey Stick Illusion, augmented by falsification to attack Overpeck, see this.

Holly Stick said...

Over at Climate Etc., Tomas Milanovic is discussing a comment that Eli had made:

John said...

Any university researcher who was proposing to suffocate sheep would have to get approval from the Institutional Review Board. Did the sheep give informed consent? (If the sheep could speak Latin, they might say, "Nos moratri te salutamus", we who are about to die salute you. That would be taken as signifying consent.)

Any university researcher would be well advised to be on the alert for the Sheep Liberation Front and other terrorist groups. Humans sympathize with sheep, dogs, cat, monkeys, and any cute and cuddly creature. It's safer to run experiments on fruitflies or E. coli, or even better on rats and roaches. Yuck!

Holly Stick said...

In the 1600s the Royal Society tried transfusing some sheep's blood into a man. Then they got a letter supposedly from the man, but written by some coffeehouse wits (predecessors of bloggers). He complained about having developed various sheepish characteristics and signed the letter "Agnus Dei".

(wv = folde)

David B. Benson said...

John Mashey --- My first reaction was that even dogs need astrologers, after all, Sirius and all that.

That was before I started reading about the sheep. So I think I'll skip JSR and try to forget about it.

EliRabett said...

Eli got a guess

Lazar said...

"Tomas Milanovic" is pulling a Spencer. As Pekka Pirila points out, the posteriors are unconcerned.

winston said...

Personally, as an ex experimental high-energy particle physicist with some crap, but peer-reviewed, papers to his name, I tend to feel we'd all be better off if most people tried to get published here: