But this Electrical Cosmos thing raises a maybe important issue:
Quote
At dinner speeches, the application of methods from other research areas is often celebrated, but in practice it can be a quite frustrating process to get such research accepted in peer-reviewed journals. I once read an article written by two physicists who applied a thermodynamic model to the "diffusion" of agriculture into Europe during the stone age. They complained about all the troubles they had getting stubborn reviewers of archaelogical journals to accept their paper. Eventually their paper found its way to a physics journal where I read it.
One of my own papers had a similar fate. I came up with an (apparently) controversial view on co-regulation of genes. When I presented it on conferences, the reactions were always very divided. My (possibly biased) generalization is that researchers who had not themselves done research in the same area were positive while those who had were hostile. The published version of the paper is not something I am very proud of - I had to water it down to little more than a footnote to the established theory to get it published.
This could very well be my own fault. My work has rightly been criticized for wrong use of terminology and all the other elementary mistakes a non-biologist (especially someone as sluggish as I) is bound to make when writing a biological paper. But I cannot help being a little bitter. Reviewers are always recruited among those who have published about the same topic themselves. Obviously they are not interested in debunking of the paradigms they relied on themselves. This makes the process more conservative than it should (ideally) be.
There may not be much to do about it. If professional journals opened up for all the weirdoes who invented the perpetuum mobile, solved the circle's quadrature etc then they would lose their function. But maybe better use of information technology (google scholar comes to mind) may pave the way for better knowledge assesment in the future.