Monday, June 27, 2011

Another journal rejection! Boo-hoo!

A week and a half after having a paper accepted by the European Journal on Operational Research, one of my other papers was rejected by the same journal. :(

The rejected paper talks about a class of techniques for the Single Container Loading Problem (SCLP) which asks: given a bunch of boxes, how do you load them into a container so that the least space is wasted? Problems don't get much more basic than that, and as you can imagine, it's pretty well studied. In our paper, we managed to beat the best existing results on benchmark data by a significant margin, which is quite a feat.

So why was the paper rejected? Well, part of the problem is that we achieved the best results without adding any ground-breaking, earth-shaking new technique. What we did was essentially analyze what other people did, mixed and matched the parts that seemed effective and tweaked it a little bit. That sounds simple, but in reality it takes a lot of hard work. We had to re-implement some really complicated methods and went down lots of blind alleys. Eventually, we realized that many things that existing literature assumed was effective, well, didn't work.

For example, most approaches spend a lot of effort using some fancy search technique to decide which box to place where. It turns out that these fancy search techniques don't perform better than simple ones. Unfortunately, the previous approaches were so complicated that it was hard to figure out why it worked, so many researchers thought it was the fancy search doing the work when in reality it was something else.

After a lo-o-o-t of experiments and analysis, we identified six "key elements" for this type of SCLP approach that we believe are essential. "Cool," we thought, "now we can put the research community back on track so that they know how to better design their algorithms. This is really useful!" We wrote it up into a paper, used our method as an example, and submitted it for review by the scientific community...

...and got repeatedly shot down. There were a few reviewers who loved it, but there were always a few who rejected it on the spot. The two most common criticisms were (1) the six elements are "well known", and (2) there isn't enough innovation. Yes, the six elements are well known, but they are obviously not well understood, since otherwise why would previous researchers mis-identify the effective parts of their own methods? As for a lack of innovation, um, what does that even mean? We did beat everybody else on the benchmarks, you know. Each part of our approach isn't very novel, but put together, we're the best.

After gnashing my teeth for a while, I calmed down a bit and tried to figure out why there has been so much resistance to this paper. I think it's partly because this is very much a non-traditional paper: it doesn't describe a technique nor is it a survey, it's somewhere in between. Also, if you don't read it carefully and think about its implications, it's quite likely that you could finish reading the paper and think, "that's it?" I'd like to believe that if the roles were reversed and I was the reviewer, I'd be able to appreciate this piece of work. But if I'm honest, I'm not 100% sure I wouldn't have a similar reaction.

Anyway, this paper has been submitted to 3 different journals and been rejected each time. At this point, we're kinda sick of swimming uphill, so we're probably going to rewrite it as a "traditional" paper (i.e., "This is how we solved this problem better than anyone else. Any questions?"). There won't be as much analysis and broad insight, but at least it's be more palatable to the average reader. It's better to tell half the story than not have it heard at all.

In case you're interested in reading the paper, the conference version can be found here. By the way, does anyone know if posting the full version of my published papers would violate copyright?

No comments: