I just ran across a paper that contradicts a whole slew of things/experiments I have seen/done with this project. I *know* my data is right (excluding that little snafu before).
Wait, let me explain. We do NOT see what they are seeing. We see the exact polar opposite. As in a yes/no. There is no gray area in this one. I would be worried if someone else in "our" lab(s) haven't had seen the same thing, so at least we are self-consistent. But WTF?
And this is not something you can hand-wave away with "it's from a different source" crap. And it doesn't make sense. So now I wonder if I can "trust" the rest of the stuff in the paper which really sucks because, well, it's a cool paper and I thought would be useful for the discussion. I guess not.
Damn. What a way to end the week....
PS-blog post theme for next week- just because it is published (or in this case written in a patent) doesn't mean it works....or how I spent the past two months beating my head against the wall....
2 comments:
I ofcourse know nothing about your particular system, but: it's either that (i) they're lying! or (ii) something about the methodology or system was different. Since (ii) is much much more probable than (i), maybe there is some diff that shows up after staring at it for a bit? :/
Yeah, I am figuring that I am missing something really subtle. I hope I am missing something really subtle, because like I said, the paper really supports a lot of other things we want to say about the system. I think it is time for a complete in-depth read of the article.
Post a Comment