Okay, it's like this:
I knew that that big review at Anantech was in the works, so I patiently waited for it, with the idea that that would be enough for me to decide what to buy.
The article is published, and I come over here, and what do I see???
I see this:
This is not funny, guys.
I need to decide which burner to buy. My needs are simple: it must be able to burn DVD's as RELIABLY as possible and with as few errors as possible, and it must have the widest possible MEDIA COMPATIBILITY. These, by the way, seem to be difficulties which CD burners have overcome YEARS ago!
Now, there are lots of hardware sites with plenty of reviews of lots of burners. But of course, each site uses different benchmarks and so you end up with one set of numbers for drive A, and a second set for drive B, and a third set for drive C, etc etc ad nauseum. Anandtech does one thing, and CDFreaks does another thing, and CDRLabs does something else! Every site seems to want to be UNIQUE AND ORIGINAL in their testing methodology and so it turns out, that in spite of all these reviews on all these sites, you can not really compare two drives unless they have been reviewed by the same site and the test suite was the same. This greatly reduces the utility of the reviews. Most people really do not want to know the benchmarks; they want ( and need) to know the benchmarks in comparison with the benchmarks of competing products. But that seems not to be available to any great degree.
And then! I read the abovementioned thread on this forum and it turns out that the article that I was waiting for, the article that I expected to be MOST HELPFUL in helping me make a choice in an area of technology that I consider still immature.... In short, the article on Anandtech, according to the opinions of some people here, is worth NOTHING due to faulty methodology.
I will not presume to judge methodology, but how do the people here rate and compare drives?