code65536 wrote:And if I may be so blunt, people hold CATS up like some infallable holy grail is sickening, too.

Like I said, CATS is more for DVD-ROM compatability then DVD Player compatability, and has it's pro's and con's as well.
code65536 wrote:What does it matter what a CATS device tells you? The devices that people are going to use to read back these discs are not CATS machines. What CATS will tell you is very simple: it tells you how other overpriced CATS machines will read the disc. Great, so you can safely and accurately compare your CATS scan with someone else's CATS scan on the other side of the world, because all CATS scanners are nicely calibrated, etc. But what good does that do the end-user? Besides the fact that such machines are prohibitively expensive. Besides the fact that doing these "proper" tests are extremely time-consuming. The alternative is to use a heuristic. One that works fairly well, I might add.
That isn't quite accurate... CATS devices scan a disc against the standards which are often tooted by "professional" forums (280 PIE and 4 POE). When the error level limits were conceptualized, they were done so under the assumption that a calibrated drive was being used to test for them. Calibrated to read the disc against various error types in a certain consistant way, which would be repeatable on any CATS device. Now, using consumer drives, we emulate this sort of test, by scanning for PIE and PIF (not POE), yet we still stick to the standard of 280 and 4. This to me is really silly, since every drive tests differently, and no drive actually tests for PIE and POE errors, only PIE and PIF errors... so WHY do we hold the scans to a system they can't even begin to match up against? Simple, people are desperatly grasping at straws. Now, if you want to tell me that a disc is playing or readable , or not, that's fine... it's really simple to do that. But telling me that a scanned disc is "within spec" or "has a good error rate" is foolhardy. Again, we can tell how well the disc can be read, and how easily the drive can play the disc back, but knowing if the error rate is within DVD Forum specifications or not, is really beyond the capabilities of a consumer drive.
code65536 wrote:The point here is that, for practical considerations, there is a heuristic that's being used. It's not BS because people are instructed to take it all with a bit of a grain of salt--comparing it with your own scans from the past, and then interpreting what the KProbes mean by crossreferencing the results you get with what kind of performance you experience when testing on other devices like standalones. And it just so happens that this heuristic is stable enough that, in a number of cases (though certainly not all), it is reasonable to compare scans from drive A to drive B. Heck, when scanning the same disc, my 832S and my 3S produces identical results, within 10% of each other. Direct comparison is something that, on the surface, we say, "no, this doesn't really mean much", but the heuristic is good enough that it allows people to, under what limited time and resources people have, try to do something meaningful with it.
Yes, in some cases people are reminded to take the results with a grain of salt... but do they? realistically no, I don't believe that they do. I think many of us even forget to do this too! I'm not saying that we can't know ANYTHING about the burned quality of the disc, I'm just saying that people are really making too much out of K-Probe results (and others). As for drive test results... even if you get a difference of 10% only between your 2s and 3s model drives, I think it is dangerous to assume that the same would be true for other people. I know that the %difference is much higher for Plextor/LiteON comparisons... but which drive is "right"?? well, neither of them really.
code65536 wrote:As rd said, the alternative is to do nothing or to watch the people who could afford CATS do their hocus-pocus.
I disagree, the alternative is to be realistic about the capabilities and difficencies of LiteON and other testing drives. There *ARE* things we can know about the disc burned... but I think people make too much out of the results. What I'd like to see is people taking the time to scan discs on 2 different drives and including a transfer rate test. Personally, I'm using the PX-712a, which I test media at 2x and 12x on, and then the SOHW-812S@832S which I test the media at 8x on. I usually do a transfer rate test on both drives too. It takes a long time, I won't argue that... but it gives a MUCH better look at the media itself, and what kind of errors affect the disc. Comparing 2x to 12x scans on the PX-712a are VERY englightening at times, since some error types are greatly magnified at higher read speeds, and others are not... cross reference this with the 832s's results at 8x, and now you know how those errors are handled by another drive in another situation (usually a less picky drive too). If I was doing a drive review, I would make sure to include a transfer rate test done on the drive being reviewed, since that would be the most important result to a potential buyer of that drive.
code65536 wrote:PS: The whole reason I'm on an anti-CATS crusade is that after seeing so many people cling onto the precious C't results like they were the word of some deity because they were performed with... *gasp* CATS! I can't begin to count the number of people who took the results of a CATS scan from one individual variable drive unit taken from one individual variable batch of drives taken from one individal variable factory using one disc from a spindle of discs that may very well have intra-spindle variation coming from a batch that may very well vary from other batches and coming from a media factory that may very well vary from other media factories, burned under very unique conditions (system setup, drive temperatures, phase of the moon, etc.) as something that was absolutely indicative of the performance of that drive in general with that media in general in all situations (yes, it's better than looking at Joe Sixpack's KProbe and making an overbroad conclusion). The point is, things vary, and that's why everyone should be doing some testing of their own, and to then dismiss self-testing as worthless because it's not some vaunted CATS is counter-productive. So is relying on someone else's results just because that someone else used the high-and-mighty CATS.
You aren't alone on that! I'm pretty ticked off about the lack of information in those C't articles... although perhaps some of those details would be available if I went back to the original deutsch... but since I don't have those articles yet, I can't do that. The fact is, most drive reviews are completely useless where burn quality issues are concerned, after about 1-2 months, if not sooner, since firmware updates change how the drive performs to such a large degree! So by the time the review is out, there is usually a newer firmware available

This isn't the reviewer's fault of course, just a problem inherent in the system as it is right now.
code65536 wrote:My apologies if I sound harsh. But over the past few months, the obsession with this non-existant holy grail of testing has put me in a sour mood.
No worries, I'm not angry with you or anything

I'm just pretty pissed off myself over the use, and abuse of K-Probe (and other) tests for the last while, and the apparent choice of convenience over professionalism in the industry.
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R
The Progression of Computer Media