Page 2 of 2

PostPosted: Wed Jul 27, 2005 5:54 am
by frank1
When some people review the LiteON SOHW-1633S drive
you have 4 pages of test (with 4 différent burbed DVD's) and with this title:
" SOHW-1633S vs. SA300 "
versus SA300: stands for "the well known AudioDev SA300 DVD CATS system"


And at the beginning of each page you can read this warning:
« Please note that the posted results are only valid for the specific tested LiteON SOHW-1633S drive. Using other drives,
even another SOHW-1633S , can produce totally different results.
Be aware! »



And as conclusion on page 4 you can read this:
« Not even with this disc could the LiteON SOHW-1633S produce similar results to the SA300 system. In all the graphs above, the SOHW-1633S drive reports higher error level values than the SA300 system »

PostPosted: Wed Jul 27, 2005 2:58 pm
by Muchin
dolphinius_rex wrote:Firmware can deffinately affect a drive's ability to scan a disc... but jitter is not the be all and end all of scanning either unfortunately. I've already noticed difference between how a DW1620 and DW1640 scan for jitter (the DW1640 is more forgiving, while the DW1620 is less forgiving).

Sadly, for a really good view of a disc's quality, the best way is still to test it on multiple drives. That way you can analyze several perspectives on the disc, and come to a more rounded conclusion. This is the basis for my own testing.

According to my friends with more than 10 years of expertise in the optical disc industry, jitter measured with CATS is the be all and end all of scanning. As to consumer drives, it is unfortunate that leading jitter and trailing jitter cannot be measured separately at present. Moreover, only those machines based on Philips and Ali chipsets are endowed with the capability to report jitter in non-arbitrary unit. Let’s hope that things will improve in the near future.

The differences between the performances of Benq 1620 and 1640 in your tests are generally not large, and may be considered as variations between drives, for consumer products are not carefully calibrated as CATS is. I shall wait until the graphs in your forthcoming review are online to see if I have some more comments to make. BTW, have you compared jitter levels given by Benq 1620 and by 1640 with those by CATS? If yes, which is closer to CATS in this respect?

I am also of the opinion that it is essential to use at least two consumer drives to conduct tests for reviewing purposes, even if all of them can report jitter, as they are not made with the same components and calibrated exactly. But most people may not want to spend money to buy another drive. I like the combination you have chosen to use, especially Plextor 712, as it gives PIE rates at 8X or 12X speed more or less the same as CATS.

PostPosted: Wed Jul 27, 2005 3:05 pm
by dolphinius_rex
Muchin wrote:
dolphinius_rex wrote:Firmware can deffinately affect a drive's ability to scan a disc... but jitter is not the be all and end all of scanning either unfortunately. I've already noticed difference between how a DW1620 and DW1640 scan for jitter (the DW1640 is more forgiving, while the DW1620 is less forgiving).

Sadly, for a really good view of a disc's quality, the best way is still to test it on multiple drives. That way you can analyze several perspectives on the disc, and come to a more rounded conclusion. This is the basis for my own testing.

According to my friends with more than 10 years of expertise in the optical disc industry, jitter measured with CATS is the be all and end all of scanning. As to consumer drives, it is unfortunate that leading jitter and trailing jitter cannot be measured separately at present. Moreover, only those machines based on Philips and Ali chipsets are endowed with the capability to report jitter in non-arbitrary unit. Let’s hope that things will improve in the near future.

The differences between the performances of Benq 1620 and 1640 in your tests are generally not large, and may be considered as variations between drives, for consumer products are not carefully calibrated as CATS is. I shall wait until the graphs in your forthcoming review are online to see if I have some more comments to make. BTW, have you compared jitter levels given by Benq 1620 and by 1640 with those by CATS? If yes, which is closer to CATS in this respect?

I am also of the opinion that it is essential to use at least two consumer drives (I like the combination you have chosen to use) to conduct tests for reviewing purposes, even if all of them can report jitter, as they are not made with the same components and calibrated exactly. But most people may not want to spend money to buy another drive. .


I'll have a DW1640 vs. CATS section... although it might only be based on 1 disc (most likely!). I haven't run it yet, but when I ran the DW1620 vs. CATS test, the DW1620 had higher Jitter levels then the CATS did.... but consistantly higher. I suspect the DW1640 will be closer to CATS.

PostPosted: Wed Jul 27, 2005 3:23 pm
by RJW
dolphinius_rex wrote:[

I'll have a DW1640 vs. CATS section... although it might only be based on 1 disc (most likely!). I haven't run it yet, but when I ran the DW1620 vs. CATS test, the DW1620 had higher Jitter levels then the CATS did.... but consistantly higher. I suspect the DW1640 will be closer to CATS.


Keep in mind drives are uncalibrated and much worse build as CATS so variations in drives might give you different results if you would also check the differnce between the 1620 and the 1640. There's a reason why CATS so expensive and why the ALMEDIO AEC1000 tester is so expensive why it seems to use a normal consumer drive as some sources point. However probally the drive is hand picked when it comes to performance and consistancy.

Ohh and the reason why Audiodev and Datarius and some other folks in the industry say Jitter is the holy grail.
Is simple it is 100% specified how jitter should be measured. Also jitter is absolute unlike errors so it can be calibrated to report what it should report.

Also why do you think that jitter counts so heavy in c't tests.

PostPosted: Wed Jul 27, 2005 3:24 pm
by Muchin
dolphinius_rex wrote:I'll have a DW1640 vs. CATS section... although it might only be based on 1 disc (most likely!). I haven't run it yet, but when I ran the DW1620 vs. CATS test, the DW1620 had higher Jitter levels then the CATS did.... but consistantly higher. I suspect the DW1640 will be closer to CATS.

Very interesting data. I am looking forward to reading the results.

PostPosted: Wed Jul 27, 2005 4:03 pm
by dolphinius_rex
RJW wrote:Keep in mind drives are uncalibrated and much worse build as CATS so variations in drives might give you different results if you would also check the differnce between the 1620 and the 1640.


If I get the oppurtunity, I'll cross reference all my scores against other drives of the same model. Keep in mind I *do* have 2 DW1620's, and 2 DW1640's :wink:

PostPosted: Wed Jul 27, 2005 4:17 pm
by Muchin
RJW wrote:If I'm right there is a standard. The ECMA standards the ECMA clearly specifies how the device is build and what it should report and what the max values would be. What most folks did was just use the max values and allmost completely ignore the rest of the ECMA standards.

People familiar with the optical storage industry told me that even for those in the industry very few persons completely understand CD and DVD specifications. It is understandable that ordinary people would not bother reading the specifications, because they have good reason to expect the major manufacturers to sell only decent products. Many firmware programmers seem to have ignored the specifications too, perhaps because the way to determine PI errors is not specified, though the allowable values are set down, and how to measure jitter is described. The specifications also do not state how the drives should be built, if I understand it correctly.

About Lite On and and SONY don't forget Lite On drives have variations itself. It's not the first time I see different results from a Lite On drive being the same model with the same firmware.
So It might not be firmware alone while the firmware also may have a significant influence.

Maybe I should be more cautious in analyzing those observations, but I have not seen any data to prove that CDRInfo’s results are due to poor quality control by the manufacturer, though it is not impossible. As I do not have a LiteOn/Sony writer for testing, I hope that some person would provide a definitive answer by flashing a LiteOn 1633 with firmware for Sony 710 or vice versa and comparing the scans of some poorly written discs.

PostPosted: Thu Jul 28, 2005 1:01 am
by vinnie97
dolphinius_rex wrote:
vinnie97 wrote:
Sadly, for a really good view of a disc's quality, the best way is still to test it on multiple drives. That way you can analyze several perspectives on the disc, and come to a more rounded conclusion. This is the basis for my own testing.

so bring on the NEC error-reporting firmware. ;) Talk about coming full circle. :lol:


Perhaps I should say:

Sadly, for a really good view of a disc's quality, the best way is still to test it on multiple drives (that offer repetitive and regularily consistant scan data). That way you can analyze several perspectives on the disc, and come to a more rounded conclusion. This is the basis for my own testing

How's that?


Fair play. :D

PostPosted: Thu Jul 28, 2005 3:37 pm
by Muchin
frank1 wrote:And as conclusion on page 4 you can read this:
« Not even with this disc could the LiteON SOHW-1633S produce similar results to the SA300 system. In all the graphs above, the SOHW-1633S drive reports higher error level values than the SA300 system »

The test disc in that page is double-layered. I have surveyed the comparisons carried out with other consumer drives again, and observed that all of them gave higher PIE counts than CATS SA300 for that disc. Since that is the only double-layered disc tested, it is difficult to explain why, IMO.