Page 1 of 1

Breaking the DVD Testing "code"

PostPosted: Fri Oct 06, 2006 12:28 pm
by dolphinius_rex
So we hear all the time about these "rules" for testing DVD media... things like PI Sum 8 errors can be no greater then 280, and PI Sum 1 failures no greater then 4. To get these test results we turn to our trusty consumer line DVD burners, often from LiteON and BenQ, which we trust to give us our answers. But what if we those drives were providing skewed data?

What happens if our DVD Testing code is broken?

PostPosted: Fri Oct 06, 2006 6:36 pm
by MediumRare
Here's the KProbe data plotted with some additional information that lets you recognize PIF clusters more easily (jumps in the running totals).

As you can see, most of the PIF's (40% in each case) occur in the 2 clusters near the end of the disc- close to where CATS finds the maximum (57.08 mm).

However the PI distribution is very uniform, as can be seen by the almost constant slope of the running total. There's a slight cluster at 4.1 Gib (probably from the PIF cluster) but it's nowhere near the location or magnitude of the CATS maximum (38.42 mm), if I interpret the results correctly.

G

PostPosted: Fri Oct 06, 2006 8:56 pm
by bicomputational
CDSpeed and Kprobe and Plextools may not give the same results as
more expensive test equipment, but that doesn't mean the data they provide is "skewed". The purpose of the testing is to get an idea of quality that can be used to judge whether or not the burn is one that will play on
other equipment and will have longevity. The tools we all use can and do do that well. Test equipment is only "skewed" if it does not give consistent results with the same input - if it scans the same disc differently at different
times. We don't need absolute results, just consistent results that we can
recognize mean that the burn is of high quality. If you've ever read CDRinfo
test results, using tools like Clover Systems for cd tests, you'll see that more exact measurement can actually give less information. That test fails almost every burn made on any media with any burner - how is that info useful to me as to whether or not the drive will give me good burns?

PostPosted: Sat Oct 07, 2006 8:16 am
by RJW
bicomputational wrote:The purpose of the testing is to get an idea of quality that can be used to judge whether or not the burn is one that will play on other equipment

On one side there's the problem. CATS is a defacto standard. It's setup is as specified in the ECMA standards so it's inline with the ECMA standards.

Now the boards standards are questionable. What folks did was just take the ECMA standards and slap it on the home testing results. However it's a question if these hometesters are also build as the ECMA specifies and for that reason also can use the standard the way as that is currently being done.

So we should trash our homesteting results ?

Well fact is that todays drive have improved much and are much better readers as the standards specify, which causes them when to use as scanners to show to good result for saying something about compatability on older players. However for the drive itself it will hold and it might also hold to some of the other modern drives.
However keep in mind that still a disc which looks so fine on your lite On can be out of standards and for that reason problematic on older (cheap ) players.

However I for the current situation it might be good to look at the current situations it might be good to have another look at them and see where there's room for improvement to get a better picture.

Thanks to these screwed consumer standards also compannies are slowly taking these over and giving less good support. So in the end things will be worse when it comes to playback with older generations adn quality levels will drop incase of some products! :evil:

and will have longevity.

Initial burn quality says zilch about longlivety but if you mean you can check and recheck so you study the degradation process(as what Dolphinius Rex is doing for his aging study) then I agree.


We don't need absolute results, just consistent results that we can
recognize mean that the burn is of high quality.

No measurement is absolute.
And I think what I said above says enough.

However about high quality I've mixed feelings.
You can't judge if it's really high quality. You can judge if it should fit if it's up to modern standards. But high quality is questionable.
Because your no longer measuring according the standards so how to judge ?

That test fails almost every burn made on any media with any burner - how is that info useful to me as to whether or not the drive will give me good burns?

It means that it the burns are not good accoding to the current official / defacto standards. Still these disc's might mean no problems with the current drives.

PostPosted: Sat Oct 07, 2006 11:10 am
by bicomputational
I think that expecting a $35 consumer level device to meet strict industry standards is unreasonable - not that it wouldn't be nice. I define high quality as I said in my last post - will the burn be useful to me now and in the future. I have found that if my discs test at a certain quality level I can expect them to perform well for me for an extended period of time and over a wide variety of drives. I can't produce professional quality results at home so why should I care how my discs test on professional grade equipment?
Just give me a tool that will give me consistent results that I can use as a guide - this is more useful than knowing my burns fail every pro test.
Regarding longevity - the tests results I have seen show a gradual increase in error rates over time. If my initial burn has very few errors, it makes sense to me to assume that it will take longer to degrade than a marginal quality burn with a higher initial error rate. This also corresponds to my actual real world experience.