Page 1 of 1

DVDInfoPro PIE value concerns

PostPosted: Mon Dec 12, 2005 4:10 pm
by dolphinius_rex
Since the comments about DVDInfoPro were becoming a bit overwhelming in the Pioneer DVR-R100 review thread, I have made this post in order to move that discussion here, were it will be more relevent and easier to follow.

PostPosted: Mon Dec 12, 2005 4:34 pm
by zebraxor
Looking further into it, at the request of all users...

PostPosted: Mon Dec 12, 2005 4:39 pm
by dodecahedron
as MediumRare has said it in the "original" topic
MediumRare wrote:I don't think it's very helpful to compel people to sign up on a website just to see what known bugs there are.

just another reason for me not to try out DVDInfoPro.
i read your explanations, zebraxor, but am not convinced.

just for an example (with which i'm familiar), Spybot - Search & Destroy has it's forums open, including the discussions about bugs. this only benefits the program.

PostPosted: Mon Dec 12, 2005 5:05 pm
by dolphinius_rex
Geez, 3 posts in and the tension is so thick you could cut it with a knife! :o

We're all geeks here right?

Zebra, I think calling the issue resolved may be a premature. I'm not able to get to my computer at home right now, because I'm at work, but I've been told that the 2x thing doesn't solve the whole problem. However, I'm not making assumptions either way until I can sink myself into the testing for a few hours.

Dodecahedron, I agree the linking to stuff 99% of people can't see is annoying... maybe next time Zebra or Nic would include a short synopsis of what they're linking to at least? That would be a little easier on people probably, right?

PostPosted: Mon Dec 12, 2005 6:26 pm
by MediumRare
zebraxor wrote:The problem was resolved as of last night:

I don't think so- I haven't signed up at your bug-track site, but the following description doesn't fit:
nicw wrote:The discrepancy is simply that the scanning speed option in DVDInfoPro is broken and its scanning speed is fixed at 2x.

if you re-run the tests with cdspeed fixed @ 2x you will see the values are equal. The error rate skyrockets at higher scan speeds.

Sorry, I have to disagree with that on both counts, at least for my drive and the version of DIP I'm using (4.5.0.3):

1. I did 2 further scans of the same disc with CD-Speed: one @2x (really 2.5x) and one @1x (all with the same scale). I'll also repeat the 4x scan, since this is posted in a different thread.

Image
Image
Image

The PI Sum 8 distribution doesn't differ significantly from the 4x scan, nor does the total count. The PIF count actually increases with lower scan speed. I've seen this before, though I don't usually do low-speed scans (they just take too long). A bigger differemce in the pattern of PI-values usually comes when going from 4x to 6x (CLV to CAV).

2. I reran the DIP scan and timed it. Apart from the 4x CLV trace in the diagram, the elapsed time (13:50) also speaks for a 4x scan, not 2x.

So the similarity of the PI Sum 1 KProbe scan and the DIP results (as shown in the original post) still indicate that DIP did a 1 ECC, 4x scan.

Now I want to elaborate on the other point I mentioned: the total and average values. Dolphinius_rex sent me some scans from a LiteOn 1635S in the meantime and they show exactly the same behaviour (his BenQ scans look consistent between CD-Speed and DIP, so it looks like a DIP/LiteOn problem). His remarks indicate that he didn't understand the point I was making so I want to elaborate it a bit.

Basis is this remark by Erik Deppe on how CD Speed calculates averages:
Only the maximum value of a certain range is plotted on the graph. The average is calculated from these values. So it's more the average of the values which you can see on the graph.


The problem with the averages is that it depends what you're averaging.

CD-Speed uses a display with ca. 420 pixels width to show the PI/PIF counts. There are > 140,000 ECC blocks (@ 16 sectors) on a disc. For PIF, there are ideally as many samples (in practice fewer because not all ECC blocks are counted), for PI Sum 8, 1/8 that value (17640), which is still much larger than 420. Now obviously you can't display all these numbers, so you group your internal counting bins in 420 display bins and determine a representative value to display. Each displayed value stands for over 333 (PIF) or ca. 42 (PI Sum 8 ) basic values.

One procedure, which most tools seem to use, is to display the maximum count in the group.

Now obviously, if you only show the maximum value in a display bin, the count look more dramatic than if you show the average. It's a choice an author has to make, and personally I think showing the max. is OK if you don't want to go the extra mile that alexnoe did. Alexnoe uses a modified approach in his tool to show more information: he varies the intensity of the line in accordance with the peaking factor (maximum/average) of each bin. As a result, the PxScan displays look finer and more delicate than the simple line- or bar-graphs used elsewhere and consequently represent the available information better.

I consider it problematic when you use the displayed values to determine the average (or worse yet) the total counts. CD-Speed does the first, DIP seems to do both for LiteOn drives (see e.g. the PIF count I mentioned in the first post).

Consider an extreme case: of 42 PI Sum 8 values represented by 1 line, one is, say, 100 and the others are 1. And assume that this is the case for every display bin. The diplayed maximum is then 100 for each display bin and the average of these displayed values is also 100. The total as displayed would be 100*420 = 42,000. But the other 41 samples per displayed value are ignored. The true total is 59,220 = 141*420 and the true average is 3.36 = 59,220/(42*420). This sampling will always overstate the average and understate the totals. Note that this example is extreme wrt. average. If the other 41 samples were 99, the true sum would be ca. 40x greater!

This effect is apparent in comparing CD-Speed with KProbe results: The PI totals are comparable, but the average CD-Speed shows is significantly higher (8.3 vs. 5.8 ).

And the DIP scan in the original post definitely does this for PIF (you can count the values shown) and also for PI: the display is 450 pixels wide, the average = 4 and the sum is ca. 1800 = 450*4.

G
(who is off to bed).
edit: corrected overstated maximum to average.

PostPosted: Mon Dec 12, 2005 6:34 pm
by zebraxor
Thank you again Medium Rare, for all your observations! We are looking into this further. :)

PostPosted: Mon Dec 12, 2005 7:03 pm
by nicw
Thanks for your dedication to this MediumRare and your feedback. I have instructed Z to re-open this issue on the mantis and all further correspondance on this issue I will look for there.

Due to the time of year and work deadlines, I will look at this issue more deeply when I get the time such an issue needs to get it resolved.

We acknowledge an issue and a resolution will be implemented in due course.

thanks
NicW

PostPosted: Tue Dec 13, 2005 10:54 am
by dolphinius_rex
NicW was kind enough to pass on a beta version of his attempt to fix the problem with DVDInfoPro. Here's some of the testing I've done with it, and comparison scans with other software.

As you can see from the pictures, all 3 scans are pretty similar.

I've done many comparison scans using all three pieces of software, and the following points have come out:

Nero CD Speed Range of variation:
Max PIE Reported: 156 to 208
Average PIE Reported: 43.21 to 57.53

K-Probe Range of Variation:
Max PIE Reported: 186 to 205
Average PIE Reported: 4.50 to 54.97

DVDInfoPro Range of Variation:
Max PIE Reported: 169 to 188
Average PIE Reported: 65 to 75

Have to run to work now, more comments later :wink:

PostPosted: Tue Dec 13, 2005 12:41 pm
by dolphinius_rex
(at work now)

Based on the recorded variations of the maximum and average values, from all 3 pieces of software, I believe that you're going to get roughly the same results no matter which one you choose.... with a few exceptions:

Exception #1: K-Probe is calculating Averages REALLY weirdly... it's pretty obvious looking at the variations between averages. Out of 3 scans, to of them had averages under 10 (they were 4.x if I recall correctly). Not knowing exactly how the program is programmed, I'm not going to say this is "wrong", or "inaccurate"... especially considering that nature of the beast itself, but I *WILL* say it's different and unexpected.

Exception #2: The Total values of errors are quite different from all 3 pieces of software. K-Probe has the lead for total errors on the scan I posted, which would also lead credability to the liklihood that it tested more raw data then DVDInfoPro and CDSpeed. CDSpeed had the next highest volume of total PIE errors, and this was consistant when compared to DVDInfoPro. DVDInfoPro obviously had the least amount of total errors. Personally... I'd PREFER to have these totals all be roughly the same, or at least hear a reason as to why the differ so.

But other then those 2 exceptions, which I classify as minor, I'm pretty satisfied with the results.

What do other people think?

PostPosted: Tue Dec 13, 2005 6:06 pm
by MediumRare
Since this is Tuesday, I don't have much time for optical media toys. So it's short shrift time. :wink:

The shape of the curves is very similar and the order of magnitude of the averages and totals are the same. The DIP values are now reasonable so it looks like the problem is solved.

One thing I'd look into, though, is the ratio of total/average, which should be the no. of samples. It isn't for CD-Speed for reasons that Erik Deppe explained and I elaborated on in my previous post. The value differs by a factor of 2 for KProbe and DIP, so perhaps DIP is using only half the samples. CD-Speed gives the average no. of ECC-blocks per sample in the text summary, KProbe shows the information required to determine this in the screen shot (scanned range and sampling count). I don't know if DIP offers this information.

As to the low KProbe averages, I'd have to see the scans to comment on them. I've seen some tremendous variations in PI count from scan to scan (factor of 5 or more), so it might be something like that. You can send them to me if you like. :D

If you want to investigate KProbe's averages further, save the RAW data (caution: several hundred thousand pieces of information). This allows you to perform your own analysis and check how averages etc. are determined.

G

PostPosted: Tue Dec 13, 2005 6:47 pm
by nicw
Thanks to all involved in this issue. Special thanks to MediumRare for pointing this out in such detail. Big thanks to Dolphy for spending this time confirming, testing and for looking at the beta version fix. Thanks to Zebra for prodding me on this to bump its priority.

I am sorry I could not work on this issue or the code myself, I simply do not have the time right now. The programmer working on it though is very capable and a big thanks goes out to her for working on it with such dedication and haste. ;)

We will leave the mantis entry for this issue hanging around for a while if anyone wants to add comments, further issues, or add wish lists or any other way to improve this.

cheers
NicW

PostPosted: Tue Dec 13, 2005 7:22 pm
by zebraxor
Hi guys,

For your interest, I recommend you read the following post:

http://www.cdrinfo.com/forum/tm.asp?m=1 ... =1&#113007

Take careful note of the large-ish post by Halcyon.

PostPosted: Sat Dec 17, 2005 3:36 pm
by Muchin
dolphinius_rex wrote:K-Probe is calculating Averages REALLY weirdly... it's pretty obvious looking at the variations between averages. Out of 3 scans, to of them had averages under 10 (they were 4.x if I recall correctly).

Did you also get large variations in total values of errors with KProbe? Total number is usually more reliable than the average calculated by the program.

PostPosted: Sat Dec 17, 2005 4:48 pm
by dolphinius_rex
Muchin wrote:
dolphinius_rex wrote:K-Probe is calculating Averages REALLY weirdly... it's pretty obvious looking at the variations between averages. Out of 3 scans, to of them had averages under 10 (they were 4.x if I recall correctly).

Did you also get large variations in total values of errors with KProbe? Total number is usually more reliable than the average calculated by the program.


Here are some of my scans done in K-Probe

PostPosted: Sat Dec 17, 2005 4:49 pm
by dolphinius_rex
Here are the last 3 scans I did with K-Probe (there's some more in between this post and last post, but I think 6 scans should be sufficient)

Of these 6 posted scans the PIE total fluctuates from 786690 to 982169, that's roughly a 25% increase from lowest total to highest.

Of these 6 posted scans the PIF total fluctuates from 125 to 173, that's roughly a 40% increase from lowest total to highest. (In all fairness, the lowest test was a bit extreme for whatever reason. Removing the lowest total value changes the percentage difference to ~16%).

PostPosted: Sat Dec 17, 2005 6:53 pm
by MediumRare
I think the variation of those scans is pretty normal. This could be a thermal effect or something else. I've seen a variation in the PI total of up to a factor of 5 when doing repeated scans.
dolphinius_rex wrote:Exception #1: K-Probe is calculating Averages REALLY weirdly... it's pretty obvious looking at the variations between averages. Out of 3 scans, to of them had averages under 10 (they were 4.x if I recall correctly). Not knowing exactly how the program is programmed, I'm not going to say this is "wrong", or "inaccurate"... especially considering that nature of the beast itself, but I *WILL* say it's different and unexpected.

Thank you for sending me your scans. :D As a result, I'm able to clear up that mystery: the scans with the low average were the ones you did with PI at 1ECC. So the factor of 8 in the average is pretty well what you'd expect (if the total is comparable- see the "normal variation" above).

I still don't understand the difference in the number of samples (total/average) between KProbe and DIP (only 1/2 as many in DIP).

G

PostPosted: Sun Dec 18, 2005 4:28 am
by dolphinius_rex
Heat deffinately plays an issue with LiteON based testing. The 1635S in particular seems to be especially prone to over heating. I've had to power my whole system down for over an hour to get it to be able to read my TYG03 disc without massive CRC errors! :o (after cool down, you get scans like the K-Probe images shown above).

It's just one of those things that makes me think LiteON's are the worst thing to happen to DVD media testing.

MediumRare, I'm curious about your opinion on the TYG01 media I tested? I found the results were MUCH more consistant as soon as the media being tested wasn't of extremely low burn quality.