Home News Reviews Forums Shop


X-bit Labs Test Lots of DVD Media

DVD-R/W, DVD+R/RW, DVD-RAM

X-bit Labs Test Lots of DVD Media

Postby 21st Hermit on Mon Apr 18, 2005 7:44 am

X-bit Labs tested some 2 dozens flavors of DVD media. Sorry no executive summary, too head throbbing this early. [-o<
21st Hermit
CD-RW Thug
 
Posts: 79
Joined: Thu Jun 17, 2004 12:35 pm
Location: Colorado

Postby Halc on Mon Apr 18, 2005 9:00 am

Lot's of comments in their comment section already.

I commend them for the huge amount of testing they've done, but with the very basic misassumptions, misunderstandings and even some factual erros, well... it's hard to agree with their "ultimate" label :)

I'd say it's indicative at best and even then possibly outright misleading.
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby dolphinius_rex on Mon Apr 18, 2005 10:04 am

oh..... my...... gosh.....

I've seen some poorly planned, poorly understood.... and poorly concluded testing done before, but this has to be the most craptacular I've seen to date.

There are so many problems with it, I couldn't even dream of listing them all! But I'll start with some of the most painful ones.

#1: They burned a single disc with a single drive only
#2: The single drive used is a REALLY crappy drive (LiteON!!!)
#3: They tested the media in only one drive, which happens to be really bad at it (LiteON!!!)
#4: The used 2 pieces of software which do EXACTLY the same thing, to test their data collection, which only amounts to scanning with one piece of software twice.
#5: They make painful claims as to data retention based off of an initial burn quality only (which I had been known to do as well on my own webpage in my earlier years... :o ).

They need to seriously re-think their methods, and spend a month or two learning more about media testing and how it really works.
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada

Postby Justin42 on Mon Apr 18, 2005 10:17 am

I'm glad it wasn't just me. I looked at that, thinking it could be really neat but was shocked at how bad the methodology was.

Articles like that do more harm than good... as this'll become one of those articles people point to for YEARS that have to be debunked.
Justin42
CD-RW Player
 
Posts: 723
Joined: Sat Jun 29, 2002 10:30 pm

Postby 21st Hermit on Mon Apr 18, 2005 11:50 am

dolphinius_rex wrote:oh..... my...... gosh.....

I've seen some poorly planned, poorly understood.... and poorly concluded testing done before, but this has to be the most craptacular I've seen to date.

There are so many problems with it, I couldn't even dream of listing them all! But I'll start with some of the most painful ones.

#1: They burned a single disc with a single drive only
#2: The single drive used is a REALLY crappy drive (LiteON!!!)
#3: They tested the media in only one drive, which happens to be really bad at it (LiteON!!!)
#4: The used 2 pieces of software which do EXACTLY the same thing, to test their data collection, which only amounts to scanning with one piece of software twice.
#5: They make painful claims as to data retention based off of an initial burn quality only (which I had been known to do as well on my own webpage in my earlier years... :o ).

They need to seriously re-think their methods, and spend a month or two learning more about media testing and how it really works.


Would it be fair to concluded you weren't impressed? #-o

BTW, Thanks for reading the test, I saw 24+ pages and my head started hurting. :)
21st Hermit
CD-RW Thug
 
Posts: 79
Joined: Thu Jun 17, 2004 12:35 pm
Location: Colorado

Postby Ian on Mon Apr 18, 2005 12:07 pm

I think their testing methods have been ripped on enough so I won't go there. Personally, I would have liked to have seen some tests with more 16x +R and -R media. A lot of the discs there have been tested to death. It might have been a different story if the Lite-On could write to this media at 16x.. but as we all know, that isn't the case.
"Blu-ray is just a bag of hurt." - Steve Jobs
User avatar
Ian
Grand Poobah
 
Posts: 15130
Joined: Sun Apr 08, 2001 2:34 pm
Location: Madison, WI

Postby hoxlund on Mon Apr 18, 2005 1:50 pm

haha not even TYG02 tested
MSI MEG Prospect 700R Case
be quiet! Straight Power 1200 Platinum PSU
ASRock X870E Taichi Mobo
AMD Ryzen 7 9800x3D CPU
Corsair Dominator Titanium 2x32GB 6000MHz
MSI RTX 4090 Liquid Suprim X
User avatar
hoxlund
CD-RW Player
 
Posts: 3709
Joined: Mon May 27, 2002 12:55 am
Location: Idaho

Postby Wesociety on Mon Apr 18, 2005 2:34 pm

I read the article before reading the reactions here.
I agree with most of Dolphin's points. ;)

Another thing to point out is that they mistakenly concluded that the Digitrex 8x DVD-R was manufactured by Taiyo Yuden. (understandable since they just looked at DVDIdentifier for this info)
Looking at the brand and results, the disc is obviously not an authentic Taiyo Yuden made disc. Just a "fake" disc using the TY MID.

And of course we would disagree with many of the "Ultimate" disc quality conclusions. Such as the statement about MCC003, which states the following:
X-bit wrote:The scanning of the operational layer of the medium betrays its low quality. The maximums of PI errors and failures by far exceed the norm. The quality of the recording layer is not stable, judging by the appearance of the diagrams. It is worse at the beginning and the outermost half of the disc. We don’t think this disc is going to be a good purchase for people who care about the safety of the recorded information.
User avatar
Wesociety
CD-RW Player
 
Posts: 1234
Joined: Tue Mar 09, 2004 11:33 am
Location: Phoenix, AZ

Postby RJW on Mon Apr 18, 2005 3:57 pm

Wesociety wrote:I read the article before reading the reactions here.
I agree with most of Dolphin's points. ;)

Another thing to point out is that they mistakenly concluded that the Digitrex 8x DVD-R was manufactured by Taiyo Yuden. (understandable since they just looked at DVDIdentifier for this info)
Looking at the brand and results, the disc is obviously not an authentic Taiyo Yuden made disc. Just a "fake" disc using the TY MID.

Just a Fake disc. Well it's probally Optodisc looking at some other test data.
Also don't blame DVDIdentifier. If he really wanted to use one tool then he did make the right choice to use this. I only hoped that he did a little more background research but then again we should know by know that 90% of todays journalist don't seem to be able to do this.
Also about DVDidentifier we're planning to put TY's serial codeing system in and a warning because I get sick of explaineing that there is something as fake as TY and that no tool that identifies by MID can identify fake TY's from real.
Also it seems that MCC is know haveing the same problem. (Thanks to infosmart who know abuses MCC code. Which was not a smart move maybe time to remove the smart part of the name. )

About the article.
I thought it was ok if you take into the account this is a generic hardware testing site. (Yeah I didn't had a high expectation.)

The only thing which is a really bad was the choice for the title because it has given some other people a to high expectation. Which Probally would have expected to use something like at least 10 different burners, Multiple disc's, 2 Testers (one from Audiodev & Datarius ) Some advanced ageing tests (Includeing humidity, corrosive components and UV) and Offcourse rewrites on rewritables.
Yes that would have been something that could have gone by the name.
But ohhwell. :D

As Halcyon said on the site gave as comment nice try.
RJW
CD-RW Player
 
Posts: 1379
Joined: Sun Oct 28, 2001 8:00 pm
Location: The netherlands

Postby RJW on Mon Apr 18, 2005 4:06 pm

dolphinius_rex wrote:oh..... my...... gosh.....

#2: The single drive used is a REALLY crappy drive (LiteON!!!)
#3: They tested the media in only one drive, which happens to be really bad at it (LiteON!!!)


One good point here: Things can only get better. :D
But that's double. However it wouldn't have been that hard to find a fifth bad point. Really


#5: They make painful claims as to data retention based off of an initial burn quality only (which I had been known to do as well on my own webpage in my earlier years... :o ).

So do 85% of the magazines. Hey and they have to start somewhere.

They need to seriously re-think their methods, and spend a month or two learning more about media testing and how it really works.

Problem is that these people probally do have a life unlike someone who tests the perfomance of hundreth disc's on a single drive and just calls it a drive review. :D (Why you could have called it the even more ultamite ...)

Ohwell the only reason why I'm not that hard on these guys is because I probally read the friday befor the joke PC CONSUMENT published.
Okay it was 16x media and it had lot's of dual layer media (Mostly ritek).
Now I only took a quick a look at that one and was allready to scared to read what the text was about because the based the whole ranking on cd speeds quality score and 1 tester I forgot which one but I think a AOPEN. (multiple drives for burning that was something they did right).
Only look at the quality score which means only PIF reports were taken into account. :)
RJW
CD-RW Player
 
Posts: 1379
Joined: Sun Oct 28, 2001 8:00 pm
Location: The netherlands

Postby Wesociety on Mon Apr 18, 2005 4:28 pm

RJW wrote:Also it seems that MCC is know haveing the same problem. (Thanks to infosmart who know abuses MCC code. Which was not a smart move maybe time to remove the smart part of the name. )

LOL! Infostupid? or maybe Infoscam ?
User avatar
Wesociety
CD-RW Player
 
Posts: 1234
Joined: Tue Mar 09, 2004 11:33 am
Location: Phoenix, AZ

Postby Wesociety on Mon Apr 18, 2005 4:29 pm

RJW wrote:Ohwell the only reason why I'm not that hard on these guys is because I probally read the friday befor the joke PC CONSUMENT published.
Okay it was 16x media and it had lot's of dual layer media (Mostly ritek).
Now I only took a quick a look at that one and was allready to scared to read what the text was about because the based the whole ranking on cd speeds quality score and 1 tester I forgot which one but I think a AOPEN. (multiple drives for burning that was something they did right).
Only look at the quality score which means only PIF reports were taken into account. :)

Eek! they based all disc ranking on the CD DVD Speed quality score?? :o
How about a link to that article?
User avatar
Wesociety
CD-RW Player
 
Posts: 1234
Joined: Tue Mar 09, 2004 11:33 am
Location: Phoenix, AZ

Postby dolphinius_rex on Mon Apr 18, 2005 5:39 pm

Wesociety wrote:
RJW wrote:Also it seems that MCC is know haveing the same problem. (Thanks to infosmart who know abuses MCC code. Which was not a smart move maybe time to remove the smart part of the name. )

LOL! Infostupid? or maybe Infoscam ?


How about Infosmack?
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada

Postby dodecahedron on Mon Apr 18, 2005 11:01 pm

Wesociety wrote:Eek! they based all disc ranking on the CD DVD Speed quality score?? :o
How about a link to that article?

in the first post of this topic.
One Ring to rule them all, One Ring to find them,
One Ring to bring them all and in the darkness bind them
In the land of Mordor, where the Shadows lie
-- JRRT
M.C. Escher - Reptilien
User avatar
dodecahedron
DVD Polygon
 
Posts: 6865
Joined: Sat Mar 09, 2002 12:04 am
Location: Israel

Postby Wesociety on Tue Apr 19, 2005 2:36 am

dodecahedron wrote:
Wesociety wrote:Eek! they based all disc ranking on the CD DVD Speed quality score?? :o
How about a link to that article?

in the first post of this topic.

I'm referring to a totally different article
RJW mentioned an article published by "PC CONSUMENT"
User avatar
Wesociety
CD-RW Player
 
Posts: 1234
Joined: Tue Mar 09, 2004 11:33 am
Location: Phoenix, AZ

Postby Halc on Tue Apr 19, 2005 3:36 am

What pains me is this:

They did a lot of work.

I mean, of course there are some no-lifer enthusiasts with no full time work/family/obligations (who, me?), who could do even more work.

Regardless, they did a big amount of work.

But had they spent 20% of that time by asking around, reading here, cdrinfo and cdfreaks, they would have succeeded in using the remaining time much more usefully.

I'm not saying that we have all the information and that the testing that for example I do myself is faultless. Of course is not. What I've learned during the past 3-4 years that this is constant learning and one must remain humble.

But at least many (if not most) of the commonly agreed mistakes can be weeded out by asking around, participating in discussions and reading others test methodologies (and criticism about them).

If one does not do it, there can be several reasons: language barrier, hubris and/or not knowing of the possible sources (even if trying to find out). Language barrier and not finding out even if trying are understandable. Hubris is not a good reason, in my book anyway :)

In my opinion real-life constraints doesn't immediately strike me as the most obvious reason, if one ends up doing a huge amount of testing anyway (thus spending a lot of time on the project as a whole).

It boils down to balancing preliminary research and planning with the actual unavoidable grunt work (burn, test, take pictures, compile, analyze, draw conclusions, write article).

So, based on all this, I recommend we not lambast the people that do not surpass our often very unrealistic set of test criteria, but give them constructive criticism. After all, they did try and they did do a lot of work and they can still learn more.

After all, there aren't enough knowledgeable testers in the world. So, let's try to educate them. If there were, we wouldn't need to have discussions like these. Just read the results of others :)


regards,
halcyon

PS I have a similar problem myself in terms of testing. I'm going to do a test of 5 drives (limited on purpose to 5). Now, I could of course burn as many discs with each drive and scan each of them once. Those of you who know my current stance on this, now that I don't think too much of such tests in terms of statistical reliability.

My other choice is to burn a very small number of discs (the most used), but to scan all the burns on various readers (excluding LiteOn completely). Now, this is far from optimal too, because I can then use a max of 6 different brands and the amount of testing will still be huge. And the readers will cry "why didn't you test X, why didn't you test Y, etc."

Of course, there is the 3rd chance of burning a huge amounts of discs and scanning with a big number of drives. While this is theoretically the best option, I've noticed it's just not doable (at least I can't do it). It takes such a long time to do it that even if doing it for a couple of hours each day, the results become outdated before one can publish them :)

So, whateve one does, one will get huge amounts of criticism (some deserved, some completely ignorant).

So, from this point of view, it is (IMHO) useful to do the test Right (TM) with reliable - even if limited in scope - test results, because you will get scorned anyway.

It's better to have a little bit of reliable information than a huge amount of unreliable information. Imho.
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby RJW on Tue Apr 19, 2005 3:49 am

Wesociety wrote:
RJW wrote:Ohwell the only reason why I'm not that hard on these guys is because I probally read the friday befor the joke PC CONSUMENT published.
Okay it was 16x media and it had lot's of dual layer media (Mostly ritek).
Now I only took a quick a look at that one and was allready to scared to read what the text was about because the based the whole ranking on cd speeds quality score and 1 tester I forgot which one but I think a AOPEN. (multiple drives for burning that was something they did right).
Only look at the quality score which means only PIF reports were taken into account. :)

Eek! they based all disc ranking on the CD DVD Speed quality score?? :o
How about a link to that article?

Like I said it was a magazine and it was that crapticulair that I won't buy it and scan it so you can see how big joke it was. Yeah the whole ranking was based on the cd dvd speed quality score.
RJW
CD-RW Player
 
Posts: 1379
Joined: Sun Oct 28, 2001 8:00 pm
Location: The netherlands

Postby Halc on Tue Apr 19, 2005 3:56 am

Getting a little bit off-topic from the original article discussion, but relevant still (I hope).

BTW, I've asked this Erik Deppe more about how each drive is calibrated for the Quality Score in CD DVD Speed and if the real units that each chipset/model/firwmare measures can be published. Erik has talked about some of these in the long CD DVD Speed thread at CDFreaks, but much still remains unanswered.

About the quality score, can we trust it?

Well, there's no absolute reason that we couldn't, if we knew how it was constructed and how each drive was "calibrated" to give a scale of results from 0 to 100.

However, as long as this "calibration" remains a mystery, I think it boils down if we trust Erik D and the people who send him test results of each drive that is supported in CD DVD Speed.

Now, I'd personally like to get a following type of tool:

1) Report raw data (no manipulation through software) that the drive/chip reports

2) Report statistically corrected (proper statistical methodology, not just averaging!!) 8ECC PIE, 1ECC PIF estimated/interpolated results from each drive.

3) Make a Quality assesment algorithm, which is known and open for criticism, that weighs the quality of the reading results in the drive doing the reading, by using raw data alone.

4) Rate readers for read ability (not readability, but ability to read). Consider marking the results of the scan based on which type of reading drive did the error reading. For me LiteON 16x3s series drives would get relatively low reliabilty ranking, because they are such good readers compared to almost anything else.

Of course, not being a paying customer, I can only propose ideas and not really demand anything.

I just hope that we could get some sense to this testing already... we've come a long way from the days of the first kProbe, but we're still not *there* yet, imho.

regards,
halc
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby RJW on Tue Apr 19, 2005 4:00 am

Halc wrote:
PS I have a similar problem myself in terms of testing. I'm going to do a test of 5 drives (limited on purpose to 5). Now, I could of course burn as many discs with each drive and scan each of them once. Those of you who know my current stance on this, now that I don't think too much of such tests in terms of statistical reliability.

My other choice is to burn a very small number of discs (the most used), but to scan all the burns on various readers (excluding LiteOn completely). Now, this is far from optimal too, because I can then use a max of 6 different brands and the amount of testing will still be huge. And the readers will cry "why didn't you test X, why didn't you test Y, etc."

Of course, there is the 3rd chance of burning a huge amounts of discs and scanning with a big number of drives. While this is theoretically the best option, I've noticed it's just not doable (at least I can't do it). It takes such a long time to do it that even if doing it for a couple of hours each day, the results become outdated before one can publish them :)

So, whateve one does, one will get huge amounts of criticism (some deserved, some completely ignorant).

So, from this point of view, it is (IMHO) useful to do the test Right (TM) with reliable - even if limited in scope - test results, because you will get scorned anyway.

It's better to have a little bit of reliable information than a huge amount of unreliable information. Imho.


Better tester means faster results. What I'm saying is that if we had access to some tester which could give us info on other parameters we wouldn't need multiple test runs. Only thing that then should be still taken into account is multiple batches and multiple drives.
However with todays drives this is still not possible.

Problem is you allways have to sacrifice something unless you got to much time on your hands or have access to the profesional analyzers but even then it won't be easy.
So the point is it are not the results alone it's also the story that has to fill up the gaps which were left open on intention because of lack of time.
RJW
CD-RW Player
 
Posts: 1379
Joined: Sun Oct 28, 2001 8:00 pm
Location: The netherlands

Postby Halc on Tue Apr 19, 2005 10:00 am

I'm getting a lot off-topic here... I hope you bear with me...

Good points RJW.

I still don't know what the ideal (or better) test reader would be though.

Of course calibrated, reliable from scan to scan and everything else that Pulstec drives are supposed to be.

Still, even with all that, they only represent one implementation of offset & slicer values.

As we know, this implementation varies from drive to drive and there is no one right way to do it. Well, the ideal is to attain minimum jitter, but drives do it differently and reach different minima.

As such, imho, the only way to do the testing is to scan with multiple different type of implementations (drives). The kind of drives that are out there and which vary in their reading capability.

Some of these drives might fail on discs where laser power was miscalibrate, while others will not.

Other drives might fail on discs where laser exposure was unoptimal, where some drives will not.

Again, it would theoretically be possible to implement a drive with a very adaptive reading capability.

It would try various offset/slicer values to cope with discs/burns of different types.

However, this kind of drive would even surpass LiteOns in its capability to read successfully (i.e. with low number of dc jitter, hence low number of read errors) even discs burned on camel dung.

And if everybody in the world doesn't have a drive like that, then it's just misleading to use a drive like for testing.

Hence the problem of having to use various different types of drives (with their different failings in reading) to gauge what is an overall readability/compatibility (thus 'quality') of a certain burn.

That at least is how I understand it.

I wish there was a simpler, more elegant and more trustworthy way.

But at least I haven't been able to figure it out (unless one has access to very low level data about burn figures, something which even CATS fails to give on all required variables).

Maybe sometime in the future the capability of drives to read dvd discs will converge near some practical maximal performance.

However, being that even CDRW drives have not done that in the past 15 years (and for them there is a theoretical benchmark, namely Rodan), I see very slim chance of this happening on dvdrw drive front.

cheers,
halc

PS Back to topic:

What would you have done differently in the Xbit labs test? What would you have not done? What would you have added? Imagine using a similar amount of time for the modified test scenario as they used, not more and not less.
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am


Return to DVD Writers

Who is online

Users browsing this forum: No registered users and 3 guests

All Content is Copyright (c) 2001-2024 CDRLabs Inc.