Page 2 of 3

PostPosted: Tue May 27, 2003 8:52 pm
by rdgrimes
Q-Check has a maximum testing speed of 24x

Grrrrrr :evil:

PostPosted: Tue May 27, 2003 9:11 pm
by cfitz
dodecahedron wrote:the argument the "more people have Lite-On drives" - i still don't buy that as a valid reason to choose a Lite-On drive for testing over any other drive.

the graphic output of PlexTools Pro does seem more "professional" and pretty. in Ians words, more polished. IMO, this is definitely a valid consideration when choosing which util to use for testing, as this output is going to be displayed on the reviews. (and this, in my opinion, is definitely a much more important consideration than how many readers have a Lite-On or Plextor drive).

I guess we will just have to disagree on those two points. I think that the greater ownership of LiteOn drives is a very valid reason to choose a LiteOn drive for CDRLab's official test bed. It shouldn't be the only reason, certainly, but since the point of a site like this is to provide usable information to its community, providing information that is most usable to most of that community is of very high importance.

As for the presentation of the results, I don't think that the subjective prettiness of the charts is important at all. As long as the charts are readable, the most important thing is the quality and quantity of the information contained therein. When I am reading a review I value solid objective information, not fluffy presentation. I prefer substance to style. And to me things like media manufacturer, testing speed, drive model, firmware version, testing extent and testing date are not at all extraneous information. Even the "KProbe" label isn't completely extraneous, because it contains KProbe version information that may be important in interpreting results.

Also, for what it is worth (nothing to me), I happen to think that KProbe's charts look more professional than Q-Check's. The only aspect of KProbe's charts that I don't like is the labeling of the time axis. The labels are hard to read since they aren't divided into even numbers.

Everyone has different GUI preferences, and no one's preference can be said to be any more valid than anyone else's. Therefore, I don't think that visual appeal is a very good discriminant for deciding between competing test suites. If all else is equal, then sure, pick the one that is prettier according to the tastes of the final arbiter. But it would be a real shame if prettiness were a primary consideration and style won out over substance. Let's choose the test suite that provides the most accurate, useful and comprehensive information, whichever that may be.

cfitz

PostPosted: Tue May 27, 2003 9:20 pm
by cfitz
Ian wrote:Not sure if it tests every sector. As far as how it defines a C1/C2/CU error, here's essentially what the manual says:

C1 - error correction for the block error rate (BLER)
C2 - like KCK pointed out, this seems to be E22 only
CU - the errors that couldn't be corrected after C2

btw.. Q-Check has a maximum testing speed of 24x.

Thanks Ian. The testing of every sector would be a plus. Any chance you can get a definitive answer from Plextor on that, and also an explanation for why they count only E22 for C2?

I have to agree with rdgrimes about the 24x speed limitation. There is something to be said for testing at both high and low read speeds, so the inability to test at high speeds is a definite minus.

I suspect I know why they limited the testing speed to 24x: they probably don't want people getting over-excited and complaining about errors, and thus limit the testing speed to produce lower error levels. As rdgrimes said, Grrrr... :evil: But, I still have to give them credit for putting the tool into the hands of end users, even with a 24x limitation.

cfitz

PostPosted: Tue May 27, 2003 9:34 pm
by Ian
Yeah, I'll email one of my contacts at Plextor tomorrow and see if I can get more info for you guys.

PostPosted: Tue May 27, 2003 10:42 pm
by KCK
cfitz:

Apparently even owners of PlexWriter Premium aren't sure what their C1/C2 counts mean:

http://club.cdfreaks.com/showthread.php ... adid=69593

Note that Frederick claims C1=E11+E21 and CU=E32 without giving a reference.

Well, a "natural" conclusion from this discussion is that we should run KProbe at most at 24x. 8) :P

It seems that dodecahedron may not be aware of the size of graphic files produced with Q-Check, although he suffers from a slow connection. It will be interesting to hear his reaction after perusing these links

http://homepage2.nifty.com/yss/qcheck/qcheck01.htm

http://homepage2.nifty.com/yss/qcheck/qcheck03.htm

which were extracted from this thread on CDFreaks:

http://club.cdfreaks.com/showthread.php ... adid=68709

PostPosted: Tue May 27, 2003 11:09 pm
by dodecahedron
cfitz:
yeah, we can agree to disagree.

on the issue of the wide popularity of Lite-On drives - i am not opposed to this being a consideration, just think that it is of the lowest priority (in my opinion of course).

as for the graphics etc: i do think the the "subjective" prettiness of the graphics is important when writing up a review to be published on the web. as for the KProbe info, i am differentiating between the need of the hobbyist such as you and me, and the needs of (say) Ian who will use these for a review published on the web, and aimed at a rather more general readership than just us regulars of the forum, the CDRW die-hards.

etc. i think you get my point.

KCK:
i didn't get what you meant.
i visited those 2 links, lots of PlexTools screenshots, they are all PNG, 10K each. do you consider this large?

PostPosted: Tue May 27, 2003 11:14 pm
by cfitz
KCK wrote:Apparently even owners of PlexWriter Premium aren't sure what their C1/C2 counts mean:

That's okay. To be honest we LiteOn owners aren't exactly sure what our C1/C2 counts mean either.

KCK wrote:Well, a "natural" conclusion from this discussion is that we should run KProbe at most at 24x. 8) :P

It gets into the old debate as to whether one should test at lower speeds to deemphasize (but not eliminate) the effects of the reader compared to the writer/media or test at higher speeds to reflect expected usage. Both approaches are valid.

KCK wrote:It seems that dodecahedron may not be aware of the size of graphic files produced with Q-Check, although he suffers from a slow connection. It will be interesting to hear his reaction after perusing these links

I'm sorry but I think I missed what you are trying to say. Q-Check makes bigger or smaller graphics files? I don't think it makes any, and those are all screen shots. Am I mistaken? By the way, none of those screen shots seem outrageously large. In fact, even with the full Q-Check window, including the menu items and such, included in the pictures, they are still smaller than Ian's cropped jpeg that shows just the graph. One of these days we will have to convince Ian to switch to png for screen shots - clearer and smaller. :wink: 8)

cfitz

PostPosted: Tue May 27, 2003 11:18 pm
by cfitz
dodecahedron wrote:cfitz:
yeah, we can agree to disagree.

Agreed. :) 8) Although I will object to calling prettiness "objective". A consideration? Yes. An important consideration? Not in my opinion, but it is certainly valid for it to be an important consideration to you - that's where we agree to disagree. But "objective"? Never! Beauty is in the eye of the beholder! :wink:

cfitz

P.S. Since prettiness of the review published on the web is important to you, can you convince Ian to switch to png for screen shots?

PostPosted: Tue May 27, 2003 11:27 pm
by dodecahedron
cfitz wrote:I will object to calling prettiness "subjective".

oops my bad, meant to say "subjective" (i was paraphrasing you).

i edited my post above.

PostPosted: Wed May 28, 2003 12:21 am
by KCK
cfitz and dodecahedron:

Sorry for the confusion about the size of pictures produced for Q-Check. The two links I listed worked quite slowly for me, so I assumed this was due to the size of the graphic files, and hence I hoped that they would exceed dodecahedron's patience! :oops: :P

Still, note that for combined plots, KProbe can display a lot of info in just one picture, whereas Q-Check needs a second picture for maximum and average C1/C2/CU counts.

And, of course, Karr Wang will hopefully provide jitter testing as well.

Further, Q-Check aborts after the first reading/slipping error, whereas Karr Wang is still struggling to make KProbe go on.

PostPosted: Wed May 28, 2003 12:28 am
by CDRecorder
My personal opinion is that testing for C1 and C2 errors with the Lite-On is best, because it will mirror the results that those of us who actually use Lite-On drives will see. However, I can see the value of using the Plextor drive, too.

PostPosted: Wed May 28, 2003 1:06 am
by cfitz
KCK wrote:Still, note that for combined plots, KProbe can display a lot of info in just one picture, whereas Q-Check needs a second picture for maximum and average C1/C2/CU counts.

Agreed. I like the convenience of having everything included in one graphics file. Whether or not that is important to Ian, I don't know. Maybe Ian doesn't mind taking multiple screen shots, cropping out the menu items from PlexTools (which truly are extraneous) to focus just on the Q-Check output, and then combining the results to get a final chart with complete information. I can say that the less work it is to collect and display all the information, the more likely it will be done, and that in the one example we do have, Ian didn't bother to include the extra chart of maximum and average values. Just another consideration...

KCK wrote:And, of course, Karr Wang will hopefully provide jitter testing as well.

Hopefully. He hasn't been around lately. :(

KCK wrote:Further, Q-Check aborts after the first reading/slipping error, whereas Karr Wang is still struggling to make KProbe go on.

True, although this may be less important to people who consider any such error to condemn the disc to the trash heap so that they don't really care what happens after the error. Personally I like Karr's approach, but I recognize it might not have value to many.

cfitz

PostPosted: Wed May 28, 2003 1:22 am
by eliminator
I still wonder if it's worth twice the new Lite-On (52x32x52) ...?

Bezel logo

PostPosted: Wed May 28, 2003 1:28 am
by Clint
Yeah, Ian, one unit is an August or September manufactured (TLA#0000) and the other is December (TLA#0103). Both made in Japan.

Yeh, it is cheaper for them to do it in China - but I hope you guys don't suffer in quality (since Asia-Pacific units are for some reason specifically made in Japan). Just something I noticed in the review which was by the way, good 8)

PostPosted: Wed May 28, 2003 1:30 am
by CDRecorder
Is the new 52x32x52 Lite-On out yet? It still says "coming soon" on Lite-On's web site.

Edit: Just saw the thread about the LTR-52327S preview. :oops:

PostPosted: Wed May 28, 2003 3:00 am
by Halc
Thanks for the great review Ian!

I've always wanted to ask this, but now finally remembered to do it:

When you test using CD Speed (ScanDisc), what do the numbers actually mean?

If a drive reports a sector damaged, but NOT unreadable, this could be due to two things:

1) The disc really is damaged and the drive is accurate in reporting it back (a lousy drive might even miss some damaged sectors that are really damaged on the disc)

2) The disc is not necessarily damaged, but the drive tries to read (and re-read) that part at XX speed and fails (even though ScanDisc tries to use set-speed command), reporting false (statistically too high number of) errors.

I think it is difficult to draw any hard conclusions from the ScanDisc test, unless I know what it means.

Also, could it be possible to get the "time taken to complete ScanDisc test" values along with those percentile figures? This would help to determine further the value of those percentile values in the ScanDisc test.

Thanks again for another great review!

regards,
Halc

PS OffTopic: The new LiteOn LTR-52327S is hopefully about to ship. A user in cdfreaks forum has received a test unit. And please, let's not start another "Plex is expensive, Plex sucks!" thread :)

PostPosted: Wed May 28, 2003 7:42 am
by Ian
LiteOnGuy wrote:Is the new 52x32x52 Lite-On out yet? It still says "coming soon" on Lite-On's web site.



Image

PostPosted: Wed May 28, 2003 8:26 am
by Scour
Hello!

In Germany it´s not avaible

PostPosted: Wed May 28, 2003 10:32 am
by cfitz
Halc, have you read this thread?

http://www.cdrlabs.com/phpBB/viewtopic.php?t=9237

cfitz

PostPosted: Wed May 28, 2003 12:07 pm
by Halc
cfitz,

yes, thanks. Unfortunately, I'm either thick or it doesn't explain my dilemma one way or another, but rather muddles it further :)

BTW, did that "new version of" CD Speed with C1/C2 testing get released already? Or is it upcoming?

Regards,
Halc

PostPosted: Wed May 28, 2003 4:15 pm
by Ian
Halc wrote:BTW, did that "new version of" CD Speed with C1/C2 testing get released already? Or is it upcoming?


No, its not out yet. Erik is still working on it. He's sent me a few CD Speed betas that had some new features and that was one of them.

PostPosted: Thu May 29, 2003 3:54 pm
by justinhamill
Ian, - having just (brainlessly) emailed you, I have for the first time browsed these forums and seen how involved you are with them.. So, my apologies and my message again...

Ultima (Artec) are selling a 52X32X52 rewriter which they describe as RW burning at 32X CLV. It would be great to see this stacked up against the 32X rewriters from MSI, Mitsumi, and Plextor, particularly as they use P-CAV, I believe.

Finally, and with a feeling of some belated satisfaction, I have discovered that CDRLabs reviews are the ones for me.

PostPosted: Fri May 30, 2003 12:57 pm
by Matt
cfitz wrote:In this situation there is no official standard test drive that can be used for C1/C2 testing (certainly not one that CDRLabs can afford). Lacking a readily available official standard, the next best thing is a de facto standard. Among the limited choices for test drives that can do C1/C2 testing, LiteOn drives are far and away closer to being a de facto standard, due to their overwhelming sales advantage.


We've also done all of our previous testing on lite-on drives and if plextor isn't using the same definitions of what c1,c2,etc are our data is misaligned.

cfitz wrote:KProbe allows the user to choose graph colors, select linear or logarithmic scales, combine or separate C1/C2 graphs, scale charts automatically or manually, label the graphs, save directly to png, bmp or jpg output files, etc., etc. I don't know if Q-Check allows any of that, but based on the screen shots Ian posted, it appears to be missing at least some of these features.


For the flexibility and thoughtfulness that went into this part of Kprobes code alone it should be used. :)

cfitz wrote:Finally, don't forget that the author of KProbe, Karr Wang, has shown a willingness to work with average users to improve KProbe. I don't know if Ian has tried contacting Karr, but I imagine there is a reasonable chance that Karr would be even more willing to work with Ian directly since Ian runs CDRLabs and has the status of being an authority in the field of CD-RW drive testing and reviewing.


I'm sure he wouldn't mind if we started using his tool as the defacto test tool for measuring errors either. Who knows some unlikely employers might even take notice. What I like about using Kprobe as the error test tool is that we may(?) have easy access to the developer when odd scenarios arise and he may be able to clarify if it's a software bug or something that can be tweaked or that the problem is for sure something in the hardware. Good luck talking to plextor about these minor issues with the software that significant meaning in reviews...

cfitz wrote:Ian, is it really just the presentation of Q-Check that you prefer?


I know for a fact that Ian likes eye candy.. he was impressed with worldclient webmail when we first rolled it out on the mail server. (Even though he doesn't use it at all... Socheat is the lone user for the most part :))

PostPosted: Fri May 30, 2003 2:54 pm
by Ian
Eye candy good!!

Webmail bad!!

PostPosted: Wed Jun 04, 2003 11:42 am
by robertb
Hi
Thanks for the detailed revue of the Plexwriter 52x32x52
I bought one two days ago form Overclockers website approx £100 plus delivery.
This package contains no IDE cables no sound cable and no instruction manual just Nero 5.5 and Plextools v2.02 on two separate CD's.
Currently I am still waiting for a reply from Plextor as to why the Plextools program refuses to work at all and throws out the following windows error message
PLEXTOOL caused a general protection fault
in module MCILAU.DLL at 0002:00007f40.
Registers:
EAX=00020000 CS=4baf EIP=00007f40 EFLGS=00000202
EBX=4baf37b4 SS=5527 ESP=00008824 EBP=00008830
ECX=0002124d DS=5387 ESI=00000000 FS=0f27
EDX=81a63c67 ES=0867 EDI=00000dc0 GS=0000
Bytes at CS:EIP:
8e 46 fe 66 26 83 bc 1e 03 00 74 49 66 26 ff b4
Stack dump:
00010dc0 4baf8223 3c670dc0 822a884a 0dc04baf 08020001 0dc00000 01ecffff 000051d6 886a0000 4baf29f2 51d601ec 00000000 00010802
001b16bf 00000000

Apparently The mcilau.dll file is something to do with Netscape media player which as far as I know I have never used it and uninstalling nescape media player does not solve the problem
I already posted on this forum but thought I would post this topic in this section.
I am running Windows 98 v4.1 and Netscape came on this packard bell computer together with a few other utilities.
After many hours I got Plextools working by deleting the mcilau.dll file although I have no idea if this dll file is required by something else.
I am waiting for a reply from Plextor with some explanation/solution.
regards
robertb