Page 1 of 1

In House Review - Plextor PlexWriter Premium-U 52/32/52

PostPosted: Tue Dec 16, 2003 1:56 pm
by Ian
Yes, its that time again. Time for another review. Today CDRLabs has taken a look at Plextor's newest external CD-RW, the PlexWriter Premium-U. Based on the award winning Premium, the Premium-U reads and writes at 52x, rewrites at 32x and has a huge 8MB buffer. The drive also has many of the features we've come to expect from Plextor, along with a number of new ones like SecuRec, Q-Check and GigaRec.

Can the Premium-U deliver the type of performance Plextor is known for? Is it the fastest external writer around? You'll have to read the review to find out.

[url=http://www.cdrlabs.com/reviews/index.php?reviewid=211]Image
Plextor PlexWriter Premium-U 52/32/52 USB 2.0 CD-RW[/url]

If you have any comments or questions about this review or the Plextor Premium-U, please post them in the forum using the link provided below.

Plextor Premium-U review

PostPosted: Thu Jan 08, 2004 4:05 am
by theTAO
Okay, I have a question. :) I plan to buy the internal version of this drive (originally the 48x24x48 until I saw how the price had dropped) very shortly, and have read several reviews. This one seems more critical than most, which is a nice departure.

But reading sections of the "Performance" page, and specifically the parts on DAE and C1/C2 errors, I was struck by how the criticisms seem more aimed at the software than at the drive itself. For example, high-speed DAE with Nero seems problematic, but no similar issues are raised with CD DAE. For Ian or anybody who owns a Premium and tried these tests themselves, is this the right interpretation? Would it be worth asking Ahead or the CD DAE people to try explaining this? Do you ever do this, or is it normally outside the scope of your reviews?

There's a similar issue where the Plextools and KProbe programs differ in the number of C1/C2 errors. Obviously they both can't be right. In case there's a bug in Plextools, I wonder if it would be worth bringing this issue to Plextor's attention? I also wonder which program WSES (?) would side with?

While I'm already convinced the Premium is a solid drive, knowing that even this software can have such wild effects on the same drive is important to me...since I don't use Windows. Perhaps in the future you can try resolving these issues a bit more, so it's only the hardware that's being rated?

Thanks,

Todd

PostPosted: Thu Jan 08, 2004 10:11 am
by Ian
I'm sorry, but I really don't get where you're going here. DAE problems? Are you talking about the slow downs? I talked to Plextor about this when we reviewed the internal Premium. They really didn't have anything to say.. and really didn't take my comments seriously since the problem occured with the PX-W5224A and Premium-U. These problems occured with CD DAE and other software as well. Definitely a hardware thing.

The whole KProbe vs. Plextools thing is something that has been discussed for ages. You're going to get different results depending on the a) drive you use and b) the software you use. IMO, neither KProbe nor Plextools (or WSES for that matter) are 100% accurate. However, the results they give you can give you a good idea of the writing quality and are good for comparisons.

As far as the "hardware being rated", thats the whole point of the performance tests. We stick to a standard set of tests, using a standard set of software so that we can get an idea of how the hardware performs. If we used different software suites for each review (like the big magazines do) it would be impossible to compare the performance.

PostPosted: Thu Jan 08, 2004 6:12 pm
by theTAO
DAE problems? Are you talking about the slow downs?
[...]
These problems occured with CD DAE and other software as well. Definitely a hardware thing.


Yes, the slowdowns. I've never done DAE that fast, haven't had problems with my current drive, and don't have first-hand experience with the Premium, and so wasn't 100% sure how to interpret those comments.

The whole KProbe vs. Plextools thing is something that has been discussed for ages.


Well, your regular readers and anybody who's compared those programs on their own knows that, but the article seems to leave the question hanging. Thanks for the clarification.

As far as the "hardware being rated", thats the whole point of the performance tests. We stick to a standard set of tests, using a standard set of software so that we can get an idea of how the hardware performs. If we used different software suites for each review (like the big magazines do) it would be impossible to compare the performance.


Which makes perfectly good sense. I was reacting to some of the comments being "unresolved", which really isn't the case. At the time, alternative software seemed a good way of helping resolve them.

Thanks for the heads-up. It was a good review. :)