Home News Reviews Forums Shop


How accurate is the Liteon 167T DVD-ROM for Kprobe results?

DVD-R/W, DVD+R/RW, DVD-RAM

How accurate is the Liteon 167T DVD-ROM for Kprobe results?

Postby Gooberslot on Mon Jul 26, 2004 8:41 am

Because if it's accurate I have a problem. I scanned some of the discs I made on my LG 4082b with one and it reported 10's of thousands of PIF errors with both Maxell and Ritek media.

EDIT: corrected spelling mistake.
Last edited by Gooberslot on Mon Jul 26, 2004 12:35 pm, edited 1 time in total.
Gooberslot
Buffer Underrun
 
Posts: 48
Joined: Thu Jan 31, 2002 2:45 am

Postby Kennyshin on Mon Jul 26, 2004 9:27 am

Can GSA-4082B read the disc back? Or other DVD-ROM drive.

I no more have any Lite-On DVD-ROM drive, or any dedicated DVD-ROM for that matter. I only own some DVD writers and three combo drives for reading DVD.
Kennyshin
CD-RW Player
 
Posts: 1173
Joined: Tue May 14, 2002 12:56 am

Postby Halc on Mon Jul 26, 2004 10:03 am

According to people at CDFreaks forums, LiteOn dvd-rom drives are not anywhere "reliable" for kProbe scans. Not even to the extent that Liteon DVD burners are.

That is, the results should not be trusted (or compatibility be generalized based on such results).

Then again, even if you have a dvd burner to measure with, it doesn't mean that the results are accurate except for that drive in which you did the scanning.
User avatar
Halc
CD-RW Player
 
Posts: 599
Joined: Mon Feb 04, 2002 9:13 am

Postby dolphinius_rex on Mon Jul 26, 2004 10:39 am

CDRinfo.com showed that in some cases, with a certain LiteON 167T, some generally accurate (compared to CATS scans) were possible... but LiteON drives all offer different results, so no two LiteOn tests can be cross compared, and DVD-ROMs are even MORE so like this.

Actually, there are *NO* consumer drives that can offer accurate media testing, and no drive's scans should ever be cross compared unless it is the same disc being tested.

Using multiple drives to test the same disc, is the best way to gain any kind of a real opinion of the disc's quality and compatability. Sadly this is very time consuming!
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada

Postby Gooberslot on Mon Jul 26, 2004 12:25 pm

I read that thread on CDRInfo and decided to run a test using CDSpeed like they did.

Here's the results of the Ritek G04 disc.

Image

Not too good at all. Much worse than what Kprobe2 reported.

Here's the Maxell disc.

Image

A lot better than the Ritek disc but still not that great.
Gooberslot
Buffer Underrun
 
Posts: 48
Joined: Thu Jan 31, 2002 2:45 am

Postby thegdog on Mon Jul 26, 2004 2:05 pm

Gooberslot wrote:I read that thread on CDRInfo and decided to run a test using CDSpeed like they did. Not too good at all. Much worse than what Kprobe2 reported.

You're running the test ast Maximum read speed though. That can also introduce more errors (or be less able to correct errors that it finds).

You should run the test at no more than 4x. There is some software available that you can use to slow the drive down for the testing. CDBremse, I think its called. Run a search for it. Its been discussed before.
thegdog
CD-RW Player
 
Posts: 365
Joined: Thu Mar 28, 2002 1:38 am

Postby Gooberslot on Mon Jul 26, 2004 3:48 pm

CDRInfo ran the tests at maximum too.
Gooberslot
Buffer Underrun
 
Posts: 48
Joined: Thu Jan 31, 2002 2:45 am

Postby rdgrimes on Mon Jul 26, 2004 4:23 pm

There's no such thing as "accurate" or "inaccurate" error reporting. It just is what it is. We don't really know what the ROM drives are reporting as PI/PIF, but we know it's not the same thing as the burners. Scanning at max speed is just one of the issues with ROM drives.
Some people report they are unable to limit the read speed on the 167 with any of the known programs. But at any speed, you will still get CAV which should not be compared to scans done on burners.

You will get more useful info from a ROM drive by running CDSpeed transfer rate test and Scandisc test.
rdgrimes
CD-RW Player
 
Posts: 963
Joined: Fri Dec 20, 2002 10:27 pm
Location: New Mexico, USA

Postby Kennyshin on Mon Jul 26, 2004 5:26 pm

dolphinius_rex wrote:...

Using multiple drives to test the same disc, is the best way to gain any kind of a real opinion of the disc's quality and compatability. Sadly this is very time consuming!


Scanning once takes too much time already. :(
Kennyshin
CD-RW Player
 
Posts: 1173
Joined: Tue May 14, 2002 12:56 am

Postby dolphinius_rex on Mon Jul 26, 2004 7:06 pm

I am getting so sick of all the BS out there regarding DVD testing. Honestly! There is *NO* accurate way of testing DVDRs with consumer drives, and that's all there really is to it. You can have approximations of accuracy when you have drives cross referenced with real testing equipment, like what CDRinfo did. If people want to be serious about testing, then it's REALLY simply: make arrangments with a DVD pressing plant that has access to real testing equipment, and stop going about it in a half assed way. And this whole "use 4x for LiteON testing" is the worst crap of it all! People started testing at 4x back when it was believed that 1x was the only way to get accurate results... so basically they were already compromising accuracy and quality for convenience. It later was proved that neither 1x nor 4x give any kind of accuracy really. Not a single consumer drive can really spot all the problems in a disc, not a LiteON, not the Plextor 712a, and certainly not the Optorite, Nutech or BenQ!! The only reason that these lame rumours have continued to spread is out of desperation from people deemed "professionals" who are too uninterested or incapable of doing it right. The *LEAST* they could do is cross reference the results with a few different drives, to see how multiple read heads handle various error types... but even that is too much for many sites.

All I can really say is, if *I* a one man show, with limited resources and contacts can gain access to a Datarius DVD Analyzer (I'm in negotiations currently) then any "professional" review site should be able to do it too!

I'm sorry for this rant, but after looking into this matter for some time now, I have just become so sickened by the constant compromise of quality for convenience that is becoming so prevaliant in our community!
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada

Postby rdgrimes on Mon Jul 26, 2004 7:39 pm

There is *NO* accurate way of testing DVDRs with consumer drives

That's just plain silly. It also demonstates a lack of understanding about what is actually being measured and reported. The results of testing on reference equipment are interesting, and not without value, but offer little usefull information to the end user of these drives.
As to scanning speed, it makes no difference what speed is used, as long as scans are not compared that were done at different speeds.

The point of PI/PIF scanning is not to determine absolute "quality" of a burned disc, but simply to compare it to a "benchmark" of previous scans on the same drive (under the same conditions) to see if it's better or worse than another disc. It's really very simple, but it seems that some people want to complicate it.

How a particular LiteOn drive compares to some reference equipment is totally irrelevant for most purposes. It couldn't matter less. What matters is that end users have SOME way of comparing relative read-ability of their burns, so they don't lose data or create discs that won't play. To that end, the tools at hand are totally adequate. To suggest that they not waste their time scanning is , well, just irresponsible.

How a particular media may measure up on reference equipment is good to know, but it is in no way a measure of how it will perform in the users burner and readers.

So there's plenty of room for all these different testing procedures in their appropriate place. What we don't need is ill-informed declarations that only one way of testing is valuable.
rdgrimes
CD-RW Player
 
Posts: 963
Joined: Fri Dec 20, 2002 10:27 pm
Location: New Mexico, USA

Postby dolphinius_rex on Mon Jul 26, 2004 8:19 pm

rdgrimes wrote:The point of PI/PIF scanning is not to determine absolute "quality" of a burned disc, but simply to compare it to a "benchmark" of previous scans on the same drive (under the same conditions) to see if it's better or worse than another disc. It's really very simple, but it seems that some people want to complicate it.


You can't have a benchmark like you suggest, especially with LiteON drives. No two drives test media the same way, and heat build up can also noticiably affect test results. If one person wants to do all the testing with one drive, then it will tell you if that ONE drive can read it, nothing more... but you might as well just do a transfer rate test since doing PI/PO testing will only give you an illusion of quality testing, which is not actually present. There are error types that go completely unnoticed by some drives, and others can't handle them at all. This complete lack of standard control among DVD-ROMs, DVD-RWs, and DVD Players makes it impossible to judge how a disc will perform based on one scan alone.


rdgrimes wrote:How a particular LiteOn drive compares to some reference equipment is totally irrelevant for most purposes. It couldn't matter less. What matters is that end users have SOME way of comparing relative read-ability of their burns, so they don't lose data or create discs that won't play. To that end, the tools at hand are totally adequate. To suggest that they not waste their time scanning is , well, just irresponsible.


What you are basically saying here, is that using K-Probe and a LiteON DVDRW to do error testing, is more useful for quality assessment then getting proper testing done on professional equipment. I think that is absurd! I think that all that is really happening now is people are playing to the crowd, and telling them what they want to hear. If you tell people they can know the "relative readability" of their burned media by doing K-Probe scans at any speed they like (since "it makes no difference what speed is used, as long as scans are not compared that were done at different speeds. ") then people who test at 1x will get one set of results, usually way below a reasonable assessment of the disc, and people who scan at 4x will get equally skewed results when certain error types are more prevailent then others.

Don't get me wrong, I'm not saying you can't test to see if the disc will read on a certain drive or not... I'm just saying you can't know the discs' quality, or compatability with a single scan on a single drive.

For myself, I do multiple scans at multiple speeds on multiple drives, and then do play back tests on multiple DVD players. I do this to form an opinion of a discs's compatability, since absolute value of the scans are not a sure thing.
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada

Postby rdgrimes on Mon Jul 26, 2004 8:59 pm

I guess I need to define "benchmark". It's how MY drive behaves with media. Not yours or any one elses. The notion of a "standard" benchmark is a dream. It's of no use to me to know how a given media performs on other drives, or even on calibrated test equipment. There's no such thing as a media that performs the same on all drives, or even on different drives of the same model. Case in point is MCC 8x +R, eithe the media is totally variable, or no 2 drives react the same way to it. Yet it may very well perform flawlessly on test equipment, but how does that help me if my burner hates it?

So each user must establish his own benchmark for what's good media in his burner and what is not. Thats what "benchmark" means, you put a mark on your bench and measure everything else against it. You don't put your mark on someone else's bench.

I am 100% certain, based on scans I have done on about 500 DVD's, that PI/PIF scanning is a totally reliable way to judge how well a media is perfoming in my burners, and predict whether it will play on my movie machines. I can demonstrate it and repeat it over and over again. And yes, I can do it with one scan.

It makes no difference to me what-so-ever whether that same media that performs well here will also perform well in another burner. In fact I wouldn't expect it to in every case. The notion that you can PREDICT how a media type will perform in my drive, based on standardized tests is what is obsurd, because you can't and never will. Burners are all diffferent, even "identical" drives of the same model. Media is all different, even "identical" media.

There will never be any such thing as "absolut value", nor would a reasonable and informed person expect there to be. Nor would an informed person expect to compare scans done at different speeds or from different drives. That includes standardized test equipment too, the results attained on such equipment will in no way predict what consumer drives will do.
rdgrimes
CD-RW Player
 
Posts: 963
Joined: Fri Dec 20, 2002 10:27 pm
Location: New Mexico, USA

Postby dolphinius_rex on Mon Jul 26, 2004 10:33 pm

so, if I'm reading you correctly, you're saying that all the tests done only say how brand x disc burns on your specific drive, and burning brand x disc on another persons drive, even if it is exactly the same drive and model, will not have comparable results with yours. If this is true, then no one should post ANY K-Probe or other type of scans, because all it would do is confuse all the newbies into thinking the results are reproducable on their drives. And hey, maybe that is actually true, maybe that is what happened here:
http://www.cdrlabs.com/phpBB/viewtopic.php?t=17146

And actually, that's really the problem. I'm not concerned with how you interpret your media tests, and if you screw up at it, I know you aren't going to come whining about it either. The people I *AM* concerned about are the newbies who look at you, and other "professional" media "experts" examples, and get the wrong impression.

Yes Mitsubishi 8x DVDRs are an excellent example! Some drives burn them well, some drives don't... some drives can read them well, and some drives can't even read the properly burned discs... add the fact that there are actually 2 versions of this disc, and you've got massive confusion written all over it! These are the worst possible discs to be doing K-Probe (or other testing software) scans on!

Again, I don't care if you can test your own media. It's everyone else... you know the other 99.99999999% of the population, the ones who don't have a lot of personal experience in testing media, that I'm concerned about.

It makes no difference to me what-so-ever whether that same media that performs well here will also perform well in another burner. In fact I wouldn't expect it to in every case. The notion that you can PREDICT how a media type will perform in my drive, based on standardized tests is what is obsurd, because you can't and never will. Burners are all diffferent, even "identical" drives of the same model. Media is all different, even "identical" media.


The above paragraph ALONE does an excellent job of explaining exactly why this has to end. If nothing is reproducable then all that these "tests" are doing is feeding ignorance and confusion. People put their trust in people like us, to help them choose the right media for the right circumstances. I for one take this job very seriously, and that is why I am so ticked off with this over confidence and mis-information campaign revolving around K-Probe.

There will never be any such thing as "absolut value", nor would a reasonable and informed person expect there to be. Nor would an informed person expect to compare scans done at different speeds or from different drives. That includes standardized test equipment too, the results attained on such equipment will in no way predict what consumer drives will do.


I never said that there could be an absolute value... in fact I don't believe in one either. But I do believe that to a point you can predict how certain media types will burn in various burners. Otherwise there would be no use for burner reviews! And for the record, standardized testing equipment is for compatability tests against ROM drives, not so much DVD players which don't follow any real standards (or not too well anyways).
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada

Postby rdgrimes on Tue Jul 27, 2004 12:38 am

The alternative is everyone blindly burning coasters and not knowing why.

I also have concerns about the number of people mis-interpreting scans. But that's not the fault of the tools. If people want to learn about what they are doing, they will. It's always better to have as many tools available as possible. That's why we have forums, to educate.

And making blanket statements that the scans are worthless is not serving that purpose, it's contrary to it. There a much better chance that people will be able to improve their burn quality if they use the tools, than if they don't
rdgrimes
CD-RW Player
 
Posts: 963
Joined: Fri Dec 20, 2002 10:27 pm
Location: New Mexico, USA

Postby code65536 on Tue Jul 27, 2004 1:08 am

dolphinius_rex wrote:I am getting so sick of all the BS out there regarding DVD testing. Honestly! There is *NO* accurate way of testing DVDRs with consumer drives, and that's all there really is to it. You can have approximations of accuracy when you have drives cross referenced with real testing equipment, like what CDRinfo did.


And if I may be so blunt, people hold CATS up like some infallable holy grail is sickening, too. :roll: :P

What does it matter what a CATS device tells you? The devices that people are going to use to read back these discs are not CATS machines. What CATS will tell you is very simple: it tells you how other overpriced CATS machines will read the disc. Great, so you can safely and accurately compare your CATS scan with someone else's CATS scan on the other side of the world, because all CATS scanners are nicely calibrated, etc. But what good does that do the end-user? Besides the fact that such machines are prohibitively expensive. Besides the fact that doing these "proper" tests are extremely time-consuming. The alternative is to use a heuristic. One that works fairly well, I might add.

The point here is that, for practical considerations, there is a heuristic that's being used. It's not BS because people are instructed to take it all with a bit of a grain of salt--comparing it with your own scans from the past, and then interpreting what the KProbes mean by crossreferencing the results you get with what kind of performance you experience when testing on other devices like standalones. And it just so happens that this heuristic is stable enough that, in a number of cases (though certainly not all), it is reasonable to compare scans from drive A to drive B. Heck, when scanning the same disc, my 832S and my 3S produces identical results, within 10% of each other. Direct comparison is something that, on the surface, we say, "no, this doesn't really mean much", but the heuristic is good enough that it allows people to, under what limited time and resources people have, try to do something meaningful with it.

As rd said, the alternative is to do nothing or to watch the people who could afford CATS do their hocus-pocus.


PS: The whole reason I'm on an anti-CATS crusade is that after seeing so many people cling onto the precious C't results like they were the word of some deity because they were performed with... *gasp* CATS! I can't begin to count the number of people who took the results of a CATS scan from one individual variable drive unit taken from one individual variable batch of drives taken from one individal variable factory using one disc from a spindle of discs that may very well have intra-spindle variation coming from a batch that may very well vary from other batches and coming from a media factory that may very well vary from other media factories, burned under very unique conditions (system setup, drive temperatures, phase of the moon, etc.) as something that was absolutely indicative of the performance of that drive in general with that media in general in all situations (yes, it's better than looking at Joe Sixpack's KProbe and making an overbroad conclusion). The point is, things vary, and that's why everyone should be doing some testing of their own, and to then dismiss self-testing as worthless because it's not some vaunted CATS is counter-productive. So is relying on someone else's results just because that someone else used the high-and-mighty CATS.

My apologies if I sound harsh. But over the past few months, the obsession with this non-existant holy grail of testing has put me in a sour mood.
User avatar
code65536
CD-RW Player
 
Posts: 371
Joined: Wed Apr 14, 2004 10:18 pm
Location: .us

Postby dolphinius_rex on Tue Jul 27, 2004 2:04 pm

code65536 wrote:And if I may be so blunt, people hold CATS up like some infallable holy grail is sickening, too. :roll: :P


Like I said, CATS is more for DVD-ROM compatability then DVD Player compatability, and has it's pro's and con's as well.

code65536 wrote:What does it matter what a CATS device tells you? The devices that people are going to use to read back these discs are not CATS machines. What CATS will tell you is very simple: it tells you how other overpriced CATS machines will read the disc. Great, so you can safely and accurately compare your CATS scan with someone else's CATS scan on the other side of the world, because all CATS scanners are nicely calibrated, etc. But what good does that do the end-user? Besides the fact that such machines are prohibitively expensive. Besides the fact that doing these "proper" tests are extremely time-consuming. The alternative is to use a heuristic. One that works fairly well, I might add.


That isn't quite accurate... CATS devices scan a disc against the standards which are often tooted by "professional" forums (280 PIE and 4 POE). When the error level limits were conceptualized, they were done so under the assumption that a calibrated drive was being used to test for them. Calibrated to read the disc against various error types in a certain consistant way, which would be repeatable on any CATS device. Now, using consumer drives, we emulate this sort of test, by scanning for PIE and PIF (not POE), yet we still stick to the standard of 280 and 4. This to me is really silly, since every drive tests differently, and no drive actually tests for PIE and POE errors, only PIE and PIF errors... so WHY do we hold the scans to a system they can't even begin to match up against? Simple, people are desperatly grasping at straws. Now, if you want to tell me that a disc is playing or readable , or not, that's fine... it's really simple to do that. But telling me that a scanned disc is "within spec" or "has a good error rate" is foolhardy. Again, we can tell how well the disc can be read, and how easily the drive can play the disc back, but knowing if the error rate is within DVD Forum specifications or not, is really beyond the capabilities of a consumer drive.

code65536 wrote:The point here is that, for practical considerations, there is a heuristic that's being used. It's not BS because people are instructed to take it all with a bit of a grain of salt--comparing it with your own scans from the past, and then interpreting what the KProbes mean by crossreferencing the results you get with what kind of performance you experience when testing on other devices like standalones. And it just so happens that this heuristic is stable enough that, in a number of cases (though certainly not all), it is reasonable to compare scans from drive A to drive B. Heck, when scanning the same disc, my 832S and my 3S produces identical results, within 10% of each other. Direct comparison is something that, on the surface, we say, "no, this doesn't really mean much", but the heuristic is good enough that it allows people to, under what limited time and resources people have, try to do something meaningful with it.


Yes, in some cases people are reminded to take the results with a grain of salt... but do they? realistically no, I don't believe that they do. I think many of us even forget to do this too! I'm not saying that we can't know ANYTHING about the burned quality of the disc, I'm just saying that people are really making too much out of K-Probe results (and others). As for drive test results... even if you get a difference of 10% only between your 2s and 3s model drives, I think it is dangerous to assume that the same would be true for other people. I know that the %difference is much higher for Plextor/LiteON comparisons... but which drive is "right"?? well, neither of them really.

code65536 wrote:As rd said, the alternative is to do nothing or to watch the people who could afford CATS do their hocus-pocus.


I disagree, the alternative is to be realistic about the capabilities and difficencies of LiteON and other testing drives. There *ARE* things we can know about the disc burned... but I think people make too much out of the results. What I'd like to see is people taking the time to scan discs on 2 different drives and including a transfer rate test. Personally, I'm using the PX-712a, which I test media at 2x and 12x on, and then the SOHW-812S@832S which I test the media at 8x on. I usually do a transfer rate test on both drives too. It takes a long time, I won't argue that... but it gives a MUCH better look at the media itself, and what kind of errors affect the disc. Comparing 2x to 12x scans on the PX-712a are VERY englightening at times, since some error types are greatly magnified at higher read speeds, and others are not... cross reference this with the 832s's results at 8x, and now you know how those errors are handled by another drive in another situation (usually a less picky drive too). If I was doing a drive review, I would make sure to include a transfer rate test done on the drive being reviewed, since that would be the most important result to a potential buyer of that drive.

code65536 wrote:PS: The whole reason I'm on an anti-CATS crusade is that after seeing so many people cling onto the precious C't results like they were the word of some deity because they were performed with... *gasp* CATS! I can't begin to count the number of people who took the results of a CATS scan from one individual variable drive unit taken from one individual variable batch of drives taken from one individal variable factory using one disc from a spindle of discs that may very well have intra-spindle variation coming from a batch that may very well vary from other batches and coming from a media factory that may very well vary from other media factories, burned under very unique conditions (system setup, drive temperatures, phase of the moon, etc.) as something that was absolutely indicative of the performance of that drive in general with that media in general in all situations (yes, it's better than looking at Joe Sixpack's KProbe and making an overbroad conclusion). The point is, things vary, and that's why everyone should be doing some testing of their own, and to then dismiss self-testing as worthless because it's not some vaunted CATS is counter-productive. So is relying on someone else's results just because that someone else used the high-and-mighty CATS.


You aren't alone on that! I'm pretty ticked off about the lack of information in those C't articles... although perhaps some of those details would be available if I went back to the original deutsch... but since I don't have those articles yet, I can't do that. The fact is, most drive reviews are completely useless where burn quality issues are concerned, after about 1-2 months, if not sooner, since firmware updates change how the drive performs to such a large degree! So by the time the review is out, there is usually a newer firmware available :-? This isn't the reviewer's fault of course, just a problem inherent in the system as it is right now.

code65536 wrote:My apologies if I sound harsh. But over the past few months, the obsession with this non-existant holy grail of testing has put me in a sour mood.


No worries, I'm not angry with you or anything :wink: I'm just pretty pissed off myself over the use, and abuse of K-Probe (and other) tests for the last while, and the apparent choice of convenience over professionalism in the industry.
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R

The Progression of Computer Media
User avatar
dolphinius_rex
CD-RW Player
 
Posts: 6923
Joined: Fri Jan 31, 2003 6:14 pm
Location: Vancouver B.C. Canada


Return to DVD Writers

Who is online

Users browsing this forum: No registered users and 3 guests

All Content is Copyright (c) 2001-2026 CDRLabs Inc.