The test is not "un-valid" but testing is not so simple for it to be called "valid" either really. What the drive is reporting is only THAT DRIVE'S perspective of the disc. One drive might read that disc fine, and another might have a terrible time trying to read it. The easy way to say it would be "every drive's scans are subjective".
Sadly, since K-Probe/LiteON drives are quite limited in their reporting abilities, you can't even really compare to scans very well. For example, the first disc you burned might have had extremely high jitter, but since K-Probe/LiteON drives cannot detect this, you would never know. On the other hand, the second disc might have excellently low jitter results but very high beta levels, but again, you wouldn't know...
To me, those two sets of test results looked mighty similar to each other. You cannot go by absolute values of the numbers, but rather by the trends of the graph. You might see 5%-10% fluctuations between scans on maximum levels, and average error levels can be even more erratic, depending on which samplings are taken from the disc being scanned.
I suppose the short answer to this would be "Yes, you may be looking a bit too hard in this case"
Media testing is VERY confusing, and even those of us who spend 8+ hours daily running test after test after test on various drives with different parameters have a hard time understanding everything!! And there is always more to learn, and more mistakes to be made (I'm pretty good at that second one myself
Punch Cards -> Paper Tape -> Tape Drive -> 8" Floppy Diskette -> 5 1/4" Floppy Diskette -> 3 1/2" "Flippy" Diskette -> CD-R -> DVD±R -> BD-R
The Progression of Computer Media