For some time now the tests in llvm-test/SingleSource/Regression and llvm-test/SingleSource/UnitTests have been producing no meaningful performance data other than "working". I think one of two things should be done, on a case-by-case basis: 1. Tests that are clearly simple, quick running tests should be moved into the llvm/test/Regression directory and made part of the deja-gnu suite since they are supposed to test functionality and their performance isn't really of significance. 2. Tests that are a little more complex and *could* have meaningful performance data should have their "main" functions put in a loop so that these programs runs sufficiently long to produce meaningful performance numbers. Currently their results are so small that we just get - in the nightly report. Is there a reason we haven't done this yet? Reid -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20050616/11ea0d98/attachment.sig>
Okay, so how about #2? Shouldn't the SingleSource/Regression tests be modified to increase their execution time (by, say, executing the whole test multiple times) ? This would put them on the radar from a performance perspective. Reid. On Thu, 2005-06-16 at 10:17 -0500, Chris Lattner wrote:> On Thu, 16 Jun 2005, Reid Spencer wrote: > > > For some time now the tests in llvm-test/SingleSource/Regression and > > llvm-test/SingleSource/UnitTests have been producing no meaningful > > performance data other than "working". I think one of two things should > > be done, on a case-by-case basis: > > > > 1. Tests that are clearly simple, quick running tests should be moved > > into the llvm/test/Regression directory and made part of the deja-gnu > > suite since they are supposed to test functionality and their > > performance isn't really of significance. > > > > 2. Tests that are a little more complex and *could* have meaningful > > performance data should have their "main" functions put in a loop so > > that these programs runs sufficiently long to produce meaningful > > performance numbers. Currently their results are so small that we just > > get - in the nightly report. > > > > Is there a reason we haven't done this yet? > > Because llvm-test is a coverage testsuite that we happen to get perf data > from, not a performance suite. There is no way to run tests in the > llvm/test/Regression directory. All of these tests require running to > show that they work. > > OTOH, I certainly wouldn't be opposed to merging SingleSource/Regression > with SingleSource/UnitTests. There is no useful distinction there. > > -Chris >-------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 189 bytes Desc: This is a digitally signed message part URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20050616/82182209/attachment.sig>
On Thu, 16 Jun 2005, Reid Spencer wrote:> For some time now the tests in llvm-test/SingleSource/Regression and > llvm-test/SingleSource/UnitTests have been producing no meaningful > performance data other than "working". I think one of two things should > be done, on a case-by-case basis: > > 1. Tests that are clearly simple, quick running tests should be moved > into the llvm/test/Regression directory and made part of the deja-gnu > suite since they are supposed to test functionality and their > performance isn't really of significance. > > 2. Tests that are a little more complex and *could* have meaningful > performance data should have their "main" functions put in a loop so > that these programs runs sufficiently long to produce meaningful > performance numbers. Currently their results are so small that we just > get - in the nightly report. > > Is there a reason we haven't done this yet?Because llvm-test is a coverage testsuite that we happen to get perf data from, not a performance suite. There is no way to run tests in the llvm/test/Regression directory. All of these tests require running to show that they work. OTOH, I certainly wouldn't be opposed to merging SingleSource/Regression with SingleSource/UnitTests. There is no useful distinction there. -Chris -- http://nondot.org/sabre/ http://llvm.cs.uiuc.edu/