[
Date Prev][
Date Next][
Thread Prev][
Thread Next][
Date Index][
Thread Index]
[
List Home]
RE: [cdt-dev] junit tests
|
As another troll lurking under the bridge
I just wanted to clarify one thing:
> Frankly, these
tests seem like a quick copy-paste job. They don't test that the resulting
AST is actually correct. Its possible that some bugs would have been detected
sooner if these tests had been stronger.
Practicaly speaking, these tests _were_
a copy/paste job. They were put together when the parser was still
not particularly close to being finished. And their primary aim was
making sure the parser could at least manage to not go into infinite loops
or throw exceptions.
For testing the actual content of the
AST we wrote more detailed tests where you would expect broken things to
show up if the AST was wrong.
As example, compare: AST2CPPSpecTest.test14_7_3s5
and AST2TemplateTests.test14_7_3s5_SpecializationMemberDefinition. There
are likely many such examples though most won't use the same naming style.
-Andrew
Mike Kucera/Toronto/IBM@IBMCA
Sent by: cdt-dev-bounces@xxxxxxxxxxx
04/24/2008 05:46 PM
Please respond to
"CDT General developers list." <cdt-dev@xxxxxxxxxxx> |
|
To
| "CDT General developers list."
<cdt-dev@xxxxxxxxxxx>
|
cc
| "CDT General developers list."
<cdt-dev@xxxxxxxxxxx>, cdt-dev-bounces@xxxxxxxxxxx
|
Subject
| RE: [cdt-dev] junit tests |
|
How would anyone reliably find problems like this? If
a test passed but the AST was actually slightly wrong there would be nothing
to alert anyone of the problem. I guess in practice it hasn't been an issue
though.
But I've actually noticed it a couple of times. An example is AST2CPPSpecTest.test10_4s2b(),
which I have since commented out. In the spec this is actually an example
of ill-formed code. Its a pure virtual function that has a body, which
isn't allowed. But the test passes against the DOM parser, it parses as
an initializer _expression_. Its not a serious problem by any means, but
it does show that the spec tests aren't completely thorough.
Frankly, these tests seem like a quick copy-paste job. They don't test
that the resulting AST is actually correct. Its possible that some bugs
would have been detected sooner if these tests had been stronger.
Mike Kucera
Software Developer
IBM CDT Team, Toronto
mkucera@xxxxxxxxxx
John
Camelon---04/24/2008 04:17:31 PM---I am curious how many regressions you
found where the syntax and semantics of the test worked out without issue
but that AST wa
John Camelon/Ottawa/IBM@IBMCA
Sent by: cdt-dev-bounces@xxxxxxxxxxx
04/24/2008 04:16 PM
Please respond to
"CDT General developers list." <cdt-dev@xxxxxxxxxxx> |
|
|
I am curious how many regressions you found where the syntax and semantics
of the test worked out without issue but that AST was incorrect.
--
JohnC
Mike
Kucera/Toronto/IBM@IBMCA
Mike Kucera/Toronto/IBM@IBMCA
Sent by: cdt-dev-bounces@xxxxxxxxxxx
24/04/2008 02:52 PM
Please respond to
"CDT General developers list." <cdt-dev@xxxxxxxxxxx> |
|
|
Many of the older parser tests also have the unfortunate property of not
actually testing if things worked correctly. There are a bunch of tests
that parse a string then just check for syntax errors and problem bindings
in the resulting AST. So the parser could return complete nonsense but
as long as there are no problem nodes the test will pass. In fact you could
write a parser that trivially passes all these tests by just having it
always return an empty AST.
Mike Kucera
Software Developer
IBM CDT Team, Toronto
mkucera@xxxxxxxxxx
Chris
Recoskie---04/24/2008 12:10:04 PM---Related to this, and also as a part
of the greater discussion of "what to do about the build system"...
Chris Recoskie/Toronto/IBM@IBMCA
Sent by: cdt-dev-bounces@xxxxxxxxxxx
04/24/2008 12:02 PM
Please respond to
"CDT General developers list." <cdt-dev@xxxxxxxxxxx> |
|
|
Related to this, and also as a part of the greater discussion of "what
to do about the build system"...
The managed build core tests are extremely brittle, and a nightmare to
maintain. Essentially what they do is create a project, then build it and
compare the resulting generated makefiles to a set of benchmark makefiles
contained with the test.
This doesn't work well for the following reasons:
1. As soon as you change *anything* that causes the generated makefiles
to be different, no matter how slight, you have to go back and manually
update thirty-someodd tests, by extracting the sample projects, building
them, and putting the updated benchmark files into the appropriate zip
file. This is a very time consuming process, and in the past this has acted
as a barrier to making changes to MBS because either you spend days updating
the testcases, or the tests become broken for an extended period of time.
2. The tests don't actually check to see if the build does what it's supposed
to do, it only cares if the makefiles match. You might actually end up
with a build that goes haywire and doesn't build anything, but so long
as the makefile comes out as expected, the test passes.
3. The test projects are geared to particular toolchains. If we ever want
to start actually checking if the build actually builds anything, we will
have to make the tests flexible enough to build with the right toolchain
for the platform on which the test is running.
4. There are no tests for the internal builder, which is the default now.
What is more, you can't test the internal builder via the above method
because it doesn't generate any makefiles to benchmark against. This means
our primary use case for build isn't being exercised at all.
On top of all this, the tests behave inconsistently due to the concurrency
issues in the build system. It is pretty random as to whether any particular
test passes or not, and I find I have to rerun the tests at least once
(if not several) times to either get a clean run, or to at least have seen
different sets of failing tests in each run that indicate that in general
the tests pass.
I am curious as to the opinons of the other committers, but personally
I think that because of all the flaws above, these tests are not very useful
anymore. Maybe we should stop running them as a part of the build.
===========================
Chris Recoskie
Team Lead, IBM CDT Team
IBM Toronto
http://www.eclipse.org/cdt
"Schaefer,
Doug" <Doug.Schaefer@xxxxxxxxxxxxx>
"Schaefer, Doug" <Doug.Schaefer@xxxxxxxxxxxxx>
Sent by: cdt-dev-bounces@xxxxxxxxxxx
04/24/2008 11:28 AM
Please respond to
"CDT General developers list." <cdt-dev@xxxxxxxxxxx> |
|
|
The only JUnits we run are the ones for the nightly builds. Everything
starts at the test.xml file in the org.eclipse.cdt.testing plug-in,
which lists the test suites that get run. Adding JUnits is as simple as
adding tests to those suites or adding new suites.
Never thought of doing a lastest link, might be a good idea. But
generally, I'm always looking to see when the build was done so I
wouldn't use a latest link.
Feel free to update the wiki with what you find. The people who set up
the test suite are all long gone (originally done by the testers at QNX
years ago).
BTW, we have a lot of test failures, especially in managedbuild. We
should be looking at why those are failing.
Cheers,
Doug.
-----Original Message-----
From: cdt-dev-bounces@xxxxxxxxxxx [mailto:cdt-dev-bounces@xxxxxxxxxxx]
On Behalf Of Elena Laskavaia
Sent: Thursday, April 24, 2008 10:00 AM
To: CDT General developers list.
Subject: [cdt-dev] junit tests
I was search wiki page for any documentation regarding our junits, I
could not find anything...
I am looking for description on test environment, schedule of automated
tests and latest test results.
Along with documentation of creating new test, adding testing packages
into build etc.
Do we have anything like this?
I kind of can found some results following CDT nightly builds but there
is no link to "latest" so I cannot even bookmark it.
_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/cdt-dev
_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/cdt-dev
(See attached file: pic08097.gif)_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/cdt-dev
(See attached file: pic08097.gif)_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/cdt-dev
(See attached file: pic31958.gif)(See attached file: pic08097.gif)_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/cdt-dev
_______________________________________________
cdt-dev mailing list
cdt-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/cdt-dev
Attachment:
pic31958.gif
Description: GIF image
Attachment:
pic08097.gif
Description: GIF image