Hi,
Why is it if it is already builded with that on jenkins for many times already it can fail suddenly if one of the sites has a hiccup?
Rebuilding from scratch is a way to ensure build portability and reproducibility in any environment. That's something that's usually desired.
for me the target site has exact version numbers
if i include a git features i include "2020127-1300"
that will not change..
I think it is the same for all the stuff we have in our target file.
if i download that target once (with that content)
it will never change and it is not supposed to change!
because that would mean that between builds suddenly have have new stuff inside?
why would that be a desired implementation??
That would be horrible i am in the process of making a final release and have first a few rc's and then the final in a couple of weeks time.
Then suddenly without me knowing tycho download all kind of different features and between the latest rc and the final build i suddenly have newer/other plugins?
Why does it want to continuously check and download the stuff?
It checks the content of the remote repo for potential updates and so on, but as long as you don't clean your Maven repository, it will reuse the local artifacts that were already downloaded and not redownload them.
right but it bombs out when it can't and it takes quite some time to check all our stuff all the time
i rather then have an option right here:
<fixedTarget>true</fixedTarget>
and based on that tycho never checks after the initial download for that target file (check the contents not the version) that it needs to do some stuff
Why can't tycho not make a local site right from my target ?
How you deal with your target and its caching seems more to be your specific build logic, not something really standard. Tycho doesn't necessarily need nor download the whole content of a .target file description, only necessary dependencies (according to dependency resolution) will be fetched. I don't think such capability to make a local p2 repo from a .target should be part of the default Tycho workflow. However, Tycho can for sure provide some utils to improve that.
no
i don't want this
because i use build servers and so on
then i constantly have to make this and upload? this to somewhere
And still that is then remote. i want it purely local and without me knowing the filesystem om my jenkins
(because our our jenkins we have 3 branches of our product continuously and all of them use a different target file)
So how would you do this?
How would you if you don't want to do anything on that server (thats just in the cloud, we only have a jenkins frontend) commit something for a specific branch that generates a local target specific for a branch that is then used on in that branch?
Without ssh, anybody should be able to just check a file in and jenkins should handle this then.
For example generate a sha hash from the contents of the target file
and cache based on that hash on local disk the full resolved site.
(or it fills up in that dir the stuff it needs or it encounters that is not there)
Tycho already has cache in the Maven repo.
right so why does it bomb out if it can't find:
(which happens now and then for that)
So when a full build is done everything is in that dir based on the sha hash of the contents of that target file
if that file changes it just generates a new one (i guess we need to be able to clean up stuff later )
But as long as the sha hash doesn't change tycho doesn't try to contact any site it always gets everything locally
This should speed up the builds, and we shouldn't suddenly have breaking builds because of a network problem to one of the things inside the target.
There is nothing that guarantees the content of a remote p2 repository doesn't change; so there is nothing that guarantees that a .target file will resolve to the same artifacts in 2 distinct executions. Caching the results and assuming the remote repos are stable would break many use-cases of Tycho, where projects explicitly rely on mutable remote repositories to build against latest dependencies.
right but let me decide that then
i know it is fully stable. it never want that a target file that stays the same has al kind of hard version it it (see above)
suddenly starts downloading different stuff
If that happens it should fail
i want to know that
because that means that suddenly our product builds on a different set of jars that i don't expect
and i want to be fully in control
i am the person that says tycho you can download new stuff, i checked it and i updated the version in the target file.
But i still don't see how this could happen, because a target file (at least ours) have hard coded versions in all the includes, it can't just suddenly take another one..
This is kind of currently how we work with target files in our Eclipse environments
In Eclipse IDE, you're supposed to hit "reload" from time to time in case your .target has refs to repo with mutable content. It's the same as what Tycho does (reload on every build).
nope
see my stuff above
as long as i don't change our target file to have different version of all the features or plugins
the target file output is set in stone.
it will not change.
Where is an example that does suddenly change?
Maybe tycho only needs to cache the contents.jar/artifacts.jar of the sites? because the plugins/features are already cached in the mvn repo. So what we cache based on a sha hash is very little.
IIRC, it does already cache metadata from remote repos, but we'd still need to query the HTTP head of repo metadata to know whether something actually changed since last fetch and invalidate cache accordingly. Maybe Tycho doesn't optimize the usage of such cache. It would be interesting to dig in that direction.
right thats what i kind of want i don't mind that this is an option that i need to set like i said above.
i know my target file as long as the content doesn't change is set in stone, the download will always be exactly the same, it does not need to check anything
johan