Modify

Opened 2 years ago

Last modified 2 years ago

#18312 new enhancement

Update site on Gitlab.com for external (git-based) plugins

Reported by: floscher Owned by: team
Priority: normal Milestone:
Component: Plugin Version:
Keywords: git Cc: don-vip, stoecker, Gubaer, qeef, iandees, Polyglot, simon04

Description

I just wanted to show you what I recently built for git-hosted JOSM plugins: https://josm.gitlab.io (a list of the latest plugin releases) and https://josm.gitlab.io/plugin-update-site (an update site that can be added in the JOSM preferences).

These are essentially doing the same thing as https://josm.openstreetmap.de/wiki/Plugins and https://josm.openstreetmap.de/pluginicons , but for JOSM plugins that are developed in a git repository instead of the SVN.

So why create this, if there's existing pages that do essentially the same thing (list JOSM plugins in a human-readable and also in a machine-readable form)?
For example:

  • No need to manually update https://josm.openstreetmap.de/wiki/PluginsSource for each release. Each time a git-tag is pushed to a plugin in gitlab.com/JOSM/plugin , a release is published and the pages update automatically to show the latest version
  • I'm planning to also automatically generate the download links for older versions in the MANIFEST.MF. That's currently done manually, which isn't reliable, because it is often forgotten to update and easy to enter wrong numbers.
  • The build is entirely Gradle/Kotlin based and openly available at https://gitlab.com/JOSM/josm.gitlab.io for review and improvement. That means it can interoperate with other Java/Groovy/Kotlin code, e.g. it reuses code that I wrote for the gradle-josm-plugin to generate the update site for JOSM, or you could write unit tests for the build script with JUnit, ….
  • The CI build of each plugin is separate, so it's not a major problem if individual plugins have a failing build. For publishing a new release, the plugin build would need to succeed again though.
  • This setup should be fairly easy to migrate to a self-hosted instance of Gitlab if/when needed.

Currently it's a work in progress, so only three of the 11 plugins available on https://gitlab.com/JOSM/plugin show up in the list without errors. But as soon as the others are also set up correctly, these will also appear normally in the list.

For the plugins that don't have their main development repository in https://gitlab.com/JOSM/plugin (which is nearly all of them) I set up mirroring, so the repo on gitlab.com is updated about every 10 minutes. This way the plugin could be developed in some git repository on Github or elsewhere, while there will still always be an up-to-date copy of it in https://gitlab.com/JOSM/plugin


TL;DR

I think this could be used as a replacement for https://josm.openstreetmap.de/wiki/PluginsSource , so in the end https://josm.openstreetmap.de/pluginicons would serve the svn-based plugins, https://josm.gitlab.io/plugin-update-site could then serve the git-based plugins.

What do you think about this? Feel free to share your feedback here, so I can further improve this.

Attachments (0)

Change History (13)

comment:1 Changed 2 years ago by floscher

See e.g. https://gitlab.com/JOSM/josm.gitlab.io/pipelines/95014548 for a CI pipeline that was triggered by new releases of the pt_assistant that were pushed to https://gitlab.com/JOSM/plugin/pt_assistant .
This pipeline regenerated the update site.
When clicking on the leftmost item in the pipeline, you can explore the pipeline that published this release of pt_assistant opens and then triggered this update.

comment:2 Changed 2 years ago by Don-vip

This is great work! Maintaining JOSM plugins on a single SVN repo is already a nightmare, it would be even worse after the Git migration if we don't do such tooling.

Does the page needs to be hosted on GitLab? It's ok right now for you to build it, but in the long term (once we move to Git...) I would like to maintain this tooling on a repo of our own, and display the page on JOSM website directly.

comment:3 Changed 2 years ago by Don-vip

Keywords: git added

comment:4 Changed 2 years ago by qeef

It's nice, but I am lost in it. For me (not Java developer) it's too complicated (I think). I didn't find out what it is and how to contribute.

Please, don't get me wrong. I do not understand the current JOSM architecture, therefore some things look strange to me. But I am interested.

I don't think that centralizing and rebuilding plugins is the right way. Centralizing plugins information is a must, however.



No need to manually update ...

I don't do that either.

comment:5 in reply to:  2 Changed 2 years ago by floscher

Replying to Don-vip:

This is great work! Maintaining JOSM plugins on a single SVN repo is already a nightmare, it would be even worse after the Git migration if we don't do such tooling.

Does the page needs to be hosted on GitLab? It's ok right now for you to build it, but in the long term (once we move to Git...) I would like to maintain this tooling on a repo of our own, and display the page on JOSM website directly.

The tool to create the pages itself is completely independent of being hosted on Gitlab. I can run it on my local machine or any server with a JDK and it will generate the pages for. In the end it just generates an HTML file and the plugin list file for JOSM.

The part where this tool relies on GitLab is:

  • it queries the GitLab API for the names of the subprojects of the https://gitlab.com/JOSM/plugin group
  • it is part of a cross-project CI pipeline, so the CI pipeline runs in the plugin repository and upon successful completion it continues to rebuild the plugin list.

So the tool itself could be hosted anywhere, but it currently relies on the plugin repos being mirrored to one group on GitLab, and we would have to find another way a rebuild of the list could be triggered.

comment:6 Changed 2 years ago by stoecker

There is already a perl script doing that work of gathering the information for PluginSource and SVN (together with lots of other things). If we do extend this to Git I'd use that script instead of something new. So your main focus should be to find a way to access the relevant data with as little remote accesses as possible and describe this process (i.e. what's necessary to get current download url and older versions for each plugin release). It then could be reimplemented in existing Perl script easily. From Perl we can do direcht HTTP/HTTPS accesses as well as call tools like git or download utils - what's important is that we try to minimize remote calls.

Currently we're downloading each external plugin every 10 minutes with a "only sent data if changed" header.

As the gitlab approach currently is not yet final this probably means we need to access the existing JOSM group of GitHub.

This should only be done for plugins under JOSM control, i.e. not for each and everyone, but only for these in GitHb JOSM group or whatever other infrastructure is decided official.

comment:7 in reply to:  4 Changed 2 years ago by floscher

Replying to qeef:

It's nice, but I am lost in it. For me (not Java developer) it's too complicated (I think). I didn't find out what it is and how to contribute.

Please, don't get me wrong. I do not understand the current JOSM architecture, therefore some things look strange to me. But I am interested.

It really lacks documentation and explanation what is going on, thanks for the feedback, that should definitely get better over time.

I don't think that centralizing and rebuilding plugins is the right way. Centralizing plugins information is a must, however.

Just to be clear, I wouldn't want to force all plugins to have their main repositories in the https://gitlab.com/JOSM/plugin group. These repositories are just meant as mirrors/copies of the real repositories. When people try to create an issue in the mirror repository, they'll even get redirected to the main issue tracker of that plugin, be it on GitHub, on JOSM's trac or in another GitLab repository.

However, the main reason I chose to make a mirror repository for each plugin is, that the plugins could then be published in the GitLab Maven repository of that plugin and you can access all plugins through the single Maven endpoint gitlab.com/api/v4/groups/JOSM/-/packages/maven, which would allow me to solve https://gitlab.com/floscher/gradle-josm-plugin/issues/1 .

I think your criticism of REbuilding plugins is valid, there should be a way for plugins to supply a URL when they have their own build infrastructure. For the plugins that I already enabled this for, it wasn't really an issue, because those are using the CI pipeline in https://gitlab.com/JOSM/plugin as their main build method,so for them it wouldn't be a rebuild, but their main build.

No need to manually update ...

I don't do that either.

Me neither ;), but since most "external" plugins do it, I thought I could bring that up as a point.

comment:8 in reply to:  6 ; Changed 2 years ago by floscher

Replying to stoecker:

There is already a perl script doing that work of gathering the information for PluginSource and SVN (together with lots of other things). If we do extend this to Git I'd use that script instead of something new. So your main focus should be to find a way to access the relevant data with as little remote accesses as possible and describe this process (i.e. what's necessary to get current download url and older versions for each plugin release). It then could be reimplemented in existing Perl script easily. From Perl we can do direcht HTTP/HTTPS accesses as well as call tools like git or download utils - what's important is that we try to minimize remote calls.

Currently we're downloading each external plugin every 10 minutes with a "only sent data if changed" header.

As the gitlab approach currently is not yet final this probably means we need to access the existing JOSM group of GitHub.

This should only be done for plugins under JOSM control, i.e. not for each and everyone, but only for these in GitHb JOSM group or whatever other infrastructure is decided official.

Part of my effort here is to kind of standardize how these git-hosted plugins could be published, so we can collect them easier for creating the plugin list, instead of each plugin doing its own release and publishing process. At the moment there is no such standard URL for every plugin that your perl script could query.

For pt_assistant, Mapillary and wikipedia, there URL for the latest plugin is josm.gitlab.io/plugin/‹plugin-name›/dist/latest/‹plugin-name›.jar. Older versions can be found when replacing latest with the git-tag which triggered the release.
But I'm still trying to figure out how to solve that best for as many of the git-hosted plugins out there as possible.

I (think I) don't have access to that perl script and the information what is currently done to publish the plugin list, so I figured I could simply try to build a similar tool from scratch to test how we could collect a plugin list for the git-hosted plugins. So I'll use my tool as a proof of concept on how the collection of plugin *.jar-files could work with git-hosted plugins.
And when it is ready to be used in production, I'd be happy to explain to you what I ended up with and if you'd like to implement the approach in perl, feel free to then do so.

If the final goal is to develop all JOSM plugins in git, then I'd like to bring up the question if it makes sense to extend the tooling used for SVN with git capabilities, or just create new tooling just for git-repositories (I don't claim to know the answer to that, don't have the necessary insight to decide). But if not, sure it definitely makes sense to keep the current tool around and extend it to also handle git-repositories.

comment:9 in reply to:  8 Changed 2 years ago by stoecker

Replying to floscher:

Replying to stoecker:

There is already a perl script doing that work of gathering the information for PluginSource and SVN (together with lots of other things). If we do extend this to Git I'd use that script instead of something new. So your main focus should be to find a way to access the relevant data with as little remote accesses as possible and describe this process (i.e. what's necessary to get current download url and older versions for each plugin release). It then could be reimplemented in existing Perl script easily. From Perl we can do direcht HTTP/HTTPS accesses as well as call tools like git or download utils - what's important is that we try to minimize remote calls.

Currently we're downloading each external plugin every 10 minutes with a "only sent data if changed" header.

As the gitlab approach currently is not yet final this probably means we need to access the existing JOSM group of GitHub.

This should only be done for plugins under JOSM control, i.e. not for each and everyone, but only for these in GitHb JOSM group or whatever other infrastructure is decided official.

Part of my effort here is to kind of standardize how these git-hosted plugins could be published, so we can collect them easier for creating the plugin list, instead of each plugin doing its own release and publishing process. At the moment there is no such standard URL for every plugin that your perl script could query.

For pt_assistant, Mapillary and wikipedia, there URL for the latest plugin is josm.gitlab.io/plugin/‹plugin-name›/dist/latest/‹plugin-name›.jar. Older versions can be found when replacing latest with the git-tag which triggered the release.
But I'm still trying to figure out how to solve that best for as many of the git-hosted plugins out there as possible.

I don't think I want that. External should remain external and be harder for the suppliers. Automatics should only work for plugins where the JOSM admins have a chance to do bug-fixes and other necessary cleanup (i.e. what's currently the JOSM group on GitHub). To explain the concept: JOSM should be open and I want to allow different methods of usage, but that does not mean for the JOSM admins each variant has the same value. The less control we have, the more work it is for us, so essentially each plugin developer should be forced to decide whether he weights his freedom higher or if he wants the automatic goodies like translation, auto-list-updates and so on :-)

So you have a limited scope: only these Git plugins, which follow certain rules.

I (think I) don't have access to that perl script and the information what is currently done to publish the plugin list, so I figured I

No, you don't have.

could simply try to build a similar tool from scratch to test how we could collect a plugin list for the git-hosted plugins. So I'll use my tool as a proof of concept on how the collection of plugin *.jar-files could work with git-hosted plugins.
And when it is ready to be used in production, I'd be happy to explain to you what I ended up with and if you'd like to implement the approach in perl, feel free to then do so.

That's fine. If your approach produces a working concept the re-implementing in the perl script is the smallest work.

As I understand it what you have now would be fine for a gitlab based future system, but is suboptimal for the current GitHub based. Could that be changed? Ideally to support both, so a switch is easier. There is no need for CI integration, as the start time of the Perl script is fixed due to other reasons.

Essentially what would be required is gathering the necessary information without requiring another special setup:

  • Start whatever script for the JOSM-managed git plugins:
    • Download list of available plugins and their newest release
    • compare with saved state
    • download newest plugin and create history when necessary and possible (requires e.g. different links for each version)
    • Do above with the least amount of necessary network calls :-)

What we have already in existing script:

  • Extract manifest from jar
  • plugin download with "if-changed-sent-data"
  • storage of historic states.

If the final goal is to develop all JOSM plugins in git, then I'd like to bring up the question if it makes sense to extend the tooling used for SVN with git capabilities, or just create new tooling just for git-repositories (I don't claim to know the answer to that, don't have the necessary insight to decide). But if not, sure it definitely makes sense to keep the current tool around and extend it to also handle git-repositories.

That script does lots more stuff than only getting the plugin list (like handling styles,presets,...) extract help topics and keyboard shortcuts and a lot of background tasks, so replacing it is out of question :-)

comment:10 Changed 2 years ago by qeef

So the goal is an automatically updatable list of .jar URLs? Or centralized store of .jars? Just trying to get the essence.

comment:11 in reply to:  10 Changed 2 years ago by stoecker

Replying to qeef:

So the goal is an automatically updatable list of .jar URLs? Or centralized store of .jars? Just trying to get the essence.

Essentially I think best would be if we do not add additional restrictions or requirements (or at least a minimal number) the the plugin process and simply automatically exctract the info what is released in the JOSM-Git-Repos (like it is done for SVN already). For SVN it's easier, as there is only one place to look at :-)

In principle if it works with JOSM group plugins it also would work with other groups, but we wont call it for these ;-)

My approach without looking into the details would be to read https://github.com/JOSM to find all plugins and then parse the repos to find the status and then find the download URL and then go on as currently based on PluginSource processing [history is created simply by remembering previous data :-]. Very likely there is a better way (this way requires minimum 1+numberplugins HTTPS requests and website scraping tends to break). From what I understand from this ticket flocher implemented that process already for a gitlab variant of the JOSM plugin (which would be the future goal for JOSM git, but is not yet there). Correct?

Last edited 2 years ago by stoecker (previous) (diff)

comment:12 in reply to:  10 ; Changed 2 years ago by floscher

Replying to qeef:

So the goal is an automatically updatable list of .jar URLs? Or centralized store of .jars? Just trying to get the essence.

I see both as the goal: Having a centralized store of *.jar files (as a Maven repository or something similar), so plugins that depend on other plugins can be developed easier. And also a list of all plugins with their download URLs that is updated automatically.

Thank you stoecker for the detailed feedback! I think I'll have to think a bit about all that, and how this could be solved best. When there are news on this, I'll report back.

To your question: Yes, the tool currently collects the released *.jar files for the Gitlab group, extracts the MANIFEST.MF from them and publishes a plugin list to https://josm.gitlab.io .

comment:13 in reply to:  12 Changed 2 years ago by stoecker

Replying to floscher:

I see both as the goal: Having a centralized store of *.jar files (as a Maven repository or something similar), so plugins that depend on other plugins can be developed easier.

As a byproduct we already have all plugins on the server. We would only need to provide them in any useful way for such a purpose. Only the most recent one.

Modify Ticket

Change Properties
Set your email in Preferences
Action
as new The owner will remain team.
as The resolution will be set.
to The owner will be changed from team to the specified user.
The owner will change to floscher
as duplicate The resolution will be set to duplicate.The specified ticket will be cross-referenced with this ticket
The owner will be changed from team to anonymous.

Add Comment


E-mail address and name can be saved in the Preferences.

 
Note: See TracTickets for help on using tickets.