SongKong and Jaikoz Music Tagger Community Forum

Albunack has been updated with latest data

Albunack has been updated with latest data, there was some issues with the dataset provided by Discogs that delayed this but now resolved.

It contains all Discogs releases up to and including https://www.discogs.com/release/11352702 and all MusicBrainz releases up to a few days ago.

When is albunack updated next time?
I have considerable difficulties to synchronize the multi-CD Claude King - More Than Climbing That Mountain, because it was compiled from several CDs available in MusicBrainz (merged). And this was not the only one.

Should be one next week.

For me this is a problem. Every week i buy new music and add it to Musicbrainz and compile all the relationships as well as the releases already present in Musicbrainz are updated / modified by other users.
This delay in updating Albunack creates several problems for me because the wait to synchronize the new data is long. I remember being told that it would be updated every month, but this deadline was rarely met. I hope this situation changes

Update every month, i.e. twelve times a year, I think it makes a lot of sense and should be taken more into consideration

Since Jaikoz version 10 Jaikoz album match nows looks in the MusicBrainz live database when you use Match to Album:Match to Specified Album if no match can be found in Albunack so that is a solution for newly added MusicBrainz albums.

Most SongKong/Jaikoz customers are not regular MusicBrainz contributors so not such a problem but the
intention is always to do once a month, coinciding with the monthly Discogs dumps.
What has made this difficult is the Discogs initial import script is rather slow taking about three days (the MusicBrainz import takes only a few hours) meaning the whole process takes about 4 days, and although its mostly automated there are a a few steps along the way. The Discogs import was not written by myself, and I need to rewrite it.

Also I am still making improvements to Albunack so sometimes I delay the import because I am going to make a coding change and then the coding change does not happen because something else takes priority, and this is what has currently happened.

Match to Album:Match to Specified MusicBrainz Album I do that all the time, when I distribute a new release. But in this case Claude King - More Than Climbing That Mountain it does not bring the hoped-for success, because the MB-Release Id points in Albunack to another Release, normally to one CD of a now Multi-Disc Release.
And everyone of the Jaikoz users who try to match his music-library with MusicBrainz and randomly tries it with these releases, will fail !!!
In my case the solution would be: use only MusicBrainz, when using Match to Album: Match to Specified MusicBrainz Album and never Albunack.

Okay I understand your specific issue now, but this is very specific to you I dont think it is really going to affect many users, in cases like these you could always use MusicBrainz Picard for this kind of problem until Albunack is updated.

Quick update, I am working on Albunack now, however I am fixing a few bugs with Albunack first so sorry the new data version will not be available until next week.

An update so Im now using a new discogs import process that only takes a few hours so this is going to make things much easier for me to get the monthly updates done. The only problem is the new database created is more different than I thought and therefore I have to make some more changes to the code that creates search index from the database, for this reason we are now looking at next week before we have updated Albunack.

(Apart from support requests the majority of my time is focused on completing this so really do expect this to be ready next week)

1 Like

Okay, we are nearly there.

The Discogs Import process that imports data from Xml data dumps into a Postgresql database was missing some data compared to the old method and required more work on it. But this is done and tested, time is now reduced from three days to three hours.

The next stage is the code that reads the combined mb/discogs data from the database and creates the search index. Some additional changes were required here as well and it seemed to be working but when applied to whole dataset there were a few changes from the first stage that broke things. So believe this is now fixed but it takes 24 hours to completely run so shall find out tomorrow if has fully worked.

@Dani82 and @Alfg

Okay its now been updated, upto 7th March 2020
Let me know if you encounter any problems.

1 Like