SongKong Jaikoz

SongKong and Jaikoz Music Tagger Community Forum

SK crashes. Memory issue?

OK clear !

Can I pause the fix job, and run a dupe removal from time to time in the processed folder ?

Or run this from a secondary docker container ?

No you can’t run a second task from the same instance of SongKong whilst another task is already running even if it is paused.

Perhaps you could setup a new SongKong instance in a separate container and point it as the processed files, as long this second instance has different location for storing SongKong database and logs I think it should work.

so, here I am again.

I’ve allowed 500GB to my docker containers.

Ran SK for a few days and it crashed again.

I monitored its storage usage, and I was still under 100GB for the songkong folder.

I reallt wonder why it’s crashing all the time. But I believe that, because of the crash, now, it will start over with 3.320.000 remaining tracks to process. So it’s a never ending story.

Can I eventually run in another way than dockerized on unraid ? as it’s the docker that crashes, I guess I need to figure out how to run it in a different way. VM ?

PS: I just sent the support files again :slight_smile:

You can set For songs already fully matched to Ignore and then those songs already matched will just be skipped over, and you can also enable Ignore songs previously checked that could not be matched to skip over songs that SongKogn tried to match but failed.

Its run on Linux, Windows and Mac so couldn’t you just point your computer at your unraid config

yes next try will be to run it from a VM.

I just did install one and noticed that SK, by default, has 870MB of RAM allocated.

As I did notice that the behavior was exactly the same (it gets slower, then finally, crashes), I think I should try to allocate more RAM to this process !

I will gie the docker one last try before I try the VM solution (I like docker better…).

I can’t easily locate the Songkong64.ini file though, where is it located in the docker container ?

I plan to allocate 32GIG of ram to it…

I am not suggesting a VM, I always find VMs quite flaky, but why can’t you just install SongKong directly on Windows/Linux/Mac computer and just access your unraid disks as remote drive. I expect it failed on docker because of something docker related.

SongKong64.ini is only for Windows, memory configuration is different for every platform

OK so where can I edit the allocated RAM in the docker container ?

I could find this in the songkong.sh params :

-XX:MaxRAMPercentage=60

But I can tell you it never used 60% of my 64gigs of ram :smiley:

Well there you are then, ram setting is not the problem.

So why cant you just run SongKong directly on a computer ?

Well, because I have no dedicated computer except my unraid server.

And a pair of macbook pros I use and that I can’t leave running 27/7 for several weeks, the time to get my files fixed :frowning:

But you could run a macbook pro for a couple of days to see if it runs without crashing, that then shows the issue is docker rather than a particular problem with your data/files.

The thing is you are trying to fix alot of songs, and docker is inherantly less reliable than installing the software directly on computes OS.

Although what did you have your docker container itself configured to, I think it will not automatically make all the physical memory available.

This is a very good one. I did not add any extra param when running the docker. I will then try to increase the docker allowed memory before I start playing with a VM.

I’m quite sure it hits some limit at some point, crashing the docker container.

Before I do this, I want to start things clear again. So now that I have properly renamed files in music/processed, I started a duplicate find job on that folder.

It’s pretty damn slow, am I missing something ? Should it not be super fast as the files that were copied there are already moved, and therefore identified by songkong ?

Look, I started it two hours ago and it only loaded 40k tracks, on a total of…3.3 millions. This is even scarier, as I don’t believe my total of 3.3M tracks were moved in there !

05/10/2022 13.51.40:CEST:INFO: Start
05/10/2022 13.51.41:CEST:INFO: Start:/music/processed
05/10/2022 14.01.22:CEST:WARNING: Ignoring /music/processed/Merzbow/Recycled because looks like a recyclebin folder
05/10/2022 14.02.12:CEST:WARNING: Ignoring /music/processed/Pain Jerk/Recycled because looks like a recyclebin folder
05/10/2022 14.04.36:CEST:WARNING: --Music File Count:456263
05/10/2022 14.07.22:CEST:SEVERE: —Shutdown:com.jthink.songkong.analyse.duplicates.DeleteDuplicatesLoadFolderWorker
05/10/2022 15.13.49:CEST:WARNING: --Music File Count:3323338

I upload the support files again.

here is a little snippet of the debug log, looks how slow it is loading the files :

finally, I can confirm SK loads more tracks that what my processed folder do contain !

while sk loaded 3323338 tracks, the reality is that I do have 699356 tracks in that folder.

It is really difficult to help customers with one problem when they then start doing something else and ask about a different issue, lets leave Delete Duplicates alone for now. My recommendation is to try running on your Mac and see how you get on.

Alternatively if you really don’t want to do that then continue with Docker and set For songs already fully matched to Ignore and then those songs already matched will just be skipped over and use the Ignore songs previously checked that could not be matched to skip over songs that SongKong tried to match but failed, this should allow you to make progress.

Another thing you could try is enabling Preview Only to see if problem is connected to file saving or file renaming

I strongly recommend not trying to use with a VM, that is not a supported platform and unlikely to work better than using Docker.

The 3,323,338 figure is how may songs SongKong has counted in the folder not how many songs it has actually loaded, it had loaded 96,224 so far. maybe an issue with counting symboloic/absolute links if there is only 699,356 songs in that folder.

2 things :slight_smile:

  1. yep, I will stick to fixing the tracks to make your life easier. I agree :wink:
  2. I can confirm that songkong keeps seeing 3,323,338 tracks whatever folder I am selecting. I believe this is the total amount of tracks /music contains. So it seems like songkong keeps listing 3 millions tracks, even if I point it to the /music/processed/ subfolder. Could that be a bug ?

Ps: I try to figure out how to run a ncdu (and only do count the flac and mp3 files) so I can confirm what stating saying above.

Possibly, what would be useful is if you could try and run SongKong against a single album folder underneath /music/processed/ and see if:

  • It stills report the large figure
  • It finishes after matching the one album or continues to run and loading files you would not expect it to load

good idea. indeed. I will do this as soon as I can and will keep you posted.

OK, so I made a mistake and did run a “dupe finder” instead of a “fix” job, but the results are, lets say, strange still.

I pointed songkong to the following folder : /music/processed/Andrew Liles/

This folder contains 560 folders, 5869 files (and a LOT of dupes, see below screenshot(s):

The job starts, and it lists me the following loaded tracks (???) :

I let it finish, and here is the result of the dupe job :

So, songkong did search dupes in totally different folders.

I opened the first artist folder just to check if the dupes were removed. And it removed a few dupes, but most of the dupe folder and files are still in there :

So it looks like songkong scan whatever it wants, and so does it delete dupes.

And now it finished processing these random files, I could eventually start another task but :

As you can see, java (sk) keeps running something, but I have no clue what it is doing.

So, here is what I’d like to achieve from now on. 1. make sure all the already renamed and removed albums are been analysed for dupes, and ALL dupes been removed. 2. make a new, fresh install of SK docker 3. allow it max possible memory 4. start from scratch for all the remaining files.

Can we focus on making sure my processed folder gets cleaned of its dupe albums ? for now, as you can see, it’s simply going nuts. :smiley:

This doesnt make any sense to me. but if you could try the Fix Songs test on single album folder that would really be more useful as a starting point, and resend support files.

Indeed, it doesn’t make any sense to me neither.

So here we go, a single album, fix tracks!

I’ve selected a 2 tracks album, it ran, listed 2 tracks, identified the album, and fixed it.

And did not go further ! So it was successful this time. But on a single folder.