Heavy CPU, enormous logs.db

I’m running the latest version (3.3.7.40937) on Linux Mint Tara. I’m seeing 40% to 45% CPU utilization continuously, day after day, even with no file activity. My logs.db file also reached 4gb in size before I deleted it.

Within a few minutes after restarting the logs.db file is 11mb and growing quickly. CPU is steady at about 40%. It’s showing ‘Scanning’ with no file activity, though it did find and sync the last couple of new files.

Out.txt shows only one clue: js: ResizeObserver loop limit exceeded

Update: after several hours, logs.db was approaching 4gb again. The ‘js: ResizeObserver loop limit exceeded’ error message in out.txt had been repeated many times. I executes ‘insync quit’ which takes a long time to take effect, and generated several other errors in out.txt:

*(insync:17297): GLib-GIO-CRITICAL *: 21:03:02.128: _g_file_info_get_attribute_value: assertion ‘G_IS_FILE_INFO (info)’ failed

a few times, followed by

Exception in thread LogsPruner:
Traceback (most recent call last):

  • File “threading.py”, line 917, in _bootstrap_inner*
  • File “threading.py”, line 865, in run*
  • File “ideskcore/mainlogs.py”, line 224, in _prune_logs*
  • File “ideskcore/logsdb.py”, line 102, in prune_logs*
  • File “ideskcore/logsdb.py”, line 122, in _prune_logs*
  • File “ideskdb/clientdb.py”, line 493, in select_one*
  • File “ideskdb/clientdb.py”, line 485, in select_first*
  • File “ideskdb/clientdb.py”, line 475, in select_all*
  • File “ideskdb/clientdb.py”, line 409, in wrapper*
  • File “ideskdb/clientdb.py”, line 258, in get_all*
    sqlite3.DatabaseError: database disk image is malformed

/usr/lib/insync/asyncio/base_events.py:609: RuntimeWarning: coroutine ‘Queue.get’ was never awaited
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
Exception in thread Sync loop:
Traceback (most recent call last):

  • File “threading.py”, line 917, in _bootstrap_inner*
  • File “threading.py”, line 865, in run*
  • File “ideskheadless/fswatcher.py”, line 215, in sync*
  • File “ideskasync/coreloop.py”, line 110, in wrapper*
  • File “idesksync/syncfs.py”, line 109, in on_event*
  • File “ideskasync/coreloop.py”, line 59, in call*
  • File “ideskasync/coreloop.py”, line 45, in run*
  • File “asyncio/tasks.py”, line 813, in run_coroutine_threadsafe*
    AttributeError: ‘NoneType’ object has no attribute ‘call_soon_threadsafe’

/usr/lib/insync/threading.py:951: RuntimeWarning: coroutine ‘call..runner’ was never awaited
RuntimeWarning: Enable tracemalloc to get the object allocation traceback

Hi @Bill_Kuhns,

Could you email us a copy of your logs.db and send it to support@insynchq.com with the link to this post?

dont know if can help but this resamble me a problem with dropbox sync.
see here for detail…

provlem could be in notify service which must be restarted with higher file number limit

try

echo fs.inotify.max_user_watches=100000 | sudo tee -a /etc/sysctl.conf; sudo sysctl -p

and restart service

Thank you. It appears that the version of Insync that I have already does that - perhaps every time it hits the limit. It had already added a line to sysctl.conf setting max_user_watches to a bit over 1000000. I increased that to 1500000. I’m not optimistic that this will solve the problem of CPU and logs.db size, but it may get rid of the error messages.

No, I can’t send it as an email attachment. It’s 1.5GB at the moment. I can delete it and restart insync, then stop insync before it gets so large if that would work. This time, out.txt is also huge - 1GB. Many error messages as follows:
INFO 2021-03-29 18:51:04,748 [sync:is_stale:89] #649: file is not stale. Reason: cl_mtime.
INFO 2021-03-29 18:51:12,989 [syncwork:_is_stale_dl_sub:2756] #649: checking dl_sub file .~1T5J5zCDHDuBHUDdEBSxB_YoUKquOUch93T__y8-qi7w-67.868.insyncdl for staleness.

That would be great, @Bill_Kuhns! Once it hits a few hundred MB, please upload it to Drive and send me a link to download. Will let you know via email once I’ve received + made a copy of both the logs.db and out.txt files so you can clear cloud storage.

Btw-- are you seeing a lot of ~insyncdl files on your sync folder too?

I sent the files a while back. No ~insyncdl files anywhere that I can see.

Alright, thanks for confirming. We will look into this further, @Bill_Kuhns!

FYI, Sqlite database files can be easily compressed. For next time, if you send it as a zip or a gz, it’ll be a significantly smaller upload.

2 Likes

Thank you for the tip @Drakinite :wink: