Are you using stock Docker compose setup?
-YES


Describe the problem you’re having:

In the eventlog I get this error every 10 mins after the latest update:
E_ERROR (1) classes/urlhelper.php:369 Allowed memory size of 268435456 bytes exhausted (tried to allocate 132124672 bytes).
Tried to update again and reboot: error persists. Running on docker via docker-compose on a raspberry pi running ubuntu 20.4 with 8GB ram (3.4 are free).

Include steps to reproduce the problem:

always

tt-rss version (including git commit id):

v21.11-a109e8998

Have you changed (i.e. set) TTRSS_MAX_DOWNLOAD_FILE_SIZE (default 16MiB / 16777216 bytes) or TTRSS_MAX_CACHE_FILE_SIZE (default 64MiB / 67108864 bytes)?

The error message seems to indicate it’s trying to allocate ~126MiB for a single request/download.

https://git.tt-rss.org/fox/ttrss-docker-compose/src/commit/c91a00451c0f689ddae10c42bed529517eade349/app/Dockerfile#L12 is where the 256MiB limit comes from.

No, I have not changed those settings. But this errormessage is clearly from this limit. I’ve changed this limit now locally to 256/512 and see what happens…

unfortunately it’s unlikely to help because you’re running out of ram.

this ram limit is set for php within the container and can’t currently be adjusted.

How? I don’t see any issue on the system (yes, I’m a noob…)

MiB Mem :   7811.3 total,   5160.1 free,    980.9 used,   1670.3 buff/cache
MiB Swap:      0.0 total,      0.0 free,      0.0 used.   6571.7 avail Mem

and you’re right, it didn’t change anything but the errormessage to
Allowed memory size of 134217728 bytes exhausted (tried to allocate 65015808 bytes)

oh sorry, i was talking about ttrss_download_… variables, not memory_limit.

this limit is set so tt-rss wouldn’t want to eat all your ram by doing something stupid i.e. what its doing now, downloading a huge file which is linked in the feed somewhere, probably because of caching turned on or a plugin.

normally plugins that deal with this stuff have safeguards so this wouldn’t happen repeatedly, not sure about generic caching. maybe this should be implemented.

you should have a feed with repeated update errors, try to find it and check what this is about (feed debugger, compose logs, etc).

we probably need a generic mechanism implemented into urlhelper which would specifically forbid to fetch stuff if tt-rss background process crashed once dealing with it.

Thanks for the hint! I found the article which is causing the issue:
https://www.reddit.com/r/dotnet/comments/r0dijt/net_70100_alpha_now_available_for_download/
In combination with af_redditimgur.
In the reddit-post there is a zip file linked and it looks like this is getting downloaded. The zip is about 250 MB so this crashes.

Since when is af_redditimgur trying to download linked zip files?

yep. it should probably check mime type first or something.

[05:06:19/30282] looking for meta og:image
[05:06:19/30282] UrlHelper::fetch: https://dotnetcli.azureedge.net/dotnet/Sdk/7.0.100-alpha.1.21478.48/dotnet-sdk-7.0.100-alpha.1.21478.48-win-x64.zip
[05:06:29/30282] curl: reached max size of 16777216 bytes requesting https://dotnetcli.azureedge.net/dotnet/Sdk/7.0.100-alpha.1.21478.48/dotnet-sdk-7.0.100-alpha.1.21478.48-win-x64.zip, aborting.

well, downloading zip files aside, i couldn’t make it OOM, it just aborts after 16 megabytes, as it should.

test feed - .NET

https://git.tt-rss.org/fox/tt-rss/commit/831648e3c84ce50645f496e4628ed2a72bdccf10

this should prevent some unnecessary downloads caused by af_redditimgur.

Thank you very much! This prevents those errors!