sGTM Container Capacity Maxed Out (200KB)

After setting everything up in server Google Tag Manager, I’ve realized that I’ve maxed out the container max capacity which is 200KB. I am not sure why there is such a low space available for server GTM. I understand that space restriction for web GTM, but not for server.

This is what’s taking up the space:

Traffic Source Tags

TOTAL = 130.25KB

Other Tags

TOTAL = 22.7KB

Variables

TOTAL = 118KB

So if you sum all up, it goes beyond 200KB (270.95KB).

I am not sure about the space size the native variables take, but pretty sure is something around that size. And if you ask, yes, all these tags & variables are needed for the system to work. It is a complex system with multi-domain, click ID mapping, object & array mapping, webhooks, firestore and so on.

Taking into account that it is very rare that Google will increase the server GTM in the near future (although already submitted the feedback), the question is…what should I do in this situation?

Should I create a “main container” with some Traffic Source Tags and then a “secondary container” with the other Traffic Source Tags? That will imply having 2 different server GTM URLs to send data from webhooks. And maintaining 2 different containers with a lot of common variables.

you don’t have much choice, you either split it all into multiple containers like you outlined, or you tailor the webhook payload to contain things you need and in format you need, thus eliminating the necessity of that many variables.

So weird.

I don’t think ED and constants have a large value.

I agree with @Dan, maybe adjust the payload of the webhook to less heavy work.

I would prob look at the variables:
You have alot of lookup tables, might be worth custom writing a tag to process or something.