One of the main reasons I switched to server-side tracking was the promise of more effective retargeting for people who visit my website. However, I noticed a big discrepancy between the number of users who visited our website in the past 30 days (according to GA4) and the size of my Facebook website traffic audience.
According to Google Analytics, my website has had 33k unique users in the past 30 days.
The size of my Facebook website traffic audience over the same 30-day period is only 4,800-5,700 (Parameters: All website traffic visitors in the past 30 days using the ‘Pageview’ event).
That means that, of the 33k who visited my website in the past 30 days (or rather, that GA4 reports visited my website in the past 30 days), Facebook was only able to match roughly 15% of those website visitors, creating a smaller than expected retargeting audience size of 4,800-5,700. So my questions are:
Shouldn’t the match rate be higher? My expectation with server-side tracking was a match rate closer to 80% since IP Address and User Agent alone are usually enough for effective matching within Facebook’s ecosystem. Were my expectations wrong?
Is there something going on I’m not aware of? Am I interpreting these numbers incorrectly?
Some additional notes on my setup that may be helpful:
I use server-side GTM and Facebook CAPI. I’ve tested and debugged my server-side setup many times, and according to Facebook, my ‘Pageview’ events are processing all the correct parameters at 100% (mainly IP and User Agent) and my event match rating is a 4.2/10 (which is standard for the ‘Pageview’ event, according to Stape blog posts).
I use Stape’s custom loader to use my own subdomain for the web GTM.
I do not use the Facebook pixel at all. I only use Facebook CAPI for processing all of my Facebook events (My Web GTM fires a GA4 ‘pageview’ event that gets sent to the GA4 client on my server GTM. The GA4 client then sends that event to Facebook via a server GTM Facebook CAPI tag.)
I switched to a server-side setup in late March/early April of 2023.
My current hypothesis is that my number of ‘users’ in GA4 is inflated because of Safari browser cookie expiration after 7 days, resulting in a large number of ‘New Users’ that are actually returning users but are counted as new/unique because the GA4 Client ID cookie expires quickly (I currently don’t use Stape’s Cookie Keeper tool because I don’t know how to make my own cookie).
However, even if that hypothesis proved true, that still wouldn’t fully account for the discrepancy between the number of unique users reported by GA4 in the past 30 days and the much smaller size of my Facebook website traffic retargeting audience.
Would love to hear any thoughts/ideas, and thanks in advance!
First of all, FB recommends using a hybrid tracking method (web events + server events with deduplication). Web events also collect microdata, this can probably affect audience collection as well. I would recommend trying to use tracking as FB recommends and see how it affects your problem.
The second is to increase Match Quality you need to send user data where it is available to you. It’s something that directly affects the effectiveness of audience collection. IP address, User Agent and fbp are basic parameters so if you want to collect audiences more efficiently, you need to add user data that you have available.
Here you can find all the possible user parameters you can send and their formats: Paramètres - API Conversions - Documentation - Meta for Developers
Thanks for the help. I went ahead and set up hybrid tracking and event deduplication for ‘pageview’ events. I’ll let that run for about a month to see how it affects my Facebook retargeting audience size and report back my findings here!
For ‘pageview’ events, I don’t have additional user data I can send since I’m only tracking people who visit my site for that event, but I am sending user data (mostly first name, last name, and email) from the server for other events like email sign-ups and purchases.
if your compliance situation allows you to set such a cookie, FB tags will use it to enhance events, effectively making it so that data you sent on purchase, will be ‘available’ for consecutive events user generates, including those page_views
it does not necessarily address you initial query, however very related and might end-up improving your match score on ‘lesser’ events
Thanks, @Dan. I noticed this last week when changing things to a hybrid tracking setup, so I updated my tags and turned that feature on. I’m already starting to see those extra parameters from purchases and email subscriptions showing up on a small percentage of pageview events.
Wanted to report back with my findings in this thread now that a month has gone by. It appears the hybrid tracking setup helped a lot, @Alex…
At the end of August, my estimated Facebook audience size (for people visiting my website in the past 30 days) was 4,800-5,700, representing about 15% of my total website traffic. Since I switched to hybrid tracking roughly 30 days ago, my estimated audience size is now 9,800-11,500, which represents about 25%-30% of my total website traffic. So, the hybrid setup increased Facebook’s ability to match website visitors with users in their ecosystem by 10%-15%.
While overall I’m happy with this solution, I’m still kind of disappointed. I think my expectation with a server-side setup was that I’d be able to match closer to 80% of my website traffic, but I just don’t think that’s the case—especially since I can’t send additional user data parameters for most of that traffic, since most people who visit our site just view our content/resources without giving us their email, first name, or last name (I help manage a donor-funded nonprofit where we give away all of our content/digital resources for free).
Still: A 25%-30% Facebook match rate is better than 15%, so I’ll take it. Unless there are ideas on other things I can do/implement to improve that match rate—then I’m all ears!