One big publication or many small ones?

What would be the most resource-efficient way to handle publications?

Say, I have a gigantic Items collection. Initially, user sees only 30 items, based on user’s cityId; items are from all Categories and sorted by date.
He can switch to other city and see completely different 30 items.
He also can choose a Category and see 30 items for this category - they may or may not overlap with the items he saw initially.
Of course, there is pagination to see next 30 items.

I’ve read somewhere that a ‘detailed’ pub concept is more expensive than ‘general’ pub, because when 10k users subscribe, server has to serve them different sets of data, query the db and so forth…

Upon thinking on this more I’ve realized that I cannot actually subscribe to a ‘part’ of a big publication… (for some reason I thought I could) - I will receive all the huge dataset on the client(
So, I guess, the only way here is to granularly filter the publication via subscription arguments… or is there another method?

Cursor cache reuse is only if they are exactly the same parameters.
As soon as you add some userId in cursor, 2 different users cant have same cursor.
Same with kinda filters, if you have same query for city, all people querying same city will be using that 1 cursor and it will be “shared”.
It is on Cursor level, not publication. So if you have more publications and in some of them they end up with same arguments, they should be automatically reused.
If my information is correct, check kadira article about it :smiley:

1 Like

I’ve decided to go with methods - to keep livedata overhead on its minimum.