Repetead _id using Random.id()?

I am using Random._id() in the client to generate the document’s _id before it’s inserted in DB, so I can benefit from the optimistic UI.

I am wondering what would happen if Random._id() generates and _id that it is already being used in the DB?

The insert will fail since there’s a unique contrain on _id fields. You could solve this by trying again with a new Random.id (the chances of generating two existing Random._id on the database are really low)

Hence one more case to consider in the logic. :sweat_smile:

Well, a simple nice error with something like “Your operation failed/Server busy, try again in a few seconds” (and generating a new id on the background with the new try could be enough so you don’t have to increase the logic complexity). Not a very common case anyway. It also depends on the size of your users/database but I guess that If you don’t hit a very large number of them this can be “ignored”

1 Like

@tcastelli do you know if when doing a Collection.insert(...), and the id that gets generated is replicated, would Meteor or MongoDB try to generate a new id and do the insert?

From my own test I would say it doesn’t, whenever if it finds a document with an _id already in the DB it raises a mongo error “_id key duplicated”(something like that).
What Meteor automatically does is to generate an _id field if it’s not defined in the object (using Random.id() to match client/server optimistic inserts)

gbInsertDefensively = (Collection, doc)->
	try Collection.insert(doc)
	catch e
		#todo - when upgrading to new ver of mongo - check that this error has the same text!
		if e.stack is 'MongoError: insertDocument :: caused by :: 11000 E11000 duplicate key error' #id already exists in coll
			doc._id = Random.id(); gbInsertDefensively(Collection, doc)
		else gbThrowMethodErr("gbInsertDefensively: unknown error when trying to insert document into #{Collection._name} collection")
1 Like

The _ids in Meteor.users consists of 17 characters (a-Z + 0-9). That gives you 17^44 = 1.379598e+54 combinations. If you would have a billion (10^9) documents in the collection, the risk of generating an existing _id would be 0.0000000000000000000000000000000000000000001%.

I wouldn’t care about that, but you might have an enormous amount of documents?

2 Likes

Not the case, but I was curious to understand how the system works. :wink:

Yep, I thought so as well, until I’ve stumbled into the same id be generated a couple of times :smile:

Really? Tell me more. What were the circumstances?

Well, I’ve simply decided to test Random package and so it happened that out of several thousand ids some were identical.
I didnt have the desire to mingle in Random’s intestines, so I simply wrote a defensive function.

@avalanche1 it actually dependes on where you create the id. the “randomness” is different on client and server, and definitely much more dependable on the latter, if you have enough entropy. Check out the comments from the source code for further discussion, links and pointers:

1 Like

Why do you provide Random._id() ? doesn’t meteor generate an id automatically on the client for the very same reason? I mean you have also optimistic ui if you don’t provide an id, or am i wrong?

Yeah, I had exact the same issue (app crashed while throwing an error, that the id already exists). My method was defined on server and client side, after moving it to server only, the error has stopped, but I’ve lost the optimistic UI. At the end of the day, we’ve moved it back and added a try catch on the insert method.

I love the simplicity of the _id as a String (comparing to Mongo.ObjectID), but I do feel insecure knowing that in the Collection.insert (https://github.com/meteor/meteor/blob/master/packages/mongo/collection.js#L516) there is no pre-check to catch the specific error of a repeated _id (even if it is just 0.0000000000000000000000000000000000000000001% chance for a billion).

It is like feeling there is a hidden flaw in the system. Maybe this should be fixed? Like, checking if the mongo error was because of a duplicated _id and try again with another random _id?