Duplicate key error index while insert record in mongo

Hello everyone,
I am getting the following error (on client side) :-

insert failed: MongoError: insertDocument :: caused by :: 11000 E11000 duplicate key error index: meteor.tags.$_id_  dup key: { : "KbNnewPfXkkXEGdMB" }

but I am inserting record with _id=“KbNnewPfXkkXEGdMB” first time.

Here is my sample code snippet (on server side) :-

if(Tags.find().count() == 0){
	Tags.insert({
		"_id" : "KbNnewPfXkkXEGdMB",
		"title" : "test"
	});
}

Any help/suggestion for this issue. Thanks.

You add it on server side but get the error on client side? It seems like you are inserting it on both sides. Normally you should only get an error message within server console. Where did you save the file that’s inserting the tag?

Try not inserting"_id", mongo will create the id on item creation.

I am able to duplicate this by performing the insert on the client and server “simultaneously” (for example by using shared code). If you specify the _id you should ensure the insert is done in one place only (if you don’t specify the _id and do it on the client and server, you will get two documents inserted).

Thanks @robfallows, It works for me.

1 Like

Chiming in with another instance where you might encounter this issue.

If you’re using Random.id()… probably anywhere, but in my specific case… to generate client-side _id's as part of a JSON browser cache setup, be sure to strip these ID’s before committing data to MongoDB as it’s nowhere near random enough.

We started encountering duplicate ID issues after only a couple years of operation (and around 2000 data entries). Took me a few hours to figure out why we were getting duplicate ID errors. And of course, because it was cached form data, once a user-generated an _id for submission even refreshing the page wouldn’t change that ID so it was impossible to get around it from a users perspective.

Protip, if you’re using Meteor (you’re here I guess but you never know), the underscore library bundled with Meteor has a lovely omit function that’s cleaner than the native delete data._id option. Usage: _.omit(data,'_id') returns a JSON object minus the key specified, in this instance '_id', without modifying the source object. This is very handy for avoiding orphaning of data in the cache so it can be properly removed upon a successful insert.

Edit: Oh yeah fun fact. The duplicate ID’s that were generated conflicted with the same users other entries, exclusively so far as I can tell, not recent entries either. Says something about how nonrandom Random is.