Fix: MongoDB Schema Validation Error — Document Failed Validation
Quick Answer
How to fix MongoDB schema validation errors — $jsonSchema rules, required fields, type mismatches, enum constraints, bypassing validation for migrations, and Mongoose schema conflicts.
The Error
MongoDB rejects a document insert or update with:
MongoServerError: Document failed validation
Additional information: {
failingDocumentId: ObjectId('...'),
details: {
operatorName: '$jsonSchema',
schemaRulesNotSatisfied: [
{
operatorName: 'required',
specifiedAs: { required: [ 'email', 'createdAt' ] },
missingProperties: [ 'createdAt' ]
}
]
}
}Or a type mismatch triggers the error:
schemaRulesNotSatisfied: [
{
operatorName: 'properties',
propertiesNotSatisfied: [
{
propertyName: 'age',
description: 'item did not match any allowed types',
details: [ { operatorName: 'bsonType', specifiedAs: 'int', reason: 'type did not match', consideredValue: '25', consideredType: 'string' }]
}
]
}
]Or validation works for new documents but fails during a migration or bulk update:
BulkWriteError: Document failed validation
writeErrors: [ { index: 14, code: 121, errmsg: 'Document failed validation' } ]Why This Happens
MongoDB 3.6+ supports collection-level schema validation using JSON Schema ($jsonSchema). When enabled, every insert and update that affects the document is checked against the schema. Common causes of failure:
- Missing required fields — the schema’s
requiredarray lists fields that must be present in every document. - Wrong BSON type —
bsonType: 'int'rejects a JavaScriptNumberthat MongoDB stores as a double, or a string"25"instead of an integer25. - Enum constraint violated — a
statusfield set to"archived"when the schema only allows["active", "inactive"]. - Existing documents out of sync — adding validation to a collection that already has documents not conforming to the new rules doesn’t fail immediately, but those documents fail on next update.
- Mongoose schema vs MongoDB validation — Mongoose validates at the ODM layer before sending to MongoDB. MongoDB-level validation adds a second layer that Mongoose doesn’t control, causing confusing double-validation failures.
$setupdates on partial documents — MongoDB validates the entire document after a$setupdate, not just the updated fields.
Fix 1: Read the Validation Error Details
MongoDB’s error message includes exactly which rules failed. Expand the details object to find the specific constraint:
// Node.js — catch and log the full error details
try {
await db.collection('users').insertOne(doc);
} catch (err) {
if (err.code === 121) { // Document failed validation error code
console.error('Validation failure details:');
console.error(JSON.stringify(err.errInfo?.details, null, 2));
// errInfo.details.schemaRulesNotSatisfied shows exactly what failed
}
throw err;
}Inspect the current schema on a collection:
// Get the current validation rules
const collectionInfo = await db.listCollections({ name: 'users' }).toArray();
const validator = collectionInfo[0]?.options?.validator;
console.log(JSON.stringify(validator, null, 2));
// Or in MongoDB shell:
// db.getCollectionInfos({ name: 'users' })[0].options.validatorCheck which existing documents violate a schema before applying it:
// Query documents that would fail validation before adding the schema
const schema = {
$jsonSchema: {
required: ['email', 'createdAt'],
properties: {
age: { bsonType: 'int' }
}
}
};
// Find documents that DON'T match the schema (will fail after schema is applied)
const violations = await db.collection('users').find({
$nor: [schema]
}).toArray();
console.log(`${violations.length} documents would fail validation`);Fix 2: Fix Type Mismatches
BSON types are stricter than JavaScript types. The most common mismatch is between int, double, and long:
// WRONG — JavaScript numbers are doubles by default
// If schema requires bsonType: 'int', this fails:
await db.collection('products').insertOne({
name: 'Widget',
quantity: 5, // Stored as BSON double (64-bit float) — fails int validation
price: 9.99 // This is fine as double
});
// CORRECT — use Int32 for integer fields
const { Int32 } = require('mongodb');
await db.collection('products').insertOne({
name: 'Widget',
quantity: new Int32(5), // Explicit BSON int32
price: 9.99 // double — fine for decimal values
});
// Or change the schema to accept 'number' (covers both int and double):
const validator = {
$jsonSchema: {
properties: {
quantity: { bsonType: ['int', 'double'] }, // Accept both
// or use 'number' to accept any numeric type:
price: { bsonType: 'number' }
}
}
};Common BSON type names:
| JavaScript value | BSON type string |
|---|---|
| Integer (whole) | "int" or "long" |
| Decimal / float | "double" or "decimal" |
"text" | "string" |
true/false | "bool" |
new Date() | "date" |
null | "null" |
[] | "array" |
{} | "object" |
| ObjectId | "objectId" |
Allow null or missing optional fields:
// Schema that allows email to be either a string or null (optional field)
const validator = {
$jsonSchema: {
properties: {
phone: {
bsonType: ['string', 'null'], // Can be string or null
description: 'Phone number, optional'
}
}
}
};Fix 3: Add Required Fields to Existing Documents
When adding a required constraint to a field that some existing documents don’t have, fix the existing documents first:
// Step 1: Find documents missing the required field
const missing = await db.collection('users').find({
createdAt: { $exists: false }
}).toArray();
console.log(`${missing.length} users missing createdAt`);
// Step 2: Backfill the field before adding validation
if (missing.length > 0) {
await db.collection('users').updateMany(
{ createdAt: { $exists: false } },
{ $set: { createdAt: new Date('2024-01-01') } } // Reasonable default
);
}
// Step 3: Now safe to add the validation schema
await db.command({
collMod: 'users',
validator: {
$jsonSchema: {
bsonType: 'object',
required: ['email', 'createdAt'],
properties: {
email: { bsonType: 'string' },
createdAt: { bsonType: 'date' }
}
}
},
validationLevel: 'strict', // Enforce on all inserts and updates
validationAction: 'error' // Reject violating documents (default)
});Fix 4: Use validationLevel for Migrations
During migrations, temporarily relax validation to avoid blocking updates:
// Set validation to 'moderate' — only validate NEW documents and documents
// that already pass the current schema. Existing non-conforming documents
// can still be updated without validation.
await db.command({
collMod: 'users',
validationLevel: 'moderate'
});
// Perform migration
await db.collection('users').updateMany({}, {
$set: { status: 'active' }
});
// Re-enable strict validation after migration
await db.command({
collMod: 'users',
validationLevel: 'strict'
});Validation levels:
"strict"(default) — validate all inserts and updates"moderate"— validate inserts and updates to documents that already pass the current schema; existing non-conforming documents can be updated without validation"off"— no validation (use only for emergency recovery)
Validation actions:
"error"(default) — reject documents that fail validation"warn"— allow the write but log a warning (useful for auditing before enforcing)
Set validationAction: "warn" when first adding a schema to audit violations without breaking the application:
await db.command({
collMod: 'users',
validator: { $jsonSchema: { ... } },
validationAction: 'warn' // Log but don't reject — check logs for violations
});Fix 5: Align Mongoose Schema with MongoDB Validation
Mongoose and MongoDB-level validation operate independently. A Mongoose schema doesn’t automatically create MongoDB collection validators:
// Mongoose schema — validates BEFORE sending to MongoDB
const userSchema = new mongoose.Schema({
email: { type: String, required: true },
age: { type: Number, min: 0 }
});
// This does NOT create a MongoDB $jsonSchema validator
// MongoDB-level validation must be created separately
// Option 1: Use ONLY Mongoose validation (no MongoDB-level schema)
// This is the simpler approach for most applications
// Option 2: Add MongoDB validation explicitly (for data integrity beyond Mongoose)
const validationSchema = {
$jsonSchema: {
bsonType: 'object',
required: ['email'],
properties: {
email: { bsonType: 'string' },
age: { bsonType: ['int', 'double', 'null'] }
}
}
};
// Apply during application startup
await mongoose.connection.db.command({
collMod: 'users',
validator: validationSchema
});When both Mongoose and MongoDB validation are active, MongoDB validation fires after Mongoose sends the document. A document that passes Mongoose validation can still fail MongoDB validation if the types differ (e.g., Mongoose coerces strings to numbers, but MongoDB receives the number type, which may not match a strict bsonType: 'string' rule).
Common Mistake: Using mongoose.Schema.Types.ObjectId for a reference field in Mongoose, then specifying bsonType: 'string' in the MongoDB schema. Mongoose stores ObjectId references as BSON ObjectIds, not strings.
Fix 6: Handle Enum Constraints
When a field has an enum constraint, every insert and update must use one of the allowed values:
// Schema with enum constraint
const validator = {
$jsonSchema: {
properties: {
status: {
bsonType: 'string',
enum: ['pending', 'active', 'inactive', 'deleted'],
description: 'Must be one of the allowed status values'
}
}
}
};
// WRONG — 'archived' not in the enum list
await db.collection('users').updateOne(
{ _id: userId },
{ $set: { status: 'archived' } } // Fails: 'archived' not allowed
);
// CORRECT — use an allowed value, or update the schema to include 'archived'
await db.collection('users').updateOne(
{ _id: userId },
{ $set: { status: 'inactive' } } // 'inactive' is in the enum list
);
// To add 'archived' to the enum — update the schema:
await db.command({
collMod: 'users',
validator: {
$jsonSchema: {
properties: {
status: {
bsonType: 'string',
enum: ['pending', 'active', 'inactive', 'deleted', 'archived'] // Added
}
}
}
}
});Fix 7: Bypass Validation for Emergency Writes
In emergencies (data recovery, critical hotfix), you can bypass validation using the bypassDocumentValidation option:
// Insert bypassing validation — use sparingly
await db.collection('users').insertOne(
{ _id: new ObjectId(), partialData: true }, // Would fail validation
{ bypassDocumentValidation: true }
);
// In a transaction
const session = client.startSession();
await session.withTransaction(async () => {
await db.collection('users').updateMany(
{ status: { $exists: false } },
{ $set: { status: 'pending' } },
{ session, bypassDocumentValidation: true }
);
});Warning:
bypassDocumentValidationrequires thebypassDocumentValidationprivilege in MongoDB’s access control. Don’t use this as a workaround for everyday writes — fix the root cause instead.
Still Not Working?
Validation on nested fields — $jsonSchema supports nested objects with properties inside properties. If a nested document fails validation, the error message shows the full property path.
Array item validation — to validate each item in an array, use items:
{
$jsonSchema: {
properties: {
tags: {
bsonType: 'array',
items: { bsonType: 'string' }, // Each tag must be a string
maxItems: 10
}
}
}
}$set only touches specified fields, but validation runs on the full document — if your full document has fields that violate the schema, a $set targeting only valid fields still triggers a validation failure on the entire document.
Mongo Atlas vs self-hosted differences — Atlas enforces validation strictly and doesn’t allow bypassDocumentValidation at the driver level by default. Check your Atlas cluster’s security settings if bypass doesn’t work.
Schema changes in Mongock/migrations — if you use a migration tool, schema changes applied to validators must account for the migration’s write operations. Apply validationLevel: 'moderate' before the migration step, then restore strict after.
For related MongoDB issues, see Fix: MongoDB Connection Timeout and Fix: Prisma Unique Constraint Failed.
Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.
Was this article helpful?
Related Articles
Fix: Drizzle ORM Not Working — Schema Out of Sync, Relation Query Fails, or Migration Error
How to fix Drizzle ORM issues — schema definition, drizzle-kit push vs migrate, relation queries with, transactions, type inference, and common PostgreSQL/MySQL configuration problems.
Fix: MongoDB Aggregation Pipeline Not Working — Wrong Results or Empty Array
How to fix MongoDB aggregation pipeline issues — $lookup field matching, $unwind on missing fields, $match placement, $group _id, type mismatches, and pipeline debugging.
Fix: Prisma Connection Pool Exhausted — Can't Acquire Connection from Pool
How to fix Prisma connection pool errors — pool size configuration, connection leaks, serverless deployments, singleton pattern, query timeout, and pgBouncer integration.
Fix: Prisma Transaction Error — Transaction Already Closed or Rolled Back
How to fix Prisma transaction errors — interactive transactions vs $transaction array, error handling and rollback, nested transactions, timeout issues, and isolation levels.