Add mongo-schema core package for native MongoDB schema validation#14279
Add mongo-schema core package for native MongoDB schema validation#14279mvogttech wants to merge 6 commits intometeor:develfrom
mongo-schema core package for native MongoDB schema validation#14279Conversation
✅ Deploy Preview for v3-migration-docs canceled.
|
✅ Deploy Preview for v3-meteor-api-docs canceled.
|
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds a new Meteor package Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 7
🧹 Nitpick comments (9)
packages/mongo-schema/schema_context.js (1)
31-34: Consider returning a copy of the errors array to prevent external mutation.
validationErrors()returns the internal_errorsarray directly. If a caller mutates this array, it would corrupt the context's state without triggering reactivity updates. Returning a shallow copy ([...this._errors]) would be safer.🛡️ Proposed defensive copy
validationErrors() { if (this._dep) this._dep.depend(); - return this._errors; + return [...this._errors]; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_context.js` around lines 31 - 34, The validationErrors method currently returns the internal _errors array directly, risking external mutation; change validationErrors (and keep the existing _dep.depend() call) to return a shallow copy of the array (e.g., using spread or slice) so callers get a defensive copy of this._errors and external mutations won't corrupt the context or bypass reactivity.packages/mongo-schema/tests/schema_errors_tests.js (1)
5-23: Consider adding coverage forDENY_INSERTandDENY_UPDATEerror types.The test verifies most
ErrorTypesconstants but omitsDENY_INSERTandDENY_UPDATEwhich are defined inschema_errors.js. These may be covered by integration tests, but adding them here would ensure complete coverage of the constants.💡 Add missing assertions
test.equal(MongoSchema.ErrorTypes.FAILED_REGULAR_EXPRESSION, 'regEx'); test.equal(MongoSchema.ErrorTypes.KEY_NOT_IN_SCHEMA, 'keyNotInSchema'); + test.equal(MongoSchema.ErrorTypes.DENY_INSERT, 'denyInsert'); + test.equal(MongoSchema.ErrorTypes.DENY_UPDATE, 'denyUpdate'); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/tests/schema_errors_tests.js` around lines 5 - 23, The test is missing assertions for the DENY_INSERT and DENY_UPDATE error type constants; update the Tinytest 'errors - ErrorTypes constants exist' to include test.equal(MongoSchema.ErrorTypes.DENY_INSERT, 'denyInsert') and test.equal(MongoSchema.ErrorTypes.DENY_UPDATE, 'denyUpdate') so the constants defined in schema_errors.js (DENY_INSERT and DENY_UPDATE) are asserted like the other ErrorTypes.packages/mongo-schema/tests/collection_integration_tests.js (1)
103-109: Add an explicit client/server split assertion forbypassSchema.This test validates the server happy path, but it doesn’t lock in the “server-only” contract. Add a client-side assertion (or guard) so regressions don’t accidentally allow bypass behavior in client contexts.
As per coding guidelines,
packages/**: “Focus on API backwards compatibility, DDP/reactivity correctness, and client/server split.”🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/tests/collection_integration_tests.js` around lines 103 - 109, This test currently exercises the server-only bypassSchema behavior; add an explicit client/server split by asserting Meteor.isServer at the top when running the happy-path (so the existing server behavior is validated only on server), and add a client-side branch that verifies bypassSchema does not allow schema bypass on the client (e.g. in the Meteor.isClient branch call col.insertAsync({anything:'goes'},{bypassSchema:true}) and assert the operation either throws or that the resulting document (via col.findOneAsync or the insert result) does not contain the extra field), referencing Tinytest.addAsync, bypassSchema, Mongo.Collection, attachSchema, MongoSchema, insertAsync and findOneAsync to locate where to change the test.packages/mongo-schema/schema_jsonschema.js (1)
26-32:oneOfproduces empty schema{}for types withoutbsonType.When compiling
oneOftypes, if a resolved type has nobsonType(likeMongoSchema.Anyor custom constructors), line 29 returns{}. While this is valid JSON Schema (allows any value), it effectively makes the entireoneOfconstraint meaningless since{}matches anything.Consider logging a warning in development, or explicitly handling the
Anycase:♻️ Handle Any type explicitly
if (typeName === 'oneOf') { - result.oneOf = desc.resolvedType.resolvedTypes.map(rt => { - if (rt.bsonType) return { bsonType: rt.bsonType }; - return {}; - }); + const oneOfSchemas = desc.resolvedType.resolvedTypes + .filter(rt => rt.bsonType) // Skip types without bsonType + .map(rt => ({ bsonType: rt.bsonType })); + if (oneOfSchemas.length > 0) { + result.oneOf = oneOfSchemas; + } + // If all types lack bsonType, result has no type constraint return result; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_jsonschema.js` around lines 26 - 32, The oneOf branch currently maps resolved types to {} when rt.bsonType is missing, which makes the oneOf meaningless; update the logic in the oneOf handling (look for the typeName === 'oneOf' block and desc.resolvedType.resolvedTypes) to explicitly detect and handle MongoSchema.Any or other non-bsonType constructors instead of returning plain {} — for example produce a schema that signals "any" via an explicit marker (or skip/include a more specific discriminator), and/or emit a dev-only warning via your logging facility when a resolved type lacks bsonType so reviewers know oneOf has a catch-all entry; ensure you reference rt constructors (e.g., MongoSchema.Any) when deciding behavior so the oneOf constraint remains meaningful.packages/mongo-schema/collection_integration.js (1)
130-132: Consider removingasynckeyword from fire-and-forget function.
applyDatabaseEnforcementis declaredasyncbut returns immediately after scheduling aMeteor.startupcallback. The actual async work happens inside the callback. This is harmless but slightly misleading since the returned Promise resolves before any database work occurs.♻️ Optional refactor
-async function applyDatabaseEnforcement(collection, options) { +function applyDatabaseEnforcement(collection, options) { // Server-only, deferred to after startup if (typeof Meteor === 'undefined' || !Meteor.isServer) return;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/collection_integration.js` around lines 130 - 132, The function applyDatabaseEnforcement is declared async but returns immediately (work is done inside a Meteor.startup callback), so remove the misleading async keyword and make it a regular function; update its declaration (applyDatabaseEnforcement) and any internal use that assumes a returned Promise (if any callers await it, stop awaiting or change them to rely on startup behavior), and ensure the Meteor.startup callback still schedules the async work (leave the inner async/await logic unchanged).packages/mongo-schema/schema_validate.js (1)
243-249: Custom validators in modifier mode have limited field access.When validating modifier fields,
rootDocis set tomodifier[op](e.g., the$setobject). This means custom validators usingthis.field('otherField')can only see fields within the same operator, not fields being set via different operators like$incor$setOnInsert.This may be intentional for SimpleSchema compatibility, but consider documenting this limitation or enhancing the context to search across all operator payloads (similar to
applyAutoValuesModifier'sfield()implementation inschema_clean.js).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_validate.js` around lines 243 - 249, Custom validators running in modifier mode are given rootDoc = modifier[op], so when validateField processes modifier operators (in schema_validate.js looping over modifier[op] and calling validateField) a custom validator's this.field('otherField') can only see fields inside the same operator; update validateField's modifier-mode context so field resolution searches across all operator payloads (not just modifier[op]) similar to applyAutoValuesModifier's field() behavior in schema_clean.js, or at minimum document this limitation in the API; specifically, adjust the logic that builds the rootDoc/context passed into validateField (and any helper that implements this.field) to merge or lookup values across all modifier operators (e.g., iterate modifier keys and fallback lookups) so validations can access fields set by other operators like $inc or $setOnInsert.packages/mongo-schema/schema.js (1)
140-145:labels()mutates the IR which may affect extended schemas.The
labels()method directly mutatesdesc.labelin the IR. If schemas share IR references (e.g., throughextend()without deep cloning), label changes on one schema could affect another.Looking at
extend()(lines 36-39), it creates a newMongoSchemawhich callsparseDefinitionand creates a fresh IR, so this appears safe. However, if future changes introduce IR sharing for performance, this mutation could become problematic.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema.js` around lines 140 - 145, labels() currently mutates IR entries (desc.label) which can leak to other schemas if IRs are shared; change it to avoid in-place mutation by creating a new descriptor with the updated label and replacing the IR entry (e.g., read desc via this._ir.get(key), create a shallow clone with the new label, then this._ir.set(key, clonedDesc)) or maintain a separate label-overlay map; ensure the change uses the existing parseDefinition/MongoSchema construction semantics so extend() still produces independent schemas.packages/mongo-schema/schema_clean.js (2)
23-25: Inconsistent mutation pattern withremoveNullsFromArrays.Unlike other cleaning steps that reassign
result,removeNullsFromArraysonly mutates in place and returnsundefined. This works becauseresultis already mutable at this point, but the inconsistency could cause confusion if the order of operations changes or ifmutate: falsesemantics need adjustment.♻️ Optional: Return result for consistency
- if (options.removeNullsFromArrays) { - removeNullsFromArrays(ir, result); - } + if (options.removeNullsFromArrays) { + result = removeNullsFromArrays(ir, result); + }And update the function:
function removeNullsFromArrays(ir, doc) { for (const [key, value] of Object.entries(doc)) { if (Array.isArray(value)) { doc[key] = value.filter(item => item !== null); } if (typeof value === 'object' && value !== null && !Array.isArray(value) && !(value instanceof Date)) { removeNullsFromArrays(ir, value); } } + return doc; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_clean.js` around lines 23 - 25, The call to removeNullsFromArrays mutates result in place but doesn't return it, causing an inconsistent mutation pattern; update the removeNullsFromArrays function to return the cleaned object (return result) and change its invocation in the options.removeNullsFromArrays branch to reassign result (e.g., result = removeNullsFromArrays(ir, result)) so the flow matches other cleaning steps and supports future non-mutating semantics.
266-274:autoConvertTypesonly processes top-level keys.Similar to
removeEmptyStrings, this function doesn't recurse into nested objects. Nested values like{ profile: { age: "25" } }won't be auto-converted to{ profile: { age: 25 } }.This appears consistent across the flat cleaning functions, but differs from the recursive approach used by
trimStringsandapplyDefaultValues.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_clean.js` around lines 266 - 274, autoConvertTypes currently only converts top-level fields; update it to recurse into nested objects and arrays so nested values (e.g., profile.age) are converted too: for each entry use ir.get(key) as now, but if value is a plain object call autoConvertTypes recursively with that nested object and the appropriate nested schema/IR (derived from desc.resolvedType or desc.schema/inner IR on desc), and if value is an array map over elements (recursing for object elements and using convertValue for primitives). Keep using convertValue(desc.resolvedType.name) for primitive conversions, and preserve the existing checks for null/undefined; update autoConvertTypes, its recursive calls, and handling of arrays accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/mongo-schema/collection_integration.js`:
- Around line 142-165: The code attempts to run db.command({ collMod: ... })
using collectionName = collection._name, which will be null for anonymous/local
collections and cause collMod to fail; update the Meteor.startup block to check
collection._name (or collectionName) before any DB operations and skip the
schema-enforcement path (including the _meteor_schema_versions check and the
db.command collMod call) for anonymous collections, returning early or logging
an informational message so anonymous collections are not processed by the
schema enforcement logic in this file.
In `@packages/mongo-schema/schema_clean.js`:
- Around line 126-132: The field() helper currently only checks operators
['$set', '$setOnInsert', '$unset', '$inc'] and therefore misses array-modifying
operators; update field() to include '$push' and '$addToSet' so its operator
list matches the main lookup that checks six operators (e.g., ensure field()
iterates over ['$set', '$setOnInsert', '$unset', '$inc', '$push', '$addToSet']
and returns isSet/value/operator when one of those exists).
In `@packages/mongo-schema/schema_jsonschema.js`:
- Around line 77-83: The conversion currently drops RegExp flags when setting
result.pattern (see desc.regEx and result.pattern); update the code that handles
desc.regEx so that when re is a RegExp and re.flags is non-empty you emit a
clear development-time warning (e.g., logger.warn/console.warn) stating that
MongoDB $jsonSchema does not support regex flags and that the flags (list the
actual re.flags) will be ignored for the pattern being set (include the
pattern/re.source in the message for context); keep the existing behavior of
using Array.isArray(desc.regEx) ? desc.regEx[0] : desc.regEx and still set
result.pattern = re.source, but add the warning when flags exist or
alternatively document this limitation if a runtime logger is unavailable.
In `@packages/mongo-schema/schema_validate.js`:
- Around line 266-278: The code in the $push/$addToSet loop currently sets
pushValue to null when value has $each, which causes validation to be skipped;
instead detect when value.$each exists and iterate its array elements, calling
validateField(ir, itemKey, itemDesc, modifier[op], eachItem, errors, options)
for each element; keep the existing branch for the non-$each case (using
pushValue as before) and ensure you use the same itemKey and itemDesc lookups so
array item schema validation runs for every element in value.$each as well as
for single push values.
- Around line 143-148: The denyInsert/denyUpdate checks in validateConstraints
are never triggered because schema.validate calls from collection_integration.js
omit isInsert/isUpdate; update the schema.validate invocations inside
collection_integration.js (the calls that currently pass { modifier: true } or {
modifier: true, upsert: true }) to include the correct operation flags (e.g.,
pass isInsert: true for insert flows, isUpdate: true for update flows, and
include upsert when appropriate) so validateConstraints can enforce
desc.denyInsert and desc.denyUpdate; ensure these flags align with how
buildCleanOptions/extendAutoValueContext set them during cleaning.
In `@packages/mongo-schema/tests/migration_compat_tests.js`:
- Around line 62-73: The test Tinytest.add 'migration - validate throws
ValidationError' currently calls MongoSchema.prototype.validate via
schema.validate({}) and relies on the catch block to assert the ValidationError,
but if validate() does not throw the test silently passes; update the test to
explicitly fail when no exception is thrown by adding a test.fail(...)
immediately after calling schema.validate({}) (or wrap in a try {
schema.validate({}); test.fail("expected ValidationError"); } catch (e) { ... })
so that the test fails when validate() does not throw; reference the test
function name Tinytest.add and the schema.validate call to locate and modify the
test.
In `@packages/mongo-schema/types.js`:
- Around line 41-44: The Object type predicate in TYPE_MAP.set(Object) and the
related schema type predicate must be tightened to only accept plain objects;
locate the OBJECT entry created via TYPE_MAP.set(Object, { name: 'object',
check: ... }) and the schema predicate around the second occurrence (lines
referenced in the review) and replace the current check (v !== null && typeof v
=== 'object' && !Array.isArray(v) && !(v instanceof Date)) with a call to an
isPlainObject utility (or implement isPlainObject that returns true for object
literals/Object.create() and false for RegExp, Error, Map, Set, Date, etc.).
Ensure both places use the same isPlainObject helper so nested schema validation
will only permit plain objects.
---
Nitpick comments:
In `@packages/mongo-schema/collection_integration.js`:
- Around line 130-132: The function applyDatabaseEnforcement is declared async
but returns immediately (work is done inside a Meteor.startup callback), so
remove the misleading async keyword and make it a regular function; update its
declaration (applyDatabaseEnforcement) and any internal use that assumes a
returned Promise (if any callers await it, stop awaiting or change them to rely
on startup behavior), and ensure the Meteor.startup callback still schedules the
async work (leave the inner async/await logic unchanged).
In `@packages/mongo-schema/schema_clean.js`:
- Around line 23-25: The call to removeNullsFromArrays mutates result in place
but doesn't return it, causing an inconsistent mutation pattern; update the
removeNullsFromArrays function to return the cleaned object (return result) and
change its invocation in the options.removeNullsFromArrays branch to reassign
result (e.g., result = removeNullsFromArrays(ir, result)) so the flow matches
other cleaning steps and supports future non-mutating semantics.
- Around line 266-274: autoConvertTypes currently only converts top-level
fields; update it to recurse into nested objects and arrays so nested values
(e.g., profile.age) are converted too: for each entry use ir.get(key) as now,
but if value is a plain object call autoConvertTypes recursively with that
nested object and the appropriate nested schema/IR (derived from
desc.resolvedType or desc.schema/inner IR on desc), and if value is an array map
over elements (recursing for object elements and using convertValue for
primitives). Keep using convertValue(desc.resolvedType.name) for primitive
conversions, and preserve the existing checks for null/undefined; update
autoConvertTypes, its recursive calls, and handling of arrays accordingly.
In `@packages/mongo-schema/schema_context.js`:
- Around line 31-34: The validationErrors method currently returns the internal
_errors array directly, risking external mutation; change validationErrors (and
keep the existing _dep.depend() call) to return a shallow copy of the array
(e.g., using spread or slice) so callers get a defensive copy of this._errors
and external mutations won't corrupt the context or bypass reactivity.
In `@packages/mongo-schema/schema_jsonschema.js`:
- Around line 26-32: The oneOf branch currently maps resolved types to {} when
rt.bsonType is missing, which makes the oneOf meaningless; update the logic in
the oneOf handling (look for the typeName === 'oneOf' block and
desc.resolvedType.resolvedTypes) to explicitly detect and handle MongoSchema.Any
or other non-bsonType constructors instead of returning plain {} — for example
produce a schema that signals "any" via an explicit marker (or skip/include a
more specific discriminator), and/or emit a dev-only warning via your logging
facility when a resolved type lacks bsonType so reviewers know oneOf has a
catch-all entry; ensure you reference rt constructors (e.g., MongoSchema.Any)
when deciding behavior so the oneOf constraint remains meaningful.
In `@packages/mongo-schema/schema_validate.js`:
- Around line 243-249: Custom validators running in modifier mode are given
rootDoc = modifier[op], so when validateField processes modifier operators (in
schema_validate.js looping over modifier[op] and calling validateField) a custom
validator's this.field('otherField') can only see fields inside the same
operator; update validateField's modifier-mode context so field resolution
searches across all operator payloads (not just modifier[op]) similar to
applyAutoValuesModifier's field() behavior in schema_clean.js, or at minimum
document this limitation in the API; specifically, adjust the logic that builds
the rootDoc/context passed into validateField (and any helper that implements
this.field) to merge or lookup values across all modifier operators (e.g.,
iterate modifier keys and fallback lookups) so validations can access fields set
by other operators like $inc or $setOnInsert.
In `@packages/mongo-schema/schema.js`:
- Around line 140-145: labels() currently mutates IR entries (desc.label) which
can leak to other schemas if IRs are shared; change it to avoid in-place
mutation by creating a new descriptor with the updated label and replacing the
IR entry (e.g., read desc via this._ir.get(key), create a shallow clone with the
new label, then this._ir.set(key, clonedDesc)) or maintain a separate
label-overlay map; ensure the change uses the existing
parseDefinition/MongoSchema construction semantics so extend() still produces
independent schemas.
In `@packages/mongo-schema/tests/collection_integration_tests.js`:
- Around line 103-109: This test currently exercises the server-only
bypassSchema behavior; add an explicit client/server split by asserting
Meteor.isServer at the top when running the happy-path (so the existing server
behavior is validated only on server), and add a client-side branch that
verifies bypassSchema does not allow schema bypass on the client (e.g. in the
Meteor.isClient branch call
col.insertAsync({anything:'goes'},{bypassSchema:true}) and assert the operation
either throws or that the resulting document (via col.findOneAsync or the insert
result) does not contain the extra field), referencing Tinytest.addAsync,
bypassSchema, Mongo.Collection, attachSchema, MongoSchema, insertAsync and
findOneAsync to locate where to change the test.
In `@packages/mongo-schema/tests/schema_errors_tests.js`:
- Around line 5-23: The test is missing assertions for the DENY_INSERT and
DENY_UPDATE error type constants; update the Tinytest 'errors - ErrorTypes
constants exist' to include test.equal(MongoSchema.ErrorTypes.DENY_INSERT,
'denyInsert') and test.equal(MongoSchema.ErrorTypes.DENY_UPDATE, 'denyUpdate')
so the constants defined in schema_errors.js (DENY_INSERT and DENY_UPDATE) are
asserted like the other ErrorTypes.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: ec0ab768-a3bd-44d1-8e68-6f241ab0211c
📒 Files selected for processing (21)
packages/mongo-schema/README.mdpackages/mongo-schema/collection_integration.jspackages/mongo-schema/index.jspackages/mongo-schema/package.jspackages/mongo-schema/schema.jspackages/mongo-schema/schema_clean.jspackages/mongo-schema/schema_context.jspackages/mongo-schema/schema_definition.jspackages/mongo-schema/schema_errors.jspackages/mongo-schema/schema_jsonschema.jspackages/mongo-schema/schema_validate.jspackages/mongo-schema/tests/collection_integration_tests.jspackages/mongo-schema/tests/migration_compat_tests.jspackages/mongo-schema/tests/schema_clean_tests.jspackages/mongo-schema/tests/schema_context_tests.jspackages/mongo-schema/tests/schema_definition_tests.jspackages/mongo-schema/tests/schema_errors_tests.jspackages/mongo-schema/tests/schema_jsonschema_tests.jspackages/mongo-schema/tests/schema_validate_tests.jspackages/mongo-schema/tests/types_tests.jspackages/mongo-schema/types.js
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (3)
packages/mongo-schema/schema_definition.js (1)
100-109: Edge case: empty array[]throws an unclear error.The array shorthand check at Line 101 requires exactly one element (
def.length === 1). An empty array[]falls through to Line 108 and throws with JSON output"[]", which may be confusing. Consider adding a specific error message for empty arrays.Proposed improvement for clearer error message
// Shorthand: Array of type (e.g., { tags: [String] }) if (Array.isArray(def) && def.length === 1) { return { type: Array, _arrayItemType: def[0] }; } + // Invalid: empty array + if (Array.isArray(def) && def.length === 0) { + throw new Error('MongoSchema: Array shorthand requires exactly one element, e.g., [String]'); + } // Full form object🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_definition.js` around lines 100 - 109, The code falls through and throws an unclear error when an empty array shorthand (def === []) is provided; add an explicit branch before the existing Array.isArray(def) && def.length === 1 check to detect Array.isArray(def) && def.length === 0 and throw a specific Error (e.g., "MongoSchema: Empty array shorthand [] is not allowed; expected one type like [String]") so callers see a clear message; keep the existing behavior for def.length === 1 (return { type: Array, _arrayItemType: def[0] }) and for full-form objects (the typeof def === 'object' && def.type !== undefined branch).packages/mongo-schema/schema_validate.js (1)
234-239: Minor inconsistency:denyUpdatetriggers regardless of value presence.At line 234,
denyInsertchecksvalue !== undefined, but at line 237,denyUpdatedoes not. In modifier validation, this is fine since only touched fields are validated. However, in document validation mode withisUpdate: true, this could trigger for fields that aren't actually present.Consider adding the same guard for consistency:
Proposed fix
if (desc.denyInsert && options.isInsert && value !== undefined) { errors.push({ name: key, type: ErrorTypes.DENY_INSERT, value, message: `${label} cannot be set during insert` }); } - if (desc.denyUpdate && options.isUpdate) { + if (desc.denyUpdate && options.isUpdate && value !== undefined) { errors.push({ name: key, type: ErrorTypes.DENY_UPDATE, value, message: `${label} cannot be set during update` }); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_validate.js` around lines 234 - 239, The denyUpdate branch is missing the same presence guard used by denyInsert, causing DENY_UPDATE errors during document validation when the field isn't present; update the condition around desc.denyUpdate in schema_validate.js (the block that calls errors.push with type ErrorTypes.DENY_UPDATE) to only run when options.isUpdate is true AND value !== undefined, mirroring the denyInsert check so only present/touched fields trigger the error.packages/mongo-schema/types.js (1)
164-172: Consider documenting thatoneOfdescriptors lack a top-levelbsonType.The
oneOftype descriptor omitsbsonType, which is intentional since JSON Schema compilation handles it via theresolvedTypesarray. However, documenting this explicitly in the return object or JSDoc would help maintainability.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/types.js` around lines 164 - 172, The oneOf branch (when typeDef._isOneOf) returns an object without a top-level bsonType; update the code or JSDoc to explicitly document that behavior so future maintainers know JSON Schema compilation uses resolvedTypes instead. Add a short inline comment or JSDoc on the returned object from the oneOf block (or add an explicit field like bsonType: undefined/null with a clarifying comment) and mention that resolvedTypes is used for JSON Schema compilation; reference the oneOf branch, the typeDef._isOneOf check, and the resolvedTypes variable when adding the documentation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/mongo-schema/schema_jsonschema.js`:
- Around line 130-142: The condition that compiles nested children only checks
for typeName === 'object', so nested MongoSchema instances with typeName ===
'schema' are skipped; update the branch in schema_jsonschema.js to treat both
object and schema the same (i.e., use typeName === 'object' || typeName ===
'schema') so that when desc.children exist and !desc.blackbox the loop that
calls compileField(ir, childKey, childDesc) still runs and populates
result.properties and result.required as before.
In `@packages/mongo-schema/schema_validate.js`:
- Around line 200-207: The recursive validation only runs for
desc.resolvedType.name === 'object' so nested MongoSchema subschemas (which have
resolvedType.name === 'schema') are skipped; update the condition in
validateField to also treat 'schema' the same as 'object' (e.g., change the if
that checks desc.resolvedType.name to include 'schema') while keeping the checks
for desc.children and !desc.blackbox, then continue to call ir.get(childKey) and
recurse via validateField as currently implemented.
---
Nitpick comments:
In `@packages/mongo-schema/schema_definition.js`:
- Around line 100-109: The code falls through and throws an unclear error when
an empty array shorthand (def === []) is provided; add an explicit branch before
the existing Array.isArray(def) && def.length === 1 check to detect
Array.isArray(def) && def.length === 0 and throw a specific Error (e.g.,
"MongoSchema: Empty array shorthand [] is not allowed; expected one type like
[String]") so callers see a clear message; keep the existing behavior for
def.length === 1 (return { type: Array, _arrayItemType: def[0] }) and for
full-form objects (the typeof def === 'object' && def.type !== undefined
branch).
In `@packages/mongo-schema/schema_validate.js`:
- Around line 234-239: The denyUpdate branch is missing the same presence guard
used by denyInsert, causing DENY_UPDATE errors during document validation when
the field isn't present; update the condition around desc.denyUpdate in
schema_validate.js (the block that calls errors.push with type
ErrorTypes.DENY_UPDATE) to only run when options.isUpdate is true AND value !==
undefined, mirroring the denyInsert check so only present/touched fields trigger
the error.
In `@packages/mongo-schema/types.js`:
- Around line 164-172: The oneOf branch (when typeDef._isOneOf) returns an
object without a top-level bsonType; update the code or JSDoc to explicitly
document that behavior so future maintainers know JSON Schema compilation uses
resolvedTypes instead. Add a short inline comment or JSDoc on the returned
object from the oneOf block (or add an explicit field like bsonType:
undefined/null with a clarifying comment) and mention that resolvedTypes is used
for JSON Schema compilation; reference the oneOf branch, the typeDef._isOneOf
check, and the resolvedTypes variable when adding the documentation.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 7d0f43b6-781a-44a7-91fd-cbb70f336655
📒 Files selected for processing (17)
packages/mongo-schema/collection_integration.jspackages/mongo-schema/index.jspackages/mongo-schema/schema.jspackages/mongo-schema/schema_clean.jspackages/mongo-schema/schema_context.jspackages/mongo-schema/schema_definition.jspackages/mongo-schema/schema_errors.jspackages/mongo-schema/schema_jsonschema.jspackages/mongo-schema/schema_validate.jspackages/mongo-schema/tests/collection_integration_tests.jspackages/mongo-schema/tests/migration_compat_tests.jspackages/mongo-schema/tests/schema_clean_tests.jspackages/mongo-schema/tests/schema_context_tests.jspackages/mongo-schema/tests/schema_definition_tests.jspackages/mongo-schema/tests/schema_errors_tests.jspackages/mongo-schema/tests/schema_validate_tests.jspackages/mongo-schema/types.js
🚧 Files skipped from review as they are similar to previous changes (2)
- packages/mongo-schema/index.js
- packages/mongo-schema/schema_context.js
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
packages/mongo-schema/schema_validate.js (1)
234-239:⚠️ Potential issue | 🟠 Major
denyInsert/denyUpdateconstraints requireisInsert/isUpdateflags from callers.The constraint checks here are correctly implemented, but as previously noted,
collection_integration.jsmust pass theisInsert/isUpdateflags toschema.validate()for these constraints to be enforced. Without those flags,denyInsertanddenyUpdaterestrictions will silently pass.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_validate.js` around lines 234 - 239, The denyInsert/denyUpdate checks in schema_validate.js rely on callers to set options.isInsert/options.isUpdate, so update collection_integration.js to pass these flags into schema.validate during insert and update flows: when performing an insert call schema.validate(document, { ..., isInsert: true, ... }) and for updates call schema.validate(updateDoc, { ..., isUpdate: true, ... }); ensure any existing options object passed into schema.validate is preserved/merged rather than overwritten so other validation options remain intact.
🧹 Nitpick comments (1)
packages/mongo-schema/schema_validate.js (1)
449-453: Minor:siblingField()in modifier validators has same issue.The
siblingFieldimplementation here (line 452) delegates tothis.field(name)similar torunCustomValidator. However, since modifier fields are typically flat dot-paths inmergedFields, the impact is less severe than in plain document validation. Consider applying the same fix for consistency.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/mongo-schema/schema_validate.js` around lines 449 - 453, siblingField currently delegates to this.field(name) which may not correctly resolve sibling values for modifier validation; update siblingField in the modifier validator to mirror the same resolution logic used for field (and as fixed in runCustomValidator) so it looks up mergedFields using the flat dot-path (mergedFields[name]) and returns { isSet: mergedFields[name] !== undefined, value: mergedFields[name], operator: null } (keeping parentField and field behavior unchanged) to ensure consistent sibling lookup for modifier validations.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/mongo-schema/schema_validate.js`:
- Around line 333-354: The custom validator context in runCustomValidator
constructs siblingField and parentField incorrectly by using absolute lookups;
change siblingField and parentField to compute the current field's parent path
(derived from the provided key/genericKey) and perform nested lookups relative
to that parent: implement siblingField(name) to build a siblingPath = parentPath
? `${parentPath}.${name}` : name, call getNestedValue(rootDoc, siblingPath) and
return { isSet: val !== undefined, value: val, operator: null }, and implement
parentField() to return the parent object via getNestedValue(rootDoc,
parentPath) (or { isSet: true, value: rootDoc } if no parent), keeping other
context properties and merging options.extendedCustomContext as before; use
runCustomValidator, siblingField, parentField, getNestedValue and
options.extendedCustomContext to locate and update the code similar to the logic
used in schema_clean.js.
---
Duplicate comments:
In `@packages/mongo-schema/schema_validate.js`:
- Around line 234-239: The denyInsert/denyUpdate checks in schema_validate.js
rely on callers to set options.isInsert/options.isUpdate, so update
collection_integration.js to pass these flags into schema.validate during insert
and update flows: when performing an insert call schema.validate(document, {
..., isInsert: true, ... }) and for updates call schema.validate(updateDoc, {
..., isUpdate: true, ... }); ensure any existing options object passed into
schema.validate is preserved/merged rather than overwritten so other validation
options remain intact.
---
Nitpick comments:
In `@packages/mongo-schema/schema_validate.js`:
- Around line 449-453: siblingField currently delegates to this.field(name)
which may not correctly resolve sibling values for modifier validation; update
siblingField in the modifier validator to mirror the same resolution logic used
for field (and as fixed in runCustomValidator) so it looks up mergedFields using
the flat dot-path (mergedFields[name]) and returns { isSet: mergedFields[name]
!== undefined, value: mergedFields[name], operator: null } (keeping parentField
and field behavior unchanged) to ensure consistent sibling lookup for modifier
validations.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 2bd5c9a7-1122-463f-84db-e8f8b66c66ed
📒 Files selected for processing (3)
packages/mongo-schema/schema_clean.jspackages/mongo-schema/schema_jsonschema.jspackages/mongo-schema/schema_validate.js
| function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) { | ||
| const context = { | ||
| key, | ||
| genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'), | ||
| value, | ||
| isSet: value !== undefined, | ||
| operator: null, | ||
| definition: desc, | ||
| field(name) { | ||
| const val = getNestedValue(rootDoc, name); | ||
| return { isSet: val !== undefined, value: val, operator: null }; | ||
| }, | ||
| siblingField(name) { return this.field(name); }, | ||
| parentField() { return { isSet: true, value: rootDoc, operator: null }; }, | ||
| ...(options.extendedCustomContext || {}), | ||
| }; | ||
|
|
||
| const result = desc.custom.call(context); | ||
| if (typeof result === 'string') { | ||
| errors.push({ name: key, type: result, value, message: `${desc.label || key} failed validation: ${result}` }); | ||
| } | ||
| } |
There was a problem hiding this comment.
siblingField() and parentField() are incorrect for nested fields in custom validators.
The siblingField implementation (line 345) simply delegates to this.field(name), which performs an absolute path lookup. For a nested field like address.city, calling siblingField('country') should return address.country, but this implementation returns the value at root-level country.
Similarly, parentField() (line 346) returns rootDoc instead of the actual parent object.
Compare to the correct implementation in schema_clean.js lines 584-606 which computes parentPath and constructs proper sibling paths.
🐛 Proposed fix
function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) {
+ const parentPath = key.includes('.') ? key.slice(0, key.lastIndexOf('.')) : '';
+
const context = {
key,
genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'),
value,
isSet: value !== undefined,
operator: null,
definition: desc,
field(name) {
const val = getNestedValue(rootDoc, name);
return { isSet: val !== undefined, value: val, operator: null };
},
- siblingField(name) { return this.field(name); },
- parentField() { return { isSet: true, value: rootDoc, operator: null }; },
+ siblingField(name) {
+ const siblingPath = parentPath ? `${parentPath}.${name}` : name;
+ const val = getNestedValue(rootDoc, siblingPath);
+ return { isSet: val !== undefined, value: val, operator: null };
+ },
+ parentField() {
+ if (!parentPath) return { isSet: true, value: rootDoc, operator: null };
+ const val = getNestedValue(rootDoc, parentPath);
+ return { isSet: val !== undefined, value: val, operator: null };
+ },
...(options.extendedCustomContext || {}),
};📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) { | |
| const context = { | |
| key, | |
| genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'), | |
| value, | |
| isSet: value !== undefined, | |
| operator: null, | |
| definition: desc, | |
| field(name) { | |
| const val = getNestedValue(rootDoc, name); | |
| return { isSet: val !== undefined, value: val, operator: null }; | |
| }, | |
| siblingField(name) { return this.field(name); }, | |
| parentField() { return { isSet: true, value: rootDoc, operator: null }; }, | |
| ...(options.extendedCustomContext || {}), | |
| }; | |
| const result = desc.custom.call(context); | |
| if (typeof result === 'string') { | |
| errors.push({ name: key, type: result, value, message: `${desc.label || key} failed validation: ${result}` }); | |
| } | |
| } | |
| function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) { | |
| const parentPath = key.includes('.') ? key.slice(0, key.lastIndexOf('.')) : ''; | |
| const context = { | |
| key, | |
| genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'), | |
| value, | |
| isSet: value !== undefined, | |
| operator: null, | |
| definition: desc, | |
| field(name) { | |
| const val = getNestedValue(rootDoc, name); | |
| return { isSet: val !== undefined, value: val, operator: null }; | |
| }, | |
| siblingField(name) { | |
| const siblingPath = parentPath ? `${parentPath}.${name}` : name; | |
| const val = getNestedValue(rootDoc, siblingPath); | |
| return { isSet: val !== undefined, value: val, operator: null }; | |
| }, | |
| parentField() { | |
| if (!parentPath) return { isSet: true, value: rootDoc, operator: null }; | |
| const val = getNestedValue(rootDoc, parentPath); | |
| return { isSet: val !== undefined, value: val, operator: null }; | |
| }, | |
| ...(options.extendedCustomContext || {}), | |
| }; | |
| const result = desc.custom.call(context); | |
| if (typeof result === 'string') { | |
| errors.push({ name: key, type: result, value, message: `${desc.label || key} failed validation: ${result}` }); | |
| } | |
| } |
🧰 Tools
🪛 ESLint
[error] 345-345: Replace ·return·this.field(name); with ⏎······return·this.field(name);⏎···
(prettier/prettier)
[error] 346-346: Replace ·return·{·isSet:·true,·value:·rootDoc,·operator:·null·}; with ⏎······return·{·isSet:·true,·value:·rootDoc,·operator:·null·};⏎···
(prettier/prettier)
[error] 352-352: Replace ·name:·key,·type:·result,·value,·message:·${desc.label·||·key}·failed·validation:·${result}`` with ⏎······name:·key,⏎······type:·result,⏎······value,⏎······message:·${desc.label·||·key}·failed·validation:·${result}`,⏎···`
(prettier/prettier)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/mongo-schema/schema_validate.js` around lines 333 - 354, The custom
validator context in runCustomValidator constructs siblingField and parentField
incorrectly by using absolute lookups; change siblingField and parentField to
compute the current field's parent path (derived from the provided
key/genericKey) and perform nested lookups relative to that parent: implement
siblingField(name) to build a siblingPath = parentPath ? `${parentPath}.${name}`
: name, call getNestedValue(rootDoc, siblingPath) and return { isSet: val !==
undefined, value: val, operator: null }, and implement parentField() to return
the parent object via getNestedValue(rootDoc, parentPath) (or { isSet: true,
value: rootDoc } if no parent), keeping other context properties and merging
options.extendedCustomContext as before; use runCustomValidator, siblingField,
parentField, getNestedValue and options.extendedCustomContext to locate and
update the code similar to the logic used in schema_clean.js.
Why
Meteor has always embraced MongoDB as its primary database layer, but we've fallen short of integrating features that MongoDB has evolved over the years. Schema validation is one of the biggest gaps.
The vast majority of the Meteor ecosystem relies on
simpl-schema+aldeed:collection2for schema validation, which works well for single-app setups but creates real problems at scale — particularly for teams running microservices or shared databases where application-layer-only validation means each service has to independently enforce the same rules (or doesn't). MongoDB introduced native$jsonSchemavalidation years ago, and none of the existing community packages support it.Having a standardized, core-supported schema validation package is long overdue. This is especially important now as teams migrate from 2.x to 3.x — knowing that schema validation is part of the platform (and will stay maintained) removes a major source of uncertainty in those migrations.
What
New core package:
mongo-schemaDesigned as a drop-in replacement for
simpl-schema+collection2— same definition syntax, sameattachSchema()API, sameautoValuecontext (this.isInsert,this.userId,this.field(), etc.) — so the migration path is changing an import line, not rewriting schemas.The key addition over existing packages is that schemas compile to MongoDB's
$jsonSchemaformat, enabling opt-in database-level enforcement viacollMod. This means validation can happen at both the application layer (with fullautoValue/defaultValue/custom validator support) and the database layer (as a safety net that works regardless of which service writes to the collection).Package structure
types.jsschema_errors.jsValidationError(extendsMeteor.Error)schema_definition.jsschema_clean.jsschema_validate.jsschema_context.jsValidationContextwith reactiveTrackersupport for formsschema_jsonschema.js$jsonSchema(Draft 4)collection_integration.jsattachSchema(), mutation hooks, database enforcementschema.jsMongoSchemaclass — public API surfaceFeatures
simpl-schemadefinition syntax (shorthand types, dot-notation nesting,[String]arrays)$set,$unset,$inc,$push,$addToSet)autoValuewith full Collection2-compatible context (isInsert,isUpdate,isUpsert,userId,field(),siblingField(),unset())ValidationContextwith reactive methods for client-side formsextend(),pick(),omit(),getObjectSchema()$jsonSchemacompilation with opt-in database enforcement viaenforceOnDatabase: true_meteor_schema_versionscollection) to avoid redundantcollModcallsbypassSchemaoption (server-only, ignored on client for security)denyInsert/denyUpdatefield restrictionsExample
Database enforcement and
$jsonSchemasyncingThe
enforceOnDatabaseoption is what separates this from every existing schema package in the Meteor ecosystem. When enabled, your schema definition is compiled into a MongoDB$jsonSchemavalidator and applied to the collection using thecollModcommand. This means MongoDB itself enforces the schema — not just your Meteor app.Why this matters: In a single-app setup, application-layer validation is usually enough. But the moment you have multiple services writing to the same database (microservices, background workers, admin tools, migration scripts), app-layer validation breaks down. Each service needs to independently implement and maintain the same rules. Database-level enforcement guarantees that nothing writes invalid data, regardless of where the write comes from.
How the syncing works:
attachSchema(schema, { enforceOnDatabase: true })is called, the schema's internal representation is compiled to a$jsonSchemaobject viaschema.toJsonSchema()_meteor_schema_versionscollectioncollModcommand to apply the new$jsonSchemavalidator with the configuredvalidationLevelandvalidationActioncollModThis all happens inside
Meteor.startup()so it doesn't block app initialization, and the hash tracking means you're not runningcollModon every restart — only when the schema actually changes.validationLevelcontrols which documents are checked:'strict'— all inserts and updates are validated'moderate'(default) — existing documents that already violate the schema aren't flagged, but new writes must conform'off'— disables validation (useful for temporarily relaxing constraints during migrations)validationActioncontrols what happens on violation:'error'(default) — MongoDB rejects the write entirely'warn'— MongoDB allows the write but logs a warning (useful for rolling out schemas gradually)What compiles and what doesn't: The
$jsonSchemacompiler handles types, required fields, nested objects, arrays,allowedValues(enum),min/max(for numbers, string lengths),regEx(pattern), andminCount/maxCount(array items). Features that are purely application-layer —autoValue,defaultValue,customvalidators,denyInsert/denyUpdate— are not included in the$jsonSchemasince they have no database-level equivalent. The app-layer clean/validate pipeline handles those before the document reaches MongoDB.Migration from SimpleSchema
Schema definitions are identical. The only breaking changes are:
e.detailsarray instead ofe.invalidKeys()SimpleSchema.RegEx.*patterns,SimpleSchema.debugselectoroption,transformoption,aldeed:schema-indexSummary by CodeRabbit
New Features
Documentation
Tests