Skip to content

Add mongo-schema core package for native MongoDB schema validation#14279

Open
mvogttech wants to merge 6 commits intometeor:develfrom
mvogttech:feature/mongo-schema
Open

Add mongo-schema core package for native MongoDB schema validation#14279
mvogttech wants to merge 6 commits intometeor:develfrom
mvogttech:feature/mongo-schema

Conversation

@mvogttech
Copy link
Copy Markdown
Contributor

@mvogttech mvogttech commented Mar 28, 2026

Why

Meteor has always embraced MongoDB as its primary database layer, but we've fallen short of integrating features that MongoDB has evolved over the years. Schema validation is one of the biggest gaps.

The vast majority of the Meteor ecosystem relies on simpl-schema + aldeed:collection2 for schema validation, which works well for single-app setups but creates real problems at scale — particularly for teams running microservices or shared databases where application-layer-only validation means each service has to independently enforce the same rules (or doesn't). MongoDB introduced native $jsonSchema validation years ago, and none of the existing community packages support it.

Having a standardized, core-supported schema validation package is long overdue. This is especially important now as teams migrate from 2.x to 3.x — knowing that schema validation is part of the platform (and will stay maintained) removes a major source of uncertainty in those migrations.

What

New core package: mongo-schema

Designed as a drop-in replacement for simpl-schema + collection2 — same definition syntax, same attachSchema() API, same autoValue context (this.isInsert, this.userId, this.field(), etc.) — so the migration path is changing an import line, not rewriting schemas.

The key addition over existing packages is that schemas compile to MongoDB's $jsonSchema format, enabling opt-in database-level enforcement via collMod. This means validation can happen at both the application layer (with full autoValue/defaultValue/custom validator support) and the database layer (as a safety net that works regardless of which service writes to the collection).

Package structure

File Purpose
types.js Type registry (String, Number, Boolean, Date, Object, Array, Integer, Any, oneOf)
schema_errors.js Error type constants + ValidationError (extends Meteor.Error)
schema_definition.js Definition parser — normalizes shorthand/full-form into flat IR
schema_clean.js Clean pipeline: filter, autoConvert, trim, removeEmptyStrings, defaults, autoValues
schema_validate.js Validation engine: type checks, constraints, custom validators, modifier support
schema_context.js ValidationContext with reactive Tracker support for forms
schema_jsonschema.js Compiles schema IR to MongoDB $jsonSchema (Draft 4)
collection_integration.js attachSchema(), mutation hooks, database enforcement
schema.js MongoSchema class — public API surface

Features

  • Full simpl-schema definition syntax (shorthand types, dot-notation nesting, [String] arrays)
  • Clean pipeline: filter unknown keys, auto-convert types, trim strings, remove empty strings, apply defaults and autoValues
  • Modifier-aware clean and validation ($set, $unset, $inc, $push, $addToSet)
  • autoValue with full Collection2-compatible context (isInsert, isUpdate, isUpsert, userId, field(), siblingField(), unset())
  • ValidationContext with reactive methods for client-side forms
  • Schema composition: extend(), pick(), omit(), getObjectSchema()
  • $jsonSchema compilation with opt-in database enforcement via enforceOnDatabase: true
  • Schema hash tracking (_meteor_schema_versions collection) to avoid redundant collMod calls
  • bypassSchema option (server-only, ignored on client for security)
  • denyInsert / denyUpdate field restrictions

Example

import { MongoSchema } from 'meteor/mongo-schema';

const UserSchema = new MongoSchema({
  name: { type: String, min: 1, max: 100 },
  email: { type: String, regEx: /^[^@]+@[^@]+$/ },
  role: { type: String, allowedValues: ['admin', 'user', 'moderator'], defaultValue: 'user' },
  tags: { type: [String], optional: true },
  createdAt: {
    type: Date,
    autoValue() {
      if (this.isInsert) return new Date();
    },
  },
  updatedAt: {
    type: Date,
    autoValue() {
      return new Date();
    },
  },
});

// Application-layer validation (works like simpl-schema + collection2)
Users.attachSchema(UserSchema);

// Database-level enforcement — compiles to MongoDB $jsonSchema via collMod
Users.attachSchema(UserSchema, {
  enforceOnDatabase: true,
  validationLevel: 'moderate',  // 'off' | 'moderate' | 'strict'
  validationAction: 'error',    // 'error' | 'warn'
});

// Both layers work together:
// - App layer: runs clean pipeline (defaults, autoValues, type coercion, trimming) then validates
// - DB layer: MongoDB itself rejects documents that don't match the $jsonSchema
// - Schema hash tracking avoids redundant collMod calls on restart

Database enforcement and $jsonSchema syncing

The enforceOnDatabase option is what separates this from every existing schema package in the Meteor ecosystem. When enabled, your schema definition is compiled into a MongoDB $jsonSchema validator and applied to the collection using the collMod command. This means MongoDB itself enforces the schema — not just your Meteor app.

Why this matters: In a single-app setup, application-layer validation is usually enough. But the moment you have multiple services writing to the same database (microservices, background workers, admin tools, migration scripts), app-layer validation breaks down. Each service needs to independently implement and maintain the same rules. Database-level enforcement guarantees that nothing writes invalid data, regardless of where the write comes from.

How the syncing works:

  1. When attachSchema(schema, { enforceOnDatabase: true }) is called, the schema's internal representation is compiled to a $jsonSchema object via schema.toJsonSchema()
  2. The JSON schema is hashed (SHA-256) and compared against the last-applied hash stored in a _meteor_schema_versions collection
  3. If the hash matches, nothing happens — the database already has the current schema
  4. If the hash is different (or no hash exists), the package runs a collMod command to apply the new $jsonSchema validator with the configured validationLevel and validationAction
  5. The new hash is stored so subsequent restarts skip the collMod

This all happens inside Meteor.startup() so it doesn't block app initialization, and the hash tracking means you're not running collMod on every restart — only when the schema actually changes.

validationLevel controls which documents are checked:

  • 'strict' — all inserts and updates are validated
  • 'moderate' (default) — existing documents that already violate the schema aren't flagged, but new writes must conform
  • 'off' — disables validation (useful for temporarily relaxing constraints during migrations)

validationAction controls what happens on violation:

  • 'error' (default) — MongoDB rejects the write entirely
  • 'warn' — MongoDB allows the write but logs a warning (useful for rolling out schemas gradually)

What compiles and what doesn't: The $jsonSchema compiler handles types, required fields, nested objects, arrays, allowedValues (enum), min/max (for numbers, string lengths), regEx (pattern), and minCount/maxCount (array items). Features that are purely application-layer — autoValue, defaultValue, custom validators, denyInsert/denyUpdate — are not included in the $jsonSchema since they have no database-level equivalent. The app-layer clean/validate pipeline handles those before the document reaches MongoDB.

Migration from SimpleSchema

- import SimpleSchema from 'simpl-schema';
+ import { MongoSchema } from 'meteor/mongo-schema';

- const Schema = new SimpleSchema({ ... });
+ const Schema = new MongoSchema({ ... });

Schema definitions are identical. The only breaking changes are:

  • Error format: e.details array instead of e.invalidKeys()
  • Removed: SimpleSchema.RegEx.* patterns, SimpleSchema.debug
  • Not yet supported: selector option, transform option, aldeed:schema-index

Summary by CodeRabbit

  • New Features

    • Released mongo-schema v1.0.0 — complete schema system: rich definition/composition (extend/pick/omit), cleaning (coercion, trimming, defaults, autoValue), validation (field/doc-level, modifier-aware), collection attachment with optional DB-level enforcement, and JSON Schema export.
  • Documentation

    • Added comprehensive README with quick-start, API reference, examples, options, and migration guide.
  • Tests

    • Added extensive unit and integration tests covering types, cleaning, validation, contexts, JSON Schema, collection integration, and migration compatibility.

@netlify
Copy link
Copy Markdown

netlify bot commented Mar 28, 2026

Deploy Preview for v3-migration-docs canceled.

Name Link
🔨 Latest commit 8d0ea49
🔍 Latest deploy log https://app.netlify.com/projects/v3-migration-docs/deploys/69c8a7ee6df6b30008441987

@netlify
Copy link
Copy Markdown

netlify bot commented Mar 28, 2026

Deploy Preview for v3-meteor-api-docs canceled.

Name Link
🔨 Latest commit 8d0ea49
🔍 Latest deploy log https://app.netlify.com/projects/v3-meteor-api-docs/deploys/69c8a7ee7b71b900083a2954

@mvogttech mvogttech marked this pull request as ready for review March 28, 2026 22:21
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 28, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Adds a new Meteor package mongo-schema providing schema definition parsing, cleaning, validation, JSON Schema compilation, Mongo.Collection integration (including server-side $jsonSchema enforcement), types/utilities, error models, reactive validation contexts, and comprehensive tests.

Changes

Cohort / File(s) Summary
Package manifest & entry
packages/mongo-schema/package.js, packages/mongo-schema/index.js, packages/mongo-schema/README.md
New package descriptor, entrypoint re-exports (MongoSchema, ValidationError), initialization that wires collection integration, and comprehensive README documentation.
Core schema class & public API
packages/mongo-schema/schema.js, packages/mongo-schema/types.js, packages/mongo-schema/schema_errors.js
Implements MongoSchema class (construction, composition APIs, accessors, toJsonSchema, validator registries), type markers (Integer, Any, oneOf), type resolution, and ValidationError/ErrorTypes constants.
Definition parsing & IR
packages/mongo-schema/schema_definition.js
Parses/normalizes user schema definitions into a flat intermediate representation (IR), expands array .$ descriptors, resolves types, computes required/labels, and attaches child relationships.
Cleaning pipeline
packages/mongo-schema/schema_clean.js
Schema-driven cleaning for documents and modifiers: unknown-key filtering, cloning/mutation control, type coercion, trimming/remove-empty/null options, defaultValue and autoValue execution (including modifier operator merging and unset semantics).
Validation engine
packages/mongo-schema/schema_validate.js, packages/mongo-schema/schema_context.js
Field-level and document-level validation (plain and modifier modes), constraint enforcement (min/max/regex/enum/count, denyInsert/denyUpdate), custom/instance/doc validators, error collection, and a reactive ValidationContext API.
JSON Schema compilation
packages/mongo-schema/schema_jsonschema.js
Compiles IR to MongoDB $jsonSchema validator documents, mapping types, enums, patterns, min/max, item constraints, oneOf, nested properties, and required lists.
Mongo.Collection integration
packages/mongo-schema/collection_integration.js
Adds attachSchema(schema, options), schema(), schemaEnforcedOnDatabase() to Mongo.Collection.prototype; wraps async mutators to run clean/validate and applies server-side collMod with schema-version tracking.
Tests
packages/mongo-schema/tests/*
packages/mongo-schema/tests/types_tests.js, .../schema_validate_tests.js, .../schema_clean_tests.js, .../schema_definition_tests.js, .../schema_jsonschema_tests.js, .../schema_context_tests.js, .../schema_errors_tests.js, .../collection_integration_tests.js, .../migration_compat_tests.js
Comprehensive Tinytest suites covering types, errors, definition parsing/composition, cleaning (doc and modifier), validation (including modifier-mode and custom validators), JSON Schema output, reactive contexts, collection integration, and migration compatibility.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and accurately summarizes the main change: adding a new core package for MongoDB schema validation, which is the primary objective of this PR.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (9)
packages/mongo-schema/schema_context.js (1)

31-34: Consider returning a copy of the errors array to prevent external mutation.

validationErrors() returns the internal _errors array directly. If a caller mutates this array, it would corrupt the context's state without triggering reactivity updates. Returning a shallow copy ([...this._errors]) would be safer.

🛡️ Proposed defensive copy
   validationErrors() {
     if (this._dep) this._dep.depend();
-    return this._errors;
+    return [...this._errors];
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_context.js` around lines 31 - 34, The
validationErrors method currently returns the internal _errors array directly,
risking external mutation; change validationErrors (and keep the existing
_dep.depend() call) to return a shallow copy of the array (e.g., using spread or
slice) so callers get a defensive copy of this._errors and external mutations
won't corrupt the context or bypass reactivity.
packages/mongo-schema/tests/schema_errors_tests.js (1)

5-23: Consider adding coverage for DENY_INSERT and DENY_UPDATE error types.

The test verifies most ErrorTypes constants but omits DENY_INSERT and DENY_UPDATE which are defined in schema_errors.js. These may be covered by integration tests, but adding them here would ensure complete coverage of the constants.

💡 Add missing assertions
   test.equal(MongoSchema.ErrorTypes.FAILED_REGULAR_EXPRESSION, 'regEx');
   test.equal(MongoSchema.ErrorTypes.KEY_NOT_IN_SCHEMA, 'keyNotInSchema');
+  test.equal(MongoSchema.ErrorTypes.DENY_INSERT, 'denyInsert');
+  test.equal(MongoSchema.ErrorTypes.DENY_UPDATE, 'denyUpdate');
 });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/tests/schema_errors_tests.js` around lines 5 - 23, The
test is missing assertions for the DENY_INSERT and DENY_UPDATE error type
constants; update the Tinytest 'errors - ErrorTypes constants exist' to include
test.equal(MongoSchema.ErrorTypes.DENY_INSERT, 'denyInsert') and
test.equal(MongoSchema.ErrorTypes.DENY_UPDATE, 'denyUpdate') so the constants
defined in schema_errors.js (DENY_INSERT and DENY_UPDATE) are asserted like the
other ErrorTypes.
packages/mongo-schema/tests/collection_integration_tests.js (1)

103-109: Add an explicit client/server split assertion for bypassSchema.

This test validates the server happy path, but it doesn’t lock in the “server-only” contract. Add a client-side assertion (or guard) so regressions don’t accidentally allow bypass behavior in client contexts.

As per coding guidelines, packages/**: “Focus on API backwards compatibility, DDP/reactivity correctness, and client/server split.”

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/tests/collection_integration_tests.js` around lines 103
- 109, This test currently exercises the server-only bypassSchema behavior; add
an explicit client/server split by asserting Meteor.isServer at the top when
running the happy-path (so the existing server behavior is validated only on
server), and add a client-side branch that verifies bypassSchema does not allow
schema bypass on the client (e.g. in the Meteor.isClient branch call
col.insertAsync({anything:'goes'},{bypassSchema:true}) and assert the operation
either throws or that the resulting document (via col.findOneAsync or the insert
result) does not contain the extra field), referencing Tinytest.addAsync,
bypassSchema, Mongo.Collection, attachSchema, MongoSchema, insertAsync and
findOneAsync to locate where to change the test.
packages/mongo-schema/schema_jsonschema.js (1)

26-32: oneOf produces empty schema {} for types without bsonType.

When compiling oneOf types, if a resolved type has no bsonType (like MongoSchema.Any or custom constructors), line 29 returns {}. While this is valid JSON Schema (allows any value), it effectively makes the entire oneOf constraint meaningless since {} matches anything.

Consider logging a warning in development, or explicitly handling the Any case:

♻️ Handle Any type explicitly
   if (typeName === 'oneOf') {
-    result.oneOf = desc.resolvedType.resolvedTypes.map(rt => {
-      if (rt.bsonType) return { bsonType: rt.bsonType };
-      return {};
-    });
+    const oneOfSchemas = desc.resolvedType.resolvedTypes
+      .filter(rt => rt.bsonType) // Skip types without bsonType
+      .map(rt => ({ bsonType: rt.bsonType }));
+    if (oneOfSchemas.length > 0) {
+      result.oneOf = oneOfSchemas;
+    }
+    // If all types lack bsonType, result has no type constraint
     return result;
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_jsonschema.js` around lines 26 - 32, The oneOf
branch currently maps resolved types to {} when rt.bsonType is missing, which
makes the oneOf meaningless; update the logic in the oneOf handling (look for
the typeName === 'oneOf' block and desc.resolvedType.resolvedTypes) to
explicitly detect and handle MongoSchema.Any or other non-bsonType constructors
instead of returning plain {} — for example produce a schema that signals "any"
via an explicit marker (or skip/include a more specific discriminator), and/or
emit a dev-only warning via your logging facility when a resolved type lacks
bsonType so reviewers know oneOf has a catch-all entry; ensure you reference rt
constructors (e.g., MongoSchema.Any) when deciding behavior so the oneOf
constraint remains meaningful.
packages/mongo-schema/collection_integration.js (1)

130-132: Consider removing async keyword from fire-and-forget function.

applyDatabaseEnforcement is declared async but returns immediately after scheduling a Meteor.startup callback. The actual async work happens inside the callback. This is harmless but slightly misleading since the returned Promise resolves before any database work occurs.

♻️ Optional refactor
-async function applyDatabaseEnforcement(collection, options) {
+function applyDatabaseEnforcement(collection, options) {
   // Server-only, deferred to after startup
   if (typeof Meteor === 'undefined' || !Meteor.isServer) return;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/collection_integration.js` around lines 130 - 132, The
function applyDatabaseEnforcement is declared async but returns immediately
(work is done inside a Meteor.startup callback), so remove the misleading async
keyword and make it a regular function; update its declaration
(applyDatabaseEnforcement) and any internal use that assumes a returned Promise
(if any callers await it, stop awaiting or change them to rely on startup
behavior), and ensure the Meteor.startup callback still schedules the async work
(leave the inner async/await logic unchanged).
packages/mongo-schema/schema_validate.js (1)

243-249: Custom validators in modifier mode have limited field access.

When validating modifier fields, rootDoc is set to modifier[op] (e.g., the $set object). This means custom validators using this.field('otherField') can only see fields within the same operator, not fields being set via different operators like $inc or $setOnInsert.

This may be intentional for SimpleSchema compatibility, but consider documenting this limitation or enhancing the context to search across all operator payloads (similar to applyAutoValuesModifier's field() implementation in schema_clean.js).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_validate.js` around lines 243 - 249, Custom
validators running in modifier mode are given rootDoc = modifier[op], so when
validateField processes modifier operators (in schema_validate.js looping over
modifier[op] and calling validateField) a custom validator's
this.field('otherField') can only see fields inside the same operator; update
validateField's modifier-mode context so field resolution searches across all
operator payloads (not just modifier[op]) similar to applyAutoValuesModifier's
field() behavior in schema_clean.js, or at minimum document this limitation in
the API; specifically, adjust the logic that builds the rootDoc/context passed
into validateField (and any helper that implements this.field) to merge or
lookup values across all modifier operators (e.g., iterate modifier keys and
fallback lookups) so validations can access fields set by other operators like
$inc or $setOnInsert.
packages/mongo-schema/schema.js (1)

140-145: labels() mutates the IR which may affect extended schemas.

The labels() method directly mutates desc.label in the IR. If schemas share IR references (e.g., through extend() without deep cloning), label changes on one schema could affect another.

Looking at extend() (lines 36-39), it creates a new MongoSchema which calls parseDefinition and creates a fresh IR, so this appears safe. However, if future changes introduce IR sharing for performance, this mutation could become problematic.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema.js` around lines 140 - 145, labels() currently
mutates IR entries (desc.label) which can leak to other schemas if IRs are
shared; change it to avoid in-place mutation by creating a new descriptor with
the updated label and replacing the IR entry (e.g., read desc via
this._ir.get(key), create a shallow clone with the new label, then
this._ir.set(key, clonedDesc)) or maintain a separate label-overlay map; ensure
the change uses the existing parseDefinition/MongoSchema construction semantics
so extend() still produces independent schemas.
packages/mongo-schema/schema_clean.js (2)

23-25: Inconsistent mutation pattern with removeNullsFromArrays.

Unlike other cleaning steps that reassign result, removeNullsFromArrays only mutates in place and returns undefined. This works because result is already mutable at this point, but the inconsistency could cause confusion if the order of operations changes or if mutate: false semantics need adjustment.

♻️ Optional: Return result for consistency
-  if (options.removeNullsFromArrays) {
-    removeNullsFromArrays(ir, result);
-  }
+  if (options.removeNullsFromArrays) {
+    result = removeNullsFromArrays(ir, result);
+  }

And update the function:

 function removeNullsFromArrays(ir, doc) {
   for (const [key, value] of Object.entries(doc)) {
     if (Array.isArray(value)) {
       doc[key] = value.filter(item => item !== null);
     }
     if (typeof value === 'object' && value !== null && !Array.isArray(value) && !(value instanceof Date)) {
       removeNullsFromArrays(ir, value);
     }
   }
+  return doc;
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_clean.js` around lines 23 - 25, The call to
removeNullsFromArrays mutates result in place but doesn't return it, causing an
inconsistent mutation pattern; update the removeNullsFromArrays function to
return the cleaned object (return result) and change its invocation in the
options.removeNullsFromArrays branch to reassign result (e.g., result =
removeNullsFromArrays(ir, result)) so the flow matches other cleaning steps and
supports future non-mutating semantics.

266-274: autoConvertTypes only processes top-level keys.

Similar to removeEmptyStrings, this function doesn't recurse into nested objects. Nested values like { profile: { age: "25" } } won't be auto-converted to { profile: { age: 25 } }.

This appears consistent across the flat cleaning functions, but differs from the recursive approach used by trimStrings and applyDefaultValues.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_clean.js` around lines 266 - 274,
autoConvertTypes currently only converts top-level fields; update it to recurse
into nested objects and arrays so nested values (e.g., profile.age) are
converted too: for each entry use ir.get(key) as now, but if value is a plain
object call autoConvertTypes recursively with that nested object and the
appropriate nested schema/IR (derived from desc.resolvedType or
desc.schema/inner IR on desc), and if value is an array map over elements
(recursing for object elements and using convertValue for primitives). Keep
using convertValue(desc.resolvedType.name) for primitive conversions, and
preserve the existing checks for null/undefined; update autoConvertTypes, its
recursive calls, and handling of arrays accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/mongo-schema/collection_integration.js`:
- Around line 142-165: The code attempts to run db.command({ collMod: ... })
using collectionName = collection._name, which will be null for anonymous/local
collections and cause collMod to fail; update the Meteor.startup block to check
collection._name (or collectionName) before any DB operations and skip the
schema-enforcement path (including the _meteor_schema_versions check and the
db.command collMod call) for anonymous collections, returning early or logging
an informational message so anonymous collections are not processed by the
schema enforcement logic in this file.

In `@packages/mongo-schema/schema_clean.js`:
- Around line 126-132: The field() helper currently only checks operators
['$set', '$setOnInsert', '$unset', '$inc'] and therefore misses array-modifying
operators; update field() to include '$push' and '$addToSet' so its operator
list matches the main lookup that checks six operators (e.g., ensure field()
iterates over ['$set', '$setOnInsert', '$unset', '$inc', '$push', '$addToSet']
and returns isSet/value/operator when one of those exists).

In `@packages/mongo-schema/schema_jsonschema.js`:
- Around line 77-83: The conversion currently drops RegExp flags when setting
result.pattern (see desc.regEx and result.pattern); update the code that handles
desc.regEx so that when re is a RegExp and re.flags is non-empty you emit a
clear development-time warning (e.g., logger.warn/console.warn) stating that
MongoDB $jsonSchema does not support regex flags and that the flags (list the
actual re.flags) will be ignored for the pattern being set (include the
pattern/re.source in the message for context); keep the existing behavior of
using Array.isArray(desc.regEx) ? desc.regEx[0] : desc.regEx and still set
result.pattern = re.source, but add the warning when flags exist or
alternatively document this limitation if a runtime logger is unavailable.

In `@packages/mongo-schema/schema_validate.js`:
- Around line 266-278: The code in the $push/$addToSet loop currently sets
pushValue to null when value has $each, which causes validation to be skipped;
instead detect when value.$each exists and iterate its array elements, calling
validateField(ir, itemKey, itemDesc, modifier[op], eachItem, errors, options)
for each element; keep the existing branch for the non-$each case (using
pushValue as before) and ensure you use the same itemKey and itemDesc lookups so
array item schema validation runs for every element in value.$each as well as
for single push values.
- Around line 143-148: The denyInsert/denyUpdate checks in validateConstraints
are never triggered because schema.validate calls from collection_integration.js
omit isInsert/isUpdate; update the schema.validate invocations inside
collection_integration.js (the calls that currently pass { modifier: true } or {
modifier: true, upsert: true }) to include the correct operation flags (e.g.,
pass isInsert: true for insert flows, isUpdate: true for update flows, and
include upsert when appropriate) so validateConstraints can enforce
desc.denyInsert and desc.denyUpdate; ensure these flags align with how
buildCleanOptions/extendAutoValueContext set them during cleaning.

In `@packages/mongo-schema/tests/migration_compat_tests.js`:
- Around line 62-73: The test Tinytest.add 'migration - validate throws
ValidationError' currently calls MongoSchema.prototype.validate via
schema.validate({}) and relies on the catch block to assert the ValidationError,
but if validate() does not throw the test silently passes; update the test to
explicitly fail when no exception is thrown by adding a test.fail(...)
immediately after calling schema.validate({}) (or wrap in a try {
schema.validate({}); test.fail("expected ValidationError"); } catch (e) { ... })
so that the test fails when validate() does not throw; reference the test
function name Tinytest.add and the schema.validate call to locate and modify the
test.

In `@packages/mongo-schema/types.js`:
- Around line 41-44: The Object type predicate in TYPE_MAP.set(Object) and the
related schema type predicate must be tightened to only accept plain objects;
locate the OBJECT entry created via TYPE_MAP.set(Object, { name: 'object',
check: ... }) and the schema predicate around the second occurrence (lines
referenced in the review) and replace the current check (v !== null && typeof v
=== 'object' && !Array.isArray(v) && !(v instanceof Date)) with a call to an
isPlainObject utility (or implement isPlainObject that returns true for object
literals/Object.create() and false for RegExp, Error, Map, Set, Date, etc.).
Ensure both places use the same isPlainObject helper so nested schema validation
will only permit plain objects.

---

Nitpick comments:
In `@packages/mongo-schema/collection_integration.js`:
- Around line 130-132: The function applyDatabaseEnforcement is declared async
but returns immediately (work is done inside a Meteor.startup callback), so
remove the misleading async keyword and make it a regular function; update its
declaration (applyDatabaseEnforcement) and any internal use that assumes a
returned Promise (if any callers await it, stop awaiting or change them to rely
on startup behavior), and ensure the Meteor.startup callback still schedules the
async work (leave the inner async/await logic unchanged).

In `@packages/mongo-schema/schema_clean.js`:
- Around line 23-25: The call to removeNullsFromArrays mutates result in place
but doesn't return it, causing an inconsistent mutation pattern; update the
removeNullsFromArrays function to return the cleaned object (return result) and
change its invocation in the options.removeNullsFromArrays branch to reassign
result (e.g., result = removeNullsFromArrays(ir, result)) so the flow matches
other cleaning steps and supports future non-mutating semantics.
- Around line 266-274: autoConvertTypes currently only converts top-level
fields; update it to recurse into nested objects and arrays so nested values
(e.g., profile.age) are converted too: for each entry use ir.get(key) as now,
but if value is a plain object call autoConvertTypes recursively with that
nested object and the appropriate nested schema/IR (derived from
desc.resolvedType or desc.schema/inner IR on desc), and if value is an array map
over elements (recursing for object elements and using convertValue for
primitives). Keep using convertValue(desc.resolvedType.name) for primitive
conversions, and preserve the existing checks for null/undefined; update
autoConvertTypes, its recursive calls, and handling of arrays accordingly.

In `@packages/mongo-schema/schema_context.js`:
- Around line 31-34: The validationErrors method currently returns the internal
_errors array directly, risking external mutation; change validationErrors (and
keep the existing _dep.depend() call) to return a shallow copy of the array
(e.g., using spread or slice) so callers get a defensive copy of this._errors
and external mutations won't corrupt the context or bypass reactivity.

In `@packages/mongo-schema/schema_jsonschema.js`:
- Around line 26-32: The oneOf branch currently maps resolved types to {} when
rt.bsonType is missing, which makes the oneOf meaningless; update the logic in
the oneOf handling (look for the typeName === 'oneOf' block and
desc.resolvedType.resolvedTypes) to explicitly detect and handle MongoSchema.Any
or other non-bsonType constructors instead of returning plain {} — for example
produce a schema that signals "any" via an explicit marker (or skip/include a
more specific discriminator), and/or emit a dev-only warning via your logging
facility when a resolved type lacks bsonType so reviewers know oneOf has a
catch-all entry; ensure you reference rt constructors (e.g., MongoSchema.Any)
when deciding behavior so the oneOf constraint remains meaningful.

In `@packages/mongo-schema/schema_validate.js`:
- Around line 243-249: Custom validators running in modifier mode are given
rootDoc = modifier[op], so when validateField processes modifier operators (in
schema_validate.js looping over modifier[op] and calling validateField) a custom
validator's this.field('otherField') can only see fields inside the same
operator; update validateField's modifier-mode context so field resolution
searches across all operator payloads (not just modifier[op]) similar to
applyAutoValuesModifier's field() behavior in schema_clean.js, or at minimum
document this limitation in the API; specifically, adjust the logic that builds
the rootDoc/context passed into validateField (and any helper that implements
this.field) to merge or lookup values across all modifier operators (e.g.,
iterate modifier keys and fallback lookups) so validations can access fields set
by other operators like $inc or $setOnInsert.

In `@packages/mongo-schema/schema.js`:
- Around line 140-145: labels() currently mutates IR entries (desc.label) which
can leak to other schemas if IRs are shared; change it to avoid in-place
mutation by creating a new descriptor with the updated label and replacing the
IR entry (e.g., read desc via this._ir.get(key), create a shallow clone with the
new label, then this._ir.set(key, clonedDesc)) or maintain a separate
label-overlay map; ensure the change uses the existing
parseDefinition/MongoSchema construction semantics so extend() still produces
independent schemas.

In `@packages/mongo-schema/tests/collection_integration_tests.js`:
- Around line 103-109: This test currently exercises the server-only
bypassSchema behavior; add an explicit client/server split by asserting
Meteor.isServer at the top when running the happy-path (so the existing server
behavior is validated only on server), and add a client-side branch that
verifies bypassSchema does not allow schema bypass on the client (e.g. in the
Meteor.isClient branch call
col.insertAsync({anything:'goes'},{bypassSchema:true}) and assert the operation
either throws or that the resulting document (via col.findOneAsync or the insert
result) does not contain the extra field), referencing Tinytest.addAsync,
bypassSchema, Mongo.Collection, attachSchema, MongoSchema, insertAsync and
findOneAsync to locate where to change the test.

In `@packages/mongo-schema/tests/schema_errors_tests.js`:
- Around line 5-23: The test is missing assertions for the DENY_INSERT and
DENY_UPDATE error type constants; update the Tinytest 'errors - ErrorTypes
constants exist' to include test.equal(MongoSchema.ErrorTypes.DENY_INSERT,
'denyInsert') and test.equal(MongoSchema.ErrorTypes.DENY_UPDATE, 'denyUpdate')
so the constants defined in schema_errors.js (DENY_INSERT and DENY_UPDATE) are
asserted like the other ErrorTypes.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: ec0ab768-a3bd-44d1-8e68-6f241ab0211c

📥 Commits

Reviewing files that changed from the base of the PR and between d8e169e and 1cae4a9.

📒 Files selected for processing (21)
  • packages/mongo-schema/README.md
  • packages/mongo-schema/collection_integration.js
  • packages/mongo-schema/index.js
  • packages/mongo-schema/package.js
  • packages/mongo-schema/schema.js
  • packages/mongo-schema/schema_clean.js
  • packages/mongo-schema/schema_context.js
  • packages/mongo-schema/schema_definition.js
  • packages/mongo-schema/schema_errors.js
  • packages/mongo-schema/schema_jsonschema.js
  • packages/mongo-schema/schema_validate.js
  • packages/mongo-schema/tests/collection_integration_tests.js
  • packages/mongo-schema/tests/migration_compat_tests.js
  • packages/mongo-schema/tests/schema_clean_tests.js
  • packages/mongo-schema/tests/schema_context_tests.js
  • packages/mongo-schema/tests/schema_definition_tests.js
  • packages/mongo-schema/tests/schema_errors_tests.js
  • packages/mongo-schema/tests/schema_jsonschema_tests.js
  • packages/mongo-schema/tests/schema_validate_tests.js
  • packages/mongo-schema/tests/types_tests.js
  • packages/mongo-schema/types.js

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (3)
packages/mongo-schema/schema_definition.js (1)

100-109: Edge case: empty array [] throws an unclear error.

The array shorthand check at Line 101 requires exactly one element (def.length === 1). An empty array [] falls through to Line 108 and throws with JSON output "[]", which may be confusing. Consider adding a specific error message for empty arrays.

Proposed improvement for clearer error message
   // Shorthand: Array of type (e.g., { tags: [String] })
   if (Array.isArray(def) && def.length === 1) {
     return { type: Array, _arrayItemType: def[0] };
   }
+  // Invalid: empty array
+  if (Array.isArray(def) && def.length === 0) {
+    throw new Error('MongoSchema: Array shorthand requires exactly one element, e.g., [String]');
+  }
   // Full form object
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_definition.js` around lines 100 - 109, The code
falls through and throws an unclear error when an empty array shorthand (def ===
[]) is provided; add an explicit branch before the existing Array.isArray(def)
&& def.length === 1 check to detect Array.isArray(def) && def.length === 0 and
throw a specific Error (e.g., "MongoSchema: Empty array shorthand [] is not
allowed; expected one type like [String]") so callers see a clear message; keep
the existing behavior for def.length === 1 (return { type: Array,
_arrayItemType: def[0] }) and for full-form objects (the typeof def === 'object'
&& def.type !== undefined branch).
packages/mongo-schema/schema_validate.js (1)

234-239: Minor inconsistency: denyUpdate triggers regardless of value presence.

At line 234, denyInsert checks value !== undefined, but at line 237, denyUpdate does not. In modifier validation, this is fine since only touched fields are validated. However, in document validation mode with isUpdate: true, this could trigger for fields that aren't actually present.

Consider adding the same guard for consistency:

Proposed fix
   if (desc.denyInsert && options.isInsert && value !== undefined) {
     errors.push({ name: key, type: ErrorTypes.DENY_INSERT, value, message: `${label} cannot be set during insert` });
   }
-  if (desc.denyUpdate && options.isUpdate) {
+  if (desc.denyUpdate && options.isUpdate && value !== undefined) {
     errors.push({ name: key, type: ErrorTypes.DENY_UPDATE, value, message: `${label} cannot be set during update` });
   }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_validate.js` around lines 234 - 239, The
denyUpdate branch is missing the same presence guard used by denyInsert, causing
DENY_UPDATE errors during document validation when the field isn't present;
update the condition around desc.denyUpdate in schema_validate.js (the block
that calls errors.push with type ErrorTypes.DENY_UPDATE) to only run when
options.isUpdate is true AND value !== undefined, mirroring the denyInsert check
so only present/touched fields trigger the error.
packages/mongo-schema/types.js (1)

164-172: Consider documenting that oneOf descriptors lack a top-level bsonType.

The oneOf type descriptor omits bsonType, which is intentional since JSON Schema compilation handles it via the resolvedTypes array. However, documenting this explicitly in the return object or JSDoc would help maintainability.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/types.js` around lines 164 - 172, The oneOf branch
(when typeDef._isOneOf) returns an object without a top-level bsonType; update
the code or JSDoc to explicitly document that behavior so future maintainers
know JSON Schema compilation uses resolvedTypes instead. Add a short inline
comment or JSDoc on the returned object from the oneOf block (or add an explicit
field like bsonType: undefined/null with a clarifying comment) and mention that
resolvedTypes is used for JSON Schema compilation; reference the oneOf branch,
the typeDef._isOneOf check, and the resolvedTypes variable when adding the
documentation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/mongo-schema/schema_jsonschema.js`:
- Around line 130-142: The condition that compiles nested children only checks
for typeName === 'object', so nested MongoSchema instances with typeName ===
'schema' are skipped; update the branch in schema_jsonschema.js to treat both
object and schema the same (i.e., use typeName === 'object' || typeName ===
'schema') so that when desc.children exist and !desc.blackbox the loop that
calls compileField(ir, childKey, childDesc) still runs and populates
result.properties and result.required as before.

In `@packages/mongo-schema/schema_validate.js`:
- Around line 200-207: The recursive validation only runs for
desc.resolvedType.name === 'object' so nested MongoSchema subschemas (which have
resolvedType.name === 'schema') are skipped; update the condition in
validateField to also treat 'schema' the same as 'object' (e.g., change the if
that checks desc.resolvedType.name to include 'schema') while keeping the checks
for desc.children and !desc.blackbox, then continue to call ir.get(childKey) and
recurse via validateField as currently implemented.

---

Nitpick comments:
In `@packages/mongo-schema/schema_definition.js`:
- Around line 100-109: The code falls through and throws an unclear error when
an empty array shorthand (def === []) is provided; add an explicit branch before
the existing Array.isArray(def) && def.length === 1 check to detect
Array.isArray(def) && def.length === 0 and throw a specific Error (e.g.,
"MongoSchema: Empty array shorthand [] is not allowed; expected one type like
[String]") so callers see a clear message; keep the existing behavior for
def.length === 1 (return { type: Array, _arrayItemType: def[0] }) and for
full-form objects (the typeof def === 'object' && def.type !== undefined
branch).

In `@packages/mongo-schema/schema_validate.js`:
- Around line 234-239: The denyUpdate branch is missing the same presence guard
used by denyInsert, causing DENY_UPDATE errors during document validation when
the field isn't present; update the condition around desc.denyUpdate in
schema_validate.js (the block that calls errors.push with type
ErrorTypes.DENY_UPDATE) to only run when options.isUpdate is true AND value !==
undefined, mirroring the denyInsert check so only present/touched fields trigger
the error.

In `@packages/mongo-schema/types.js`:
- Around line 164-172: The oneOf branch (when typeDef._isOneOf) returns an
object without a top-level bsonType; update the code or JSDoc to explicitly
document that behavior so future maintainers know JSON Schema compilation uses
resolvedTypes instead. Add a short inline comment or JSDoc on the returned
object from the oneOf block (or add an explicit field like bsonType:
undefined/null with a clarifying comment) and mention that resolvedTypes is used
for JSON Schema compilation; reference the oneOf branch, the typeDef._isOneOf
check, and the resolvedTypes variable when adding the documentation.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 7d0f43b6-781a-44a7-91fd-cbb70f336655

📥 Commits

Reviewing files that changed from the base of the PR and between 1cae4a9 and e852421.

📒 Files selected for processing (17)
  • packages/mongo-schema/collection_integration.js
  • packages/mongo-schema/index.js
  • packages/mongo-schema/schema.js
  • packages/mongo-schema/schema_clean.js
  • packages/mongo-schema/schema_context.js
  • packages/mongo-schema/schema_definition.js
  • packages/mongo-schema/schema_errors.js
  • packages/mongo-schema/schema_jsonschema.js
  • packages/mongo-schema/schema_validate.js
  • packages/mongo-schema/tests/collection_integration_tests.js
  • packages/mongo-schema/tests/migration_compat_tests.js
  • packages/mongo-schema/tests/schema_clean_tests.js
  • packages/mongo-schema/tests/schema_context_tests.js
  • packages/mongo-schema/tests/schema_definition_tests.js
  • packages/mongo-schema/tests/schema_errors_tests.js
  • packages/mongo-schema/tests/schema_validate_tests.js
  • packages/mongo-schema/types.js
🚧 Files skipped from review as they are similar to previous changes (2)
  • packages/mongo-schema/index.js
  • packages/mongo-schema/schema_context.js

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
packages/mongo-schema/schema_validate.js (1)

234-239: ⚠️ Potential issue | 🟠 Major

denyInsert/denyUpdate constraints require isInsert/isUpdate flags from callers.

The constraint checks here are correctly implemented, but as previously noted, collection_integration.js must pass the isInsert/isUpdate flags to schema.validate() for these constraints to be enforced. Without those flags, denyInsert and denyUpdate restrictions will silently pass.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_validate.js` around lines 234 - 239, The
denyInsert/denyUpdate checks in schema_validate.js rely on callers to set
options.isInsert/options.isUpdate, so update collection_integration.js to pass
these flags into schema.validate during insert and update flows: when performing
an insert call schema.validate(document, { ..., isInsert: true, ... }) and for
updates call schema.validate(updateDoc, { ..., isUpdate: true, ... }); ensure
any existing options object passed into schema.validate is preserved/merged
rather than overwritten so other validation options remain intact.
🧹 Nitpick comments (1)
packages/mongo-schema/schema_validate.js (1)

449-453: Minor: siblingField() in modifier validators has same issue.

The siblingField implementation here (line 452) delegates to this.field(name) similar to runCustomValidator. However, since modifier fields are typically flat dot-paths in mergedFields, the impact is less severe than in plain document validation. Consider applying the same fix for consistency.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_validate.js` around lines 449 - 453,
siblingField currently delegates to this.field(name) which may not correctly
resolve sibling values for modifier validation; update siblingField in the
modifier validator to mirror the same resolution logic used for field (and as
fixed in runCustomValidator) so it looks up mergedFields using the flat dot-path
(mergedFields[name]) and returns { isSet: mergedFields[name] !== undefined,
value: mergedFields[name], operator: null } (keeping parentField and field
behavior unchanged) to ensure consistent sibling lookup for modifier
validations.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@packages/mongo-schema/schema_validate.js`:
- Around line 333-354: The custom validator context in runCustomValidator
constructs siblingField and parentField incorrectly by using absolute lookups;
change siblingField and parentField to compute the current field's parent path
(derived from the provided key/genericKey) and perform nested lookups relative
to that parent: implement siblingField(name) to build a siblingPath = parentPath
? `${parentPath}.${name}` : name, call getNestedValue(rootDoc, siblingPath) and
return { isSet: val !== undefined, value: val, operator: null }, and implement
parentField() to return the parent object via getNestedValue(rootDoc,
parentPath) (or { isSet: true, value: rootDoc } if no parent), keeping other
context properties and merging options.extendedCustomContext as before; use
runCustomValidator, siblingField, parentField, getNestedValue and
options.extendedCustomContext to locate and update the code similar to the logic
used in schema_clean.js.

---

Duplicate comments:
In `@packages/mongo-schema/schema_validate.js`:
- Around line 234-239: The denyInsert/denyUpdate checks in schema_validate.js
rely on callers to set options.isInsert/options.isUpdate, so update
collection_integration.js to pass these flags into schema.validate during insert
and update flows: when performing an insert call schema.validate(document, {
..., isInsert: true, ... }) and for updates call schema.validate(updateDoc, {
..., isUpdate: true, ... }); ensure any existing options object passed into
schema.validate is preserved/merged rather than overwritten so other validation
options remain intact.

---

Nitpick comments:
In `@packages/mongo-schema/schema_validate.js`:
- Around line 449-453: siblingField currently delegates to this.field(name)
which may not correctly resolve sibling values for modifier validation; update
siblingField in the modifier validator to mirror the same resolution logic used
for field (and as fixed in runCustomValidator) so it looks up mergedFields using
the flat dot-path (mergedFields[name]) and returns { isSet: mergedFields[name]
!== undefined, value: mergedFields[name], operator: null } (keeping parentField
and field behavior unchanged) to ensure consistent sibling lookup for modifier
validations.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 2bd5c9a7-1122-463f-84db-e8f8b66c66ed

📥 Commits

Reviewing files that changed from the base of the PR and between e852421 and 8d0ea49.

📒 Files selected for processing (3)
  • packages/mongo-schema/schema_clean.js
  • packages/mongo-schema/schema_jsonschema.js
  • packages/mongo-schema/schema_validate.js

Comment on lines +333 to +354
function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) {
const context = {
key,
genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'),
value,
isSet: value !== undefined,
operator: null,
definition: desc,
field(name) {
const val = getNestedValue(rootDoc, name);
return { isSet: val !== undefined, value: val, operator: null };
},
siblingField(name) { return this.field(name); },
parentField() { return { isSet: true, value: rootDoc, operator: null }; },
...(options.extendedCustomContext || {}),
};

const result = desc.custom.call(context);
if (typeof result === 'string') {
errors.push({ name: key, type: result, value, message: `${desc.label || key} failed validation: ${result}` });
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

siblingField() and parentField() are incorrect for nested fields in custom validators.

The siblingField implementation (line 345) simply delegates to this.field(name), which performs an absolute path lookup. For a nested field like address.city, calling siblingField('country') should return address.country, but this implementation returns the value at root-level country.

Similarly, parentField() (line 346) returns rootDoc instead of the actual parent object.

Compare to the correct implementation in schema_clean.js lines 584-606 which computes parentPath and constructs proper sibling paths.

🐛 Proposed fix
 function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) {
+  const parentPath = key.includes('.') ? key.slice(0, key.lastIndexOf('.')) : '';
+
   const context = {
     key,
     genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'),
     value,
     isSet: value !== undefined,
     operator: null,
     definition: desc,
     field(name) {
       const val = getNestedValue(rootDoc, name);
       return { isSet: val !== undefined, value: val, operator: null };
     },
-    siblingField(name) { return this.field(name); },
-    parentField() { return { isSet: true, value: rootDoc, operator: null }; },
+    siblingField(name) {
+      const siblingPath = parentPath ? `${parentPath}.${name}` : name;
+      const val = getNestedValue(rootDoc, siblingPath);
+      return { isSet: val !== undefined, value: val, operator: null };
+    },
+    parentField() {
+      if (!parentPath) return { isSet: true, value: rootDoc, operator: null };
+      const val = getNestedValue(rootDoc, parentPath);
+      return { isSet: val !== undefined, value: val, operator: null };
+    },
     ...(options.extendedCustomContext || {}),
   };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) {
const context = {
key,
genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'),
value,
isSet: value !== undefined,
operator: null,
definition: desc,
field(name) {
const val = getNestedValue(rootDoc, name);
return { isSet: val !== undefined, value: val, operator: null };
},
siblingField(name) { return this.field(name); },
parentField() { return { isSet: true, value: rootDoc, operator: null }; },
...(options.extendedCustomContext || {}),
};
const result = desc.custom.call(context);
if (typeof result === 'string') {
errors.push({ name: key, type: result, value, message: `${desc.label || key} failed validation: ${result}` });
}
}
function runCustomValidator(ir, key, desc, rootDoc, value, errors, options) {
const parentPath = key.includes('.') ? key.slice(0, key.lastIndexOf('.')) : '';
const context = {
key,
genericKey: key.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'),
value,
isSet: value !== undefined,
operator: null,
definition: desc,
field(name) {
const val = getNestedValue(rootDoc, name);
return { isSet: val !== undefined, value: val, operator: null };
},
siblingField(name) {
const siblingPath = parentPath ? `${parentPath}.${name}` : name;
const val = getNestedValue(rootDoc, siblingPath);
return { isSet: val !== undefined, value: val, operator: null };
},
parentField() {
if (!parentPath) return { isSet: true, value: rootDoc, operator: null };
const val = getNestedValue(rootDoc, parentPath);
return { isSet: val !== undefined, value: val, operator: null };
},
...(options.extendedCustomContext || {}),
};
const result = desc.custom.call(context);
if (typeof result === 'string') {
errors.push({ name: key, type: result, value, message: `${desc.label || key} failed validation: ${result}` });
}
}
🧰 Tools
🪛 ESLint

[error] 345-345: Replace ·return·this.field(name); with ⏎······return·this.field(name);⏎···

(prettier/prettier)


[error] 346-346: Replace ·return·{·isSet:·true,·value:·rootDoc,·operator:·null·}; with ⏎······return·{·isSet:·true,·value:·rootDoc,·operator:·null·};⏎···

(prettier/prettier)


[error] 352-352: Replace ·name:·key,·type:·result,·value,·message:·${desc.label·||·key}·failed·validation:·${result}`` with ⏎······name:·key,⏎······type:·result,⏎······value,⏎······message:·${desc.label·||·key}·failed·validation:·${result}`,⏎···`

(prettier/prettier)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/mongo-schema/schema_validate.js` around lines 333 - 354, The custom
validator context in runCustomValidator constructs siblingField and parentField
incorrectly by using absolute lookups; change siblingField and parentField to
compute the current field's parent path (derived from the provided
key/genericKey) and perform nested lookups relative to that parent: implement
siblingField(name) to build a siblingPath = parentPath ? `${parentPath}.${name}`
: name, call getNestedValue(rootDoc, siblingPath) and return { isSet: val !==
undefined, value: val, operator: null }, and implement parentField() to return
the parent object via getNestedValue(rootDoc, parentPath) (or { isSet: true,
value: rootDoc } if no parent), keeping other context properties and merging
options.extendedCustomContext as before; use runCustomValidator, siblingField,
parentField, getNestedValue and options.extendedCustomContext to locate and
update the code similar to the logic used in schema_clean.js.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant