diff --git a/docs/configuration/cli.md b/docs/configuration/cli.md
index 9393c17cca..89ccd3ca42 100644
--- a/docs/configuration/cli.md
+++ b/docs/configuration/cli.md
@@ -29,7 +29,7 @@ scenarios.
To use the expanded command line options to a .NET application, add this last line of code shown below to your `Program.cs`:
-
+
```cs
var builder = WebApplication.CreateBuilder(args);
@@ -37,20 +37,20 @@ var builder = WebApplication.CreateBuilder(args);
// Must be done before calling builder.Build() at least
builder.Host.ApplyJasperFxExtensions();
```
-snippet source | anchor
+snippet source | anchor
And finally, use JasperFx as the command line parser and executor by replacing `App.Run()` as the last line of code in your
`Program.cs` file:
-
+
```cs
// Instead of App.Run(), use the app.RunJasperFxCommands(args)
// as the last line of your Program.cs file
return await app.RunJasperFxCommands(args);
```
-snippet source | anchor
+snippet source | anchor
In your command line in the project directory, you can run:
diff --git a/docs/configuration/hostbuilder.md b/docs/configuration/hostbuilder.md
index 35f8227c45..57661ab6c8 100644
--- a/docs/configuration/hostbuilder.md
+++ b/docs/configuration/hostbuilder.md
@@ -3,7 +3,7 @@
As briefly shown in the [getting started](/) page, Marten comes with the `AddMarten()` extension method for the .NET `IServiceCollection` to quickly add Marten to any ASP.NET Core or Worker Service application:
-
+
```cs
// This is the absolute, simplest way to integrate Marten into your
// .NET application with Marten's default configuration
@@ -26,7 +26,7 @@ builder.Services.AddMarten(options =>
// string to Marten
.UseNpgsqlDataSource();
```
-snippet source | anchor
+snippet source | anchor
The `AddMarten()` method will add these service registrations to your application:
@@ -61,20 +61,20 @@ All the examples in this page are assuming the usage of the default IoC containe
First, if you are using Marten completely out of the box with no customizations (besides attributes on your documents), you can just supply a connection string to the underlying Postgresql database like this:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
// By only the connection string
services.AddMarten(connectionString);
```
-snippet source | anchor
+snippet source | anchor
The second option is to supply a [nested closure](https://martinfowler.com/dslCatalog/nestedClosure.html) to configure Marten inline like so:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
@@ -91,13 +91,13 @@ services.CritterStackDefaults(x =>
x.Production.ResourceAutoCreate = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
Lastly, if you prefer, you can pass a Marten `StoreOptions` object to `AddMarten()` like this example:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
@@ -113,7 +113,7 @@ services.CritterStackDefaults(x =>
x.Production.ResourceAutoCreate = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
## Using NpgsqlDataSource
@@ -130,7 +130,7 @@ You can also use the [NpgsqlDataSource](https://www.npgsql.org/doc/basic-usage.h
You can use the `AddNpgsqlDataSource` method from [Npgsql.DependencyInjection package](https://www.nuget.org/packages/Npgsql.DependencyInjection) to perform a setup by calling the `UseNpgsqlDataSourceMethod`:
-
+
```cs
services.AddNpgsqlDataSource(ConnectionSource.ConnectionString);
@@ -138,13 +138,13 @@ services.AddMarten()
.UseLightweightSessions()
.UseNpgsqlDataSource();
```
-snippet source | anchor
+snippet source | anchor
If you're on .NET 8 (and above), you can also use a dedicated [keyed registration](https://learn.microsoft.com/en-us/dotnet/core/whats-new/dotnet-8#keyed-di-services). This can be useful for scenarios where you need more than one data source registered:
-
+
```cs
const string dataSourceKey = "marten_data_source";
@@ -154,7 +154,7 @@ services.AddMarten()
.UseLightweightSessions()
.UseNpgsqlDataSource(dataSourceKey);
```
-snippet source | anchor
+snippet source | anchor
## Using a Multi-Host Data Source
@@ -164,7 +164,7 @@ Marten includes support for `NpgsqlMultiHostDataSource`, allowing you to spread
Configuring `NpgsqlMultiHostDataSource` is very similar to a normal data source, simply swapping it for `AddMultiHostNpgsqlDataSource`. Marten will always use the primary node for queries with a `NpgsqlMultiHostDataSource` unless you explicitly opt to use the standby nodes. You can adjust what type of node Marten uses for querying via the `MultiHostSettings` store options:
-
+
```cs
services.AddMultiHostNpgsqlDataSource(ConnectionSource.ConnectionString);
@@ -176,7 +176,7 @@ services.AddMarten(x =>
.UseLightweightSessions()
.UseNpgsqlDataSource();
```
-snippet source | anchor
+snippet source | anchor
::: warning
@@ -197,7 +197,7 @@ The `AddMarten()` mechanism assumes that you are expressing all of the Marten co
Fear not, Marten V5.0 introduced a new way to add or modify the Marten configuration from `AddMarten()`. Let's assume that we're building a system that has a subsystem related to *users* and want to segregate all the service registrations and Marten configuration related to *users* into a single place like this extension method:
-
+
```cs
public static IServiceCollection AddUserModule(this IServiceCollection services)
{
@@ -214,7 +214,7 @@ public static IServiceCollection AddUserModule(this IServiceCollection services)
return services;
}
```
-snippet source | anchor
+snippet source | anchor
And next, let's put that into context with its usage inside your application's bootstrapping:
@@ -242,7 +242,7 @@ The `ConfigureMarten()` method is the interesting part of the code samples above
service that implements the `IConfigureMarten` interface into the underlying IoC container:
-
+
```cs
///
/// Mechanism to register additional Marten configuration that is applied after AddMarten()
@@ -253,13 +253,13 @@ public interface IConfigureMarten
void Configure(IServiceProvider services, StoreOptions options);
}
```
-snippet source | anchor
+snippet source | anchor
You could alternatively implement a custom `IConfigureMarten` (or `IConfigureMarten where T : IDocumentStore` if you're working with multiple databases class like so:
-
+
```cs
internal class UserMartenConfiguration: IConfigureMarten
{
@@ -270,13 +270,13 @@ internal class UserMartenConfiguration: IConfigureMarten
}
}
```
-snippet source | anchor
+snippet source | anchor
and registering it in your IoC container something like this:
-
+
```cs
public static IServiceCollection AddUserModule2(this IServiceCollection services)
{
@@ -293,7 +293,7 @@ public static IServiceCollection AddUserModule2(this IServiceCollection services
return services;
}
```
-snippet source | anchor
+snippet source | anchor
### Using IoC Services for Configuring Marten
@@ -305,7 +305,7 @@ be used to selectively configure Marten using potentially asynchronous methods a
That interface signature is:
-
+
```cs
///
/// Mechanism to register additional Marten configuration that is applied after AddMarten()
@@ -317,13 +317,13 @@ public interface IAsyncConfigureMarten
ValueTask Configure(StoreOptions options, CancellationToken cancellationToken);
}
```
-snippet source | anchor
+snippet source | anchor
As an example from the tests, here's a custom version that uses the Feature Management service:
-
+
```cs
public class FeatureManagementUsingExtension: IAsyncConfigureMarten
{
@@ -343,7 +343,7 @@ public class FeatureManagementUsingExtension: IAsyncConfigureMarten
}
}
```
-snippet source | anchor
+snippet source | anchor
And lastly, these extensions can be registered directly against `IServiceCollection` like so:
@@ -369,7 +369,7 @@ compatibility with early Marten and RavenDb behavior before that. To opt into us
without the identity map behavior, use this syntax:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
@@ -382,7 +382,7 @@ services.AddMarten(opts =>
// session factory behavior
.UseLightweightSessions();
```
-snippet source | anchor
+snippet source | anchor
## Customizing Session Creation Globally
@@ -391,7 +391,7 @@ By default, Marten will create a document session with the basic identity map en
as shown in this example:
-
+
```cs
public class CustomSessionFactory: ISessionFactory
{
@@ -419,13 +419,13 @@ public class CustomSessionFactory: ISessionFactory
}
}
```
-snippet source | anchor
+snippet source | anchor
To register the custom session factory, use the `BuildSessionsWith()` method as shown in this example:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
@@ -446,7 +446,7 @@ services.CritterStackDefaults(x =>
x.Production.ResourceAutoCreate = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
The session factories can also be used to build out and attach custom `IDocumentSessionListener` objects or replace the logging as we'll see in the next section.
@@ -461,20 +461,20 @@ session identification in your application? That's now possible by using a custo
Taking the example of an ASP.NET Core application, let's say that you have a small service scoped to an HTTP request that tracks a correlation identifier for the request like this:
-
+
```cs
public interface ISession
{
Guid CorrelationId { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
And a custom Marten session logger to add the correlation identifier to the log output like this:
-
+
```cs
public class CorrelatedMartenLogger: IMartenSessionLogger
{
@@ -528,13 +528,13 @@ public class CorrelatedMartenLogger: IMartenSessionLogger
}
}
```
-snippet source | anchor
+snippet source | anchor
Now, let's move on to building out a custom session factory that will attach our correlated marten logger to sessions being resolved from the IoC container:
-
+
```cs
public class ScopedSessionFactory: ISessionFactory
{
@@ -568,13 +568,13 @@ public class ScopedSessionFactory: ISessionFactory
}
}
```
-snippet source | anchor
+snippet source | anchor
Lastly, let's register our new session factory, but this time we need to take care to register the session factory as `Scoped` in the underlying container so we're using the correct `ISession` at runtime:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
@@ -594,7 +594,7 @@ services.CritterStackDefaults(x =>
x.Production.ResourceAutoCreate = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
::: tip
@@ -621,7 +621,7 @@ To utilize the type system and your application's underlying IoC container, the
below targeting a separate "invoicing" database:
-
+
```cs
// These marker interfaces *must* be public
public interface IInvoicingStore : IDocumentStore
@@ -629,7 +629,7 @@ public interface IInvoicingStore : IDocumentStore
}
```
-snippet source | anchor
+snippet source | anchor
A couple notes on the interface:
@@ -640,7 +640,7 @@ A couple notes on the interface:
And now to bootstrap that separate store in our system:
-
+
```cs
using var host = Host.CreateDefaultBuilder()
.ConfigureServices(services =>
@@ -675,14 +675,14 @@ using var host = Host.CreateDefaultBuilder()
});
}).StartAsync();
```
-snippet source | anchor
+snippet source | anchor
At runtime we can inject an instance of our new `IInvoicingStore` and work with it like any other
Marten `IDocumentStore` as shown below in an internal `InvoicingService`:
-
+
```cs
public class InvoicingService
{
@@ -705,5 +705,5 @@ public class InvoicingService
}
}
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/configuration/ioc.md b/docs/configuration/ioc.md
index 6ff101b0e7..4b1992ca48 100644
--- a/docs/configuration/ioc.md
+++ b/docs/configuration/ioc.md
@@ -20,7 +20,7 @@ use the `AddMarten()` method directly with Lamar as well.
Using [Lamar](https://jasperfx.github.io/lamar) as the example container, we recommend registering Marten something like this:
-
+
```cs
public class MartenServices : ServiceRegistry
{
@@ -49,7 +49,7 @@ public class MartenServices : ServiceRegistry
}
}
```
-snippet source | anchor
+snippet source | anchor
There are really only two key points here:
diff --git a/docs/configuration/json.md b/docs/configuration/json.md
index d008dabce2..5862b162d4 100644
--- a/docs/configuration/json.md
+++ b/docs/configuration/json.md
@@ -232,7 +232,7 @@ Please talk to the Marten team before you undergo any significant effort to supp
Internally, Marten uses an adapter interface for JSON serialization:
-
+
```cs
///
/// When selecting data through Linq Select() transforms,
@@ -338,7 +338,7 @@ public interface ISerializer
string ToJsonWithTypes(object document);
}
```
-snippet source | anchor
+snippet source | anchor
To support a new serialization library or customize the JSON serialization options, you can write a new version of `ISerializer` and plug it
diff --git a/docs/configuration/multitenancy.md b/docs/configuration/multitenancy.md
index 1bcfbdd2d9..6d03a3bb65 100644
--- a/docs/configuration/multitenancy.md
+++ b/docs/configuration/multitenancy.md
@@ -267,7 +267,7 @@ It is strongly recommended that you first refer to the existing Marten options f
The multi-tenancy strategy is pluggable. Start by implementing the `Marten.Storage.ITenancy` interface:
-
+
```cs
///
/// Pluggable interface for Marten multi-tenancy by database
@@ -314,19 +314,19 @@ public interface ITenancy: IDatabaseSource, IDisposable, IDatabaseUser
bool IsTenantStoredInCurrentDatabase(IMartenDatabase database, string tenantId);
}
```
-snippet source | anchor
+snippet source | anchor
Assuming that we have a custom `ITenancy` model:
-
+
```cs
// Make sure you implement the Dispose() method and
// dispose all MartenDatabase objects
public class MySpecialTenancy: ITenancy
```
-snippet source | anchor
+snippet source | anchor
We can utilize that by applying that model at configuration time:
diff --git a/docs/configuration/retries.md b/docs/configuration/retries.md
index 0e85f71660..5263667eda 100644
--- a/docs/configuration/retries.md
+++ b/docs/configuration/retries.md
@@ -7,7 +7,7 @@ Marten's previous, homegrown `IRetryPolicy` mechanism was completely replaced by
Out of the box, Marten is using [Polly.Core](https://www.pollydocs.org/) for resiliency on most operations with this setup:
-
+
```cs
// default Marten policies
return builder
@@ -22,7 +22,7 @@ return builder
BackoffType = DelayBackoffType.Exponential
});
```
-snippet source | anchor
+snippet source | anchor
The general idea is to have _some_ level of retry with an exponential backoff on typical transient errors encountered
diff --git a/docs/configuration/storeoptions.md b/docs/configuration/storeoptions.md
index ac9ede632e..67781d8282 100644
--- a/docs/configuration/storeoptions.md
+++ b/docs/configuration/storeoptions.md
@@ -5,7 +5,7 @@ The static builder methods like `DocumentStore.For(configuration)` or `IServiceC
syntactic sugar around building up a `StoreOptions` object and passing that to the constructor function of a `DocumentStore`:
-
+
```cs
public static DocumentStore For(Action configure)
{
@@ -15,7 +15,7 @@ public static DocumentStore For(Action configure)
return new DocumentStore(options);
}
```
-snippet source | anchor
+snippet source | anchor
The major parts of `StoreOptions` are shown in the class diagram below:
@@ -83,7 +83,7 @@ to compose your document type configuration in additional `MartenRegistry` objec
To use your own subclass of `MartenRegistry` and place declarations in the constructor function like this example:
-
+
```cs
public class OrganizationRegistry: MartenRegistry
{
@@ -94,13 +94,13 @@ public class OrganizationRegistry: MartenRegistry
}
}
```
-snippet source | anchor
+snippet source | anchor
To apply your new `MartenRegistry`, just include it when you bootstrap the `IDocumentStore` as in this example:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -109,7 +109,7 @@ var store = DocumentStore.For(opts =>
opts.Connection(ConnectionSource.ConnectionString);
});
```
-snippet source | anchor
+snippet source | anchor
Do note that you could happily use multiple `MartenRegistry` classes in larger applications if that is advantageous.
@@ -140,7 +140,7 @@ If there's some kind of customization you'd like to use attributes for that isn'
you're still in luck. If you write a subclass of the `MartenAttribute` shown below:
-
+
```cs
public abstract class MartenAttribute: Attribute
{
@@ -167,7 +167,7 @@ public abstract class MartenAttribute: Attribute
public virtual void Register(Type discoveredType, StoreOptions options){}
}
```
-snippet source | anchor
+snippet source | anchor
And decorate either classes or individual field or properties on a document type, your custom attribute will be
@@ -178,7 +178,7 @@ As an example, an attribute to add a gin index to the JSONB storage for more eff
would look like this:
-
+
```cs
[AttributeUsage(AttributeTargets.Class)]
public class GinIndexedAttribute: MartenAttribute
@@ -189,7 +189,7 @@ public class GinIndexedAttribute: MartenAttribute
}
}
```
-snippet source | anchor
+snippet source | anchor
## Embedding Configuration in Document Types
@@ -199,7 +199,7 @@ and invoke that to let the document type make its own customizations for its sto
the unit tests:
-
+
```cs
public class ConfiguresItself
{
@@ -211,7 +211,7 @@ public class ConfiguresItself
}
}
```
-snippet source | anchor
+snippet source | anchor
The `DocumentMapping` type is the core configuration class representing how a document type is persisted or
@@ -222,7 +222,7 @@ You can optionally take in the more specific `DocumentMapping` for your docum
some convenience methods for indexing or duplicating fields that depend on .Net Expression's:
-
+
```cs
public class ConfiguresItselfSpecifically
{
@@ -235,7 +235,7 @@ public class ConfiguresItselfSpecifically
}
}
```
-snippet source | anchor
+snippet source | anchor
## Document Policies
diff --git a/docs/diagnostics.md b/docs/diagnostics.md
index a181ee05e3..21c8f515a9 100644
--- a/docs/diagnostics.md
+++ b/docs/diagnostics.md
@@ -44,7 +44,7 @@ All of the functionality in this section was added as part of Marten v0.8
Marten has a facility for listening and even intercepting document persistence events with the `IDocumentSessionListener` interface:
-
+
```cs
public interface IChangeListener
{
@@ -106,7 +106,7 @@ public interface IDocumentSessionListener
void DocumentAddedForStorage(object id, object document);
}
```
-snippet source | anchor
+snippet source | anchor
You can build and inject your own listeners by adding them to the `StoreOptions` object you use to configure a `DocumentStore`:
@@ -212,7 +212,7 @@ Listeners will never get activated during projection rebuilds to safe guard agai
A sample listener:
-
+
```cs
public class FakeListener: IChangeListener
{
@@ -237,12 +237,12 @@ public class FakeListener: IChangeListener
}
}
```
-snippet source | anchor
+snippet source | anchor
Wiring a Async Daemon listener:
-
+
```cs
var listener = new FakeListener();
StoreOptions(x =>
@@ -251,7 +251,7 @@ StoreOptions(x =>
x.Projections.AsyncListeners.Add(listener);
});
```
-snippet source | anchor
+snippet source | anchor
## Custom Logging
@@ -259,7 +259,7 @@ StoreOptions(x =>
Marten v0.8 comes with a new mechanism to plug in custom logging to the `IDocumentStore`, `IQuerySession`, and `IDocumentSession` activity:
-
+
```cs
///
/// Records command usage, schema changes, and sessions within Marten
@@ -342,7 +342,7 @@ public interface IMartenSessionLogger
public void OnBeforeExecute(NpgsqlBatch batch);
}
```
-snippet source | anchor
+snippet source | anchor
To apply these logging abstractions, you can either plug your own `IMartenLogger` into the `StoreOptions` object and allow that default logger to create the individual session loggers:
@@ -373,7 +373,7 @@ session.Logger = new RecordingLogger();
The session logging is a different abstraction specifically so that you _could_ track database commands issued per session. In effect, my own shop is going to use this capability to understand what HTTP endpoints or service bus message handlers are being unnecessarily chatty in their database interactions. We also hope that the contextual logging of commands per document session makes it easier to understand how our systems behave.
-
+
```cs
public class ConsoleMartenLogger: IMartenLogger, IMartenSessionLogger
{
@@ -473,7 +473,7 @@ public class ConsoleMartenLogger: IMartenLogger, IMartenSessionLogger
}
}
```
-snippet source | anchor
+snippet source | anchor
## Accessing Diagnostics
diff --git a/docs/documents/aspnetcore.md b/docs/documents/aspnetcore.md
index ac6f899098..9620044b12 100644
--- a/docs/documents/aspnetcore.md
+++ b/docs/documents/aspnetcore.md
@@ -97,7 +97,7 @@ that allow you to use Linq queries without the runtime overhead of continuously
Back to the sample endpoint above where we write an array of all the open issues. We can express the same query in a simple compiled query like this:
-
+
```cs
public class OpenIssues: ICompiledListQuery
{
@@ -107,7 +107,7 @@ public class OpenIssues: ICompiledListQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
And use that in an MVC Controller method like this:
@@ -131,7 +131,7 @@ sample, here's an example compiled query that reads a single `Issue` document by
id:
-
+
```cs
public class IssueById: ICompiledQuery
{
@@ -143,7 +143,7 @@ public class IssueById: ICompiledQuery
public Guid Id { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
And the usage of that to write JSON directly to the `HttpContext` in a controller method:
diff --git a/docs/documents/concurrency.md b/docs/documents/concurrency.md
index e25ce1bb06..f5bbd753b7 100644
--- a/docs/documents/concurrency.md
+++ b/docs/documents/concurrency.md
@@ -26,7 +26,7 @@ as being revisioned
In Marten's case, you have to explicitly opt into optimistic versioning for each document type. You can do that with either an attribute on your document type like so:
-
+
```cs
[UseOptimisticConcurrency]
public class CoffeeShop: Shop
@@ -37,7 +37,7 @@ public class CoffeeShop: Shop
public ICollection Employees { get; set; } = new List();
}
```
-snippet source | anchor
+snippet source | anchor
Or by using Marten's configuration API to do it programmatically:
@@ -112,7 +112,7 @@ Marten is throwing an `AggregateException` for the entire batch of changes.
A new feature in Marten V4 is the `IVersioned` marker interface. If your document type implements this interface as shown below:
-
+
```cs
public class MyVersionedDoc: IVersioned
{
@@ -120,7 +120,7 @@ public class MyVersionedDoc: IVersioned
public Guid Version { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
Your document type will have the optimistic concurrency checks applied to updates _when_ the current version is given to Marten. Moreover, the current version
@@ -143,7 +143,7 @@ You can opt into this behavior on a document by document basis by using the flue
like this:
-
+
```cs
using var store = DocumentStore.For(opts =>
{
@@ -154,7 +154,7 @@ using var store = DocumentStore.For(opts =>
opts.Schema.For().UseNumericRevisions(true);
});
```
-snippet source | anchor
+snippet source | anchor
or by implementing the `IRevisioned` interface in a document type:
diff --git a/docs/documents/deletes.md b/docs/documents/deletes.md
index 7e4078ed3f..b8e455fd1f 100644
--- a/docs/documents/deletes.md
+++ b/docs/documents/deletes.md
@@ -50,13 +50,13 @@ public Task DeleteByDocument(IDocumentSession session, User user)
Marten also provides the ability to delete any documents of a certain type meeting a Linq expression using the `IDocumentSession.DeleteWhere()` method:
-
+
```cs
theSession.DeleteWhere(x => x.Double == 578);
await theSession.SaveChangesAsync();
```
-snippet source | anchor
+snippet source | anchor
A couple things to note:
@@ -70,7 +70,7 @@ A couple things to note:
Documents of mixed or varying types can be deleted using `IDocumentSession.DeleteObjects(IEnumerable documents)` method.
-
+
```cs
// Store a mix of different document types
var user1 = new User { FirstName = "Jamie", LastName = "Vaughan" };
@@ -89,7 +89,7 @@ using (var documentSession = theStore.LightweightSession())
await documentSession.SaveChangesAsync();
}
```
-snippet source | anchor
+snippet source | anchor
## Soft Deletes
@@ -104,7 +104,7 @@ documents marked as _deleted_ unless you explicitly state otherwise in the Linq
You can direct Marten to make a document type soft deleted by either marking the class with an attribute:
-
+
```cs
[SoftDeleted]
public class SoftDeletedDoc
@@ -112,7 +112,7 @@ public class SoftDeletedDoc
public Guid Id;
}
```
-snippet source | anchor
+snippet source | anchor
Or by using the fluent interface off of `StoreOptions`:
@@ -132,7 +132,7 @@ With Marten v4.0, you can also opt into soft-deleted mechanics by having your do
interface as shown below:
-
+
```cs
public class MySoftDeletedDoc: ISoftDeleted
{
@@ -146,7 +146,7 @@ public class MySoftDeletedDoc: ISoftDeleted
public DateTimeOffset? DeletedAt { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
More on `ISoftDeleted` in a later section on exposing soft-deleted metadata directly
@@ -156,7 +156,7 @@ Also starting in Marten v4.0, you can also say globally that you want all docume
to be soft-deleted unless explicitly configured otherwise like this:
-
+
```cs
internal void AllDocumentTypesShouldBeSoftDeleted()
{
@@ -167,7 +167,7 @@ internal void AllDocumentTypesShouldBeSoftDeleted()
});
}
```
-snippet source | anchor
+snippet source | anchor
### Querying a "Soft Deleted" Document Type
@@ -485,7 +485,7 @@ public async Task query_is_soft_deleted_since_docs()
_Neither `DeletedSince` nor `DeletedBefore` are inclusive searches as shown_below:
-
+
```cs
internal void AllDocumentTypesShouldBeSoftDeleted()
{
@@ -496,7 +496,7 @@ internal void AllDocumentTypesShouldBeSoftDeleted()
});
}
```
-snippet source | anchor
+snippet source | anchor
### Undoing Soft-Deleted Documents
@@ -505,7 +505,7 @@ New in Marten v4.0 is a mechanism to mark any soft-deleted documents matching a
as not being deleted. The only usage so far is using a Linq expression as shown below:
-
+
```cs
internal Task UndoDeletion(IDocumentSession session, Guid userId)
{
@@ -516,7 +516,7 @@ internal Task UndoDeletion(IDocumentSession session, Guid userId)
return session.SaveChangesAsync();
}
```
-snippet source | anchor
+snippet source | anchor
### Explicit Hard Deletes
@@ -525,7 +525,7 @@ New in v4.0 is the ability to force Marten to perform hard deletes even on docum
that are normally soft-deleted:
-
+
```cs
internal void ExplicitlyHardDelete(IDocumentSession session, User document)
{
@@ -542,7 +542,7 @@ internal void ExplicitlyHardDelete(IDocumentSession session, User document)
// to actually perform the operations
}
```
-snippet source | anchor
+snippet source | anchor
### Deletion Metadata on Documents
@@ -552,7 +552,7 @@ and when it was deleted is to implement the `ISoftDeleted` interface as shown
in this sample document:
-
+
```cs
public class MySoftDeletedDoc: ISoftDeleted
{
@@ -566,7 +566,7 @@ public class MySoftDeletedDoc: ISoftDeleted
public DateTimeOffset? DeletedAt { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
Implementing `ISoftDeleted` on your document means that:
@@ -583,7 +583,7 @@ Now, if you don't want to couple your document types to Marten by implementing t
you're still in business. Let's say you have this document type:
-
+
```cs
public class ASoftDeletedDoc
{
@@ -595,7 +595,7 @@ public class ASoftDeletedDoc
public DateTimeOffset? DeletedWhen { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
You can manually -- and independently -- map the `IsDeleted` and `DeletedWhen` properties
diff --git a/docs/documents/execute-custom-sql.md b/docs/documents/execute-custom-sql.md
index ec0589e479..c522120e47 100644
--- a/docs/documents/execute-custom-sql.md
+++ b/docs/documents/execute-custom-sql.md
@@ -5,7 +5,7 @@ Use `QueueSqlCommand(string sql, params object[] parameterValues)` method to reg
`?` placeholders can be used to denote parameter values. Postgres [type casts `::`](https://www.postgresql.org/docs/15/sql-expressions.html#SQL-SYNTAX-TYPE-CASTS) can be applied to the parameter if needed. If the `?` character is not suitable as a placeholder because you need to use `?` in your sql query, you can change the placeholder by providing an alternative. Pass this in before the sql argument.
-
+
```cs
theSession.QueueSqlCommand("insert into names (name) values ('Jeremy')");
theSession.QueueSqlCommand("insert into names (name) values ('Babu')");
@@ -17,5 +17,5 @@ theSession.QueueSqlCommand("insert into data (raw_value) values (?::jsonb)", jso
// Use ^ as the parameter placeholder
theSession.QueueSqlCommand('^', "insert into data (raw_value) values (^::jsonb)", json);
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/documents/full-text.md b/docs/documents/full-text.md
index f34873333b..b78ac1c836 100644
--- a/docs/documents/full-text.md
+++ b/docs/documents/full-text.md
@@ -296,13 +296,13 @@ var posts = session.Query()
They allow also to specify language (regConfig) of the text search query (by default `english` is being used)
-
+
```cs
var posts = session.Query()
.Where(x => x.PhraseSearch("somefilter", "italian"))
.ToList();
```
-snippet source | anchor
+snippet source | anchor
## Partial text search in a multi-word text (NGram search)
diff --git a/docs/documents/hierarchies.md b/docs/documents/hierarchies.md
index 4c356c2e63..d514bb2eeb 100644
--- a/docs/documents/hierarchies.md
+++ b/docs/documents/hierarchies.md
@@ -80,21 +80,24 @@ public class Smurf: ISmurf
public interface IPapaSmurf: ISmurf
{
+ bool IsVillageLeader { get; set; }
}
public class PapaSmurf: Smurf, IPapaSmurf
{
+ public bool IsVillageLeader { get; set; }
+
+ public bool IsPapa { get; set; } = true;
}
public class PapySmurf: Smurf, IPapaSmurf
{
+ public bool IsVillageLeader { get; set; }
}
-public class BrainySmurf: PapaSmurf
-{
-}
+public class BrainySmurf: PapaSmurf;
```
-snippet source | anchor
+snippet source | anchor
If you wish to query over one of hierarchy classes and be able to get all of its documents as well as its subclasses,
@@ -125,7 +128,7 @@ public query_with_inheritance(ITestOutputHelper output)
});
}
```
-snippet source | anchor
+snippet source | anchor
Note that if you wish to use aliases on certain subclasses, you could pass a `MappedType`, which contains the type to map
@@ -142,9 +145,10 @@ _.Schema.For()
typeof(PapySmurf),
typeof(IPapaSmurf),
typeof(BrainySmurf)
- );
+ )
+ .Duplicate(x => x.IsVillageLeader); // Put a duplicated index on subclass property;
```
-snippet source | anchor
+snippet source | anchor
Now you can query the "complex" hierarchy in the following ways:
@@ -239,6 +243,37 @@ public async Task get_all_subclasses_of_an_interface()
theSession.Query().Count().ShouldBe(3);
}
+
+[Fact]
+public async Task search_on_property_of_subclass()
+{
+ var smurf = new Smurf {Ability = "Follow the herd"};
+ var papa = new PapaSmurf {Ability = "Lead", IsVillageLeader = true };
+ var papy = new PapySmurf {Ability = "Lead"};
+ var brainy = new BrainySmurf {Ability = "Invent"};
+ theSession.Store(smurf, papa, brainy, papy);
+
+ await theSession.SaveChangesAsync();
+
+ (await theSession.Query().WhereSub(x => x.IsVillageLeader).CountAsync()).ShouldBe(1);
+}
+
+[Fact]
+public async Task search_on_property_of_subclass_and_parent()
+{
+ var smurf = new Smurf {Ability = "Follow the herd"};
+ var papa = new PapaSmurf {Ability = "Lead" };
+ var papy = new PapySmurf {Ability = "Lead"};
+ var brainy = new BrainySmurf {Ability = "Invent"};
+ theSession.Store(smurf, papa, brainy, papy);
+
+ await theSession.SaveChangesAsync();
+
+ (await theSession.Query()
+ .WhereSub(x => x.IsPapa)
+ .Where(x => x.Ability == "Invent")
+ .CountAsync()).ShouldBe(1);
+}
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/documents/identity.md b/docs/documents/identity.md
index 7e06312976..d1635c20b6 100644
--- a/docs/documents/identity.md
+++ b/docs/documents/identity.md
@@ -55,7 +55,7 @@ the `[Identity]` attribute to force Marten to use a property or field as the ide
the "id" or "Id" or "ID" convention:
-
+
```cs
public class NonStandardDoc
{
@@ -63,7 +63,7 @@ public class NonStandardDoc
public string Name;
}
```
-snippet source | anchor
+snippet source | anchor
The identity property or field can also be configured through `StoreOptions` by using the `Schema` to obtain a document mapping:
@@ -183,7 +183,7 @@ var store = DocumentStore.For(_ =>
Marten 1.2 adds a convenience method to reset the "floor" of the Hilo sequence for a single document type:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -195,7 +195,7 @@ var store = DocumentStore.For(opts =>
// type to 2500
await store.Tenancy.Default.Database.ResetHiloSequenceFloor(2500);
```
-snippet source | anchor
+snippet source | anchor
This functionality was added specifically to aid in importing data from an existing data source. Do note that this functionality simply guarantees
@@ -218,14 +218,14 @@ so you will not be able to use any kind of punctuation characters or spaces.
Let's say you have a document type with a `string` for the identity member like this one:
-
+
```cs
public class DocumentWithStringId
{
public string Id { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
You can use the "identity key" option for identity generation that would create string values of the pattern `[type alias]/[sequence]` where the type alias is typically the document class name in all lower case and the sequence is a _HiLo_ sequence number.
@@ -233,7 +233,7 @@ You can use the "identity key" option for identity generation that would create
You can opt into the _identity key_ strategy for identity and even override the document alias name with this syntax:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -243,7 +243,7 @@ var store = DocumentStore.For(opts =>
.DocumentAlias("doc");
});
```
-snippet source | anchor
+snippet source | anchor
## Custom Identity Strategies
@@ -413,7 +413,7 @@ As you might infer -- or not -- there's a couple rules and internal behavior:
For another example, here's a usage of an `int` wrapped identifier:
-
+
```cs
[StronglyTypedId(Template.Int)]
public readonly partial struct Order2Id;
@@ -424,7 +424,7 @@ public class Order2
public string Name { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
::: warning
diff --git a/docs/documents/indexing/duplicated-fields.md b/docs/documents/indexing/duplicated-fields.md
index 2883b0b20e..4f1e4e372e 100644
--- a/docs/documents/indexing/duplicated-fields.md
+++ b/docs/documents/indexing/duplicated-fields.md
@@ -35,7 +35,7 @@ public class Employee
Or by using the fluent interface off of `StoreOptions`:
-
+
```cs
var store = DocumentStore.For(options =>
{
@@ -73,7 +73,7 @@ var store = DocumentStore.For(options =>
});
});
```
-snippet source | anchor
+snippet source | anchor
In the case above, Marten would add an extra columns to the generated `mt_doc_user` table with `first_name` and `department`. Some users find duplicated fields to be useful for user supplied SQL queries.
diff --git a/docs/documents/indexing/foreign-keys.md b/docs/documents/indexing/foreign-keys.md
index 7e4dd7959b..7cf33e70bc 100644
--- a/docs/documents/indexing/foreign-keys.md
+++ b/docs/documents/indexing/foreign-keys.md
@@ -8,7 +8,7 @@ One of our sample document types in Marten is the `Issue` class that has
a couple properties that link to the id's of related `User` documents:
-
+
```cs
public class Issue
{
@@ -33,7 +33,7 @@ public class Issue
public string Status { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
If I want to enforce referential integrity between the `Issue` document and the `User` documents,
diff --git a/docs/documents/indexing/gin-gist-indexes.md b/docs/documents/indexing/gin-gist-indexes.md
index e2bda22945..314d2a4c4f 100644
--- a/docs/documents/indexing/gin-gist-indexes.md
+++ b/docs/documents/indexing/gin-gist-indexes.md
@@ -6,7 +6,7 @@ To optimize a wider range of ad-hoc queries against the document JSONB, you can
the JSON field in the database:
-
+
```cs
var store = DocumentStore.For(options =>
{
@@ -44,7 +44,7 @@ var store = DocumentStore.For(options =>
});
});
```
-snippet source | anchor
+snippet source | anchor
**Marten may be changed to make the GIN index on the data field be automatic in the future.**
diff --git a/docs/documents/indexing/ignore-indexes.md b/docs/documents/indexing/ignore-indexes.md
index 1464a1ae7a..c43e93c2ba 100644
--- a/docs/documents/indexing/ignore-indexes.md
+++ b/docs/documents/indexing/ignore-indexes.md
@@ -3,7 +3,7 @@
Any custom index on a Marten defined document table added outside of Marten can potentially cause issues with Marten schema migration detection and delta computation. Marten provides a mechanism to ignore those indexes using `IgnoreIndex(string indexName)`.
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -11,5 +11,5 @@ var store = DocumentStore.For(opts =>
opts.Schema.For().IgnoreIndex("foo");
});
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/documents/indexing/metadata-indexes.md b/docs/documents/indexing/metadata-indexes.md
index 8223effab5..e7afcc8869 100644
--- a/docs/documents/indexing/metadata-indexes.md
+++ b/docs/documents/indexing/metadata-indexes.md
@@ -52,7 +52,7 @@ public class TenantIdIndexCustomer
Or by using the fluent interface:
-
+
```cs
DocumentStore.For(_ =>
{
@@ -60,7 +60,7 @@ DocumentStore.For(_ =>
_.Schema.For().IndexTenantId();
});
```
-snippet source | anchor
+snippet source | anchor
## Soft Delete
@@ -69,7 +69,7 @@ If using the [soft deletes](/documents/deletes) functionality you can ask Marten
to create a partial index on the deleted documents either using `SoftDeletedAttribute`:
-
+
```cs
[SoftDeleted(Indexed = true)]
public class IndexedSoftDeletedDoc
@@ -77,7 +77,7 @@ public class IndexedSoftDeletedDoc
public Guid Id;
}
```
-snippet source | anchor
+snippet source | anchor
Or by using the fluent interface:
diff --git a/docs/documents/indexing/unique.md b/docs/documents/indexing/unique.md
index d76478eccd..99dd19bc82 100644
--- a/docs/documents/indexing/unique.md
+++ b/docs/documents/indexing/unique.md
@@ -231,7 +231,7 @@ var store = DocumentStore.For(_ =>
Same can be configured for Duplicated Field:
-
+
```cs
var store = DocumentStore.For(options =>
{
@@ -269,7 +269,7 @@ var store = DocumentStore.For(options =>
});
});
```
-snippet source | anchor
+snippet source | anchor
## Unique Index per Tenant
diff --git a/docs/documents/initial-data.md b/docs/documents/initial-data.md
index 3b17881179..ec0c30f595 100644
--- a/docs/documents/initial-data.md
+++ b/docs/documents/initial-data.md
@@ -77,7 +77,7 @@ We think it's common that you'll use the `IInitialData` mechanism strictly for t
a set of baseline data for testing that lives in your test project:
-
+
```cs
public class MyTestingData: IInitialData
{
@@ -88,7 +88,7 @@ public class MyTestingData: IInitialData
}
}
```
-snippet source | anchor
+snippet source | anchor
Now, you'd like to use your exact application Marten configuration, but only for testing, add the `MyTestingData` initial data
@@ -96,7 +96,7 @@ set to the application's Marten configuration. You can do that as of Marten v5.1
methods as shown in a sample below for a testing project:
-
+
```cs
// Use the configured host builder for your application
// by calling the Program.CreateHostBuilder() method from
@@ -126,5 +126,5 @@ var store = host.Services.GetRequiredService();
// MyTestingData:
await store.Advanced.ResetAllData();
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/documents/metadata.md b/docs/documents/metadata.md
index 5ece1b4fe9..9e40810ebd 100644
--- a/docs/documents/metadata.md
+++ b/docs/documents/metadata.md
@@ -103,7 +103,7 @@ type to a metadata value individually. Let's say that you have a document type l
this where you want to track metadata:
-
+
```cs
public class DocWithMetadata
{
@@ -116,7 +116,7 @@ public class DocWithMetadata
public bool IsDeleted { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
To enable the Marten mapping to metadata values, use this syntax:
@@ -149,7 +149,7 @@ For correlation, causation, and last modified tracking, an easy way to do this i
just implement the Marten `ITracked` interface as shown below:
-
+
```cs
public class MyTrackedDoc: ITracked
{
@@ -159,7 +159,7 @@ public class MyTrackedDoc: ITracked
public string LastModifiedBy { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
If your document type implements this interface, Marten will automatically enable the correlation and causation tracking, and set values for correlation, causation, and the last modified data on documents anytime they are loaded or persisted by Marten.
@@ -168,7 +168,7 @@ Likewise, version tracking directly on the document is probably easiest with the
interface as shown below:
-
+
```cs
public class MyVersionedDoc: IVersioned
{
@@ -176,7 +176,7 @@ public class MyVersionedDoc: IVersioned
public Guid Version { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
Implementing `IVersioned` will automatically opt your document type into optimistic concurrency
@@ -187,7 +187,7 @@ checking with mapping of the current version to the `IVersioned.Version` propert
If you want Marten to run lean, you can omit all metadata fields from Marten with this configuration:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -198,7 +198,7 @@ var store = DocumentStore.For(opts =>
opts.Policies.DisableInformationalFields();
});
```
-snippet source | anchor
+snippet source | anchor
## Querying by Last Modified
diff --git a/docs/documents/multi-tenancy.md b/docs/documents/multi-tenancy.md
index 723c618df6..1b29741349 100644
--- a/docs/documents/multi-tenancy.md
+++ b/docs/documents/multi-tenancy.md
@@ -484,7 +484,7 @@ To exempt document types from having partitioned tables, such as for tables you
even harm by partitioning, you can use either an attribute on the document type:
-
+
```cs
[DoNotPartition]
public class DocThatShouldBeExempted1
@@ -492,7 +492,7 @@ public class DocThatShouldBeExempted1
public Guid Id { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
or exempt a single document type through the fluent interface:
diff --git a/docs/documents/querying/batched-queries.md b/docs/documents/querying/batched-queries.md
index 40d3133b2d..c23763d406 100644
--- a/docs/documents/querying/batched-queries.md
+++ b/docs/documents/querying/batched-queries.md
@@ -62,7 +62,7 @@ As of v0.8.10, Marten allows you to incorporate [compiled queries](/documents/qu
Say you have a compiled query that finds the first user with a given first name:
-
+
```cs
public class FindByFirstName: ICompiledQuery
{
@@ -74,7 +74,7 @@ public class FindByFirstName: ICompiledQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
To use that compiled query class in a batch query, you simply use the `IBatchedQuery.Query(ICompiledQuery)` syntax shown below:
diff --git a/docs/documents/querying/compiled-queries.md b/docs/documents/querying/compiled-queries.md
index dd87e9e1a2..bed2eb8a37 100644
--- a/docs/documents/querying/compiled-queries.md
+++ b/docs/documents/querying/compiled-queries.md
@@ -27,20 +27,20 @@ Fortunately, Marten supports the concept of a _Compiled Query_ that you can use
All compiled queries are classes that implement the `ICompiledQuery` interface shown below:
-
+
```cs
public interface ICompiledQuery : ICompiledQueryMarker where TDoc: notnull
{
Expression, TOut>> QueryIs();
}
```
-snippet source | anchor
+snippet source | anchor
In its simplest usage, let's say that we want to find the first user document with a certain first name. That class would look like this:
-
+
```cs
public class FindByFirstName: ICompiledQuery
{
@@ -52,7 +52,7 @@ public class FindByFirstName: ICompiledQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
::: tip
@@ -173,19 +173,19 @@ To query for multiple results, you need to just return the raw `IQueryable` a
If you are selecting the whole document without any kind of `Select()` transform, you can use this interface:
-
+
```cs
public interface ICompiledListQuery: ICompiledListQuery where TDoc : notnull
{
}
```
-snippet source | anchor
+snippet source | anchor
A sample usage of this type of query is shown below:
-
+
```cs
public class UsersByFirstName: ICompiledListQuery
{
@@ -198,25 +198,25 @@ public class UsersByFirstName: ICompiledListQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
If you do want to use a `Select()` transform, use this interface:
-
+
```cs
public interface ICompiledListQuery: ICompiledQuery> where TDoc : notnull
{
}
```
-snippet source | anchor
+snippet source | anchor
A sample usage of this type of query is shown below:
-
+
```cs
public class UserNamesForFirstName: ICompiledListQuery
{
@@ -230,7 +230,7 @@ public class UserNamesForFirstName: ICompiledListQuery
public string FirstName { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
## Querying for Related Documents with Include()
@@ -426,19 +426,19 @@ we handle Include queries.
If you are querying for a single document with no transformation, you can use this interface as a convenience:
-
+
```cs
public interface ICompiledQuery: ICompiledQuery where TDoc : notnull
{
}
```
-snippet source | anchor
+snippet source | anchor
And an example:
-
+
```cs
public class FindUserByAllTheThings: ICompiledQuery
{
@@ -455,7 +455,7 @@ public class FindUserByAllTheThings: ICompiledQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
## Querying for Multiple Results as JSON
@@ -463,7 +463,7 @@ public class FindUserByAllTheThings: ICompiledQuery
To query for multiple results and have them returned as a Json string, you may run any query on your `IQueryable` (be it ordering or filtering) and then simply finalize the query with `ToJsonArray();` like so:
-
+
```cs
public class FindJsonOrderedUsersByUsername: ICompiledListQuery
{
@@ -477,7 +477,7 @@ public class FindJsonOrderedUsersByUsername: ICompiledListQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
If you wish to do it asynchronously, you can use the `ToJsonArrayAsync()` method.
@@ -485,7 +485,7 @@ If you wish to do it asynchronously, you can use the `ToJsonArrayAsync()` method
A sample usage of this type of query is shown below:
-
+
```cs
public class FindJsonOrderedUsersByUsername: ICompiledListQuery
{
@@ -499,7 +499,7 @@ public class FindJsonOrderedUsersByUsername: ICompiledListQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
Note that the result has the documents comma separated and wrapped in angle brackets (as per the Json notation).
@@ -509,7 +509,7 @@ Note that the result has the documents comma separated and wrapped in angle brac
Finally, if you are querying for a single document as json, you will need to prepend your call to `Single()`, `First()` and so on with a call to `AsJson()`:
-
+
```cs
public class FindJsonUserByUsername: ICompiledQuery
{
@@ -522,13 +522,13 @@ public class FindJsonUserByUsername: ICompiledQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
And an example:
-
+
```cs
public class FindJsonUserByUsername: ICompiledQuery
{
@@ -541,7 +541,7 @@ public class FindJsonUserByUsername: ICompiledQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
(our `ToJson()` method simply returns a string representation of the `User` instance in Json notation)
@@ -554,7 +554,7 @@ object to collect the total number of rows in the database when the query is exe
from the Marten tests:
-
+
```cs
public class TargetsInOrder: ICompiledListQuery
{
@@ -572,13 +572,13 @@ public class TargetsInOrder: ICompiledListQuery
}
}
```
-snippet source | anchor
+snippet source | anchor
And when used in the actual test:
-
+
```cs
[Fact]
public async Task use_compiled_query_with_statistics()
@@ -596,7 +596,7 @@ public async Task use_compiled_query_with_statistics()
query.Statistics.TotalResults.ShouldBe(100);
}
```
-snippet source | anchor
+snippet source | anchor
## Query Plans
@@ -686,7 +686,7 @@ public static async Task use_query_plan(IQuerySession session, CancellationToken
There is also a similar interface for usage with [batch querying](/documents/querying/batched-queries):
-
+
```cs
///
/// Marten's concept of the "Specification" pattern for reusable
@@ -698,7 +698,7 @@ public interface IBatchQueryPlan
Task Fetch(IBatchedQuery query);
}
```
-snippet source | anchor
+snippet source | anchor
And because we expect this to be very common, there is convenience base class named `QueryListPlan` for querying lists of `T` data that can be used for both querying directly against an `IQuerySession` and for batch querying. The usage within a batched query is shown below from the Marten tests:
diff --git a/docs/documents/querying/linq/extending.md b/docs/documents/querying/linq/extending.md
index c35b7b470b..87585c6b63 100644
--- a/docs/documents/querying/linq/extending.md
+++ b/docs/documents/querying/linq/extending.md
@@ -9,7 +9,7 @@ Using the (admittedly contrived) example from Marten's tests, say that you want
different queries for "IsBlue()." First, write the method you want to be recognized by Marten's Linq support:
-
+
```cs
public class IsBlue: IMethodCallParser
{
@@ -29,7 +29,7 @@ public class IsBlue: IMethodCallParser
}
}
```
-snippet source | anchor
+snippet source | anchor
Note a couple things here:
diff --git a/docs/documents/querying/linq/sql.md b/docs/documents/querying/linq/sql.md
index 25b4fde6f1..edc1802753 100644
--- a/docs/documents/querying/linq/sql.md
+++ b/docs/documents/querying/linq/sql.md
@@ -26,7 +26,7 @@ public async Task query_with_matches_sql()
Older version of Marten also offer the `MatchesJsonPath()` method which uses the `^` character as a placeholder. This will continue to be supported.
-
+
```cs
var results2 = await theSession
.Query().Where(x => x.MatchesSql('^', "d.data @? '$ ? (@.Children[*] == null || @.Children[*].size() == 0)'"))
@@ -37,5 +37,5 @@ var results3 = await theSession
.Query().Where(x => x.MatchesJsonPath("d.data @? '$ ? (@.Children[*] == null || @.Children[*].size() == 0)'"))
.ToListAsync();
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/documents/querying/linq/strings.md b/docs/documents/querying/linq/strings.md
index 2720900aed..767d62b23e 100644
--- a/docs/documents/querying/linq/strings.md
+++ b/docs/documents/querying/linq/strings.md
@@ -40,12 +40,12 @@ public void case_insensitive_string_fields(IDocumentSession session)
A shorthand for case-insensitive string matching is provided through `EqualsIgnoreCase` (string extension method in *Baseline*):
-
+
```cs
query.Query().Single(x => x.UserName.EqualsIgnoreCase("abc")).Id.ShouldBe(user1.Id);
query.Query().Single(x => x.UserName.EqualsIgnoreCase("aBc")).Id.ShouldBe(user1.Id);
```
-snippet source | anchor
+snippet source | anchor
This defaults to `String.Equals` with `StringComparison.CurrentCultureIgnoreCase` as comparison type.
diff --git a/docs/documents/querying/query-json.md b/docs/documents/querying/query-json.md
index eef831d3a1..f676e0c20d 100644
--- a/docs/documents/querying/query-json.md
+++ b/docs/documents/querying/query-json.md
@@ -93,7 +93,7 @@ public async Task when_get_json_then_raw_json_should_be_returned_async()
Marten has the ability to combine the `AsJson()` mechanics to the result of a `Select()` transform:
-
+
```cs
var json = await theSession
.Query()
@@ -105,13 +105,13 @@ var json = await theSession
json.ShouldBe("{\"Name\": \"Bill\"}");
```
-snippet source | anchor
+snippet source | anchor
And another example, but this time transforming to an anonymous type:
-
+
```cs
(await theSession
.Query()
@@ -124,5 +124,5 @@ And another example, but this time transforming to an anonymous type:
.ToJsonFirstOrDefault())
.ShouldBe("{\"Name\": \"Bill\"}");
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/documents/sessions.md b/docs/documents/sessions.md
index 44d0ed8719..51bcea7419 100644
--- a/docs/documents/sessions.md
+++ b/docs/documents/sessions.md
@@ -58,7 +58,7 @@ for reading. The `IServiceCollection.AddMarten()` configuration will set up a DI
`IQuerySession`, so you can inject that into classes like this sample MVC controller:
-
+
```cs
public class GetIssueController: ControllerBase
{
@@ -83,7 +83,7 @@ public class GetIssueController: ControllerBase
}
```
-snippet source | anchor
+snippet source | anchor
If you have an `IDocumentStore` object though, you can open a query session like this:
@@ -316,7 +316,7 @@ By default, Marten just uses the underlying timeout configuration from the [Npgs
You can though, opt to set a different command timeout per session with this syntax:
-
+
```cs
public void ConfigureCommandTimeout(IDocumentStore store)
{
@@ -327,7 +327,7 @@ public void ConfigureCommandTimeout(IDocumentStore store)
}
}
```
-snippet source | anchor
+snippet source | anchor
## Unit of Work Mechanics
diff --git a/docs/documents/storage.md b/docs/documents/storage.md
index 2e712f146a..3603140c54 100644
--- a/docs/documents/storage.md
+++ b/docs/documents/storage.md
@@ -47,7 +47,7 @@ var store = DocumentStore.For(opts =>
Or by using an attribute on your document type:
-
+
```cs
[DatabaseSchemaName("organization")]
public class Customer
@@ -55,7 +55,7 @@ public class Customer
[Identity] public string Name { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
## Type Aliases
diff --git a/docs/documents/storing.md b/docs/documents/storing.md
index 77b69f51bd..6dcecda39e 100644
--- a/docs/documents/storing.md
+++ b/docs/documents/storing.md
@@ -12,7 +12,7 @@ a previously persisted document with the same identity. Here's that method in ac
with a sample that shows storing both a brand new document and a modified document:
-
+
```cs
using var store = DocumentStore.For("some connection string");
@@ -35,7 +35,7 @@ session.Store(newUser, existingUser);
await session.SaveChangesAsync();
```
-snippet source | anchor
+snippet source | anchor
The `Store()` method can happily take a mixed bag of document types at one time, but you'll need to tell Marten to use `Store()` instead of letting it infer the document type as shown below:
@@ -132,7 +132,7 @@ theSession.Query().Count().ShouldBe(data.Length);
By default, bulk insert will fail if there are any duplicate id's between the documents being inserted and the existing database data. You can alter this behavior through the `BulkInsertMode` enumeration as shown below:
-
+
```cs
// Just say we have an array of documents we want to bulk insert
var data = Target.GenerateRandomData(100).ToArray();
@@ -151,7 +151,7 @@ await store.BulkInsertDocumentsAsync(data, BulkInsertMode.InsertsOnly);
// being loaded
await store.BulkInsertDocumentsAsync(data, BulkInsertMode.OverwriteExisting);
```
-snippet source | anchor
+snippet source | anchor
The bulk insert feature can also be used with multi-tenanted documents, but in that
@@ -159,7 +159,7 @@ case you are limited to only loading documents to a single tenant at a time as
shown below:
-
+
```cs
// Just say we have an array of documents we want to bulk insert
var data = Target.GenerateRandomData(100).ToArray();
@@ -173,5 +173,5 @@ using var store = DocumentStore.For(opts =>
// If multi-tenanted
await store.BulkInsertDocumentsAsync("a tenant id", data);
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/events/appending.md b/docs/events/appending.md
index 019c0bc6af..5e30fcc369 100644
--- a/docs/events/appending.md
+++ b/docs/events/appending.md
@@ -159,7 +159,7 @@ is present in the database:
To make the stream type markers mandatory, you can use this flag in the configuration:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -171,7 +171,7 @@ builder.Services.AddMarten(opts =>
opts.Events.UseMandatoryStreamTypeDeclaration = true;
});
```
-snippet source | anchor
+snippet source | anchor
This causes a couple side effects that **force stricter usage of Marten**:
@@ -250,7 +250,7 @@ perfectly safe to delete tombstone events from your database:
value from the `mt_event_progression` table or through this API call:
-
+
```cs
public static async Task ShowDaemonDiagnostics(IDocumentStore store)
{
@@ -269,5 +269,5 @@ public static async Task ShowDaemonDiagnostics(IDocumentStore store)
Console.WriteLine($"The daemon high water sequence mark is {daemonHighWaterMark}");
}
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/events/archiving.md b/docs/events/archiving.md
index fc28338368..b92534beef 100644
--- a/docs/events/archiving.md
+++ b/docs/events/archiving.md
@@ -137,7 +137,7 @@ Let's try to make this concrete by building a simple order processing system tha
aggregate:
-
+
```cs
public class Item
{
@@ -175,14 +175,14 @@ public class Order
}
}
```
-snippet source | anchor
+snippet source | anchor
Next, let's say we're having the `Order` aggregate snapshotted so that it's updated every time new events
are captured like so:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -203,7 +203,7 @@ builder.Services.AddMarten(opts =>
// need that tracking at runtime
.UseLightweightSessions();
```
-snippet source | anchor
+snippet source | anchor
Now, let's say as a way to keep our application performing as well as possible, we'd like to be aggressive about archiving
diff --git a/docs/events/compacting.md b/docs/events/compacting.md
index 79b8b9d281..36be8abf82 100644
--- a/docs/events/compacting.md
+++ b/docs/events/compacting.md
@@ -59,7 +59,7 @@ There's not yet any default archiver, but we're open to suggestions about what t
an implementation of this interface:
-
+
```cs
///
/// Callback interface for executing event archiving
@@ -70,7 +70,7 @@ public interface IEventsArchiver
CancellationToken cancellation);
}
```
-snippet source | anchor
+snippet source | anchor
By default, Marten is *not* archiving events in this operation.
diff --git a/docs/events/metadata.md b/docs/events/metadata.md
index 9bed4dca12..399c63baa4 100644
--- a/docs/events/metadata.md
+++ b/docs/events/metadata.md
@@ -8,7 +8,7 @@ for causation, correlation, user names, and key/value headers with this syntax a
Marten:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -22,7 +22,7 @@ var store = DocumentStore.For(opts =>
opts.Events.MetadataConfig.UserNameEnabled = true;
});
```
-snippet source | anchor
+snippet source | anchor
By default, Marten runs "lean" by omitting the extra metadata storage on events shown above. Causation, correlation, user name (last modified by), and header fields must be individually enabled.
diff --git a/docs/events/multitenancy.md b/docs/events/multitenancy.md
index b0956d1836..61fbc53f80 100644
--- a/docs/events/multitenancy.md
+++ b/docs/events/multitenancy.md
@@ -29,7 +29,7 @@ be global within your system.
Let's start with a possible implementation of a single stream projection:
-
+
```cs
public class SpecialCounterProjection: SingleStreamProjection
{
@@ -40,13 +40,13 @@ public class SpecialCounterProjection: SingleStreamProjectionsnippet source | anchor
+snippet source | anchor
Or this equivalent, but see how I'm explicitly registering event types, because that's going to be important:
-
+
```cs
public class SpecialCounterProjection2: SingleStreamProjection
{
@@ -90,7 +90,7 @@ public class SpecialCounterProjection2: SingleStreamProjectionsnippet source | anchor
+snippet source | anchor
And finally, let's register our projection within our application's bootstrapping:
diff --git a/docs/events/projections/aggregate-projections.md b/docs/events/projections/aggregate-projections.md
index 1f5f7519d9..12f627e1dc 100644
--- a/docs/events/projections/aggregate-projections.md
+++ b/docs/events/projections/aggregate-projections.md
@@ -13,7 +13,7 @@ aggregated document representing the state of those events. To jump into a simpl
view called `QuestParty` that creates an aggregated view of `MembersJoined`, `MembersDeparted`, and `QuestStarted` events related to a group of heroes traveling on a quest in your favorite fantasy novel:
-
+
```cs
public sealed record QuestParty(Guid Id, List Members)
{
@@ -38,7 +38,7 @@ public sealed record QuestParty(Guid Id, List Members)
};
}
```
-snippet source | anchor
+snippet source | anchor
Once again, here's the class diagram of the key projection types inside of Marten, but please note the `SingleStreamProjection`:
@@ -77,7 +77,7 @@ The easiest type of aggregate to create is a document that rolls up the state of
document that directly mutates itself through method conventions or by sub-classing the `SingleStreamProjection` class like this sample for a fictional `Trip` aggregate document:
-
+
```cs
public class TripProjection: SingleStreamProjection
{
@@ -113,7 +113,7 @@ public class TripProjection: SingleStreamProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
And register that projection like this:
@@ -218,7 +218,7 @@ document type -- which doesn't have to be public by the way.
You can also use a constructor that takes an event type as shown in this sample of a `Trip` stream aggregation:
-
+
```cs
public class Trip
{
@@ -268,13 +268,13 @@ public class Trip
internal bool ShouldDelete(VacationOver e) => Traveled > 1000;
}
```
-snippet source | anchor
+snippet source | anchor
Or finally, you can use a method named `Create()` on a projection type as shown in this sample:
-
+
```cs
public class TripProjection: SingleStreamProjection
{
@@ -310,7 +310,7 @@ public class TripProjection: SingleStreamProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
The `Create()` method has to return either the aggregate document type or `Task` where `T` is the aggregate document type. There must be an argument for the specific event type or `IEvent` where `T` is the event type if you need access to event metadata. You can also take in an `IQuerySession` if you need to look up additional data as part of the transformation or `IEvent` in addition to the exact event type just to get at event metadata.
@@ -325,7 +325,7 @@ Marten will apply all those event types that can be cast to the interface or abs
To make changes to an existing aggregate, you can either use inline Lambda functions per event type with one of the overloads of `ProjectEvent()`:
-
+
```cs
public class TripProjection: SingleStreamProjection
{
@@ -350,13 +350,13 @@ public class TripProjection: SingleStreamProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
I'm not personally that wild about using lots of inline Lambdas like the example above, and to that end, Marten now supports the `Apply()` method convention. Here's the same `TripProjection`, but this time using methods to mutate the `Trip` document:
-
+
```cs
public class TripProjection: SingleStreamProjection
{
@@ -392,7 +392,7 @@ public class TripProjection: SingleStreamProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
The `Apply()` methods can accept any combination of these arguments:
@@ -511,7 +511,7 @@ Additionally, `ShouldDelete()` methods should return either a `Boolean` or `Task
You can use the `SingleStreamProjection` method conventions for stream aggregations, which we just mean to be an aggregate document type that implements its own `Apply()` or `ShouldDelete()` methods to mutate itself. Using that concept, let's take the `TripProjection` we have been using and apply that instead to a `Trip` type:
-
+
```cs
public class Trip
{
@@ -561,7 +561,7 @@ public class Trip
internal bool ShouldDelete(VacationOver e) => Traveled > 1000;
}
```
-snippet source | anchor
+snippet source | anchor
Here's an example of using the various ways of doing `Trip` stream aggregation:
@@ -608,7 +608,7 @@ in order to opt into the optimistic concurrency check.
To start with, let's say we have an `OrderAggregate` defined like this:
-
+
```cs
public class OrderAggregate
{
@@ -623,7 +623,7 @@ public class OrderAggregate
public bool HasShipped { get; private set; }
}
```
-snippet source | anchor
+snippet source | anchor
Notice the `Version` property of that document above. Using a naming convention (we'll talk about how to go around the convention in just a second),
@@ -742,7 +742,7 @@ your aggregate in any way you wish.
Here's an example of using a custom header value of the events captured to update an aggregate based on the last event encountered:
-
+
```cs
public class Item
{
@@ -794,7 +794,7 @@ public class ItemProjection: SingleStreamProjection-
}
}
```
-
snippet source | anchor
+snippet source | anchor
And the same projection in usage in a unit test to see how it's all put together:
@@ -944,7 +944,7 @@ By default, Marten will only process projection "side effects" during continuous
wish to use projection side effects while running projections with an `Inline` lifecycle, you can do that with this setting:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -956,7 +956,7 @@ builder.Services.AddMarten(opts =>
opts.Events.EnableSideEffectsOnInlineProjections = true;
});
```
-snippet source | anchor
+snippet source | anchor
This functionality was originally written as a way of sending external messages to a separate system carrying the new state of a single stream projection
diff --git a/docs/events/projections/async-daemon.md b/docs/events/projections/async-daemon.md
index b1b2413cab..e66f370eff 100644
--- a/docs/events/projections/async-daemon.md
+++ b/docs/events/projections/async-daemon.md
@@ -204,7 +204,7 @@ You can see the usage below from one of the Marten tests where we use that metho
daemon has caught up:
-
+
```cs
[Fact]
public async Task run_simultaneously()
@@ -225,7 +225,7 @@ public async Task run_simultaneously()
await CheckExpectedResults();
}
```
-snippet source | anchor
+snippet source | anchor
The basic idea in your tests is to:
@@ -273,7 +273,7 @@ public async Task run_simultaneously()
The following code shows the diagnostics support for the async daemon as it is today:
-
+
```cs
public static async Task ShowDaemonDiagnostics(IDocumentStore store)
{
@@ -292,7 +292,7 @@ public static async Task ShowDaemonDiagnostics(IDocumentStore store)
Console.WriteLine($"The daemon high water sequence mark is {daemonHighWaterMark}");
}
```
-snippet source | anchor
+snippet source | anchor
## Command Line Support
@@ -413,7 +413,7 @@ from systems using Marten.
If your system is configured to export metrics and Open Telemetry data from Marten like this:
-
+
```cs
// This is passed in by Project Aspire. The exporter usage is a little
// different for other tools like Prometheus or SigNoz
@@ -432,7 +432,7 @@ builder.Services.AddOpenTelemetry()
metrics.AddMeter("Marten");
});
```
-snippet source | anchor
+snippet source | anchor
*And* you are running the async daemon in your system, you should see potentially activities for each running projection
@@ -496,7 +496,19 @@ use this information to "know" what streams and projections may be impacted by a
The flag for this is shown below:
-snippet: sample_enabling_advanced_tracking
+
+
+```cs
+var builder = Host.CreateApplicationBuilder();
+builder.Services.AddMarten(opts =>
+{
+ opts.Connection(builder.Configuration.GetConnectionString("marten"));
+
+ opts.Events.EnableAdvancedAsyncTracking = true;
+});
+```
+snippet source | anchor
+
## Querying for Non Stale Data
diff --git a/docs/events/projections/custom-aggregates.md b/docs/events/projections/custom-aggregates.md
index 4359b59115..406670409e 100644
--- a/docs/events/projections/custom-aggregates.md
+++ b/docs/events/projections/custom-aggregates.md
@@ -39,7 +39,7 @@ public class Increment
And a simple aggregate document type like this:
-
+
```cs
public class StartAndStopAggregate: ISoftDeleted
{
@@ -57,7 +57,7 @@ public class StartAndStopAggregate: ISoftDeleted
}
}
```
-snippet source | anchor
+snippet source | anchor
As you can see, `StartAndStopAggregate` as a `Guid` as its identity and is also [soft-deleted](/documents/deletes.html#soft-deletes) when stored by
diff --git a/docs/events/projections/custom.md b/docs/events/projections/custom.md
index 22c68402f0..a21031dbb3 100644
--- a/docs/events/projections/custom.md
+++ b/docs/events/projections/custom.md
@@ -3,7 +3,7 @@
To build your own Marten projection, you just need a class that implements the `Marten.Events.Projections.IProjection` interface shown below:
-
+
```cs
///
/// Interface for all event projections
@@ -12,7 +12,7 @@ To build your own Marten projection, you just need a class that implements the `
///
public interface IProjection: IJasperFxProjection, IMartenRegistrable
```
-snippet source | anchor
+snippet source | anchor
The `StreamAction` aggregates outstanding events by the event stream, which is how Marten tracks events inside of an `IDocumentSession` that has
@@ -20,7 +20,7 @@ yet to be committed. The `IDocumentOperations` interface will give you access to
or deletions. Here's a sample custom projection from our tests:
-
+
```cs
public class QuestPatchTestProjection: IProjection
{
@@ -47,7 +47,7 @@ public class QuestPatchTestProjection: IProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
And the custom projection can be registered in your Marten `DocumentStore` like this:
diff --git a/docs/events/projections/event-projections.md b/docs/events/projections/event-projections.md
index efcd49ec1e..f0307b8d6f 100644
--- a/docs/events/projections/event-projections.md
+++ b/docs/events/projections/event-projections.md
@@ -6,7 +6,7 @@ on individual events. In essence, the `EventProjection` recipe does pattern matc
To show off what `EventProjection` does, here's a sample that uses most features that `EventProjection` supports:
-
+
```cs
public class SampleEventProjection : EventProjection
{
@@ -62,7 +62,7 @@ public class SampleEventProjection : EventProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
Do note that at any point you can access event metadata by accepting `IEvent` where `T` is the event type instead of just the event type. You can also take in an additional variable for `IEvent` to just
diff --git a/docs/events/projections/index.md b/docs/events/projections/index.md
index d8356da89f..3d3574fa2d 100644
--- a/docs/events/projections/index.md
+++ b/docs/events/projections/index.md
@@ -38,7 +38,7 @@ The out-of-the box convention is to expose `public Apply()` methods o
Sticking with the fantasy theme, the `QuestParty` class shown below could be used to aggregate streams of quest data:
-
+
```cs
public sealed record QuestParty(Guid Id, List Members)
{
@@ -63,7 +63,7 @@ public sealed record QuestParty(Guid Id, List Members)
};
}
```
-snippet source | anchor
+snippet source | anchor
## Live Aggregation via .Net
diff --git a/docs/events/projections/inline.md b/docs/events/projections/inline.md
index 4f346c1c32..2ab656a7a7 100644
--- a/docs/events/projections/inline.md
+++ b/docs/events/projections/inline.md
@@ -4,7 +4,7 @@ An "inline" projection just means that Marten will process the projection agains
to the event store at the time that `IDocumentSession.SaveChanges()` is called to commit a unit of work. Here's a small example projection:
-
+
```cs
public class MonsterDefeatedTransform: EventProjection
{
@@ -20,7 +20,7 @@ public class MonsterDefeated
public string Monster { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
Note that the inline projection is able to use the [event metadata](/events/metadata) at the time the inline projection is executed. That was previously a limitation of Marten that was fixed in Marten V4.
diff --git a/docs/events/projections/ioc.md b/docs/events/projections/ioc.md
index 883ef0b63a..c01e34ee10 100644
--- a/docs/events/projections/ioc.md
+++ b/docs/events/projections/ioc.md
@@ -14,7 +14,7 @@ Let's say you have a custom aggregation projection like this one below that need
`IPriceLookup` at runtime:
-
+
```cs
public class ProductProjection: SingleStreamProjection
{
@@ -42,7 +42,7 @@ public class ProductProjection: SingleStreamProjection
}
}
```
-snippet source | anchor
+snippet source | anchor
Now, we *want* to use this projection at runtime within Marten, and need to register the projection
@@ -71,7 +71,7 @@ using var host = await Host.CreateDefaultBuilder()
})
.StartAsync();
```
-snippet source | anchor
+snippet source | anchor
Note that we're having to explicitly specify the projection lifecycle for the projection used within
diff --git a/docs/events/projections/multi-stream-projections.md b/docs/events/projections/multi-stream-projections.md
index 1922a0a316..8bea28f9c3 100644
--- a/docs/events/projections/multi-stream-projections.md
+++ b/docs/events/projections/multi-stream-projections.md
@@ -508,12 +508,12 @@ The `ViewProjection` also provides the ability to "fan out" child events from a
create an aggregated view. As an example, a `Travel` event we use in Marten testing contains a list of `Movement` objects:
-
+
```cs
public IList Movements { get; set; } = new List();
public List Stops { get; set; } = new();
```
-snippet source | anchor
+snippet source | anchor
In a sample `ViewProjection`, we do a "fan out" of the `Travel.Movements` members into separate events being processed through the projection:
diff --git a/docs/events/projections/testing.md b/docs/events/projections/testing.md
index faa9438da4..2ebc02db3f 100644
--- a/docs/events/projections/testing.md
+++ b/docs/events/projections/testing.md
@@ -308,7 +308,7 @@ See Andrew Lock's blog post [Avoiding flaky tests with TimeProvider and ITimer](
In the example projection, I've been capturing the timestamp in the `Invoice` document from the Marten event metadata:
-
+
```cs
public static Invoice Create(IEvent created)
{
@@ -324,7 +324,7 @@ public static Invoice Create(IEvent created)
};
}
```
-snippet source | anchor
+snippet source | anchor
But of course, if that timestamp has some meaning later on and you have any kind of business rules that may need to key
diff --git a/docs/events/quickstart.md b/docs/events/quickstart.md
index abc132d825..1b47b690a0 100644
--- a/docs/events/quickstart.md
+++ b/docs/events/quickstart.md
@@ -55,7 +55,7 @@ await session.SaveChangesAsync();
At some point we would like to know what members are currently part of the quest party. To keep things simple, we're going to use Marten's _live_ stream aggregation feature to model a `QuestParty` that updates itself based on our events:
-
+
```cs
public sealed record QuestParty(Guid Id, List Members)
{
@@ -80,7 +80,7 @@ public sealed record QuestParty(Guid Id, List Members)
};
}
```
-snippet source | anchor
+snippet source | anchor
Next, we'll use the live projection to aggregate the quest stream for a single quest party like this:
@@ -106,7 +106,7 @@ Simple, right? The above code will load the events from the database and run the
What about the quest itself? On top of seeing our in-progress quest, we also want the ability to query our entire history of past quests. For this, we'll create an _inline_ `SingleStreamProjection` that persists our Quest state to the database as the events are being written:
-
+
```cs
public sealed record Quest(Guid Id, List Members, List Slayed, string Name, bool isFinished);
@@ -136,7 +136,7 @@ public sealed class QuestProjection: SingleStreamProjection
}
```
-snippet source | anchor
+snippet source | anchor
::: tip INFO
diff --git a/docs/events/skipping.md b/docs/events/skipping.md
index 2e9331c2cf..4250394278 100644
--- a/docs/events/skipping.md
+++ b/docs/events/skipping.md
@@ -20,7 +20,22 @@ Definitely check out [Rebuilding a Single Stream](/events/projections/rebuilding
To get started, you will first have to enable potential event skipping like this:
-snippet: sample_enabling_event_skipping
+
+
+```cs
+var builder = Host.CreateApplicationBuilder();
+builder.Services.AddMarten(opts =>
+{
+ opts.Connection(builder.Configuration.GetConnectionString("marten"));
+
+ // This is false by default for backwards compatibility,
+ // turning this on will add an extra column and filtering during
+ // various event store operations
+ opts.Events.EnableEventSkippingInProjectionsOrSubscriptions = true;
+});
+```
+snippet source | anchor
+
That flag just enables the ability to mark events as _skipped_. As you'd imagine, that
flag alters Marten behavior by:
diff --git a/docs/events/subscriptions.md b/docs/events/subscriptions.md
index 96801ef5ce..d53802fc2e 100644
--- a/docs/events/subscriptions.md
+++ b/docs/events/subscriptions.md
@@ -26,7 +26,7 @@ events to the Marten event storage.**
Subscriptions will always be an implementation of the `ISubscription` interface shown below:
-
+
```cs
///
/// Basic abstraction for custom subscriptions to Marten events through the async daemon. Use this in
@@ -47,7 +47,7 @@ public interface ISubscription
CancellationToken cancellationToken);
}
```
-snippet source | anchor
+snippet source | anchor
So far, the subscription model gives you these abilities:
@@ -72,7 +72,7 @@ To make this concrete, here's the simplest possible subscription you can make to
for every event:
-
+
```cs
public class ConsoleSubscription: ISubscription
{
@@ -95,13 +95,13 @@ public class ConsoleSubscription: ISubscription
}
}
```
-snippet source | anchor
+snippet source | anchor
And to register that with our Marten store:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -133,13 +133,13 @@ builder.Services.AddMarten(opts =>
using var host = builder.Build();
await host.StartAsync();
```
-snippet source | anchor
+snippet source | anchor
Here's a slightly more complicated sample that publishes events to a configured Kafka topic:
-
+
```cs
public class KafkaSubscription: SubscriptionBase
{
@@ -199,14 +199,14 @@ public class KafkaProducerConfig
public string? Topic { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
This time, it's requiring IoC services injected through its constructor, so we're going to use this mechanism
to add it to Marten:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -230,7 +230,7 @@ builder.Services.AddMarten(opts =>
using var host = builder.Build();
await host.StartAsync();
```
-snippet source | anchor
+snippet source | anchor
## Registering Subscriptions
@@ -251,7 +251,7 @@ is a great tool for this.
Stateless subscriptions can simply be registered like this:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -283,14 +283,14 @@ builder.Services.AddMarten(opts =>
using var host = builder.Build();
await host.StartAsync();
```
-snippet source | anchor
+snippet source | anchor
But, if you need to utilize services from your IoC container within your subscription -- and you very likely do --
you can utilize the `AddSubscriptionWithServices()` mechanisms:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -314,7 +314,7 @@ builder.Services.AddMarten(opts =>
using var host = builder.Build();
await host.StartAsync();
```
-snippet source | anchor
+snippet source | anchor
## Starting Position of Subscriptions
@@ -412,7 +412,7 @@ the various configuration options for that subscription right into the subscript
base class is shown below:
-
+
```cs
public class KafkaSubscription: SubscriptionBase
{
@@ -472,7 +472,7 @@ public class KafkaProducerConfig
public string? Topic { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
## Rewinding or Replaying Subscriptions
@@ -526,7 +526,7 @@ that the controller is told.
The following is an example of using these facilities for error handling:
-
+
```cs
public class ErrorHandlingSubscription: SubscriptionBase
{
@@ -602,5 +602,5 @@ public class ReallyBadException: Exception
}
}
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/getting-started.md b/docs/getting-started.md
index df3dafb261..a58396f236 100644
--- a/docs/getting-started.md
+++ b/docs/getting-started.md
@@ -26,7 +26,7 @@ dotnet paket add Marten
In the startup of your .NET application, make a call to `AddMarten()` to register Marten services like so:
-
+
```cs
// This is the absolute, simplest way to integrate Marten into your
// .NET application with Marten's default configuration
@@ -49,7 +49,7 @@ builder.Services.AddMarten(options =>
// string to Marten
.UseNpgsqlDataSource();
```
-snippet source | anchor
+snippet source | anchor
See [Bootstrapping with HostBuilder](/configuration/hostbuilder) for more information and options about this integration.
@@ -70,7 +70,7 @@ Marten uses the [Npgsql](http://www.npgsql.org) library to access PostgreSQL fro
Now, for your first document type, we'll represent the users in our system:
-
+
```cs
public class User
{
@@ -81,7 +81,7 @@ public class User
public bool Internal { get; set; }
}
```
-snippet source | anchor
+snippet source | anchor
*For more information on document identity, see [identity](/documents/identity).*
@@ -95,7 +95,7 @@ you'll rarely need to interact with that service.
From here, an instance of `IDocumentStore` or a type of `IDocumentSession` can be injected into the class/controller/endpoint of your choice and we can start persisting and loading user documents:
-
+
```cs
// You can inject the IDocumentStore and open sessions yourself
app.MapPost("/user",
@@ -131,7 +131,7 @@ app.MapGet("/user/{id:guid}",
return await session.LoadAsync(id, ct);
});
```
-snippet source | anchor
+snippet source | anchor
::: tip INFO
diff --git a/docs/migration-guide.md b/docs/migration-guide.md
index 512339a52e..9f6e4a4460 100644
--- a/docs/migration-guide.md
+++ b/docs/migration-guide.md
@@ -60,7 +60,7 @@ On the bright side, we believe that the "event slicing" usage in Marten 8 is sig
The existing "Optimized Artifacts Workflow" was completely removed in V8. Instead though, there is a new option shown below:
-
+
```cs
var connectionString = Configuration.GetConnectionString("postgres");
@@ -81,7 +81,7 @@ services.CritterStackDefaults(x =>
x.Production.ResourceAutoCreate = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
Note the usage of `CritterStackDefaults()` above. This will allow you to specify separate behavior for `Development` time vs
diff --git a/docs/otel.md b/docs/otel.md
index 21137a0dd9..34573b2a10 100644
--- a/docs/otel.md
+++ b/docs/otel.md
@@ -15,7 +15,7 @@ said, here's a sample of configuring the exporting -- this case just exporting i
a Project Aspire dashboard in the end:
-
+
```cs
// This is passed in by Project Aspire. The exporter usage is a little
// different for other tools like Prometheus or SigNoz
@@ -34,7 +34,7 @@ builder.Services.AddOpenTelemetry()
metrics.AddMeter("Marten");
});
```
-snippet source | anchor
+snippet source | anchor
Note, you'll need a reference to the `OpenTelemetry.Extensions.Hosting` Nuget for that
diff --git a/docs/scenarios/command_handler_workflow.md b/docs/scenarios/command_handler_workflow.md
index 8f8ee871ba..1bf479d1f2 100644
--- a/docs/scenarios/command_handler_workflow.md
+++ b/docs/scenarios/command_handler_workflow.md
@@ -38,7 +38,7 @@ To that end, Marten has the `FetchForWriting()` operation for optimized command
Let's say that you are building an order fulfillment system, so we're naturally going to model our domain as an `Order` aggregate:
-
+
```cs
public class Item
{
@@ -76,13 +76,13 @@ public class Order
}
}
```
-snippet source | anchor
+snippet source | anchor
And with some events like these:
-
+
```cs
public record OrderShipped;
public record OrderCreated(Item[] Items);
@@ -90,7 +90,7 @@ public record OrderReady;
public record ItemReady(string Name);
```
-snippet source | anchor
+snippet source | anchor
Let's jump right into the first sample with simple concurrency handling:
@@ -276,7 +276,7 @@ Lastly, there are several overloads of a method called `IEventStore.WriteToAggre
over the top of `FetchForWriting()` to simplify the entire workflow. Using that method, our handler versions above becomes:
-
+
```cs
public Task Handle4(MarkItemReady command, IDocumentSession session)
{
@@ -303,7 +303,7 @@ public Task Handle4(MarkItemReady command, IDocumentSession session)
});
}
```
-snippet source | anchor
+snippet source | anchor
## Optimizing FetchForWriting with Inline Aggregates
@@ -311,7 +311,7 @@ public Task Handle4(MarkItemReady command, IDocumentSession session)
If you are utilizing `FetchForWriting()` for your command handlers -- and you really, really should! -- and at least some of your aggregates are updated `Inline` as shown below:
-
+
```cs
var builder = Host.CreateApplicationBuilder();
builder.Services.AddMarten(opts =>
@@ -332,7 +332,7 @@ builder.Services.AddMarten(opts =>
// need that tracking at runtime
.UseLightweightSessions();
```
-snippet source | anchor
+snippet source | anchor
You can potentially gain some significant performance optimization by using the `UseIdentityMapForInlineAggregates` flag shown above. To be clear, this optimization mostly helps when you have the combination in a command handler that:
diff --git a/docs/schema/extensions.md b/docs/schema/extensions.md
index 229b5de806..f36b7060e0 100644
--- a/docs/schema/extensions.md
+++ b/docs/schema/extensions.md
@@ -68,7 +68,7 @@ But it **won't apply them** for multi-tenancy per database with **unknown
Postgresql tables can be modeled with the `Table` class from `Weasel.Postgresql.Tables` as shown in this example below:
-
+
```cs
StoreOptions(opts =>
{
@@ -82,7 +82,7 @@ StoreOptions(opts =>
await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
```
-snippet source | anchor
+snippet source | anchor
## Function
@@ -90,7 +90,7 @@ await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
Postgresql functions can be managed by creating a function using `Weasel.Postgresql.Functions.Function` as below:
-
+
```cs
StoreOptions(opts =>
{
@@ -112,7 +112,7 @@ $f$ language sql immutable;
await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
```
-snippet source | anchor
+snippet source | anchor
## Sequence
@@ -120,7 +120,7 @@ await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
[Postgresql sequences](https://www.postgresql.org/docs/10/static/sql-createsequence.html) can be created using `Weasel.Postgresql.Sequence` as below:
-
+
```cs
StoreOptions(opts =>
{
@@ -134,7 +134,7 @@ StoreOptions(opts =>
await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
```
-snippet source | anchor
+snippet source | anchor
## Extension
@@ -142,7 +142,7 @@ await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
Postgresql extensions can be enabled using `Weasel.Postgresql.Extension` as below:
-
+
```cs
StoreOptions(opts =>
{
@@ -157,5 +157,5 @@ StoreOptions(opts =>
await theStore.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
```
-snippet source | anchor
+snippet source | anchor
diff --git a/docs/schema/index.md b/docs/schema/index.md
index 9e23f765a4..0090f48ddf 100644
--- a/docs/schema/index.md
+++ b/docs/schema/index.md
@@ -8,7 +8,7 @@ As of Marten v0.8, you have much finer grained ability to control the automatic
`StoreOptions.AutoCreateSchemaObjects` like so:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -32,7 +32,7 @@ var store = DocumentStore.For(opts =>
opts.AutoCreateSchemaObjects = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
To prevent unnecessary loss of data, even in development, on the first usage of a document type, Marten will:
diff --git a/docs/schema/migrations.md b/docs/schema/migrations.md
index 8c8c73b1cf..430d4d5bd2 100644
--- a/docs/schema/migrations.md
+++ b/docs/schema/migrations.md
@@ -19,7 +19,7 @@ As long as you have rights to alter your Postgresql database, you can happily se
modes and not worry about schema changes at all as you happily code new features and change existing document types:
-
+
```cs
var store = DocumentStore.For(opts =>
{
@@ -43,7 +43,7 @@ var store = DocumentStore.For(opts =>
opts.AutoCreateSchemaObjects = AutoCreate.None;
});
```
-snippet source | anchor
+snippet source | anchor
As long as you're using a permissive auto creation mode (i.e., not _None_), you should be able to code in your application model
@@ -95,12 +95,12 @@ If you'd rather write a database SQL migration file with your own code, bootstra
want to update, and use:
-
+
```cs
// All migration code is async now!
await store.Storage.Database.WriteMigrationFileAsync("1.initial.sql");
```
-snippet source | anchor
+snippet source | anchor
The command above will generate a file called "1.initial.sql" to update the schema, and a second file called
@@ -122,11 +122,11 @@ While there are many options to include these exported scripts in your ci/cd pip
To programmatically apply all detectable schema changes upfront , you can use this mechanism:
-
+
```cs
await store.Storage.ApplyAllConfiguredChangesToDatabaseAsync();
```
-snippet source | anchor
+snippet source | anchor
With the [command line tooling](/configuration/cli), it's:
@@ -144,7 +144,7 @@ dotnet run -- marten-apply
Lastly, Marten V5 adds a new option to have the latest database changes detected and applied on application startup with
-
+
```cs
// The normal Marten configuration
services.AddMarten(opts =>
@@ -162,7 +162,7 @@ services.AddMarten(opts =>
// database changes on application startup
.ApplyAllDatabaseChangesOnStartup();
```
-snippet source | anchor
+snippet source | anchor
In the option above, Marten is calling the same functionality within an `IHostedService` background task.
@@ -173,11 +173,11 @@ As a possible [environment test](http://codebetter.com/jeremymiller/2006/04/06/e
by throwing an exception:
-
+
```cs
await store.Storage.Database.AssertDatabaseMatchesConfigurationAsync();
```
-snippet source | anchor
+snippet source | anchor
The exception will list out all the DDL changes that are missing.
diff --git a/src/Directory.Packages.props b/src/Directory.Packages.props
new file mode 100644
index 0000000000..d23a291eae
--- /dev/null
+++ b/src/Directory.Packages.props
@@ -0,0 +1,5 @@
+
+
+ false
+
+
diff --git a/src/DocumentDbTests/Internal/Generated/DocumentStorage/RevisionedDocProvider1212098993.cs b/src/DocumentDbTests/Internal/Generated/DocumentStorage/RevisionedDocProvider1212098993.cs
index d9e30158cc..f86a808b18 100644
--- a/src/DocumentDbTests/Internal/Generated/DocumentStorage/RevisionedDocProvider1212098993.cs
+++ b/src/DocumentDbTests/Internal/Generated/DocumentStorage/RevisionedDocProvider1212098993.cs
@@ -73,7 +73,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id);
+ var parameter2 = parameterBuilder.AppendParameter((document is DocumentDbTests.Concurrency.RevisionedDoc ? ((DocumentDbTests.Concurrency.RevisionedDoc)document).Id : default(System.Guid)));
setCurrentRevisionParameter(parameterBuilder);
builder.Append(')');
}
@@ -137,7 +137,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id);
+ var parameter2 = parameterBuilder.AppendParameter((document is DocumentDbTests.Concurrency.RevisionedDoc ? ((DocumentDbTests.Concurrency.RevisionedDoc)document).Id : default(System.Guid)));
setCurrentRevisionParameter(parameterBuilder);
builder.Append(')');
}
@@ -207,7 +207,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id);
+ var parameter2 = parameterBuilder.AppendParameter((document is DocumentDbTests.Concurrency.RevisionedDoc ? ((DocumentDbTests.Concurrency.RevisionedDoc)document).Id : default(System.Guid)));
setCurrentRevisionParameter(parameterBuilder);
builder.Append(')');
}
@@ -463,7 +463,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id);
+ var parameter2 = parameterBuilder.AppendParameter((document is DocumentDbTests.Concurrency.RevisionedDoc ? ((DocumentDbTests.Concurrency.RevisionedDoc)document).Id : default(System.Guid)));
setCurrentRevisionParameter(parameterBuilder);
builder.Append(')');
}
@@ -1077,7 +1077,7 @@ public override string OverwriteDuplicatesFromTempTable()
public override async System.Threading.Tasks.Task LoadRowAsync(Npgsql.NpgsqlBinaryImporter writer, DocumentDbTests.Concurrency.RevisionedDoc document, Marten.Storage.Tenant tenant, Marten.ISerializer serializer, System.Threading.CancellationToken cancellation)
{
await writer.WriteAsync(document.GetType().FullName, NpgsqlTypes.NpgsqlDbType.Varchar, cancellation);
- await writer.WriteAsync(document.Id, NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
+ await writer.WriteAsync(((DocumentDbTests.Concurrency.RevisionedDoc)document).Id, NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
await writer.WriteAsync(1, NpgsqlTypes.NpgsqlDbType.Integer, cancellation);
await writer.WriteAsync(serializer.ToJson(document), NpgsqlTypes.NpgsqlDbType.Jsonb, cancellation);
}
diff --git a/src/LinqTests/Acceptance/query_with_inheritance.cs b/src/LinqTests/Acceptance/query_with_inheritance.cs
index 2283eefbe7..a35c12d79c 100644
--- a/src/LinqTests/Acceptance/query_with_inheritance.cs
+++ b/src/LinqTests/Acceptance/query_with_inheritance.cs
@@ -43,19 +43,14 @@ public class PapySmurf: Smurf, IPapaSmurf
public bool IsVillageLeader { get; set; }
}
-public class BrainySmurf: PapaSmurf
-{
-}
+public class BrainySmurf: PapaSmurf;
#endregion
public class sub_class_hierarchies: OneOffConfigurationsContext
{
- private readonly ITestOutputHelper _output;
-
- public sub_class_hierarchies(ITestOutputHelper output)
+ public sub_class_hierarchies()
{
- _output = output;
StoreOptions(_ =>
{
#region sample_add-subclass-hierarchy-with-aliases
@@ -67,14 +62,16 @@ public sub_class_hierarchies(ITestOutputHelper output)
typeof(PapySmurf),
typeof(IPapaSmurf),
typeof(BrainySmurf)
- );
+ )
+ .Duplicate(x => x.IsVillageLeader); // Put a duplicated index on subclass property;
#endregion
_.Connection(ConnectionSource.ConnectionString);
_.AutoCreateSchemaObjects = AutoCreate.All;
- _.Schema.For().GinIndexJsonData();
+ _.Schema.For()
+ .GinIndexJsonData();
});
}
@@ -96,7 +93,7 @@ public class query_with_inheritance: OneOffConfigurationsContext
{
private readonly ITestOutputHelper _output;
- #region sample_add-subclass-hierarchy
+ #region sample_add-subclass-hierarchy
public query_with_inheritance(ITestOutputHelper output)
{
diff --git a/src/Marten/Linq/IMartenQueryable.cs b/src/Marten/Linq/IMartenQueryable.cs
index 713ec5d93d..15e80553c5 100644
--- a/src/Marten/Linq/IMartenQueryable.cs
+++ b/src/Marten/Linq/IMartenQueryable.cs
@@ -144,5 +144,23 @@ IMartenQueryableIncludeBuilder Include(
IMartenQueryableIncludeBuilder Include(
IDictionary> dictionary) where TInclude : notnull where TKey : notnull;
+ ///
+ /// Filters the query for documents of a given subclass type within a document hierarchy
+ /// and applies the specified predicate to those subclass documents.
+ ///
+ ///
+ /// The subclass type to filter on. Must inherit from .
+ ///
+ ///
+ /// A lambda expression representing the filter to apply to documents of type .
+ ///
+ ///
+ /// An of the base type with the filter applied
+ /// only to matching subclass documents.
+ ///
+ ///
+ /// This method is useful for querying on subclass-specific properties while still returning
+ /// results as the base type, preserving the hierarchy in the query.
+ ///
IMartenQueryable WhereSub(Expression> predicate) where TSub : T;
}
diff --git a/src/Marten/MartenRegistry.cs b/src/Marten/MartenRegistry.cs
index 714bc9bfd9..0af078de58 100644
--- a/src/Marten/MartenRegistry.cs
+++ b/src/Marten/MartenRegistry.cs
@@ -195,6 +195,41 @@ public DocumentMappingExpression Duplicate(Expression> expre
return this;
}
+ ///
+ /// Marks a member on a subclass of to be duplicated
+ /// into its own column, enabling fast filtering, sorting and indexing.
+ ///
+ ///
+ /// The subclass of that owns the member to duplicate.
+ ///
+ ///
+ /// Member selector on , for example x => x.Status .
+ ///
+ ///
+ /// Optional PostgreSQL column type override, for example "varchar(100)" .
+ ///
+ ///
+ /// Optional Npgsql database type override used for parameters.
+ ///
+ ///
+ /// Optional callback to configure the created index on the duplicated column.
+ ///
+ ///
+ /// When true, the duplicated column is created with NOT NULL .
+ ///
+ ///
+ /// The current mapping expression for chaining.
+ ///
+ public DocumentMappingExpression Duplicate(Expression> expression, string? pgType = null,
+ NpgsqlDbType? dbType = null, Action? configure = null, bool notNull = false) where TSub : T
+ {
+ _builder.Alter = mapping =>
+ {
+ mapping.Duplicate(expression, pgType, dbType, configure, notNull);
+ };
+ return this;
+ }
+
///
/// Creates a computed index on this data member within the JSON data storage
///
diff --git a/src/Marten/Schema/Arguments/UpsertArgument.cs b/src/Marten/Schema/Arguments/UpsertArgument.cs
index 2901fd5186..835905dc79 100644
--- a/src/Marten/Schema/Arguments/UpsertArgument.cs
+++ b/src/Marten/Schema/Arguments/UpsertArgument.cs
@@ -58,7 +58,9 @@ public MemberInfo[] Members
if (_members.Length == 1)
{
- DotNetType = _members.Last().GetRawMemberType()!;
+ var lastMember = _members.Last();
+ DotNetType = lastMember.GetRawMemberType()!;
+ DeclaringType = lastMember.DeclaringType;
}
else
{
@@ -79,6 +81,7 @@ public MemberInfo[] Members
}
public string ParameterValue { get; set; }
+ public Type? DeclaringType { get; set; }
public Type DotNetType { get; private set; }
@@ -97,8 +100,7 @@ public virtual void GenerateCodeToModifyDocument(GeneratedMethod method, Generat
}
public virtual void GenerateCodeToSetDbParameterValue(GeneratedMethod method, GeneratedType type, int i,
- Argument parameters,
- DocumentMapping mapping, StoreOptions options)
+ Argument parameters, DocumentMapping mapping, StoreOptions options)
{
var memberPath = _members.Select(x => x.Name).Join("?.");
@@ -114,11 +116,18 @@ public virtual void GenerateCodeToSetDbParameterValue(GeneratedMethod method, Ge
? $"{Constant.ForEnum(NpgsqlDbType.Array).Usage} | {Constant.ForEnum(PostgresqlProvider.Instance.ToParameterType(rawMemberType.GetElementType()!)).Usage}"
: Constant.ForEnum(DbType).Usage;
+ var accessorString = AccessorString(type);
+ var requiresCast = DeclaringType is { } dt && dt != type.BaseType;
+
if (rawMemberType.IsClass || rawMemberType.IsNullable() || _members.Length > 1)
{
+ var hasValueGuard = requiresCast
+ ? $"(document is {DeclaringType!.FullNameInCode()} && {accessorString} != null)"
+ : $"{accessorString} != null";
+
method.Frames.Code($@"
-BLOCK:if (document.{memberPath} != null)
-var parameter{i} = {{0}}.{nameof(IGroupedParameterBuilder.AppendParameter)}(document.{ParameterValue});
+BLOCK:if ({hasValueGuard})
+var parameter{i} = {{0}}.{nameof(IGroupedParameterBuilder.AppendParameter)}({accessorString});
parameter{i}.{nameof(NpgsqlParameter.NpgsqlDbType)} = {dbTypeString};
END
BLOCK:else
@@ -128,7 +137,18 @@ public virtual void GenerateCodeToSetDbParameterValue(GeneratedMethod method, Ge
}
else
{
- method.Frames.Code($"var parameter{i} = {{0}}.{nameof(IGroupedParameterBuilder.AppendParameter)}(document.{ParameterValue});", Use.Type());
+ var underlying = rawMemberType;
+ var valueProp = rawMemberType.GetProperty("Value");
+ if (valueProp != null)
+ underlying = valueProp.PropertyType;
+
+ var guarded = requiresCast
+ ? $"(document is {DeclaringType!.FullNameInCode()} ? {accessorString} : default({underlying.FullNameInCode()}))"
+ : accessorString;
+
+ method.Frames.Code(
+ $"var parameter{i} = {{0}}.{nameof(IGroupedParameterBuilder.AppendParameter)}<{underlying.FullNameInCode()}>({guarded});",
+ Use.Type());
}
}
}
@@ -231,8 +251,15 @@ public virtual void GenerateBulkWriterCodeAsync(GeneratedType type, GeneratedMet
}
else
{
- load.Frames.CodeAsync($"await writer.WriteAsync(document.{ParameterValue}, {dbTypeString}, {{0}});",
+ var accessor = AccessorString(type);
+
+ load.Frames.CodeAsync($"await writer.WriteAsync({accessor}, {dbTypeString}, {{0}});",
Use.Type());
}
}
+
+ private string AccessorString(GeneratedType type) =>
+ DeclaringType is { } dt2 && dt2 != type.BaseType
+ ? $"(({DeclaringType.FullNameInCode()})document).{ParameterValue}"
+ : $"document.{ParameterValue}";
}
diff --git a/src/Marten/Schema/DocumentMapping.cs b/src/Marten/Schema/DocumentMapping.cs
index 90dcb48bee..6ad9f68ffd 100644
--- a/src/Marten/Schema/DocumentMapping.cs
+++ b/src/Marten/Schema/DocumentMapping.cs
@@ -846,6 +846,32 @@ public DocumentMapping(StoreOptions storeOptions) : base(typeof(T), storeOptions
///
public void Duplicate(Expression> expression, string? pgType = null, NpgsqlDbType? dbType = null,
Action? configure = null, bool notNull = false)
+ => Duplicate(expression, pgType, dbType, configure, notNull);
+
+ ///
+ /// Duplicates a member from a subclass of into a
+ /// dedicated table column, and optionally creates an index on that column.
+ ///
+ ///
+ /// The subclass of that defines the selected member.
+ ///
+ ///
+ /// Member selector on , for example x => x.Code .
+ ///
+ ///
+ /// Optional PostgreSQL column type override, for example "timestamp without time zone" .
+ ///
+ ///
+ /// Optional Npgsql database type override used for parameters.
+ ///
+ ///
+ /// Optional callback to configure the index created for the duplicated column.
+ ///
+ ///
+ /// When true, the duplicated column is created with NOT NULL .
+ ///
+ public void Duplicate(Expression> expression, string? pgType = null, NpgsqlDbType? dbType = null,
+ Action? configure = null, bool notNull = false) where TSub : T
{
var visitor = new FindMembers();
visitor.Visit(expression);
@@ -861,7 +887,6 @@ public void Duplicate(Expression> expression, string? pgType =
configure?.Invoke(indexDefinition);
}
-
///
/// Adds a computed index
///
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/GradeProvider907442297.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/GradeProvider907442297.cs
index 424d82899d..ff6cd5a36b 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/GradeProvider907442297.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/GradeProvider907442297.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.Grade && ((ValueTypeTests.Grade)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.Grade)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.Grade && ((ValueTypeTests.Grade)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.Grade)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.Grade && ((ValueTypeTests.Grade)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.Grade)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice2Provider479877131.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice2Provider479877131.cs
index 9c620804c5..51541290ca 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice2Provider479877131.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice2Provider479877131.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.Invoice2 && ((ValueTypeTests.StrongTypedId.Invoice2)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.Invoice2)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.Invoice2 && ((ValueTypeTests.StrongTypedId.Invoice2)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.Invoice2)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.Invoice2 && ((ValueTypeTests.StrongTypedId.Invoice2)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.Invoice2)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice3Provider479877132.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice3Provider479877132.cs
index d493e9a8b4..1e05922a67 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice3Provider479877132.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Invoice3Provider479877132.cs
@@ -65,7 +65,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value);
+ var parameter2 = parameterBuilder.AppendParameter((document is ValueTypeTests.StrongTypedId.Invoice3 ? ((ValueTypeTests.StrongTypedId.Invoice3)document).Id.Value : default(System.Guid)));
setVersionParameter(parameterBuilder);
builder.Append(')');
}
@@ -128,7 +128,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value);
+ var parameter2 = parameterBuilder.AppendParameter((document is ValueTypeTests.StrongTypedId.Invoice3 ? ((ValueTypeTests.StrongTypedId.Invoice3)document).Id.Value : default(System.Guid)));
setVersionParameter(parameterBuilder);
builder.Append(')');
}
@@ -191,7 +191,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value);
+ var parameter2 = parameterBuilder.AppendParameter((document is ValueTypeTests.StrongTypedId.Invoice3 ? ((ValueTypeTests.StrongTypedId.Invoice3)document).Id.Value : default(System.Guid)));
setVersionParameter(parameterBuilder);
builder.Append(')');
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/InvoiceProvider1724721064.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/InvoiceProvider1724721064.cs
index e345b78e53..e3f56e85ea 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/InvoiceProvider1724721064.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/InvoiceProvider1724721064.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.VogenIds.Invoice && ((ValueTypeTests.VogenIds.Invoice)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.VogenIds.Invoice)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.VogenIds.Invoice && ((ValueTypeTests.VogenIds.Invoice)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.VogenIds.Invoice)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.VogenIds.Invoice && ((ValueTypeTests.VogenIds.Invoice)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.VogenIds.Invoice)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/OrderProvider347010495.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/OrderProvider347010495.cs
index 73269cb598..211dd9685f 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/OrderProvider347010495.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/OrderProvider347010495.cs
@@ -65,9 +65,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is FSharpTypes.Order && ((FSharpTypes.Order)document).Id.Item != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Item);
+ var parameter2 = parameterBuilder.AppendParameter(((FSharpTypes.Order)document).Id.Item);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -139,9 +139,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is FSharpTypes.Order && ((FSharpTypes.Order)document).Id.Item != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Item);
+ var parameter2 = parameterBuilder.AppendParameter(((FSharpTypes.Order)document).Id.Item);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -213,9 +213,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is FSharpTypes.Order && ((FSharpTypes.Order)document).Id.Item != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Item);
+ var parameter2 = parameterBuilder.AppendParameter(((FSharpTypes.Order)document).Id.Item);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -815,7 +815,7 @@ public override string OverwriteDuplicatesFromTempTable()
public override async System.Threading.Tasks.Task LoadRowAsync(Npgsql.NpgsqlBinaryImporter writer, FSharpTypes.Order document, Marten.Storage.Tenant tenant, Marten.ISerializer serializer, System.Threading.CancellationToken cancellation)
{
await writer.WriteAsync(document.GetType().FullName, NpgsqlTypes.NpgsqlDbType.Varchar, cancellation);
- await writer.WriteAsync(document.Id.Item, NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
+ await writer.WriteAsync(((FSharpTypes.Order)document).Id.Item, NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
await writer.WriteAsync(JasperFx.Core.CombGuidIdGeneration.NewGuid(), NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
await writer.WriteAsync(serializer.ToJson(document), NpgsqlTypes.NpgsqlDbType.Jsonb, cancellation);
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/ReferenceTypeOrderProvider1892982689.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/ReferenceTypeOrderProvider1892982689.cs
index 98c75e9647..84101a303f 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/ReferenceTypeOrderProvider1892982689.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/ReferenceTypeOrderProvider1892982689.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.ReferenceTypeOrder && ((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Item);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.ReferenceTypeOrder && ((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Item);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.ReferenceTypeOrder && ((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Item);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -816,7 +816,7 @@ public override string OverwriteDuplicatesFromTempTable()
public override async System.Threading.Tasks.Task LoadRowAsync(Npgsql.NpgsqlBinaryImporter writer, ValueTypeTests.StrongTypedId.ReferenceTypeOrder document, Marten.Storage.Tenant tenant, Marten.ISerializer serializer, System.Threading.CancellationToken cancellation)
{
await writer.WriteAsync(document.GetType().FullName, NpgsqlTypes.NpgsqlDbType.Varchar, cancellation);
- await writer.WriteAsync(document.Id.Item, NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
+ await writer.WriteAsync(((ValueTypeTests.StrongTypedId.ReferenceTypeOrder)document).Id.Item, NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
await writer.WriteAsync(JasperFx.Core.CombGuidIdGeneration.NewGuid(), NpgsqlTypes.NpgsqlDbType.Uuid, cancellation);
await writer.WriteAsync(serializer.ToJson(document), NpgsqlTypes.NpgsqlDbType.Jsonb, cancellation);
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeacherProvider944571072.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeacherProvider944571072.cs
index 22b00630de..cae8ccc47a 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeacherProvider944571072.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeacherProvider944571072.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.Teacher && ((ValueTypeTests.Teacher)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.Teacher)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.Teacher && ((ValueTypeTests.Teacher)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.Teacher)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.Teacher && ((ValueTypeTests.Teacher)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.Teacher)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Uuid;
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team2Provider1170066519.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team2Provider1170066519.cs
index 570d19ec9c..3f279b0529 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team2Provider1170066519.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team2Provider1170066519.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.Team2 && ((ValueTypeTests.StrongTypedId.Team2)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.Team2)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Text;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.Team2 && ((ValueTypeTests.StrongTypedId.Team2)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.Team2)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Text;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.StrongTypedId.Team2 && ((ValueTypeTests.StrongTypedId.Team2)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.StrongTypedId.Team2)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Text;
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team3Provider396017422.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team3Provider396017422.cs
index 89826e04d0..91e955999c 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team3Provider396017422.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/Team3Provider396017422.cs
@@ -65,7 +65,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value);
+ var parameter2 = parameterBuilder.AppendParameter((document is ValueTypeTests.StrongTypedId.Team3 ? ((ValueTypeTests.StrongTypedId.Team3)document).Id.Value : default(string)));
setVersionParameter(parameterBuilder);
builder.Append(')');
}
@@ -128,7 +128,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value);
+ var parameter2 = parameterBuilder.AppendParameter((document is ValueTypeTests.StrongTypedId.Team3 ? ((ValueTypeTests.StrongTypedId.Team3)document).Id.Value : default(string)));
setVersionParameter(parameterBuilder);
builder.Append(')');
}
@@ -191,7 +191,7 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
// .Net Class Type
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value);
+ var parameter2 = parameterBuilder.AppendParameter((document is ValueTypeTests.StrongTypedId.Team3 ? ((ValueTypeTests.StrongTypedId.Team3)document).Id.Value : default(string)));
setVersionParameter(parameterBuilder);
builder.Append(')');
}
diff --git a/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeamProvider700129768.cs b/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeamProvider700129768.cs
index 0a549f2599..a4846e4cb3 100644
--- a/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeamProvider700129768.cs
+++ b/src/ValueTypeTests/Internal/Generated/DocumentStorage/TeamProvider700129768.cs
@@ -66,9 +66,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.VogenIds.Team && ((ValueTypeTests.VogenIds.Team)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.VogenIds.Team)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Text;
}
@@ -140,9 +140,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.VogenIds.Team && ((ValueTypeTests.VogenIds.Team)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.VogenIds.Team)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Text;
}
@@ -214,9 +214,9 @@ public override void ConfigureParameters(Weasel.Postgresql.IGroupedParameterBuil
var parameter1 = parameterBuilder.AppendParameter(_document.GetType().FullName);
parameter1.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Varchar;
- if (document.Id != null)
+ if ((document is ValueTypeTests.VogenIds.Team && ((ValueTypeTests.VogenIds.Team)document).Id.Value.Value != null))
{
- var parameter2 = parameterBuilder.AppendParameter(document.Id.Value.Value);
+ var parameter2 = parameterBuilder.AppendParameter(((ValueTypeTests.VogenIds.Team)document).Id.Value.Value);
parameter2.NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Text;
}