From d176d3cb6391975426c213dea1ac150972938a2b Mon Sep 17 00:00:00 2001 From: reeb Date: Sun, 10 May 2026 11:15:14 +0300 Subject: [PATCH 1/4] RDoc-1655 - add `GoogleCloud` to versioned docs allowed backup destinations (backup documentation in current version - 7.2 - is already updated with it) --- .../version-6.2/server/configuration/backup-configuration.mdx | 1 + .../version-7.0/server/configuration/backup-configuration.mdx | 1 + .../version-7.1/server/configuration/backup-configuration.mdx | 1 + 3 files changed, 3 insertions(+) diff --git a/versioned_docs/version-6.2/server/configuration/backup-configuration.mdx b/versioned_docs/version-6.2/server/configuration/backup-configuration.mdx index 69161427c4..0b8c9a8c37 100644 --- a/versioned_docs/version-6.2/server/configuration/backup-configuration.mdx +++ b/versioned_docs/version-6.2/server/configuration/backup-configuration.mdx @@ -65,6 +65,7 @@ Possible values: - `Azure` - `AmazonGlacier` - `AmazonS3` +- `GoogleCloud` - `FTP` diff --git a/versioned_docs/version-7.0/server/configuration/backup-configuration.mdx b/versioned_docs/version-7.0/server/configuration/backup-configuration.mdx index 69161427c4..0b8c9a8c37 100644 --- a/versioned_docs/version-7.0/server/configuration/backup-configuration.mdx +++ b/versioned_docs/version-7.0/server/configuration/backup-configuration.mdx @@ -65,6 +65,7 @@ Possible values: - `Azure` - `AmazonGlacier` - `AmazonS3` +- `GoogleCloud` - `FTP` diff --git a/versioned_docs/version-7.1/server/configuration/backup-configuration.mdx b/versioned_docs/version-7.1/server/configuration/backup-configuration.mdx index 69161427c4..0b8c9a8c37 100644 --- a/versioned_docs/version-7.1/server/configuration/backup-configuration.mdx +++ b/versioned_docs/version-7.1/server/configuration/backup-configuration.mdx @@ -65,6 +65,7 @@ Possible values: - `Azure` - `AmazonGlacier` - `AmazonS3` +- `GoogleCloud` - `FTP` From a58361339f933d76ef5370b34e6861e17ce1482f Mon Sep 17 00:00:00 2001 From: reeb Date: Sun, 10 May 2026 11:39:09 +0300 Subject: [PATCH 2/4] RDoc-2310 - change the wording in references to Studio page from "cluster certificate" to "server certificate" --- .../certificate-renewal-and-rotation.mdx | 10 +++++----- .../certificate-renewal-and-rotation.mdx | 10 +++++----- .../certificate-renewal-and-rotation.mdx | 10 +++++----- .../certificate-renewal-and-rotation.mdx | 10 +++++----- 4 files changed, 20 insertions(+), 20 deletions(-) diff --git a/docs/server/security/authentication/certificate-renewal-and-rotation.mdx b/docs/server/security/authentication/certificate-renewal-and-rotation.mdx index f6efcce391..21a88da883 100644 --- a/docs/server/security/authentication/certificate-renewal-and-rotation.mdx +++ b/docs/server/security/authentication/certificate-renewal-and-rotation.mdx @@ -26,13 +26,13 @@ You can also ignore these limits and replace the certificates immediately but be To manually replace the server certificate you can either edit [settings.json](../../configuration/configuration-options.mdx#json) with a new certificate path and restart the server or you can overwrite the existing certificate file and the server will pick it up within one hour without requiring a restart. - + The new certificate must contain all of the cluster domain names in the CN or ASN properties of the certificate. Otherwise you will get an authentication error because SSL/TLS requires the domain in the certificate to match with the actual domain being used. -## Replace the Cluster Certificate Using the Studio +## Replace the Server Certificate Using Studio -Access the certificate view, click on `Cluster certificate` -> `Replace cluster certificate` and upload the new certificate PFX file. +Access the [certificate view](../../../studio/server/certificates/server-management-certificates-view.mdx), click on `Server certificate` -> `Replace server certificate` and upload the new certificate PFX file. This will start the certificate replacement process. @@ -48,9 +48,9 @@ If a node is not responding during the replacement, the operation will not compl * `Replace immediately` is chosen. In this case, the cluster will complete the operation without the node which is down. When bringing that node up, the certificate must be replaced manually. -During the process you will receive alerts in the studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. +During the process you will receive alerts in Studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. -## Replace the Cluster Certificate Using Powershell +## Replace the Server Certificate Using Powershell Here is a little example of using the REST API directly with powershell to replace the cluster certificate: diff --git a/versioned_docs/version-6.2/server/security/authentication/certificate-renewal-and-rotation.mdx b/versioned_docs/version-6.2/server/security/authentication/certificate-renewal-and-rotation.mdx index 0521c22160..bb77764d24 100644 --- a/versioned_docs/version-6.2/server/security/authentication/certificate-renewal-and-rotation.mdx +++ b/versioned_docs/version-6.2/server/security/authentication/certificate-renewal-and-rotation.mdx @@ -25,13 +25,13 @@ You can also ignore these limits and replace the certificates immediately but be To manually replace the server certificate you can either edit [settings.json](../../configuration/configuration-options.mdx#json) with a new certificate path and restart the server or you can overwrite the existing certificate file and the server will pick it up within one hour without requiring a restart. - + The new certificate must contain all of the cluster domain names in the CN or ASN properties of the certificate. Otherwise you will get an authentication error because SSL/TLS requires the domain in the certificate to match with the actual domain being used. -## Replace the Cluster Certificate Using the Studio +## Replace the Server Certificate Using Studio -Access the certificate view, click on `Cluster certificate` -> `Replace cluster certificate` and upload the new certificate PFX file. +Access the [certificate view](../../../studio/server/certificates/server-management-certificates-view.mdx), click on `Server certificate` -> `Replace server certificate` and upload the new certificate PFX file. This will start the certificate replacement process. @@ -47,9 +47,9 @@ If a node is not responding during the replacement, the operation will not compl * `Replace immediately` is chosen. In this case, the cluster will complete the operation without the node which is down. When bringing that node up, the certificate must be replaced manually. -During the process you will receive alerts in the studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. +During the process you will receive alerts in Studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. -## Replace the Cluster Certificate Using Powershell +## Replace the Server Certificate Using Powershell Here is a little example of using the REST API directly with powershell to replace the cluster certificate: diff --git a/versioned_docs/version-7.0/server/security/authentication/certificate-renewal-and-rotation.mdx b/versioned_docs/version-7.0/server/security/authentication/certificate-renewal-and-rotation.mdx index 0521c22160..bb77764d24 100644 --- a/versioned_docs/version-7.0/server/security/authentication/certificate-renewal-and-rotation.mdx +++ b/versioned_docs/version-7.0/server/security/authentication/certificate-renewal-and-rotation.mdx @@ -25,13 +25,13 @@ You can also ignore these limits and replace the certificates immediately but be To manually replace the server certificate you can either edit [settings.json](../../configuration/configuration-options.mdx#json) with a new certificate path and restart the server or you can overwrite the existing certificate file and the server will pick it up within one hour without requiring a restart. - + The new certificate must contain all of the cluster domain names in the CN or ASN properties of the certificate. Otherwise you will get an authentication error because SSL/TLS requires the domain in the certificate to match with the actual domain being used. -## Replace the Cluster Certificate Using the Studio +## Replace the Server Certificate Using Studio -Access the certificate view, click on `Cluster certificate` -> `Replace cluster certificate` and upload the new certificate PFX file. +Access the [certificate view](../../../studio/server/certificates/server-management-certificates-view.mdx), click on `Server certificate` -> `Replace server certificate` and upload the new certificate PFX file. This will start the certificate replacement process. @@ -47,9 +47,9 @@ If a node is not responding during the replacement, the operation will not compl * `Replace immediately` is chosen. In this case, the cluster will complete the operation without the node which is down. When bringing that node up, the certificate must be replaced manually. -During the process you will receive alerts in the studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. +During the process you will receive alerts in Studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. -## Replace the Cluster Certificate Using Powershell +## Replace the Server Certificate Using Powershell Here is a little example of using the REST API directly with powershell to replace the cluster certificate: diff --git a/versioned_docs/version-7.1/server/security/authentication/certificate-renewal-and-rotation.mdx b/versioned_docs/version-7.1/server/security/authentication/certificate-renewal-and-rotation.mdx index 0521c22160..bb77764d24 100644 --- a/versioned_docs/version-7.1/server/security/authentication/certificate-renewal-and-rotation.mdx +++ b/versioned_docs/version-7.1/server/security/authentication/certificate-renewal-and-rotation.mdx @@ -25,13 +25,13 @@ You can also ignore these limits and replace the certificates immediately but be To manually replace the server certificate you can either edit [settings.json](../../configuration/configuration-options.mdx#json) with a new certificate path and restart the server or you can overwrite the existing certificate file and the server will pick it up within one hour without requiring a restart. - + The new certificate must contain all of the cluster domain names in the CN or ASN properties of the certificate. Otherwise you will get an authentication error because SSL/TLS requires the domain in the certificate to match with the actual domain being used. -## Replace the Cluster Certificate Using the Studio +## Replace the Server Certificate Using Studio -Access the certificate view, click on `Cluster certificate` -> `Replace cluster certificate` and upload the new certificate PFX file. +Access the [certificate view](../../../studio/server/certificates/server-management-certificates-view.mdx), click on `Server certificate` -> `Replace server certificate` and upload the new certificate PFX file. This will start the certificate replacement process. @@ -47,9 +47,9 @@ If a node is not responding during the replacement, the operation will not compl * `Replace immediately` is chosen. In this case, the cluster will complete the operation without the node which is down. When bringing that node up, the certificate must be replaced manually. -During the process you will receive alerts in the studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. +During the process you will receive alerts in Studio and in the logs indicating the status of the operation and any errors if they occur. The alerts are displayed for each node independently. -## Replace the Cluster Certificate Using Powershell +## Replace the Server Certificate Using Powershell Here is a little example of using the REST API directly with powershell to replace the cluster certificate: From b304464af0c5f8a0bf77288b2d4a0ca0eeb129c6 Mon Sep 17 00:00:00 2001 From: reeb Date: Sun, 10 May 2026 13:20:10 +0300 Subject: [PATCH 3/4] RDoc-2387 - document Smuggler.ImportIncrementalAsync in the smuggler page and in the backup\restore page. --- docs/backup/restore.mdx | 14 + .../content/_what-is-smuggler-csharp.mdx | 268 ++++++++++++------ .../content/_what-is-smuggler-csharp.mdx | 268 ++++++++++++------ .../content/_what-is-smuggler-csharp.mdx | 268 ++++++++++++------ .../content/_what-is-smuggler-csharp.mdx | 268 ++++++++++++------ 5 files changed, 718 insertions(+), 368 deletions(-) diff --git a/docs/backup/restore.mdx b/docs/backup/restore.mdx index 66a26e5331..5e82a68416 100644 --- a/docs/backup/restore.mdx +++ b/docs/backup/restore.mdx @@ -23,6 +23,8 @@ import ContentFrame from "@site/src/components/ContentFrame"; * If your backup consists of full and incremental backup files, you can restore the database up to a specific point in time by selecting the last incremental backup file to restore. +* To import backup files into an **existing** database instead of creating a new one, see [Importing into an existing database](../backup/restore#importing-into-an-existing-database). + * On a [sharded](../sharding/overview.mdx) database, restore is performed per shard, using the backups created by the shards. [Learn to restore a sharded database.](../sharding/backup-and-restore/restore.mdx) @@ -40,6 +42,7 @@ import ContentFrame from "@site/src/components/ContentFrame"; * [Restoring to a single node and replicating to additional nodes](../backup/restore#restoring-to-a-single-node-and-replicating-to-additional-nodes) * [Restoring to multiple nodes simultaneously](../backup/restore#restoring-to-multiple-nodes-simultaneously) * [Restoring from server-wide backups](../backup/restore#restoring-from-server-wide-backups) + * [Importing into an existing database](../backup/restore#importing-into-an-existing-database) @@ -877,4 +880,15 @@ Restoring databases backed up by a server-wide backup task **is identical** to r - Note that each backup is stored in the backup location within its own folder, named after the database it belongs to. When restoring, make sure the correct backup folder is selected. + + + + +The restore operation described on this page always creates a **new** database. +If you need to import backup files from a backup folder into an **existing** database instead, use `store.Smuggler.ImportIncrementalAsync`. + +This method accepts the path to a backup folder, finds all backup files it contains, orders them chronologically, and imports them in sequence — without creating a new database. + +[Learn about `ImportIncrementalAsync` in the Smuggler documentation.](../client-api/smuggler/what-is-smuggler.mdx#importincrementalasync) + \ No newline at end of file diff --git a/docs/client-api/smuggler/content/_what-is-smuggler-csharp.mdx b/docs/client-api/smuggler/content/_what-is-smuggler-csharp.mdx index 4bdf46a8fc..d8cdf830e2 100644 --- a/docs/client-api/smuggler/content/_what-is-smuggler-csharp.mdx +++ b/docs/client-api/smuggler/content/_what-is-smuggler-csharp.mdx @@ -2,34 +2,44 @@ import Admonition from '@theme/Admonition'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import CodeBlock from '@theme/CodeBlock'; +import Panel from '@site/src/components/Panel'; +import ContentFrame from '@site/src/components/ContentFrame'; + + Smuggler gives you the ability to export or import data from or to a database using JSON format. It is exposed via the `DocumentStore.Smuggler` property. -## ForDatabase +* In this page: + * [ForDatabase](../../../client-api/smuggler/what-is-smuggler.mdx#fordatabase) + * [Export](../../../client-api/smuggler/what-is-smuggler.mdx#export) + * [DatabaseSmugglerExportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions) + * [Import](../../../client-api/smuggler/what-is-smuggler.mdx#import) + * [DatabaseSmugglerImportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions) + * [ImportIncrementalAsync](../../../client-api/smuggler/what-is-smuggler.mdx#importincrementalasync) + * [TransformScript](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript) -By default, the `DocumentStore.Smuggler` works on the default document store database from the `DocumentStore.Database` property. + -In order to switch it to a different database use the `.ForDatabase` method. + - - -{`var northwindSmuggler = store - .Smuggler - .ForDatabase("Northwind"); -`} - - +By default, `DocumentStore.Smuggler` works against the default database from the `DocumentStore.Database` property. +To target a different database, use `ForDatabase`: + +```csharp +var northwindSmuggler = store.Smuggler.ForDatabase("Northwind"); +``` + + -## Export + ### Syntax - - -{`Task ExportAsync( +```csharp +Task ExportAsync( DatabaseSmugglerExportOptions options, DatabaseSmuggler toDatabase, CancellationToken token = default(CancellationToken)); @@ -38,65 +48,77 @@ Task ExportAsync( DatabaseSmugglerExportOptions options, string toFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerExportOptions` | Options that will be used during the export. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions). | -| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination | -| **toFile** | `string` | Path to a file where exported data will be written | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination. | +| **toFile** | `string` | Path to a file where exported data will be written. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | -| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +
+ +--- + + ### DatabaseSmugglerExportOptions -| Parameters | | | -| ------------- | ------------- | ----- | -| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be exported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be exported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be exported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be exported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be exported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be exported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every exported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// export only Indexes and Documents to a given file +```csharp +// Export only Indexes and Documents to a given file var exportOperation = await store .Smuggler .ExportAsync( new DatabaseSmugglerExportOptions - \{ + { OperateOnTypes = DatabaseItemType.Indexes | DatabaseItemType.Documents - \}, - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await exportOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## Import + + + ### Syntax - - -{`Task ImportAsync( +```csharp +Task ImportAsync( DatabaseSmugglerImportOptions options, Stream stream, CancellationToken token = default(CancellationToken)); @@ -105,75 +127,137 @@ Task ImportAsync( DatabaseSmugglerImportOptions options, string fromFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerImportOptions` | Options that will be used during the import. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions). | -| **stream** | `Stream` | Stream with data to import | -| **fromFile** | `string` | Path to a file from which data will be imported | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **stream** | `Stream` | Stream with data to import. | +| **fromFile** | `string` | Path to a file from which data will be imported. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | + +
-| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +--- + + ### DatabaseSmugglerImportOptions -| Parameters | | | -| - | - | - | -| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be imported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be imported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be imported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be imported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be imported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be imported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every imported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// import only Documents from a given file +```csharp +// Import only Documents from a given file var importOperation = await store .Smuggler .ImportAsync( new DatabaseSmugglerImportOptions - \{ + { OperateOnTypes = DatabaseItemType.Documents - \}, - // import the .ravendbdump file that you exported (i.e. in the export example above) - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await importOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## TransformScript + -`TransformScript` exposes the ability to modify or even filter-out the document during the import and export process using the provided JavaScript. +A RavenDB backup folder contains one full backup file and any incremental backup files accumulated since the last full backup. +Use `ImportIncrementalAsync` to import all backup files from such a folder into an existing database in a single call, +instead of importing each file individually. -Underneath the JavaScript engine is exactly the same as used for [patching operations](../../../client-api/operations/patching/single-document.mdx) giving you identical syntax and capabilities with additional **ability to filter out documents by throwing a 'skip' exception**. +The method finds all backup files in the specified directory, orders them chronologically, and imports them in sequence. +Indexes and Subscriptions are excluded from all files except the last, ensuring only their most recent state is applied. - - -{`var id = this['@metadata']['@id']; -if (id === 'orders/999-A') - throw 'skip'; // filter-out + +`ImportIncrementalAsync` imports into an **existing** database. +To create a **new** database from backup files, use [`RestoreBackupOperation`](../../../backup/restore.mdx). + -this.Freight = 15.3; -`} - - + + +### Syntax + +```csharp +Task ImportIncrementalAsync( + DatabaseSmugglerImportOptions options, + string fromDirectory, + CancellationToken cancellationToken = default); +``` + +
+ +| Parameter | Type | Description | +| --- | --- | --- | +| **options** | `DatabaseSmugglerImportOptions` | Options applied to each file imported from the directory. `Tombstones` and `CompareExchangeTombstones` are added automatically. | +| **fromDirectory** | `string` | Path to the backup folder containing the backup files to import. | +| **cancellationToken** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Completes when all backup files in the folder have been imported. | + +
+ +--- + +### Example + +```csharp +// Import all backup files from a backup folder into an existing database +await store.Smuggler.ImportIncrementalAsync( + new DatabaseSmugglerImportOptions(), + @"C:\RavenDB\Backups\Northwind.2024-01-15T12-00-00"); +``` + + + +
+ + +`TransformScript` exposes the ability to modify or filter documents during the import and export process using JavaScript. + +The JavaScript engine is the same one used for [patching operations](../../../client-api/operations/patching/single-document.mdx), +with the same syntax and capabilities, plus the ability to filter out documents by throwing a `'skip'` exception. + +```javascript +var id = this['@metadata']['@id']; +if (id === 'orders/999-A') + throw 'skip'; // filter out this document + +this.Freight = 15.3; +``` + diff --git a/versioned_docs/version-6.2/client-api/smuggler/content/_what-is-smuggler-csharp.mdx b/versioned_docs/version-6.2/client-api/smuggler/content/_what-is-smuggler-csharp.mdx index 4bdf46a8fc..d8cdf830e2 100644 --- a/versioned_docs/version-6.2/client-api/smuggler/content/_what-is-smuggler-csharp.mdx +++ b/versioned_docs/version-6.2/client-api/smuggler/content/_what-is-smuggler-csharp.mdx @@ -2,34 +2,44 @@ import Admonition from '@theme/Admonition'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import CodeBlock from '@theme/CodeBlock'; +import Panel from '@site/src/components/Panel'; +import ContentFrame from '@site/src/components/ContentFrame'; + + Smuggler gives you the ability to export or import data from or to a database using JSON format. It is exposed via the `DocumentStore.Smuggler` property. -## ForDatabase +* In this page: + * [ForDatabase](../../../client-api/smuggler/what-is-smuggler.mdx#fordatabase) + * [Export](../../../client-api/smuggler/what-is-smuggler.mdx#export) + * [DatabaseSmugglerExportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions) + * [Import](../../../client-api/smuggler/what-is-smuggler.mdx#import) + * [DatabaseSmugglerImportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions) + * [ImportIncrementalAsync](../../../client-api/smuggler/what-is-smuggler.mdx#importincrementalasync) + * [TransformScript](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript) -By default, the `DocumentStore.Smuggler` works on the default document store database from the `DocumentStore.Database` property. + -In order to switch it to a different database use the `.ForDatabase` method. + - - -{`var northwindSmuggler = store - .Smuggler - .ForDatabase("Northwind"); -`} - - +By default, `DocumentStore.Smuggler` works against the default database from the `DocumentStore.Database` property. +To target a different database, use `ForDatabase`: + +```csharp +var northwindSmuggler = store.Smuggler.ForDatabase("Northwind"); +``` + + -## Export + ### Syntax - - -{`Task ExportAsync( +```csharp +Task ExportAsync( DatabaseSmugglerExportOptions options, DatabaseSmuggler toDatabase, CancellationToken token = default(CancellationToken)); @@ -38,65 +48,77 @@ Task ExportAsync( DatabaseSmugglerExportOptions options, string toFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerExportOptions` | Options that will be used during the export. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions). | -| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination | -| **toFile** | `string` | Path to a file where exported data will be written | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination. | +| **toFile** | `string` | Path to a file where exported data will be written. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | -| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +
+ +--- + + ### DatabaseSmugglerExportOptions -| Parameters | | | -| ------------- | ------------- | ----- | -| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be exported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be exported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be exported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be exported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be exported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be exported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every exported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// export only Indexes and Documents to a given file +```csharp +// Export only Indexes and Documents to a given file var exportOperation = await store .Smuggler .ExportAsync( new DatabaseSmugglerExportOptions - \{ + { OperateOnTypes = DatabaseItemType.Indexes | DatabaseItemType.Documents - \}, - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await exportOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## Import + + + ### Syntax - - -{`Task ImportAsync( +```csharp +Task ImportAsync( DatabaseSmugglerImportOptions options, Stream stream, CancellationToken token = default(CancellationToken)); @@ -105,75 +127,137 @@ Task ImportAsync( DatabaseSmugglerImportOptions options, string fromFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerImportOptions` | Options that will be used during the import. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions). | -| **stream** | `Stream` | Stream with data to import | -| **fromFile** | `string` | Path to a file from which data will be imported | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **stream** | `Stream` | Stream with data to import. | +| **fromFile** | `string` | Path to a file from which data will be imported. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | + +
-| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +--- + + ### DatabaseSmugglerImportOptions -| Parameters | | | -| - | - | - | -| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be imported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be imported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be imported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be imported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be imported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be imported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every imported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// import only Documents from a given file +```csharp +// Import only Documents from a given file var importOperation = await store .Smuggler .ImportAsync( new DatabaseSmugglerImportOptions - \{ + { OperateOnTypes = DatabaseItemType.Documents - \}, - // import the .ravendbdump file that you exported (i.e. in the export example above) - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await importOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## TransformScript + -`TransformScript` exposes the ability to modify or even filter-out the document during the import and export process using the provided JavaScript. +A RavenDB backup folder contains one full backup file and any incremental backup files accumulated since the last full backup. +Use `ImportIncrementalAsync` to import all backup files from such a folder into an existing database in a single call, +instead of importing each file individually. -Underneath the JavaScript engine is exactly the same as used for [patching operations](../../../client-api/operations/patching/single-document.mdx) giving you identical syntax and capabilities with additional **ability to filter out documents by throwing a 'skip' exception**. +The method finds all backup files in the specified directory, orders them chronologically, and imports them in sequence. +Indexes and Subscriptions are excluded from all files except the last, ensuring only their most recent state is applied. - - -{`var id = this['@metadata']['@id']; -if (id === 'orders/999-A') - throw 'skip'; // filter-out + +`ImportIncrementalAsync` imports into an **existing** database. +To create a **new** database from backup files, use [`RestoreBackupOperation`](../../../backup/restore.mdx). + -this.Freight = 15.3; -`} - - + + +### Syntax + +```csharp +Task ImportIncrementalAsync( + DatabaseSmugglerImportOptions options, + string fromDirectory, + CancellationToken cancellationToken = default); +``` + +
+ +| Parameter | Type | Description | +| --- | --- | --- | +| **options** | `DatabaseSmugglerImportOptions` | Options applied to each file imported from the directory. `Tombstones` and `CompareExchangeTombstones` are added automatically. | +| **fromDirectory** | `string` | Path to the backup folder containing the backup files to import. | +| **cancellationToken** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Completes when all backup files in the folder have been imported. | + +
+ +--- + +### Example + +```csharp +// Import all backup files from a backup folder into an existing database +await store.Smuggler.ImportIncrementalAsync( + new DatabaseSmugglerImportOptions(), + @"C:\RavenDB\Backups\Northwind.2024-01-15T12-00-00"); +``` + + + +
+ + +`TransformScript` exposes the ability to modify or filter documents during the import and export process using JavaScript. + +The JavaScript engine is the same one used for [patching operations](../../../client-api/operations/patching/single-document.mdx), +with the same syntax and capabilities, plus the ability to filter out documents by throwing a `'skip'` exception. + +```javascript +var id = this['@metadata']['@id']; +if (id === 'orders/999-A') + throw 'skip'; // filter out this document + +this.Freight = 15.3; +``` + diff --git a/versioned_docs/version-7.0/client-api/smuggler/content/_what-is-smuggler-csharp.mdx b/versioned_docs/version-7.0/client-api/smuggler/content/_what-is-smuggler-csharp.mdx index 4bdf46a8fc..d8cdf830e2 100644 --- a/versioned_docs/version-7.0/client-api/smuggler/content/_what-is-smuggler-csharp.mdx +++ b/versioned_docs/version-7.0/client-api/smuggler/content/_what-is-smuggler-csharp.mdx @@ -2,34 +2,44 @@ import Admonition from '@theme/Admonition'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import CodeBlock from '@theme/CodeBlock'; +import Panel from '@site/src/components/Panel'; +import ContentFrame from '@site/src/components/ContentFrame'; + + Smuggler gives you the ability to export or import data from or to a database using JSON format. It is exposed via the `DocumentStore.Smuggler` property. -## ForDatabase +* In this page: + * [ForDatabase](../../../client-api/smuggler/what-is-smuggler.mdx#fordatabase) + * [Export](../../../client-api/smuggler/what-is-smuggler.mdx#export) + * [DatabaseSmugglerExportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions) + * [Import](../../../client-api/smuggler/what-is-smuggler.mdx#import) + * [DatabaseSmugglerImportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions) + * [ImportIncrementalAsync](../../../client-api/smuggler/what-is-smuggler.mdx#importincrementalasync) + * [TransformScript](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript) -By default, the `DocumentStore.Smuggler` works on the default document store database from the `DocumentStore.Database` property. + -In order to switch it to a different database use the `.ForDatabase` method. + - - -{`var northwindSmuggler = store - .Smuggler - .ForDatabase("Northwind"); -`} - - +By default, `DocumentStore.Smuggler` works against the default database from the `DocumentStore.Database` property. +To target a different database, use `ForDatabase`: + +```csharp +var northwindSmuggler = store.Smuggler.ForDatabase("Northwind"); +``` + + -## Export + ### Syntax - - -{`Task ExportAsync( +```csharp +Task ExportAsync( DatabaseSmugglerExportOptions options, DatabaseSmuggler toDatabase, CancellationToken token = default(CancellationToken)); @@ -38,65 +48,77 @@ Task ExportAsync( DatabaseSmugglerExportOptions options, string toFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerExportOptions` | Options that will be used during the export. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions). | -| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination | -| **toFile** | `string` | Path to a file where exported data will be written | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination. | +| **toFile** | `string` | Path to a file where exported data will be written. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | -| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +
+ +--- + + ### DatabaseSmugglerExportOptions -| Parameters | | | -| ------------- | ------------- | ----- | -| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be exported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be exported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be exported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be exported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be exported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be exported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every exported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// export only Indexes and Documents to a given file +```csharp +// Export only Indexes and Documents to a given file var exportOperation = await store .Smuggler .ExportAsync( new DatabaseSmugglerExportOptions - \{ + { OperateOnTypes = DatabaseItemType.Indexes | DatabaseItemType.Documents - \}, - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await exportOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## Import + + + ### Syntax - - -{`Task ImportAsync( +```csharp +Task ImportAsync( DatabaseSmugglerImportOptions options, Stream stream, CancellationToken token = default(CancellationToken)); @@ -105,75 +127,137 @@ Task ImportAsync( DatabaseSmugglerImportOptions options, string fromFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerImportOptions` | Options that will be used during the import. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions). | -| **stream** | `Stream` | Stream with data to import | -| **fromFile** | `string` | Path to a file from which data will be imported | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **stream** | `Stream` | Stream with data to import. | +| **fromFile** | `string` | Path to a file from which data will be imported. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | + +
-| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +--- + + ### DatabaseSmugglerImportOptions -| Parameters | | | -| - | - | - | -| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be imported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be imported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be imported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be imported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be imported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be imported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every imported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// import only Documents from a given file +```csharp +// Import only Documents from a given file var importOperation = await store .Smuggler .ImportAsync( new DatabaseSmugglerImportOptions - \{ + { OperateOnTypes = DatabaseItemType.Documents - \}, - // import the .ravendbdump file that you exported (i.e. in the export example above) - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await importOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## TransformScript + -`TransformScript` exposes the ability to modify or even filter-out the document during the import and export process using the provided JavaScript. +A RavenDB backup folder contains one full backup file and any incremental backup files accumulated since the last full backup. +Use `ImportIncrementalAsync` to import all backup files from such a folder into an existing database in a single call, +instead of importing each file individually. -Underneath the JavaScript engine is exactly the same as used for [patching operations](../../../client-api/operations/patching/single-document.mdx) giving you identical syntax and capabilities with additional **ability to filter out documents by throwing a 'skip' exception**. +The method finds all backup files in the specified directory, orders them chronologically, and imports them in sequence. +Indexes and Subscriptions are excluded from all files except the last, ensuring only their most recent state is applied. - - -{`var id = this['@metadata']['@id']; -if (id === 'orders/999-A') - throw 'skip'; // filter-out + +`ImportIncrementalAsync` imports into an **existing** database. +To create a **new** database from backup files, use [`RestoreBackupOperation`](../../../backup/restore.mdx). + -this.Freight = 15.3; -`} - - + + +### Syntax + +```csharp +Task ImportIncrementalAsync( + DatabaseSmugglerImportOptions options, + string fromDirectory, + CancellationToken cancellationToken = default); +``` + +
+ +| Parameter | Type | Description | +| --- | --- | --- | +| **options** | `DatabaseSmugglerImportOptions` | Options applied to each file imported from the directory. `Tombstones` and `CompareExchangeTombstones` are added automatically. | +| **fromDirectory** | `string` | Path to the backup folder containing the backup files to import. | +| **cancellationToken** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Completes when all backup files in the folder have been imported. | + +
+ +--- + +### Example + +```csharp +// Import all backup files from a backup folder into an existing database +await store.Smuggler.ImportIncrementalAsync( + new DatabaseSmugglerImportOptions(), + @"C:\RavenDB\Backups\Northwind.2024-01-15T12-00-00"); +``` + + + +
+ + +`TransformScript` exposes the ability to modify or filter documents during the import and export process using JavaScript. + +The JavaScript engine is the same one used for [patching operations](../../../client-api/operations/patching/single-document.mdx), +with the same syntax and capabilities, plus the ability to filter out documents by throwing a `'skip'` exception. + +```javascript +var id = this['@metadata']['@id']; +if (id === 'orders/999-A') + throw 'skip'; // filter out this document + +this.Freight = 15.3; +``` + diff --git a/versioned_docs/version-7.1/client-api/smuggler/content/_what-is-smuggler-csharp.mdx b/versioned_docs/version-7.1/client-api/smuggler/content/_what-is-smuggler-csharp.mdx index 4bdf46a8fc..d8cdf830e2 100644 --- a/versioned_docs/version-7.1/client-api/smuggler/content/_what-is-smuggler-csharp.mdx +++ b/versioned_docs/version-7.1/client-api/smuggler/content/_what-is-smuggler-csharp.mdx @@ -2,34 +2,44 @@ import Admonition from '@theme/Admonition'; import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import CodeBlock from '@theme/CodeBlock'; +import Panel from '@site/src/components/Panel'; +import ContentFrame from '@site/src/components/ContentFrame'; + + Smuggler gives you the ability to export or import data from or to a database using JSON format. It is exposed via the `DocumentStore.Smuggler` property. -## ForDatabase +* In this page: + * [ForDatabase](../../../client-api/smuggler/what-is-smuggler.mdx#fordatabase) + * [Export](../../../client-api/smuggler/what-is-smuggler.mdx#export) + * [DatabaseSmugglerExportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions) + * [Import](../../../client-api/smuggler/what-is-smuggler.mdx#import) + * [DatabaseSmugglerImportOptions](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions) + * [ImportIncrementalAsync](../../../client-api/smuggler/what-is-smuggler.mdx#importincrementalasync) + * [TransformScript](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript) -By default, the `DocumentStore.Smuggler` works on the default document store database from the `DocumentStore.Database` property. + -In order to switch it to a different database use the `.ForDatabase` method. + - - -{`var northwindSmuggler = store - .Smuggler - .ForDatabase("Northwind"); -`} - - +By default, `DocumentStore.Smuggler` works against the default database from the `DocumentStore.Database` property. +To target a different database, use `ForDatabase`: + +```csharp +var northwindSmuggler = store.Smuggler.ForDatabase("Northwind"); +``` + + -## Export + ### Syntax - - -{`Task ExportAsync( +```csharp +Task ExportAsync( DatabaseSmugglerExportOptions options, DatabaseSmuggler toDatabase, CancellationToken token = default(CancellationToken)); @@ -38,65 +48,77 @@ Task ExportAsync( DatabaseSmugglerExportOptions options, string toFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerExportOptions` | Options that will be used during the export. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerexportoptions). | -| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination | -| **toFile** | `string` | Path to a file where exported data will be written | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **toDatabase** | `DatabaseSmuggler` | `DatabaseSmuggler` instance used as a destination. | +| **toFile** | `string` | Path to a file where exported data will be written. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | -| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +
+ +--- + + ### DatabaseSmugglerExportOptions -| Parameters | | | -| ------------- | ------------- | ----- | -| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be exported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be exported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be exported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to export. If empty, then all collections will be exported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be exported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be exported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be exported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be exported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be exported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every exported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// export only Indexes and Documents to a given file +```csharp +// Export only Indexes and Documents to a given file var exportOperation = await store .Smuggler .ExportAsync( new DatabaseSmugglerExportOptions - \{ + { OperateOnTypes = DatabaseItemType.Indexes | DatabaseItemType.Documents - \}, - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await exportOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## Import + + + ### Syntax - - -{`Task ImportAsync( +```csharp +Task ImportAsync( DatabaseSmugglerImportOptions options, Stream stream, CancellationToken token = default(CancellationToken)); @@ -105,75 +127,137 @@ Task ImportAsync( DatabaseSmugglerImportOptions options, string fromFile, CancellationToken token = default(CancellationToken)); -`} - - +``` + +
-| Parameters | | | -| ------------- | ------------- | ----- | +| Parameter | Type | Description | +| --- | --- | --- | | **options** | `DatabaseSmugglerImportOptions` | Options that will be used during the import. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#databasesmugglerimportoptions). | -| **stream** | `Stream` | Stream with data to import | -| **fromFile** | `string` | Path to a file from which data will be imported | -| **token** | `CancellationToken` | Token used to cancel the operation | +| **stream** | `Stream` | Stream with data to import. | +| **fromFile** | `string` | Path to a file from which data will be imported. | +| **token** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events. | + +
-| Return Value | | -| ------------- | ----- | -| `Operation` | Instance of Operation-class which gives you an ability to wait for the operation to complete and subscribe to operation progress events | +--- + + ### DatabaseSmugglerImportOptions -| Parameters | | | -| - | - | - | -| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported.
Default: `empty` | -| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported.
Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | -| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record.
Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | -| **IncludeExpired** | `bool` | Should expired documents be imported.
Default: `true` | -| **IncludeArtificial** | `bool` | Should artificial documents be imported.
Default: `false` | -| **IncludeArchived** | `bool` | Should archived documents be imported.
Default: `true` | -| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes.
Default: `false` | +| Property | Type | Description | +| --- | --- | --- | +| **Collections** | `List` | List of specific collections to import. If empty, then all collections will be imported. Default: `empty` | +| **OperateOnTypes** | `DatabaseItemType` | Indicates what should be imported. Default: `Indexes`, `Documents`, `RevisionDocuments`, `Conflicts`, `DatabaseRecord`, `ReplicationHubCertificates`, `Identities`, `CompareExchange`, `Attachments`, `CounterGroups`, `Subscriptions`, `TimeSeries` | +| **OperateOnDatabaseRecordTypes** | `DatabaseRecordItemType` | Indicates what should be imported from database record. Default: `Client`, `ConflictSolverConfig`, `Expiration`, `ExternalReplications`, `PeriodicBackups`, `RavenConnectionStrings`, `RavenEtls`, `Revisions`, `Settings`, `SqlConnectionStrings`, `Sorters`, `SqlEtls`, `HubPullReplications`, `SinkPullReplications`, `TimeSeries`, `DocumentsCompression`, `Analyzers`, `LockMode`, `OlapConnectionStrings`, `OlapEtls`, `ElasticSearchConnectionStrings`, `ElasticSearchEtls`, `PostgreSQLIntegration`, `QueueConnectionStrings`, `QueueEtls`, `IndexesHistory`, `Refresh`, `DataArchival` | +| **IncludeExpired** | `bool` | Should expired documents be imported. Default: `true` | +| **IncludeArtificial** | `bool` | Should artificial documents be imported. Default: `false` | +| **IncludeArchived** | `bool` | Should archived documents be imported. Default: `true` | +| **RemoveAnalyzers** | `bool` | Should analyzers be removed from Indexes. Default: `false` | | **TransformScript** | `string` | JavaScript-based script applied to every imported document. Read more [here](../../../client-api/smuggler/what-is-smuggler.mdx#transformscript). | -| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing.
Default: **10000** | +| **MaxStepsForTransformScript** | `int` | Maximum number of steps that the transform script can process before failing. Default: `10000` | + +
+ +--- + + ### Example - - -{`// import only Documents from a given file +```csharp +// Import only Documents from a given file var importOperation = await store .Smuggler .ImportAsync( new DatabaseSmugglerImportOptions - \{ + { OperateOnTypes = DatabaseItemType.Documents - \}, - // import the .ravendbdump file that you exported (i.e. in the export example above) - @"C:\\ravendb-exports\\Northwind.ravendbdump", + }, + @"C:\ravendb-exports\Northwind.ravendbdump", token); await importOperation.WaitForCompletionAsync(); -`} - - +``` + +
-## TransformScript + -`TransformScript` exposes the ability to modify or even filter-out the document during the import and export process using the provided JavaScript. +A RavenDB backup folder contains one full backup file and any incremental backup files accumulated since the last full backup. +Use `ImportIncrementalAsync` to import all backup files from such a folder into an existing database in a single call, +instead of importing each file individually. -Underneath the JavaScript engine is exactly the same as used for [patching operations](../../../client-api/operations/patching/single-document.mdx) giving you identical syntax and capabilities with additional **ability to filter out documents by throwing a 'skip' exception**. +The method finds all backup files in the specified directory, orders them chronologically, and imports them in sequence. +Indexes and Subscriptions are excluded from all files except the last, ensuring only their most recent state is applied. - - -{`var id = this['@metadata']['@id']; -if (id === 'orders/999-A') - throw 'skip'; // filter-out + +`ImportIncrementalAsync` imports into an **existing** database. +To create a **new** database from backup files, use [`RestoreBackupOperation`](../../../backup/restore.mdx). + -this.Freight = 15.3; -`} - - + + +### Syntax + +```csharp +Task ImportIncrementalAsync( + DatabaseSmugglerImportOptions options, + string fromDirectory, + CancellationToken cancellationToken = default); +``` + +
+ +| Parameter | Type | Description | +| --- | --- | --- | +| **options** | `DatabaseSmugglerImportOptions` | Options applied to each file imported from the directory. `Tombstones` and `CompareExchangeTombstones` are added automatically. | +| **fromDirectory** | `string` | Path to the backup folder containing the backup files to import. | +| **cancellationToken** | `CancellationToken` | Token used to cancel the operation. | + +| Return value | | +| --- | --- | +| `Task` | Completes when all backup files in the folder have been imported. | + +
+ +--- + +### Example + +```csharp +// Import all backup files from a backup folder into an existing database +await store.Smuggler.ImportIncrementalAsync( + new DatabaseSmugglerImportOptions(), + @"C:\RavenDB\Backups\Northwind.2024-01-15T12-00-00"); +``` + + + +
+ + +`TransformScript` exposes the ability to modify or filter documents during the import and export process using JavaScript. + +The JavaScript engine is the same one used for [patching operations](../../../client-api/operations/patching/single-document.mdx), +with the same syntax and capabilities, plus the ability to filter out documents by throwing a `'skip'` exception. + +```javascript +var id = this['@metadata']['@id']; +if (id === 'orders/999-A') + throw 'skip'; // filter out this document + +this.Freight = 15.3; +``` + From 08eb22eb42eeb1def397c188bc92bb354200f8b3 Mon Sep 17 00:00:00 2001 From: reeb Date: Sun, 10 May 2026 15:19:20 +0300 Subject: [PATCH 4/4] RDoc-3344 - update download link for Visual C++ Redistributable Package, remove stale image --- docs/start/getting-started.mdx | 8 ++++++-- .../start/installation/setup-examples/aws-windows-vm.mdx | 5 +++-- versioned_docs/version-7.0/start/getting-started.mdx | 9 ++++++--- .../start/installation/setup-examples/aws-windows-vm.mdx | 5 +++-- versioned_docs/version-7.1/start/getting-started.mdx | 8 ++++++-- .../start/installation/setup-examples/aws-windows-vm.mdx | 5 +++-- 6 files changed, 27 insertions(+), 13 deletions(-) diff --git a/docs/start/getting-started.mdx b/docs/start/getting-started.mdx index 7dbf350fd8..b6a486164f 100644 --- a/docs/start/getting-started.mdx +++ b/docs/start/getting-started.mdx @@ -53,10 +53,14 @@ RavenDB is written in `.NET` so it requires the same set of prerequisites as `.N -Please install the [Visual C++ Redistributable Package](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist) before launching the RavenDB server. +Please install the **Microsoft Visual C++ Redistributable Package 2019 or later** before launching the RavenDB server on Windows. + +Download the latest supported Microsoft Visual C++ v14 Redistributable package for your Windows architecture: +[x64](https://aka.ms/vc14/vc_redist.x64.exe) | [x86](https://aka.ms/vc14/vc_redist.x86.exe). This package should be the sole requirement for the 'Windows' platforms. + If you're experiencing difficulties, please check the -prerequisites for .NET on Windows in this [Microsoft article](https://learn.microsoft.com/en-us/dotnet/core/install/windows?tabs=net70#dependencies). +prerequisites for .NET on Windows in this [Microsoft article](https://learn.microsoft.com/en-us/dotnet/core/install/windows?tabs=net70#dependencies). diff --git a/docs/start/installation/setup-examples/aws-windows-vm.mdx b/docs/start/installation/setup-examples/aws-windows-vm.mdx index e7fe6fe0c0..5048ccba19 100644 --- a/docs/start/installation/setup-examples/aws-windows-vm.mdx +++ b/docs/start/installation/setup-examples/aws-windows-vm.mdx @@ -95,9 +95,10 @@ Dowload Chrome. You will need to allow it in the Internet Explorer firewall. ![20](./assets/20.png) ![21](./assets/21.png) -Install the [Visual C++ Redistributable Package](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist). +--- -![19](./assets/19.png) +Install the **Microsoft Visual C++ Redistributable Package 2019 or later**. +For the Windows Server x64 VM used in this walkthrough, use the [x64 installer](https://aka.ms/vc14/vc_redist.x64.exe). If you are running a 32-bit Windows installation, use the [x86 installer](https://aka.ms/vc14/vc_redist.x86.exe). ## Run the RavenDB Setup Wizard diff --git a/versioned_docs/version-7.0/start/getting-started.mdx b/versioned_docs/version-7.0/start/getting-started.mdx index 3ddb3fdbf7..7cc2a6a306 100644 --- a/versioned_docs/version-7.0/start/getting-started.mdx +++ b/versioned_docs/version-7.0/start/getting-started.mdx @@ -52,11 +52,14 @@ RavenDB is written in `.NET` so it requires the same set of prerequisites as `.N -Please install [Visual C++ 2015 Redistributable Package](https://support.microsoft.com/en-us/help/2977003/the-latest-supported-visual-c-downloads) -(or newer) before launching the RavenDB server. +Please install the **Microsoft Visual C++ Redistributable Package 2019 or later** before launching the RavenDB server on Windows. + +Download the latest supported Microsoft Visual C++ v14 Redistributable package for your Windows architecture: +[x64](https://aka.ms/vc14/vc_redist.x64.exe) | [x86](https://aka.ms/vc14/vc_redist.x86.exe). This package should be the sole requirement for the 'Windows' platforms. + If you're experiencing difficulties, please check the -prerequisites for .NET on Windows in this [Microsoft article](https://learn.microsoft.com/en-us/dotnet/core/install/windows?tabs=net70#dependencies). +prerequisites for .NET on Windows in this [Microsoft article](https://learn.microsoft.com/en-us/dotnet/core/install/windows?tabs=net70#dependencies). diff --git a/versioned_docs/version-7.0/start/installation/setup-examples/aws-windows-vm.mdx b/versioned_docs/version-7.0/start/installation/setup-examples/aws-windows-vm.mdx index 897924c7fd..665fbdbabf 100644 --- a/versioned_docs/version-7.0/start/installation/setup-examples/aws-windows-vm.mdx +++ b/versioned_docs/version-7.0/start/installation/setup-examples/aws-windows-vm.mdx @@ -94,9 +94,10 @@ Dowload Chrome. You will need to allow it in the Internet Explorer firewall. ![20](./assets/20.png) ![21](./assets/21.png) -Install the [Visual C++ 2015 Redistributable Package](https://support.microsoft.com/en-us/help/2977003/the-latest-supported-visual-c-downloads) (or newer). +--- -![19](./assets/19.png) +Install the **Microsoft Visual C++ Redistributable Package 2019 or later**. +For the Windows Server x64 VM used in this walkthrough, use the [x64 installer](https://aka.ms/vc14/vc_redist.x64.exe). If you are running a 32-bit Windows installation, use the [x86 installer](https://aka.ms/vc14/vc_redist.x86.exe). ## Run the RavenDB Setup Wizard diff --git a/versioned_docs/version-7.1/start/getting-started.mdx b/versioned_docs/version-7.1/start/getting-started.mdx index c817ed298a..7cc2a6a306 100644 --- a/versioned_docs/version-7.1/start/getting-started.mdx +++ b/versioned_docs/version-7.1/start/getting-started.mdx @@ -52,10 +52,14 @@ RavenDB is written in `.NET` so it requires the same set of prerequisites as `.N -Please install the [Visual C++ Redistributable Package](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist) before launching the RavenDB server. +Please install the **Microsoft Visual C++ Redistributable Package 2019 or later** before launching the RavenDB server on Windows. + +Download the latest supported Microsoft Visual C++ v14 Redistributable package for your Windows architecture: +[x64](https://aka.ms/vc14/vc_redist.x64.exe) | [x86](https://aka.ms/vc14/vc_redist.x86.exe). This package should be the sole requirement for the 'Windows' platforms. + If you're experiencing difficulties, please check the -prerequisites for .NET on Windows in this [Microsoft article](https://learn.microsoft.com/en-us/dotnet/core/install/windows?tabs=net70#dependencies). +prerequisites for .NET on Windows in this [Microsoft article](https://learn.microsoft.com/en-us/dotnet/core/install/windows?tabs=net70#dependencies). diff --git a/versioned_docs/version-7.1/start/installation/setup-examples/aws-windows-vm.mdx b/versioned_docs/version-7.1/start/installation/setup-examples/aws-windows-vm.mdx index 57e06d8969..665fbdbabf 100644 --- a/versioned_docs/version-7.1/start/installation/setup-examples/aws-windows-vm.mdx +++ b/versioned_docs/version-7.1/start/installation/setup-examples/aws-windows-vm.mdx @@ -94,9 +94,10 @@ Dowload Chrome. You will need to allow it in the Internet Explorer firewall. ![20](./assets/20.png) ![21](./assets/21.png) -Install the [Visual C++ Redistributable Package](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist). +--- -![19](./assets/19.png) +Install the **Microsoft Visual C++ Redistributable Package 2019 or later**. +For the Windows Server x64 VM used in this walkthrough, use the [x64 installer](https://aka.ms/vc14/vc_redist.x64.exe). If you are running a 32-bit Windows installation, use the [x86 installer](https://aka.ms/vc14/vc_redist.x86.exe). ## Run the RavenDB Setup Wizard