How to copy CosmosDb docs to Blob storage (each doc in single json file) with Azure Data Factory

后端 未结 4 1490
清歌不尽
清歌不尽 2021-01-21 05:12

I\'m trying to backup my Cosmos Db storage using Azure Data Factory(v2). In general, it\'s doing its job, but I want to have each doc in Cosmos collection to correspond new json

4条回答
  •  情深已故
    2021-01-21 05:42

    Since your cosmosdb has array and ADF doesn't support serialize array for cosmos db, this is the workaround I can provide.

    First, export all your document to json files with export json as-is (to blob or adls or file systems, any file storage). I think you already knows how to do it. In this way, each collection will have a json file.

    Second, handle each json file, to exact each row in the file to a single file.

    I only provide pipeline for step 2. You could use execute pipeline activity to chain step 1 and step 2. And you could even handle all the collections in step 2 with a foreach activity.

    Pipeline json

    {
    "name": "pipeline27",
    "properties": {
        "activities": [
            {
                "name": "Lookup1",
                "type": "Lookup",
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false
                },
                "typeProperties": {
                    "source": {
                        "type": "BlobSource",
                        "recursive": true
                    },
                    "dataset": {
                        "referenceName": "AzureBlob7",
                        "type": "DatasetReference"
                    },
                    "firstRowOnly": false
                }
            },
            {
                "name": "ForEach1",
                "type": "ForEach",
                "dependsOn": [
                    {
                        "activity": "Lookup1",
                        "dependencyConditions": [
                            "Succeeded"
                        ]
                    }
                ],
                "typeProperties": {
                    "items": {
                        "value": "@activity('Lookup1').output.value",
                        "type": "Expression"
                    },
                    "activities": [
                        {
                            "name": "Copy1",
                            "type": "Copy",
                            "policy": {
                                "timeout": "7.00:00:00",
                                "retry": 0,
                                "retryIntervalInSeconds": 30,
                                "secureOutput": false
                            },
                            "typeProperties": {
                                "source": {
                                    "type": "DocumentDbCollectionSource",
                                    "query": {
                                        "value": "select @{item()}",
                                        "type": "Expression"
                                    },
                                    "nestingSeparator": "."
                                },
                                "sink": {
                                    "type": "BlobSink"
                                },
                                "enableStaging": false,
                                "cloudDataMovementUnits": 0
                            },
                            "inputs": [
                                {
                                    "referenceName": "DocumentDbCollection1",
                                    "type": "DatasetReference"
                                }
                            ],
                            "outputs": [
                                {
                                    "referenceName": "AzureBlob6",
                                    "type": "DatasetReference",
                                    "parameters": {
                                        "id": {
                                            "value": "@item().id",
                                            "type": "Expression"
                                        },
                                        "PartitionKey": {
                                            "value": "@item().PartitionKey",
                                            "type": "Expression"
                                        }
                                    }
                                }
                            ]
                        }
                    ]
                }
            }
        ]
    },
    "type": "Microsoft.DataFactory/factories/pipelines"
    

    }

    dataset json for lookup

       {
    "name": "AzureBlob7",
    "properties": {
        "linkedServiceName": {
            "referenceName": "bloblinkedservice",
            "type": "LinkedServiceReference"
        },
        "type": "AzureBlob",
        "typeProperties": {
            "format": {
                "type": "JsonFormat",
                "filePattern": "arrayOfObjects"
            },
            "fileName": "cosmos.json",
            "folderPath": "aaa"
        }
    },
    "type": "Microsoft.DataFactory/factories/datasets"
    

    }

    Source dataset for copy. Actually, this dataset has no use. Just want to use it to host the query (select @{item()}

    {
    "name": "DocumentDbCollection1",
    "properties": {
        "linkedServiceName": {
            "referenceName": "CosmosDB-r8c",
            "type": "LinkedServiceReference"
        },
        "type": "DocumentDbCollection",
        "typeProperties": {
            "collectionName": "test"
        }
    },
    "type": "Microsoft.DataFactory/factories/datasets"
    

    }

    Destination dataset. With two parameters, it also addressed your file name request.

    {
    "name": "AzureBlob6",
    "properties": {
        "linkedServiceName": {
            "referenceName": "AzureStorage-eastus",
            "type": "LinkedServiceReference"
        },
        "parameters": {
            "id": {
                "type": "String"
            },
            "PartitionKey": {
                "type": "String"
            }
        },
        "type": "AzureBlob",
        "typeProperties": {
            "format": {
                "type": "JsonFormat",
                "filePattern": "setOfObjects"
            },
            "fileName": {
                "value": "@{dataset().PartitionKey}-@{dataset().id}.json",
                "type": "Expression"
            },
            "folderPath": "aaacosmos"
        }
    },
    "type": "Microsoft.DataFactory/factories/datasets"
    

    }

    please also note the limitation of Lookup activity: The following data sources are supported for lookup. The maximum number of rows can be returned by Lookup activity is 5000, and up to 2MB in size. And currently the max duration for Lookup activity before timeout is one hour.

提交回复
热议问题