Imported Azure SQL Database always sizes at 250GB

走远了吗. 提交于 2021-01-07 00:57:28

问题


I have the issue that setting up a new SQL DB bacpac import (S3 Standard) always has a max size of 250 GB - just as if the parameter 'DatabaseMaxSizeBytes' gets omitted. Here the code:

$sdvobjDbImport = New-AzureRmSqlDatabaseImport `
                    -ResourceGroupName $sdvstrSqlsResourceGroupName `
                    -ServerName $sdvstrSqlsName `
                    -AdministratorLogin $sdvstrAdminLI `
                    -AdministratorLoginPassword $sdvsstrAdminPW `
                    -DatabaseName $sdvstrDatabaseName `
                    -Edition $sdvstrDbEditionAtImport `
                    -ServiceObjectiveName $sdvstrServiceObjectiveAtImport `
                    -DatabaseMaxSizeBytes 262144000 `
                    -StorageKey $sdvstrStorageKey `
                    -StorageKeyType 'StorageAccessKey' `
                    -StorageUri $sdvstrStorageUri `
                    -EA Stop

It should be 250 MB, but not GB. I don't need such a monster; scaling down afterwards (from 250 GB to 250 MB) gives problems with long operations times on the DB. Any idea what is wrong in my code? Google does not give an answer, either.


回答1:


I tested and have the same problem: the parameter -DatabaseMaxSizeBytes doesn't works. No matter which value we use, it will all create the database with the max storage 250G (DTU Standard S2).

The Azure Document use -DatabaseMaxSizeBytes 5000000, I tested and it also doesn't works.

Solution:

After the import completed, We must set the database size manually, here's the sample command:

Set-AzSqlDatabase -DatabaseName "TestDB" -ServerName "sqlservername" -ResourceGroupName "resourcegroup" -MaxSizeBytes "104857600"

Note:

The value of -MaxSizeBytes must be: 100M, 500m, 1GB, 2GB, 5GB, 10GB, 20GB, 30GB, 40GB, 50GB,100GB, 150GB, 200GB, 250GB.

You also can test

You can get this value from Portal:

I use the new Azure PowerShell Az module.

Hope this helps.




回答2:


Az is replacing AzureRM so this bug probably won't be fixed. The solution is using New-AzSqlDatabaseImport instead of New-AzureRmSqlDatabaseImport.

Here is an example of how to use it.

$importRequest = New-AzSqlDatabaseImport 
   -ResourceGroupName "<your_resource_group>" `
   -ServerName "<your_server>" `
   -DatabaseName "<your_database>" `
   -DatabaseMaxSizeBytes "<database_size_in_bytes>" `
   -StorageKeyType "StorageAccessKey" `
   -StorageKey $(Get-AzStorageAccountKey -ResourceGroupName "<your_resource_group>" -StorageAccountName "<your_storage_account").Value[0] `
   -StorageUri "https://myStorageAccount.blob.core.windows.net/importsample/sample.bacpac" `
   -Edition "Standard" `
   -ServiceObjectiveName "P6" `
   -AdministratorLogin "<your_server_admin_account_user_id>" `
   -AdministratorLoginPassword $(ConvertTo-SecureString -String "<your_server_admin_account_password>" -AsPlainText -Force)

As you can see here they are favoring the use of New-AzSqlDatabaseImport.



来源:https://stackoverflow.com/questions/58712699/imported-azure-sql-database-always-sizes-at-250gb

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!