azure-databricks

Create External table in Azure databricks

萝らか妹 提交于 2019-12-06 04:46:05
I am new to azure databricks and trying to create an external table, pointing to Azure Data Lake Storage (ADLS) Gen-2 location. From databricks notebook i have tried to set the spark configuration for ADLS access. Still i am unable to execute the DDL created. Note : One solution working for me is mounting the ADLS account to cluster and then use the mount location in external table's DDL. But i needed to check if it is possible to create a external table DDL with ADLS path without mount location. # Using Principal credentials spark.conf.set("dfs.azure.account.auth.type", "OAuth") spark.conf

Spark 2.4.0 - unable to parse ISO8601 string into TimestampType preserving ms

一个人想着一个人 提交于 2019-12-06 00:52:27
问题 When trying to convert ISO8601 strings with time zone information into a TimestampType using a cast(TimestampType) only strings using the time zone format +01:00 is accepted. If the time zone is defined in the ISO8601 legal way +0100 (without the colon) the parse fails and returns null. I need to convert the string to a TimestampType while preserving the ms part. 2019-02-05T14:06:31.556+0100 Returns null 2019-02-05T14:06:31.556+01:00 Returns a correctly parsed TimestampType I have tried to

Generate Azure Databricks Token using Powershell script

牧云@^-^@ 提交于 2019-12-04 11:21:57
I need to generate Azure Databricks token using Powershell script. I am done with creation of Azure Databricks using ARM template , now i am looking to generate Databricks token using powershell script . Kindly let me know how to create Databricks token using Powershell script The only way to generate a new token is via the api which requires you to have a token in the first place. Or use the Web ui manually. There is no official powershell commands for databricks, there are some unofficial ones but they still require you to generate a token manually first. https://github.com/DataThirstLtd