Using pyspark to connect to PostgreSQL

前端 未结 10 1435
逝去的感伤
逝去的感伤 2020-12-01 04:50

I am trying to connect to a database with pyspark and I am using the following code:

sqlctx = SQLContext(sc)
df = sqlctx.load(
    url = "jdbc:postgresql         


        
10条回答
  •  忘掉有多难
    2020-12-01 05:26

    Download the PostgreSQL JDBC Driver from https://jdbc.postgresql.org/download.html

    Then replace the database configuration values by yours.

    from pyspark.sql import SparkSession
    
    spark = SparkSession \
        .builder \
        .appName("Python Spark SQL basic example") \
        .config("spark.jars", "/path_to_postgresDriver/postgresql-42.2.5.jar") \
        .getOrCreate()
    
    df = spark.read \
        .format("jdbc") \
        .option("url", "jdbc:postgresql://localhost:5432/databasename") \
        .option("dbtable", "tablename") \
        .option("user", "username") \
        .option("password", "password") \
        .option("driver", "org.postgresql.Driver") \
        .load()
    
    df.printSchema()
    

    More info: https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html

提交回复
热议问题