How to work efficiently with SBT, Spark and “provided” dependencies?

前端 未结 8 877
野性不改
野性不改 2021-01-31 02:47

I\'m building an Apache Spark application in Scala and I\'m using SBT to build it. Here is the thing:

  1. when I\'m developing under IntelliJ IDEA, I want Spark depend
8条回答
  •  无人共我
    2021-01-31 03:27

    Why not bypass sbt and manually add spark-core and spark-streaming as libraries to your module dependencies?

    • Open the Project Structure dialog (e.g. ⌘;).
    • In the left-hand pane of the dialog, select Modules.
    • In the pane to the right, select the module of interest.
    • In the right-hand part of the dialog, on the Module page, select the Dependencies tab.
    • On the Dependencies tab, click add and select Library.
    • In the Choose Libraries dialog, select new library, from maven
    • Find spark-core. Ex org.apache.spark:spark-core_2.10:1.6.1
    • Profit

    https://www.jetbrains.com/help/idea/2016.1/configuring-module-dependencies-and-libraries.html?origin=old_help#add_existing_lib

提交回复
热议问题