“error: type mismatch” in Spark with same found and required datatypes

前端 未结 1 1185
鱼传尺愫
鱼传尺愫 2020-12-07 02:14

I am using spark-shell for running my code. In my code, I have defined a function and I call that function with its parameters.

The problem is that I get the below e

相关标签:
1条回答
  • 2020-12-07 02:50

    Here is the trick. Lets open the REPL and define a class:

    scala> case class Foo(i: Int)
    defined class Foo
    

    and a simple function which operates on this class:

    scala> def fooToInt(foo: Foo) = foo.i
    fooToInt: (foo: Foo)Int
    

    redefine the class:

    scala> case class Foo(i: Int)
    defined class Foo
    

    and create an instance:

    scala> val foo = Foo(1)
    foo: Foo = Foo(1)
    

    All whats left is to call fooToInt:

    scala> fooToInt(foo)
    <console>:34: error: type mismatch;
     found   : Foo(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC)
     required: Foo(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC)
              fooToInt(foo)
    

    Does it look familiar? Yet another trick to get a better idea what is going on:

    scala> case class Foo(i: Int)
    defined class Foo
    
    scala> val foo = Foo(1)
    foo: Foo = Foo(1)
    
    scala> case class Foo(i: Int)
    defined class Foo
    
    scala> def fooToInt(foo: Foo) = foo.i
    <console>:31: error: reference to Foo is ambiguous;
    it is imported twice in the same scope by
    import INSTANCE.Foo
    and import INSTANCE.Foo
             def fooToInt(foo: Foo) = foo.i
    

    So long story short this is an expected, although slightly confusing, behavior which arises from ambiguous definitions existing in the same scope.

    Unless you want to periodically :reset REPL state you should keep track of entities you create and if types definitions change make sure that no ambiguous definitions persist (overwrite things if needed) before you proceed.

    0 讨论(0)
提交回复
热议问题