I am using spark-shell for running my code. In my code, I have defined a function and I call that function with its parameters.
The problem is that I get the below e
Here is the trick. Lets open the REPL and define a class:
scala> case class Foo(i: Int)
defined class Foo
and a simple function which operates on this class:
scala> def fooToInt(foo: Foo) = foo.i
fooToInt: (foo: Foo)Int
redefine the class:
scala> case class Foo(i: Int)
defined class Foo
and create an instance:
scala> val foo = Foo(1)
foo: Foo = Foo(1)
All whats left is to call fooToInt
:
scala> fooToInt(foo)
<console>:34: error: type mismatch;
found : Foo(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC)
required: Foo(in class $iwC)(in class $iwC)(in class $iwC)(in class $iwC)
fooToInt(foo)
Does it look familiar? Yet another trick to get a better idea what is going on:
scala> case class Foo(i: Int)
defined class Foo
scala> val foo = Foo(1)
foo: Foo = Foo(1)
scala> case class Foo(i: Int)
defined class Foo
scala> def fooToInt(foo: Foo) = foo.i
<console>:31: error: reference to Foo is ambiguous;
it is imported twice in the same scope by
import INSTANCE.Foo
and import INSTANCE.Foo
def fooToInt(foo: Foo) = foo.i
So long story short this is an expected, although slightly confusing, behavior which arises from ambiguous definitions existing in the same scope.
Unless you want to periodically :reset
REPL state you should keep track of entities you create and if types definitions change make sure that no ambiguous definitions persist (overwrite things if needed) before you proceed.