scala-collections

How can I ensure that the dynamic type of my custom Scala collection is preserved during a map()?

为君一笑 提交于 2019-12-01 21:49:33
问题 I read the very interesting article on the architecture of the Scala 2.8 collections and I've been experimenting with it a little bit. For a start, I simply copied the final code for the nice RNA example. Here it is for reference: abstract class Base case object A extends Base case object T extends Base case object G extends Base case object U extends Base object Base { val fromInt: Int => Base = Array(A, T, G, U) val toInt: Base => Int = Map(A -> 0, T -> 1, G -> 2, U -> 3) } final class RNA

Extending Scala collections

眉间皱痕 提交于 2019-12-01 21:12:54
I would like to derive a version of a Scala built-in collection that expands on the functionality for a particular generic type e.g., import scala.collection.immutable._ class Tuple2Set[T1,T2] extends HashSet[Tuple2[T1,T2]] { def left = map ( _._1 ) def right = map ( _._2 ) } However when I try to use it with the following test new Tuple2Set[String,String]() + (("x","y")) left I get the following compile error error: value left is not a member of scala.collection.immutable.HashSet[(String, String)] How can I change the class so that this works? As Kevin Wright said, + operation will return

Underscores and string concatenation in List.map with Scala [duplicate]

有些话、适合烂在心里 提交于 2019-12-01 12:43:30
This question already has an answer here: Scala foreach strange behaviour 5 answers Scala lets you use an underscore to do a simple map. So for example instead of writing: def roleCall(people: String*){ people.toList.map(x => println(x)) } ...I can instead write: def roleCall(people: String*){ people.toList.map(println(_)) } However for some reason I can't write: def greet(people: String*){ // This won't compile! people.toList.map(println("Hello " + _)) } instead I have to write: def greet(people: String*){ people.toList.map(x => println("Hello " + x)) } Can anyone explain why? Basically, the

Missing par method from Scala collections

被刻印的时光 ゝ 提交于 2019-12-01 10:34:03
问题 I tried to convert a sequential list to a parallel one in Intellij, but I get the error Cannot resolve symbol par on the .par method call: import scala.collection.parallel.immutable._ ... val parList = List(1,2,3).par According to https://docs.scala-lang.org/overviews/parallel-collections/overview.html, one must simply invoke the par method on the sequential collection, list. After that, one can use a parallel collection in the same way one would normally use a sequential collection. What

Using scala map in Java

点点圈 提交于 2019-12-01 07:41:49
I have two files. One is scala and other is java. Scala file has a function which returns scala immutable map. Java file wants to use that map as dictionary. I am a newbie to scala and java. How can I convert that scala map to java dicionary? Use the static forwarders from Java. In Scala 2.13 the API is simplified, with overloaded asJava for conversion to Java: $ javap -cp ~/scala-2.13.0/lib/scala-library.jar scala.jdk.javaapi.CollectionConverters Compiled from "CollectionConverters.scala" public final class scala.jdk.javaapi.CollectionConverters { public static scala.collection.mutable.Map

Method taking implicit CanBuildFrom does not work with eta-expansion?

六眼飞鱼酱① 提交于 2019-12-01 06:21:45
I have a following method: def firstAndLast[CC, A, That](seq: CC)(implicit asSeq: CC => Seq[A], cbf: CanBuildFrom[CC, A, That]): That = { val b = cbf(seq) b += seq.head b += seq.last b.result } See: Method taking Seq[T] to return String rather than Seq[Char] for rationale. It works like a charm in the first case but fails to compile in the second: List("abc", "def") map {firstAndLast(_)} List("abc", "def") map firstAndLast Giving: error: No implicit view available from CC => Seq[A]. List("abc", "def") map firstAndLast Any idea how to improve this declaration to avoid extra wrapping? Seems like

Should Scala's map() behave differently when mapping to the same type?

狂风中的少年 提交于 2019-12-01 06:00:23
In the Scala Collections framework, I think there are some behaviors that are counterintuitive when using map() . We can distinguish two kinds of transformations on (immutable) collections. Those whose implementation calls newBuilder to recreate the resulting collection, and those who go though an implicit CanBuildFrom to obtain the builder. The first category contains all transformations where the type of the contained elements does not change. They are, for example, filter , partition , drop , take , span , etc. These transformations are free to call newBuilder and to recreate the same

Method taking implicit CanBuildFrom does not work with eta-expansion?

ぃ、小莉子 提交于 2019-12-01 05:08:56
问题 I have a following method: def firstAndLast[CC, A, That](seq: CC)(implicit asSeq: CC => Seq[A], cbf: CanBuildFrom[CC, A, That]): That = { val b = cbf(seq) b += seq.head b += seq.last b.result } See: Method taking Seq[T] to return String rather than Seq[Char] for rationale. It works like a charm in the first case but fails to compile in the second: List("abc", "def") map {firstAndLast(_)} List("abc", "def") map firstAndLast Giving: error: No implicit view available from CC => Seq[A]. List("abc

Extending a Scala collection

对着背影说爱祢 提交于 2019-12-01 03:54:02
问题 I want a Map that throws on attempt to overwrite a value for existing key. I tried: trait Unoverwriteable[A, B] extends scala.collection.Map[A, B] { case class KeyAlreadyExistsException(e: String) extends Exception(e) abstract override def + [B1 >: B] (kv: (A, B1)): Unoverwriteable[A, B1] = { if (this contains(kv _1)) throw new KeyAlreadyExistsException( "key already exists in WritableOnce map: %s".format((kv _1) toString) ) super.+(kv) } abstract override def get(key: A): Option[B] = super

Scala simple histogram

戏子无情 提交于 2019-12-01 03:03:10
For a given Array[Double] , for instance val a = Array.tabulate(100){ _ => Random.nextDouble * 10 } what is a simple approach to calculate a histogram with n bins ? A very similar preparation of values as in @om-nom-nom 's answer, yet the histogram method quite small by using partition , case class Distribution(nBins: Int, data: List[Double]) { require(data.length > nBins) val Epsilon = 0.000001 val (max,min) = (data.max,data.min) val binWidth = (max - min) / nBins + Epsilon val bounds = (1 to nBins).map { x => min + binWidth * x }.toList def histo(bounds: List[Double], data: List[Double]):