scala-2.9

Add tools.jar in the classpath of sbt project

拟墨画扇 提交于 2020-04-08 00:51:12
问题 The ':javap' command in the scala 2.9.1 console need the tools.jar (from JDK6) in the 'classpath'. From cmd-line it could be done with '-cp' argument or CLASSPATH environment variable. How to do the same for scala console that invoked from SBT with the 'console' and 'console-quick' commands? 回答1: Long answer, that might help you elsewhere. If I want to know about something in SBT, I inspect it: > inspect console [info] Task: Unit [info] Description: [info] Starts the Scala interpreter with

Add tools.jar in the classpath of sbt project

怎甘沉沦 提交于 2020-04-08 00:50:35
问题 The ':javap' command in the scala 2.9.1 console need the tools.jar (from JDK6) in the 'classpath'. From cmd-line it could be done with '-cp' argument or CLASSPATH environment variable. How to do the same for scala console that invoked from SBT with the 'console' and 'console-quick' commands? 回答1: Long answer, that might help you elsewhere. If I want to know about something in SBT, I inspect it: > inspect console [info] Task: Unit [info] Description: [info] Starts the Scala interpreter with

What is the difference between different function creations

丶灬走出姿态 提交于 2020-01-17 02:58:05
问题 This question is heavily related to my other question (and may lead to me solving that one) but is definetely different. how to allow passing in a => AnyRef function and call that function I have been playing with different function creations and I am frankly having trouble creating an anonymous function of type => AnyRef and => String. I can create a function of type () => AnyRef and () => String I think. Example 1 I have the following code def debugLazyTest2(msg: => String) : Unit = {

Why does Scala define a “+=” operator for Short and Byte types?

独自空忆成欢 提交于 2020-01-02 04:09:07
问题 Given the following scala code: var short: Short = 0 short += 1 // error: type mismatch short += short // error: type mismatch short += 1.toByte // error: type mismatch I don't questioning the underlying typing - it's clear that "Short + value == Int". My questions are: 1. Is there any way at all that the operator can be used? 2. If not, then why is the operator available for use on Short & Byte? [And by extension *=, |= &=, etc.] 回答1: The problem seems to be that "+(Short)" on Short class is

Why does Scala define a “+=” operator for Short and Byte types?

孤人 提交于 2020-01-02 04:09:05
问题 Given the following scala code: var short: Short = 0 short += 1 // error: type mismatch short += short // error: type mismatch short += 1.toByte // error: type mismatch I don't questioning the underlying typing - it's clear that "Short + value == Int". My questions are: 1. Is there any way at all that the operator can be used? 2. If not, then why is the operator available for use on Short & Byte? [And by extension *=, |= &=, etc.] 回答1: The problem seems to be that "+(Short)" on Short class is

How does one implement a Hadoop Mapper in Scala 2.9.0?

时光总嘲笑我的痴心妄想 提交于 2019-12-30 06:29:46
问题 When I migrated to Scala 2.9.0 from 2.8.1, all of the code was functional except for the Hadoop mappers. Because I had some wrapper objects in the way, I distilled down to the following example: import org.apache.hadoop.mapreduce.{Mapper, Job} object MyJob { def main(args:Array[String]) { val job = new Job(new Configuration()) job.setMapperClass(classOf[MyMapper]) } } class MyMapper extends Mapper[LongWritable,Text,Text,Text] { override def map(key: LongWritable, value: Text, context: Mapper

How does one implement a Hadoop Mapper in Scala 2.9.0?

有些话、适合烂在心里 提交于 2019-12-30 06:29:04
问题 When I migrated to Scala 2.9.0 from 2.8.1, all of the code was functional except for the Hadoop mappers. Because I had some wrapper objects in the way, I distilled down to the following example: import org.apache.hadoop.mapreduce.{Mapper, Job} object MyJob { def main(args:Array[String]) { val job = new Job(new Configuration()) job.setMapperClass(classOf[MyMapper]) } } class MyMapper extends Mapper[LongWritable,Text,Text,Text] { override def map(key: LongWritable, value: Text, context: Mapper

How can I use Scala's MurmurHash implementation: scala.util.MurmurHash3?

你说的曾经没有我的故事 提交于 2019-12-22 08:46:50
问题 I'm writing a BloomFilter and wanted to use Scala's default MurmurHash3 implementation: scala.util.MurmurHash3. My compile is failing however with the following compile error: [error] /mnt/hgfs/dr/sandbox/dr-commons/src/main/scala/dr/commons/collection/BloomFilter.scala:214: MurmurHash3 is not a member of scala.util [error] import scala.util.{MurmurHash3 => MH} I'm using Scala 2.9.1 and sbt 0.11.2. Is the MurmurHash3 class not in the 2.9.1 library by default? I assume it is since it's used a

Dynamic Proxy using Scalas new Dynamic Type

岁酱吖の 提交于 2019-12-21 12:31:54
问题 Is it possible to create an AOP like interceptor using Scalas new Dynamic Type feature? For example: Would it be possible to create a generic stopwatch interceptor that could be mixed in with arbitrary types to profile my code? Or would I still have to use AspectJ? 回答1: I'm pretty sure Dynamic is only used when the object you're selecting on doesn't already have what you're selecting: From the nightly scaladoc: Instances x of this trait allow calls x.meth(args) for arbitrary method names meth

Why do case classes extend only Product and not Product1, Product2, …, ProductN?

不羁的心 提交于 2019-12-21 07:19:06
问题 after I learned that case classes extend Product, I wondered why they do not extend ProductN. E.g., given a code like: case class Foo(a: Int) I'd expect Foo(1).asInstanceOf[Product1[Int]] to work, but it does not (checked with Scala 2.9.1, and confirmed by other sources and by Product documentation). I was interested in this, because I wanted to declare classes such as: abstract class UnaryOp[T1 <: Exp[_], R](t1: T1) extends Exp[R] { this: Product1[T1] => } This way, a node for a Unary