When I use spark streaming and kafka integration with kafka broker version 0.10.1
KafkaUtils.createDirectStream creates as a org.apache.spark.streaming.dstream.DStream. It is not a RDD. Spark Streaming will create RDDs temporarily as is runs. To retrieve an RDD use stream.foreach() to get the RDD and then RDD.foreach to get each object in the RDD. Those will be Kafka ConsumerRecords of which you use use the value() method to read the message from the Kafka topic:
stream.foreachRDD { rdd =>
rdd.foreach { record =>
val value = record.value()
println(map.get(value))
}
}