I am a new to NoSQL solutions and want to play with Hive. But installing HDFS/Hadoop takes a lot of resources and time (maybe without experience but I got no time to do this
Update This answer is out-of-date : with Hive on Spark it is no longer necessary to have hdfs support.
Hive requires hdfs and map/reduce so you will need them. The other answer has some merit in the sense of recommending a simple / pre-configured means of getting all of the components there for you.
But the gist of it is: hive needs hadoop and m/r so in some degree you will need to deal with it.