Hadoop on Windows Building/ Installation Error

前端 未结 4 937
隐瞒了意图╮
隐瞒了意图╮ 2020-12-05 16:21

I am trying to install Apache Hadoop 2.7.1 on Windows x64 (8.1 and Server 2012 R2), but I am failing at goal:

    [INFO] Apache Hadoop Common ..............         


        
4条回答
  •  离开以前
    2020-12-05 17:01

    I've managed to build it with Visual Studio 2015 community edition.

    Here's how I've built it:

    My environment

    Here's my shopping list:

    • Windows 10
    • JDK 1.8.0_51
    • Maven 3.3.3
    • Findbugs 1.3.9 (I haven't used this)
    • ProtocolBuffer 2.5.0 (I didn't pick the latest and greatest here - it has to be 2.5.0)
    • CMake 3.3.0
    • Visual Studio 2015 Community Edition
    • GnuWin32 0.6.3 - a bit painful to install but so is cygwin
    • zlib 1.2.8
    • internet connection

    Windows System Environment variables

    • JAVA_HOME = "C:\Program Files\Java\jdk1.8.0_51"
    • MAVEN_HOME=c:\apache-maven-3.3.3

    (make sure you point the above to your JDK version and maven installation)

    I appended the following to my windows system environment Path variable:

    ;%MAVEN_HOME%\bin;C:\Windows\Microsoft.NET\Framework64\v4.0.30319;c:\zlib

    The weird "C:\Windows\Microsoft.NET\Framework64\v4.0.30319" path is the location of MSBuild.exe, which is required during the build process.

    Protoc Buffers 2.5.0

    Oh no, another unix/linux only build? I've downloaded the google package named protoc-2.5.0-win32.zip. Then extracted the binary file (protoc.exe) to c:\windows\system32 - just a lazy way to put it on the path.

    I'm not 100% sure of the effect of having a win32 component for this win64 build. But: "Hadoop 0.23+ requires the protocol buffers JAR (protobufs.jar) to be on the classpath of both clients and servers; the native binaries are required to compile this and later versions of Hadoop." - http://wiki.apache.org/hadoop/ProtocolBuffers.

    So I understand the win32 executable is used only during the build process (the jar equivalent should be packaged in the build).

    If it is used in any way to compile native code, we may have left with some pointers out of order. I'll come back to this when I can.

    Tweaking the Hadoop sources

    Well, this was necessary to allow to build to execute. It shouldn't affect the quality of the build itself, but let's keep in mind the result is an unofficial, unsupported, use at your own risk hadoop, intended for a development environment.

    Migrating VS projects

    The following files need to be open with Visual Studio 2015:

    \hadoop-common-project\hadoop-common\src\main\winutils\winutils.vcxproj \hadoop-common-project\hadoop-common\src\main\native\native.vcxproj

    Visual Studio will complain of them being of an old version. All you have to do is to save all and close.

    Enabling cmake VS 2015 project generation for hdfs

    On the line 441 of \hadoop-hdfs-project\hadoop-hdfs\pom.xml, edit the else value as the following:

    (the "value" value applies to win32 - you may want to edit it if building for win32).

    Building it

    You should try and find on windows the "Development Command Prompt for VS2015". I'm still wondering what is so special about this, but the fact is that it will only work with that.

    More Environment variables Those should be done on the command prompt:

    set Platform=x64

    set ZLIB_HOME=C:\zlib\include (unlike the official instructions, this should be pointing to the include folder).

    Finally building it

    Go to the hadoop source folder and issue:

    mvn package -Pdist,native-win -DskipTests -Dtar

    What next?

    Follow the official docs to get your hadoop instance configured and up and running.

    I'll try to keep a link with for the binaries on my blog:

    http://kplitzkahran.blogspot.co.uk/2015/08/hadoop-271-for-windows-10-binary-build.html

提交回复
热议问题