Pentaho MongoDBInput Java integration

不问归期 提交于 2019-12-11 09:49:37

问题


I have a simple pentaho formation with MongoDBInput connected with JSON Output. I am able to fetch json while i see preview in Pentaho Design studio but as I try to integrate with java, and run the transformation, its throwing error -

Exception in thread "main" org.pentaho.di.core.exception.KettleXMLException: Error reading object from XML file

Unable to load step info from XML step nodeorg.pentaho.di.core.exception.KettleStepLoaderException: Unable to load class for step/plugin with id [MongoDbInput]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.

Unable to load class for step/plugin with id [MongoDbInput]. Check if the plugin is available in the plugins subdirectory of the Kettle distribution.

My code is:

 import java.io.IOException;
import java.util.List;
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.util.EnvUtil;
import org.pentaho.di.trans.TransMeta;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.core.*;
import org.pentaho.di.core.exception.*;

public class KettleConnector {

    public static void main(String[] args) throws KettleException, IOException{

        KettleEnvironment.init(false);

        EnvUtil.environmentInit();

        TransMeta transMeta = new TransMeta("D:\\mangoes.ktr");

        Trans trans = new Trans(transMeta);

        trans.execute(null); // You can pass arguments instead of null.

        trans.waitUntilFinished();

        Result r = trans.getResult();

        List<RowMetaAndData> rowsResult = r.getRows();
        System.out.println(trans.getTransMeta());

        if (trans.getErrors() > 0) {

        throw new RuntimeException();

        }

    }

}

It works fine for mysql transformations.

I have included mongo-2.4jar and mongo-java-driver-2.7.2.jar

still am facing this error.


回答1:


Please check if you are adding VM argument while running:

-DKETTLE_PLUGIN_BASE_FOLDERS=D:/LOCATION/data-integration/plugins



回答2:


The MongoDB steps are Pentaho Data Integration (PDI) plugins. What version of PDI are you using? If you are using 4.4.0, then you will need the Big Data plugin, which contains the MongoDB steps. From where you launch your application, you'll need a plugins/steps folder containing the contents of the Big Data plugin (the pentaho-big-data-plugin folder). Then your call to KettleEnvironment.init() will load the plugins and you will have access to the MongoDB steps.




回答3:


Matt..I am using pentaho kettle 5.1. What jars need to be included from the plugins folder for the Mongodb steps to work. I have included everything from plugins/pentaho-mongodb-plugin?

Still get the same error




回答4:


Well...figured out with Matt's help. Though this answer doesn't fit well with how java WAR files are designed, essentially you need the entire plugins folder passed in as an argument (-D) to your app server. Here's an example

-DKETTLE_PLUGIN_BASE_FOLDERS=/Users/user1/Documents/pdi-ce-5.1.0/data-integration/plugins



来源:https://stackoverflow.com/questions/17788800/pentaho-mongodbinput-java-integration

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!