pentaho

Postgres to Json. Pentaho 7.0 ( Data Integration)

孤街浪徒 提交于 2019-12-23 04:52:26
问题 I make a query to a database of postgres and I bring two fields, "USER" and "CREATED" (DATE) I extract the year from the creation date, and then it is traversing the records and according to the year and the user create the new json object And I would like to generate a json with the following structure.: [ {year:2015, users[ { user:"Ana" created: 4 }, { user:"Pedro" created: 7 } ]}, year:2016, users[ { user:"Ana" created: 4 }, { nombre:"Pedro" created: 7 } ]} ] I create a modification with

Pentaho Mondrian Schema: left join of fact table with dimension table

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-23 01:07:28
问题 I have integrated Pentaho5 EE with Impala. In my schema dimension values are not gathered from the fact table as it is a huge table and it takes too long to calculate them. Since dimension values come dimension tables Mondrian compiles a query which does a join of dimension table with fact table in that order (i.e dimension table on the left). The query this way is slow and I read on the Cloudera website that if you do a join in Impala the bigger table (the fact table) has to be on the right.

Parametrized transformation from Pentaho DI server console

一笑奈何 提交于 2019-12-22 12:27:50
问题 I can execute a independent scheduled transformation from pentaho DI server console . But, issue on running a parametrized scheduled transformation from pentaho DI server console .How can i pass parameter value at run time . In pentaho BI server , to execute parametrized report we used to pass variable value in URL . tried same in pentho DI server as below but didnt worked http:// * * /pentaho-di/kettle/transStatus?name=UI_parameter&Values=Testvalue 来源: https://stackoverflow.com/questions

Integrating Pentaho Reporting web frontend with custom Java/JSF application

◇◆丶佛笑我妖孽 提交于 2019-12-22 11:37:07
问题 I have the following situation: OTLP database schema with data. Database procedures pump data into denormalized, star-schema with defined dimensions and fact tables. The goal is to build web application, which can do summary and drill-down on those defined data structures. I can build custom web interface, but I would prefer to use existing tools for the reporting part. The resulting application must be written in java and integrated with existing solution based on JSF and Pentaho looks like

Adding a jdbc driver to pentaho design studio and configuring the datasource

我的梦境 提交于 2019-12-22 11:04:04
问题 I've just integrated pentaho's design studio into the BI server. Does anyone know how to add mysql jdbc drivers. I need to connect in order to define the relational action process. In my research I found: http://wiki.bizcubed.com.au/xwiki/bin/view/Pentaho%20Tutorial/Install%20Pentaho%20Design%20Studio#Comments which specifies selecting JDBC Driver, Edit, Extra Class Path from Preferences but no such preference exists, http://forums.pentaho.com/showthread.php?85148-Design-Studio-xaction

Limit no. of rows in mongodb input

…衆ロ難τιáo~ 提交于 2019-12-22 08:19:21
问题 How to limit the no. of rows retrieved in mongodb input transformation used in kettle. I tried in mongodb input query with below queries but none of them are working : {"$query" : {"$limit" : 10}} or {"$limit" : 10} Please let me know where i am going wrong. Thanks, Deepthi 回答1: There are several query modification operators you can use. Their names are not totally intuitive and don't match the names of functions you would use in the Mongo shell, but they do the same sorts of things. In your

using variable names for a database connection in Pentaho Kettle

好久不见. 提交于 2019-12-22 00:09:29
问题 I am working on PDI kettle. Can we define a variable and use it in a database connection name. So that if in future if i need to change the connections in multiple transformations i would just change the variable value in kettle properties file? 回答1: Just use variables in the Database Connection . For instance ${DB_HostName} , and ${DB_Name} etc. Then just put it in your kettle.properties: DB_HostName=localhost You can see what fields that support variables by the S in the blue diamond. 来源:

How to change Pentaho after login page

空扰寡人 提交于 2019-12-21 22:07:58
问题 After login to Pentaho BI server as a user, the Pentaho shows a default page. I need to redesign that page to match look at feel of a company website. What files do I need to edit and how to give links from that page to dashboards in Pentaho? 回答1: Instead of redesigning the Pentaho screens create a "central dashboard" with links to all of your "sub-dashboards". This central place can be created as an another Pentaho CDE dashboard and you can make it look and feel as company website, because

Unable to connect to HDFS using PDI step

自古美人都是妖i 提交于 2019-12-21 21:22:39
问题 I have successfully configured Hadoop 2.4 in an Ubuntu 14.04 VM from a Windows 8 system. Hadoop installation is working absolutely fine and also i am able to view the Namenode from my windows browser. Attached Image Below: So, my host name is : ubuntu and hdfs port : 9000 (correct me if I am wrong). Core-site.xml : <property> <name>fs.defaultFS</name> <value>hdfs://ubuntu:9000</value> </property> The issue is while connecting to HDFS from my Pentaho Data Integration Tool. Attached Image Below

Pentaho BI安装

瘦欲@ 提交于 2019-12-20 16:30:28
【推荐】2019 Java 开发者跳槽指南.pdf(吐血整理) >>> #编者注 有人对一堆数据信息需要进行统计处理,由于之前测试过百度BI,大体上BI的模式应该相似并且百度BI与需求不符,随即研究最大的开源BI项目,Pentaho BI。 #Pentaho BI ##官网 商业官网 社区官网 文档链接 ##下载 到 社区官网 的最下面,有一个Download,找到MAIN DOWNLOADS,下载Business Analytics Platform。所下载的文件900MB+,大家注意网络。 ##环境 安装好Java环境,Pentaho BI会自动检查Java环境是否正常 DEBUG: Using JAVA_HOME DEBUG: _PENTAHO_JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64 DEBUG: _PENTAHO_JAVA=/usr/lib/jvm/java-7-openjdk-amd64/bin/java -------------------------------------------------------------------------------------------- The Pentaho BI Platform now contains a version checker that will