data-science-experience

Spark-cloudant package 1.6.4 loaded by %AddJar does not get used by notebook

て烟熏妆下的殇ゞ 提交于 2019-12-11 06:12:06
问题 I'm trying to use the latest spark-cloudant package with a notebook: %AddJar -f https://github.com/cloudant-labs/spark-cloudant/releases/download/v1.6.4/cloudant-spark-v1.6.4-167.jar Which outputs: Starting download from https://github.com/cloudant-labs/spark-cloudant/releases/download/v1.6.4/cloudant-spark-v1.6.4-167.jar Finished download of cloudant-spark-v1.6.4-167.jar Followed by: val dfReader = sqlContext.read.format("com.cloudant.spark") dfReader.option("cloudant.host", sourceDB.host)

No FileSystem for scheme: cos

左心房为你撑大大i 提交于 2019-12-11 03:35:15
问题 I'm trying to connect to IBM Cloud Object Storage from IBM Data Science Experience: access_key = 'XXX' secret_key = 'XXX' bucket = 'mybucket' host = 'lon.ibmselect.objstor.com' service = 'mycos' sqlCxt = SQLContext(sc) hconf = sc._jsc.hadoopConfiguration() hconf.set('fs.cos.myCos.access.key', access_key) hconf.set('fs.cos.myCos.endpoint', 'http://' + host) hconf.set('fs.cose.myCos.secret.key', secret_key) hconf.set('fs.cos.service.v2.signer.type', 'false') obj = 'mydata.tsv.gz' rdd = sc

Bluemix Analytics for Apache Spark log file information required

懵懂的女人 提交于 2019-12-10 17:56:51
问题 I would like more information when debugging my spark notebook. I have found some log files: !ls $HOME/notebook/logs/ The files are: bootstrap-nnnnnnnn_nnnnnn.log jupyter-nnnnnnnn_nnnnnn.log kernel-pyspark-nnnnnnnn_nnnnnn.log kernel-scala-nnnnnnnn_nnnnnn.log logs-nnnnnnnn.tgz monitor-nnnnnnnn_nnnnnn.log spark160master-ego.log Which applications log to these files and what information is written to each of these files? 回答1: When debugging notebooks, the kernel-*-*.log files are the ones you're

IPython Notebook: Why do not appearing the widgets after installing correctly ipywidgets in DSX?

眉间皱痕 提交于 2019-12-08 01:50:46
问题 After installing ipywidgets in a Jupiter notebook, DSX (IBM Datascience Experience framework), it doesn't show the widget, just a static stuff or A Jupyter Widget Example: import ipywidgets as widgets widgets.Dropdown( options = { 'One': 1, 'Two': 2, 'Three': 3 }, value = 2, description = 'Number:', ) Result: A Jupiter Widget I have tried several versions of !jupyter nbextension enable --py widgetsnbextension --sys-prefix based on http://ipywidgets.readthedocs.io/en/latest/user_install.html,

IPython Notebook: Why do not appearing the widgets after installing correctly ipywidgets in DSX?

五迷三道 提交于 2019-12-06 13:49:28
After installing ipywidgets in a Jupiter notebook, DSX (IBM Datascience Experience framework), it doesn't show the widget, just a static stuff or A Jupyter Widget Example: import ipywidgets as widgets widgets.Dropdown( options = { 'One': 1, 'Two': 2, 'Three': 3 }, value = 2, description = 'Number:', ) Result: A Jupiter Widget I have tried several versions of !jupyter nbextension enable --py widgetsnbextension --sys-prefix based on http://ipywidgets.readthedocs.io/en/latest/user_install.html , but I still got the same error message: PermissionError: [Errno 13] Permission denied: '/usr/local/src

TensorFrames not working with Tensorflow on Data Science Experience

萝らか妹 提交于 2019-12-02 21:56:01
问题 This is a follow up from this question. I've imported the following jars into my notebook: pixiedust.installPackage("http://central.maven.org/maven2/com/typesafe/scala-logging/scala-logging-slf4j_2.10/2.1.2/scala-logging-slf4j_2.10-2.1.2.jar") pixiedust.installPackage("http://central.maven.org/maven2/com/typesafe/scala-logging/scala-logging-api_2.10/2.1.2/scala-logging-api_2.10-2.1.2.jar") But when I do an extremely basic command using tensorframes, I get the following error: import

install.packages(“tm”) -> “dependency 'slam' is not available”

北城余情 提交于 2019-12-02 18:22:36
问题 I'm trying to install the tm package on IBM's Data Science Experience (DSX): install.packages("tm") However, I'm hitting this issue: "dependency 'slam' is not available" This post suggests that R version 3.3.1 will resolve the issue, however the R version on DSX is: R version 3.3.0 (2016-05-03) How can I resolve this issue on IBM DSX? Note that you don't have root access on DSX. I've seen similar questions on stackoverflow, but none are asking how to fix the issue on IBM DSX, e.g. dependency

install.packages(“tm”) -> “dependency 'slam' is not available”

岁酱吖の 提交于 2019-12-02 08:48:10
I'm trying to install the tm package on IBM's Data Science Experience (DSX): install.packages("tm") However, I'm hitting this issue: "dependency 'slam' is not available" This post suggests that R version 3.3.1 will resolve the issue, however the R version on DSX is: R version 3.3.0 (2016-05-03) How can I resolve this issue on IBM DSX? Note that you don't have root access on DSX. I've seen similar questions on stackoverflow, but none are asking how to fix the issue on IBM DSX, e.g. dependency ‘slam’ is not available when installing TM package Update: install.packages("slam") Returns: Installing

Enable nbextension on IBM Data Science Experience

≯℡__Kan透↙ 提交于 2019-11-29 18:01:11
I have to enable File Upload widgets on Jupyter on IBM Data Science Experience. It requires the following set of commands: pip install fileupload jupyter nbextension install --py fileupload jupyter nbextension enable --py fileupload It looks we need to have sudo permission on the platform to execute the 2nd and 3rd commands which IBM Data Science Experience is not allowing me to do. How can I do this on the platform to install the file upload widget? Users cannot enable notebook extensions on DSX Jupyter notebooks. The configuration of the notebook server, including the set of enabled notebook

How to troubleshoot a DSX scheduled notebook?

社会主义新天地 提交于 2019-11-28 13:05:26
I have a DSX notebook that I can run manually usng the DSX user interface and it populates some data in a Cloudant database. I have scheduled the notebook to run hourly. Overnight I would have expected the job to have run many times, but the Cloudant database has not been updated. How can I debug the scheduled job? Are there any logs that I can check to verify that the notebook has actually been executed? Is the output from my notebook saved to log files? Where can I find these files? One possibility is to look into the kernel logs of your notebook kernel. For that you need to use a Python