how to run a python jupyter notebook daily automatically

后端 未结 7 891
无人共我
无人共我 2020-12-14 08:35

I have a code in a python jupyter notebook but i need to run this every day so I would like to know if there\'s a way to set this, I really appreciate it

相关标签:
7条回答
  • 2020-12-14 08:56

    You want to use Google AI Platform Notebooks Scheduler service currently in EAP.

    0 讨论(0)
  • 2020-12-14 08:57

    Update
    recently I came across papermill which is for executing and parameterizing notebooks.

    https://github.com/nteract/papermill

    papermill local/input.ipynb s3://bkt/output.ipynb -p alpha 0.6 -p l1_ratio 0.1
    

    This seems better than nbconvert, because you can use parameters. You still have to trigger this command with a scheduler. Below is an example with cron on Ubuntu.


    Old Answer

    nbconvert --execute
    

    can execute a jupyter notebook, this embedded into a cronjob will do what you want.

    Example setup on Ubuntu:

    Create yourscript.sh with the following content:

    /opt/anaconda/envs/yourenv/bin/jupyter nbconvert \
                          --execute \
                          --to notebook /path/to/yournotebook.ipynb \
                          --output /path/to/yournotebook-output.ipynb
    

    You have more options except --to notebook. I like this option since you have a fully executable "log"-File afterwards.

    I recommend using a virtual environment to run your notebook, to avoid that future updates mess with your script. Do not forget to install nbconvert into the environment.

    Now create a cronjob, that runs every day e.g. at 5:10 AM, by typing crontab -e in your terminal and add this line:

    10 5 * * * /path/to/yourscript.sh
    
    0 讨论(0)
  • 2020-12-14 09:01

    Try the SeekWell Chrome Extension. It lets you schedule notebooks to run weekly, daily, hourly or every 5 minutes, right from Jupyter Notebooks. You can also send DataFrames directly to Sheets or Slack if you like.

    Here's a demo video, and there is more info in the Chrome Web Store link above as well.

    **Disclosure: I'm a SeekWell co-founder

    0 讨论(0)
  • 2020-12-14 09:04

    As others have mentioned, papermill is the way to go. Papermill is just nbconvert with a few extra features.

    If you want to handle a workflow of multiple notebooks that depend on one another, you can try Airflow's integration with papermill. If you are looking for something simpler that does not need a scheduler to run, you can try ploomber which also integrates with papermill (Disclaimer: I'm the author).

    0 讨论(0)
  • 2020-12-14 09:05

    Executing Jupyter notebooks with parameters is conveniently done with Papermill. I also find convenient to share/version control the notebook either as a Markdown file or a Python script with Jupytext. Then I convert the notebook to an HTML file with nbconvert. Typically my workflow looks like this:

    cat world_facts.md \
    | jupytext --from md --to ipynb --set-kernel - \
    | papermill -p year 2017 \
    | jupyter nbconvert --no-input --stdin --output world_facts_2017_report.html
    

    Learn more about the above, including how to specify the Python environment in which the notebook is expected to run, and how to use continuous integration on notebooks, have a look at my article Automated reports with Jupyter Notebooks (using Jupytext and Papermill) which you can read either on Medium, GitHub, or on Binder. Use the Binder link if you want to test interactively the outcome of the commands in the article.

    0 讨论(0)
  • 2020-12-14 09:09

    It's better to combine with airflow if you want to have higher quality. I packaged them in a docker image, https://github.com/michaelchanwahyan/datalab.

    It is done by modifing an open source package nbparameterize and integrating the passing arguments such as execution_date. Graph can be generated on the fly The output can be updated and saved within inside the notebook.

    When it is executed

    • the notebook will be read and inject the parameters
    • the notebook is executed and the output will overwrite the original path

    Besides, it also installed and configured common tools such as spark, keras, tensorflow, etc.

    0 讨论(0)
提交回复
热议问题