distribute

Distribute a python script in bytecode precompiled + all necessary libraries

半城伤御伤魂 提交于 2020-07-19 16:58:49
问题 I made a (one file) scrip in python for my client, the program is a success and now it needs to be distributed to 12 of my client employees. The script I made uses a lot of libraries (imports), some of then are not popular at all so here goes the question: Is there a way to distribute my program already compiled in bytecode? So the users can run it by just simply doing "python myProgram.pyc" or just "myProgram.pyc" (if it has +x property), I know this is entirely possible in Java by compiling

Distribute a python script in bytecode precompiled + all necessary libraries

霸气de小男生 提交于 2020-07-19 16:57:47
问题 I made a (one file) scrip in python for my client, the program is a success and now it needs to be distributed to 12 of my client employees. The script I made uses a lot of libraries (imports), some of then are not popular at all so here goes the question: Is there a way to distribute my program already compiled in bytecode? So the users can run it by just simply doing "python myProgram.pyc" or just "myProgram.pyc" (if it has +x property), I know this is entirely possible in Java by compiling

Hadoop DistributedCache functionality in Spark

别等时光非礼了梦想. 提交于 2020-04-06 05:16:05
问题 I am looking for a functionality similar to the distributed cache of Hadoop in Spark. I need a relatively small data file (with some index values) to be present in all nodes in order to make some calculations. Is there any approach that makes this possible in Spark? My workaround so far consists on distributing and reducing the index file as a normal processing, which takes around 10 seconds in my application. After that, I persist the file indicating it as a broadcast variable, as follows:

How do you correct Module already loaded UserWarnings in Python?

余生颓废 提交于 2020-01-10 17:29:10
问题 Getting the following kinds of warnings when running most python scripts in the command line: /Library/Python/2.6/site-packages/virtualenvwrapper/hook_loader.py:16: UserWarning: Module pkg_resources was already imported from /System/Library/Frameworks/Python.framework/Versions/2.6/Extras/lib/python/pkg_resources.pyc, but /Library/Python/2.6/site-packages is being added to sys.path import pkg_resources /Library/Python/2.6/site-packages/virtualenvwrapper/hook_loader.py:16: UserWarning: Module

Can I distribute my iPhone app for only certain people?

拟墨画扇 提交于 2020-01-04 04:14:08
问题 I want to develop a specific application that only clients of mine would be able to use, how can I limit the app to be downloaded only by people who I aprove ? Thanks ! 回答1: Developer and AdHoc Provisioning Profiles expire. So, if you use that method, your clients would have to be willing to continually renew their app certificates. Each client could apply to the developer programs, which would allow up to 100 devices per client. But the only non-expiring method for non-enterprise size

Installing my sdist from PyPI puts the files in unexpected places

[亡魂溺海] 提交于 2020-01-01 09:29:09
问题 My problem is that when I upload my Python package to PyPI, and then install it from there using pip, my app breaks because it installs my files into completely different locations than when I simply install the exact same package from a local sdist. Installing from the local sdist puts files on my system like this: /Python27/ Lib/ site-packages/ gloopy-0.1.alpha-py2.7.egg/ (egg and install info files) data/ (images and shader source) doc/ (html) examples/ (.py scripts that use the library)

Does pip handle extras_requires from setuptools/distribute based sources?

左心房为你撑大大i 提交于 2019-12-29 18:44:14
问题 I have package "A" with a setup.py and an extras_requires line like: extras_require = { 'ssh': ['paramiko'], }, And a package "B" that depends on util: install_requires = ['A[ssh]'] If I run python setup.py install on package B, which uses setuptools.command.easy_install under the hood, the extras_requires is correctly resolved, and paramiko is installed. However, if I run pip /path/to/B or pip hxxp://.../b-version.tar.gz , package A is installed, but paramiko is not. Because pip "installs

Distribute/distutils specify Python version

元气小坏坏 提交于 2019-12-28 18:08:34
问题 Kinda followup to this... :) My project is Python 3-only and my question is basically how I tell distutils/distribute/whoever that this package is Python 3-only? 回答1: Not sure if there's some special setting, but this in the beginning of setup.py might help: import sys if sys.version_info.major < 3: print("I'm only for 3, please upgrade") sys.exit(1) 来源: https://stackoverflow.com/questions/13385337/distribute-distutils-specify-python-version

Why are “sc.addFile” and “spark-submit --files” not distributing a local file to all workers?

最后都变了- 提交于 2019-12-23 10:09:08
问题 I have a CSV file "test.csv" that I'm trying to have copied to all nodes on the cluster. I have a 4 node apache-spark 1.5.2 standalone cluster. There are 4 workers where one node also acts has master/driver as well as the worker. If I run: $SPARK_HOME/bin/pyspark --files=./test.csv OR from within the REPL interface execute sc.addFile('file://' + '/local/path/to/test.csv') I see spark log the following: 16/05/05 15:26:08 INFO Utils: Copying /local/path/to/test.csv to /tmp/spark-5dd7fc83-a3ef