setuptools

Changing console_script entry point interpreter for packaging

て烟熏妆下的殇ゞ 提交于 2019-11-28 07:43:19
I'm packaging some python packages using a well known third party packaging system, and I'm encountering an issue with the way entry points are created. When I install an entry point on my machine, the entry point will contain a shebang pointed at whatever python interpreter, like so: in /home/me/development/test/setup.py from setuptools import setup setup( entry_points={ "console_scripts": [ 'some-entry-point = test:main', ] } ) in /home/me/.virtualenvs/test/bin/some-entry-point : #!/home/me/.virtualenvs/test/bin/python # EASY-INSTALL-ENTRY-SCRIPT: 'test==1.0.0','console_scripts','some-entry

How can I use setuptools to generate a console_scripts entry point which calls `python -m mypackage`?

邮差的信 提交于 2019-11-28 07:35:02
I am trying to be a good Pythonista and following PEP 338 for my package I plan on deploying. I am also trying to generate my executable scripts upon python setuptools install using setuptools entry_points{'console_scripts': ... } options. How can I use entry_points to generate a binary that calls python -m mypackage (and passes *args, **kwargs) ? Here are a few attempts I have made with no success: setuptools( ... (1) entry_points= {'console_scripts': ['mypkg=mypkg.__main__'],}, (2) entry_points= {'console_scripts': ['mypkg=mypkg.main'],}, (3) entry_points= {'console_scripts': ['mypkg=python

Python Packaging: Data files are put properly in tar.gz file but are not installed to virtual environment

馋奶兔 提交于 2019-11-28 04:20:20
I can't properly install the project package_fiddler to my virtual environment. I have figured out that MANIFEST.in is responsible for putting the non-.py files in Package_fiddler-0.0.0.tar.gz that is generated when executing python setup.py sdist . Then I did: (virt_envir)$ pip install dist/Package_fiddler-0.0.0.tar.gz But this did not install the data files nor the package to /home/username/.virtualenvs/virt_envir/local/lib/python2.7/site-packages . I have tried many configurations of the setup arguments package_data , include_package_data and data_files but I seem to have used the wrong

How do I use data in package_data from source code?

瘦欲@ 提交于 2019-11-28 04:08:35
In setup.py, I have specified package_data like this: packages=['hermes'], package_dir={'hermes': 'hermes'}, package_data={'hermes': ['templates/*.tpl']}, And my directory structure is roughly hermes/ | | docs/ | ... | hermes/ | | __init__.py | code.py | templates | | python.tpl | | README | setup.py The problem is that I need to use files from the templates directory in my source code so I can write out python code (this project is a parser generator). I can't seem to figure out how to properly include and use these files from my code. Any ideas? The standard pkgutil module's get_data()

How do I install an old version of Django on virtualenv?

眉间皱痕 提交于 2019-11-28 04:02:33
This may sound like a stupid question, since the very purpose of virtualenv is to this exactly: Installing some specific version of a package (in this case Django) inside the virtual environment. But it's exactly what I want to do, and I can't figure it out. I'm on Windows XP, and I created the virtual environment successfully, and I'm able to run it, but how am I supposed to install the Django version I want into it? I mean, I know to use the newly-created easy_install script, but how do I make it install Django 1.0.7? If I do easy_install django , it will install the latest version. I tried

What is the correct way to share package version with setup.py and the package?

余生颓废 提交于 2019-11-28 02:50:56
With distutils , setuptools , etc. a package version is specified in setup.py : # file: setup.py ... setup( name='foobar', version='1.0.0', # other attributes ) I would like to be able to access the same version number from within the package: >>> import foobar >>> foobar.__version__ '1.0.0' I could add __version__ = '1.0.0' to my package's __init__.py, but I would also like to include additional imports in my package to create a simplified interface to the package: # file: __init__.py from foobar import foo from foobar.bar import Bar __version__ = '1.0.0' and # file: setup.py from foobar

Copy configuration file on installation

北城以北 提交于 2019-11-28 02:16:21
I am trying to package my Python project, which comes with a configuration dotfile that I want copied into the user's home directory on installation. The quick guide to packaging says that this can be done using the data_files argument to setuptools.setup . So this is what I have: data_files = [(os.path.expanduser("~"), [".my_config"])] This appears to work fine if I use python setup.py install , but when I upload my package to PyPI and run pip install the dotfile isn't copied. FWIW, I've put the dotfile in the MANIFEST.in and also tried including the package_data argument to setup . None of

How to force a python wheel to be platform specific when building it?

我怕爱的太早我们不能终老 提交于 2019-11-28 01:11:44
I am working on a python2 package in which the setup.py contains some custom install commands. These commands actually build some Rust code and output some .dylib files that are moved into the python package. An important point is that the Rust code is outside the python package. setuptools is supposed to detect automatically if the python package is pure python or platform specific (if it contains some C extensions for instance). In my case, when I run python setup.py bdist_wheel , the generated wheel is tagged as a pure python wheel: <package_name>-<version>-py2-none-any.whl . This is

Setuptools unable to use link from dependency_links

≯℡__Kan透↙ 提交于 2019-11-27 23:45:57
I've been trying to install a package with the following setup configured: setup( packages=find_packages(), include_package_data=True, install_requires=[ 'Django==1.5.1', 'xhtml2pdf', ], dependency_links=[ 'https://github.com/chrisglass/xhtml2pdf/zipball/28d12fcaafc4c47b13f1f6f42c2bfb73f90cc947#egg=xhtml2pdf', ], ) However it installs the XHTML2PDF package from PyPi, instead of using the specified link. According to the output (I ran the install using pip install -vvv package.tar.gz ), it could either not parse the version from the link (at // 1 in code), or I've not specified the correct

Cross Compiling Python Extensions

不问归期 提交于 2019-11-27 23:40:19
问题 I have a problem cross compiling netifaces extension under Buildroot Linux distro for ARM (Python 2.7.2). According to this blog http://whatschrisdoing.com/blog/2009/10/16/cross-compiling-python-extensions/ I've defined CC, LDSHARE etc. environment variables, but distutils/setuptools doesn't take CC into account so all tests will fail: running build Setting prefix Setting prefix running build_ext checking for getifaddrs... not found. (cached) checking for getnameinfo... not found. (cached)