conda fails to create environment from yml

和自甴很熟 提交于 2019-12-12 12:06:35

问题


I’m trying to run the code below to create a virtual Python environment from a YAML file. I’m running the code in the command line on Ubuntu server. The virtual environment is named py36. When I run the code below, I get the message below. The environment also doesn’t get created. Is this problem caused because I have several packages that I had to install using pip instead of Anaconda? Does anyone know how to solve this issue?

I created the YAML file following the example from:

https://datascience.stackexchange.com/questions/24093/how-to-clone-python-working-environment-on-another-machine

Code:

conda env create -f py36.yml

py36.yml

name: py36
channels:
  - anaconda
  - cvxgrp
  - conda-forge
  - defaults
dependencies:
  - beautifulsoup4=4.6.3=py36_0
  - patsy=0.5.1=py36_0
  - sqlite=3.25.3=ha441bb4_0
  - tk=8.6.8=ha441bb4_0
  - asn1crypto=0.24.0=py36_1003
  - ca-certificates=2018.11.29=ha4d7672_0
  - certifi=2018.11.29=py36_1000
  - cffi=1.11.5=py36h5e8e0c9_1
  - clangdev=4.0.0=default_0
  - cryptography=2.3.1=py36hdbc3d79_1000
  - cryptography-vectors=2.3.1=py36_1000
  - cycler=0.10.0=py_1
  - fftw=3.3.8=h470a237_0
  - freetype=2.9.1=h6debe1e_4
  - glpk=4.65=h16a7912_1
  - gmp=6.1.2=hfc679d8_0
  - icu=58.2=h0a44026_1000
  - idna=2.8=py36_1000
  - kiwisolver=1.0.1=py36h2d50403_2
  - lapack=3.6.1=1
  - libiconv=1.15=h1de35cc_1004
  - libpng=1.6.35=ha92aebf_2
  - libxml2=2.9.8=hf14e9c8_1005
  - lightgbm=2.2.1=py36hfc679d8_0
  - llvmdev=4.0.0=default_0
  - matplotlib=2.2.3=py36h0e0179f_0
  - metis=5.1.0=3
  - mkl_fft=1.0.6=py36_0
  - mkl_random=1.0.1=py36_0
  - mlxtend=0.13.0=py_1
  - openblas=0.2.20=8
  - openmp=4.0.0=1
  - openssl=1.0.2p=h1de35cc_1002
  - pandas=0.23.4=py36hf8a1672_0
  - pycparser=2.19=py_0
  - pyopenssl=18.0.0=py36_1000
  - pyparsing=2.2.0=py_1
  - pysocks=1.6.8=py36_1002
  - python=3.6.6=h4a56312_1003
  - pytz=2018.5=py_0
  - selenium=3.141.0=py36h470a237_0
  - tbb=2018_20171205=0
  - urllib3=1.24.1=py36_1000
  - cvxcanon=0.1.1=py36_0
  - cvxpy=1.0.6=py36_0
  - ecos=2.0.5=py36hf9b3073_0
  - multiprocess=0.70.4=py36_0
  - scs=1.2.6=py36_0
  - appnope=0.1.0=py36hf537a9a_0
  - backcall=0.1.0=py36_0
  - blas=1.0=mkl
  - cvxopt=1.2.0=py36hb579ef3_0
  - decorator=4.3.0=py36_0
  - dill=0.2.8.2=py36_0
  - dsdp=5.8=hb579ef3_0
  - fastcache=1.0.2=py36h1de35cc_2
  - gsl=2.4=h1de35cc_4
  - intel-openmp=2019.0=117
  - ipykernel=4.8.2=py36_0
  - ipython=6.4.0=py36_0
  - ipython_genutils=0.2.0=py36h241746c_0
  - jedi=0.12.0=py36_1
  - jupyter_client=5.2.3=py36_0
  - jupyter_core=4.4.0=py36h79cf704_0
  - libcxx=4.0.1=h579ed51_0
  - libcxxabi=4.0.1=hebd6815_0
  - libedit=3.1.20170329=hb402a30_2
  - libffi=3.2.1=h475c297_4
  - libgcc=4.8.5=hdbeacc1_10
  - libgfortran=3.0.1=h93005f0_2
  - libopenblas=0.3.3=hdc02c5d_2
  - libsodium=1.0.16=h3efe00b_0
  - mkl=2018.0.3=1
  - ncurses=6.1=h0a44026_0
  - numpy=1.15.4=py36h6a91979_0
  - numpy-base=1.15.4=py36h8a80b8c_0
  - parso=0.2.1=py36_0
  - pexpect=4.6.0=py36_0
  - pickleshare=0.7.4=py36hf512f8e_0
  - pip=10.0.1=py36_0
  - prompt_toolkit=1.0.15=py36haeda067_0
  - ptyprocess=0.5.2=py36he6521c3_0
  - pygments=2.2.0=py36h240cd3f_0
  - python-dateutil=2.7.3=py36_0
  - pyzmq=17.0.0=py36h1de35cc_1
  - readline=7.0=hc1231fa_4
  - scikit-learn=0.20.1=py36h4f467ca_0
  - scipy=1.1.0=py36h28f7352_1
  - setuptools=39.2.0=py36_0
  - simplegeneric=0.8.1=py36_2
  - six=1.11.0=py36h0e22d5e_1
  - suitesparse=5.2.0=he235d88_0
  - toolz=0.9.0=py36_0
  - tornado=5.0.2=py36_0
  - traitlets=4.3.2=py36h65bd3ce_0
  - wcwidth=0.1.7=py36h8c6ec74_0
  - wheel=0.31.1=py36_0
  - xz=5.2.4=h1de35cc_4
  - zeromq=4.2.5=h378b8a2_0
  - zlib=1.2.11=hf3cbc9b_2
  - pip:
    - absl-py==0.2.2
    - astor==0.6.2
    - bleach==1.5.0
    - cython==0.28.3
    - gast==0.2.0
    - grpcio==1.12.1
    - h5py==2.8.0
    - html5lib==0.9999999
    - keras==2.2.0
    - keras-applications==1.0.2
    - keras-preprocessing==1.0.1
    - markdown==2.6.11
    - pillow==5.1.0
    - protobuf==3.5.2.post1
    - pyramid-arima==0.6.5
    - pyyaml==3.12
    - sklearn==0.0
    - statsmodels==0.9.0
    - tensorboard==1.8.0
    - tensorflow==1.8.0
    - termcolor==1.1.0
    - tqdm==4.23.4
    - werkzeug==0.14.1
    - xlrd==1.1.0
prefix: /Users/username/anaconda2/envs/py36

Command line

conda env create -f py36.yml
Collecting package metadata: done
Solving environment: failed

ResolvePackageNotFound: 
  - libgfortran==3.0.1=h93005f0_2
  - pyzmq==17.0.0=py36h1de35cc_1
  - python==3.6.6=h4a56312_1003
  - prompt_toolkit==1.0.15=py36haeda067_0
  - libiconv==1.15=h1de35cc_1004
  - sqlite==3.25.3=ha441bb4_0
  - six==1.11.0=py36h0e22d5e_1
  - cryptography==2.3.1=py36hdbc3d79_1000
  - openssl==1.0.2p=h1de35cc_1002
  - libxml2==2.9.8=hf14e9c8_1005
  - libcxxabi==4.0.1=hebd6815_0
  - matplotlib==2.2.3=py36h0e0179f_0
  - ptyprocess==0.5.2=py36he6521c3_0
  - readline==7.0=hc1231fa_4
  - libedit==3.1.20170329=hb402a30_2
  - libgcc==4.8.5=hdbeacc1_10
  - xz==5.2.4=h1de35cc_4
  - pickleshare==0.7.4=py36hf512f8e_0
  - appnope==0.1.0=py36hf537a9a_0
  - scipy==1.1.0=py36h28f7352_1
  - cvxopt==1.2.0=py36hb579ef3_0
  - jupyter_core==4.4.0=py36h79cf704_0
  - dsdp==5.8=hb579ef3_0
  - ncurses==6.1=h0a44026_0
  - tk==8.6.8=ha441bb4_0
  - ecos==2.0.5=py36hf9b3073_0
  - wcwidth==0.1.7=py36h8c6ec74_0
  - scikit-learn==0.20.1=py36h4f467ca_0
  - libopenblas==0.3.3=hdc02c5d_2
  - traitlets==4.3.2=py36h65bd3ce_0
  - libsodium==1.0.16=h3efe00b_0
  - ipython_genutils==0.2.0=py36h241746c_0
  - fastcache==1.0.2=py36h1de35cc_2
  - numpy==1.15.4=py36h6a91979_0
  - numpy-base==1.15.4=py36h8a80b8c_0
  - zlib==1.2.11=hf3cbc9b_2
  - libffi==3.2.1=h475c297_4
  - pygments==2.2.0=py36h240cd3f_0
  - icu==58.2=h0a44026_1000
  - gsl==2.4=h1de35cc_4
  - libcxx==4.0.1=h579ed51_0
  - suitesparse==5.2.0=he235d88_0
  - zeromq==4.2.5=h378b8a2_0

回答1:


No, PyPI is not the issue. Instead, it fails because the YAML includes platform-specific build constraints, but you are transferring across platforms. Specifically, examining the build numbers on the failed packages (e.g., six=py36h0e22d5e_1), I can see that they correspond to packages from the osx-64 platform, but you are trying to install on a linux-64 platform, hence the build constraints are unresolvable.

The simplest solution to this is to exclude the build info from the environment definition export.

conda env export -n py36 -f py36.yml --no-builds

There can still be issues if some of the packages are not available on linux-64 through Conda. If this is the case, you may need to find other channels (or check PyPI), switch versions, or remove the dependency altogether. Most of the packages look standard though.

Not so important, but you can safely remove cvxgrp from your channels. That channel only serves an outdated version of cvxopt and only for osx-64.




回答2:


Indeed environments keep platform build specifics under the conda-installed (i.e., dependencies) section. From OP's sample:

  - zlib=1.2.11=hf3cbc9b_2

, hf3cbc9b_2 is a platform-specific version tag. You have to remove that.

If you do switch between platforms very often (OSX <-> Linux, for example), please read the answer from @merv, that is the right thing to do in your future env export.

For the time being, like me, just want to have it fixed, you may do it manually or run a sed over it:

sed 's/\(.*[[:alnum:]]\)=[[:alnum:]][[:alnum:].-_]*/\1/' environment.yml > env.yml

. That will handle the platform-specific tag without touching the pip section of the file.

Then you can try again with env.yml:

conda env create -f env.yml

Notice that platform-specific packages may occur. If after removing the version tags, Conda still complains, you'll have to manually clean the packages accordingly. For example, I'm bringing an environment.yml from Linux to Mac, where the packages libgcc-ng=9.1.0, libstdcxx-ng=9.1.0, libgfortran-ng=7.3.0 are not defined; I removed them by hand.

Once such cleaning was done, my conda env create -f env.yml worked like a charm.



来源:https://stackoverflow.com/questions/55554431/conda-fails-to-create-environment-from-yml

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!