Python unicode write to file crashes in command line but not in IDE

烈酒焚心 提交于 2019-12-06 11:03:10

As Fenikso said, you should encode a string before writing it to a file. The reason that file.write() doesn't do this itself is that you need to specify which encoding (utf-8, utf-16, etc) you want to use. There's a python module "codecs" which allows you to create stream objects that know what encoding to use, and automatically apply it. That's what Fenikso is using in his second example.

As to why your code works in the IDE but not the command line, my guess is that your IDE is setting the "default encoding" to some non-default value. Try running this in both the IDE and the command line and see if it differs:

>>> import sys
>>> print sys.getdefaultencoding()

Here's some related information: http://blog.ianbicking.org/illusive-setdefaultencoding.html

You should explicitly encode your string in desired encoding before writing it in the file:

f.write(text.encode("cp1250", "replace")) # Czech Windows encoding, use your own

or

f.write(text.encode("utf-8", "replace")) # UTF-8

You can also explicitly open the file with specific encoding:

# -*- coding: utf-8 -*-
from __future__ import unicode_literals
import codecs

x = "abcč"
f = codecs.open("test.txt", "w", "utf-8", "replace")
f.write(x)

This is how I do whenever I need to work with a specific encoding

#!/usr/bin/env python
# -*- coding: UTF-8 -*-
import codecs
out = codecs.getwriter('utf-8')(sys.stdout)
out.write('some åäö-string')
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!