MySQL 5.7.12 import cannot create a JSON value from a string with CHARACTER SET 'binary'

試著忘記壹切 提交于 2019-12-17 17:25:51

问题


I exported my database with JSON columns in it. After I migrated to a new server, my import crashed every time with an error like:

cannot create a JSON value from a string with CHARACTER SET 'binary'

On stackoverflow, I found this post but didn't work for me: mysqlimport issues "set @@character_set_database=binary" which prevents loading json values

The file is 2GB and isn't possible to open the file.

Anyone has an idea to import my database file?


回答1:


You can apply a regex to the SQL text which you exported which will convert your binary strings into an insertable format. This was my quick and dirty fix when I faced this issue

(X'[^,\)]*')
CONVERT($1 using utf8mb4)

Applying this regex means

INSERT INTO json_table (json_column) VALUES (X'7B22666F6F223A2022626172227D');

will now become

INSERT INTO json_table (json_column) VALUES (CONVERT(X'7B22666F6F223A2022626172227D' using utf8mb4));



回答2:


I had this problem dealing with exports made by Sequel Pro. I unchecked the Output BLOB fields as hex option and the problem went away. Visually inspecting the export showed legible JSON instead of binary.




回答3:


I faced the same issue today. Below were the findings for my case,

I asked one of my friend to generate an SQL dump for me to import. He used sequel-pro to generate the dump (export database). When I did the import it threw an error

Cannot create a JSON value from a string with CHARACTER SET 'binary'

So, there was an issue with the generated dump, all the json fields were converted to some raw format i.e. instead of value being

"{'key1':'value1', 'key2':'value2'}"

it was,

X'nfdsklsdsklnfjkbvkjsdbvkjhdfsbvkjdsbnvljkdsbvkjhdfbvkjdfbvjkdfb'

So, when importing the dump i.e. running the insert statements mysql could not process the data as it was not of json type.

Here is a link to the bug reported
https://github.com/sequelpro/sequelpro/issues/2397

You need to uncheck the Output BLOB fields as hex option.




回答4:


This worked for me, (I had control of the export to the sql file as well). There're lots of caveats; e.g. I knew that the fields would never be bigger than 1000 and wouldn't contain any non-ascii chars. Please do comment and tell me all the whys this is so bad tho :)

Before export

alter table <table> modify <json_column> varchar(1000);

Then after import

alter table <table> modify <json_column> json;



回答5:


For those using Sequel Pro around June 2019, in addition to unchecking the "Output BLOB fields as hex option" (as mentioned above) - you also need to use the nightly build, which added support for JSON types 2 years ago. This support has not yet made it to the official release.




回答6:


I had this problem with a dump. i was able to fix it by changing the line in the dump file from:

/*!40101 SET NAMES binary*/;

to

/*!40101 SET NAMES utf8mb4*/;



回答7:


vim version For Lorcan O'Neill's answer

vi xxxx.sql
:%s/\(X'[^,\)]*'\)/CONVERT(\1 using utf8mb4)/g



回答8:


change collation to utf8_general_ci. worked for me.




回答9:


For the ones like me arived here using Symfony 4 / Doctrine : For some reasons the same entity can be resolved in a longtext MySQL type storing JSON; or a json MySQL type storing json. Manually setting longtext MySQL type resolved the problem in my particular case.




回答10:


This odd issue was occurring when running a simple UPDATE query:

update some_table set json_attr = '{"test":168}' where id = 123456;

Restarting MySQL fixed it. Was not able to pinpoint the cause.

Edit: We are using Aurora. It looks like it was related to us having a weird configuration where the same instance handled both master & slave/reader connections.




回答11:


All MySQL JSON data type information must be UTF8MB4 character set not BINARY.



来源:https://stackoverflow.com/questions/38078119/mysql-5-7-12-import-cannot-create-a-json-value-from-a-string-with-character-set

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!