可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
I'm loving the new JSON functions in MySQL 5.7, but running into a block trying to merge values from JSON into a normal table structure.
Grabbing JSON, manipulating and extracting arrays from it etc. is simple. JSON_EXTRACT all the way. But what about the inverse, going from a JSON array to rows? Perhaps I am dense on the existing MySQL JSON functionality, but I haven't been able to figure that one out.
For example, say I have a JSON array and want to insert a row for each element in the array with its value? The only way I have found is to write a bunch of JSON_EXTRACT(... '$[0]') JSON_EXTRACT(... '$[1]') etc and union them together.
Or, say I have a JSON array and want to GROUP_CONCAT() it to a single comma separated string?
In other words, I know I can do this:
SET @j = '[1, 2, 3]'; SELECT GROUP_CONCAT(JSON_EXTRACT(@j, CONCAT('$[', x.n, ']'))) AS val FROM ( SELECT 0 AS n UNION SELECT 1 AS n UNION SELECT 2 AS n UNION SELECT 3 AS n UNION SELECT 4 AS n UNION SELECT 5 AS n ) x WHERE x.n < JSON_LENGTH(@j);
But that hurts my eyes. And my heart.
How can I do something like:
SET @j = '[1, 2, 3]'; SELECT GROUP_CONCAT(JSON_EXTRACT(@j, '$[ * ]'))
... and have it concatenate together the values in the array vs. the JSON array itself?
I guess what I'm looking for here is some sort of JSON_SPLIT along the lines of:
SET @j = '[1, 2, 3]'; SELECT GROUP_CONCAT(val) FROM JSON_SPLIT(JSON_EXTRACT(@j, '$[ * ]'), '$')
If MySQL had a proper STRING_SPLIT(val, 'separator') table returning function, I could hack it (escaping be damned), but that's not available either.
回答1:
It's true that it's not a good idea to denormalize into JSON, but sometimes you need to deal with JSON data, and there's a way to extract a JSON array into rows in a query.
The trick is to perform a join on a temporary or inline table of indexes, which gives you a row for each non-null value in a JSON array. I.e., if you have a table with values 0, 1, and 2 that you join to a JSON array “fish” with two entries, then fish[0] matches 0, resulting in one row, and fish[1] matches 1, resulting in a second row, but fish[2] is null so it doesn't match the 2 and doesn't produce a row in the join. You need as many numbers in the index table as the max length of any array in your JSON data. It's a bit of a hack, and it's about as painful as the OP's example, but it's very handy.
Example (requires MySQL 5.7.8 or later):
CREATE TABLE t1 (rec_num INT, jdoc JSON); INSERT INTO t1 VALUES (1, '{"fish": ["red", "blue"]}'), (2, '{"fish": ["one", "two", "three"]}'); SELECT rec_num, idx, JSON_EXTRACT(jdoc, CONCAT('$.fish[', idx, ']')) AS fishes FROM t1 -- Inline table of sequential values to index into JSON array JOIN ( SELECT 0 AS idx UNION SELECT 1 AS idx UNION SELECT 2 AS idx UNION -- ... continue as needed to max length of JSON array SELECT 3 ) AS indexes WHERE JSON_EXTRACT(jdoc, CONCAT('$.fish[', idx, ']')) IS NOT NULL ORDER BY rec_num, idx;
The result is:
+---------+-----+---------+ | rec_num | idx | fishes | +---------+-----+---------+ | 1 | 0 | "red" | | 1 | 1 | "blue" | | 2 | 0 | "one" | | 2 | 1 | "two" | | 2 | 2 | "three" | +---------+-----+---------+
It looks like the MySQL team may add a JSON_TABLE function in MySQL 8 to make all this easier. (http://mysqlserverteam.com/mysql-8-0-labs-json-aggregation-functions/)
回答2:
In My Case, JSON
Function was not available so I used a hack. As mentioned by Chris MYSQL do not have STRING_SPLIT
but it does have substring_index
.
For the input
{ "requestId":"BARBH17319901529", "van":"0xxxxx91317508", "source":"AxxxS", "txnTime":"15-11-2017 14:08:22" }
You can use:
trim( replace( substring_index( substring(input, locate('requestid',input) + length('requestid') + 2), ',', 1), '"', '') ) as Requestid`
The output will be:
BARBH17319901529
You can modify according to your requirement.
回答3:
I was working in a report where there was a big json array list in one column. I modified the datamodel to store the relationship 1 to * instead of storing everything in one single column. For doing this process, I had to use a while in a stored procedure since I do not know the maximum size:
DROP PROCEDURE IF EXISTS `test`; DELIMITER # CREATE PROCEDURE `test`() PROC_MAIN:BEGIN DECLARE numNotes int; DECLARE c int; DECLARE pos varchar(10); SET c = 0; SET numNotes = (SELECT ROUND ( ( LENGTH(debtor_master_notes) - LENGTH( REPLACE ( debtor_master_notes, "Id", "") ) ) / LENGTH("Id") ) AS countt FROM debtor_master order by countt desc Limit 1); DROP TEMPORARY TABLE IF EXISTS debtorTable; CREATE TEMPORARY TABLE debtorTable(debtor_master_id int(11), json longtext, note int); WHILE(c <numNotes) DO SET pos = CONCAT('$[', c, ']'); INSERT INTO debtorTable(debtor_master_id, json, note) SELECT debtor_master_id, JSON_EXTRACT(debtor_master_notes, pos), c+1 FROM debtor_master WHERE debtor_master_notes IS NOT NULL AND debtor_master_notes like '%[%' AND JSON_EXTRACT(debtor_master_notes, pos) IS NOT NULL AND JSON_EXTRACT(debtor_master_notes, pos) IS NOT NULL; SET c = c + 1; END WHILE; SELECT * FROM debtorTable; END proc_main # DELIMITER ;