data-migration

Problem with parsing data via php and storing it to MySQL database

徘徊边缘 提交于 2020-01-16 10:36:29
问题 Sorry for duplicating this question, but here I tried to explain it in more details. I need to parse the data from certain file and store it to database (MySQL). This is how the data is displayed in the file: 戚谊 戚誼 [m1][b]qīyì[/b][/m] [m2]translation 1[/m] [m1][b]qīyi[b][/m] [m2]translation 2[/m] 三州府 [m1][b]sānzhōufǔ[/b][/m] [m2]translation of other character[/m] etc. The first and the second line represent the same character, but the first line is a simplified and the second line is a

Avoiding code change with Microsoft SQLServer and Unicode

こ雲淡風輕ζ 提交于 2020-01-16 03:48:06
问题 How can you get MSSQL server to accept Unicode data by default into a VARCHAR or NVARCHAR column? I know that you can do it by placing a N in front of the string to be placed in the field but to by quite honest this seems a bit archaic in 2008 and particuarily with using SQL Server 2005. 回答1: The N syntax is how you specify a unicode string literal in SQL Server. N'Unicode string' 'ANSI string' SQL Server will auto convert between the two when possible, using either a column's collation or

Return data from subselect used in INSERT in a Common Table Expression

扶醉桌前 提交于 2020-01-15 12:44:52
问题 I am trying to move bytea data from one table to another, updating references in one query. Therefore I would like to return data from the query used for the insert that is not used for the insert. INSERT INTO file_data (data) select image from task_log where image is not null RETURNING id as file_data_id, task_log.id as task_log_id But I get an error for that query: [42P01] ERROR: missing FROM-clause entry for table "task_log" I want to do something like: WITH inserted AS ( INSERT INTO file

Kafka Message migration

[亡魂溺海] 提交于 2020-01-15 09:20:29
问题 We are currently operating on Apache Kafka 0.10.1.1. We are migrating to Confluent Platform 5.X. The New cluster is setup completely on different set of physical nodes. While we are already working on upgrading the API(s), our application uses spring-boot , we are trying to figure out how do we migrate the messages? I need to maintain the same ordering of messages in the Target Cluster. Can I simply copy the messages? Do I need to republish the messages to Target cluster for successful

Magento 2 Data Migration Edit Product Unable to unserialize value

烈酒焚心 提交于 2020-01-07 04:36:14
问题 I migrated magento site from 1.7.0.2 to 2.2 clean install. Migration completed without issues. However if I attempt to edit a product or add a new product magento displays an error : Unable to unserialize value. Does anyone know what im doing wrong here? Customers, categories and custom attributes all migrated without issue 回答1: The problem is in /vendor/magento/framework/Serialize/Serializer/Json.php There is a function unserialize($string) which gives you a syntax error if a string is

What is the best practice for data conversion between applications [closed]

限于喜欢 提交于 2020-01-07 02:16:07
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . I wonder if this might be a too subjective question for stackoverflow but ill give it a go anyway. Is there a common/best practice for

Why sqoop job is not creating dynamic sub-directory date wise?

喜你入骨 提交于 2020-01-06 08:36:06
问题 I am using sqoop to import Oracle data to HDFS directory. I have created the sqoop job for the same. I have used follwoing command to create sqoop job- sqoop job --create TABLE_NAME -- import --connect jdbc:oracle:thin:/system@HOST_NAME:PORT:SERVICE --username USERNAME --password-file /MYPASSWORD.txt --fields-terminated-by ',' --enclosed-by '"' --table USERNAME.TABLE_NAME --target-dir /TABLE_NAME/$(date --date "-1 days" +%F)/ -m 1 --incremental append --check-column DATE_COLUMN --last-value

How to insert multiple JSON files into postgresql table at a time?

蹲街弑〆低调 提交于 2020-01-06 02:45:08
问题 I have multiple JSON files, they all have same format but the values are different based on each transaction. I want to migrate this data to a postgresql table. What is the best way to proceed with this? Right now, I am using the following query: CREATE TABLE TEST (MULTIPROCESS VARCHAR(20), HTTP_REFERER VARCHAR(50)); INSERT INTO TEST SELECT MULTIPROCESS, HTTP_REFERER FROM json_populate_record(NULL::test, '{"multiprocess": true,"http_referer": "http://localhost:9000/"}'); But, once the number

Joining two tables in a complex query (not uniform data)

吃可爱长大的小学妹 提交于 2020-01-05 10:11:38
问题 I need to connect two tables in a query that I will use to insert data to third table (used in the future to join the two). I will mention only relevant columns in these tables. PostgreSQL version 9.0.5 Table 1: data_table migrated data, ca 10k rows, relevant columns: id (primary key), address (beginning of an address, string that I need to match with the second table. This address has varying length.) Table 2: dictionary dictionary, ca 9 mln rows, relevant columns: id (primary key), address

liferay migrate data from hsql to mysql

走远了吗. 提交于 2020-01-05 07:39:18
问题 I am having Liferay 6.2. It has some data in hsql. I need to import these data into mysql database. How can I import all data from hsql to mySQL. I am having following files in data/hsql folder lportal.lck (lck file) lportal (seems like data base dump) lportal.properties lportal.script Is there any way to import db using some script or changing some configuration files. 回答1: One way is you can try using the Data migration tool liferay provides, check out the Liferay User-guide and scroll to