disk-io

disk I/O error with SQLite3 in Python 3 when writing to a database

…衆ロ難τιáo~ 提交于 2020-08-22 11:52:25
问题 i am a student just starting out with python, and i was tasked with creating a relational database management system. I think i came pretty far, but i seem to have hit a wall. This is my code: import csv import sqlite3 conn = sqlite3.connect('unfccc.db') c = conn.cursor() c.execute('''CREATE TABLE unfccc ( Country TEXT, CodeCountryFormat TEXT, NamePollutant TEXT, NameYearSector TEXT, NameParent TEXT, Sector TEXT, CodeSector TEXT, CNUEDSPD TEXT )''') def insert_row(Country, CodeCountryFormat,

disk I/O error with SQLite3 in Python 3 when writing to a database

折月煮酒 提交于 2020-08-22 11:52:11
问题 i am a student just starting out with python, and i was tasked with creating a relational database management system. I think i came pretty far, but i seem to have hit a wall. This is my code: import csv import sqlite3 conn = sqlite3.connect('unfccc.db') c = conn.cursor() c.execute('''CREATE TABLE unfccc ( Country TEXT, CodeCountryFormat TEXT, NamePollutant TEXT, NameYearSector TEXT, NameParent TEXT, Sector TEXT, CodeSector TEXT, CNUEDSPD TEXT )''') def insert_row(Country, CodeCountryFormat,

Performance comparison between mybaits Batch ExecutorType and for_each Xml

人盡茶涼 提交于 2020-06-01 07:22:46
问题 I have a list of records to be inserted into DB using my baits. Previously, my code is something like: for(Item item : items){ sqlSession.insert("insert", item); } Using this method works but I find there is dynamical incremental DiskIO at Mysql server, due to number of items. As I have little access to MySql configuration, and hope to resolve this high disk io issue, I find some possible solutions: using ExecutorType.BATCH for sqlSession insert multiple values within a single insert

How to Safely and Transactionally Replace a File on Linux?

好久不见. 提交于 2019-12-23 15:16:48
问题 The most naive, worst way I can think of to replace the contents of a file is: f = open('file.txt', 'w') f.write('stuff') f.close() Obviously, if that operation fails at some point before closing, you lose the contents of the original file while not necessarily finishing the new content. So, what is the completely proper way to do this (if there is one). I imagine it's something like: f = open('file.txt.tmp', 'w') f.write('stuff') f.close() move('file.txt.tmp', 'file.txt') # dangerous line?

Improve performance of first query

我的未来我决定 提交于 2019-12-22 05:16:19
问题 If the following database (postgres) queries are executed, the second call is much faster. I guess the first query is slow since the operating system (linux) needs to get the data from disk. The second query benefits from caching at filesystem level and in postgres. Is there a way to optimize the database to get the results fast on the first call? First call (slow) foo3_bar_p@BAR-FOO3-Test:~$ psql foo3_bar_p=# explain analyze SELECT "foo3_beleg"."id", ... FROM "foo3_beleg" WHERE foo3_bar_p-#

How to measure file read speed without caching?

 ̄綄美尐妖づ 提交于 2019-12-17 22:53:30
问题 My java program spends most time by reading some files and I want to optimize it, e.g., by using concurrency, prefetching, memory mapped files, or whatever. Optimizing without benchmarking is a non-sense, so I benchmark. However, during the benchmark the whole file content gets cached in RAM, unlike in the real run. Thus the run-times of the benchmark are much smaller and most probably unrelated to the reality. I'd need to somehow tell the OS (Linux) not to cache the file content, or better

UNC path does not work with .NET?

匆匆过客 提交于 2019-12-07 01:56:43
问题 I am running a very simple program, which is trying to list files in a folder on the same machine, which is specified using UNC format(as described in http://msdn.microsoft.com/en-us/library/windows/desktop/aa365247%28v=vs.85%29.aspx): static string rootDir = @"\\?\d:\share\input"; static void Main(string[] args) { char[] invlidChars = Path.GetInvalidPathChars(); foreach (char invChar in invlidChars) { if (rootDir.Contains(invChar.ToString())) { Console.WriteLine("InvChar - {0}", invChar); }

UNC path does not work with .NET?

荒凉一梦 提交于 2019-12-05 09:12:06
I am running a very simple program, which is trying to list files in a folder on the same machine, which is specified using UNC format(as described in http://msdn.microsoft.com/en-us/library/windows/desktop/aa365247%28v=vs.85%29.aspx ): static string rootDir = @"\\?\d:\share\input"; static void Main(string[] args) { char[] invlidChars = Path.GetInvalidPathChars(); foreach (char invChar in invlidChars) { if (rootDir.Contains(invChar.ToString())) { Console.WriteLine("InvChar - {0}", invChar); } } string[] matchFiles = Directory.GetFiles(rootDir); } However the Directory.GetFiles() does not work

Improve performance of first query

自闭症网瘾萝莉.ら 提交于 2019-12-05 08:12:36
If the following database (postgres) queries are executed, the second call is much faster. I guess the first query is slow since the operating system (linux) needs to get the data from disk. The second query benefits from caching at filesystem level and in postgres. Is there a way to optimize the database to get the results fast on the first call? First call (slow) foo3_bar_p@BAR-FOO3-Test:~$ psql foo3_bar_p=# explain analyze SELECT "foo3_beleg"."id", ... FROM "foo3_beleg" WHERE foo3_bar_p-# (("foo3_beleg"."id" IN (SELECT beleg_id FROM foo3_text where foo3_bar_p(# content @@ 'footown'::tsquery

Parallel Disk I/O

僤鯓⒐⒋嵵緔 提交于 2019-12-05 02:03:47
问题 I have several logfiles that I would like to read. Without loss of generality, let's say the logfile processing is done as follows: def process(infilepath): answer = 0 with open (infilepath) as infile: for line in infile: if line.startswith(someStr): answer += 1 return answer Since I have a lot of logfiles, I wanted to throw multiprocessing at this problem (my first mistake: I should have probably used multi-threading; someone please tell me why) While doing so, it occurred to me that any