data-loss

Intermittent data loss with background fetch - could NSKeyedUnarchiver return nil from the documents directory?

╄→尐↘猪︶ㄣ 提交于 2019-12-06 13:51:36
问题 I have a simple app that stores an array of my custom type (instances of a class called Drug ) using NSCoding in the app’s documents folder. The loading and saving code is an extension to my main view controller, which always exists once it is loaded. Initialisation of array: var drugs = [Drug]() This array is then appended with the result of the loadDrugs() method below. func saveDrugs() { // Save to app container let isSuccessfulSave = NSKeyedArchiver.archiveRootObject(drugs, toFile: Drug

Cassandra is configured to lose 10 seconds of data by default?

一曲冷凌霜 提交于 2019-12-05 08:59:48
As the data in the Commitlog is flushed to the disk periodically after every 10 seconds (controlled by commitlog_sync_period_in_ms ), so if all replicas crash within 10 seconds, will I lose all that data? Does it mean that, theoretically, a Cassandra Cluster can lose data? If a node crashed right before updating the commit log on disk, then yes, you could lose up to ten seconds of data. If you keep multiple replicas, by using a replication factor higher than 1 or have multiple data centers, then much of the lost data would be on other nodes, and would be recovered on the crashed node when it

Strange git case - git stash followed by git stash apply lost uncommitted data?

梦想的初衷 提交于 2019-12-05 07:55:54
I have a file, let's say file.txt I have done git mv file.txt to file1.txt, then I created a new file called file.txt and worked on it. Unfortunately I didn't add that file to git yet. Anyway the problem is that I did git stash, then git stash apply, but the new file.txt disappeared... anyway to get it back? The problem here is mostly a misunderstanding of what git stash save does. It saves only changes to tracked files. Untracked files are not saved by git stash . When you moved file.txt to file1.txt, the new file.txt is an untracked file and will not be saved by git stash . This isn't a bug,

c# serial port data loss

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-04 18:36:51
I have written a C# program to receive data on COM2 port. The baud rate is set as 115200. The sender is sending the data at 115200bps. My program is losing a few bytes ocassionally. I am calling ReadByte method to read the data in a while(true) loop from the com port. I have few questions: When the baud rate is set high, should I expect to lose data? if yes, why? I am setting the readbuffer size as 100*1024*1024. Does this set the serial driver buffer size to 100*1024*1024? Any thoughts on how to debug this problem? A receive buffer size of 100*1024*1024 is HUGE! I would seriously doubt you

losing data while writing through asynchronousFileChannel in java

老子叫甜甜 提交于 2019-12-03 21:25:10
I am trying to use asynchronousFileChannel to write the date into a text file. I made 3 jar file of the program with the AsynchronousFileChannel and compiled all 3 jars simultaneously through command prompt to read 3 different text files and output to one common temporary file I have 2000 records in my test files(3) to be read,but the output in the common temporary file is missing some of the records,the output should have 6000 records but it shows only 5366 or 5666 or sometimes less than that. I am not able to figure out why some data is lost as it is the functionality of a

Using named pipes with bash - Problem with data loss

会有一股神秘感。 提交于 2019-11-30 06:37:09
问题 Did some search online, found simple 'tutorials' to use named pipes. However when I do anything with background jobs I seem to lose a lot of data. [[Edit: found a much simpler solution, see reply to post. So the question I put forward is now academic - in case one might want a job server]] Using Ubuntu 10.04 with Linux 2.6.32-25-generic #45-Ubuntu SMP Sat Oct 16 19:52:42 UTC 2010 x86_64 GNU/Linux GNU bash, version 4.1.5(1)-release (x86_64-pc-linux-gnu). My bash function is: function jqs {

Android SQLite Upgrade without losing data

百般思念 提交于 2019-11-30 04:13:17
I have created a SQLite database successfully and it works fine. However when the onUpgrade method is called, I'd like to do so without losing data. The app I'm developing is a quiz app. Simply, when the onCreate method is called I create and prepopulate a database with questions, answers etc. The last column is whether they have set the question as a favourite or not. What I would like to do is that when the onUpgrade method is called, I'd like to temporarily save that one column, drop the whole database, recreate it with any edits I've made to old questions and add any new questions then re

GAE Go - “This request caused a new process to be started for your application…”

耗尽温柔 提交于 2019-11-29 02:59:28
问题 I've encountered this problem for a second time now, and I'm wondering if there is any solution to this. I'm running an application on Google App Engine that relies on frequent communication with a website through HTTP JSON RPC. It appears that GAE has a tendency to randomly display a message like this in the logs: "This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and

Using named pipes with bash - Problem with data loss

久未见 提交于 2019-11-28 20:44:09
Did some search online, found simple 'tutorials' to use named pipes. However when I do anything with background jobs I seem to lose a lot of data. [[Edit: found a much simpler solution, see reply to post. So the question I put forward is now academic - in case one might want a job server]] Using Ubuntu 10.04 with Linux 2.6.32-25-generic #45-Ubuntu SMP Sat Oct 16 19:52:42 UTC 2010 x86_64 GNU/Linux GNU bash, version 4.1.5(1)-release (x86_64-pc-linux-gnu). My bash function is: function jqs { pipe=/tmp/__job_control_manager__ trap "rm -f $pipe; exit" EXIT SIGKILL if [[ ! -p "$pipe" ]]; then mkfifo

How to recover deleted rows from SQL server table?

◇◆丶佛笑我妖孽 提交于 2019-11-27 14:20:11
I accidentaly ran a DELETE command against a table with a wrong WHERE clause. I am using SQL Server 2005. Is there a way that could help me recover the lost data? It is possible using Apex Recovery Tool,i have successfully recovered my table rows which i accidentally deleted if you download the trial version it will recover only 10th row check here http://www.apexsql.com/sql_tools_log.aspx You have Full data + Transaction log backups, right? You can restore to another Database from backups and then sync the deleted rows back. Lots of work though... (Have you looked at Redgate's SQL Log Rescue