backup

addSkipBackupAttributeToItemAtURL -> NSString parameter?

杀马特。学长 韩版系。学妹 提交于 2019-12-23 05:17:21
问题 In order to follow the Data Storage Guidelines I must use the below method to add a flag to say to not back it up to iCloud. However, the parameter here is for a NSURL. I need to pass it a NSString like from a line like so return [[self offlineQueuePath] stringByAppendingPathComponent:@"SHKOfflineQueue.plist"]; Here is the method that takes in a URL. - (BOOL)addSkipBackupAttributeToItemAtURL:(NSURL *)URL { if (&NSURLIsExcludedFromBackupKey == nil) { // iOS <= 5.0.1 const char* filePath = [

How do I script making a backup copy of a spreadsheet and its values not its formulas to an archive folder?

为君一笑 提交于 2019-12-23 04:55:25
问题 I am new to google app scripts and I have been looking for a way to back up a sheet. I am currently using. DriveApp.getFileById("146qFnrQoNPBcDhV6QB0bscHFp8TquXJoAC1qg‌​_esy4E").makeCopy("D‌​ailyArchive" + Date() + " backup"); The problem is its making a daily backup and those backups are updating just like the original and I just want to make a backup of the values so I have a archive. In my sheet I am importing data from a jail roster. http://www.kitsapgov.com/sheriff/incustody/jailwebname

Optimal combination of files to the blocks of 4.8GB

北慕城南 提交于 2019-12-23 03:00:46
问题 My drive has DMG-blocks. The sum of their sizes is strictly below 47GB. I have 11 DVDs, each of the size 4.7GB. I want to use as small amount of DVDs as possible, without using compressing (the problem may be superflous, since it considers the most optimal combinations in terms of DMG-files. You can think it in terms of compressed files, if you want.). You can see that the DMG-files have arbitrary sizes. So many solutions are possible. find . -iname "*.dmg" -exec du '{}' \; 3&> /dev/null

Backup a database mdf & Entity Framework

谁都会走 提交于 2019-12-23 02:43:15
问题 I have a database (mdf file) wich I'm approaching with the Entity Framework. Is it possible to make a backup of the MDF file. I tried already but SMO but the problem is because I'm using a mdf file the database name is empty. I've read that it's autogenerated. Piece of my backup code: String destinationPath = "C:\\"; Backup sqlBackup = new Backup(); sqlBackup.Action = BackupActionType.Database; sqlBackup.BackupSetDescription = "ArchiveDataBase:" + DateTime.Now.ToShortDateString(); sqlBackup

Backup a database mdf & Entity Framework

…衆ロ難τιáo~ 提交于 2019-12-23 02:43:02
问题 I have a database (mdf file) wich I'm approaching with the Entity Framework. Is it possible to make a backup of the MDF file. I tried already but SMO but the problem is because I'm using a mdf file the database name is empty. I've read that it's autogenerated. Piece of my backup code: String destinationPath = "C:\\"; Backup sqlBackup = new Backup(); sqlBackup.Action = BackupActionType.Database; sqlBackup.BackupSetDescription = "ArchiveDataBase:" + DateTime.Now.ToShortDateString(); sqlBackup

Reverse changes from transaction log in SQL Server 2008 R2?

∥☆過路亽.° 提交于 2019-12-23 00:52:08
问题 We have a SQL Server 2008 R2 database that backs up transaction logs every now and then. Today there was a big error in the database caused at around 12am... I have transaction logs up to 8am and then 12am - 16pm - etc. My question is: can I sort of reverse-merge those transaction logs into database, so that I return to the database state at 8am? Or is my best chance to recover an older full backup and restore all transaction logs up to 8am? The first option is preferable since full backup

How to zip specified folders with Command Line

余生长醉 提交于 2019-12-22 18:56:09
问题 Could you people tell me how to zip specified files into same Zip file. Let me tell how my folders are filled : A task scheduler have backups of my databases and save them into a file daily. It creates 4 database backup daily which means there will be 4 more files daily. So I need to zip newly created backups into same zip file (of course it differs from previous day zip file, the zip file will be created for that day for newly created backup files) and I need to do it automatically. Well I

Reading appengine backup_info file gives EOFError

隐身守侯 提交于 2019-12-22 18:28:26
问题 I'm trying to inspect my appengine backup files to work out when a data corruption occured. I used gsutil to locate and download the file: gsutil ls -l gs://my_backup/ > my_backup.txt gsutil cp gs://my_backup/LongAlphaString.Mymodel.backup_info file://1.backup_info I then created a small python program, attempting to read the file and parse it using the appengine libraries. #!/usr/bin/python APPENGINE_PATH='/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default

How to import a Oracle 11g RMAN backupset on a new database server?

旧时模样 提交于 2019-12-22 17:39:54
问题 I have got a backupset of an Oracle 11g database which was created with RMAN. Now I want to import/restore the backupset onto a new and empty database server. I know that the command to create the backupset was run { backup as compressed backupset database tag "FULLBACKUP" format "/orabackup/rman/backup/FULL_%d_%T_%U"; backup as compressed backupset archivelog all tag "ARCHIVELOGS" format "/orabackup/rman/backup/ARCH_%d_%T_%U" delete all input; } but I cannot find out how to make the files

detect if something is modified in directory, and if so, backup - otherwise do nothing

跟風遠走 提交于 2019-12-22 12:39:31
问题 I have a "Data" directory, that I rsync to a remote NAS periodically via a shell script. However, I'd like to make this more efficient. I'd like to detect if something has changed in "Data" before running rsync. This is so that I don't wake up the drives on the NAS unnecessarily. I was thinking of modifying the shell script to get the latest modified time of the files in Data (by using a recursive find), and write that to a file every time Data is rsynced. Before every sync, the shell script