backup

It's possible to make a Batch program that backup a folder files from my external hd to pc hdd, anytime I add or change a file?

我与影子孤独终老i 提交于 2019-12-12 00:16:21
问题 I'm a graphic designer and I use my external hd frequently to move psd/jpeg/png/tiff files from my laptop at home to my Work PC and vice versa. But as a precaution before modifying or adding a file I copy it to a folder. It's possible to make it automatically? And it's easy to make on Batch? Thanks for help! 回答1: Robocopy is your friend google it and have a look. It's a very powerful program by sysinternals (owned by Microsoft). Or if you want a nice GUI to go with it use SyncToy (https://www

powershell keep last 31 files

泪湿孤枕 提交于 2019-12-11 23:53:25
问题 I'm using this powershell command to keep only backup files that are 31 days old. Get-ChildItem –Path “d:\Backup\hl” –Recurse | Where-Object{$_.LastWriteTime –lt (Get-Date).AddDays(-31)} | Remove-Item -Force -Recurse My question is if the daily backup was to fail and I checked the backup folder e.g after a month, the powershell script would delete all or most of the backups because they are older then 31 days. Is it possible to change the powershell command to keep the last 31 files depending

Apply NSURLIsExcludedFromBackupKey to Server URL or URL of Directory on iPhone?

女生的网名这么多〃 提交于 2019-12-11 20:32:16
问题 I am battling with an app rejection due to how I am saving some downloaded files on the iPhone. I have implemented NSURLIsExcludedFromBackupKey but I am not clear if I am doing it correctly. Should I be applying NSURLIsExcludedFromBackupKey to the NSURL that represents the file on my server, or do I somehow apply it to the directory on the iPhone where I am saving the file? Here is what I have created now, after the app was rejected: // Get / Create the File Directory on the iPhone NSArray

How to schedule backups for development, testing, and production environments?

余生颓废 提交于 2019-12-11 20:14:34
问题 I was educating myself about website launches. I was reading this document to prepare an implementation check-list for my future website launches. On page 11, they mentioned Schedule backups for development, testing, and production environments I can imagine taking backup of a database and website code. How can someone automate and schedule backup of an entire x, y or z environment ? If this is a very broad question then I do not need the step by step document for doing this. Maybe some

Where is the list of recent files stored?

北城以北 提交于 2019-12-11 19:04:08
问题 I would like to syncronize certain Sublime Tex2 settings between multiple computers. One thing I don't want to sync is the Recent File list. Where on the local file system does Sublime Text 2 store its list of recently opened files? 回答1: On Mac OS X this list is stored in a file called Session.sublime_session under ~/Library/Application Support/Sublime Text 2/Settings , so I'd expect it to be in the same file in Sublime's Settings folder on other operating systems as well. 回答2: On Linux it is

Google Apps Script: Save Spreadsheet as ODS for Local Backup

妖精的绣舞 提交于 2019-12-11 18:58:39
问题 I could use a hand. My company uses Google Sheets extensively, and we need a way to access files when we lose our Internet connection. I could not get any of the examples found on this site to work for creating xls or ods from Google Sheets via script. I did script a way to create csv backups, accessible from a local Google Drive folder. When used with an hourly trigger, this script creates csv files of every sheet of any spreadsheet modified in the last hour, puts them in a folder, and zips

Tiny CLion projects slow to build a second time

冷暖自知 提交于 2019-12-11 18:27:09
问题 I'm a Community College instructor grading student C++ coding assignments. Been doing the same task all semester. Suddenly, this morning, CLion is building extremely slowly, perhaps even hanging, the second time I build/run a project. WTF? The projects are very small. One source file, one header, no libraries. What changed? And why would a second build be the problem? It's usually first builds that are slow. 回答1: What changed? My harddrive backup software. I told my auto-backup to take 2

Python REST / TeamCity backup

妖精的绣舞 提交于 2019-12-11 17:57:18
问题 I am trying to develop a python script to run REST backup procedure as shown in http://confluence.jetbrains.com/display/TW/REST+API+Plugin#RESTAPIPlugin-DataBackup here is my code: #!/usr/bin/env python # -*- coding: utf-8 -*- import urllib import urllib2 """ Data Backup +++++++++++ Start backup: POST http://teamcity:8111/httpAuth/app/rest/ server/backup?includeConfigs=true&includeDatabase=true& includeBuildLogs=true&fileName=<fileName> where <fileName> is the prefix of the file to save

Building a high performance and automatically backupped queue

时间秒杀一切 提交于 2019-12-11 17:12:47
问题 Please give me some hints on my issue. I'm building a queue data structure that: has a backup on hard disk at realtime and can restore the backup Can respond to massive enqueue/dequeue request Thank you! 回答1: Is this an exercise your doing. If not, you should probably look at some of the production message queueing technologies (e.g. MSMQ for Windows) which supports persisting the queues on the disk and not just storing them in memory. In terms of your requirements 1. has a backup on hard

FTP - Only want to keep latest 10 files - delete LRU

我怕爱的太早我们不能终老 提交于 2019-12-11 17:11:48
问题 I have created a shell script to backup my webfiles + database dump, put it into a tar archive and FTP it offsite. Id like to run it X times per week however I only want to keep the latest 10 backups on the FTP site. How can I do this best? Should I be doing this work on the shell script side, or is there an FTP command to check last modified and admin things that way? Any advice would be appreciated. Thanks, 回答1: One way to do something like this would be to use the day of the week in the