Another option is to archive the operational data on a [daily|hourly|whatever] basis. Most database engines support the extraction of the data into an archive.
Basically, the idea is to create a scheduled Windows or CRON job that
- determines the current tables in the operational database
- selects all data from every table into a CSV or XML file
- compresses the exported data to a ZIP file, preferably with the timestamp of the generation in the file name for easier archiving.
Many SQL database engines come with a tool that can be used for this purpose. For example, when using MySQL on Linux, the following command can be used in a CRON job to schedule the extraction:
mysqldump --all-databases --xml --lock-tables=false -ppassword | gzip -c | cat > /media/bak/servername-$(date +%Y-%m-%d)-mysql.xml.gz