Hi all,
I currently have a backup regime that I have inherited that backs up on a nightly basis to tape using BE 2010 R2, out weekly full backup is around 1.15 TB of data however our nightly incremental's are anywhere between 6-700 GB.
I cannot fathom how we are generating that much throughput of changed data, it intimates over half our dataset is changing on a daily basis.
Any tips on how to diagnose this?
We are looking at moving our core ops to a datacentre with our current HQ server room as a DR hot site but there's no way we can shuffle that quantity of data across an internet link nightly.
Thoughts?
GD