ONTAP Discussions

QUESTION - A source to dual destination mirror quesiton

emanuel
3,589 Views

Hello all

I have a design question based on a use case:

CASE:

We have a ongoing storage controller farm of volumes that are created by the month ( ex: /vol/vol2010_8, /vol/vol2010_9, /vol/vol2010_10, etc ) and they are required to be mirrored to dual sites - different geos.  For the last few months the mirror schedule is aggressive but older volumes are not so aggressive; as these months / volumes come and go, the schedule needs to be modified to replicate less.  Currently all these are managed locally on storage controllers ( using snapmirror.conf files ).

We would like to consider this for Protection Manager ( we have DFM 4.0 currenlty )

We have a working design and will test out before multiple deployment:

SETUP:

1.     Create a new policy in PM and select the source to primary and secondary mirrors ( mirror to two destinations ).

2.     Create a couple of schedules to reflect the aggressive and non aggressive schedules

3.     Create a couple of Resource Pools that contain aggregates/controllers for each target GEO

>>>> Destination Site 1 = RSP1

>>>> Destination Site 2 = RSP2

DESIGN:

1.     Create a Data Set and add source volumes

2.     Assign Policy to Data Set

>>>> Choose "assign resource pool"

>>>> for Destination Site 1 assign RSP1

>>>> for Destination Site 2 assign RSP2

This should start the mirror process using the aggressive schedule ( one "set" of source volumes to be mirrored to "two" GEOs )

What i am struggling with is as this data starts to age, there is no need to mirror every few hours and we can back off to a lesser schedule.  So how do we modify our setup to reflect this?

>     If I change the schedule in the Data Set, then the Data Set stays the same and there should be no interuption if mirroring activities.

>     If I remove the source volume form the data set and re-add it to another dataset, am I at risk of a re-baseline?

If you can imagine a years worth of volumes; If i just modify the schedule I will have a lot of datasets in PM ( which may not be a bad thing ) --OR -- if i can withdraw a volume from a dataset and place it into an "older" dataset, then I can maintain two datasets with two schedules but the "older" dataset would have an increasing amount of  source volumes.

We will be spacing out schedules so we do not conflict with snapmirror working threads.

If this model is good, then we would consider to use it to import exisitng mirror relationships.

Thank you for your time.

5 REPLIES 5

rshiva
3,589 Views

Hi Emanuel,

I'd prefer sticking to the same dataset and modify the schedules as time goes by. That's the best way - coz if we keep moving volumes between datasets - It can be a bit messy (Such as: the backup versions of the volumes would still remain in the old dataset ... so on and so forth).

Don't worry about the number of datasets. We've tested upto 500 datasets in Protection Manager 4.0. Which is pretty decent number. However, the question is how do you plan to configure the datasets. Create a dataset every month and assign all the volumes created on that month - does sound a bit organised (but make sure that you don't end up having 100's of volumes within the same dataset). Try to limit upto 20 volumes per dataset - But that's the bigger picture and totaly depends upon the client's infrastructure.

Hope that helps.

Thanks and regards

Shiva Raja

emanuel
3,589 Views

Shiva, all

We will create a dataset for the first few current months; example: we are still in October so starting today we will create a dataset and include AUG, SEP, and OCT  volumes and replicate to the two remote sites every 2 hours.

As time moves on then we will change the schedule so that it replicates less.

There is no termination plan for this sort of replication although they are reviewing exactly how long old volumes need to replicate.

I guess we need to come up with a scheme to create datasets, perhaps based on a QTR calendar system and then as they become older, just modify the schedule.

I just found out that these "monthly" source volumes are two apiece; each month has two volumes so each QTR will have six ... soo our PM dataset will look like:

DATASETNAME:     AUG-SEP-OCT2010

Policy:                    Dual-Mirror-Replication

Schedule:               High Replication Schedule - bi-hourly

# of Source volumes:     6 ( 2 per month )

Resource Pools:     Create two seperate resource pools per GEO

Once this becomes "old", then change schedule to "Low Replication Schedule" - every 8 hours.

As time moves on we could have 10s of these quarterly datasets in PM

adaikkap
3,589 Views

Hi Emanuel,

            Some of answers and my thoughts on your questions.

>     If I change the schedule in the Data Set, then the Data Set stays the same and there should be no interuption if mirroring activities.

Yes, changing the schedule does not affect an in-flight mirror job.It takes effect once this job completes.Once a job is fired its on it own untill it completes or the user decided to kill it.

>     If I remove the source volume form the data set and re-add it to another dataset, am I at risk of a re-baseline?

You, mean remove the source and dst volume and import it back to another dataset ? If so no rebaseline.

But do keep in mind the old backup version of this volume still stays in the old dataset and will expire on its own.

Also keep in mind, only one scheudle can be associted with a connection of a dataset.

So if you create a mirror dataset with 20 volumes you can have only one schedule and not different ones for each of the volume.

Infarct this is the paradigm of the dataset where you groups objects of similar type, requiring same set of things and manage it as one single entity(dataset) as apposed to 20 or 200 volumes.

regards

adai

smoot
3,589 Views

Hi Emmanuel --

I like the idea of creating a dataset per month.

The only clarification I'd make is you want two protection polices: one for aggressive mirroring, one for less frequent mirroring. As the datasets age, you should change the policy on the dataset, not fool around with schedules.

-- Pete

emanuel
3,589 Views

okay ... let me look into the design for this ... its coming together slowly.

We will probably start with importing exisitng relationships first base don resource availablty.

Public