2018-01-30 01:53 PM
I recently ran into a scenario where large amount of data (lots of files across lots of sub-folders) had to be moved from a regular folder into a Qtree (in the same volume) for quota purposes.
Challenge was how to get the data moved between the two folders as quickly as possible with as little impact to snapshot space consumption. We tried many approches (File Explorer, Robocopy, fancy multi-threaded / multi-job code), and eventually found that the fastest way to move the data was with a single line PowerShell command:
Get-ChildItem -Path <path_to_folder_1> -Recurse | Move-Item -Destination <path_to_folder_2>
That's it. Moving hunderds of thousands of files acorss very complicated nested sub-folder structure took only a few seconds with about 2% impact to snapshot space.
Thought I would share in case you find yourself in similar situation.
2018-01-31 05:12 AM
May be so, but robocopy doesn't really move files at the metadata level. It copies and deletes, meaning the snapshot space will explode.
Hope you guys are surviving the cold weather...
2018-01-31 05:20 AM
Remeber, the use case is moving files between two folders (one regular and one qtree) in the same volume. The PowerShell command will not copy any data but only update the metadata of the files. Robocopy will copy the entire data and delete the source.