BlueXP Services
BlueXP Services
Are there any way to optimizing storage by deleting duplicate files automatically? Any recommendations will be appreciated.
CIFS environment, Classification, BlueXP onprem , Cloud Data Sense - Not Legacy version.
Solved! See The Solution
Hello @sprmachtmann ,
As mentioned by @chamfer , can you validate if deduplication is enabled for the volumes in your AFF cluster. I am herewith attaching a documentation that details about enabling deduplication:
http://docs.netapp.com/us-en/ontap/volumes/enable-deduplication-volume-task.html
You can also validate if the storage efficiency is enabled for a particular volume by running the command:
::> volume efficiency show <vol_name> -instance
Hi @sprmachtmann,
Can you provide more information? Your question is generic and adding more information would assist the community with providing you with some recommendations.
I would start by:
That is a start, but more information will help those trying to help you..... and I hope that this helps!
Hi @chamfer ,
Environment runs on AFF system. Our ideal end state is every duplicated files will be reduced as non-duplicated.
No restriction or avoidance for any software or script.
Hello @sprmachtmann ,
As mentioned by @chamfer , can you validate if deduplication is enabled for the volumes in your AFF cluster. I am herewith attaching a documentation that details about enabling deduplication:
http://docs.netapp.com/us-en/ontap/volumes/enable-deduplication-volume-task.html
You can also validate if the storage efficiency is enabled for a particular volume by running the command:
::> volume efficiency show <vol_name> -instance