That's a very relative question. Why not ask how long is a piece of string while you're at it.
But seriously it needs to be put in context ... 15% of what size of dataset?
15% of 5TB is certainly more significant than 15% of a few gig.
There's no denying that while de-dupe decreases storage utilization it does add a level of complexity in terms of capacity management.
It’s a trade off, so the amount saved has to be greater than the perceived "cost" of implementing the technology as well as the criticality of the use case. In my experience 25% has been enough for the optimistic crowd to have a go and 40% or more is good enough for the more conservative folks.
Of course it also depends on if the economy is in the tank and whether you actually have a budget left for next year at all.
Richard (all the happier now that OM 3.8 can manage de-dupe centrally)