The Flash Cache or PAM cards are *NOT* used when writing to the NetApp disks (that is reading from the tape) but also, they're *NOT* used for large sequential reads (which is what will happen when you try to write a single large file like a LUN to a tape).
The reason for the Flash Cache to disregard the writes to disk, is that these writes are already 100 % cached by the NVRAM and then flushed to the disks in the subsequent consistency point. The Flash Cache may get populated with the written data so that it can then serve as a read-cache for future reads of the data written.
The large sequential reads (which is, I think, what you were asking about) are considered to 'pollute' the cache, since on a backup you typically read the data just once, therefore storing this information in a Flash Cache would simply evict other (potentially valuable from a performance point) data and occupy that space for the backed-up data, which is not going to be read again in the near future (most probably).
This may also explain why you found such 'good' performance on the volume with the many small files.
From my experience this doesn't explain why you get such 'bad' performance (like the 10 or even 3 MB/s), since Data ONTAP will still read-ahead in it's RAM, even if it's not going to store that data in the Flash Cache. Is the system performing a lot of other work while you're performing these tests ? as this could 'flush' the read-ahead data in the RAM. I believe that a 3240 has 8 GB of RAM per head, so that's about the quantity of caching that's available for the read-aheads (for all operations).