问题
Given a data structure (e.g. a hash of hashes), what\'s the clean/recommended way to make a deep copy for immediate use? Assume reasonable cases, where the data\'s not particularly large, no complicated cycles exist, and readability/maintainability/etc. are more important than speed at all costs.
I know that I can use Storable, Clone, Clone::More, Clone::Fast, Data::Dumper, etc. What\'s the current best practice?
回答1:
Clone is much faster than Storable::dclone, but the latter supports more data types.
Clone::Fast and Clone::More are pretty much equivalent if memory serves me right, but less feature complete than even Clone, and Scalar::Util::Clone supports even less but IIRC is the fastest of them all for some structures.
With respect to readability these should all work the same, they are virtually interchangeable.
If you have no specific performance needs I would just use Storable's dclone.
I wouldn't use Data::Dumper for this simply because it's so cumbersome and roundabout. It's probably going to be very slow too.
For what it's worth, if you ever want customizable cloning then Data::Visitor provides hooking capabilities and fairly feature complete deep cloning is the default behavior.
回答2:
My impression is that Storable::dclone() is somewhat canonical.
回答3:
Clone is probably what you want for that. At least, that's what all the code I've seen uses.
回答4:
Try to use fclone from Panda::Lib which seems the fastest one (written in XS)
来源:https://stackoverflow.com/questions/388187/whats-the-best-way-to-make-a-deep-copy-of-a-data-structure-in-perl