On Apr 21, 4:51=A0pm, sas_9264 <Shiping9...@gmail.com> wrote:
> Hi, sometimes I have a problem to use unix command to copy, mv or soft
> link large sas data set(over 4-5 GB). After I do that, I cann't open
> that data anymore . Sas complain that ERROR: The open failed because
> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
> similar experience?
>
> Thanks,
>
> Shiping
There was a post on this newsgroup (list) about a year ago of a huge
dataset having a few corrupted records after doing a copy.
Unfortunately, you should expect this. After copying a huge dataset,
and somehow being sure it has flushed from the cache, you should use a
utility to do a comparison or better, use proc compare to make sure
you made a good copy. Do a few million shuffles of chunks of data and
one or two might well not work. It's one of the reasons I state that a
"validated" sas reporting system can never be truly validated. There
are too many I/Os going on and sometimes these might fail. By its
design of being a procedural language and creating many datasets in
the process of running a complex job, it is inherently untrustworthy.