f



problem with large sas data sets

Hi, sometimes I have a problem to use unix command to copy, mv or soft
link large sas data set(over 4-5 GB). After I do that, I cann't open
that data anymore . Sas complain that ERROR: The open failed because
library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
similar experience?

Thanks,

Shiping
0
4/21/2008 2:51:22 PM
comp.soft-sys.sas 142827 articles. 4 followers. Post Follow

1 Replies
916 Views

Similar Articles

[PageSpeed] 15

On Apr 21, 4:51=A0pm, sas_9264 <Shiping9...@gmail.com> wrote:
> Hi, sometimes I have a problem to use unix command to copy, mv or soft
> link large sas data set(over 4-5 GB). After I do that, I cann't open
> that data anymore . Sas complain that ERROR: The open failed because
> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
> similar experience?
>
> Thanks,
>
> Shiping

There was a post on this newsgroup (list) about a year ago of a huge
dataset having a few corrupted records after doing a copy.
Unfortunately, you should expect this. After copying a huge dataset,
and somehow being sure it has flushed from the cache, you should use a
utility to do a comparison or better, use proc compare to make sure
you made a good copy. Do a few million shuffles of chunks of data and
one or two might well not work. It's one of the reasons I state that a
"validated" sas reporting system can never be truly validated. There
are too many I/Os going on and sometimes these might fail. By its
design of being a procedural language and creating many datasets in
the process of running a complex job, it is inherently untrustworthy.
0
rolandberry (1870)
4/22/2008 7:21:30 AM
Reply: