f



Re: problem with large sas data sets #4

I don't accept it is reasonable to expect data corruption to occur without
cause, even if the datasets are many or large...? At least your offsite
backup should always be secure; your overnight and intraday backup should
resolve short term faults, but there should always be a known reason why
dataset x has gone kablooey.

Rgds.

On Tue, 22 Apr 2008 00:21:30 -0700, RolandRB <rolandberry@HOTMAIL.COM>
wrote:

>On Apr 21, 4:51 pm, sas_9264 <Shiping9...@gmail.com> wrote:
>> Hi, sometimes I have a problem to use unix command to copy, mv or soft
>> link large sas data set(over 4-5 GB). After I do that, I cann't open
>> that data anymore . Sas complain that ERROR: The open failed because
>> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
>> similar experience?
>>
>> Thanks,
>>
>> Shiping
>
>There was a post on this newsgroup (list) about a year ago of a huge
>dataset having a few corrupted records after doing a copy.
>Unfortunately, you should expect this. After copying a huge dataset,
>and somehow being sure it has flushed from the cache, you should use a
>utility to do a comparison or better, use proc compare to make sure
>you made a good copy. Do a few million shuffles of chunks of data and
>one or two might well not work. It's one of the reasons I state that a
>"validated" sas reporting system can never be truly validated. There
>are too many I/Os going on and sometimes these might fail. By its
>design of being a procedural language and creating many datasets in
>the process of running a complex job, it is inherently untrustworthy.
0
ben.powell1 (971)
4/22/2008 10:48:57 AM
comp.soft-sys.sas 142827 articles. 4 followers. Post Follow

0 Replies
656 Views

Similar Articles

[PageSpeed] 59

Reply: