f



Re: problem with large sas data sets #6

Dear SAS-L-ers,

Shiping posted the following:

> Hi, sometimes I have a problem to use unix command to copy,
> mv or soft link large sas data set(over 4-5 GB). After I do
> that, I cann't open that data anymore . Sas complain that
> ERROR: The open failed because library member
> TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
> experience?
>
Shiping, yes, indeed, I have had this error on UNIX and on Linux servers
in the past when copying very large SAS data sets between directories on
the same server.  It was always due to not having enough disk space in
the target directory, so the resultant SAS data set was kindof lopped
off at the end.  You know; you can't put 100 pounds of feed into a
75-pound feedbag and expect to be able to carry _ALL_ of your feed to
your livestock.  (<--Note the dime-store pseudo-intellectual analogy)!

So, if I were in your sandals, I would check the disk space in my target
directory (e.g. df -k)and double-verify that I had enough space in it
before trying to copy my HUMONGO SAS data set to it.

Shiping, best of luck in shipping your SAS data sets between directories
on your UNIX server!

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Michael A. Raithel
"The man who wrote the book on performance"
E-mail: MichaelRaithel@westat.com

Author: Tuning SAS Applications in the MVS Environment

Author: Tuning SAS Applications in the OS/390 and z/OS Environments,
Second Edition
http://www.sas.com/apps/pubscat/bookdetails.jsp?catid=1&pc=58172

Author: The Complete Guide to SAS Indexes
http://www.sas.com/apps/pubscat/bookdetails.jsp?catid=1&pc=60409

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Anything that has real and lasting value is always a gift from within. -
Franz Kafka
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
0
4/24/2008 12:37:37 PM
comp.soft-sys.sas 142827 articles. 4 followers. Post Follow

1 Replies
550 Views

Similar Articles

[PageSpeed] 28

On Apr 24, 2:37=A0pm, michaelrait...@WESTAT.COM (Michael Raithel) wrote:
> Dear SAS-L-ers,
>
> Shiping posted the following:
>
> > Hi, sometimes I have a problem to use unix command to copy,
> > mv or soft link large sas data set(over 4-5 GB). After I do
> > that, I cann't open that data anymore . Sas complain that
> > ERROR: The open failed because library member
> > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
> > experience?
>
> Shiping, yes, indeed, I have had this error on UNIX and on Linux servers
> in the past when copying very large SAS data sets between directories on
> the same server. =A0It was always due to not having enough disk space in
> the target directory, so the resultant SAS data set was kindof lopped
> off at the end. =A0You know; you can't put 100 pounds of feed into a
> 75-pound feedbag and expect to be able to carry _ALL_ of your feed to
> your livestock. =A0(<--Note the dime-store pseudo-intellectual analogy)!
>
> So, if I were in your sandals, I would check the disk space in my target
> directory (e.g. df -k)and double-verify that I had enough space in it
> before trying to copy my HUMONGO SAS data set to it.
>
> Shiping, best of luck in shipping your SAS data sets between directories
> on your UNIX server!

Those are maybe just the errors you know about. Do a "proc compare"
after copying one of these humungous datasets, and somehow making sure
it is no longer sitting in the disk cache, and you might find more
errors.

But maybe it is best not to look too hard.

0
rolandberry (1870)
4/24/2008 4:26:56 PM
Reply: