f



Re: problem with large sas data sets #8

Roland:

You said:

    Those are maybe just the errors you know about. Do a "proc compare"
    after copying one of these humungous datasets, and somehow making
sure
    it is no longer sitting in the disk cache, and you might find more
    errors.


This comment causes me to repeat (and reword) my unanswered question to
you earlier in this thread.

Why would you do a PROC COMPARE after a system copy, instead of a system
compare utility? The latter is presumably faster (i.e. no need to
deconstruct blocks into records, etc.) and just as good (maybe better)
at telling you if there is a difference in a single bit.

Regards,
Mark

-----Original Message-----
From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of
RolandRB
Sent: Thursday, April 24, 2008 12:27 PM
To: SAS-L@LISTSERV.UGA.EDU
Subject: Re: problem with large sas data sets

On Apr 24, 2:37 pm, michaelrait...@WESTAT.COM (Michael Raithel) wrote:
> Dear SAS-L-ers,
>
> Shiping posted the following:
>
> > Hi, sometimes I have a problem to use unix command to copy,
> > mv or soft link large sas data set(over 4-5 GB). After I do
> > that, I cann't open that data anymore . Sas complain that
> > ERROR: The open failed because library member
> > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
> > experience?
>
> Shiping, yes, indeed, I have had this error on UNIX and on Linux
servers
> in the past when copying very large SAS data sets between directories
on
> the same server.  It was always due to not having enough disk space in
> the target directory, so the resultant SAS data set was kindof lopped
> off at the end.  You know; you can't put 100 pounds of feed into a
> 75-pound feedbag and expect to be able to carry _ALL_ of your feed to
> your livestock.  (<--Note the dime-store pseudo-intellectual analogy)!
>
> So, if I were in your sandals, I would check the disk space in my
target
> directory (e.g. df -k)and double-verify that I had enough space in it
> before trying to copy my HUMONGO SAS data set to it.
>
> Shiping, best of luck in shipping your SAS data sets between
directories
> on your UNIX server!

Those are maybe just the errors you know about. Do a "proc compare"
after copying one of these humungous datasets, and somehow making sure
it is no longer sitting in the disk cache, and you might find more
errors.

But maybe it is best not to look too hard.
0
mkeintz (198)
4/24/2008 8:00:26 PM
comp.soft-sys.sas 142827 articles. 4 followers. Post Follow

1 Replies
549 Views

Similar Articles

[PageSpeed] 3

On Apr 24, 10:00=A0pm, mkei...@WHARTON.UPENN.EDU ("Keintz, H. Mark")
wrote:
> Roland:
>
> You said:
>
> =A0 =A0 Those are maybe just the errors you know about. Do a "proc compare=
"
> =A0 =A0 after copying one of these humungous datasets, and somehow making
> sure
> =A0 =A0 it is no longer sitting in the disk cache, and you might find more=

> =A0 =A0 errors.
>
> This comment causes me to repeat (and reword) my unanswered question to
> you earlier in this thread.
>
> Why would you do a PROC COMPARE after a system copy, instead of a system
> compare utility? The latter is presumably faster (i.e. no need to
> deconstruct blocks into records, etc.) and just as good (maybe better)
> at telling you if there is a difference in a single bit.
>
> Regards,
> Mark
>
>
>
> -----Original Message-----
> From: SAS(r) Discussion [mailto:SA...@LISTSERV.UGA.EDU] On Behalf Of
>
> RolandRB
> Sent: Thursday, April 24, 2008 12:27 PM
> To: SA...@LISTSERV.UGA.EDU
> Subject: Re: problem with large sas data sets
>
> On Apr 24, 2:37 pm, michaelrait...@WESTAT.COM (Michael Raithel) wrote:
> > Dear SAS-L-ers,
>
> > Shiping posted the following:
>
> > > Hi, sometimes I have a problem to use unix command to copy,
> > > mv or soft link large sas data set(over 4-5 GB). After I do
> > > that, I cann't open that data anymore . Sas complain that
> > > ERROR: The open failed because library member
> > > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
> > > experience?
>
> > Shiping, yes, indeed, I have had this error on UNIX and on Linux
> servers
> > in the past when copying very large SAS data sets between directories
> on
> > the same server. =A0It was always due to not having enough disk space in=

> > the target directory, so the resultant SAS data set was kindof lopped
> > off at the end. =A0You know; you can't put 100 pounds of feed into a
> > 75-pound feedbag and expect to be able to carry _ALL_ of your feed to
> > your livestock. =A0(<--Note the dime-store pseudo-intellectual analogy)!=

>
> > So, if I were in your sandals, I would check the disk space in my
> target
> > directory (e.g. df -k)and double-verify that I had enough space in it
> > before trying to copy my HUMONGO SAS data set to it.
>
> > Shiping, best of luck in shipping your SAS data sets between
> directories
> > on your UNIX server!
>
> Those are maybe just the errors you know about. Do a "proc compare"
> after copying one of these humungous datasets, and somehow making sure
> it is no longer sitting in the disk cache, and you might find more
> errors.
>
> But maybe it is best not to look too hard.- Hide quoted text -
>
> - Show quoted text -

CMP should work, if using *nix, but again, somehow you have got to
make sure it has been flushed from the cache before you run the
compare. Don't use DIFF.
0
rolandberry (1870)
4/25/2008 7:58:47 AM
Reply:

Similar Artilces:

Re: How to filter sas data sets into separate sas data sets #3
Lizette, Instead of trying to merge the two data sets, I would probably try to create a SAS format from the values of VAR1 in data set 1. Then, NODE1, NODE2 and NODE3 could be compared against the format for a match. The example below is a simplified version of what you could do and shows a printout of how it works. It has 5 observations in data set 1 and only 3 variables in data set 2, but I think the logic should hold for the example you gave. After the example is code that could be used to actually split the data as you had requested. Hope this helps. * create sas data set 1 ; data ...

Re: How to filter sas data sets into separate sas data sets #4
Something like this is the old way. You could use a proc sql if you have a new enough version. Increase your buffersize and if you have enough memory you may get it into a hash routine. DATA WORK.NEW; MERGE small (IN=A OBS=500) big ; BY ID_FIELD; IF A=1; RUN; QUIT; RICH -----Original Message----- From: SAS(r) Discussion [mailto:SAS-L@listserv.vt.edu] On Behalf Of Lizette Koehler Sent: Monday, April 02, 2007 10:53 AM To: SAS-L@LISTSERV.VT.EDU Subject: How to filter sas data sets into separate sas data sets Listers, This is my failing point in coding SAS. T...

Re: problem with large sas data sets #6
Dear SAS-L-ers, Shiping posted the following: > Hi, sometimes I have a problem to use unix command to copy, > mv or soft link large sas data set(over 4-5 GB). After I do > that, I cann't open that data anymore . Sas complain that > ERROR: The open failed because library member > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar > experience? > Shiping, yes, indeed, I have had this error on UNIX and on Linux servers in the past when copying very large SAS data sets between directories on the same server. It was always due to not having enough disk space in the target directory, so the resultant SAS data set was kindof lopped off at the end. You know; you can't put 100 pounds of feed into a 75-pound feedbag and expect to be able to carry _ALL_ of your feed to your livestock. (<--Note the dime-store pseudo-intellectual analogy)! So, if I were in your sandals, I would check the disk space in my target directory (e.g. df -k)and double-verify that I had enough space in it before trying to copy my HUMONGO SAS data set to it. Shiping, best of luck in shipping your SAS data sets between directories on your UNIX server! +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Michael A. Raithel "The man who wrote the book on performance" E-mail: MichaelRaithel@westat.com Author: Tuning SAS Applications in the MVS Environment Author: Tuning SAS Applications in the OS/390 and z/OS Environments, Second Edition http://www.sa...

Re: problem with large sas data sets #4
I don't accept it is reasonable to expect data corruption to occur without cause, even if the datasets are many or large...? At least your offsite backup should always be secure; your overnight and intraday backup should resolve short term faults, but there should always be a known reason why dataset x has gone kablooey. Rgds. On Tue, 22 Apr 2008 00:21:30 -0700, RolandRB <rolandberry@HOTMAIL.COM> wrote: >On Apr 21, 4:51 pm, sas_9264 <Shiping9...@gmail.com> wrote: >> Hi, sometimes I have a problem to use unix command to copy, mv or soft >> link large sas data s...

Re: problem with large sas data sets #7
Not a thing wrong with that analogy... ;-) jim Yes, i grew up on a farm, and do have manners.. I think... ;-) Michael Raithel wrote: > Dear SAS-L-ers, > > Shiping posted the following: > >> Hi, sometimes I have a problem to use unix command to copy, >> mv or soft link large sas data set(over 4-5 GB). After I do >> that, I cann't open that data anymore . Sas complain that >> ERROR: The open failed because library member >> TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar >> experience? >> > Shiping, yes, indeed, I have had this error on UNIX and on Linux servers > in the past when copying very large SAS data sets between directories on > the same server. It was always due to not having enough disk space in > the target directory, so the resultant SAS data set was kindof lopped > off at the end. You know; you can't put 100 pounds of feed into a > 75-pound feedbag and expect to be able to carry _ALL_ of your feed to > your livestock. (<--Note the dime-store pseudo-intellectual analogy)! > > So, if I were in your sandals, I would check the disk space in my target > directory (e.g. df -k)and double-verify that I had enough space in it > before trying to copy my HUMONGO SAS data set to it. > > Shiping, best of luck in shipping your SAS data sets between directories > on your UNIX server! > > +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ ...

Re: How to filter sas data sets into separate sas data s ets
I think that both Ron's (as he mentioned) and Richard solutions require that VAR1 is in both datasets. But from the original post, it seemed to me that VAR1 is only in data set 1, and it must be matched to 1 of 3 variables in data set 2 (NODE1, NODE2 or NODE3) to be output to the NEWLIST data set. For this reason, I think a format is one possible approach. Maybe the original poster can clarify this point. Thanks. Jack Clark Research Analyst Center for Health Program Development and Management University of Maryland, Baltimore County -----Original Message----- From: SAS(r) Discussio...

Re: search SAS data set from SAS code
> From: Rose > Hi All, > Suppose I have a sas permanent data set which was created > early, I know > the library path but I couldn't remember in which sas program code I > created it. how can I search from so many sas program files in > different folders and find it. a problem familiar to all of us delayed-housekeeping folks. Libname Libref '<directory-specification>'; DATA LibRef.DataSetName; use your system utilities to search for the dir-spec of your libref. search: *.sas containing text: <dir-spec> once you have found the libname...

Re: Deleting SAS Data from a SAS DATASET #8
On 8/15/08, Mary <mlhoward@avalon.net> wrote: > A view helps on deletes, but I wonder how it affects performance of querying the data- wouldn't storing the data in 24 different locations cause a significant slowdown in perfomance upon querying the data versus having it all in one table that is indexed? If this data is queryied a lot but only deleted once a month, the time in querying (which probably is in peak time) could be much more important than the time in deleting (which could be run when the computer is not busy, such as nights or weekends). It is not the VIEW that has an influence on deleting the old data. I would think that having the 24 indexed data sets might be about as good as having the giant data set. I would think the indexes could be used while accessing the data through views. Where's that guy that says he knows everything about using indexed data sets? I would agree that much depends on how the data is used. And I don't know the answers to those questions. > > -Mary > ----- Original Message ----- > From: ./ ADD NAME=Data _null_, > To: SAS-L@LISTSERV.UGA.EDU > Sent: Friday, August 15, 2008 3:51 PM > Subject: Re: Deleting SAS Data from a SAS DATASET > > > Summary: PROC DATASETS; AGE statement. + VIEWs > > This won't help you delete data from your very big data set, but you > may find this example interesting. > > You say you append data monthly to a big data set then when b...

problem with large sas data sets
Hi, sometimes I have a problem to use unix command to copy, mv or soft link large sas data set(over 4-5 GB). After I do that, I cann't open that data anymore . Sas complain that ERROR: The open failed because library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar experience? Thanks, Shiping On Apr 21, 4:51=A0pm, sas_9264 <Shiping9...@gmail.com> wrote: > Hi, sometimes I have a problem to use unix command to copy, mv or soft > link large sas data set(over 4-5 GB). After I do that, I cann't open > that data anymore . Sas complain that ERROR: The open...

Re: search SAS data set from SAS code #5
Rose, You have some good advice on search techniques, but they may beinadequate. I hope your LIBNAME wasn't something like libname lib "&dir" ; Perhaps you should also search for ".member", but that also couldhave the same problem. You might also look for key variablenames or values, or procedures that you know created the data.The date from a PROC CONTENTs might provide useful information,or an old report created by the same program with a footnote,"Source code: ...". Maybe data lib.w ( label="created by ..." ) ; would be a good habit to ...

Re: search SAS data set from SAS code #2
Rose, The answer to your question depends on your operating system. In Windows, there's the Search tool. In Unix/Linux, you can use grep Bob Abelson HGSI 240 314 4400 x1374 bob_abelson@hgsi.com Rose <myr_rose@YAHOO.COM> Sent by: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU> 04/19/2005 11:13 AM Please respond to myr_rose To: SAS-L@LISTSERV.UGA.EDU cc: Subject: search SAS data set from SAS code Hi All, Suppose I have a sas permanent data set which was created early, I know the library path but I couldn't remember in which s...

Re: SAS 8.2 = SAS 8.02 ?
On Mon, 6 Dec 2004 10:29:57 -0500, Igor Kurbeko <ikurbeko@ATHEROGENICS.COM> wrote: >Hi, guys. > > > >Are SAS releases 8.2 and 8.02 one and the same? > > > >I wanted to apply hot fix for SAS Release 8.2 (TS2M0) B2BB97. > >But we use SAS 8.02 Hi, Igor, SAS releases 8.2, 8.02, and 8e refer to the same thing Release 8.2 (TS2M0), at least on Windows platform and likely so in other platforms. It seems an outcome of the development and marketing not in sync -- which is much frequent happening in the industry. Like Java 2 is in fact Java 1.2 and above, ms off...

Re: Reading & Writing SAS data sets without SAS #3
Chang, You're correct in that a number of companies have done it. I believe SPSS can do it, WPS, Stat Transfer, dbmscopy, and perhaps others have also done it. But what I think is unique about this is that Alan is talking about offering plug-ins so you can roll-your-own so to speak. How cool would it be to have some type of driver/plugin for R? Phil Philip Rack MineQuest, LLC SAS & WPS Consulting and WPS Reseller Tel: (614) 457-3714 Web: www.MineQuest.com Blog: www.MineQuest.com/WordPress -----Original Message----- From: Chang Chung [mailto:chang_y_chung@HOTMAIL.COM] Sent: Monday...

Re: Exporting a SAS data set to Text file on SAS unix #3
hi ... actually, what I posted earlier was too much code (sorry) this is enough (a bit more succinct) * variable names into a macro variable (tab separated); proc sql noprint; select name into :vars separated by '09'x from dictionary.columns where libname eq 'SASHELP' and memname eq 'CLASS' order varnum; quit; data _null_; file 'z:\class.txt' dsd dlm='09'x ; if _n_ eq 1 then put "&vars"; set sashelp.class; put (_all_) (:); run; -- Mike Zdeb U@Albany School of Public Health One University Place Rensselaer, New York 12144-3456 P/518-402...

Re: Reading SAS data sets on UNIX by non-SAS apps #2
John: Following on Richard's thoughtful suggestions, the Affinium system would likely capture data from csv files. SAS PROC EXPORT produces them quickly, and loading them into external systems works faster for relatively basic data structures and data formats, in my experience, than xml parsing. Sig -----Original Message----- From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of John Bentley Sent: Monday, October 18, 2004 10:10 AM To: SAS-L@LISTSERV.UGA.EDU Subject: Reading SAS data sets on UNIX by non-SAS apps I have SAS data sets on AIX that we need to read with Unica's Affinium campaign management software, also on a UNIX box. (Let's not get into why we didn't go with the SAS Solution.) SAS Institute doesn't supply an ODBC driver for the UNIX environment, and the Affinium implementors don't want to use the SAS SQL Library for C and or deal with APIs. Other that dumping the SAS data sets as flat files, can anyone suggest a solution? Thanks in advance for the help. ...

Re: What r the data types in SAS? in Base SAS , and SAS SQL
> From: Amar Mundankar > Sent: Tuesday, July 21, 2009 8:10 AM > To: sas-l@uga.edu > Subject: What r the data types in SAS? in Base SAS , and SAS SQL > > Hi all, > What are the different data types in Base SAS and SAS SQL?? character, lengths from 1 to 32,000+ numeric: lengths from 2 to 8 dates are a subtype of numeric and are identified by their date, datetime, or time formats the new proc TSPL (Table Server Processing Language) supports ANSI data types: bigint, tinyint, etc. http://support.sas.com/documentation/cdl/en/tsag/30878/HTML/default/a003 065339.htm http://s...

Re: Data set problems and basic understanding of SAS #2
On Wed, 13 Jul 2005 19:37:55 -0400, Robert Slotpole <rslotpole@COMCAST.NET> wrote: >I keep working through the tutorials but there are so many of them. I know >this is a verry basic question but just the same any and all help greatly >appreciated. OK. But posting essentially the same question multiple times under different subject headings isn't really going to accomplish much. >In my main table D6 I have 200 observations consisting of 40 tickers >repeated for each of 5 dates. I issue a proc summary by date and get 5 >observations (1 for each date) that I now wa...

Re: Data set problems and basic understanding of SAS part 2
On Wed, 13 Jul 2005 22:17:24 -0400, Robert Slotpole <rslotpole@COMCAST.NET> wrote: >In following up on my previous post I can create and merge the data: > >the following output shows what I mean: > > total_ total_ total_ > Obs DATE TICKER endshare dol_div sales purchases > > 1 2002-12-31 *$$$ 0.0 0.0 9617604 - 9617604 > 2 2002-12-31 ABT 5700.0 0.0 9617604 - 9617604 > 3 2002-12-31 AMAT 0.0 0.0 ...

Re: Matching a Large SAS Data Set to a Huge Sybase Table
I say bring Mohammed to the mountain--one of the 3's makes the most intuitive sense to me. For whatever that's worth. 8^) If you've got a sybase index on your sid field, you might try this: ibname dbor sybase user = &idnid. pass = &idnpw. server = &idnserver. ; proc sql ; create table outds as select distinct c.* , a.xid , a.cd from GBO.inds c INNER JOIN dbor.dbtbl(dbkey = sid) a on c.sid = a.sid ; quit ; The smaller your sas dset relative to the sybase table, the better that should work. HTH, -...

Re: SAS Advanced Programming Exam for SAS 9: SAS Joke of the year. #8
On Tue, 4 Sep 2007, Alan Churchill wrote: > It seems that a test like this would toss me out of the pool from the > get-go. Heck, maybe that's what the goal would be ;-] Well then, maybe the rest of us might have a chance then :) Given that I have not used SAS regularly in months, I would be hurting. Hopefully, I would not be tossed immediately either. Isn't one point of the test to allow a organization to call itself a partner or some other distinction? Maybe to be listed among companies on SI's websites? Kevin Kevin Viel, PhD Post-doctoral fellow Department of Geneti...

Re: SAS 9 and SAS 8
I suggest that you look at the SAS web site http://support.sas.com/documentation/whatsnew/index.html Nat Wooding Environmental Specialist III Dominion, Environmental Biology 4111 Castlewood Rd Richmond, VA 23234 Phone:804-271-5313, Fax: 804-271-2977 pausha <pausha1@GMAIL.CO M> To Sent by: "SAS(r) SAS-L@LISTSERV.UGA.EDU Discussion" cc <SAS-L@LISTSERV.U GA.EDU> Subject SAS 9 and SAS 8 11/20/2008 03:28 PM Please respond to pausha <pausha1@GMAIL.CO M> Could someone send me the papers or articles that gives the differences between sas 8 and sas 9 and new function and changes in sas 9. Thanks CONFIDENTIALITY NOTICE: This electronic message contains information which may be legally confidential and/or privileged and does not in any case represent a firm ENERGY COMMODITY bid or offer relating thereto which binds the sender without an additional express written confirmation to that effect. The information is intended solely for the individual or entity named above and access by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copyi...

Re: Matching a Large SAS Data Set to a Huge Sybase Table #2
Hi Talbot, When one or more of the datasets is huge or even just large, any merging across the 'wire' is a very poor(typically bad) performance method. e.g. huge dataset on remote database and small set of keys on local box. Your methods #1 and #2 are going to bring *ALL* the records from the remote database table over your network i/o (which is the slowest link in the chain) and then do the testing of the comparision checks on your local SAS session then deciding what subsetted records to keep. Methods 3's are the best. i.e. using the FROM CONNECTION TO ... ( ...p...

Re: Compressing data sets (was Re: [SAS-L])
On Mon, 5 Jan 2004 18:14:02 -0700, Jack Hamilton <JackHamilton@FIRSTHEALTH.COM> wrote: >OK, "you or a program on your behalf will have to decompress them before >use". > > > >-- >JackHamilton@FirstHealth.com >Manager, Technical Development >Metrics Department, First Health >West Sacramento, California USA > >>>> "Richard Graham" <richardwgraham@earthlink.net> 01/05/2004 5:04 PM >>>> >Actually you can compress(WINZIP, PKZIP) SAS data sets and use them in >the >compressed format. There is software...

Re: XML data to SAS data set converstion #2
A correction. The initial post asked: "Is it possible to convert an XML data to SAS data set without a SAS environment?" If there is no SAS at all in a particular shop, then there is no way to do this conversion. If access to SAS is available via IOM then there are ways. What "environment" means becomes the question. If a shop can simply convert into a compatible SAS XML format that is a possibility. The other is to convert it into a delimited file or get it into a database that a SAS shop can read. The SAS dataset layout is binary and unknown. You could put it into a S...

Web resources about - Re: problem with large sas data sets #8 - comp.soft-sys.sas

Is–ought problem - Wikipedia, the free encyclopedia
The is–ought problem in meta-ethics as articulated by Scottish philosopher and historian David Hume (1711–76) is that many writers make claims ...

SmartPot Problems: Marijuana And The Internet Of Things
What happens when you combine cannabis, whose laws are changing almost daily, with technologies that are constantly evolving as well?

Corona Gets Ready to Rumble With Adrien 'The Problem' Broner
... presenting sponsor of "Premier Boxing Champions," gets ready to rumble with its spot featuring professional boxers including Adrien "The Problem" ...

Sanders Wins Millennial Women Two-to-One; Clinton Would Have Millennial Problem in General Election
Just some of the information from a new USA Today poll ( source ) by Gaius Publius Buried in a USA Today polling story with the headline "Poll ...

No Manuel! The Government Is the Problem. Not The Exits.
I find it difficult to publicly refute Manuel Trajtenberg. He is a man of high integrity and has done wonderful things for the State of Israel. ...

Ford has solved a problem for owners of its biggest pickup trucks
For a lot of drivers, it's hard to imagine life before cameras on cars and trucks. Rear-, side, and forward cameras have made parallel parking ...

Reports of voting problems surface in Florida primary
Some voting machines in the Jacksonville area were down Tuesday morning, and some Orlando-area reportedly ran out of ballots

World's $78 trillion pension problem
Dreams of lengthy cruises and beach life may be just that, with 20 of the world’s biggest countries facing a $78 trillion pension shortfall, ...

Vizio says new TV certification program has 'serious problems'
One of the most popular TV companies not in the UHD Alliance is Vizio, and turns out, it has some pretty strong feelings about it.

Apple cites iPhone, Mac security problems in rebuttal to FBI demands
In a lengthy legal rebuttal to the U.S. government, Apple yesterday deployed an unusual defense that its devices are susceptible to attack to ...

Resources last updated: 3/17/2016 7:16:38 AM