Re: How to filter sas data sets into separate sas data sets #3Lizette,
Instead of trying to merge the two data sets, I would probably try to create
a SAS format from the values of VAR1 in data set 1. Then, NODE1, NODE2 and
NODE3 could be compared against the format for a match.
The example below is a simplified version of what you could do and shows a
printout of how it works. It has 5 observations in data set 1 and only 3
variables in data set 2, but I think the logic should hold for the example
you gave. After the example is code that could be used to actually split
the data as you had requested. Hope this helps.
* create sas data set 1 ;
data ...
Re: How to filter sas data sets into separate sas data sets #4Something like this is the old way. You could use a proc sql if you
have a new enough version. Increase your buffersize and if you have
enough memory you may get it into a hash routine.
DATA WORK.NEW;
MERGE small (IN=A OBS=500) big ;
BY ID_FIELD;
IF A=1;
RUN;
QUIT;
RICH
-----Original Message-----
From: SAS(r) Discussion [mailto:SAS-L@listserv.vt.edu] On Behalf Of
Lizette Koehler
Sent: Monday, April 02, 2007 10:53 AM
To: SAS-L@LISTSERV.VT.EDU
Subject: How to filter sas data sets into separate sas data sets
Listers,
This is my failing point in coding SAS. T...
Re: problem with large sas data setshhmm.. are you crossing operating systems, like a windows or vms
inbetween unix hosts? is there FTP involved anywhere and it's sending
binary as ascii? is the network link bad?
sas_9264 wrote:
> Hi, sometimes I have a problem to use unix command to copy, mv or soft
> link large sas data set(over 4-5 GB). After I do that, I cann't open
> that data anymore . Sas complain that ERROR: The open failed because
> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
> similar experience?
>
> Thanks,
>
> Shiping
>
--
"Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000 servers, 2
Windows 2003 servers, 1 MySQL Database Server, 1 Postgres Database
Server, 1 Linux server, several Ubuntus and a direct satellite feed to
my windows desktop background, who needs toys???" - Jim
...
Re: Reading & Writing SAS data sets without SAS #3Chang,
You're correct in that a number of companies have done it. I believe SPSS
can do it, WPS, Stat Transfer, dbmscopy, and perhaps others have also done
it. But what I think is unique about this is that Alan is talking about
offering plug-ins so you can roll-your-own so to speak. How cool would it be
to have some type of driver/plugin for R?
Phil
Philip Rack
MineQuest, LLC
SAS & WPS Consulting and WPS Reseller
Tel: (614) 457-3714
Web: www.MineQuest.com
Blog: www.MineQuest.com/WordPress
-----Original Message-----
From: Chang Chung [mailto:chang_y_chung@HOTMAIL.COM]
Sent: Monday...
Re: Exporting a SAS data set to Text file on SAS unix #3hi ... actually, what I posted earlier was too much code (sorry)
this is enough (a bit more succinct)
* variable names into a macro variable (tab separated);
proc sql noprint;
select name into :vars separated by '09'x
from dictionary.columns
where libname eq 'SASHELP' and memname eq 'CLASS'
order varnum;
quit;
data _null_;
file 'z:\class.txt' dsd dlm='09'x ;
if _n_ eq 1 then put "&vars";
set sashelp.class;
put (_all_) (:);
run;
--
Mike Zdeb
U@Albany School of Public Health
One University Place
Rensselaer, New York 12144-3456
P/518-402...
Re: problem with large sas data sets #4I don't accept it is reasonable to expect data corruption to occur without
cause, even if the datasets are many or large...? At least your offsite
backup should always be secure; your overnight and intraday backup should
resolve short term faults, but there should always be a known reason why
dataset x has gone kablooey.
Rgds.
On Tue, 22 Apr 2008 00:21:30 -0700, RolandRB <rolandberry@HOTMAIL.COM>
wrote:
>On Apr 21, 4:51 pm, sas_9264 <Shiping9...@gmail.com> wrote:
>> Hi, sometimes I have a problem to use unix command to copy, mv or soft
>> link large sas data set(over 4-5 GB). After I do that, I cann't open
>> that data anymore . Sas complain that ERROR: The open failed because
>> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
>> similar experience?
>>
>> Thanks,
>>
>> Shiping
>
>There was a post on this newsgroup (list) about a year ago of a huge
>dataset having a few corrupted records after doing a copy.
>Unfortunately, you should expect this. After copying a huge dataset,
>and somehow being sure it has flushed from the cache, you should use a
>utility to do a comparison or better, use proc compare to make sure
>you made a good copy. Do a few million shuffles of chunks of data and
>one or two might well not work. It's one of the reasons I state that a
>"validated" sas reporting system can never be truly validated. There
>are too man...
Re: problem with large sas data sets #6Dear SAS-L-ers,
Shiping posted the following:
> Hi, sometimes I have a problem to use unix command to copy,
> mv or soft link large sas data set(over 4-5 GB). After I do
> that, I cann't open that data anymore . Sas complain that
> ERROR: The open failed because library member
> TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
> experience?
>
Shiping, yes, indeed, I have had this error on UNIX and on Linux servers
in the past when copying very large SAS data sets between directories on
the same server. It was always due to not having enough disk space in
the target directory, so the resultant SAS data set was kindof lopped
off at the end. You know; you can't put 100 pounds of feed into a
75-pound feedbag and expect to be able to carry _ALL_ of your feed to
your livestock. (<--Note the dime-store pseudo-intellectual analogy)!
So, if I were in your sandals, I would check the disk space in my target
directory (e.g. df -k)and double-verify that I had enough space in it
before trying to copy my HUMONGO SAS data set to it.
Shiping, best of luck in shipping your SAS data sets between directories
on your UNIX server!
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Michael A. Raithel
"The man who wrote the book on performance"
E-mail: MichaelRaithel@westat.com
Author: Tuning SAS Applications in the MVS Environment
Author: Tuning SAS Applications in the OS/390 and z/OS Environments,
Second Edition
http://www.sa...
Re: problem with large sas data sets #8Roland:
You said:
Those are maybe just the errors you know about. Do a "proc compare"
after copying one of these humungous datasets, and somehow making
sure
it is no longer sitting in the disk cache, and you might find more
errors.
This comment causes me to repeat (and reword) my unanswered question to
you earlier in this thread.
Why would you do a PROC COMPARE after a system copy, instead of a system
compare utility? The latter is presumably faster (i.e. no need to
deconstruct blocks into records, etc.) and just as good (maybe better)
at telling you if there is a difference in a single bit.
Regards,
Mark
-----Original Message-----
From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of
RolandRB
Sent: Thursday, April 24, 2008 12:27 PM
To: SAS-L@LISTSERV.UGA.EDU
Subject: Re: problem with large sas data sets
On Apr 24, 2:37 pm, michaelrait...@WESTAT.COM (Michael Raithel) wrote:
> Dear SAS-L-ers,
>
> Shiping posted the following:
>
> > Hi, sometimes I have a problem to use unix command to copy,
> > mv or soft link large sas data set(over 4-5 GB). After I do
> > that, I cann't open that data anymore . Sas complain that
> > ERROR: The open failed because library member
> > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
> > experience?
>
> Shiping, yes, indeed, I have had this error on UNIX and on Linux
servers
> in the past when copying very large SAS data sets...
Re: problem with large sas data sets #7Not a thing wrong with that analogy... ;-)
jim
Yes, i grew up on a farm, and do have manners.. I think... ;-)
Michael Raithel wrote:
> Dear SAS-L-ers,
>
> Shiping posted the following:
>
>> Hi, sometimes I have a problem to use unix command to copy,
>> mv or soft link large sas data set(over 4-5 GB). After I do
>> that, I cann't open that data anymore . Sas complain that
>> ERROR: The open failed because library member
>> TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
>> experience?
>>
> Shiping, yes, indeed, I have had this error on UNIX and on Linux servers
> in the past when copying very large SAS data sets between directories on
> the same server. It was always due to not having enough disk space in
> the target directory, so the resultant SAS data set was kindof lopped
> off at the end. You know; you can't put 100 pounds of feed into a
> 75-pound feedbag and expect to be able to carry _ALL_ of your feed to
> your livestock. (<--Note the dime-store pseudo-intellectual analogy)!
>
> So, if I were in your sandals, I would check the disk space in my target
> directory (e.g. df -k)and double-verify that I had enough space in it
> before trying to copy my HUMONGO SAS data set to it.
>
> Shiping, best of luck in shipping your SAS data sets between directories
> on your UNIX server!
>
> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
...
Re: problem with large sas data sets #2But, is there a "different" system between the 2 unixes? Or, is the SAS
dataset being copied to another drive within the same system?
j.
Shiping Wang wrote:
> I use Sas under unix system.
>
> On 4/21/08, *Jim Agnew* <agnew@vcu.edu <mailto:agnew@vcu.edu>> wrote:
>
> hhmm.. are you crossing operating systems, like a windows or vms
> inbetween unix hosts? is there FTP involved anywhere and it's
> sending binary as ascii? is the network link bad?
>
> sas_9264 wrote:
>
> Hi, sometimes I have a problem to use unix command to copy, mv
> or soft
> link large sas data set(over 4-5 GB). After I do that, I cann't open
> that data anymore . Sas complain that ERROR: The open failed because
> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
> similar experience?
>
> Thanks,
>
> Shiping
>
>
> --
>
> "Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000
> servers, 2 Windows 2003 servers, 1 MySQL Database Server, 1 Postgres
> Database Server, 1 Linux server, several Ubuntus and a direct
> satellite feed to my windows desktop background, who needs toys???"
> - Jim
>
>
--
"Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000 servers, 2
Windows 2003 servers, 1 MySQL Database Server, 1 Postgres Database
Server, 1 Li...
Re: problem with large sas data sets #5If I/O as source of corruption is the criterion, then practically all
applications are "inherently untrustworthy". True, such risk would have
to increase as amount of I/O increases, but there are means of reducing
that risk - i.e. contemporary analogs of the parity track in 9-track
tapes.
It seems to me that the issue is one of managing risk. For example, if
a binary compare utility, suggested by RolandRB, declare two files
equivalent, I'd like to know how running a PROC COMPARE further reduces
risk.
Regards,
Mark
-----Original Message-----
From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of
RolandRB
Sent: Tuesday, April 22, 2008 3:22 AM
To: SAS-L@LISTSERV.UGA.EDU
Subject: Re: problem with large sas data sets
On Apr 21, 4:51 pm, sas_9264 <Shiping9...@gmail.com> wrote:
> Hi, sometimes I have a problem to use unix command to copy, mv or soft
> link large sas data set(over 4-5 GB). After I do that, I cann't open
> that data anymore . Sas complain that ERROR: The open failed because
> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
> similar experience?
>
> Thanks,
>
> Shiping
There was a post on this newsgroup (list) about a year ago of a huge
dataset having a few corrupted records after doing a copy.
Unfortunately, you should expect this. After copying a huge dataset,
and somehow being sure it has flushed from the cache, you should use a
utility to do a comparison or better, use proc ...
Re: How to filter sas data sets into separate sas data s etsI think that both Ron's (as he mentioned) and Richard solutions require that
VAR1 is in both datasets.
But from the original post, it seemed to me that VAR1 is only in data set 1,
and it must be matched to 1 of 3 variables in data set 2 (NODE1, NODE2 or
NODE3) to be output to the NEWLIST data set. For this reason, I think a
format is one possible approach.
Maybe the original poster can clarify this point. Thanks.
Jack Clark
Research Analyst
Center for Health Program Development and Management
University of Maryland, Baltimore County
-----Original Message-----
From: SAS(r) Discussio...
Re: Deleting SAS Data from a SAS DATASET #3The disadvantages of using SQL delete are (1) NOBS= is no longer
accurate, and (2) POINT= may behave unexpectedly.
If those are not issues, then the delete would be faster, with or
without an index, than recreating the data set (especially if the data
set has indexes that would need to be rebuilt).
--
Jack Hamilton
jfh@alumni.stanford.org
On Aug 15, 2008, at 11:17 am, Mary wrote:
> One thing you might do is to add an index on the snap_dt to the
> dataset; if that's there then you should be able to delete the
> records in place:
>
> proc sql;
> delete from prod.master_date;
> where snap_dt = "&end_dt"d;
> quit;
> run;
>
> In both the ways you are trying now you are creating new data sets
> rather than deleting records from the current data set; it would
> seem to me that a SQL delete statement would be faster than creating
> new datasets even if there isn't an index on the date.
>
> -Mary
> ----- Original Message -----
> From: SUBSCRIBE SAS-L Chandra Gadde
> To: SAS-L@LISTSERV.UGA.EDU
> Sent: Friday, August 15, 2008 12:14 PM
> Subject: Deleting SAS Data from a SAS DATASET
>
>
> Hi All
>
> I have several SAS datasets that are very very big. (50GB of size).
> Every
> month, the data is being appended to these datasets. I need to
> deleted the
> data which is greater than 24 months. What is the best method to do
> this?
> Please help me.
>
...
Re: search SAS data set from SAS code> From: Rose
> Hi All,
> Suppose I have a sas permanent data set which was created
> early, I know
> the library path but I couldn't remember in which sas program code I
> created it. how can I search from so many sas program files in
> different folders and find it.
a problem familiar to all of us delayed-housekeeping folks.
Libname Libref '<directory-specification>';
DATA LibRef.DataSetName;
use your system utilities to search for the dir-spec
of your libref.
search: *.sas
containing text: <dir-spec>
once you have found the libname...
problem with large sas data setsHi, sometimes I have a problem to use unix command to copy, mv or soft
link large sas data set(over 4-5 GB). After I do that, I cann't open
that data anymore . Sas complain that ERROR: The open failed because
library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
similar experience?
Thanks,
Shiping
On Apr 21, 4:51=A0pm, sas_9264 <Shiping9...@gmail.com> wrote:
> Hi, sometimes I have a problem to use unix command to copy, mv or soft
> link large sas data set(over 4-5 GB). After I do that, I cann't open
> that data anymore . Sas complain that ERROR: The open failed because
> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has
> similar experience?
>
> Thanks,
>
> Shiping
There was a post on this newsgroup (list) about a year ago of a huge
dataset having a few corrupted records after doing a copy.
Unfortunately, you should expect this. After copying a huge dataset,
and somehow being sure it has flushed from the cache, you should use a
utility to do a comparison or better, use proc compare to make sure
you made a good copy. Do a few million shuffles of chunks of data and
one or two might well not work. It's one of the reasons I state that a
"validated" sas reporting system can never be truly validated. There
are too many I/Os going on and sometimes these might fail. By its
design of being a procedural language and creating many datasets in
the process of running a complex job, it is inherently ...
Re: search SAS data set from SAS code #2Rose,
The answer to your question depends on your operating system. In Windows,
there's the Search tool. In Unix/Linux, you can use grep
Bob Abelson
HGSI
240 314 4400 x1374
bob_abelson@hgsi.com
Rose <myr_rose@YAHOO.COM>
Sent by: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
04/19/2005 11:13 AM
Please respond to myr_rose
To: SAS-L@LISTSERV.UGA.EDU
cc:
Subject: search SAS data set from SAS code
Hi All,
Suppose I have a sas permanent data set which was created early, I know
the library path but I couldn't remember in which s...
Re: search SAS data set from SAS code #5Rose,
You have some good advice on search techniques, but they may beinadequate.
I hope your LIBNAME wasn't something like
libname lib "&dir" ;
Perhaps you should also search for ".member", but that also couldhave the same problem. You might also look for key variablenames or values, or procedures that you know created the data.The date from a PROC CONTENTs might provide useful information,or an old report created by the same program with a footnote,"Source code: ...".
Maybe
data lib.w ( label="created by ..." ) ;
would be a good habit to ...
Re: sas data xport problem #3Hi Everyone,
Thanks so much for the help! Using your suggestions, my code is fixed now
and works fine. Here is the corrected codes:
libname in 'C:\data;
libname out xport 'C:/data/new.xpt';
options validvarname=3Dv7;
data work1;
set in.work (rename=3D(long_name=3Dlong) );
run;
data out.new;
set work1;
run;
Thanks again!
Orange
On Wed, Jan 27, 2010 at 3:58 AM, Andre Wielki <wielki@ined.fr> wrote:
> from documentation
> The UPLOAD and DOWNLOAD procedures in SAS/CONNECT and PROC COPY with the
> XPORT engine are the only strategies available for...
Re: sas data sets to excel #3Maybe talk your client out of this? Or just do "passive-aggressive" and =
send it on two sheets? It is a really, really bad idea to put two =
different data sets on the same sheet.=20
-Mary
----- Original Message -----=20
From: Qiang Fu=20
To: Mary=20
Sent: Wednesday, October 15, 2008 10:08 AM
Subject: Re: sas data sets to excel
Hi,
Thanks for your quick reply. I think, however, I didn't express me =
clearly in the MSG.
What I want is to export two different SAS data sets into ONE =
weeksheet with one tab name instead of two weeksheets with two different =
tab names. The example you gave will export the data sets into two =
different tabs with names "sheet1" and "sheet2" respectively even they =
are in same workbook. I know this task sounds insane, but client needs. =
Any idea? Thanks a lot.
Qiang
On Wed, Oct 15, 2008 at 10:24 AM, Mary <mlhoward@avalon.net> wrote:
Hi,
I like to use Excel Tagsets for this; it will actually write an XML =
file but you can open an XML file in either Excel 2003 or 2007. Setting =
the sheet name switches sheets.
Here is an example:
Excel Tagsets give you a lot of formatting control; here is an =
example:
ods listing close;
ods tagsets.excelxp file=3D"c:\temp\file1.xml'
style=3Danalysis
=
options(absolute_column_width=3D'10,30,8,8,8,8,8,8,8,8,8,8,8,8,8,8,8,8,8,=
8,8,8,8,8,8,8,8,8,8,8,8,8,8,8,8,8,8'
sheet_label=3D' '...
Re: problem when installing SAS. windows x64 does not support SAS? #3What about having a Virtual Machine with XP for example and SAS inside in your x64?
Arthur Tabachneck <art297@NETSCAPE.NET> escribi�: Elodie,
Take a look at:
http://support.sas.com/kb/16/568.html
HTH,
Art
---------
On Wed, 14 May 2008 07:54:57 -0700, elodie.gillain@GMAIL.COM wrote:
>Hi everyone
>
>I am trying to install SAS on a Vista machine.
>
>The setup wizard says that Windows x64 does not support the SAS
>version I am trying to install.
>
>What can I do?
>
>I greatly appreciate your help.
---------------------------------
Yahoo! Encuentros
Ahora encontrar pareja es mucho m�s f�cil, prob� el nuevo Yahoo! Encuentros.
Visit� http://yahoo.cupidovirtual.com/servlet/NewRegistration
...
Re: Reading SAS data sets on UNIX by non-SAS apps #2John:
Following on Richard's thoughtful suggestions, the Affinium system would
likely capture data from csv files. SAS PROC EXPORT produces them quickly,
and loading them into external systems works faster for relatively basic
data structures and data formats, in my experience, than xml parsing.
Sig
-----Original Message-----
From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of John
Bentley
Sent: Monday, October 18, 2004 10:10 AM
To: SAS-L@LISTSERV.UGA.EDU
Subject: Reading SAS data sets on UNIX by non-SAS apps
I have SAS data sets on AIX that we need to read with Unica's Affinium
campaign management software, also on a UNIX box. (Let's not get into why
we didn't go with the SAS Solution.) SAS Institute doesn't supply an ODBC
driver for the UNIX environment, and the Affinium implementors don't want to
use the SAS SQL Library for C and or deal with APIs. Other that dumping the
SAS data sets as flat files, can anyone suggest a solution?
Thanks in advance for the help.
...
Re: What r the data types in SAS? in Base SAS , and SAS SQL> From: Amar Mundankar
> Sent: Tuesday, July 21, 2009 8:10 AM
> To: sas-l@uga.edu
> Subject: What r the data types in SAS? in Base SAS , and SAS SQL
>
> Hi all,
> What are the different data types in Base SAS and SAS SQL??
character, lengths from 1 to 32,000+
numeric: lengths from 2 to 8
dates are a subtype of numeric
and are identified by their date, datetime, or time formats
the new proc TSPL (Table Server Processing Language)
supports ANSI data types: bigint, tinyint, etc.
http://support.sas.com/documentation/cdl/en/tsag/30878/HTML/default/a003
065339.htm
http://s...
Re: SAS Data Set in Edit Mode #3Anil,
Assuming you are trying to edit the SAS table in ViewTable, click on the
column header for the date column you want to edit with your right mouse
button and choose Column Attributes from the resulting drop down list. You
can then change the INFORMAT to one that will accept dates in the form you
want to use.
Joe
On 10/13/07, Peter Crawford <peter.crawford@blueyonder.co.uk> wrote:
>
> On Sat, 13 Oct 2007 11:33:15 -0400, Anil Anand <anilkranand@GMAIL.COM>
> wrote:
>
> >Hello All,
> >
> >I am trying to Edit a SAS datasetin Edit mode on Linux. W...
Re: SAS data-set index size #3auto208611@HUSHMAIL.COM wrote:
>
>Is a telling story if the size of the SAS index file (*.sas7bdnx) is
>75% the size
>of the SAS data-set itself?
>
>For instance, we have a data-set that is 533MB's and the index file
>is
>400MB's.
>
>Is this an indication of a poor data-set structure?
No, it is not an indication of a poor data structure. (You may *have*
a poor data structure, but the relative size of the index file is not
indicative.)
Since Mister Index himself has already chipped in, I'll just add a couple
other points.
Think conceptually of the i...