f



Data, data, data,

I have been tasked to dump around 400GB of data in physical files into
a flat files and the transfer them from an iSeries 810 V5r3 to a
windows server.  I was going to CPYTOIMPF to a stram file and then
copy over to the Windows server from the IFS.  Anyone have a better
idea or any suggestions?

Thanks,
Thad Rizzi

0
thad_rizzi (231)
7/20/2007 9:42:21 PM
comp.sys.ibm.as400.misc 9219 articles. 2 followers. Post Follow

2 Replies
1880 Views

Similar Articles

[PageSpeed] 32

Hi Thad

If you have loads of free disk space on your 810 then your method is
fine but if not you may consider Simply FTPing or iSeroes Access file
transfering the files directly from the iSeries files to Windows Text
file.  You can get the either FTP or iSeries Access to carry out the
EBCDIC to ASCII file translations on the fly.

Here are my thoughts for and against

Things in favour of Direct Transfer
1) You do not need and extra disk space on iSeries
2) Will take less time to transfer as this is a single pass process,
with CPY command to IFS you still have to port the large text files to
other Server
3) FTP does not handle Packed Numeric fields very well (iSeries Access
does)
4) iSeries Access File Transfer will allow basic field formating and
record selection to assist with Migration


Things in favour of CPYTOIMF / CPYTOSMTF
1) If the iSeries Data is still being used this method allows for you
to snapshot all the data at a know point and with all files in sync
2) This meathod make it simpler to copy the data out to tape for
restore on a none iSeries platorm

You could also consider an ODBC style link but this is generally
better suited to more limited and complex data requests rather than
simple but large scale data exports.

It is also feasible for you to copy the data directly from iSeries DB2
file to tape for reading on a windows platform but by the time you've
figured it out you may as well have transfer it via network


Regards Brad

0
smjbradshaw (182)
7/21/2007 10:02:37 AM
Brad wrote:

> 2) Will take less time to transfer as this is a single pass process,
> with CPY command to IFS you still have to port the large text files to
> other Server

Why? If /QNTC or an NFS mount was the target, the transfer would happen 
during the copy. Are those not valid targets?

Personally, I'd probably use the Qshell db2 utility instead of 
CPYTOIMPF. There's far more control over the format of each column 
including column delimiters and quoting.

The file field descriptions could be read first in order to construct 
the desired db2 SELECT statement. The SELECT could be built as part of 
the command string for STRQSH or written into a qsh script. It shouldn't 
be a particularly complicated program that accepted a file name as a 
parm, read the field descriptions, created the appropriate db2 SELECT 
and then executed it. Output would be redirected by the shell to the 
target streamfile.

-- 
Tom Liotta
http://zap.to/tl400
0
thomas8368 (114)
7/21/2007 11:36:14 AM
Reply:

Similar Artilces:

Data Entry Online, Data Format, Data Conversion and Data Entry Services through Data Entry Outsourcing
Now a day IT companies may needs to process rapidly growing amount of information. So it's becoming trends to outsource data entry work to reliable professional experts who gives excellent output with cost effective and time bound. Data Entry is the procedure of handling and processing over data. There are different forms of data entry like data entry for survey forms, legal services, entry for medical claim forms. Data for keeping track for credit and debit card transactions. Data entry can be differentiate in two forms online form data entry and offline data entry. Data Entry Online is mostly preferred by IT based organization or company. There are several reasons behind that. Most important reasons is its reliable and cost effective using online data entry services so company can focus on their core processes and remove management headaches. Data entry online services include entering data into websites, e-books, entering image in different format, Data processing and submitting forms, creating database for indexing and mailing for data entered. It also used in insurance claim entry. Procedure of processing of the forms and insurances claims are kept track of data entry services. Scanned image are required for file access and credit and debit card entry. Moreover, Large Company have large volume of data and storing them is very difficult. Managing these data is multiple formats are headaches. Data Entry Outsourcing covers entire scale of data entry services which is...

Data Entry Online, Data Format, Data Conversion and Data Entry Services through Data Entry Outsourcing
Now a day IT companies may needs to process rapidly growing amount of information. So it's becoming trends to outsource data entry work to reliable professional experts who gives excellent output with cost effective and time bound. Data Entry is the procedure of handling and processing over data. There are different forms of data entry like data entry for survey forms, legal services, entry for medical claim forms. Data for keeping track for credit and debit card transactions. Data entry can be differentiate in two forms online form data entry and offline data entry. Data Entry Online is mostly preferred by IT based organization or company. There are several reasons behind that. Most important reasons is its reliable and cost effective using online data entry services so company can focus on their core processes and remove management headaches. Data entry online services include entering data into websites, e-books, entering image in different format, Data processing and submitting forms, creating database for indexing and mailing for data entered. It also used in insurance claim entry. Procedure of processing of the forms and insurances claims are kept track of data entry services. Scanned image are required for file access and credit and debit card entry. Moreover, Large Company have large volume of data and storing them is very difficult. Managing these data is multiple formats are headaches. Data Entry Outsourcing covers entire scale of data entry services which is...

Data Entry Online, Data Format, Data Conversion and Data Entry Services through Data Entry Outsourcing
Now a day IT companies may needs to process rapidly growing amount of information. So it's becoming trends to outsource data entry work to reliable professional experts who gives excellent output with cost effective and time bound. Data Entry is the procedure of handling and processing over data. There are different forms of data entry like data entry for survey forms, legal services, entry for medical claim forms. Data for keeping track for credit and debit card transactions. Data entry can be differentiate in two forms online form data entry and offline data entry. Data Ent...

Advantages of Data Conversion, Data Formats and Data Entry Services by Data Entry India
Now a day's paper has become outdated and organizations are using online services to avoid paper work. For any organization, its biggest and most important asset is its data and more important is perfect accessibility of that data when it is needed. Sometimes it becomes really very difficult to manage such a large amount of data, especially for firms with large amounts of information, multiple formats, and complex system requirements. In order to avoid such situations of mismanagement conversion services are like bliss. Conversion services allow these organizations to systematize their data by putting it into one format which can be used across many different platforms. However, with the coming of conversion service, the task of converting data in a desired form has become much easier. In fact, these days, most of the business and work is conducted over the Internet and maintaining data and records in a hard copy has just become a formality thanks to the conversion service. This is how we acquire any information on internet through this service at a click of a mouse. It helps in a simple search on one of the popular engines and the results will be available at a moment's notice. The same efficiency expectancy holds true for any information regarding conversion service that offers the conversion values for various measurements such as area, length, mass, temperature, speed and volume. Moreover, by using modern technology and highly skills professional staff, your uns...

Save 60% on Data Entry, Data Conversion, Data Processing Services by Offshore-Data-Entry
Get a Discount up to 60% on data entry, data capture, dataentry services, large volume data processing and data conversion services through offshore facilities in India. Offshore data entry also provides form data entry, data capture, HTML/SGML coding, image scanning, file conversion with low cost, high quality,99.98% accuracy and time bound. Data Conversion Services offer cost effective data conversion projects, Outsourcing Data Conversion, advantages of data entry conversion services at offshore data entry in India. We convert PDFs, typesetters, word processors, paper format i...

Save 60% on Data Entry, Data Conversion, Data Processing Services by Offshore-Data-Entry
Get a Discount up to 60% on data entry, data capture, dataentry services, large volume data processing and data conversion services through offshore facilities in India. Offshore data entry also provides form data entry, data capture, HTML/SGML coding, image scanning, file conversion with low cost, high quality,99.98% accuracy and time bound. Data Conversion Services offer cost effective data conversion projects, Outsourcing Data Conversion, advantages of data entry conversion services at offshore data entry in India. We convert PDFs, typesetters, word processors, paper format into XML, HTML-SGML and other structure format. CAD conversion process, book conversion, catalog conversion are the services we offer with attractive rates. Outsourcing your data entry services to help in effectively managing your business by eliminating risks, reducing costs, and boost up your business growth, so you concentrate on your core business. Do visit us at http://www.offshoredataentry.com/ , to know more about our Offshore Data Entry Services and avail our services at affordable rates. ...

Variant data to data
Hi, I am connecting to MySQL server and with SELECT VI from database connectivity kit I am having 2 coloums of data, I have read the documentation it says, If I want to convert this data into labview data I have to use Variant Data to Data VI but when I use that I have got the error message below. What do you think is the problem? ERROR MSG: Generally, polymorphic terminals accept any wire type connected to them. But sometimes these terminals only accept a limited subset based on other wire types connected to other terminals. The most common cause of this conflict is using the type c...

Save 60% on Data Entry, Data Conversion, Data Processing Services by Offshore-Data-Entry
Get a Discount up to 60% on data entry, data capture, dataentry services, large volume data processing and data conversion services through offshore facilities in India. Offshore data entry also provides form data entry, data capture, HTML/SGML coding, image scanning, file conversion with low cost, high quality,99.98% accuracy and time bound. Data Conversion Services offer cost effective data conversion projects, Outsourcing Data Conversion, advantages of data entry conversion services at offshore data entry in India. We convert PDFs, typesetters, word processors, paper format into XML, HTML-SGML and other structure format. CAD conversion process, book conversion, catalog conversion are the services we offer with attractive rates. Outsourcing your data entry services to help in effectively managing your business by eliminating risks, reducing costs, and boost up your business growth, so you concentrate on your core business. Do visit us at http://www.offshoredataentry.com/ , to know more about our Offshore Data Entry Services and avail our services at affordable rates. ...

Incorrect AFPDS data was found. The data length or the data structure is incorrect
I am using and IBM software product called ON_DEMAND. I can view the file fine before I burn it to cd. The error occurs when I try to view the file from cd. I need to find out what is causing this error when it is burned to cd and what I can do to correct it? Anyone ever encounter this problem? Lchadwick ...

Advantages of Data Conversion and Data Formats Services by Data Entry India
Now a day's paper has become outdated and organizations are using online services to avoid paper work. For any organization, its biggest and most important asset is its data and more important is perfect accessibility of that data when it is needed. Sometimes it becomes really very difficult to manage such a large amount of data, especially for firms with large amounts of information, multiple formats, and complex system requirements. In order to avoid such situations of mismanagement conversion services are like bliss. Conversion services allow these organizations to systematize their data by putting it into one format which can be used across many different platforms. Data Conversion Service offers flexibility, responsibility and ability to cater to any new process. If you have a data conversion requirement or want to convert bulk documents than take advantage of conversion services. That enables you to handle large volumes of data. As conversion services cover a multitude of formats it is ideal for corporations, educational institutions, library and business. In conversion quality of work is very essential to get outstanding result. Data conversion service prevents access, downloads or copying of confidential documents or information as it customize software for data capture and workflow. By Converting Data into many different formats like mainframe data, excel worksheets, FoxPro, dbase, paradox, text files etc. Data formats deliver output through FTP, VPN. For priva...

data to procedure? packed data?
Hi all I need to send data to this thing - and I guess that the "selite" (explanation) is an array or something. Just how do I do that? I understand all the parameters, except the first one - and "comment" is possible, but the product ID / name an option, as the procedure should need to know what to work with. Any ideas? Sonnich procedure VarastopaikkaSiirrot(ppSelite Vararasto_pack.vctable, Function VarastoPaikkaSiirrot2(ppJuoksu in pls_integer, ppPaikat in Vararasto_pack.intTable, ppPaikka in pls_integer, ppMuutos Vararasto_pack.floatTable, ppMuutos in number, ppTyyppi in char default 'S' ppTyyppi in char, ) is ppSelite in varchar2) return pls_integer is On Nov 11, 4:35=A0am, jodleren <sonn...@hot.ee> wrote: > Hi all > > I need to send data to this thing - and I guess that the > "selite" (explanation) is an array or something. > Just how do I do that? > > I understand all the parameters, except the first one - and "comment" > is possible, but the product ID / name an option, as the procedure > should need to know what to work with. > > Any ideas? > > Sonnich > > procedure VarastopaikkaSiirrot(ppSelite Vararasto_pack.vctable, > Function VarastoPaikkaSiirrot2(ppJuoksu in pls_integer, > ppPaikat in Vararasto_pack.intTable, > ppPaikka in pls_integer, > ppMuutos Vararasto_pack.floatTable, > ppMuutos in number, &g...

Data Distribution for Data Warehouse
I'm setting up a somewhat generic data warehouse consisting of up to 100 tables with sizes ranging from 100 MB to 10GB; total diskspace usage may hit 1TB. What are the general rules I should follow to determine chunk size, number of dbspaces, table distribution and fragmentation approach? What (Informix-specific) references should I consult? Using IDS 10.00.FC6 (eventually migrating to 11.7) on RHEL4. On Jul 20, 5:00=A0pm, red_valsen <red_val...@yahoo.com> wrote: > I'm setting up a somewhat generic data warehouse consisting of up to > 100 tables with sizes ranging from 100 MB to 10GB; total diskspace > usage may hit 1TB. =A0What are the general rules I should follow to > determine chunk size, number of dbspaces, table distribution and > fragmentation approach? =A0What (Informix-specific) references should I > consult? > > Using IDS 10.00.FC6 (eventually migrating to 11.7) on RHEL4. Red_valsen, What do you wanna do ? Is possible to create a dbspace to only one table. Tables with plenties of data, is possible to create fragment. Some tables can be created in external disks. Don't forget to consider backup strategies. With ontape, is possible to restore just one dbspace instead of restoring all of them. Informix publib: http://publib.boulder.ibm.com/infocenter/idshelp/v10/index= ..jsp Administration: http://publib.boulder.ibm.com/infocenter/idshelp/v10/index.jsp?topic=3D/com= ..ibm.admin.doc/ad...

Senescence of data (data age)
&nbsp; Hey All, &nbsp;&nbsp;&nbsp;&nbsp; I am currently using a DAQ 6015 set up to monitor 4 analog voltages that range from 0 to 5 V.&nbsp;&nbsp;I&nbsp;set the channels up using the DAQ Wizard in a basic While loop, and they are set to record at 6000 Hz and 1000 samples to read.&nbsp; These signals are being picked off various transducers.&nbsp; I am displaying these values and also writing these values to LabVIEW .lvm file using the file wizard.&nbsp;&nbsp;My question is, Is there anyway of determining of how old the actual data is that I am writing to file?&nbsp;How accurate should may results be? &nbsp; Thanks, Chris &nbsp; If you fetch your data using the waveform datatype, the timestamp will tell you when the data was collected.&nbsp; The absolute accuracy of the timestamp will be no better than the clock setting on your computer.&nbsp; The relative accuracy will depend on your hardware.&nbsp; I do not know whether or not E-series devices have hardware relative timing (I think they do not), in which case your relative accuracy will be on the order of 100ms due to the operating system.&nbsp; If they do have hardware relative timing, it will be on the order of the base clock period with further long-term errors due to clock jitter and drift. How much accuracy do you need? ...

Data mining and data extraction
I have a question on how to write a script in matlab to help me look throw a bunch of data and extract data that was selected based on my criteria. I have a text file with about 1.5 million rows and 5 columns they are separated by a space. I am able to load that file “x=dlmred(‘myfile.txt’,’’) and matlab reads it. I need to index the data into 2 sets 1st column with is used to sort the data and find start and end to each data set and the second index witch has the data that is relates to each row. I need matlab to take chunks of data for example : (searching|(data that is related to each search index) index) | 1 118 254 5 4 1 98 1 8 6 1 84 23 8124 1 1 894 21 1 44 1 84 21 1 5 2 894 8748 874 8 2 87489 8 54 484 2 848 3 4 3 3 513 8 848 2 3 7456 1 4 21 3 897 23 48 15 3 94 88 8 4 5 9 2 84 418 5 99 9 4 8 5 32 2 48 4 5 8 79 8 84 And I need to extract data from 1 to 1as my searching index however I to pull all the related data to from the search to be extracted into a new text file for other analysis. Please let me know what I should do and how should I approach this problem. On 10/17/2013 9:33 AM, Dimitry wrote: .... > I need to index the data into 2 sets 1st column with is used to sort the > data and find start and end to each data set and the second index witch > has the data that is relates to each row. I need matlab to take chunks > of data for example : > > (searching|(data that is related to each search index) > ...

Transposing data
Hi, i would to transform a table like this per_idt cnt_idt cpt_craut cpt_pos cpt_note 200801 152 10 A 2 200802 152 12 A 3 into this cnt_idt cpt_craut_200801 cpt_pos_200801 cpt_note_200801 cpt_craut_200802 cpt_pos_200802 cpt_note_200802 152 10 A 2 12 A 3 Does someone have a idea ? Thanks in advance J=E9r=F4me ...

data compression on data blocks
I have to implement a data compression to compress a lot of different small data blocks (about 100 to 1000 bytes) independend of each other. What are good algorithms to do this? Compression must be lossless. Thanks in advance David I'm not an expert, but the answer probably depends a lot on the nature of the data. For, example, if the data was truly random then I don't think it will compress at all. If the data is lots of little line drawings, then run length encoding might do it. Also, I guess there are tradeoff such as what's more important: fast compression or fast decompres...

Plotting data from a data file
Hi. I wanted to know how to plot a curve from a data file . Data file is generated by using PSIM. It contains all points of time, current , voltage. Thanking you. Sachin "SACHIN DEVASSY" <sachindevassy@gmail.com> wrote in message <h2i2rt$62c$1@fred.mathworks.com>... > Hi. > > I wanted to know how to plot a curve from a data file . Data file is generated by using PSIM. It contains all points of time, current , voltage. Thanking you. > > Sachin what is a psim file: ASCII/binary/... us > what is a psim file: ASCII/binary/... Psim is used for s...

volatile data between data steps
I am a new user of SAS and want to explore how extensible the language is. My inquiry deals with the data step and how data is filtered/manipulated through it. By diagram, I have come to understand that one may constrain the way data is passed to its final repository (the hard drive) in the following way: data input ---- where ------ [datastep] ----- if ------> data set created so WHERE clause processes before the datastep and the IF clause during or after the datastep. Therefore if one wants to prefilter data that requires manipulation to check if it should be passed through or not, one ...

how to acess data from data stream
Hello, &nbsp; My research group recently purchased the MiniAmp MSA-6 and MC6-6-2000 transducer from AMTI. I am currently using NI Labview 8.2.1 to control the signal conditioner through the provided ASCII commands.&nbsp;Once the start command is given data will be continuous until the stop command is received. Would the Write to Text file.vi be the best way to record this information? &nbsp; If so how would I be able to use this information in a feedback loop&nbsp;to control an actuator? For example, I will start the signal conditioner and start the actuator, once the signal conditioner reads a specific load, I want to send the stop command to the actuator. &nbsp; Thank you in advance, Jena I am controlling an actuator through serial communication and controlling a signal conditioner through another serial communication. The signal conditioner can be controlled with ASCII commands to start and stop the reading of data. At the start of a test, I will start the signal conditioner and then start the actuator. The signal conditioner will read the continuous data and load being placed onto a 6 DOF load cell. In the signal conditioner&nbsp;manual it says that once the Start command is received, data will be continuous until the Stop command is received. From my little knowledge of serial communication, I think that this data will be sent continuously through the String Read indicator. I would like to use this data to control when the actuator stops but am ...

data cleaning
Hi everyone. If 'dunno' is not an option on the questionnaire, would you classify those who put down 'dunno' as a comment as missing? or create another variable for those people? Instead of giving one value as an answer, how would you quantify/ categorise them if a range of values was recorded? Thanks a lot. Have a blessed day. Hi, sky, It may matter depending on the question. At our hospital, we required = nurses to ask patients if they had had particular immunizations. In = looking at our data, having a response from the patient "Don't Know" = would imply that the nurse had at least asked the question, whereas = having a response of missing might imply that the nurse had not asked = the patient if they had had their immunizations at all. =20 Thus the two responses, "Don't Know", or "Missing", would have different = policy implications: a "Don't Know" could be followed up with a decision = of whether to give the vaccine or not, but a "Missing" implied a need to = determine which nurses were not following protocol and why. Thus I'd keep them separate if possible. -Mary ----- Original Message -----=20 From: sky=20 To: SAS-L@LISTSERV.UGA.EDU=20 Sent: Friday, September 14, 2007 12:36 AM Subject: data cleaning - missing data Hi everyone. If 'dunno' is not an option on the questionnaire, would you classify those who put down 'dunno' as a comment as missing?...

data checking,data manipulation
hi all how can we respond if someone asks what u mean by data checking,data manipulation and data validation in an interview. thanks in advance. janas. If you don't know, maybe you're being interviewed for the wrong job. On Jan 17, 5:05=A0pm, janardan....@gmail.com wrote: > hi all > > how can we =A0respond if someone asks =A0what u mean by data checking,data= > manipulation and data validation =A0in an interview. > > thanks in advance. > janas. ...

Save 60% on Data Entry conversion, Data Processing Services by Offshore-Data-Entry
Get a Discount up 40 to 60% on data entry, data capture, dataentry services, large volume data processing and data conversion services through offshore facilities in India. Offshore data entry also provides form data entry, data capture, HTML/SGML coding, image scanning, file conversion with low cost, high quality,99.98% accuracy and time bound. Data Conversion Services offer cost effective data conversion projects, Outsourcing Data Conversion, advantages of data entry conversion services at offshore data entry in India. We convert PDFs, typesetters, word processors, paper format into XML, HTML-SGML and other structure format. CAD conversion process, book conversion, catalog conversion are the services we offer with attractive rates. Outsourcing your data entry services to help in effectively managing your business by eliminating risks, reducing costs, and boost up your business growth, so you concentrate on your core business. Do visit us at http://www.offshoredataentry.com/ , to know more about our Offshore Data Entry Services and avail our services at affordable rates. Email us : info@offshoredataentry.com Hi, It is really nice opportunity to earn lot of money by sitting at home. we can adjust time according to our busy work schedule and complete our work at any suitable time. Thank you. --------------------------------------------------------------------------------------------------------------------- Earn an Extra $1000 to...

data
Hi there! An example of my data file is Animalid lactation_sequence 1215 1 1215 2 1423 1 1423 2 1423 3 I would like to include in my analysis only the animals that have the first the second and the third lactation (all three lactations). can you help me to work this out??? thanks Manu Manu, data selectid(keep=3DAnimalid); merge ds(where=3D(lactation_sequence eq 1) in=3D_1) ds(where=3D(lactation_sequence eq 2) in=3D_2) ds(where=3D(lactation_sequence eq 2) in=3D_3) ; by animali...

data
hello, i have wrongly saved datas on a cd-rw with nero 5.5 under the format cda; please, how can i recover my datas ?? ry ...

Web resources about - Data, data, data, - comp.sys.ibm.as400.misc

Data - Wikipedia, the free encyclopedia
For data in computer science, see Data (computing) . For other uses, see Data (disambiguation) . is a set of values of qualitative or quantitative ...

Data stickies, a graphene alternative to flash drives
Data stickies is a design concept, but not necessarily a pipe dream. There’s some real science here. dataSTICKIES are conceptualized to be made ...

Australian researcher uses big data to monitor astronaut health on mission to Mars
Australian researcher Carolyn McGregor is preparing for the "two-week window where there will be no communication with Earth".

Square Can Be a top ‘Acquirer’ Like JP Morgan, First Data, Says CLSA
... Who are those acquirers? They are banks such as JP Morgan Chase 's ( JPM ) Chase Paymentech , and infotech services firms such as First Data ...

Data scientists spend lots of time doing stuff they don't enjoy, but they still love their jobs
Data scientists spend a lot of time doing things they don't like, such as sorting out problems with unprocessed information, but they still love ...

Google is planning a massive expansion of its data centre empire
Google is planning to rapidly expand its data centre business as it looks to catch up with Amazon and Microsoft. The Mountain View company announced ...

Black Knight's First Look at February Mortgage Data: Delinquency rate lowest since April 2007
From Black Knight: Black Knight Financial Services’ First Look at February Mortgage Data: Delinquencies Fully Recover from January Spike, Hit ...

Traders look to data for clarity as Fed confuses
Traders will be watching two key datapoints Thursday.

BrandPost: The Keys to Putting IoT Data to Work for Your Organization
... sensors, iBeacons, and other network-connected machines. However, the greatest value for organizations comes from combining the data generated ...

Businesses Lack Adequate Data Privacy, Consent Tools
Nearly all the ForgeRock survey respondents said they felt individuals are becoming increasingly concerned about their personal data privacy. ...

Resources last updated: 3/24/2016 12:50:47 PM