How can I use Automator to extract specific Data from a text file?
I have several hundred text files that contain a bunch of information. I only need six values from each file and ideally I need them as columns in an excel file.
How can I use Automator to extract specific Data from the text files and either create a new text file or excel file with the info? I have looked all over but can't find a solution. If anyone could please help I would be eternally grateful!!! If there is another, better solution than automator, please let me know!
Example of File Contents:
Link Time =
DD/MMM/YYYY
Random
Text
161 179
bytes of CODE memory (+ 68 range fill )
16 789
bytes of DATA memory (+ 59 absolute )
1 875
bytes of XDATA memory (+ 1 855 absolute )
90 783
bytes of FARCODE memory
What I would like to have as a final file:
EXCEL COLUMN1
Column 2
Column3
Column4
Column5
Column6
MM/DD/YYYY
filename1
161179
16789
1875
90783
MM/DD/YYYY
filename2
xxxxxx
xxxxx
xxxx
xxxxx
MM/DD/YYYY
filename3
xxxxxx
xxxxx
xxxx
xxxxx
Is this possible? I can't imagine having to go through each and every file one by one. Please help!!!
Hello
You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
if f ends with "/" then set f to f's text 1 thru -2
do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
use strict;
use open IN => ':crlf';
chdir $ARGV[0] or die qq($!);
local $/ = qq(\\0);
my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
local $/ = qq(\\n);
# CSV spec
# - record separator is CRLF
# - field separator is comma
# - every field is quoted
# - text encoding is UTF-8
local $\\ = qq(\\015\\012); # CRLF
local $, = qq(,); # COMMA
# print column header row
my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
# print data row per each file
while (@ff) {
my $f = shift @ff; # file path
if ( ! open(IN, '<', $f) ) {
warn qq(Failed to open $f: $!);
next;
$f =~ s%^.*/%%og; # file name
@dd = ('', $f, '', '', '', '');
while (<IN>) {
chomp;
$dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
last unless grep { /^$/ } @dd;
close IN;
print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
EOF
Hope this may help,
H
Similar Messages
-
Extracting specific data from multiple text files to single CSV
Hello,
Unfortunately my background is not scripting so I am struggling to piece together a powershell script to achieve the below. Hoping an experienced powershell scripter can provide the answer. Thanks in advance.
I have a folder containing approx. 2000 label type files that I need to extract certain information from to index a product catalog. Steps to be performed within the script as I see are:
1. Search folder for *.job file types
2. Search the files for certain criteria and where matched return into single CSV file
3. End result should be a single CSV with column headings:
a) DESCRIPTION
b) MODEL
c) BARCODETry:
# Script to extract data from .job files and report it in CSV
# Sam Boutros - 8/24/2014
# http://superwidgets.wordpress.com/category/powershell/
$CSV = ".\myfile.csv" # Change this filename\path as needed
$Folders = "d:\sandbox" # You can add multiple search folders as "c:\folder1","\\server\share\folder2"
# End Data entry section
if (-not (Test-Path -Path $CSV)) {
Write-Output """Description"",""Model"",""Barcode""" | Out-File -FilePath $CSV -Encoding ascii
$Files = Get-ChildItem -Path $Folders -Include *.job -Force -Recurse
foreach ($File in $Files) {
$FileContent = Get-Content -Path $File
$Keyword = "viewkind4"
if ($FileContent -match $Keyword) {
for ($i=0; $i -lt $FileContent.Count; $i++) {
if ($FileContent[$i] -match $Keyword) {
$Description = $FileContent[$i].Split("\")[$FileContent[$i].Split("\").Count-1]
} else {
Write-Host "Keyword $Keyword not found in file $File" -ForegroundColor Yellow
$Keyword = "Code:"
if ($FileContent -match $Keyword) {
for ($i=0; $i -lt $FileContent.Count; $i++) {
if ($FileContent[$i]-match $Keyword) {
$Parts = $FileContent[$i].Split(" ")
for ($j=0; $j -lt $Parts.Count; $j++) {
if ($Parts[$j] -match $Keyword) {
$Model = $Parts[$j+1].Trim()
$Model = $Model.Split("\")[$Model.Split("\").Count-1]
} else {
Write-Host "Keyword $Keyword not found in file $File" -ForegroundColor Yellow
$Keyword = "9313"
if ($FileContent -match $Keyword) {
for ($i=0; $i -lt $FileContent.Count; $i++) {
if ($FileContent[$i] -match "9313") {
$Index = $FileContent[$i].IndexOf("9313")
$Barcode = $null
for ($j=0; $j -lt 12; $j++) {
$Barcode += $FileContent[$i][($Index+$j)]
} else {
Write-Host "Keyword $Keyword not found in file $File" -ForegroundColor Yellow
Write-Output "File: '$File', Description: '$Description', Model: '$Model', Barcode: '$Barcode'"
Write-Output """$Description"",""$Model"",""$Barcode""" | Out-File -FilePath $CSV -Append -Encoding ascii
Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable) -
I'm assuming that Automator is the way to go. I've never used it before but from research it seems to be the way to go. Also, I'm not good at programming at all.
Here's a typical email...
From: [email protected]
Subject: Enquiry for Property ID: 408777039, 2 Grey Avenue, Manningham, SA 5086, Listing Agent
Date: 18 September 2013 8:33:51 PM ACST
To: Joe Jope
Reply-To: [email protected]
You have received a new lead from realestate.com.au for
Property id: 408587036
Property address: 2 Grey Avenue, Manningham, SA 5086
Property URL: www.realestate.com.au/404387039
User Details:
Name: John Bon Jovi
Email: [email protected]
Phone: 0422645633
I would like to: buy this house
Comments: Please give me a call.
I have several hundred of these emails and I want to extract the information and then save it into my CRM which is Highrise (https://highrisehq.com). If it's too difficult to get it directly into Highrise then I'm aslo happy to extract the info into excel and the import into Highrise. However I do not want to have to run through that proceedure for every single email.
I'd really like some help on how to get Automator to do this for me. Or if not Automator.. any other suggestions?
Thanks.
JohnHello
You may try the following AppleScript script. It will ask you to choose a root folder where to start searching for *.map files and then create a CSV file named "out.csv" on desktop which you may import to Excel.
set f to (choose folder with prompt "Choose the root folder to start searching")'s POSIX path
if f ends with "/" then set f to f's text 1 thru -2
do shell script "/usr/bin/perl -CSDA -w <<'EOF' - " & f's quoted form & " > ~/Desktop/out.csv
use strict;
use open IN => ':crlf';
chdir $ARGV[0] or die qq($!);
local $/ = qq(\\0);
my @ff = map {chomp; $_} qx(find . -type f -iname '*.map' -print0);
local $/ = qq(\\n);
# CSV spec
# - record separator is CRLF
# - field separator is comma
# - every field is quoted
# - text encoding is UTF-8
local $\\ = qq(\\015\\012); # CRLF
local $, = qq(,); # COMMA
# print column header row
my @dd = ('column 1', 'column 2', 'column 3', 'column 4', 'column 5', 'column 6');
print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
# print data row per each file
while (@ff) {
my $f = shift @ff; # file path
if ( ! open(IN, '<', $f) ) {
warn qq(Failed to open $f: $!);
next;
$f =~ s%^.*/%%og; # file name
@dd = ('', $f, '', '', '', '');
while (<IN>) {
chomp;
$dd[0] = \"$2/$1/$3\" if m%Link Time\\s+=\\s+([0-9]{2})/([0-9]{2})/([0-9]{4})%o;
($dd[2] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of CODE\\s/o;
($dd[3] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of DATA\\s/o;
($dd[4] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of XDATA\\s/o;
($dd[5] = $1) =~ s/ //g if m/([0-9 ]+)\\s+bytes of FARCODE\\s/o;
last unless grep { /^$/ } @dd;
close IN;
print map { s/\"/\"\"/og; qq(\").$_.qq(\"); } @dd;
EOF
Hope this may help,
H -
How to retrieve specific data from a text file
Hi, everyone
For my project it is required that a parameter file is read at the beginning, in order for certain variables to be initialized with specific values that change with the user.
At the moment, the way it is done is the following: The values at a specified sequence in a text file are read and saved in an array and the elements of the array are retrieved according to their index.
The problem with this implementation is, that if for some reason the format of the file changes, e.g. we want to use a parameter file from a previous version of the program that has the values for the same variables but in a different order, the only way to have the right values for the parameters is to change everything accordingly, which is really time wasting.
Could someone suggest a different implementation that would make the reading of the different values independent from their order in the file, e.g. by scanning the file for specific strings and reading the value after the string?
Thank you very much.
P.S. I have attached a screenshot of the routine I am using now.
Solved!
Go to Solution.
Attachments:
read parameter file.JPG 180 KBHi panagiov,
Find attached Folders.
Method 1: in this you can search for each variable separately. You can use "Config file vis" to get all keys(Variables) at once and then you can use for loop to get their values. Or you can access individual values as shown in this code.
Method 2: here you will get all data at once. You will get Variable and Data (2D array) You need to search variables as and when required.
I hope you will understand these methods.
Best of luck
Gaurav k
CLD Certified !!!!!
Do not forget to Mark solution and to give Kudo if problem is solved.
Attachments:
Method 1.zip 7 KB
Method 2.zip 9 KB -
How to extract different date format in text file
well, iam new for using regex api(regular expression), In my project i want to extract the different format of date from the text file... date format will be 3rd june 2004, 03-06-2004, june 3rd and so on....
can any body give me regular expression to extract the date from the text file...
i will be very grateful..
kareemdate format will be 3rd june 2004, 03-06-2004, june 3rd and so on....The only way to do this (without implementing a "mind reader") is to determine in advance all the possible date formats that are possible in your input, and try to interpret each date using each of those formats until one of them passes.
It's easy enough to handle june 3rd vs 3 june vs 3rd june, but 6/3 vs 3/6, of course, is ambiguous. -
How can I use apps that I have purchased from Apps Store for Mac on my iPad2?
how can I use apps that I have purchased from Apps Store for Mac on my iPad2?
Thanks, varjak, it is too easy to fall into use of jargon assuming everyone understands what it means.
One additional point in all of this, not all "iDevice" apps run on every "iDevice". Some are customized for a specific device, or class of devices such as the iPod/iPhone with their smaller screen. Others for just the iPads for their larger screens. So the user needs to be sure they are getting an app that will work on their specific device.
And to Phoneboone, you do need to purchase the app again if it is for a different class of equipment. When you are looking at apps such as iWorks for the Mac and iWorks components for the iPad, the iPad versions are very inexpensive. Pages is only $10 (US) which is really cheap for such a good word processor that reads and saves in the MS Word format.
Anyway, search for what you need and just enjoy the convenience of so many options for apps. -
How can i call ajax and extract the data?
Hi all,
I don't know briefly about ajax..i want to learn ajax, so my question is how can i call ajax and extract the data. for ex, i have JSON file
"mobile": [
"Name": "Micromax",
"Model": A310
"Name": "samsung",
"Model": grand 2
for above example how can i call ajax..please anyone explain with small example because it'll help me a lot.
Thanks & Regards,
PalsaranHi Palsaran,
As i understood, your requirement is to POST data from UI to backend.
The above code you given is not correct if you want to use model based implementation.
In UI5 you don't need to do an explicit Ajax call, for which we can use Models like OData, JSON etc.
For your JSON example, you can use like,
var oModel = new sap.ui.model.json.JSONModel();
var oData = "YOUR JSON DATA";
oModel.setData(oData)
oModel.loadData(yourURL/Entityset,"POST");
For ref, you can use the below link
JsDoc Report - SAP UI development Toolkit for HTML5 - API Reference - sap.ui.model.json.JSONModel
Thanks
Naveenraj -
How can I read a specific character from a text file?
Hi All!
I would like to read a specific character from a text file, e.g. the 2012th character in a text file with 7034 characters.
How can I do this?
Thanks
Johannesjust use the skip(long) method of the input stream that reads the text file and skip over the desired number of bytes
-
I would like to plott data from a text file in the same way as a media player does from a video file. I’m not sure how to create the pointer slide function. The vi could look something like the attached jpg.
Please, can some one help me?
Martin
Attachments:
Plotting from a text file like a media player example.jpg 61 KBHI Martin,
i am not realy sure what you want!?!?
i think you want to display only a part of the values you read from XYZ
so what you can do:
write all the values in an array.
the size of the array is the max. value of the slide bar
now you can select a part of the array (e.g. values from 100 to 200) and display this with a graph
the other option is to use the history function of the graphes
regards
timo -
Can any one suggest me how can I use relative path inside SSIS pacakge to access config file ?
Can any one suggest me how can I use relative path inside SSIS pacakge to access config file ? Please help me as its urgent.THanks for your help in advance.
Hi Jay,
SSIS can only recognize the absolute path of a XML Configuration file, the relative path is not supported. Furthermore, if the XML Configuration file is already generated, we can use the Environment variable package configuration type instead so that
SSIS runtime automatically looks for the configuration file from the path defined in the environment variable. This is convenient when we need to deploy a package to different environment. We only need to define the environment variable for package configurations
once on each server, and then the variable can be used by all the packages on this server.
Regards,
Mike Yin
TechNet Community Support -
How can I use the "Correct camera distortion" filter and process multiple files in PSE 11?
How can I use the "Correct camera distortion" filter and process multiple files in PSE 11?
Did you check the help page for Correct Camera Distortion and Process multiple file
Correct Camera Distortion: http://helpx.adobe.com/photoshop-elements/using/retouching-correcting.html#main-pars_headi ng_5
Process multiple files: http://help.adobe.com/en_US/photoshopelements/using/WS287f927bd30d4b1f89cffc612e28adab65-7 fff.html#WS287f927bd30d4b1f89cffc612e28adab65-7ff6 -
How can I use my apple ID to buy from different app stores (different countries)?
You can only use a country's store if you are in that country and have a billing address in that country on your account - are you in the country that you want to buy content from ? If you are then you can update the billing address on your account to be your address there (if using a credit card then that will need to have been issued by a bank in that country) via the Store > View Account menu option on your computer's iTunes.
-
Use LINQ to extract the data from a file...
Hi,
I have created a Subprocedure CreateEventList
which populates an EventsComboBox
with a current day's events (if any).
I need to store the events in a generic List communityEvents
which is a collection of
communityEvent
objects. This List needs to be created and assigned to the instance variable
communityEvents.
This method should call helper method ExtractData
which will use LINQ to extract the data from my file.
The specified day is the date selected on the calendar control. This method will be called from the CreateEventList.
This method should clear all data from List communityEvents.
A LINQ
query that creates CommunityEvent
objects should select the events scheduled for selected
day from the file. The selected events should be added to List
communityEvents.
See code below.
Thanks,
public class CommunityEvent
private int day;
public int Day
get
return day;
set
day = value;
private string time;
public string Time
get
return time;
set
time = value;
private decimal price;
public decimal Price
get
return price;
set
price = value;
private string name;
public string Name
get
return name;
set
name = value;
private string description;
public string Description
get
return description;
set
description = value;
private void eventComboBox_SelectedIndexChanged(object sender, EventArgs e)
if (eventComboBox.SelectedIndex == 0)
descriptionTextBox.Text = "2.30PM. Price 12.50. Take part in creating various types of Arts & Crafts at this fair.";
if (eventComboBox.SelectedIndex == 1)
descriptionTextBox.Text = "4.30PM. Price 00.00. Take part in cleaning the local Park.";
if (eventComboBox.SelectedIndex == 2)
descriptionTextBox.Text = "1.30PM. Price 10.00. Take part in selling goods.";
if (eventComboBox.SelectedIndex == 3)
descriptionTextBox.Text = "12.30PM. Price 10.00. Take part in a game of rounders in the local Park.";
if (eventComboBox.SelectedIndex == 4)
descriptionTextBox.Text = "11.30PM. Price 15.00. Take part in an Egg & Spoon Race in the local Park";
if (eventComboBox.SelectedIndex == 5)
descriptionTextBox.Text = "No Events today.";Any help here would be great.
Look, you have to make the file a XML file type -- Somefilename.xml.
http://www.xmlfiles.com/xml/xml_intro.asp
You can use NotePad XML to make the XML and save the text file.
http://support.microsoft.com/kb/296560
Or you can just use Notepad (standard), if you know the basics of how to create XML, which is just text data that can created and saved in a text file, which, represents data.
http://www.codeproject.com/Tips/522456/Reading-XML-using-LINQ
You can do a (select new CommunityEvent) just like the example is doing a
select new FileToWatch and load the XML data into the CommunityEvent properties.
So you need to learn how to make a manual XML textfile with XML data in it, and you need to learn how to use LINQ to read the XML. Linq is not going to work against some flat text file you created. There are plenty of examples out on Bing and Google
on how to use Linq-2-XML.
http://en.wikipedia.org/wiki/Language_Integrated_Query
<copied>
LINQ extends the language by the addition of query
expressions, which are akin to
SQL statements, and can be used to conveniently extract and process data from
arrays, enumerable
classes, XML documents,
relational databases, and third-party data sources. Other uses, which utilize query expressions as a general framework for readably composing arbitrary computations, include the construction of event handlers<sup class="reference" id="cite_ref-reactive_2-0">[2]</sup>
or
monadic parsers.<sup class="reference" id="cite_ref-parscomb_3-0">[3]</sup>
<end>
<sup class="reference" id="cite_ref-parscomb_3-0"></sup> -
How can I return a big amount of data from a stored procedure?
How can I return a big amount of data from a stored procedure in a efficient way ?
For example not using a cursor for going through all the rows and then assign the values to variables.
thanks in advance!Let's see if I'm able to explain myself..
I have a SQL query with..about 190.000 (in the best fo the the options)
I have something like this in my stored procedure..
FOR REC IN trn_adj_no
LOOP
REC_PIPE.INSTANCE_ID:=REC.INSTANCE_ID;
REC_PIPE.PROCESS_ID:=REC.PROCESS_ID;
REC_PIPE.ENTITY_TYPE:='TRANSACTION';
REC_PIPE.CRES_FILENAME:=REC.CRES_FILENAME;
REC_PIPE.CHUNK_REVISION_SK:=P_CHUNK_REVISION_SK;
REC_PIPE.CHUNK_ID:=trim(p_chunk_id);
REC_PIPE.FCL_SK:=REC.FCL_SK;
REC_PIPE.FCL_TRADE_SK:=REC.FCL_TRADE_SK;
REC_PIPE.FCL_VALUATION_SK:=REC.FCL_VALUATION_SK;
REC_PIPE.FCL_BUSINESS_GROUP:=REC.FCL_BUSINESS_GROUP;
REC_PIPE.ABS_TRN_CD:=REC.ABS_TRN_CD;
REC_PIPE.ACC_INTEREST_IMP_LOAN_CCY_IFRS:=REC.ACC_INTEREST_IMP_LOAN_CCY_IFRS;
REC_PIPE.ACC_INTEREST_IMP_LOAN_IFRS:=REC.ACC_INTEREST_IMP_LOAN_IFRS;
REC_PIPE.ACCRUED_INT_AMT:=REC.ACCRUED_INT_AMT;
REC_PIPE.ACCRUED_INT_CCY_CD:=REC.ACCRUED_INT_CCY_CD;
REC_PIPE.AMORT_ID:=REC.AMORT_ID;
REC_PIPE.AMORT_IND:=REC.AMORT_IND;
REC_PIPE.ATI:=REC.ATI;
REC_PIPE.ATI_2:=REC.ATI_2;
REC_PIPE.ATI_2_AMT:=REC.ATI_2_AMT;
REC_PIPE.ATI_2_CCY_CD:=REC.ATI_2_CCY_CD;
REC_PIPE.ATI_AMT:=REC.ATI_AMT;
REC_PIPE.ATI_CCY_CD:=REC.ATI_CCY_CD;
REC_PIPE.AVG_MTH_DUE_AMT:=REC.AVG_MTH_DUE_AMT;
REC_PIPE.AVG_MTH_DUE_CCY_CD:=REC.AVG_MTH_DUE_CCY_CD;
REC_PIPE.AVG_YTD_DUE_AMT:=REC.AVG_YTD_DUE_AMT;
REC_PIPE.AVG_YTD_DUE_CCY_CD:=REC.AVG_YTD_DUE_CCY_CD;
REC_PIPE.BAL_SHEET_EXP_AMT:=REC.BAL_SHEET_EXP_AMT;
REC_PIPE.BAL_SHEET_EXP_AMT_IFRS:=REC.BAL_SHEET_EXP_AMT_IFRS;
REC_PIPE.BAL_SHEET_EXP_CCY_CD:=REC.BAL_SHEET_EXP_CCY_CD;
REC_PIPE.BAL_SHEET_EXP_CCY_CD_IFRS:=REC.BAL_SHEET_EXP_CCY_CD_IFRS;
REC_PIPE.BAL_SHEET_EXP_EUR_AMT:=REC.BAL_SHEET_EXP_EUR_AMT;
REC_PIPE.BAL_SHEET_EXP_EUR_AMT_IFRS:=REC.BAL_SHEET_EXP_EUR_AMT_IFRS;
REC_PIPE.BEN_CCDB_ID:=REC.BEN_CCDB_ID;
REC_PIPE.BEN_CD:=REC.BEN_CD;
REC_PIPE.BEN_SRC_CD:=REC.BEN_SRC_CD;
REC_PIPE.BOOK_CD:=REC.BOOK_CD;
REC_PIPE.BOOK_TYPE_CD:=REC.BOOK_TYPE_CD;
REC_PIPE.BOOK_VAL_AMT:=REC.BOOK_VAL_AMT;
REC_PIPE.BOOK_VAL_AMT_IFRS:=REC.BOOK_VAL_AMT_IFRS;
REC_PIPE.BOOK_VAL_CCY_CD:=REC.BOOK_VAL_CCY_CD;
REC_PIPE.BOOK_VAL_CCY_CD_IFRS:=REC.BOOK_VAL_CCY_CD_IFRS;
REC_PIPE.BOOK_VAL_US_GAAP_AMT:=REC.BOOK_VAL_US_GAAP_AMT;
REC_PIPE.BRANCH_ID:=REC.BRANCH_ID;
REC_PIPE.BRANCH_LOC_CD:=REC.BRANCH_LOC_CD;
REC_PIPE.BS_COMPEN_CD:=REC.BS_COMPEN_CD;
REC_PIPE.BUS_AREA_UBR_ID:=REC.BUS_AREA_UBR_ID;
REC_PIPE.CA_CD:=REC.CA_CD;
REC_PIPE.CAD_RISK_WEIGHT_PCT:=REC.CAD_RISK_WEIGHT_PCT;
REC_PIPE.CASH_SETTLE_IND:=REC.CASH_SETTLE_IND;
REC_PIPE.CLS_FLG:=REC.CLS_FLG;
REC_PIPE.COB_DT:=REC.COB_DT;
REC_PIPE.COL_AGMENT_SRC_CD:=REC.COL_AGMENT_SRC_CD;
REC_PIPE.CONSOLIDATION_PCT:=REC.CONSOLIDATION_PCT;
REC_PIPE.CONTRACT_AMT:=REC.CONTRACT_AMT;
REC_PIPE.CONTRACT_COUNT:=REC.CONTRACT_COUNT;
REC_PIPE.COST_UNSEC_WORKOUT_AMT:=REC.COST_UNSEC_WORKOUT_AMT;
REC_PIPE.CPTY_CCDB_ID:=REC.CPTY_CCDB_ID;
REC_PIPE.CPTY_CD:=REC.CPTY_CD;
REC_PIPE.CPTY_LONG_NAME:=REC.CPTY_LONG_NAME;
REC_PIPE.CPTY_SRC_PROXY_UBR_ID:=REC.CPTY_SRC_PROXY_UBR_ID;
REC_PIPE.CPTY_TYPE_CD:=REC.CPTY_TYPE_CD;
REC_PIPE.CPTY_UBR_ID:=REC.CPTY_UBR_ID;
REC_PIPE.CREDIT_LINE_NET_IND:=REC.CREDIT_LINE_NET_IND;
REC_PIPE.CRES_SYS_ID:=REC.CRES_SYS_ID;
REC_PIPE.CTRY_RISK_PROVISION_BAL_AMT:=REC.CTRY_RISK_PROVISION_BAL_AMT;
REC_PIPE.CUR_NOTIONAL_AMT:=REC.CUR_NOTIONAL_AMT;
REC_PIPE.CUR_NOTIONAL_CCY_CD:=REC.CUR_NOTIONAL_CCY_CD;
REC_PIPE.CUR_NOTIONAL_CCY_CD_IFRS:=REC.CUR_NOTIONAL_CCY_CD_IFRS;
REC_PIPE.CUR_NOTIONAL_AMT_IFRS:=REC.CUR_NOTIONAL_AMT_IFRS;
REC_PIPE.DB_ENTITY_TRANSFER_ASSETS_CD:=REC.DB_ENTITY_TRANSFER_ASSETS_CD;
REC_PIPE.DEAL_TYPE_CD:=REC.DEAL_TYPE_CD;
REC_PIPE.DECOMPOSITION_IDENTIFIER:=REC.DECOMPOSITION_IDENTIFIER;
REC_PIPE.DEF_LOAN_FEE_IFRS:=REC.DEF_LOAN_FEE_IFRS;
REC_PIPE.DEF_LOAN_FEE_USGAAP:=REC.DEF_LOAN_FEE_USGAAP;
REC_PIPE.DELIVERY_DT:=REC.DELIVERY_DT;
REC_PIPE.DERIV_NOT_DT:=REC.DERIV_NOT_DT;
REC_PIPE.DERIV_NOT_IND:=REC.DERIV_NOT_IND;
REC_PIPE.DESK:=REC.DESK;
REC_PIPE.DIVISION_UBR_ID:=REC.DIVISION_UBR_ID;
REC_PIPE.ENTITY_CCDB_ID:=REC.ENTITY_CCDB_ID;
REC_PIPE.FAC_CD:=REC.FAC_CD;
REC_PIPE.FAC_LIMIT_CD:=REC.FAC_LIMIT_CD;
REC_PIPE.FAC_LIMIT_SRC_CD:=REC.FAC_LIMIT_SRC_CD;
REC_PIPE.FAC_SRC_CD:=REC.FAC_SRC_CD;
REC_PIPE.FAIR_VAL_CD_IFRS:=REC.FAIR_VAL_CD_IFRS;
REC_PIPE.FED_CLASS_CD:=REC.FED_CLASS_CD;
REC_PIPE.FIRST_RISK_PROVISION_DT:=REC.FIRST_RISK_PROVISION_DT;
REC_PIPE.FV_IMP_LOSS_CRED_CCY_IFRS:=REC.FV_IMP_LOSS_CRED_CCY_IFRS;
REC_PIPE.FV_IMP_LOSS_CRED_IFRS:=REC.FV_IMP_LOSS_CRED_IFRS;
REC_PIPE.FV_IMP_LOSS_CRED_YTD_CCY_IFRS:=REC.FV_IMP_LOSS_CRED_YTD_CCY_IFRS;
REC_PIPE.FV_IMP_LOSS_CRED_YTD_IFRS:=REC.FV_IMP_LOSS_CRED_YTD_IFRS;
REC_PIPE.GEN_RISK_PROVISION_BAL_AMT:=REC.GEN_RISK_PROVISION_BAL_AMT;
REC_PIPE.GL_PROFIT_CENTRE:=REC.GL_PROFIT_CENTRE;
REC_PIPE.GLOBAL_GL_CD:=REC.GLOBAL_GL_CD;
REC_PIPE.IFRS_POSITION:=REC.IFRS_POSITION;
REC_PIPE.IFRS_POSITION_AGGR:=REC.IFRS_POSITION_AGGR;
REC_PIPE.IMP_DT:=REC.IMP_DT;
REC_PIPE.IMP_METHOD_CD:=REC.IMP_METHOD_CD;
REC_PIPE.INT_COMPEN_CD:=REC.INT_COMPEN_CD;
REC_PIPE.INTER_IND:=REC.INTER_IND;
REC_PIPE.INTERCO_IND:=REC.INTERCO_IND;
REC_PIPE.INTERCO_IND_IFRS:=REC.INTERCO_IND_IFRS;
REC_PIPE.LOAN_PUR_FAIR_VAL_ADJ_AMT:=REC.LOAN_PUR_FAIR_VAL_ADJ_AMT;
REC_PIPE.LOCAL_GL_CD:=REC.LOCAL_GL_CD;
REC_PIPE.LOWEST_LEVEL_UBR_ID:=REC.LOWEST_LEVEL_UBR_ID;
REC_PIPE.LTD_FEES_TO_PRINCIPAL_AMT:=REC.LTD_FEES_TO_PRINCIPAL_AMT;
REC_PIPE.MA_CD:=REC.MA_CD;
REC_PIPE.MA_SPL_NOTE:=REC.MA_SPL_NOTE;
REC_PIPE.MA_SRC_CD:=REC.MA_SRC_CD;
REC_PIPE.MA_TYPE_CD:=REC.MA_TYPE_CD;
REC_PIPE.MAN_LINK_ID:=REC.MAN_LINK_ID;
REC_PIPE.MASTER_MA_CD:=REC.MASTER_MA_CD;
REC_PIPE.MATURITY_DT:=REC.MATURITY_DT;
REC_PIPE.MAX_OUT_AMT:=REC.MAX_OUT_AMT;
REC_PIPE.MAX_OUT_CCY_CD:=REC.MAX_OUT_CCY_CD;
REC_PIPE.MTM_AMT:=REC.MTM_AMT;
REC_PIPE.MTM_AMT_IFRS:=REC.MTM_AMT_IFRS;
REC_PIPE.MTM_CCY_CD:=REC.MTM_CCY_CD;
REC_PIPE.MTM_CCY_CD_IFRS:=REC.MTM_CCY_CD_IFRS;
REC_PIPE.NEW_RISK_PROVISION_AMT:=REC.NEW_RISK_PROVISION_AMT;
REC_PIPE.NON_PERF_IND:=REC.NON_PERF_IND;
REC_PIPE.NON_PERF_RCV_INT_AMT:=REC.NON_PERF_RCV_INT_AMT;
REC_PIPE.OPT_TYPE_CD:=REC.OPT_TYPE_CD;
REC_PIPE.ORIG_NOTIONAL_AMT:=REC.ORIG_NOTIONAL_AMT;
REC_PIPE.ORIG_NOTIONAL_CCY_CD:=REC.ORIG_NOTIONAL_CCY_CD;
REC_PIPE.ORIG_TRN_CD:=REC.ORIG_TRN_CD;
REC_PIPE.ORIG_TRN_SRC_CD:=REC.ORIG_TRN_SRC_CD;
REC_PIPE.OTHER_CUR_NOTIONAL_AMT:=REC.OTHER_CUR_NOTIONAL_AMT;
REC_PIPE.OTHER_CUR_NOTIONAL_CCY_CD:=REC.OTHER_CUR_NOTIONAL_CCY_CD;
REC_PIPE.OTHER_ORIG_NOTIONAL_AMT:=REC.OTHER_ORIG_NOTIONAL_AMT;
REC_PIPE.OTHER_ORIG_NOTIONAL_CCY_CD:=REC.OTHER_ORIG_NOTIONAL_CCY_CD;
REC_PIPE.OVERDUE_AMT:=REC.OVERDUE_AMT;
REC_PIPE.OVERDUE_CCY_CD:=REC.OVERDUE_CCY_CD;
REC_PIPE.OVERDUE_DAY_COUNT:=REC.OVERDUE_DAY_COUNT;
REC_PIPE.PAY_CONTRACT_AMT:=REC.PAY_CONTRACT_AMT;
REC_PIPE.PAY_CONTRACT_CCY_CD:=REC.PAY_CONTRACT_CCY_CD;
REC_PIPE.PAY_FWD_AMT:=REC.PAY_FWD_AMT;
REC_PIPE.PAY_FWD_CCY_CD:=REC.PAY_FWD_CCY_CD;
REC_PIPE.PAY_INT_INTERVAL:=REC.PAY_INT_INTERVAL;
REC_PIPE.PAY_INT_RATE:=REC.PAY_INT_RATE;
REC_PIPE.PAY_INT_RATE_TYPE_CD:=REC.PAY_INT_RATE_TYPE_CD;
REC_PIPE.PAY_NEXT_FIX_CPN_DT:=REC.PAY_NEXT_FIX_CPN_DT;
REC_PIPE.PAY_UL_SEC_TYPE_CD:=REC.PAY_UL_SEC_TYPE_CD;
REC_PIPE.PAY_UL_ISS_CCDB_ID:=REC.PAY_UL_ISS_CCDB_ID;
REC_PIPE.PAY_UL_MTM_AMT:=REC.PAY_UL_MTM_AMT;
REC_PIPE.PAY_UL_MTM_CCY_CD:=REC.PAY_UL_MTM_CCY_CD;
REC_PIPE.PAY_UL_SEC_CCY_CD:=REC.PAY_UL_SEC_CCY_CD;
REC_PIPE.PAY_UL_SEC_CD:=REC.PAY_UL_SEC_CD;
REC_PIPE.PCP_TRANSACTION_SK:=REC.PCP_TRANSACTION_SK;
REC_PIPE.PROD_AREA_UBR_ID:=REC.PROD_AREA_UBR_ID;
REC_PIPE.QUOTA_PART_AMT:=REC.QUOTA_PART_AMT;
REC_PIPE.RCV_CONTRACT_AMT:=REC.RCV_CONTRACT_AMT;
REC_PIPE.RCV_CONTRACT_CCY_CD:=REC.RCV_CONTRACT_CCY_CD;
REC_PIPE.RCV_FWD_AMT:=REC.RCV_FWD_AMT;
REC_PIPE.RCV_FWD_CCY_CD:=REC.RCV_FWD_CCY_CD;
REC_PIPE.RCV_INT_RATE:=REC.RCV_INT_RATE;
REC_PIPE.RCV_INT_RATE_TYPE_CD:=REC.RCV_INT_RATE_TYPE_CD;
REC_PIPE.RCV_INT_INTERVAL:=REC.RCV_INT_INTERVAL;
REC_PIPE.RCV_NEXT_FIX_CPN_DT:=REC.RCV_NEXT_FIX_CPN_DT;
REC_PIPE.RCV_UL_ISS_CCDB_ID:=REC.RCV_UL_ISS_CCDB_ID;
REC_PIPE.RCV_UL_MTM_AMT:=REC.RCV_UL_MTM_AMT;
REC_PIPE.RCV_UL_MTM_CCY_CD:=REC.RCV_UL_MTM_CCY_CD;
REC_PIPE.RCV_UL_SEC_CCY_CD:=REC.RCV_UL_SEC_CCY_CD;
REC_PIPE.RCV_UL_SEC_CD:=REC.RCV_UL_SEC_CD;
REC_PIPE.RCV_UL_SEC_TYPE_CD:=REC.RCV_UL_SEC_TYPE_CD;
REC_PIPE.REC_SEC_AMT:=REC.REC_SEC_AMT;
REC_PIPE.REC_SEC_CCY_CD:=REC.REC_SEC_CCY_CD;
REC_PIPE.REC_UNSEC_AMT:=REC.REC_UNSEC_AMT;
REC_PIPE.REC_UNSEC_CCY_CD:=REC.REC_UNSEC_CCY_CD;
REC_PIPE.RECON_CD:=REC.RECON_CD;
REC_PIPE.RECON_SRC_CD:=REC.RECON_SRC_CD;
REC_PIPE.RECOV_AMT:=REC.RECOV_AMT;
REC_PIPE.RECOVERY_FLAG:=REC.RECOVERY_FLAG;
REC_PIPE.REGULATORY_NET_IND:=REC.REGULATORY_NET_IND;
REC_PIPE.REL_RISK_PROVISION_AMT:=REC.REL_RISK_PROVISION_AMT;
REC_PIPE.RESPONSIBLE_BRANCH_CCDB_ID:=REC.RESPONSIBLE_BRANCH_CCDB_ID;
REC_PIPE.RESPONSIBLE_BRANCH_ID:=REC.RESPONSIBLE_BRANCH_ID;
REC_PIPE.RESPONSIBLE_BRANCH_LOC_CD:=REC.RESPONSIBLE_BRANCH_LOC_CD;
REC_PIPE.RESTRUCT_IND:=REC.RESTRUCT_IND;
REC_PIPE.RISK_END_DT:=REC.RISK_END_DT;
REC_PIPE.RISK_ENGINES_METHOD_USED:=REC.RISK_ENGINES_METHOD_USED;
REC_PIPE.RISK_MITIGATING_PCT:=REC.RISK_MITIGATING_PCT;
REC_PIPE.RISK_PROVISION_BAL_AMT:=REC.RISK_PROVISION_BAL_AMT;
REC_PIPE.RISK_PROVISION_CCY_CD:=REC.RISK_PROVISION_CCY_CD;
REC_PIPE.RISK_WRITE_OFF_AMT:=REC.RISK_WRITE_OFF_AMT;
REC_PIPE.RISK_WRITE_OFF_LIFE_DT_AMT:=REC.RISK_WRITE_OFF_LIFE_DT_AMT;
REC_PIPE.RT_BUS_AREA_UBR_ID:=REC.RT_BUS_AREA_UBR_ID;
REC_PIPE.RT_CB_CCDB_ID:=REC.RT_CB_CCDB_ID;
REC_PIPE.RT_COV_CD:=REC.RT_COV_CD;
REC_PIPE.RT_COV_SRC_CD:=REC.RT_COV_SRC_CD;
REC_PIPE.RT_COV_TYPE_CD:=REC.RT_COV_TYPE_CD;
REC_PIPE.RT_COV_TYPE_CD_IFRS:=REC.RT_COV_TYPE_CD_IFRS;
REC_PIPE.RT_TYPE_CD:=REC.RT_TYPE_CD;
REC_PIPE.RT_TYPE_CD_IFRS:=REC.RT_TYPE_CD_IFRS;
REC_PIPE.SENIORITY:=REC.SENIORITY;
REC_PIPE.SETTLE_STATUS_ID:=REC.SETTLE_STATUS_ID;
REC_PIPE.SETTLEMENT_CD:=REC.SETTLEMENT_CD;
REC_PIPE.SETTLEMENT_DT:=REC.SETTLEMENT_DT;
REC_PIPE.SOX_REF:=REC.SOX_REF;
REC_PIPE.SPE_IND:=REC.SPE_IND;
REC_PIPE.SPL_TREAT_IND_IFRS:=REC.SPL_TREAT_IND_IFRS;
REC_PIPE.SRC_PROD_TYPE_CD:=REC.SRC_PROD_TYPE_CD;
REC_PIPE.SRC_PROD_TYPE_CD_IFRS:=REC.SRC_PROD_TYPE_CD_IFRS;
REC_PIPE.SRC_PROXY_UBR_CD:=REC.SRC_PROXY_UBR_CD;
REC_PIPE.START_DT:=REC.START_DT;
REC_PIPE.STRATEGIC_LEND_IND:=REC.STRATEGIC_LEND_IND;
REC_PIPE.STRIKE_FWD_FUT_PRICE_AMT:=REC.STRIKE_FWD_FUT_PRICE_AMT;
REC_PIPE.STRIKE_FWD_FUT_PRICE_CCY_CD:=REC.STRIKE_FWD_FUT_PRICE_CCY_CD;
REC_PIPE.TERMINATION_LOAN_DT:=REC.TERMINATION_LOAN_DT;
REC_PIPE.TRAD_ASSET_CD_IFRS:=REC.TRAD_ASSET_CD_IFRS;
REC_PIPE.TRADE_DT:=REC.TRADE_DT;
REC_PIPE.TRADE_ENTITY_CCDB_ID:=REC.TRADE_ENTITY_CCDB_ID;
REC_PIPE.TRADE_ENTITY_LOC_CD:=REC.TRADE_ENTITY_LOC_CD;
REC_PIPE.TRADE_RELATED_IND:=REC.TRADE_RELATED_IND;
REC_PIPE.TRADE_STATUS:=REC.TRADE_STATUS;
REC_PIPE.TRADING_ENTITY_LOC_CD:=REC.TRADING_ENTITY_LOC_CD;
REC_PIPE.TRAN_MATCH_CD:=REC.TRAN_MATCH_CD;
REC_PIPE.TRN_CD:=REC.TRN_CD;
REC_PIPE.TRN_LINK_CD:=REC.TRN_LINK_CD;
REC_PIPE.TRN_SRC_CD:=REC.TRN_SRC_CD;
REC_PIPE.TRN_TYPE_CD:=REC.TRN_TYPE_CD;
REC_PIPE.TRN_VERSION:=REC.TRN_VERSION;
REC_PIPE.UL_DELIVERY_DT:=REC.UL_DELIVERY_DT;
REC_PIPE.UL_MATURITY_DT:=REC.UL_MATURITY_DT;
REC_PIPE.UNDLY_ISS_NAME:=REC.UNDLY_ISS_NAME;
REC_PIPE.UNEARNED_INCOME_CCY_CD:=REC.UNEARNED_INCOME_CCY_CD;
REC_PIPE.UNEARNED_INCOME_DIS_LOAN_AMT:=REC.UNEARNED_INCOME_DIS_LOAN_AMT;
REC_PIPE.US_GAAP_POSITION_AGGR:=REC.US_GAAP_POSITION_AGGR;
REC_PIPE.VALUATION_DT:=REC.VALUATION_DT;
REC_PIPE.XCHG_CTRY_CD:=REC.XCHG_CTRY_CD;
REC_PIPE.YEAR01_NOTIONAL_AMT:=REC.YEAR01_NOTIONAL_AMT;
REC_PIPE.YEAR02_NOTIONAL_AMT:=REC.YEAR02_NOTIONAL_AMT;
REC_PIPE.YEAR03_NOTIONAL_AMT:=REC.YEAR03_NOTIONAL_AMT;
REC_PIPE.YEAR04_NOTIONAL_AMT:=REC.YEAR04_NOTIONAL_AMT;
REC_PIPE.YEAR05_NOTIONAL_AMT:=REC.YEAR05_NOTIONAL_AMT;
REC_PIPE.YEAR06_NOTIONAL_AMT:=REC.YEAR06_NOTIONAL_AMT;
REC_PIPE.YEAR07_NOTIONAL_AMT:=REC.YEAR07_NOTIONAL_AMT;
REC_PIPE.YEAR08_NOTIONAL_AMT:=REC.YEAR08_NOTIONAL_AMT;
REC_PIPE.YEAR09_NOTIONAL_AMT:=REC.YEAR09_NOTIONAL_AMT;
REC_PIPE.YEAR10_NOTIONAL_AMT:=REC.YEAR10_NOTIONAL_AMT;
REC_PIPE.YTD_PL_TRADE_AMT:=REC.YTD_PL_TRADE_AMT;
REC_PIPE.FCL_VALIDATED_IND:=REC.FCL_VALIDATED_IND;
REC_PIPE.FCL_EXCLUDED_IND:=REC.FCL_EXCLUDED_IND;
REC_PIPE.FCL_SOURCE_SYSTEM:=REC.FCL_SOURCE_SYSTEM;
REC_PIPE.FCL_CREATE_TIMESTAMP:=REC.FCL_CREATE_TIMESTAMP;
REC_PIPE.FCL_UPDATE_TIMESTAMP:=REC.FCL_UPDATE_TIMESTAMP;
REC_PIPE.FCL_TRADE_LOAD_DATE:=REC.FCL_TRADE_LOAD_DATE;
REC_PIPE.FCL_VALUATION_LOAD_DATE:=REC.FCL_VALUATION_LOAD_DATE;
REC_PIPE.FCL_MATRIX_TRADE_TYPE:=REC.FCL_MATRIX_TRADE_TYPE;
REC_PIPE.PCP_GCDS_REVISION:=REC.PCP_GCDS_REVISION;
REC_PIPE.FCL_UTP_BOOK_CD:=REC.FCL_UTP_BOOK_CD;
REC_PIPE.FCL_MIS_RISK_BOOK_CD:=REC.FCL_MIS_RISK_BOOK_CD;
REC_PIPE.ALD_STATUS:=REC.ALD_STATUS;
REC_PIPE.FCAT_OPERATION:=REC.FCAT_OPERATION;
REC_PIPE.INTERNAL_INTRA_IND_IFRS:=REC.INTERNAL_INTRA_IND_IFRS;
REC_PIPE.FCL_USAGE_IND:=REC.FCL_USAGE_IND;
REC_PIPE.QUICK:=REC.QUICK;
REC_PIPE.SEDOL:=REC.SEDOL;
REC_PIPE.CUSIP:=REC.CUSIP;
REC_PIPE.ISIN:=REC.ISIN;
REC_PIPE.QUANTITY:=REC.QUANTITY;
REC_PIPE.RIC:=REC.RIC;
REC_PIPE.DESCRIPTION:=REC.DESCRIPTION;
REC_PIPE.RESET_DATE:=REC.RESET_DATE;
REC_PIPE.PRICE:=REC.PRICE;
REC_PIPE.EXCHANGERATE:=REC.EXCHANGERATE;
REC_PIPE.PARA_PROD_TYPE_ID:=REC.PARA_PROD_TYPE_ID;
REC_PIPE.EFF_INT_RATE:=REC.EFF_INT_RATE;
REC_PIPE.ORIG_FEE_UNREALISED_AMT:=REC.ORIG_FEE_UNREALISED_AMT;
REC_PIPE.ASSET_TYPE:=REC.ASSET_TYPE;
REC_PIPE.IMP_IND_IFRS:=REC.IMP_IND_IFRS;
REC_PIPE.NOT_AMT_REDEMPTION_TO_1YR:=REC.NOT_AMT_REDEMPTION_TO_1YR;
REC_PIPE.NOT_AMT_REDEMPTION_TO_5YR:=REC.NOT_AMT_REDEMPTION_TO_5YR;
REC_PIPE.NOT_AMT_REDEMPTION_OVER_5YR:=REC.NOT_AMT_REDEMPTION_OVER_5YR;
REC_PIPE.CASH_LTD:=REC.CASH_LTD;
REC_PIPE.CASH_LTD_CCY:=REC.CASH_LTD_CCY;
REC_PIPE.FCL_FACILITY_SK:=REC.FCL_FACILITY_SK;
REC_PIPE.DILUTION_RISK_CRITERIA:=REC.DILUTION_RISK_CRITERIA;
REC_PIPE.FCL_RMS_RUNID:=REC.FCL_RMS_RUNID;
REC_PIPE.BSTYPE:=REC.BSTYPE;
REC_PIPE.YTD_ACCR_DIV:=REC.YTD_ACCR_DIV;
REC_PIPE.YTD_DIV:=REC.YTD_DIV;
REC_PIPE.ACC_ADJ_YTD:=REC.ACC_ADJ_YTD;
REC_PIPE.FV_RLZD_PL:=REC.FV_RLZD_PL;
REC_PIPE.ITD_PL:=REC.ITD_PL;
REC_PIPE.YTD_RLZD_PL:=REC.YTD_RLZD_PL;
REC_PIPE.YTD_UNRLZD_PL:=REC.YTD_UNRLZD_PL;
REC_PIPE.TRN_BAS_LGD:=REC.TRN_BAS_LGD;
REC_PIPE.TRAD_YTD_RLZD_PL:=REC.TRAD_YTD_RLZD_PL;
REC_PIPE.TRAD_YTD_UNRLZD_PL:=REC.TRAD_YTD_UNRLZD_PL;
REC_PIPE.UNADJUSTED_PV:=REC.UNADJUSTED_PV;
REC_PIPE.UNADJUSTED_REALISED_PL:=REC.UNADJUSTED_REALISED_PL;
REC_PIPE.UNADJUSTED_UNREALISED_PL:=REC.UNADJUSTED_UNREALISED_PL;
REC_PIPE.GRC_PROD_TYPE_ID:=REC.GRC_PROD_TYPE_ID;
REC_PIPE.GRC_PROD_TYPE_ID_IFRS:=REC.GRC_PROD_TYPE_ID_IFRS;
REC_PIPE.CUR_FEE:=REC.CUR_FEE;
REC_PIPE.SEC_IND:=REC.SEC_IND;
REC_PIPE.INVEST_ASSETS_PORT:=REC.INVEST_ASSETS_PORT;
REC_PIPE.RISK_PROVISION_START_BAL_AMT:=REC.RISK_PROVISION_START_BAL_AMT;
REC_PIPE.GEN_RISK_PRV_START_BAL_AMT:=REC.GEN_RISK_PRV_START_BAL_AMT;
REC_PIPE.GEN_RISK_PROVISION_NEW:=REC.GEN_RISK_PROVISION_NEW;
REC_PIPE.GEN_RISK_PROVISION_REL:=REC.GEN_RISK_PROVISION_REL;
REC_PIPE.ACQUIRED_IND:=REC.ACQUIRED_IND;
REC_PIPE.CRED_NET_MNA:=REC.CRED_NET_MNA;
REC_PIPE.SEC_IFRS_VUE_ADJ:=REC.SEC_IFRS_VUE_ADJ;
REC_PIPE.YEAR00_NOTIONAL_AMT:=REC.YEAR00_NOTIONAL_AMT;
REC_PIPE.MAX_OVERDUE_DAYS:=REC.MAX_OVERDUE_DAYS;
REC_PIPE.MIS_CD:=REC.MIS_CD;
REC_PIPE.MULTINAME_POOL_SK:=REC.MULTINAME_POOL_SK;
REC_PIPE.MULTINAME_CHUNK_REVISION_SK:=REC.MULTINAME_CHUNK_REVISION_SK;
REC_PIPE.MTRX_CALC_CNTRL_EPE:=REC.MTRX_CALC_CNTRL_EPE;
REC_PIPE.MTRX_CALC_CNTRL_PFE:=REC.MTRX_CALC_CNTRL_PFE;
REC_PIPE.FCL_TE_VALIDATED_IND:=REC.FCL_TE_VALIDATED_IND;
REC_PIPE.PCP_TE_DYNAMIC_KEY:=REC.PCP_TE_DYNAMIC_KEY;
REC_PIPE.AVG_COST:=REC.AVG_COST;
REC_PIPE.SEC_LONG_NAME:=REC.SEC_LONG_NAME;
REC_PIPE.ISS_CTRY_CD:=REC.ISS_CTRY_CD;
REC_PIPE.RISK_REPORTING_UBR:=REC.RISK_REPORTING_UBR;
REC_PIPE.RISK_COVERING_BRANCH_CCDB:=REC.RISK_COVERING_BRANCH_CCDB;
REC_PIPE.CLEARING_STATUS:=REC.CLEARING_STATUS;
REC_PIPE.FCL_CPTY_SK:=REC.FCL_CPTY_SK;
REC_PIPE.STRGRP:=REC.STRGRP;
REC_PIPE.OPEN_ENDED_FLAG:=REC.OPEN_ENDED_FLAG;
REC_PIPE.AVAILABILITY_IND:=REC.AVAILABILITY_IND;
REC_PIPE.ESCRW_FLG:=REC.ESCRW_FLG;
REC_PIPE.ESTABLISHED_RELP_FLG:=REC.ESTABLISHED_RELP_FLG;
REC_PIPE.GOV_GTY_AMT:=REC.GOV_GTY_AMT;
REC_PIPE.BUBA_CTRY_ID:=REC.BUBA_CTRY_ID;
REC_PIPE.GOV_GTY_CTRY_CD:=REC.GOV_GTY_CTRY_CD;
REC_PIPE.INTERNET_DPST_FLG:=REC.INTERNET_DPST_FLG;
REC_PIPE.LAST_ACTIVITY_DT:=REC.LAST_ACTIVITY_DT;
REC_PIPE.NOTICE_PERIOD_QTY:=REC.NOTICE_PERIOD_QTY;
REC_PIPE.OPR_RELP_FLG:=REC.OPR_RELP_FLG;
REC_PIPE.SIG_WD_PENALTY_FLG:=REC.SIG_WD_PENALTY_FLG;
REC_PIPE.TRN_ACT_FLG:=REC.TRN_ACT_FLG;
PIPE ROW(REC_PIPE);
END LOOP;
the Stored procedre returns REC_PIPE.
I thins this could be not efficient enough...
What do you think? -
How to read a tab seperated data from a text file using utl_file
Hi,
How to read a tab seperated data from a text file using utl_file...
I know if we use UTL_FILE.get_line we can read the whole line...but i need to read the tab separated value separately.....
Thanks in advance...
NaveenNaveen Nishad wrote:
How to read a tab seperated data from a text file using utl_file...
I know if we use UTL_FILE.get_line we can read the whole line...but i need to read the tab separated value separately.....If it's a text file then UTL_FILE will only allow you to read it a line at a time. It is then up to you to split that string up (search for split string on this forum for methods) into it's individual components.
If the text file contains a standard structure on each line, i.e. it is a fixed delimited structure, then you could use external tables to read the data instead.
Maybe you are looking for
-
Need help on setting up and using IDE3&4 on my K7T turbo Limited edition. I have not used IDE 3&4 yet. How do I set it up as a raid config. Does my boot drive still stay at IDE1, or does it move to IDE3 or 4 . Is the some type of bios setting for th
-
Missing help information?
I find that some labview objects do not link to any description or help information. For instance, when I drop a control, say the "file path control" from the "string & path" palette onto the front panel of a vi, there is "no description" available i
-
HDTV 1080i compression question
Hello. I have a HDTV 1080i timeline and I want to export it so I can fit it onto a 4x3 DVD using DVD studio pro. What is the best setting to have, in term of best quality and keep it 16x9? Much appreciated.
-
How to use airport express wired and wireless at the same time
Hi, I currently have FiOS with Verizon, they ran a coax cable to the Verizon router. I didn't ask them to switch to CAT6 cable since I have an airport extreme base station and an airport express base station (both are the newest generation). I ran a
-
Empty smart album (Album is not any)
Hello, I have recently began to use iPhoto (it's late i know ;p) and I have a problem with a smart album feature. Indeed, i have created albums, and in order to know what are the remaining photo to classify, i use the "Album is not any" smart album!