Cash Management's BAI data file Record Query

Hi All,
I need to upload and Reconcile the BAI data file which consists of Wire Transactions and Check Payments.
For Wire transactions, the client has mentioned that he will populate the Transaction number TRN#0357238745932 in 88th Record, which is -1 field of the 16th Record. Also he is populating the Check Number in 16th Record's last field having the 88th record as the continuation record.i.e, Check number is in -2 field of the 16th record.
Now if i need to Reconcile these two transactions for a particular Bank Account Number, then i have to Map the BANK_TRX_NUMBER to -1 in case of Wire transaction and map as -2 in case of Check Payments, which i think is not possible at a single point of time(Mapping two positions to a single Column)
How can i achieve this?

Without customization it is not possible to map two fields to single column and alternatively I have told to write trigger to achieve this, which will not impact standard Oracle process as you are updating interface data only according to your requirement.
once the data is loaded into interface table, the same data will be available in the ce_stmt_int_tmp table and you can query the records from the tmp table and you can get the data from tmp table update on interface table if the data is not loading into the interface table (or you can find complete text in trx_text in interface table).

Similar Messages

  • Cash Management BAI data file

    Hi All,
    I need to upload and Reconcile the BAI data file which consists of Wire Transactions and Check Payments.
    For Wire transactions, the client has mentioned that he will populate the Transaction number TRN#0357238745932 in 88th Record, which is -1 field of the 16th Record. Also he is populating the Check Number in 16th Record's last field having the 88th record as the continuation record.i.e, Check number is in -2 field of the 16th record.
    Now if i need to Reconcile these two transactions for a particular Bank Account Number, then i have to Map the BANK_TRX_NUMBER to -1 in case of Wire transaction and map as -2 in case of Check Payments, which i think is not possible at a single point of time(Mapping two positions to a single Column)
    How can i achieve this?

    You may get a quicker response in the financials forum
    HTH
    Srini Chavali

  • SQL*Loader Sequential Data File Record Processing?

    If I use the conventional path will SQL*Loader process a data file sequentially from top to bottom?  I have a file comprised of header and detail records with no value found in the detail records that can be used to relate to the header records.  The only option is to derive a header value via a sequence (nextval) and then populate the detail records with the same value pulled from the same sequence (currval).  But for this to work SQL*Loader must process the file in the exact same sequence that the data has been written to the data file.  I've read through the 11g Oracle® Database Utilities SQL*Loader sections looking for proof that this is what will happen but haven't found this information and I don't want to assume that SQL*Loader will always process the data file records sequentially.
    Thank you

    Oracle Support responded with the following statement.
    "Yes, SQL*LOADER process data file from top to bottom.
    This was touched in the note below:
    SQL*Loader - How to Load a Single Logical Record from Physical Records which Include Linefeeds (Doc ID 160093.1)"
    Jason

  • Cash management-Memo records

    hi
    Please clarify what is memo record , payment advice in Cash management.
    and what is its usage for the purpose of Planning. I have not understood from documentation /help given in SAP.
    What is planning in CM.
    Reward Points will be given for suitable answer.
    thanks
    s ap

    Hi,
    Memo records is used for planning purpose. These are used in case of exceptional cases. Suppose next month the company as to pay an amount of Rs.50 lakhs towards taxes then this can be entered in the memo record in the next month. This memo record created is updated in the Cash management report - FF7A and FF7B. Thus when we see next months figures this will show the cash position and liquidity planning.
    This memo record can be created by using T.code - FF63.
    Cash management report can itself be said as a planning and forecasting report. Many of the details to the Cash management report flows from actual data like purchase orders, Invoices, payments and down payments. Some of the data flows from planning by memo records
    Hope it is useful
    Cheers
    V.Krishnan
    (Assign points if useful)

  • Memo records (Cash Management)

    Hello,
    What are memo records in Cash Management?Are they always posted manually or can they be posted automatically.

    Hi,
    Memo records are information which will help you to make your Investment/Borrowing decisions. You need Memo records because this information is not available as a transaction in SAP. For e.g. you use SAP as your ERP but the payroll is done in some other system. SAP will be updated only after the actual payroll file from the other system is uploaded. This will normally happen only after the actual payment.  But before that you need this information in your Cash Position and Liquidity Forecast report (FF7A or FF7B). In order to have this information you can enter the payroll amount as a Memo record.
    Memo records do not update your GL. They are just informational and they expire as soon as the transaction is entered in SAP.
    Payroll is just an example and this can be used in many different ways. For e.g. you can use this for quarterly tax payments, Purchase orders (if they are not maintained in SAP) etc. You have to be very careful in using Memo Records because it may lead to double counting if you do not expire them/Archive them before the actual transaction is entered.
    Kalyan

  • Error ORA-20003: Cannot read file/...  in Cash Management?

    Dear All.
    I am having this Error in the Cash Management Module on the bank statement load process
    Error ORA-20003: Cannot read file /software/d01/oracle/interface/finance/inbound/email.9Nemail.13111507.3445704.cnv.
    How do I resolve this error.
    Thank you.

    Hi Lucy,
    The error is probably due to one of the reasons:
    1. There is no file in the location: /software/d01/oracle/interface/finance/inbound/email.9Nemail.13111507.3445704.cnv.
    2. No permission on the file respective file.
    Please get the assistance from the DBA to check on this.
    Thanks &
    Best Regards,

  • How to load unicode data files with fixed records lengths?

    Hi!
    To load unicode data files with fixed records lengths (in terms of charachters and not of bytes!) using SQL*Loader manually, I found two ways:
    Alternative 1: one record per row
    SQL*Loader control file example (without POSITION, since POSITION always refers to bytes!)<br>
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode.dat
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001111112234444
    01NormalDExZWEI
    02ÄÜÖßêÊûÛxöööö
    03ÄÜÖßêÊûÛxöööö
    04üüüüüüÖÄxµôÔµ Alternative2: variable length records
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode_var.dat "VAR 4"
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001501NormalDExZWEI002702ÄÜÖßêÊûÛxöööö002604üuüüüüÖÄxµôÔµ Problems
    Implementing these two alternatives in OWB, I encounter the following problems:
    * How to specify LENGTH SEMANTICS CHAR?
    * How to suppress the POSITION definition?
    * How to define a flat file with variable length and how to specify the number of bytes containing the length definition?
    Or is there another way that can be implemented using OWB?
    Any help is appreciated!
    Thanks,
    Carsten.

    Hi Carsten
    If you need to support the LENGTH SEMANTICS CHAR clause in an external table then one option is to use the unbound external table and capture the access parameters manually. To create an unbound external table you can skip the selection of a base file in the external table wizard. Then when the external table is edited you will get an Access Parameters tab where you can define the parameters. In 11gR2 the File to Oracle external table can also add this clause via an option.
    Cheers
    David

  • How do I skip footer records in Data file through control file of sql*loade

    hi,
    I am using sql*loader to load data from data file and i have written control file for it. How do i skip last '5' records of data file or the footer records to be skiped to read.
    For first '5' records to be skiped we can use "skip" to achieve it but how do i acheive for last '5' records.
    2)
    Can I mention two data files in one control file if so what is the syntax(like we give INFILE Where we mention the path of data file can i mention two data file in same control file)
    3)
    If i have datafile with variable length (ie 1st record with 200 charcter, 2nd with 150 character and 3rd with 180 character) then how do i load data into table, i mean what will be the syntax for it in control file.
    4)if i want to insert sysdate into table through control file how do i do it.
    5) If i have variable length records in data file and i have first name then white space between then and then last name, how do i insert this value which includes first name and last name into single column of the table.( i mean how do you handle the white space in between first name and last name in data file)
    Thanks in advance
    ram

    You should read the documentation about SQL*Loader.

  • Forms pulling Multiple Records from an XML Schema and XML data files - Adobe LiveCycle Designer

    I built a form in Adobe LiveCycle with an xml schema and data file.  The problem is with how the form renders the xml data file.
    I have a statement element that consists of about 6 fields (statementID, statementName, statementAddress, statementCountry, statementZip, statementDate, etc) of data in the schema that allows for multiple iterations - so one xml data file can contain multiple statements. These fields allow for null values.
    But here's the problem:  When any of the statements - say statement 2 of 6 - has a null value in one of the fields, if the xml data file doesn't have a placeholder
    (example of placeholder:  <statementName type="String"/>   )in the xml for that field, my form pulls the field value from the NEXT statement.
    This corrupts all the rest of the statement records, as this field is shifted up for all the rest.
    I know that in the past I haven't needed a placeholder when a field was null. But I'm wondering if when the data allows for multiple records the xml data file needs to generate the placeholder.  And where is the problem? In the Schema? The xml data file? My form?  And the 64-thousand-dollar question:  How to fix it?

    If your <statement> element is the one that repeats, it should be bound to a subform with the binding string of something like $.statement[*]. Then in that subform should be your fields and they should have bindings of $.statementID, $.statementName, $.statementAddress, etc.
    Kyle

  • Data File Management on SAN & RAID

    Hi everyone,
    this is more a question for some generic feedback rather than a particular problem. I'm advising on a system which runs 10g, Archiving, Flashback and Standby on a really fast machine. The whole database, which currently is some 5GB, runs pretty much of an 40GB SGA :)
    When i started working with Oracle we managed data files very detailed and paid much attention to disk placements, that was back at 8i. Today we have NAS, SAN and RAID systems which make I/O tracking a lot harder. This particular system runs an HP storage system, which runs virtual RAID's and everything appears under just 1 mount point in the OS and to the DB.
    I'm aware of the standard rules, e.g. Logs on RAID10 etc., but i'm just wondering how, everyone of you out their setting up production systems, usually deal with the placement of DB Files, Log Files, Archive File and Flashback Files on today's low end or enterprise class Storage Systems?
    If you need a particular problem to answer ..... the issue here is that IT says it's not the storage system, but out of 13h database time i have over 3h log file sync and write wait events.
    Thanks.

    Well the first thing I'd do with a 5GB database is not run a 40GB SGA.
    But to your question ... I like to place my files so that, hypothetically, I can lose a disk shelf not just a disk, and that in so doing if I lose storage I can lose the database but not its associated backups, archived redo logs and redo logs. Or, if I lose the backups I still have a running database.
    And 1 LUN is almost never the right answer to any question if performance is an issue.
    Also familiarize yourself with Clos networks and create multiple paths from everywhere to everywhere. This means that blades and 1U servers are almost never the right answer to a database question.

  • How to manage a data file using Business Contact Manager database tool

    I set up BCM on my desktop and now want to share with the rest of the office. I downloaded BCM database tool on the C drive
    of the server and all was well until I tried to move through the wizard. The only option that gets me to a data file is creating a database. However, I receive the error message "Cannot create a new database. The operation has been rolled back. Please
    make sure the database service is running". How can I make this work? Truthfully, I don't want to create a new database - I have already set one up; Restore only gives me the database server instance but no choices for the database name list. I have installed
    BCM on another co-workers desktop and allowed the database to be shared with her. What more am I missing? Why can't I get the server to see the data file?

    Hi,
    If the Database has not been shared, you can't connect to it.
    Please also confirm you have been granted permission to access the shared Business Contact Manager database.
    You can refer to this article below and there are some factors that you should consider:
    Business Contact Manager cannot connect to the shared database
    http://office.microsoft.com/en-us/outlook-help/business-contact-manager-cannot-connect-to-the-shared-database-HA010262548.aspx
    I hope it can be helpful.
    Regards,
    Melon Chen
    TechNet Community Support

  • What is the smallest data structure record in a .TXT file record to be recognized as an Apple Address Book "Data Card"?

    Hello! This is my first time in this discussion group. The question posed is the subject line itself:
    What is the smallest data structure record in a .TXT file record to be recognized as an Apple Address Book "Data Card"?
    I'm lazy! As a math instructor with 40+ students per class per semester (pCpS), I would rather not have to create 40 data cards pCpS by hand, only to expunge that info at semester's end. My college's IS department can easily supply me with First name, Last name, and eMail address info, along with a myriad of other fields. I can manipulate those data on my end to create the necessary .TXT file, but I don't know the essential structure of that file.
    Can you help me?
    Thank you in advance.
    Bill

    Hello Bill, & welcome aboard!
    No idea what  pCpS is, sorry.
    To import a text file into Address Book, it needs to be a comma delimited .csv file, like...
    Customer Name,Company,Address1,Address2,City,State,Zip
    Customer 1,Company 1,2233 W Seventh Street,Unit 543,Seattle,WA,99099
    Customer 2,Company 2,1 Park Avenue,,New York,NY,10001
    Customer 3,Company 3,65 Loma Linda Parkway,,San Jose,CA,94321
    Customer 4,Company 4,89988 E 23rd Street,B720,Oakland,CA,99899
    Customer 5,Company 5,432 1st Avenue,,Seattle,WA,99876
    Customer 6,Company 6,76765 NE 92nd Street,,Seattle,WA,98009
    Customer 7,Company 7,8976 Poplar Street,,Coupeville,WA,98976
    Customer 8,Company 8,7677 4th Ave North,,Seattle,WA ,89876
    Customer 9,Company 9,4556 Fauntleroy Avenue,,West Seattle,WA,98987
    Customer 10,Company 10,4 Bell Street,,Cincinnati,OH,89987
    Customer 11,Company 11,4001 Beacon Ave North,,Seattle,WA,90887
    Customer 12,Company 12,63 Dehli Street,,Noida,India,898877-8879
    Customer 13,Company 13,63 Dehli Street,,Noida,India,898877-8879
    Customer 14,Company 14,63 Dehli Street,,Noida,India,898877-8879
    Customer 15,Company 15,4847 Spirit Lake Drive,,Bellevue,WA,98006
    Customer 16,Company 16,444 Clark Avenue,,West Seattle,WA,88989
    Customer 17,Company 17,6601 E Stallion,,Scottsdale,AZ,85254
    Customer 18,Company 18,801 N 34th Street,,Seattle,WA,98103
    Customer 19,Company 19,15925 SE 92nd,,Newcastle,WA,99898
    Customer 20,Company 20,3335 NW 220th,2nd Floor,Edmonds,WA,99890
    Customer 21,Company 21,444 E Greenway,,Scottsdale,AZ,85654
    Customer 22,Company 22,4 Railroad Drive,,Moclips,WA,98988
    Customer 23,Company 23,89887 E 64th,,Scottsdale,AZ,87877
    Customer 24,Company 24,15620 SE 43rd Street,,Bellevue,WA,98006
    Customer 25,Company 25,123 Smalltown,,Redmond,WA,98998
    Try Address Book Importer...
    http://www.sillybit.com/abee/

  • SQL*Loader-510: Physical record in data file (clob_table.ldr) is long

    If I generate loader / Insert script from Raptor, it's not working for Clob columns.
    I am getting error:
    SQL*Loader-510: Physical record in data file (clob_table.ldr) is long
    er than the maximum(1048576)
    What's the solution?
    Regards,

    Hi,
    Has the file been somehow changed by copying it between windows and unix? Ora file transfer done as binary or ASCII? The most common cause of your problem. Is if the end of line carriage return characters have been changed so they are no longer /n/r could this have happened? Can you open the file in a good editor or do an od command in unix to see what is actually present?
    Regards,
    Harry
    http://dbaharrison.blogspot.co.uk/

  • How to get master data records that do not have transaction data in a query

    Hi,
    How to get master data records that do not have transaction data in a query output. Can we create a query or any other way to get the master data records that do not have transaction data?

    Hi,
    Create a multiprovider which includes transactional data target and master data info object. Make sure that identification for this master data info object is ticked on both the provider.
    Create report on this multiprovider , keep the master data info object in rows , and now you should able to see all the values which are there in master data info object irrespective of transaction happened or not .
    Next you may create condition showing only zero keyfigure values , ie. master data without any transaction.
    Hope that helps.
    Regards
    Mr Kapadia

  • W520 - Lenovo Password Manager - Can .CSV or .DAT files be imported from other Password Managers?

    The title says it all!
    Currently using Norton Identity Safe's standalone password manager, but would like to start using Lenovo's instead. Attempted to import .DAT and/or .CSV files from Norton, but no option for these file types appeared in Lenovo's Password Manager.
    If not an option to import directly, is it possible to add one of the files to the location Lenovo stores the password manger entries?
    Solved!
    Go to Solution.

    Hello, 
    No you can't import .CSV or .DAT files into Password Manager. It uses a .PWM file format, and uses a 256-bit key Advanced
    Encryption Standard (AES) with Microsoft CryptoAPI. You can export your info to be used in another machine with the Thinkvantage Password Manager installed, but not from another password manager.
    More info here:
    Password Manager 4 Deployment Guide
    Cheers!
    ThinkPad W540 (20BG) - i7-4800MQ/24GB // ThinkPad T440s (20AQ) - i7-4600U/12GB
    ThinkPad T440p (20AW) - i7-4800MQ/16GB // ThinkPad Helix (3698-6EU) - i5-3337U/4GB
    ThinkPad W520 (4282-W4Q) - i7-2720QM/32GB // ThinkPad T400 (2767-W1C) - P9500/8GB
    ThinkPad T61 (7665-CTO) - T7700/4GB // ThinkPad T60p (8741-C2G) - T7400/4GB

Maybe you are looking for