MAXL to IMPORT DATA

Hi All,
Can you please help me in writing maxl script for the following scenario?
I have ASO cube, I need to export level 0 data from it and then load it back to the cube.
In between them, I will have to update the dimensions. I know maxl scripting for loading data, dimensions, exporting, importing but
When I export lvl0 data to a location, if it writes to multiple files [2gb each]. How can I write maxl to import data with these multiple files. {cache: I dont know how many files it will export}
Please help..!!
Thanks.

You could simply restructure 'in place' - you don't necessarily have to export / clear / restructure / reload.
But if you do go the reload route look at this thread for some tips from Robert and Glenn on handling an unknown number of export files - Importing spanned export files in MaxL
In BSO the 'import data' statement will now handle wildcards, e.g. 'export*.txt' but this functionality hasn't been added to the ASO version of the command (at least per the documentation; I haven't actually tested it).
Glenn, yeah - lack of column export can be a major PITA. I've wasted more than a few hours of my life working through renamed members one-by-one in dozens of export files...

Similar Messages

  • MAXL Import data statement error

    Can someone assist me in the maxl import data statement.
    In the import data statement "import database App.DB data from data_file "\\servername\\folder1\\folder1\\data.txt";
    The error I get is trying to specify the syntax for the path of the server. Does someone have an example of the import statement syntax that is referencing the data file from a server?
    Thanks

    Have you tried something like :-
    import database App.Db data from text data_file "\\\\servername\\sharename\\directory\\datafile.txt" using server rules_file "dataload" on error write to "dataerrors.err";
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Import Data Maxl Linux path specification

    Hi,
    Import data 'ap'.'db' data from data_file production://Oracle//Middleware//user_projects//epmsystem//EssbaseServer//essbaseserver1//app//filename.txt  on error write to 'err.log';
    above statement errors out " syntax error at 'production'
    How does maxl recognize linux production machine path?
    Thanks.

    Before running the maxl, did you check whether you can get to the file using the path that you specified?
    You can mount that shared drive to a local drive and then access the file as local, or create a soft link and do that same.
    Regards
    Celvin

  • Automate script for import data

    automate script for importing data to the cube from txt files.
    Ex: - I have below 2 files for import
    aaaa.txt
    bbbb.txt
    For the above simply i can write esscmd or Maxl script for importing files .
    Esscmd :
    IMPORT 3 "\aaaa.txt" 4 "N";
    IMPORT 3 "\bbbb.txt" 4 "N"
    my question , suppose if some more files added to the existing files, for example cccc.txt got added ,how can i write a code for this?

    Well I guess this will do
    dataDir=" " # Give the data directory path here
    maxlFile="maxl1.msh"
    cd $dataDir
    for file in `ls *` ;
    do
    maxlStmt="import database sample.basic data from data_file '$file' using rules_file 'xxx.rul'
    on error write to 'xxx.log';";
    printf $maxlStmt >> $maxlFile
    done
    ls * is lists all the files in the directory. If you want to be more specific and if you yhink tat your data folder may contain more files then you can use "ls *.txt" if your file extension to load is txt.
    Well I haven't tested this as I am on vacation...So you can just follow the similar way.
    This is only for the 3rd point which you said

  • How to update link and import data of relocated incx file into inca file?

    Subject : <br />how to update link and import data of relocated incx file into inca file.?<br />The incx file was originally part of the inca file and it has been relocated.<br />-------------------<br /><br />Hello All,<br /><br />I am working on InDesignCS2 and InCopyCS2.<br />From indesign I am creating an assignment file as well as incopy files.(.inca and .incx file created through exporing).<br />Now indesign hardcodes the path of the incx files in inca file.So if I put the incx files in different folder then after opening the inca file in InCopy , I am getting the alert stating that " The document doesn't consists of any incopy story" and all the linked story will flag a red question mark icon.<br />So I tried to recreate and update the links.<br />Below is my code for that<br /><br />//code start*****************************<br />//creating kDataLinkHelperBoss<br />InterfacePtr<IDataLinkHelper> dataLinkHelper(static_cast<IDataLinkHelper*><br />(CreateObject2<IDataLinkHelper>(kDataLinkHelperBoss)));<br /><br />/**<br />The newFileToBeLinkedPath is the path of the incx file which is relocated.<br />And it was previously part of the inca file.<br />eg. earlier it was c:\\test.incx now it is d:\\test.incx<br />*/<br />IDFile newIDFileToBeLinked(newFileToBeLinkedPath);<br /><br />//create the datelink<br />IDataLink * dlk = dataLinkHelper->CreateDataLink(newIDFileToBeLinked);<br /><br />NameInfo name;<br />PMString type;<br />uint32 fileType;<br /><br />dlk->GetNameInfo(&name,&type,&fileType);<br /><br />//relink the story     <br />InterfacePtr<ICommand> relinkCmd(CmdUtils::CreateCommand(kRestoreLinkCmdBoss)); <br /><br />InterfacePtr<IRestoreLinkCmdData> relinkCmdData(relinkCmd, IID_IRESTORELINKCMDDATA);<br /><br />relinkCmdData->Set(database, dataLinkUID, &name, &type, fileType, IDataLink::kLinkNormal); <br /><br />ErrorCode err = CmdUtils::ProcessCommand(relinkCmd); <br /><br />//Update the link now                         <br />InterfacePtr<IUpdateLink> updateLink(dataLinkHelper, UseDefaultIID()); <br />UID newLinkUID; <br />err = updateLink->DoUpdateLink(dl, &newLinkUID, kFullUI); <br />//code end*********************<br /><br />I am able to create the proper link.But the data which is there in the incx file is not getting imported in the linked story.But if I modify the newlinked story from the inca file,the incx file will be getting update.(all its previous content will be deleted.)<br />I tried using <br />Utils<IInCopyWorkflow>()->ImportStory()<br /> ,But its import the incx file in xml format.<br /><br />What is the solution of this then?<br />Kindly help me as I am terribly stuck since last few days.<br /><br />Thanks and Regards,<br />Yopangjo

    >
    I can say that anybody with
    no experience could easily do an export/import in
    MSSQLServer 2000.
    Anybody with no experience should not mess up my Oracle Databases !

  • Object reference not set to an instance of an object error with Import data

    Hi Experts,
    We are using BPC 7.5M with SQL Server 2008 in Multiserver environment, I am getting an error "Object reference not set to an instance of an object." while running Import data package, earlier we use to get this error sometime(once in a month) but it goes away if we reboot the application server but this time I have rebotted the Application server multiple times but still getting the same error.
    Please Advice.
    Thanks & Regards,
    Rohit

    Hi Rohit,
    please see the sap note 1615837, maybe this help you.
    Best regards
    Roberto Vidotti

  • How to import data from a text file into a table

    Hello,
    I need help with importing data from a .csv file with comma delimiter into a table.
    I've been struggling to figure out how to use the "Import from Files" wizard in Oracle 10g web-base Enterprise Manager.
    I have not been able to find a simple instruction on how to use the Wizard.
    I have looked at the Oracle Database Utilities - Overview of Oracle Data Pump and the Help on the "Import: Files" page.
    Neither one gave me enough instruction to be able to do the import successfully.
    Using the "Import from file" wizard, I created a Directory Object using the Create Directory Object button. I Copied the file from which i needed to import the data into the Operating System Directory i had defined in the Create Directory Object page. I chose "Entire files" for the Import type.
    Step 1 of 4 is the "Import:Re-Mapping" page, I have no idea what i need to do on this page. All i know i am not tying to import data that was in one schema into a different schema and I am not importing data that was in one tablespace into a different tablespace and i am not R-Mapping datafiles either. I am importing data from a csv file.
    For step 2 of 4, "Import:Options" page, I selected the same directory object i had created.
    For step 3 of 4, I entered a job name and a description and selected Start Immediately option.
    What i noticed going through the wizard, the wizard never asked into which table do i want to import the data.
    I submitted the job and I got ORA-31619 invalid dump file error.
    I was sure that the wizard was going to fail when it never asked me into which table do i want to import the data.
    I tried to use the "imp" utility in command-line window.
    After I entered (imp), i was prompted for the username and the password and then the buffer size as soon as i entered the min buffer size I got the following error and the import was terminated:
    C:\>imp
    Import: Release 10.1.0.2.0 - Production on Fri Jul 9 12:56:11 2004
    Copyright (c) 1982, 2004, Oracle. All rights reserved.
    Username: user1
    Password:
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
    tion
    With the Partitioning, OLAP and Data Mining options
    Import file: EXPDAT.DMP > c:\securParms\securParms.csv
    Enter insert buffer size (minimum is 8192) 30720> 8192
    IMP-00037: Character set marker unknown
    IMP-00000: Import terminated unsuccessfully
    Please show me the easiest way to import a text file into a table. How complex could it be to do a simple import into a table using a text file?
    We are testing our application against both an Oracle database and a MSSQLServer 2000 database.
    I was able to import the data into a table in MSSQLServer database and I can say that anybody with no experience could easily do an export/import in MSSQLServer 2000.
    I appreciate if someone could show me how to the import from a file into a table!
    Thanks,
    Mitra

    >
    I can say that anybody with
    no experience could easily do an export/import in
    MSSQLServer 2000.
    Anybody with no experience should not mess up my Oracle Databases !

  • Performance issue with FDM when importing data

    In the FDM Web console, a performance issue has been detected when importing data (.txt)
    In less than 10 seconds the ".txt" and the ".log" files are created the INBOX folder (the ".txt" file) and in the OUTBOX\Logs (the ".log" file).
    At that moment, system shows the message "Processing, please wait” during 10 minutes. Eventually the information is displayed, however if we want to see the second page, we have to wait more than 20 seconds.
    It seems a performance issue when system tries to show the imported data in the web page.
    It has been also noted that when a user tries to import a txt file directly clicking on the tab "Select File From Inbox", the user has to also wait other 10 minutes before the information is displayed on the web page.
    Thx in advance!
    Cheers
    Matteo

    Hi Matteo
    How much data is being imported / displayed when users are interacting with the system.
    There is a report that may help you to analyse this but unfortunately I cannot remember what it is called and don't have access to a system to check. I do remember that it breaks down the import process into stages showing how long it takes to process each mapping step and the overall time.
    I suspect that what you are seeing is normal behaviour but that isn't to say that performance improvements are not possible.
    The copying of files is the first part of the import process before FDM then starts the import so that will be quick. The processing is then the time taken to import the records, process the mapping and write to the tables. If users are clicking 'Select file from Inbox' then they are re-importing so it will take just as long as it would for you to import it, they are not just asking to retrieve previously imported data.
    Hope this helps
    Stuart

  • Error when running import data package, BPC 7.5 Microsoft

    When I try to import data to one of my applications the import fails. This happens with external data but also with data I extracted from the application (input via schedule and then exported) I get the below error:
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    TOTAL STEPS  2
    1. Convert Data:         completed  in 0 sec.
    2. Load and Process:     Failed  in 1 sec.
    3. Import:               completed  in 1 sec.
    [Selection]
    FILE=\GRP_CONSOL\ZHYP_DATA\DataManager\DataFiles
    2011_test_schedule.txt
    TRANSFORMATION=\GRP_CONSOL\ZHYP_DATA\DATAMANAGER\TRANSFORMATIONFILES\SYSTEM FILES\IMPORT.XLS
    CLEARDATA= Yes
    RUNLOGIC= Yes
    CHECKLCK= Yes
    [Messages]
    Convert Data
    Success
    Record Count : 12
    Accept Count : 12
    Reject Count : 0
    Skip Count   : 0
    Incorrect syntax near '.'.
    ++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    What can cause this error?
    Marco

    Hi Marco,
    See note [https://service.sap.com/sap/support/notes/1328907].
    This is issue is related to the Replace and Clear option and not having Work Status set up correctly.
    When using the option 'Replace & Clear' with Import packages, the WorkStatus Setting should be not only enabled but also 3 dimensions types 'C' (i.e. Category), 'E' (i.e. Entity) and 'T' (i.e. Time) should be set by 'Yes' or 'Owner'.
    Thanks,
    John

  • Beginner trying to import data from GL_interface to GL

    Hello I’m Beginner with Oracle GL and I ‘m not able to do an import from GL_Interface, I have putted my data into GL_interface but I think that something is wrong with it Becose when I try to import data
    with Journals->Import->run oracle answer me that GL_interface is empty!
    I think that maybe my insert is not correct and that's why oracle don't want to use it... can someone help me?
    ----> I have put the data that way:
    insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
    created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
    entered_dr, entered_cr,transaction_date,reference1 )
    values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','4004',111.11,0,
    sysdate,'xx_finance_jab_demo');
    insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
    created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
    entered_dr, entered_cr,transaction_date,reference1 )
    values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','1005',0,111.11,
    sysdate,'xx_finance_jab_demo');
    ------------> Oracle send me that message:
    General Ledger: Version : 11.5.0 - Development
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    GLLEZL module: Journal Import
    Current system time is 14-MAR-2007 15:39:25
    Running in Debug Mode
    gllsob() 14-MAR-2007 15:39:25sob_id = 124
    sob_name = Vision France
    coa_id = 50569
    num_segments = 6
    delim = '.'
    segments =
    SEGMENT1
    SEGMENT2
    SEGMENT3
    SEGMENT4
    SEGMENT5
    SEGMENT6
    index segment is SEGMENT2
    balancing segment is SEGMENT1
    currency = EUR
    sus_flag = Y
    ic_flag = Y
    latest_opened_encumbrance_year = 2006
    pd_type = Month
    << gllsob() 14-MAR-2007 15:39:25
    gllsys() 14-MAR-2007 15:39:25fnd_user_id = 1008009
    fnd_user_name = JAB-DEVELOPPEUR
    fnd_login_id = 2675718
    con_request_id = 2918896
    sus_on = 0
    from_date =
    to_date =
    create_summary = 0
    archive = 0
    num_rec = 1000
    num_flex = 2500
    run_id = 55578
    << gllsys() 14-MAR-2007 15:39:25
    SHRD0108: Retrieved 51 records from fnd_currencies
    gllcsa() 14-MAR-2007 15:39:25<< gllcsa() 14-MAR-2007 15:39:25
    gllcnt() 14-MAR-2007 15:39:25SHRD0118: Updated 1 record(s) in table: gl_interface_control
    source name = xx jab payroll
    group id = -1
    LEZL0001: Found 1 sources to process.
    glluch() 14-MAR-2007 15:39:25<< glluch() 14-MAR-2007 15:39:25
    gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26<< gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26
    glusbe() 14-MAR-2007 15:39:26<< glusbe() 14-MAR-2007 15:39:26
    << gllcnt() 14-MAR-2007 15:39:26
    gllpst() 14-MAR-2007 15:39:26SHRD0108: Retrieved 110 records from gl_period_statuses
    << gllpst() 14-MAR-2007 15:39:27
    glldat() 14-MAR-2007 15:39:27Successfully built decode fragment for period_name and period_year
    gllbud() 14-MAR-2007 15:39:27SHRD0108: Retrieved 10 records from the budget tables
    << gllbud() 14-MAR-2007 15:39:27
    gllenc() 14-MAR-2007 15:39:27SHRD0108: Retrieved 15 records from gl_encumbrance_types
    << gllenc() 14-MAR-2007 15:39:27
    glldlc() 14-MAR-2007 15:39:27<< glldlc() 14-MAR-2007 15:39:27
    gllcvr() 14-MAR-2007 15:39:27SHRD0108: Retrieved 6 records from gl_daily_conversion_types
    << gllcvr() 14-MAR-2007 15:39:27
    gllfss() 14-MAR-2007 15:39:27LEZL0005: Successfully finished building dynamic SQL statement.
    << gllfss() 14-MAR-2007 15:39:27
    gllcje() 14-MAR-2007 15:39:27main_stmt:
    select int.rowid
    decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , '', replace(ccid_cc.SEGMENT2,'.','
    ') || '.' || replace(ccid_cc.SEGMENT1,'.','
    ') || '.' || replace(ccid_cc.SEGMENT3,'.','
    ') || '.' || replace(ccid_cc.SEGMENT4,'.','
    ') || '.' || replace(ccid_cc.SEGMENT5,'.','
    ') || '.' || replace(ccid_cc.SEGMENT6,'.','
    , replace(int.SEGMENT2,'.','
    ') || '.' || replace(int.SEGMENT1,'.','
    ') || '.' || replace(int.SEGMENT3,'.','
    ') || '.' || replace(int.SEGMENT4,'.','
    ') || '.' || replace(int.SEGMENT5,'.','
    ') || '.' || replace(int.SEGMENT6,'.','
    ') ) flexfield , nvl(flex_cc.code_combination_id,
    nvl(int.code_combination_id, -4))
    , decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , '', decode(ccid_cc.code_combination_id,
    null, decode(int.code_combination_id, null, -4, -5),
    decode(sign(nvl(ccid_cc.start_date_active, int.accounting_date-1)
    - int.accounting_date),
    1, -1,
    decode(sign(nvl(ccid_cc.end_date_active, int.accounting_date +1)
    - int.accounting_date),
    -1, -1, 0)) +
    decode(ccid_cc.enabled_flag,
    'N', -10, 0) +
    decode(ccid_cc.summary_flag, 'Y', -100,
    decode(int.actual_flag,
    'B', decode(ccid_cc.detail_budgeting_allowed_flag,
    'N', -100, 0),
    decode(ccid_cc.detail_posting_allowed_flag,
    'N', -100, 0)))),
    decode(flex_cc.code_combination_id,
    null, -4,
    decode(sign(nvl(flex_cc.start_date_active, int.accounting_date-1)
    - int.accounting_date),
    1, -1,
    decode(sign(nvl(flex_cc.end_date_active, int.accounting_date +1)
    - int.accounting_date),
    -1, -1, 0)) +
    decode(flex_cc.enabled_flag,
    'N', -10, 0) +
    decode(flex_cc.summary_flag, 'Y', -100,
    decode(int.actual_flag,
    'B', decode(flex_cc.detail_budgeting_allowed_flag,
    'N', -100, 0),
    decode(flex_cc.detail_posting_allowed_flag,
    'N', -100, 0)))))
    , int.user_je_category_name
    , int.user_je_category_name
    , 'UNKNOWN' period_name
    , decode(actual_flag, 'B'
         , decode(period_name, NULL, '-1' ,period_name), nvl(period_name, '0')) period_name2
    , currency_code
    , decode(actual_flag
         , 'A', actual_flag
         , 'B', decode(budget_version_id
         , 1210, actual_flag
         , 1211, actual_flag
         , 1212, actual_flag
         , 1331, actual_flag
         , 1657, actual_flag
         , 1658, actual_flag
         , NULL, '1', '6')
         , 'E', decode(encumbrance_type_id
         , 1000, actual_flag
         , 1001, actual_flag
         , 1022, actual_flag
         , 1023, actual_flag
         , 1024, actual_flag
         , 1048, actual_flag
         , 1049, actual_flag
         , 1050, actual_flag
         , 1025, actual_flag
         , 999, actual_flag
         , 1045, actual_flag
         , 1046, actual_flag
         , 1047, actual_flag
         , 1068, actual_flag
         , 1088, actual_flag
         , NULL, '3', '4'), '5') actual_flag
    , '0' exception_rate
    , decode(currency_code
         , 'EUR', 1
         , 'STAT', 1
         , decode(actual_flag, 'E', -8, 'B', 1
         , decode(user_currency_conversion_type
         , 'User', decode(currency_conversion_rate, NULL, -1, currency_conversion_rate)
         ,'Corporate',decode(currency_conversion_date,NULL,-2,-6)
         ,'Spot',decode(currency_conversion_date,NULL,-2,-6)
         ,'Reporting',decode(currency_conversion_date,NULL,-2,-6)
         ,'HRUK',decode(currency_conversion_date,NULL,-2,-6)
         ,'DALY',decode(currency_conversion_date,NULL,-2,-6)
         ,'HLI',decode(currency_conversion_date,NULL,-2,-6)
         , NULL, decode(currency_conversion_rate,NULL,
         decode(decode(nvl(to_char(entered_dr),'X'),'X',1,2),decode(nvl(to_char(accounted_dr),'X'),'X',1,2),
         decode(decode(nvl(to_char(entered_cr),'X'),'X',1,2),decode(nvl(to_char(accounted_cr),'X'),'X',1,2),-20,-3),-3),-9),-9))) currency_conversion_rate
    , to_number(to_char(nvl(int.currency_conversion_date, int.accounting_date), 'J'))
    , decode(int.actual_flag
         , 'A', decode(int.currency_code
              , 'EUR', 'User'
         , 'STAT', 'User'
              , nvl(int.user_currency_conversion_type, 'User'))
         , 'B', 'User', 'E', 'User'
         , nvl(int.user_currency_conversion_type, 'User')) user_currency_conversion_type
    , ltrim(rtrim(substrb(rtrim(substrb(int.reference1, 1, 50)) || ' ' || int.user_je_source_name || ' 2918896: ' || int.actual_flag || ' ' || int.group_id, 1, 100)))
    , rtrim(substrb(nvl(rtrim(int.reference2), 'Journal Import ' || int.user_je_source_name || ' 2918896:'), 1, 240))
    , ltrim(rtrim(substrb(rtrim(rtrim(substrb(int.reference4, 1, 25)) || ' ' || int.user_je_category_name || ' ' || int.currency_code || decode(int.actual_flag, 'E', ' ' || int.encumbrance_type_id, 'B', ' ' || int.budget_version_id, '') || ' ' || int.user_currency_conversion_type || ' ' || decode(int.user_currency_conversion_type, NULL, '', 'User', to_char(int.currency_conversion_rate), to_char(int.currency_conversion_date))) || ' ' || substrb(int.reference8, 1, 15) || int.originating_bal_seg_value, 1, 100)))
    , rtrim(nvl(rtrim(int.reference5), 'Journal Import 2918896:'))
    , rtrim(substrb(nvl(rtrim(int.reference6), 'Journal Import Created'), 1, 80))
    , rtrim(decode(upper(substrb(nvl(rtrim(int.reference7), 'N'), 1, 1)),'Y','Y', 'N'))
    , decode(upper(substrb(int.reference7, 1, 1)), 'Y', decode(rtrim(reference8), NULL, '-1', rtrim(substrb(reference8, 1, 15))), NULL)
    , rtrim(upper(substrb(int.reference9, 1, 1)))
    , rtrim(nvl(rtrim(int.reference10), nvl(to_char(int.subledger_doc_sequence_value), 'Journal Import Created')))
    , int.entered_dr
    , int.entered_cr
    , to_number(to_char(int.accounting_date,'J'))
    , to_char(int.accounting_date, 'YYYY/MM/DD')
    , int.user_je_source_name
    , nvl(int.encumbrance_type_id, -1)
    , nvl(int.budget_version_id, -1)
    , NULL
    , int.stat_amount
    , decode(int.actual_flag
    , 'E', decode(int.currency_code, 'STAT', '1', '0'), '0')
    , decode(int.actual_flag
    , 'A', decode(int.budget_version_id
    , NULL, decode(int.encumbrance_type_id, NULL, '0', '1')
    , decode(int.encumbrance_type_id, NULL, '2', '3'))
    , 'B', decode(int.encumbrance_type_id
    , NULL, '0', '4')
    , 'E', decode(int.budget_version_id
    , NULL, '0', '5'), '0')
    , int.accounted_dr
    , int.accounted_cr
    , nvl(int.group_id, -1)
    , nvl(int.average_journal_flag, 'N')
    , int.originating_bal_seg_value
    from GL_INTERFACE int,
    gl_code_combinations flex_cc,
    gl_code_combinations ccid_cc
    where int.set_of_books_id = 124
    and int.status != 'PROCESSED'
    and (int.user_je_source_name,nvl(int.group_id,-1)) in (('xx jab payroll', -1))
    and flex_cc.SEGMENT1(+) = int.SEGMENT1
    and flex_cc.SEGMENT2(+) = int.SEGMENT2
    and flex_cc.SEGMENT3(+) = int.SEGMENT3
    and flex_cc.SEGMENT4(+) = int.SEGMENT4
    and flex_cc.SEGMENT5(+) = int.SEGMENT5
    and flex_cc.SEGMENT6(+) = int.SEGMENT6
    and flex_cc.chart_of_accounts_id(+) = 50569
    and flex_cc.template_id(+) is NULL
    and ccid_cc.code_combination_id(+) = int.code_combination_id
    and ccid_cc.chart_of_accounts_id(+) = 50569
    and ccid_cc.template_id(+) is NULL
    order by decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , rpad(ccid_cc.SEGMENT2,30) || '.' || rpad(ccid_cc.SEGMENT1,30) || '.' || rpad(ccid_cc.SEGMENT3,30) || '.' || rpad(ccid_cc.SEGMENT4,30) || '.' || rpad(ccid_cc.SEGMENT5,30) || '.' || rpad(ccid_cc.SEGMENT6,30)
    , rpad(int.SEGMENT2,30) || '.' || rpad(int.SEGMENT1,30) || '.' || rpad(int.SEGMENT3,30) || '.' || rpad(int.SEGMENT4,30) || '.' || rpad(int.SEGMENT5,30) || '.' || rpad(int.SEGMENT6,30)
    ) , int.entered_dr, int.accounted_dr, int.entered_cr, int.accounted_cr, int.accounting_date
    control->len_mainsql = 16402
    length of main_stmt = 7428
    upd_stmt.arr:
    update GL_INTERFACE
    set status = :status
    , status_description = :description
    , je_batch_id = :batch_id
    , je_header_id = :header_id
    , je_line_num = :line_num
    , code_combination_id = decode(:ccid, '-1', code_combination_id, :ccid)
    , accounted_dr = :acc_dr
    , accounted_cr = :acc_cr
    , descr_flex_error_message = :descr_description
    , request_id = to_number(:req_id)
    where rowid = :row_id
    upd_stmt.len: 394
    ins_stmt.arr:
    insert into gl_je_lines
    ( je_header_id, je_line_num, last_update_date, creation_date, last_updated_by, created_by , set_of_books_id, code_combination_id ,period_name, effective_date , status , entered_dr , entered_cr , accounted_dr , accounted_cr , reference_1 , reference_2
    , reference_3 , reference_4 , reference_5 , reference_6 , reference_7 , reference_8 , reference_9 , reference_10 , description
    , stat_amount , attribute1 , attribute2 , attribute3 , attribute4 , attribute5 , attribute6 ,attribute7 , attribute8
    , attribute9 , attribute10 , attribute11 , attribute12 , attribute13 , attribute14, attribute15, attribute16, attribute17
    , attribute18 , attribute19 , attribute20 , context , context2 , context3 , invoice_amount , invoice_date , invoice_identifier
    , tax_code , no1 , ussgl_transaction_code , gl_sl_link_id , gl_sl_link_table , subledger_doc_sequence_id , subledger_doc_sequence_value
    , jgzz_recon_ref , ignore_rate_flag)
    SELECT
    :je_header_id , :je_line_num , sysdate , sysdate , 1008009 , 1008009 , 124 , :ccid , :period_name
    , decode(substr(:account_date, 1, 1), '-', trunc(sysdate), to_date(:account_date, 'YYYY/MM/DD'))
    , 'U' , :entered_dr , :entered_cr , :accounted_dr , :accounted_cr
    , reference21, reference22, reference23, reference24, reference25, reference26, reference27, reference28, reference29
    , reference30, :description, :stat_amt, '' , '', '', '', '', '', '', '', '' , '', '', '', '', '', '', '', '', '', '', ''
    , '', '', '', '', '', '', '', '', '', gl_sl_link_id
    , gl_sl_link_table
    , subledger_doc_sequence_id
    , subledger_doc_sequence_value
    , jgzz_recon_ref
    , null
    FROM GL_INTERFACE
    where rowid = :row_id
    ins_stmt.len: 1818
    glluch() 14-MAR-2007 15:39:27<< glluch() 14-MAR-2007 15:39:27
    LEZL0008: Found no interface records to process.
    LEZL0009: Check SET_OF_BOOKS_ID, GROUP_ID, and USER_JE_SOURCE_NAME of interface records.
    If no GROUP_ID is specified, then only data with no GROUP_ID will be retrieved. Note that most data
    from the Oracle subledgers has a GROUP_ID, and will not be retrieved if no GROUP_ID is specified.
    SHRD0119: Deleted 1 record(s) from gl_interface_control.
    Start of log messages from FND_FILE
    End of log messages from FND_FILE
    Executing request completion options...
    Finished executing request completion options.
    No data was found in the GL_INTERFACE table.
    Concurrent request completed
    Current system time is 14-MAR-2007 15:39:27
    ---------------------------------------------------------------------------

    As per the error message said, you need to specify a group id.
    as per documentation :
    GROUP_ID: Enter a unique group number to distinguish import data within a
    source. You can run Journal Import in parallel for the same source if you specify a
    unique group number for each request.
    For example if you put data for payables and receivables, you need to put different group id to separate payables and receivables data.
    HTH

  • How to import data into a Z table from excel file?

    hi,
    i have a custom created Z table into which i want to import some data from an excel file. which function can i use to do so because SE16 allows me to insert data only one by one via a data entry screen. this is time consuming. i want to import data from excel file so that its faster.

    HI,
    this program uploads data from excel and modifies ztable,(inserts records into ztable), check this and modify according to ur requirement)
    This program uploads material number from excel sheet and does
    ******modifications to material number if required by the user
    ******and updates the table zmatnr with new material against the old material number
    REPORT  zmat_no message-id zebg.
    TYPE-POOLS  truxs.
    TABLES:zmatnr.
    DATA : itab LIKE alsmex_tabline OCCURS 0 WITH HEADER LINE.
    DATA row LIKE alsmex_tabline-row.
    data : g_matnr like mara-matnr.
    data : count type i.
    data : itab_count type i.
    data : gi_final like zmatnr occurs 0 with header line.
    *data : begin of gi_final occurs 0,
          mat_old like mara-matnr,
          mat_new like mara-matnr,
          end of gi_final.
    ***********************Selection Screen*************************
    SELECTION-SCREEN BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
    PARAMETER : pfname LIKE rlgrap-filename OBLIGATORY.
    select-options : records for count.
    SELECTION-SCREEN END OF BLOCK b1.
    *********************At Selection Screen*************************
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR pfname.
      PERFORM search.
    START-OF-SELECTION.
    perform process.
    form process.
      CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
        EXPORTING
          filename                = pfname
          i_begin_col             = 1
          i_begin_row             = 2
          i_end_col               = 12
          i_end_row               = 65000
        TABLES
          intern                  = itab
        EXCEPTIONS
          inconsistent_parameters = 1
          upload_ole              = 2
          OTHERS                  = 3.
      IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      describe table itab lines itab_count.
       row = 1.
      loop at itab.
        if itab-row <> row.
          append gi_final.
          clear gi_final.
        endif.
        case itab-col.
          when '1'.
          CLEAR G_MATNR.
          gi_final-OLD_MATNR = itab-value.
          CONCATENATE 'NEW' gi_final-old_matnr INTO itab-value.
            gi_final-new_MATNR = itab-value.
          endcase.
        row = itab-row.
      append gi_final.
      clear gi_final.
      endloop.
      CALL FUNCTION 'PROGRESS_INDICATOR'
      EXPORTING
        I_TEXT  = 'File Has Been Successfully Uploaded from Workstation ' .
      if not gi_final[] is initial.
        if not records-low is initial .
          if not records-high is initial.
            records-high = records-high + 1.
            DESCRIBE TABLE gi_final LINES count.
            IF records-high < count.
              DELETE gi_final FROM records-high TO count.
            ENDIF.
            IF records-low <> 1.
              IF records-low <> 0.
                DELETE gi_final FROM 1 TO records-low.
              ENDIF.
            ENDIF.
          endif.
        endif.
      endif.
      IF NOT GI_FINAL[] IS INITIAL.
        CALL FUNCTION 'PROGRESS_INDICATOR'
         EXPORTING
           I_TEXT  = 'Processing zmatnr table'
           I_OUTPUT_IMMEDIATELY = 'X'.
          if itab_count <> count.
             message i000 with 'records are not matching'.
             exit.
          else.
             modify zmatnr from table gi_final.
             message i000 with 'data base table modified successfully'.
          endif.
      endif.
    endform.
    *&      Form  search
          text
    -->  p1        text
    <--  p2        text
    FORM search .
      CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
        EXPORTING
          static    = 'X'
        CHANGING
          file_name = pfname.
    ENDFORM.                    " search
    regards
    siva

  • Address Book field names changing for "imported" data

    I'm migrating from a thousand year old Palm Pilot that has served me so well to an iPhone. So it's time to get my contacts into OSX Address Book.
    I have tried moving the data a couple of ways - exporting vCards and text files from the Palm Desktop software.
    I did extensive editing of the text file in MS Excel to get the data into good order.
    I have set up a template in the Address Book preferences to create fields that I want/need. Some are generic default field names, some have custom names. eg. I want to see a field called "Bus. Phone" not one called "work". Call me picky.
    I get the same basic result no matter if I import the vCard file or a text file - The names of the fields just show up with very generic names in the imported data. I should note that creating a NEW record seems to use the field names that I have specified in the preferences.
    Also, when importing the text file, Address Book asks me to assign the field names for certain fields (while it seems smart enough to know what labels to use for other fields all on it's own). However, the Field Names available in the drop down menu ARE NOT the ones that I have set up in the prefs. I seem limited to the generic default field names only.
    Can anybody tell me how to get the field names to be what I want them to be for address data that is being imported like this?
    Thanks for any suggestions - this has been driving me crazy as it would seem to me to be a pretty basic process. (I mean, the Address Book has this import functionality and custom field name functionality built in so it should work right?!)
    Allen

    No sooner do I ask than I notice that other people have asked the same question. In one of the reply's I saw mention of a product called "Abee".
    I tracked it down and it did the job for me! Custom Field names live on!
    It's at:
    http://www.sillybit.com/abee/

  • How can I import data in to the digital word generator in Multisim?

    How can I import data in to the digital word generator in Multisim?
    I just  received this comment from a friend, a RADAR engineer, who has just down loaded Multisim.  He has been using HP/Agilent software.  He has a work around using a piecewise linear voltage waveform with data imported from Excel but this is not really a good solution.  It would also be helpful to import data from Mathcad or equivalent.
    "I thought I was about to be impressed with MultiSim but it ended only in disappointment. There is a word generator in the simulation instrument panel which can drive the DAC with a waveform and it can have thousands of lines of values. I opened Excel, wrote the formula to generate the time and voltage points for a chirp, converted to DAC values in Hex and then went back to the word generator in MultiSim to load the values only to find that you have to enter each value manually. It doesn’t even allow you to paste in a list of values from a text file. I’m not going to type 5000 values by hand. If you get the chance to give feedback to National Instruments please ask them if the paste option can be added to the word generator. MultiSim is useful in many regards, but in this case, it left me with the impression that it is considerably limited in capability compared to what I’m used to."

    Hi,
    You can load your data automatically in the Multisim word generator. Follow these steps:
    - Save your data file (in excel .xslx ir .csv format) on your computer
    - Change the extension of the file to ".dp"
    - Double-click the word generator in Multisim and click on Set...
    - In the Settings dialog box, click on Load and then Accept
    - This will prompt you to select the .dp file you have on your computer, select it and you're good to go
    However, in Multisim you have the option of creating your own custom simulation analysis and instrument.
    I will try creating the instrument and send it back to you but it might take some time.
    Multisim and LabVIEW are very powerful in test automation, with the custom instruments you create for Multisim you don't need to export your data file into excel from LabVIEW (or MathCAD or other tools) and then reload it into Multisim. The test procedure is automated instead.
    Please check this reference design about automated simulation
    http://zone.ni.com/devzone/cda/tut/p/id/7825
    Here is how you can create your own custom measurement tool in Multisim and LabVIEW, but as I mentioned, I will create the word generator and come back to you anyways
    http://zone.ni.com/devzone/cda/tut/p/id/5635
    Let me know if you have any questions.
    Mahmoud W
    National Instruments

  • HT4967 Is there a way to import data to an iCloud calendar if you don't have a Mac?

    What options exist for moving data INTO an iCloud calendar?  The only options I can find for importing data involve using iCal calendars on the Mac.
    I have an iPhone and an iPad and would like to manage my calendars via iCloud.  I have event data (most sports team schedules and the like for my kids) which I would like to "load" rather than typing in.  In the past, I always used CSV files (most of the data is distributed in Excel), ICS files, etc. and imported into Outlook.  Please don't tell me to sync with Outlook as I am trying to eliminate Outlook out of the equation.
    I just don't see any way to get data into an iCloud calendar other than typing it in or sync'ing with another calendar program.  Am I missing something? 

    Typing it in or Synchronizing are the only ways, but if you don't want to use Outlook (a stance I agree with) try eMClient. It's free for one user and 2 machines, Vista and Win7 are supported.

  • How can I import data in an interactive PDF?

    Hello everyone,
    I'm trying to import data (in this case, price lists created in Excel) in a PDF that needs be interactive; The goal is to actually make up a document (in InDesign) using form fields, and then have that data imported in the form fields, but the generated PDF has to be interactive. What I'm trying to achieve is that my sales people have a pdf, where they can generate a price estimate based on the number of products they wanna order. Is this achievable at all?

    Yes, very much so. The form fields can be added in InDesign or in Acrobat
    (I prefer the latter), and the data can be imported in Acrobat as well. The
    Excel file needs to have the following structure for it to work: Fields
    names in the first row and their data in the second. Then you save it as a
    tab-delimited text file and import it in Acrobat via Tools - Forms - More
    Form Options - Import Data.

Maybe you are looking for

  • MSI 865PE Neo2-V and 1GB RAM stick, compatibility?

    Hi all, I've just bought a new mainboard, the MSI 865PE Neo2-V... I read that supports up to 3GB of DDR333, so now I want to buy 2 new RAM modules. These modules has the specific:  - 184 memory pins  - 1GB RAM  - PC2700 DDR 333MHz  - 2.5 Module  - No

  • Mail regularly open the new blank message window

    After Mail initial start the programm opens a new blank message window. When I try to close it I receive the message "Do you want to save it as a draft and work on it later?" The new blank message window appears regularly every 4-5 minutes. How can I

  • Stack error in PREPARE_JSPM_QUEUE phase

    Hi all, We are upgrading our Enterprise Portal system SAP NW 7.0 EHP1 to EHP2. We generated the stack XML file through MOPZ whereby we did select EHP2 Stack 9 and started the EHPI. However, it failed in the configuration step in the PREPARE_JSPM_QUEU

  • My 5140 suddenly froze and now I can't turn it on

    Without any warning the phone froze up, and since I couldn't turn it off I removed the battery and put it back in. Now I can't turn it on. I've tried removing the SIM-card, I've tried charging the battery, but still no life-signs. The little light on

  • Entrourage calendar and iPod Classic

    Does the iPod Classic sync calendar data with Microsoft Entourage? The iPod manual notes that it can sync with iCal and Outlook, but Entourage is mentioned only with regard to contact data. I can't find any reference to this in the forum. Thanks, Tom