Load bulk suppliers

hi all
i want to load suppliers in a bulk to i made one stage table and load all the data there and after that i made one script which pull all the data from here and put into ap_supplier_int and ap_supplier_sites_int. Now the data has been uploaded into INT tables, but when i use concurrent program to load data into applications (1- SUPPLIER OPEN INTERFACE IMPORT and SUPPLIER OPEN INTERFACE IMPORT) the request run failed with STATUS error. i am copying the log file contents here
+-----------------------------
| Starting concurrent program execution...
+-----------------------------
Arguments
P_WHAT_TO_IMPORT='ALL'
P_COMMIT_SIZE='1000'
P_PRINT_EXCEPTIONS='Y'
P_DEBUG_SWITCH='Y'
P_TRACE_SWITCH='Y'
Current NLS_LANG and NLS_NUMERIC_CHARACTERS Environment Variables are :
American_America.US7ASCII
LOG :
Report: d:\oracle\prodappl\ap\11.5.0\reports\US\APXSSIMP.rdf
Logged onto server:
Username:
LOG :
Logged onto server:
Username: APPS
MSG MSG-00001: After SRWINIT
MSG MSG-00002: After Get_Company_Name
MSG MSG-00003: After Get_NLS_Strings
ERR REP-1419: 'beforereport': PL/SQL program aborted.
more there is nothing update in reject_code and status feild of both the tables
Please help me out and suggust me how to get out of this problem
regards
anwer

you make sure,
1) Link between SUPPLIER HEADER TABLE will be supplier site table must be there
2) checkout , how the payable options you have set.
3)mandatory column data should be populated.
these are three reason, by which is throwing exception.

Similar Messages

  • Bad load arguments supplied when starting up the Reports Server

    Dear All,
    In report60\server\<filename>.ora
    Following were the contents :
    maxconnect=20
    cachedir="d:\Oracle\806\REPORT60\server\cache"
    cachesize=50
    minengine=0
    initengine=0
    maxengine=1
    maxidle=30
    security=1
    englife=50
    We chnaged it to
    maxconnect=20
    cachedir="d:\Oracle\806\REPORT60\server\cache"
    cachesize=50
    minengine=1
    initengine=0
    maxengine=5
    maxidle=30
    security=1
    englife=50
    We imideatly got result we checked it put it on live application server.
    Everything was smooth but next day we got calls/email from user saying report were slow. when we checked the windows process on application server rwmts60 was occuping 50% of memory so we killed that process. We restrarted the report service but was unable to restart.Then we saw the report log as follows :
    *** 2008-07-29 17:36:51 -- Server started up an engine. (Rep60_WEBDB-01)
    *** 2008-07-29 17:36:53 -- Server engine crashed. (Rep60_WEBDB-01)
    *** 2008-07-29 17:38:18 -- Server started up an engine. (Rep60_WEBDB-01)
    *** 2008-07-29 18:39:44 -- Server shutting down an engine. (Rep60_WEBDB-01)
    *** 2008-07-30 09:13:19 -- Server engine cannot be started. (Rep60_WEBDB-01)
    *** 2008-07-30 09:15:24 -- Server engine cannot be started. (Rep60_WEBDB-01)
    *** 2008-07-30 09:17:29 -- Server engine cannot be started. (Rep60_WEBDB-01)
    *** 2008-07-30 09:19:34 -- Server engine cannot be started. (Rep60_WEBDB-01)
    *** Bad load arguments supplied when starting up the Reports Server
    *** Bad load arguments supplied when starting up the Reports Server
    *** Bad load arguments supplied when starting up the Reports Server
    *** Bad load arguments supplied when starting up the Reports Server
    then we changed back our ora file to
    maxconnect=20
    cachedir="d:\Oracle\806\REPORT60\server\cache"
    cachesize=50
    minengine=0
    maxengine=1
    initengine=0
    maxidle=30
    security=1
    englife=50
    And restarted the service which restarted suceesfully.
    My question is i had changed maxengine=5 and minengine=1 on 20- JUL-2008 around 16:00 .Did this happen b'coz of this 2 parameters.
    Kindly advice.
    Thanking You in anticipation.
    Best Regards,
    Devendra

    Hello Mahendraji,
    Thanks for your reply.
    I may ask some basic question please accomodate me.
    <<$RACLE_HOME\reports\conf\apple_report.conf>>
    I have Application server 9i version 1.0.2.0.0 (isuite).
    I have installed in D drive
    In this drive a folder d:\orcale is created.
    Inside of this i don't have any folder called "reports"
    i have folder called "806"
    \\webdb-01\D-Drive\Oracle\806
    Inside of which i have folder "report60"
    \\webdb-01\D-Drive\Oracle\806\REPORT60
    Inside of which i have folder "SERVER"
    \\webdb-01\D-Drive\Oracle\806\REPORT60\SERVER
    In which i found "Rep60_WEBDB-01.ora" , "Rep60_WEBDB-01.log" and "cgicmd.dat"
    and 2 folders "CACHE" and "security".
    I am confused as you said that the file name should be Rep60_WEBDB-01.CONF but i have Rep60_WEBDB-01.ora are they one and the same . If yes is the extension b'coz i have older version of 9i i.e 1.0.2.0.0 (isuite).
    if no where i how i can search this .
    This Rep60_WEBDB-01.ora file doesnot have parameter "engineResponseTimeOut" if i add it will it work.
    Your explanation regarding "engineResponseTimeOut" is clear.
    Maximum Time taken by our report is 20 minutes. But sometimes when report hangs or SQL takes more time than normal due to the environment so the max time might change to 25 or 30 minutes in such a case should we keep it at 45 minutes to avoid any unneccessary termination.
    <<Note:
    It is always better to run batch reports on a separate server with different engineResponseTimeOut values. Do not submit interactive and batch reports to same server.>>
    If i create a seperate application server how can i send reports requested by user to batch in this seperrate server.
    What should be the criteria of report to be redirected to seperate server?
    What do mean by "submit interactive" and "submit batch"
    My interpretation is when we click on submit query button it is "submit interactive" is it correct.
    I am not clear about "submit batch"
    I know there might be some silly questions but please do accomodate.
    Kindly advice.
    Thanks.
    Regards,
    Devendra Shelke

  • Loading bulk data in to oracle

    what is the procedure to load bulk amount of data in oracle database?

    Where is your incoming data ? Is it in the table or external file or from any other database ?
    You might consider reading this document...
    http://www.dbspecialists.com/presentations/load_faster.html

  • Load Bulk shared members in EPMA

    Hi,
    We need to develop the process to be able to load members in bulk, thus manual addition will not help. I specifically want to (1) load exisitng dimension members where the parent has changed, and ensure these members are MOVED and not duplicated as shared, and (2) load shared members.
    Any guidance in this regard will be helpful.s ma

    Hi John
    I can't map the IsPrimary to any property in EPMA?
    I added the column in my flat file, but in my import profile I can't map it, so in the load it is giving 'The 'IsPrimary' column is not present in the import source for dimension 'Entity'. Creating all imported members as primary instances.'
    Is the property only applicable to the Account dimension? I'm trying to load the shared members in Entity dimension.
    Cheers,
    Omar

  • Load bulk of files from folder

    HI ,
    there is a way to create program the load csv file from shared folder but load all the
    files ?
    i.e. if in the folder there is 5 files -> load them one after one and the order is
    not important .
    Assume that i don't know the file names
    Regards
    James

    Hi
    Yes it's possible to do it
    - Frontend: U can use the method CL_GUI_FRONTEND_SERVICES=>DIRECTORY_LIST_FILES in order to get all files stored in a directory and then CL_GUI_FRONTEND_SERVICES=>GUI_UPLOAD to read them (file by file)
    - Application: U can use a fm like SUBST_GET_FILE_LIST in in order to get all files stored in a directory and then OPEN DATASET to read them (file by file)
    Max

  • Tools to Auto load bulk data from SAP BW into Essbase?

    Does any one know what is the most efficient way to retrieve a large amount of data from SAP BW into Essbase on a regular basis? We currenly export data from BW into several Excel files, which has the limitation of 64k row in a batch, we then load the excel files into essbase usng rules files. And it's a manual process. Do you know if there is a better tool/approach that we can use to do the data retrieve and load into essbase in one batch without need of human interference? Thanks for your good ideas!

    you can use HAL to do the extraction and loading data from SAP. did you try this?

  • Loading bulk data into an ODS

    Hi,
    I've to extract big volume of data (millions of records) into an ODS. But the problem is, I cannot give selections in the infopackage as there are no fields in data selection tab.
    If I extract all the data, then ods activation would definately fail. Is there any of doing this?
    Many Thx.

    I think that with Siggi's post you can have another version of the same suggestion and maybe you are better understanding..
    anyway, if you want to split your load in several steps (in order to avoid a unique request with a lot of records) go to RSA6, search your datasource, check 'selection' column for a certail field, activate, replicate source system and then you will find your field available for selection in IP !
    But I suggested also to check why you have an activation error, since it's not a normal situation: you can easily load one million record in the active table of your ODS...investigate about the reason and try to remove the cause !
    Hope now is clearer...
    Bye,
    Roberto

  • Problem in flat file loading(bulk data)

    Hi,
    I am populating a flat file with 3lac records. I am getting error due to buffer space problem.
    "Error during target file write "
    around 2 lac records are getting inserted after that the above error is coming..
    Now temporarily I am populating 1.5 lac and 1.5 lac in two times.
    Could anyone please tell me how to solve this issue .
    Thanks in Advance.

    I am using owb 10.2 version.
    In the map source is a Oracle table, Target is a flat file.
    Source table contains 3lac records.
    I executed the map. During execution it is throwing the following error .
    "Error during target file write "
    But it has inserted 2lac records in the target file.

  • Unable to load suppliers using interface.

    Hi,
    I was able to load the supplier record into the table AP_SUPPLIERS_INT. No error encountered and no .BAD files exist. When I run the Supplier Open Interface Import in apps, the screen shows the following:
    1. Total Suppliers Imported: 0
    2. Total Suppliers Rejected: 0
    3. *** No Data Exists for this Report ***
    The version of apps is 12.1.3 and the database is 11.1.
    Please help. Thanks.
    Regards,
    peopsquik08

    I would check these Oracle docs :
    Note: 444736.1 - Supplier Open Interface Does Not Import NUM_1099 and VAT_REGISTRATION
    Note: 605302.1 - R12: How to Import VAT Registration Number at Supplier Site Level
    These may also help :
    http://oracleappss.blogspot.com/2008/07/payables-data-migration.html
    download.oracle.com/docs/cd/B40089_10/current/acrobat/120aprg.pdf
    The forum also has its own area for this and a search of it may help :
    Financials
    Best Regards
    mseberg

  • Reg "Allow Bulk Data Load"

    Hi all,
    GoodMorning,.
    what exactly does the option of "Allow Bulk Data Load" option on Company Profile page do, it is clear in doc. that it allows crm on demand consultants to load bulk data. But i am not clear on how they load etc etc, do they use anyother tools other than that admin. uses for data uploading.
    any real time implementation example using this option would be appreciated.
    Regards,
    Sreekanth.

    The Bulk Data Load utility is a utility similar to the Import Utility that On Demand Professional Services can use for import. The Bulk Data Load utility is accessed from a separate URL and once a company has allowed bulk data load then we would be able to use the Bulk Data Load Utility for importing their data.
    The Bulk Data Load uses similar method to the Import Utility for importing data with the difference being that the number of records per import is higher and you can queue multiple import jobs.

  • Supply load trough thyristor

    Hello
    I'm writing first time to this forum. I apologize in advance because of my poor English, but I hope you will understand what I would like to ask:
    I’m new in LW. I have built few simple applications but I have problems with this one. I have LW 8.0 Full Development and NI-USB-6251 and SCC-68.
    I need to build a system which run on AC line voltage, 50 Hz. The load must be supply according the following demand:
    Load is supply for 5 sec in the following procedure:
    12x (10 ms ON – 30 ms OFF) - number of cycles (loops) could vary
    24x (10 ms ON – 50 ms OFF) - number of cycles (loops) could vary
    39x (10 ms ON – 70 ms OFF) - number of cycles (loops) could vary
    On picture 1 is shown the current trough load (just first three cycles).
    To make it easier I will use the Thyristor for supply the load – as shown in schematics (picture 2).
    With LW I plan to find zero cross and according to this I will trigger the gate of thyristor. Because of that design I will have ~ 5 ms reserve for triggering (I can trigger ~2.5 ms before positive zero cross (predicted-calculated) and wait 2.5 ms after zero cross (predicted-calculated). By my opinion this should be enough because I do not have real time operating system. I thing that one zero cross reading should be enough for next 5 s of operations (line frequency is very stable) I build VI for testing the zero cross – that works fine but I have problems because the times vary, and I do not manage to predict the second zero cross with +-2.5 ms accuracy.
    I’m afraid that this is not the right way – could you suggest me how to do that.

    Hello BR,
    I'm not exactly sure what you're trying to do, but let’s see what we can figure out.  It sounds like you're trying to control the current throughput to your load using a thyristor and control the thyristor with the USB-6251.
    It sounds like you want to turn the thyristor on for the specified pattern:
    "12x (10 ms ON – 30 ms OFF) - number of cycles (loops) could vary
    24x (10 ms ON – 50 ms OFF) - number of cycles (loops) could vary
    39x (10 ms ON – 70 ms OFF) - number of cycles (loops) could vary"
    What signal are you using to control the thyristor?  Are you using a digital line or something else?
    Also, I'm unsure what you're "predicting-calculating" are you saying that you're trying to predict when you need to trigger the thyristor in order to turn it on for the 10 ms intervals mentioned above or are you referring to something else?  If you're trying to do that then I would recommend using a correlated digital output to output the desired waveform with hardware timing.  As you've mentioned, a Windows operating system does not use deterministic timing.  While allowing 2.5 ms for this trigger to occur is a good idea, Windows may take longer or shorter depending on what else is doing.  That is likely the time variation that you're seeing.
    If you can define your desired input and output signals a little more it would be easier to decide on the best solution.
    Cheers,
    Brooks

  • Suppliers API's in 11i

    Hello,
    I am creating an application similar to iSupplier where I need to create Supplier, Supplier Site and Contacts. If the supplier clicks submit the record should created right away. There would a delay if I have to use the supplier interface table and import. Is there a way that I can load them using API's without using Concurrent Program Import Process. In simple words, it should work like iSupplier.
    Thanks,
    Sharad

    Pls check following, you will get answer...
    Is There A Bulk Supplier / Supplier User Registration API Available? [ID 256790.1]
    Does the Supplier Open Interface API Support Updates to Supplier Information? [ID 795270.1]

  • Handling rejections in the Payable's Supplier Open Interface Import Process

    I’m using the suppliers API to mass load the suppliers. I’m loading the tables AP_SUPPLIERS_INT, AP_SUPPLIER_SITES_INT and AP_SUP_SITE_CONTACT_INT by part; first the AP_SUPPLIERS_INT and then the other two tables. Due to various errors I get some rejections on the first table. If I want to correct the data on the interface tables, what should I do (on the rejected records that I want to correct)? (a)Should I correct the data and leave the STATUS and REJECTION_CODE as the API left them and re-run the Open Interface Import process? (b)Should I delete the contents of those fields? (c)Should I delete the entire table?
    I tried option (a) but the process seemed to take forever compared to the first time I ran it and I canceled the request.
    Thanks in advance.

    Hi,
    Unhide the Debug (Debug Switch) parameter of the Report, Supplier Open Interface Import and run the program with Debug flag as Yes.
    Please post the log to help us understand the issue.
    Regards,
    Sridhar

  • Supplier  Catalog upload  w/o Prodcut ID in SRM MDM CATLOG

    Hi experts,
    The supplier catalog loaded by supplier, will not have Product ID. I believe , even though we don't have product id, we can create a SC in SRM.
    Will there be any isuue  , if v r not geting the prodcut id in catalog upload file by supplier.?
    I think this SC will be a created as free text in sap SRM. Is supplier part number equivalent to Prodcut id?
    Plz help.
    Thanks,
    SK

    Q The supplier catalog loaded by supplier, will not have Product ID. I believe , even though we don't have product id, we can create a SC in SRM.
    A:- Yes. you can . it creates based on product category sc
    Will there be any isuue , if v r not geting the prodcut id in catalog upload file by supplier.?
    I think this SC will be a created as free text in sap SRM.yes.
    Q  Is supplier part number equivalent to Prodcut
    id?
    A :-product id and supplier part number are totally different.
    for eg oci fields for product no :- NEW_ITEM-MATNR[n]
    spn :- NEW_ITEM-VENDORMAT[n]
    br
    muthu

  • SQL Loader Exception while loading Partitioned table

    Hi,
    I have a table EMP and it has year wise partitions created based on creation_date column.
    Now, I using SQL Loader to load bulk data using Java Program. But I am getting SQLLoader Exception. When I drop partition on the table, same code is working fine.
    Do I need to do anything extra for the partitioned table?
    Please help me.
    Thanks

    SQL Loader should produce a log file with an error code(s) in it. Check for that.

Maybe you are looking for

  • Trying to install ios 5 but backing is slow, 3 days and counting.

    I've been trying to update up iPad 2 to the ios 5 software, iTunes started the normal backup process. It's now been 3 days and it's still backing up and because I 've got the sliding green bar I don't know how far the backup is, if it is nearly done,

  • HT204406 Apple TV 2 drooping i Tunes account every 10 minutes

    Using Apple TV.I have to close i Tunes every +- 10 minutes to re start when watching a TV show downloaded to my primary PC when using Home Sharing

  • More Simpletech EHD Problems!

    So, I posted about 4 months ago or so regarding my mac not reading my external hard drive. I tried repairing the disk through disk utility, and this is what it said: Verify and Repair volume "STORAGE" ** /dev/disk3s2 (NO WRITE) ** Phase 1 - Preparing

  • To print two ' continuously

    Hi, I want to pass (' ' abcdef';3;5' )as an argument some function but when i am passing like this i am getting error quoted string not properly terminated. Can any one help me out. Regards, Guru.

  • How can I change my e-mail adress

    I`ve the wrong e-mail adress  imbued. How can I change this to the good one?