Loadjava / soap.jar / errors during loading into 9.2 dbms

Hello!
I followed the description under "
http://otn.oracle.com/sample_code/tech/java/jsp/loadjars.html"!
I tried to test a webservice call from java stored procedure in oracle 9.2 dbms, but i get the following error while using loadjava.
e.g.
referenced object oracle/dms/http/HttpBasicAuthorizationException could not be resolved, or e.g. OracleSOAPHTTPConnection ...
Any idea, what i should do instead?
regards
Harald.

This note details the errors I typically get from loading as well as the extra commands I need to run to get a call out to a particular endpoint running (as extra information to the writeup you already have used).
Can 8i be a Web Service Consumer?
The other thing to make sure is that you use the soap.jar that ships with Oracle9iAS not the one that ships with the database as the database soap.jar is from a much older version of Apache SOAP than that that ships with 9iAS. I believe the above writeup details this too.
Mike.

Similar Messages

  • ODI : how to raise cross reference error before loading into Essbase?

    Hi John .. if you read my post, I want to say that you impress me! really, thank for your blog.
    Today, my problem is :
    - I received a bad quality data file from ERP extract
    - I have cross reference table (Source ==> Target)
    - >> How to raise the error before loading into Essbase !
    My Idea is the following, (first of all, I'm not sure if it is a good one, and also I meet issue to do it in ODI !)
    - Step 1 : make JOIN between data.txt and cross-reference Table ==> Create a table DATA_STEP1 in the ODISTAGING schema (the columns of DATA_STEP1 are the addition of columns of data.txt those of cross-references Tables (... there is more than 20 columns in my case)
    - Step 2 : Control if there is no NULL value in the Target Column (NULL means that the data.txt file contains value that are not defined in my cross reference Table) by using Filter ( Filter = Target_Account IS NULL or Target_Entity IS NULL or ...)
    The result of this interface is send to reject.txt file - if reject.txt file is not empty then a mail is sent to the administrator
    - Step 3 : make the opposite : Filter NOT (Target_Account IS NULL or Target_Entity IS NULL ... ) ==> the result is sent in DATA_STEP3 Table
    - Step 4 : run properly the mapping : source : DATA_STEP3 (the clean and verified data !) with cross reference Tables and send data into Essbase - NORMALY, there is not rejected record !
    My main problem is : what is the right IKM to send data into the DATA_STEP1, or DATA_STEP3 Table, which are Oracle Table in my ODISTAGING Schema ! I thy with IKM Oracle Incremental Update but I get error, and actually I don't need an update (which is time consumming), I just need an INSERT !
    I'm just lookiing for an 'IKM SQL to Oracle" ....
    regards
    xavier

    Thanks john : very speed !
    I understood better now which IKM is useful.
    I found other information about the error followup with ODI : http://blogs.oracle.com/dataintegration/2009/10/did_you_know_that_odi_generate.html
    and I decided to activate Integrity Constorl in ODI :
    I load :
    - data.txt in ODITEMP.T_DATA
    - transco_account.csv in ODITEMP.T_TRANSCO_ACCOUNT
    - transco_entity.csv in ODITEMP.T_TRANSCO_ENTITY
    - and so on ...
    - Moreover I create integrity constraints between T_DATA and T_TRANSCO_ACCOUNT and T_TRANSCO_ENTITY ... so I expected that ODI will raise for me in E$_DATA (the error table) the bad records !
    However I have one issue when loading data.txt into T_DATA because I have no ID or Primary Key ... I read in a training book that I could use a SEQUENCE ... I try but unsuccessful ... :-(
    Is there another simple way to create a Primary Key automaticaly (T_DATA is in an oracle Schema of course) ?thanks in advance

  • 1 duplicate key error during insert into table

    Dear Guru's
    While upgrading the SAP_HR patch, I am facing this issue
    Current level is SAPKE60455
    Trying to apply patch - SAPKE60456
    This is seen in the import log
    Start import R3TRVDATV_512W_O ...
    client 000:   1
    client 001:   1
    client 070:   1
    client 080:   1
    client 100:   1
    client 320:   1
    1 duplicate key error during insert into table T512T occured
    End import R3TRVDATV_512W_O (with warnings)
    Please help, thank in advance
    R3trans is at version 6.19
    Regards,
    Omkar

    Hello,
    Issue was resolved after upgrading the R3trans
    Regards,
    Omkar

  • XI SOAP adapter : Error during parsing the SOAP part

    Hello experts,
    We have several scenarios SOAP to IDocs.
    We are on XI 3.0 SP19, and since 9/6/09 around 4:00 PM GMT, our SOAP channels used to communicate with a specific 3rd party all stoped accepting incoming SOAP requests. All the messages are stucked on third party side, and we get the following error in the defaulttrace and application logs :
    #1.5#001125A5BCD00066000076850013109E00046BF6A963E479#1244607054865#/Applications/ExchangeInfrastructure/AdapterFramework/SAPLibraries/SAPXDK#sap.com/com.sap.aii.af.soapadapter#com.sap.aii.messaging.mo.Message.reparseRootDocument()#PIMSOAP#21217##pxaci_PXA_6443451#PIMSOAP#a71ebfa0556d11decc24001125a5bcd0#SAPEngine_Application_Thread[impl:3]_50##0#0#Error#1#com.sap.aii.messaging.mo.Message#Plain###Error during parsing the SOAP part --- java.lang.NullPointerException#
    It happened at the same time on all the environments for every interfaces using SOAP message from this sender to XI. That's the reason why we think it is a global issue, but we can not find anything concerning this error. Has anybody encountered this error or would have any clue of where it comes from ?
    Many thanks,
    Best regards,
    Guislain

    Hello Michal,
    Thanks a lot for your help. Actualy we have tested with SOAP UI yesterday and this is what came out of the tests :
    test 1 : we tried to call the XI service with the wsdl and put dummy data inside --> test OK
    test 2 : we loaded in SOAP UI one of the file from our partner and tried to call the service --> test NOT OK
    test 3 : we realized that there were some blank lines at the beginning of the file provided by the partner. we removed these blank lines --> test OK.
    So we first thought that the format of the file sent by the partner was wrong and this is why we were getting the parsing error, because of these blank line. The thing is that these blank lines have always been here ! We have checked with old successful messages and they contain also these lines and if we test them in SOAP UI we get an error also. so it does not seems to come from this.
    Moreover, as I said in my previous message, it happened on all the environments approximatly at the same time, so we are suspecting a global problem, but which one ? That's the million dollar question
    Would you have any other ideas ?
    Thanks a lot,
    Best regards,
    Guislain

  • Errors during Loading TD from Flat file to PSA

    Hi Guys,
    I got some errors while loading the flat file transaction data into PSA using infopackage.
    Could you guys can help me to solve the below errors. It would be more appreciated if you guys could send some material which tells about these types of errors more detailly.
    Points will be awarded.
    Field /BIC/ZIO_PRIC ( Position 4 ): External length specification will be ignored
    Message no. RSDS101
    Diagnosis
    The value 23 is specified as output length in Field /BIC/ZIO_PRIC ( Position 4 ). It is also specified that the data is shipped in internal format.
    System Response
    The length specification for output length is ignored. If the data in the source actually have the specified lengths, this results in conversion errors.
    Procedure
    Select the external format if required.
    Field /BIC/ZIO_PRIC ( Position 4 ): Amount field in internal format; check settings
    Message no. RSDS099
    Diagnosis
    Setting made for Field /BIC/ZIO_PRIC ( Position 4 ) states that the data from the source is available in the internal format.
    This is not usually the case for this source system type.
    System Response
    If you do not change this setting, the data has to be available in the internal SAP format.
    Example:
    For the Japanese Yen this means: 1000 JPY has to be delivered from the source as 10.00 JPY.
    Procedure
    Check the data format of the source and where necessary, change the setting to External.
    Field /BIC/ZIO_PRIC ( Position 4 ): Missing reference to currency field/unit field
    Message no. RSDS151
    Diagnosis
    If an amount or quantity field is not delivered with values in internal format, an associated currency field or unit field must be specified so that the currency or units can be converted during data import.
    System Response
    The DataSource can not be activated.
    Procedure
    Specify a reference field if the data is delivered in external format.
    Specify a reference field even if the currency or unit is constant. Fields with fixed values are not part of the data source; they are filled by the system when loaded.

    Hi JUergen.
    The order of the fields are same as datasource. I didn't make any field for amount in a flat file as it is defined as fixed currency usd when i created a key figure. i don't think so we have to create a field unless we take 0currency is a unit for price instead of fixed currency.
    Though we got the above error, it still allows activates the datasource but the 10.00 usd shows as 1000 usd.
    Cheers,
    Shrinu

  • Error While loading into BW using infoPackage

    Hi All,
    While following directions posted at the following wiki page https://wiki.sdn.sap.com/wiki/display/BOBJ/ConfiguretheLoadJobinBW%28DS+3.2%29,
    we get an error while loading data into BW using infoPackage via Data services 3.2
    In SAP BI side Error says :"Error while executing the following command:  -sJO return code:220"
    On DS side for the RFC server log we get the error" Error while executing the following command -sJOB_SQL_TO_DBI -Ubbu -P;(encrypted password) -A (encryted information) -VMODE=F"
    Issue is still open with support as we still cannot excute the job from infopackage. it is succesfully executed from designer, admin console, batch file.
    Any insights/help would be appreciated.
    Thanks.
    Edited by: rana_accn on Jan 26, 2010 6:48 PM

    Sure I will.
    One thing we have noticed is that in the server_eventlog parameter for the  log line "Starting job with command line parameters are different"
    When job is executed from DS side parameters -Ct -Cm -Ca -Cj -Cp -S -N -Q -U -P -G -r -T -Locale are used
    When job is executed from BW side parameters -Ct -Cm -Ca -Cj -Cp -s -N -Q are used
    Also when the job is executed from BW,  log files start getting created but then trace and error file show an error that cannot open database - and that points to DSCentral repository.This repository is not even mapped to the job server and the job has been exported from the correct repository as command line execution of the job works perfectly fine.
    Thnx
    Just an update we were not able to figure out why this happened in the existing environment. One reason given was that somehow the commands which were being executed from BW side were getting corrupted. We had to build a new environment and this is no longer an issue in the new environment, as now all jobs can be executed from BW. Only difference between the 2 environments was we are now on 12.2.1.3 while earlier we were on 12.2.0.0.
    Edited by: rana_accn on Mar 23, 2010 10:21 PM
    Edited by: rana_accn on Apr 29, 2010 9:46 PM

  • Error while loading into DTP

    Hi,
    I am loading data to a DSO from 0FI_AP_30. PSA I am able to load successfully but while loading this PSA data to DTP i get the following error message:
    @5D@     Request REQU_D44GG0YIPPH7NBQ5F1VOCB890 not extracted; request is in obsolete version of DataSource     @35@
    and in help this is the respose:
    Diagnosis
    The request REQU_D44GG0YIPPH7NBQ5F1VOCB890 was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
    System Response
    The request is in PSA /BIC/B0001232, version 001. The current PSA version is 002. An extraction would lead to a short dump because of field incompatibilities.
    Procedure
    If you no longer require the data, you do not need to take any further action.
    If you do still require the data, you can just extract it by creating a generic DataSource in the Myself source system in PSA /BIC/B0001232, version 001. You can find the technical name of the PSA table in the table RSTSODS with the specified PSA name and version.  Convert the data using a separate transformation into the new target format.
    Please let me know what can be done for this? pl give me the solution.
    Thanks & Regards,
    Vijaya

    Hi Vijaya,
    I am also facing the same error for data source 2LIS_03_BF, i am unable to load the same in one DSO, can you elaborate how you solved your issue ASAP?
    I can't run IP because this data source requires downtime in ecc. We have already done that.
    Thanks & Regards
    Manna Das

  • Error during loading of transaction data: An RFC call-up triggered the ...

    Dear Sirs,
    during loading of master data from a 4.6C SAP system to BI 7.0 SP9 I get the following error during processing (data packet).
    <i>"An RFC call-up triggered the exception (ID: RSAR NO: 503) "</i>
    The strange thing with this error is that it emerged today, after successfully loading master data and a subset of transaction data yesterday from same source system.
    Has anyone else come across this error, and can tell us where we should look for a possible solution? (or a better error explaination).
    The RFC connections works fine when testing in SM59, and was functioning yesterday. No relevant dumps in either system.

    Hi folks,
    I am also interested in this error.
    Thanks.

  • Error during loading and deletion of write-optimized DSO

    Hey guys,
    I am using a write optimized DSO ZMYDSO to store data from several sources (two datasources and one DSO).
    I have disabled the check of uniqueness in the DSO, but I defined a semantic key for the fields ZCLIENT, ZGUID, ZSOURCE, ZPOSID which are used in a non-unique index.
    In the given case, I want to delete existing rows in the DSO. I execute these steps in the endroutine. Here the abstract coding:
    LOOP AT RESULT_PACKAGE ASSINING <RESULT_FIELDS>.
    u201Csome other logic [u2026]
    DELETE /BIC/AZMYDSO00
    WHERE /BIC/ZCLIENT = RESULT_FIELDS-/BIC/ZCLIENT
         AND /BIC/ZGUID = RESULT_FIELDS-/BIC/ZGUID
    AND /BIC/ZSOURCE = RESULT_FIELDS-/BIC/ZSOURCE
    AND /BIC/ZPOSID = RESULT_FIELDS-/BIC/ ZPOSID.
    ENDLOOP.
    COMMIT WORK AND WAIT.
    During the Loading (after the transformation step in the updating step), I get the messages (not every time):
    1.     Error while writing the data. (RSAODS131)
    2.     Could not Save DataPackage xy in DataStore ZMYDSO (RSODSO_UPDATE027).
    Diagnosis: DataPackage XY could not be saved. Reasons therefore could be violation of key uniqueness (duplicate data) or general database error.
    3.     Error in the substep of updating DataStore.
    I have checked the system log (SM21) and the system dumps (ST22) but I could not find an exact error description.
    I guess, I am creating some inconsistencies or locks (I also checked the SM12) so that the load process interrupts. But I also tried a serial updating within the DTP (I reduced the number of batch processes to 1). No success.
    Perhaps the loading of one specific package could take a longer time so that the following package would overtake the predecessor. Could that be a problem? Do you generally advise against the deletion of rows within the endroutine?
    Regards,
    Philipp

    Hi,
    is ZMYDSO the name of the DSO?
    And is this the end routine of the transformation while loading the same DSO?
    if so we never do such a thing.
    you are comparing the DSO with the data that is flowing in and then deleting the data from the DSO...
    Which doesnt actually make any sense... because when loading the data to a DSO (or a cube or any table) the DSO (or cube) will be locked exclusively for any modifications of data. You can only read data from it.
    If your requirement is that existing duplicate records need not arrive in the DSO then you can delete the data from the SOURCE_PACKAGE in the start routine like below
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE WHERE <CONDITION>.
    LOOP AT INTERNAL_TABLE.
       DELETE SOURCE_PACKAGE
      WHERE SOURCE_PACKAGE-/BIC/ZCLIENT = INTERNAL_TABLE-/BIC/ZCLIENT
         AND SOURCE_PACKAGE-/BIC/ZGUID = INTERNAL_TABLE-/BIC/ZGUID
      AND SOURCE_PACKAGE-/BIC/ZSOURCE = INTERNAL_TABLE-/BIC/ZSOURCE
      AND SOURCE_PACKAGE-/BIC/ZPOSID = INTERNAL_TABLE-/BIC/ ZPOSID.
    ENDLOOP.
    or if your requirement is that you need to delete the old data from the DSO for the same key which is arriving newly in order to load the new data into the DSO in that case, you could do something like this in the start routine
    SELECT FIELDS FROM /BIC/AZMYDSO00 INTO INTERNAL_TABLE FOR ALL ENTRIES IN SOURCE_PACKAGE
    WHERE /BIC/ZCLIENT = SOURCE_PACKAGE-/BIC/ZCLIENT
         AND /BIC/ZGUID = SOURCE_PACKAGE-/BIC/ZGUID
    AND /BIC/ZSOURCE = SOURCE_PACKAGE-/BIC/ZSOURCE
    AND /BIC/ZPOSID = SOURCE_PACKAGE-/BIC/ ZPOSID.
    * now update the new values you want to write in the loop
    LOOP AT INTERNAL_TABLE INTO WORK_AREA.
    "CODE FOR MANIPULATION of WORK_AREA
    *write a modify statement to update the RESULT_PACKAGE.
    MODIFY RESULT_PACKAGE FROM WORK_AREA TRANSPORTING FIELDS.
    ENDLOOP.
    hope it helps,
    Regards,
    Joe

  • ERROR DURING LOADING DATA FROM PSA TO DSO IN "PRUEFMBKT"

    Respected all
    I am trying to load data through an r/3 field( pruefmbkt, char-40length) to infoobject. the data is coming correctly to psa but when i am loading it in to dso or infocube request is turning red giving following error message--
    Value '010/12.05-112.15/hold on not operate' (hex.
    '3031302F31322E30352D3131322E31352F686F6C64206F6E20') of characteristic
    PRU_LSIL contains invalid characters
    Message no. BRAIN060
    Diagnosis
        Only the following standard characters are valid in characteristic
        values by default:
        !"%&''()*,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ.+
        Furthermore, characteristic values that only consist of the character #
        or that begin with ! are not valid.
        You are trying to load the invalid characteristic value 1. (hexidecimal
        representation 3031302F31322E30352D3131322E31352F686F6C64206F6E20).
    Procedure for System Administration
        Try to change the invalid characters to valid characters.
        If you want values that contain invalid characters to be admitted into
        the system, make the appropriate setting in BW Customizing. Refer to the
        documentation describing the requirements for special characters and the
        possible consequences.
        For more information on the problems involved with valid and invalid
        characters, click here.
    kindly give me the solution how to deal with this problem.
    thanks
    abhay

    HI Abhay,
    please check if the infoobject corresponding to field PRU_LSIL in BW system has lower case checkbox enabled in the infoobject properties, as the  value '010/12.05-112.15/hold on not operate' coming from R/3 has lower case alphabets.
    Hope this helps.
    Regards,
    Umesh.

  • Weblogic 7.0 MIB - syntax error when loading into Openview NNM

    I tried to load Weblogic Server 7.0 MIB supplied into Openview NNM, but I received the following message:
    Error detected while loading MIB File BEA-WEBLOGIC-MIB.asn1. This MIB cannot be loaded until the following problem is corrected:
    Line 222: Error in registration of 'CacheMonitorRuntimeAccessCount': Subid '25' already defined by 'applicationRuntimeApplicationName' .
    Where can I get fixed MIB?
    thank you
    Stanislav

    I do not see this element in my MIB from 70 GA version.
    I am attaching myMIB here. Can you please check with it ?
    thanks,
    Mihir
    "Stanislav Paltis" <[email protected]> wrote in message
    news:3d2b25f9$[email protected]..
    I tried to load Weblogic Server 7.0 MIB supplied into Openview NNM, but Ireceived the following message:
    Error detected while loading MIB File BEA-WEBLOGIC-MIB.asn1. This MIBcannot be loaded until the following problem is corrected:
    >
    Line 222: Error in registration of 'CacheMonitorRuntimeAccessCount': Subid'25' already defined by 'applicationRuntimeApplicationName' .
    >
    Where can I get fixed MIB?
    thank you
    Stanislav[BEA-WEBLOGIC-MIB.asn1]

  • SLG1 errors during load of DNL_CUST_CNDALL

    Hi,
    In our system, when loading DNL_CUST_CNDALL, the SLG1 shows the following errors.
    1. There are no entries for certain tables(around 30 different tables) and so these tables cannot be deleted.
    2. Some Fields of table are not referenced in access CRM PR UTXC.
    3. Creating names of generated objects failed
    4. Customizing error: Field PRED_HEADER_GUID is not defined in application CRM.
    I have checked the Long Texts and the Notes wherever possible. But they were not of a great help. Most of these are /SAPCND/GENERATION errors.
    Has anyone faced these kind of problems? Any inputs on how to solve these?
    Thanks,
    Rashmi.

    Hi
    I also had a lot of problems when downloading DNL_CUST_DNALL to CRM. Hopefully, this will help anyone else having similar issues.
    For PRED_HEADER_GUID issue try note 766711.
    In CRM 5.0 some conditions are already created in CRM as a local source, therefore when attempting to download from R/3, it fails. Please follow instructions from note 1003793 to delete the necessary conditions in CRM prior to download.
    Cheers
    Rhianna

  • Error during Load

    Gurus,
    I am trying to load data from BI content datasource 0HR_PY_1 to cube 0PY_C02. But when the infopackage runs, it throws an error and no records get transferred.
    Error while reading the payroll area information D2 , table T549A     HR_BW_PY99     
    I have made sure the transfer rules are activated and redid the process several times. What am I doing wrong. Please help.

    Hi,
    First make sure that you can get the data in RSA3. This atleast makes sure that data can come from R/3. Is your datasource active? If not, go to RSA6 and do that and then again try.
    Also, check this for information:
    http://help.sap.com/saphelp_nw04/helpdata/en/a5/94e8288eec11d4b2fb00010220c65f/content.htm
    Also, check the table from were the datasource is extracting data.
    http://help.sap.com/saphelp_nw04/helpdata/en/22/7883373c1af326e10000009b38f8cf/content.htm
    Hope this helps.
    PB

  • Error During Load Infocube from ODS

    Hi guys i hae an infocube and an ODS i load data to the ODS directly from SAP and its OK but i have an error when i try to load data from ODS to infocube, i`ve created an update rule from ODS to infocube theres an error in the load request "Load Request is invalid" i try to load to infocube from ODS in many ways Full, Init Delta but i have the same error in all of them.....i need to do something before or i need to change in any way the data targets.....
    I hope somebody could help me with this.......

    HI,
    Activate the data in ur ODS. Then right click on ur ODS n choose "create export datasource". Now right click ur ODS and go to "Manage" and make sure that the QM status is green and the Datamart status is checked with a green mark. Now carry on the load from ODS to infocube.
    Hope this helps...Let me know
    Regards,
    R.Ravi

  • Error During Loading Master Data

    Hi Gurus,
    I'm loading master data from R/3 to BW 3.5 but i'm getting the following error for all the Infosources
    "Request IDoc : Application document not posted"
    Can anyone help me out of this
    Thanking you inadvance
    R.Saravanan

    Hi Saravanan,
                 Check this previous postings on the same issue.
    InfoPackage loading giving IDOC Error: Status 51
    Application document not posted
    Cheers,
    Aravindhan

Maybe you are looking for