Issues while generating Schema DAT files

We are facing two type of issues when generating Schema ".dat" files from Informix Database on Solaris OS using the
"IDS9_DSML_SCRIPT.sh " file.
We are executing the command on SOLARIS pormpt as follows..
"IDS9_DSML_SCRIPT.sh <DBName> <DB Server Name> ".
The first issue is ,after the command is excuted ,while generating the ".dat" files the following error is occuring .This error is occuring for many tables
19834: Error in unload due to invalid data : row number 1.
Error in line 1
Near character position 54
Database closed.
This happens randomly for some schemas .So we again shift the script to a different folder in Unix and execute it.
Can we get the solution for avoiding this error.
2. The second issue is as follows..
When the ".dat" files are generated without any errors using the script ,these .dat files are provided to the OMWB tool to load the Source Model.
The issue here is sometimes OMWB is not able to complete the process of creating the Source Model from the .dat files and gets stuck.
Sometimes the tables are loaded ,but with wrong names.
For example the Dat files is having the table name as s/ysmenus for the sysmenus table name.
and when loaded to oracle the table is created with the name s_ysmenus.
Based on the analysis and understanding this error is occuring due to the "Delimiter".
For example this is the snippet from a .dat file generated from the IDS9_DSML_SCRIPT.sh script.The table name sysprocauthy is generated as s\ysprocauthy.
In Oracle this table is created with the name s_ysprocauthy.
s\ysprocauthy║yinformixy║y4194387y║y19y║y69y║y4y║y2y║y0y║y2005-03-31y║y65537y║yT
y║yRy║yy║y16y║y16y║y0y║yy║yy║y╤y
Thanks & Regards
Ramanathan KrishnaMurthy

Hello Rajesh,
Thanks for your prompt reply. Please find my findings below:
*) Have there been any changes in the extractor logic causing it to fail before the write out to file, since the last time you executed it successfully? - I am executing only the standard extractors out of the extractor kit so assumbly this shouldnt be a issue.
*) Can this be an issue with changed authorizations? - I will check this today, bt again this does not seem to be possible as the same object for a different test project i created executed fine and a file was created.
*) Has the export folder been locked or write protected at the OS level? Have the network settings (if its a virtual directory) changed? - Does not seem so because of the above reason.
I will do some analysis today and revert back for your help.
Regards
Gundeep

Similar Messages

  • Issue while generating a CSV file through Oracle.

    Hi,
    I am generating a CSV file using oracle stored procedure.
    In an Oracle procedure I need to create a CSV file, which contains the data of a select query, and then emailing this file. Problem is that one of the field contains comma eg ('ABC,DE'). Now in the CSV file 'ABC' is shown in the one cell and 'DE' is shifted to the adjacent cell which is not required..
    Thanks.

    Hi Welcome,
    If you data contains comma then make ~ as your delimeter.
    or repalce all the commas form your data.
    select replace(column_name,',','') from table_name; /* before writing to the file */Regards,
    Bhushan

  • Performance issue while generating Query

    Hi BI Gurus.
    I am facing performance issue while generating query on 0IC_C03.
    It has a variable as (from & to) for generating the report for a particular time duration.
    if the variable (from & to) fields is filled then after taking a long time it shows run time error.
    & if the query is executed without mentioning the variable(which is optional) then the data is extracted from beginning to till date. the same takes less time in execution.
    & after that the period has to be selected manually by option keep filter value. please suggest how can i solve the error
    Regards
    Ritika

    HI RITIKA,
    WEL COME TO SDN.
    YOUHAVE TO CHECK THE FOLLOWING RUN TIME SEGMENTS USING ST03N TCODE:
    High Database Runtime
    High OLAP Runtime
    High Frontend Runtime
    if its high Database Runtime :
    - check the aggregates or create aggregates on cube and this helps you.
    if its High OLAP Runtime :
    - check the user exits if any.
    - check the hier. are used and fetching in deep level.
    If its high frontend runtime:
    - check if a very high number of cells and formattings are transferred to the Frontend ( use "All data" to get value "No. of Cells") which cause high network and frontend (processing) runtime.
    For From and to date variables, create one more set and use it and try.
    Regs,
    VACHAN

  • SWF error- while generating the flash files.

    Post Author: [email protected]
    CA Forum: Xcelsius and Live Office
    Hi, in the old forum I came across the folowing topic: SWF error- while generating the flash files.
    As I have the same problem, I would like to know the solution / answer / meaning please. What do I have to change in order to let in function properly again.
    Thanks anyone.
    Femke
    (The post is from 3/6/2007)
    The following error is generated upon generating the flash files Any insight???SWF error- while generating the flash files.--ErrorSWF Function Overflow. The generated SWF will not function correctly because data dependencies go beyond SWF limits.--
    OK -

    Post Author: Andres
    CA Forum: Xcelsius and Live Office
    Hello.
    I'm having the same problem. It appeared after I made use of many cells (a table of 10x200 cells) with formulas inside them.
    Before that, while I was making my tests and only used a few cells and the .XLF file was 4Mb, I had no problem. Now that the .XLF file is 8Mb I see this error appearing (in Spanish):
    "Error
    Desbordamiento de la función SWF. El archivo SWF generado no funcionará correctamente ya que las dependencias de datos superan los límites de SWF."
    Any idea of how this could be resolved (appart from using less data)?
    Thanks in advance.

  • Issue while Processing the Huge File in BPEL

    Hi,
    We are facing an Issue while Processing a Hige file in BPEL Process (More than 1MB File). When i test the files with more than 1500 transactions (More than 1MB file) then the BPEL Process is automatically goes to OFF Mode or it goes to Perform Manually Recovery Queue.
    Even we are facing this issue in Production also so we are using UNIX Script to Split the file before place the file in BPEL Input directory.Any Pointers to resolve this issue will be helpful.
    Thanks,
    Saravana

    Hi,
    Please find the answers.
    1. Currently we are using SOA 10.1.2 Version and JDev10g
    2. we are using File Adapeter
    3. yes. We used debatching.
    4. Yes. I am able to recover from Manual Recovery Queue
    5. Please find the error message
    <2009-05-21 04:32:38,461> <DEBUG> <ESIBT.collaxa.cube.engine.dispatch> <Dispatcher::adjustThreadPool> Allocating 1 thread(s); pending threads: 1, active threads: 0, total: 83
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B is ready to be processed.
    <2009-05-21 04:32:44,077> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Processing file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchBegin: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) starting...
    <2009-05-21 04:32:44,077> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> Inside TranslatorFactory
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> using version attribute = NXSD
    <2009-05-21 04:32:44,078> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> loading xlator class...oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.translation> <TranslatorFactory::log> class loaded
    <2009-05-21 04:32:44,081> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Created translator : oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl@46908ae8
    <2009-05-21 04:32:44,098> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting up Control dir for debatching error recovery
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Control dir for debatching error recovery : /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/j2ee/home/fileftp/controlFiles/localhost_ESIBT_BPELProcess_810~1.0_/inbound
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Invoking inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,121> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Starting translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.translation> <NXSDTranslatorImpl::log> Done with translateFromNative
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Completed inbound translation for : Input5162009.B2B
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> isTextFile : true
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translated inbound batch index 1 of file {Input5162009.B2B} with corrupted message count = 1
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Error Reader created using charset :ASCII
    <2009-05-21 04:32:44,139> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Sending message to Adapter Framework for rejection to user-configured rejection handlers : {
    fileName=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, startLine=1, startColumn=1, endLine=-1, endCol=-1, Exception=ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,139> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting batchId in NativeRecord to bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000
    <2009-05-21 04:32:44,139> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: The resource adapter 'File Adapter' requested handling of a malformed inbound message. However, the following bpel.xml activation property has not been defined: 'rejectedMessageHandlers'. Please define it and redeploy the business process. Will use the default Rejection Directory file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages for now.
    <2009-05-21 04:32:44,140> <WARN> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(Object)] - onReject: Sending invalid inbound message to Exception Handler:
    <2009-05-21 04:32:44,140> <INFO> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Handing rejected message to DEFAULT rejection handler: file:///opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages since none of the configured rejection handlers [] succeeded.
    <2009-05-21 04:32:44,140> <DEBUG> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> Finished persisting rejected message to file system under the name: /opt01/app/ESIBT/oracle/esibt10gR2iAS/BPEL10gR2/iAS/integration/orabpel/domains/ESIBT/archive/jca/BPELProcess_810/rejectedMessages/INVALID_MSG_BPELProcess_810_Read_20090521_043244_0140.dat
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Setting last error record to : -1
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Translator has failed to translate any message from batch number: 1
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Message not published as translation failed: {
    File=/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B, batchIndex=1, PublishSize=1
    <2009-05-21 04:32:44,141> <ERROR> <ESIBT.collaxa.cube.activation> <AdapterFramework::Inbound> onBatchFailure: Batch 'bpel://localhost/ESIBT/BPELProcess_810~1.0//Input5162009.B2B_1242894594000' (/harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B) has failed due to: ORABPEL-11167
    Error while reading native data.
    [Line=1, Col=70] Expected "\t" at the specified position in the native data, while trying to read the data for "element with name HDR_STORE_NUM", using "style" as "terminated" and "terminatedBy" as "\t", but not found.
    Ensure that "\t", exists at the specified position in the native data.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B after processing.
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleting file : Input5162009.B2B
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Deleted file : true
    <2009-05-21 04:32:44,141> <DEBUG> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Removing file /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B from files to be processed Map.
    <2009-05-21 04:32:44,141> <INFO> <ESIBT.collaxa.cube.activation> <File Adapter::Inbound> Done processing File : /harbinger/prod/xlate/Outbound_Input_LDS_810/Input5162009.B2B
    <2009-05-21 04:33:09,698> <DEBUG> <ESIBT.collaxa.cube.engine.data> <ConnectionFactory::getConnection> GOT CONNECTION 4 Autocommit = false
    For this error message this shows due to some /t its not picking up the file. but even i am facing the same issue for all the files where load is huge.
    Thanks,
    Saravana

  • Deadlock with thread issues while generating reports with Crystal Report XI

    We are facing deadlock with thread issues while generating report with Crystal Report XI
    Version Number is 11.0 and the database used is Oracle
    In the log file on line number 74350  by 2008/12/16 13:35:54 there is a dead lock with Thread: u20184u2019 is waiting to acquire lock for 'com.crystaldecisions.reports.queryengine.av@15214b9' which is held by the Thread: '0'.
    And  a dead lock with Thread: u20180u2019 is waiting to acquire lock for 'com.crystaldecisions.reports.queryengine.av@15214b9' which is held by the Thread: '4'.
    Exactly after 10 minutes we can see the thread 4 and 0 are declared as STUCK by 2008/12/16  13:45:54 .
    Is this an existing issue with Crystal Report?
    Is there some solution for this problem?
    THE LOG FILE INFORMATION IS GIVEN BELOW
    [deadlocked thread] [ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)':
    Thread '[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'' is waiting to acquire lock 'com.crystaldecisions.reports.queryengine.av@15214b9' that is held by thread '[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)''
    Stack trace:
         com.crystaldecisions.reports.queryengine.av.V(Unknown Source)
         com.crystaldecisions.reports.queryengine.av.do(Unknown Source)
         com.crystaldecisions.reports.queryengine.as.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.c(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.b(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.long(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a.b.a(Unknown Source)
         com.crystaldecisions.reports.sdk.ReportClientDocument.open(Unknown Source)
         com.sysarris.aris.crystalreports.RepServlet.generateReport(RepServlet.java:65)
         com.sysarris.aris.crystalreports.RepServlet.doPost(RepServlet.java:40)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:763)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:225)
         weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:127)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:272)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:165)
         weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3153)
         weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:1973)
         weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:1880)
         weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1310)
         weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
         weblogic.work.ExecuteThread.run(ExecuteThread.java:179)
    [deadlocked thread] [ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)':
    Thread '[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'' is waiting to acquire lock 'com.crystaldecisions.reports.queryengine.av@12e0415' that is held by thread '[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)''
    Stack trace:
         com.crystaldecisions.reports.queryengine.av.V(Unknown Source)
         com.crystaldecisions.reports.queryengine.av.do(Unknown Source)
         com.crystaldecisions.reports.queryengine.as.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.c(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.datainterface.j.a(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.b(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.cy.long(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.a(Unknown Source)
         com.crystaldecisions.reports.common.ab.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.if(Unknown Source)
         com.crystaldecisions.reports.reportdefinition.a1.o(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a(Unknown Source)
         com.crystaldecisions.reports.reportengineinterface.a.a.b.a(Unknown Source)
         com.crystaldecisions.reports.sdk.ReportClientDocument.open(Unknown Source)
         com.sysarris.aris.crystalreports.RepServlet.generateReport(RepServlet.java:65)
         com.sysarris.aris.crystalreports.RepServlet.doPost(RepServlet.java:40)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:763)
         javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:225)
         weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:127)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:272)
         weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:165)
         weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3153)
         weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:1973)
         weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:1880)
         weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1310)
         weblogic.work.ExecuteThread.execute(ExecuteThread.java:207)
         weblogic.work.ExecuteThread.run(ExecuteThread.java:179)
    Can you please suggest any work around for this?

    I'm not referring to Servlet threading issues.
    I'll clarify.
    You have two threads, both entering ReportClientDocument.open(...) method.
    Thread 4 is waiting to acquire 'com.crystaldecisions.reports.queryengine.av@15214b9'
    Thread 0 is waiting to acquire ''com.crystaldecisions.reports.queryengine.av@12e0415'
    So I'm thinking ??? are they the same objects?
    My specific question concerning the ReportClientDocument is that both are calling open - i.e., trying to open a new report.  You wouldn't be trying to open different reports using the same ReportClientDocument - so was wondering if you've cached the RCD and trying to open two different reports at the same time on the same instance via different threads.
    You'd normally tie a ReportClientDocument instance to a HTTP Session, to ensure each user gets their own copy.
    Sincerely,
    Ted Ueda

  • Getting Duplicate Object existing issue while deploying the BIAR file

    Hi All,
    We are trying to deploy BIAR File with XI R2 Command tool InstallEntSdkWrapper. But we are getting Duplicate Object exixting issue while deploying the BIAR file.
    Error Message:
    [report] [InstallEntSdkWrapper.main] Connecting to CMS plmdevapp31:6400 as administrator
       [report] [InstallEntSdkWrapper.CmsImportFile] Exception: An error occurred at the server :
       [report] Failed to commit objects to server : Duplicate object name in the same folder.
       [report]
       [report] [InstallEntSdkWrapper.main] BIAR File could not be imported
    If we are doing any promition with Import Wizard we have an option to "Overwrite object contents" option to overwite exixting objects. It will very helpful if any one suggest how we can achieve this through InstallEntSdkWrapper.
    Unfortunately there is no documentation availabe on InstallEntSdkWrapper.
    Cheers!

    That's a limitation with the XI Release 2 InstallEntSdkWrapper.jar tool.
    Sincerely,
    Ted Ueda

  • Issue while archiving the processed file in sender communication channel using SFTP adapter

    Hi All,
    In one of my scenario (File to IDOC), we are using SFTP sender communicationchannel.
    we are facing an issue while archiving the processed file. Some times PI processed the file successfully but unable to archive it and in the next poll PI process & archives the same file successfully which will creates duplicate orders in ECC.
    Please let us know how to resolve this issue.

    Hi Anil,
    Refer Archiving concepts in below links.
    http://help.sap.com/saphelp_nw73/helpdata/en/44/682bcd7f2a6d12e10000000a1553f6/content.htm?frameset=/en/44/6830e67f2a6d12e10000000a1553f6/frameset.htm
    http://scn.sap.com/docs/DOC-35572
    Warm Regards,
    DNK Siddhardha.

  • 7303030 patch - an error occured while generating forms library files

    Hi,
    While applying 12.1.1 upgrade patch, 7303030 i got an error at A1000 phase
    "An error occured while generating forms library files" continue as if it were successful..
    As we can generate forms after completion of the patch, i selected yes .
    Later, i got "An error occured while generating oracle forms files", here also i selected to continue.
    Please tell me is this the right way to continue the patch?
    I didn't find any errors in worker logs files, after all workers have quit, this error was reported.
    After completion of the patch what necessary steps i need to perform.
    Thanks,

    Hi,
    The errors are due to missing of pre-req patch 6400501
    Forms 10.1.2.3 Compilation against an 11.2 Fails with DB PL / SQL ERROR 801 ... internal error [60603] [ID 1065020.1]
    After Applying the patch, successfully gerated forms.
    I applied the patch 6400501 but still with the error internal error [60603], have something more to be done?
    Thank you.

  • Issues while generating the output(using VF02- header- output)

    Hi Group,
    I have issues while generating output using the tcode VF02.
    the issue is that, I have got an output type which is configured for the medium "Special Function" and the o/p program generates PDF depending on certain criteria.
    now, when I try to "repeat output" option to generate output again, I was getting errors. it works sometimes by choosing the option "Successfully processed" then selecting the record and then saving it. if I do the same thing again, the output make be running into error or success, it is varying everytime.
    I am not able to understand why the process is not working correctly, when doing the "Repeat output" option on the "Successfully processed" record, the system is giving error instead of a Successful message. it is succeeding once and sometimes not.
    Kindly let me know your inputs if you have encountered same kind of issue.
    Regards,
    Vishnu.

    hi Group,
    when I checked the configuration part of the o/p type I am currently processing, I could see that, in customizing part of this o/p type, there was a checkbox for Multiple Issuing - which is not checked.
    when it is not checked, it should not be possible to run the o/ps for multiple times, but when I tried to generate o/ps again and again, in a few attempts I could to generate the PDFs and in some cases not. I don't know why there is no relevance for this check box Multiple Issuing.
    Could you please let me know why this issuing of o/p is working in some occassions and not working work sometimes, which is  having the configuration of checkbox Multiple Issuing - not being checked.
    thanks for your help in advance.
    Regards,
    Vishnu.

  • Problem generating XML data file through XSD

    Hello there,
    I'm trying to generate an XML data file as per the below steps:
    Making a normal schema in query transform.
    Mapping desired source column into query schema.
    Right click on query transform and generating XML schema (XSD) in my local system.
    In object library XML file format adding the XSD with schema as a root element.
    Creating target instance from object library XML file.
    In target file specifying file creation path in my local system. 
    After performing above steps and executing the job getting below error.
    Any help on this will be much appreciated.
    Regards,
    Jagari

    I found some code looking at a similar issue - I don't know how to recover other data (in I guess) the rest of the HTML in the code. I need to find a better reference with details and examples.
    I fixed it by changing:
        ssDebug.trace(moreStuff.xliff.file.body.trans-unit); // - error
    to:
       ssDebug.trace(moreStuff.file.body['trans-unit']); //- no error
    with expected output (no error):
    <trans-unit id="001" resname="IDS_ZXP7_JAM_01">
      <source>If, while you are printing, your printer stops, ...</source>
    </trans-unit>
    <trans-unit id="002" resname="IDS_ZXP7_JAM_02">
      <source>look at the Operator Control Panel (OCP) for the fault description.</source>
    </trans-unit>
    <trans-unit id="003" resname="IDS_ZXP7_JAM_03">
      <source>If the fault is a card jam, open and close the Print Cover (or Options Cover).</source>
    </trans-unit>
    <trans-unit id="004" resname="IDS_ZXP7_JAM_04">
      <source>The printer will initialize and move the jammed card to the Reject Bin.</source>
    </trans-unit>
    <trans-unit id="005" resname="IDS_ZXP7_JAM_TITLE">
      <source>Card Jam</source>
    </trans-unit>

  • Issue while generating report using web.show_document with https

    Hi All,
    I am facing some issue while seeing the report using web.show_document as shown below:
    https://ucrmskr.apac.nsroot.net:10301/forms/html/001725032_gca.rtf_
    In this case the report opens directly without asking for me to save or open or cancel option
    whereas if I hit
    http://scrmskr.apac.nsroot.net:7801/forms/html/001725032_gca.rtf_
    it asks for save or open or cancel option
    so that I can save the report to my machine and open in wordpad format
    The report generated in the first case is not coming in proper format
    Below are my forms.conf mappings:
    # Name
    # forms.conf - Forms component Apache directives configuration file.
    # Purpose
    # It should include the weblogic managed server (routing) directives for
    # the servers where Forms applications are deployed and other miscellaneous
    # Forms component OHS directives.
    # Remarks
    # This file is included with the OHS configuration under
    # $OI/config/OHS/<OHS Node Name>/moduleconf sub-directory.
    # virtual mapping for the /forms/html mapping.
    RewriteEngine on
    RewriteRule ^/forms/html/(..*) /workaroundhtml/$1 [PT]
    AliasMatch ^/workaroundhtml/(..*) "/ucrmap1/weblogic/bea/ucrms/config/FormsComponent/forms/html/$1"
    RewriteRule ^/ucrms/icons/(..*) "/workaroundicons/$1" [PT]
    AliasMatch ^/workaroundicons/(..*) "/ucrmap1/weblogic/bea/ORA_PFRD/forms/java/$1"
    RewriteRule ^/forms/help/(..*) "/workaroundhelp/$1" [PT]
    AliasMatch ^/workaroundhelp/(..*) "/ucrmap1/ucrrgbg2/help/$1"
    <Location /forms>
    SetHandler weblogic-handler
    WebLogicCluster kauh0079:9001
    DynamicServerList OFF
    </Location>
    Please let me know what needs to be done additionally if we are trying to hit https because in the second case we were hitting http with similar mapping in diff environment and it was generating report successfully.
    Regards,
    Harish

    Thanks for answering,
    I changed the URL from
    http://nbotlaguduru.dms.local/export/FMSLaborChargesalcs20060829132645.pdf
    to
    http://nbotlaguduru.dms.local:8889/export/FMSLaborChargesalcs20060829132645.pdf
    and the same problem occured
    the file is located on my local C drive in:
    C:\lcs\export
    seems as though I am missing something else as well
    any ideas?

  • SOA Manager issue while generating the URL

    Hi All,
    we are working on  "Third party system <--> BizTalk <--> PI(7.31 dual stack) <--> SAP [ECC]" interface using PI.
    We are using XSD's we didn't generate the WSDL files..
    When we are trying to generate the URL using SOAMANAGER in quality it is showing the production URL.
    In SAP  SOAMANAGER web service administration it is showing me the wrong binding WSDL URL(which is Production URL) it supposed to be Quality system Binding URL.I have deleted the web service and created again still it is showing the same. System supposed to generate the binding URL Quality.
    In SXMB_MONI the error is
        <SAP:Category>XIServer</SAP:Category>
       <SAP:Code area="INTERNAL">WS_ADAPTER_SYS_ERROR</SAP:Code>
    <SAP:P1 />
    <SAP:P2 />
    <SAP:P3 />
    <SAP:P4 />
    <SAP:AdditionalText />
    <SAP:Stack>System error while calling Web service adapter: Error when initializing SOAP client application: &#39;Error when initializing SOAP client application: &quot;SRT: Unexpected failure in SOAP processing occurre&quot;&#39;</SAP:Stack>
    I referred the below notes and the links..
    1142454 - Error when initializing the SOAP application
    1142596 - Logical port was not yet created
    Please help me to resolve the issue..
    Thanks
    Bhargava Krishna

    Hi,
    Could you please check your pricing procedure, on which value the tax is getting calculated.
    Ie Base value you are saving on which step.
    The same Step number you have given in From column.
    Please check that.
    Regards,
    Nagesh

  • Issues while generating

    Hello,
    While generating the BRFplus function (using programFDT_GENERATE_MANY_FUNCTIONS) an error is issue
    We're using Action Call Funtion Module ( Customer  Specific).
    This is how we're mapping table required by the FM .
    All the 3 tables used for mapping are created as Data Objects refering to similar DICC Type Table ( Customer Specific)
    While generating the function this is what we get in Method PROCESS_PURE
    As you could see, the DATA definition for the third table is generated as TYPE not as TYPE TABLE ( as should be and as happen for the other two).
    Due to this we're getting the next syntax error
    A number of notes OSS have been aplplied but issue persit,
    Any advice about how to solve the issue?
    Thank you in advance
    Carlos

    You should create an OSS ticket for the issue. Let the support team identify or create a note for it.

  • Issue while sending a zipped file from ABAP to JAVA layer

    Hi All,
    I have a requirement wherein i have to zip a xml file in abap, convert it to base64 encoded string and send it to JAVA layer.
    I'm using the class CL_ABAP_ZIP for zipping the xml string and FM "SCMS_BASE64_ENCODE_STR" to convert the zipped data to base64 encoded string.
    But on the JAVA layer we get an exception while unzipping this data.
    Has anybody come across a similar situtation.
    Please help.
    Regards,
    Ankit Agrawal

    Example
    REPORT  Z_PAP_UP_ZIP_DL.
    DATA: L_ZIPPER TYPE REF TO cl_abap_zip.
    DATA: FILEX type XSTRING.
    DATA: FILENAME type string.
    DATA: PATH type string.
    DATA: zip type xstring.
    DATA: FILE_N_TAB type FILETABLE.
    DATA: FULL_PATH type string.
    DATA: FILE_LENGTH type i.
    DATA: FILE_TAB type w3mimetabtype.
    DATA: WA_INT type int4.
    DATA: PATH_TABLE TYPE TABLE of char1024.
    "    Load the file
    "Which file to upload
    CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_OPEN_DIALOG
      CHANGING
        FILE_TABLE              = FILE_N_TAB
        RC                      = WA_INT
    "load the (first) file from the frontend the user has selected.
    LOOP at FILE_N_TAB into FULL_PATH.
    "get the file
      CALL METHOD CL_GUI_FRONTEND_SERVICES=>GUI_UPLOAD
         EXPORTING
           FILENAME                = FULL_PATH
           FILETYPE                = 'BIN'
        IMPORTING
          FILELENGTH              = FILE_LENGTH
        CHANGING
          DATA_TAB                = file_tab.
        exit.
    endloop.
    "create xstring from table
    CALL FUNCTION 'SCMS_BINARY_TO_XSTRING'
      EXPORTING
        INPUT_LENGTH       = FILE_LENGTH
      IMPORTING
        BUFFER             = FILEX
      TABLES
        BINARY_TAB         = file_tab
    "get the name of the file. we take entry after the last '\' ...windows.
    SPLIT FULL_PATH AT '\' INTO TABLE PATH_TABLE.
    DESCRIBE TABLE PATH_TABLE LINES WA_INT.
    READ TABLE PATH_TABLE INTO FILENAME INDEX WA_INT.

Maybe you are looking for