Problem with EXPORT IMPORT PROCESS in ApEx 3.1

Hi all:
I'm having a problem with the EXPORT IMPORT PROCESS in ApEx 3.1
When I export an application, and try to import it again. I get this error message
ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 16, column 28: PLS-00103: Encountered the symbol "牃慥整㈰㈯⼴〲㐰〠㨷㐵㈺′䵐" when expecting one of the following: ( - + case mod new not null <an identifier> <a double-quoted delimited-identifier> <a bind variable> avg count current exists max min prior sql stddev sum variance execute forall merge time timestamp in
As a workaround, I check the exported file and found this
wwv_flow_api.create_flow
p_documentation_banner=> '牃慥整⠤㈰㈯⼴〲㠰〠㨷㠵㈺′äµ
And when I replace with this
p_documentation_banner=> ' ',
I can import the application without the error.
somebody knows why I have to do this??
Thank you all.
Nicolas.

Hi,
This issue seems to have been around for a while:
Re: Error importing file
I've had similar issues and made manual changes to the file to get it to install correctly. In my case, I got:
ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful.<br>ORA-02047: cannot join the distributed transaction in progress<br>begin execute immediate 'alter session set nls_numeric_characters='''||wwv_flow_api.g_nls_numeric_chars||'''';end;There are several suggestions, if you follow that thread, about character sets or reviewing some of the line breaks within pl/sql code within your processes etc. Not sure what would work for you.

Similar Messages

  • Problem with export / import

    Hi everybody, I am having a problem with an import. I am doing an export with a value in an user exit:
    EXPORT V_NLQNR = V_NLQNR TO MEMORY ID 'QUANTL'
    And after I call import in another program:
    IMPORT V_NLQNR = V_NLQNR FROM MEMORY ID 'QUANTL'.
    But when I check the sy-subrc I am getting 4, so I don't get anything.
    Does anybody why?? Is there a problem if I call the export from an user exit??
    Thanks in advance.

    Hello,
    I think you have the right idea.
    As a suggestion I would name my variables to make it clear which data is being
    exported/imported. I would also use different names on the left and right side of the = sign. 
    Here is a working example from programs that I use:
    In the first program
       EXPORT intercodata FROM g_data_exp TO MEMORY ID 'INTERCOWOS'.
         where g_data_exp is declared as a global variable
    In the second program
       IMPORT  intercodata TO g_data_imp FROM MEMORY ID 'INTERCOWOS'.
         where g_data_imp is declared as a global variable
    The syntax that you use ( p1 = dobj1 )  should work as well, just make sure that the variable v_nlqnr to the right of the equal sign has a value before the export.
    Regards
    Greg Kern

  • Problem with export/import in back ground

    Hi Experts,
    I am having a requirement in which i am exporting an internal table to memory from one program and importing that in another program. in foreground it is working fine. But in background it is not working. Is there any work around.
    Here is sample code.
    Program1:
    types:
      begin of tab_type,
        para type string,
        dobj type string,
      end of tab_type.
    data:
      id    type c length 10 value 'TEXTS',
      text1 type string value `IKE`,
      text2 type string value `TINA`,
      line  type tab_type,
      itab  type standard table of tab_type,
        itab1  type standard table of tab_type.
    line-para = 'P1'.
    line-dobj = 'TEXT1'.
    append line to itab.
    line-para = 'P2'.
    line-dobj = 'TEXT2'.
    append line to itab.
    free memory id 'TD'.
    export itab to memory id 'TD'.
    Program2:
    types:
      begin of tab_type,
        para type string,
        dobj type string,
      end of tab_type.
    data:
      id    type c length 10 value 'TEXTS',
      text1 type string value `IKE`,
      text2 type string value `TINA`,
      line  type tab_type,
      itab  type standard table of tab_type,
        itab1  type standard table of tab_type.
    refresh itab.
    import itab from memory id 'TD'.
    free memory id 'TD'.
    clear line.
    loop at itab into line.
      write: / line-para, line-dobj.
      clear line.
    endloop.
    Thanks,
    Jyothi

    Thanks for your links Venkat.
    My problem is solved by using the SHARED BUFFER.
    Here is the code i used inthe first program
            DATA: wa_invdata(10) TYPE c VALUE 'INVDATA'.
            DATA: wa_month(10) TYPE c VALUE 'MONTH'.
            DATA: wa_process(10) TYPE c VALUE 'PROCESS'.
            EXPORT wa_month_end TO MEMORY ID 'MONTHEND'.
            EXPORT e_process = it_inv_data[]
            TO SHARED BUFFER indx(st) ID  wa_process.
            EXPORT e_month = wa_month_end
            TO SHARED BUFFER indx(st) ID  wa_month.
            EXPORT e_invdata = it_tab[]
                TO SHARED BUFFER indx(st) ID wa_invdata.
         EXPORT it_inv_data TO MEMORY ID 'PROCESS'.
            DATA: number           TYPE tbtcjob-jobcount,
                  name             TYPE tbtcjob-jobname VALUE 'CRDR_ ',
                  print_parameters TYPE pri_params.
            CONCATENATE name
                    sy-datum
                    INTO name .
    *CALL FUNCTION 'GET_PRINT_PARAMETERS'
    EXPORTING
       no_dailog              = 'X'
       archive_mode           = '3'
    IMPORTING
       out_parameters         = print_parameters
       out_archive_parameters = archi_parameters
       valid                  = valid_flag
    EXCEPTIONS
       invalid_print_params   = 2
       OTHERS                 = 4.
            CALL FUNCTION 'JOB_OPEN'
              EXPORTING
                jobname          = name
              IMPORTING
                jobcount         = number
              EXCEPTIONS
                cant_create_job  = 1
                invalid_job_data = 2
                jobname_missing  = 3
                OTHERS           = 4.
            IF sy-subrc = 0.
              print_parameters-pdest = 'locl'.
              SUBMIT zsd_crdr_monthend_back_process
                      TO SAP-SPOOL
                      SPOOL PARAMETERS print_parameters
                      WITHOUT SPOOL DYNPRO
                      VIA JOB name NUMBER number
                    AND RETURN.
              IF sy-subrc = 0.
                CALL FUNCTION 'JOB_CLOSE'
                  EXPORTING
                    jobcount             = number
                    jobname              = name
                    strtimmed            = 'X'
                  EXCEPTIONS
                    cant_start_immediate = 1
                    invalid_startdate    = 2
                    jobname_missing      = 3
                    job_close_failed     = 4
                    job_nosteps          = 5
                    job_notex            = 6
                    lock_failed          = 7
                    OTHERS               = 8.
    Second Program
    Program zsd_crdr_monthend_back_process.
      DATA: wa_invdata(10) TYPE c VALUE 'INVDATA'.
      DATA: wa_month(10) TYPE c VALUE 'MONTH'.
      DATA: wa_process(10) TYPE c VALUE 'PROCESS'.
      IMPORT e_invdata = it_tab[]
      FROM SHARED BUFFER indx(st) ID  wa_invdata.
      IF sy-subrc NE 0 OR it_tab[] IS INITIAL..
       No data found for the Month end processing
        MESSAGE s398(00)
           WITH 'No Data Found'.
        STOP.
      ELSE.
    found the data
        IMPORT e_process = it_inv_data[]
        FROM SHARED BUFFER indx(st) ID  wa_process.
        IMPORT e_month = wa_month_end
        FROM SHARED BUFFER indx(st) ID  wa_month.
    Thanks for your help.

  • Problems with export/import

    Hi People,
    I'm fairly new to the world of Oracle, I wonder if you could help me.
    * Imported a dump file into Oracle 8.1.7 R3
    ( Came from Oracle 8.1.7 running on HP/UX, importing onto another HP/UX
    system. Dump file is from one user and contains approx 7GB of data )
    Once imported this leaves me with approx 10% of my disk left ( its a 9GB
    partition ).
    * Ran a purge script against the user's tables that removes transactional
    data over 60 days old. This should get rid of 50-60% of the data.
    * I then exported the table data out to a dmp file
    ( Compressing Extents )
    * removed original user
    * removed rollback segments
    * removed tablespace
    * recreated tablespace
    * recreated rollback segment
    * created user
    * ran imp with the newly created dmp file
    The problem i've got, is that once I import the newly created dump file,
    it fills the disk up completely and the import never finishes.
    Please forgive me, as i'm sure i'm doing something fundamentally wrong.
    Can someone explain what i'm doing wrong and how to rectify it ?.
    regards
    /Lars

    It seems the import is failing because the export to memory fails also.
    Can you explain?
    Did you check the extended debugger facilities like memory areas?
    What is the structure/amount of data for ex- and import, same name, same data object?
    Just reveal what you are really doing, post a few code lines.
    Or not.
    Regards,
    Clemens

  • Problem with Export/Import  in background

    Hi Experts,
    I am having a requirement in which i am exporting an flag to memory from one program and importing that in another program. in foreground it is working fine. But in background it is not working. Is there any work around.
    Report 1:
    DATA:l_flag type c value 'X'.
    free memory id 'ZFLAG'.
    EXPORT L_FLAG TO MEMORY ID 'ZFLAG'.
    LEAVE PROGRAM.
    Report 2:
    DATA:L_FLAG TYPE C.
      import l_FLAG from memory id 'ZFLAG'.
      free memory id 'ZFLAG'.
      IF l_FLAG = 'X'.
      LEAVE PROGRAM.

    Hi,
    You can export the flag to database ..it is not required to be a internal tabel. You can refer to this link..
    It is explained with example
    http://help.sap.com/saphelp_45b/helpdata/en/34/8e73a36df74873e10000009b38f9b8/content.htm
    Here obj1 can be single variable..
    EXPORT obj1 ... objn TO DATABASE dbtab(ar) ID key
    This will work in Background not the Export to MEMORY statement.

  • Have a problem with exporting & importing project.

    Hi all,
    I have created my web dynpro project in my system. i tried to exported as zip file. and import in to my laptop. unfortunatly it asks, where to import this zip file and i have given my workspace in my laptop. but it doesn't accept it. Any one can help me by step by step. forget about what i have done. please give me your sugisstions.
    thank you
    Regards
    Ravi

    Hai,
    Right Click on You project select export as -->zip file select location finish.
    to import that one.
    create new project with same name (the project old name)
    file-->import -->zip file Select your zip file --> select the new project nad click finish
    regards,
    naga
    Message was edited by:
            Naga Raju Meesala

  • Issues with Export/Import using Database & Shared buffer

    Hi All,
    I have a method that calls a program via a job and I am having issues passing data between the two.
    Note that two different users process the method and program (via job) resp. This is how I am calling the second prog-
    SUBMIT ZPROGRAM
                 VIA JOB     l_jobname
                     NUMBER  l_jobcount
                     USER    i_user
                 AND RETURN.
    I need to pass data from method to  the second program and vice versa and then the method continues its processing with the data acquired from the second program.
    I have tried using Import/Export using database and also shared buffer. What I have found is that most of the times I am able to pass data from method to program. However the job takes a couple of min to execute and I think that the data is not making back to the method in time.
    I have looked at some useful forum links-
    Problem with export/import in back ground
    Re: EXPORT/IMPORT  to MEMORY
    but havent been able to find an answer yet. Any solution? Thanks in advance for your help!
    Liz

    Hi Suhas, Subhankar
    I have tested the scenario without the job previously itself and it works. Thats the reason, i am trying with the job now as my requirement is that I need to change the user while executing the second report.
    Here is an example of my import/export - I am passing the return value from the second report to the first.
    Code in second report-
    DATA: INDXKEY LIKE INDX-SRTFD VALUE 'RET1'.
    INDX-AEDAT = SY-DATUM.
    INDX-USERA = SY-UNAME.
    EXPORT RETURN1 TO SHARED BUFFER INDX(ST) ID INDXKEY.
    Code in first report -
    SUBMIT ZPROGRAM
                     VIA JOB     l_jobname
                         NUMBER  l_jobcount
                         USER    i_user
                     AND RETURN.
    Once Job close FM is executed successfully, I import the values as follows
    IMPORT RETURN1 TO RETURN1 FROM SHARED BUFFER INDX(ST) ID INDXKEY3.
    INDXKEY is having value RET1.
    However Return1 is not having any values in first report. It has some value in executed without the job
    Please note that I have tried Export/import with Database too and I am getting the same results.
    Thanks for your suggestions.
    Regards, Liz

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Summary: Problem with export (maybe...terminated sucEE to XE issues exp/imp

    Summary: Problem with export (maybe...terminated successfully with warnings) and import (IMP-00022: failed to process parameters)
    I used PL/SQL developer to make a parameter file (initially I used it to export, then I just stole the file to try again). It contains 100 tables from a single schema.
    Export from prod DB using exp parfile=test.par: Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
    server uses ZHS16GBK character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    . . exporting table     CONTRACT_INFO         12 rows exported
    EXP-00091: Exporting questionable statistics.
    'Export terminated successfully with warnings.' !!!I want to import into the cs2_user schema (same privileges as on the production DB) trying:impdp 'sys as sysdba'@danieldb schemas=cs2_user dumpfile=test01tables3.dmpThat gets the error: UDI-00014: invalid value for parameter, 'attach'
    I then thought maybe you have to use imp/exp or impdp/expdp instead of a combination (exp + impdp) so I tried: imp 'sys/admin as sysdba'@danieldb touser=cs2_party_owner file=test01tables4.dmpbut then I just get: IMP-00022: failed to process parameters
    Anyone see why its all failing? :s
    Mike

    you must input SID name into single-quotes like:
    C:\Tools\Oracle\product\10.2.0\db\NETWORK\ADMIN>expdp 'sys/manager@stb as sysdba'
    Export: Release 11.2.0.1.0 - Production on Thu Feb 9 13:46:16 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    UDE-06550: operation generated ORACLE error 6550
    ORA-06550: line 1, column 11:
    PLS-00201: identifier 'SYS.DBMS_UTILITY' must be declared
    ORA-06550: line 1, column 11:
    PL/SQL: Statement ignored

  • I'm having problem with exporting video from premiere pro cc 2014

    I'm using Adobe Premiere Pro CC 2014 and I'm having problem with exporting video.
    When I click Export and media, the window for exporting appears and it starts exporting video.
    But after a few minutes, it looks like it has completed exporting video and says there's a bug in exporting.
    It literally says it has a problem with compiling, but I don't know what the real problem is.
    I tried many things like restarting computer and Premire Pro but it's no use.
    But it works when I use Adobe Media Encoder CC 2014 for exporting video.
    If anyone knows any possibilities about why this problem happens, please let me know.

    I'm having a similar issue.  When I go to export my project, I add it to the render queue, set it up with the h.264 settings that I've used with countless past versions of the program, and then hit render.  Randomly, at some point in the process, the render just stops and the ETA just keeps climbing.  I actually had to save an XMF file of the CC 2014 project and bring it into the previous CC version of Premiere just to finish it.  Even a clean re-install of CC 2014 doesn't work.

  • Problem with exporting devices to non-global zone

    Hi,
    I've problem with exporting devices to my solaris zones (i try do add support to mount /dev/lofi/* in my non-global zone).
    A create cfg for my zone.
    Here it is:
    $ zonecfg -z sapdev info
    zonename: sapdev
    zonepath: /export/home/zones/sapdev
    brand: native
    autoboot: true
    bootargs:
    pool:
    limitpriv: default,sys_time
    scheduling-class:
    ip-type: shared
    fs:
    dir: /sap
    special: /dev/dsk/c1t44d0s0
    raw: /dev/rdsk/c1t44d0s0
    type: ufs
    options: []
    net:
    address: 194.29.128.45
    physical: ce0
    device
    match: /dev/lofi/1
    device
    match: /dev/rlofi/1
    device
    match: /dev/lofi/2
    device
    match: /dev/rlofi/2
    attr:
    name: comment
    type: string
    value: "This is SAP developement zone"
    global# lofiadm
    Block Device File
    /dev/lofi/1 /root/SAP_DB2_9_LUW.iso
    /dev/lofi/2 /usr/tmp/fsfile
    I reboot the non-global zone, even reboot global-zone, and after that, in sapdev zone, there is no /dev/*lofi/* files.
    What i do wrong? Maybe I reduce my sol 10 u4 sparc instalation too much.
    Can anybody help me?
    Thanks for help,
    Marek

    I experienced the same problem on my system Sol 10 08/07.
    Normally, when the zone enters the READY state during boot, it's zoneadmd will run devfsadm -z <zone>. In my understanding this is to create the necessary device files in ZONEPATH/dev.
    This worked well until recently. Now only the directories are still created.
    It seems as if devfsadm -z is broken. Somebody should issue a call to sun.
    As a workaround you can easily copy the device files into the zone. It is important not to copy the symbolic link but the target.
    # cp /dev/lofi/1 ZONEPATH/dev/lofi
    Hope this helps,
    Konstantin Gremliza

  • URGENT : Problems with exporting a translated application

    Hi ! I have a problem with exporting a translated application. Effectively, I exported both applications (The normal and the translated version). I want now to apply the translation XLIFF. I want to know if it is possible to apply my old XLIFF file. The problem is that I can't select the same ID for my application.
    I really have problems!
    Anyone can help me ?

    When I import the translated version, the translation works a little bit. Effectively, all my reports appear but there is no item that appears.

  • Problem in export/import work repo !!!

    Hi All,
    Good Morning.
    I have a problem with exporting and importing repositories. The description of the problem is given below,
    I have a remote machine in that sunopsis is installed and all the developments,implementations,testing are happening there.
    Now i need to stimulate the same environment in my local desktop(ODI is installed and master,work repo is created).
    When i export the work repo from my remote machine and trying to import it in my local machine (from designer level import->work repo) after a while its saying "snp_lschema does not exists" error.
    Any one have idea why this is happening?
    Thanks,
    Guru

    Hi Julien,
    Thanks for your input. It really helpful for me.
    I need some more clarifications,
    Actually i exported my master and work repo from my remote machine (as a zip file) and saved it in my local drive (say D:/downloads/deploy.zip...)
    So when i tried to import master repo from topology ( browsing the zip file and said OK) its not doing anything, i mean nothing happens after i said OK.
    Should i have to copy this master repo zip in the Sunopsis installation folder (in IMPEXP directory) and then import it? Am i doing right?
    Please advise.
    Thanks,
    Guru

  • Problem with export files

    Hello,
    I have got the version 5.3, original.
    I have a problem with export files. After work with a lot of picture in dng format, I try to export all picture in jpg format, but the number of pictures exported are less than the original number.
    Why?
    Thank you in advance

    There is nothing you are doingt wrong, the web people need to check their Flash stuff, most importantly realyl include al lthe files you provide them with. there's a reason why it spits out 4 files...
    Mylenium

  • Problem with Exporter for MS Access 3.2 in SQL Developer

    Hi,
    I have problem with exporting tables and data from MS Access to XML with Exporter for MS Access 2000.
    This error ocurr: 'Error #5 - XML Exporter'
    When I use Exporter for MS Access 2002 this error ocurr: 'Error #3478 - XML Exporter'
    Any leads how to solve this problem ?

    Thread moved to Forum Home » Database » SQL Developer
    SQL Developer
    Please, stay tune there.
    Nicolas.

Maybe you are looking for

  • My ipod will not sync now that windows 8 has been installed, but ipod worked fine before

    I am running a Ipod Nano think it's 3rd or 4th generation, when I had windows seven the ipod would be syncing fine with my iTunes, not that windows has updated windows to windows eight, I am now experiencing problems with my ipod syncing to my iTunes

  • Sign in problem

    Hi, Just fresh Mac OS X 10.7.4 install. kickoff FaceTime and was unable to sign in ("could not sign in. please check your network connection and try again"). Can you help please?

  • VRF and FTP Server

    I have a weird problem with VRF and FTP Server. I have a lab setup whereby two VRFs Client1 and Client2 are created. Both the VRFs are in the same subnet. I have configured FTP-Server and TFTP-Server on this router. TFTP-Server works perfectly fine f

  • JBuilder vs. J2SDK?

    Hello All, I use J2SDK 1.4.0-beta3 (active) on my PC and have 1.3.1 (inactive) in a separate directory. The I/O window is the DOS command window. A board requires that I download something called JBuilder IDE (38MB). The board is a TINI (DS80C390-bas

  • Webservice with BW 30B

    Hello, I would like to know that can I create a webservice on BW 30B. If yes, can somebody provide an example on how to do it. Regards Pranay Dave