Import/Export Analyzer views using API

Hi,I need a solution for export/import views using the Analyzer API.

Got the answer from helpdesk:apiApplet.run("ImportReportGroup", "", "", "", "");apiApplet.run("ExportReportGroup", "", "", "", "");

Similar Messages

  • Import/Export publishable packages using WLPI APIs

    Hi,
    I have been trying to create a utility to perform import/export of publishable
    packages. Here is the problem that I encountered:
    When creating the PackageEntry, I need to have a map of all references that a
    publishable object has, however, the method to retrieve the referenced publishables
    is no longer in the Publishable interface
    public java.util.List getReferencedPublishables( java.util.Map publishables)
    If I pass a null for the referenced publishables, then the export will work, but
    when I tried to import the package, I got the following error. Any workaround
    to the problem or is there something that I missed?
    Here is the error:
    The server was unable to complete your request.
    null
    null
    Start server side stack trace:
    Unknown error: com.bea.wlpi.common.WorkflowException: The server was unable to
    complete your request.
    null
         at com.bea.wlpi.server.admin.AdminBean.importPackage(AdminBean.java:1168)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl.importPackage(AdminBean_11ksof_EOImpl.java:301)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:298)
         at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:267)
         at weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:22)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
    Nested exception is: java.lang.NullPointerException
         at com.bea.wlpi.server.admin.ImportManager.resolveTDReferences(ImportManager.java:787)
         at com.bea.wlpi.server.admin.ImportManager.importTemplateDefinition(ImportManager.java:659)
         at com.bea.wlpi.server.admin.ImportManager.importPackage(ImportManager.java:293)
         at com.bea.wlpi.server.admin.AdminBean.importPackage(AdminBean.java:1164)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl.importPackage(AdminBean_11ksof_EOImpl.java:301)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:298)
         at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:267)
         at weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:22)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
    End server side stack trace
    Start server side stack trace:
    java.lang.NullPointerException
         at com.bea.wlpi.server.admin.ImportManager.resolveTDReferences(ImportManager.java:787)
         at com.bea.wlpi.server.admin.ImportManager.importTemplateDefinition(ImportManager.java:659)
         at com.bea.wlpi.server.admin.ImportManager.importPackage(ImportManager.java:293)
         at com.bea.wlpi.server.admin.AdminBean.importPackage(AdminBean.java:1164)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl.importPackage(AdminBean_11ksof_EOImpl.java:301)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:298)
         at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:267)
         at weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:22)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
    End server side stack trace
    Unknown error: com.bea.wlpi.common.WorkflowException: The server was unable to
    complete your request.
    Start server side stack trace:
    java.lang.NullPointerException
         at com.bea.wlpi.server.admin.ImportManager.resolveTDReferences(ImportManager.java:787)
         at com.bea.wlpi.server.admin.ImportManager.importTemplateDefinition(ImportManager.java:659)
         at com.bea.wlpi.server.admin.ImportManager.importPackage(ImportManager.java:293)
         at com.bea.wlpi.server.admin.AdminBean.importPackage(AdminBean.java:1164)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl.importPackage(AdminBean_11ksof_EOImpl.java:301)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:298)
         at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:267)
         at weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:22)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)
    End server side stack trace
         at weblogic.rmi.internal.BasicOutboundRequest.sendReceive(BasicOutboundRequest.java:85)
         at weblogic.rmi.internal.BasicRemoteRef.invoke(BasicRemoteRef.java:135)
         at weblogic.rmi.internal.ProxyStub.invoke(ProxyStub.java:35)
         at $Proxy4.importPackage(Unknown Source)
         at com.worldchain.wlpiAdmin.WlpiAdmin.handleImport(WlpiAdmin.java:201)
         at com.worldchain.wlpiAdmin.WlpiAdmin.main(WlpiAdmin.java:83)
    Nested exception is: java.lang.NullPointerException
         at com.bea.wlpi.server.admin.ImportManager.resolveTDReferences(ImportManager.java:787)
         at com.bea.wlpi.server.admin.ImportManager.importTemplateDefinition(ImportManager.java:659)
         at com.bea.wlpi.server.admin.ImportManager.importPackage(ImportManager.java:293)
         at com.bea.wlpi.server.admin.AdminBean.importPackage(AdminBean.java:1164)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl.importPackage(AdminBean_11ksof_EOImpl.java:301)
         at com.bea.wlpi.server.admin.AdminBean_11ksof_EOImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:298)
         at weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.java:267)
         at weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest.java:22)
         at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:139)
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:120)

    There may be options to your process, but as I have mentioned in other strings, the Journals in the BPC for Microsoft version are sequenced.  That means that there is a system generated sequence ID for each journal.  Any individual selection of journals would possibly cause an issue for the sequence. I have not seen any past work in SSIS to detach the records for export or import.  It may be possible, but I would assume it will take some core SQL coding.
    Regarding your error, you may need to verify that you have the security set correctly to use Journals and use the Data Manager capabilities.  Typically, when security tasks are not correct, you will get an error similar to your HRRESULTX.... error.
    An option for loading the details is to build a worksheet in EVDRE, aggregate the data and send it to the cube at a level that is easy to save the file and submit consolidated results.  Just make sure you send all "like" records at an aggregated to a base member, and send to a datasrc to identify the extra details. Then store the excel file with the input values.
    Hope this helps.
    Edited by: Petar Daniel on Feb 16, 2009 10:06 PM

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • IMPORT OF INSTALLATION DETAILS USING API

    We have got orders in the existing 11.5.10 system now we want to import the systems from our legacy system and link them to the orders. I know we can create the systems using CSI_SYSTEM_PUB but how do we link them to order lines so that they are visable in order line installation details. Is there a api to do this? Please help.
    Thank you,
    Nag.

    Nag
    There is no public API but a group API is available. Assuming that you have already created transaction lines (using CSI_T_TXN_DETAILS_GRP) you can use CSI_T_TXN_SYSTEMS_GRP to assign systems. So first you create transaction lines and then systems.
    Thanks
    Nagamohan

  • Import / Export Script Presets problem

    Hi all,
    I've an issue when trying to import / export custom scripts using the import/export presets command in CS6.
    I like to have custom scripts accessible under the File > Scripts menu, so I keep them in the Application / Presets Folder. I've found if I store them in the Users/ Presets folder they don't appear under the File > Scripts menu.
    If I use the Export Presets command, I can't export custom scripts stored in the application folder, only if I have them stored in the User/Preset directory.
    If I go to another machine and use the Import Preset Option, I can get it to import the custom scripts from a back up folder, but they get installed in the User/Presets directory, so I end up having to manually move them afterwards.
    Is there a way around this ? It's great being able to package the presets and quickly install them elsewhere, but at the moment I'm still having to move the scripts by hand
    Mac 10.6.8 PS 13.01

    HI,
    I've no idea why you get that error, but there's a simpler way to transport your bookmarks to another machine.
    /Users/YourUsername/Library/Safari/Bookmarks.plist
    Copy that file to the equivalent location on the other machine and that's it.

  • CMSDK import/export utility

    Can the 9i CMSDK import/export utility be used to export the content of the an iFS 1.1.6 schema to subsequently import into a 9i database? Once imported into the 9i database, then migrate/upgrade the 9i iFS schema to CMSDK 9 for testing purposes?

    Thank you for your advice Luis.
    What I really need to be able to do is keep my iFS 1.1.6/RDBMS8.1.7.4 development environment intact, while creating a CMSDK/Oracle9i environment for regression testing. I have multiple iFS schemas (6) in our 8.1.7.4 dev database (source). And, what I would like to do is copy one of the iFS environments (including Metadata) out of source and move it into the CMSKD/Oracle9i environment(target). Both environments need to remain accessible, and the source environment needs to remain at version iFS1.1.6 / 8.1.7.4 for regression testing.
    Here's a list of actions that I have taken:
    1. install oracle9i rdbms software.
    2. create 9i database from scratch.
    3. install 9iAS software
    4. install CMSDK software in 9iAS root.
    5. export one iFS schema out of the dev source database. (Using the plain RDBMS export utility)
    6. import this schema into the newly created Oracle9i database.
    7. Run CMSDK configuration tool to upgrade the imported iFS 1.1.6 schema. <== FAILED HERE!!!!
    I run into a problem at the start of the upgrade, when using the configuration tool. It complains that the credential manager username IFSSYS$CM does not exist and fails. So, I simply created a user called IFSSYS$CM in the 9i target database to see if this would allow the configuration tool upgrade process to progress and the result was another error of "Insufficient Privileges". How can manually set a the crediential manager so that the imported iFS schema can be upgraded successfully?
    I realize that be doing an in-place upgrade of my entire Oracle 8i source database. I will be able to migrate both the RDBMS and iFS using the installation/configuration tools.
    If there is a limitation of not being able to copy (credential manager metadata) an iFS 1.1.6 schema out of the source and into a target database, and then upgrading/migrating the schema to CMSDK/Oracle9i. Then I may be forced into cloning my entire development instance (giving it a different name) and then doing in-place migration of the entire cloned database (all 6 iFS schemas). Our source dev database is 18G and includes much more (probably garbage data) than will be needed for our regression testing. My hope was to take only a subset of the 18G (1 of the 6 iFS schemas) and move it into CMSDK/Oracle9i to perform the testing. We are very low on available disk space and taking only a subset would be a more efficient use.
    I have even attempted to use the CMSDK export utility to export the iFS1.1.6 schema. I was hoping this would export the iFS1.1.6 schema, metadata, contents, and credential manager. This was not successful due to missing Java packages not installed in the Oracle8i (8.1.7.4 database).
    These are all the details of my dilemma. Can you help guide me in the right direction. I've spent a lot time already fumbling through and have had a number of false starts with my proposed approach. I need someone to tell me the recommended way of approaching my issue. Thanks again.

  • Import/Exporting iVRF routes in IPsec iVRF/FVRF environment

    Hi,
    I am currently terminating a number of IPsec VPNs into customers' 'inside' VRFs (iVRFs) with the 'classic' crypto-map applied in a separate Front-Door VRF (FVRF) on an ASR1k. I now want to export a VPN route from one iVRF into another VRF using MP-BGP. This works as expected in as far as the VPN prefix makes it into the BGP table, but not into the RIB - it would appear that this may be by design and a route with a next-hop in the FVRF (i.e. the VPN RRI route) cannot be exported from the VRF and imported into another VRF. Is there any workaround for this; the only one solution which looks like it might work is to import/export these routes using another VRF and back-to-back VASI interfaces, using ordinary BGP to leak routes. Another possible solution is also to use sVTIs instead of classic crypto (thus avoiding the RRI route), but this doesn't address the need to support classic crypto.
    Cheers,
    Matt

    Hi,
    I am currently terminating a number of IPsec VPNs into customers' 'inside' VRFs (iVRFs) with the 'classic' crypto-map applied in a separate Front-Door VRF (FVRF) on an ASR1k. I now want to export a VPN route from one iVRF into another VRF using MP-BGP. This works as expected in as far as the VPN prefix makes it into the BGP table, but not into the RIB - it would appear that this may be by design and a route with a next-hop in the FVRF (i.e. the VPN RRI route) cannot be exported from the VRF and imported into another VRF. Is there any workaround for this; the only one solution which looks like it might work is to import/export these routes using another VRF and back-to-back VASI interfaces, using ordinary BGP to leak routes. Another possible solution is also to use sVTIs instead of classic crypto (thus avoiding the RRI route), but this doesn't address the need to support classic crypto.
    Cheers,
    Matt

  • Import/Export in back stage view

    What is the use of import/Export in back stage view of customize ribbon and Quick access toolbar

    Shiv --
    Pardon me for bumping into this thread.  Here is a scenario that might answer your question:
    You create a customized ribbon and Quick Access Toolbar.  Your ribbon has a new tab called Initiate which contains the commands you use to initiate a new project, such as setting the Start date of the project, setting the Project and Nonworking Time
    calendars, etc.  Your customized Quick Access Toolbar contains the buttons you most frequently use, such as Zoom In, Zoom, Out, Scroll to Task, etc.
    A fellow PM sees how you have customized the ribbon and Quick Access Toolbar in your copy of Microsoft Project and asks if you would give him/her the customizations.
    You navigate to the Backstage and export your ribbon and Quick Access Toolbar customizations to an *.exportedUI file and you e-mail the customization file to your fellow PM.
    Your fellow PM launches Microsoft Project, navigates to the Backstage, walks through the import process, and imports the ribbon and Quick Access Toolbar customizations from the customization file you sent him/her.
    As Guillaume correctly points out, this is how to transfer ribbon and Quick Access Toolbar customizations from one computer to another, or from one user to another.  One of the things I have done since Microsoft released the 2010 version Microsoft Project
    is to give my students a customization file that has customizations they can use in the real world with their ribbon and Quick Access Toolbar.  The export/import process is very easy, and very useful.  Hope this extra helps.
    Dale A. Howard [MVP]

  • SAP HANA: IMPORT/EXPORT Views

    We can Import/export tables using hdbsql CLI.
    But Is there any way by which we can export/import Views(attribute,analytic,calculation) using hdbsql CLI.
    I raised above question some days back
    SAP HANA: IMPORT/EXPORT Views | SCN
    Also while checking on internet I found another link for the same issue:
    http://scn.sap.com/thread/2141404
    And from experts answers, I came to conclusion that Import/Export Views(Attribute/Analytic/Calculation views)in SAP HANA is not possible through hdbsql CLI/ SQL commands.Now Is there a way where I can confirm this from SAP HANA program management guys?

    Lars,
    Thanks for the explanation.
    What I am trying to achieve is to create an automate script which should export all tables and views for a schema and again import tables and views in a different schema.
    table part is done, But stuck in view part.
    I tried the below command and was able to export the calculation view which I created in Content and exists as
    _SYS_BIC-->Package(same name as schema where all the tables are stored) --> View Name.
    Command used was:
    EXPORT  "<Schema_Name>"."<Package_Name>/<View_Name>"  INTO '<path>'
    Now I have 2 ques :
    1) I am not sure how to import this exported view. I tried below command but it failed:
    IMPORT  "<Schema_Name>"."<Package_Name>/<View_Name>"  FROM '<path>'
    2) Not sure how to export and import all the views.
    I did IMPORT SCAN of the path as well and found below entries for the exported view in the mentioned path:
    SCHEMA_NAME, OBJECT_NAME, OBJECT_TYPE, EXISTS
    "_SYS_BIC", "<Packagename>/CalculationViewName>", "VIEW", 1
    "_SYS_BIC", "<Packagename>/AnalyticViewName>", "VIEW", 1
    "_SYS_BIC", "<Packagename>/CalculationViewName>/olap", "VIEW", 1
    "_SYS_BIC", "vwTile1ALLData", "VIEW", 1
    Any Inputs would definitely help. Thanks.

  • HT5537 How do you import/export garage band projects using the latest version of iTunes when there is no apps in the left column to select garageband?

    I am perfectly aware of how to do this using the old version of itunes.
    however i just got a new macbook pro retina... and it came with the latest version of itunes.
    and i was trying to import/export garage band files between my ipad and mac via itunes.
    and the option for apps, that is usually on the left side where you can then select garage band from in itunes, isnt there in the new version of itunes.
    has anyone got a work around for this?
    thanks.
    nic

    Nic, for me the iTunes window looks like this, when I connect my iPad 3:
    I select the iPad in the "devices" section of the Sidebar (use: "View > Show Sidebar" if the sidebar is hidden).
    Click the "Apps" tab in the "Devices" pane.
    Scroll all the way down in the Devices pane to "File Sharing" "Apps" section.
    Then do I click "GarageBand" to select the documents in the right panel.
    Which part is different for you? Perhaps you could post a screenshot?
    Regards
    Léonie

  • Need syntax for remap tablespaces for  Import datapump using API

    Good day all,
    I am not able to find an example for remap_tablespaces for import datapump using API. I am not able to perform the import as it complains it doesnt have grant to the exported tables's tablespace. I created a new user with new table spaces associated with it to import tables and other objects.
    I have remap_schema in API procedure but not remap_tablesapce. Could you please tell me how to accomplish it.
    Thanks a lot.
    Maggie

    Hi Maggie,
    The other parameter is 'OBJECT_TYPE'
    so you could have
    DBMS_DATAPUMP.METADATA_REMAP (handle => k1, name => 'REMAP_TABLESPACE', OLD_VALUE =>'+ORIGINAL_TABLESPACE_NAME+', VALUE =>'+NEW_TABLESPACE_NAME+',OBJECT_TYPE=>'TABLE');The full list of options from the oracle docs is suposedly this:
    TABLE, INDEX, ROLLBACK_SEGMENT, MATERIALIZED_VIEW, MATERIALIZED_VIEW_LOG,TABLE_SPACE
    ref http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_datpmp.htm#i1007115
    As for an example with table - here is one i did which also includes remap_data - you can ignore that part - the rest of the code shows you how to do a table level export based on a list of tables returned from a SQL query.
    http://dbaharrison.blogspot.de/2013/04/dbmsdatapump-with-remapdata.html
    Cheers,
    Harry

  • Report Title and Text Area issue when exported to pdf using Viewer

    Hi there,
    We are using OracleBI Discoverer Version 10.1.2.55.26
    We have reports that displays Report title containing the following
    - Report Title
    - Runt Date and Time
    - Page No
    And text area which displays 'Last Page'
    Following properties are set at the worksheet level using page setup
    Report Title --> 'Print on every Page'
    Text Area --> 'Print on last page'
    The report when exported to PDF using Discoverer plus works fine and displays report title and text area as defeined.
    But when we try to export the same report to pdf from Discoverer viewer, it displays
    - Report title on first page only.
    - text area on all pages
    All our users accesses report using discoverer viewer so we cannot open discoverer plus to them.
    Is there a solution which will enable us to export the report in pdf using discoverer viewer and displays the same output as discoverer plus.
    Please let me know... If you have any questions then please feel free to ask.
    Thanks in advance for your help...
    Manish

    but when opened on other os x machines some text is colored differently than it should be
    Well, if typographic objects are colour managed, the colour appearance is dependent on the source ICC profile that describes the colourants in the typographic objects and the destination ICC profile that describes the colours the display is able to form and the RBC colourant combinations that will form those colours.
    In general, typographic objects should have special treatment, since the expectation is not that typographic objects should be colour managed, but that typographic objects should simple be set to the maximum density of dark. On a display, that is R=0 G=0 B=0 and on a seperations device (a lithographic press) that is C=0 M=0 Y=0 K=100.
    If for some reason typographic objects are colour managed, and if the ICC profiles for the displays are off by half a mile or more in relation to the actual state of the display system, then the colours will not be the same. On the other hand, if those displays are calibrated and characterized, then the colourants will be converted to form the same colours on the displays.
    /hh

  • Export/Import Parameters dissapear when using user-exit

    I am using some import/export parameters in a dynamic action when I create a new record (infotype). I am also using a user exit to avoid modifying BEGDA and ENDDA when I modify the record (IPSYST = 'MOD'). Using this user-exit, the parameters dissapear from memory so the dynamic action does not execute well. What can I do to use the user-exit?? Anything to add?

    In the dinamic actions when I create, I delimit records on infotype with export/import parameters defined in infotype Module Pool. When I delete, I avoid deleting the record if it is not the last one. With the user-exit the modification of begda endda in infotype is not allowed. If I use the user-exit, the dinamic actions which use export/import parameters, don't work.
    I have tried to do in MP what I do in user-exit but it is not easy because I haven´t got in PSAVE what I want.

  • Using XML Import/Export in Transport Connection

    Hi, I was investigating the feasibility of copying a query definition from one BW system to another using the XML Import/Export functionality in Transport Connection.  This is not to replace our current transport process across the landscape (Dev -> QA -> Prod), but merely just looking at possibilities for end users who develop queries in QA & then recreate them in Prod.
    My question is: has anyone worked with the XML Export/Import for types Queries & Query Elements?  I was trying this out between a Dev and Sandbox system and only had limited success.  I was able to take a small query and perform the Export to a local .xml file without much difficulty.  But when I try to do the Import, my query never shows up.  The Import function shows a green light, but I get a couple of error messages on the Import such as the following:
    ==========================================================
    SAP object 3WROG4HZ3NKP1PIYHCR1H24CQ () cannot be imported
    Diagnosis
    You attempted to import SAP object 3WROG4HZ3NKP1PIYHCR1H24CQ of type into the system via the XMI interface. However, you are not allowed to import SAP objects.
    System response
    The object was ignored during the import.
    Note: It can be that the import is incomplete when the required SAP objects are still not active in the system.
    Procedure
    Install the specified objects from Business Content. Only import the metadata afterwards.
    =========================================================
    The portion returned for Saving and Activating returns green lights.  The two sample objects mentioned appear to just be custom InfoObjects used in my query & don't look to be any different than others that work OK. 
    Just curious if anyone else has worked with this and could help advise. 
    Thanks...  Jody

    This is an old subject, but I think XML import only works for datamodeling objects like InfoObjects, InfoCubes, and ODS objects. If you try to export a query, and open the XML file in a text editor, there is not enough information (in my opinion) to build the query. As an example you don't see how key figures are to be displayed, filter or free characteristics.
    However, when an InfoObject is collected, you get all the attributes, texts in all languages.
    If anyone has more insight, please enlighten us.
    -John

  • Use of IMPORT/EXPORT in methods

    Hi,
    Is it possible to use IMPORT/EXPORT statements in the methods which are part of BADI's.
    Thanks
    Rajavardhana reddy

    HI,
    Import
    TYPES: BEGIN OF OBJ_LINE,
            CLUSTERNAME(30),
            PROGRAMNAME(10),
          END OF OBJ_LINE,
          BEGIN OF B_LINE,
            FIELD_1    TYPE I,
            FIELD_2(1) TYPE N,
          END OF B_LINE.
    DATA: OBJ_TAB TYPE STANDARD TABLE OF OBJ_LINE,
          OBJ_WA  TYPE OBJ_LINE,
          B_PROG  TYPE STANDARD TABLE OF B_LINE,
          B_WA    TYPE B_LINE,
          A(10),
          C_PROG LIKE SYST.
    MOVE:  'A'    TO OBJ_WA-CLUSTERNAME.
    APPEND OBJ_WA TO OBJ_TAB. CLEAR OBJ_WA.
    MOVE:  'B'      TO OBJ_WA-CLUSTERNAME,
           'B_PROG' TO OBJ_WA-PROGRAMNAME.
    APPEND OBJ_WA TO OBJ_TAB. CLEAR OBJ_WA.
    MOVE:  'C'      TO OBJ_WA-CLUSTERNAME,
           'C_PROG' TO OBJ_WA-PROGRAMNAME.
    APPEND OBJ_WA TO OBJ_TAB. CLEAR OBJ_WA.
    IMPORT (OBJ_TAB) FROM MEMORY ID 'ABCD'.
    export
    TYPES: BEGIN OF OBJ_LINE,
             CLUSTERNAME(30),
             PROGRAMNAME(10),
           END OF OBJ_LINE.
    DATA: OBJ_TAB TYPE STANDARD TABLE OF OBJ_LINE,
          OBJ_WA  TYPE OBJ_LINE.
    TYPES: BEGIN OF B_LINE,
             FIELD_1    TYPE I,
             FIELD_2(1) TYPE N,
           END OF B_LINE.
    DATA: B_PROG TYPE STANDARD TABLE OF B_LINE.
    DATA: A(10),
          C_PROG LIKE SYST.
    MOVE:  'A'      TO OBJ_WA-CLUSTERNAME.
    APPEND OBJ_WA TO OBJ_TAB. CLEAR OBJ_WA.
    MOVE:  'B'      TO OBJ_WA-CLUSTERNAME,
           'B_PROG' TO OBJ_WA-PROGRAMNAME.
    APPEND OBJ_WA TO OBJ_TAB. CLEAR OBJ_WA.
    MOVE:  'C'      TO OBJ_WA-CLUSTERNAME,
           'C_PROG' TO OBJ_WA-PROGRAMNAME.
    APPEND OBJ_WA TO OBJ_TAB. CLEAR OBJ_WA.
    EXPORT (OBJ_TAB) TO MEMORY ID 'ABCD'.
    Regards,
    Laxmi.

Maybe you are looking for