Issues while manually exporting from Data Manager

Hi Guru's,
When we try to export dat in Excel Or Text OR Access and select/tick Qualifiers for exporting then "Split qualified lookup fields into multiple rows" gets ticked automatically and greys it out,so cant even untick it......
This causes issues as we want the whole data of record to come in single row  and not in multiple row..
Any idea/setting which can fix this..
Regards
Vikrant M kelkar

Hi Vikrant,
This is the default behaviour. You can check the same from the below link
http://help.sap.com/saphelp_mdm550/helpdata/en/43/e0677c82b40a2ee10000000a11466f/frameset.htm
Exporting Records -> Exporting Table Records -> Qualifier and Qualified lookup field export
Regards,
Jitesh Talreja

Similar Messages

  • Cannot Delete Articles from Data manager..

    Hi Guru's,
    WHile delting record s from Data Manager I am getting Error
    " Insufficient disk space available on DBMS" and as a result i cannot delete reocrds..
    What could be the issue?
    Please let me know if ican take help of our Basisi admin team .. Or Is there anything that i can work out in MDM..
    Regards,
    Vikrant M Kelkar..

    Hi everyone,
    Got it .. It was isseu because the table space was full and it ddnt allow me to further delete articles..
    Increased Table space and its ALl Good now..
    Thanks all ( whoen=ver reads this thread)
    Regards
    Vikrant M Kelkar

  • Unable to access the data from Data Management Gateway: Query timeout expired

    Hi,
    Since 2-3 days the data refresh is failing on our PowerBI site. I checked below:
    1. The gateway is in running status.
    2. Data source is also in ready status and test connection worked fine too.
    3. Below is the error in System Health -
    Failed to refresh the data source. An internal service error has occurred. Retry the operation at a later time. If the problem persists, contact Microsoft support for further assistance.        
    Error code: 4025
    4. Below is the error in Event Viewer.
    Unable to access the data from Data Management Gateway: Query timeout expired. Please check 1) whether the data source is available 2) whether the gateway on-premises service is running using Windows Event Logs.
    5. This is the correlational id for latest refresh failure
    is
    f9030dd8-af4c-4225-8674-50ce85a770d0
    6.
    Refresh History error is –
    Errors in the high-level relational engine. The following exception occurred while the managed IDataReader interface was being used: The operation has timed out. Errors in the high-level relational engine. The following exception occurred while the
    managed IDataReader interface was being used: Query timeout expired. 
    Any idea what could have went wrong suddenly, everything was working fine from last 1 month.
    Thanks,
    Richa

    Never mind, figured out there was a lock on SQL table which caused all the problems. Once I released the lock it PowerPivot refresh started working fine.
    Thanks.

  • Error : 404 not found while opening Ciscoview from Campus manager with LMS 3.2

    Hi Friends,
    While opening ciscoview from Campus Manager Topology, getting error 404 not found by apache web server. pls guide me to resolve this issue if anyone faces the same situation earlier.
    Thnx

    The name of Enterprise Manger service is em you can find it under deployments in weblogic console. Its status should be active and ok.
    Have you tried this
    http://technasir.blogspot.com/2013/04/enterprise-manger-is-not-working.html
    regards
    Nasir

  • Watched folder error - "Exception while getting principal from Directory manager"

    Hi! I've been having problems with a LiveCycle ES installation.
    I've configured a watched folder that starts a process with an Office
    Document, converts it to PDF and applies a Rights Management Policy.
    It runs on behalf of an Active Directory user.
    Sometimes it works flawlessly, but most of the time it fails, giving a
    ridiculously long failure log with the longest trace stacks I've ever
    seen in life... resumed:
    =======================
    ALC-DSC-600-000: com.adobe.idp.dsc.provider.service.scheduler.impl.SchedulerRuntimeException : Failure to invoke the job [watched_folder_endpoint_name]
    Caused by: ALC-FEP-011-000: com.adobe.idp.dsc.service.file.impl.FileProviderRuntimeException: Failed to get the context on behalf of user [username], domain [company_domain] for watch folder [watched_folder_endpoint_name]
    Caused by: | [com.adobe.idp.um.api.impl.AuthenticationManagerImpl] errorCode:16386 errorCodeHEX:0x4002 message:Exception while getting principal from Directory manager| [IDPLoggedException] errorCode:12801 errorCodeHEX:0x3201 message:Exception while getting principal from Directory manager
    chainedException:javax.ejb.TransactionRolledbackLocalException: null;
    CausedByException is:
    nullchainedExceptionMessage:null; CausedByException is:
    null chainedException
    trace:javax.ejb.TransactionRolledbackLocalException: null;
    CausedByException is:
    null
    =======================
    I vaguely suspect is the server's clock going out of sync with the domain controller's clock, but I tried everything I knew of about it with no consistent results.
    It's LiveCycle 8.0.1 SP2 installed on a Windows 2003 Server.
    Manual install, JBoss Clustered configuration (the second node is actually turned off for the time being)
    SQL Server 2005 as backend Database
    Users on Active Directory on a Windows 2003 Server -which is a "copy" of the main domain controller.
    Watched folder is in a mounted share of another Windows 2003 Server acting as fileserver.
    Any clue will be greatly appreciated!!

    Hello,
    sorry in advance for my english. i am beginning in Adobe LiveCycle and i think that you can help me : i want to configure a watched folder to automatically convert in pdf file and apply a right management policy. Can you tell me how you configure it ? many thanks in advance. regards

  • Issue in moving files from data fodler to processed folder in background

    Hi All,
    I am facing one issue in moving files from data fodler to processed folder in case of background execution.
    When i am executing the file in the foreground, i can move the file from Data folder to processed folder. I am using SXPG_COMMAND_EXECUTE FM to move the file from data folder to processed folder.  I can see the file in processed folder once the program is executed.
    But in case of executing the same program in background, it is giving me the error "Failed to move the file to processed folder" in the spool of SM37 and i can see the file still laying in data folder.
    I tried to check other programs which acesses the same folder as the above program, whether they are able to move. They are able to move the file to processed fodler successfully both in foreground and background mode.
    Please help me in resolving this issue.
    Thanks,
    Deepa

    Hi Sanu,
                    Please use teh following code to move the file from source to target folder.
    This is a code showing how to create and use COPY command of UNIX in ABAP
    PARAMETERS:
    Input file path
    p_input TYPE localfile,
    Processed file path
    p_proc TYPE localfile.
    Declare the Types to file data
    TYPES: BEGIN OF L_X_OUTPUT,
    sys(200), " Please note, there are asterisk before and after sys (i.e.sys)
    END OF L_X_OUTPUT.
    * Internal table to store file data
    DATA l_i_output TYPE STANDARD TABLE OF l_x_output WITH HEADER LINE.
    * Variable for the UNIX command
    DATA: l_v_unix_comm(255) TYPE c.
    Copy command of UNIX
    CONCATENATE 'mv' p_input p_proc
    INTO l_v_unix_comm SEPARATED BY space.
    For example the Copy command is stored as below
    cp u2018/data/interfaces/input/input_fileu2019 u2018/data/interfaces/processed/processed_fileu2019
    Examples of UNIX Command *u2022 mv filename1 filename2 --- moves a file (i.e. gives it a different name, or moves it into a *different directory (see below) *u2022 cp filename1 filename2 --- copies a file
    Execute the UNIX Copy command.
    This command will copy the file from input file path to the processed file path
    CALL 'SYSTEM' ID 'COMMAND' FIELD l_v_unix_comm
    ID 'TAB'
    FIELD l_i_output-sys.
    IF sy-subrc eq 0.
    write: 'File is copied successfully using UNIX command in ABAP'.
    ENDIF.

  • Manual export from 8.1.6 to 9.2.0

    Hello,
    I'm trying to make a manual export from a 8.1.6 database to an 9.2.0.
    When i send my command exp user/passwd .... i have this error :
    Connected to Oracle 8 Release 8.0.5.0.0
    EXP--00037 Export views not compatible with database version
    If someone have an idea ....
    thank's in advance

    I'm trying to make a manual export from a 8.1.6 database to an 9.2.0.
    When i send my command exp user/passwd .... i have this error :
    Connected to Oracle 8 Release 8.0.5.0.0
    EXP--00037 Export views not compatible with database versionIt looks like you're trying to use the 8.0.5.0.0 export utility against an 8.1.6 database. You need to use the 8.1.6 export
    utility.

  • Problems While Extracting Hours From Date Field

    Hi Guys,
    Hope you are doing well.
    I am facing some problems while extracting hours from date field. Below is an example of my orders table:-
    select * from orders;
    Order_NO     Arrival Time               Product Name
    1          20-NOV-10 10:10:00 AM          Desktop
    2          21-NOV-10 17:26:34 PM          Laptop
    3          22-JAN-11 08:10:00 AM          Printer
    Earlier there was a requirement that daily how many orders are taking place in the order's table, In that I used to write a query
    arrival_time>=trunc((sysdate-1),'DD')
    and arrival_time<trunc((sysdate),'DD')
    The above query gives me yesterday how many orders have been taken place.
    Now I have new requirement to generate a report on every 4 hours how many orders will take place. For an example if current time is 8.00 AM IST then the query should fetch from 4.00 AM till 8 AM how many orders taken place. The report will run next at 12.00 PM IST which will give me order took place from 8.00 AM till 12.00 PM.
    The report will run at every 4 hours a day and generate report of orders taken place of last 4 hours. I have a scheduler which will run this query every hours, but how to make the query understand to fetch order details which arrived last 4 hours. I am not able to achieve this using trunc.
    Can you please assist me how to make this happen. I have checked "Extract" also but I am not satisfied.
    Please help.
    Thanks In Advance
    Arijit

    you may try something like
    with testdata as (
      select sysdate - level/24 t from dual
      connect by level <11
    select
      to_char(sysdate, 'DD-MM-YYYY HH24:MI:SS') s
    , to_char(t, 'DD-MM-YYYY HH24:MI:SS') t from testdata
    where
    t >= trunc(sysdate, 'HH') - numtodsinterval(4, 'HOUR')
    S     T
    19-06-2012 16:08:21     19-06-2012 15:08:21
    19-06-2012 16:08:21     19-06-2012 14:08:21
    19-06-2012 16:08:21     19-06-2012 13:08:21
    19-06-2012 16:08:21     19-06-2012 12:08:21trunc ( ,'HH') truncates the minutes and seconds from the date.
    Extract hour works only on timestamps
    regards
    Edited by: chris227 on 19.06.2012 14:13

  • Date issue while transfering pics from macbook to external drive

    Hello
    I am transferring pictures from my Macbook to an external hard drive and my issue is that the date of the picture (date of creation/modified) is being replaced by the date of the transfer… So let say that all my 2013 pictures have now the date of today – 19th June 2014. I tried to drag the pics as well as to export them but I am getting the same issue. I am doing transfers regularly and it is the first time I encounter this issue. Funnily enough it works when I drag one picture at a time but not for more than 3 pictures – very weird.  Your help will be very appreciated !

    There are two kinds of metadata involved when you consider jpeg or other image file.
    One is the file data. This is what the Finder shows. This tells you nothing about the contents of the file, just the File itself.
    The problem with File metadata is that it can easily change as the file is moved from place to place or exported, e-mailed, uploaded etc.
    Photographs have also got both Exif and IPTC metadata. The date and time that your camera snapped the Photograph is recorded in the Exif metadata. Regardless if what the file date says, this is the actual time recorded by the camera.
    Photo applications like iPhoto, Aperture, Lightroom, Picasa, Photoshop etc get their date and time from the Exif metadata.
    When you export from iPhoto to the Finder new file is created containing your Photo (and its Exif). The File date is - quite accurately - reported as the date of Export.
    However, the Photo Date doesn't change.
    The problem is that the Finder doesn't work with Exif.
    So, your photo has the correct date, and so does the file, but they are different things. To sort on the Photo date you'll need to use a photo app.

  • Issue while insert and update data to DB tables

    Hello all,
    i am having an issue while insert the data to DB table.
    my scenario is DB1 to DB2. i had a sender channel with select query which fetches data from DB1 and inserts to DB2.
    so the select query will fetch the records that were INSERTED to DB1 and records that were UPDATED to DB1 and needs to insert/update to DB2 table.
    Now the issue is i am able to insert the records but not able toupdate the records to DB2 table due to primary key issue.
    im message mapping
    sender message type is as follows:
    <src_message1>
    ----<row>
    -------<fieldA>
    -------<filedB>
    -------<filedC>
    Receiver message type as follows:
    <trgt_message1>
    ----<STATEMENT_1>
    ----<TABLE_NAME>
    ----<ACTION> INSERT
    ----<TABLE>
    ----<ACCESS>
    ----<field1> primary key
    ----<field2>
    ----<field3>
    ----<field4>
    ----<KEY>
    ----<field1>
    ----<field2>
    ----<field3>
    ----<field4>
    my query in sender channel is : select filedA, filedB, filedC from test_table where createdate=sysdate or updatedate=sysdate
    so it feteches the data from DB1 and inserting to DB2 but not updating the records to DB2 due to primarykey issue.
    please suggest how to solve ....will it solve by using UPDATE_INSERT for action?
    Best Regards,SARAN

    Hi Nagarjuna,
    i have done the following changes to target mapping structure;
    1. action as UPDATE_INSERT
    2.  in access tab, i had mapped fieldDate to field4.
    3. in Key tab, i assigned the sysdate to field4.
    but issue still exist. could you please check my above changes are correct or not. if wrong please provide me the details that needs to be done.
    thanks in advance.
    i'm providing the error details again:
    my query in sender channel is : select filedA, filedB, filedC, FiledDate from TEST_TABLE where fieldDate=sysdate or updatedate=sysdate
    it returns 4 records as follows:
    fieldA--fieldB-fieldC---fieldDate
    1001----EU----  1----
        2011-11-10
    1002----CN----  0----
         2011-11-10
    1003----AP---- 1----
          2008-03-15 (already exist in DB2)
    1004----JP----  1----
        2007-04-12 (already exist in DB2)
    the first two records are created today and remaining 2 records are updated the fieldC from 0 to 1 ( in DB1 )
    while inserting these 4 records to DB2, we get the following error "java.sql.SQLException: ORA-00001: unique constraint (data.TEST_TABLE_PK) violated" .
    Best Regards,SARAN

  • Issue in PDF export from InDesign CS4

    Hi,
    We are facing one issue while creating  PDF file from the InDesign document. The document has 2-column texts created in InDesign CS4 application.
    The issue is, we export the PDF from InDesign document. When we open the PDF in Acrobat and selecting the text (using selection tool), the second column text is selected first and then it selects the first column text. So while copying the content from the PDF, it comes is reverse order (that is Second column text, then first column text).
    Could anyone suggest me how to solve this issues.  Is this bug in ID CS4?
    Thanks,
    Gopal

    Hi Uwe,
    Below is the answer to your question.
    Question: Did you set the "TouchUp Reading Order" in the preferences of your Acrobat other than "Left to Right, Top to Bottom"?
    Answer: No, we did not change the Touchup Reader order. It is in "Left to Right, Top to Bottom"
    Question: What version of Acrobat Pro are you using? Or is it just Adobe Reader?
    Answer: We are using Adobe Acrobat 9 Pro version. We tested in Adobe reader X (VER 10.1.3). But the same problem exists.
    Question: What kind of tool are you using? it the "TouchUp Text Tool"? Or is it the "Select Tool"?
    Answer: We used both Select Tool and TouchUp Text Tool. Both selects the text in reverse order. (that is right column to Left column)
    Please help me on this issue.
    Thanks,
    Gopal

  • ADF Libary Issue - Application Module disappear from data control

    Hi All,
    I am facing an issue while adding an Application module to an ADF project as an ADF libray.
    I have two applications - ADF Application1 with Model1 project having an Application module AM1, ADF Application2 with Model2 project having an Application Module AM2
    I have created an ADF library jar file of Model1 project.
    When I add it to Model2 project, the Application Module AM2 of Model2 disappears and Application Module AM1 from the library appears in data control.
    Please suggest what could be wrong.
    Regards,
    Rekha

    Hi,
    verify that both application modules don't share the same ID in their databindings.cpx file. Which release version of JDeveloper 11g are you on ?
    Frank

  • Variables export from Calc Manager

    Hi,
    I have a list of run time prompts which I need to export from Dev server to Test server. When I take the export, I cannoy see the option for rtp as TRUE in the exported .xml file.
    For example, here I defined a rtp with checking both the RTP and the Use Last Variable. The output came as below.
    <variable id="4" name="testvar2" product="Planning" type="member" usage="const">
    <property name="application">PLab13</property>
    <property name="plantype">Plan1</property>
    <property name="useLastValue">false</property>
    <property name="group"></property>
    <property name="dimension">Period</property>
    <property name="prompt_text">Please enter</property>
    <value>"Jan"</value>
    <limits type="list">
    <property name="value">"YearTotal"</property>
    </limits>
    </variable>
    Afetr that when I am trying to import the same into a different server, RTP is not showing as checked. Therefore, I cannot use the same in rules unless manually I am checking the RTP checkbox.
    Please suggest what needs to be done in this case.
    Thanks.

    Thanks a million for guiding towards the right path. Just tested the export-import data template. And it worked like a charm!

  • IMPORT/EXPORT FROM DATA BUFFER

    Hi,
    I have some code like this:
    DATA: b type xstring.
    DATA: a type xstirng.
    EXPORT '1234' TO DATA BUFFER b.
    IMPORT a FROM DATA BUFFER b.
    for some reason a does not get set to the value that I assigned to b.
    However, this code works:
    DATA: b type xstring.
    DATA: a type xstirng.
    EXPORT some_internal_table TO DATA BUFFER b.
    IMPORT a FROM DATA BUFFER b.
    Does anybody know why internal tables work but other data types do not?
    Thank you.

    EXPORT '1234' TO DATA BUFFER b, not working because, you did not assign a variable name to '1234' to be stored as in data buffer B.
    Change the code to:
    DATA: b type xstring.
    DATA: a(4) type c.
    EXPORT a = '1234' TO DATA BUFFER b.
    IMPORT a FROM DATA BUFFER b.
    It's working in the second case because, some_internal_table  is the name of the variable in databuffer B and you are importing into the same variable in IMPORT
    Regards
    Sridhar

  • Run SSIS Package (SQL Database on Different Server) from Data Manager.

    Hi- How to run a SSIS package from BPC Data Manager -This package connects to another SQL Server Database and creates a text file. This text file is the source to BPC custom tasks CONERTTASK and DUMPLOAD task to load to BPC.
    Here is the  flow of the complete package- Dataflow (Create the text file from a SQL Database)
                                                               CONVERT TASK (Convert the file to BPC Format)
                                                                DUMPLOAD TASK (Load the converted file to BPC)
    Any pointer will be a great help.

    Hello Pam,
    When you run SSIS package with BPC DM it runs on the application server. You don't really have to run a package on a different server in order to get data from a remote database and dump it to a file. That task can be done in your SSIS package using various data sources/destinations. If that's what you are trying to do. The only thing is, your BPC admin user (the one you used to install BPC) has to have an appropriate privileges on a remote server.
    Hope that helps.
    Regards,
    Akim

Maybe you are looking for