Loading data via ftp

Hi,
From my reading of XML Db papers, it seems to indicate to me, that it is possible to load XML datafiles via ftp.
Please could someone give me an example of how this would be done, as I cannot find a concrete
example in the documentation.
Thanks for your attention.
Pete

C:\>ftp
ftp> open localhost 2100
Connected to mdrake-lap.
220 mdrake-lap FTP Server (Oracle XML DB/Oracle9i Enterprise Edition Release 9.2
.0.1.0 - Production) ready.
User (mdrake-lap:(none)): scott
331 pass required for SCOTT
Password:
230 SCOTT logged in
ftp> cd public
250 CWD Command successful
ftp> mkdir test
257 MKD command successful
ftp> cd test
250 CWD Command successful
ftp> pwd
257 "/public/test" is current directory.
ftp> put c:\temp.txt temp.txt
200 PORT Command successful
150 ASCII Data Connection
226 ASCII Transfer Complete
ftp: 305 bytes sent in 0.00Seconds 305000.00Kbytes/sec.
ftp> get temp.txt -
200 PORT Command successful
150 ASCII Data Connection
Hello
This is a simple text file.....
It is stored in the Resource View, If it were a schema based XML file, and the
Schema had been registered with XML DB, then the Resource View would contain a
reference to a row stored in the default table identified by the XML Schema.
Does this help...
226 ASCII Transfer Complete
ftp: 305 bytes received in 0.03Seconds 10.17Kbytes/sec.
ftp> quit
221 QUIT Goodbye.
C:\>sqlplus scott/tiger
SQL*Plus: Release 9.2.0.1.0 - Production on Mon Jun 24 12:54:27 2002
Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
SQL> set long 10000
SQL> select xdburitype('/public/test/temp.txt').getCLob() from dual;
XDBURITYPE('/PUBLIC/TEST/TEMP.TXT').GETCLOB()
Hello
This is a simple text file.....
It is stored in the Resource View, If it were a schema based XML file, and the
Schema had been registered with XML DB, then the Resource View would contain a
reference to a row stored in the default table identified by the XML Schema.
Does this help...
SQL> exit
Disconnected from Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
C:\>

Similar Messages

  • Chart dataTipFunction and loading data via HTTPService

    Hi everyone,
    I have a problem and I hope someone is able to give me a hint.
    I am using a chart with data points. At the moment I am using a datatipFunction in order to give each point a "tooltip".
    Now I need the ability to load some data via httpservice and display this as tooltip instead of the original value while being over a point.
    It is needede because of a lot of data and a continuous minimal change of its values. I don't want to reload every possible data. Only examined datatips should be updated.
    mouse over datapoint --> tooltip: "please wait, while updating" --> httpservice finished --> tooltip: "new data xyz"
    The datatip function only returns a string, that is displayed as tooltip. If the datatip function calls a httpservice, how can I update that tooltip text?
    Any ideas?

    Hi everyone,
    I have a problem and I hope someone is able to give me a hint.
    I am using a chart with data points. At the moment I am using a datatipFunction in order to give each point a "tooltip".
    Now I need the ability to load some data via httpservice and display this as tooltip instead of the original value while being over a point.
    It is needede because of a lot of data and a continuous minimal change of its values. I don't want to reload every possible data. Only examined datatips should be updated.
    mouse over datapoint --> tooltip: "please wait, while updating" --> httpservice finished --> tooltip: "new data xyz"
    The datatip function only returns a string, that is displayed as tooltip. If the datatip function calls a httpservice, how can I update that tooltip text?
    Any ideas?

  • Error while Loading data Via Integrator

    Hi,
    I'm currently following the GettingStarted screencast series 3.0 in youtube.
    In the 'loadData.grf', while i'm trying to load data's to data domain after configuring the 'Reformat' and 'ExHashJoin' I'm getting below error, can anyone kindly help me?
    INFO  [main] - ***  CloverETL framework/transformation graph, (c) 2002-2013 Javlin a.s, released under GNU Lesser General Public License  ***
    INFO  [main] - Running with CloverETL library version 3.3.0 build#036 compiled 12/02/2013 18:45:45
    INFO  [main] - Running on 4 CPU(s), OS Windows 7, architecture amd64, Java version 1.7.0_07, max available memory for JVM 668160 KB
    INFO  [main] - Loading default properties from: defaultProperties
    INFO  [main] - Graph definition file: graph/Copy of LoadData1.grf
    INFO  [main] - Graph revision: 1.17 Modified by: MANJU Modified: Tue Jul 23 08:08:54 IST 2013
    INFO  [main] - Checking graph configuration...
    ERROR [main] - Graph configuration is invalid.
    WARN  [main] - [Join Dim on Facts:JOIN_DIM_ON_FACTS] - Input port 4 not defined or mapping has too many elements (number of input ports: 4)
    ERROR [main] - [ResponseSurvey:RESPONSE_SURVEY] - At least 1 output port must be defined!
    ERROR [main] - Error during graph initialization !
    Element [1371098357416:LoadData]-Graph configuration is invalid.
      at org.jetel.graph.runtime.EngineInitializer.initGraph(EngineInitializer.java:263)
      at org.jetel.graph.runtime.EngineInitializer.initGraph(EngineInitializer.java:239)
      at org.jetel.main.runGraph.runGraph(runGraph.java:377)
      at org.jetel.main.runGraph.main(runGraph.java:341)
    Caused by: org.jetel.exception.ConfigurationException: [ResponseSurvey:RESPONSE_SURVEY] - At least 1 output port must be defined!
      at org.jetel.exception.ConfigurationProblem.toException(ConfigurationProblem.java:156)
      at org.jetel.exception.ConfigurationStatus.toException(ConfigurationStatus.java:106)

    Hi,
    I've managed to recreate the graph.But stuck with other error on the "Getting started" application.
    This time I'm adding "Geography Dim" and "Reseller Dim" through ExHash join to "Join Dims to Fact".
    I've got the below error
    INFO  [main] - ***  CloverETL framework/transformation graph, (c) 2002-2013 Javlin a.s, released under GNU Lesser General Public License  ***
    INFO  [main] - Running with CloverETL library version 3.3.0 build#036 compiled 12/02/2013 18:45:45
    INFO  [main] - Running on 4 CPU(s), OS Windows 7, architecture amd64, Java version 1.7.0_07, max available memory for JVM 668160 KB
    INFO  [main] - Loading default properties from: defaultProperties
    INFO  [main] - Graph definition file: graph/LoadData.grf
    INFO  [main] - Graph revision: 1.26 Modified by: MANJU Modified: Thu Jul 25 23:54:02 IST 2013
    INFO  [main] - Checking graph configuration...
    INFO  [main] - Graph configuration is valid.
    INFO  [main] - Graph initialization (LoadData)
    INFO  [main] - [Clover] Initializing phase: 0
    INFO  [main] - [Clover] phase: 0 initialized successfully.
    INFO  [WatchDog] - Starting up all nodes in phase [0]
    INFO  [WatchDog] - Successfully started all nodes in phase!
    ERROR [WatchDog] - Graph execution finished with error
    ERROR [WatchDog] - Node GEOGRAPHY_DIM finished with status: ERROR caused by: Component pre-execute initialization failed.
    ERROR [WatchDog] - Node GEOGRAPHY_DIM error details:
    Element [GEOGRAPHY_DIM:Geography Dim]-Component pre-execute initialization failed.
      at org.jetel.graph.Node.run(Node.java:458)
      at java.lang.Thread.run(Thread.java:722)
    Caused by: java.lang.ArrayIndexOutOfBoundsException: 10
      at org.jetel.data.parser.AhoCorasick.isPattern(AhoCorasick.java:182)
      at org.jetel.data.parser.CharByteDataParser$ByteRecordSkipper.skipInput(CharByteDataParser.java:1594)
      at org.jetel.data.parser.CharByteDataParser.skip(CharByteDataParser.java:264)
      at org.jetel.util.MultiFileReader.skip(MultiFileReader.java:341)
      at org.jetel.util.MultiFileReader.nextSource(MultiFileReader.java:286)
      at org.jetel.util.MultiFileReader.preExecute(MultiFileReader.java:498)
      at org.jetel.component.DataReader.preExecute(DataReader.java:237)
      at org.jetel.graph.Node.run(Node.java:456)
      ... 1 more
    INFO  [WatchDog] - [Clover] Post-execute phase finalization: 0
    INFO  [WatchDog] - [Clover] phase: 0 post-execute finalization successfully.
    INFO  [WatchDog] - Execution of phase [0] finished with error - elapsed time(sec): 0
    ERROR [WatchDog] - !!! Phase finished with error - stopping graph run !!!
    INFO  [WatchDog] - -----------------------** Summary of Phases execution **---------------------
    INFO  [WatchDog] - Phase#            Finished Status         RunTime(sec)    MemoryAllocation(KB)
    INFO  [WatchDog] - 0                 ERROR                              0             27979
    INFO  [WatchDog] - ------------------------------** End of Summary **---------------------------
    INFO  [WatchDog] - WatchDog thread finished - total execution time: 5 (sec)
    INFO  [main] - Freeing graph resources.
    ERROR [main] - Execution of graph failed !
    kindly help.

  • Loading data via Collaborative Planning

    Hi
    Is it possible to load data in MRP Forecast tables using Collaborative Planning? If it's possible it'll be very usefull for me.
    Thanks in advance,
    Ricardo Cabral

    Tank you Eddy, but it seems it does not work.
    The error message received from DTW is:
    - <BOM>
    - <BOM>
    - <BO>
    - <AdmInfo>
      <Object>66</Object>
      <Version>2</Version>
      </AdmInfo>
    - <ProductTrees>
    - <row>
      <TreeCode>A00001</TreeCode>
      <Quantity>1</Quantity>
      <TreeType>iProductionTree</TreeType>
      </row>
      </ProductTrees>
    - <ProductTrees_Lines>
    - <row>
      <Currency>USD</Currency>
      <IssueMethod>im_Backflush</IssueMethod>
      <ItemCode>A00006</ItemCode>
      <Price>160</Price>
      <PriceList>1</PriceList>
      <Quantity>3</Quantity>
      <Warehouse>01</Warehouse>
      </row>
    - <row>
      <Currency>USD</Currency>
      <IssueMethod>im_Backflush</IssueMethod>
      <ItemCode>S10002</ItemCode>
      <Price>160</Price>
      <PriceList>1</PriceList>
      <Quantity>3</Quantity>
      <Warehouse>01</Warehouse>
      </row>
      </ProductTrees_Lines>
      </BO>
      </BOM>
      <ErrorMessage>A00001 - Errore definito dall'applicazione o dall'oggetto</ErrorMessage>
      </BOM>

  • Column Mapping while loading data via tab delimited text file

    While attempting to load data I get an error message at the "Define Column Mapping" step. It reads as follows:
    1 error has occurred
    There are NOT NULL columns in SYSTEM.BASICINFO. Select to upload the data without an error
    The drop down box has the names of the columns, but I don't know how to map them to the existing field names, I guess.
    By the way, where can I learn what goes in "format" at this same juncture.
    Thanks!

    You can use Insert Into Array and wire the column index input instead of the row index as shown in the following pic:
    Just be sure that the number of elements in Array2 is the same as the number of rows in Array1.
    Message Edited by tbob on 03-07-2006 11:32 AM
    - tbob
    Inventor of the WORM Global
    Attachments:
    Add Column.png ‏2 KB

  • HFM - log that shows if a user has loaded data via web form or excel load.

    I can see any data loads that are coming from FDM, but is there a log that shows any data entered into HFM via web forms or submitted through an excel file? Any input is appreciated.
    Thanks

    You could enable Data Audit to capture data changes made by users, though this will not capture which method users chose to change the data. That is, HFM can show that data changed, and who changed it, but cannot tell whether the data was changed through a form, grid, smart view, or FDM. If you want to prevent users from changing data through forms, grids, or smart view, you can secure those input methods, but you cannot capture which one is used.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Load data from FTP server into Oracle

    I have a zipped file in a FTP server. I need to unzip the file and sql load it into an Oracle table.
    Does Oracle has any utilities with which I can direct do "get" from the FTP server?
    Does Oracle has any zip/unzip features?
    I am using 10g.

    user6794035 wrote:
    I have a zipped file in a FTP server. I need to unzip the file and sql load it into an Oracle table.
    Does Oracle has any utilities with which I can direct do "get" from the FTP server?
    Does Oracle has any zip/unzip features?
    I am using 10g.Oracle does not support thes features directly. You'll have to write an OS script to performthe tasks and insert the data.
    Dependending on your output format you can use UTL_FILE, SQL*LOADER, or external tables to load the data.

  • Short dump:ASSIGN_TYPE_CONFLICT- While loading data through DTP

    Dear all:
    We currently work with BI NW2004s SP10. We created transformation for mapping InfoSource and InfoCube based on 3.x transfer rule. For example, we used cube 0PUR_C04 and Data Source 2LIS_02_ITM_CP. And the transformation is "TRCS Z2LIS_02_ITM_CP -> CUBE 0PUR_C04". Everytime when we tried to load data via DTP. A runtime short dump occurred: ASSIGN_TYPE_CONFLICT
    Error analysis:
      You attempted to assign a field to a typed fie but the field does not have the required type.
    we went back and forth to activated transformation and DTP. But still, same error occurred.
    Any idea, please !!!!
    BR
    SzuFen

    Hi Pavel:
    Please refer to the following information-
    User and Transaction
        Client.............. 888
        User................ "TW_S
        Language key........ "E"
        Transaction......... " "
        Program............. "GPD0
        Screen.............. "SAPM
        Screen line......... 6
    ===========================================================
    Information on where terminated
        Termination occurred in the ABAP program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" - in
         "EXECUTE".
        The main program was "RSBATCH_EXECUTE_PROZESS ".
        In the source code you have the termination point in line 704
        of the (Include) program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
        The program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" was started as a background job.
        Job Name....... "BIDTPR_284_1"
        Job Initiator.. "TW_SZU"
        Job Number..... 16454800
    ===========================================================
    Short text
        Type conflict with ASSIGN in program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
    ===========================================================
    Error analysis
        You attempted to assign a field to a typed field symbol,
        but the field does not have the required type.
    ===========================================================
    Information on where terminated
        Termination occurred in the ABAP program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" - in
         "EXECUTE".
        The main program was "RSBATCH_EXECUTE_PROZESS ".
        In the source code you have the termination point in line 704
        of the (Include) program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
        The program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" was started as a background job.
        Job Name....... "BIDTPR_284_1"
        Job Initiator.. "TW_SZU"
        Job Number..... 16454800
    ===========================================================
    Line  SourceCde
      674         ELSE.
      675           ASSIGN rdsTG_1->*          to <_ys_TG_1>.
      676           CLEAR <_ys_TG_1>.
      677           MOVE-CORRESPONDING G1 TO <_ys_TG_1>.
      678           <_ys_TG_1>-requid    = l_requid.
      679           l_recno_TG_1          = l_recno_TG_1 + 1.
      680           ls_cross-insegid      = 1.
      681           ls_cross-inrecord     = l_recno_SC_1.
      682           ls_cross-outsegid     = 1.
      683           ls_cross-outrecord    = l_recno_TG_1.
      684
      685           CALL METHOD i_r_log->add_cross_tab
      686             EXPORTING
      687               I_S_CROSSTAB = ls_cross.
      688
      689 **     Record# in target = sy-tabix - if sorting of table won't be changed
      690           <_ys_TG_1>-record     = l_recno_TG_1.
      691           INSERT <_ys_TG_1> INTO TABLE <_yth_TG_1>.
      692           IF sy-subrc <> 0.
      693             CALL METHOD cl_rsbm_log_step=>raise_step_failed_callstack.
      694           ENDIF.
      695
      696         ENDIF.      "Read table
      697 *
      698       ENDIF.
      699       CLEAR skipseg_all.
      700     ENDLOOP.
      701 * -
    insert table into outbound segment -
      702
      703     <_yt_TG_1>[] = <_yth_TG_1>[].
    >>>>>
      705     rTG_1->insert_table( rdtTG_1_dp ).
      706   ENDMETHOD.                 "execute
      707
      708
      709
      710 endclass.                    "lcl_transform IMPLEMENTATION
      711
      712 &----
      713 *&      Form  get_runtime_ref
      714 &----
      715 *       text
      716 ----
      717 *      -->C_R_EXE    text
      718 ----
      719 form get_runtime_ref
      720 changing c_r_exe  type ref to object.
      721
      722   data: l_r_exe type ref to lcl_transform.
      723   create object l_r_exe.
    ===========================================================
    Contents of system fields
    Name     Val.
    SY-SUBRC 0
    SY-INDEX 3
    SY-TABIX 0
    SY-DBCNT 1
    SY-FDPOS 0
    SY-LSIND 0
    SY-PAGNO 0
    SY-LINNO 1
    SY-COLNO 1
    SY-PFKEY
    SY-UCOMM
    SY-TITLE Execute Batch Process
    SY-MSGTY E
    SY-MSGID R7
    SY-MSGNO 057
    SY-MSGV1 0TOTDELTIME
    SY-MSGV2 A
    SY-MSGV3
    SY-MSGV4
    SY-MODNO 0
    SY-DATUM 20070420
    SY-UZEIT 164557
    SY-XPROG SAPCNVE
    SY-XFORM CONVERSION_EXIT
    ===========================================================
    Active Calls/Events
    No.   Ty.          Program                             Include
          Name
        6 METHOD       GPD0QBVJ2WFQZZXBD0IJ1DSZAEL         GPD0QBVJ2WFQZZXBD0IJ1DSZAEL
          LCL_TRANSFORM=>EXECUTE
        5 METHOD       CL_RSTRAN_TRFN_CMD============CP    CL_RSTRAN_TRFN_CMD============CM005
          CL_RSTRAN_TRFN_CMD=>IF_RSBK_CMD_T~TRANSFORM
        4 METHOD       CL_RSBK_PROCESS===============CP    CL_RSBK_PROCESS===============CM00Q
          CL_RSBK_PROCESS=>PROCESS_REQUEST
        3 METHOD       CL_RSBK_PROCESS===============CP    CL_RSBK_PROCESS===============CM002
          CL_RSBK_PROCESS=>IF_RSBATCH_EXECUTE~EXECUTE
        2 FUNCTION     SAPLRSBATCH                         LRSBATCHU13
          RSBATCH_EXECUTE_PROCESS
        1 EVENT        RSBATCH_EXECUTE_PROZESS             RSBATCH_EXECUTE_PROZESS
          START-OF-SELECTION
    ===========================================================
    Thank you and BR
    SF

  • Problem while loading data from ODS to infoobject

    Hi guys,
    I am facing problem while loading data from <b>ODS to infoobject</b>.
    If I load data via PSA it works fine but
    if I load data without PSA its giving error as Duplicate records.
    Do u have any idea why it is so.
    Thanks in advance
    savio

    Hi,
    when you load the data via the PSA, what did you select? Serial or Paralel?
    If you select serial most likely you don't have duplicates within the same datapackage and your load can go through.
    Loading directly in the IObj will happen thefore if you have the same key in two different packages, the "duplicate records" will be raised; you can perhaps flag your IPack with the option "ignore duplicate records" as suggested...
    hope this helps...
    Olivier.

  • Shared Svcs Security to "Submit" data via Excel

    Hello all,
    What are the security provisions necessary for the user to use the Submit data function through SmartView in Excel? I've set up Enable write back to Excel, but that didn't seem to do the trick. I'm just looking for the bare minimum.
    Thanks!

    You really must consider them together, unless the user has the appropriate security class assigned and the write acces to it, the role will not do much. If you have no security classes and no Process Management. Bare bones should be Default Role and Load Excel Data, the rest of the roles are not related to data loading through excel, they are related to consolidation, JE's, I/C, workspace. If the user is using a webform to load data via excel then he/she will need Data Form Write back from Excel, but if the user is using formulas, the only the two above and the appropriate security levels should do the trick.

  • Load data from File on Client side (via Sqlplus)

    Server OS: RedHat, Oracle 10g rel 2.
    I am trying to load data from OS .txt files to clob field.
    I am able to do this successfully using:
    Oracle DIRECTORY
    BFILE
    DBMS_LOB.loadclobfromfile packageIssue is: this only works if my files and DIR are on database server.
    Is it not possible "load clob from file" from client side, files being on client and exec command via SQLPlus. Is there any other option to load data from client side.
    Thanks for any help.

    Options:
    1) Search for OraDAV
    2) Learn about Application Express.

  • Loading data from infopackage via application server

    Hi Gurus,
    I have a requirement where i need to load data present in the internal table to a CSV file in the application server (AL11) via open data set, and then read the file from the aplication server, via infopackage ( routine ) then load it to the PSA.
    Now i have created a custom program to load data to AL11 application server and i have used the below code.
    DATA : BEGIN OF XX,
      NODE_ID     TYPE N LENGTH 8,
      INFOOBJECT  TYPE C LENGTH 30,
      NODENAME  TYPE C LENGTH 60,
      PARENT_ID  TYPE N LENGTH 8,
      END OF XX.
    DATA : I_TAB LIKE STANDARD TABLE OF XX.
    DATA: FILE_NAME TYPE RLGRAP-FILENAME.
    FILE_NAME = './SIMMA2.CSV'.
    OPEN DATASET FILE_NAME FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
    XX-NODE_ID = '5'.
    XX-INFOOBJECT = 'ZEMP_H'.
    XX-NODENAME = '5'.
    XX-PARENT_ID  = '1'.
    APPEND XX TO I_TAB.
    XX-NODE_ID = '6'.
    XX-INFOOBJECT = 'ZEMP_H'.
    XX-NODENAME = '6'.
    XX-PARENT_ID  = '1'.
    APPEND XX TO I_TAB.
    LOOP AT I_TAB INTO XX.
      TRANSFER XX TO FILE_NAME.
    ENDLOOP.
    now i can see the data in the application server AL11.
    Then in my infopackage i have the following code,
    form compute_flat_file_filename
         using p_infopackage type rslogdpid
      changing p_filename    like rsldpsel-filename
               p_subrc       like sy-subrc.
          Insert source code to current selection field
    $$ begin of routine - insert your code only below this line        -
      P_FILENAME = './SIMMA2.CSV'.
      DATA : BEGIN OF XX,
      NODE_ID     TYPE N LENGTH 8,
    INFOOBJECT  TYPE C LENGTH 30,
      NODENAME  TYPE C LENGTH 60,
      PARENT_ID  TYPE N LENGTH 8,
      END OF XX.
      DATA : I_TAB LIKE STANDARD TABLE OF XX.
      OPEN DATASET P_FILENAME FOR INPUT IN TEXT MODE ENCODING DEFAULT.
      IF SY-SUBRC = 0.
        DO.
          READ DATASET P_FILENAME INTO XX.
          IF SY-SUBRC <> 0.
            EXIT.
          ELSE.
            APPEND XX TO I_TAB.
          ENDIF.
        ENDDO.
      ENDIF.
    CLOSE DATASET P_FILENAME.
      P_SUBRC = 0.
    i have the following doubt,
    while loading the data from internal table to application server, do i need to add any "data seperator" character and "escape sign" character?
    Also in the infopackage level i will select the "file type" as "CSV file", what characters do i need to give in the "data seperator" and "escape sign" boxes?  Please provide if there is any clear tutorial for the same and can we use process chain to load data for infopackage using file from application server and this is a 3.x datasource where we are loading hierarchy via flat file in the application server.
    Edited by: Raghavendraprasad.N on Sep 6, 2011 4:24 PM

    Hi,
    Correct me if my understanding is wrong.. I think u are trying to load data to the initial ODS and from that ODS the data is going to t2 targets thru PSA(Cube and ODS)....
    I think u are working on 3.x version right now.. make sure the following process in ur PC.
    Start process
    Load to Initia ODS
    Activation of the Initial ODS
    Further Update thru the PSA(which will update both ODS and Cube).
    make sure that u have proper Update rules and Init for both the targets from the Lower ODS and then load the data.
    Thanks

  • Loading attribute data via data source and file

    Hi gurus,
    My issue is that when I load excel file(2-column: matnr and werks) and load to master data via IP and DTP. After successful upload all available attriubute values which are retrieved from 0material_attr are gone away. Maybe deleted somehow.
    This is my issue.
    Can you help me which is the method loading some columns via sap extractor and one lıne via excel upload?
    Thanks.
    Eddy.

    Hello Eddy,
    I believe, the reason of getting your old data removing might be you are using the Full Upload via the Excel. In that recent excel you are not haviing the old records .
    Check the files and as suggested by raman ,if you have any changes in your master data then better to make delta relevant data source.
    Hope this helps !
    Regards
    YN

  • How to Import data via SQL Loader with characterset  UTF16 little endian?

    Hello,
    I'm importing data from text file into one of my table which contains blob column.
    I've specified following in my control file.
    -----Control file-------
    LOAD DATA
    CHARACTERSET UTF16
    BYTEORDER LITTLE
    INFILE './DataFiles/Data.txt'
    BADFILE './Logs/Data.bad'
    INTO TABLE temp_blob truncate
    FIELDS TERMINATED BY "     "
    TRAILING NULLCOLS
    (GROUP_BLOB,CODE)
    Problem:
    SQL Loader always importing data via big endian. Is there any method available using which we can convert these data to little endian?
    Thanks

    A new preference has been added to customize the import delimiter in main code line. This should be available as part of future release.

  • Loading master data via process chain

    hi experts,
    I have to load data in infoObject via process chain, my infopackage has update mode as full update. Now my question is do I have to delete the previously added data from infoObject and then run this full update infopackage in process chain or system will only fetch the newly added records from source.
    regards,
    ray

    You can not delete the data from master data once the transcational data is posted against the master data value. Attribute value can be deleted, but you can not delete the value of the main characteristics from master data table.
    Moreover, you dont need to delete the data when loading master data. Master data always overwritten with the new reocrds.
    As described, first load the full load and then on daily basis you can load the delta.
    DO NOT FOGET TO SCHEDULE ATTRIBUTE CHANGE RUN to activate the newly loaded data into master data.
    Make sure that your master data is not enhanced, otherwise any change in the enhanced field will not generate delta record. If this is the case, then load in full mode everyday.
    - Danny

Maybe you are looking for

  • Safari 5.1 keeps crashing - OSX 10.6.8

    Process:         Safari [192] Path:            /Applications/Safari.app/Contents/MacOS/Safari Identifier:      com.apple.Safari Version:         5.1 (6534.50) Build Info:      WebBrowser-75345000~1 Code Type:       X86-64 (Native) Parent Process:  la

  • Error in Source System, Time series does not exist

    Hi Guys, I am loading the data from APO system and i am getting the below error after scheduling the info Packs.. can you analyze and let me know your suggestions Error Message : Time series does not exist, Error in Source System I have pasted the st

  • How to prevent/allow admin access from certain ip address.

    Hello trying to setup the following scenario: have a user BOB created in Cisco ACS 4.2 have several network devices with different management IP addresses  all added in Cisco ACS 4.2 want to be able to allow BOB to access network devices only if BOB'

  • WORKFLOW Overview

    Hello Everybody, My question is about workflow. I need to manage automatically the correspondence between people. Letters, Document to validate, document to be signed, document to check which are managed manually today. The actual scenario is : I nee

  • Problem with Crystal 2013 in SAP 9.0

    Hi Experts, While opening Crystal Report in SAP B1, it throws an Unhanded Exception and SAP closes immediately. Crystal Reports: 2013 SAP Business One: 9.0 PL 05 Just give solution.