Heirarchy loading from DBconnect

Hi,
Can we load heirarchies from DB connect or other source systems.
Thanks,

https://forums.sdn.sap.com/click.jspa?searchID=12191305&messageID=5118729
Hope it Helps
Chetan
@CP..

Similar Messages

  • Error is data loading from 3rd party source system with DBCONNECT

    Hi,
    We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
    After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
    The connection is working OK and we can see the tables in the source system. But we cannot load the data.
    The error in the monitor is as follows:
    'Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.'
    But, unfortunately, the error message has no further information.
    If we look at the job log in sm37, the job finished with the following log -                                                                               
    27.10.2009 12:14:19 Job started                                                                                00           516          S 
    27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA)                    00           550          S 
    27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG                                              RSM1          797          S 
    27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container                                     OL           356          S 
    27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G     RSM1          796          S 
    27.10.2009 12:14:24 Job finished                                                                                00           517          S 
    In a BW 3.10 system, there is no  message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
    Thanks in advance,
    Rajib

    There will be three things to get the errors like this
    1.RFC CONNECTION FAILED
    2.CHECK THE SOURCE SYSTEM
    3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
    4.CHECK I DOC PROCESSING
    5.FINALLY MEMORY ISSUES.
    6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
    7.LAST IS MEMORY ISSUE.
    and also Check the RFC connection in SM59 If  it is ok then
    check the SAP note : 692195 for authorization
    Santosh

  • Chinese characters scrambled when loading from DS to BW

    Hi, I've been pulling my hair out with this issue.
    I have a flat file containing Chinese text. When I load this in BW using 'FLATFILE' as a source system, it works fine. BW shows the correct Chinese characters.
    When I do the same load using BODI, I get funny characters.
    When I use BODI to load from one flat file into another flat file, the Chinese characters remain correct.
    What do I need to do to make sure I get the right Chinese characters in BW when loading from BODI?
    BODI is installed on Unix on Oracle 10.
    I run the jobs as batch processes.
    The dsconfig.txt has got:
    AL_Engine=<default>_<default>.<default>
    There are no locale settings in al_env.sh
    BW target is UTF-8 codepage.
    File codepage is BIG5-HKSCS
    BODI is set up as a Unicode system in SAP BW.
    When loading flat file to flat file, I get a message:
    DATAFLOW: The specified locale <eng_gb.iso-8859-1> has been coerced to <Unicode (UTF-16)
    because the datastore <TWIN_FF_CUSTOMER_LOCAL> obtains data in <BIG5-HKSCS> codepage.
    JOB: Initializing transcoder for datastore <TWIN_FF_CUSTOMER_LOCAL> to transcode between
    engine codepage<Unicode (UTF-16)>  and datastore codepage <BIG5-HKSCS>
    When loading to BW the messages are almost the same, but now the last step in UTF-16 to UTF-8.
    I read the wiki post which definitely helped me to understand the rationale behind code page, but now I ran out of ideas what else to check ( http://wiki.sdn.sap.com/wiki/display/BOBJ/Multiple+Codepages )
    Any help would be greatly appreciated.
    Jan.

    Hi all. Thanks for the Inputs. This is what I got when I clicked on the Details Tab of the Monitor....
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data packages or InfoPackages are missing in BI but there were no apparent processing errors in the source system. It is therefore probable that there was an error in the data transfer.
    The analysis tried to read the ALE outbox of the source system. This lead to error .
    It is possible that there is no connection to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection to the source system for errors and check the authorizations and profiles of the remote user in both the BI and source systems.
    Check th ALE outbox of the source system for IDocs that have not been updated.

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Invokin SQL*Loader from a stored procedure

    I try to invoke SQL*LOADER from within a database package by using external C procedure (the procedure calls the system() C function) but the loader generates the following error in its log file :
    SQL*Loader -523: error -2 writing to file (STDERR)
    and no data is uploaded.
    I have tried to use system() from within database procedures to execute OS commands and it works. Does anyone know what is the problem with using system() to execute "sqlldr <parameters>"? Is there some other way to call the loader from within a stored PL/SQL procedure?
    Thank you very much for your help.
    Aneta Valova
    null

    Hi
    What is your task and why you are trying to invoke SQL*Loader from strorage procedure or package? Maybe the redirecting of stderr will resolve your problem but thik is it the best way to do your job.
    I am not sure, that invoking other executables from Oracle instance is good idea.
    Regards
    null

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Problem during  Data Warehouse Loading (from staging table to Cube)

    Hi All,
    I have created a staging Module in owb to load my flat files to my staging tables.I have created an Warehouse module to load my staging tables to Dimension and Cube that I have created.
    My senario:
    I have a temp_table_transaction which had loaded my flat files to it .This table had loaded with 168,271,269 milion record as through this flat file.
    I have created a mapping in owb which loaded my temp_table_transaction which has join with other tables and some expression and convert function that these numbers fill to a new table called stg_tbl_transaction in my staging module.Running this mapping takes 3 hours and 45 minutes with this configue of my mapping:
    Default operation mode in running parameter of Mapp config=Set based
    My dimesion filled correctly but I have two problem when I want to transfer my staging table to my Cube:
    #1 Problem:
    i have created a cube is called transaction_cube with owb and it generated and deployed correctly.
    i have created a map to fill my cube with 168,271,268 milon recodes in staging table was called stg_tbl_transaction and deployed it to server (my cube map operating mode is set based)
    but after running this map it did not complete after 9 hour and I forced to cancel my running's map by kill its sessions .I want to know this time for loading this capacity of data is acceptable or for this capacity of data we should spend more time.Please let me know if anybody has any Issue.
    #2 Problem
    To test my map I have created a map with configure set based in operation modes and select my stg_tbl_transaction as source with 168,271,268 records in it and I have created another table to transfer and load my data in it.I wanted to test the time we should spend on this simple map but after 5 hours my data had not loaded in new table.I want to know where is my problem.Should I have set something in configue of map or anothe things.Please guide me about these problems.
    CONFIGURATION OF MY SERVER:
    i run owb on two socket xeon 5500 series with 192 GB ram and disks with RAID 10 Array
    Regards,
    Sahar

    For all of you
    It is possible to load from Infoset to Cube we did it, and it was ok.
    Data are really loaded from Infoset (Cube + master dat) to cube.
    When you create a transformation under a cube Infoset is proposed, and it works fine ....
    Now the process is no more operationnal and i don't understand why .....
    Load from infoset to cube is possible, i can send you screen shot if you want ....
    Christophe

  • Jasper report on HTML when one image loaded from database and for the other

    How to generate jasper report on HTML when one image loaded from database and for the other we give a image path
    My code
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
              exporter = new JRHtmlExporter();
              exporter.setParameter(JRExporterParameter.JASPER_PRINT, print);
              exporter.setParameter(JRExporterParameter.OUTPUT_STREAM, baos);
              exporter.setParameter(JRHtmlExporterParameter.IMAGES_URI, strImageInputDirectory);
         exporter.setParameter(JRHtmlExporterParameter.IMAGES_DIR_NAME, strImageOutputPath == null ? "." : strImageOutputPath);
         exporter.setParameter(JRHtmlExporterParameter.IS_OUTPUT_IMAGES_TO_DIR, Boolean.TRUE);
         exporter.setParameter(JRHtmlExporterParameter.IS_USING_IMAGES_TO_ALIGN, Boolean.FALSE);
         exporter.setParameter(JRHtmlExporterParameter.IS_WHITE_PAGE_BACKGROUND, Boolean.FALSE);
              exporter.exportReport();
              byte[] bdata = ((ByteArrayOutputStream) baos).toByteArray();
    Can any one help pls
    Message was edited by:
    ameet.au

    hey sorry for posting it in this forum.
    but do u have sample code for making it work.. since i am able to do it on PDF format(image from Database and another stored in the webserver) using
    byte image[] =(byte[]) outData.get("image");
                        ByteArrayInputStream img = new ByteArrayInputStream(image);
                        hmimg.put("P_PARAMV3", img);
    print = JasperFillManager.fillReport(reportFileName, hmimg, jrxmlds);
    bdata= JasperExportManager.exportReportToPdf(print);

  • My LV program can't find the iak file when the program is loaded from a different PC.

    Hello all,
    My LV vi with Fieldpoint I/O and it's IAK file are stored on the company network.  If I load and run the LV vi from the PC it was developed on, everything is fine.  If I load and run on a different PC (that has an identical install of LV), the LV vi can't find the IAK file (all of the I/O pointers to the file are grayed out.) 
    Also, if I make an EXE out of the vi on the machine that can find the IAK, it won't find it if it's run from the other machine.
    It is not related to whether one machine has different "rights", as they are identical.  It's not related to different installs of LV, as they are identical.   
    I have found that this is related to the network login name/password, and to the machine's network name.  If I login on the second machine and try to run the program, it won't find the IAK.  If I point it to the IAK it will run.  If I save it and log out, and then login as a different user with the same rights on the second machine, it won't find the IAK again, until I re-point it there when I re-run it.  This is true for both the vi and the EXE.
    This problem makes deploying the EXE difficult to different machines, as the first time it's run, the EXE has to be pointed to the IAK.  Pointing the EXE to the IAK is difficult as these installations have no keyboard or mouse.
    Does anyone have any ideas on how to fix this so the LV vi or it's EXE finds the IAK from no matter where it's loaded from?
    Thanks,
    Mike

    Laura,
    I figured I'd continue this thread because it's along he same line..
    I have two Fieldpoint systems. They are identical, except for one thing...one has an AI-110 in position 7...the other has nothing. Both have the same IP address, etc and I swap the ethernet cable between them when I want to select which one I use.  I use a dedicated ethernet card to the FP processor...and it's not the company net.
    There are two IAK files...one for the system with the AI-110, one for the system without the AI-110. The two IAKs are in different paths.
    Before I had the system and the IAK without the AI-110, I made an EXE of the code that woked with the AI-110, and it worked fine.
    But now if I run the EXE after I've used MAX and opened the IAK without the AI-110, the EXE won't read the AI-110!  I have to go back into MAX and open the IAK that has the AI-110 in it, close MAX and restart the EXE for the EXE to see the AI-110.  (Max is NOT SET to "Download Items on Save")
    I looks like the EXE is looking at MAX to get the IAK or a pointer to the IAK.
    Did I build my EXE incorrectly?  Shouldn't an EXE completely ignore MAX once it's made?
    Thanks,
    Mike

  • Error in delta loading from a planning cube

    hi all,
    I am using a delta load from one planning cube to another.When the changed records are of a magnitude of 100 then it is successful but when the records are of magnitude of 100000 then it stays yellow with no data records sent and eventually fails.the packet size is 20000.
    any help is appreciated.

    Hello Ajay,
    Have you checked that your Request is closed in the planning cube? FM RSAPO_CLOSE_TRANS_REQUEST will do it
    It is a customizing issue to tell the infopackage to end "green" when it has no data.
    Transaction Spro ==> SAP Netweaver ==> SAP Business Information Warehouse ==> Automated Processes ==> Monitor Settings ==> Set Traffic Light Color. There you have to set the Traffic light on green if no data is available in the system.
    Hope that helps
    best regards
    Yannick

Maybe you are looking for

  • Creation of Outbound delivery with respect to sales order

    Hi Gurus, I am an Abaper. I have a requirement to create outbound delivery with respect to sales order. Currently I have the purchase order details.   My coordinator has given me a logic. From the Purchase order, we have to get the Purchase requisiti

  • Converting UTC time stamp to local time (CET)

    Is there a smooth way to convert a time stamp from UTC time into the local time (e.g. CET)? CONVERT TIME STAMP.... Just converts the timestamp from local time zone to UTC, is there another comand to perform the opposite conversion? TIA! /Armin

  • How can I connect a second Bluetooth track pad?

    I bought two track pads to use with my Lion OS on my MBP. One for the home and one for the office. The first one I set up was flawless. When I tried to repeate with the second one it would not connect. I tried to rename the first and the new name was

  • Is it possible to add a user to a role at run-time?

    basically I need to be able to add a user to a role programmatically before the role-based content is displayed to the user. Example: I have a role called 'Manager' created in the portal. When a user logs on, I detect that the user has the attribute

  • RE: Network Connect of a WMP100N Wireless card Thru a WRT54G Router

    Well, I disabled the Zonealarm personal firewall. No change in connectivity. The card acquires the router, has super signal strength, and no internet. I've even tried re-doing the connection profile. I have noticed a minor difference in security sett