Bw loading from mainframe

Hello Gurus,
My client wants me to load data from mainframe system to BW in near real time. They want me to use some ETL tool like Datastage, Infomatica... for doing that. Since I will have to use staging BAPI for doing extraction, I will have to handle delta at ODS object level( If I am not wrong only service API support queue delta extraction). But is delta handling at ODS object level effective in a near real time scenario?
I do not have any prior experience in staging BAPI.
If I use staging BAPI for data loading, will I have to write
function module for each info-object that I define Or are they already built in to staging BAPI?
My client wants to avoid using SAP XI as far as possible for doing this.
Any help will be appreciated.
Thanks
Anand

Really,
Does your MF provide the info real time or through batch processing. If it is through batch the easiest way is loading through tables...
Remember however that the BW model is really a pull model.
If you want to go to a push model you may have to create a staging table that is being filled by the push model from which the BW system will pull....
Enjoy

Similar Messages

  • Date columns in owb from mainframe

    I get data in flat file from mainframe. In mainframe if date column is high date then it will be assigned defualt '9999-12-31' . Now I am trying to load into OWB and it converts to 31-dec-99 with mask 'yyyy-mm-dd' . I cant distinguish it has high date.
    I want to insert space or is there anything high date in oracle. How to set a date defined column in oracle table to no value like high date or low date , means I dont have a valid date at that point of time.
    thanks

    Hi,
    The mapping through which you are loading can be made so as to check for a high date and if found in the source data make them NULL and load them to the target table field. You need to make the target date field as NULLABLE. This NULL can serve as a high date for you.
    Regards
    -AP

  • Load from EBCDIC file to Oracle 9i tables using UTL_FILE

    Hello, I have a requirement to load EBCDIC file from Mainframe to load to Oracle 9i tables and then do some transformation. Then again create EBCDIC file from database table. I'm not sure if this is possible using UTL_FILE, though i have seen people loading using SQL * Loader. If possible, can you please give some sample code for this? I would appreciate your help
    Thanks
    Karuna

    Hi,
    I'm reading data from EBCDIC file in Oracle PL/SQL using UTL_FILE. I wasn't able to read BINARY data type from EBCDIC to Oracle. Initially i thought the problem was due to the following reasons discussed in the article.
    http://support.sas.com/techsup/technote/ts642.html
    <quote>
    Solutions
    The only way to overcome the problem of non-standard numeric data being corrupted by the FTP is to move the data without translating it. This will necessitate making some significant changes in your program. It may also require preprocessing the data file on the mainframe. The sections below list the different types of files and situations, a recommended approach to read in the file, and a sample program to accomplish the task.
    </quote>
    But we have confirmed that the contents of EBCDIC file is fine by looking into the EBCDIC file using a tool that will convert EBCDIC to ASCII. The contents are absolutely ok.
    Now how do i read the Binary data from EBCDIC file.
    My code is like this...
    Open the file using UTL_FILE.FOPEN
    UTL_FILE.GET_LINE(file_handler,string,lengthofthestring)
    DBMS_OUTPUT.PUT_LINE(SUBSTR(CONVERT(string,ASCIIUS7,EBCDIC),1,4));
    --This is generating an output as "&". The actual data is
    --005. Since this is
    --declared as binary in EBCDIC file, I'm unable to read
    --and print it.
    --same is the case with other binary data types.
    --I'm able to read the other datatypes clearly
    UTL_FILE.FCLOSE.
    How do I resolve this? I would appreciate your help on this. This is something critical and immediate requirement for us.
    Thanks
    Karuna

  • Chinese characters scrambled when loading from DS to BW

    Hi, I've been pulling my hair out with this issue.
    I have a flat file containing Chinese text. When I load this in BW using 'FLATFILE' as a source system, it works fine. BW shows the correct Chinese characters.
    When I do the same load using BODI, I get funny characters.
    When I use BODI to load from one flat file into another flat file, the Chinese characters remain correct.
    What do I need to do to make sure I get the right Chinese characters in BW when loading from BODI?
    BODI is installed on Unix on Oracle 10.
    I run the jobs as batch processes.
    The dsconfig.txt has got:
    AL_Engine=<default>_<default>.<default>
    There are no locale settings in al_env.sh
    BW target is UTF-8 codepage.
    File codepage is BIG5-HKSCS
    BODI is set up as a Unicode system in SAP BW.
    When loading flat file to flat file, I get a message:
    DATAFLOW: The specified locale <eng_gb.iso-8859-1> has been coerced to <Unicode (UTF-16)
    because the datastore <TWIN_FF_CUSTOMER_LOCAL> obtains data in <BIG5-HKSCS> codepage.
    JOB: Initializing transcoder for datastore <TWIN_FF_CUSTOMER_LOCAL> to transcode between
    engine codepage<Unicode (UTF-16)>  and datastore codepage <BIG5-HKSCS>
    When loading to BW the messages are almost the same, but now the last step in UTF-16 to UTF-8.
    I read the wiki post which definitely helped me to understand the rationale behind code page, but now I ran out of ideas what else to check ( http://wiki.sdn.sap.com/wiki/display/BOBJ/Multiple+Codepages )
    Any help would be greatly appreciated.
    Jan.

    Hi all. Thanks for the Inputs. This is what I got when I clicked on the Details Tab of the Monitor....
    Error when transferring data; communication error when analyzing
    Diagnosis
    Data packages or InfoPackages are missing in BI but there were no apparent processing errors in the source system. It is therefore probable that there was an error in the data transfer.
    The analysis tried to read the ALE outbox of the source system. This lead to error .
    It is possible that there is no connection to the source system.
    Procedure
    Check the TRFC overview in the source system.
    Check the connection to the source system for errors and check the authorizations and profiles of the remote user in both the BI and source systems.
    Check th ALE outbox of the source system for IDocs that have not been updated.

  • Full load from a DSO to a cube processes less records than available in DSO

    We have a scenario, where every Sunday I have to make a full load from a DSO with OnHand Stock information to a cube, where I register on material and stoer level a counter if there is stock available.
    The DTP has no filters at all and has a semantic group on 0MATERIAL and 0PLANT.
    The key in the DSO is:
    0MATERIAL
    0PLANT
    0STOCKTYPE
    0STOR_LOC
    0BOM
    of which only 0MATERIAL, 0PLANT and 0STORE_LOC are later used in the transformation.
    As we had a growing number of records, we decided to delete in the START routine all records, where the inventory is not GT zero, thus eliminating zero and negative inventory records.
    Now comes the funny part of the story:
    Prior to these changes I would [in a test system, just copied from PROD] read some 33 million of records and write out the same amount of records. Of course, after the change we expected to write out less. To my total surprise I was reading now however 45 million of records with the same unchanged DTP, and writing out the expected less records.
    When checking the number of records in the DSO I found the 45 million, but cannot explain why in the loads before we only retrieved some 33 millions from the same unchanged amount of records.
    When checking in PROD - same result: we have some 45 million records in the DSO, but when we do the full load from the DSO to the cube the DTP only processes some 33 millions.
    What am I missing - is there a compression going on? Why would the amount of records in a DSO differ from the amount of records processed in the DataPackgages when I am making a FULL load without any filter restrictions and only a semantic grouping in place for parts of the DSO key?
    ANY idea, thought is appreciated.

    Thanks Gaurav.
    I did check if there were more/any loads doen inbetween - there were none in the test system.  As I mentioned that it was a new copy from PROD to TEST, I compared the number of entries in the DSO and that seems to be a match between TEST and PROD, ok a some more in PROD but they can be accounted for. In test I loaded the day before the changes were imported to have a comparison, and between that load and the one ofter the changes were imported nothing in the DSO was changed.
    Both DTPs in TEST and PW2 load from actived DSO [without archive]. The DTPs were not changed in quite a while - so I ruled that one out. Same with activation of data in the DSO - this DSO get's loaded and activated in PROD daily via process chain and we load daily deltas into the cube in question. Only on Sundays, for the begin of the new week/fiscal period, we need to make a full load to capture all materials per site with inventory. The deltas loaded during the week are less than 1 million, but the difference between the number of records in the DSO and the amount processed in the DataPackages is more than 10 millions per full load even in PROD.
    I really appreciated the knowledgable answer, I just wished you would pointed out something that I missed out on.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Data load from DSO to cube

    Hi gurus
    We have a typical problem, we have to combined 2 records into one when they reach to DSO.
    One source is flat file and the other is R3. so few fields I am geting from flat file and few from R3 . They are creating one record when they loaded to DSO . Now I am getting one record in active data table but when I load the data to cube from that DSO ( Data is going from change log to cube ) to cube , I am getting 2 seperate records ( one which is loaded from flat file and one which is loaded from R3 ) in cube which I dont want. I want only one record in cube just like i have in my active data table in DSO.
    I cant get the data from Active data table because I need delta load.
    Would you please advise what can I do to get that one record in Cube?
    Pl help

    Ravi
    I am sending the data thro DTP only but is there any solution to get one record bcos in other scenario I am getting data from 2 different ERP sources and getting one record in DSO and cube as well.
    but that is not happening for this second scenario where i am getting data from FLAT file and ERP and traing to create one record.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

  • Data load from DSO to Cube in BI7?

    Hi All,
    We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
    a. Infopackage1 from datasource to ODS and
    b. Infopackage2 from ODS to the CUBE.
    Now after we transported the migrated dataflow to production, to load the same infoproviders I use
    1. Infopackage to load PSA.
    2. DTP1 to load from PSA to DSO.
    3. DTP2 to load from DSO to CUBE.
    step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
    Please note that the DSO already has data loaded using infopackage.  (Is this causing the problem?)
    Thanks,
    Sirish.

    Hi Naveen,
    Thanks for the Reply. The creation of DTP is not possible without a transformation.
    The transformation has been moved to production successfully.

  • Invokin SQL*Loader from a stored procedure

    I try to invoke SQL*LOADER from within a database package by using external C procedure (the procedure calls the system() C function) but the loader generates the following error in its log file :
    SQL*Loader -523: error -2 writing to file (STDERR)
    and no data is uploaded.
    I have tried to use system() from within database procedures to execute OS commands and it works. Does anyone know what is the problem with using system() to execute "sqlldr <parameters>"? Is there some other way to call the loader from within a stored PL/SQL procedure?
    Thank you very much for your help.
    Aneta Valova
    null

    Hi
    What is your task and why you are trying to invoke SQL*Loader from strorage procedure or package? Maybe the redirecting of stderr will resolve your problem but thik is it the best way to do your job.
    I am not sure, that invoking other executables from Oracle instance is good idea.
    Regards
    null

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Problem during  Data Warehouse Loading (from staging table to Cube)

    Hi All,
    I have created a staging Module in owb to load my flat files to my staging tables.I have created an Warehouse module to load my staging tables to Dimension and Cube that I have created.
    My senario:
    I have a temp_table_transaction which had loaded my flat files to it .This table had loaded with 168,271,269 milion record as through this flat file.
    I have created a mapping in owb which loaded my temp_table_transaction which has join with other tables and some expression and convert function that these numbers fill to a new table called stg_tbl_transaction in my staging module.Running this mapping takes 3 hours and 45 minutes with this configue of my mapping:
    Default operation mode in running parameter of Mapp config=Set based
    My dimesion filled correctly but I have two problem when I want to transfer my staging table to my Cube:
    #1 Problem:
    i have created a cube is called transaction_cube with owb and it generated and deployed correctly.
    i have created a map to fill my cube with 168,271,268 milon recodes in staging table was called stg_tbl_transaction and deployed it to server (my cube map operating mode is set based)
    but after running this map it did not complete after 9 hour and I forced to cancel my running's map by kill its sessions .I want to know this time for loading this capacity of data is acceptable or for this capacity of data we should spend more time.Please let me know if anybody has any Issue.
    #2 Problem
    To test my map I have created a map with configure set based in operation modes and select my stg_tbl_transaction as source with 168,271,268 records in it and I have created another table to transfer and load my data in it.I wanted to test the time we should spend on this simple map but after 5 hours my data had not loaded in new table.I want to know where is my problem.Should I have set something in configue of map or anothe things.Please guide me about these problems.
    CONFIGURATION OF MY SERVER:
    i run owb on two socket xeon 5500 series with 192 GB ram and disks with RAID 10 Array
    Regards,
    Sahar

    For all of you
    It is possible to load from Infoset to Cube we did it, and it was ok.
    Data are really loaded from Infoset (Cube + master dat) to cube.
    When you create a transformation under a cube Infoset is proposed, and it works fine ....
    Now the process is no more operationnal and i don't understand why .....
    Load from infoset to cube is possible, i can send you screen shot if you want ....
    Christophe

  • Jasper report on HTML when one image loaded from database and for the other

    How to generate jasper report on HTML when one image loaded from database and for the other we give a image path
    My code
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
              exporter = new JRHtmlExporter();
              exporter.setParameter(JRExporterParameter.JASPER_PRINT, print);
              exporter.setParameter(JRExporterParameter.OUTPUT_STREAM, baos);
              exporter.setParameter(JRHtmlExporterParameter.IMAGES_URI, strImageInputDirectory);
         exporter.setParameter(JRHtmlExporterParameter.IMAGES_DIR_NAME, strImageOutputPath == null ? "." : strImageOutputPath);
         exporter.setParameter(JRHtmlExporterParameter.IS_OUTPUT_IMAGES_TO_DIR, Boolean.TRUE);
         exporter.setParameter(JRHtmlExporterParameter.IS_USING_IMAGES_TO_ALIGN, Boolean.FALSE);
         exporter.setParameter(JRHtmlExporterParameter.IS_WHITE_PAGE_BACKGROUND, Boolean.FALSE);
              exporter.exportReport();
              byte[] bdata = ((ByteArrayOutputStream) baos).toByteArray();
    Can any one help pls
    Message was edited by:
    ameet.au

    hey sorry for posting it in this forum.
    but do u have sample code for making it work.. since i am able to do it on PDF format(image from Database and another stored in the webserver) using
    byte image[] =(byte[]) outData.get("image");
                        ByteArrayInputStream img = new ByteArrayInputStream(image);
                        hmimg.put("P_PARAMV3", img);
    print = JasperFillManager.fillReport(reportFileName, hmimg, jrxmlds);
    bdata= JasperExportManager.exportReportToPdf(print);

  • My LV program can't find the iak file when the program is loaded from a different PC.

    Hello all,
    My LV vi with Fieldpoint I/O and it's IAK file are stored on the company network.  If I load and run the LV vi from the PC it was developed on, everything is fine.  If I load and run on a different PC (that has an identical install of LV), the LV vi can't find the IAK file (all of the I/O pointers to the file are grayed out.) 
    Also, if I make an EXE out of the vi on the machine that can find the IAK, it won't find it if it's run from the other machine.
    It is not related to whether one machine has different "rights", as they are identical.  It's not related to different installs of LV, as they are identical.   
    I have found that this is related to the network login name/password, and to the machine's network name.  If I login on the second machine and try to run the program, it won't find the IAK.  If I point it to the IAK it will run.  If I save it and log out, and then login as a different user with the same rights on the second machine, it won't find the IAK again, until I re-point it there when I re-run it.  This is true for both the vi and the EXE.
    This problem makes deploying the EXE difficult to different machines, as the first time it's run, the EXE has to be pointed to the IAK.  Pointing the EXE to the IAK is difficult as these installations have no keyboard or mouse.
    Does anyone have any ideas on how to fix this so the LV vi or it's EXE finds the IAK from no matter where it's loaded from?
    Thanks,
    Mike

    Laura,
    I figured I'd continue this thread because it's along he same line..
    I have two Fieldpoint systems. They are identical, except for one thing...one has an AI-110 in position 7...the other has nothing. Both have the same IP address, etc and I swap the ethernet cable between them when I want to select which one I use.  I use a dedicated ethernet card to the FP processor...and it's not the company net.
    There are two IAK files...one for the system with the AI-110, one for the system without the AI-110. The two IAKs are in different paths.
    Before I had the system and the IAK without the AI-110, I made an EXE of the code that woked with the AI-110, and it worked fine.
    But now if I run the EXE after I've used MAX and opened the IAK without the AI-110, the EXE won't read the AI-110!  I have to go back into MAX and open the IAK that has the AI-110 in it, close MAX and restart the EXE for the EXE to see the AI-110.  (Max is NOT SET to "Download Items on Save")
    I looks like the EXE is looking at MAX to get the IAK or a pointer to the IAK.
    Did I build my EXE incorrectly?  Shouldn't an EXE completely ignore MAX once it's made?
    Thanks,
    Mike

Maybe you are looking for

  • HP LaserJet M1217nfw MFP used to print wirelessly, now won't

    At some point the printer stopped printing wirelessly.  I completely uninstalled and reinstalled the software from DVD, and have run "wireless" setup several times.  It will only print through a USB cable.  ScanDr only reports that it can't communica

  • Oracle 10g Too Many INACTIVE Processes in V$SESSION

    Help. I have a 10g Catelogue database that I use only for RMAN backups for other databases, it sits on a Windows 2003 server. My problem is that it is generating a lot of INACTIVE processes for user SYSMAN to such an extent I had to increase the proc

  • Some artists not showing up

    Hello, I have a problem where certain artists that are on my ipod don't appear in the artist category but when i go to a different category, like, album, the music is there. I checked the info settings in iTunes but everything looks fine on that end.

  • Labview 7 crashes when a vi containing a dll is closed

    Dear All, I've written a dll in C language to use a USB dongle. Before making my C code into a dll, I throughly tested it as a console application, using Borland C++. In Labview, I call my dll, using the call library function node. The first time I r

  • Avoid duplicate tax just for 00000000T

    Hi Experts, I am on crm 2007 and i have a requirement to check BP duplictaes by Tax number. In other words, any time a BP is being created and has the same Tax Numer as an existing one, the system must not allow the creation of the new BP excepting t