SOLAR01 data extraction into BI

Hi,
This question was asked a couple of months ago but does not seem to have been asnwered. Is there a was to extract data from the SOLAR01 transaction into BI? There is reporting in SolMan but it would be nicer to use BI functionality.
If not an actual extract could someone tell me the tables and possible function modules used to read them? Can you even connect SolMan as a source system?
Thanks much,
Matt

Hi,
The infopackage ran for full load and not for delta load.
I looked at the rsmo and at first glance everything looks fine. But when i look at the details tab i can see that the overall status shows as RED and it says
Overall Status: Errors occured: or : Missing Messages
I am not clear what this means. I dont see any other nodes in RED. Every node is in green except the top most node which is in RED and shows the above message.
Any idea what could be the problem?

Similar Messages

  • Need help for SRM-6 Data Extraction into Bi-7

    Hi Experts,
    I am looking for help regarding SRM DataSource Extraction to BW. I am working on BW project and need to extract data from SRM Data Source 0BBP_TD_CONTR_2 and 0SRM_TD_PRO. I need to know about the extraction process from SRM. Is there a tool available like LO Cockpit in SRM for extraction and how can we manage the delta in it. Or i can use the T-code RSA5 for that and follow the normal process.  If I can get some documentation that can give me an idea for data extraction and delta management  in SRM that would be appreciated. Any kind of help will be welcome.
    Thanks in Advance.

    Hi,
    May help you:----
    How to extract data from SRM into BW system.
    SRM Audit Management Data Extraction into BW
    Procedure for standard data source extraction from SRM to BW 3.5.
    Regards,
    Suman

  • Need help for SRM Data Extraction into BI-7

    Hi Experts,
    I am looking for help regarding SRM DataSource Extraction to BW. I am working on BW project and need to extract data from SRM Data Source 0BBP_TD_CONTR_2 and 0SRM_TD_PRO. I need to know about the extraction process from SRM. Is there a tool available like LO Cockpit in SRM for extraction and how can we manage the delta in it. Or i can use the T-code RSA5 for that and follow the normal process. If I can get some documentation that can give me an idea for data extraction and delta management in SRM that would be appreciated. Any kind of help will be welcome.
    Thanks in Advance.

    SRM doesn't use kind of "cockpit" LO-like in ECC.
    Overall picture:
    http://help.sap.com/saphelp_srm40/helpdata/en/b3/cb7b401c976d1de10000000a1550b0/content.htm
    If you setup data flows accordign Business Content of SRM:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/7aeb3cad744026e10000000a11405a/frameset.htm
    Then just perform an init load and schedule the deltas.

  • Problem in Data extraction into BI

    Hi Guys,
    I am having a strange problem and unable to idenitfy the root cause. I am hoping that some one can help me with this issue.
    System Details:
    BI 7.0
    SAP ECC 5.0
    Current Design:
    We have created a DSO in the BI system and created a custom extractor to pull the data from SAP ECC 5.0 into the DSO.
    The custom extractor is using the option "Extraction Frm SAP Query" and using a custom infoset. This is the transaction type data source.
    Problem Statement:
    When I run the RSA3 transaction for this extractor, the extractor brings 1870 records. However when i run the Infopackage in the BI system to bring the records into the DSO --> PSA then i am only getting 823 records.
    Question:
    Why am i seeing this difference in the number of records in RSA3 and in the BI system? I am NOT using any selection or filter conditions in the Infopackage. And i am only getting the data till PSA. So i was expecting that in PSA i can see 1870 records same as that i see in RSA3. Any idea what is missing?
    Thanks in advance !!

    Hi,
    The infopackage ran for full load and not for delta load.
    I looked at the rsmo and at first glance everything looks fine. But when i look at the details tab i can see that the overall status shows as RED and it says
    Overall Status: Errors occured: or : Missing Messages
    I am not clear what this means. I dont see any other nodes in RED. Every node is in green except the top most node which is in RED and shows the above message.
    Any idea what could be the problem?

  • Data extraction into sap table  from legacy oracle database

    Hello All,
        I have a scenario where I have two different software systems (SAP and xyz systems), where a intermediate table will be created in between the two systems that is shared. Data will be updated by the xyz systems into this shared table. Now, my questions regarding this shared table.
       1) Can we write some program or something to get the data from shared table to update the SAP?
       2) If possible send me the suggestions
       3) Please also send me the sample code to get the data from the shared table
    Thanks in advance,
      SDN powered

    this shared table should be compatiable to sap fields...write code to fetch data from this table and assign data to appropriate fields in SAP and insert the data into sap.
    1. Push mechanism
    Write a RFC on SAP side to insert entries into the table.
    Call the RFC from xyz application passing the data you want to insert.
    2. Pull mechanism
    Write a ABAP Program, where you can somehow read the data of the xyz application and insert data into the table.
    3. Flat file
    Dump the data from xyz application into the file.
    Write a ABAP program to read the file and update the table.

  • Master data extraction into FTP

    Hi all,
    Please help me downloading material master data into ftp site.

    Hi,
    To connect to FTP Server, you will need to provide username and password in your program. Following steps will solve your problem.
    1. Call Function HTTP_SCRAMBLE  : Export password for FTP server, length of the password, key and Import Destination(v_pass1).
    2. Call Function FTP_CONNECT : Export username, v_pass1 to the password parameter, Host and Destination and Import a Handle.
    3. Call Function FTP_COMMAND : Here pass FTP commands and handle along with the table which contains your data.
    Reward points if the answer is helpful.
    Regards,
    Mukul

  • Custom Document & Custom Doc Legal Control Extraction into SAP BI

    Dear All,
    After all the business content for GTS was activated in GTS and SAP BI, the following configuration steps were taken in GTS to extract this data into BI for
    reporting:
    [1] Integration with Other mySAP.com Components > Data Transfer to the SAP Business Information Warehouse > General Settings > Maintain Control Parameters for Data Transfer [Used ECC as guide]
    [2] Global Trade Services > General Settings > Organizational Structure > Control Settings at FTO Level for SAP NW Business Intelligence (BI) 
    Inserted new records missing when comparing to Assgmt of Feeder System Grouping Org.Units to LLS Org.Units table [/SAPSLL/TCOOGS]
    Turned on BI Active flag for all records  
    [3] Global Trade Services > General Settings > Document Structure > Define Document Types for Application Areas
    For each document type in all the folders, turned on ‘transfer to SAP Netweaver Business Intelligence Active’ flag where it was available to be turned on.
    [4] Global Trade Services > General Settings > Document Structure > Define Item Categories for Application Areas
    For  each item category in all the folders, turned on ‘transfer to SAP  Netweaver Business Intelligence Active’ flag where it was available to be turned on.
    Focusing on custom document and custom document legal control extractors, the data comes from /SAPSLL/CUIT which contains 113K records in development, however, the amount of data extracted into BI is minimal.  So I have a few questions:
    Did I miss any additional configuration swithces to extract into BI?
    SAP Documentation: You have entered a unit of weight in the table for GTS: Control Settings at Foreign Trade Organization Level. This unit of weight must be the common unit of weight that you want to be used in strategic reporting in BI . The values in the fields for gross weight (WEIGR), net weight (WEINE) and unit of weight (WEIDI)
    depend on the unit of weight you maintain at FTO level in the IMG for converting weights for SAP BIDoes each FTO need to be insert with different unit of measure?  ie FTO123 UoM = EA; FTO123 UoM = CS; FTO123 UoM = KG
    Should all the data records from /SAPSLL/CUIT be extracted into Custom Document DSO [0SLL_DS01]? If no, why? If yes, refer to pt 1 and what am I missing?
    SAP Documentation: Only data that belongs to the legal controls for the following legal regulations is transferred to BI:
    German Foreign Trade Laws
    Prohibitions and Restrictions
    Other What is the significance of these legal controls that SAP has decided to only transfer this type into BI? If the client want additional legal controls, would it be possible to extract them into BI?
    If anyone is currently using the std content for BI reporting, any assistance with my questions would be appreciated.
    Sincerely,
    Claire

    Hi Claire,
    Your customizing looks correct. But the amount of data extracted depends on your condition you have defined during extraction(e.g. Creation Date)
    Also in the coding only /SAPSLL/CUIT~ITSTA = space (Standard Item) and A (Item is allocated) is considered in 0SLL_CD_1.
    You could find out reason by debugging to find out exact reason.
    T-code: RSA2 , Enter Data source : 0SLL_CD_1. Then click display.
    Click Extract check ( Shift + F8 ) and put break point in coding FM : /SAPSLL/BW_GET_CD_1
    Best regards,
    Vincent

  • Extracting into Flat Files Using OCI or Pro*C Programs

    Data Extraction into Flat Files from a database Using OCI or Pro*C Programs - please provide me a sample code. It is urgent. Thank you in advance.

    This problem is very simple to solve. Simply use Pro*C, issue an SQL select into a host variable, then use unix "printf" to output the result. An alternative is to use the provided sqlplus utility to grab the data in a script, disabling headers, etc.
    Sadly, this area is a huge, basic hole in the Oracle product offering. While they have an import utility, there is no export utility, except for one that makes binary files not usable outside Oracle. Every other RDBMS I've seen has this. In Informix, you can say something like "export to <filename> select * from <table>", but that syntax is not offered by Oracle Corporation.

  • Data Extraction from Multiple data sources into a single Infoprovider

    Hi Experts,
    Can anyone send me links or examples on how to extract data from multiple data sources into 1 Cube/DSO.
    Can anyone send me example scenarios for extracting data from 2 data sources into a single Cube/DSO.
    Thanks
    Kumar

    Hi Ashok,
    Check the following link from SAP help. Ths is probably what you are looking for.
    [ Multiple data sources into single infoprovider|http://help.sap.com/saphelp_nw70/helpdata/en/e3/e60138fede083de10000009b38f8cf/frameset.htm]
    Data from multiple data sources which are logically related and technicall have the same key can be combined into single record.
    For example, if you have Sales order item and sales order item status .
    These both have the same key - sales order and item and are logically related.
    The Sales order item - provides information on Sales order item - material , price, quantity, plant etc.
    The item status  - povides information on delivery status, billing status, rejection status....
    These are two different data sources 2LIS_!1_VAITM ad 2LIS_11_VASTI.
    In case you have few master data attributes coming from different systems ie tow data sources in different systems say completely defines your master data. Then you could use a DSO or infoobject to combine the records.
    If you want to see aggregated data you use a cube.
    ie say you want to analyzae the customer revenue on a particular material and a particular project.
    The project details would come from Project syatem and Sales details fom sales flow.
    The data is then combined in the DSO with key cutomer and sales area say.
    Then it is aggregated at cube level...by not using sales order in the cube model...so all sales order of the same customer would add while loading the cube.. to give direct customer spend values for your sales area at aggregated level.
    Hope this helps,
    Best regards,
    Sunmit.

  • [sap-bw] SAP HR data extraction errors....into BI 7.0 ...from ECC 6.0..

    Hi Gurus,
    We are doing HR data extraction with the following details:
    Cube: 0PT_C01 - Time and Labor
    data source: 0HR_PT_1 - Times from personal work schedule
    Records in ECC: 15mill +.
    "Init load with Data "is being loaded in packets of 40K records (400 packets approx).
    Each time we load (we tried reloading after deleting data and reinit) it is loading same records. But it is giving the following error. Surprisingly this error is coming for different packets in different loads. for ex: 1st time packet 6...2nd time packet 58....3rd time 122 etc.....it keeps changing...
    Processing end : Errors occurred
    Error when generating the update program
    Diagnosis
    An error occurred during program generation for InfoSource 0HR_PT_1 and
    InfoProvider 0PT_C01 . This may be extrapolated to incorrect update
    rules.
    Procedure
    Check and correct your update rules and then generate the update program
    again. You can find help on the error in the error log.
    We reactivated and verfieid update rules. But no use.
    Appreciate any kind of help.
    thanks in advance.
    J.

    Oscar,
    Thanks for the help.
    We tried the following one by one.
    1. Delete and recreate update rules.  To be frank....there is no CODING or complication in there...
         It did not work.
    2. Tried loading only to PSA.  It loaded 15mill records perfectly.  But when we tried updating target cube, 
        same story with many more errors.....earlier we had 10 packages failed out of 387.......
         this time almost every  alternate package is marked red and same error....update generating update
         program.....
    Hope somebody will come up with a solution.

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • FI data extraction help

    HI All,
    I have gone thorugh the sdn link for FI extarction and founf out to be very  useful.
    Still i have some doubts....
    For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
    ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......

    >
    Jacob Jansen wrote:
    > Hi Raj.
    >
    > Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
    >
    > br
    > jacob
    Not necessarily for systems above plug in 2002.
    As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
    Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm

  • Steps for Data extraction from SAP r/3

    Dear all,
    I am New to SAP Bw.
    I have done data extraction from Excel into SAP BW system.
    that is like
    Create info objects > info area> Catalog
                                                --> Character catalog
                                                --> Key catalog
    Create info source
    Upload data.
    create info cube
    I need similar steps for data extraction for SAP R/3
    1. when data is in Ztables ( using Views/Infosets/Function etc)
    2. When data is with Standard SAP using Business Content.
    Thanks and Regards,
    Gaurav Sood

    hi,
    chk the links
    Generic Extraction
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    CO-PA
    http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
    CO-PC
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    iNVENTORY
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Extractions in BI
    https://www.sdn.sap.com/irj/sdn/wiki
    LO Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Remya

  • 'Error 8 occurred when starting the data extraction program'

    Hello Experts,
    I am trying to pull master data (Full upload) for a attribute. I am getting an error on BW side i.e. 'The error occurred in Service API .'
    So I checked in Source system and found that the an IDOC processing failure has occurred. The failure shows 'Error 8 occurred when starting the data extraction program'.
    But when I check the extractor through RSA3, it looks fine.
    Can someone inform what might be the reason of the failure for IDOC processing and how can this be avoided in future. Because the same problem kept occurring later as well.
    Thanks
    Regards,
    KP

    Hi,
    Chk the idocs from SM58 of source system are processing fine into ur target system(BI system)...?
    Chk thru Sm58 and give all * and target destination as ur BI system and execute and chk any entries pending there?
    rgds,
    Edited by: Krishna Rao on May 6, 2009 3:22 PM

Maybe you are looking for

  • How to mark email messages as Not Junk?

    When I look in the Junk email folder in Mac Mail, I see messages that I did not mark as junk mail.  I guess Mac Mail is somehow filtering and classifying the messages as junk.  Within the Junk folder, however, there are messages that are not junk.  H

  • Can I connect the iphone 3gs with ipad 2?

    Hallo can I connect the iphone 3gs with ipad 2?

  • Editing song albums

    I'm trying to edit my songs and have tried the "get info" way because it has worked in the past and it is not working on every song.  I just downloaded 30 new songs from iTunes and only about 20 will allow me to change the album information.  I also

  • Where did my photo go? Is it gone? (iPhone 6; iOS 8)

    I took a photo on my iPhone 6 and when I went to look at it in the Photos app, it wasn't there. Yet, the picture I took shows up as a thumbnail of the last photo I took when in the camera app. Is it a programming error? Any suggestions on getting the

  • How can use VISA Read and the Wait Function properly

    I am trying to read an instrument. The is a simple instrument because all it does is feed data; there aren't not any write commands whatsoever. I am trying to read the instrument in time intervals. It seems to work fine except for the one second inte