BW Extractors for NON-BW Data Warehouse

Hi,
I am working on a client who wishes to use a custom developed Data Warehouse. Is there any way to use SAP's standard extractors to extract data to these non-sap dw systems (or download the data in flat file format from extractor after run)? We are looking for mainly LO Cockpit and Finance extractors.
Best regards,
Nikhil

One option I know of is using Informatica you can extract data from SAP R/3 and load a non-BW data warehouse via SAP BCI adapters. BCI adapter makes use of delivered extractors.
Check out the webex replay of InformaticaWorld conference
http://www11.informatica.com/replays/IW06_BOsession_Kato.wrf

Similar Messages

  • Using OBIEE for a custom Data Warehouse

    Hi Everyone,
    I am very new to OBIEE and I have a few questions about this product family.
    1. I have an existing custom build data warehouse, and I would like to know, is it possible to have build reports on this data warehouse?
    2. I understand that OBIEE comes with pre-built ETL jobs in Informatica, what kind of license is it? Is it possible to modify them, or even build new jobs that load into a non-OBIEE data warehouse?
    your answer will be greatly appreciated.
    Jeffrey
    Edited by: user3265404 on Oct 13, 2009 12:50 PM

    Its the same Informatica which can do all functions as a stand alone Infa. additionally it also has prebuilt adapters for source systems like Siebel, APPL, PSFT, JDE, SAP and some universal adapters, so the license included these also which is going to cost more than getting a informatica Licence from Informatica Corp. Moreover, OBI Apps 7.9.6 comes with Informatica 8.6 which is a little older version of the tool. Informatica is going to release version 9 in a couple of weeks.
    I see that you already have a datawarehouse, so why do you need a ETl tool again?
    OBI EE can directly report out of a datewarehouse, and also transactional systems as long as the metadata layer is built.
    PS: Am I clear?

  • Native data warehouse products  vs  non-native data warehouse products

    Hi Experts!
    Can any one help me on this topic if you have any idea on <b>native data warehouse products idea</b>.
    2)discuss the <b>system’s ability to interface with a non-native data-warehouse</b>.
    <b>Discuss the architecture in both cases</b>.
    Describe or illustrate <b>how data in the data warehouse can be utilized for reporting with the data in the ERP system.</b>
    Your help will be Appreciated.
    Advance thanks,
    vikram.c

    Hi Experts!
    Can any one help me on this topic if you have any idea on <b>native data warehouse products idea</b>.
    2)discuss the <b>system’s ability to interface with a non-native data-warehouse</b>.
    <b>Discuss the architecture in both cases</b>.
    Describe or illustrate <b>how data in the data warehouse can be utilized for reporting with the data in the ERP system.</b>
    Your help will be Appreciated.
    Advance thanks,
    vikram.c

  • Mapping deploy for Non-Oracle Data Source hangs

    Hi All,
    I am trying to deploy mapping for Non-Oracle Data Source and it hangs.
    Oracle version is 10.2.0.3 and OWB version is 10.2.0.1.3.1
    It would be really appreciated if you can help.
    Thanks!
    PS.

    That helpes quite a bit. I still can't get the app to retrieve data, but I am getting a more useful message in the log:
    [Error in allocating a connection. Cause: Connection could not be allocated because: ORA-01017: invalid username/password; logon denied]
    As you suggested, I removed the <default-resource-principal> stuff from sun-web.xml and modified it to match your example. Additionally, I changed the <res-ref-name> in web.xml from "jdbc/jdbc-simple" to "jdbc/oracle-dev".
    The Connection Pool "Ping" from the Admin Console is successful with the user and password I have set in the parameters. (it fails if I change them, so I am pretty sure that is set up correctly) Is there another place I should check for user/pass information? Do I need to do anything to the samples/database.properties file?
    By the way, this is the 4th quarter 2004 release of app server. Would it be beneficial to move to the Q1 2005 beta?
    Many thanks for your help so far...

  • How to convert number datatype to raw datatype for use in data warehouse?

    I am picking up the work of another grad student who assembled the initial data for a data warehouse, mapped out a dimensional dw and then created then initial fact and dimension tables. I am using oracle enterprise 11gR2. The student was new to oracle and used datatypes of NUMBER (without a length - defaulting to number(38) for dimension keys. The dw has 1 fact table and about 20 dimension tables at this point.
    Before refining the dw further, I have to translate all these dimension tables and convert all columns of Number and Number(n) (where n=1-38) to raw datatype with a length. The goal is to compact the size of the dw database significantly. With only a few exceptions every number column is a dimension key or attribute.
    The entire dw db is now sitting in a datapump dmp file. this has to be imported to the db instance and then somehow converted so all occurrences of a number datatype into raw datatypes. BTW, there are other datatypes present such as varchar2 and date.
    I discovered that datapump cannot convert number to raw in an import or export, so the instance tables once loaded using impdp will be the starting point.
    I found there is a utl_raw package delivered with oracle to facilitate using the raw datatype. This has a numbertoraw function. Never used it and am unsure how to incorporate this in the table conversions. I also hope to use OWB capabilities at some point but I have never used it and only know that it has a lot of analytical capabilities. As a preliminary step I have done partial imports and determined the max length of every number column so I can alter the present schema number columns tp be an apporpriate max length for each column in each table.
    Right now I am not sure what the next step is. Any suggestions for the data conversion steps would be appreciated.

    Hi there,
    The post about "Convert Numbers" might help in your case. You might also interested in "Anydata cast" or transformations.
    Thanks,

  • Syntax for WriterLoginName in Data Warehouse DB

    Hello
    I'm having a few issues with our management servers writing to the Data Warehouse DB. I've checked the 'Management Group' table and can see the WriterLoginName is set to
    DOMAIN\sv-scom-dw - however, i'm just querying whether that field should read
    sv-scom-dw
    The account is in fact a domain account. It's listed as the 'Data Warehouse SQL Account' & 'Data Warehouse Action Account' (under Administration > Run As configuration > Accounts). 
    We have two entries in the database security (rights over OperationsMangerDW), one as DOMAIN\sv-scom-dw & a local SQL login called sv-scom-dw. Both accounts have the following permissions: apm_datareader, apm_datawriter, db_datareader, db_owner, OpsMgrReader,
    OpsMgrWriter, public.
    We're a SCOM 2012 R2 environment. All servers are 2012 R2, SQL is also 2012 standard. 
    Anyone faced a similar issue before? I'm seeing a lot of alerts in the Monitoring section for the Data Warehouse. One in particular:
    Data Warehouse failed to discover performance standard data set. Failed to enumerate (discover) Data Warehouse objects and relationships among them. The operation will be retried.
    Exception 'SqlException': Management Group with id ''5F201AB2-4B10-7FCC-C716-B2361102248D'' is not allowed to access Data Warehouse under login ''sv-scom-dw''
    One or more workflows were affected by this.
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Discovery.StandardDataSet
    Instance name: Performance data set
    Instance ID: {B81C47FB-A80D-0FE5-A8DB-DC4544FC8DA6}
    Management group: ******
    As you can see from the alert the account referenced is 'sv-scom-dw' and not 'DOMAIN\sv-scom-dw'. Which is why I originally asked, should the field in the management table be updated?
    Thanks, David.

    Hi guys.
    Thanks for the responses, I shall provide an event  ID shortly. In response to Mai, I've followed the link you've posted and I'm now checking the 'data source and related settings', so i've gone to http://localhost/reports on the Warehouse server (which
    also hosts the reporting), and i've got the following error:
    The report server cannot decrypt the symmetric key that is used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. (rsReportServerDisabled)
    Get
    Online Help
    Keyset does not exist (Exception from HRESULT: 0x80090016)
    Have you come across this before?

  • Extractor for payroll reads data from cluster PCL2?

    Hello all,
    This is the line from help.sap in payroll extractor.
    "The extractor for payroll results reads data from payroll cluster PCL2, not from standard tables."
    Can someone explain to me what do we have to understand from that, and where in R/3 do we have to go and check for data.
    Thanks in advance.

    Anytime some of last information does not upload in delta to BW from payroll because payroll has a process to close it had to be executed (or something like that (sorry, i haven't enough funcional HR knoledge), it happened to me anytime).
    But, surely the user that you are using for extraction has no enought rights or authorizations.
    Look at these notes
    397208 **** very important!!!
    672514
    964569
    329961
    585682

  • Need Help for Non Transactional Data

    Hi,
    I need your help for getting the non-transactional data. I am looking for some solution where I can have employee list with both transaction and non transaction along with the measure in the report.
    Thanks in advance.
    Phani.

    Looks like you either want a procedure with OUT parameters, or to return a record (which you'd need to declare somewhere that your function and calling procedure can both see)

  • Testing for non-Numeric Data in a varchar2

    Hello -
    What is the easiest way to test see if there is non-numeric data in a varchar2 column? The column holds ssn values, but I am unable to convert these values to numeric data, because somewhere the column is storing non-numeric data.
    Thanks in advance.

    Maybe something like this ?
    SQL> create or replace function test_num (var1 in varchar2) return varchar2
      2  is
      3     num     number;
      4  begin
      5     num := to_number (var1);
      6     return ('Number');
      7  exception
      8     when others then
      9             return ('Character');
    10* end;
    SQL> /
    Function created.
    SQL> select test_num ('111') from dual;
    TEST_NUM('111')
    Number
    SQL> select test_num ('aaa') from dual;
    TEST_NUM('AAA')
    Character
    SQL>                                                             

  • Data syncup issue - Custom extractor for revenue recognition data from ECC into BW

    Hi there,
    We have created a custom extractor on top of a Function Module (FM) in ECC that reads data primarily from VBREVE table. Currently, we have close to 10 million records and full load isn't a preferred option so we built in a delta mechanism using "Created On" (ERDAT) and "Posting Date" (BUDAT). The idea is to do an initial full load and then switch over to delta loads on a nightly basis for data where:
    "Created On" OR "Posting Date" is within last 6 months.
    This will ensure if any updates are made to the existing VBREVE entries, this change will be reflected in the BW DSO, provided the "Created On" or "Posting Date" fall within last 6 months. The issue comes up if in ECC a billing plan is cut down to a smaller term, let's say from 3 years to 2 years, the entries for the 1 year will be deleted from VBREVE table. How can I pick up this change in BW DSO since this data has already been loaded in the past? Addition of entries are okay but I need to address the deletion on entries in VBREVE table so that it reflects this in BW DSO. Any ideas how I can accomplish this? In the example on the screenshots, BW still shows the before image and I need to be able to get it to sync up with ECC as per the after image.
    -Rohit

    Dear Rohit,
    The case is complicated , there can be workaround using the changedon date from VBAK table. if the billing plan is changed in the sales order VBAK will be updated for changedon.
    1) If the billing plan deletion is a very specific case, then using the change log tables find out which are the sales order for which there has been billing plan change. Store this sales order details in DSO day wise. call it DSO1
    2) Create a self transformation to the DSO ( DSO2) where you are currently loading from the table VBREVE (DSO2 to DSO2 -For loading data from the same DSO) .Transformation should have reverse record mode given as a constant.
    3) load the data DSO2 from DSO2 by filling up sales order details from DSO1 in the DTP filter.
    ( This will reverse the complete entry for the sales order in DSO2 for the sales order for which billing plan has been changed)
    4) Now load afresh for these sales order again from source system to DSO2 from the table VBREVE.
    You can also use VBAK changedon date to trigger this load.
    Hope I have not confused you
    Regards
    Gajesh

  • CRM Extractor for service management - date type

    Hi,
    I want to extract data for the service management from crm system. I find the data with crm_order_read in the table et_appointment. I need the field appt_type with the time stamps.
    Is there a business content extractor available? I can not find one for this issue.
    In which table is the data from et_appointment really stored?
    Thanks in advance.
    Best regards
    Nils

    Hi Sunita,
    I think you wanted to calculate the time duration of the service requests. you can refer to tables
    CRMD_ORDERADM_H,
    CRMD_SRVPLAN_I (Service Plan Item - Service Cycle Interval).
    Or you can refer to FM CRM_ORDER_READ and export parameter ET_APPOINTMENT.
    Regards,
    Karthik.

  • Extractor for Non Leading Ledger

    Hi,
    I am trying to create a DS for a non leading ledger based on table FAGLFLEXT (Totals). I am using TCODE - FAGLBW03 for that. When I try doing this I get an error message:
    No extract structure exists for ledger table ECMCT
    Message no. GQPI053
    I am able to create a DS for the Leading Ledger with the same table in the same TCODE.
    Any suggestions please?
    Cheers
    Anand

    Anand,
    You can go to BW01 Tcode in ECC and Execute Summary Table ECMCT.you would need a Dev key and then you can activate str and then try to create a Data source.But Why dont you go via FAGLBW03 and generate any specific reason.You want to create a Generic?
    How you are goin to handle Delta?
    Thanks!
    ~AK

  • Defining Working Times - Incorrect Scheduling Result for non-Working Date

    Hi All,
    We are on SAP R/3 46C and in the process of implementing 'Route Schedules' for our outbound delivery processing.
    One of the pre-requisite customising requirements for 'Route Schedules' is to maintain 'Working Times' for the Shipping Point for precise scheduling.
    We have defined a 24 hour shift between Monday and Friday in the 'Working Times' config (IMG -> Logistics Execution -> Shipping -> Basic Shipping Functions -> Scheduling -> Delivery Scheduling and Transportation Scheduling -> Maintain Working Hours).  
    The 24 hour shift is defined as follows Start: 00:00:00 with a Finish at 24:00:00 (We initially had 23:59:59, however, encountered problems with this config).  We have also maintained a 24 hour Pick/Pack time in Customising, and we do not use Loading Times.
    We have encountered a problem when we key a Sales Order Item where the ATP confirmed qty Delivery Date results in a Material Availability Date on a 'Friday', in this scenario, the system determines a one day 'Loading Time' which falls on a Saturday at 00:00:00.  This outcome is incorrect on two levels - 1) we do not use 'Loading Times' in our system, 2) a non-working day (Saturday) has been selected.  An important exception to this outcome - if a Route is not selected for the Sales Order item, the outcome described above does not eventuate.
    Any assistance in explaing possible causes of this unusual result would be greatly appreciated.
    Thanks in advance,
    Ravelle
    When we display our time streams via report 'ZZTSTRDISP' (OSS Note 169885)
    The system shows our 24:00:00 Finish time is converted to 00:00:00 which incorrectly rolls into the next day - on a Friday this means our shift finishes on a Saturday at 00:00:00
    25    06.02.2008     00:00:00            07.02.2008     00:00:00
    26    07.02.2008     00:00:00            08.02.2008     00:00:00
    27    08.02.2008     00:00:00            09.02.2008     00:00:00 <-- Saturday
    28    11.02.2008     00:00:00            12.02.2008     00:00:00
    29    12.02.2008     00:00:00            13.02.2008     00:00:00
    30    13.02.2008     00:00:00            14.02.2008     00:00:00
    31    14.02.2008     00:00:00            15.02.2008     00:00:00
    32    15.02.2008     00:00:00            16.02.2008     00:00:00 <-- Saturday
    33    18.02.2008     00:00:00            19.02.2008     00:00:00

    SAP Created OSS Note# 1144784 to resolve this issue for us.

  • Inserting the default value for non existing date

    hello,
    i am designing a matrix report for payslip for month wise in oracle 6i report builder.
    in that i want to display per day working hours of the month.
    i m using following query for achieving my goal :
    select a.paycode,b.empname,c.departmentname,to_char(a.dateoffice,'DD') dateoffice,a.shiftattended,round(decode(a.hw,0,a.mannual_hours,a.hw),0) hw,
    a.ISMANNUAL,A.STATUS
    from tbltimeregister1 a,
         tblemployee  b,
          tbldepartment c
    where a.paycode=b.paycode and
    (a.hw >0 or a.mannual_hours is not null ) and
    a.departmentcode='D05' AND
    a.departmentcode=c.departmentcode and
    a.dateoffice between to_date('01/12/2012','dd/mm/yyyy') and to_date('31/12/2012','dd/mm/yyyy')
    ORDER BY A.PAYCODE,A.DATEOFFICEit is displaying the hours for the date in b/w date ranges exist in master table.
    my problem is that if any date of a month is not in master data ,it should display the given date in DD format with 0 hours , A(absent) status, and null shift by default.
    pl tell me how to modify my query for getting my desirable result.
    Thanking You
    Regards
    Vishal Agrawal

    965354 wrote:
    hello,
    select a.paycode,b.empname,c.departmentname,to_char(a.dateoffice,'DD') dateoffice,a.shiftattended,round(decode(a.hw,0,a.mannual_hours,a.hw),0) hw,
    a.ISMANNUAL,A.STATUS
    from tbltimeregister1 a,
    tblemployee  b,
          tbldepartment c
    where a.paycode=b.paycode and
    (a.hw >0 or a.mannual_hours is not null ) and
    a.departmentcode='D05' AND
    a.departmentcode=c.departmentcode and
    a.dateoffice between to_date('01/12/2012','dd/mm/yyyy') and to_date('31/12/2012','dd/mm/yyyy')
    ORDER BY A.PAYCODE,A.DATEOFFICEit is displaying the hours for the date in b/w date ranges exist in master table.
    my problem is that if any date of a month is not in master data ,it should display the given date in DD format with 0 hours , A(absent) status, and null shift by default.
    pl tell me how to modify my query for getting my desirable result.Your problem isn't exactly clear to me. How do you pass the dates to be compared to the DateOffice column? It will be better to ensure the Master table contain data for each date (with some default values) and you will not have to tweak your sql.
    Below is a possible solution:
    select *
      from
    select a.paycode,b.empname,c.departmentname,to_char(a.dateoffice,'DD') dateoffice,a.shiftattended,round(decode(a.hw,0,a.mannual_hours,a.hw),0) hw,
    a.ISMANNUAL,A.STATUS
    from tbltimeregister1 a,
         tblemployee  b,
          tbldepartment c
    where a.paycode=b.paycode and
    (a.hw >0 or a.mannual_hours is not null ) and
    a.departmentcode='D05' AND
    a.departmentcode=c.departmentcode and
    a.dateoffice between to_date('01/12/2012','dd/mm/yyyy') and to_date('31/12/2012','dd/mm/yyyy')
    union
    select a.paycode,b.empname,c.departmentname,to_char(&date_variable,'DD') dateoffice,null,round(decode(a.hw,0,a.mannual_hours,a.hw),0) hw,
    a.ISMANNUAL,'A'
    from tbltimeregister1 a,
         tblemployee  b,
          tbldepartment c
    where a.paycode=b.paycode and
    (a.hw >0 or a.mannual_hours is not null ) and
    a.departmentcode='D05' AND
    a.departmentcode=c.departmentcode and
    ORDER BY A.PAYCODE,A.DATEOFFICEIf this is not what you are looking for, then
    1. Post a script that helps us to create the Tables required in your query
    2. Some sample data. That includes the data for master table that has missing dates (4-5 rows should suffice)
    3. The expected results from the sample data you posted.
    Please do not forget to mention your version:
    select * from v$version;

  • Advise for a pseudo-data warehouse?

    Hello:
    We are looking into setting up a database that is a mirror of our production environment for querying only. Running of queries in our production environment has proven to be too much of a strain on resources. I have been researching setting up this mirrored machine as a standby database in the managed environment. It could then be queried when in "read only" mode. It seems like it would offer a lot of advantages. We are already generating redo logs on the production machine, so I don't think there would be much overhead added to the production machine. One concern I have, is the robustness of the net8 transfer of the redo logs from the production machine to the standby machine? The redo logs would go over a WAN, and thus might be susceptible to blips in the connection. Is the transferring of the redo logs over net8 robust enough to recover from this, or would the transfer just fail and not restart?
    Also, an additional requirement, is for data from another database (Microsoft SQL Server) to be available in this standby database. Is this possible? It seems that when a database is set up as a standby machine, it is basically a slave to the main database, and no operations can be performed on it. Is that the case?
    I know these are kind of vague questions. I can provide more info on our requirements if anybody is curious...
    Thanks!

    Its the same Informatica which can do all functions as a stand alone Infa. additionally it also has prebuilt adapters for source systems like Siebel, APPL, PSFT, JDE, SAP and some universal adapters, so the license included these also which is going to cost more than getting a informatica Licence from Informatica Corp. Moreover, OBI Apps 7.9.6 comes with Informatica 8.6 which is a little older version of the tool. Informatica is going to release version 9 in a couple of weeks.
    I see that you already have a datawarehouse, so why do you need a ETl tool again?
    OBI EE can directly report out of a datewarehouse, and also transactional systems as long as the metadata layer is built.
    PS: Am I clear?

Maybe you are looking for

  • After iOS 7.1 update: red '1' badge on settings app

    Dear Apple Support Communities! The download & update to iOS 7.1 was successful. Well more or less. My iPhone 5s made no problems, everything works fine. Except the iPad Air (WiFi, 32GB). After the update to iOS 7.1 the red '1' badge on the settings

  • Invoice Does not Created in R/3

    Dear All, I am sending data from Oracle Application to SAP R/3 through XI . XI is picking up data from Oracle and send it to RFC of SAP R/3 in RFC it calls function module(Z) in which i am calling BDC.It executes fine in R/3 But when i send it thriou

  • How Do I Create a List of Reports?

    Looking to create an exportable list of reports that are housed in my Crystal Reports Server 2008 Enterprise.  Report Name, owner, next run time, last modify time, etc.  The export preferably needs to be in Excel.

  • For the blue screen at Leopard install, this tech report works for me

    http://docs.info.apple.com/article.html?artnum=306857 I used the second "solution" and my system booted properly after typing in the command lines mentioned in the article. Good luck

  • Tape tar error

    Hi, I am trying to determine if a tape is full because on 2 different tapes (on 2 different SunOS servers), im receiving 2 different kind of errors: What i want to verify is if both tapes are actually full, as i got different tar errors with the same