Sql override-lookup

Hi Gurus,
Why to give SQL query in Lookup Sql Override in lookup transformation,what is the use of that option?

Hi
If you need to filter the data apart from applying join conditions, you do it by giving sql override. In this way the Lookup Cache holds the filtered data and not the entire data.
Hope this helps
Regards

Similar Messages

  • SQL Override. How to use custom SQL in ODI?

    I am new to ODI & have an Informatica background. I am looking for an SQL override capability in ODI. I have an existing SQL which has complex joins, in-line queries, filters, unions & many other complexities. The query reads many existing tables & output 5 fields, which I need to populate in the target table. In Informatica, I would simply create a Target table of these 5 fields, import the same table as both source & target, & then override the source qualifier with the existing SQL. Does ODI have a similar feature?
    I know that the same can be achieved using core ODI features, but I am trying to get it done quickly without re-inventing the wheels.
    Thanks,
    Dinesh.

    Dinesh
    you can not use SQL over-ride feature which informatica has. I am an informatica developer myself. So I can feel your pain. You are gonna miss a lot of features Informatica has. the one you are looking can not be done using ODI Interfaces. But can be done using ODI Procedures. When you create a step in Procedure you would mention the Source and target schema informations/Context Logical Schema info which should do the work for you. If your source and targets are in different Server then create a DBlink between two servers. Now when you put your select query in it put it like an insert query.

  • SQL Table Lookup during WhilePrintingRecords phase

    Post Author: BrHunt
    CA Forum: Data Connectivity and SQL
    Is it possible in Crystal XI to perform a "real-time" SQL table lookup while processing records for printing?  Possibly in a user defined function?
    Thanks.
    Bryan Hunt

    Hi Jonathan,
    Below is an extract from SAP Note 490095 - DB6: Additional information on upgrade
    XIV/ Converting the table DBTABLOG
    < d021078 >----
    The table 'DBTABLOG' (among others) is converted for all source releases. Since this table may contain a lot of entries, this conversion may take a very long time.
    When you use the upgrade strategy 'resource-minimized', the complete conversion takes place in downtime.
    When you use the upgrade strategy 'downtime-minimized', the system proposes the table DBTABLOG for the incremental conversion with ICNV when it reaches a certain size.
    The part of the data that is not yet converted (by ICNV) at the beginning of downtime is converted in the PARCONV_UPG phase in downtime.
    Note that the maximum size of the log files on the database must at least be as large as the portion of data for the table 'DBTABLOG', which is converted in the PARCONV_UPG phase.
    Before the upgrade, find out how large the table 'DBTABLOG' in your system is and check whether the existing entries are also still required after the upgrade. You can delete the entries no longer required in accordance with the instructions contained in Note 41300.
    Also there is some information mentioned in SAP note 41300 which can be useful.
    Regards,
    Deepak Kori

  • SQL Override In OWB

    Hello
    I am new to OWB, as was wondering how I can view and edit operators using sql. Example I would like to write a source query that joins to mulitple tables in my source system and then inserts into my target object/operator
    Thanks
    Michelle

    Hi Michelle
    There is a way using a view in OWB to inline the SQL;
    https://blogs.oracle.com/warehousebuilder/entry/owb_11gr2_mappings_and_inline_sql
    You can also use operators for a huge amount of the SQL dialect, there is a set of blogs linked below that illustrate an expert I created to generate mappings from SQL which is useful to get the idea behind some of the operators;
    https://blogs.oracle.com/warehousebuilder/entry/sql_and_owb_accelerated_map_co
    Cheers
    David

  • How to implement SQL Over-ride on Source Tables in OWB?

    How can we acheive SQL-Override feature similar to the one available in informatica on OWB.
    We are thinking of using a view to filter the required data.And defiinig the view as the source in the mapping.
    Is this the best approach to go by,any thoughts suggestions?

    Informatica is a pipelined client/server ETL tool which means that if you read from a table and then apply a filter it will read all of the data onto the client (i.e. the application server, not a user machine) and then filter the data so it is not unusual for developers to want to put a lot of the logic into the SQL to reduce the number of records (which sort of defies the purpose of using the tool in the first place).
    OWB is more of a ELT/code generator so if you create a similar mapping with a table and filter it will generate the SQL with the filter included. To be honest, I'm not 100% sure about this bit but I believe if you have a source via dblink (or even ODBC) then that full generated SQL should be sent to the source. In other words, although some people might be more comfortable with custom SQL I dont think it is actually necessary in most if not all cases.

  • Essbase Studio SQL Query Doubt

    Hi All,
    Can some on explain me what does the below statements mean realted to Data Load SQL Override Editing
    "If a member is prefixed with previous members of its dimension (for example, parent or all ancestors), more columns are returned.
    ● If some columns in the data load SQL statements are NULL, you can add SQL statements to load the data at the next level in the outline. This is known as NULL Promotions."
    Thanks,
    SatyaB

    can you explian what does this where name < 'S' means??
    Thanks in advance
    Sajithit will display the names in allphapatecal order A TO R
    it wont display NAMES WITH S T UVWXYZ
    s

  • Insert SQL server: Conversion failed. On A environment, but not on Test.

    We probably found the cause of this problem. Not that long ago we migrated the databases from SQL server 2005 to SQL server 2012. All databases are migrated now, but only the acceptance one is giving us trouble. After an extensive search we found that the connection to acceptance databases uses an other (older?) version ODBC driver. Al other environments had the same, other (newer) connection diver and they work fine. It seems that the combination of SQL server 2012 and the use of the old connection ODBC driver is causing our problem.   (we saw something strange, wich seems to be related during an other run: a SQL override in a session was giving us a 'database fetch optimize' error.... But! Not on our acceptantance environment :-) )

    Hi, I have a mapping which inserts (meta)data into a SQL server (2012) database. One field is 'proces_start_date' of the date/time datatype in Powercenter. In the SQL server database this field is of the datetime datatype. This mapping runs at the start of the workflow to insert an initial record in a metadata table for each session in the workflow. Therefor the 'proce's_start_date' is initialy set to 01-01-1900 with the following expression: TO_DATE('01011900','DDMMYYYY').  The weird thing is that when I run this mapping on our development or test environment, the insert is being performed without a problem. But when I run the same mapping on our acceptance environment the session fails. This error is returned: [Informatica][ODBC SQL Server Legacy Driver][SQL Server]Conversion failed when converting date and/or time from character string. The session log says it fail on this value of 'proces_start_date': 01/01/1900 00:00:00.000000000 Because the mapping inserts initial values, the field 'proces_start_date' will always have this value! But still it doesn't work on our A environment. I've checked the workflows/sesion/mapping/mapplet/target etc.. on T and A and there are no differences. Our DBA can't find anything different between the databases on T and A either. Also, when I try to manualy convert/insert this value of 'proces_start_date' in the database, using management studio, I get the same converion error on BOTH databases!  We are a bit lost here and have no idea wheter this could be a powercenter issue or a database related issue. Hopefully someone can help us in finding a solution for this problem.

  • Delta load from netezza to sql server

    i have 20 million rows in a table and 100 million rows in another table in netezza
    Very first time, i got all the rows from netezza to sql server....
    Say in a week, we have new rows of data and updated rows in those tables in Netezza,
    how would i go about in doing the delta to sql server??
    should i use
    1) SCD
    2) Merge Statement using T-sql
    3) Lookup
    Not able to decide
    we also need to think about performance as well right
    Can somebody point me to the right resource
    Thanks

    So this is what you load, you require to have 2 tables to make the incremental logic on your side
    a) Load_Log: Here you would capture the load start time/ End time/ max date(modified date) from the Nettiza tables.
    b) As ione suggested load this data to a staging table( this table would be used for comparison against the main table), this step is required as SSIS merge wouldn't be suited for million records, so the comparison is best served in a SP via a merge or classic
    UPSERT 
    Next time the load is run fetch the max date from Load_Log table and pass this as parameter on the source query something like where modified date> param.
    if its a weekly load you might want to run it as daily on source or modify the source query to be like modified date between max date and mas date+1 doing some iterations, though best would be to run on daily basis. a million records would be fine to be
    processed.
    Abhinav http://bishtabhinav.wordpress.com/

  • Standard Data Load SQL is blank

    Good day guys,
    I'm trying to perform a data load in IES. I noticed that the "Standard Data Load SQL" field in the SQL Override window is blank. May I know what causes this and how can I fix this?
    Uploaded with ImageShack.us
    Thanks in advance.
    Edited by: 26FEB1986 on Dec 14, 2010 9:15 PM

    UP.

  • Update override in Target Transformation

    This is a long-known issue, as of my knowledge multiple change requests have been filed for that. I doubt that this (small but nasty) nuisance will ever be remedied, but that's only my personal suspicion.In general when you log on to the My Support portal you have two buttons Enter Online Support and Track Change Requests near the upper right corner of the home page. One of these two buttons will allow you to file change requests and to look at their current status, if I recall correctly. Regards,Nico

    Hi,
    I am having a query that in target update override ,if i am giving only one condition in "Where" clause then my mapping is validating correctly like the below query
    A)UPDATE D_THIRD_PARTY SET THIRD_PARTY_ID = :TU.THIRD_PARTY_ID, THIRD_PARTY_NAME = :TU.THIRD_PARTY_NAME, THIRD_PARTY_TYPE = :TU.THIRD_PARTY_TYPE, INDUSTRY_CODE_STANDARD = :TU.INDUSTRY_CODE_STANDARD, ADDRESS_LINE_1 = :TU.ADDRESS_LINE_1, ADDRESS_LINE_2 = :TU.ADDRESS_LINE_2, ADDRESS_LINE_3 = :TU.ADDRESS_LINE_3, ADDRESS_LINE_4 = :TU.ADDRESS_LINE_4, ADDRESS_LINE_5 = :TU.ADDRESS_LINE_5, COUNTY = :TU.COUNTY, REGION = :TU.REGION, COUNTRY_CODE = :TU.COUNTRY_CODE, POST_CODE = :TU.POST_CODE, CUSTOMER_RATING_GE = :TU.CUSTOMER_RATING_GE, CURRENCY_CODE = :TU.CURRENCY_CODE, SOURCE_PLATFORM_ID = :TU.SOURCE_PLATFORM_ID, SOURCE_SYSTEM_ID = :TU.SOURCE_SYSTEM_ID, H_RECORD_CREATE_DATE = :TU.H_RECORD_CREATE_DATE, H_RECORD_VERSION_NO = :TU.H_RECORD_VERSION_NO, H_CURRENT_RECORD_FLAG = :TU.H_CURRENT_RECORD_FLAG
    WHERE
    THIRD_PARTY_TYPE = :TU.THIRD_PARTY_TYPE
    then moment i am putting where clause as
    WHERE
    THIRD_PARTY_TYPE = :TU.THIRD_PARTY_TYPE
    and
    SOURCE_SYSTEM_ID = :TU.SOURCE_SYSTEM_ID
    and
    SOURCE_PLATFORM_ID = :TU.SOURCE_PLATFORM_ID
    and
    THIRD_PARTY_ID = :TU.THIRD_PARTY_ID
    in above query 'A'.
    it is giving problem an error as 'unknown field 'name of that field' in sql override' and asking i want to continue Y/N if i say Yes then while validating the mapping its giving me an error.
    All the condition in where clause are Unique Constraint in table and the primary key is somethg else which is getting generated by Sequence Generator.
    So to overcome the problem of dynamic cache i am overwriting the update sql in target .
    So my question is Is there any specific rule for Update override ?If its there pls let me know
    Thanks in advance
    Regard
    Vaibhav
      

  • Bulk Override to enable "Generate Alert" parameter for monitors/rules

    Hi Experts,
    I am working on fine tuning Exchange 2010 management pack as per the monitoring requirement of the team. Most of the monitors have their workflow enabled but have the Generate Alert parameter disabled.
    I was using Override Creator by Borris for enabling and disabling the workflows, but it doesnt help in creating override for Generate Alert parameter.
    I almost have 700+ monitors or rule to be enabled for generating alerts is there any way i can do this using any script or any other tool?
    Regards,
    Prajul Nambiar

    Hi 
    GenerateAlert is the property of override same
    as enable, you can apply override using below script and enable all monitors to generate alert.
    Import-module operationsmanager
    New-SCOMManagementGroupConnection
    $mps=Get-SCOMManagementPack |?{$_.name -match "sql"}
    $overrideMp= Get-SCOMManagementPack -Displayname "sql.Override"
    $Monitors=$mps|Get-SCOMMonitor |?{$_.xmltag -eq "UnitMonitor"}
    foreach($Monitor in $Monitors)
    if($Monitor.AlertSettings.AlertOnState -eq $null)
    $Target= Get-SCOMClass -id $Monitors[0].Target.id
    $overridname=$Monitor.name+".Override"
    $override = New-Object Microsoft.EnterpriseManagement.Configuration.ManagementPackMonitorPropertyOverride($overrideMp,$overridname)
    $override.Monitor = $Monitor
    $Override.Property = ‘GenerateAlert’
    $override.Value = ‘true’
    $override.Context = $Target
    $override.DisplayName = $overridname
    $overrideMp.Verify()
    $overrideMp.AcceptChanges() 
    Note : you need to have override mp with name sql.Override
    Regards
    sridhar v

  • Where are the SQL and PSQL reference manuals gone ?

    Hello,
    I cannot found the SQL and PSQL reference manuals on OTN. In previous version of this web site, there was direct links available, but they seem to have disappeared. As a SQL developer, I have frequently to search in those manuals:
    - Oracle8i SQL Reference Release 8.1.5 A67779-01
    - Oracle8i SQL Reference Release 2 (8.1.6) A76989-01
    - Oracle8i SQL Reference Release 3 (8.1.7) A85397-01
    - Oracle9i SQL Reference Release 1 (9.0.1) A90125-01
    - Oracle9i SQL Reference Release 2 (9.2) A96540-01
    - Oracle8i PL/SQL User's Guide and Reference Release 8.1.5
    - Oracle8i PL/SQL User's Guide and Reference Release 8.1.6
    - Oracle8i PL/SQL User's Guide and Reference Release 8.1.7
    - Oracle9i PL/SQL User's Guide and Reference Release 9.0.1
    - Oracle9i PL/SQL User's Guide and Reference Release 9.2
    I found in this forum links to Oracle9i SQL Reference
    Release 2 (9.2) and PL/SQL User's Guide and Reference
    Release 2 (9.2), but I would appreciate to have links in the OTN page. Why did you change this ? I must now use Google to search the informations that I need.
    Regards

    FYI, all the Oracle manuals are online at http://tahiti.oracle.com. There is also a SQL & PL/SQL statement lookup feature that strikes me as a bit easier than using the manual.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Problem with BO Data Services Function lookup_ext

    HI All,
    I have to be able to lookup values in a code table based on three input parameters and returning two description columns.  Using the wizard and then adding comments I get the following:
    lookup_ext(
    Lookup Table Specifications
    [DS_DB2_RISM.RIMS.RIT_CODE_TABLE,'NO_CACHE','MAX'],
    Return Column List Specification
    [T_DESCRIPTION,T_LONG_DESC],
    Default Value List
    [NULL,NULL],
    Condition List
    [C_STATUS,'=',$IV_STATUS,C_TYPE,'=',$IV_TYPE,C_VALUE,'=',$IV_VALUE],
    Order By List
    Output Variable List
    [$OV_DESCRITION,$OV_LONG_DESC],
    SQL Override
    SET ("run_as_separate_process"='no',
          "output_cols_info"='<?xml version="1.0" encoding="UTF-8"?>
          <output_cols_info>,
          <col index="1" expression="no"/>
          <col index="2" expression="no"/>
          </output_cols_info>)
    This produces the following errors
    [Function:CF_DB2_RIMS_CODE] (Ln21): Syntax error : found <[end of text]> expecting <IF, a decimal>
    [Function:CF_DB2_RIMS_CODE] The function <CF_DB2_RIMS_CODE> contains an invalid expression. additional information: syntax error>. (BODI-1111182)
    I have tried
    1) including the single quote after > on the line with xml version
    2) Including the single quote before ) on the last line
    3) Getting ride of the SET statement altogether
    Any suggestions welcome?

    David,
    What you described should work.  Personally, I'd write the output of the first dataflow to a table, just to insure your output is what you think it is.  Then test the second against the data in the table you created in the first workflow.  Continue the process until each independent job works as expected.  Then combine into the single workflow.

  • OWB bugs, missing functionality and the future of OWB

    I'm working with OWB for some time now and there are a lot of rough edges to discover. Functionality and stability leave a lot to be desired. Here's a small and incomplete list of things that annoy me:
    Some annoying OWB bugs (OWB 10g 10.1.0.2.0):
    - The debugger doesn't display the output parameters of procedures called in pre-mapping processes (displays nothing, treats values as NULL). The mapping itself works fine though.
    - When calling selfmade functions within an expression OWB precedes the function call with a constant "Functions." which prevents the function from being executed and results in an error message
    - Occasionally OWB cannot open mappings and displays an error message (null pointer exception). In this case the mapping cannot be opened anymore.
    - Occasionally when executing mappings OWB doesn't remember changes in mappings even when the changes were committed and deployed
    - When using aggregators in mappings OWB scrambles the order of the output attributes
    - The deployment of mappings sometimes doesn't work. After n retries it works without having changed anything in the mapping
    - When recreating an external table directly after dropping the table OWB recreates the external table but always displays both an error message and a success message.
    - In Key Lookups the screen always gets garbled when selecting an attribute as a join condition
    - Usage of constants results in aborts in the debugger
    - When you reconcile a table used in a key lookup the lookup condition sometimes changes. OWB seems to remember only the position of the lookup condition attribute but not the name.
    - In the process of validating a mapping often changes in the mapping get lost and errors occur like 'Internal Errors' or 'Null Pointer Exceptions'.
    - When you save the definition of external tables OWB always adds 2 whitespace columns to the beginning of all the lines following 'ORGANISATION EXTERNAL'. If you save a lot of external table definitions you get files with hundreds of leading whitespaces.
    Poor or missing functionality:
    - No logging on the level of single records possible. I'd like the possibility to see the status of each single record in each operator like using 'verbose data' in PowerCenter
    - The order of the attributes cannot be changed. This really pisses me off expecially if operators like the aggregator scramble the order of attributes.
    - No variables in expressions possible
    - Almost unusable lookup functionality (no cascading lookups, no lookup overrides, no unconnected lookups, only equal condition in key lookups)
    - No SQL overrides in soruces possible
    - No mapplets, shared containers or any kind a reusable transformations
    - No overview functionality for mappings. Often it's very hard to find a leftover operator in a big mapping.
    - No copy function for attributes
    - Printing functionality is completely useless
    - No documentation functionality for mappings (reports)
    - Debugger itself needs debugging
    - It's very difficult to mark connections between attributes of different operations. It's almost impossible to mark a group of connections without marking connections you don't want to mark.
    I really wonder which of the above bugs and mssing functionality 'Paris' will address. From what I read about 'Paris' not many if at all. If Oracle really wants to be a competitor (with regard to functionality) to Informatica, IBM/Ascential etc. they have a whole lot of work to do or purchase Informatica or another of the leading etl tool
    vendors.
    What do you think about OWB? Will it be a competitor for the leading etl tools or just a cheap database add on and become widely used like SAB BW not for reasons of technology or functionality but because it's cheap?
    Looking forward to your opinions.
    Jörg Menker

    Thanks to you two for entertaining my thoughts so far. Let me respond to you latest comments.
    Okay, lets not argue which one is better.. when a tool is there .. then there are some reasons to be there...But the points raised by Jorg and me are really very annoying. Overall I agree with both yours and Jorg's points (and I did not think it was an argument...merely sharing our observations with each other (;^)
    The OWB tool is not as mature as Informatica. However, Informatica has no foothold in the database engine itself and as I mentioned earlier, is still "on the outside looking in..." The efficiency and power of set-based activity versus row-based activity is substantial.
    Looking at it from another way lets take a look at Microstrategy as a way of observing a technical strategy for product development. Microstrategy focused on the internals (the engine) and developed it into the "heavy-lifting" tool in the industry. It did this primarily by leveraging the power of the backend...the database and the hosting server. For sheer brute force, it was champion of the day. It was less concerned with the pretty presentation and more concerned with getting the data out of the back-end so the user didn't have to sit there for a day and wait. Now they have begun to focus on the presentation part.
    Likewise this seems to be the strategy that Oracle has used for OWB. It is designed around the database engine and leverages the power of the database to do its work. Informatica (probably because it needs to be all things to all people) has tended to view the technical offerings of the database engine as a secondary consideration in its architectural approach and has probably been forced to do so more now that Oracle has put themselves in direct competition with Informatica. To do otherwise would make their product too complex to maintain and more vendor-specific.
    I am into the third data warehousing/data migration project and my previous two have been on Informatica (3 years on it).I respect your experience and your opinions...you are not a first timer. The tasks we have both had to solve and how we solved them with these tools are not necessarily the same. Could be similar in instances; could be quite different.
    So the general tendency is to evaluate the tool and try to see how things that were needed to be done in my previous projects can be done with this tool. I am afraid to say .. I am still not sure how these can be implemented in OWB. The points raised by us are probably the fall out of this deficiency.One observation that I would make is that in my experience, calls to the procedural language in the database engine have tended to perform very poorly with Informatica. Informatica's scripting language is week. Therefore, if you do not have direct usability of a good, strong procedural language to tackle some complicated tasks, then you will be in a pickle when the solution is not well suited to a relational-based approach. Informatica wants you to do most things outside of the database (in the map primarily). It is how you implement the transformation logic. OWB is built entirely around the relational, procedural, and ETL components in the Oracle database engine. That is what the tool is all about.
    If cost is the major factor for deciding a tool then OWB stands far ahead...Depends entirely on the client and the situation. I have implemented solutions for large companies and small companies. I don't use a table saw to cut cake and I don't use a pin knife to fall trees. Right tool for the right job.
    ...thats what most managers do .. without even looking how in turn by selecting such a tool they make the life tough for the developers.Been there many times. Few non-technical managers understand the process of tool evaluation and selection and the value a good process adds to the project. Nor do they understand the implications of making a bad choice (cost, productivity, maintainability).
    The functionality of OWB stands way below Informatica.If you are primarily a GUI-based implementer that is true. However, I have often found that when I have been brought in to fix performance problems with Informatica implementations that the primary problem is usually with the way that the developer implemented it. Too often I have found that the developer understands how to implement logic in the GUI component (the Designer/Maps and Sessions) with a complete lack of understanding of how all this activity will impact load performance (they don't understand how the database engine works.) For example, a strong feature in Informatica is the ability to override the default SQL statement generated by Informatica. This was a smart design decision on Informatica's part. I have frequently had to go into the "code" and fix bad joins, split up complex operations, and rip out convoluted logic to get the maps to perform within a reasonable load window. Too often these developers are only viewing the problem through the "window" of the tool. They are not stepping back and look at the problem in the context of the overall architecture. In part Informatica forces them to do this. Another possible factor is they probably don't know better.
    "One tool...one solution"
    Microstrategy until recently had been suffering from that same condition of not allowing the developer to create the actual query). OWB engineers need to rethink their strategy on overriding the SQL.
    The functionality of OWB stands way below Informatica.In some ways yes. If you do a head-to-head comparison of the GUI then yes. In other ways OWB is better (Informatica does not measure up when you compare it with all of the architectural features that the Oracle database engine offers). They need to fix the bugs and annoyances though.
    .. but even the GUI of Informatica is better than OWB and gives the developer some satisfaction of working in it.Believe me I feel your pain. On the other hand, I have suffered from Informatica bugs. Ever do a port from one database eingine to another just to have it convert everything into multi-byte? Ever have it re-define your maps to parallel processing threads when you didn't ask it to?
    Looking at the technical side of things I can give you one fine example ... there is no function in Oracle doing to_integer (to_number is there) but Informatica does that ... Hmm-m-m...sorry, I don't get the point.
    The style of ETL approach of Informatica is far more appealing.I find it unnecessarily over-engineered.
    OWB has two advantages : It is basically free of cost and it has a big brother in Oracle.
    It is basically free of cost...When you are another "Microsoft", you can throw your weight around. The message for Informatica is "don't bite the hand that feeds you." Bad decisions at the top.
    Regards,
    Dan Phillips

  • Business Logic in ETL process of Oracle_BI_DW_Base.rep

    Hi,
    Are there any docs or web pages that convey business logic used in informatica mappings used for Order Management ?
    Ex: SIL_SalesOrderLinesFact
    Do we need to open every transformation in it and see the query or sql overrides to acquire business logic ? Or Oracle gives any standard docs on this ?
    Thanks,

    Short answer is no. There is no Oracle document that gives the "business logic" for all the ETL mappings. It would be wonderful if that existed. The best approach is to dig into the mappings and try to understand the flow. Using the DAC and Informatica, along with the DMR (data model reference), you can formulate the general strategy that Oracle uses. Some of the mappings/mapplets/transformations also have some comments. What you will see is that Oracle uses similar strategies across the stack. For instance, the method to do SCD Type 2 changes, or to do Aggregation..or to lookup to get the proper WIDs...its similar across all the OBIA apps.

Maybe you are looking for

  • Background job name

    Hi All, Background job related info. not getting retrieved when using function module 'SAPWL_READ_STATISTIC_FILES' (version SAP ECC 5.0 compatible) in our SAP ECC 6.0 whereas if I use 'SWNC_STAD_READ_STATRECS' I'm getting the required info. Can anyon

  • I installed FF 4 beta and lost all bookmarks. Found a bookmark file dated today on hard drive but can't open it because it's a json file.

    Sorry, guys, I'm pretty lost. Spent the evening trying to redo a profile; found the profile on my hard drive but couldn't figure out how to change it back. I also read the FF help section quite a bit and '''tried''' to follow all directions but STILL

  • Auditing BPM Processes

    Hello all, Is there a way to audit BPM processes in SAP CE?  Lets say I have an approval process for vendors built in BPM.  There are plenty of reasons why I may need to go back and show when and who approved a particular vendor.  I think there are s

  • New PC Old Palm need help

    I just got a new laptop with vista and I have a palm m500 hand held.  from what i can see the software for the m500 is not compatible with Vista.  Is there any advise for this not so computer savy girl? thanks Post relates to: Palm m500

  • Having problems with Ghost Songs

    Basically I downloaded iOS 8, then when i went to put my songs back onto my phone, iTunes would say that theyre on my phone, but it will gray the song out and put an empty circle next to it.  I already tried restoring my phone, restaring everything,