Tools for data modeling

Hi,
         Can any one tell me if there are any specific tools used for logical data modeling in SAP BW data modeling . I have used Erwin as a data modeling tool for implentation of data ware house, I would like to know if there are any specific tools for SAP BW implementations.
Response appreciated
tanu

Another related discussion: Looking for EDW modelling tool recommendations for BW

Similar Messages

  • Using CVS in SQL Developer for Data Modeler changes.

    Hi,
    I am fairly new to SQL Developer Data Modeler and associated version control mechanisms.
    I am prototyping the storage of database designs and version control for the same, using the Data Modeler within SQL Developer. I have SQL Developer version 3.1.07.42 and I have also installed the CVS extension.
    I can connect to our CVS server through sspi protocol and external CVS executable and am able to check out modules.
    Below is the scenario where I am facing some issue:
    I open the design from the checked out module and make changes and save it. In the File navigator, I look for the files that have been modified or added newly.
    This behaves rather inconsistently in the sense that even after clicking on refresh button, sometimes it does not get refreshed. Next I try to look for the changes in Pending Changes(CVS) window. According to the other posts, I am supposed to look at the View - Data Modeler - Pending Changes window for data modeler changes but that shows up empty always( I am not sure if it is only tied to Subversion). But I do see the modified files/ files to be added to CVS under Versioning - CVS - Pending Changes window. The issue is that when I click on the refresh button in the window, all the files just vanish and all the counts show 0. Strangely if I go to Tools - Preferences - Versioning - CVS and just click OK, the pending changes window gets populated again( the counts are inconsistent at times).
    I believe this issue is fixed and should work correctly in 3.1.07.42 but it does not seem to be case.
    Also, I m not sure if I can use this CVS functionality available in SQL Dev for data modeler or should I be using an external client such as Wincvs for check in/ check out.
    Please help.
    Thanks

    Hi Joop,
    I think you will find that in Data Modeler's Physical Model tree the same icons are used for temporary Tables and Materialized Views as in SQL Developer.
    David

  • Data Warehouse and ETL tools for data verification ?

    Data Warehouse and ETL tools for data verification ?
    How need to to data verification using ETL tool ? Also how to relate this thing to datawaehouse ?
    Thanks in Advance

    Hi  Shyamal Kumar,
    1)  BW it self  facilitates to do the ETL (Extraction Transformation Loading)  steps:
         example:
                     Extraction  - from SAP or other data bases
                     Transformation - using transfer rules, Updates rules
                     Loading  -  Loading into ODS, Cube, master data
    2) Typically used ETL tools in the industry are:
         a)   datastage from Ascential (owned by IBM)
         b)   Informatica
         c)   Mercator
    Regards, BB

  • Just wondering, what kinda tools do you prefer for data modeling

    Reading alot about datamodeling, from E-R diagrams to UML. In your database designs. From the conceptual, logical and physical model. What tool you prefer the most and why. What is a good tool for such things. And does Oracle supply such tools. Just curious as to what the professionals use.
    Pete

    ER-WIN is your tool for Database physical / Logical data modeling.

  • Sharepoint 2013 Reporting Services & OLAP Cubes for Data Modeling.

    I've been using PowerPivot & PowerView in Excel 2013 Pro for some time now so am now eager to get set up with Sharepoint 2013 Reporting Services.
    Before set up Reporting Services  I have just one question to resolve.
    What are the benefits/differences of using a normal flat table set up, compared to an OLAP cube?
    Should I base my Data Model on an OLAP Cube or just Connect to tables in my SQL 2012 database?
    I realize that OLAP Cubes aggregate data making it faster to return results, but am unclear if this is needed with Data Modeling for Sharepoint 2013.
    Many thanks,
    Mike

    So yes, PV is an in-memory cube. When data is loaded from the data source, it's cached in memory, and stored (compressed) in the Excel file. (also, same concept for SSAS Tabular mode... loads from source, cached in mem, but also stored (compressed) in data
    files, in the event that the server reboots, or something similar).
    As far as performance, tabular uses memory, but has a shorter load process (no ETL, no cube processing)... OLAP/MDX uses less memory, by requiring ETL and cube processing... technically tabular uses column compression, so the memory consumption will be based
    on the type of data (numeric data is GREAT, text not as much)... but the decision to use OLAP (MDX)/TAB (DAX) is just dependent on the type of load and your needs... both platforms CAN do realtime queries (ROLAP in multidimensional, or DirectQuery for tabular),
    or can use their processed/in-memory cache (MOLAP in multidimensional, xVelocity for tabular) to process queries.
    if you have a cube, there's no need to reinvent the wheel (especially since there's no way to convert/import the BIDS/SSDT project from MDX to DAX). If you have SSAS 2012 SP1 CU4 or later, you can connect PV (from Excel OR from within SP) directly to the
    MDX cube.
    Generally, the benefit of PP is for the power users who can build models quickly and easily (without needing to talk to the BI dept)... SharePoint lets those people share the reports with a team... if it's worthy of including in an enterprise warehouse,
    it gets handed off to the BI folks who vet the process and calculations... but by that time, the business has received value from the self-service (Excel) and team (SharePoint) analytics... and the BI team has less effort since the PP model includes data sources
    and calculations - aside from verifying the sources and calculations, BI can just port the effort into the existing enterprise ETL / warehouse / cubes / reports... shorter dev cycle.
    I'll be speaking on this very topic (done so several times already) this weekend in Chicago at SharePoint Saturday!
    http://www.spschicagosuburbs.com/Pages/Sessions.aspx
    Scott Brickey
    MCTS, MCPD, MCITP
    www.sbrickey.com
    Strategic Data Systems - for all your SharePoint needs

  • Will there be generators for Data Modeler

    Are there plans to add any generators to SQL Data Modeler? This would be similar to what we had in Oracle Designer. One of the nicest things with Oracle Designer was that we could interview the End Users, build the ERDs and FHDs based on the interviews, push them down to tables and modules and menus, then generate those with Forms and Reports and then show the End User the prototype. This would provide the feedback if we were in the right track or not very early on in the design process. Since we had the data model transformers, we could modify the logical model and then re-push the changes.
    With DM 3.0 and all that it provides on the modeling side, will some form generators for a user interface be provided?
    Thanks in advance,
    Scott

    That last statement, question should read:
    With DM 3.0 and all that it provides on the modeling side, will some form of generators for a user interface be provided?
    Edited by: ScottK on Feb 23, 2011 10:23 AM

  • Best approach for Data Modelling.

    Hello Experts
    I am building a Customer Scorecard involving SD and Marketing in BI 7.0.
    There are a couple of existing DSOs, some pushing the data into InfoCubes and some don't. All the reporting is happening from MultiProvider sitting on top of these Data Targets.
    The team has a primitive design which says that I additional DSOs be created to extract data from the above mentioned couple of DSOs based on only the Objects that are needed for Customer Scorecard reporting.
    This means, I am creating a couple of DSOs as per the current design which is in place.
    Upon suggesting to only create a Customer Scorecard MultiProvider on top of the already existing couple of Data Targets (avoiding to recreate addtional DSOs and the hassles of loading and activating them and then loading the data into InfoCubes) and then create the BEx Queries on top of them, the Lead expressed his concerns about the impacts it could have on the existing Data Model and subsequent transports once the Model is complete..!
    What is the best practice to handle a situation like this? I see there are 3 ways to go ahead with this:
    1. Do as the Lead said, which means creating additional DSOs (extracting data from a couple of required existing DSOs, push this data into 1 InfoCube and then create a MultiProvider on top of this (be aware that there is another similar data model that I need to create which will also be embedded into this MultiProvider) and create BEx Reports from there.
    2. Create only the InfoCubes which will extract data from the already existing DSOs (avoid creation of additional DSOs) and then create a MP from where BEx Reports are created.
    3. Only create a MultiProvider on all the required and already existing DSOs and InfoCubes, making sure if reporting needs aggregated data for reporting or not and then create BEx Reports from there (avoid creation of additional DSOs, & ICs).
    Note: We use Rev-Track to do the Transports.
    Which one do you think would be the best way to go and what could be the implications? Eventually, the reporting is done in WAD.
    Thanks for your time in advance.
    Cheers,
    Chandu

    Hi,
    Case 1 and 2 have similarities. But its purely depend user needs.
    I think you may be know the difference between dso and cube.
    DSO - holds detailed level data
    Cube - holds aggregated data.
    As per you needs use any one target only, no need to use DSO---> cube flow for existing flows.
    you can decide which you want use DSO or Cube only.
    Case 3. if your requirement will suffice with existing dso and at reporting level if you can manage to get the required out put then you can with it. But as my guess with existing target your requirement may won't suffice your needs.
    About transports:
    You can create one Rev track and assign multiple transports to it.
    you can add and release transport one by one rather than all at a time.
    if you release all at a time you may get some inconsistency issue and TR won't be released.
    Thanks

  • ISO:  person or forum for data modeling guidance

    hello -
    Is there anyone here who is willing/able to advise me on the data model that my husband and I have been working on? Alternatively, can anyone suggest an alternative forum, or where I need to go to have this done professionally?
    Our background is that we're both self-taught; he has a functioning database (in Approach) that we want to re-build and expand upon. I have a set of related tables in Filemaker that are semi-functional.
    We're not sure if his original model is the most stable way to have built it, nor are we sure that we're expanding in the right way to encompass my tables....
    I really do NOT want to pay someone to write this whole application for us - beyond the expense, we like/want to be in control.
    Thank you -
    Marion in Rochester

    Marion
    My e-mail address is in my profile, drop me a line and I'll be happy to have a look.
    A few things first though. Download and install Oracle XE if you have not done so already. Let me know what SQL knowledge you have specifically Oracle DDL (Data Definition Language - Using for creating and altering database objects) and DML (Data Manipulation Language - Used for inserting, updating and deleting rows from a table).
    Next send me the data model by way of what tables (and their columns), primary keys, foreign keys, constraints, indexes (if any yet) that you currently have. I will also need to know roughly how many rows you currently have in each table. Send this in any format you want (even just a text file) but not as a database.
    Cheers
    Ben

  • Alternative tools for Data Migration

    Hi,
    I am looking for any latest tool developed by SAP for Data migration activities other than LSMW.
    The requirements in my project include:
    Data Migration / Data Conversion covering :
                   Data cleansing / data quality check
                   Extraction, Transformation, Load
                   Reconciliation
    Information Lifecycle Management covering :
                   Data prevention
                   Data aggregation
                   Data deletion
                   Data archiving
    BAU data maintenance covering :
                   Master data maintenance
                   Test data maintenance
                   Data quality
                   Interface data journey
    Can some one let me know if there is any SAP product catering  to the above
    Thanks

    Did you find anything on this?

  • Searching for Data Modell

    Hello,
    I need some further information about the data modell of SBO. An ERD would be great, but as far as I know there is no such diagram available.
    Searching in this forum gave me the information that that there are two useful files:
    - SBOObjectsTables 6.5.chm
    - DB_Help.chm
    I don´t have access to the marketplace, that´s why it would be great if someone could mail one of these files to me to [email protected]
    Thanks!

    Done

  • Are there any tools for data encryption and decryption ?

    Hi,
    i am using oracle 9i R2, i want encrypt my data. Are there any tools available in market.
    Please let me know the ways to do data encryption and decryption.
    Thanks in advance
    Prasuna.

    970489 wrote:
    using DBMS_OBFUSCATION_TOOLKIT.Encrypt /DESEncrypt we can't secure our password...So i am looking for an another alternative.As Blue Shadow said, what are you really trying to achieve?
    Encrypting a password is itself not secure. Anything that can be encrypted can be decrypted. That is why Oracle itself DOES NOT encrypt passwords.
    Surprised??
    Here's what Oracle does with passwords, and what others should be doing if they have to store them.
    When the password is created, the presented password - clear text - is concatenated with the username. The resulting character string is then passed through a one-way hashing function. It is that hashed value that is stored. Then when a user presents his credentials to log on to the system, the presented credentials are combined and hashed in the same manner as when the password was created, and the resulting hash value compared to the stored value.

  • Tools for planning / modeling / designing a RIA

    I'm about to embark on a large (to me) web project. I have little formal training but have been able to do well enough with my projects so far.
    I know that planning for a large project is very important. I have been reading stuff all over the net about what needs to be in a plan and the things that need to be considered.
    What I am not finding is a tool or set of tools to use during planning. Sure I could write everything in a word processor but that doesn't seem to be productive. I'm looking for something that I can use to define all the elements of my application (objects / classes / data / relationships / views / logic). I would like it be intelligent enough to know that I am designing a web application. So for example if a create a data table object it knows it will have columns with data types or if I am creating a class it may have properties and methods and events etc.
    Do such tools exist?
    What tools do you use when planning/documenting an application?
    I came here to the Flex forum because I am thinking I may be using Flex and PHP and Flash.
    Thanks

    I'm looking at Enterprise Architect by Spark Systems now. Seem like a good product. Anyone have any comments?

  • Tools for Dimension Modelling and building Aggregates

    Hello everybody,
    I want to analyze and improve the dimensions of a cube and create aggregates. Please let me know if there are any standard tools/programs (in BW 3.5, but also in 7.0 if there are any improvements) to analyze the relations between characteristics in a cube. For example, I have beside others these 3 characteristics: Document Number, Customer and Date. Lets assume there is almost only one document (number) per day. This could lead into the decision of having these 2 characteristics in one dimension which you wouldnt have expected at the beginning.
    Besides that, an aggregate where I throw out document number would not make any sense, because Date has the same granularity. Normally I would do that by doing some manual analysis and trial and error, but I currently have a too huge data set.
    I know the RSRV and the aggregation proposals, but are there any tools that can display me the dependencies between the characteristics in a cube?
    Thanks for your help,
    best regards,
    Michael

    Hi Michael,
    but that's it for the design of the cube/dimensions. Additionally you can use rsrt, to run a query in debug mode to get the required information about aggregates.
    regards
    Siggi

  • SDO_PC, multiple SRIDs - best practise for data model?

    Hi,
    im using UTM and I am getting data covering two zones.
    all my existing data is from zone A.
    tables:
    pointcloud
    pointcloud_blk
    now im getting data with very few points from zone A and most points from zone B. It was agreed that the data delivery will be in SRID for zone B.
    so I tested whether this would work. I had two pointclouds. One with SRID A, another with SRID B. As soon as I put SRID B pointcloud inside, I could NO LONGER QUERY pointcloud with SRID A.
    So it seems to be necessary to use at least another pointcloud_blk, f.e. pointcloud_blk_[srid].
    Question: does another pointcloud_blk for each SRID suffice or do i also need a pointcloud table per SRID. the pointcloud table seems only interesting due to its EXTENT column. But on the other hand this could be queried by "function", since there are only 10 or so records (pointclouds) inside.
    PLZ share your best practises. What does work, what not.

    It is necessary to have one pointcloud_blk table for each SRID since there is a spatial index on that table.
    As for the PointCloud table itself, it is up to you. You can have pointclouds with different SRIDs in that table.
    But if you want to create spatial index on it, you have to use some function based index so that the index
    sees one SRID for the table.
    Since this table usually does not have many rows, this should work fine with one table for different SRIDs.
    siva

  • Data Modeling Tool

    Does Oracle 9i provide a database modeling tool? if not, which tool in the market would you recommand? thanks in advance!

    Oracle Gives Designer Tool for Data Modelling.
    However ER-Win And ER-Studio can be also used as both of them are vwry good tool for modelling.
    Regards

Maybe you are looking for

  • MM PO - New Pricing Routine / user exit / at Line item

    Hi When ever a line item data gets chnaged, we need to do the Update Pricing / New pricing Automatically. Currently we are doing this manually and would prefer to automate - request for possible suggestions / solutions OSS notes for this functionalit

  • Not enough memory for Data Provider-Error while creating Data Source

    Hi, I am loading data into Master Data_Attribute InfoObject I am getting following error message while creating Data Source under "Proposal" Tab "Not enough memory for Data Provider" My Master Data InfoObject having 65 attributes My CSV file having 1

  • Flash PlayerをアップデートしたらFlashが表示されなくなった

    検索して情報を探してみたのですが.解決方法が見つからず こちらで質問させて頂きます. Adobe Flash Player 11.5.502.110 にアップデートしたら. 今までWebサイトで表示されていたFlash部分が見えなくなりました. 見えなくなったFlashに.おそらく共通しているのは 外部JPEGファイルを読み込んで表示させていることです. Flash Playerをインストールし直したりしましたが改善しません. WindowsXP.Windows7でそれぞれ.複数のブラウザで確認

  • Charateristic values as MD of Material

    hello everyone, i have a little problem. in our organization, we keep some additional chars. of the material in the AUSP table - Characteristic values (accept of the main chars. which appear in the MARA table). The format of the AUSP table is: materi

  • Indesign CS6, trouble with accessing menus

    Hi all, I've asked this question on twitter before and the staff was very helpful but it didn't solve my problem. I have a PC on Windows7 family premium, 64 bits. Intel Core i3 CPU550 3.2 GHz, processor 4G of RAM As far as I can tell that should run