Best  Course  for Data Warehousing

Hi,
            I am planning to join data warehousing course .I heard there is lot courses in data warehousing .
Data warehousing with ETL tools or
Data warehousing with Crystal Reports or
Data warehousing with Business object or
Data warehousing with Informatica or
Data warehousing with Bo-Webel or
Data warehousing with Cognos or
Data warehousing with Data Stage or
Data warehousing with MSTR or
Data warehousing with Erwin or
Data warehousing with oracle.
Please suggest me which best to choose and  which have more scope because I  don't know  the ABC of data warehousing  but I have some experience in oracle.
Is it must that I need work experience in data warehousing  then only can get a job ?Please tell me which is the best book for data warehousing which should start from scratch.  Please  give your suggestions about to my queries.
Thanks & Regards,
Raji

Hi,
Basically Datawarehouse is a concept.To develop DW , we need two tools mainly. One is ETL tool and other one is Reporting tool .
The few famous ETL tools are
Informatica
Data Stage
Few famous Reporting tools are
Crystal Reports
Cognos
Business object
As a DW developer you should aware of atleat one ETL tool and atleat one Reporting tool.The combination is your choice.It better to finout the best combination in point of job market , and then learn them.
Erwin is Datamodel tool. It can aslo be used in DW implementation. You have already have experience on ORacle,So my adivce is go for Data warehousing with oracle or Data warehousing with Informatica .And learn one reporting tool.I donot is there any reporting tool available from ORACLE.
My suggestion on books.
Fundamentals of Datawarehouse by PaulRaj Ponnai and
Datawarehouse toolkit.
http://www.inmoncif.com/about.html is one of the best site for Datawarehouse.
With rgds,
Anil Kumar Sharma .P
Assigning points is the way to say thanks in SDN site.

Similar Messages

  • Scope for data warehousing ETL Tool

    Hi all
    can anybody explain scope for data warehousing ETL Tool
    for oracle developer in future this is ok or..
    regards
    Message was edited by:
    174313

    What exactly is your question?
    The scope of using an ETL tool would be setting up and maintaining datawarehouses and building ETL processes to populate these datawarehouses.
    A tool is generally preferred over hand coding because tools allow better maintenance, shorter development cycles etc.
    Oracle has a pretty good ETL tool, called Oracle Warehouse Builder. It's not the best tool available, but if you compare price/ functionality I would say in most cases it will do.
    If your question is if it's a wise pick to master OWB or any ETL tool my answer would be a clear YES! Datawarehouses and BI are becoming more important every day, their use gets broader everyday.
    If your question is if an investment in OWB is a wise investment for the future I would answer a clear YES again. It's increddilbe to see what progress Oracle made with the tool, coming from a 'laughing stock' postition, regarded completely immature good for nothing tool, to where they are now, with 10.2. Regarded as one of the leaders by Gartner.
    Oracle has recognized (a long while ago) that ETL is the bread and butter of the future and invests in a good quality tool for accomplishing this.
    I hope this answes your question, if not, please try to specify more clearly.
    Regards,
    Toin.

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

  • Webcast : Sun Oracle Database Machine for Data Warehousing  -Sep 30 noon ET

    Sun Oracle Database Machine for Data Warehousing
    Jean Pierre Dijcks - Data Warehousing Product Mgmt, Oracle
    https://conference.oracle.com/imtapp/app/cmn_jm_hub.uix?mID=158101510
    On September 15 Oracle announced the second generation of its Database Machine, making an already strong data warehousing product significantly stronger. The new version runs on Sun hardware and offers important new features. Available in full rack, half rack, quarter rack, and basic unit configurations, the Sun Oracle Database Machine can add value at many data warehouse size levels.
    The Sun Oracle Database Machine runs on Oracle Database 11g Release 2 and has new features such as:
    Smart Flash Cache memory for ultra-fast IO - Reaches 50GB/second on a full rack system (not even counting gains from compression)
    Exadata Hybrid Columnar Compression - Maximizes data capacity and reduces scan times: think 500GB/second IO
    Offloaded Data Mining Scoring - Moves CPU-intensive operations from database servers to Exadata storage servers
    In-Memory Parallel Execution - Caches full tables in memory across nodes: foundation of new TPC-H world record
    There is plenty more we have not listed above, so come to this TechCast and learn about this major new product!
    Audio Dial-In: 888 967 2253 Audio Meeting ID: 572994 Audio Meeting Passcode: 334451
    Web Conference: https://conference.oracle.com/imtapp/app/cmn_jm_hub.uix?mID=158101510
    Compatibility Check: If you have not used Oracle's web conference system before, please ensure your system
    compatibility by going to https://conference.oracle.com/imtapp/app/nuf_sys.uix

    Is there any way, one could get this webcast to watch it offline?
    regards

  • SAP Best Practices for Data Migration :repositories only on MS SQL Server ?

    Hi,
    I'm implementing the "SAP Best Practices for Data Migration" (see https://websmp109.sap-ag.de/bp-datamigration).
    As part of the installation you have to install MS SQL Server Express Edition. The installation guide contains detailed steps to do this. All repositories for Data Services should be running on SQL Server, according to the installation guide.
    The customer I'm working for now does not want to use SQL Server, but DB2, as company standard.
    So I use DB2 for the local and profiler repositories.
    I notice however that the web application http://localhost:8080/MigrationServices does not support DB2.The only database type you can select in the configuration area is MS SQL Server.
    Is this a limitation, a by design ?

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • Best practices for data migration

    Hi All,
    This thread is useful for those who can use their opportunity to share the ideas and knowledge for making better and best use of data migration using Business Objects Data Services.

    Hans,
    The current release of SAP Best Practices for Data Migration, v1.32, supports only MS SQL Server.  The intent when developing the DM content was to quickly set up a temporary, standardized data migration environment, using tools that are available to everyone.  SQL Server Express was chosen to host the repositories, because it is easy to set up and can be downloaded for free.  Sone users have successfully deployed the content on Oracle XE, but as you have found, the MigrationServices web application works only with SQL Server.
    The next release, including the web app, will support SQL Server and Oracle, but not DB2.
    Paul

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Best strategy for data definition handling while using OCCI

    Hi, subject says all.
    Assuming to use OCCI for data manipulation (e.g. object navigation), what's the best strategy to handle definitions in the same context ?
    I mean primarily dynamic creation of tables and columns.
    Apparent choices are:
    - SQL from OCCI.
    - use OCI in parallel.
    Did I miss anything ? Thanks for any suggestion.

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Best Practise for Data Refresh & Hierarchy

    Hi,
    During a recent discussion with one of our BI user groups, the questions were raised as what the best practice are to handle the following two issues.
    Issue 1:
    If entries are posted to the prior periods in SAP R/3 (outside of the daily auto-refresh range), the current process is that the user group will ask us to conduct a manual refresh in BI for the prior periods which are effected.
    Question: Is it possible to set up a trigger in the system, so that BI knows which periods are changed and automatically refreshes data for those periods?
    Issue 2:
    If a hierarchy used in the reports is modified, there might be an adverse impact on the financial data the user group reports. The current process we have in place is to run a group of BI reports for both current year and prior year to make sure nothing is impacted, but there is limitation to this current process. What if there is no impact on current or prior year, but on the years prior to that?
    Question: What other global companies do to minimize such reporting impact, especially when they have hundreds of complex reports?
    If someone has any info on this, help me in sharing the same.
    Thanks all for your support.
    Regards,
    Murali

    Hi Srini,
    1. SAP suggestes to implement data archiving strategy  as early as possible to manage database growth .
    However pople think of archiving when they realise the  problems like large data volumes,slow system resonse time,performance issues etc...
    2. There is a proper way to implement Data Archiving . Database has to be anaylzed first for getting the top DB tables and Archiving objects.
    3. Based on the DB analysis ,Data archiving plan has to be implenemted according to data management guide.
    4. There is a minimum period known as residence time has to be completed before any data to be archived. Once the document is business completed and serverd its minimum required period in the Database ,it can be archived.
    5, Before going for data archiving there are many steps to be followed like analysis,configuration etc that you can see in details at the link below :
    http://help.sap.com/saphelp_47x200/helpdata/en/2e/9396345788c131e10000009b38f83b/frameset.htm
    let me know if this helps you .
    -Supriya
    Edited by: Supriya  Shrivastava on May 4, 2009 10:38 AM

  • The best practice for data archiving

    Hi
    My client has been using OnDemand for almost 2 years, there are around 2M records in the system(Activities), so just want to know what is the best practice of data archiving, we dont care much about the data in the 6 month ago.

    Hi Erik,
    Archival is nothing but deletion.
    Create a backup cube in BW. Copy the data from your planning cube to the backup cube, and then delete that data region from your planning cube.
    Archival will definitely improve the performance of your templates, scripts, etc; since the system will now search from a smaller dataset.
    Hope this helps.

  • Best Practices for Data Access

    Good morning!
    I was wondering if someone might give me some advice on some best practices for retrieving data from a SQL server in the cloud via a desktop application?
    I'm curious if I embed into my desktop application the server address (IP, or Domain or whatever) and allow the users to provide their own usernames and passwords when using the application, if there was anything "wrong" with that? Where-in my
    application collects the username and password from the user, connects to a server with that username and password, retrieves the data and uses it in-app.
    I'm petrified of security issues and I would hate to start using a SQL database with this setup only to find out that anyone could download x, y or z and connect to the database and see everything.
    Assuming I secure all of the users with limited permissions, is there anything wrong with exposing a SQL server to the web for my application to use? If so, what and what would be a reasonable alternative?
    I really appreciate any help and feedback!

    There are two options, none of them very palatable:
    1) One is to create a domain, and add the VM and your local box to it.
    2) Stick to a workgroup, but have the same user name and password on both machines.
    In practice, a better option is to create an SQL login that is member of sysadmin - or who have rights to impersonate an account that is member of sysadmin. And for that matter, you could use the built-in sa account - but you rename it to something else.
    The other day I was looking at the error log from a server that apparently had been exposed on the net. The log was full with failed login attempts for sa, with occasional attempts for names like usera and so on. The server is in Sweden - the IP address
    for the login attempts were in China.
    Just so know what you can expect.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • ASM BEST PRACTICES FOR 'DATA' DISKGROUP(S)

    In our quest to reduce operating costs we are consolidating databases and eliminating RAC in favor of standalone servers. This is a business decision that is a certainty.  Our SAN has been upgraded, and the new database servers are newer, faster, etc.
    Our database version is 11.2.0.4 with Grid Infrastructure 12.1.0.1. Our data diskgroup is RAID-5 and our fra is RAID-1+0.  ASM has external redundancy.  All disks are of equal size with equal storage performance and availability.
    Previously our databases were on separate clusters by function: OLTP, REPORTING and ENTERPRISE CONTENT MANAGEMENT. Development/Acceptance shared a cluster, while production was separate.
    The new architecture combines different functions onto one server for dev/acc, and another for production.  This means they will all be using the same ASM instance.  Typically we followed Oracle’s recommendation to have two disk groups, one for data and the other for FRA.  That followed well when the database was the only one using the data diskgroup.  Now that we are coming databases, is the best practice still to have one data diskgroup and one FRA diskgroup?  For example, production will house 3 databases.  OLTP is 500 GB, Reporting is 1.3 TB, and Enterprise Content Management is 6 TB and growing.
    My consideration is that if all 3 databases accessing the same data diskgroup, the smaller OLTP must traverse through the 6 TB of content management.  Or is this thinking flawed?
    Does this warrant separate diskgroups?  Are there pros and cons to this?
    Any insights are appreciated.
    Best Regards,
    Sherrie

    I have many issues to deal with in this 'consolidation', but budget reduction is happening in state and regional government.  Our SAN storage is for our enterprise infrastructure and not part of my money-savings directive.  We are also migrating to UCS blades for the infrastructure, also not part of my budget reduction contribution. Oracle licensing is our biggest software cost, this is where my directive lies.  We've always been conservative and done more with less, now we will do with less, but different because the storage and hardware are awesome. 
    We've been consolidating databases onto RAC clusters and standalones since we started doing Oracle.  For the last 7 years we've supported ASM, 6 databases and 2 passive standby instances (with Data Guard) on a 2-node cluster totalling 64gb of memory.  The new UCS blades have 256gb of memory.  I get that each database must support its background processes.  If I add up the sga, pga allocated, background processes they take up about 130gb of memory, but also consider that there is an overhead to RAC.  In all the years we've had Oracle, most of our failures, outages or downtime was because of RAC.  On the plus side of that, the seamless failover saved us most times (not all times), but required administrative time for troubleshooting.
    I would love to go the Oracle 12c and use its multitenant architecture, but I have 3rd party applications that don't yet support it.  11.2 might be our last release unless I can reduce costs.  Consolidation is real and much needed, I believe why Oracle responded to the market with multitenancy. 
    But back to my first question about how many diskgroups to service a group of databases.  What I hearing, and think I agree to, is that one data group will suffice because the ASM instance knows where to retrieve the data and waste will be reduced, as well as management. 
    I still need to do some ciphering and by no means have a final plan, but thank you all for your insights and contributions.

  • RFP for Data Warehousing Project

    My manager contracted a consultant to come up with an RFP for us. But we are running way behind schedule :(
    Does anyone have a sample RFP for a Data Warehousing project for me to have a reference on what is required?
    Thanks,
    Tamara

    Tamara,
    I assume this question is not for members of Oracle Warehouse Builder team. Otherwise it'd be a bit awkward - you are asking us for a sample of what to request from us.
    Nikolai

  • How to extract data from RR cluster n B2 cluster for data warehousing.

    Hi,
        Our company wants to make backup of RR cluster and B2 cluster data of all employees. its basically moving all the employee master data to data warehousing.
    Your suggestions are appreciated.
    Cheers,
    Senthil

    Hi,
        Our company wants to make backup of RR cluster and B2 cluster data of all employees. its basically moving all the employee master data to data warehousing.
    Your suggestions are appreciated.
    Cheers,
    Senthil

  • JPA - Best Practice For Data Transfer?

    I've been considering an alternative method for data transfer between applications, by using Serialized or Encoded to File JPA Entities. (either Binary or XML)
    I know this procedure may have several draw backs as compared to traditional exported SQL queries or data manipulation statements, however, I want to know if anyone has considered or used this process for data transfer?
    The process would be to
    - query the database and load the JPA Entities
    - Serialize or Encode them to file
    - zip up the entire folder with the JPA entities
    - transfer the data to destination machine
    - extract the data to a temp directory
    - reload the JPA entities by de-serializing and persisting them to the database
    The reason I'm considering this process, is basically because I have a desktop application (manages member records, names, dates, contributions, etc) used by different organisations in different locations (which are not related except by purpose ie clubs or churches) and would like to have a simple way of transporting all data associated with a single profile (information about a member of the organisation) from one location to another in a simple way, ie users interact only with the application without the need for any database management tool or such.
    I'm also considering this because it is not easy to generate an SQL Script file without using a dedicated Database Management Tool, which I do not want the application users to have to learn how to use.
    I would appreciate ANY suggestions and probable alternative solutions for this problem. FYI: I'm using a Java DB database.
    ICE

    jschell wrote:
    In summary you are moving data from one database to another. True
    You only discussed flow one way. Well the process is meant to be bi-directional. Basically what I envision would to have something like:
    - the user goes to File -> Export Profile...
    - then selects one or more profiles to export
    - the export process completes and the zip archive is created and transfered (copied, mailed, etc) to the destination location
    then on the destination pc
    - the user goes to File -> Import Profile
    - selects the profile package
    - the app extracts, processes and imports the data (JPA serialized for example)
    Flow both ways is significantly more complicated in general.Well if well done it shouldn't be
    And what does this have to do with users creating anything?Well as shown above the user would be generating the Zip Archive (assuming that is the final format)
    Does this make the problem clearer?
    ICE

Maybe you are looking for

  • AR Invoice + Payment

    Hi all, Pardon me if anyone has posted on this. I have tried searching the forum and found nothing. If this has been posted, please lead me to the right thread. Anyway, I have a client with a problem with their AR Invoice + Payment. Normally, we will

  • Boot External CD ROM Satelite M45 s265

    HI, pls help me ! is their a way to boot a external CDrom into my laptop. because my cdrom was damage/broken. there is no usb boot in bios. pls help me so i can reformat my laptop. thanks, eumir

  • Configure another Inspection Type for Production Order

    Hi QM Gurus,    I have one scenario, my client wants to have another inspection type for inspection lot origin goods receipt from production because they have this requirement that for some materials they need to have a status profile that will be ac

  • LOGIN PLEASE NEED HELP

    Hello, im new using MAC OS X SERVER, i just wanna ask if its possible to log from a client computer, to the Server using the User Authentication saved on LDAP and mounting the Home directoty for that user like if it were a local user? if this is poss

  • HT5934 Any problems in IOS 7 and if I add IOS 7.2 do I get everything?

    Any problems in IOS 7 and if I add IOS 7.2 do I get everything?