Dataware house

Hello all,
I am looking for extracting data from Kalido dataware house for reporting in BI 7.0.
Could any one help me on quick set up tools and steps for UD connect as well as any other method you could suggest specifically for Kalido.
Points will be rewarded on merit.
Thanks
Message was edited by:
        Shail

make sure there is a JDBC driver available for Kalido, if the drivers are available you can configure it like other UD connect datasources

Similar Messages

  • Install Dataware House on SUSE linux 9.3 with Oracle Rel 2 (9.2.0.4)

    Dear All
    One of my customer needs dataware house on SUSE linux and i am a core DBA working on Prod and development servers upto now. So i want to know the necessary things i need to keep in mind before going to configure datawarehouse for the customer. Can you guys please suggest me the things i need to take care?
    Any URL's PDF's
    Thanks in Advance
    Ravi

    But when i don't give the oracle user all rights, it isn't possible to proceed with theinstallation
    But if you give that rights then it's a security hole. According to your words I guess you have similar enviroment settings:
    ORACLE_BASE=/
    ORACLE_HOME=/<directory_name>
    Why you not installing on deeper directory such as /opt or some your own directory? For example
    ORACLE_BASE=/myoracledir
    ORACLE_HOME=$ORACLE_BASE/<directory_name>
    Then chown -R oracle:dba /myoracledir.
    Then oracle will be owner just for /myoracle directory and all its subdirectories.
    i just could look at the error details, but they didn't described the erroranyway.
    That's not so true. Error log you could find in /tmp/OraInstallYYYY-MM-DD_HH_MI_SS..

  • Dataware House Slow Down

    Hi,
    We are having some maping on our dataware housing system. When we shutdown that database then it is taking 2-3 hours more to finish.
    What my thinking is when we shutdown database all sql needs to compile again. I dont know whether i am right or wrong.
    Can any one highlight on this? It would be great if some one suggest possible solution for this.
    Platform: Oracle8i on HP UX
    Thanks
    Chetan

    Which shutdown option are you using?
    SHUTDOWN NORMAL is a system quiesce that waits for all sessions to log off - this can take a LONG time
    SHUTDOWN TRANSACTIONAL will wait for all transactions to complete. If you are doing a large load transaction, this can take a long time
    SHUTDOWN IMMEDIATE will terminate all users sessions, roll back any uncommitted transactions and shutdown. If there is a long running transaction that needs to be rolled back, this can take a long time.
    SHUTDOWN ABORT will crash the instance, requiring instance recovery upon startup. This will shut the database quickly.
    NORMAL and TRANSACTIONAL are the cleanest options, but may take too long.
    If your shutdown is taking too long, you can connect via another session, do a SHUTDOWN ABORT and then let the transactions roll back during the instance recovery on startup.

  • Dataware house question

    Hello Everyone,
    In dataware housing what needs to be higher the SGA or PGA? I have read it that it was suppose to be PGA? Can any1 clearify this for me.
    Thanks in advance

    For any database, the amount of RAM allocated to SGA and to PGA will depend heavily on the database workload and the application(s) connecting to the database. There is no rule that states that the SGA should always be bigger than the PGA or vice versa.
    In general, data warehouses allocate a greater percentage of the available memory to PGA than would OLTP databases because data warehouses are generally supporting queries that involve doing relatively large sorts, hash joins, etc for relatively few concurrent users, and those sorts of operations require more PGA. Since data warehouses are commonly doing table scans and blocks read by one query aren't commonly re-read by other queries, the benefit of a lot of SGA tends to be less than for an OLTP system.
    Whether these generalities mean that a particular warehouse allocates more RAM to PGA than to SGA in absolute terms rather than merely in relative terms compared to an OLTP database on the same hardware, though, depends entirely on the specifics of the system.
    Justin

  • Dataware house Daily Maintenance + up and running 24/7

    Hello fellows
    I start working in dataware house environment, need suggestion regarding daily maintenance + running the db 24/7 , & what should be steps to acconmplish this , or forward me good notes .

    You'll need to manage loads into the Warehouse so I would suggest some metric capturing tables to assist there. Backups and monitoring of the backups will need to be in place. Statistics gathered on a predetermined schedule. Aggregations calculated via materialized views. General monitoring of the environment (disk space, CPU, memory).

  • DataWare Housing Database Optimization Parameter

    Hi,
    Does any one have any standard on Oracle optimized parameter value with small DataWare housing enviornment?
    Any suggestions are welcome.
    thanks

    As with any tuning problem, there is no "one size fits all" approach. The standard tuning methodology applies here as it does anywhere
    - Figure out how quickly something needs to run
    - If it isn't running quickly enough, figure out what is taking so much time
    - Once you know what is taking so much time, figure out how to reduce the time required. That may involve a global configuration change, it may involve tuning SQL, etc.
    In addition, specifying at least the Oracle version would be critical-- there's a world of difference between an 8.1.7 database and an 11.1 database. If you are managing SGA & PGA separately, data warehouses generally allocate a larger fraction of the RAM to PGA than their OLTP cousins. They generally make greater use of parallelism. They more commonly use compression, bitmap indexes, and partitioning.
    Justin

  • Views or MV in 11g for dataware house database

    Hello All,
    I would like to know which is better option to go with Views or Materialised views in DWH. I have read that views should not be used. I need to know the pros & cons for views before I propose to scrap views to my team. Which is better in terms of performance?
    Thanks

    Views are not designed to improve query performance: http://download.oracle.com/docs/cd/E11882_01/server.112/e16508/schemaob.htm#i20690.
    Materialized views are designed to improve query performance: http://download.oracle.com/docs/cd/E11882_01/server.112/e16508/schemaob.htm#CFAIGHFC

  • Adding new columns & New dataware house Tables

    Hello Gurus,
    We are using OBIEE 7.9.6, with Oracle EBS OLTP. I would like to show a new colum in an existing dash board report that requires new column in datawarehouse table & staging table. I also have a requirement to add new custom datawarehouse table to create a new report. Does anyone know the step by step approach to do this with details such as BI tools that are required?
    Any pointers on this is greatly appreciated!
    Thanks,
    Chandra

    Please read this:
    Oracle® Business Intelligence
    Data Warehouse Administration Console User’s Guide
    Version 10.1.3.4
    E12652-01
    Figure 8–1 Process Flow to Add New Object to Data Warehouse
    Pg 59

  • Dataware house with Oracle or distributed computing

    Hi All,
    When I talked with some guys in some big companies on the DW (ETL especially), all of them said they love distributing computing with Hadoop or Hive much more than Oracle.
    When they have huge data per day for processing (say n TB), Oracle or rational database can't work very well.
    I just have one DW project experience, which was implemented with PL/SQL and Shell purely and works well, at least from my point of view.
    What's your opinion on this?
    Thank you very much,
    Leon

    Hi Leon,
    look at this page (it contains link to two publications with results of comparison Hadoop against 2 RDBMS)
    http://database.cs.brown.edu/projects/mapreduce-vs-dbms/
    It seems Hadoop currently has no any chance against RDBMS (in DWH area)...
    In my opnion, Hadoop/Hive is a technology and not a solution, to solve problem with Hadoop/Hive you will need to do a lot of work.
    Regards,
    Oleg

  • How to identify Dataware house write account.

    Hi Guys,
    Pls help me in identifying the DWWriter account, I am able to get the details of all the other except DWWrite account, not able to find it from Account details from console view. As my customer is not having the details on accounts used, facing this challenge.
    Thank you!
    - Thanks, Sai

    Hi,
    Check the DB Role on the DW called "OpsMgrWriter" via SQL Management Studio and check which account has been assigned for example.
    Cheers,
    Christoph
    Blog: http://blog.cmaresch.at/  Twitter:
      LinkedIn:
      XING:
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Support::Rebuild SCOM 2012 R2 Dataware house Database

    Hi Folks,
    We have 2 Management server, 1 opsmanager, 1 datawarehousedb + reporting server, 1 acs server. Unfortunately our datawarehousedb server took a hit and database got corrupted. To add to injury we don’t have a valid database backup. We have tried to repair
    the database and it repair all the errors expect catalog errors due to which database keep going into suspect mode. I would like to know if there is a way to rebuild datawarehouse database component of the scom 2012 r2 environment. Any assistance in this matter
    will be very helpful.
    Regards,
    nav
    Regards, Navdeep [a.k.a v-2nas]

    Hi There,
    1. Uninstall Report Server role.
    2. Blow away ReportServer and ReportServerTempDB and the reporting services website (or do ssrs reset)
    3. Uninstall the Data Warehouse component (i.e., Delete the Data Warehouse database)
    4. Install the Data Warehouse (i.e., create the data warehouse database)
    5. Ensure SSRS is working in default state (you can get to http://localhost/reports without error). You'll need to use SSRS Configuration Tool.
    6. Install the Report Server role
    Refer : https://social.technet.microsoft.com/Forums/systemcenter/en-US/ca03b455-8c13-42a7-a810-8a63c913b527/scom-data-warehouse-database-uninstall-and-reinstall-procedures-in-production?forum=operationsmanagerreporting
    Gautam.75801

  • Exchange Partition on large Partition tables in dataware house

    Hi all,
    oracle 10.2.0.4(64 bit) and OS 5.3 (64 bit).
    We have large tables in our DWH size in TB and data is for 13 Months.
    Now our management want to split these tables in two.
    current tables contain data for 3 months and current month and history tables contain data for history data of 9 months.
    We have no space on mount point for export/import.
    Can exchange partition will work on it? if yes, please need steps/demo/examples.
    Some partitions has size more then 300gb.

    user610482 wrote:
    Hi Oracle gurus,
    I need a dynamic script to add MAXVALUE partition to all 100 tables in schema,each tables are having different Partition Key with different tablespace
    AVB1_NOTIFICATIONSL have 2 columns in partition key
    For eg : alter table AVB1_NOTIFICATIONSL add partition DMAX VALUES LESS THAN (MAXVALUE,MAXVALUE) TABLESPACE LARGE_D
    is SQL above valid?
    does it do what you require?

  • Need information/advice on dataware house compression on 11g

    We have an initiative to compress our ever growing datawarehouse.
    OS level is AIX, db version is 11gr1, and going to go to 11gr2.
    We would like to know where I can get a link regarding what is best to apply regrading compression in datawarehouse db.
    I tried oracle documentation, and not sure where to look .
    and plus are the compression a licensed product? any performance concerns? and any practical advices are highly appreciated here.

    Table compression does not require any additional licenses the key is that the only compression is done via direct insert pretty good for a data warehouse. To get and maintain compression for regular inserts and for updates an Advanced Compression license is required.
    Performance for inserts using direct method does not appear to have any large impact to load performance, however updates and regular inserts do take a bit of a hit when using compression. Queries that have any significant IO operations appear to benefit the most from compression and the larger the number of IO operations the higher the performance impact tends to be. Keep in mind that compression does take a CPU hit for the compression and de-compression operations, while this does not appear to be really large it is a consideration if your system have CPU resource issues already.
    I have found that partitioning and compression when used together offers the best performance when accessing using the partition key and partitioning offers benefits for management as well, found range/interval partitioning offered the best management benefits for a data warehouse.
    Some references besides the documentation which is a decent place to start, also google searches on Oracle Advanced Compression returns tons of results:
    http://www.techrepublic.com/whitepapers/oracle-database-11gr2-reduces-storage-costs-and-improves-performance/1728273
    Looks at table compression and 11g Advanced compression
    http://myeverydayoracle.blogspot.com/2010/11/oracle-10g-compression-vs-11gs-advanced.html
    Basic Exaplinations and examples for compression
    http://practical-tech.blogspot.com/2012/01/oracle-11gr2-table-level-compression.html
    Partitioning
    http://www.oracle.com/technetwork/database/bi-datawarehousing/twp-partitioning-11gr2-2010-10-189137.pdf
    http://www.oracle.com/technetwork/database/options/partitioning/ds-partitioning-11gr2-2009-09-134551.pdf

  • MDM, Dataware Housing and XI

    Hi
    Somebody could tell me what is the connection or difference between these 3 technologies. Appreciate all your inputs.
    S.

    Hi
    <i>difference between these 3 technologies.</i>
    --->
    MDM--> Master Data Management. It is key enabler of consistent master data within a heterogeneous system landscape. It ensures cross-system data consistency through streamlined data import, consolidation, and distribution mechanisms.
    Data WareHousing--> It is BW concept that talks about storing of historical data, data that is not frequently used but required sometimes for operational use. it maintains master data as well as transactional data. Data warehousing has own ways of preserving this data. for example master data is stored in infocubes etc.
    XI--> It is integration technology and platform for SAP and non- SAP applications,cross component business process management. Take an example where sender system is sending and IDOC and receiver system demands this information in file format.. so here comes role of XI to manage mapping and configuration between these 2 systems
    <i>connection between these technologies</i>
    Imagine a scenario wherein R/3 is sending IDOC which contains master data that needs to be updated on MDM repository.. then in this case R/3 will send IDOC to XI , Xi will convert it to XML file, and this XML file will be picked by import manager of MDM and hence the data is saved to repository for further use and thereby ensuring consistancy if remote systems are using this data
    Hope this satisfies your curiosity.

  • Dataware housing

    How we perform any performance tuning in Prompt?
    difference between OLTP AND OLAP?
    whts the difference between after & before aggregation?

    how to create multidimentional view database
    i dont have any experience in there..
    but i learned conceptual idea ..,but i need how toimplement
    one basic ?whether MySQL support OLAP concept ..,datawarehosing
    ..,plz give me any some idea............This is a Java forum, and your question is not Java related.
    Kaj

Maybe you are looking for