Join large external external Tables (best practice ?)

Hi there,
I have three external tables in a master-detail relation: table A (10000000 rows) is master of table B (20000000 rows), Table B is master of table C (100000000 rows). Can you tell my the best way:
- directly join the external tables, or
- copy the external tables into tables, create an index, and join them
What is more efficient, and why ?
Thanks for your help and ideas.
Gerhard.

In general, if the joins you are doing can benefit from indexes, you will want to copy the data to database tables. If the joins will end up doing full table scans anyway, it will matter far less.
For data integrity, you will likely also want to be able to enforce foreign key constraints.
Justin
Distributed Database Consulting, Inc.
http://www.ddbcinc.com/askDDBC

Similar Messages

  • Transfer iphoto library to external harddrive. Best practice?

    Need to transfer iphoto library to external harddrive due to space issues. Best practice?

    Moving the iPhoto library is safe and simple - quit iPhoto and drag the iPhoto library intact as a single entity to the external drive - depress the option key and launch iPhoto using the "select library" option to point to the new location on the external drive - fully test it and then trash the old library on the internal drive (test one more time prior to emptying the trash)
    And be sure that the External drive is formatted Mac OS extended (journaled) (iPhoto does not work with drives with other formats) and that it is always available prior to launching iPhoto
    And backup soon and often - having your iPhoto library on an external drive is not a backup and if you are using Time Machine you need to check and be sure that TM is backing up your external drive
    LN

  • Logical level in Fact tables - best practice

    Hi all,
    I am currently working on a complex OBIEE project/solution where I am going straight to the production tables, so the fact (and dimension) tables are pretty complex since I am using more sources in the logical tables to increase performance. Anyway, what I am many times struggling with is the Logical Levels (in Content tab) where the level of each dimension is to be set. In a star schema (one-to-many) this is pretty straight forward and easy to set up, but when the Business Model (and physical model) gets more complex I sometimes struggle with the aggregates - to get them work/appear with different dimensions. (Using the menu "More" - "Get levels" does not allways give the best solution......far from). I have some combinations of left- and right outer join as well, making it even more complicated for the BI server.
    For instance - I have about 10-12 different dimensions - should all of them allways be connected to each fact table? Either on Detail or Total level. I can see the use of the logical levels when using aggregate fact tables (on quarter, month etc.), but is it better just to skip the logical level setup when no aggregate tables are used? Sometimes it seems like that is the easiest approach...
    Does anyone have a best practice concerning this issue? I have googled for this but I haven't found anything good yet. Any ideas/articles are highly appreciated.

    Hi User,
    For instance - I have about 10-12 different dimensions - should all of them always be connected to each fact table? Either on Detail or Total level.It not necessary to connect to all dimensions completely based on the report that you are creating ,but as a best practice we should maintain all at Detail level only,when you are mentioning any join conditions in physical layer
    for example for the sales table if u want to report at ProductDimension.ProductnameLevel then u should use detail level else total level(at Product,employee level)
    Get Levels. (Available only for fact tables) Changes aggregation content. If joins do not exist between fact table sources and dimension table sources (for example, if the same physical table is in both sources), the aggregation content determined by the administration tool will not include the aggregation content of this dimension.
    Source admin guide(get level definition)
    thanks,
    Saichand.v

  • High demand architecture, large reports, distributed printing, best practic

    Hi,
    I have a client with high demand for printing reports in different country regions and large reports (10000 pages some of them), today they are doing this with Oracle Reports, but the server regularly gets overloaded and some reports never complete. They use reports in PDF format.
    Can you please suggest an appropriate architecture for this scenario. Or refer some documentation on Best Practices for high demand and distributed reporting?.
    Case description
    They have a centralized Reports Server and reports are used in different remote regions of the country.
    Some reportrs have 6 pages and others may be more than 10.000 pages.
    Reports server has communication with some remote facilities at 10 mbps, and some regional offices at 10mbps, but most offices connect at 512 o 1024 bps.
    The idea would be having reports like "client reports" in the offices and request only "Data" to the server, so they can reduce network traffic, they believe most problems come from the fact that Reports Server sends heavy PDF thru the network causing lags and fails, some reports never complete, they have to resend them.
    Some othes ideas also include : Reduce the business logic placed into reports, use of regional servers, or even they have thought in changing Reports for another tool.
    Any comment or suggestion would be appreciated.
    Thanks in advance for your attention.

    i think better way is to keep business logic into database and generate ur reports using Forms(Oracle forms) based parameter forms and save results into temporary tables
    show some progress bar status on ur forms so it will be easy for users how long they have to wait for report to generate
    in reports there should not by anything more than select * from mytemptables
    Baig,
    [My Oracle Blog|http://baigsorcl.blogspot.com/]

  • Temp Tables - Best Practice

    Hello,
    I have a customer who uses temp tables all over their application.
    This customer is a novice and the app has its roots in VB6. We are converting it to .net
    I would really like to know the best practice for using temp tables.
    I have seen code like this in the app.
    CR2.Database.Tables.Item(1).Location = "tempdb.dbo.[##Scott_xwPaySheetDtlForN]"
    That seems to work, though i do not know why the full tempdb.dbo.[## is required.
    However, when i use this in the new report I am doing I get runtime errors.
    i also tried this
    CR2.Database.Tables.Item(1).Location = "##Scott_xwPaySheetDtlForN"
    I did not get errors, but I was returned data i did not expect.
    Before i delve into different ways to do this, i could use some help with a good pattern to use.
    thanks

    Hi Scott,
    Are you using the RDC still? It's not clear but looks like it.
    We had an API that could piggy back the HDBC handle in the RDC ( craxdrt.dll ) but that API is no longer available in .NET. Also, the RDC is not supported in .NET since .NET uses the framework and RDC is COM.
    Work around is to copy the temp data into a data set and then set location to the data set. There is no way that I know of to get to the tempdb from .NET. Reason being is there is no CR API to set the owner of the table to the user, MS SQL Server locks the tempdb to that user has exclusinve rights on it.
    Thank you
    Don

  • Internal vs. external directory services best practices

    Hello everyone,
    We have two distinct directory services here where I work, one that supports 'internal' needs, and one that is used for external clients, the people who use our web-facing applications. We are limited by the separation of the directory services. E.g., our internal users cannot use the external directory service to look up email addresses.
    I have been asked to look into design options and best practises. Is it common to have distinct services like this? Or are those external users usually part of the same service as the internal users? Is my online banking account information in the same directory service (assuming it is in a directory service at all) as the employees at my bank? Does it make sense to run separate services like this? What are some alternatives?
    Part of the integration problem is AD vs. Sun Directory Server. The external service is in Sun Directory Server and predates AD. The AD service is obviously here for the Windows environment. Some organizations I have worked with in the past used Sun LDAP as the authoritative source of data, and synced in one way or another into AD.
    Any feedback is appreciated,
    Mark

    No, what I am looking for is architectural input regarding the use of AD and a separate LDAP server. In my case I am talking about AD and the SJS Directory Server, but this would apply to any environment that has AD plus some other LDAP server.
    I need to be able to reasonably answer the general question: Why should we keep the SJS Directory Server, when we could just put all our LDAP data into AD?
    I also need to answer the more specific question: Given our LDAP data is external users only (customer, partners), does it make sense to keep them there? Again, why not just put these "external" entities into AD?
    I'm not trying to figure out how to get AD and LDAP to work together. I'm trying to figure out why I have two directories, and why I should or should not keep two directories. I've found nothing online dealing with what should be a very common scenario.
    Mark

  • Premiere Pro + External Hard Drive Best Practices

    For best performance when editing in premiere, is it recommended to keep raw files and project files on seperate drives?  Does this make workflow and response time quicker? What are the most efficient and safest options? Thanks

    I see this repeated over and over. I have not found it to be true. I have found over and over again that a single dedicated 7200 internal drive for media will work just fine. This is editing RT employing mainly DV, HDV, Cannon XF and Sony XdCam up to 50 Mbps. This includes multicamera projects up to four cameras.
    Since 2002 I have set up at least 10 separate Premiere based NLE PCs and have witnessed hundreds of projects across these many systems edited with Premiere 6.0, 6.5, Pro 1.5 Pro 2.0, CS4, CS5x, and CS6.  The only time when hard discs became an issue was a misadventure using USB external drives.
    I experimented, early in DV editing days, with advising editors to save their graphics, music, and project files on a separate internal media drive. However I could never get a consistent compliance with that, so for file management sake, I began having users keep all scratch disc dialogues checked to "same as project". That way they just needed to be certain to create their project folder on a Media drive, and save their project into that folder.
    Things have been smooth with that. I think that over the years I've had enough testing to say with confidence that anyone will be fine with it for general use with codecs up to 50Mbps per stream.
    The only place I've found some benefit is in targeting a separate internal drive for exports. This can speed up exports a bit, but I haven't found wild differences.
    So a typical system I might set up or use would map like this:
         C: System (usually a raid0)
         D: internal 7200 rpm Media 1 (Active Premiere Projects)
         G: USB\Firewire External Drives  Storage (Inactive premiere projects and offline file storage)
          Additional video edit space set up as:
         E: internal 7200 rpm Media 2 (Active Premiere Projects)
    Again, lots of experience with this set up and no drive performance issues.

  • Setup internal and external DNS namespaces best practice

    Is external name space (e.g. companydomain.com) and internal name space (e.g. corp.companydomain.com or companydomain.local) able to run on the same DNS server (using Microsoft Windows DNS servers)?
    MS said it is highly recommended to use a subdomain to handle internal name space - say corp.companydomain.com if the external namespace is companydomain.com.  How shall this be setup?  Shall I create my ADDS domain as corp.companydomain.com directly
    or companydomain.com then create a subdomain corp?
    Thanks in advanced.
    William Lee
    Honf Kong

    Is external name space (e.g. companydomain.com) and internal name space (e.g. corp.companydomain.com or companydomain.local)
    able to run on the same DNS server (using Microsoft Windows DNS servers)?
    Yes, it is technically feasible. You can have both of them running on the same DNS server(s). Just only your public DNS zone can be published for external resolution.
    MS said it is highly recommended to use a subdomain to handle internal name space - say corp.companydomain.com
    if the external namespace is companydomain.com.  How shall this be setup?  Shall I create my ADDS domain as corp.companydomain.com directly or companydomain.com then create a subdomain corp?
    What is recommended is to avoid having a split-DNS setup (You internal and external DNS names are the same). This is because it introduces extra complexity and confusion when managing it.
    My own recommendation is to use .local for internal zone and .com for external one.
    This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.
    Get Active Directory User Last Logon
    Create an Active Directory test domain similar to the production one
    Management of test accounts in an Active Directory production domain - Part I
    Management of test accounts in an Active Directory production domain - Part II
    Management of test accounts in an Active Directory production domain - Part III
    Reset Active Directory user password

  • External Portal - Security Best Practice

    We will be initiating an external portal for ESS access. For those using ESS from home, what type of additional security access is anyone using if the person happens to lock themselves out of their ESS account? Do you have a security question built into ESS? Are you using a security grid to reset their password? I'm looking to see what other alternatives people are using.
    Thanks
    Pam Major

    Hi Tim: Here's my basic approach for this -- I create either a portal dynamic page or a stored procedure that renders an HTML parameter form. You can connect to the database and render what ever sort of drop downs, check boxes, etc you desire. To tie everything together, just make sure when you create the form, the names of the fields match that of the page parameters created on the page. This way, when the form posts to the same page, it appends the values for the page parameters to the URL.
    By coding the entire form yourself, you avoid the inherent limitations of the simple parameter form. You can also use advanced JavaScript to dynamically update the drop downs based on the values selected or can cause the form to be submitted and update the other drop downs from the database if desired.
    Unfortunately, it is beyond the scope of this forum to give you full technical details, but that is the approach I have used on a number of portal sites. Hope it helps!
    Rgds/Mark M.

  • Large amount of data best practices.

    Hello Experts,
    I have an scenario where i have to extract large volume of data from SAP system to a external database using SAP PI. The process has to extract about 400.000 rows from SAP and send it to this external database. I guess the best way to insert the data to the database is using JDBC adapter but i'm wondering what's the best adapter i can use to comunicate SAP R/3 and SAP PI? What's the best way to send a message of 400.000 rows to SAP PI? Files, idocs, proxies? Could you please tell me if there's any documentation on the topic?
    Thank you in advance.

    HI,
    In your case, ClientProxy to JDBC is the best for the performance.
    Please see the link, it will explain you scenario (proxy to JDBC) in details.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e0ac1a33-debf-2c10-45bf-fb19f6e15649?quicklink=index&overridelayout=true
    Regards,
    Rajesh

  • Building tables - Best practices?

    Hi all,
    I hope this is a good place to ask a general question like this. I'm trying to improve myself as a DB designer/programmer and am wondering what are the current practices used when deploying a database that must be kept running at the highest performance possible (as far as selecting data and keeping the database clean).
    Basically, here are the specific topics of concern for me:
    - table sizing
    - index sizing
    - oracle parameter tuning
    - maintenance work required to be done on tables/indexes
    The things I've studied on were all based on Oracle 8i, and I'm wondering if much has changed for 9i and/or 10g.
    Thanks.
    Peter

    Actually I'm not very new to doing database work now, but I do still consider myself not quite sufficient in certain aspects of a typical DBA. For that reason, I'm trying to keep my questions very general as though as I'm learning them afresh.
    It does seem that I'm trying to ask something that is too broad to bring up in forum discussions... I'll go back and do some independent studies then come back to the forum with better questions. :)
    When looking through the 10g bug reports in metalink, it made me uncomfortable on some issues that people have been running into (been a while since I did the initial evaluation, and I forgot which specific issues I looked at). I realized that Oracle 10g provided a lot of conveniences with their new web-based EM and EMServer (especially interested in the new reports and built-in automations that Oracle provided), and also on grid deployments for high-availability systems, but we've been held back by many reasons to not go forward with 10g at this time. Having said that, moving to 10g is still planned for the future, so I am continuing the evaluation in several aspects that are specific to our design to determine what we can use and/or abandon in our existing deployment processes.
    Thanks for everyone's time, best wishes.
    Peter

  • Best Practices for Creating eLearning Content With Adobe

    As agencies are faced with limited resources and travel restrictions, initiatives for eLearning are becoming more popular. Come join us as we discuss best practices and tips for groups new to eLearning content creation, and the best ways to avoid complications as you grow your eLearning library.
    In this webinar, we will take on common challenges that we have seen in eLearning deployments, and provide simple methods to avoid and overcome them. With a little training and some practice, even beginners can create engaging and effective eLearning content using Adobe Captivate and Adobe Presenter. You can even deploy content to your learners with a few clicks using the Adobe Connect Training Platform!
    Sign up today to learn how to:
    -Deliver self-paced training content that won't conflict with operational demands and unpredictable schedules
    -Create engaging and effective training material optimized for knowledge retention
    -Build curriculum featuring rich content such as quizzes, videos, and interactivity
    -Track program certifications required by Federal and State mandates
    Come join us Wednesday May 23rd at 2P ET (11A PT): http://events.carahsoft.com/event-detail/1506/realeyes/
    Jorma_at_RealEyes
    RealEyes Connect

    You can make it happen by creating a private connection for 40 users by capi script and when creating portlet select 2nd option in Users Logged in section. In this the portlet uses there own private connection every time user logs in.
    So that it won't ask for password.
    Another thing is there is an option of entering password or not in ASC in discoverer section, if your version 10.1.2.2. Let me know if you need more information
    thnaks
    kiran

  • Best practices for data representation

    I'm curious about the best data representation for a constant or variable when there is an obvious choice of two.
    For example, take the Timeout terminal of the Event structure. This terminal takes a Long (I32) data type, but I'm wiring to it a constant value of 100 and therefore could use an Unsigned Byte (U8). Setting the constant to be I32 prevents an automatic conversion step from happening, but setting it to be U8 saves a little bit of unnecessary allocated space.
    Which is better?

    Practically
    speaking it more than likely will not matter until the data sets get
    large however as a "best practices" go it is best to keep the data
    consistent and in the type that the control, property node etc expects. Directly from the NI user manual (LV 7.1)
    "Coercion
    dots appear on block diagram nodes to alert you that you wired two
    different numeric data types together. The dot means that LabVIEW
    converted the value passed into the node to a different representation.
    Coercion dots can cause a VI to use more memory and increase its run
    time. Try to keep data types consistent in VIs."
    Cheers,
    --Russ

  • Best practice for extracting data to feed external DW

    We are having a healthy debate with our EDW team about extracting data from SAP.  They want to go directly against ECC tables using Informatica and my SAP team is saying this is not a best practice and could potentially be a performance drain.  We are recommending going against BW at the ODS level.  Does anyone have any recommendations or thoughts on this?

    Hi,
    As you asked for Best Practice, here it is in SAP landscape.
    1. Full Load or Delta Load data from SAP ECC to SAP BI (BW): SAP BI understand the data element structure of SAP ECC, and delta mechanism is the continous process of data load from a SAP ECC (transaction system) to BI (Analytic System).
    2. You can store transaction data in DSOs (granular level), and in InfoCubes (at a summrized level) within SAP BI. You can have master data from SAP ECC coming into SAP BI separately.
    3. Within SAP BI, you SHOULD use OpenHub service to provide SAP BI data to other external system. You must not connect external extractor to fetch data from DSO and InfoCube to target system. OpenHub service is the tool that faciliate data feeding to external system. You can have Informatica to take data from OpenHubs of SAP BI.
    Hope I explain to best of your satisfaction.
    Thanks,
    S

  • Best practice for external but secure access to internal data?

    We need external customers/vendors/partners to access some of our company data (view/add/edit).  It’s not so easy as to segment out those databases/tables/records from other existing (and put separate database(s) in the DMZ where our server is).  Our
    current solution is to have a 1433 hole from web server into our database server.  The user credentials are not in any sort of web.config but rather compiled in our DLLs, and that SQL login has read/write access to a very limited number of databases.
    Our security group says this is still not secure, but how else are we to do it?  Even if a web service, there still has to be a hole in somewhere.  Any standard best practice for this?
    Thanks.

    Security is mainly about mitigation rather than 100% secure, "We have unknown unknowns". The component needs to talk to SQL Server. You could continue to use http to talk to SQL Server, perhaps even get SOAP Transactions working but personally
    I'd have more worries about using such a 'less trodden' path since that is exactly the areas where more security problems are discovered. I don't know about your specific design issues so there might be even more ways to mitigate the risk but in general you're
    using a DMZ as a decent way to mitigate risk. I would recommend asking your security team what they'd deem acceptable.
    http://pauliom.wordpress.com

Maybe you are looking for

  • Can Apple unlock my Iphone 4s?, Can Apple unlock my Iphone 4s?

    I bought an Iphone 4s and it worked in my former country but once I tried using an uk sim (O2) it doesnt seem to be reading is. Does the apple store unlock iphones?

  • I'm using a Macbook Pro older system

    I can't open Facebook anymore on Safari, i can open everything else though! Any advice?

  • Install Back office on WLS 10.3.5

    Hi All, I want to create a Domain and then a managed server. On this managed server I want to install OR-Back Office. I need to do all of this in silent mode(command prompt.) i know i can create domain using unpack command from command prompt but don

  • SAP disp+work

    Hi Gurus I have installed mySAP ecc 5.0 with oracle 9.2 DB on solaris 10.My Central instance got installed without any error.But my Database installation i got some errors. Fist error was ERROR 2007-06-27 13:14:16 FRF-00007  Unable to open RFC connec

  • Are the OPH webservices/odi migrated to  SOA FMW in the pip pmdm11.2?

    To integrate OPH (Oracle Product Hub 12.1.3) with RODOD using AIA Pre-Built Integrations Product Master Data Management 11.1 I installed an OAC (Oracle Application Server) where has been deployed the OPH webservice: PIMWebServices-ICC-WS PIMWebServic