Slow due to huge number of tables

Hi,
unfortunately we have a really huge number of tables in the ( Advantage Server ) database.
About 18,000 + tables
Firing the acitveX preview thru RDC, or just running a preview in the designer slows down to a crawl.
Any hints? ( Besides get rid of that many tables )
Thanks
Oskar

Hi Oskar
The performance of a report is related to:
External factors:
1. The amount of time the database server takes to process the SQL query.
    ( Crystal Reports send the SQL query to the database, the database process it, and returns the data set to Crystal Reports. )
2. Network traffics.
3. Local computer processor speed.
    ( When Crystal Reports receives the data set, it generates a temp file to further filter the data when necessary, as well as to group, sort, process formulas, ... )
4. The number of record returned
    ( If a SQL query returns a large number of records, it will take longer to format and display than if was returning a smaller data set.)
Report design:
1. Where is the Record Selection evaluated.
    Ensure your Record Selection Formula can be translated in SQL, so the data can be filtered down on the Server, otherwise the filtering will be done in a temp file on the local machine which will be much slower.
They have many functions that cannot be translated in SQL because they may not have a standard SQL for it.
For example, control structure like IF THEN ELSE cannot be translated into SQL. It will always be evaluated in Crystal Reports. But if you use an IF THEN ELSE on a parameter, it will convert the result of the condition to SQL, but as soon as uses database fileds in the conditions it will not be translated in SQL.
2. How many subreports the report contains and in section they are located.
Minimise the number of subreports used, or avoid using subreports if possible because
subreports are reports within a report, and if you have a subreport in a details section, and the report returns 100 records, the subreport will be evaluated 100 times, so it will query the database 100 times. It is often the biggest factor why a report takes a long time to preview.
3. How many records will be returned to the report.
   Large number of records will slow down the preview of the reports. Ensure you only returns the necessary data on the report, by creating a Record Selection Formula, or basing your report
off a Stored Procedure, or a Command Object that only returns the desired data set.
4. Do you use the special field "Page N of M", or "TotalPageCount"
   When the special field "Page N of M" or "TotalPageCount" is used on a report, it will have to generate each page of the report before it displays the first page, therfore it will take more time to display the first page of the report.
    If you want to improve the speed of a report, remove the special field "Page N of M" or "Total Page Count" or formula that uses the function "TotalPageCount". If those aren't used when you view a report it only format the page requested. It won't format the whole report.
5. Link tables on indexed fields whenever possible.
6. Remove unused tables, unused formulas, unused running totals from the report.
7. Suppress unnecessary sections.
8. For summaries, use conditional formulas instead of running totals when possible.
9. Whenever possible, limit records through selection, not suppression.
10. Use SQL expressions to convert fields to be used in record selection instead of using formula functions.
For example, if you need to concatenate 2 fields together, instead of doing it in a formula, you can create a SQL Expression Field. It will concatenate the fields on the database server, instead of doing in Crystal Reports.
SQL Expression Fields are added to the SELECT clause of the SQL Query send to the database.
11. Using one command as the datasource can be faster if you return only the desired data set.
      It can be faster if the SQL query written only return the desired data.
12. Perform grouping on server
   This is only relevant if you only need to return the summary to your report but not the details. It   will be faster as less data will be returned to the reports.
Regards
Girish Bhosale

Similar Messages

  • Local client copy error - Table "BKPF" not edited due to excessive number

    Hi,
    I am performing a local client copy on our ECC 6.0 system. Client copy took more than 12 hours and its still in progress with 7 tables still to be copied. Logs in SCC3 -
    ===
    2 ETA163 Table "BKPF" not edited due to excessive number of errors
    2 ETA057 WARNING: Cancelled several times
    2 ETA163 Table "COEP" not edited due to excessive number of errors
    3 ETA311 Process "00002" started on server "sfrndevsap15"
    (time: "17:20:51")
    2 ETA297 Error: Table "ONROV" error in DDIC - Check table with SE14
    2 ETA057 WARNING: Cancelled several times
    2 ETA163 Table "FAGLFLEXA" not edited due to excessive number of errors
    3 ETA311 Process "00003" started on server "sfrndevsap15"
    (time: "17:20:52")
    4 ETA346 "FAGLFLEXC :"" 0 0 0 DEL.""
    0 0 17:20:52"
    2 ETA057 WARNING: Cancelled several times
    2 ETA163 Table "HRP1001" not edited due to excessive number of errors
    3 ETA311 Process "00004" started on server "sfrndevsap15"
    (time: "17:20:52")
    4 ETA346 "VSAFVU_CN :"" 0 86123 0 DEL.""
    0 29 17:21:21"
    2 ETA057 WARNING: Cancelled several times
    2 ETA163 Table "PCL4" not edited due to excessive number of errors
    ===
    Please help ASAP, as this is effecting our testing team.  I have tried to copy individual tables also using a transport request number and SCC1 to copy it to the target client.  This is also taking quite a long time.
    Thanks in advance,
    Abdul

    Hi Abdul,
    What does your system log say ?
    Check this SAP Note
    Note 579783 - CC ERROR: Loss of data - table not copied
    run the report RSCC_VERIFY to check if the table is consistent within the source client.
    SAP will only provide support for RSCC_VERIFY messages if there is a specific problem in an application in the target client that can be assigned immediately to the messages.
    RSCC_VERIFY is delivered with SAPKB46B49, SAPKB46C35 and SAPKB46D24. In older releases, you can create the report in the ABAP editor (transaction SE38) in accordance with the attached advance correction. Use the development class or the STRM package for this.
    Regards,
    Siddhesh

  • Huge number of unprocessed logging table records found

    Hello Experts,
    I am facing one issue where huge number of unprocessed logging table records were found in SLT system for one table. I have check all setting and error logs but not found any evidence that causing the unprocessed records. In HANA system also it shows in replicated status. Could you please suggest me something other than to replicate same table again, as that option is not possible at this moment.

    Hi Nilesh,
    What are the performance impacts on the SAP ECC system when multiple large SAP tables like BSEG are replicated at the same time? Is there a guideline for a specific volume or kind of tables?
    There is no explicit guideline since aspects as server performance as well as change rate of the tables are also relevant. As a rule of thumb, one dedicated replication job per large table is recommended.
    from SLT
    How to enable parallel replication before DMIS 2011 SP6    do not ignore its for SP06 == go through
    How to improve the initial load
    Regards,
    V Srinivasan

  • Implementing PaaS (CloudFoundry/BOSH) feeds a huge number of (unwanted) ProtectionServers into DPM

    Hi
    We have a Hyper-v cluster with VM's on Cluster Shared Volumes, we are using System Center (2012 R2) VMM and DPM for backing up the core infrastructure and a selection of vm's from some of our VMM clouds.
    Ongoing work is to implement a PaaS with Hyper-V/System Center - with use of CloudFoundry/BOSH. Due to a lot of unity tests many short lived vm's are made that we never want to backup with DPM. It looks like that the DPM Agent on the Hyper-v hosts in the
    cluster feeds all vm's that it sees into DPM as a ProtectionServer - even though these VM's do not have any DPM Agent installed.
    Doing this SQL in the DPM Database:
    SELECT count(*) FROM [DPMDB_MyDPM].[dbo].[vw_DPM_Server]
    Gives 5531 entires. A huge number of these entires are from the PaaS CloudFoundry/Bosh created VM's in the IaaS. These are VM's that never have been backed up by DPM (Or have any DPM agent installed).
    We see the same with use of powershell:
    PS C:\> $ps = Get-ProductionServer | Where-Object {$_.Name -like "*bosh_vm_being_created*"}
    PS C:\> $ps.length
    5294
    PS C:\>
    * The huge number of (unwanted) Production Servers in our DPM causes that our DPM is going slower and slower. And we see that our DPM SQL Database is working more and more. Today we are only taking backup of around 20 VM's, the SystemCenter MSSQL and a few
    hosts with DPM agent.
    Question 1 - How can we remove these unwanted ("bosh_vm_being_created") ProductionServers from our DPM? They do not have any DPM agent installed, they have no recovery point in DPM. But still they are listed in DPM as a ProductionServer. Why?
    Question 2 - How can we configure DPM to filter out these PaaS/CloudFoundry/Bosh VM's so that they do not reach the DPM system?
    Br. Rune

    Hi
    Unfortunately I have no solution on my case yet. The number of ProductionServers in our DPM server is growing and growing. And our DPM is going slower and slower.
    PS C:\> $ps = Get-DPMProductionServer -DPMServerName <DPMhostname>
    PS C:\> $ps.length
    8525
    PS C:\>
    It must be an purg-job that is not cleaning out old VM's (objects) from the DPM server I guess. I our VMM we only have around 200 VM's. So most of the ProductionServers in our DPM is old VM's that no longer exist.
    When we try to use
    Remove-ProductionServer.ps1 powershell to remove one of these ProductionServers - we get an error because the VM do no longer exist (and the VM do not have any agent installed).
    Do anyone have any experience with this?
    Br. Rune

  • Sql query slowness due to rank and columns with null values:

        
    Sql query slowness due to rank and columns with null values:
    I have the following table in database with around 10 millions records:
    Declaration:
    create table PropertyOwners (
    [Key] int not null primary key,
    PropertyKey int not null,    
    BoughtDate DateTime,    
    OwnerKey int null,    
    GroupKey int null   
    go
    [Key] is primary key and combination of PropertyKey, BoughtDate, OwnerKey and GroupKey is unique.
    With the following index:
    CREATE NONCLUSTERED INDEX [IX_PropertyOwners] ON [dbo].[PropertyOwners]    
    [PropertyKey] ASC,   
    [BoughtDate] DESC,   
    [OwnerKey] DESC,   
    [GroupKey] DESC   
    go
    Description of the case:
    For single BoughtDate one property can belong to multiple owners or single group, for single record there can either be OwnerKey or GroupKey but not both so one of them will be null for each record. I am trying to retrieve the data from the table using
    following query for the OwnerKey. If there are same property rows for owners and group at the same time than the rows having OwnerKey with be preferred, that is why I am using "OwnerKey desc" in Rank function.
    declare @ownerKey int = 40000   
    select PropertyKey, BoughtDate, OwnerKey, GroupKey   
    from (    
    select PropertyKey, BoughtDate, OwnerKey, GroupKey,       
    RANK() over (partition by PropertyKey order by BoughtDate desc, OwnerKey desc, GroupKey desc) as [Rank]   
    from PropertyOwners   
    ) as result   
    where result.[Rank]=1 and result.[OwnerKey]=@ownerKey
    It is taking 2-3 seconds to get the records which is too slow, similar time it is taking as I try to get the records using the GroupKey. But when I tried to get the records for the PropertyKey with the same query, it is executing in 10 milliseconds.
    May be the slowness is due to as OwnerKey/GroupKey in the table  can be null and sql server in unable to index it. I have also tried to use the Indexed view to pre ranked them but I can't use it in my query as Rank function is not supported in indexed
    view.
    Please note this table is updated once a day and using Sql Server 2008 R2. Any help will be greatly appreciated.

    create table #result (PropertyKey int not null, BoughtDate datetime, OwnerKey int null, GroupKey int null, [Rank] int not null)Create index idx ON #result(OwnerKey ,rnk)
    insert into #result(PropertyKey, BoughtDate, OwnerKey, GroupKey, [Rank])
    select PropertyKey, BoughtDate, OwnerKey, GroupKey,
    RANK() over (partition by PropertyKey order by BoughtDate desc, OwnerKey desc, GroupKey desc) as [Rank]
    from PropertyOwners
    go
    declare @ownerKey int = 1
    select PropertyKey, BoughtDate, OwnerKey, GroupKey
    from #result as result
    where result.[Rank]=1
    and result.[OwnerKey]=@ownerKey
    go
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Query using system parameter LEVEL returns incorrect huge number of records

    We migrate our database from Oracle *9.2.0.6* to *11.2.0.1*
    The query below throws "ORA-01788: CONNECT BY clause required in this query block".
    select * from (
    +select a.BOARD_ID, code, description, is_displayable, order_seq,  board_parent_id, short_description, IS_SUB_BOARD_DISPLAYABLE, <font color=blue>LEVEL</font> child_level, sp_board.get_parent_id(a.board_id) top_parent_id, is_top_selected isTopSelected+
    from boards a, ALERT_MESSAGE_BOARD_TARGETS b
    where a.board_id = b.board_id and is_displayable = 'Y' and alert_message_id = 5202) temp
    start with board_parent_id = 0
    connect by prior board_id = board_parent_id
    ORDER SIBLINGS BY order_seq;
    Based from online resources we modified "*_allow_level_without_connect_by*" by executing the statement.
    alter system set "_allow_level_without_connect_by"=true scope=spfile;
    After performing the above, ORA-01788 is resolved.
    The new issue is that the same query above returns *9,015,853 records in 11g* but in *9i it returns 64 records*. 9i returns the correct number of records. And the cause for 11g returning greater number of records is due to system parameter <font color=blue>LEVEL</font> used in the query.
    Why 11g is returning an incorrect huge number of records?
    Any assistance to address this is greatly appreciated. Thanks!

    The problem lies in th query.
    Oracle <font color=blue>LEVEL</font> should not be used inside a subquery. After <font color=blue>LEVEL</font> is moved in the main query, the number of returned records is the same as in 9i.
    select c.BOARD_ID, c.code, c.description, c.is_displayable, c.order_seq, c.board_parent_id, c.short_description, c.IS_SUB_BOARD_DISPLAYABLE, <font color=blue>LEVEL</font> child_level, c.top_parent_id, c.isTopSelected
    from (
    select a.BOARD_ID, code, description, is_displayable, order_seq, board_parent_id, short_description, IS_SUB_BOARD_DISPLAYABLE, sp_board.get_parent_id(a.board_id) top_parent_id, is_top_selected isTopSelected
    from boards a, ALERT_MESSAGE_BOARD_TARGETS b
    where a.board_id = b.board_id and is_displayable = 'Y' and alert_message_id = 5202
    ) c
    start with c.board_parent_id = 0
    connect by prior c.board_id = c.board_parent_id
    ORDER SIBLINGS BY c.order_seq

  • Huge number of Managed Properties

    I've recently started at a new company and inhertied the existing SharePoint Farm.  I've looking at search as it seems quite slow crawling content.  One thing I have noticed is that there is a huge number of Managed Properties >5000. There are
    pages and pages like the one below.
    There are only ~1800 Crawled Properties so I'm not really sure why there are so many Managed Properties.
    I have noticed that the SharePoint and Office categories have the 'Automatically generate a new managed property' enabled.  The farm uses a number of 3rd party addons and I'm not sure it they are responsible or not at this point or if they Require the
    Automatically generate Properties option. 
    Just wondering if anyone had seen this or may have an idea?
    Cheers

    Hey Scott, thanks for replying.
    The managed Property Mapping looks normal it's mapped to a single Crawled Property.
    Although there are >4000 Managed Properties mapped to a single
    Crawled Property which is weird.
    I don't know where the Crawled property is coming from though. This is 2010 so I can't use the
    SiteCollection property of the Get-SPEnterpriseSearchMetadataCrawledProperty command to filter. I'm not sure there is another way of figuring that out.
    I'll probably end up trying to delete all of these Mapped Properties or Just create a new Search Service Application and start from scratch.

  • Number of tables in from clause

    Hi,
    Can anyone tell me how many number of tables can we have in from clause in oracle.

    Over 800 so far without any error. :)
    But boy does it run slow :(
    select 1 from
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,
    dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual,dual

  • Improve Indesign performance with a huge number of links?

    Hi all,
    I am working on a poster infographic with a huge number of links, specifically around 4500. I am looking to use indesign over illustrator for the object styles (vs graphic styles in illustrator) and for the interactive capacity.
    The issue I am having is indesign's performance with this many links. My computer is not maxed out on resources when indesign is going full power, but indesign is still very slow.
    So far, here are the things I have tried:
    Display performance to fast
    Switching from link AI files to SVGs
    Turning off preflight
    Turning off live-draw
    Turning off save preview
    Please let me know if you have any suggestions on how to speed up indesign! See below system specs
    Lenovo w520
    8GB DDR3 @1333mhz
    nVidia 2000M, 2GB GDDR
    Intel Core i7 2760QM @ 2.4GHz
    240GB Samsung 840 SSD
    Adobe CS6
    Windows 8
    The only other thing I can think to try is to break up the poster into multiple pages/docs and then combine it later, but this is not ideal. Thank you all for your time.
    Cheers,
    Dylan Halpern

    I am not a systems expert, but I wonder if you were to hide the links, and keep InDesign from accessing them that it might help. Truly just guessing.
    Package the file so all the graphics are in a single folder. Then set the File Handling Preferences so InDesign doesn't yell at you when it can't find the links.
    Then quit InDesign and move the folder to a new place. Then reopen InDesign. The preview of the graphics will suck, but it might help. And one more thing, close the Links panel.

  • Crystal Reports - Connecting to databases - Limit to Number of Tables?

    I am using CR 2008 and attempting to build a report.  The database I created access to contains over 5000 tables in the db schema.  The list of tables presented in the GUI to build the report is just a portion of the total number of tables.  ('A' to 'O' displays alphabetically).  Is there a limit to the number of tables that can be displayed to build the contents of a report?  If so, what is the limit?  Is there a work around?

    Hello,
    This is by design and a limit of how CR works due to limitation in your PC resources. CR loads all of that info into memory, if there are too many to list, and typically it's more common when using Oracle due to it's ability have thousands of tables, is in the Database connection right click on the connection and then select Options. You can add filtering to limit what you see.
    Only option you have to be able to get to see all tables required for your report. Don't add tables if they are not required or if you can't link them. Work arounds are to create a collection of Stored Procedures or Views so you see just what you need.
    Thank you
    Don

  • Huge number of page faults .. amost 5 million in 4 hrs operation .. running XP on HP Pavilion laptop

    I prefer Firefox to IE for a number of reasons.
    Lately I am experiencing slow operations .. noticed huge number of page faults for firefox.exe 4,710,333 in 4 hrs operation in task manager.
    Running XP on HP Pavillion laptop with 2Gb memory with Avast virus protection.
    3 recent scans show no viruses. Cleared all cookies etc and .tmp files.
    Yahoo Messenger 11 and svchost 1560 also have high page faults .. 1,971,264 and 1,193,792 respectively.

    Re: excessive page faults. Closing unneeded apps is not helpful. The excessive page faults -27million after running Firefox for about 3 hours happen even if FF is the only thing running apart from Winxp files and minor antivirusware.
    This only started when I downloaded FF 8 in Nov 2011. I am really fed up because eventually things slow down and webpages won't open. I really would like to go back to an earlier version. So, does anyone else have a solution to this?

  • Limitation on the number of tables in a Database Schema!

    Hi All,
    Is there a limitation on the number of tables present in a schema that can be fetched in Designer while creating a universe.
    Customer is using Oracle schema which contains 22000 tables and while trying to insert tables in Designer XIR2 (or trying to get the table browser) Designer hangs.
    In BO 6.5 there are no issues while retrieving the tables from the schema in Designer.
    Is there a way to retrieve only a certain amount of tables in Designer XIR2?
    Thanks for your help!

    Hi Ganesh,
    Following are the Answers regaring your queries:
    Query : 1
    There is no limitation on the number of components (objects, classes, tables, joins, hierarchies, lov's, etc) in a universe. But of course as the number of components increases, you could run into problems  related to performance.
    This depends on available RAM and the processing speed.
    Query 2:
    There is NO such option to select the number of table to be automatically inserted in to the universe because Suppose if you have 22000 tables in DB and you want only 1000 table ,you entered 1000 tables as the value to insert tables in Universe then How Designer will come to know which tables you want to take in Schema to build the Universe?
    It all depends on the DBA and Universe Designer which tables are important for organizations reporting needs.
    When you  create connection to the DB then Connection will fetch all table from the database and we canu2019t limit  DB data retrieval.
    I hope this Helps...
    Thanks...
    Pratik

  • How can i organize a huge number of events?

    I just began with mac and iphoto. I imported a lot of photos from windows (google picasa) in iphoto (>17000). Now I have a huge number of events, which i would like to sort by e.g. years and months. Within the years and months, the events should be placed.
    Is this possible or what would you suggest to handle a lot of photos?
    Thank you in advance for any helpful answer.
    Alumsch

    alumsch wrote:
    I just began with mac and iphoto. I imported a lot of photos from windows (google picasa) in iphoto (>17000). Now I have a huge number of events, which i would like to sort by e.g. years and months. Within the years and months, the events should be placed.
    Is this possible or what would you suggest to handle a lot of photos?
    Thank you in advance for any helpful answer.
    Alumsch
    Assuming that you have events sorted by date (event menu ==> sort events) your events will be sorted by date & time - you can not create a substructure within events - it is a large flat set of photos
    If you want a hierarchal structure use albums and folders - albums hold photos and folders hold albums or other folders - you also can use smart albums to instantly find all photos from a date or a date range - or the search window in the lower left
    Events a a very basic, inflexible and pretty much automatic organization - just a starting point to hold photos
    I generally merge trips into a single event and leave the other time based - others merge even more having events like 1st quarter 2010 etc
    LN

  • TS4020 I had to buy a new iPhone, how do I retrieve my stuff that was backed up in my iCloud account? The guy at the apple store said it could take a few days iCloud was running real slow due to so many new accounts, is that true?

    I had to buy a new iPhone, how do I retrieve my stuff that was backed up in my iCloud account? The guy at the apple store said it could take a few days iCloud was running real slow due to so many new accounts, is that true?

    No problems backing up iCloud here.
    Tap Settings > iCloud > Storage & Backup
    Switch iCloud Backup On

  • Lightroom or Photoshop Elements for administrating huge number of photos?

    Dear photo experts,
    at home, we have a huge number of photos we took over years. We are looking for a software able to organize all of them. We currently have:
    Adobe Photoshop Elements 10 (Mac)
    Adobe Lightroom 3 (Mac)
    However, for organizing photos we so far use none of them but a third software (here, we are not happy about how it handles our huge catalog of photos).
    Our photos are stored on a server (network attached storage), organized by date and event.
    I read that Adobe Photoshop Elements cannot organize photos stored on a network drive.
    Now my questions:
    - Can Adobe Lightroom organize photos stored on a network drive?
    - Is Adobe Lightroom capable to organize a huge amount of photos in one single catalog (separating them via tags)?
    What are your experiences?
    Thanks a lot!
    JMickey

    I read that Adobe Photoshop Elements cannot organize photos stored on a network drive.
    I thought the opposite was true
    Can Adobe Lightroom organize photos stored on a network drive?
    As we say here in Rochester, NY, YES it can
    Is Adobe Lightroom capable to organize a huge amount of photos in one single catalog (separating them via tags)?
    Yes, this is one of Lightroom's strengths
    What are your experiences?
    I use Lightroom for all of my photo management (yes, I said ALL). I never use the operating system for photo management. Lightroom works great. People here in this forum who have much larger catalogs than I do (over 1/4 million photos) also use Lightroom to manage their photos.

Maybe you are looking for