Large Dimension Update

Mapping using merge to refresh dimension works fine, but updates all records whether there has been a change or not. So, an update on a large dim will be inefficient/impractical.
I guess if staging table/cursor only has the changed records to start with, then only the changed records are updated so its feasible. So my problem is that source records do not have a "last updated" date - CDC issue.
So now that I've typed that I'm thinking a filter/join between staging and dimension & returning non-matching rows and results are used to update dimension (I suppose I should be doing that anyway to ensure new descriptions aren't null, etc)
Any other suggestions?
Thanks!

Then again, it's a completely impractical approach if the dimension has a lot of changeable attributes (as large dimensions probably would!). So back to root of problem - if changes are not recorded at source then what are the options?
1. Get changes recorded at source (but maybe tracking them across 20 tables for 1 dim makes for more inefficiency than a full refresh?)
2. Full refresh of dims possibly workable for low data volumes, but still need to reduce updates whilst keeping keys (So use method above, or alternative may be to create new table using existing key but attributes from source if available, use post mapping process to drop existing dim and rename new dim - so a full refresh/update in a fraction of the time?)
Then again... "fitness for purpose" - just create materialized views/tables with the current view of the data & ignore a lot of DW fundamentals - make reporting easier/safer than using the transactional system. What does all the extra work give us? Better user interface, historical comparisons, better maintainability? Does the customer actually require it?

Similar Messages

  • Rapidly changing very large dimensions  in OWB

    Hello,
    How is rapidly changing very large dimensions supported in OWB?
    Is it supported directly by the tool or do we have to depend on PL/SQL?
    If supported, Is it supported by all versions of OWB from 9.0.3 thru 10g?
    TIA

    Hi
    Use merge (insert/update or update/insert) if you have to update and insert too.
    Ott Karesz
    http://www.trendo-kft.hu

  • Profit center dimension updation. (urgent)

    Hi Friends,
    When i post the document in Fi its updated in FI and consolidation report ZD ( Company Dimension)  but its not updated in ZP (Profit center Dimension).
    Please help me  for this how it will update is their any process for Profit center Dimension updation.
    K.Satish
    09892421477

    In 2007A it is possible to select the Profit Center dimension on the Outgoing Payment.

  • Are large dimensions a problem?

    Hello, I am looking to possibly purchase Premiere for some video editing.  I have been using Camtasia as a hack method, but I've learned that Camtasia does not deal well with large dimensions.  I'm looking to request authorization to purchase Premiere, but I want to ensure that Premiere is the tool I need for what I'm doing.
    In short, I am taking rather large screen motion captures for instructional purposes.  I need to blur out confidential information and silence portions of the audio.  The dimensions of these captures are 1920 x 1200.  The duration of these videos range from about 40 seconds to 14 minutes (which is 1.45 GB in size).
    Has anyone worked with movies of these dimensions in Premiere?  I'm hoping to find some anecdotes that I can bring to my manager before making this request.  I'd appreciate any input folks have on this.
    Kevin

    Kevin,
    If you are looking to purchase, I would assume that you are looking at PrPro CS4. Is that correct? Unfortunately, you have posted to the Premiere (precursor to PrPro) forum. Maybe one of our tireless MOD's will move the post out to the PrPro forum, where you will get a lot more traffic.
    As to the dimensions, yes, PrPro can handle those easily. Now, your satisfaction will be tied to two things: your computer, and the full specs. of your source footage. With a good, stout editing machine, and appropriate source footage, you will have no problems.
    In the Hardware sub-forum, Harm Millaard has done several worthwhile articles on building/buying an editing rig. In the PrPro forum, there is much discussion on cameras and their footage, to work the best in PrPro.
    When this post gets moved, you will receive a lot of worthwhile comments, that will steer you in the right direction.
    Good luck, and do not be surprised, when Curt or Jeff moves the post.
    Hunt

  • Best practice when doing large cascading updates

    Hello all
    I am looking for some help with tackling a fairly large cascading update.
    I have an object tree that needs to be merged using JPA and Toplink.
    Each update consists of 5-10000 objects with a decent depth as well.
    Can anyone give me some pointers/hints towards a Best practice for doing this? Looping though each object with JPA's merge takes minutes to complete, so i would rather not do that.
    I have never actually used TopLinks own API before, so i am especially interested if TopLink has an effective way of handling this, preferably with a link to some related reading material?
    Note that i have a somewhat duplicate question on (Noting for good forum practice)
    http://stackoverflow.com/questions/14235577/how-to-execute-a-cascading-jpa-toplink-batch-update

    Not certain what you think you can't do. Take a long clip and open it in the Viewer. Set In and Out points. Drop that into the Timeline. Now you can move along in the Viewer clip and set new Ins and Outs and drop that into the Timeline. Clips in the Timeline are created from the Ins and Outs you set in the Viewer.
    Is that what you want to do? If it is, I don't where making copies of the clip would work for you
    Later, if you want to match up a clip in the Timeline to that master clip, just use Match Clip (find) in the timeline to find where it correaltes to your main clip
    You can have FCE automatically create subclips at camera cut points by using DV Stop/Start Detect if that is what you're looking for

  • Java support for file of large dimension

    Hi friends,
    Do you know if exist a support for managing file of large dimension (about 2GB) using java? BerkeleyDB could be a good support?
    thanks

    As Kaj mentioned before, using an NTFS partition, you should be able to write files of extreme sizes (for instance, an NTFS partition that is 200GB in size may store a single 200GB file).
    Of course, using FileChannel objects, you shouldn't actually try and write the full 2+ GB of data at once, because that will effectively require over 2GB RAM for the data of your application alone. Instead, append your datafile one segment at a time (the size of the segment, in bytes, is roughly the amount of RAM you need for your data). Perhaps you can directly write the bytes into FileChannel (ie: when your measurements take place), which does not require a byte buffer.
    Neither FileChannel nor File impose size restrictions beyond what the underlying operating system and type of partition impose. For example: a FAT32 partition can't deal with files over 4GB, so no language (Java, C++, whatever) will make it possible to store 4+ GB files on a FAT32 partition.
    If you rewrite a portion of my [url http://forum.java.sun.com/thread.jsp?forum=31&thread=562837&start=15&range=15&hilite=false#2770450]sample from the thread "storing in Java", you should be able to write a 4GB file on your NTFS partition. Change the size of the byte buffer to 64MB and the amount of cycles to 64 and you're there.
    I'm not sure how long it will take to write 64 chunks of 64MB, but don't expect it to finish in a few seconds. My regular IDE drive takes 2.25 seconds for a 60MB file (yet, my Serial-ATA drive does it in 1078ms, so don't forget that hardware has a significant impact on I/O performance as well).

  • How to export to very large dimensions?

    I need to export a file to the specs below, however, every time I try I get a 'Error compiling movie - Unknown error'. 
    Resolution 2732 pixels (w) x 768 pixels (h).
    32:9 Landscape format.
    File format is MPEG-4/H.264 AVC at 20Mbps.
    Frame rate is 25 as per PAL standard.
    This error has been replicated across a number of fairly high spec computers so I don't think it is an issue with disk space etc.  We're working off the assumption that it's the large dimensions causing the problem.
    The only solution we can come up with is to export to smaller dimensions (which we have done successfully) and then upscale but even that is proving challenging!
    Any suggestions or ideas are very welcome!

    Hi,
    I was also unable to reproduce the error using your export specs.
    Could you provide answers to these:
    Are you able to reproduce this issue with multiple projects?
    Did you try increasing the "level" to 5.1 under video settings?
    What is the specs of original media used in the project?
    Regards
    Vipul

  • Disable Spotlight before installing a large software update?

    Is it a good idea to "disable" both Time Machine and especially Spotlight before downloading any large software update?  And then enable again after the install?
    This question is based on what I have read on various threads.  Would like to know both the "pros" and "cons".
    Thanks

    Not sure where you read that but it is false. There is no need to disable anything before doing any "Updates".
    It is always a good idea to do a Time Machine backup before you do any major updates and Especially when doing UpGrades so you can Roll back to that TM backup if the "UPGRADE" fails or causes problems.

  • What are the steps to rendering large dimensions in AI?

    Hi,
    What is the ideal way to render a file in say as big as 3500x7000 px in AI? This image when exported will be used for print.
    I was thinking of opening a document that size and working on it but AI gets real choppy and during export it takes forever.
    What's the best way to export it out as 3500x7000 ?
    Thanks
    (NOTE: the printshop doesn't want to scale it,they want the final file to be in the size you want it printed out)

    Yes, of course. Both in preferences and in scaling. I can get an innerglow effect if scaled to 3500x7000 only that it is just barely enough to even see it. While on a smaller dimension the effects are more prominent.
    I've tried increasing the dpi to 300 for a 350x800px and then scaling it to 3500x7000px and still nothing, worse AI errors out. I increase it to more than 300 and still errors out.
    Ive been reading  afew things about scaling strokes and effects online and people say since it's a raster effect it doesn't scale well and has a limitation to scaling. Is this true? If so what can i do to use these effects so i can scale.? Someone mentioned using InDesign to scale it so it retains the effects and strokes even on large dimensions. Im competely confused now. I need to use InDesign now so i can get my effects scaled? or can illustrator do this on it's own?

  • HT5678 Why don't they show how large the update is

    WHAT AN ANNOYING WAY TO DO BUSINESS IN ASKING A QUESTION.
    Just upgraded from Snow Leopard to Mountain Lion and am checking on updates. So they have this update for Safari, but NO WHERE does it show how large the update is like they USED to do in Snow Leopard. WHY?

    We're all end-users like you so you will never see an offical Apple answer here.
    You can, however, use this feedback link:
    http://www.apple.com/feedback/macosx.html
    to voice your concern directly to Apple.

  • Maintaining large dimensions

    I'm currently testing whether using an OLAP cube is a viable solution in our case. Here's an overview about the data for the first test setup:
    3 dimensions:
    - one time dimension
    - one small dimension (4 levels)
    - one large dimension (5 levels)
    Items in each level of the large dimension:
    1st level: 27
    2nd level: 246
    3rd level: 1.889
    4th level: 383.434
    5th level: 1.348.869
    While the small dimension can be easily handled by the OLAP cube, loading the large dimension data takes too long (It has already been running for over 4 hours and is still running). Since the dimension data changes regularly and will be even larger in the real setup, this is not acceptable.
    Is there a way to significantly increase the dimension data load speed, or is an OLAP cube not the right solution for this problem? Or is there anything else I'm doing wrong?
    What during the loading process requires such a lot of processing power?
    Any suggestions or further reading material are welcome.
    Thanks,
    Karl

    Hi,
    I'm not sure if this is your problem or not,but I want to share my experience.
    I've had slow building dimensions as well, but not as big as yours. I think the lowest level had 500,000 rows. This would often keep on building beyond reasonable time. What I discovered was that the source data wasn't completely hierarchical. As far as I know your dimension data need to have a strict parent child relationship, not a relationship that would leave any soap opera on TV jealous.
    I'll try to explain better.
    Say that you have a product dimension with 4 levels with the following hierarchy:
    Total - being the top level and only just one member
    Product_category - What kind of product it is
    Color - The color of the product
    Product - Lowest level.
    Looks like an easy and basic dimension. Now lets say that we want to load data into this dimension.
    Every dimension also has the total level, but I'll leave it out for this example since it would always be the same.
    say you load the following products:
    Product_category - Color - Product
    Clothes - Blue - Sweater
    Clothes - Black - Socks
    Furniture - Black - Chair
    Furniture - White - Lawnchair
    Toys - Black - R/C Helicopter
    Now, apart from being a weird mix of products, this seems like nice dataset to load. In my experience this is where things starts to mess up.
    If you load this data in to your dimension, and start to drill down into the dimension you will start at the total level on top. From there you will go to product_Category and if you then choose clothes, you should see the colors blue and black. If you drill into the black you will suddenly see the R/C Helicopter, the chair and also the the socks you expected to.
    The reason is that the lowest levels have different grandparents, when they should be having the same. Black has ended up with having three different parents: toys, furniture and clothes and this is not good.
    If this is the case for your dimension I would believe things would act weird.
    Now back to the example, if this is the hierarchy you want to need to do some sql to your sourcedata so that all the levels gets just one parent.
    In my very simplified source, the data could look something like this:
    Clothes - Blue clothes - Sweater
    Clothes - Black clothes- Socks
    Furniture - Black furniture- Chair
    Furniture - White furniture- Lawnchair
    Toys - Black toys- R/C Helicopter
    Now every level has just one parent and everything is fine. You don't have to change the description fields, only the memberfields. However you should also change the description field so that the endusers won't get totally confused.
    Like I said, I'm not sure if this is what messes things up for you, it might be something completely different. I just wanted to mention it as a thing you might want to check out/keep in mind.
    Good luck!
    Ragnar

  • Beach Ball of Death on OSX while trying to upload large files(updated)

    Hi,
    I did a search on this and while I got lots of hits none on this issue.
    We have a web portal that allows customers to upload files to a DAM server, we have control server.
    The issue is not the back end, the back end is php and all set up correctly, we have 2gb ram assigned to php, the upload_max_filesize is set to 1.8gb max post size is set accordingly.
    The flex app loads, user selects large file say 1.6gb ( we do file size check to make sure its below the limit), click upload button, beach ball of death shows, a little while later the file uploads.
    Great so far, now the user tries another files same size, beach ball of death, script times out (we capture this and end gracefully). If the user restarts safari then the file will upload just fine.
    Seems you can upload a large file first time and small files subsequently below 1.2gb, but you can not upload multiple large files.
    Seems some sort of memory leak, I was looking at converting this upload app to send files via ftp, but if the flash player needs to process/load the files first then I don't think this will work looking to increase the file size ceiling to 4gb via ftp.
    Code is a bit involved but in the end just calls file reference upload, then beach ball appears
    Any ideas? player version 10.0.32.18 debug
    UPDATED 09_17_09
    It appears to be a memory leak in the player when in safari, the file is read but the memory not freed up after completion. Firefox frees the used memory after upload and you can continue. Not sure if Apple or Adobe issue.
    However why bother reading the file first? can we have an HTP stream file upload in the web flash player like AIR as. This way the player would not care on the file size and the limitation would reside on the server, which we can deal with.
    Message was edited by: flashharry!

    Hi,
    I did a search on this and while I got lots of hits none on this issue.
    We have a web portal that allows customers to upload files to a DAM server, we have control server.
    The issue is not the back end, the back end is php and all set up correctly, we have 2gb ram assigned to php, the upload_max_filesize is set to 1.8gb max post size is set accordingly.
    The flex app loads, user selects large file say 1.6gb ( we do file size check to make sure its below the limit), click upload button, beach ball of death shows, a little while later the file uploads.
    Great so far, now the user tries another files same size, beach ball of death, script times out (we capture this and end gracefully). If the user restarts safari then the file will upload just fine.
    Seems you can upload a large file first time and small files subsequently below 1.2gb, but you can not upload multiple large files.
    Seems some sort of memory leak, I was looking at converting this upload app to send files via ftp, but if the flash player needs to process/load the files first then I don't think this will work looking to increase the file size ceiling to 4gb via ftp.
    Code is a bit involved but in the end just calls file reference upload, then beach ball appears
    Any ideas? player version 10.0.32.18 debug
    UPDATED 09_17_09
    It appears to be a memory leak in the player when in safari, the file is read but the memory not freed up after completion. Firefox frees the used memory after upload and you can continue. Not sure if Apple or Adobe issue.
    However why bother reading the file first? can we have an HTP stream file upload in the web flash player like AIR as. This way the player would not care on the file size and the limitation would reside on the server, which we can deal with.
    Message was edited by: flashharry!

  • NOTE to Developers - Enough with the large CC updates.

    These large updates take lots of time to download and install, especially when one does not have access to a super fast connection. Give me a break.  Updates of 800Mb (recent Photoshop CC update) should not be necessary if it truly is an "update" and if the app was well constructed. Think twice before pushing everything out if it's not really necessary for the update. Also, why is the CC Cloud App being updated at all?  Why not load a simple data file to update the options, list the apps etc. 
    Thanks,
    PM

    Yes, these huge downloads are a genuine pain in the butt. I just spent most of yesterday getting Lightroom, Bridge and Photoshop updated - during which time I effectively had no access to internet for other things, and of course couldn't get any work done in the applications themselves.
    So what I did was to sit staring at a progress bar that mostly didn't move, for most of the day. And I still have huge Id and Pr updates waiting.
    Please, at least provide offline installers that can be downloaded elswhere and installed at convenience!

  • How to turning the large dimension query in the Analyzer 6.2?

    Dear expert:I need to query a report in the Analyzer 6.2How to turning the performance in the large sparse dimension query in the Analyzer 6.2 and essbase 6.5.1? ---------------------------------------Report is look like: page : order_date sub_code item pur_date currency vendor measure ---------------------------------------dimension order is dimension(member number) Dimension typeOrder_Date(304) densemeasure(6) densecurrency(6) sparsepur_date(138) sparsevendor(135) sparseitem(253) sparsesub_code(151) sparsePlease help to slove the problem, otherwise I will be killed by customer.Thanks your very much.Phoenixsub_code

    Hi All,
    We got another idea to create new template and use it as "Current Default Workbook".
    Then it is showing latest date as we changed one of the Text element from "Display Status of Data" to "Display status of Data To".
    But the this change is showing to my user id only but not to the other users.
    We are selecting the tick mark for "Global Default Workbook", but this tick mark is going away after each refresh. I think if this tick mark is holds permanently, my problem will solve.
    Please suggest me if you have any ideas to resolve this issue....

  • Large dimension tables

    I am trying to get a grip with how a dimension table can have more records than the fact table since the key for the dimension table is DIMID.  Can someone give me a practical example? Thanks

    Thanks for this lets work with the Doc No example.
    1) I have a cube with 4 dimensions - the fourth dimension contains material no, doc no and plant, the key for this dimension table is DIMID (system generated ??) and each record in the Dimension contains the corresponding SID values for mat no, doc no and plant
    2) My first load of data into the cube - 1000 records
    3) For each of these 1000 records a DIMID is generated in dimension 4 and the corresponding values of the 3 SIDs are retained as part of these records
    4) My next load is a delta 200 lines - 150 new records and 50 updated records - in this case 150 new DIMIDs are generated and the 3 SIDS are retained whereas the original 50 updated records do not have new DIMIDs generated
    Is how I described the scenario correct

Maybe you are looking for

  • Multiple Users....One iTunes Library

    I am looking find a easier solution to a iTunes multiple user problem I have encountered.Firstly I have followed the steps outlined on numerous articles here to move the music content out of the home folder and move it to the first level of the hardd

  • I accidently deleted my settings app on my iphone how do I get it back?

    My settings button was deleted & I kinda need that. My phone was backed up after it was deleted so is there another way I can ge the app without restoring phone to factory settings? Thanks

  • Ode To Everybody (Thank You)

    Dear Everybody, During the last few days I tried with (more or less) success to answer some questions here in this discussion forum. I'm really not a pro, nevertheless I hope I could help and I know I did... By answering the questions I actually real

  • Mountain lion shutdown

    I have an iMac Mid 2010, i upgraded to Mountain Lion and using version 10.8.2 Lately my iMac has been shutting down itself while i'm using it. what could be the cause? i also have an Apple TV, how do i not have the VDO show on my iMac while i'm strea

  • HT201304 I just can't figure this out . how can I do a restriction on my ipad help me please

    And I cannot get rid of games the once that being downloaded help please