Question regarding DocumentDB RU consumption when inserting documents & write performance

Hi guys,
I do have some questions regarding the DocumentDB Public Preview capacity and performance quotas:
My use case is the following:
I need to store about 200.000.000 documents per day with a maximum of about 5000 inserts per second. Each document has a size of about 200 Byte.
According to to the documentation (http://azure.microsoft.com/en-us/documentation/articles/documentdb-manage/) i understand that i should be able to store about 500 documents per second with single inserts and about 1000 per second with a batch insert using
a stored procedure. This would result in the need of at least 5 CUs just to handle the inserts.
Since one CU consists of 2000 RUs i would expect the RU usage to be about 4 RUs per single document insert or 100 RUs for a single SP execution with 50 documents.
When i look at the actual RU consumption i get values i don’t really understand:
Batch insert of 50 documents: about 770 RUs
Single insert: about 17 RUs
Example document:
{"id":"5ac00fa102634297ac7ae897207980ce","Type":0,"h":"13F40E809EF7E64A8B7A164E67657C1940464723","aid":4655,"pid":203506,"sf":202641580,"sfx":5662192,"t":"2014-10-22T02:10:34+02:00","qg":3}
The consistency level is set to “Session”.
I am using the SP from the example c# project for batch inserts and the following code snippet for single inserts:
await client.CreateDocumentAsync(documentCollection.DocumentsLink, record);
Is there any flaw in my assumption (ok…obviously) regarding the throughput calculation or could you give me some advice how to achieve the throughput stated in the documentation?
With the current performance i would need to buy at least 40 CUs which wouldn’t be an option at all.
I have another question regarding document retention:
Since i would need to store a lot of data per day i also would need to delete as much data per day as i insert:
The data is valid for at least 7 days (it actually should be 30 days, depending on my options with documentdb). 
I guess there is nothing like a retention policy for documents (this document is valid for X day and will automatically be deleted after that period)?
Since i guess deleting data on a single document basis is no option at all i would like to create a document collection per day and delete the collection after a specified retention period.
Those historic collections would never change but would only receive queries. The only problem i see with creating collections per day is the missing throughput:
As i understand the throughput is split equally according to the number of available collections which would result in “missing” throughput on the actual hot collection (hot meaning, the only collection i would actually insert documents).
Is there any (better) way to handle this use case than buy enough CUs so that the actual hot collection would get the needed throughput?
Example: 
1 CU -> 2000 RUs
7 collections -> 2000 / 7 = 286 RUs per collection (per CU)
Needed throughput for hot collection (values from documentation): 20.000
=> 70 CUs (20.000 / 286)
vs. 10 CUs when using one collection and batch inserts or 20 CUs when using one collection and single inserts.
I know that DocumentDB is currently in preview and that it is not possible to handle this use case as is because of the limit of 10 GB per collection at the moment. I am just trying to do a POC to switch to DocumentDB when it is publicly available. 
Could you give me any advice if this kind of use case can be handled or should be handled with documentdb? I currently use Table Storage for this case (currently with a maximum of about 2500 inserts per second) but would like to switch to documentdb since i
had to optimize for writes per second with table storage and do have horrible query execution times with table storage because of full table scans.
Once again my desired setup:
200.000.000 inserts per day / Maximum of 5000 writes per second
Collection 1.2 -> Hot Collection: All writes (max 5000 p/s) will go to this collection. Will also be queried.
Collection 2.2 -> Historic data, will only be queried; no inserts
Collection 3.2 -> Historic data, will only be queried; no inserts
Collection 4.2 -> Historic data, will only be queried; no inserts
Collection 5.2 -> Historic data, will only be queried; no inserts
Collection 6.2 -> Historic data, will only be queried; no inserts
Collection 7.2 -> Historic data, will only be queried; no inserts
Collection 1.1 -> Old, so delete whole collection
As a matter of fact the perfect setup would be to have only one (huge) collection with an automatic document retention…but i guess this won’t be an option at all?
I hope you understand my problem and give me some advice if this is at all possible or will be possible in the future with documentdb.
Best regards and thanks for your help

Hi Aravind,
first of all thanks for your reply regarding my questions.
I sent you a mail a few days ago but since i did not receive a response i am not sure it got through.
My main question regarding the actual usage of RUs when inserting documents is still my main concern since i can not insert nearly
as many documents as expected per second and CU.
According to to the documentation (http://azure.microsoft.com/en-us/documentation/articles/documentdb-manage/)
i understand that i should be able to store about 500 documents per second with single inserts and about 1000 per second with a batch insert using a stored procedure (20 batches per second containing 50 documents each). 
As described in my post the actual usage is multiple (actually 6-7) times higher than expected…even when running the C# examples
provided at:
https://code.msdn.microsoft.com/windowsazure/Azure-DocumentDB-NET-Code-6b3da8af/view/SourceCode
I tried all ideas Steve posted (manual indexing & lazy indexing mode) but was not able to enhance RU consumption to a point
that 500 inserts per second where nearly possible.
Here again my findings regarding RU consumption for batch inserts:
Automatic indexing on: 777
RUs for 50 documents
Automatic indexing off &
mandatory path only: 655
RUs for 50 documents
Automatic indexing off & IndexingMode Lazy & mandatory path only:  645 RUs for
50 documents
Expected result: approximately 100
RUs (2000 RUs => 20x Batch insert of 50 => 100 RUs per batch)
Since DocumentDB is still Preview i understand that it is not yet capable to handle my use case regarding throughput, collection
size, amount of collections and possible CUs and i am fine with that. 
If i am able to (at least nearly) reach the stated performance of 500 inserts per second per CU i am totally fine for now. If not
i have to move on and look for other options…which would also be “fine”. ;-)
Is there actually any working example code that actually manages to do 500 single inserts per second with one CUs 2000 RUs or is
this a totally theoretical value? Or is it just because of being Preview and the stated values are planned to work.
Regarding your feedback:
...another thing to consider
is if you can amortize the request rate over the average of 200 M requests/day = 2000 requests/second, then you'll need to provision 16 capacity units instead of 40 capacity units. You can do this by catching "RequestRateTooLargeExceptions" and retrying
after the server specified retry interval…
Sadly this is not possible for me because i have to query the data in near real time for my use case…so queuing is not
an option.
We don't support a way to distribute throughput differently across hot and cold
collections. We are evaluating a few solutions to enable this scenario, so please do propose as a feature at http://feedback.azure.com/forums/263030-documentdb as this helps us prioritize
feature work. Currently, the best way to achieve this is to create multiple collections for hot data, and shard across them, so that you get more proportionate throughput allocated to it. 
I guess i could circumvent this by not clustering in “hot" and “cold" collections but “hot" and “cold"
databases with one or multiple collections (if 10GB will remain the limit per collection) each if there was a way to (automatically?) scale the CUs via an API. Otherwise i would have to manually scale down the DBs holding historic data. I
also added a feature requests as proposed by you.
Sorry for the long post but i am planning the future architecture for one of our core systems and want to be sure if i am on
the right track. 
So if you would be able to answer just one question this would be:
How to achieve the stated throughput of 500 single inserts per second with one CUs 2000 RUs in reality? ;-)
Best regards and thanks again

Similar Messages

  • I want to run a macro when a document is opened to insert a shared buidling block

    I have a macro called insertsrsheader that I want to run when I open a particular word document.  This macro inserts a shared buidling block document.  What vbs code do I use to get this to run when the document is opened?

    Hi,
    This is the forum to discuss questions and feedback for Microsoft Word, you'd better post your questions to MSDN forum
    http://social.msdn.microsoft.com/Forums/en-US/home?forum=worddev&filter=alltypes&sort=lastpostdesc
    The reason why we recommend posting appropriately is you will get the most qualified pool of respondents, and other partners who read the forums regularly can either share their knowledge or learn from your interaction with us. Thank you for your understanding.
    George Zhao
    TechNet Community Support

  • Hello, I have a question regarding the sharing/exporting on imovie. Whenever I click the share button all the normal options pop up, but when I actually click where I want to share it to nothing happens.  If you know what's wrong please let me know.

    Hello,  I have a question regarding the sharing on iMovie.  I have just recently purchased an Elgato Gaming Capture HD and I then finished my recording with that and put it into imovie.  I worked long and hard on the project and when I go click the share feature on iMovie all the noral options pop up and when I actually click where I want to share it to nothing at all happens.  If you know what is wrong/ what I am doing wrong please let me know.
    Thank you.
    PS:  I am using iMovie 10.0.6.

    /*line 957 error */
         public void select()
              for (count = 0; count <= p; count ++ )
                   if(P[count] != null){ /* validation */
                   m = (int)(P[count].getX());
                   n = (int)(P[count].getY());
                   if (Math.pow(-1, m + n) == 1)
                        piece[m][n].setBackground(wselect);
                   else
                        piece[m][n].setBackground(bselect);
              step = 2;
         }

  • When inserting a .pdf of a document with a standard white paper color the test and images show up, but the white background is transparent. How do you also make the white paper color show up?

    When inserting a .pdf of a document into a Keynote template with a standard white paper color the text and images show up, but the white background is transparent. How do you also make the white paper color show up?

    Use the color fill option - select the inserted .pdf and assign a fill color of white to it using the Color Fill Menu on the Toolbar or the Color Picker Palette.
    Good Luck.

  • I have a question regarding emails being marked as unread when syncing with an exchange account.

    I have a question regarding emails being marked as unread when syncing with an exchange account.
    In the evening I can see I have 10 unread emails on my exchange account. I leave them unread. The following day, at work, I view and read all my mails from today and yesterday, so that I have NO unread messages. When I then sync my iphone it marks only the mails from today as read but, leaves all the mails from yesterday marked unread. Only solution so far is for me to go through the mails on my iphone one at a time for them to be marked as read.
    My mail account is set to sync 1 month back.
    I have had this problem on all the iphones I have had.
    I currently have an iPhone 5 and my software is up to date.
    What am I doing wrong?

    Hey kabsl,
    Thanks for the post.
    What type of email account is attached to your BlackBerry Z10? (POP3, IMAP, Exchange).  Also have you tried removing and re-adding the email account to test?
    Is this email account only setup on one computer or several computer?
    I look forward to your reply.
    Cheers.
    -ViciousFerret
    Come follow your BlackBerry Technical Team on Twitter! @BlackBerryHelp
    Be sure to click Like! for those who have helped you.
    Click  Accept as Solution for posts that have solved your issue(s)!

  • Questions regarding FaceTime: When I call someone, it rings but they can't answer and when someone calls me via FaceTime it won't even ring.

    Questions regarding FaceTime: When I call someone, it rings but they can't answer and when someone calls me via FaceTime it won't even ring.

    Using FaceTime http://support.apple.com/kb/ht4319
    Troubleshooting FaceTime http://support.apple.com/kb/TS3367
    The Complete Guide to FaceTime: Set-up, Use, and Troubleshooting Problems
    http://tinyurl.com/32drz3d
     Cheers, Tom

  • I'm tired of constantly reading "not yet" for any questions regarding a 64-bit Firefox. Exactly WHEN will there be a FF 64-bit?

    I'm tired of constantly reading the simplified answer of "not yet" for any questions regarding a 64-bit Firefox. Exactly WHEN will there be a Firefox 64-bit?

    That hasn't been decided yet for Windows PC's. So instead of "not yet", how about '''when it is ready for release'''?

  • I want to run a macro when a document is opened to insert a shared building block

    I have a macro called insertsrsheader that I want to run when I open a particular word document.  This macro inserts a shared buidling block document.  What vbs code do I use to get this to run when the document is opened?

    The code you posted here doesn't make sense, as Paul has already mentioned. As a guess (I haven't checked) it might have been something like the following:
    Application.Templates("F:\Building Blocks\SRS Header.dotx"). _
    BuildingBlockEntries("SRS
    Header").Insert Where:=Selection.Range, _
            RichText:=True
    If so, what you also need is a line before that saying
    Application.Templates.LoadBuildingBlocks
    Peter Jamieson

  • Open xml relationship target is NULL when inserting chart into a mapped rich text content control in Word 2013

    Hi,
    I have a word document with a rich text content control that is mapped to a CustomXml. You can download an example here
    http://1drv.ms/1raxoUr
    I have looked into the specification ISO/IEC 29500-1:2012 and i understand that the attribute Target for the element Relationship can be set to NULL at times(Empty header and footer in the specification).
    Now, i have stumbled on Target being NULL also when inserting a diagram into a word document. For example:
    <Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/oleObject"
    Target="NULL" TargetMode="External" xmlns="http://schemas.openxmlformats.org/package/2006/relationships" />
    Why is Target="NULL" and how should i interpret that Target is null?
    Br,
    /Peter
    Peter

    Hello Peter,
    The relationship in question is associated with the externalData element (ISO/IEC 29500-1:2012 §21.2.2.63). For the other two charts in this document, the corresponding relationships are of the other allowable form:
      <Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/package" Target="../embeddings/Microsoft_Excel_Worksheet1.xlsx"/>  <Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/package" Target="../embeddings/Microsoft_Excel_Worksheet2.xlsx"/>
    For charts 1 and 3 in your document, the data can be edited via the Chart Tools ribbon control. The option to edit data is not available for chart 2. The data used to create chart 2 is the same default spreadsheet data used for chart 1, and in fact the spreadsheet
    references are still present in the file format, despite there being no apparent link to a spreadsheet for chart 2.
    Thus, it appears that Target="NULL" in this context means that the chart is not associated with an external data source. The specification doesn't have much to say about the semantics of the Target attribute (ISO/IEC 29500-2:2012 §9.3.2.2) beyond
    the fact that it be a valid xsd:anyURI, which the string "NULL" is.
    It looks like there is some unexpected interaction between the chart and the content control. I don't think the file format is the issue. You will probably need to pursue that behavior from the product perspective via a support incident, if that behavior
    is unexpected. If you still have questions about what is seen in the file format, please let me know.
    Best regards,
    Matt Weber | Microsoft Open Specifications Team

  • Questions regarding customisation/configuration of PS CS4

    Hello
    I have accumulated a list of questions regarding customising certain things in Photoshop. I don't know if these things are doable and if so, how.
    Can I make it so that the list of blending options for a layer is by default collapsed when you first apply any options?
    Can I make it possible to move the canvas even though I'm not zoomed in enough to only have parts of it visible on my screen?
    Is it possible to enable a canvas rotate shortcut, similar to the way you can Alt+RightClick to quickly change brush size?
    Is it possible to lock button positions? Sometimes I accidentally drag them around when I meant to click.
    Is it possible to lock panel sizes? For example, if I have the Navigator and the Layers panels vertically in the same group, can I lock the height of the navigator so that I don't have to re-adjust it all the time? Many panels have a minimum height so I guess what I am asking for is if it's possible to set a maximum height as well.
    Is it possible to disable Photoshop from automatically appending "copy" at the end of layer/folder names when I duplicate them?
    These are things I'd really like to change to my liking as they are problems I run into on a daily basis.
    I hope someone can provide some nice solutions

    NyanPrime wrote:
    <answered above>
    Can I make it possible to move the canvas even though I'm not zoomed in enough to only have parts of it visible on my screen?
    Is it possible to enable a canvas rotate shortcut, similar to the way you can Alt+RightClick to quickly change brush size?
    Is it possible to lock button positions? Sometimes I accidentally drag them around when I meant to click.
    Is it possible to lock panel sizes? For example, if I have the Navigator and the Layers panels vertically in the same group, can I lock the height of the navigator so that I don't have to re-adjust it all the time? Many panels have a minimum height so I guess what I am asking for is if it's possible to set a maximum height as well.
    Is it possible to disable Photoshop from automatically appending "copy" at the end of layer/folder names when I duplicate them?
    These are things I'd really like to change to my liking as they are problems I run into on a daily basis.
    I hope someone can provide some nice solutions
    2.  No.  It's a sore spot that got some forum time when Photoshop CS4 was first released, then again with CS5.  It's said that the rules change slightly when using full-screen mode, though I personally haven't tried it.
    3.  Not sure, since I haven't tried it.  However, you may want to explore the Edit - Keyboard Shortcuts... menu, if you haven't already.
    4.  What buttons are you talking about?  Those you are creating in your document?  If so, choose the layer you want to lock in the LAYERS panel, then look at the little buttons just above the listing of the layers:
    5.  There are many, many options for positioning and sizing panels.  Most start with making a panel visible, then dragging it somewhere by its little tab.  One of the important features is that you can save your preferred layout as a named workspace.  Choose the Window - Workspace - New Workspace... to create a new named workspace (or to update one you've already created).  The name of that menu is a little confusing.  Once you have created your workspace, if something gets out of place, choose Window - Workspace - Reset YourNamedWorkspace to bring it back to what was saved.
    You'll find that panels like to "stick together", which helps with arranging them outside of the Photoshop main window.
    As an example, I use two monitors, and this is my preferred layout:
    6.  No, it's not possible to affect the layer names Photoshop generates, as far as I know.  I have gotten in the habit of immediately naming them per their usage, so that I don't confuse myself (something that's getting easier and easier to do...).
    Hope this helps!
    -Noel

  • ORA-07445 (solaris 9.0.) when inserting into LONG column

    i am getting a core-dump when inserting into a table w/ long column. running 9.0.1 on solaris.
    ORA-07445: exception encountered: core dump [kghtshrt()+68] [SIGSEGV] [Address not mapped to object] [0x387BBF0] [] []
    if anyone has ANY input - please provide it ... i am desperate at this point.
    i am trying to avoid upgrading to 9.2.0 to solve this problem.
    regards -
    jerome

    You should report this in a service request on http://metalink.oracle.com.
    It is a shame that you put all the effort here to describe your problem, but on the other hand you can now also copy & paste the question to Oracle Support.
    Because you are using 10.2.0.3; I am guessing that you have a valid service contract...

  • User exit/BADI triggered when WWI document is released in CG02

    Hi,
    In CG02 when a WWI document is created for a specification i want to know user exit/BADI associated when the document is released/accept in the report management.
    Any help would be appriciated.
    Thanks,
    Sridhar.

    Hello
    the user exits mentioned do have special purposes.
    C1G20001 and C1G20002 are related with report management system
    C1G20003 and C1G20004 are related with report information system
    The other exit is related to tasks which can be "programmed" in using the report header. The "release/accept"  process is generally related to the definition of the generation variant.
    Regarding the mentioned BADI I have no special knowledge if it can be used here.
    Could you indicate the reason behind your question?
    Reason is: based on the generation variant definition there is no "real report" which can be "released" or the report is immediately "released" and the user exits mentioned will not help any more in most scenarios.
    With best regards
    C.B.
    PS: I have done a small research: Badi EHSS_WWI_BADI_001 is available starting ERp 6.0 o my knowlegde; depending to your business needs there is a chance that this Badi can help.
    Edited by: Christoph Bergemann on May 24, 2010 3:57 PM

  • Open xml relationship target is NULL when inserting chart

    Hi,
    I have looked into the specification ISO/IEC 29500-1:2012 and i understand that the attribute Target for the element Relationship can be set to NULL at times(Empty header and footer in the specification).
    Now, i have stumbled on Target being NULL also when inserting a diagram into a word document. For example:
    <Relationship Id="rId3" Type="http://schemas.openxmlformats.org/officeDocument/2006/relationships/oleObject"
    Target="NULL" TargetMode="External" xmlns="http://schemas.openxmlformats.org/package/2006/relationships" />
    I'm using Open xml sdk and i'm trying to create a relationship. My question is how i should resolve the target which i think should point to the diagram/chart that i inserted.
    Br,
    /Peter
    Peter

    Hi Cindy,
    It is "easy" to reproduce :)
    1. Create a word document
    2. Add a CustomXml to the document with for example "Word content control toolkit"
    2:1 The CustomXml should look something like this <Root xmlns=http://schemas.company.se><Description></Description></Root>
    3. Open up the word document and open up the "XML Mapping Pane" under Developer tab. Add rich text content control which mapps to the <Description> element.
    4. Save and close the document
    5. Take a copy of thedocument
    6. Open up the document again, place the insert point within the content control mapped to <Description> element and insert a Chart.
    7. Save and close the document
    8. Rename the document with the extension .zip and navigate to word/charts/_rels/ and open up chart2.xml.rels.
    9. Search for Target="NULL", you should find 1 occurrence
    You can also download a sample here: http://1drv.ms/1raxoUr
    Br,
    /Peter
    Peter

  • Generate a surregate key only when insert a new row - (SCD1)

    Hi all,
    I&acute;m starting with OWB (10gR2) and I'm creating a simple template dimension with SDC type 1 with one source, one target and a sequence do generate the surrogate key.
    The main properties configurations are:
    - Target property: Loading type = INSERT/UPDATE.
    - Only match the natural key between source and target.
    - Only use the sequence.nextval when inserting a new row.
    - Mapping: Set based.
    The generated code was something like this:
    {color:#000080}+MERGE INTO DIM_DEFAULT+{color}
    {color:#000080}+USING SCR_TABLE+{color}
    {color:#000080}+ON(SCR_TABLE.NATURALKEY=DIM_DEFAULT.NATURALKEY)+{color}
    {color:#000080}+WHEN MATCHED THEN+{color}
    {color:#000080}+UPDATE SET NAME=SCR_TABLE.NAME+{color}
    {color:#000080}+WHEN NOT MATCHED THEN+{color}
    {color:#000080}+INSERT(SK_DEFAULT,NATURALKEY,NAME)+ {color}
    {color:#000080}+VALUES(SQ_DIM_DEFAULT.NEXTVAL,SCR_TABLE.NATURALKEY,SCR_TABLE.NAME)+{color}
    Ok, it works fine! But the merge statement generates a new SK for each merged row even if the action is update! What I want is a new SK only when the process inserts a new row. Hard-coding, I possible solve this problem creating a function to return the nextval from a sequence and put it on the insert clause like this:
    {color:#000080}+MERGE INTO DIM_DEFAULT+{color}
    {color:#000080}+USING SCR_TABLE+{color}
    {color:#000080}+ON(SCR_TABLE.NATURALKEY=DIM_DEFAULT.NATURALKEY)+{color}
    {color:#000080}+WHEN MATCHED THEN+{color}
    {color:#000080}+UPDATE SET NAME=SCR_TABLE.NAME+{color}
    {color:#000080}+WHEN NOT MATCHED THEN+{color}
    {color:#000080}+INSERT(SK_DEFAULT,NATURALKEY,NAME)+ {color}
    {color:#000080}+VALUES(FN_SQC_NEXTVAL('SQ_DIM_DEFAULT'),SCR_TABLE.NATURALKEY,SCR_TABLE.NAME)+{color}
    Hard-coded it works fine as I need and that&acute;s the point that I want to know:
    It&acute;s possible perform this solution using OWB?
    I tried to use Constants but this solution doesn&acute;t work then I tried an expression calling my sequence.nextval function but the result was:
    {color:#000080}+MERGE INTO DIM_DEFAULT+{color}
    {color:#000080}+USING (SELECT FN_SQC_NEXTVAL('SQ_DIM_DEFAULT') AS SK_DEFAULT_0,NATURALKEY,NAME FROM SCR_TABLE) AS SCR_TABLE+{color}
    {color:#000080}+ON(SCR_TABLE.NATURALKEY=DIM_DEFAULT.NATURALKEY)+{color}
    {color:#000080}+WHEN MATCHED THEN+{color}
    {color:#000080}+UPDATE SET NAME=SCR_TABLE.NAME+{color}
    {color:#000080}+WHEN NOT MATCHED THEN+{color}
    {color:#000080}+INSERT(SK_DEFAULT,NATURALKEY,NAME)+ {color}
    {color:#000080}+VALUES(SK_DEFAULT_0,SCR_TABLE.NATURALKEY,SCR_TABLE.NAME)+{color}
    It&acute;s the same of merge the row using the SEQUENCE.NEXTVAL direct.
    Does anyone have a workaround to solve this situation?

    Hi,
    Perhaps this comes a bit late but...
    Have you tried to change attribute properties in the mapping? For simple example you have a source table in sa-stage which have to fields: code (varchar2(2), natural key) and description (varchar2(50)). You want to map those fields to target table where you have 3 fields: id (NUMBER(10), surrogate id, nextval from sequence when new row inserted), code (as from the source, natural key, the field to determine if the coming row is new or one to be merged) and description as in source table. Fields and datatypes are just for an example.
    When you have source and target tables in your mapping (and fields mapped), drag the sequence operator to the mapping and map the nextval to target table's surrogate id field.
    - Change target table operator properties:
    * loading type = UPDATE/INSERT
    * Match by constraint = no constraint
    - Change target table attribute properties for:
    * surrogate id field: Load column when updating row = no; match column when updating row = no; load column when inserting row = yes
    * natural key field(s): Load column when updating row = no; match column when updating row = yes; load column when inserting row = yes
    * all the regular fields: Load column when updating row = yes; match column when updating row = no; load column when inserting row = yes
    - change the deleting rules as you want them to be
    - Validate your mapping. If the green light is shown, the settings in field properties are suitable for merge.
    Hope this is what you wanted to get. I think this kind of solution works fine. Trigger in the table that holds and where one inserts millions of rows sounds scary...
    Regards,
    jk
    Added text: sorry about false alarm. I think that was what you did. The solution above steals the nextvals from sequence. Perhaps no-native-english-reader should be more careful when reading questions ;-)
    Edited by: jkoski on 5.11.2008 1:30

  • Questions regarding PO output in SRM 4.0

    Hi All
    I have several questions regarding the settings for PO output in SRM 4.0 ( Ext. classic)
    Would really appreciate if someone provides me the rationale and business reasons behind some config settings:
    I am referring to BBP_PO_ACTION_DEF transaction in IMG
    1) What is the difference between Processing when saving document & immediate processing. SHouldn't the PO output be always processed after the PO is changed and saved?
    2) In the Determination technology what is the significance of the term ' Transportable conditions'. Why would it be different if the conditions were manual and not transportable?
    3) In the Rule type what is the signifcance of Workflow conditions? How workflow controls the output?
    4) What is the meaning of ' Action merging' in layman's terms and what do each of the choices like "Max. 1 Unprocessed Action for Each Processing Type" signify.
    5) What powers do the 'changeable in dialog' & 'executable in dialog' indicators assign to the user processing the PO. What happens if these indicators are not set.
    6) What does ' Archive mode' in processing type signify. Are the PO outputs archived and stored?
    Regards
    Kedar

    Hello,
    Have a look at note 564826. It has some information.
    As far as I know, processing time 'Immediate Processing' is not allowed. I really don't know the reason.
    "Processing Time" should be defined as "Process using Selection Report" when an output should be processed by a report, such as RSPPFPROCESS, for example.
    If you set this as "Process when Saving document", then the output will be sent immediately, otherwise you have to process it with transaction BBP_PPF.              
    I hope I could help you a little.
    Kind regards,
    Ricardo

Maybe you are looking for

  • Vendor Balance in Local Currency (RFKSLD00)

    Can anyone tell me the use of this report? We have execute this report on a monthly basis. This month we did a lot of Accounting Document clearing and also archive the FI_DOCUMNT for certain year. After the archiving and clearing, we generate this re

  • How can i edit lenght of the all layers in timeline at the same time??? in Photoshop cs6 extended

    How can i edit lenght of the all layers in timeline at the same time??? in Photoshop cs6 extended Because when i select all layers in layer panel or right in timeline panel, i will be able to edit duration only in one layer... so if i can document wi

  • HT201303 How do I reset my security questions I forgot them

    I bought iTunes account money and I have to verify my questions for te first purchase on my device and I don't remember my answers :-/

  • Renameing host

    Hello all, I am wondering if changing host's name could cause problems to Oracle database 10g. Currently we are testing our applications on 10g test server named db10.company.com. After all tests are done we are planning to change it's name to dn.com

  • Thread bottleneck?

    Hi -           We have an application that is loading tens of thousands of xml messages           into a JMS queue backed by Oracle, and committing every 100. MDB's pick up           from that queue and start their own UserTransactions managing from