Types of ODS's in BI 7.0

Hai friends,
   i'm new to BI 7.0 .. i want to how many types of ODS 's are there in BI 7.0 and in that each type of ODS.. what are the <b>Tables</b> existed...( I mean  Standard ODS  is one type.. in that there are three tables are there.. those are... NEW TABLE...  and   ACTIVE TABLE.... and Change Log Table..) like that can you explain. that how many types of ODS are there and... what are the tables existed in that
thankyou
@jay

Hi Ajay,
Standard DSO consists of three tables: activation queue, table of active data &  change log
DSO for direct update consists of the active data table only
Write-optimized DSO consists of the active data table only
Check this help.sap link for more
http://help.sap.com/saphelp_nw04s/helpdata/en/f9/45503c242b4a67e10000000a114084/content.htm
Regards,
R.Ravi
*Assigning points is the way of saying Thanks in SDN

Similar Messages

  • Upload file of type Spreadsheet.ods  to internal table

    Hi Experts,
    can any body give me function module to upload file of type Spreadsheet.ods  into internal table
    Thanks in Advance
    Narendra

    Hi Renu,
    it is not supporting this function modules.
    we r using open office and its just like excel sheet.
    even function module ALSM_EXCEL_TO_INTERNAL_TABLE
    not supporting.
    Narendra

  • UPDATE  TYPE  IN  ODS...

    Dear all experts,
    I know update types available for ODS are.. addition, Minimum, Maximum, and Overwrite.
    Can anybody please tell, where do we do this setting, for update type in ODS.
    As per my knowledge, for Cube, we are having updation for key figures, from that respective, key figure will get updated.
    Your help will be rewarded, surely.
    Waiting
    Regards
    Vinay.

    Hi Vinay,
    Creating update rules:  Right click on ODS -> create UR.
    Select the Infosource Radio Button (u can use datatargets also).  In the next screen keep the cursor on any datafield and click on Details button just above  that table.  there u can specify which update type u want.
    http://help.sap.com/saphelp_nw04/helpdata/en/80/1a64c6e07211d2acb80000e829fbfe/content.htm
    Regards,
    ARK
    null

  • Addition and Overwrite update types for ODS

    Hello BW Experts ,
    I have an issue.
    For a particular Order and Cost element on R/3 side I have Four cost Figures .
    When I do full upload from the related standard data source to ODS with update type Overwritten I am getting only the last cost figure of the same order and Cost elemet.
    I am loadind data from this ODS to infocube further.
    Now the problem is I want the sum of all the cost figures for the same order and same cost element.
    But as the last cost figure is overwritten in ODS I am not getting the correct sum.
    So i have made changes to the update type from Overwrite to Addition.
    Now I am getting the addition of all cost figures correctly .
    Now I am doubtful that If I add further data to ODS by full upload then all cost figures will get doubled.
    Please explain me what to do in such a case.
    Thanks in Advance,
    Amol .

    Hello Amol,
    check if the datasource supports delta!! you can see it in the rsa6 datasource display. check for the delta checkbox. also from the roosource table this can be found out.
    if yes, shift to init-delta loads from full loads.
    only the init may take some time but the deltas on daily basis should not take much time.
    also one more thing to add to the earlier responses is you can automate the deletion of similar request in infopakcage setting so that u need not manually delete the full upload request daily( if u are working on full uploads on daily basis).
    hope it helps..
    regards,

  • Process Chains and Process Types for ODS Change logs

    We have created and used process chains to manage a large amount of our batch processing.   I am looking to convert our batch process of deleting data from ODS change logs into a process chain.   I have been unable to find a process type that will allow us to do this.
    Any thoughts?

    Hi Lisa,
       You can use Process "Deleting Requests from the PSA" for deleting PSA and change log data.
    Since the change log is also stored as a PSA table, you can also use this function to delete change log
    records. You can find additional information under Deleting from the Change Log.
    More info:
    Deleting Requests from the PSA
    http://help.sap.com/saphelp_nw04/helpdata/en/b0/078f3b0e8d4762e10000000a11402f/content.htm
    Deleting from the Change Log
    http://help.sap.com/saphelp_nw04/helpdata/en/6d/1fd53be617d524e10000000a11402f/content.htm
    Hope it Helps
    Srini

  • Split a value into two based on VERSION type in ODS

    Hello Gurus,
    I have a field in the cube called AMOUNT. It is plan or actual value based on a field called VERSION in the ODS.
    Can you please tell me how to display two columns in the report (Acutal and Plan columns) from one field called AMOUNT in the cube, based on the VERSION field value in ODS.
    Your responses are highly appreciated,
    Thanks,
    Regards,
    aarthi
    [email protected]

    Hi Diego,
    I don’t have version field in the Cube. It is available only in the ODS. So, how to filter based on VERSION.
    Hi Oscar,
    Is there any way I can access the VERSION value from ODS in the report.
    Thanks
    Regards,
    aarthi
    [email protected]

  • Cube and ODS update types

    Hi all. Please tell me in brief about update types for ODS and InfoCube. I know there is something Addition,No update,overwrite. But I do not know how they work.
    regards
    rajesh

    Hi Rajesh,
      No Update means - no data is updated into the target (Cube or ODS).
      Overwite - Only for ODS, not possible for Cube. this iwll overwrite the exisiting enrty in the ODS with the new values.
       Aditive - The records would be summed up ... Cube can only work either in Additive or No Update mode.
    Hope this helps.
    best regards,
    Kazmi

  • Some records not transfer from ODS to infocube

    Hello BW folks ,
    We have an ODS which stores the various sales doc. types.
    We are transfering all the data from this ODS to infocube. We do not have any routine or filter conditions while loading data from ODS to infocube.
    In the update rules of infocube we do not have any routine written .
    Also no start routine is present in Infocube.
    The data is loaded successfully from ODS to infocube.
    But still a particular 'sales doc type' is not transfer from ODS to infocube.
    While this 'sales doc type' is present in ODS.
    The Sales doc type is maintained with respect to the Ordernumber. So If I check the Ordernumber of that sales doc type in ODS then the same Order Number is not present in infocube.
    Means some / few Ordernumbers also getting deleted while transfering data from ODS to infocube.
    We do not have any object in update rules as master data attribute of.
    Please suggest me what to do in this case.
    Amol.

    hello VC,
    I have checked that in ODS update rules , for key figures we have update type as Addition and the data from ODS to infocube is INIT and then delta.
    Is this the reason for Overwritting the particular sales doc type ?
    I thing we should do full upload only from ODS to infocube in this case.
    Regards,
    Amol.

  • Process Chain problem related to Update ODS Object Data (Further Update)

    In the process chain, the data is loaded to an ODS first then feed to an InfoCube by kind of data mart load.  But when checking the process chain, getting yellow warning msg:
    A type "Update ODS Object Data (Further Update)" process has to follow process "Activate ODS Object Data" var.our_infopackage_name in the chain
    Message no. RSMPC016
    If we add the variant "Update ODS Object Data (Further Update) after the activation of the ODS and before the InfoPackage to feed data to the cube, then another yellow warning shows up for the InfoPackage of loading data to the InfoCube that says the Update variant can not be before the loading to this Cube and etc.
    What should we do to have the problem resolved?
    Thanks

    hey Dinesh,
    How to have a wait time for the load to the cube?
    I've given you "Very helpful" rewarding points and after you give the answer to the above, I will give you the "Solve problem" rewarding points.
    Thanks

  • Loading from ODS to Cube

    I have a process chain for GL.
    Data was getting loaded first to ODS and then to CUBE.
    Now i there is no update to cube is scheduled. I removed  'Further Update'  process from process chain.
    During loading  I am getting a warning message as follow:
    There must be a type "Update ODS Object Data (Further Update)" process behind process "Activate ODS Object Data" var.ACTIVATE_ODSu201D
    So plz suggest me any way to remove this warning message.
    Thanx,
    Vishal

    Hi Vishal,
    since you removed 'Further Update' process from process chain, this message is comming, i hope instead of this peorcess you are loading dta from ODS to Cube using Infopackge at process chain.
    at ODS - setting uncheck setting Further update to data targets (i assume you are loading data from ODS to Cube using Infopackage at process chain).
    Best Regards.

  • Difference between an ODS and DSO

    What is the difference between an ODS (operational data store) and a DSO?

    Hi
    There is no difference in ODS and DSO. In BW 3.5 object is called ODS which is now known as DSO in 7.0 version.
    To enhance the functionality, we have three types of DSO.
    1. Standard DSO
    2. Write Optimized DSO
    3. Direct Update
    Where as there was only type of ODS. this is the only difference.
    - Jaimin

  • Process chain questions

    Hai  All,
               I designed a process chain with data flow like this:
    R/3>ODS1>ODS2>Cube.
    The process chain sequence is:
    Start>>Load data>>activate ODS1 data>>load data to ODS2>>activate ODS2 data>>delete index>>load data to cube.
               My process chain is working and the data loads are good. But I am getting warning messages for all three variants "activate ODS1 data>>load data to ODS2>>activate ODS2 data>>"
    The message for the two activation variants is as follows:
    A type "Update ODS Object Data (Further Update)" process has to follow process "Activate ODS Object Data" var.ACTIVATEZMONDET in the chain
    Message no. RSMPC016
    The message for the loading variant is as follows:
    A type "Activate ODS Object Data" process cannot precede process "Execute InfoPackage" var. ZPAK_3YF4Q8R0RSHTDJH74WZOAXBC6 in thechain
    Message no. RSMPC013
               The version I am working on is BW 3.5. My question is it ok to have these warning messages or do I need to do anything else?
    Thank you.

    Hi Visu,
    If you data loads are running fine, you can ignore the warning messages. The system tries to warn you in case it suspects that a process is missing from the chain of events required...but there are more than one ways to achieve a load. If the data volume to be loaded to the cube is not very high, like it is just a daily delta, you can even go ahead and remove the index deletion process.
    Hope this helps...

  • Interview Questions

    Hy Gurus
    I am new to BW.
    Can u please let me know the answers for the following Qs.
    1.     What is a Data warehouse? List Few Properties of a Data warehouse.
    2. What are the major challenges of any Data Warehouse design?
    3. Data loading issues, Reporting, Production support issues?
    4.     Data Modeling, Info cube, ODS obj creation?
    5.     SAP Batch process in BW side?
    6.     How to schedule the Background jobs in ABAP, & BW?
    7.     Variables in Query? User exit variables? How do u create user exit variable? And wt u have used?
    8.     Structures in Query Reporting? What are structures?
    9.     Transportation steps?
    10.     How to compress the Info cube? What happens when you compress the info cube?
    11.     Performance issues in Reporting? How can you improve the performance of the query?
    12.     Where the ABAP routines are used in BW side?
    13.     How to create Primary index & secondary indexes on ODS?
    14.     How to prevent the duplicate records at the data target level?
    15.     Issues regarding double records?
    16.     Info sets and multiprovider differences?
    17.     Issues faced in Delta loading? Why it happened?
    Enhancements to the Data sources?
    18.     Issues in loading the data from Flat files? Delta load issues? Which will u suggest for flat file loading is it delta or full load?
    19.     How to prevent the Replication Errors? What happens when you replicate the Data Sources?
    20.     Process chain steps? Which process type is used to delete the last request when loading the data?
    21.     What is virtual cube? Characteristics? Its significance?
    22.     Diff. methods for Generic Data Sources?
    23.     What is Extract structure? Where it is used?
    24.     Data Modeling – design issues, Tech specifications, Modeling , Reporting, Testing, Transportation
    25.     Extraction on R/3 side steps?LO’s
    26.     How do you setup LIS?
    27.     SAP Batch process? /.where does u setup batch process?
    28.     Reconstruction tab in Info cube? Why it is used?
    29.     Suppose Info cube (A) having 10 records.
                         We want to take the some records say 4-7 from Info cube (A) --&#61664; Info cube (B).
                How does u handle the situation?
    30.     Suppose ODS contains 5 records and the Info cube shows 12 records. How to solve it?
    31.     Landscape
    32.     Tell me typical BW Team & How u work?
    33.     How to Maintain the Master data at Client –Server architecture?
    34.     Query performance problems? How does u improve the performance?
    35.     How does u improve the Info cube design performance?
    36.     How do you improve the Dimension & fact table performance?
    37.     How to push data from PSA to ODS?
    38.     How to postpone daily load?
    39.     The functions of Administrator Workbench are…
    40.     What is RECORD MODE
    41.     What is partition? How to partition the Info cube & PSA. ?..
    42.     How to filter single records when uploading into an ods object?
    43.     How can u connect a new data target to an existing data flow?
    44.     When it is advantageous to create secondary indexes for ODS field?
    45.     Purpose of setup tables?
    46.     What is Delta mechanism?
    47.     Will u create an infosourse for every data souse? How many infosourse can we connect to a Data Sources?
    48.     What is a Star Schema? What is the Differences between a Classic Star Schema & Extended Star schema.
    49.     What is an Attribute? Difference between Display Attribute & Navigational Attribute?
    50.     What is Transfer Rule? List the methods used in Transfer Rules.
    51.     Why we need ODS? /List a few of the technical settings that can be defined when building/modifying an ODSObject?
    52.     What is ODS? What are the three tables associated with an ODS object? What are the two types of ODS.
    53.     Name the two tables that provide detailed information about data sources.
    54.     What are two data transfer methods? Which is the preferred method and why?
    55.     Where will the development take place?&  Who will do the development testing.
    56.     Who will be responsible for long-term support?
    57.     What is a Slowly Changing Dimension?
    58.     What is namespace for SAP BW?
    59.     What are nine decision points of Data warehouse
    60.     How does u install BW Statistics? How you enable monitoring using BW Statistics.
    61.     How do u rate u r self in Modeling, Reporting & Extraction.
    62.     What are the advantages with LO Extraction.
    63.     What are the steps in General to enhance data?
    64.     What are Phases in ASAP Methodology?
    65.     What is the 90 Day rule?
    66.     What is the demo content use?
    67.     What is the use of RSRAJ Transaction?
    68.     What is the use of RSSU53 Transaction?
    69.     Can u repeat the Master Data Source?
    70.     What is diff. b/w DW and BW?
    71.     Will u use idol methods in BW?
    72.     What does the number in the 'Total' column in Transaction RSA7 mean?
    73.       The extract structure was changed when the Delta Queue was empty. Afterwards new delta records were written to the Delta Queue. When loading the delta into the PSA, it shows that some fields were moved. The same result occurs when the contents of the Delta Queue are listed via the detail display. Why are the data displayed differently? What can be done?
    74.     How and where can you control whether a repeat delta is requested?
    75.     Can I use several clients within SAP BW?
    Thanks in Advance
    murali

    Hi
    I have few questions and answer u can check it out which would be more helpful
    Solution
    Questions and answers.
    Question 1:
    What does the number in the 'Total' column in Transaction RSA7 mean?
    Answer:
    The 'Total' column displays the number of LUWs that were written in the delta queue and that have not yet been confirmed. The number includes the LUWs of the last delta request (for repeating a delta request) and the LUWs for the next delta request. An LUW only disappears from the RSA7 display when it has been transferred to the BW System and a new delta request has been received from the BW System.
    Question 2:
    What is an LUW in the delta queue?
    Answer:
    An LUW from the point of view of the delta queue can be an individual document, a group of documents from a collective run or a whole data packet from an application extractor.
    Question 3:
    Why does the number in the 'Total' column, in the overview screen of Transaction RSA7, differ from the number of data records that are displayed when you call up the detail view?
    Answer:
    The number on the overview screen corresponds to the total number of LUWs (see also question 1) that were written to the qRFC queue and that have not yet been confirmed. The detail screen displays the records contained in the LUWs. Both the records belonging to the previous delta request and the records that do not meet the selection conditions of the preceding delta init requests are filtered out. This means that only the records that are ready for the next delta request are displayed on the detail screen. The detail screen of Transaction RSA7 does not take into account a possibly existing customer exit.
    Question 4:
    Why does Transaction RSA7 still display LUWs on the overview screen after successful delta loading?
    Answer:
    Only when a new delta has been requested does the source system learn that the previous delta was successfully loaded into the BW System. The LUWs of the previous delta may then be confirmed (and also deleted). In the meantime, the LUWs must be kept for a possible delta request repetition. In particular, the number on the overview screen does not change if the first delta is loaded into the BW System.
    Question 5:
    Why are selections not taken into account when the delta queue is filled?
    Answer:
    Filtering according to selections takes place when the system reads from the delta queue. This is necessary for performance reasons.
    Question 6:
    Why is there a DataSource with '0' records in RSA7 if delta exists and has been loaded successfully?
    Answer:
    It is most likely that this is a DataSource that does not send delta data to the BW System via the delta queue but directly via the extractor . You can display the current delta data for these DataSources using TA RSA3 (update mode ='D')
    Question 7:
    Do the entries in Table ROIDOCPRMS have an impact on the performance of the loading procedure from the delta queue?
    Answer:
    The impact is limited. If performance problems are related to the loading process from the delta queue, then refer to the application-specific notes (for example in the CO-PA area, in the logistics cockpit area, and so on).
    Caution: As of PlugIn 2000.2 patch 3, the entries in Table ROIDOCPRMS are as effective for the delta queue as for a full update. Note, however, that LUWs are not split during data loading for consistency reasons. This means that when very large LUWs are written to the delta queue, the actual package size may differ considerably from the MAXSIZE and MAXLINES parameters.
    Question 8:
    Why does it take so long to display the data in the delta queue (for example approximately 2 hours)?
    Answer:
    With PlugIn 2001.1 the display was changed: you are now able to define the amount of data to be displayed, to restrict it, to selectively choose the number of a data record, to make a distinction between the 'actual' delta data and the data intended for repetition, and so on.
    Question 9:
    What is the purpose of the function 'Delete Data and Meta Data in a Queue' in RSA7? What exactly is deleted?
    Answer:
    You should act with extreme caution when you use the delete function in the delta queue. It is comparable to deleting an InitDelta in the BW System and should preferably be executed there. Not only do you delete all data of this DataSource for the affected BW System, but you also lose all the information concerning the delta initialization. Then you can only request new deltas after another delta initialization.
    When you delete the data, this confirms the LUWs kept in the qRFC queue for the corresponding target system. Physical deletion only takes place in the qRFC outbound queue if there are no more references to the LUWs.
    The delete function is intended for example, for cases where the BW System, from which the delta initialization was originally executed, no longer exists or can no longer be accessed.
    Question 10:
    Why does it take so long to delete from the delta queue (for example half a day)?
    Answer:
    Import PlugIn 2000.2 patch 3. With this patch the performance during deletion improves considerably.
    Question 11:
    Why is the delta queue not updated when you start the V3 update in the logistics cockpit area?
    Answer:
    It is most likely that a delta initialization had not yet run or that the the delta initialization was not successful. A successful delta initialization (the corresponding request must have QM status 'green' in the BW System) is a prerequisite for the application data to be written to the delta queue.
    Question 12:
    What is the relationship between RSA7 and the qRFC monitor (Transaction SMQ1)?
    Answer:
    The qRFC monitor basically displays the same data as RSA7. The internal queue name must be used for selection on the initial screen of the qRFC monitor. This is made up of the prefix 'BW, the client and the short name of the DataSource. For DataSources whose name is shorter than 20 characters, the short name corresponds to the name of the DataSource. For DataSources whose name is longer than 19 characters (for delta-capable DataSources only possible as of PlugIn 2001.1) the short name is assigned in Table ROOSSHORTN.
    In the qRFC monitor you cannot distinguish between repeatable and new LUWs. Moreover, the data of a LUW is displayed in an unstructured manner there.
    Question 13:
    Why is there data in the delta queue although the V3 update has not yet been started?
    Answer:
    You posted data in the background. This means that the records are updated directly in the delta queue (RSA7). This happens in particular during automatic goods receipt posting (MRRS). There is no duplicate transfer of records to the BW system. See Note 417189.
    Question 14:
    Why does the 'Repeatable' button on the RSA7 data details screen not only show data loaded into BW during the last delta but also newly-added data, in other words, 'pure' delta records?
    Answer:
    It was programmed so that the request in repeat mode fetches both actually repeatable (old) data and new data from the source system.
    Question 15:
    I loaded several delta inits with various selections. For which one
    is the delta loaded?
    Answer:
    For delta, all selections made via delta inits are summed up. This
    means a delta for the 'total' of all delta initializations is loaded.
    Question 16:
    How many selections for delta inits are possible in the system?
    Answer:
    With simple selections (intervals without complicated join conditions or single values), you can make up to about 100 delta inits. It should not be more.
    With complicated selection conditions, it should be only up to 10-20 delta inits.
    Reason: With many selection conditions that are joined in a complicated way, too many 'where' lines are generated in the generated ABAP source code which may exceed the memory limit.
    Question 17:
    I intend to copy the source system, i.e. make a client copy. What will happen with may delta? Should I initialize again after that?
    Answer:
    Before you copy a source client or source system, make sure that your deltas have been fetched from the delta queue into BW and that no delta is pending. After the client copy, an inconsistency might occur between BW delta tables and the OLTP delta tables as described in Note 405943. After the client copy, Table ROOSPRMSC will probably be empty in the OLTP since this table is client-independent. After the system copy, the table will contain the entries with the old logical system name which are no longer useful for further delta loading from the new logical system. The delta must be initialized in any case since delta depends on both the BW system and the source system. Even if no dump 'MESSAGE_TYPE_X' occurs in BW when editing or creating an InfoPackage, you should expect that the delta has to be initialized after the copy.
    Question 18.
    Am I permitted to use the functions in Transaction SMQ1 to manually control processes?
    Answer:
    Use SMQ1 as an instrument for diagnosis and control only. Make changes to BW queues only after informing BW Support or only if this is explicitly requested in a note for Component 'BC-BW' or 'BW-WHM-SAPI'.
    Question 19.
    Despite the delta request only being started after completion of the collective run (V3 update), it does not contain all documents. Only another delta request loads the missing documents into BW. What is the cause for this "splitting"?
    Answer:
    The collective run submits the open V2 documents to the task handler for processing. The task handler processes them in one or several parallel update processes in an asynchronous way. For this reason, plan a sufficiently large "safety time window" between the end of the collective run in the source system and the start of the delta request in BW. An alternative solution where this problem does not occur is described in Note 505700.
    Question 20.
    Despite deleting the delta init, LUWs are still written into the DeltaQueue
    Answer:
    In general, delta initializations and deletions of delta inits should always be carried out at a time when no posting takes place. Otherwise, buffer problems may occur: If you started the internal mode at a time when the delta initialization was still active, you post data into the queue even though the initialization had been deleted in the meantime. This is the case in your system.
    Question 21.
    In SMQ1 (qRFC Monitor) I have status 'NOSEND'. In the Table TRFCQOUT, some entries have the status 'READY', others 'RECORDED'. ARFCSSTATE is 'READ'. What do these statuses mean? Which values in the field 'Status' mean what and which values are correct and which are alarming? Are the statuses BW-specific or generally valid in qRFC?
    Answer:
    Table TRFCQOUT and ARFCSSTATE: Status READ means that the record was read once either in a delta request or in a repetition of the delta request. However, this still does not mean that the record has successfully reached the BW. The status READY in the TRFCQOUT and RECORDED in the ARFCSSTATE means that the record has been written into the delta queue and will be loaded into the BW with the next delta request or a repetition of a delta. In any case only the statuses READ, READY and RECORDED in both tables are considered to be valid. The status EXECUTED in TRFCQOUT can occur temporarily. It is set before starting a delta extraction for all records with status READ present at that time. The records with status EXECUTED are usually deleted from the queue in packages within a delta request directly after setting the status before extracting a new delta. If you see such records, it means that either a process which confirms and deletes records loaded into the BW is successfully running at the moment, or, if the records remain in the table for a longer period of time with status EXECUTED, it is likely that there are problems with deleting the records which have already been successfully been loaded into the BW. In this state, no more deltas are loaded into the BW. Every other status indicates an error or an inconsistency. NOSEND in SMQ1 means nothing (see note 378903). However the value 'U' in field 'NOSEND' of table TRFCQOUT is of concern.
    Question 22.
    The extract structure was changed when the delta queue was empty. Afterwards new delta records were written to the delta queue. When loading the delta into the PSA, it shows that some fields were moved. The same result occurs when the contents of the delta queue are listed via the detail display. Why is the data displayed differently? What can be done?
    Answer:
    Make sure that the change of the extract structure is also reflected in the database and that all servers are synchronized. We recommend resetting the buffers using Transaction $SYNC. If the extract structure change is not communicated synchronously to the server where delta records are being created, the records are written with the old structure until the new structure has been generated. This may have disastrous consequences for the delta. When the problem occurs, the delta needs to be re-initialized.
    Question 23. How and where can I control whether a repeat delta is requested?
    Answer:
    Via the status of the last delta in the BW Request Monitor. If the request is RED, the next load will be of type 'Repeat'. If you need to repeat the last load for any reason, manually set the request in the monitor to red. For the contents of the repeat, see Question 14. Delta requests set to red when data is already updated lead to duplicate records in a subsequent repeat, if they have not already been deleted from the data targets concerned.
    Question 24.
    As of PI 2003.1, the Logistic Cockpit offers various types of update methods. Which update method is recommended in logistics? According to which criteria should the decision be made? How can I choose an update method in logistics?
    Answer:
    See the recommendation in Note 505700.
    Question 25.
    Are there particular recommendations regarding the maximum data volume of the delta queue to avoid danger of a read failure due to memory problems?
    Answer:
    There is no strict limit (except for the restricted number area of the 24-digit QCOUNT counter in the LUW management table - which is of no practical importance, however - or the restrictions regarding the volume and number of records in a database table).
    When estimating "soft" limits, both the number of LUWs and the average data volume per LUW are important. As a rule, we recommend bundling data (usually documents) as soon as you write to the delta queue to keep number of LUWs low (this can partly be set in the applications, for example in the Logistics Cockpit). The data volume of a single LUW should not be much larger than 10% of the memory available to the work process for data extraction (in a 32-bit architecture with a memory volume of about 1 GByte per work process, 100 MByte per LUW should not be exceeded). This limit is of rather small practical importance as well since a comparable limit already applies when writing to the delta queue. If the limit is observed, correct reading is guaranteed in most cases.
    If the number of LUWs cannot be reduced by bundling application transactions, you should at least make sure that the data is fetched from all connected BWs as quickly as possible. But for other, BW-specific, reasons, the frequency should not exceed one delta request per hour.
    To avoid memory problems, a program-internal limit ensures that no more than 1 million LUWs are ever read and fetched from the database per delta request. If this limit is reached within a request, the delta queue must be emptied by several successive delta requests. We recommend, however, to try not to reach that limit but trigger the fetching of data from the connected BWs as soon as the number of LUWs reaches a 5-digit value.
      THANKS =POINTS in SDN
    SANJEEV

  • "Error While creating Process Chain"

    Hi all,
    I am Creating a process chain in that I have to load from ODS to Cube,
    If I include process types "Activate ODS" and "Update ODS" in process chain its Loading Delta from ODS to Cube,  But my Requirement is to load FULL LOAD from ODS to Cube, If i select Full load infopackage from 8ZODS and place it in the chain after Activate ODS its giving errors as 
    Errors:
    InfoPackage is generated; NOT able to be used as loading variant
    A type "Activate ODS Object Data" process cannot precede process "Execute InfoPackage" var. ZPAK_42622G5QGKNFH2LSX5PN6DAG6 in the chain
    Please Guide me what to do? and Brief me the sequence to create a Process Chain(loading full and delta from ODS to Cube).
    Thanks in Advance,
    Sai

    hi Sai ram...
    <b>Create a FULL Infopack by  Copying  the Full IP which was generated at the 8ZODS.</b>
    Sequence for full is.
    1) start Process
    2) LOAD IP
    3)ACTIVE ODS
    4)EXECUTE IP(Full IP which was copied)
    For Delta
    1) start Process
    2) LOAD IP
    3)ACTIVE ODS
    4)Further Update Process type..(give the ODS name here from where u want to load to Cube.)
    put the Indexes where ever nececssary ...
    hope it helps
    regards
    AK
    Thanks=points
    Message was edited by:
            A K

  • I am getting an error in process chain in bw 3.5?

    Hi all,
    I am getting an error in process chain in bw 3.5?
    @5D@     A type "Activate ODS Object Data" process cannot precede process "Execute InfoPackage" var. ZPAK_4GT51KCFLGM9VPY80NI7UPCFJ in th     @35@
    I was just executing an initial ods -> activate ods as well as -> further update ods is getting generated and then 2nd ods-> activate ods -> further update ods too.
    Why does further update ods gets selected automatically?
    Thanks
    pooja

    Hi Pooja
    Are you talking about update ods object data?
    Please check this http://help.sap.com/saphelp_nw70/helpdata/en/12/43074208ae2a38e10000000a1550b0/content.htm
    Edited by: Chandamita Sarmah on Feb 11, 2010 4:43 PM
    Edited by: Chandamita Sarmah on Feb 11, 2010 4:46 PM

Maybe you are looking for

  • After upgrading to version 4.0 my email money bank transfers do not work. Bank says it is browser issue but what is the setting that needs to be changed?

    I do a great deal of online banking. Recently issue an email money transfer from an account with one bank to an account with a different bank. Have done this regularly in the past with no problems. The transfer could not be completed. After consultin

  • How to convert Time stamp format in CR

    Dear all Expert, I have a question. I have a field in my CR which is date time. It show 24.03.2011 03:30:00 in the tables. But when i drop this field in my CR, it become 20,100,906,080,000.00. How to convert it to 24.03.2011 03:30:00 in CR? Thank You

  • Find the difference between previous row value

    Hi All, I have a Property Rent Review table which holds the last rent update done on a property. I need to be able to compare the rent change (%) at any interval for a property. - so the query should allow to pull the property rent at 2 or more point

  • Skip lot across multiple batches for same material

    Hi, I am looking for a way to control skip lots across multiple batches for the same material. We need to do a quality check in production after a certain quantity produced (on a machine, for a material). My idea was to do skip lots (for GR)  until w

  • Assiging Roles to the User

    Hai All, I am new to EP I have some predefined users and when I am trying to login using those users I am getting  message <b>"No portal roles are assigned for this user.If this problem persists, contact your system administrator"</b>.When I am tryin