Ods fields

hi friends,I bit confused.
I have requirement on adding the field in existing ODS object calling function modules in R3.i have write routines for that,i dont know how can solve this problem.please any one give me proper step how can implement this.
thanks for advance
habeeb

Hi,
you need only to add an infoobject in a ODS?
1. Go to RSA1 - infoproviders.
2. look for your ODS. Double click.
3. add the required InfoObject.
4. Activate. Go back.
5. Go to source system and look for the modified datasource (that one that uses the modified FM).
6. replicate datasource.
7. press the + sign.
8. look in Datasource/Trans. Structure, in the right column for your new field. move it to the left column
9. in transfer rules, assign to this new field the corresponding InfoObject.
10. add this infoObject to the communication structure above.
11. activate.
12. go to infoproviders. look for your ods. under this you'll find the update rules. edit them and assign correctly the new infoobject.
13. activate.
That's all.
Hope this helps.
Regards,
Diego

Similar Messages

  • Characterestics is not getting displayed from ODS in Multiprovider

    I have Join ODS and remote cube in Multiprovider.And data is not getting displayed from ODS fields neither char not KF getting displyed in Multiprovider.I have checked the identification in Char that Char is presend in both the Cube and ODS,But that Char has data in ODS and not in Cube so I checked Char from ODS in Identification.I am still not getting data from ODS.
    Can anyone help me in it?

    How many requests do u have in the ODS?
    If u have only 1 then just right click on DSO and say delete data it will be faster than deleting by request wise....
    What is the message u r getting when u r trying to delete the request?
    Khaja

  • Error in updating data from ODS to CUBE.

    Hi,
    I am tryin to load data manually from ODS to CUBE in NW2004s.
    This is a flat file load from the datasource to the ODS and then from the ODS to the CUBE.
    In the CUBE, I am trying to populate fields by using the ODS fields.
    For eg.
    In the ODS, a CHAR Infoobject has the data in the timestamp format(i.e. mm/dd/yyyy hh:mm ). I need to split this data and assign them to the two individual DATE and TIME Infoobject in the CUBE.
    For this, I have done the coding in the Transfer Structure in the Rule Group.
    The time field is gettin populated , but the date field is not getting populated.
    I get an error as Eg:
    <b>Value '04052007' for CHAR 0DATE is not plausible</b>
    Due to this, the corresponding records is not getting displayed
    Also, the records where the time id displayed, the date is not getting displayed inspite of the date being correct.
    Please help me with a solution for this.
    <b><u><i>REMOVED</i></u></b>
    Thanks In Advance.
    Hitesh Shetty

    Hello Hitesh
    SAP accepts the date format in YYYYMMDD, so in the routine where you have concatenate the day month year...just do it in reverse order.....
    Thanks
    Tripple k

  • Transformation Rule: Error while loading from PSA to ODS using DTP

    Hi Experts,
    I am trying to load data from PSA to ODS using DTP. For about 101 records I get the following error:
    "Runtime error while executing rule -> see long text     RSTRAN     301"
    On further looking at the long text:
    Diagnosis
        An error occurred while executing a transformation rule:
        The exact error message is:
        Overflow converting from ''
        The error was triggered at the following point in the program:
        GP4808B5A4QZRB6KTPVU57SZ98Z 3542
    System Response
        Processing the data record has been terminated.
    Procedure
          The following additional information is included in the higher-level
         node of the monitor:
         o   Transformation ID
         o   Data record number of the source record
         o   Number and name of the rule which produced the error
    Procedure for System Administration
    When looking at the detail:
    Error Location: Object Type    TRFN
    Error Location: Object Name    06BOK6W69BGQJR41BXXPE8EMPP00G6HF
    Error Location: Operation Type DIRECT
    Error Location: Operation Name
    Error Location: Operation ID   00177 0000
    Error Severity                 100
    Original Record: Segment       0001
    Original Record: Number        2
    Pls can anyone help in deducing and pointing this error to the exact spot in the transformation rule
    Thanks & Regards,
    Raj

    Jerome,
    The same issue.
    Here are some fields which are different in terms of length when mapped in transformation rules
    ODS                    |Data Source
    PROD_CATEG     CHAR32           |Category_GUID      RAW 16
    CRM_QTYEXP     INT4          |EXPONENT      INT2
    CRM_EXCRAT     FLTP16          |EXCHG_RATE     Dec 9
    CRM_GWEIGH     QUAN 17, 3     |Gross_Weight     QUAN 15
    NWEIGH          QUAN 17, 3     |Net_Weight     QUAN 15
    CRMLREQDAT     DATS 8          |REQ_DLV_DATE     Dec 15
    The difference is either some dats field are mapped to decimal, or the char 32 field is mapped to raw 16 OR Calweek, Calmonth is mapped to Calday
    Both mostly all the ods field size is greater than the input source field.
    Thanks
    Raj

  • Quantity  Conversion : How to map 0MAT_UNIT_ATTR with UOM0MATE ODS

    Hi dear BWers,
    Can any of you help, about mapping 0MAT_UNIT_ATTR fields to UOM0MATE ODS fields in conversion ?
    (UOM0MATE  is my generated ODS for conversion factor)
    I've mapped them as below, but not sure with which Datasource field to map 0BASE_UOM:
    Material (0MATERIAL) ==> Datasource : MATNR Material
    Quantity - Counter (0UOMZ1D) ==> Datasource :UMREZ Counter
    Quantity -Denominator (0UOMN1D) ==> Datasource :UMREN Denominator
    Base Unit of Measure (0BASE_UOM) ==> Datasource :<b> ???</b>
    Unit of Measure (0UNIT)  ==> Datasource :MEINH Alternative Unit
    Regards,
    Rozz

    Solved on my own as below:
    0Base_Unit exists in master data table of InfoObject 0MATERIAL.
    While creating tranformation rule for ODS, 0Base_Unit should be mapped to master data attribute of 0MATERIAL.
    So, every item's base unit is loaded not from 0MAT_UNIT_ATTR or similar datasource, but from 0MATERIAL master data.
    (Of course I0MATERIAL master data should have been loaded beforehand)

  • ODS data Load from multiple sources.... Help

    Hello every one,
    I have a situation:
    I have a ODS where out of 150 fields 100 are coming from flat files, 30 from other ODSs and 20 are from Master Data Tables (some are attributes of Master Data InfoObjects).
    I am thinking i can do the following. Please advise if this is fine or if there is any other way of doing it for better performance e.t.c
    1 Load the Flat File Fields based on Flat File DataSource/InfoSource/UpdateRules.
    2. Load the other ODS fields/data in the Start Routine of the above Update Rules ???
    3. Load the Master Data objects through Master Data InfoSources
    Please confirm if i can do this. Also is there any order that i have to maintain for the same..... and finally
    When i declare the Masterdata Object (say 0MATERIAL) as one of my data object in the ODS and some the fields as the attributes of this 0MATERIAL, if some one is already loading data into the 0MATERIAL, should i load again or not, because i remember Master Data stores data in separate tables.. ... Please remove my confusion )
    All useful answers will be given full points.
    Thanks a lot ...

    Hi:
    <i>1 Load the Flat File Fields based on Flat File DataSource/InfoSource/UpdateRules.
    2. Load the other ODS fields/data in the Start Routine of the above Update Rules ???
    3. Load the Master Data objects through Master Data InfoSources</i>
    THe third one is only possible if those InfoObjects are InfoProviders. This is becasue if the I-Objects are direct update, you cannot use those infosources for any other laods but InfoObjects.
    <i>Please confirm if i can do this. Also is there any order that i have to maintain for the same.....</i>
    Order is needed only if th ekey figures are Overwrite and Business Requests the order.
    <i>and finally
    When i declare the Masterdata Object (say 0MATERIAL) as one of my data object in the ODS and some the fields as the attributes of this 0MATERIAL, if some one is already loading data into the 0MATERIAL, should i load again or not, because i remember Master Data stores data in separate tables.. ... Please remove my confusion )</i>
    Didn't understand this part clearly.
    DO you have the Attributes of 0material in ODS and populating thme in ODS?
    Ram C.

  • Delta Update for ODS

    Hi,
    We have created a view on R/3 side based on MARA and MARC. Using this view created a Datasource in R/3 side with Generic Delta on Matnr Field as a numeric pointer and safety interval Lower limit = 10.
    In BW Side we have replicated the datasource and in the transfer structure mapped all the datasource fields with ODS Fields. But 0RECORDMODE is not mapped. Not sure what needs to be mapped to this field.
    When i execute the Delta Infopackage the changes in not reflected in the ODS. Can any one help me in this.
    Regards,
    Subbu

    Hi,
    If you want the information from MARA and MARC then you dont have to create a view on these 2 tables. Its because, 0MATERIAL gets data from 0MATERIAL_ATTR and 0MATERIAL_TEXT which gets data from MARA and MAKT respectively. For MARC, you have 0MAT_PLANT in BW which gets the data from 0MAT_PLANT_ATTR and 0MAT_PLANT_TEXT in R/3. So try to use these 2 info objects delivered by SAP. If you want to report on both of them then you may go for an infoset in BW may be. Anyways, even if you create a view on these 2 tables, you will get only the common materials from them or you can say only the Materials from MARC as MARA has all of them and in addition you may have few more in MARA. But if you use an infoset in BW and join them by using Left Outer join, you will get all the material numbers from MARA.
    Hope it helps...

  • Interview Questions

    Hy Gurus
    I am new to BW.
    Can u please let me know the answers for the following Qs.
    1.     What is a Data warehouse? List Few Properties of a Data warehouse.
    2. What are the major challenges of any Data Warehouse design?
    3. Data loading issues, Reporting, Production support issues?
    4.     Data Modeling, Info cube, ODS obj creation?
    5.     SAP Batch process in BW side?
    6.     How to schedule the Background jobs in ABAP, & BW?
    7.     Variables in Query? User exit variables? How do u create user exit variable? And wt u have used?
    8.     Structures in Query Reporting? What are structures?
    9.     Transportation steps?
    10.     How to compress the Info cube? What happens when you compress the info cube?
    11.     Performance issues in Reporting? How can you improve the performance of the query?
    12.     Where the ABAP routines are used in BW side?
    13.     How to create Primary index & secondary indexes on ODS?
    14.     How to prevent the duplicate records at the data target level?
    15.     Issues regarding double records?
    16.     Info sets and multiprovider differences?
    17.     Issues faced in Delta loading? Why it happened?
    Enhancements to the Data sources?
    18.     Issues in loading the data from Flat files? Delta load issues? Which will u suggest for flat file loading is it delta or full load?
    19.     How to prevent the Replication Errors? What happens when you replicate the Data Sources?
    20.     Process chain steps? Which process type is used to delete the last request when loading the data?
    21.     What is virtual cube? Characteristics? Its significance?
    22.     Diff. methods for Generic Data Sources?
    23.     What is Extract structure? Where it is used?
    24.     Data Modeling – design issues, Tech specifications, Modeling , Reporting, Testing, Transportation
    25.     Extraction on R/3 side steps?LO’s
    26.     How do you setup LIS?
    27.     SAP Batch process? /.where does u setup batch process?
    28.     Reconstruction tab in Info cube? Why it is used?
    29.     Suppose Info cube (A) having 10 records.
                         We want to take the some records say 4-7 from Info cube (A) --&#61664; Info cube (B).
                How does u handle the situation?
    30.     Suppose ODS contains 5 records and the Info cube shows 12 records. How to solve it?
    31.     Landscape
    32.     Tell me typical BW Team & How u work?
    33.     How to Maintain the Master data at Client –Server architecture?
    34.     Query performance problems? How does u improve the performance?
    35.     How does u improve the Info cube design performance?
    36.     How do you improve the Dimension & fact table performance?
    37.     How to push data from PSA to ODS?
    38.     How to postpone daily load?
    39.     The functions of Administrator Workbench are…
    40.     What is RECORD MODE
    41.     What is partition? How to partition the Info cube & PSA. ?..
    42.     How to filter single records when uploading into an ods object?
    43.     How can u connect a new data target to an existing data flow?
    44.     When it is advantageous to create secondary indexes for ODS field?
    45.     Purpose of setup tables?
    46.     What is Delta mechanism?
    47.     Will u create an infosourse for every data souse? How many infosourse can we connect to a Data Sources?
    48.     What is a Star Schema? What is the Differences between a Classic Star Schema & Extended Star schema.
    49.     What is an Attribute? Difference between Display Attribute & Navigational Attribute?
    50.     What is Transfer Rule? List the methods used in Transfer Rules.
    51.     Why we need ODS? /List a few of the technical settings that can be defined when building/modifying an ODSObject?
    52.     What is ODS? What are the three tables associated with an ODS object? What are the two types of ODS.
    53.     Name the two tables that provide detailed information about data sources.
    54.     What are two data transfer methods? Which is the preferred method and why?
    55.     Where will the development take place?&  Who will do the development testing.
    56.     Who will be responsible for long-term support?
    57.     What is a Slowly Changing Dimension?
    58.     What is namespace for SAP BW?
    59.     What are nine decision points of Data warehouse
    60.     How does u install BW Statistics? How you enable monitoring using BW Statistics.
    61.     How do u rate u r self in Modeling, Reporting & Extraction.
    62.     What are the advantages with LO Extraction.
    63.     What are the steps in General to enhance data?
    64.     What are Phases in ASAP Methodology?
    65.     What is the 90 Day rule?
    66.     What is the demo content use?
    67.     What is the use of RSRAJ Transaction?
    68.     What is the use of RSSU53 Transaction?
    69.     Can u repeat the Master Data Source?
    70.     What is diff. b/w DW and BW?
    71.     Will u use idol methods in BW?
    72.     What does the number in the 'Total' column in Transaction RSA7 mean?
    73.       The extract structure was changed when the Delta Queue was empty. Afterwards new delta records were written to the Delta Queue. When loading the delta into the PSA, it shows that some fields were moved. The same result occurs when the contents of the Delta Queue are listed via the detail display. Why are the data displayed differently? What can be done?
    74.     How and where can you control whether a repeat delta is requested?
    75.     Can I use several clients within SAP BW?
    Thanks in Advance
    murali

    Hi
    I have few questions and answer u can check it out which would be more helpful
    Solution
    Questions and answers.
    Question 1:
    What does the number in the 'Total' column in Transaction RSA7 mean?
    Answer:
    The 'Total' column displays the number of LUWs that were written in the delta queue and that have not yet been confirmed. The number includes the LUWs of the last delta request (for repeating a delta request) and the LUWs for the next delta request. An LUW only disappears from the RSA7 display when it has been transferred to the BW System and a new delta request has been received from the BW System.
    Question 2:
    What is an LUW in the delta queue?
    Answer:
    An LUW from the point of view of the delta queue can be an individual document, a group of documents from a collective run or a whole data packet from an application extractor.
    Question 3:
    Why does the number in the 'Total' column, in the overview screen of Transaction RSA7, differ from the number of data records that are displayed when you call up the detail view?
    Answer:
    The number on the overview screen corresponds to the total number of LUWs (see also question 1) that were written to the qRFC queue and that have not yet been confirmed. The detail screen displays the records contained in the LUWs. Both the records belonging to the previous delta request and the records that do not meet the selection conditions of the preceding delta init requests are filtered out. This means that only the records that are ready for the next delta request are displayed on the detail screen. The detail screen of Transaction RSA7 does not take into account a possibly existing customer exit.
    Question 4:
    Why does Transaction RSA7 still display LUWs on the overview screen after successful delta loading?
    Answer:
    Only when a new delta has been requested does the source system learn that the previous delta was successfully loaded into the BW System. The LUWs of the previous delta may then be confirmed (and also deleted). In the meantime, the LUWs must be kept for a possible delta request repetition. In particular, the number on the overview screen does not change if the first delta is loaded into the BW System.
    Question 5:
    Why are selections not taken into account when the delta queue is filled?
    Answer:
    Filtering according to selections takes place when the system reads from the delta queue. This is necessary for performance reasons.
    Question 6:
    Why is there a DataSource with '0' records in RSA7 if delta exists and has been loaded successfully?
    Answer:
    It is most likely that this is a DataSource that does not send delta data to the BW System via the delta queue but directly via the extractor . You can display the current delta data for these DataSources using TA RSA3 (update mode ='D')
    Question 7:
    Do the entries in Table ROIDOCPRMS have an impact on the performance of the loading procedure from the delta queue?
    Answer:
    The impact is limited. If performance problems are related to the loading process from the delta queue, then refer to the application-specific notes (for example in the CO-PA area, in the logistics cockpit area, and so on).
    Caution: As of PlugIn 2000.2 patch 3, the entries in Table ROIDOCPRMS are as effective for the delta queue as for a full update. Note, however, that LUWs are not split during data loading for consistency reasons. This means that when very large LUWs are written to the delta queue, the actual package size may differ considerably from the MAXSIZE and MAXLINES parameters.
    Question 8:
    Why does it take so long to display the data in the delta queue (for example approximately 2 hours)?
    Answer:
    With PlugIn 2001.1 the display was changed: you are now able to define the amount of data to be displayed, to restrict it, to selectively choose the number of a data record, to make a distinction between the 'actual' delta data and the data intended for repetition, and so on.
    Question 9:
    What is the purpose of the function 'Delete Data and Meta Data in a Queue' in RSA7? What exactly is deleted?
    Answer:
    You should act with extreme caution when you use the delete function in the delta queue. It is comparable to deleting an InitDelta in the BW System and should preferably be executed there. Not only do you delete all data of this DataSource for the affected BW System, but you also lose all the information concerning the delta initialization. Then you can only request new deltas after another delta initialization.
    When you delete the data, this confirms the LUWs kept in the qRFC queue for the corresponding target system. Physical deletion only takes place in the qRFC outbound queue if there are no more references to the LUWs.
    The delete function is intended for example, for cases where the BW System, from which the delta initialization was originally executed, no longer exists or can no longer be accessed.
    Question 10:
    Why does it take so long to delete from the delta queue (for example half a day)?
    Answer:
    Import PlugIn 2000.2 patch 3. With this patch the performance during deletion improves considerably.
    Question 11:
    Why is the delta queue not updated when you start the V3 update in the logistics cockpit area?
    Answer:
    It is most likely that a delta initialization had not yet run or that the the delta initialization was not successful. A successful delta initialization (the corresponding request must have QM status 'green' in the BW System) is a prerequisite for the application data to be written to the delta queue.
    Question 12:
    What is the relationship between RSA7 and the qRFC monitor (Transaction SMQ1)?
    Answer:
    The qRFC monitor basically displays the same data as RSA7. The internal queue name must be used for selection on the initial screen of the qRFC monitor. This is made up of the prefix 'BW, the client and the short name of the DataSource. For DataSources whose name is shorter than 20 characters, the short name corresponds to the name of the DataSource. For DataSources whose name is longer than 19 characters (for delta-capable DataSources only possible as of PlugIn 2001.1) the short name is assigned in Table ROOSSHORTN.
    In the qRFC monitor you cannot distinguish between repeatable and new LUWs. Moreover, the data of a LUW is displayed in an unstructured manner there.
    Question 13:
    Why is there data in the delta queue although the V3 update has not yet been started?
    Answer:
    You posted data in the background. This means that the records are updated directly in the delta queue (RSA7). This happens in particular during automatic goods receipt posting (MRRS). There is no duplicate transfer of records to the BW system. See Note 417189.
    Question 14:
    Why does the 'Repeatable' button on the RSA7 data details screen not only show data loaded into BW during the last delta but also newly-added data, in other words, 'pure' delta records?
    Answer:
    It was programmed so that the request in repeat mode fetches both actually repeatable (old) data and new data from the source system.
    Question 15:
    I loaded several delta inits with various selections. For which one
    is the delta loaded?
    Answer:
    For delta, all selections made via delta inits are summed up. This
    means a delta for the 'total' of all delta initializations is loaded.
    Question 16:
    How many selections for delta inits are possible in the system?
    Answer:
    With simple selections (intervals without complicated join conditions or single values), you can make up to about 100 delta inits. It should not be more.
    With complicated selection conditions, it should be only up to 10-20 delta inits.
    Reason: With many selection conditions that are joined in a complicated way, too many 'where' lines are generated in the generated ABAP source code which may exceed the memory limit.
    Question 17:
    I intend to copy the source system, i.e. make a client copy. What will happen with may delta? Should I initialize again after that?
    Answer:
    Before you copy a source client or source system, make sure that your deltas have been fetched from the delta queue into BW and that no delta is pending. After the client copy, an inconsistency might occur between BW delta tables and the OLTP delta tables as described in Note 405943. After the client copy, Table ROOSPRMSC will probably be empty in the OLTP since this table is client-independent. After the system copy, the table will contain the entries with the old logical system name which are no longer useful for further delta loading from the new logical system. The delta must be initialized in any case since delta depends on both the BW system and the source system. Even if no dump 'MESSAGE_TYPE_X' occurs in BW when editing or creating an InfoPackage, you should expect that the delta has to be initialized after the copy.
    Question 18.
    Am I permitted to use the functions in Transaction SMQ1 to manually control processes?
    Answer:
    Use SMQ1 as an instrument for diagnosis and control only. Make changes to BW queues only after informing BW Support or only if this is explicitly requested in a note for Component 'BC-BW' or 'BW-WHM-SAPI'.
    Question 19.
    Despite the delta request only being started after completion of the collective run (V3 update), it does not contain all documents. Only another delta request loads the missing documents into BW. What is the cause for this "splitting"?
    Answer:
    The collective run submits the open V2 documents to the task handler for processing. The task handler processes them in one or several parallel update processes in an asynchronous way. For this reason, plan a sufficiently large "safety time window" between the end of the collective run in the source system and the start of the delta request in BW. An alternative solution where this problem does not occur is described in Note 505700.
    Question 20.
    Despite deleting the delta init, LUWs are still written into the DeltaQueue
    Answer:
    In general, delta initializations and deletions of delta inits should always be carried out at a time when no posting takes place. Otherwise, buffer problems may occur: If you started the internal mode at a time when the delta initialization was still active, you post data into the queue even though the initialization had been deleted in the meantime. This is the case in your system.
    Question 21.
    In SMQ1 (qRFC Monitor) I have status 'NOSEND'. In the Table TRFCQOUT, some entries have the status 'READY', others 'RECORDED'. ARFCSSTATE is 'READ'. What do these statuses mean? Which values in the field 'Status' mean what and which values are correct and which are alarming? Are the statuses BW-specific or generally valid in qRFC?
    Answer:
    Table TRFCQOUT and ARFCSSTATE: Status READ means that the record was read once either in a delta request or in a repetition of the delta request. However, this still does not mean that the record has successfully reached the BW. The status READY in the TRFCQOUT and RECORDED in the ARFCSSTATE means that the record has been written into the delta queue and will be loaded into the BW with the next delta request or a repetition of a delta. In any case only the statuses READ, READY and RECORDED in both tables are considered to be valid. The status EXECUTED in TRFCQOUT can occur temporarily. It is set before starting a delta extraction for all records with status READ present at that time. The records with status EXECUTED are usually deleted from the queue in packages within a delta request directly after setting the status before extracting a new delta. If you see such records, it means that either a process which confirms and deletes records loaded into the BW is successfully running at the moment, or, if the records remain in the table for a longer period of time with status EXECUTED, it is likely that there are problems with deleting the records which have already been successfully been loaded into the BW. In this state, no more deltas are loaded into the BW. Every other status indicates an error or an inconsistency. NOSEND in SMQ1 means nothing (see note 378903). However the value 'U' in field 'NOSEND' of table TRFCQOUT is of concern.
    Question 22.
    The extract structure was changed when the delta queue was empty. Afterwards new delta records were written to the delta queue. When loading the delta into the PSA, it shows that some fields were moved. The same result occurs when the contents of the delta queue are listed via the detail display. Why is the data displayed differently? What can be done?
    Answer:
    Make sure that the change of the extract structure is also reflected in the database and that all servers are synchronized. We recommend resetting the buffers using Transaction $SYNC. If the extract structure change is not communicated synchronously to the server where delta records are being created, the records are written with the old structure until the new structure has been generated. This may have disastrous consequences for the delta. When the problem occurs, the delta needs to be re-initialized.
    Question 23. How and where can I control whether a repeat delta is requested?
    Answer:
    Via the status of the last delta in the BW Request Monitor. If the request is RED, the next load will be of type 'Repeat'. If you need to repeat the last load for any reason, manually set the request in the monitor to red. For the contents of the repeat, see Question 14. Delta requests set to red when data is already updated lead to duplicate records in a subsequent repeat, if they have not already been deleted from the data targets concerned.
    Question 24.
    As of PI 2003.1, the Logistic Cockpit offers various types of update methods. Which update method is recommended in logistics? According to which criteria should the decision be made? How can I choose an update method in logistics?
    Answer:
    See the recommendation in Note 505700.
    Question 25.
    Are there particular recommendations regarding the maximum data volume of the delta queue to avoid danger of a read failure due to memory problems?
    Answer:
    There is no strict limit (except for the restricted number area of the 24-digit QCOUNT counter in the LUW management table - which is of no practical importance, however - or the restrictions regarding the volume and number of records in a database table).
    When estimating "soft" limits, both the number of LUWs and the average data volume per LUW are important. As a rule, we recommend bundling data (usually documents) as soon as you write to the delta queue to keep number of LUWs low (this can partly be set in the applications, for example in the Logistics Cockpit). The data volume of a single LUW should not be much larger than 10% of the memory available to the work process for data extraction (in a 32-bit architecture with a memory volume of about 1 GByte per work process, 100 MByte per LUW should not be exceeded). This limit is of rather small practical importance as well since a comparable limit already applies when writing to the delta queue. If the limit is observed, correct reading is guaranteed in most cases.
    If the number of LUWs cannot be reduced by bundling application transactions, you should at least make sure that the data is fetched from all connected BWs as quickly as possible. But for other, BW-specific, reasons, the frequency should not exceed one delta request per hour.
    To avoid memory problems, a program-internal limit ensures that no more than 1 million LUWs are ever read and fetched from the database per delta request. If this limit is reached within a request, the delta queue must be emptied by several successive delta requests. We recommend, however, to try not to reach that limit but trigger the fetching of data from the connected BWs as soon as the number of LUWs reaches a 5-digit value.
      THANKS =POINTS in SDN
    SANJEEV

  • Update ORDImage with in memory BLOB?

    According to the interMedia docs, one of your examples is:
    BEGIN
    INSERT INTO emp VALUES (
    'John Doe', 24000, 'Technical Writer', 123,
    ORDSYS.ORDImage(ORDSYS.ORDSource(empty_blob(), NULL,NULL,NULL,SYSDATE,1),
    NULL,NULL,NULL,NULL,NULL,NULL,NULL));
    -- select the newly inserted row for update
    SELECT photo INTO Image FROM emp
    WHERE ename = 'John Doe' for UPDATE;
    --BEGIN
    -- use the getContent method to get the LOB locator.
    -- populate the data with dbms lob calls or write an OCI program to
    -- fill in the image BLOB.
    --END;
    -- set property attributes for the image data
    Image.setProperties;
    UPDATE emp SET photo = Image WHERE ename = 'John Doe';
    -- continue processing
    END;
    However, the part I'm stuck on is the populate the data (using OCI or LOB packages). For a regular BLOB field (or interMedia text), I can use the procedure outlined below:
    sSQL = "SELECT blob FROM store WHERE id = " & ID
    Set oDS = setDS(sSQL)
    oDS.Edit
    Set oBLOB = oDS.Fields("blob").Value
    sRet = oBLOB.Write(oFile.Binary, sSize)
    oDS.Update
    I retrieve the blob field (which was inserted with empty_blob()) into a dynaset using OO4O, edit the current row and write the new blob with the file and size parameters (I get from the rest of my code). This does not work for a fieldtype of ORDImage (the write method is not available). Is there any sample code that will update the blob field with binary in-memory data as opposed to from an external file? Thanks,
    Kevin
    null

    The interMedia object method getContent() returns a blob. This can then be read and write using the BLOB interface.
    So in your example, your SQL would be
    sSQL = "SELECT s.image.getContent() FROM store s WHERE s.id = " & ID
    Then you could uses the fields.value method to get the blob handle.

  • InfoObject* is inconsistent?

    Hi all,
    We got an error: <b>InfoObject* is inconsistent</b> after we created the cube and finished the init load and delta load with the process chain creation(hasn't scheduled).
    Do we need to fix it and how? If not, will it cause any problem for the dataload?
    Thank you

    Hi John,
    <i>1. what problem it will cause if we don't repair it?</i>
    There might be load issues and the data might not be displayed in the proper format.
    <i>2. If we uploaded data from the ODS A to the cube, do we need to use the ODS A for the template in the cube creation?</i>
    By template you mean having all the ODS fields in the cube as well. It depends upon your reporting requirement. You have lesser, more or equal no. of fields in the cube when compared to ODS.
    Bye
    Dinesh.

  • Data type mismatch while writing a code in BADi

    Hi Experts,
    While writing a code in BADi, I am facing data type mismatch.
    Scenario:
    I have created the Infospoke based on one ODS and inside BADi i am looking up the other ODS fields.
    Two ODS's having 4 common key fields but one key field having the data type mismatch.
    While selecting the data from other ODS table in where condition it's giving the error data type mismatch.
    Could you please advice the same please.
    Thanks.
    Gana.

    Any update please....
    Thanks in advance.
    Edited by: Gananadha Lenka on Jun 18, 2010 1:57 PM

  • Mysterious Infoobject

    Hi Experts,
               can u please solve the mystery below.
    I Checked in the metadata repository for a an Infoobject. The metadata repository says it is being used in an ODS, but when i look at the Contents tab in Manage Data Targets for that ODS, I cant find it. Moreover when i run a query on this ODS i am able to see in the query. also it has some data in the query. can anyone help me with this .. Thank u
    Dave

    Hi Dave,
    You can check from the Infoobject maintanence (RSD1) from where used list.
    And it is possible that the InfoObject is a reference Object of another infoobject.
    If you are getting values in Query check the Technical name for the field and validate within the ODS fields by selecting all Charecterstic and Keyfigure, sometimes you may not be choosing all to display.
    Hope this helps..
    Best Regards,
    DMK
    *Assign points if it serves your purpose...

  • Imp-code help

    Hi all
    i am loading data from ods1 to ods2. the data in ods1 is as follows
    billno item      type  sign     value
    1     1     X     P     7
    1     1     X     N     3
    1     1     X     N     3
    1     1     Y     P     2
    2     1     X     P     2
    2     1     X     N     2
    2     1     X     P     9
    for each combination of Billno and item ,
    for every record where type =X , i have to
    all all values where sign equal to p to pupulate
    ods2 field 1 and all values where sign is N to
    populate ods field 2.     
    i.e my output should be
    billno item      kf1p      kf2n
    1     1     7     6
    2     1     11     2

    Hi,
    Create an ODS2 with key field as bill no and item. Create 2 key figures and in the update rules set the update mode to "Addition". In the start routine write:
    DELETE DATA_PACKAGE WHERE type <> 'X'.
    In the update routine of the 1st key figure write,
    If COMM_STRUCTURE-sign = 'P'.
    RESULT = comm_structure-value.
    ENDIF.
    In the update routine of the 2nd key figure write,
    If COMM_STRUCTURE-sign = 'N'.
    RESULT = comm_structure-value.
    ENDIF.
    Hope this helps.
    PB

  • Dat atarget

    Hi all,
    1) How to filter single records when uploading into an ods object?
    2) How can u connect a new data target to an existing data flow?
    3) When it is advantageous to create secondary indexes for ODS field?
    Thanxs,
    Rekha.

    hi Rekha,
    For the second question u asked, one can connect the data target to the concerned data source.
    just go to context of your data target -> select update rules tab. in the update rules screen u can select the option from where the data source is comming. for example if u want to load data from one cube to another cube u can specify the soruce cube name there. upon activating the rules the data target is connected to that concerned data soruce.
    hope this clears  ur doubt
    let me know
    cheers
    ravi
    Assignign points is the way  of saying thanks in sdn

  • New fields addition to BW 3.5 version ODS and Cube and transport to PRD.

    Hi,
    We have a scenarion on 3.5 wherein there is a enhancement to ODS and Cube(few new fileds are added), this New ODS also feeds data to Cube.  Since we do not had data on Quality system, we had no problem in adding fields to ODS and cube, but now we need transport these changes to Production, In production ODS and Cube has large data. we have few doubts.
    1. Shall we need to delete data from ODS and Cube then Transport request to Production server.
    2. Is it ok to move transport request without deleting data in ODS and Subsequent Cube in production system
    Guys and Gals,
    what is your suggestion on this one. WE are in BW 3.5 only. No BI7.
    Please revert back.

    Hi
    you can directly transport that to production.
    the image will over write with the existing one and for the new object add , a new table space will be created.
    it will not affect the Old data
    But in the Cube even if the data is there there is a concept called remodeling
    http://help.sap.com/saphelp_nw70/helpdata/en/58/85e5414f070640e10000000a1550b0/content.htm
    hope this helps
    santosh

Maybe you are looking for