Data archieving  questioner required

Dear All,
We have been approched by one of our cleint for DATA ARCHIVING from R/3 system.
Management has pushed my name in this.
Requirement is to prepare DATA ARCHIVING QUESTIONER template.
Can please anybody help me out in this regard from MM point of view.
Thanking you in advance.
Regards
Nasir Chapparband.

Hi,
Refer following link;
[http://itmanagement.earthweb.com/datbus/article.php/3109221]
SAP Data Archiving
1.0 Introduction to Enterprise Data Archiving
Currently, a large number of enterprises use SAP R/3 as a platform for integration of business processes. The continuous usage of SAP results in huge amounts of enterprise data, which is stored in SAP R/3. With passage of time, the new and updated data is entered into the system while the old data still resides in the SAP enterprise system.
Since some of the old data is critical, it cannot be deleted. The difficulty is keeping the data you want, and deleting the data you do not want. Hence, a SAP database keeps on expanding rapidly and enterprise systems, which have limited data retention abilities for a few years, suffer from problems such as data overflow, longer transaction processing times, and performance degradation.
The solution of this problem has led to the concept of Data Archiving in SAP. Data Archiving removes out-of-date data from the SAP database that the R/3 system does not need online, but can be retrieved on a later date, if required. This data is known as archived data and is stored at an offline location. Data Archiving not only consistently removes data from the database but also ensures data availability for future business requirements.
One rule of thumb is that in a typical SAP enterprise system, the ratio of data required to be online and instantly accessible to old data, which could be archived, and stored offline is 1:6. For example, if an enterprise has 2100 GB of SAP database, the online data, which is frequently used by SAP users will be 300 MB and the rest (1800 MB) will be scarcely used and hence can be archived.
1.1 Data Archiving u2013 Features
It provides a protection layer to the SAP database and resolves underperformance problems caused by huge volumes of data. It is important that SAP users should keep only minimal data to efficiently work with database and servers. Data archiving ensures that the SAP database contains only relevant and up-to-date data that meet your requirements.
Data archiving uses hardware components such as hard disks and memory. For efficient data archiving, minimum number of disks and disk space should be used.
It also reduces the system maintenance costs associated with the SAP database. In the SAP database there are various procedures such as, data backup, data recovery, and data upgrade.
SAP data archiving complies with statutory data retention rules that are common and well-proven techniques.
SAP data archiving can be implemented in two ways. In the next section both options will be discussed in detail.
Also refer following link;
[SAP Data Archiving Tutorial|http://www.thespot4sap.com/articles/SAP_Data_Archiving_Overview.asp]

Similar Messages

  • Field Bline date is a required field for G/L account 1200 122400

    While posting a Good receipt document using MIGO for a PO I am recieving the following error
    Field Bline date is a required field for G/L account 1200 122400
    Though both the document date and Posting date fields are filled with appropriate date.
    The detailed message reads as follows, now my question is how to change the baseline date  for the Cocd 1200 for GL account 122400
    Field Bline date is a required field for G/L account 1200 122400
    Message no. F5 808
    Diagnosis
    The value for field "Bline date" in the interface to Financial Accounting is an initial value but you are required to make an entry in the field selection for G/L account "122400" in company code "1200" linked to the field selection for posting key "81".
    System Response
    Error
    Procedure
    It might be an error in the configuration of the G/L account field selection. The initial application, used to call up the interface must otherwise define a value for field "Bline date". If this is the case, contact the consultant responsible for the application used to call up the interface or get in contact with SAP directly.

    Hello Ravinagh,
    Heres what I did:
    OB41> Double clicked on posting key 81> clicked on field status> Payment transactions> Both Due date and Value date have already been set to optional.
    For OB14 it asks me to input a field status variant.
    There are three of them. Now here is the tricky part, how do I find out which field variant has been assigned
    0001     Field status for 0001
    1000     Field status for CoA 1000
    3000     Field status for CoC 3000
    I went further and check for all the three variants one by one
    and found out that in the field status> Payment transactions for the following entry in the field status group
    G045     Goods/invoice received clearing accounts
    The DUE DATE and  VALUE DATE are suppressed. What do I need to do here?
    Next I checked OBB8 to find  the following three Options I dunno which one of them is applicable to the GL account 122400 for Cocd 1200. Where can I find the assignment? How can I find which payment term is being used.
       PayT        Sales text                                                                    
      Day limit   Explanations                                                                  
      0001                                                                               
    0           Payable immediately Due net                                                                               
      0002                                                                               
    0           Within 14 days 3 % cash discount                                              
                  Within 30 days 2 % cash discount                                              
                  Within 45 days Due net                                                                               
      0003                                                                               
    15          Within 14 days 2 % cash discount                                              
                  Within 30 days 1.5 % cash discount                                            
                  Within 45 days Due net                                                        
                  Baseline date on 30 of the month                                              
    Entry 1 of 40                                                                               
    Well I went ahead and clicked on each of them
    For 0001 the Default for Baseline date is set to Posting date (Do I need to change anything here?)
    For 0002 the Default for Baseline date is set to Document date (Do I need to change anything here?)
    For 0003 the Default for Baseline date is set to Posting date (Do I need to change anything here?)
    Please guide me here. If you need screenshots I would be glad to mail it to you.
    I hope you understood the scenario.
    Thanks
    Ron

  • BI 7.0 Data Source Question

    I have requirement in Funct spec from Tables EKKO , EKPO  . The data sources 2lis_02_itm ,2lis_02_scl etc will have these tabels data . But in BI 7.0 there is a data source called 2LIS_06_Inv which provides data from all these data sources.i.e,
    The new DataStore object of Invoice Verification (0LIV_DS01) is supplied with data by the new DataSource 2LIS_06_INV, whereas previously the relevant invoice verification data was only supplied by the three DataSources of Purchasing 2LIS_02_HDR (purchasing data (header level)), 2LIS_02_ITM (purchasing data (item level)), and 2LIS_02_SCL (purchasing data (delivery schedule line level)).
    Iam already using this elis_06_inv in my other data model for another req. So my question here is instead of wasting time with creating this with 2lis_02_itm /hdr /scl Can I just use the new one that Iam already using for other req.
    Thanks

    Hi Harish,
    You can use the same datasource to load data into different infoproviders. Just create new transformation rules and DTP assigning source as your datasource, for the new infoprovider. In the infopackage, schedule the data transfer to PSA and then execute the DTP to load the data, for the required infoprovider.
    Hope this will help resolve your query.
    Regards
    Vaibhav

  • Dear Customer,  It hase come to our attention that your account billing Information records are out of date. That requires you to update your Billing Information. Failure to update your records will result in account termination records...  Click on

    Dear Customer,
    It hase come to our attention that your account billing Information records are out of date. That requires you to update your Billing Information. Failure to update your records will result in account termination records...
    Click on the reference link below and enter your login Information on the following page to confirm your Billing Information records..
    Verify Now >
    Wondering why you got this email?
    It's sent when someone adds or changes a contact email address for an Apple ID account. If you didn't do this, don't worry. Your email address cannot be used as a contact address for an Apple ID without your verification.
    For more information, see our frequently asked questions.
    Thanks,
    Apple Customer Support
    TM and copyright © 2014 Apple Inc. Apple Sales International, Hollyhill Industrial Estate, Cork, Ireland. Company Registration number: 15719. VAT number: IE6554690W.
    All Rights Reserved / Keep Informed / Privacy Policy / My Apple ID
    Anyone else get this today?
    And yep i know this message doesn't go here however the ability to post in general postings to most viewers is lost to me in the ridicules way apple setup there support

    It's phishing. Forward it to [email protected] and then delete it.
    (118880)

  • HT5815 The most important piece of info about an update I need is how much data does it require. I am travelling and using prepaid data. I cannot even consider an update until I know how much data it will use. Please provide this information.

    The most important piece of info about an update I need is how much data does it require. I am travelling and using prepaid data. I cannot even consider an update until I know how much data it will use. Please provide this information.

    http://www.apple.com/feedback/macosx.html

  • The DATA CAP MEGA THREAD....post data cap questions or comments here.

    In the interest of keeping things orderly....This is the DATA CAP MEGA THREAD....post data cap questions or comments here.
    Please keep it civil.
    Comcast is testing usage plans (AKA "data caps") in certain markets.
    The markets that are currently testing usage plans are:
    Nashville, Tennessee market: 300 GB per month and additional gigabytes in increments/blocks ( e.g., $10.00 per 50 GB ). 
    Tucson, Arizona market: Economy Plus through Performance tiers receive 300 GB. Those customers subscribed to the Blast! Internet tier receive 350 GB; Extreme 50 customers receive 450 GB; Extreme 105 customers receive 600 GB. Additional gigabytes in increments/blocks of 50 GB for $10.00 each in the event the customer exceeds their included data amount. 
    Huntsville and Mobile, Alabama; Atlanta, Augusta and Savannah, Georgia; Central Kentucky; Maine; Jackson, Mississippi; Knoxville and Memphis, Tennessee and Charleston, South Carolina: 300 GB per month and additional gigabytes in increments/blocks ( e.g., $10.00 per 50 GB ) Economy Plus customers have the option of enrolling in the Flexible-Data plan.
    Fresno, California, Economy Plus customers also have the option of enrolling in the Flexible-Data plan.
    - If you live outside of these markets you ARE NOT currently subject to a data plan.
    - Comcast DOES NOT THROTTLE your speed if you exceed your usage limits.
    - You can check out the Data Usage Plan FAQ for more information.
     

    I just got a call today that I reached my 300GB limit for the month.  I called and got a pretty rude response from the security and data usage department.  The guy told me in so many words that if I do not like or agree with the policy that I should feel free to find another service provider.!!! I tried to explain that we watch Netflix and XFinity on-demand alot and I was told that that can not be anywhere close to the data usage. I checked my router and watching a "super HD, dolby 5.1" TV show on Netflix will average about 5-6 GB per hour (1.6MB/s) ... sp this means that I can only watch no more than 1-2 Super HD TV shows a day via Netflix before I run out of my data usage.    This seems a bit redicilous doesn't it? Maybe the TV ads about the higher speed than the competition should be accompanied with "as long as you don't use it too often"   Not a good experience ... 

  • SAP Data archieving

    Hi Experts,
        Can anyone please help me regarding SAP Data archieving material. I would like to have material to look into data Archieving. your help will be appreciated.
    Venkat.

    Data which needs to be archived include Customer Master, Material Master, Bills of Material, Master Recipe, Process Order, Vendor Master, Purchasing Info Record, Sales Pricing Conditions, and Purchasing Pricing Conditions.
        To archive vendor master data, first we need to archive its dependant objects.
        The dependant objects are
    1)     MM_EINA
    2)     FI_DOCUMNT           
    3)     FI_MONTHLY
    4)     MM_EKKO

  • I have a mac book running leopard software, 10.5.8 how can I get my computor up to date as loin requires minimum 10.6.6

    I have a mac book running leopard software, 10.5.8 how can I get my computor up to date as lion requires minimum 10.6.6.

    You can upgrade from 10.5 to 10.6 with no problems. Any program that runs under 10.5 should run under10.6. See this list for compatibility with 10.6: http://snowleopard.wikidot.com/  You might have to upgrade some drivers for printers, etc.... And you will have to install Rosetta if you have any Power PC applications http://www.macobserver.com/tmo/article/snow_leopard_installing_rosetta/  
    To upgrade your iLife and iWork. If you only want iPhoto or other single apps from iLife '11 you can get them from the App Store after you've upgraded to 10.6.6. iPhoto, iMovie or Garage Band for $15 each and Pages, Keynote or Numbers from iWork '09 for $20 each.
    You can order a Snow Leopard 10.6 install disk for $29 as long as you have at least 1gb of RAM and 5gb of free space on your hard drive. http://store.apple.com/us/product/MC573Z/A?mco=MTY3ODQ5OTY
    Once you are at 10.6.8 you can buy Lion for $29 from the App Store if you have at least a model 2,1 MacBook. Lion will require at least 2gb of RAM but really needs 4gb to run smoothly. As for programs see this list for compatibility with 10.7 http://roaringapps.com/apps:table Also Lion doesn't run any Power PC programs. To see if you have any Power PC programs go to the Apple in the upper left corner and select About This Mac, then click on More Info. When System Profiler comes up select Applications under Software. Then look under Kind to see if any of your applications are listed as Power PC. Universal and Intel will run under Lion.
    Before Mac switched to Intel processors they used Power PC processors from 1994 to 2005. Power PC 601 through 604, G3, G4 and G5. Applications written for the Power PC processors need the application called Rosetta to run on Intel processors. This was part of the Operating System in 10.4 and 10.5 but was an optional install in 10.6. With 10.7 Lion Apple dropped all support for Power PC applications.

  • Invalid License Data,Reinstall is required error in sequencing SQL server 2012 in APPV

    Hi ,
    I am trying to sequence SQL 2012 in 4.6 SP1 Sequencer ,with the following components selected in SQL.
    1. Client Tools Backward Compatibility
    2.Documentation Components
    3. Management Tools - Basic
    4. Management Tools -Complete
    Sequencing Part goes fine. But in client when i launch the management studio it says that "Invalid
    License Data,Reinstall is required"
    requesting to you please help us  to resolve  the issue.!
    Thanks -
    Aditya Halder

    There's a visual studio run time which needs to be installed locally.
    Launch the install for SQL 2012, when it extracts the install media, it drops them into a randomly generated folder under C:\ grab the vssetup. Ensure that's installed locally on your machine
    That was enough to get it to work for me BUT I believe, since I sequenced, something changed with an update and now there's some other steps required. Peter S, shared this with me. Try these steps:
    Under HKEY_CLASSES_ROOT there's a Licenses registry key. If there's a value remove the value, do not delete the key, leave it there, if it doesn't exist create it. Set the virtualization level as Overwrite in the.
    Create a PRE-LAUNCH Script which runs:
    “VFS\ProgramFilesX86\Microsoft Visual Studio
    10.0\Common7\IDE\DDConfigCA.exe” inside the package*. This creates the
    machine related license key inside the App-V environment.
    PLEASE MARK ANY ANSWERS TO HELP OTHERS Blog:
    rorymon.com Twitter: @Rorymon

  • I have a 2006 imac (intel) w osx 10.4.11 and need to update for new ipod touch? software updates says up to date, but itunes requires 10.5 or higher? any solutions?

    i have a 2006 imac (intel) w osx 10.4.11 and need to update for new ipod touch? software updates says up to date, but itunes requires 10.5 or higher? any solutions?

    i have a 2006 imac (intel) w osx 10.4.11  <  Tiger
    It's not just iTunes you need to update...
    You need to upgrade to Leopard v10.5.8 at the very least in order to sync, restore, and backup an iPod touch.
    See iPod touch system requirements here.
    Leopard is available from Amazon here.

  • [CS5.5/6] - XML / Data Merge questions & Best practice.

    Fellow Countrymen (and women),
    I work as a graphic designer for a large outlet chain retailer which is constantly growing our base of centers.  This growth has brought a workload that used to be manageable with but two people to a never ending sprint with five.  Much of what we do is print, which is not my forte, but is also generally a disorganized, ad-hoc affair into which I am wading to try to help reduce overall strain.
    Upon picking up InDesign I noted the power of the simple Data Merge function and have added it to our repetoire in mass merging data sources.  There are some critical failures I see in this as a tool going forward for our purposes, however:
    1) Data Merge cannot handle information stored and categorized in a singular column well.  As an example we have centers in many cities, and each center has its own list of specific stores.  Data merge cannot handle a single column, or even multiple column list of these stores very easily and has forced us into some manual operations to concatenate the data into one cell and then, using delimiter characters, find and replace hard returns to seperate them.
    2) Data Merge offers no method of alternate alignment of data, or selection by ranges.  That is to say:  I cannot tell Data merge to start at Cell1 in one column, and in another column select say... Cell 42 as the starting point.
    3) Data merge only accepts data organized in a very specific, and generally inflexible pattern.
    These are just a few limitations.
    ON TO MY ACTUAL DILEMMA aka Convert to XML or not?
    Recently my coworker has suggested we move toward using XML as a repository / delivery system that helps us quickly get data from our SQL database into a usable form in InDesign. 
    I've watched some tutorials on Lynda.com and havent yet seen a clear answer to a very simple question:
    "Can XML help to 'merge' large, dynamic, data sets like a list of 200 stores per center over 40 centers based off of a single template file?"
    What I've seen is that I would need to manually duplicate pages, linking the correct XML entry as I go rather than the program generating a set of merged pages like that from Data Merge with very little effort on my part.  Perhaps setting up a master page would allow for easy drag and drop fields for my XML data?
    I'm not an idiot, I'm simply green with this -- and it's kind of scary because I genuinely want us to proceed forward with the most flexible, reliable, trainable and sustainable solution.  A tall order, I know.  Correct me if I'm wrong, but XML is that beast, no?
    Formatting the XML
    Currently I'm afraid our XML feed for our centers isnt formatted correctly with the current format looking as such:
    <BRANDS>
         <BRAND>
              • BrandID = xxxx
              [Brand Name]
              [Description]
              [WebMoniker]
              <CATEGORIES>
                   <CATEGORY>
                        • xmlns = URL
                        • WebMoniker = category_type
              <STORES>
                   <STORE>
                        • StoreID = ID#
                        • CenterID = ID#
    I dont think this is currently usable because if I wanted to create a list of stores from a particular center, that information is stored as an attribute of the <Store> tag, buried deep within the data, making it impossible to 'drag-n-drop'. 
    Not to mention much of the important data is held in attributes rather than text fields which are children of the tag.
    Im thinking of proposing the following organizational layout:
    <CENTERS>
         <CENTER>
         [Center_name]
         [Center_location]
              <CATEGORIES>
                   <CATEGORY>
                        [Category_Type]
                        <BRANDS>
                             <BRAND>
                                  [Brand_name]
    My thought is that if I have the <CENTER> tag then I can simply drag that into a frame and it will auto populate all of the brands by Category (as organized in the XML) for that center into the frame.
    Why is this important?
    This is used on multiple documents in different layout styles, and since our store list is ever changes as leases end or begin, over 40 centers this becomes a big hairy monster.  We want this to be as automated as possible, but I'd settle for a significant amount of dragging and dropping as long as it is simple and straightforward.  I have a high tollerance for druding through code and creating work arounds but my co-workers do not.  This needs to be a system that is repeatable and understandable and needs to be able to function whether I'm here or not -- Mainly because I would like to step away from the responsibility of setting it up every time
    I'd love to hear your raw, unadulterated thoughts on the subject of Data merge and XML usage to accomplish these sorts of tasks.  What are your best practices and how would you / do you accomplish these operations?
    Regards-
    Robert

    From what I've gleaned through watching Lynda tutorials on the subject is that what I'm hoping to do is indeed possible.
    Peter, I dont disagree with you that there is a steep learning curve for me as the instigator / designer of this method for our team, but in terms of my teammates and end-users that will be softened considerably.  Even so I'm used to steep learning curves and the associated frustrations -- but I cope well with new learning and am self taught in many tools and programs.
    Flow based XML structures:
    It seems as though as long as the initial page is set up correctly using imported XML, individual data records that cascade in a logical fashion can be flowed automatically into new pages.  Basically what you do is to create an XML based layout with the dynamic portion you wish to flow in a single frame, apply paragraph styles to the different tags appropriately and then after deleting unused records, reimport the XML with some specific boxes checked (depending on how you wish to proceed).
    From there simply dragging the data root into the frame will cause overset text as it imports all the XML information into the frame.  Assuming that everything is cascaded correctly using auto-flow will cause new pages to be automatically generated with the tags correctly placed in a similar fashion to datamerge -- but far more powerful and flexible. 
    The issue then again comes down to data organization in the XML file.  In order to use this method the data must be organized in the same order in which it will be displayed.  For example if I had a Lastname field, and a Firstname field in that order, I could not call the Firstname first without faulting the document using the flow method.  I could, however, still drag and drop content from each tag into the frame and it would populate correctly regardless of the order of appearance in the XML.
    Honestly either method would be fantastic for our current set of projects, however the flow method may be particularly useful in jobs that would require more than 40 spreads or simple layouts with huge amounts of data to be merged.

  • Aqualogic Data Services Question

    Hello,
    I have recently downloaded BEA's Aqualogic Data Services 3.2 suite. We have requirement which requires that we effect CRUD operations on one table from another. Data sources have been created in WebLogic Server to this purpose. Within the Data Services suite, the necessary physical data services have also been set up - one for the source table and another for the target table. I am able to view the data within these tables by using the "Test" facilities, as provided uder the "Test" tab relevant to each of the physical data services. Both tables have the same exact number of attributes, and the type of attributes are exactly the same (i.e.: attribute 1: varchar2(20), attribute 2: varchar2(150)), so no complexity there.
    I would now like to poll the the source table (physical data source) and effect the required Create/Update/Delete operation operation in the target table (physical data source).
    How can this be done?
    Any useful answer to my question, or reference to a thorough document describing an answer would be greatly appreciated.
    I thank you in advance.

    Currently there is no DB adapter for Oracle Service Bus (AquaLogic) This should be avaliable early 2009. Although it may be some support for this in the 100 day release due this month, I forget what has been promissed in which release.
    Therfore you would need some external app to do the polling and invoke the AquaLogic. If you have Oracle SOA Suite you could use ESB or BPEL to do this then invoke AquaLogic.
    There is this link that creates a transport for Oracle PL/SQL which may be of some use.
    https://codesamples.projects.dev2dev.bea.com/servlets/tracking/remcurreport/true/template/ViewIssue.vm/id/S334/nbrresults/40
    cheers
    James

  • Data Guard questions

    Hello all,
    I am a newbie to data guard.
    1) Unless using the real time apply feature, standby redo logs must be archived before the data can be applied to the standby database. Am I correct in my understanding?
    2) Can we keep standby database in higher version(11i) and primary database in 10g?
    3) Do I need to left blank LOG_ARCHIVE_DEST_2 parameter on Standby Database init parameter?
    4) Can we face any problem if we have different LOG_ARCHIVE_FORMAT on the primary and standby site?
    Waiting for your valuable reply.
    With regards

    994269 wrote:
    Hello all,
    I am a newbie to data guard.
    Wel come to the data guard>
    1) Unless using the real time apply feature, standby redo logs must be archived before the data can be applied to the standby database. Am I correct in my understanding?
    Not uderstand your question.
    2) Can we keep standby database in higher version(11i) and primary database in 10g?
    No. Both the version should be same.
    3) Do I need to left blank LOG_ARCHIVE_DEST_2 parameter on Standby Database init parameter?
    Yes.Until you dont required to send archived log from standby database to any other database. If you have cascaded setup then you would required to set the parameter as required.
    4) Can we face any problem if we have different LOG_ARCHIVE_FORMAT on the primary and standby site?
    Yes. We should keep default log format or shourl be same.
    Waiting for your valuable reply.
    With regards

  • Dimension table design Data modeling question

    Hi Experts,
    Sorry if I am putting my question in a wrong forum and please suggest an appropriate forum.
    need your opinion on the existing design of our 10 years old datawarehouse.
    There is one dimension table with structure like following
    Dimension Table
    Dimension Key Number (THIS IS NOT A PRIMARY KEY)
    Natural key (from source) Number
    source name character
    current record indicator char(1)
    form_date date
    to_date date
    many other columns, which if change a new current record is created and previous is marked as H-historical
    Data is stored in the dimension table like this
    Dimension_key Natural key Source Name current record ind from_date to_date
    1 10001 Source1 H 1-jan-2005 31-may-2005
    1 10001 Source1 H 1-jun-20005 12-dec-2011
    1 10001 Source1 C 13-dec-2011 NULL
    2 20002 Source1 H 1-jun-20001 12-dec-2011
    2 20002 Source1 C 13-dec-2011 NULL
    The problem I see in this design is that there is no surrogate key, if any attribute is changed the new record is inserted by first taking the dimension key based on the (natural_key,source_name,current_record_ind).
    Shouldn't it be stored like following based on the data-warehousing principals.
    Dimension_key Natural key Source Name current record ind from_date to_date
    1 10001 Source1 H 1-jan-2005 31-may-2005
    2 10001 Source1 H 1-jun-20005 12-dec-2011
    3 10001 Source1 C 13-dec-2011 NULL
    4 20002 Source1 H 1-jun-20001 12-dec-2011
    5 20002 Source1 C 13-dec-2011 NULL
    Please let me know the pros and cons of the current design.

    And what if you have both the features something like this :
    Lineno  Dimension_key Natural key Source Name current record ind    from_date         to_date
         1              1       10001     Source1                  H   1-jan-2005     31-may-2005
         2              1       10001     Source1                  H  1-jun-20005     12-dec-2011
         3              1       10001     Source1                  C  13-dec-2011            NULLI mean just add a new column and populate it with required order by clause. Because what i guess, that in the second example you just added a new column which is something like a line number.
    Regards
    Girish Sharma

  • BI data collection question

    Dear colleagues,
    I have one big question regarding the BI data collection functionality (within the solution reporting tab of T-cd: DSWP).
    To whom it may concerned,
    According to the SAP tutor located at the learning map,
    it looks like SolMan system requires XI installed in order to use the BI data collection. Is this correct? If so, are there any step by step configuration guide that covers from installing XI to enabling the BI data collection function?
    Thank you guys in advance.
    rgds,
    Yoshi

    Hi,
    But we want to maintain Item Master in ITEM02, Item Branch in ITEM03  data separate infoobjects because the discriptions always changing that is the reason I created different  Master Data infoobject.
    Let me give one sample data
    Record1: IT001 LOT1 BRAN1 CODE1 CODE2 DS01 DS02 in  2006
    Record2: IT001 LOT1 BRAN1 CODE1 CODE2 DS05 DS04 in  2007
    If you look at above data in 2006 item description was DS01,DS02 and in 2007 Description changed from DS01, DS02 to DS05,DS04 with the same item no.
    We want to maintain item descriptions will be the same in 2006 and 2007 if we change item descriptions in 2007. How can it be possible?
    If I go your way how can we achive this task?
    Do we need Dalta loads?
    Also do i need to create InfoCube? Can I use ITEMNO infoobject as a Data target ?
    Please let me know your thoughts.
    Regards,
    ALEX

Maybe you are looking for

  • Text entry not recording

    Help please. Using Captivate 4 to capture an internal system.  Using training mode.  Whilst capturing the text entries are clicking ok but on Preview the text entry box isn't there. Checked settings and it works in all other applications I capture bu

  • Can't Sync Applications After HD Reformat/Reinstall

    A couple of days ago, I experienced some hard drive corruption and had to reformat my hard drive and reinstall the OS, via archive and install. I accidentally started up with a new user folder, but I fixed that by copying my old one over from the bac

  • How to "straighten" photos in iWeb?

    Hi All, I'm just starting to use iWeb and trying to learn the ins and outs. One thing I can't seem to find is a "straighten" feature for photos once they are placed in a page. For example, when selecting the "adjustments" one is presented with the us

  • Standard Queries

    Hi All, How to find the standard queries in SAP BI related to specific module like FI. How to install that queries? If you have SAP FI standard queries, Please let me know. thanks, KN

  • Scope of the BPM Process variables

    Hi All, What is the scope of the process variables in a BPM Process? I had defined couple of process variables in my BPM process and observed that whenever i go into a Human Task activity and come back to the BPM process all my process variables valu