SAP data collection framework - owner change

Hi,
in the data collection framework there is an owner assigned to all data collectors. In general the owner is the SID system itself.
If another system (i.e. solman) is connected to the system, the owner of the data collector changes to that system.
The data collection does not work any longer and creates a lot of errors.
I can change back the owner, but it changes immediately to the connected system.
Why does the data collector not work with the "foreign" owner? And why does the owner change at all?
Best regards
Arne

Dear Mr. Knoeller,
Your SAP_BASIS SP and release would be helpful, both for SolMan and the managed system.
Until then, could you please check the following notes and KBAs, in this order:
1927012 - SYB: DBACockpit shows warnings about data collectors not
being properly set up or having a different version
1972114 - SYB: Error in DCF ownership transfer - should solve your issue in my opinion
1712802 - SYB: Change ownership of Data Collection Framework /
ATM
1623182 - SYB: Authorization issues in DBACockpit
In case this is a test system, if the mentioned notes do not offer you a propper resolution, I would also drop all the collectors (Framework Collecter would be the last one, of course) and reassign it to Solution Manager. This also depends on your NW release, SP and DBA Cockpit rellease.
Relevant notes regarding DBA Cockpit version known by me:
1757928 - SYB: DBA Cockpit Release Notes 7.02 SP11, 7.30 SP6, 7.31
SP2
1758182 - SYB: DBA Cockpit Release Notes 7.02 SP12, 7.30 SP8, 7.31
SP5
1758496 - SYB: DBA Cockpit Release Notes 7.02 SP13, 7.30 SP9, 7.31
SP7
1814258 - SYB: DBA Cockpit Release Notes 7.02 SP14, 7.30 SP10, 7.31
SP9, 7.40 SP4
Kind Regards,
Ionut

Similar Messages

  • SAP Data Migration Framework

    Dear All, I read from TechEd that SAP data migration framework involves using Business Objects data services. SAP has provided pre-built Jobs with iDOC as target.
    All is fine but I came to know that we can use Data services as staging area and modify etc our data in Data services beforw loading into SAP ECC system.
    I would like to kow more about using Data services as staging area. Does that mean that when we run jobs then data is stored in iDOC(XML file) on DS or it means we can run DS in demo mode where we don't load data but keep it within memory of DS. This is different from profiling and storing execution/DQ statistics. This is storing real time data in DS and then modify/analyze and then load in SAP ECC.
    Can someone explain if we can run DS in demo mode without loading data?
    Thanks,

    Hi - you can get more information on using Data Services for data migraiton at http://service.sap.com/bp-datamigration.  Also, check out the blogs:
    /people/ginger.gatling/blog/2010/08/27/data-migration-installing-data-services-and-best-practice-content
    /people/ginger.gatling/blog/2011/03/31/data-migration-in-5-minutes

  • SAP ME Data Collection setup

    Hello Experts!
    We are trying to set up data collection for following scenario:
    Typically our order quantity can vary from 1000 - 5000 pcs and we want to create only one SFC per shop order. We want to collect data after a fixed number of pcs are reported, for example, after every 100 pcs reported. So in above example number of iterations for data collection could vary from 10 - 50.
    We see that there are two fields "Required Data Entries" & "Optional Data Entries" in the Data Collection parameter detail tab but looks like those are for static values but we want this to change based on order quantity.
    Also we noticed another issue with these fields for our scenario, if we use "Required Data Entries" field then user has to collect all the iterations together but that is not possible since we are collecting after reporting a certain qty. If we use "Optional Data Entries" then system allow the user to collect multiple times but the pop-up does not indicate how many iterations are already collected which is confusing for the users.
    Has anyone else had any experience with a similar scenario?
    Thanks in advance!
    -Venkat

    Hello Venkat,
    To collect data against the same SFC several times you should enable the corresponding System Rule 'Allow Multiple Data Colelction'. The "Optional Data Entries" rather controls the number of optional entries that you can enter (i.e. you measure temperature of SFC and need to enter several measures for a single Data Collection).
    As long as you enable the system rule, it will be up to you when to collect it. But there is no standard functionality to force it after certain Qty of SFC processed. You'll need a customization for it.
    Br, Alex.

  • SAP Personnel Cost Planning - Data Collection Employee not providing hours

    Hello,
    I am implementing SAP PCP and have created a 'Salaries; Cost Item. It has both Salary $ (1003) wagetypes and an Hour amount (1305 Normal hours) wagetype attached to it. When I run data collection, I only get the $ amounts for Infotype 0666. Is there a setting I have missed in order to bring the Hour Amounts into the 'Salaries' Cost Item in Infotype 0666?
    Conversly, I checked the SAP help and it looks like I can also bring in Hours via another Cost Item and attach that to a Statistical Key Figure.  However, I can't see where I can attach a WT (hours) to the Statistical Key Figure.
    Can anyone help me out?  I appreciate any help you can give.
    Thanks
    Alex

    1) Create a new cost item of the Statistical Key Figure Type
    2) Create a symbolic account
    3) Assign the Hours wage type to the symbolic account
    4) Assign the cost item to the symbolic account
    That way the Hour wage type will then be related to the Cost Item. The issue you may have is if the Hour wage type is already assigned to post to an existing symbolic account.
    Hope this helps.

  • Change product in Sap data administration

    Hi
    In Sap data adminstration our GXP system is registered as product MDM. This is wrong. GXP is actually a XI 3.0 system. Is there a way to change that?
    Best regards
    Nils-Olov Granstedt
    edb

    Hi Michael,
    I am facing Problem while updating Employee details using web proxy
    PI is sending data to ECC via web proxy to update Employee details. while receiving data (able to see in SXMB_MONI) ECC System is updated successfully. but change pointers is not triggered(BDCP2 table is not updated).
    Now I am executing program "RBDMIDOC" using Message type
    "HRMD_ABA" but "No data selected for distribution".
    Note:- All the configuration is correct.
    *****Successful Manual Process , Change pointers is generated****************
    Manual Process:- Step 1. Changing employee details through PA30 ex:- perner 70821 , infotype : "Communication"  sty :- "0010".
    Step 2: BDCP2 table has entries.
    Step 3:- executing program "RBDMIDOC" using Message type "HRMD_ABA".
    Please Note:- After analysing PA30 Transaction I can come to know that change pointers set when we are changing any employee details through PA30. Change pointers set means "BDCP2" table has entries with processing indicator = BLANK.Now we need to process this change pointers using standard program "RBDMIDOC" , after executing this program an IDOC is generated contaning the updated details of an employee. Now in CRM System we can import this IDOC via standard report and hence all the updation happened in CRM system.
    Please assist me how can I trigger change pointer/ Update BDCP2 table while updating employee details using web proxy.
    Regards
    Ashish

  • Reporting on SR Owner Change, Activity Due Date Change

    We have few Report Requirements around Audit Trail. We are looking for Reports where we could Track Service Request Owner change, SR Status Change, Activity Due Date Change etc. As we know, In CRMOD Analytics, we cannot report on Audit Trail or use the PRE() function. Can anybody suggest any other ways for reporting on the Changes where we could show each and every change made to these fields and not only the previous values.

    Lets Say you want to track status Change in SR. There are two ways of doing it.
    Option 1:
    Create a Long text field in SR page.
    Name it "Track Status Change". Do not Display the field to the users.
    Default the Field to Initial Status+","+Now().
    When the Status Changes, you need to append the field with "Current Status"+","+Now()+";"
    The Semi Colon is delimiter to show that this is the new status. This delimiter would be used in the report to seperate the status. You can achieve this using the reporting functionalities.
    The down side of this is that it can affect the performance of report since lot of Sting functionalities would be used for the tracking.
    Option 2
    Whenever the status changes, create a completed task. Populate the Subject of the task with the Status. Put a workflow/field validation so that no one edits the subject when the task is created automatically.
    You can run a report on the Task Created Date and Subject to get the Audit Trail.
    Let me know if this helps.
    Paul Swarnapandian.

  • Bug in Data Collection Changes By Row for Dynamic domains?

    Hello,
    I was using the Data Collection Changes By Row? option on one of my dynamic domains.
    This resulted in a nullpointer exception for selected items on the jspx page.
    After some searching I found this was because the appmodule was null.
    And this was cause by the generated domains beans.xml.
    The generated code was:
    <property-name>applicationModule</property-name>
    <value>#{data.<appmodule datacontrol name>.dataProvider}</value>While it should have been:
    <property-name>applicationModule</property-name>
    <value>#{data.<appmodulename>DataControl.dataProvider}</value>I think this is related to me setting the datacontrolname on the appmodule under custom properties. But I think JHS should be able to detect that?
    Anton

    Anton,
    Currently JHeadstart has no support for customized data control names. However, I have added it to our list of enhancement requests.
    As a workaround, you can customize the Generator Template for the domains bean.
    kind regards,
    Sandra Muller
    JHeadstart Team
    Oracle Consulting

  • SAP-date change...

    Hi,
    When I logon to SAP Home, a window message says that, "License expiration date....." and I know that system date has to be changed to installation date to keep it alive. 
    Here I want to know,
    Is there any particular procedure to be followed ?
    For eg. stop the Oracle services, stop java services...etc.

    Pankaj
    Do you mean that SAP home as SAP Portal or SAP system ?
    REgards
    Anwer Waseem
    SAP BASIS

  • Redesigning the Collections Framework

    Hi!
    I'm sort of an experienced Java programmer, in the sense that I program regularly in Java. However, I am not experienced enough to understand the small design specifics of the Collections Framework (or other parts of Javas standard library).
    There's been a number of minor things that bugged me about the framework, and all these minor things added up to a big annoyance, after which I've decided to (try to) design and implement my own framework.
    The thing is however, that since I don't understand many design specifics about the Collection Framework and the individual collection implementations, I risk coming up short with my own.
    (And if you are still reading this, then I thank you for your time, because already now I know that this entry is going to be long. : ) )
    Since I'm doing my Collection framework nearly from scratch, I don't have to worry too much about the issue of backwards compatibility (altough I should consider making some parts similar to the collection framework as it is today, and provide a wrapper that implements the original collection interfaces).
    I also have certain options of optimizing several of the collections, but then again, there may be very specific design issues concerning performance and usability that the developers of the framework (or other more experienced Java progammers) knew about, that I don't know.
    So I'm going to share all of my thoughts here. I hope this will start an interesting discussion : )
    (I'm also not going to make a fuss about the source code of my progress. I will happily share it with anyone who is interested. It is probably even neccessary in order for others to understand how I've intended to solve my annoyances (or understand what these annoyances were in the first place). ).
    (I've read the "Java Collections API Design FAQ", btw).
    Below, I'm going to go through all of the things that I've thought about, and what I've decided to do.
    1.
    The Collections class is a class that consists only of static utility methods.
    Several of them return wrapper classes. However the majority of them work on collections implementing the List interface.
    So why weren't they built into the List interface (same goes for methods working only with the Collection interface only, etc)? Several of them can even be implemented more efficiently. For example calling rotate for a LinkedList.
    If the LinkedList is circular, using a sentry node connecting the head and tail, rotate is done simply by relocating the sentry node (rotating with one element would require one single operation). The Collections class makes several calls to the reverse method instead (because it lacks access to the internal workings of a LinkedList).
    If it were done this way, the Collections class would be much smaller, and contain mostly methods that return wrapped classes.
    After thinking about it a while, I think I can answer this question myself. The List interface would become rather bloated, and would force an implementation of methods that the user may not need.
    At any rate, I intend to try to do some sort of clean-up. Exactly how, is something I'm still thinking about. Maybe two versions of List interfaces (one "light", and the other advanced), or maybe making the internal fields package private and generally more accessible to other package members, so that methods in other classes can do some optimizations with the additional information.
    2.
    At one point, I realized that the PriorityQueue didn't support increase\decrease key operations. Of course, elements would need to know where in the backing data structure it was located, and this is implementation specific. However, I was rather dissapointed that this wasn't supported somehow, so i figured out a way to support this anyway, and implemented it.
    Basically, I've wrapped the elements in a class that contains this info, and if the element would want to increase its key, it would call a method on the wrapping class it was contained in. It worked fine.
    It may cause some overhead, but at least I don't have to re-implement such a datastructure and fiddle around so much with the element-classes just because I want to try something with a PriorityQueue.
    I can do the same thing to implement a reusable BinomialHeap, FibonacciHeap, and other datastructures, that usually require that the elements contain some implementation-specific fields and methods.
    And this would all become part of the framework.
    3.
    This one is difficult ot explain.
    It basically revolves around the first question in the "Java Collections API Design FAQ".
    It has always bothered me that the Collection interface contained methods, that "maybe" would be implemented.
    To me it didn't make sense. The Collection should only contain methods that are general for all Collections, such as the contains method. All methods that request, and manipulate the Collection, belonged in other interfaces.
    However, I also realized that the whole smart thing about the current Collection interface, is that you can transfer elements from one Collection to another, without needing to know what type of Collection you were transferring from\to.
    But I still felt it violated "something" out there, even if it was in the name of convenience. If this convenience was to be provided, it should be done by making a separate wrapper interface with the purpose of grouping the various Collection types together.
    If you don't know what I'm trying to say then you might have to see the interfaces I've made.
    And while I as at it, I also fiddled with the various method names.
    For example, add( int index, E element), I felt it should be insert( int index, E element). This type of minor things caused a lot of confusion for me back then, so I cared enough about this to change it to somthing I thought more appropriate. But I have no idea how appropriate my approach may seem to others. : )
    4.
    I see an iterator as something that iterates through a collection, and nothing else.
    Therefor, it bothered me that the Iterator interface had an optional remove method.
    I myself have never needed it, so maybe I just don't know how to appreciate it. How much is it used? If its heavily used, I guess I'm going to have to include it somehow.
    5.
    A LinkedList doesnt' support random access. But true random access is when you access randomly relative to the first index.
    Iterating from the first to the last with a for statement isn't really random access, but it still causes bad performance in the current LinkedList implementation. One would have to use the ListIterator to achieve this.
    But even here, if you want a ListIterator that starts in the middle of the list, you still need to traverse the list to reach that point.
    So I've come up with LinkedList that remembers the last accessed element using the basic methods get, set, remove etc, and can use it to access elements relatively from it.
    Basically, there is now an special interal "ListIterator" that is used to access elements when the basic methods are used. This gives way for several improvements (although that may depend how you look at it).
    It introduces some overhead in the form of if-else statemenets, but otherwise, I'm hoping that it will generally outperform the current LinkedList class (when using lists with a large nr of elements).
    6.
    I've also played around with the ArrayList class.
    I've implemented it in a way, that is something like a random-access Deque. This has made it possible to improvement certain methods, like inserting an element\Collection at some index.
    Instead of always shifting subsequent element to the right, elements can be shifted left as well. That means that inserting at index 0 only requires a single operation, instead of k * the length of the list.
    Again, this intrduces some overhead with if-else statements, but is still better in many cases (again, the List must be large for this to pay off).
    7.
    I'm also trying to do a hybrid between an ArrayList and a Linked list, hopefully allowing mostly constant insertion, but constant true random access as well. It requires more than twice the memory, since it is backed by both an ArrayList and a LinkedList.
    The overhead introduced , and the fact that worst case random access is no better than that of a pure LinkedList (which occurs when elelements are inserted at the same index many times, and you then try to access these elements), may make this class infeasible.
    It was mostly the first three points that pushed my over the edge, and made me throw myself at this project.
    You're free to comment as much as you like.
    If no real discussion starts, thats ok.
    Its not like I'm not having fun with this thing on my own : )
    I've started from scratch several times because of design problems discovered too late, so if you request to see some of the source code, it is still in the works and I would have to scurry off and add a lot of java-comments as well, to explain code.
    Great. Thanks!

    This sort of reply has great value : )
    Some of them show me that I need to take some other things into consideration. Some of them however, aren't resolved yet, some because I'm probably misunderstanding some of your arguments.
    Here goes:
    1.
    a)
    Are you saying that they're were made static, and therefor were implemented in a utility class? Isn't it the other way around? Suppose that I did put them into the List interface, that would mean they don't need to be static anymore, right?
    b)
    A bloated List interface is a problem. Many of them will however have a default not-so-alwyas-efficient implementation in a abstract base class.
    Many of the list-algorithms dump sequential lists in an array, execute the algorithm, and dump the finished list back into a sequential list.
    I believe that there are several of them where one of the "dumps" can be avoided.
    And even if other algorithms would effectively just be moved around, it wouldn't neccesarily be in the name of performance (some of them cannot really be done better), but in the name of consistency\convenience.
    Regarding convenience, I'm getting the message that some may think it more convenient to have these "extra" methods grouped in a utility class. That can be arranged.
    But when it comes to consistency with method names (which conacerns usability as well), I felt it is something else entirely.
    For example take the two methods fill and replaceAll in the Collections class. They both set specific elements (or all of them) to some value. So they're both related to the set method, but use method names that are very distinguished. For me it make sense to have a method called setAll(...), and overload it. And since the List interface has a set method, I would very much like to group all these related methods together.
    Can you follow my idea?
    And well, the Collections class would become smaller. If you ask me, it's rather bloated right now, and supports a huge mixed bag of related and unrelated utitlity methods. If we should take this to the extreme, then The Collections class and the Arrays class should be merged.
    No, right? That would be hell : )
    2,
    At a first glance, your route might do the trick. But there's several things here that aren't right
    a)
    In order to delete an object, you need to know where it is. The only remove method supported by PriorityQueue actually does a linear search. Increase and decrease operations are supposed to be log(n). Doing a linear search would ruin that.
    You need a method something like removeAt( int i), where i would be the index in the backing array (assuming you're using an array). The elemeny itself would need to know that int, meaning that it needs an internal int field, even though this field only is required due to the internal workings of PriorityQueue. Every time you want to insert some element, you need to add a field, that really has nothing to with that element from an object-oriented view.
    b)
    Even if you had such a remove method, using it to increase\decrease key would use up to twice the operations neccesary.
    Increasing a key, for example, only requires you to float the element up the heap. You don't need to remove it first, which would require an additional log(n) operations.
    3.
    I've read the link before, and I agree with them. But I feel that there are other ways to avoid an insane nr of interfaces. I also think I know why I arrive at other design choices.
    The Collection interface as it is now, is smart because it can covers a wide range of collection types with add and remove methods. This is useful because you can exchange elements between collections without knowing the type of the collection.
    The drawback is of course that not all collection are, e.g modifiable.
    What I think the problem is, is that the Collection interface is trying to be two things at once.
    On one side, it wants to be the base interface. But on the other side, it wants to cast a wide net over all the collection types.
    So what I've done is make a Collection interface that is infact a true base interface, only supporting methods that all collection types have in common.
    Then I have a separate interface that tries to support methods for exchanging elements between collections of unknown type.
    There isn't neccesarily any performance benefit (actually, it may even introduces some overhead), but in certainly is easier to grasp, for me at least, since it is more logically grouped.
    I know, that I'm basically challenging the design choices of Java programmers that have much more experience than me. Hell, they probably already even have considered and rejected what I'm considering now. In that case, I defend myself by mentioning that it isn't described as a possiblity in the FAQ : )
    4.
    This point is actually related to point 3., becausue if I want the Collection interface to only support common methods, then I can't have an Iterator with a remove method.
    But okay....I need to support it somehow. No way around it .
    5. 6. & 7.
    The message I'm getting here, is that if I implement these changes to LinkedList and ArrayList, then they aren't really LinkedList and ArrayList anymore.
    And finally, why do that, when I'm going to do a class that (hopefully) can simulate both anyway?
    I hadn't thought of the names as being the problem.
    My line of thought was, that okay, you have this arraylist that performs lousy insertion and removal, and so you avoid it.
    But occasionally, you need it (don't ask me how often this type of situation arises. Rarely?), and so you would appreciate it if performed "ok". It would still be linear, but would often perform much better (in extreme cases it would be in constant time).
    But these improvements would almost certainly change the way one would use LinkedList and ArrayList, and I guess that requires different names for them.
    Great input. That I wouldn't have thought of. Thanks.
    There is however some comments I should comment:
    "And what happens if something is suibsequently inserted or removed between that element and the one you want?"
    Then it would perform just like one would expect from a LinkedList. However if that index were closer to the last indexed position, it would be faster. As it is now, LinkedList only chooses either the first index or the last to start the traversal from.
    If you're working with a small number of elements, then this is definitely not worth it.
    "It sounds to me like this (the hybrid list) is what you really want and indeed all you really need."
    You may be right. I felt that since the hybrid list would use twice as much memory, it would not always be the best choice.
    I going to think about that one. Thanks.

  • Pkg com.sap.aii.proxy.framework.core don't exist even added in the JRE path

    Hi All,
    I am facing some error while activating the activity. I have checked-in the activity but getting the error while trying to activated the activity. there are two DC's in one of them i am gettign this error. i have not noticed this while check-in.
    After checked in - i am not getting the  any such option --Revert activitty or delete activity.
    This activity is alreday closed.. what i can do with this activity??  After this error i am not able to activate any activity. I am getting the below error in all the activity.
    package com.sap.aii.proxy.framework.core does not exist even though i have added this into the  JRE path in NWDS. Do i need to import the same version file somewhere into J2ee server or somewhere else??
    Regards,
    Narpal

    Hi,
    javac ERROR: class file has wrong version 49.0, should be 48.0
    This error means that the class file in question is being built with JDK1.5 (49), however it should be built with JDK1.4.2 (48).
    In other words, as per the class the build expects a class file built with 1.4.2, but in your track JDK1.5 is configured for the build tool.
    You need to therefore :
    A.) review the CBS service settings, as most likely the parameters BUILDT_TOOL_JDK_HOME and JDK_HOME_PATHS are set up improperly.
    B.) plus the build variant is incorrect in the track in question.
    I'll refer below to the corresponding points with the letters A and B.
    Some more explanation on the error:
    If you check any class file with a hex editor, you'll see at the beginning of it this:
    example: CA FE BA BE | 00 00 |00 31  ...
    JDK 1.6 = 50 (0x32 hex)
    JDK 1.5 = 49 (0x31 hex = (161)*3 + (160)*1 = 49)
    JDK 1.4 = 48 (0x30 hex)
    JDK 1.3 = 47 (0x2F hex)
    JDK 1.2 = 46 (0x2E hex)
    JDK 1.1 = 45 (0x2D hex)
    On the first 4 bytes there's a magic number (CAFEBABE) which ensures this is a java class file, then on 2-2 bytes you'll see the minor and the major version (in this order). In the above example 00 00 | 00 31. This is what displayed as major.minor i.e. 49.0 what you can see in error messages. See also the attachment I've put to this thread minormajor.JPG.
    More information on the class file structure:
    http://java.sun.com/docs/books/jvms/second_edition/html/ClassFile.doc.html
    For point A)
    See the guide for doublechecking the CBS parameters
    help.sap.com u2013 CBS Service Properties
    http://help.sap.com/saphelp_nw70/helpdata/EN/53/75b3407e73c57fe10000000a1550b0/frameset.htm
    Further explanation to the paramters
    BUILD_TOOL_JDK_HOME and JDK_HOME_PATHS
    BUILD_TOOL_JDK_HOME = <path to highest JDK>
    JDK_HOME_PATHS = JDK1.3.1_HOME=<path of jdk131>;JDK1.4.2_HOME=<path of jdk142>;JDK1.5.0_HOME=<path of jdk150>JDK1.6.0_HOME=<path of jdk160>;default=<path of the JDK as default>
    Some simple rules with examples:
    - for BUILD_TOOL_JDK_HOME you simply enter the path to your JDK, e.g.: /opt/IBMJava2-amd64-142
    - for JDK_HOME_PATHS you have to follow the scheme "key=value" e.g.: JDK_1.4.2_HOME=/opt/IBMJava2-amd64-142
    - for BUILD_TOOL_JDK_HOME you always specify the highest JDK,
    - for JDK_HOME_PATHS you list the available JDKs.
    - JRE is not allowed, specify always JDK!
    For point B)
    Regarding the build variant in your track: check the page 13 for the paramter com.sap.jdk.home_path_key as per the guide :
    How To... Setup an NWDI Track for Composition Environment Developments
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/7014086d-3fd9-2910-80bd-be3417810c6f
    Summary:
    A. The first part of the solution is then to doublecheck the CBS service parameters. CBS service must be restarted after changes.
    B. The second part is to doublecheck the build variant in the track (see the Build Variants tab on the Track Data tab on the CMS webui  - Landscape Configurator).
    I recommend you to have
    A) BUILD_TOOL_JDK_HOME = <highest available JDK, I recommend here to set the path to JDK1.5>
    JDK_HOME_PATHS = JDK1.4.2_HOME=<path of jdk142>;JDK1.5.0_HOME=<path of jdk150>;default=<path of the JDK as default, set it to the same path as you did for 1.4.2_HOME>
    B) set the com.sap.jdk.home_path_key in the build variant to 1.4.2_HOME, or leave it for the default. Do not forget to set this build variant explicitly for >= 7.1 tracks to JDK1.5.0_HOME
    I hope this helps.
    Best Regards,
    Ervin

  • BI data collection question

    Dear colleagues,
    I have one big question regarding the BI data collection functionality (within the solution reporting tab of T-cd: DSWP).
    To whom it may concerned,
    According to the SAP tutor located at the learning map,
    it looks like SolMan system requires XI installed in order to use the BI data collection. Is this correct? If so, are there any step by step configuration guide that covers from installing XI to enabling the BI data collection function?
    Thank you guys in advance.
    rgds,
    Yoshi

    Hi,
    But we want to maintain Item Master in ITEM02, Item Branch in ITEM03  data separate infoobjects because the discriptions always changing that is the reason I created different  Master Data infoobject.
    Let me give one sample data
    Record1: IT001 LOT1 BRAN1 CODE1 CODE2 DS01 DS02 in  2006
    Record2: IT001 LOT1 BRAN1 CODE1 CODE2 DS05 DS04 in  2007
    If you look at above data in 2006 item description was DS01,DS02 and in 2007 Description changed from DS01, DS02 to DS05,DS04 with the same item no.
    We want to maintain item descriptions will be the same in 2006 and 2007 if we change item descriptions in 2007. How can it be possible?
    If I go your way how can we achive this task?
    Do we need Dalta loads?
    Also do i need to create InfoCube? Can I use ITEMNO infoobject as a Data target ?
    Please let me know your thoughts.
    Regards,
    ALEX

  • How to enforce data collection

    Hi everybody,
    how can I enforce data collection at a specific operation? I am using the integrated POD showing the data to be collected in one part of the window. The data collection itself works fine, but it is merely optional. I can complete the operation without a warning.
    So, where is the hook or rule that I need to set up in order to check whether data collection has been done before completion of the operation?
    Georg

    Hello,
    Please chec k the link below:
    http://help.sap.com/saphelp_me52/helpdata/EN/d7/f7f0be3fec4a31bec083a035eb2423/content.htm
    this explains activity CT500,
    This activity deals with the checking of component, making them as mandatory to assembly etc.
    you can change for the CT500 and CT500_RICH activities in Activity Maintenance:
    On POD you have to use these activities.
    Note: The system executes all code associated with a hook point in the same database transaction. For hook points within POD pushbutton activities, the transaction includes a single pushbutton activity, such as Start (PR500). If the hook point activity fails, the system rolls back, or cancels, the entire transaction. For example, in the figure in Site Level Hook Points, if you associate Check Configuration (CT520) with the POST_START hook point and the components have not been assembled, the system rolls back the Start as well. This is true for all hook points
    br,
    Pushkar

  • Unable to Data Collection Methods in CCMS/RZ20

    Hi gurus,
    There are lot of Montoring Objects for which there are no data collection methods already assigned which I am trying to do.
    I am able to do method assignments only to data analysis and auto reaction methods but not " data collection " which remains in display only mode even after using the edit option,
    Kindly let me know how to proceed.
                                                                                    Regards,
    Sandeep.

    Hi,
    in the past there was the option to change the data supplier. After several customers accidentally entered analysis or auto-reaction methods, the changed that to read-only, because its completely preconfigured - no need to change. And: you cannot know the name of a data supplier to be entered.
    NO METHOD does not mean: There is no data supplier. There are a lot of MTEs being refreshed with data, although there is NO METHOD being entered. These MTEs are refreshed by so called active data suppliers - they run independently from the CCMS Moni Infra.
    Example: Syslog. Data collection method: NO METHOD. Real data supplier: The SAP kernel, which reports into syslog (when there is something to be reported) and into CCMS in parallel. All changes to that MTE would cause a problem in data supplying.
    So please feel free to open a n SAP call, whenever an MTE in CCMS is white (no data), and you are interested in that information. Sometimes we disable data supplying by default - then we can tell you how to enable it.
    Best regards
    Christoph Nake
    PM CCMS

  • RZ70 - Data Collection

    Hi,
    I am trying to configure my SneakPreview WAS Java SLD on local PC with an existing R/3 Test system. The existing R/3 test system is already configured to talk with the Test Portal system.
    Now when I goto RZ70 Tcode of the R/3 test system and do a 'Start Data Collection' giving host name as 'localhost'
    1) Will it create a technical ABAP System in my SneakPreview WAS Server ?
    2) Will it affect the existing bonding which is already been established with the Test Portal system ?
    Rgrds,
    Gandolf S

    Hi Gandolf,
    Have you given the Fully Qualified Domain Name (FQDN) of the gateway server while giving entries against the Gateway server name in the SLD settings and while executing the RZ70 transaction.?.
    Make it sure that, there is no port issues existing in between the J2EE server and the R/3 server.By default, the gateway server port of the R/3 system is 3300(provided the instance number of the R/3 installation is 00.Gateway port number will changes with the instance number.The convention is , 33<instance number of the installation>)..Make it sure the servers are on the same domain, if not, open the appropriate ports in between these two systems.
    Make it sure that, you have added the entry for the R/3 system in the <b>hosts</b> file of your J2EE system.Right now, you are using the gateway server names instead of the R/3 server IP address.In the case of System name, the above mentioned is mandatory.
    (IP address, system name mapping should be there in the hosts file inorder to resolve the IP - host name conflict).Check this point also.
    Just check the two destinations SLD_UC and SLD_NUC.
    Login on to the backend system.Execute sm59.From the TCP/IP Connections,double click on SLD_NUC.Then Check the Program ID filled against the registered server program. By default it is SLD_NUC.Check it is there or not.
    Then do a test Connection for this destination.Check the status message, like Successful or Error.
    Do the same for SLD_UC destination also.note the result while testing the connection.
    If you have already started the SLD data supplier bridge with the same gateway server and service name which will be mentioning while executing the RZ70 transaction , then the result will be successul.
    Its working is like this...
    When you start the SLD data supplier bridge after filling the server and service details, two program IDs like SLD_UC and SLD_NUC will be registered in the gateway server.When you try to execute the RZ70 transaction, the relevant data will be collected by the data collector programs and will be sent to the two registered server programs. If these program iDs are not yet registered with the Gateway server, then you will get exception like , 'Program ID not registered'.
    (like out bound JCo connection.For accessing J2EE applications
    from the SAP System)..
    Check these things..
    Regards,
    Kishor Gopinathan

  • SAP data files

    Hi
    My SAP version is ECC 5.0 with oracle 9.2.0.7 on SOLARIS 10.
    My current DB size is 300GB where i have 23 data files on my oracle/<SID>/sapdata1...4
    I have implemented MM,FICO,PP,QM modules.My monthly DB growth rate is 30-35 GB.
    My Questions
    01.Is this growing rate is acceptable.If not how can i find the error.
    02.PSAP<SID> table space is the place where my data is written.I have to keep on monitoring the space on this table space and i have to manually extend that once it is filled.Why do i need to do this whwre i have enable "AUTOEXTEND ON" and why it is not automatically extend.
    03.Table reqorganization/Index Rebuilding  is not giving me any gain and why?
    Pls give me your feed back.
    With the current growing rate i can go for another 1.5 yrs but after that what should i do..
    Roshantha

    Hi,
    Including the brtools,
    >> we are using Oracle 10g on RHEL5.4 64Bit i want to move all my sap data files along with Temp are there any specific notes that i need to follow apart form 868750 and there is no mention of Temp file in this note.
    You can move all the datafiles including TEMP tablespace, by the statement, below;
    ALTER DATABASE RENAME FILE '<old_full_path>' TO '<new_full_path>';
    >> Apart from this i want to know what changes need to be done in the control file for the information of the new file system,so that the database can be started using changed control file having information of the renamed and moved file system.
    You will not change anything on control files, manually. It will be updated after you executed the statement, respectfully
    >> Are there any oracle parameters or profile parameters that needs to be changed and how much downtime is required.
    No you don't need to change any database profile parameter after you move the datafile(s). If you prepare a script and execute it, it will be done in a few seconds. This is because only configuration will be updated. The datafiles will not be created from the scratch.
    Best regards,
    Orkun Gedik

Maybe you are looking for