RZ70 - Data Collection

Hi,
I am trying to configure my SneakPreview WAS Java SLD on local PC with an existing R/3 Test system. The existing R/3 test system is already configured to talk with the Test Portal system.
Now when I goto RZ70 Tcode of the R/3 test system and do a 'Start Data Collection' giving host name as 'localhost'
1) Will it create a technical ABAP System in my SneakPreview WAS Server ?
2) Will it affect the existing bonding which is already been established with the Test Portal system ?
Rgrds,
Gandolf S

Hi Gandolf,
Have you given the Fully Qualified Domain Name (FQDN) of the gateway server while giving entries against the Gateway server name in the SLD settings and while executing the RZ70 transaction.?.
Make it sure that, there is no port issues existing in between the J2EE server and the R/3 server.By default, the gateway server port of the R/3 system is 3300(provided the instance number of the R/3 installation is 00.Gateway port number will changes with the instance number.The convention is , 33<instance number of the installation>)..Make it sure the servers are on the same domain, if not, open the appropriate ports in between these two systems.
Make it sure that, you have added the entry for the R/3 system in the <b>hosts</b> file of your J2EE system.Right now, you are using the gateway server names instead of the R/3 server IP address.In the case of System name, the above mentioned is mandatory.
(IP address, system name mapping should be there in the hosts file inorder to resolve the IP - host name conflict).Check this point also.
Just check the two destinations SLD_UC and SLD_NUC.
Login on to the backend system.Execute sm59.From the TCP/IP Connections,double click on SLD_NUC.Then Check the Program ID filled against the registered server program. By default it is SLD_NUC.Check it is there or not.
Then do a test Connection for this destination.Check the status message, like Successful or Error.
Do the same for SLD_UC destination also.note the result while testing the connection.
If you have already started the SLD data supplier bridge with the same gateway server and service name which will be mentioning while executing the RZ70 transaction , then the result will be successul.
Its working is like this...
When you start the SLD data supplier bridge after filling the server and service details, two program IDs like SLD_UC and SLD_NUC will be registered in the gateway server.When you try to execute the RZ70 transaction, the relevant data will be collected by the data collector programs and will be sent to the two registered server programs. If these program iDs are not yet registered with the Gateway server, then you will get exception like , 'Program ID not registered'.
(like out bound JCo connection.For accessing J2EE applications
from the SAP System)..
Check these things..
Regards,
Kishor Gopinathan

Similar Messages

  • Problem setting data collection for SLD

    Good day
    I have the following situation.
    A Solution Manager 71 installation, just recently patched to SP11. Previously I had the SLD on this system. And then a colleague informed that this could be a problem in the future as my ECC systems are Basis 731 which is higher level than the 702 on the Solution Manager system. So I installed a separate Java stack system on the same server as the Solution Manager.
    The problem that I am faced with now is that I cant get the data collection (RZ70) to connect with the new Java system that I have installed. When I go into SLD and Administration, I see under Details>Data Suppliers that the server is specified as the Gateway Host, with a Gateway port of 3303. On the Solution Manager system it specified the gateway host as "localhost" and the gateway port as sapgw00.
    So, from RZ70 I specify the gateway service as sapgw03 and then I get the result:
    0: DSIMSID001_SID_00                    
    : Collection of SLD data finished
    0: DSIMSID001_SID_00                    
    : Data collected successfully
    0: DSIMSID001_SID_00                    
    : RFC data prepared
    0: DSIMSID001_SID_00                    
    : Used RFC destination: SLD_NUC
    0: DSIMSID001_SID_00                    
    : RFC call failed: Error when opening an RFC connection (CPIC-CALL: 'ThSAPOCMINIT' : cmRc=17 thRc=2
    0: DSIMSID001_SID_00                    
    : Batch job not scheduled
    And from the SLD_NUC entry in SM59 I get
    Logon    Connection Error
    Error Details    Error when opening an RFC connection (CPIC-CALL: 'ThSAPOCMINIT' : cmRc=17 thRc=2
    Error Details    ERROR: SAP gateway communication error (Is SAP gateway closed?)
    Error Details    LOCATION: SAP-Server DSIMSID001_SID_00 on host DSIMSID001 (wp 1)
    Error Details    COMPONENT: CPIC
    Error Details    COUNTER: 47
    Error Details    MODULE:
    Error Details    LINE:
    Error Details    RETURN CODE: 239
    Error Details    SUBRC: 0
    Error Details    RELEASE: 721
    Error Details    TIME: Mon Mar 16 14:04:05 2015
    Error Details    VERSION:
    Can anyone give me a few ideas what may be wrong here please?
    Regards
    Ray Phillips

    Hi Ray,
    Please check if the below SLD configuration is done.You can use any one of the two below link depending on how you want to configure your SLD.
    Configure SLD for JCo and Creation of JCo Destinations - Portal - SCN Wiki
    or
    How to Create & Configure Local SLD on SAP NetWeaver 7.3 AS Java after Installation
    If your SLD server is not a local SLD then make sure that in both the SLD system and the remote system host fine entry is there in /etc/hosts and etc/services.

  • Can not start data Collection service

    I have installed Fabric Manager version 3.3.(4). In the Performance Web Client I can not start the service Data Collection is in starting forever.
    I also could not stop it. I rebooted the server and the status for this service is in starting.
    Thanks,

    Please provide more details like: Server OS, Java version, and what type of database you are using for the FM Server.  Have you considered using a newer version of FM Server?

  • Customer Name on Sales Order not in Data Collections

    Hi,
    I am looking for some help. My company just did an upgrade from 11i to R12 in the middle of May. As part of the upgrade we installed ASCP. We have noticed that certain customers are not being collected when running Standard Data Collections. We have open sales orders in the system for the customer. The sales order are being collected properly and appear in the plan. When we show the Customer field in the 'Supply/Demand Screen' in ASCP, the field is blank for select customers. Is there anything on the customer master or sales order line that would prevent the customer name from being collected? Any insight would be helpful.
    Thanks
    Rich

    You can go to order organizer screen select the OA NO which you are not able to see in the ASCP workbench, go to line items---using show field get the Demand visible
    tab on screen-----default value should be 'Y' if blank then then that could be the one of the reason.

  • Data Collection -- Planning Data Pull process failed

    Hello Experts,
    I am having a problem with data collection, the result of Planning Data Pull was completed error. It seems the problem is on the Planning Data Pull Worker.
    I am using 2 worker to run Planning Data Pull, and both result was errors. Below are some of the errors log:
    16-MAY 19:11:34 : Procedure MSC_CL_SETUP_PULL.LOAD_CALENDAR_DATE started.
    Into populate_rsrc_cal
    16-MAY-2011 19:11:40
    APS string is Invalid, check for Error condition
    16-MAY 19:11:40 : Error in Routine GMP_CALENDAR_PKG.POPULATE_RSRC_CAL.
    16-MAY 19:11:40 : User-Defined Exception
    16-MAY 19:11:40 : User-Defined Exception
    16-MAY 19:11:40 : Error_Stack...
    16-MAY 19:11:40 : ORA-06510: PL/SQL: unhandled user-defined exception
    16-MAY 19:11:40 : Error_Backtrace...
    16-MAY 19:11:40 : ORA-06512: at "APPS.MSC_CL_PULL", line 6218
    ORA-06512: at "APPS.MSC_CL_PULL", line 1583
    Help me please.
    Thanks & Regards,
    Andi

    This error shows up when it cannot find any organizations.
    Use the sqls below to identify this
    1. select MSC_CL_PULL.get_org_str(&instance_id) from dual ;
    2. select instance_code, instance_id from msc.msc_apps_instances where instance_code='&instance_code'
    3. select * from apps.msc_instance_orgs where sr_instance_id= &instanc_id
    sql 1 should in principle return the orgs available. If this returns null or -9999, then it's an orgs problem
    sql 3 will show the available orgs.
    If the organizations are enabled for planning in the ASCP instances form, then run a targeted refresh for trading partners (suppliers/customers/orgs). If this works with no errors, then look at the table msc_trading_partners. It should contain records for the orgs with partner_type = 3

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • What is the significance of data collection nad search help exit field ?

    Dear Gurus
    I  know i am asking very basic quetion of abap but sdn is  the only source to learn sap for me.I want to thanks you all for your kind support.
    i read  most of the post related to search help and trying to create one.
    for elementary search help.
    SE11 -> SEACH HELP -> ELEMENTARY SEARCH HELP
    I have doubts regarding to fields "DATA COLLECTION" and "SEARCH HELP EXIT".
    reference to a  tutorial it is a maintenance view  shall i  have to reate  a maintenace view first.
    and other field is  SEARCH HELP EXIT what is this.
    please help me .
    Thanks in advance.
    Chitta Ranjan mahato.
    Edited by: chitto123 on Oct 8, 2010 5:59 AM

    Howdy,
    DATA COLLECTION - refers to a database table or view.  This is the data that the search help will search through and display based on the parameters provided, so you can create your own view for the search help if you want the search to cover multiple tables.
    SEARCH HELP EXIT - You can create a function module to be able to alter the Search Help's selection and results at various events throughout the search help.  An example of this function module is provided with some documentation in function module F4IF_SHLP_EXIT_EXAMPLE.
    Cheers
    Alex

  • Errors in weekly data collection

    Hello,
    I am running ecc6 sp 15 on AIX 5.3 and oracle 10.2.0.2
    when i goto transaction st03 -> workload -> <instance_name> ->week dir - when i try to open one of the logs i get the following error message
    on the botton of the screen - "unknown period type (W).
    when i try to watch a daily log it is displayed without a problem.
    moreover when i push on the change user mode button (expert mode or
    service engineer options) and move to service engineer, i try to
    calculate both in background and in dialog i get the same error
    message - unknown period type (W).when i press again on the user mode
    button (expert mode or service engineer options) i get a dump -
    DYNPRO_NOT_FOUND - dynpro does not exist, the current abap progrem
    SAPWL_ST03N HAD TO BE TERMINATED.
    the saposcol is running, the background job -
    SAP_COLLECTOR_FOR_PERFMONITOR is running and working (daily data is
    collected and displayed correctly)
    each week i get an early watch alert report but with errors in data
    collection section with the following exception -
    SAPWL_WORKLOAD_GET_SUMMERY_I_W - NO_DATA_FOUND.
    please advise asap
    Regards,
    Moshe

    done - had to import note 1014116

  • Is there a way to export the data collected with PDF forms, and saved via FormsCentral, into a MS Excel or Access file format?

    or Is there another place to save data collected by PDF forms other than FormsCentral? e.g. another database on the web server that presents the form?

    Hi,
    Yes, you can export user responses into an excel file. To do that, open the responses tab, select Files -> Export Responses and select Microsoft Excel to export response data into.
    Thanks,
    Wenlan

  • Table in which system updates data collected through KKRV

    Hi,
    Can any one tell me the table in which system updates data collected through KKRV?
    and the correct procedure to collect the data for product drilldown.
    can i run KKRC alogwith KKRV to get summerized reports?
    Thanks in advance
    Regards,

    You could probably trace them at COSP or COSS tables where the object starts with VD (summarization object).
    Summarization transaction are better if run separately. First KKRV for summarising plant level information and then KKRC for hierarchy summarization.

  • Performance Report Manager Data Collection Issues

    I have an agent running and collecting data in a circular file.... history.log
    Data Collection has failed approximately 6 days ago without warning. I have tried to find out the issue and have not had any success.
    If I open the Data Collection Window in PRM I see a Yellow triangle beside the hostname:port.
    If you can provide any information as to what to look for, please reply.

    history.log may be corrupted. If you run it through ccat to flatten it does it show any recent (i.e. newer than 6 days) entries?
    If ccat gives errors then just move it out of the way (i.e. rename it to history.log.busted) and restart your Agent. A fresh log will be created. Give it a couple hours to make a couple flushes of data to your SunMC Server and see if PRM works.
    Regards,
    Mike
    [email protected]

  • BCS - Data collection issue

    Hi,
    I'm using BCS 4.0. I'm working now in final testing and I have some question regarding to the data collection process using load from data stream method. I ran the task in consolidation monitor for 007.2007 period and for all companies without any error or warnings, but we have differences in financial information for that period.
    I reviewed the content in my BCS cube (RSA1) and we don't have any data for that accounts, the only thing that I found was that all docs were created on same date
    I deleted the ID request in RSA1in my BCS cube and I executed task again in consolidation monitor, but the result was the same.
    Looking at log / source data, in the rows listed, these data is not taking from the infoprovider
    Any idea which could be the problem ?
    Thanks
    Nayeli

    Hi Nayeli,
    I had to do this kind of job (reconciliation of data in the source basis cube and the totals cube) during the final testing a lot of times, with an accountant.
    The only way to have the first clue what is happening is to compare every particular amount s in both cubes. Comparing and trying to figure out any dependencies in data.
    The difference might arise because of ANY reason. Only you may analyze the data and decide what to do.
    AFAIU, you compared only reported data and did no currency translation and eliminations?
    Then, I'm afraid that you've made a very big mistake deleting the request from the totals cube. You have broken the consistency between totals and document data. For a transactional cube the request is yellow until the number of records in it reach 50000 records. Then it is closed and become green. As you may understand, in those 50000 records maybe contained a lot of data for different posting periods, for example, reported data for the previous closed period. Though, documents still contain eliminations for that period. - Inconsistency.

  • Data collection table in database

    Hi,
    In our SAP ME WIP databse we are not able to see the parameter values for data collection ,as entered by operator.
    We tried searchin in WIP db as system rule setting for '
    Store Data Collection Results in ODS ' is false.
    Are there any other settings apart from those mentioned in "how-to-guide" for datacollection that stores data in ODS database?
    Or are we going wrong somewhere?
    Attached is the reccomended landscape that we have used in our project.
    Regards
    Mansi
    P.S.

    I think this post is in the wrong forum, but what table are you looking in for your data? 
    Did you figure out the answer to this problem?

  • Too many BPM data collection jobs on backend system

    Hi all,
    We find about 40,000 data collection jobs running on our ECC6 system, far too many.
    We run about 12 solutions, all linked to the same backend ECC6 system. Most probably this is part of the problem. We plan to scale down to 1 solution rather than the country-based approach.
    But here we are now, and I have these questions.
    1. How can I relate a BPM_DATA_COLLECTION job on ECC6 back to a particular solution ? The job log give me monitor-id, but I can't relate that back to a solution.
    2. If I deactivate a solution in the solution overview, does that immediately cancel the data collection for that solution ?
    3. In the monitoring schedule on a business process step we sometimes have intervals defined as 5 minutes, sometimes 60. Strange thing is that the drop-down of that field does not always give us the same list of values. Even within a solution I see that in one step I have the choice of a long list of intervals, in the next step in that same business process I can only choose between blank and 5 minutes.
    How is this defined ?
    Thanks in advance,
    Rad.

    Hi,
    How did you managed to get rid of this issue. i am facing the same.
    Thanks,
    Manan

  • IDoc Monitoring: Data collection not getting completed

    Hi,
    I have setup IDoc monitoring in Solution Manager 7.1 SP08 for managed systems which is on ST-A/PI 01Q. There are many IDoc being generated in the managed system but I am not getting any alert for it.
    In the alert inbox I get the message that the data collection has not yet taken place since the last activation. and the last data collection time is shown as of 2009.
    Also in the managed system the BPMon background job BPM_DATA_COLLECTION_2 get cancelled with run time error TSV_TNEW_BLOCKS_NO_ROLL_MEMMORY
    What can be the issue?
    Regards
    Vishal

    Hi Vishal,
    Please check and implement below notes to fix the first issue:
    1570282 - Advance Corr. BPMon ST-A/PI 01N
    1946940 - ST-A/PI 01Q SP2: Advance Correction BPMon (Infrastructure and
    EBO)
    1869678 - Advanced Corrections for BPmon/Analytics Delivered with 01Q SP1
    (framework)
    Thanks
    Vikram

Maybe you are looking for