Find activitys which are related to an opportuntiy

Hello Experts
I can related an acctivity with a Opportunity.
I want to select this information from the sysmtem. In which tabels i find the information?
How can i finde the activitys which are related to an opportuntiy?
Regards,
Sven

Hello Vellory,
You can query the table CRMD_LINK for retrieving the follow up document. But, here you will run into a little bit confusion as to select which one.
So, a much better and cleaner approach would be to use the Function Module - CRM_ORDER_READ. Using this FM you can very easily retrieve all the following and preceding documents in the Export Tables Parameter ET_DOC_FLOW. From here you can again call the CRM_ORDER_READ to get the additional details for each one order document.
You can even execute the Report CRM_ORDER_READ in SE38 Tcode to have a feel of the actual FM.
Hope this will help.
Thanks,
Samantak.

Similar Messages

  • Migrating the SAP-Scripts to smartforma which are related to Finance(FI)

    Hi,
    Do we can migrate the SAP-Scripts to Smartform which are related to Finance(FI) & those SAP-Scripts are called by a standard print program (RFFOUS_C)
    Urgent answer need please!!!
    Thanks,

    Hello,
    Yes you can migrate script to smartform  there are different ways to migrate.
    Use program SF_MIGRATE..
    to migrate from scripts to smartforms.
    Reward if helpful.
    Viswam.

  • Unction modules which are related to transaction code (XD01 & VL01N)

    hi expects,
    can any body give me the function modules which are related to transaction code (XD01 & VL01N).please help me

    Hi Santosh,
    FM related to XD01 are :
    QPL1_INSPECTION_LOT_CREATE - for creating Inspection Lots
    BAPI_INSPECTIONPLAN_CREATE - for creating Inspection Points
    FM related to VL01N are :
    these are used for creating delivery
    WS_DELIVERY_UPDATE
    BAPI_OUTB_DELIVERY_CREATE
    BAPI_OUTB_DELIVERY_SAVEREPLICA
    BAPI_OUTB_DELIVERY_CONFIRM_DEC
    BAPI_OUTB_DELIVERY_SPLIT_DEC
    SHP_VL10_DELIVERY_CREATE
    Reagrds,
      Sunil

  • Where can I find out which areas in London Actuall...

    Where can I find out which areas in London Actually have got the Fibre Optics promised last year? I have tried with bt and on the net but cant actually find anything that would tell me where the elusive fibre optics are. I am being asked all the time for a postcode and if I knew that I would be living there.

    The problem you're going to have, is not only finding where it's available but also if there are any connections still available in the local cabinet. I'm in London, in a slow-spot (500kpbs) for normal broadband, and when the engineer did my Infinity install last Friday he told me that I got one of the last connections available (they only put in a 100, based apparently on take-up in other areas, never mind that given the speed round here over the copper wire, a lot of people are going to want Infinity...)

  • How to find documents which are not linked to a project?

    Hi all,
    i'm just looking for a variant to search for documents which are not linked to a project within ta SOLAR_EVAL.
    I need a way to report how much documents are not linked to a project; just stored in KW.
    Can anyone give a hint?
    Thanks a lot!
    Jan

    Hi Jan,
    This report SOLMAN_UNUSED_DOCUMENTS will help you identify the documents which are not linked to the project.
    (OR) Use Tcode: SI80 to find any document in SAP Solution Manager KW.
    Regards,
    Sanjai

  • Find columns which are null

    Hello all, I am currently developing a data warehouse that holds time-series values. In the 5-min interval fact table, there is a column for every 5-minute segment of a 12 hour period. So I have a few foreign keys which make up a composite primary key in the fact table, followed by generally-named columns m0,m5,m10...m1145,m1150,m1155 (m0 represents 00:00 [12:00] and m1155 represents 11:55). There is a flag in the row to denote AM or PM. This database collects various metrics from a Coherence environment for a live monitoring solution that also has a Historian to track trends, get a graphical historical overlay against the current values coming into the live environment, etc.
    So I have this table with aggregation at a 5 minute level and will need to roll this up to hourly and daily aggregation levels during an ETL process. I'd like to be able to pinpoint which data, if any, has holes in it IE the collectors hiccupped somewhere for whatever reason and did not report a 5 minute segment to me.
    Is there anyway to find rows which have null values and to report back which column names contain null values? A use case would be that I have a row in the 5 min fact table that contains a null value for column m455, which means that there is a gap in my time series at the 4:55 hour for that object. During ETL roll ups, I'd like to be able to record which time slot has "holes", or null values in them so that we can pinpoint why they occur to prevent them from occurring again during the development process of this solution.
    Thanks for anyone's help!!
    Regards,
    TimS

    TimS wrote:
    create table f_metric_5_min (
         metric_key number(10) not null,
         object_key number(19) not null,
         date_key number(5) not null,
         segment_id number(1) not null,
         m0 number(19,4),
         m5 number(19,4),
         m10 number(19,4),
         m15 number(19,4),
         m20 number(19,4),
         m25 number(19,4),
         m30 number(19,4),
         m35 number(19,4),
         m40 number(19,4),
         m45 number(19,4),
         m50 number(19,4),
         m55 number(19,4),
         m100 number(19,4),
         m105 number(19,4),
         m110 number(19,4),
         m115 number(19,4),
         m120 number(19,4),
         m125 number(19,4),
         m130 number(19,4),
         m135 number(19,4),
         m140 number(19,4),
         m145 number(19,4),
         m150 number(19,4),
         m155 number(19,4),
         m200 number(19,4),
         m205 number(19,4),
         m210 number(19,4),
         m215 number(19,4),
         m220 number(19,4),
         m225 number(19,4),
         m230 number(19,4),
         m235 number(19,4),
         m240 number(19,4),
         m245 number(19,4),
         m250 number(19,4),
         m255 number(19,4),
         m300 number(19,4),
         m305 number(19,4),
         m310 number(19,4),
         m315 number(19,4),
         m320 number(19,4),
         m325 number(19,4),
         m330 number(19,4),
         m335 number(19,4),
         m340 number(19,4),
         m345 number(19,4),
         m350 number(19,4),
         m355 number(19,4),
         m400 number(19,4),
         m405 number(19,4),
         m410 number(19,4),
         m415 number(19,4),
         m420 number(19,4),
         m425 number(19,4),
         m430 number(19,4),
         m435 number(19,4),
         m440 number(19,4),
         m445 number(19,4),
         m450 number(19,4),
         m455 number(19,4),
         m500 number(19,4),
         m505 number(19,4),
         m510 number(19,4),
         m515 number(19,4),
         m520 number(19,4),
         m525 number(19,4),
         m530 number(19,4),
         m535 number(19,4),
         m540 number(19,4),
         m545 number(19,4),
         m550 number(19,4),
         m555 number(19,4),
         m600 number(19,4),
         m605 number(19,4),
         m610 number(19,4),
         m615 number(19,4),
         m620 number(19,4),
         m625 number(19,4),
         m630 number(19,4),
         m635 number(19,4),
         m640 number(19,4),
         m645 number(19,4),
         m650 number(19,4),
         m655 number(19,4),
         m700 number(19,4),
         m705 number(19,4),
         m710 number(19,4),
         m715 number(19,4),
         m720 number(19,4),
         m725 number(19,4),
         m730 number(19,4),
         m735 number(19,4),
         m740 number(19,4),
         m745 number(19,4),
         m750 number(19,4),
         m755 number(19,4),
         m800 number(19,4),
         m805 number(19,4),
         m810 number(19,4),
         m815 number(19,4),
         m820 number(19,4),
         m825 number(19,4),
         m830 number(19,4),
         m835 number(19,4),
         m840 number(19,4),
         m845 number(19,4),
         m850 number(19,4),
         m855 number(19,4),
         m900 number(19,4),
         m905 number(19,4),
         m910 number(19,4),
         m915 number(19,4),
         m920 number(19,4),
         m925 number(19,4),
         m930 number(19,4),
         m935 number(19,4),
         m940 number(19,4),
         m945 number(19,4),
         m950 number(19,4),
         m955 number(19,4),
         m1000 number(19,4),
         m1005 number(19,4),
         m1010 number(19,4),
         m1015 number(19,4),
         m1020 number(19,4),
         m1025 number(19,4),
         m1030 number(19,4),
         m1035 number(19,4),
         m1040 number(19,4),
         m1045 number(19,4),
         m1050 number(19,4),
         m1055 number(19,4),
         m1100 number(19,4),
         m1105 number(19,4),
         m1110 number(19,4),
         m1115 number(19,4),
         m1120 number(19,4),
         m1125 number(19,4),
         m1130 number(19,4),
         m1135 number(19,4),
         m1140 number(19,4),
         m1145 number(19,4),
         m1150 number(19,4),
         m1155 number(19,4),
         constraint f_metric_5_min_pk primary key (metric_key, object_key, date_key, segment_id),
         constraint f_metric_5min_fk_metric_key foreign key (metric_key) references d_metric (metric_key),
         constraint f_metric_5min_fk_object_key foreign key (object_key) references d_object (object_key),
         constraint f_metric_5min_fk_date_key foreign key (date_key) references d_date (date_key)
    ) partition by range(date_key) (
         PARTITION def values less than (2558)
    ) tablespace FACT;The reason to define separate columns for time intervals is because need to see data coming from Coherence enviroment in 5 minute intervals, creating a row for each 5 min interval for each object (thousands) is too time-consuming. Using MERGE is much faster to update columns in the same row. And secondly, because my job tells me to go with this logical model :).
    I can understand the "job-tells-me" reason but not sure am convinced with the "time-consuming" issue,
    especially as it is a datawarehouse. But I assume you must have verified the same.:)
    The NULL-finding exercise will be done after the data is loaded into the 5 minute metric table and before the hourly roll-up ETL.
    Not sure what you mean in what format, I plan to build a list and place them into a form of auditing table within the database that will be queried in the future to determine where the gaps are and for what objects so that we can trace them back to a certain JVM and coherence collector and fix the issue from occuring again; hence the point of a monitoring tool.
    Storing time in a table can take up a lot of space and is harder to compute procedurally. In this model the database does not care about the time, only the data; the date and time is not a focal point for this database, the database is abstract, with the Java behind it controlling the purpose of its data. The reporting engine after-the-fact cares about dates and such which can be easily determined by referencing to dimension tables to determine which column represents which time in the 12 hour sequence.
    Going back to the "null-finder" do you think there is any way to do this? Thanks for your help!
    Regards,
    TimSDo you mean that you want to get only those columns where there is a NULL value, for a record-at-a-time ?
    In that case, the only option that I can think of is building a Dynamic SQL, based on checking whether each column has a NULL value or not. Although, I hope somebody can come up with a better way to achieve the same.
    (Else, your not considering "time-consuming" approach above becomes not much of advantage, isn't it ?) :)
    (Of course, only if "null-finding" is a MUST exercise...)

  • How to find aggregates which are not used.

    Hi ,
    There are lot of aggregates in my system , so i should deactivate the aggregates which  are not used from long back. so how to find out all those , can u plz guide me ,

    You can also check the usage column of the aggregates. If usage value is high then the aggregate is used very frequently.
    If there is no value at the last used then the aggreagete is not used at all.
    Hope it helps
    Regards
    Sadeesh

  • Find System which are automatically updating by internet

    Hi team,
    can any one help me on this , In my environment I have around 1000 servers , we are using wuss for windows update 
    but some server are missing they are updating by internet means automatic update.
    can I find the server name  which are taking update from internet ,I don't want check manually all the servers.
    Regards, Triyambak

    Hi Triyambak,
    I’m writing to just check in to see if the suggestions were helpful. If
    you need further help, please feel free to reply this post directly so we will be notified to follow it up.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna
    TechNet Community Support

  • How to find reports which are using sales tables

    Hi Guys,
    we are using OBIEE 10g version.Here i need to identify the the reports which are using sales tables.
    Table names are given but how to find which report is using these tables.Is there any method to find
    or we have to check all reports manually?
    Could any one pls suggest me here!
    Regards,
    sk
    Edited by: 912736 on Jun 8, 2012 1:24 PM

    Hi SK,
    You can run a report from catalog manager that willl give you all answers requests and the subject area columns in use, you can map these back to the sales tables either manually or by linking (vlookup) to an RPD report that you can run from the Admin tool.
    The Usage Tracking method is pretty good but you will have to match up the reports using the Logical SQL.
    I'd do both methods and cross ref your results to ensure nothing slips the net.
    Regards
    Alastair

  • DBIF_RSQL_SQL_ERROR which are related to BW ods Activation

    Hi friends,
    I dont know whether this is the right thread i am posting or not. But technical people can definately help me.
    Recently we have upgraded patch from  SP 22 to SP 25 for BI 7.0. From that time onwards i am facing issues like anything.
    Here is the impact.
    1. Data loads not triggering for process Chains as per schedules. we are running emegency chains manually. We have raised an issue to SAP, waiting for reply.
    Short Dumps:
    b)
    DBIF_REPO_SQL_ERROR
    Short text
        SQL error 3114 occurred when accessing program "SAPLRS_EXCEPTION " part "LOAD".
    What happened?
        The system is no longer linked to an ORACLE instance.
        No further operations can be performed on the database.
    Error analysis
        The system attempted to access an ORACLE instance to which
        it is no longer linked.
        This situation may arise because the ORACLE instance
        has been stopped due to an error or an external operation.
    How to correct the error
        Database error text........: "ORA-03114: not connected to ORACLE"
        Triggering SQL statement...: "SAPLRS_EXCEPTION "
        The current status of the ORACLE instance cannot be determined.
        It may be still inactive or the database administrator may have
        restarted it.
        In any case, all systems that have accessed the ORACLE instance
        must be restarted after the ORACLE instance has been restarted.
        If you were working on a central system, inform
        your system administrator.
        If you were working on an external local system, inform
        the person responsible for this system.
        If you were working on your own local system, restart
        the system yourself.
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "DBIF_REPO_SQL_ERROR" " "
        "SAPLRRMS" or "LRRMSU14"
        "RRMS_EXCEPTION_HANDLING"
    c)
    DBIF_RSQL_SQL_ERROR
    CX_SY_OPEN_SQL_DB
    How to correct the error
        Database error text........: "ORA-03135: connection lost contact"
        Internal call code.........: "[RSQL/RDUP/TADIR ]"
        Please check the entries in the system log (Transaction SM21).
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "DBIF_RSQL_SQL_ERROR" "CX_SY_OPEN_SQL_DB"
        "SAPLSDI0" or "LSDI0U01"
        "TRINT_TADIR_UPDATE"
    2. Especially ODS activation (through process chains) failing in every process Chain. if you run maual activation , then its ok.
    All Batch are got struck when accessing table TADIR. Dump C which is mensioned above impact is very high on the system.
    Anyone know about to resolve the above dumps, Please help me.
    If you want to analyze more detailed level, please let me know
    Thanks in Advance.
    Venkat

    Hi Mark,
    Please gothrough below details.
    Filename......: sqlnet.ora
    Created.......: created by SAP AG, R/3 Rel. >= 6.10 # Name..........:
    Date..........:
    @(#) $Id: //bc/640-2/src/ins/SAPINST/impl/tpls/ora/ind/SQLNET.ORA#4 $ ################
    AUTOMATIC_IPC = ON
    TRACE_LEVEL_CLIENT = OFF
    NAMES.DEFAULT_DOMAIN = WORLD
    NAME.DEFAULT_ZONE = WORLD
    SQLNET.EXPIRE_TIME = 0
    TCP.NODELAY=YES
    TADIR file got struck during this stament:
    Last SQL statement
    SELECT
    /*+
      FIRST_ROWS (1)
    FROM
      "TADIR"
    WHERE
      "PGMID" = :A0 AND "OBJECT" = :A1 AND "OBJ_NAME" = :A2
    FOR UPDATE
    Database                         Number              Time (usec)         Recs.
    Direct Read                     1,017                   172,733                 232
    Sequential Read                 1,815                471729                10,454
    Insert                                   20                     31,223                 306
    Update                                25                       9,621                  17
    Delete                                  25             115,827                 291
    Sources                         3,069,427  (Bytes)
    RSQL                              2,265,032  (Bytes)
    Commit                                                  0
    DB Procedure Calls                  0                   0
    Thanks and reagards
    Venkat

  • How to find topics which are not assigned to a map ID

    Perhaps I am missing something very obvious in the Edit Map IDs window, but I can't find a way to easily identify topics which have not been given a Map ID.
    Initially, I selected all topics and used the auto map option for the project map file, but I am constantly adding topics and don't always remember to go straight in and assign an ID at the time of creation.
    Can somebody point me in the right direction?
    Thanks!
    Chloe

    Hi Chloe. You can't get that from the mapids dialog but you can by running the Topic Properties report with the mapid option turned on.
    Read the RoboColum(n).

  • Where can I find apps which are compatible with original iphone?

    All the apps even twitter and facebook apps on app store say it requires a ios4. Could someone tell me where I can find apps for original iphone

    See the response by PC1234 at https://discussions.apple.com/message/19298842#19298842 to see some techniques for doing this with a Google search.  You would need to replace the 3.1 with the version of iOs that your original iPhone is using.  Your version number can be found under Settings:General:About:Version on your iPhone.

  • How to find out the domains related to only Transaction Tables....?

    Hi All,
    I have to find out all the Domains which are related to Only Transaction Tables, that Domains should not be used or related to Master Tables, pls let me know is there any way to find out?
    Akshitha.

    Step 1:
    Select TABNAME from DD09L where TABART = 'APPL1'.
    Select DOMNAME from DD03L where TABNAME = TABNAME (got from the above statement)
    By this way you will get all the domains used in Transaction table
    Step 2:
    Select TABNAME from DD09L where TABART = 'APPL0'.
    Select DOMNAME from DD03L where TABNAME = TABNAME (got from the above statement)
    By this way you will get all the domains used in Master table
    Step 3:
    Display all the domains got from step 1 and not in step 2.
    Hopefully this will fulfill your requirement.
    Please reward if useful..
    -Tushar

  • Which are the most popular reports in SAP BI for Purchasing and Sales

    Hi Experts,
    I'm looking to work on one completed BI content flow.
    I wanted to work on 1 report from each module to activate the relevant objects.
    Can anyone please let me know which reports in Purchasing and Sales are frequently used in SAP BI.
    Thanks

    Hi,
    Please check table - RSRREPDIR       
    Report names which are related to purchasing  - 0PUR_C0N_Q00Y, N and Y indicates number.
    same for sales report names - 0SD_C0N_Q00Y .
    Query tech names - 0PUR_C01_Q0021
    Thanks

  • How to findout Which are the User Exist had been activated for T.Code

    Hi Everybody,
    -->I want to know, what are the user exits have been activated or used for a particular t.code.
    --> I will give a small example it will be give clear idea on my question.
    -->For my client, they used lot of user exist and screen exist in the sales order screen,how to find out them. Is their any easy way to find out which are user exist had been acitviated for a particular template.
    Regards,
    Madhan

    If you mean how to see a user list with the T-Code they are using.
    Then you can just use T-Code SM04.
    It will give all the information about those user.
    Or If you mean User Exists.
    then please use this link
    http://www.sap-img.com/abap/a-short-tutorial-on-user-exits.htm
    Thanks,
    Sugauli
    Edited by: Sugauli on Dec 11, 2008 9:20 PM

Maybe you are looking for