Extract CRM Master Data - Ad Hoc Query

Hi,
Does anyone know of a tool in CRM like Ad Hoc Query in HR?
Basically I'm looking for an end user tool that allows the user to pull of Master Data. Effectively like a report but they can pick and choose the data they want it to search on.
Any help with this would be greatly appreciated.
<b>Points awarded for helpful responses.</b>
Regards,
   Philip Johannesen

Hello,
query /SAPQUERY/HR_ADM belongs to the client independant (global area) more info you can find here: http://help.sap.com/saphelp_erp2005vp/helpdata/en/d2/cb3f6f455611d189710000e8322d00/content.htm
I think your copy of the infoset will also belong to the global area.
Koen

Similar Messages

  • Master Data Ad-hoc Query

    when we try to save changes to existing queries, we get a message stating that changes to the repository cannot be made.  A few months ago I copied the infoset /SAPQUERY/HR_ADM so that I could remove fields not being used, etc.  Ever since then, I cannot save changes to existing queries in either infoset.  If anyone knows of something I may have changed by mistake to cause this, please let me know.
    Thanks,
    Karen

    Hello,
    query /SAPQUERY/HR_ADM belongs to the client independant (global area) more info you can find here: http://help.sap.com/saphelp_erp2005vp/helpdata/en/d2/cb3f6f455611d189710000e8322d00/content.htm
    I think your copy of the infoset will also belong to the global area.
    Koen

  • CRM Master Data Extraction

    Dear all,
    Can anyone help me with how to paper about CRM Master Data Extraction? I need to know on how to extract Business Partner and its attributes (Relationship, Address and others), also on how to extract CRM Marketing Attributes for Business Partner.
    Thanks for your help.
    Ricky

    Hello Ricky,
    I haven't found any conclusive document about loading Business Partner from CRM 4.0 to BW 3.5. If there is someone out there who knows about that documentation I would really appreciate it.
    Here is what we did regarding BP relations, marketing attributes and addresses:
    1. Marketing attributes:
    These data type is difficult, because you have to build a special extractor for each marketing attribute group. And because Marketing attributes cannot be transported, you cannot transport the extractor. This poses a big problem in many projects. Therefore we solved it by creating some view extractors on the appropriate CRM tables: AUSP, CABN, KSML, KLAH. I can give you some more details, if this will help.
    2. BP relations:
    We build a view extractor for this too, because it was much easier to extract the data from BUT050 and BUT051 than trying to find out, how the business content extractors do their work. Using this extractor, we build some special ODS and InfoCubes for the relations we were interested in.
    There is Business content for BP relations, but we kept to the generic extractors, because they were easier to verify.
    3. BP adresses:
    We use 0CRM_BPDEFADDR_ATTR to get the primary address.
    We use 0CRM_BPART_ATTR to get BP master data
    We use 0CRM_BPART_TEXT to get BP text.
    We have only about 300.000 BP in CRM, so we do a full update everyday.
    I am not sure, if these three data sources are part of the current BW 3.5 business content, because we started with BW 3.0 and did not check for BC updates.
    Kind regards,
    Jürgen
    Message was edited by: Jürgen Kirsch

  • Unable to extract Customer Master data from MDC

    Hello,
    We creating an MDM-XI-ERP2005 Customer scenario. We want to work with all bussiness scenarios (not customizing scenario).
    First I'm trying to extract Customer Master Data from EP 2005 using MDM_CLNT_EXTR.
    I have created a New Variant which properties are:
    Variant: CUSTOMER_TOTAL
    Description: Send all customer data to MDM
    Extraction Object: CUSTOMER_EXTRACT.
    Target System: PI7.
    But, the display jobs shows the following error: Could not determine recipients for message type MDMRECEIPT. Job cancelled after system exception.
    We need to focus the problem. We have done the following ALE configuration.
    SALE transaction:
    In Basic Settings -> Logical Systems -> Define Logical System we have add the following logical systems:
                  EUS100
                  MDM: Master Data Management
                  PI7: Process Integration 7.0
    WE21:
    Transactional RFC -> PI7, IDOC record types SAP Release 4.x
    RFC destination: SAPSLDAPI
    BD64
    New Model -> PFMC
    Add message type: Model iview: PFMC
                                 Sender: EUS100
                                  Receiver: PI7
                                 Message Type: DEBMDM
    WE20 transaction shows:
    Partner profiles -> Partner Type LS (logical system)
                        PI7: Outbound parmtrs. DEBMDM, SYNCH.
    Any idea will be very helpful,
    Thanks in advance
    Marta.
    Edited by: Marta Sánchez on Jul 9, 2008 12:34 PM

    Sorry because I made a mistake in my previous message. ALE seems to be correctly configured. I'll try to update correctly my current situation:
    In WE02 I can see al IDOCs that I'm trying to send to XI. All them have the IDoc Status = 3 (Data Passed to Port OK) instead of 12 Status Dispatch OK :(. So, It seems an tRFC error.
    SM58 shows me more trails: "Basic type 'DEBMDM06' is unknown"
    before that I have done the following steps at SAP ERP:
        1. WE21: I have created a PI7 port with 700PI70CLNT RFC.
        2. SM59: I have checked successfully 700PI70CLNT RFC connection.
        3. WE20: I have created a PI7 Partner Type LS with the following inbound and outbound parameters: DEBMDM and MDMRECEIPT.
        4. BD64: I have created a Distribution Model with DEBMDM and MDMRECEIPT message types.
    and at SAP PI:
         5.IDX: Port: SAPEUS. Client: 111. RFC Destination: PI7.
         6. technical and business system for sender SAP system(System Landscape Directory)...
    - Maybe some of the previous steps is not correct....
    - Is RFC Destination = PI7 is correct in step #5?
    - Is also necessary to configure more steps in PI system like RFC destinations or Transactional port?
    - I have read that is neccessary to check that the option "transfer idoc immediately" in we20. How?
    Please any idea will be very helpful.......
    Thank you!
    Best regards,
    Marta.

  • Master data values in Query

    Hello,
    i want to restrict division by certain value in a query. i cannot see one of such values in the restriction window in the available list of values.
    i checked master records for division and it has this value. the infoprovider may not have this value.
    in the 0division infoobject the settings in the business explorer tab are:
    Query Def. Filter Value Selection: M Values in master data table.
    Query Execution Filter Val. Selectn:M Values in master data table.
    Why can't i see the value in the query for restriction?
    thanks

    Hi Aby,
    <i>Have you tried clicking on the "display other values" button right before the find text box in the restriction window (the one with the small yellow arrow)? If not, then click on tht button and put in some restriction rule to cut down on the number of entries displayed. This should display the value you desire.</i>
    In addtion to that uncheck 'only values from InfoProvider'
    regards

  • Why do we call Master Data CRM master data

    Hi
    this is shankar
    can you please clarify my doubts
    1.why do we call master data as CRM Master Data
    2. what is the difference between OPPORTUNITY AND OPPORTUNITY MANAGEMENT
    3. what is the Relation between OPPORTUINITY AND SUBORDINATE OPPORTUNITY.
    THANKDS IN ADVANCE
    Regards
    Shankar

    Hi Shankar
    1) We call it CRM master data, to differentiate between R/3 master data and CRM master data. This is very useful when discussing master data that is exchanged between the two systems.
    2) Opportunity is a transaction type (sales document type) whereas Opportunity Managment refers to the general business process of working with opportunities (to optimise the effort performed by the sales and marketing people)
    3) An opportunity can have more subordinate opportunities, while a subordinate can only have one "master" opportunity. The purpose of this could be: Your sales rep registers an opportunity "New IT systems". This opportunity could represent two subordinate opportunies, like "Hardware" and "Software", each representing a value.
    This is uses to break up large opportunities in smaller instances. This could also be relevant if it is different sales people that are dealing with the customer for different products.
    Hope this helps. Please reward points if useful.
    Regards,
    Claus Møldrup

  • Extraction of master data from R3 into BW.

    Hello.
    This is my new SAP BI assignment:
    I'm going to use standard SAP InfoOjects (from Business Content) in order to build a new InfoCube (Logistics - MM/SD) with for example standard Characteristics like: 0MATERIAL, 0MATL_GROUP, 0SHIP_TO, 0SOLD_TO, 0SALESORG, 0PROD_HIER, 0CUST_GROUP, 0COMP_CODE... (_Note: NO generic extraction involved_)
    My question is, do I need some extra "Process Chain"  (Tcode: RSPC) in order to bring all these characteristics and all associated master data/texts, attributes and hierarchies from the R/3 ( source system) into BW, or these data will come with a standard extractor into SAP BW? I mean no extra work is needed for  bringing master data from R/3 in to BW?
    Another question, Is there any way to know which Characteristics need an extra "Process Chain"  (Tcode: RSPC) in order to bring associated master data/texts, attributes and hierarchies from the R/3 ( source system) into BW?
    Regards
    ASantos

    Hi ,
    1. In SAP R3 system go to the transaction RSA6, find the application component then select the data Source and view the structure of the data source and  select all the required fields.
    2. As you said no generic data source involve , then create the generic data source using the transaction SBIW, generic data sources have many creation options. view is the simplest.You can create a viewwith all requred fields on the top of different source table fieldsa per your requirement.
    3.Now go to the transaction RSA3 the extractor checker ,give name of your ectractor as input and check this data source.
    4.go to BW system,transaction RSA1, find the source system then replicate the metadata in the corresponding application component.After replication only the structure of the data source is copied not the actual data.
    5. Create InfoPackage, so that the actual data is loaded in the PSA table.
    6. Create transformation between the data source and the corresponding data target (InfoProvider - infoobject, cube, DSO). This is done by mapping the source fields with target fields using the appropriate rule type.If there are some update rules in R3 system then you may need to write some field level routines depend on your requirement.
    7.Create DTP between the source and target, and execte it .It will load actual data into the target.
    8.then more transformation and DTP can be created and executed according to the requirements, like from DSO to cube. Thus the data is available in BW data targets on which query can be generated for reporting.
    I don't think you need any process chain for this .this is manual process but loading DTP and all can be automated via creating process chain .
    For more guidence you an refer this document:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/40394d9b-93b9-2d10-9f9a-f13118a4776d?quicklink=index&overridelayout=true
    Hope this will be helpful.
    Regards,
    Jaya

  • Force read/refresh of master data attributes on query

    Hi there,
    We're having troubles with one input ready query that changes and attribute value (KYF) of one characteristic. That works fines and whe can save changed data on the infoprovider via DTP. Problem is when data is saved we need to refresh query as whe have both values on screen (original value as char attribute on rows) new value as input ready KYF, so after save we would like to see that both values are the same.
    Is there any way of force query no to read from caché as whe are changing master data attributes. I read something about IF_RSMD_RS_ACCESS class that can be implemented on Master Data access and force it there but sounds really hard so if this is the way can some of you guys give us some help.
    I hope I make myself clear on the explanation...
    Thanks in advance,
    Regards
    Carlos

    Dear All,
    The recent days that I tried working on changing master data attributes through BPS didn't work out.The Primary reasons was  that some of the attributes that I needed to change were not present in the transaction or planning cubes and the characteristics that are not part of your cube on which the planning area is based then you can not do changes on them.
    This is my undestanding .Please correct me if I am wrong.
    Besides , I was also thinking if we can do the same through portal.i.e retriving the master data infoobject ( based on the value seleceted for that infoobject by the user  )  and its attributes in the portal , edit them and save them back so that the updated values goes back to BW master data infoobject data base tables and updates the value.
    Eg . I have Natural Account master data infoobject in the BW with attributes fucntional area and expense center.Based on the user selection of any values for the Natural account lets say  01110 , then for 01110 natural account the portal should display the correspoding attributes values of fucntional area and expense center.Lets take this values as 10 , 20 respectively for fucntional area and expense center . What I want to do now is to change these attrbute values to 30 and 40  and I would like to save it back as the changed attribute values for that natural account for 01110 with new attribute values 30 & 40 respectively for fucntional area and expense center .
    Is this possible through portal and BW?
    Any idea on this would be appriciated.
    Regards,
    Ankit
    Edited by: Ankit Bhandari on Nov 21, 2008 12:21 PM
    Edited by: Ankit Bhandari on Nov 21, 2008 12:32 PM

  • Japense language getting displayed in master data not in query

    Hi all,
    I am trying to load the asset master from ecc and bw which both are unicode. But then the master data gets loaded perfectly with japanese in the master with
       asset   language   description
    1. 300001   en           加硫
    2. 300002   ja           加硫
    i can see some data is loaded with en and some with ja both are loaded in master data perfectly.
    but then in the query if i want to display 30002 it is not getting displayed
    can anyone help me on this

    Hi,
    What language you are selecting while logging into the query designer?
    Select the language as JA and then check if you can get the required result.
    Regards,
    Durgesh.

  • Question about Extraction of master data from R/3 into BI 3.5

    Hi there,
    i want to extract master data, let's say customers or material, from an R/3 system into BI 3.5.
    Now I know that SAP delivers ready-to-go DataSources for certain SAP applications that can be used for this type of situation. These DataSources will map some R/3 table fields for example to the according Business Content objects in BI.
    My question is:
    Is there some ready-to-go DataSource for the master data i want to extract? Let it be customer data. Or "business partners" is the appropriate expression i think.
    When I use Tcode SBIW to activate the DataSource I need for the extraction, I kind of have problems finding the right one. But for some simple master data I assume there must be a standard DataSource that I can use without bothering to change anything.
    Or am I thinking the wrong way and it is easier than I think?
    Cheers,
    Stefan

    Hi Stefan,
    Some DataSources are 0MATERIAL_ATTR - 0ARTICLE_ATTR - 0MAT_PLANT_ATTR - 0ART_PLANT_ATTR - 0MAT_SALES_ATTR...
    Consider that in some cases (Retail for example) you can find Master Data with 50mln of records, in this case you need absolutely delta upload method, offered by these extractors.
    And above all you can always enhance these extractors.
    You can built your custom extractors, on MARA - MARC - MBEW - ..., but you will not have delta capabilities and support from SAP for maintenance.
    Ciao.
    Riccardo.

  • How to extract HRM master data from R/3 into LDIF file?

    Recently I have been asked to provide an extract from our R/3 system
    with some Human Resource master data. The extract has to be in the LDIF
    format (LDAP data interchange format). It is needed to import into a
    DirX metahub solution from Siemens.
    How can this be done most easily?
    (does SAP provide tools, can XI do this?) or do we have to write a
    customized abap to do this?
    Thanks in advance
    Kind regards
    Alex Veen

    Hi Satish,
    As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
    After data source enhancement it is supported to load normally because you don't get any historical data for that field.
    Best way is to take down time from the users, normally we do in weekends/non-business hours.
    Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
    1. Load set-up tables by yearly basis as a background job.
    2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
    This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
    Then after you can load all the data into BI first into PSA and then into Cube.
    Regards,
    Ravi Kanth.

  • Open hub Services - How to extract the master data related to a object ?

    Hi Gurus,
    I am implementing OpenHub services for our project, it's on BW 3.5, I have the list of required fields with which I am creating an InfoSpoke. Now I am stuck in some the info objects which are having master data associated with it.
    Example : Business partner(BP) data, when I map the 0BP infobjects in infospoke it's extracting the BP ID (ex: CT065316,CT068638 etc) in flat file but I want the BP name, address & telephone number as well, which are coming from master data table. But I am able to map only 0BP infobjects as a part of ODS/Cube.
    Can any one tell me how to get the master data extracted in the flat file with associated info object???
    Answers will be highly appreciated.
    Regards,
    Kironmoy Banerjee
    Edited by: Kironmoy Banerjee on Oct 1, 2009 3:34 PM

    Hi Kironmay
    Please follow the below mentioned procedure to create a transformation. This is applicable for BW 3.5 as well.
    - Enter your infospoke in the edit mode.
    - On the Transformation tab set the indicator for the Infospoke with Transformation with BADI so that the infospoke is activated.
    - This will take you to the Addin implementation/BADI builder.
    - Enter the short text/description for the implementation. The implementation name is always the same as the technical name of the infospoke
    - The implementation of the BADI is always filter dependant.
    - In the properties tab of the infospoke enter your infospoke under the Filter specifications.
    If you do not specify an InfoSpoke under Filter Specifications, then this implementation is valid for all InfoSpokes. This means that this is called up for all InfoSpokes during the extraction.
    - Activate your class
    - From your interface tab page, double click on the Transofrm Method and you will arrive in the class builder page
    - Here you can enter the code
    - To do a look up of the master data you have to write a code similar to the one I've given below. This is just an example for looking up material master.
    IF FLT_VAL = 'Your infospoke'.
        T_DATA_IN[] = I_T_DATA_IN[].
    Select zstd_cost from /bi0/pmaterial into table T_return
    For all entries in T_DATA_IN
    WHERE material = T_DATA_IN-material.
    ...Continue with your code.
    Append output from T_return to your output E_T_DATA_OUT
    - Activate your method. Return to the BAdI builder. Return to your InfoSpoke.
    I hope this helps.
    Thanks.

  • Characteristic "Master Data" setting in Query definition not working?

    Hi,
    I am having a problem with a query definition. (BI 7.0)
    The query comprises the following definition which relate to displaying Milestone dates for respective projects in a time series so the dates are populated in the correct column in a time series:
    Filter
    Project Definition = Fixed Single Project Definition
    Fiscal Year Variant = K4
    Project Profit Centre = Hierarchy Node
    Project Plant = Fixed Single Value
    Free Chars
    Network
    Network Activity
    Activity Element
    Rows
    Project Plant
    Project Profit Centre (With hierarchy)
    Project Definition
    Milestone Type  (Setting in characteristic to pick up Master Data)
    CSR Relevant    (Setting in characteristic to pick up Master Data)
          Structure
               Actual
               Scheduled
        (Actual:  Value type = Actual,  Origin = Manual ,Event = Start)
        (Scheduled:  Value type = Actual,  Origin = Scheduled ,Event = Start)
    Columns
    Key Figure structure of 12 key figures in a time series
    Selection
    Key Figure = Date
    Fiscal Year/Period = Current Fiscal Year / Period (SAP EXIT) (With Offset 1 to 12)
    FYI
    When I run the above it comes back correctly. However when I add "MILESTONE" (Characteristic setting (Posted Values) to the Query Definition immediately after the "Milestone Type" and before CSR it does not work it just is hanging for a considerable time.
    I am not using "Master Data" for the "Access Type for Result Vales" for the additional characteristic "MILESTONE" just "Posted Values".
    I was expecting that it would just show all Milestone types as per Master Data but just the Milestones that are posted for this result.
    I would appreciate a solution to this issue and an explanation to why it hangs when I add the "MILESTONE" characteristic to the definition as explained above.
    Thanks in advance..
    Stevo

    Hello there,
    Putting Milestone in the rows in the query definition is not being hang there (i.e., while creating and saving the query) right?
    It is stucked while executing the query isn't it so?
    That's because you have too many records to be desaggregated by milestone characteristic, and this is indepently of posted values in the characteristic settings.
    Putting milestone characteristic in the free characteristics will allow you to run the query, but desaggregate it (add it to the rows) will be to much information to read from your dataprovider (this is just a guess).
    If it is being stucked in query execution whike milestone is in the rows. Try this for example:
    Add milestone to the free chars. Execute the query. Filter milestone characteristic while still in the free rows and choose a value that you are pretty sure has the smallest records in the Dataprovider (could be for example value #). Than drill down milestone to the rows. You'll see that it can be executed without any problem.
    Diogo.

  • Incorrect result between maintain master data and bex query, how can i fix?

    Hi ALL,
    i get some messages from the users there is incorrect result between SAP R/3 and Report on BW. i controlled the monitor and i saw there was a job for 0CUSTOMER_ATTRIBUTE that it finish correctly but the processing it was only in PSA, i started the full update immediately from PSA into Data Targets and is finished correctly. after when i control the content of the 0CUSTOMER (right click maintain master data) i get the correct attribute result that match the data in SAP R/3, but the problem is when i execute a query Bex on this master data it will not return the same attributes data.
    Can SomeBody Help please
    Bilal

    hi,
    For any master data attributes loaded you will have to run "Attributes Change Run" for that.Execute for Master data 0CUSTOMER.
    The same is avilable in rsa1->Tools(top menu)->apply hierarchy/attribute run.
    hope it helps,
    regards,
    Parth.

  • How to extract Hierarchical Master data into flatfile

    Hello Experts,
    I have a requirement to extract Master data (Texts, Attr, Hierarchies)  of all Info Objects pertaining to a given Infocube ( say BoxA)
    I am aware that I can manually dump the data / use infospoke to get data into flatfile without hassle in case of texts/attributes.
    (1)But I doubt it for hierchical data. In case of hierarchical data I have to create an ABAP code to extract master data.
    Below is the link I got when I am searching SDN.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0403a990-0201-0010-38b3-e1fc442848cb?quicklink=index&overridelayout=true
    But I didnt understand it the mechanism behind the extraction.
    Can anyone please give me a detailed explanation on how to do it or some other reference documents.
    Note: I am going to use that master data and import that again into different box ( say BoxB) and there is no connection between two systems (BoxA and BoxB).
    Thanks.
    Edited by: saptrain on Mar 18, 2010 12:24 AM
    Edited by: saptrain on Mar 18, 2010 12:32 AM
    Edited by: saptrain on Mar 18, 2010 4:18 AM

    Hi,
    You can use the Program Z_SAP_HIERARCHY_DOWNLOAD to download as flat file. Upload this flat file into another system.
    Regards
    Arnab

Maybe you are looking for