Data reconciliation using MDM

Hi can anyone tell me that does MDM play any role in data reconciliation across various R/3 modules.

Hi Gopesh,
maybe you want to read the thread right before this one... there i put an answer which covers your topic.
this part might be relevant to you:
Concerning 2: <b>Actually with MDM 3.0 you can consolidate any data</b> - the standard allows MD Objects like Product or Business Partner - but if you want to harmonize eg FI/CO Data you can create beforehand on MDM a MD Object type using the MD Framework - this framework allows to create any data type you like (look at the elearning sessions on the mdm page in sdn - https://www.sdn.sap.com/sdn/developerareas/mdm.sdn?node=linkDnode8 )
regards, Matthias

Similar Messages

  • Data reconciliation using OBIEE/Answers

    Hi
    I am new to OBIEE. I am trying to use the OBIEE and Answers for the reconciliation of the accounting entries with transaction data, However This leads to two facts one transaction lines and accounting lines. I can make the transaction lines also dimension. But in such case can I compare the transaction amounts with accounting amounts something like
    Transaction Total – Accounting total – difference .
    Thanks a lot in advance.

    Hi
    Still I am not ready with complete data model. However I Can explain. we have the customer, transaction types, date , Currency as dimensions. These dimensions are linked to the transaction Fact where we have the transaction amount like Invoice amount for each Transactions, open balance for each TXtransaction. We have another child table to the transaction table called accounting distribution table. Now i can build two differnt star one for transaction as fact and another one as Accounting flex. In such a case in we join both fact in presentationlayer to get the transaction amount and accounting amount compared. Also another question is can we have the both facts in the same schema. Or any other approach?
    Thanks a lot in advance
    raj

  • Key benefits of using MDM

    Hi,
        I am trying to build a business case as to why a business should go for MDM implementation. Can anyone provide any insight into what are the key business benefits of implementing MDM ?
    Thanks

    Hi GS,
    In addition to above posts
    MDM Implementation basically deals to manage and maintain master data of the any firm/organization.
    The Dis separate and bad data is lying through out your IT landscape. to get a "singel version of truth"  or "360 degree view of your data" you need to implement MDM.
    though many MDM technologies are in market in order to implement. but as far as SAP MDM is concern, it is  the most versatile and beautiful platform over all others
    as it gives you
    Master Data Integration :
    Use MDM remote system extractor to extract customizing data in the initial setup of your MDM system
    Use MDM remote system extractor to automatically extract master data
    Use MDM import mechanisms to load master data from various sources
    Use MDM business content to integrate data
    Use MDM Syndicator to distribute master data to various targets
    Use MDM APIs and Web services to integrate data
    Some of the Operations :
    Data Modeling
    Role-Based Data Access
    Search
    Workflow-enabled Data Creation/Maintenance
    Data Publishing
    Master Data Quality
    Use validations and assignments to check the accuracy and consistency of your master data
    Create more complete and meaningful master data using the MDM Enrichment Architecture
    Create non-redundant and consistent master data using matching and merging capabilities
    I think things are clear for you  as far as MDM Implementation is concern.
    Hope it helps.
    Deep

  • Creating Master Data Centrally Using Guided Procedures

    Is it possible to create master data centrally using GP say for example creating Material master data without using MDM? Is there a stand out of box delivered scenario exist without MDM. I did see the article Create Master Data Centrally Using Guided Procedures but it uses MDM.
    Any help appreciated.

    Wow, that was indeed a quick reply i have ever got on SDN. Thanks for the excellent link jitesh. However, where will i get the deployable files that it selects in the Import GP/CAF contents step of the demo? those are the webdynpro components files. One is SCA and other is SDA. Where will i get them?
    Kindly post the link to those files if u have them or else send it across.
    Regards,
    Ameya
    Edited by: Ameya Pimpalgaonkar on Sep 27, 2008 11:53 AM

  • Help Required -- Can we use SQL Query to READ data from SAP MDM Tables

    Hi All,
    Please help.........
    Can we use SQL Query to READ(No Creation/Updation/Deletion  just Read) Data from SAP MDM tables directly, without using MDM Syndicator.
    Or direct SQL access to SAP MDM tables is not possible. Only through MDM Syndicator can we export data.
    Thanks in Advance
    Regards

    All the tables you create in Repository comes under A2i_CM_Tables in Database named as your repository name. So the tables names are fields of table A2i_CM_Tables. Now i tried it but cant make it.
    Now, I dont think its possible to extract all fields in tables and there values using select query. May be pure sql guy can do that or not.
    But there is no relation of data extraction and syndicator. Data is viewed in Data Manager. and you can also store data in a file from DM also.
    BR,
    Alok

  • Identifing duplicate master data records using the MDM Import Manager

    hi all
    I read the Topis "How to identify duplicate master data records using the MDM Import Manager</b>"
    i tried to create import maps and to set rules. but when i import them it creates the new vendor records for each rule with rest of the fields blank.
    when i import vendor data all the three fields i.e Match rate , match type and match group are blank.
    My Question is :
    I am getting vendor data from SAP R/3.
    In which source (in lookup xml file or data xml file) do i have to include these above three fields and how all the rules will be reflected in repository?

    Hi Sheetal
      Here we go when do you Import any data (vendor master) please follow the following steps;
    1. First of all apply the map to the source data
    2. In the Match Record tab there are 3 possiblities
       a.[Remote Key] : Checks the current source rec with
         repository along with all the fields - This is
         default
       b.Remove [Remote key] - by double click the same; and
         choose any single fields like Vendor Number or
         name - Then the current record will be matched
         with the repository based on the field.
       c.Instead of single field you can choose combination
         also.
    3. Based on the Match results, match class will be set
       automatically;
       a. None
       b. Single
       c. Multiple
    4. Then Match Type
        a.Exact-All the individual value matches are Equal.
        b.Partial-At least one value match is Equal and at least one Undefined; no value matches are Not Equal.
        c.Conflict-At least one value match is Equal and at least one value match is Not Equal.
    5. then chek the Import status and Execute the import.
    Hope this helps you.
    cheers
    Alexander
    Note: Pls dont forget reward points.

  • MDM Material Master data creation using Guided Procedures

    Well, i searched a lot on the forums but didnt get any thing.
    My requirement is to use MDM Java APIs for creating material master data using Guided Procedures. However, i got one nice document "[How to create master data centrally using guided procedure|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/92811997-0d01-0010-9584-f7d535177831]". This doc discuss about business partner creation and also have a link to downloadable file. But now when i click on this download link, it says download expired !
    Can anyone please post a valid link to these deployable files?
    Or anyone could post some usefull materail or links on the above mentioned requirement?
    Kindly reply, any help will be highly appreciated.
    Regards,
    Ameya
    Edited by: Ameya Pimpalgaonkar on Sep 27, 2008 11:29 AM

    Wow, that was indeed a quick reply i have ever got on SDN. Thanks for the excellent link jitesh. However, where will i get the deployable files that it selects in the Import GP/CAF contents step of the demo? those are the webdynpro components files. One is SCA and other is SDA. Where will i get them?
    Kindly post the link to those files if u have them or else send it across.
    Regards,
    Ameya
    Edited by: Ameya Pimpalgaonkar on Sep 27, 2008 11:53 AM

  • Read data from MDM For Lookup and Flat table using MDM ABAP API

    Hi,
    I have requriment to read data from MDM from FLAT and Lookup table using MDM ABAP API. My design  is like this ,
    I have one ITEMS (Main table in MDM) and inside that i have one Lookup flat table ITEM_TYPE , my requriment is to read Item number and its related Item type.
    From ABAP.
    Please help if any body has any idea.
    Regards,
    Shyam

    HI Guys,
    I found my solution by myself. Below is the solution , hope this will help others:-
    Retrieve data from MDM  using MDM ABAP API.
    Step- 1. Create structure in SAP with the same name as that of MDM field code for MDM Main table.
    Step-2. Create another structure in SAP having all  lookup fields of MDM , fieldname in ECC must be same as that of MDM field
    code.
    Step-3.Create structure in SAP for  individual lookup field(Single Field only)   with the same name as MDM Field code.
    Step-4.
    DATA: IT_QUERY            TYPE STANDARD TABLE OF MDM_QUERY,  "MDM_QUERY_TABLE,
          WA_QUERY            TYPE  MDM_QUERY,
          WA_CDT_TEXT         TYPE  MDM_CDT_TEXT,
          IT_RESULT_SET_KEY   TYPE  MDM_SEARCH_RESULT_TABLE,
          WA_RESULT_SET_KEY   TYPE  MDM_SEARCH_RESULT,
          WA_STRING           TYPE  STRING.
    DATA:<Internal table> TYPE STANDARD TABLE OF <SAP Str Having all LOOKup Fields>    
    DATA: :<Internal table>TYPE STANDARD TABLE OF <SAP Str one LOOKup field>,
         <Workarea> LIKE LINE OF :<Internal table>.
    *PASS LOGICAL OBJECT NAME.
    V_LOG_OBJECT_NAME = 'Logical object name defined in Customization'.
    Define logon language, country & region for server
    WA_LANGUAGE-LANGUAGE = 'eng'.
    WA_LANGUAGE-COUNTRY = 'US'.
    WA_LANGUAGE-REGION = 'USA'.
    TRY.
        CREATE OBJECT LR_API
          EXPORTING
            IV_LOG_OBJECT_NAME = V_LOG_OBJECT_NAME.
    ENDTRY.
    CONNECT to repository. Apply particular logon language info
    CALL METHOD LR_API->MO_ACCESSOR->CONNECT
      EXPORTING
        IS_REPOSITORY_LANGUAGE = WA_LANGUAGE.
    *NOW PASS ITEM NO AND GET KEY FROM MDM.
    CLEAR WA_QUERY.
    WA_QUERY-PARAMETER_CODE  = <MDM FIELD CODE>. "Field code
    WA_QUERY-OPERATOR        = 'EQ'. "Contains
    WA_QUERY-DIMENSION_TYPE  = 1. "Field search
    WA_QUERY-CONSTRAINT_TYPE = 8. "Text search
    WA_STRING                = <Field Value>.
    GET REFERENCE OF WA_STRING INTO WA_QUERY-VALUE_LOW.
    APPEND WA_QUERY TO IT_QUERY.
    CLEAR WA_QUERY.
    *PASS ITEM NUMBER AND GET RELATED KEY FROM MDM.
    TRY.
        CALL METHOD LR_API->MO_CORE_SERVICE->QUERY
          EXPORTING
            IV_OBJECT_TYPE_CODE = <MDM Main Table>
            IT_QUERY            = IT_QUERY
          IMPORTING
            ET_RESULT_SET       = IT_RESULT_SET_KEY.
      CATCH CX_MDM_COMMUNICATION_FAILURE .
      CATCH CX_MDM_KERNEL .
      CATCH CX_MDM_NOT_SUPPORTED .
      CATCH CX_MDM_USAGE_ERROR .
      CATCH CX_MDM_PROVIDER .
      CATCH CX_MDM_SERVER_RC_CODE .
    ENDTRY.
    Pass record id into keys.
    LOOP AT IT_RESULT_SET_KEY INTO WA_RESULT_SET_KEY.
      WA_KEYS = WA_RESULT_SET_KEY-RECORD_IDS.
    ENDLOOP.
    WA_RESULT_SET_DEFINITION-FIELD_NAME = <Look field name>.
    APPEND WA_RESULT_SET_DEFINITION TO IT_RESULT_SET_DEFINITION.
    CALL METHOD LR_API->MO_CORE_SERVICE->RETRIEVE
      EXPORTING
        IV_OBJECT_TYPE_CODE      = <MDM Main Table>
        IT_RESULT_SET_DEFINITION = IT_RESULT_SET_DEFINITION
        IT_KEYS                  = WA_KEYS
      IMPORTING
        ET_RESULT_SET            = IT_RESULT_SET.
    LOOP AT IT_RESULT_SET INTO
            WA_RESULT_SET.
    *PASS KEYS INTO MAIN TABLE TO GET Structure for FALT or Look up Table
      TRY.
          CALL METHOD LR_API->MO_CORE_SERVICE->RETRIEVE_SIMPLE
            EXPORTING
              IV_OBJECT_TYPE_CODE = <MDM Main Table>
              IT_KEYS             = WA_KEYS
            IMPORTING
              ET_DDIC_STRUCTURE =<SAP Strct having all Look up fileds of MDM>         
      ENDTRY.
      LOOP AT <SAP Strct having all Look up fileds of MDM> INTO <Work area>.
        CLEAR WA_KEYS.
        APPEND <Work area>-field name TO WA_KEYS.
        CALL METHOD LR_API->MO_CORE_SERVICE->RETRIEVE_SIMPLE
          EXPORTING
            IV_OBJECT_TYPE_CODE = <MDM Lookup table name>
            IT_KEYS             = WA_KEYS
          IMPORTING
            ET_DDIC_STRUCTURE   = <Single Structure in SAP For Lookup field>.
        READ TABLE <Single Structure in SAP For Lookup field>. INTO <Work Area> INDEX 1.
    Here you can get the value of realted lookup fields associated with main table data.
      ENDLOOP.
    ENDLOOP.
    LR_API->MO_ACCESSOR->DISCONNECT( ).
    Edited by: Shyam Babu Sah on Nov 24, 2009 4:52 AM

  • How to retrieve the data from MDM hierarchy table using MDM Java API

    Hi,
    I had a hierarchy table in MDM. This table had some column say x. I want to retrieve the values of this x column and need to show them in a drop down using MDM Java API.
    Can anyone help me to solve this?
    Regards
    Vallabhaneni

    Hi,
    Here is your code...
    TableId Hier_TId = repository_schema.getTableId(<hierarchy table id>);
    java.util.List list = new ArrayList();
    ResultDefinition Supporting_result_dfn = null;
    FieldProperties[] Hier_Field_props =rep_schema.getTableSchema(Hier_TId).getFields();
    LookupFieldProperties lookup_field = null;
    TableSchema lookupTableSchema = null;
    FieldId[] lookupFieldIDs = null;
    for (int i = 0, j = Hier_Field_props.length; i < j; i++) {
    if (Hier_Field_props<i>.isLookup()) {     
                                  lookup_field = (LookupFieldProperties) Hier_Field_props<i>;
         lookupTableSchema =repository_schema.getTableSchema(lookup_field.getLookupTableId());
                                  lookupFieldIDs = lookupTableSchema.getFieldIds();
         Supporting_result_dfn = new ResultDefinition(lookup_field.getLookupTableId());
         Supporting_result_dfn.setSelectFields(lookupFieldIDs);
         list.add(Supporting_result_dfn);
    com.sap.mdm.search.Search hier_search =new com.sap.mdm.search.Search(Hier_TId);
    ResultDefinition Hier_Resultdfn =     new ResultDefinition(Hier_TId);
    Hier_Resultdfn.setSelectFields(rep_schema.getTableSchema(Hier_TId).getDisplayFieldIds());
    ResultDefinition[] supportingResultDefinitions =
    (ResultDefinition[])list.toArray(new ResultDefinition [ list.size() ]);
    RetrieveLimitedHierTreeCommand retrieve_Hier_tree_cmd =
    new RetrieveLimitedHierTreeCommand(conn_acc);
    retrieve_Hier_tree_cmd.setResultDefinition(Hier_Resultdfn);
    retrieve_Hier_tree_cmd.setSession(Auth_User_session_cmd.getSession());
    retrieve_Hier_tree_cmd.setSearch(hier_search);
    retrieve_Hier_tree_cmd.setSupportingResultDefinitions(supportingResultDefinitions);
    try {
         retrieve_Hier_tree_cmd.execute();
    } catch (CommandException e5) {
              // TODO Auto-generated catch block
              e5.printStackTrace();
    HierNode Hier_Node = retrieve_Hier_tree_cmd.getTree();
    print(Hier_Node,1);
    //method print()
    static private void print(HierNode node, int level) {
    if (!node.isRoot()) {
         for (int i = 0, j = level; i < j; i++) {
              System.out.print("\t");
         System.out.println(node.getDisplayValue());
    HierNode[] children = node.getChildren();
    if (children != null) {
              level++;
    for (int i = 0, j = children.length; i < j; i++) {
    print(children<i>, level);
    //end method print()
    Best regards,
    Arun prabhu S
    Edited by: Arun Prabhu Sivakumar on Jul 7, 2008 12:19 PM

  • Data Reconciliation Data Sources in Business Content

    Can you tell me where we can find these data sources and explain me how to use them? Do we need to define infocube/ods or anything like that to load the data and use report to see the results?
    Please explain me with one complete scenario.
    Thanks.

    Data Reconciliation for data sources allows you to ensure the consistency of data has been loaded in to BI is available and used productively there.
    It is based on comparision of the data loaded in to BI and the application data in the source system.You can access the data in the source system directly to perform this comparison.
    The term Data Reconciliation data source is used for Data sources that are used as a reference for accessing the application data in the source directly and there fore allow you to draw comparison to the source data.
    It allows you to check the integrity of the loaded data by for EXAMPLE,comparing the total of a keyfigure in the data store object with the corresponding totals that the virtual providers access directly in the Source system.
    Hope it will helps you.....

  • Data reconciliation

    BI Experts - I am using BI as my data repository to move data from R/3 to BPC. Any suggestions on what I need to look out for? Thanks.

    Purpose
    An important aspect in ensuring the quality of data in BI is the consistency of the data.  As a data warehouse, BI integrates and transforms data and stores it so that it is made available for analysis and interpretation. The consistency of the data between the various process steps has to be ensured. Data reconciliation for DataSources allows you to ensure the consistency of data that has been loaded into BI and is available and used productively there. You use the scenarios that are described below to validate the loaded data. Data reconciliation is based on a comparison of the data loaded into BI and the application data in the source system. You can access the data in the source system directly to perform this comparison.
    The term productive DataSource is used for DataSources that are used for data transfer in the productive operation of BI. The term data reconciliation DataSource is used for DataSources that are used as a reference for accessing the application data in the source directly and therefore allow you to draw comparisons to the source data. 
    You can use the process for transaction data. Limitations apply when you use the process for master data because, in this case, you cannot total key figures, for example.
    Model
    The following graphic shows the data model for reconciling application data and loaded data:
    The productive DataSource uses data transfer to deliver the data that is to be validated to BI. The transformation connects the DataSource fields with the InfoObject of a DataStore object that has been created for data reconciliation, by means of a direct assignment.  The data reconciliation DataSource allows a VirtualProvider direct access to the application data.  In a MultiProvider, the data from the DataStore object is combined with the data that has been read directly. In a query that is defined on the basis of a MultiProvider, the loaded data can be compared with the application data in the source system.
    In order to automate data reconciliation, we recommend that you define exceptions in the query that proactively signal that differences exist between the productive data in BI and the reconciliation data in the source. You can use information broadcasting to distribute the results of data reconciliation by email, for example.
    Edited by: prem casanova on Oct 21, 2008 11:15 AM

  • Data Reconciliation ... trouble shooting

    Hi All Experts,
    Can anyone provide me some assistance on the following plus provide me some solution.
    a)Data validation btw r/3 and bw reports
    b)if not possible then use se16 - but need to explain how to determine the table and the fields to be used
    check if the definitions are similar between reports
    check to see if there are any exclusions in the report and the data inconsistency is caused by master data loads (unassigned etc)
    c)timing of the data loads - are there postings since the data is loaded into bw?
    d)if the reportt definitions are the same between r/3 and bw - check the data in the cube.  use listcube - explain how to use listcube.
    e)determine if the error is caused by a particular package or error in update rule / transfer rule - explain how to determine this (include the package id, possible errors - full load, delta load)
    include pointers on how to fix the error - asking a bw consultant to fix the error is a good idea in this case.  they should not be fixing the error themselves (unless its a query issue)
    Also ... data reconciliation and how to fix the errors and suggest corrections, improvements and fixes wherever possible.
    TQ
    BR
    Kumar

    a) Data validation between R/3 and BW reports can be done by getting the reports from the R/3 consultant or if u know how to get that u can do that urself. And checking them with the reports in BW ( the selections shd be the same ).
    b)use se16 . For this u have to know from which tables the data is coming into BW . For example take the case or Material Master . The data comes into this Master data from the table MARA.
    for this goto to the SE16 of R/3 and check the data from this table . similarly all other datasources.
    c)u can check this in the delta queue of the source system.
    d)Listcube is a transaction in the BW where u can check the data .
    u get a screen with selections and fill the entries by defult if there are too many selection fields the system will not allow u so u have deselect all the characteristics and select few which are important or those for which u want to check the data.
    e)Determining if the error is caused by paricular package or any other thing , u can find this int he RSMO transaction and select the load which failed and in the details tab u find all the information .
    To see if it is caused in the update rules or TR's u have to simulate the update and switch on debugging and check the update rules and u can find out.
    Data reconcilation is nothing but ur first question and how to fix the errors u will have to get the error to be fixed so if u hae any error just post the error in the forum and u will be helped by ur friends here.
    Hope this helps
    Regards
    Majeed

  • Data reconciliation for Open Orders in SD

    Hi,
              Can any one give me a step by step approach for doing the data reconciliation for open orders in the Sales Infocube.
    Regds
    tanu

    Hi Tanu,
    Reconcilation:
    Reconcilation is the process of comparing the data after it is transferred to the BW system with the source system. The procedure to do reconcilation is either you can check the data from the SE16 if the data is coming from a particular table only or if the datasource is any std datasource then the data is coming from the many tables in that scenario what I used to do ask the R/3 consultant to report on that particular selections and used to get the data in the excel sheet and then used to reconcile with the data in BW . If you are familiar with the reports of R/3 then you are good to go meaning you need not be dependant on the R/3 consultant ( its better to know which reports to run to check the data ).
    I will give you a scenario to help you understand it better. Lets say BW extracts FI data from R/3. To make sure that all the records has been extracted from R/3 we can create a report in R/3 which will show the year-to-date balance of all the documents posted and we can create a BEx query on the BW cube which will also display the trial balnce. Any difference between the two balance will identify the records missing from R/3.
    Similary you can model other scenarios as per your requirement. If you are extracting from 2 or more different sources from R/3 then create a multicube on top of the individual cube and produce the report. You need to also create a similar report in R/3 as well.
    check this How to Doc:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7a5ee147-0501-0010-0a9d-f7abcba36b14
    Hope this helps.
    Regards,
    Ravikanth

  • Data reconciliation for Open Orders

    Hi,
         Can any one give me the step by step approach for data reconciliation for open orders in Sales Infocube. I understand that the Open Orders which are dynamic Key figures and the data reconciliation in this case is not straight forward using "Generic View data sources"  and we need to program our own generic data source with a funciton module. I would like to know if any one has dealt with this specific case and give me the proceudure to do the same.
    Regds
    tanu

    Hi
    Refer this
    Data Reconciliation
    You can compare periodic R/3 reports relevant to open orders with values in the cube and reports
    Regards
    N Ganesh

  • Data Reconciliation in BI

    Hi All,
    There is a requirement to do a data reconciliation within the BI system.
    Requirement: PSA data from the Data source needs to be reconciled with the Data Target ( DSO or Cube).
    Approach: We are planning to build a data souce on PSA using the FM based data Source and fetch that data into a new write Optimized DSO. Then a Multiprovider can be built which will conatain the new DSO and the data Target to be compared and built the Query on top of this MP and compare trhe values.
    Please suggect some other strategies for achieving this.
    Regards,
    Mayank
    Assign point is the best way of saying thanks..

    Hi Mayank,
    Your approach is correct for reconciliation. I would like to suggest you to include a key figure each in the DSO's for the count of records. Default it is as '1' for an entry in the DSO.
    In the query, you can have the all the fields of the DSO in 'characteristics 'and the key figure defined based on the row count under key figures.
    In the report for an exact match a '1' will appear for the data from both sources indicating a perferct match. A single 1 will indicate an occurence in DSO or PSA respectively.
    Please ensure the structure of both the InfoProviders is exactly same.
    Cheers,
    VA

Maybe you are looking for

  • Logic 9.0.0 is not workig on OS 10.8 what to do?

    Hi! I just upgraded my OS to 10.8 and somehow logic 9.0.0 stop working on my mac it seems to be a problem with PowerPC. When I want to run logic it says that I can't use this version of logic with OS 10.8 what to do I need your help guys.

  • Nested exception, unmarshalling arguments, ClassNotFountException:AImp_stub

    java.rmi.ServerException: Server RemoteException; nested exception: java.rmi.UnmarshalException: error unmarshalling arguments; nested exception is: java.lang.ClassNotFoundException:AddServerImpl_Stub Kindly look at my face......full of burden kindly

  • My music doesnt play, my music doesnt play

    bought my iphone 4 8g last month  from verizon wireless and i try to play music from my phone and it i dont hear anything. i restored it thru itunes and i tried to rest it many time and i also took my phone back to verizon and they couldnt help me/ t

  • How do I get 12 hardware counter channels?

    Hello, I am wanting to count 12 seperate pulse signals. The USB device I am currently using (6501) has only 1 counter channel that works great for one signal, but the extremely short pulses (@5us) makes software programming a nightmare for 12 signals

  • Problem with SDO_NN

    Hallo, I have following query SELECT A.f1820_geom FROM UG_F1820_GEOM A WHERE SDO_NN(A.f1820_geom, (SELECT A.f1820_geom FROM UG_F1820_GEOM A WHERE A.F1820_OBJNR=10000004 and A.f1820_geom.sdo_gtype=3003),'sdo_num_res=10')='TRUE' and A.f1820_geom.sdo_gt