Data Warehouse Cubes Not Processing

With a customer now who is having data warehouse problems, the main issues being:
- ETL jobs run fine (except MPSync which finishes with only 179/180 jobs complete)
- Cube processes are stuck in a RUNNING loop, they never complete or fail out and all show a last run time 1/10 and next run time 1/11
Have scoured the internet to find any solution to this, have come across various blogs to fix the issue. Have tried to manually disable the jobs and then manually process the cubes, restart the DW server, restart the SSAS, etc. to no avail.
Lastest solution we tried to take was to unregister the DW and re-register, however when we went to unregister the DW we received the following error:
"Failed to unregister from the data warehouse"
Title: Error
Text: System.ArgumentException: SM data source not found.
at
Microsoft.EnterpriseManagement.ServiceManager.UI.Administration.DWRegistration.Unregistration.DWUnregistrationHelper.AcceptChanges(WizardMode wizardMode)
So our next step was to unregister the data sources and then re-register them individually. Of the two data sources, we were able to unregister and re-register the DW data source (DW_COMPANY01), but when we tried to unregister the Operational data source
(COMPANY01) we got the following error:
Title:  An error was encountered while running the task.
Text:  Exception has been thrown by the target of an invocation.
Based on the two errors shown above I assume we cannot un-register this data source or the DW as a whole because it cannot find that operational data source. Couple things to point out above this data source and the environment to shed some light on the
situation:
Customer upgraded all servers to UR4 around the January time frame
Customer promoted a new primary MS around January 20<sup>th</sup>
Currently, when reviewing the details of the data source, the SDK Server references the computer name of the OLD primary management server
Looking at the event logs on the DW MS (with SSRS/SSAS) an error shows that the MPSynch job failed to finish and reference the OLD primary management server
Looking for any guidance for this issue. Have taken many steps to troubleshoot the problem to avail. Please let me know if you have any questions or need any more information.

Hi !
Probably it was a problem with the primary Keys, because when MPSyncJobs does not complete something will fail in the following ETL Jobs.
Solution would have been here: http://technet.microsoft.com/en-us/library/jj614520.aspx
In this stage, frankly spoken if you have still all the Data in your CMDB and nothing gone bcs of grooming, deinstall all DWH components
(DM-MgmtServer, Databases) and reinstall DW completely.
R.

Similar Messages

  • OLAP vs OBIEE Cubes vs Data Warehouse Cubes

    Good afternoon,
    Could someone please tell me the difference between the cubes created using Oracle's Analytic Workspace Manager, the cubes created in OBIEE, and the ones created in Oracle Warehouse Builder? Do these all use Oracle's OLAP functionality, or are they different things?
    Thank you.

    I think by Datawarehouse cubes, you mean the traditional relational ROLAP cubes.
    It has been a long time now that OLAP engine was merged into Oracle database (starting with 9.2 version), so multidimensional MOLAP cubes can be created inside an Oracle-based DW as part of Aggregation strategy (in addition to MVs or instead of MVs) to improve query performance and simplify calculations.
    Some other points.
    (1). OWB can create both ROLAP cubes and MOLAP cubes. Even ODI has knowledge modules to create MOLAP (i.e., Oracle-OLAP) cubes.
    (2). OBIEE cannot create Oracle-OLAP cubes in the database. I think there is some new functionality to create Essbase cubes through OBIEE, but there is no out-of-the-box functionality to create Oracle-OLAP cubes through OBIEE.
    (3). The process to create Oracle-OLAP cubes using AWM and then import those into RPD is very simple. Starting with OBIEE 11.1.1.5, it understands oracle olap metadata in standard Oracle database dictionary. So when Oracle-OLAP cubes and dimensions are queried, obiee generates physical queries using OLAP_TABLE fuction, and that is how data is retrieved from OLAP engine into relational engine and then into BI server.
    (4). Oracle OLAP cubes are always created in Analytical workspace, which is a table prefixed by AW$. So one quick way is to check the tables in your schema and see if there is any table with AW$ prefix.
    You can also query Oracle-OLAP metadata to see what (if any) Aanalytical workspaces, MOLAP cubes and MOLAP dimensions exist in your database. Refer to Oracle OLAP Dictionary Views at http://docs.oracle.com/cd/E11882_01/olap.112/e17123/admin.htm#i1006325

  • Attribute key cannot be found : Data present but not processed

    Hi there,
    I know this question has been ask several time and I went through a lot of proposed solution but none were successful in my case. My cube processed successfully for month but now I have the following error :
    Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'EventMember', Column: 'Id', Value: '22560966'. The attribute is 'Id'.
    To explain my case here what I've got.
    A dimension [Member Unique] where there is a bigint Id and some attributes. The only defined attribute in the dimension is Id.
    When I process fully this dimension the query issued by SSAS to SQL (using the profiler) is the following :
    SELECT
    DISTINCT
    [dbo_Member].[Id] AS [dbo_MemberId0_0]
    FROM [dbo].[Member] AS [dbo_Member]
    I verified that the Id 22560966 is present in the table and in the resulting set and that is the case.
    Then I have the measure group EventMember based on a named query doing the following :
    SELECT Id
    FROM dbo.Member AS m
    WHERE EXISTS
    SELECT 1 AS Expr1
    FROM dbo.Event AS g INNER JOIN
    dbo.MemberHistory AS h ON h.Id = g.MemberId INNER JOIN
    dbo.MemberVersion AS v ON v.HistoryId = h.Id
    WHERE (v.AntvoiceMemberId = m.Id)
    This query returns also the row 22560966.
    When I process fully the measure group EventMember the query issued by SSAS is :
    SELECT [EventMember].[EventMember0_0] AS [EventMember0_0],[EventMember].[EventMemberId0_1] AS [EventMemberId0_1]
    FROM
    SELECT 1 AS [EventMember0_0],[Id] AS [EventMemberId0_1]
    FROM
    SELECT Id
    FROM dbo.Member AS m
    WHERE EXISTS
    SELECT 1 AS Expr1
    FROM dbo.Event AS g INNER JOIN
    dbo.MemberHistory AS h ON h.Id = g.MemberId INNER JOIN
    dbo.MemberVersion AS v ON v.HistoryId = h.Id
    WHERE (v.AntvoiceMemberId = m.Id)
    AS [EventMember]
    AS [EventMember]
    And it crashes during the process giving me the above error.
    I did try to allow missing key attribute error in the process option, and it ran successfully. It allowed me to issue an MDX statement on my cube to check if the dimension member was present. And it was not the case. Hence the error I think. The MDX statement
    was :
    SELECT [Member Unique].[Id].&[22560966] on 0
    FROM MyCube
    The the final question would be : Why this member which is returned by the SQL query is not present in my dimension ?
    I tried deletiong my cube it did not changed a thing. A noticeable thing is that the faulty attributes key are the same every time.

    Hi RodolpheAV,
    The error means SSAS could not find a record in the dimension table [table name] where column [column name] contained value [value]. For more information, you can look into the following articles:
    Error messages when you try to process a database or a cube in SQL Server 2005 Analysis Services: "The attribute key cannot be found" and "The record was skipped because the attribute key was not found:
    http://support.microsoft.com/kb/922673
    SSAS Quick Reference: Attribute Key Cannot Be Found:
    http://www.ssas-info.com/analysis-services-articles/66-mgmt/1963-ssas-quick-reference-attribute-key-cannot-be-found
    TechNet Subscriber Support
    If you are TechNet Subscription user and have
    any feedback on our support quality, please send your feedback here.
    Regards,
    Bin Long
    TechNet Community Support

  • Installed Data warehouse builder.Not knowin how to proceed.intial problems

    hello frds
    i installed Oracle datawarehouse builder of 10.2.0.1.0
    and when i am clicking the Repository Assistanceit is asking me for "basic or Advanced instal"
    i clicked on basic install after dat it is asking me to fill many things like repository name , password,sys name and password an oracle service name etc etc
    tell me how can i fill those details
    if i am filling those details and click on Next it is showing be Sql exception ..
    i have seen many books
    but not getting any proper idea
    and i already installed oracle 10g
    and even oracle backup software
    Plz help me with passwords andfilling those details
    Is dat any thing related to oracle 10g
    Plzz frdss hope u will reply.
    i am waiting for a gud answer
    Sindhu

    contact your local oracle university or oacle learning centre and get some training on owb. But before you undertake any training you might want to be clear on the specifics such as:
    Operating system.
    Versions of DB and client.
    Is there a specific reasons such as data migration from legacy to owb or development of a new dw environemnt.
    What BI tool are you planning to use eg, Cognos,OBI EE ,Hyperion,BO etc.
    What is typically your source system and how is the data going to be interfaced(frequency , ,data rules etc).
    What is the schema architecture, Star or Snowflake.
    What DW methodology do you want to use, Kimball or Inmon.
    How much data are you going to work with in DW en.
    Is it going to be on RAC.
    Once you get your specifics right and know what you want you can get the best out of the training otherwise it will be a routine exercise of chasing something without knowing what you want to achieve.
    Trust this helps.

  • Event data collection process unable to write data to the Data Warehouse

    Alert Description:
    Event data collection process unable to write data to the Data Warehouse. Failed to store data in the Data Warehouse. The operation will be retried.
    Exception 'InvalidOperationException': The given value of type Int32 from the data source cannot be converted to type tinyint of the specified target column.
    Running SCOM 2007 R2 on Server 2008 R2 with SQL Server 2008 R2. I can only find a single reference to this exact error on the Internet. It started occurring on a weekend. No changes were made to the SCOM server directly before this occurred. Anyone know
    what the error means and/or how to fix?

    Hello,
    I would suggest the following threas for your reference:
    Troubles with DataWarehouse database
    http://social.technet.microsoft.com/Forums/en-US/operationsmanagergeneral/thread/5e7005ae-d5d8-4b5c-a51c-740634e3da4e
    Data Warehouse configuration synchronization process failed
    to read state 
    http://social.technet.microsoft.com/Forums/en-US/systemcenter/thread/8ea1f4b9-115b-43cd-b66f-617533703047
    Thanks,
    Yog Li
    TechNet Community Support

  • Performance issues with data warehouse loads

    We have performance issues with our data warehouse load ETL process. I have run
    analyze and dbms_stats and checked database environment. What other things can I do to optimize performance? I cannot use statspack since we are running Oracle 8i. Thanks
    Scott

    Hi,
    you should analyze the db after you have loaded the tables.
    Do you use sequences to generate PKs? Do you have a lot of indexex and/or triggers on the tables?
    If yes:
    make sure your sequence caches (alter sequence s cache 10000)
    Drop all unneeded indexes while loading and disable trigger if possible.
    How big is your Redo Log Buffer? When loading a large amount of data it may be an option to enlarge this buffer.
    Do you have more then one DBWR Process? Writing parallel can speed up things when a checkpoint is needed.
    Is it possible using a direct load? Or do you already direct load?
    Dim

  • Service Manager 2012 R2 - Data warehouse Issue

    i have an issue with a customer with their Data warehouse Server. when ever we generate a report using Service Manager we are not seeing data in the report. example - we only see 4 incidents on reports when we generate them and these are many months
    old records. within the database there are 1000+ Incidents created however when generating a report only shows us 4 incidents. i'm trying to figure out why it's only showing few records whereas it should show all the records when we generate. i have this
    issue now with two customers
    i can see that the Data warehouse jobs are running without issues. they are not failing. please let me know how i can get this issue fixed

    Open up an SQL management studio and connect to the database that hosts the data warehouse database. Run a query against this following views.
    Incident
    SELECT * FROM [DWDataMart].[dbo].[IncidentDimvw]
    If we look at the Incident query if this only returns 4 incidents as your report then the sync to the data warehouse is not working correctly. I would recommend runnung travis ETL job to run all the data warehouse jobs in the correct order. You can find
    it here. https://gallery.technet.microsoft.com/PowerShell-Script-to-Run-a4a2081c
    And if that still does not help there is another few blog posts for troubleshooting the data warehouse but lets try this first and go from there. 
    Cheers,
    Thomas Strömberg
    System Center Specialist
    Blog:  
    Twitter:   LinkedIn:
    Please remember to 'Propose as answer' if you find a reply helpful

  • Problem during  Data Warehouse Loading (from staging table to Cube)

    Hi All,
    I have created a staging Module in owb to load my flat files to my staging tables.I have created an Warehouse module to load my staging tables to Dimension and Cube that I have created.
    My senario:
    I have a temp_table_transaction which had loaded my flat files to it .This table had loaded with 168,271,269 milion record as through this flat file.
    I have created a mapping in owb which loaded my temp_table_transaction which has join with other tables and some expression and convert function that these numbers fill to a new table called stg_tbl_transaction in my staging module.Running this mapping takes 3 hours and 45 minutes with this configue of my mapping:
    Default operation mode in running parameter of Mapp config=Set based
    My dimesion filled correctly but I have two problem when I want to transfer my staging table to my Cube:
    #1 Problem:
    i have created a cube is called transaction_cube with owb and it generated and deployed correctly.
    i have created a map to fill my cube with 168,271,268 milon recodes in staging table was called stg_tbl_transaction and deployed it to server (my cube map operating mode is set based)
    but after running this map it did not complete after 9 hour and I forced to cancel my running's map by kill its sessions .I want to know this time for loading this capacity of data is acceptable or for this capacity of data we should spend more time.Please let me know if anybody has any Issue.
    #2 Problem
    To test my map I have created a map with configure set based in operation modes and select my stg_tbl_transaction as source with 168,271,268 records in it and I have created another table to transfer and load my data in it.I wanted to test the time we should spend on this simple map but after 5 hours my data had not loaded in new table.I want to know where is my problem.Should I have set something in configue of map or anothe things.Please guide me about these problems.
    CONFIGURATION OF MY SERVER:
    i run owb on two socket xeon 5500 series with 192 GB ram and disks with RAID 10 Array
    Regards,
    Sahar

    For all of you
    It is possible to load from Infoset to Cube we did it, and it was ok.
    Data are really loaded from Infoset (Cube + master dat) to cube.
    When you create a transformation under a cube Infoset is proposed, and it works fine ....
    Now the process is no more operationnal and i don't understand why .....
    Load from infoset to cube is possible, i can send you screen shot if you want ....
    Christophe

  • SAles Order not appearing in Sales Data Warehouse SAles Order List

    Hi Gurus,
    Can you help me in this regard :- My Sales Order has been invoiced but its not appearing in the Sales Data Warehouse Sales Order List.

    Hi poonam,
    I don't get clearly, but mean here your sales order doesn't exist in your cube or ODs in BW.
    If so check the delta load has been done or not.
    Before that check in tables VBAK (Header data) and  VBAP (Item data) , wheather your sales order exists.
    Hope this helps.
    Regards,
    Reddy

  • Availability data not visible in data warehouse

    I'm having a problem with our data warehouse. I can't run or even find availability reports from some of the objects that are visible and clearly monitored in our scom. For example I did a web transaction monitor with the wizard but when I try to run a availability
    report from it, there is no object for that so I can not even run the report. I know the 500 object limit and I have set the registry key to see more objects. We use SCOM 2012 R2 UR2.
    Is there anything that I should check? Can I somehow run a SQL query against my data warehouse to see if there is any availability data?

    Hello SamiKoskivaara, 
    Could you please check if event ID 31553 is being logged on one of your SCOM management servers ?
    Event ID 31553:
    "Data was written to the Data Warehouse staging area but processing failed on one of the subsequent operations. Exception 'SqlException': Sql execution failed. Error 2627, Level
    14, State 1, Procedure ManagedEntityChange, Line 368, Message: Violation of UNIQUE KEY constraint 'UN_ManagedEntityProperty_ManagedEntityRowIdFromDAteTime'. Cannot insert duplicate key in object 'dbo.ManagedEntityProperty'. The duplicate key value is (184,
    Mar 1 2013 9:42AM). One or more workflows were affected by this...

  • Data warehouse Loader did not write the data

    Hi,
    I need to know which products are the most searched, I know the tables responsible for storing this information
    and are ARF_QUERY ARF_QUESTION. I already have the Data Warehouse module loader running, if anyone knows
    why the data warehouse loader did not write the data in the database, I thank you.
    Thank.

    I have configured the DataWarehouse Loader and its components.Even I have enabled the logging mechanism.
    I can manually pass the log files into queue and then populate the data into Data Warehouse database through scheduling.
    The log file data is populated into this queue through JMS message processing" and should be automated.I am unable to
    configure this.
    Which method is responsible for adding the log file data into loader queue and how to automate this.

  • Data warehouse monitor initial state data synchronization process failed to write state.

    Data Warehouse monitor initial state data synchronization process failed to write state to the Data Warehouse database. Failed to store synchronization process state information in the Data Warehouse database. The operation will be retried.
    Exception 'SqlException': Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
    One or more workflows were affected by this. 
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Synchronization.MonitorInitialState
    Instance name: Data Warehouse Synchronization Service
    Instance ID: {0FFB4A13-67B7-244A-4396-B1E6F3EB96E5}
    Management group: SCOM2012R2BIZ
    Could you please help me out of the issue?

    Hi,
    It seems like that you are encountering event 31552, you may check operation manager event logs for more information regarding to this issue.
    There can be many causes of getting this 31552 event, such as:
    A sudden flood (or excessive sustained amounts) of data to the warehouse that is causing aggregations to fail moving forward. 
    The Exchange 2010 MP is imported into an environment with lots of statechanges happening. 
    Excessively large ManagedEntityProperty tables causing maintenance to fail because it cannot be parsed quickly enough in the time allotted.
    Too much data in the warehouse staging tables which was not processed due to an issue and is now too much to be processed at one time.
    Please go through the links below to get more information about troubleshooting this issue:
    The 31552 event, or “why is my data warehouse server consuming so much CPU?”
    http://blogs.technet.com/b/kevinholman/archive/2010/08/30/the-31552-event-or-why-is-my-data-warehouse-server-consuming-so-much-cpu.aspx
    FIX: Failed to store data in the Data Warehouse due to a Exception ‘SqlException': Timeout expired.
    Regards,
    Yan Li
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Alert data collection process unable to write data to the Data Warehouse

    Alert data collection process unable to write data to the
    Data Warehouse. Failed to store data in the Data Warehouse. The operation will
    be retried.
    Exception 'InvalidOperationException': The given value of type
    String from the data source cannot be converted to type nvarchar of the
    specified target column.
    One or more workflows were affected by this.
    Workflow name:
    Microsoft.SystemCenter.DataWarehouse.CollectAlertData
    Instance name: Data
    Warehouse Synchronization Service
    Instance ID:
    {9A0B3744-A559-3080-EA82-D22638DAC93D}
    Management group: SCOMMG
    Can anybody Help?

    About 24 hours ago, one of my four management servers began generating this error every 10 minutes; we only upgraded to SCOM 2012 R2 a couple weeks ago, I have NOT installed UR1.  No new management packs or database changes have been made within the
    last week; KB945946 is not related to this.  An Event ID 11411 warning started occurring around the same time as this started and repeats every 10 minutes, too:
    Alert subscription data source module encountered alert subscriptions that were waiting for a long time to receive an acknowledgement.
     Alert subscription ruleid, Alert subscription query low watermark, Alert subscription query high watermark:
    5fcdbf15-4f5b-29db-ffdc-f2088a0f33b7,03/27/2014 00:01:39, 03/27/2014 20:30:00
    Performance on the Data Warehouse database server seems fine; CPU, memory and disk I/O are good.
    How can we identify where the problem is?

  • Data in the Cube not getting aggregated

    Hi Friends
    We have Cube 1 and Cube 2.
    The data flow is represented below:
    R/3 DataSource>Cube1>Cube2
    In Cube1 data is Stored according to the Calender Day.
    Cube2 has Calweek.
    In Transformations of Cube 1 and Cube 2 Calday of Cube 1 is mapped to Calweek of Cube 2.
    In the CUBE2 when i upload data from Cube1.Keyfigure Values are not getting summed.
    EXAMPLE: Data in Cube 1
    MatNo CustNo qty calday
    10001  xyz     100  01.01.2010
    10001  xyz      100  02.01.2010
    10001  xyz      100   03.01.2010
    10001  xyz     100  04.01.2010
    10001  xyz      100  05.01.2010
    10001  xyz      100   06.01.2010
    10001  xyz      100   07.01.2010
    Data in Cube 2:
    MatNo CustNo qty calweek
    10001  xyz     100  01.2010
    10001  xyz      100  01.2010
    10001  xyz      100   01.2010
    10001  xyz     100   01.2010
    10001  xyz      100   01.2010
    10001  xyz      100   01.2010
    10001  xyz      100   01.2010
    But Expected Output Should be:
    MatNo CustNo qty calweek
    10001  xyz     700  01.2010
    How to acheive this?
    I checked in the transformations all keyfigures are maintained in aggregation summation
    regards
    Preetam

    Just now i performed consisyency check for the cube:
    I a getting following warnings:
    Time characteristic 0CALWEEK value 200915 does not fit with time char 0CALMONTH val 0
    Consistency of time dimension of InfoCube &1
    Description
    This test checks whether or not the time characteristics of the InfoCube used in the time dimension are consistent. The consistency of time characteristics is extremely important for non-cumulative Cubes and partitioned InfoCubes.
    Values that do not fit together in the time dimension of an InfoCube result in incorrect results for non-cumulative cubes and InfoCubes that are partitioned according to time characteristics.
    For InfoCubes that have been partitioned according to time characteristics, conditions for the partitioning characteristic are derived from restrictions for the time characteristic.
    Errors
    When an error arises the InfoCube is marked as a Cube with an non-consistent time dimension. This has the following consequences:
    The derivation of conditions for partitioning criteria is deactivated on account of the non-fitting time characteristics. This usually has a negative effect on performance.
    When the InfoCube contains non-cumulatives, the system generates a warning for each query indicating that the displayed data may be incorrect.
    Repair Options
    Caution
    No action is required if the InfoCube does not contain non-cumulatives or is not partitioned.
    If the Infocube is partitioned, an action is only required if the read performance has gotten worse.
    You cannot automatically repair the entries of the time dimension table. However, you are able to delete entries that are no longer in use from the time dimension table.
    The system displays whether the incorrect dimension entries are still being used in the fact table.
    If these entries are no longer being used, you can carry out an automatic repair. In this case, all time dimension entries not being used in the fact table are removed.
    After the repair, the system checks whether or not the dimension is correct. If the time dimension is correct again, the InfoCube is marked as an InfoCube with a correct time dimension once again.
    If the entries are still being used, use transaction Listcube to check which data packages are affected.  You may be able to delete the data packages and then use the repair to remove the time dimension entries no longer being used. You can then reload the deleted data packages. Otherwise the InfoCube has to be built again.

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

Maybe you are looking for

  • Creative Zen Sl

    i bought a zen sleek in august last year, since then I've had two replacements. The first player gathered so much dust between the screen cover and the LCD screen that it obscured the menu, the second player had the wrong firmware and was incompatibl

  • When I connect my iPod, I get this error...

    I recieved a black 5G 30 GB iPod for Christmas. Here is the error I wrote down most of it. My whole screen comes up with a blue background and this is the text. I can only shut off the computer by pressing the power button. There's no other way to ge

  • Read date from an email using sender mail adapter.

    Hi All, I am working on a scenario, which reads emails from mail box using POP3 protocol and Mail adapter. How to get the mail attributes like received date? I have to take some decisions based on that field. Also can we tell to move that mail to dif

  • TS1702 After I updated my new Ipad  with IOS 6, now Map& Dictation icon are not working. please help me

    After I updated my new Ipad  with IOS 6, now Map& Dictation icon are not working. please help me

  • ALE Inbound steps

    Hi All, What are the step's for ALE inbound process. Step by step for ALE Congfiguration. Regards, Suresh.D