Filtering multi source data in a Dashboard

Hello,
I want to create a dashboard with multi source data (Tables / XML files)
Filtering data from one source it's easy, but what I want is to use the same criteria to filter the rest
Each (Table / XML file) Will stored in a separate Excel Tab
Example:
Imagine we have to select:
1- the business unit
2- the Country
3- the Project
And then display charts with data from 5 different tables (XML files)
I wanted to use macros -> impossible
Please Help

Hello,
I want to create a dashboard with multi source data (Tables / XML files)
Filtering data from one source it's easy, but what I want is to use the same criteria to filter the rest
Each (Table / XML file) Will stored in a separate Excel Tab
Example:
Imagine we have to select:
1- the business unit
2- the Country
3- the Project
And then display charts with data from 5 different tables (XML files)
I wanted to use macros -> impossible
Please Help

Similar Messages

  • Passing multi-value parameter from BIEE dashboard to BIP report

    It is possibile passing multi-value parameter from BIEE dashboard prompt to BI Publisher integrated report? (BIP report has a DB data source (not a answers))
    Thank you
    R.

    Hi Rajkm,
    In order to pass a multi-value parameter through the Reporting Services Web services, you need to define the same numbers of ParameterValue objects as the number of the values of the multi-value parameter being past into the report. The Name property
    of these ParameterValue objects must be specified same to the parameter name.
    I found a good FAQ article for this scenario:
    How do I pass a multi-value parameter into a report with Reporting Services Web service API?:
    http://blogs.msdn.com/b/sqlforum/archive/2010/12/21/faq-how-do-i-pass-a-multi-value-parameter-into-a-report-with-sql-server-reporting-services-ssrs-web-services-api.aspx
    Hope this helps.
    Elvis Long
    TechNet Community Support

  • Error - Generating WebI report with Multi-Source universe (.unx)

    Hello ,
    While creating WebI report from multisource universe (.unx) , created using
    relational connection with SAP BW JCO connector , following JCO exception is
    cropping up
    Database error: [Data Federator Driver] [Server] [Connector 'Test_RCN_01'] SAP
    NetWeaver BW has reported an exception: com.sap.conn.jco.JCoException: (101)
    JCO_ERROR_CONFIGURATION: Server configuration for DF_SERV-bitest-3300-NOSNC is
    already used for a running server. (IES 10901) (WIS 10901)
    Steps followed for Multi-source universe creation
    1. creating relation connection ( Passed)
    2. Creating Data foundation using this connection (passed)
    3. Mapping two parameters (programIDMapping, hostname ) with RFC destination and
    BW FQDN ( passed )
    4. Creating Business layer for this Data foundation (passed)
    5. Open webintelligence with SSO and while creating the reports using the .unx ,
    the above error is coming
    already implemented SAP note no 1638647
    System used
    SAP BW 7.1 with EHP 1 and SP 10
    SAP BI 4.0 SP2
    Data federation service admin tool version 14.0.2
    thanks,
    Rajib

    Hi,
    try with this.
    Starting the Gateway Service on the Message Server resolved this issue.
    There have also been reports of this issue when name resolution issues occur between the client and the Message Server machine.  As a test, you can add the Message Server name and IP address to the Windows\System32\Drivers\etc\hosts file.
    Or check SAP note 1590518.
    Thanks,
    Amit

  • Not able to access the multi-source universe in WebI

    Hi
    I am not able to access the multi-source universe in WebI, getting below error message.
    [Data Federator Driver] Unexpected exception: com.crystaldecisions.thirdparty.org.omg.CORBA.UNKNOWN: null | [Data Federator Driver] Failed to connect to any of the provided 'Central Management Server' hosts.
    And also Not able to perform anything to designing multi-source universe in business Layer.
    Universe back-end is
    Oracle 11g and
    SQL2008 DB
    Version IDT: 4.1 Support Pack 2
    SAP BusinessObjects BI Platform 4.1 Support Pack 2,

    Hi Sreeni,
    You can create a new APS in CMC containing the Data Federation Service with
    -Xmx2g -> 8g (This is the suggested range)
    Make sure you remove this service from the existing APS and then create a new one.
    You could refer SAP KBA 1694041 - BI 4.x Consulting:- How to size the Adaptive Processing
    Server (APS) which would assist you in sizing the APS.
    Regards,
    Manpreet

  • How to handle duplicate Primary Key entries in the Source data

    This is my first experience with ODI.
    I receive Source data from the customer that includes a one letter designation, ACTION_CODE, in each record of data as to the disposition of the record:
    ‘R’ represents Re-issue in which case I’m to modify the corresponding Target record based on the Primary Key.
    ‘N’ represents an Insert in which case I’m to insert a new record into the Target.
    ‘D’ represents a delete in which case I’m to delete the record with the corresponding Primary Key from the Target.
    The Source data comes in an XML file and the Target is an Oracle DB.
    I have chosen the IKM Oracle Incremental Update (MERGE) Knowledge Module.
    I filter ACTION_CODE to just collect records that are ‘N’ or ‘R’ and I exclude the ACTION_CODE from the mapping but since within the same Source
    set there may be an ‘N’ and ‘R’ with the same primary key I receive Primary Key errors.
    Should I alter CKM to not check for duplicates in the Source?
    Is there a better way?

    Ganesh,
    Identifying Duplicates is a logical activity.  More or less it need Manual intervention to judge both the records means common.  if few unique paramenters like Telephone, Pincode, SSN, passport no etc can be used on filters for searching the records.  Currently there are no automatic method to identify the duplicates.  In MDM 5.5 SP04 which is next release there will be auto de-duplicate facility based on tresholeds and matching criteria that you will setup.
    I hope i have answered your query transparently. if you have any queries futher you can reply here.
    Regards
    Veera

  • Multi Source execution plan build issue

    Hi,
    I am trying to create/build a multi source (homogenous) execution plan in DAC from 2 containers for 2 of the same subject areas (Financial Payables) from 2 different EBS sources. I successfully built each individual subject area in each container and ran an executin plan for each individually and it works fine. Now I am trying to create 1 execution plan based on the DAC steps. So far I have done the following:
    - Configured all items for both subject areas in the DAC (tables, tasks, task groups, indexesconnections, informatica logical, physical folders, etc)
    - Assigned EBS system A a sifferent priority than EBS system B under Setup->Physical Data Sources
    - I noticed that I have to change the Physical Folder priorities for each informatica folder (SDE for Container A versus SDE for Container B). I assigned system A a higher priority
    - Build the Execution plan..
    I am able to build the execution plan successfully but I have the following issues/questions:
    1) I assumed that by doing the steps above..it will ONLY execute the extract (SDE) for BOTH system containers but only have ONE Load (SIL and PLP)..and I do see that the SDEs are ok..but I see the SIL for Inder Row in RunTABLE running for BOTH...why is this? When I run the EP...I get an unique constraint index error..since its entering two records for each system. Isnt the DAC suppose to include only one instance of this task?
    2) When I build the execution PLAN, it is including the SILOS and PLP tasks from BOTH source system containers (SILOS and PLP folders exist in both containers)...why is this? I thought that there is only one set of LOAD tasks and only SDE are run for each (as this is a homogenous load).
    3) What exactly does the Physical folder Priority do? How is this different than the Source System priority? When I have a multi source execution plan, do I need to assign physical folder priorites to just the SDE folders?
    4) When we run a multi source execution plan, after the first full load, can we somehow allow Incremental loads only from Container A subject area? Basically, I dont want to load incrementally from source sytem A after the first full load.
    4) Do I have to set a DELAY? In my case, my systems are both in the same time zone..so I assume I can leave this DELAY option blank. Is that correct?
    Thanks in advance
    Edited by: 848613 on May 26, 2011 7:32 AM
    Edited by: 848613 on May 26, 2011 12:24 PM

    Hi
    you are having 2 sources like Ora11510 and OraR1211 so you will be having 2 DAC containers
    You need these below mandatory changes
    for your issue
    +++++++++++++++++++++++++++++++++
    Message: Database errors occurred:
    ORA-00001: unique constraint (XXDBD_OBAW.W_ETL_RUN_S_U2) violated while inserting into W_ETL_RUN_S
    You need to Inactivate 2 tasks in R12 container.
    #1 Load Row into Run Table
    #2 Update Row into Run Table
    +++++++++++++++++++++++++++++++++
    There are other tasks that has to be executed only once
    (ie Inactivate the Below in One of the container)
    SIL_TimeOfDayDimension
    SIL_DayDimension_GenerateSeed
    SIL_DayDimension_CleanSeed
    SIL_TimeOfDayDimension
    SIL_CurrencyTypes
    SIL_Stage_GroupAccountNumberDimension_FinStatementItem
    SIL_ListOfValuesGeneral_Unspecified
    PLP_StatusDimension_Load_StaticValues
    SIL_TimeDimension_CalConfig
    SIL_GlobalCurrencyGeneral_Update <dont Inactivate this> <check for any issues while running>
    Update Parameters <dont Inactivate this> <check for any issues while running>
    +++++++++++++++++++++++++++++++++++
    Task :SDE_ORA_EmployeeDimension_Addresses
    Unique Index Failure on "W_EMP_D_ADR_TMP_U1"
    As you are load from 11.5.10 & R12 , for certain data which is common across the systems the ETL index creation Fails.
    Customize the Index Creation in DAC with another unique columns (data_source_numID).
    ++++++++++++++++++++++++++++++++++++
    Task :SDE_ORA_GeoCountryDimension
    Unique Index Failure on "W_GEO_COUNTRY_DS_P1 " As you are loading from 11.5.10 & R12 , for certain data which is common across the systems the ETL index creation Fails.
    Option1) Customize the Index Creation in DAC with another unique columns (data_source_numID)
    ++++++++++++++++++++++++++++++++++
    This changes were mandate
    Regards,
    Kumar

  • Using SQL Server credentials with Secure Store Target Application for Data Connection in Dashboard Designer

    [Using SharePoint 2013 Enterprise SP1]
    I would like to use SQL Server credentials in a Secure Store Target Application, and
    this page makes it look like it's possible but when I attempt to use the new Target Application ID as authentication for a Data Connection in Dashboard Designer, I get a generic "Unable to access data source" with no error logged in SQL Server
    logs.
    I am able to use a Target Application with AD credentials to access the SQL db without a problem. Suggestions?

    Hi,
    1. Make sure that the credential is set to
    Secure Store Target Application. Navigate to the Central Administration. Click on the
    Application Management. Click on the Manage Service Applications. Click on the
    Secure Store Service Application. Select the application ID and from the ECB menu click on the
    Set Credentials. Enter the Credential Owner, Windows User Name and the
    Windows Password.
    2. Make sure that in the Dashboard Designer “Use a stored account” is selected in the “Authentication” and the proper application ID is mentioned.
    Please refer to the link below for more information:
    http://www.c-sharpcorner.com/Blogs/14527/unable-to-access-data-source-the-secure-store-target-applic.aspx
    Regards,
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected] .
    Rebecca Tu
    TechNet Community Support

  • I need a sample excel data for practicing Dashboards

    Hi Experts ,
                        I am new to SAP  BO  Dashboards. i need sample excel data for practicing and designing a dash boards . kindly help me to get sample excel files from any where .Please suggest me where i can get sample excel files.
    Regards
    Abhi

    Hi,
    open the samples in the dashboard which come with source data as excel.or search on google you will get the dashboard files with sample excel data.or try to create own sample data and use in dashboard for learning.
    Amit

  • When to upgrade DAC to 10.1.3.4 during new multi-source OBIA implementation

    I want to take advantage of the multi-source capabilities of DAC 10.1.3.4 on OBIA 7.9.5 as I need to bring in data from several production EBS instances. Do I need to create the Data Warehouse tables via DAC 7.9.5 or upgrade the Windows DAC client and Linux DAC server right after installation and proceed with the rest of the OBIA Installation and Configuration steps from Chapter 4? Also, given a Linux Informatica 8.1.1 Server, should the DAC client's INFA_DOMAINS_FILE point to the Linux domains.infa file instead of the dummy Windows one installed to create the pmcmd.exe executable?
    Thanks.

    You can only use the DAC 10.1.3.4 against an instance that is fully configured and running. This install is primarily intend to improve the ease with which to stand up additional Admin users. You must complete all of the steps first before you can install the standalone client (1. Install on Windows, 2. Migrate to Linux, 3. Run the Warehouse tool from the original Windows install (the preferred method).
    INFA_DOMAINS_FILE is only needed on the server instance. The client tools do not need it to communicate with the Services.
    Just to let you know you will not be able to connect to the Informatica 8.1.1 SP4 Server if it was not configured with a Fully Qualified Domain Name.
    The following note from My.Informatica.com (ID:18672) provides the resolution. Disregard the 8.1.0 specifics, it will work for 8.1.1
    Good luck,
    Damon
    Unable to use fully qualified server name to connect to domain using the PowerCenter 8.1 client
    Problem Description
    During the configuration of the domain in the PowerCenter 8.1 clients (Designer, Workflow Manager, etc.), the value for the gateway host is automatically reverted to the simple hostname. This can cause problems in a networked environment, like a WAN, where the fully qualified server name is needed( Example: server123 is used instead of server123.europe.informatica.com ).
    The following error may also occur:
    Unable to get repositories for domain xxxxx
    Error: [PCSF_46008] cannot connect to domain [xxxxx] to look up service [coreservice][DomainConfigurationService]
    Solution
    If there is only one client, changing the hosts file on the client machine is sufficient. On the client machine, in the file C:\WINNT\system32\drivers\etc\hosts , add the required line.
    Example:
    10.0.0.0 server123.europe.informatica.com
    To make a system-wide change, you need to modify the domain information in the following way:
    1.
    Go to \informatica\pc810\server.
    2.
    Run the infasetup backupdomain command.
    Example:
    infasetup backupdomain -da host:port -du user -dp password -dt Oracle -ds SID -bf domain.xml -dn domain
    Successfully backed up domain [domain] to file [domain.xml]
    3. Edit the xml file, and replace the existing <host> tag
    Example:
    Change <host>server123</host> to <host>server123.europe.informatica.com</host> .
    4.
    Save the xml file, and restore it into the domain with the command "infasetup restoredomain".
    Example:
    infasetup restoredomain -da host:port -du user -dp password -dt Oracle -ds SID -bf domain.xml -force
    Successfully restored domain [domain] from file [domain.xml] .
    5.
    Verify that the fully qualified hostname is listed in the file nodemeta.xml in \pc810\server\config
    6.
    Restart the Informatica services with the new server name
    Applies To
    PowerCenter 8.1

  • Accordion Source Data Limit

    I have built a dashboard, which is based on 50 sites, with about 40 lines per site (all up about 2000 lines), and I use an accordion menu as my selector.
    Everything works great until the Accordion's Source Data range goes over 960 lines. When this happens, I cannot select a component and view it's properties. The only component properties I can see is the Canvas. If I right-click on a component, all the options are greyed-out.
    I am using Xcelsius2008.

    have you installed the xcelsius SP1??
    find it here
    Announcement - Xcelsius 2008 SP1   NOW AVAILABLE!
    good luck

  • Creation of DME medium FZ205 There is no source data found

    We are executing payment runs using F110 and then creating data medium - a file to send to the bank.
    In the variant for the program I am putting C:\ however when I have several users executing payment runs at the same time, the data medium is not creating and I am getting the error message that the source data cannot be found
    Can anyone help me with this issue - should I leave the file name as blank?
    Thanks
    Liz

    Hello,
    In order to avoid FZ205 please review your selection parameters and F1 help for the print program when creating the file:
    1. If you are taking the Output to file system:
    If required, the file can be written to the file system. The created file can be copied to a PC using data medium exchange management. You should be looking for downloaded files here, since the data carrier is not managed within the SAP system, but is already stored in the file system by the payment medium program. The file name should be defined by the user. You should make sure that existing files with the same name have already been processed, because they will be overwritten.
    Note:If a file cannot be found using the data medium exchange management the reason could be that the directory that was written to at the start of the payment medium program (in background processing, for example) cannot be read online.
    You should then select a directory which can be recorded and read by several different computers. Due to the problems described above and the resulting lack of data security, we advise against writing to the file system. This method is only beneficial if the data carrier file is taken from the file system by an external program, to be transferred to the bank.
    2. If you are taking Output into TemSe:
    If required, the file created can be stored within the SAP System(store in the TemSe and not in the file system),thus protecting it from unauthorized external access. You can download the file into the user's file system via the DME manager. The name of the file to be created during the download can be determined when running the payment medium program: the contents of the
    file name parameter are stored in the management data and defaulted when running the download.
    Please check the corresponding files in the DME administration for all files and check if the output medium 'File-System' has been
    chosen, that means output medium '0'. In order to use the TemSe you have to use the output medium '1'. Furthermore see if the PC-file- paths, like c:\filename.DAT, instead of application file names. The FDTA has difficulties to find these files, especially by using 2 application servers.
    To avoid problems with the files SAP recommends you to use the TemSe   with output medium '1', or the file system with output
    medium '0'. TemSe is always a better option.
    I hope this helps.
    Best regards,
    Suresh Jayanthi.

  • [APEX 3] Requested source data of the report has been modified

    Hello APEX-Friends,
    I have a common problem but the situation is a bit different here. Many of you might know the "invalid set of rows requested the source data of the report has been modified" problem. Often it occurs on submit. That means, you have a report, you select rows, you do things, you submit the page and everything blews up.
    This is because you enter some values into fields the report depends on and so you modify your report parameters and the source data changes.
    But:
    In my case I have a dynamically created report that blews up before any submits occur or values change.
    My query is a union of two selects. Both query different views. Those views use a date field as parameter and some compare functions.
    I read the field with a V-Function i wrapped arround the apex V Function - declared as deterministic. My date compare function is also declared deterministic (I doubt this makes any differences as it might be only important for the optimizer, but as long as I don't know exactly what APEX evaluates, I go for sure).
    I ensured, that the date field is set by default with the current date (and that works, because my interactive report initially displays correct data from the current date).
    So everything is deterministic and the query must return same results on subsequent calls, but APEX still throws this "source data has changed" error and I am to 99.99% sure, that this cannot be true.
    And now the awesome thing about this:
    If I change the value of the date field, an javascript performs a submit. The page is reloaded (without resetting pagination!) and everything works fine. I can leave the page, reenter, do things - everything works well.
    But if I log into the application and directly move to the corrupted report and try to use the pagination without editing fields or submitting the page the error occurs.
    Do you have any Idea what's happing there? I could try to workaround this by submitting the page the first time it's entered to trigger this "mystery submit" that gets everything working. But I would like to understand this issue and have a clean solution.
    Thanks in advance,
    Mike aka UniversE

    Okay, I found a solution, but I do not understand it - it might be a design flaw in APEX.
    I mentioned the date field that is used in the query. I also mentioned that it is set with the current date by default. I did not mention how.
    There are some possibilities in APEX to do so.
    1. Default-Setting in the element properties
    2. Static assignment if no value is in session cache
    3. Computation before header
    I did the first and second.
    BUT:
    An interactive report seems to work as follows. A query is executed to get all rows of the report. Then a second query is executed to get the rows that shall be displayed. And the order is screwed up, I think.
    1. The first report query to get all rows
    2. The elements are loaded and set to default values
    3. The second report query to get the display rows
    And that's the reason why nothing worked. Scince I added a computation before header the date field is set before the report queries are executed and everything works all fine now.
    But I think it's a design flaw. Either both queries shall be executed before Regions or afterwards but not split as field values might change when elements are loaded.
    Greetings,
    UniversE

  • The quest for a good multi-source video workflow

    I frequently have to mix together several sources, and the Multi-Camera Source Sequence in Premiere Pro should in theory be the best way to go.
    However, it's seriously underdeveloped and for a single camera with separate audio, I end up doing a sequence instead.
    Mix and scrub audio in Audition, tag and comment source video in Prelude, import both to Premiere Pro and put them together on a sequence so I can use that sewuence as a source for other sequences.
    Why don't I use a multi-camera sequence? Because you have to tag syncronization points before you put everything together, because audio and video is often recorded out of sync, with several video recordings per audio recording etcetera. It ends up a mess. We don't live in an ideal world, and not wanting to waste time (and make the talent lose focus) organizing things, recording happens pretty much on the fly with all that includes in differing methods, ideals and so on. We need to know how to cut resources (and this includes not using the most exåensive editing tools) but there's always room for improvements.
    I like to slap everything loosely together post-production and fine-sync on audio waves on the run, but this means you can adjust the multi-camera source sequence after you have put it together.
    This isn't possible in Premiere Pro, at least not without putting it into a sequrnce first and that's the next step you want to do after all the syncronizing is done, not something you want to while cutting up the sources in a sequence.
    I believe most of what I need for this is already programmed but it's mostly a matter of politics. Just please understand, synchronizing and preparing sources is different from working with a sequence, but it can be a substantial piece of work on it's own and it's not enough to just put it together in a menu and think that's all fine. (And not by far at that.)
    What I need is to be able to edit a multi-camera source sewuence on a timeline like I would edit any other sequence. And call it a multi-source sequence, it isn't just video.
    Then audio from different sources can be mixed together on the timeline like you do in a sequence, expoerted to audition as a multi-track session for more advanced audio processing (and no, single tracks aren't enough, I often need the real-time effects in audition not just the processing of whole tracks. And I mix together several audio sources, both mono and stereo tracks some from cameras but mostly from a separate, 24-bit audio recorder with inputs from hand or tie mics, environment and perhaps a gun with a boom. (I'm the audio guy with perhaps a camera on the side, others handle the main camera(s).
    And video from several cameras can be put together, one track per camera on a timeline and not just one clip per track, one camera can have several clips taken on a timeline.
    Then, when all this is put together it can be used as a source, like we use a multi-camera sequence today.
    I believe it's possible, and would increase the value of your offers. If this isn't done, I may end up looking for other software.

    Hi gino_76ph,
    You can compare different Nokia phone models and their features on Nokia developer website. Just follow the link below:
    http://www.developer.nokia.com/Devices/Device_specifications/?filter1=all
    Audio player features can be found under multimedia section, with differences highlighted.
    Hope this helps,
    Puigchild
    If you find this post helpful, a click upon the white star at bottom would always be appreciated.
    If it also solves your problem, clicking ACCEPT AS SOLUTION below it will benefit other users!

  • User View is not reflecting the source data - Transparent Partition

    We have a transparent partition cubes. We recently added New fiscal year details to the cube (user view as well as source data cube). We loaded the data to the source data cube. From the user view, we tried to retrieve data, it shows up 0's. but the data is availble in the source data cube. Could anyone please provide the information what might be the issue?
    Thanks!

    Hi-
    If u haven't add the new member in the partition area, then Madhvaneni's advice is the one u should do. Because, if u haven't add the member, the target can't read the source.
    If u have already added the new member in the partition area, and still the data won't show up, sometimes it's worth to try re-save the partition, and see what's the outcome.
    -Will

  • How to deal with such Unicode source data in BI 7.0?

    I encountered error when activating DSO data. It turned out that the source data is Unicode in the HTML representation style. For example, the source character string is:
    ABCDEFG& #65288;XYZ  (I added a space in between & and # so that it won't be interpreted to Unicode in SDN by web browser)
    After some analysis, I see it's actually the Unicode string
    ABCDEFG&#65288;XYZ
    Please notice the wide left parenthesis. It's the actual character from the HTML $#xxx style above. To compare, here is the Unicode parenthesis '&#65288;'  and here is the ASCII one '(' . You see they are different.
    My question is: as I have trouble loading the &#... string, I think I should translate the string to actual Unicode character (like '&#65288;' in this case). But how can I achieve this?
    Thanks!
    Message was edited by:
            Tom Jerry

    I found this is called "Numeric character reference", or NCR, in HTML term. So the question is how to convert string in NCR fashion back to Unicode. Thanks.

Maybe you are looking for

  • Is Neatdesk scanner easy to use on my iMac

    Need a good organizer for bills receipts etc for setting up easy budget.  Is neat or neat desk scanner easy to use on my mac

  • Storing and loading global values in JVM...

    Is it possible to store and then read global values from the JVM? If it is possible, could please give me the code fragmant to do this or direct me to some place where I can read more about this. Thanks. -PV

  • JAX-RPC Client - java.rmi.RemoteException:/getPort best practices

    We are working on java webservices(JAX-RPC style) and while consuming Java WebService sometime getting 'Remote Exception' .I have generated client side code with weblogic ant task "clientgen". 1: Exception java.rmi.RemoteException: SOAPFaultException

  • @EmbeddedID - JPA

    Hello!! I'm trying to map a relationship 1-N (Product(1)/Item(N)) and I'm getting an error when i insert the product. Error: Cannot insert the value NULL into column ITEM_PRODUCT.CD_PRODUCT. It sounds like I'm missing something, specially in my compo

  • Can't click on "Ms. PAC-MAN" in iTunes Store

    Hi all Don't know what's going on here but when I go to the iTunes Store and select sub-category iPod games, I get the list of all the games currently available for purchase. However, Ms. PAC-MAN doesn't allow me to click on it. The cursor stays as a