'ALL' variable value not bringing data for All Geo Units

Hi everyone!
In BEx 7.0 we can't use * as we did in 3.x to use all the variable values.
Running a query in BEx 7.0, when I am selecting a variable value for Geographic Unit I see the value 'ALL' under the single value list.  I select this value but I get no data when I use it.
What checks can I do to know this is working properly?
Is this expected to work as it is was the * used in 3.x?
Any help provided will be greatly appreciated.
Thanks!

Hi
have you chekced the master data for the Geoobjects,
have you checked the options at cube/Multi provider level to read data from Master / Transactional
hope this help
Regards
Daya Sagar

Similar Messages

  • Dbx collector not collecting data for all threads

    Hi, I am having problem with the dbx collector when I collect data from a multithreaded application. Data seems to be collected only from the first threads. Sun Studio 11
    (dbx) attach -p 5501 my_proc
    (dbx) threads
    > t@2 a l@2 SI_SigThr() running in ___sigtimedwait()
    t@3 a l@3 SF_autoReInit() running in ___nanosleep()
    t@4 b l@4 SF_sendHeartBeats() running in ___nanosleep()
    t@5 a l@5 SF_Main() running in ___nanosleep()
    t@6 a l@6 MSI_TimerThread() running in ___nanosleep()
    t@7 b l@7 HbtMonitor() running in ___nanosleep()
    t@20 b l@20 umem_update_thread() sleep on 0xfe803ea8 in __lwp_park()
    t@22 a l@22 FXI_SorterSpooler() sleep on 0xbd418 in __lwp_park()
    t@23 a l@23 XCI_ReaderThread() running in __pollsys()
    t@24 a l@24 MSI_ListenerThread() running in soaccept()
    t@25 a l@25 MSI_ListenerThread() running in soaccept()
    ... and so on.... (sever hundred threads)
    t@33 a l@33 ReconnectingThread() sleep on 0xc55f8 in __lwp_park()
    ... and so on.... (sever hundred threads)
    (dbx) collector limit unlimited
    (dbx) collector enable
    (dbx) cont
    ^C
    (dbx) collector show
    collector enable
    collector profile timer 10.007
    collector profile on
    collector synctrace threshold calibrate
    collector synctrace off
    collector hwprofile counter insts
    collector hwprofile off
    collector heaptrace off
    collector mpitrace off
    collector store directory "."
    collector store experiment "test.3.er"
    collector sample periodic
    collector dbxsample on
    collector sample period 1
    collector limit none
    collector archive on
    (dbx) collector disable
    When viewing the experiment only the t@2,t@3,t@4 and t@33 are seen!?
    What

    I had a few other questions in my earlier post.
    During the time that data collection was running, how active (w.r.t. CPU time) should those threads be? When you run collect (instead of DBX) and look at timeline, how much activity do you see activity on the 'missing' threads during the similar segment (and loading) of the program's operation ? (I believe the collector won't record data if a thread does not get sufficient CPU time during the data collection period. )
    Also, please post which OS version you are using and which version of collect & dbx. Thanks!

  • BEx not refreshing data on all embedded queries:One variable, one variant:

    Good day
    I have the following challenge. I have created a BEx workbook and inserted one query multiple times into this workbook. I have removed the result area of each query to a separate sheet to enable the use of different filter values and to use some excel formatting to change fonts, colours etc.
    I have created a ocalday variable selection to this query. The data must be displayed for the last day of each month for a period of 12 months from a certain ocalday. I created and saved a variant for this, adding the last ocalday of each month from 2009/05/31 to 2010/04/30 in the selection option of the variable, thus it looks like this in the variable selection of single values:
    2009/05/31
    2009/06/30
    2009/07/31
    Etc
    Etc
    I have ticked the option u201Crefresh queries when opening workbooku201D in the properties of the query and applied it to all. When the user opens the workbook, the users must select the variant on the variable selection screen and refresh all the queries.
    Now, my challenge comes in where only one query is refreshed with the required dates. All the other results cannot be found and it displays the last date in the variant (ie. 2009/04/30). You need to go and refresh each query separately, using the same variant. This is time consuming as some workbooks have > 50 queries.
    Thus, one query embedded multiple times, one variable, one variantu2026u2026
    Why does the variant not apply to all the embedded queries when data is refreshed?
    Your expert assistance will be appreciated.
    Have a nice day
    Cj

    Hi,
    When you have multiple queries in the workbook which have variables with that,refreshing the workbook,will prompt you the variable screen where you can enter the variables which can take effect in all the queries.
    Go to Workbook Settings->General tab->Check Refresh workbook on Open->Go to Variables tab->Check Process Varaibles on Refresh and Display Duplicate Variables Only Once.Save and reopen the workbook.
    Now it will prompt for variable screen where you ca enter the variable values which is common for all the queries.
    The data will be shown in all the ANalysis items based on the variables entered for the queries.
    Rgds,
    Murali
    Edited by: MuraliKrishnan E on Sep 10, 2009 2:21 PM

  • Background job not updating data for some steps of program

    We have created a custom program and submitting the standard program RPRCCC_READ_KR1025
    which is used for conversion of AMEX credit card files.
    Since this program can update data for 1 company id at a time,we have created a background job
    for that program with 10 steps and 10 variants of data related to 10 different company codes.
    when this job is executed it is not updating data for all the 10 files , sometimes it updates 5,7,etc...
    If we execute the other files which didnt update in job manually they are getting updated.
    All the files executed with that program can be seen in PRTA transaction.
    Regards,
    Gautham.
    Edit: Solved.
    Since the standard program updates the database every time with commit work
    there should be some time until the next database update,so we can use wait command
    after the submit statement which will do the trick.
    Edited by: Gautham Vangaveti on Aug 5, 2010 7:20 PM

    Hi,
    I mean B(Sub Assembly)
    Item Rollup Flag
    A No
    --B                       Yes
    ---B1 No
    ---B2 No
    ---B3 No
    --C                      Yes
    In this case when we run cost rollup by giving item A, Oracle will consider below structure.
    A
    --C
    Regards,
    Sarvottam.

  • Why files in Lightroom mobile and files in the creative cloud not the same? idea-  one place(stored date) for all application ??????

    Why files in Lightroom mobile and files in the creative cloud not the same? idea- one place(stored date) for all application ??????

    Lightroom Mobile is not a cloud storage service. You shouldn't treat it as a way of backing up your files. You are merely storing Smart Previews of your files in the cloud space, high-quality JPEGs of your files regardless of their original format on your desktop. The point is that they are there so you can continue editing them in a Lightroom-like environment even while away from your desktop/laptop computer. The files in Lightroom Mobile can only be used in Lightroom Mobile.
    This is very different from what is offered by the Creative Cloud storage, which can be used to synchronize your files between any device (that can support the individual files).
    The vast difference in how each service works and its intended use is why they are separate.

  • Difference between Due Date for All Tasks and Duration per task in an approval workflow

    Hello,
    I'm starting to read up and learn about approval workflows in Designer 2010.  I am confused by the fields "Due Date for all Tasks" and the "Duration per task".  I understand that the Due date for all Tasks is the date that all
    of the tasks in the workflow must be completed by.  And I understand that the duration means each task has x number of days to complete.
    But I dont know whey we need them both.  If I have the task duration set to two days and there are 3 tasks involved, then wouldn't the due date be 6 days from the start?  Why would I have a due date for all tasks that is completely different than
    the sum of the duration of days for each task? 
    Please help shed some light!

    Hi Michelle,
    Due date for all tasks, this setting specifies the date by which all tasks are due. It’s used for serial and parallel
    task.
    Duration per tasks, this setting specifies the date by which an individual
     task is due, it’s used for serial workflow participants.
    Note, when you create an approval workflow, select task process participants(approvers oob approval workflow), we can also
    select once at a time(serial) or all at once(parallel).
    For more information, see
    http://office.microsoft.com/en-us/sharepoint-server-help/use-an-approval-workflow-HA101793831.aspx
    Best Regards.
    Kelly Chen
    TechNet Community Support

  • In mdx how to get max date for all employees is it posible shall we use group by in mdx

    in mdx how to get max date for all employees is it posible shall we use group by in mdx
    example
    empno  ename date
    1         hari        12-01-1982
    1         hari        13-06-2000
    by using above data i want to get max data

    Hi Hari3109,
    According to your description, you want to get the max date for the employees, right?
    In your scenario, do you want to get the max date for all the employees or for each employee? In MDX, we have the Max function to achieve your requirement. You can refer to Naveen's link or the link below to see the details.
    http://www.sqldbpros.com/2013/08/get-the-max-date-from-a-cube-using-mdx/
    If this is not what you want, please provide us more information about the structure of you cube, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Save persistents data for all users

    Hi
    The persistent datas are save for one specified user, the user logged in into the system (e.g. Windows).
    Is it possible to save persistent data for ALL users?
    If yes how?
    Thanks
    Hans

    Hi Hans
    IMO it might be a bit tricky. As you know - Persistence objects are stored in databases. I presume you're not asking about document database. Second database is stored in SavedData file which is located in per user data directory. I.E. on mac in ~/Library/Caches/Adobe Indesign/Version 6.0/en_US/Indesign SavedData
    Probably you need to create your own implementation of persistence or take a look on relations between file system and IDataBase interface.
    Regards
    Bartek

  • I am trying to confirm the time at which a particular backup occurred using an external hard drive that I have since misplaced. Although time machine lists the backup times for some backups, it does not list them for all. Can someone help?

    I am trying to confirm the time at which a particular backup occurred using an external hard drive that I have since misplaced. Although time machine lists the backup times for some backups, it does not list them for all, including the one in question. Can someone help?

    Okay, I've sussed it.  I saw a link to another discussion that hadn't turned up in any of my searches before I posted the question above.
    My Tosh HDD was formatted to FAT 32 and therefore would not be backed up by Time Machine.  I removed all data from it - nightmare as I hadn't enough space to put it on the macbook HDD, but, after trawling through numerous photos and deleting rubbish for hours, I finally made enough space and copied all data across.  Then, after clearing the data from the Tosh HDD, I used disk utilitity to reformat it to Mackintosh OS Extended.  Chose the option without kournalling because I'm not baking up onto this disk and so don't need it - according to some other discussions I found.
    When it was done I check on time machine in system prefs and once I selected options, I was able to remove it from the excluded from backups list.  Hey presto.  Now I'm moving my photo library and my music library onto the Tosh HDD and my macbook should start to breath again.  Hurray.

  • Tlanes not being created for all locations for one product

    Hello Experts - I need some advice on the following situation:
    We have a set of products where tlanes are only being created for 3 locations out of many.  What would cause the tlanes to not be generated for all locations if all master data elements are present?  We have a valid inforec, mrp views, etc.  I can't see anything different with the 3 locations that are working correctly.
    Any help would be much appreciated.
    Thanks!

    Thank you for your resopnse:
    To answer your questions, yes, we have done all of these things and the tlanes are still not being created.  We have been using SPP for 6 months and this is the first time we've seen this problem.  Do you have any other suggestions or is there any more information I can provide you that would be helpful in solving.
    Thanks in advance for your advice.
    AAF1999

  • HOW TO MIGRATE DUNNING LEVEL & LAST DUNNED DATE FOR ALL OPEN LINE ITEMS

    Hi,
    We need to upload dunning level & last dunned date for all open items (to be done separately i.e. after all open items migrated) which is different for each and every line item so if we use mass upload (fbl5n) we have to do for each and every open line item which is not feasible as open line items in thousands..I tried LSMW also,problem is that for Batch input recording we need to run one complete cycle in sm30 but there is no maintenance dialog for BSEG & BSID table.. Please share your valuable Inputs how we can go ahead..
    Thanks in advance

    Hi Mark,
    Thanks for your reply.
    It is complete a Zcopy of a standard one, but we are facing one or two records are not updating correctly out of hundreds of customers in the worklist.

  • Can we load data for all levels in ASO?

    Hi All,
    Im creating cube in ASO
    can i load data for all levels in ASO
    we can load data for all Levels In BSO but in ASO i need confirmation????
    and one more
    wat is the consider all levels option in ASO is used for ? wat is the purpose?
    Can any one help ,it would be appriciate.
    Thanks

    In an ASO cube you can only load to level zero
    The consider all levels if used for aggregation hints. It allows you to tell the aggregation optimizer to look at all levels when deciding if aggregation needs to be done on the dimension

  • Can we load data for all levels in ASO cube

    Hi All,
    Can we load data for all levels of members in ASO cube in 9.3.1.
    Regards

    Yes you can load data for all levels in an ASO cube in any version HOWEVER, none of the upper level data in any cube will be there when you look for it. You will get a warning message in the load because ASO cubes don't store data at upper levels. It is the same as loading data into dynamic calc members in BSO cube. It will do the load without compalints, but there will be no data there (At least you get the warning in ASO)

  • ORA-01008 All variables are not bound

    Hi I am running this query and getting this exception ORA-01008 All variables are not bound .
    Could anyone please insight ?
    SELECT EQMT_INGT_LOG_ID, EQMT_ID,
    XMLSerialize(DOCUMENT XMLType(ingLog.BUCK_SLIP_XML) AS CLOB) BUCK_SLIP_XML
    FROM TOS_EQMT_INGT_LOG ingLog
    where BUCK_SLIP_XML is not null and ingt_date between to_date(:fromDate, 'MM/DD/YYYY HH24:MI')
    and to_date(:toDate,'MM/DD/YYYY HH24:MI' )
    and eqmt_id in (select eqmt_id from tos_eqmt
    where eqmt_init = :eqmtInit and eqmt_nbr = :eqmtNbr
    and orig_loca_id in (select loca_id from tos_loca where altn_src_sys_stn_id = :circ7 ))
    and SCAC = :scac
    and STCC = :stcc
    and SHPR_NAME = :shprName
    and CNSE_NAME = :cnseName
    and driv_id in (select driv_id from tos_driv where lcns = :lcns )
    and driv_id in (select driv_id from tos_driv where sabv = :sabv )
    and ingt_stat_ind = :ingtStatInd
    and BUCK_SLIP_XML like :inspectedBy
    ORDER BY INGT_DATE

    Slightly off-topic but what do you think this does :
    XMLSerialize(DOCUMENT XMLType(ingLog.BUCK_SLIP_XML) AS CLOB) BUCK_SLIP_XML?

  • How can I keep lion from generating .DS_Store files on windows network partitions, but not disable it for all network partitions?

    How can I keep lion from generating .DS_Store files on windows network partitions, but not disable it for all network partitions?  I am fimilar with changing the setting for all network partitions(defaults write com.apple.desktopservices DSDontWriteNetworkStores true), but that is undesirable when I connect my laptop to my home network. A preferable solution would be where I could control the writing of these files based on disk format (NTFS vs HFS+).

    Go to MacUpdate or CNET Downloads and search for ds_store. There are numerous utilities for preventing them from being transferred to Windows systems.

Maybe you are looking for

  • Lines on screen, can't reset w home and sleep button at same time

    Lines on screen, can't reset w home and sleep button at same time. Any advise.  4th gen. 32g touch

  • Junk Mail - Weird!

    Hi, I hope that someone can help. I am using mail 2.1, and have set it up with my .mac account and yahoo account, both POP. Emails are coming in, and being sent out fine, however, something odd has happened. On the junk folder, on the drop down where

  • How long to repair an Iphone 5 ?

    Hello, I live in France and for a battery issue, I called AppleCare to repair my Iphone 5. (The 24th December and UPS took my Iphone at the 26) But, they did not tell me how long will last the repair (included the time to arrive and to return of the

  • BASIC/LDAP Realm Authentication

    I am trying to protect access to my Web Application using BASIC Authentication based on an LDAPRealm that I have configured. I want all users that try to access anything in my Web App to have to log in first, based on their information in the LDAP se

  • Price in PR

    Dear Experts, Why we have to enter the price in the PR, and why there is no free goods indicator in the PR? Thanks for help.