SSAS Tabular - Why is it wrong to call classify it as MOLAP, ROLAP or HOLAP?

Hi There,
I am working on an assignment (for a masters degree) in which I am evaluating different OLAP tools, and one of the tools assigned to me is Microsoft's BI stack.
One of the classifications we are working on is whether a tool is MOLAP, ROLAP or HOLAP. When reading about SSAS Tabular, however, every single thing I read points out that SSAS escapes this traditional classification.
I am trying to understand why would it be wrong to attempt to classify a Tabular instance of SSAS in such way? Why not say that SSAS Tabular is MOLAP, when using DirectQuery it is ROLAP, and if mixing both it is HOLAP?
What is so fundamentally wrong about it?
EDIT: Ah, and how does the Microsoft BI Semantic Model
fits into the picture -- I assume is has something to do with the Tabular model being outside the traditional MOLAP/ROLAP/HOLAP classifiction?
Thanks in advance for your help!
Regards,
P.

Hi Pmdci,
According to your description, you are looking for the reason why is it wrong to call classify SQL Server Analysis Service Tabular as MOLAP, right? As you know in MOLAP, data is stored in a multidimensional cube. The storage is not in the relational database,
but in proprietary formats. And in ROLAP, data stored in the relational database to give the appearance of traditional OLAP's slicing and dicing functionality. However tabular solutions use relational modeling constructs such as tables and relationships for
modeling data, and the xVelocity in-memory analytics engine for storing and calculating data. So as per my understanding, we cannot classify tabular as MOLAP.
Reference:
MOLAP, ROLAP, And HOLAP
Comparing Tabular and Multidimensional Solutions (SSAS)
Regards,
Charlie Liao
If you have any feedback on our support, please click
here.
Charlie Liao
TechNet Community Support

Similar Messages

  • How does the SSAS storage mode (MOLAP/ROLAP/HOLAP) fits into the BI Semantic Model?

    Hi there,
    I've been reading some posts and whitepapers about the BI Semantic Model (BISM) but there is something that is a bit unclear to me.
    If we imagine Venn diagrams, how does the concept of BISM fits along the storage modes, such as MOLAP, HOLAP and ROLAP? How does these two concepts work together?
    Are they completely independent from the other, or do they overlap somehow? Can I have 
    Regards,
    P.

    BISM is just a marketing term referring to any type of Analysis Services model. What is more important is Multidimensional models can have three storage modes: MOLAP, ROLAP, or HOLAP. Tabular models can have two storage modes: In-Memory or DirectQuery.
    http://artisconsulting.com/Blogs/GregGalloway

  • Excel SSAS Tabular error: An error occurred during an attempt to establish a connection to the external data source

    Hello there,
    I have an Excel report I created which works perfectly fine on my dev environment, but fails on my test environment when I try to do a data refresh.
    The key difference between both dev and test environments is that in dev, everything is installed in one server:
    SharePoint 2013
    SQL 2012: Database Instance, SSAS Instance, SSRS for SharePoint, SSAS POWERPIVOT instance (Powerpivot for SharePoint).
    In my test and production environments, the architecture is different:
    SQL DB Servers in High Availability (irrelevant for this report since it is connecting to the tabular model, just FYI)
    SQL SSAS Tabular server (contains a tabular model that processes data from the SQL DBs).
    2x SharePoint Application Servers (we installed both SSRS and PowerPivot for SharePoint on these servers)
    2x SharePoint FrontEnd Servers (contain the SSRS and PowerPivot add-ins).
    Now in dev, test and production, I can run PowerPivot reports that have been created in SharePoint without any issues. Those reports can access the SSAS Tabular model without any issues, and perform data refresh and OLAP functions (slicing, dicing, etc).
    The problem is with Excel reports (i.e. .xlsx files) uploaded to SharePoint. While I can open them, I am having a hard time performing a data refresh. The error I get is:
    "An error occurred during an attempt to establish a connection to the external data source [...]"
    I ran SQL Profiler on my SSAS Server where the Tabular instance is and I noticed that every time I try to perform a data refresh, I get the following entries:
    Every time I try to perform a data refresh, two entries under the user name ANONYMOUS LOGON.
    Since things work without any issues on my single-server dev environment, I tried running SQL Server Profiler there as well to see what I get.
    As you can see from the above, in the dev environment the query runs without any issues and the user name logged is in fact my username from the dev environment domain. I also have a separated user for the test domain, and another for the production domain.
    Now upon some preliminary investigation I believe this has something to do with the data connection settings in Excel and the usage (or no usage) of secure store. This is what I can vouch for so far:
    Library containing reports is configured as trusted in SharePoint Central Admin.
    Library containing data connections is configured as trusted in SharePoint Central Admin.
    The Data Provider referenced in the Excel report (MSOLAP.5) is configured as trusted in SharePoint Central Admin.
    In the Excel report, the Excel Services authentication settings is set as "use authenticated user's account". This wortks fine in the DEV environment.
    Concerning SecureStore, PowerPivot Configurator has configured it the PowerPivotUnnattendedAccount application ID in all the environments. There is
    NO configuration of an Application ID for Excel Services in any of the environments (Dev, test or production). Altough I reckon this is where the solution lies, I am not 100% sure as to why it fails in test and prod. But as I read what I am
    writing, I reckon this is because of the authentication "hops" through servers. Am I right in my assumption?
    Could someone please advise what am I doing wrong in this case? If it is the fact that I am missing an Secure Store entry for Excel Services, I am wondering if someone could advise me on how to set ip up? My confusion is around the "Target Application
    Type" setting.
    Thank you for your time.
    Regards,
    P.

    Hi Rameshwar,
    PowerPivot workbooks contain embedded data connections. To support workbook interaction through slicers and filters, Excel Services must be configured to allow external data access through embedded connection information. External data access is required
    for retrieving PowerPivot data that is loaded on PowerPivot servers in the farm. Please refer to the steps below to solve this issue:
    In Central Administration, in Application Management, click Manage service applications.
    Click Excel Services Application.
    Click Trusted File Location.
    Click http:// or the location you want to configure.
    In External Data, in Allow External Data, click Trusted data connection libraries and embedded.
    Click OK.
    For more information, please see:
    Create a trusted location for PowerPivot sites in Central Administration:
    http://msdn.microsoft.com/en-us/library/ee637428.aspx
    Another reason is Excel Services returns this error when you query PowerPivot data in an Excel workbook that is published to SharePoint, and the SharePoint environment does not have a PowerPivot for SharePoint server, or the SQL Server Analysis
    Services (PowerPivot) service is stopped. Please check this document:
    http://technet.microsoft.com/en-us/library/ff487858(v=sql.110).aspx
    Finally, here is a good article regarding how to troubleshoot PowerPivot data refresh for your reference. Please see:
    Troubleshooting PowerPivot Data Refresh:
    http://social.technet.microsoft.com/wiki/contents/articles/3870.troubleshooting-powerpivot-data-refresh.aspx
    Hope this helps.
    Elvis Long
    TechNet Community Support

  • SSRS Parameters using SSAS Tabular model get cleared

    I have an SSRS report that uses data from a SSAS Tabular model.  In the query designer, from the calendar dimension I choose a "Date Inclusive" filter and make it a parameter. I also choose to add another filter using and Organisation Unit
    dimension and also make this a parameter. The report is written and deployed to a SharePoint 2013 library.
    Most of the time, the report runs as expected with the parameters cascading off each other as expected.  However, occasionally, parameters get cleared (either after changing a single value such as the Org Unit selection or sometime whilst the report
    is being rendered). Sometimes you cannot select a value from the available values - you need to navigate somewhere else and then start over.
    I changed the data source for the parameters to use SQL queries that return the same values as the MDX queries and the probably seems to have gone (time will tell)
    This report has a child (detail) report that has one extract parameter.  This parameter happens to have over 1,000 values.  With the change of the parent report, you are now able to get to the child report.  However, the child report seems
    to exhibit the same problem with the parameters being cleared - and with a much higher frequency.
    So, that leaves me wondering whether
    anyone else has experienced this ?
    is this an issue with SSRS 2012 and SSAS Tabular models (I have not seen this behaviour before and I have been using SSRS (since version 1) and SSAS Multi-dimensional (from when it was called "OLAP Services") ?

    We applied SQL Server 2012 Service Pack 2 to the SharePoint farm (the SP Admin needed to re-create the service applications) and the problem is fixed

  • Designing database structure and SSAS Tabular Model cubes

    Hi.
    I need to design a database and SSAS tabular models for my clients but I am confused which way I should implement it.
    Data for all the clients is stored into a single database with unique ClientId for each client, such 15 tables I have under a single database which stores information about all the clients.
    Task is to create a SharePoint Site collection for each client which will display Power View Dashboard by taking data from above database.
    Till now I have created a SSAS Tabular Model for each client, XClientModel, YModelClient  using BIDS and using SQL Queries to extract data for respective clients(select * from Table1 where ClientID="X") and using Power View external connection
    to this model, have created Dashboard and other SharePoint information.
    I am not sure if creating different Model is suitable or I should first separate data for each client into separate database and then create Model based on respective client's database.
    Can Some one highlight pros and cons on using 
    SINGLE database with Multiple Tabular Model (One with Many) AND Separate Database with it's Model(One to One) ?
    This is understandable but just putting it here........Imp Note: Data for X client shouldn't be visible to Y client on SharePoint.
    Please let me know if further information is required.

    Hi Sgms,
    In your description, you said that all the clients information were stored in a single database, now you want to know which method is better, single database with Multiple Tabular Model or separate database with it's Model?
    In your scenario, all the information were stored in a single database, why do you want to separate it or create multiple tabular model? If you create multiple model, then you need change the data source to create PowerView dashboard for each client. As
    per my understanding, you just need create one tabular to load all the information. And then use this model to create PowerView dashboard. Using filter to display the information for each client.
    Reference:
    Lesson 1: Create a New Tabular Model Project
    Filtering, Highlighting, and Slicers in Power View
    Regards,
    Charlie Liao
    If you have any feedback on our support, please click
    here.
    Charlie Liao
    TechNet Community Support

  • SSAS tabular mode Perspective display in Excel 2010

    Hello All,
    I have SSAS tabular modal cube. I created one perspective based on user requirement. I am using Excel 2010 to open the perspective. When I check the Pivot Tbale Field List (window at the right), I can see all my tables, columns and mesaures that I selected
    for my perspective by scrolling in hte Field List. However when I click the drop down which says (Show fields related to:) I can see only 3 to 4 Dimension tables and some of the measures that I selected. I don't see all the tables in that perspective. When
    I open the perspective which contains the entire cube, I can see all tables, columns and measures in the drop down list, however with perspective it only shows certain table.
    Does anyone why this behaviour in Excel. I tried opening in Excel 2013 but got same result.
    Any help in solving this matter would be appreciated.
    Thanks
    Deepak Gada

    Hi,
    In tabular projects there is no property like "NameColumn" to change the display labels. What I would suggest is to create a calculated column in tabular project by using CONCATENATE() function available in DAX. As an example;
    Account Names:= CONCATENATE([Account Name] + " - ", [Account ID])
    Following is the link to CONCATENATE DAX function.
    http://msdn.microsoft.com/en-us/library/ee634811.aspx 
    Best regards...

  • SSAS Tabular: Show balance on latest dimension attribute

    Hi,
    I have a fact with transactions over time eg.
    20140101, 1000
    20140105,-400
    In SSAS Tabular, I want to add a balance (saldo) measure, that shows the balance on any given date from my date dimension
    Balance 20140106: 600
     I can do this by using SUMX (or summarize)
    Saldo:=SUMX(
    VALUES('Date'[Date])
    ,CALCULATE(
    SUM(Fact[Amount])
    ,DATESBETWEEN('Date'[Date],BLANK(),LASTDATE('Date'[Date]))
    ,ALL('Date')
    The issue arises when I want to show the balance for an attribute from a dimension related to the latest fact entry. I can calculate this on dates that has transactions like this:
    Saldo_MaxFact:=MAXX(
    VALUES('Fact'[FactId])
    ,CALCULATE(
    SUM(Fact[Amount])
    ,DATESBETWEEN('Date'[Date],BLANK(),LASTDATE('Date'[Date]))
    ,ALL('Date')
    ,ALL('Fact'[FactId])
    ,ALL('Dimension')
    But on dates with no transactions, this measure is empty (which makes sense, since there is no FactId to roll-up the sum to).
    How would I go about creating a measure that rolls up to any given date AND the attributes on the latest fact entry?
    I have created a sample snapshot: http://1drv.ms/1ly4o6a
    Sample Excel Power Pivot model: http://1drv.ms/1jy2nkX
    Any help would be much appreciated!

    Hi Greg,
    Finally I found the problem why the query goes out of memory in tabular mode. I guess this information will helpful for others and I am posting my findings.
    Some of the non-key attribute columns in the tabular model tables (mainly the tables which form dimensions) do not contain pretty names. So for the non-key attribute columns which I need to provide pretty names I renamed the columns to something else.
    For an example, in my date dimension there is a non-key attribute named “DateAltKey”. This is the date column which I am using. As this is not pretty to the client tools I renamed this column as “Date” inside the designer (Dimension
    design screen). I deployed the cube, processed the cube and no problem.
    Now here comes the fun part. For every table, inside the Tables node (Tabular SSAS Database > Tables) you can view the partition details. You have single partition per dimension table if you do not create extra partitions. I opened the partitions screen
    and clicked on the “Edit” icon and performed a Syntax Check. Surprisingly it failed. It complains about the renamed column. It complained “Date” cannot be found in source. So I realized that I cannot simply rename the columns like that.
    After that I created calculated columns (with a pretty name) for all the columns which complained and all the source columns to the calculated columns were hid from the client tools. I deployed the cube, processed the cube and performed a
    syntax check. No errors and everything were perfect.
    I ran the query which gave me trouble and guess what... it executed within 5 seconds. My problem is solved. I really do not know who did this improve the performance but the trick worked for me.
    Thanks a lot for your support.
    Chandima

  • SSAS Tabular - placing single measure in Excel is fast, multiple from same table is slow?

    With SSAS Tabular using Excel:
    If I place a single measure MyMeasure:=SUM([ColumnNameOnFactTable])
    it happens very quickly.
    I have 3 other dimensions from 3 other dimension tables on Excel with this "MyMeasure" as the value.
    YearMonth in the columns and say Department ID, Account ID, and Call Center (just all made up for this example).
    Now, when I place a second measure from that same table as "MyMeasure" call it SecondMeasure:SUM([AnotherColumnNameOnFactTable]) the OLAP query in Excel spins, and sometimes even throws the out of memory error.
    The server has 24 GB of RAM, and the model is only a few hundred megs.
    I assume something must be off here? 
    Either I've done something foolish with the model or I'm missing something?
    EDIT:
    It SEEMS to work better if I place all y measures on the Excel grid first, then go and add my "dimensions", adding the measures after the dimensions appears to incur a rather steep penalty?
    Number of rows:
    Largest table (account ID lookup has 180,000)
    Fact table has 7,000
    The others are 1,000 or less...

    Hi,
    Thank you for your question. 
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated. 
    Thank you for your understanding and support.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Need help ASAP - SSAS Tabular error has me stumped and others' solutions do not apply in this case.

    Hello everyone,
    Here's hoping no one will consider this to be the wrong forum. The data source is an Access database, but the issue is an SSAS Tabular issue. Kind of desperate here... the customer needs this ASAP, and I've been trying to resolve this for a week
    already! Any help would be much appreciated!
    A customer of mine (another employee of the ComIT department) is receiving an error trying to open an Access database via SQL Data Tools.
    The error is "OLE DB or ODBC error: The Microsoft Access database engine cannot open or write to the file '\\[our_domain]\[the_path_to_the_file]\[database_name].accdb'. It is already opened exclusively by another user, or you need permission
    to view and write its data.; 3051."
    What perplexes me is that I do not receive this error, my coworkers on the DBA team do not receive this error, and our boss does not receive this error--even on the customer's machine. But the customer gets this error every single time.
    The only difference is the credentials provided on the "Impersonation Information" window.
    Here is our process:
    We open SQL Data Tools, we create a new Analysis Services Tabular Project, we select the workspace server (our Analysis Services server... db5079\tabular) and test connection ("Test connection succeded") and click OK.
    Then we click "Import From Data Source," select "Microsoft Access," enter the full network path to the database and test connection ("Test connection succeeded")
    The next window says "Impersonation Information - Specify the credentials used by the Analysis Services server to connect to the data source when importing and processing data"
    The credentials provided at this point is the only difference between me/my team/our boss and the customer. If we use our own credentials at this point--even on the customer's machine--we receive no error and everything is fine.
    If we use the customer's credentials, we get the error above at the end of the next step (i.e., after we choose the data to import and then click Finish).
    So that's it. On the same machine, the customer's credentials produce this error, and our credentials do not.
    I have already added the customer as a server administrator to the Analysis Services server (db5079\tabular).
    Copying the data source to another folder on the network or to his local machine produces the same results: his credentials produce the error, my credentials/my boss's credentials/etc. do not.
    All of our machines are 64 bit, and the Analysis Services server is 2012 64-bit.
    Please help!

    UPDATE: As it turns out, all those who were able to import the data were local administrators on the Analysis Server (i.e., on the OS), and all those who were unable to import data were not.
    When we added someone who couldn't import the data to the local Administrators group on the Analysis Server, they were able to import the data.
    However, we can't give them local admin on the Analysis Server, and we are unable to determine what combination of user rights and permissions on folders we can grant the user as individual that will allow them to import the data.
    Just as a test, we tried giving the user the same user rights that the admin group has, and the same permissions on all the drive as the admin group has, but that didn't work.
    If it had worked, we could have started reducing the rights until we found the minimum necessary, but it didn't and once again we're stuck.
    Please help!

  • Wrong number calls & texts

    I would like to know why I keep getting wrong number calls and texts. It is the oddest thing when I get these because the other person on the line is saying my number is what they dialed. could somebody have the same number as mine? I have had my number changed 3 times since this has been happening every single time. really need help with this. Has anyone had this same problem?

        houser2014 that's quite odd! It isn't possible for more than one person to have the same number at the same time. However you can block unwanted calls by http://vz.to/1eiwezI
    AshleyS_VZW
    Follow us on Twitter @VZWSupport

  • Data loaded to Power Pivot via Power Query is not yet supported in SSAS Tabular Cube

    Hello, I'm trying to create a SSAS Tabular cube from a data loaded to Power Pivot via Power Query (SAP BOBJ connector) but looks like is not yet supported.
    Any one tried this before? any workaround that make sense?
    The final goal is pull data from SAP BW, BO Universe (using PowerQuery) and be able to create a SSAS Tabular cube.
    Thanks in advance
    Sebastian

    Sebastian, 
    Depending on the size of the data from Analysis Services, one work around could be to import the data into into Excel and then make an Excel table and then use the Excel table as a data source. 
    Reeves
    Denver, CO

  • Filter Date Table (SSAS Tabular)

    Hi Guys,
    I'll try to define my issue as clear as possible.
    Am creating a model for SSAS Tabular but I have following problem.
    I have a Date Table that I want to filter depending on what the user select from another table with values like:
    Today, Yesterday, This Week, Last Week, This Month, Last Month, .....
    What I have tried:
    (1) My Date table contains flags(columns) for each of these values. Filtering on these values is no problem. But then I have a long list of different possible flags.
    (2) I created a copy of my date table but with the flags unpivoted (DateId, FlagName, Value (0 or 1))
    I was hoping to be able to create somehow a relation between my date table and this table, but no success.
    Also this way I would have a field that users can filter on (choose what period they want to see)
    But sadly enough (2) did not work and I can not find any other way to find a solution for it.
    Any help would be great.
    Regards,
    Sammy

    In your date table, you can create different attribute for
    each condition below:
    1. Today, 2. Yesterday, 3. This Week, 4. Last Week, 5. This Month, 6. Last Month
    So, if we consider above attributes there will be 6 different attributes with "Yes/No" Or "True/ False" Or "1/0" whatever flag.
    I would suggest you to calculate these conditions in sql query which will be faster compared to calculated columns in the model.
    Unfortunately, tabular model 2012 do not support dynamic set so you cannot build a single attribute with these values @server side.
    I had similar question sometime back, check if you find something useful from this thread -
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/302dd796-2677-44df-a76e-b053dcd14117/ssas-tabular-model-dynamic-fiscal-period?forum=sqlanalysisservices
    If this post answers your query, please click "Mark As Answer" or "Vote as Helpful".

  • SSAS Tabular - Adding Column to a table gives error "Object reference not set to instance of object"

    If I make changes to a table in SSAS Tabular Visual Studio, the newly added column gives error as "Object
    reference not set to instance of object"

    Hi VikasJain13,
    According to your description, you get the "Object reference not set to instance of object" error when adding columns in Tabular. Right?
    Generally, it throws this error when the internal code is accessing the property of an empty object. As you mentioned it happens when you make changes on a table, mostly it means that table is already a empty object. Please re-process your tabular to see
    if this table is still existing. 
    If you have any question, please feel free to ask.
    Simon Hou
    TechNet Community Support

  • Why copy option from recent call numbers is not working on ios7 [recent call logs in phone]?

    why copy option from recent call numbers is not working on ios7 [recent call logs in phone]?

    Here is a copy of my expressions and a copy of my data. The expressions match up going from left to right
    Code.CalcPTD2(Lookupset(FORMAT(Fields!REALDATE.Value,"Long Date"),FORMAT(Fields!CallbackDate.Value,"Long Date"),Fields!DailyHours.Value,"DataSet4"))
    sum(Code.CalcPTD2(Lookupset(FORMAT(Fields!REALDATE.Value,"Long Date"),FORMAT(Fields!CallbackDate.Value,"Long Date"),Fields!DailyHours.Value,"DataSet4")))
    count(Code.CalcPTD2(Lookupset(FORMAT(Fields!REALDATE.Value,"Long Date"),FORMAT(Fields!CallbackDate.Value,"Long Date"),Fields!DailyHours.Value,"DataSet4")))
    PUBLIC SHARED FUNCTION CalcPTD2(LookupArray AS OBJECT) AS INTEGER
         DIM i,Total AS INTEGER
         Total = 0
         FOR i = 0 to UBOUND(LookupArray)
               Total = Total + CINT(LookupArray(i))
         NEXT i
     NumberTimes  =   NumberTimes +1
    RETURN  Total
    END FUNCTION

  • SSAS Tabular in DirectQuery - What are the workarounds for formula limitations?

    Hello,
    I need to create an SSAS Tabular model against the database of a live, real-time, line of business transactional system (i.e. a CRM).
    The business requirement behind it is that we need to create some complex reports against live data, and our DW is only updated daily.
    This live model will however be partitioned with a time-variance limitation (e.g. only records which are XX old can be returned).
    Now here is the challenge. Since I am querying live data, then I believe the model must be configured in DirectQuery model. Am I right?
    The issue is that DirectQuery mode is full of formula limitations. So my concern is, if I need a calculated column or measure that I cannot make it work due to DirectQuery limitations, then what are the alternatives?
    Remember that the data source is from a live system, so it is not like I can create columns and measures in the underlying relational database.
    Please advise.
    Regards,
    P.

    Hi pmdci,
    According to your description, you want to use some functions in calculated measure which are not supported in DirectQuery mode. Right?
    In Analysis Services Tabular, since DirectQuery has the real time access and scalability, this comes with a price of restrictions on a number of DAX functions and missing Calculated Column feature. Generally the workaround for these scenarios
    is replacing those functions with other functions which are supported in DirectQuery mode, or create columns in the data source. However, as you said, your environment is not possible to create columns in the database. And a lot of those limited
    function are not replaceable, like time intelligence functions. So actually, there's no really effective workaround currently.
    For you requirement, I suggest you submit Microsoft a feature request
    at https://connect.microsoft.com/SQLServer
    so that we can try to modify and expand the product features based on your needs.
    Best Regards,  
    Simon Hou
    TechNet Community Support

Maybe you are looking for

  • How to interchange the data in a report

    I am using Obiee 10g.I build one report in that we having 4 columns like name, sal, job, manager ank, 4000 ,SE, parta mou ,5000, SE, kemp Rafi, 4000, SE, debi parta, 11000, mana, sai kemp, 12000 ,SE, ben now my client requirement like this ( the mana

  • Error - Addition of Batch wise GRPO thru DI API

    Dear All, I am facing the error "Can not release the item with selection of Batch/Serial" while adding the GRPO thru DI API. When I add only 1 line in GRPO with Batch Details thru coding,GRPO added successfully. But when there are more than 1 line th

  • XML Rendering

    My jsp pages would be transformed by XSLT for layout formatting, So I do not use JSF's UIComponents. I'm failed to invode the navigation-rule without using UIComponents. Does somebody have idea how to inoke the navigator-rule using custom java code?

  • Lightroom 4 on Macbook Air

    I just upgraded to Lightroom 4 on my Macbook Air, on my downloads page of Adobe it shows this I am just curious as to why it says not applicable. Here is my system info: Thank you for any help.

  • Limiting holds of a position

    Hello experts, I would like your help in limiting the holders of a position to one person at a time. Currently the systems allows for multiple persons to be assigned to one position. What tables should I maintain the time constraints on to ensure onl