Sales hierarchy . role based reports and Data access, Best practice and sol

Dear all,
Currently working on a solution for Sales. There a geography and sales position hierarchy.
e.g. (USA,EU,ASIA) --> EU head -->UK head -->Northern UK head -->London head -->East london--> Postal codes in east london.
so altogether 7 roles and 7 positions, all these users can see data at it's level and below.
Also the summary report/ Prompts/ subject area columns for each of these users must be at their corresponding level and just 1 level below.
e.g. EU head by default must see column EU and country UK, UK head will see report only for UK and Northern UK region and so on.
After lots of thinking i am planing to take approach of creating 6 sets of similar dashboard/ different prompts/ different global filters/ reports.
Data level access is not at all issue i am easily able to manage.
Request experts to guide is there is any other better approach to take for objects in catalog.?
many thanks in advance.
Regards,
Yogen

Dear Srini,
Data level Security is not at all issue for me. Have already implement it and so far not a single bug in testing is caught.
It's about object level security and that too for 6 different types of user demanding different reports i.e. columns and detailed drill downs are different.
Again these 6 types of users can be read only users or power users (who can do ad hoc analysis) may be BICONSUMER and BIAUTHOR.
so need help regarding that...as we have to take decision soon.
thanks,
Yogen

Similar Messages

  • Data access best practice

    Oracle web site has an article talking about the 9iAS best practice. Predefining column type in the select statement is one of topics. The detail is following.
    3.5.5 Defining Column Types
    Defining column types provides the following benefits:
    (1) Saves a roundtrip to the database server.
    (2) Defines the datatype for every column of the expected result set.
    (3) For VARCHAR, VARCHAR2, CHAR and CHAR2, specifies their maximum length.
    The following example illustrates the use of this feature. It assumes you have
    imported the oracle.jdbc.* and java.sql.* interfaces and classes.
    //ds is a DataSource object
    Connection conn = ds.getConnection();
    PreparedStatement pstmt = conn.prepareStatement("select empno, ename, hiredate from emp");
    //Avoid a roundtrip to the database and describe the columns
    ((OraclePreparedStatement)pstmt).defineColumnType(1,Types.INTEGER);
    //Column #2 is a VARCHAR, we need to specify its max length
    ((OraclePreparedStatement)pstmt).defineColumnType(2,Types.VARCHAR,12);
    ((OraclePreparedStatement)pstmt).defineColumnType(3,Types.DATE);
    ResultSet rset = pstmt.executeQuery();
    while (rset.next())
    System.out.println(rset.getInt(1)+","+rset.getString(2)+","+rset.getDate(3));
    pstmt.close();
    Since I'm new to 9iAS, I'm not sure whether it's true that 9iAS really does an extra roundtrip to database just for the data type of the columns and then another roundtrip to get the data. Anyone can confirm it? Besides the above example uses the Oracle proprietary information.
    Is there any way to trace the db activities on the application server side without using enterprise monitor tool? Weblogic can dump all db activities to a log file so that they can be reviewed.
    thanks!

    Dear Srini,
    Data level Security is not at all issue for me. Have already implement it and so far not a single bug in testing is caught.
    It's about object level security and that too for 6 different types of user demanding different reports i.e. columns and detailed drill downs are different.
    Again these 6 types of users can be read only users or power users (who can do ad hoc analysis) may be BICONSUMER and BIAUTHOR.
    so need help regarding that...as we have to take decision soon.
    thanks,
    Yogen

  • OBIEE SSO enabling and role based reporting

    Hi,
    I had installed SOA10.1.3.1.0 and OBIEE10.1.3.4.0 already on my WINDOWS. I understand that I need to install 10.1.4 infrastructure to enable SSO in OBIEE, can you please tell me what is 10.1.4 infrastructure? is it equivelent to Oracle Identity Management Infrastructure and Oracle Identity Federation 10.1.4? I tried to download this from OTN since last night, but the page is always unaccessible. Where can I download 10.1.4 infrastructure except otn?
    I have another question regarding to the role based reporting with SSO. We want users to see different reports based on their roles once they login. What options do we have to implement this? From my understanding, we need to maintain a user role mapping table in our database, create groups in OBIEE and map the user role with the group in OBIEE? Is it true? Are there other options? Is there a existing product we can use to implement this?
    Thanks,
    Meng

    have a look on page 137 and further http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/b31770.pdf

  • Managed Beans and Data Access Object

    I have a question / need help understanding how to configure backing bean and model objects so that memory and object creation/deletion is done as efficiently as possible.
    1. I have a .jsf page with a form and a commandbutton that submits the form inputs to a backing bean (enrollispbean is backing bean)
    <h:commandButton value="Enter" action="#{enrollispbean.insert}"/>
    2. The backing bean is used for form handling - the insert() method is used to read the data fields from the form and create a SQL string that will be submitted to a model object, DbInsert, that is used as a generic data access object that connects to the database and insert the SQL string:
    public class EnrollIspBean {
    private String beanvar1="";
    private String beanvar2= "";
    // DbInsert is data access object
    private DbInsert dbinsert = new DbInsert();
    public String insert (){
    String sqlstmt;
    sqlstmt = "INSERT INTO ispmain VALUES(beanvar1, beanvar2,..)"
    dbinsert.insert(sqlstmt);
    return "success"; }
    3. DbInsert is the data access object that contains a method, insert(), that accepts a sql string to insert into the database. This method contains the code to obtain a connection from the database connection pool and then execute the sql statement (note: error checking code not shown):
    public class DbInsert {
    public void insert(String sqlstmt) throws SQLException {
    Connection conn = null;
    GetDBConnection getdbconnection = new GetDBConnection();
    PreparedStatement stmt = null;
    conn = getdbconnection.getdbconn();
    stmt = conn.prepareStatement(sqlstmt);
    stmt.executeUpdate();
    stmt.close();
    conn.close();
    return;
    Where I need help understanding is how to set up the scope for the managed beans and data access object. Currently, I have the backing bean within the session scope (using the facesconfig.xml file). My main question is how to set up the scope for the Data Access Object - currently I do not have it as a managed bean within facesconfig.xml. Instead I am creating a new instance within the backing bean:
    private DbInsert dbinsert = new DbInsert();
    Is this the best way to do this? Will the DBInsert object now be tied to the session scope of the backing bean (i.e., when backing bean is deleted, the DbInsert object will be deleted from session scope as well.)
    Ideally I would like the data access object to be available as a shared object throughout the life of the application. When I was programming using a servlet approach, I would have created a servlet to load on startup. Now that I'm using java server faces, I'm confused about the scope / how to efficiently set up a data access object that I want to be available to all backing beans in the application.
    tnanks for any help understanding this.
    Tom

    I was thinking about setting the data access object as application scope so that it can be used by an backing bean to execute sql statements.
    If I do set it as application scope, however, if I do this, do I still need to declare a new instance of the object from within each bean that uses the object?
    For example do I need to declare a new instance of the data access object from within the bean? or, should I assume that there is always an instance of the bean available in the application scope, and if so, how do I reference it from within the bean?
    Bean Code:
    public class EnrollIspBean {
    // DbInsert is data access object
    private DbInsert dbinsert = new DbInsert();
    Finally, I understand performance may be an issue if I have one instance of the data access object available in the application scope - is there a way to make multiple instances available in the application scope?
    thanks

  • What is the best practice and Microsoft best recommended procedure of placing "FSMO Roles on Primary Domain Controller (PDC) and Additional Domain Controller (ADC)"??

    Hi,
    I have Windows Server 2008 Enterprise  and have
    2 Domain Controllers in my Company:
    Primary Domain Controller (PDC)
    Additional Domain Controller (ADC)
    My (PDC) was down due to Hardware failure, but somehow I got a chance to get it up and transferred
    (5) FSMO Roles from (PDC) to (ADC).
    Now my (PDC) is rectified and UP with same configurations and settings.  (I did not install new OS or Domain Controller in existing PDC Server).
    Finally I want it to move back the (FSMO Roles) from
    (ADC) to (PDC) to get UP and operational my (PDC) as Primary. 
    (Before Disaster my PDC had 5 FSMO Roles).
    Here I want to know the best practice and Microsoft best recommended procedure for the placement of “FSMO Roles both on (PDC) and (ADC)” ?
    In case if Primary (DC) fails then automatically other Additional (DC) should take care without any problem in live environment.
    Example like (FSMO Roles Distribution between both Servers) should be……. ???
    Primary Domain Controller (PDC) Should contains:????
    Schema Master
    Domain Naming Master
    Additional Domain Controller (ADC) Should contains:????
    RID
    PDC Emulator
    Infrastructure Master
    Please let me know the best practice and Microsoft best recommended procedure for the placement of “FSMO Roles.
    I will be waiting for your valuable comments.
    Regards,
    Muhammad Daud

    Here I want to know the best practice
    and Microsoft best recommended procedure for the placement of “FSMO Roles both on (PDC) and (ADC)” ?
    There is a good article I would like to share with you:http://oreilly.com/pub/a/windows/2004/06/15/fsmo.html
    For me, I do not really see a need to have FSMO roles on multiple servers in your case. I would recommend making it simple and have a single DC holding all the FSMO roles.
    In case if
    Primary (DC) fails then automatically other Additional (DC) should take care without any problem in live environment.
    No. This is not true. Each FSMO role is unique and if a DC fails, FSMO roles will not be automatically transferred.
    There is two approaches that can be followed when an FSMO roles holder is down:
    If the DC can be recovered quickly then I would recommend taking no action
    If the DC will be down for a long time or cannot be recovered then I would recommend that you size FSMO roles and do a metadata cleanup
    Attention! For (2) the old FSMO holder should never be up and online again if the FSMO roles were sized. Otherwise, your AD may be facing huge impacts and side effects.
    This posting is provided "AS IS" with no warranties or guarantees , and confers no rights.
    Get Active Directory User Last Logon
    Create an Active Directory test domain similar to the production one
    Management of test accounts in an Active Directory production domain - Part I
    Management of test accounts in an Active Directory production domain - Part II
    Management of test accounts in an Active Directory production domain - Part III
    Reset Active Directory user password

  • Quick question regarding best practice and dedicating NIC's for traffic seperation.

    Hi all,
    I have a quick question regarding best practice and dedicating NIC's for traffic seperation for FT, NFS, ISCSI, VM traffic etc.  I get that its best practice to try and separate traffic where you can and especially for things like FT however I just wondered if there was a preferred method to achieving this.  What I mean is ...
    -     Is it OK to have everything on one switch but set each respective portgroup to having a primary and failover NIC i.e FT, ISCSI and all the others failover (this would sort of give you a backup in situations where you have limited physical NICs.
    -    Or should I always aim to separate things entirely with their own respective NICs and their own respective switches?
    During the VCAP exam for example (not knowing in advance how many physical NIC's will be available to me) how would I know which stuff I should segregate on its own separate switch?  Is there some sort of ranking order of priority /importance?  FT for example I would rather not stick on its own dedicated switch if I could only afford to give it a single NICs since this to me seems like a failover risk.

    I know the answer to this probably depends on however many physical NICs you have at your disposal however I wondered if there are any golden 100% rules for example FT must absolutely be on its own switch with its own NICs even at the expence of reduced resiliency should the absolute worst happen?  Obviously I know its also best practice to seperate NICs by vender and hosts by chassis and switch etc 

  • Best Practices and Usage of Streamwork?

    Hi All -
    Is this Forum a good place to inquire about best practices and use of Streamwork? I am not a developer working with the APIs, but rather have setup a Streamwork Activity for my team to collaborate on our activities.
    We are thinking about creating a sort of FAQ on our team activity and I was thinking of using either a table or a collection for this. I want it to be easy for team members to enter the question and the answer (our team gets a lot of questions from many groups and over time I would like to build up a sort of knowledge base).
    Does anyone have any suggestions for such a concept in StreamWork? Has anyone done something like this and can share experiences?
    Please let me know if I should post this question in another place.
    Thanks and regards,
    Rob Stevenson

    Activities have a limit of 200 items that can be included.  If this is the venue you wish to use,  it might be better to use a table rather than individual notes/discussions.

  • Coherence Best Practices and Performance

    I'm starting to use coherence and I'd to know if someone could point me out some doc on Best Practices and Performance optimizations when using it.
    BTW, I haven't had the time to go through the entire Oracle documentation.
    Regards

    Hi
    If you are new to Coherence (or even for people who are not that new) one of the best things you can do is read this book http://www.packtpub.com/oracle-coherence-35/book I know it says Coherence 3.5 and we are currently on 3.7 but it is still very relevant.
    You don't need to go through all the documentation but at least try the introductions and try out some of the examples. You need to know the basics otherwise it makes it harder for people to either understand what you want or give you detailed enough answers to questions.
    For performance optimizations it depends a lot on your use cases and what you are doing; there are a number of things you can do with Coherence to help performance but as with anything there are trade-offs. Coherence on the server-side is a Java process and often when tuning, sorting out issues and performance I spend a lot of time with the usual tools for Java such as VisualVM (or JConsole), tuning GC, looking at thread dumps and stack traces.
    Finally, there are plenty of people on these forums happy to answer your questions in return for a few forum points, so just ask.
    JK

  • Solaris 8 hangs and data access error on reboot

    Hi
    using solaris 5.8 on UltraSPARC-IIi 360MHz.
    I know it is an old hardware but similiar hardware is running fine. here is the issue,
    System booted : works okay for some time then display was hung so system was rebooted ..it gave data access error. again rebooted and it came up fine..
    I have pasted the /var/adm/messages here..
    http://pastebin.com/m5d2d1d53
    Is it a CPU problem or Hard disk or memory issue ? or something else ?
    what could be solution , a kernel patch is an option ?
    uname -a
    SunOS sparc5 5.8 Generic_108528-07 sun4u sparc SUNW,Ultra-5_10
    test-all at OBP did not show any error.. passed all.
    Thanks

    I cant see the /var/adm/messages output, Since you have already performed a Full Diagnostic check, then you boot your solaris using CD and then perform full FSCK on root File Systems and see any errors get reported. Keep doing fsck 2 -3 times untill all the errors are clean.
    Also you must check the latest patches available for this system architecture.

  • Report for date variances between delivery and goods receipt date

    dear guru ,
    i search a standard report for date variances between delivery date in the purchase order and the goods receipt date.
    Do you help me ?
    Thanks

    Hi,
    Use Report ME80FN and here check the delivery schedule option in output screen.

  • LabView DSC, RSLinx, SLC505 and Data transfer (Best way for large blocks of data?)

    I am currently programming in Labview 6.1 with the Datalogging and
    Supervisory Control module to transfer 720 floating point numbers to a
    SLC505 Allen Bradley PLC (with ethernet) using RSLinx as an OPC server. I
    have found that using the Datasocket Write takes about 30 - 40 seconds, and
    making the tags and writing to them takes about the same amount of time. Is
    it possible to transfer this data faster?
    Thanks,
    Michael Thomson (Surcher)

    Cyril,
    I was just experimenting with different ways to transfer the data from the
    computer to the PLC. In the past I have built large tag databases with
    specific tag names. This made the code rather cumbersome when you wanted to
    write to a large group of tags with descriptive names. I was just using
    datasocket write as a way to transfer the data to the plc using code to
    build the url and without having the DSC engine running. I have found that
    importing the tags right from the tag configuration editor with the names
    being simply the PLC addresses and then accessing them with the tag write is
    considerably faster (under 5 seconds). I can then build the names in an
    embedded for/next loop and change them to a tag name before I write to each
    one. The appli
    cation is a user interface that allows the machine operator
    to pick what kind of arch to put on cabinet door part. With the selections
    chosen I calculate the servo moves and download the data to the PLC.
    Thanks for the link!
    Michael Thomson
    "Cyril" wrote in message
    news:[email protected]..
    > Michael,
    >
    > I am a little bit confused about the configuration here and the
    > programming here: why are you using Datasocket Write if you are using
    > tags? Are the 720 floating numbers written to 720 different I/O
    > Points(registers). If so this shouldn't be that slow especially with
    > DSC
    > I would strongly encourage you contact the support at National
    > Instruments for LabVIEW DSC, either by phone or e-mail and give a
    > detailed description of the issue:
    > www.ni.com/ask, and communication method phone NI or e-mail NI.

  • OBIEE Report and Dashboard development best practice

    Hi All,
    Is there any best practice available on OBIEE report and dashboard development? Any help would be appreciated.
    Thanks,
    RK

    http://forums.oracle.com/forums/thread.jspa?messageID=2718365
    this might help you
    Thanks

  • JSP Best Practices and Oracle Report

    Hello,
    I am writing an application that obtains information from the user using a JSP/HTML form and then submitted to a database, the JSP page is setup using JSP Best Practices in which the SQL statments, database connectivity information, and most of the Java source code in a java bean/java class. I want to use Oracle Reports to call this bean, and generate a JSP page displaying the information the user requested from the database. Would you please offer me guidance for setting this up.
    Thank you,
    Michelle

    JSP Best Practices.
    More JSP Best Practices
    But the most important Best Practice has already been given in this thread: use JSP pages for presentation only.

  • Functional and data differences between W_GL_BALANCE_F and W_GL_OTHER_F

    Hi:
    Can some explain what the functional and data-source differences are between W_GL_BALANCE_F and W_GL_OTHER_F? Both seem to group by GROUP_ACCOUNT_NUM.
    Thanks.

    That is not possible; all transaction in GL Other will end up in GL Balance.
    The two tables are essentially having the same data but at different grain. Two main differences:
    - GL Other have individual journal transactions; GL Balance have them summarized to the account level.
    - GL Other is truly additive since its just individual journal transactions. i.e. you can just sum() any number of transactions and wont double count a trx; GL Balance is a monthly snapshot table. i.e. it provides account balance for all accounts for every month end, so you can never add two snapshots.

  • Understanding that the iPhone 5 won't support simultaneous voice and data transmissions over Verizon and Sprint LTE networks will it ever be able to support VoLTE (Voice over LTE)?

    Understanding that the iPhone 5 won’t support simultaneous voice and data transmissions over Verizon LTE network will it ever be able to support VoLTE (Voice over LTE)? Could Apple add VoLTE support later on? It seems pretty clear the MDM9615 chip supports VoLTE but the baseband software by Apple does not. Could Apple add it to the baseband later on? When will Verizon turn-on/enable VoLTE?
    Referral Link: https://discussions.apple.com/message/19614226#19614226

    Well in my research Verizon is already testing Voice Over LTE in select markets via a few news releases, and rumor has it that a few smartphones are already VoLTE ready. http://news.verizonwireless.com/news/2011/02/pr2011-02-09f.html

Maybe you are looking for