Data structures

what is the best data structure to use for keeping four different data
bases interconnected and updated about the changes in one another at all times while not using an existing java data structure but by implementing one by one's own?
thanx .

Data structure? Coherency Protocol!
What a question anyways? If you want to implement your own, the best thing to do is to get going ...

Similar Messages

  • Simple Transformation to deserialize an XML file into ABAP data structures?

    I'm attempting to write my first simple transformation to deserialize
    an XML file into ABAP data structures and I have a few questions.
    My simple transformation contains code like the following
    <tt:transform xmlns:tt="http://www.sap.com/transformation-templates"
                  xmlns:pp="http://www.sap.com/abapxml/types/defined" >
    <tt:type name="REPORT" line-type="?">
      <tt:node name="COMPANY_ID" type="C" length="10" />
      <tt:node name="JOB_ID" type="C" length="20" />
      <tt:node name="TYPE_CSV" type="C" length="1" />
      <tt:node name="TYPE_XLS" type="C" length="1" />
      <tt:node name="TYPE_PDF" type="C" length="1" />
      <tt:node name="IS_NEW" type="C" length="1" />
    </tt:type>
    <tt:root name="ROOT2" type="pp:REPORT" />
        <QueryResponse>
        <tt:loop ref="ROOT2" name="line">
          <QueryResponseRow>
            <CompanyID>
              <tt:value ref="$line.COMPANY_ID" />
            </CompanyID>
            <JobID>
              <tt:value ref="$line.JOB_ID" />
            </JobID>
            <ExportTypes>
              <tt:loop>
                <ExportType>
                   I don't know what to do here (see item 3, below)
                </ExportType>
              </tt:loop>
            </ExportTypes>
            <IsNew>
              <tt:value ref="$line.IS_NEW"
              map="val(' ') = xml('false'), val('X') = xml('true')" />
            </IsNew>
          </QueryResponseRow>
          </tt:loop>
        </QueryResponse>
        </tt:loop>
    1. In a DTD, an element can be designated as occurring zero or one
    time, zero or more times, or one or more times. How do I write the
    simple transformation to accommodate these possibilities?
    2. In trying to accommodate the "zero or more times" case, I am trying
    to use the <tt:loop> instruction. It occurs several layers deep in the
    XML hierarchy, but at the top level of the ABAP table. The internal
    table has a structure defined in the ABAP program, not in the data
    dictionary. In the simple transformation, I used <tt:type> and
    <tt:node> to define the structure of the internal table and then
    tried to use <tt:loop ref="ROOT2" name="line"> around the subtree that
    can occur zero or more times. But every variation I try seems to get
    different errors. Can anyone supply a working example of this?
    3. Among the fields in the internal table, I've defined three
    one-character fields named TYPE_CSV, TYPE_XLS, and TYPE_PDF. In the
    XML file, I expect zero to three elements of the form
    <ExportType exporttype='csv' />
    <ExportType exporttype='xls' />
    <ExportType exporttype='pdf' />
    I want to set field TYPE_CSV = 'X' if I find an ExportType element
    with its exporttype attribute set to 'csv'. I want to set field
    TYPE_XLS = 'X' if I find an ExportType element with its exporttype
    attribute set to 'xls'. I want to set field TYPE_PDF = 'X' if I find
    an ExportType element with its exporttype attribute set to 'pdf'. How
    can I do that?
    4. For an element that has a value like
    <ErrorCode>123</ErrorCode>
    in the simple transformation, the sequence
    <ErrorCode>  <tt:value ref="ROOT1.CODE" />  </ErrorCode>
    seems to work just fine.
    I have other situations where the XML reads
    <IsNew value='true' />
    I wanted to write
    <IsNew>
            <tt:value ref="$line.IS_NEW"
            map="val(' ') = xml('false'), val('X') = xml('true')" />
           </IsNew>
    but I'm afraid that the <tt:value> fails to deal with the fact that in
    the XML file the value is being passed as the value of an attribute
    (named "value"), rather than the value of the element itself. How do
    you handle this?

    Try this code below:
    data  l_xml_table2  type table of xml_line with header line.
    W_filename - This is a Path.
      if w_filename(02) = '
        open dataset w_filename for output in binary mode.
        if sy-subrc = 0.
          l_xml_table2[] = l_xml_table[].
          loop at l_xml_table2.
            transfer l_xml_table2 to w_filename.
          endloop.
        endif.
        close dataset w_filename.
      else.
        call method cl_gui_frontend_services=>gui_download
          exporting
            bin_filesize = l_xml_size
            filename     = w_filename
            filetype     = 'BIN'
          changing
            data_tab     = l_xml_table
          exceptions
            others       = 24.
        if sy-subrc <> 0.
          message id sy-msgid type sy-msgty number sy-msgno
                     with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        endif.

  • OC4J: marshalling does not recreate the same data structure onthe client

    Hi guys,
    I am trying to use OC4J as an EJB container and have come across the following problem, which looks like a bug.
    I have a value object method that returns an instance of ArrayList with references to other value objects of the same class. The value objects have references to other value objects. When this structure is marshalled across the network, we expect it to be recreated as is but that does not happen and instead objects get duplicated.
    Suppose we have 2 value objects: ValueObject1 and ValueObject2. ValueObject1 references ValueObject2 via its private field and the ValueObject2 references ValueObject1. Both value objects are returned by our method in an ArrayList structure. Here is how it will look like (number after @ represents an address in memory):
    Object[0] = com.cramer.test.SomeVO@1
    Object[0].getValueObject[0] = com.cramer.test.SomeVO@2
    Object[1] = com.cramer.test.SomeVO@2
    Object[1].getValueObject[0] = com.cramer.test.SomeVO@1
    We would expect to see the same (except exact addresses) after marshalling. Here is what we get instead:
    Object[0] = com.cramer.test.SomeVO@1
    Object[0].getValueObject[0] = com.cramer.test.SomeVO@2
    Object[1] = com.cramer.test.SomeVO@3
    Object[1].getValueObject[0] = com.cramer.test.SomeVO@4
    It can be seen that objects get unnecessarily duplicated – the instance of the ValueObject1 referenced by the ValueObject2 is not the same now as the instance that is referenced by the ArrayList instance.
    This does not only break referential integrity, structure and consistency of the data but dramatically increases the amount of information sent across the network. The problem was discovered when we found that a relatively small but complicated structure that gets serialized into a 142kb file requires about 20Mb of network communication. All this extra info is duplicated object instances.
    I have created a small test case to demonstrate the problem and let you reproduce it.
    Here is RMITestBean.java:
    package com.cramer.test;
    import javax.ejb.EJBObject;
    import java.util.*;
    public interface RMITestBean extends EJBObject
    public ArrayList getSomeData(int testSize) throws java.rmi.RemoteException;
    public byte[] getSomeDataInBytes(int testSize) throws java.rmi.RemoteException;
    Here is RMITestBeanBean.java:
    package com.cramer.test;
    import javax.ejb.SessionBean;
    import javax.ejb.SessionContext;
    import java.util.*;
    public class RMITestBeanBean implements SessionBean
    private SessionContext context;
    SomeVO someVO;
    public void ejbCreate()
    someVO = new SomeVO(0);
    public void ejbActivate()
    public void ejbPassivate()
    public void ejbRemove()
    public void setSessionContext(SessionContext ctx)
    this.context = ctx;
    public byte[] getSomeDataInBytes(int testSize)
    ArrayList someData = getSomeData(testSize);
    try {
    java.io.ByteArrayOutputStream byteOutputStream = new java.io.ByteArrayOutputStream();
    java.io.ObjectOutputStream objectOutputStream = new java.io.ObjectOutputStream(byteOutputStream);
    objectOutputStream.writeObject(someData);
    objectOutputStream.flush();
    System.out.println(" serialised output size: "+byteOutputStream.size());
    byte[] bytes = byteOutputStream.toByteArray();
    objectOutputStream.close();
    byteOutputStream.close();
    return bytes;
    } catch (Exception e) {
    System.out.println("Serialisation failed: "+e.getMessage());
    return null;
    public ArrayList getSomeData(int testSize)
    // Create array of objects
    ArrayList someData = new ArrayList();
    for (int i=0; i<testSize; i++)
    someData.add(new SomeVO(i));
    // Interlink all the objects
    for (int i=0; i<someData.size()-1; i++)
    for (int j=i+1; j<someData.size(); j++)
    ((SomeVO)someData.get(i)).addValueObject((SomeVO)someData.get(j));
    ((SomeVO)someData.get(j)).addValueObject((SomeVO)someData.get(i));
    // print out the data structure
    System.out.println("Data:");
    for (int i = 0; i<someData.size(); i++)
    SomeVO tmp = (SomeVO)someData.get(i);
    System.out.println("Object["+Integer.toString(i)+"] = "+tmp);
    System.out.println("Object["+Integer.toString(i)+"]'s some number = "+tmp.getSomeNumber());
    for (int j = 0; j<tmp.getValueObjectCount(); j++)
    SomeVO tmp2 = tmp.getValueObject(j);
    System.out.println(" getValueObject["+Integer.toString(j)+"] = "+tmp2);
    System.out.println(" getValueObject["+Integer.toString(j)+"]'s some number = "+tmp2.getSomeNumber());
    // Check the serialised size of the structure
    try {
    java.io.ByteArrayOutputStream byteOutputStream = new java.io.ByteArrayOutputStream();
    java.io.ObjectOutputStream objectOutputStream = new java.io.ObjectOutputStream(byteOutputStream);
    objectOutputStream.writeObject(someData);
    objectOutputStream.flush();
    System.out.println("Serialised output size: "+byteOutputStream.size());
    objectOutputStream.close();
    byteOutputStream.close();
    } catch (Exception e) {
    System.out.println("Serialisation failed: "+e.getMessage());
    return someData;
    Here is RMITestBeanHome:
    package com.cramer.test;
    import javax.ejb.EJBHome;
    import java.rmi.RemoteException;
    import javax.ejb.CreateException;
    public interface RMITestBeanHome extends EJBHome
    RMITestBean create() throws RemoteException, CreateException;
    Here is ejb-jar.xml:
    <?xml version = '1.0' encoding = 'windows-1252'?>
    <!DOCTYPE ejb-jar PUBLIC "-//Sun Microsystems, Inc.//DTD Enterprise JavaBeans 2.0//EN" "http://java.sun.com/dtd/ejb-jar_2_0.dtd">
    <ejb-jar>
    <enterprise-beans>
    <session>
    <description>Session Bean ( Stateful )</description>
    <display-name>RMITestBean</display-name>
    <ejb-name>RMITestBean</ejb-name>
    <home>com.cramer.test.RMITestBeanHome</home>
    <remote>com.cramer.test.RMITestBean</remote>
    <ejb-class>com.cramer.test.RMITestBeanBean</ejb-class>
    <session-type>Stateful</session-type>
    <transaction-type>Container</transaction-type>
    </session>
    </enterprise-beans>
    </ejb-jar>
    And finally the application that tests the bean:
    package com.cramer.test;
    import java.util.*;
    import javax.rmi.*;
    import javax.naming.*;
    public class RMITestApplication
    final static boolean HARDCODE_SERIALISATION = false;
    final static int TEST_SIZE = 2;
    public static void main(String[] args)
    Hashtable props = new Hashtable();
    props.put(Context.INITIAL_CONTEXT_FACTORY, "com.evermind.server.rmi.RMIInitialContextFactory");
    props.put(Context.PROVIDER_URL, "ormi://lil8m:23792/alexei");
    props.put(Context.SECURITY_PRINCIPAL, "admin");
    props.put(Context.SECURITY_CREDENTIALS, "admin");
    try {
    // Get the JNDI initial context
    InitialContext ctx = new InitialContext(props);
    NamingEnumeration list = ctx.list("comp/env/ejb");
    // Get a reference to the Home Object which we use to create the EJB Object
    Object objJNDI = ctx.lookup("comp/env/ejb/RMITestBean");
    // Now cast it to an InventoryHome object
    RMITestBeanHome testBeanHome = (RMITestBeanHome)PortableRemoteObject.narrow(objJNDI,RMITestBeanHome.class);
    // Create the Inventory remote interface
    RMITestBean testBean = testBeanHome.create();
    ArrayList someData = null;
    if (!HARDCODE_SERIALISATION)
    // ############################### Alternative 1 ##############################
    // ## This relies on marshalling serialisation ##
    someData = testBean.getSomeData(TEST_SIZE);
    // ############################ End of Alternative 1 ##########################
    } else
    // ############################### Alternative 2 ##############################
    // ## This gets a serialised byte stream and de-serialises it ##
    byte[] bytes = testBean.getSomeDataInBytes(TEST_SIZE);
    try {
    java.io.ByteArrayInputStream byteInputStream = new java.io.ByteArrayInputStream(bytes);
    java.io.ObjectInputStream objectInputStream = new java.io.ObjectInputStream(byteInputStream);
    someData = (ArrayList)objectInputStream.readObject();
    objectInputStream.close();
    byteInputStream.close();
    } catch (Exception e) {
    System.out.println("Serialisation failed: "+e.getMessage());
    // ############################ End of Alternative 2 ##########################
    // Print out the data structure
    System.out.println("Data:");
    for (int i = 0; i<someData.size(); i++)
    SomeVO tmp = (SomeVO)someData.get(i);
    System.out.println("Object["+Integer.toString(i)+"] = "+tmp);
    System.out.println("Object["+Integer.toString(i)+"]'s some number = "+tmp.getSomeNumber());
    for (int j = 0; j<tmp.getValueObjectCount(); j++)
    SomeVO tmp2 = tmp.getValueObject(j);
    System.out.println(" getValueObject["+Integer.toString(j)+"] = "+tmp2);
    System.out.println(" getValueObject["+Integer.toString(j)+"]'s some number = "+tmp2.getSomeNumber());
    // Print out the size of the serialised structure
    try {
    java.io.ByteArrayOutputStream byteOutputStream = new java.io.ByteArrayOutputStream();
    java.io.ObjectOutputStream objectOutputStream = new java.io.ObjectOutputStream(byteOutputStream);
    objectOutputStream.writeObject(someData);
    objectOutputStream.flush();
    System.out.println("Serialised output size: "+byteOutputStream.size());
    objectOutputStream.close();
    byteOutputStream.close();
    } catch (Exception e) {
    System.out.println("Serialisation failed: "+e.getMessage());
    catch(Exception ex){
    ex.printStackTrace(System.out);
    The parameters you might be interested in playing with are HARDCODE_SERIALISATION and TEST_SIZE defined at the beginning of RMITestApplication.java. The HARDCODE_SERIALISATION is a flag that specifies whether Java serialisation should be used to pass the data across or we should rely on OC4J marshalling. TEST_SIZE defines the size of the object graph and the ArrayList structure. The bigger this size is the more dramatic effect you get from data duplication.
    The test case outputs the structure both on the server and on the client and prints out the size of the serialised structure. That gives us sufficient comparison, as both structure and its size should be the same on the client and on the server.
    The test case also demonstrates that the problem is specific to OC4J. The standard Java serialisation does not suffer the same flaw. However using the standard serialisation the way I did in the test case code is generally unacceptable as it breaks the transparency benefit and complicates interfaces.
    To run the test case:
    1) Modify provider URL parameter value on line 15 of the RMITestApplication.java for your environment.
    2) Deploy the bean to the server.
    4) Run RMITestApplication on a client PC.
    5) Compare the outputs on the server and on the client.
    I hope someone can reproduce the problem and give their opinion, and possibly point to the solution if there is one at the moment.
    Cheers,
    Alexei

    Hi,
    Eugene, wrong end user recovery.  Alexey is referring to client desktop end user recovery which is entirely different.
    Alexy - As noted in the previous post:
    http://social.technet.microsoft.com/Forums/en-US/bc67c597-4379-4a8d-a5e0-cd4b26c85d91/dpm-2012-still-requires-put-end-users-into-local-admin-groups-for-the-purpose-of-end-user-data?forum=dataprotectionmanager
    Each recovery point has users permisions tied to it, so it's not possible to retroacively give the users permissions.  Implement the below and going forward all users can restore their own files.
    This is a hands off solution to allow all users that use a machine to be able to restore their own files.
     1) Make these two cmd files and save them in c:\temp
     2) Using windows scheduler – schedule addperms.cmd to run daily – any new users that log onto the machine will automatically be able to restore their own files.
    <addperms.cmd>
     Cmd.exe /v /c c:\temp\addreg.cmd
    <addreg.cmd>
     set users=
     echo Windows Registry Editor Version 5.00>c:\temp\perms.reg
     echo [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft Data Protection Manager\Agent\ClientProtection]>>c:\temp\perms.reg
     FOR /F "Tokens=*" %%n IN ('dir c:\users\*. /b') do set users=!users!%Userdomain%\\%%n,
     echo "ClientOwners"=^"%users%%Userdomain%\\bogususer^">>c:\temp\perms.reg
     REG IMPORT c:\temp\perms.reg
     Del c:\temp\perms.reg
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. Regards, Mike J. [MSFT]
    This posting is provided "AS IS" with no warranties, and confers no rights.

  • Error saving data structure CE11000 (please read log) message number KX 655

    while activating the data structure in the operating concern of CO PA sap gives the following errors.
    1.Error saving data structure CE11000 (please read log)
    Message no. KX655
    2.Error saving table CE01000
    Message no. KX593
    3.in Log Reference field CE31000-REC_WAERS for CE31000-VVQ10001 has incorrect type.
    Pls suggest

    Hey,
    Below tables are related to application logs
    BAL_AMODAL  :                   Application Log: INDX table for amodal communication
    BALC        :                   Application Log: Log or message context            
    BALDAT      :                   Application Log: Log data                          
    BALHANDLE   :                   Application Log: Lock object dummy table           
    BALHDR      :                   Application log: log header                        
    BALHDRP     :                   Application log: log parameter                     
    BAL_INDX    :                   Application Log: INDX tables                       
    BALM        :                   Application log: log message                       
    BALMP       :                   Application log: message parameter                 
    BALOBJ      :                   Application log: objects                           
    BALOBJT     :                   Application log: object texts                      
    BALSUB      :                   Application log: sub-objects                       
    BALSUBT     :                   Application log: Sub-object texts                  
    -Kiran
    *Please mark useful answers

  • Can I automate the creation of a cluster in LabView using the data structure created in an autogenerated .CSV, C header, or XML file?

    Can I automate the creation of a cluster in LabView using the data structure created in an auto generated .CSV, C header, or XML file?  I'm trying to take the data structure defined in one or more of those files listed and have LabView automatically create a cluster with identical structure and data types.  (Ideally, I would like to do this with a C header file only.)  Basically, I'm trying to avoid having to create the cluster by hand, as the number of cluster elements could be very large. I've looked into EasyXML and contacted the rep for the add-on.  Unfortunately, this capability has not been created yet.  Has anyone done something like this before? Thanks in advance for the help.  
    Message Edited by PhilipJoeP on 04-29-2009 04:54 PM
    Solved!
    Go to Solution.

    smercurio_fc wrote:
    Is this something you're trying to do at runtime? Clusters are fixed data structures so you can't change them programmatically. Or, are you just trying to create some typedef cluster controls so that you can use them for coding? What would your clusters basically look like? Perhaps another way of holding the information like an array of variants?
    You can try LabVIEW scripting, though be aware that this is not supported by NI. 
     Wow!  Thanks for the quick response!  We would use this cluster as a fixed data structure.  No need to change the structure during runtime.  The cluster would be a cluster of clusters with multiple levels.  There would be not pattern as to how deep these levels would go, or how many elements would be in each.   Here is the application.  I would like to be able to autocode a Simulink model file into a DLL.  The model DLL would accept a Simulink bus object of a certain data structure (bus of buses), pick out which elements of the bus is needed for the model calculation, and then pass the bus object.  I then will take the DLL file and use the DLL VI block to pass a cluster into the DLL block (with identical structure as the bus in Simulink).  To save time, I would like to auto generate the C header file using Simulink to define the bus structure and then have LabView read that header file and create the cluster automatically.   Right now I can do everything but the auto creation of the cluster.  I can manually build the cluster to match the Simulink model bus structure and it runs fine.  But this is only for an example model with a small structure.  Need to make the cluster creation automated so it can handle large structures with minimal brute force. Thanks!  

  • What is the best data structure for loading an enterprise Power BI site?

    Hi folks, I'd sure appreciate some help here!
    I'm a kinda old-fashioned gal and a bit of a traditionalist, building enterprise data warehouses out of Analysis Service hypercubes with a whole raft of MDX for analytics.  Those puppies would sit up and beg when you asked them to deliver up goodies
    to SSRS or PowerView.
    But Power BI is a whole new game for me.  
    Should I be exposing each dimension and fact table in the relational data warehouse as a single Odata feed?  
    Should I be running Data Management Gateway and exposing each table in my RDW individually?
    Should I be flattening my stars and snowflakes and creating a very wide First Normal Form dataset with everything relating to each fact? 
    I guess my real question, folks, is what's the optimum way of exposing data to the Power BI cloud?  
    And my subsidiary question is this:  am I right in saying that all the data management, validation, cleansing, and regular ETTL processes are still required
    before the data is suitable to expose to Power BI?  
    Or, to put it another way, is it not the case that you need to have a clean and properly structured data warehouse
    before the data is ready to be massaged and presented by Power BI? 
    I'd sure value your thoughts and opinions,
    Cheers, Donna
    Donna Kelly

    Dear All,
    My original question was: 
    what's the optimum way of exposing data to the Power BI cloud?
    Having spent the last month faffing about with Power BI – and reading about many people’s experiences using it – I think I can offer a few preliminary conclusions.
    Before I do that, though, let me summarise a few points:
    Melissa said “My initial thoughts:  I would expose each dim & fact as a separate OData feed” and went on to say “one of the hardest things . . . is
    the data modeling piece . . . I think we should try to expose the data in a way that'll help usability . . . which wouldn't be a wide, flat table ”.
    Greg said “data modeling is not a good thing to expose end users to . . . we've had better luck with is building out the data model, and teaching the users
    how to combine pre-built elements”
    I had commented “. . . end users and data modelling don't mix . . . self-service so
    far has been mostly a bust”.
    Here at Redwing, we give out a short White Paper on Business Intelligence Reporting.  It goes to clients and anyone else who wants one.  The heart
    of the Paper is the Reporting Pyramid, which states:  Business intelligence is all about the creation and delivery of actionable intelligence to the right audience at the right time
    For most of the audience, that means Corporate BI: pre-built reports delivered on a schedule.
    For most of the remaining audience, that means parameterised, drillable, and sliceable reporting available via the web, running the gamut from the dashboard to the details, available on
    demand.
    For the relatively few business analysts, that means the ability for business users to create their own semi-customised visual reports when required, to serve
    their audiences.
    For the very few high-power users, that means the ability to interrogate the data warehouse directly, extract the required data, and construct data mining models, spreadsheets and other
    intricate analyses as needed.
    On the subject of self-service, the Redwing view says:  Although many vendors want tot sell self-service reporting tools to the enterprise, the facts of the matter are these:
    v
    80%+ of all enterprise reporting requirement is satisfied by corporate BI . . . if it’s done right.
    v Very few staff members have the time, skills, or inclination to learn and employ self-service business intelligence in the course of their activities.
    I cannot just expose raw data and tell everyone to get on with it.  That way lies madness!
    I think that clean and well-structured data is a prerequisite for delivering business intelligence. 
    Assuming that data is properly integrated, historically accurate and non-volatile as well, then I've just described
    a data warehouse, which is the physical expression of the dimensional model.
    Therefore, exposing the presentation layer of the data warehouse is – in my opinion – the appropriate interface for self-service business intelligence.
    Of course, we can choose to expose perspectives as well, which is functionally identical to building and exposing subject data marts.
    That way, all calculations, KPIs, definitions, and even field names, and all consistent because they all come from the single source of the truth, and not from spreadmart hell.
    So my conclusion is that exposing the presentation layer of the properly modelled data warehouse is – in general - the way to expose data for self-service.
    That’s fine for the general case, but what about Power BI?  Well, it’s important to distinguish between new capabilities in Excel, and the ones in Office 365.
    I think that to all intents and purposes, we’re talking about exposing data through the Data Management Gateway and reading it via Power Query.
    The question boils down to what data structures should go down that pipe. 
    According to
    Create a Data Source and Enable OData Feed in Power BI Admin Center, the possibilities are tables and views.  I guess I could have repeating data in there, so it could be a flattened structure of the kind Melissa doesn’t like (and neither do I). 
    I could expose all the dims and all the facts . . . but that would mean essentially re-building the DW in the PowerPivot DM, and that would be just plain stoopid.  I mean, not a toy system, but a real one with scores of facts and maybe hundreds of dimensions?
    Fact is, I cannot for the life of me see what advantages DMG/PQ
    has over just telling corporate users to go directly to the Cube Perspective they want, that has already all the right calcs, KPIs, security, analytics, field names . . . and most importantly, is already modelled correctly!
    If I’m a real Power User, then I can use PQ on my desktop to pull mashup data from the world, along with all my on-prem data through my exposed Cube presentation layer, and PowerPivot the
    heck out of that to produce all the reporting I’d ever want.  It'd be a zillion times faster reading the data directly from the Cube instead of via the DMG, as well (I think Power BI performance sucks, actually).
    Of course, your enterprise might not
    have a DW, just a heterogeneous mass of dirty unstructured data.  If that’s the case,
    choosing Power BI data structures is the least of your problems!  :-)
    Cheers, Donna
    Donna Kelly

  • Lease Out Contract - Master Data Structural Relations

    Dear Gurus,
    I am new to FX RE and if my question is too simple, please excuse me!
    I am researching the possibilities for the master data structure of lease in contracts and have a serious confusion.
    The documentation says that objects can be assigned to a contract individually or many objects related to one contract. At the same tim it is said that rental units are assigned only to lease out contracts.
    How should i proceed if my company has got many lease in contracts for different object. Is it a must the contract to be assigned to an object or it can be managed just as a contract? I saw that from the RE Navigator you can create a contract, does that mean that in FX RE the contracts itself is considered as master data record?
    Any opinions will be highly appreciated.
    Many thanks in advance!

    Dear Kumar,
    Sorry if I ve been confusing. I will try to make it more clear.
    I am working for a Teleco and i am researching the possibility of implementing FX - RE internally. We will not need all the functionalities of the module, but only the ones that allow you to manage your lease in contracts. We`ve got lots of base stations contracts (as you can imagine) where our company is a tenant in a lease contract. 
    I did some research in relation to the master data structure and cant work out how should our master data be structured. The business requirements are  - handling the advance payments, accruals and postings, giving and receiving notices, adjustments of rent etc. Is it possible to create the contract within the system,  set all the conditions directly in it or do we need to have another master record other that the contract itself.
    My confusion comes from the fact that the standard solution of the lease outs suggests creating a rental unit first and then assigning the contract to it. But since we will be tenants we dont need to more detailed information on these objects, we just need the conditions in the contract that will let us handle all the activities that arise in connection to it.
    I hope that i put the question more clearly this time.
    Can you also tell me if it is possible to create a template and after entering the conditions about the particular contract to generate it from the system on a hard copy?
    Many thanks and sorry if i cant make myself more clear!

  • Urgent! Need help in deciding data structure to use

    Hi all,
    I need to implement a restaurant system by which a customer can make a reservation.
    I was wondering whether vector would be able to be store such an object, The thing is I don't want a complex data structure. just sumthin simple cos i hardly have anytime left to my submission. sighz...
    The thing is I need to able to search based on 2 properties of an object. Eg. I need to search for a reservation based on the customer name and the date he reserved a table.
    But I am totally clueless how to search thru a vector based on 2 properties of the object... Would really appreciate some help. Like an example how to so so based on my program. Feelin so lost...This is all I have so far:
    class AddReservation
         static BufferedReader stdin = new BufferedReader (new InputStreamReader(System.in));
         //Main Method
         public static void main (String[]args)throws IOException
              String custName, comments;
              int covers, date, startTime, endTime;
              int count = 0;
              //User can only add one reservation at a time
              do
                   //Create a new reservation
                   Reservation oneReservation=new Reservation();                         
                   System.out.println("Please enter customer's name:");
                   System.out.flush();
                   custName = stdin.readLine();
                   oneReservation.setcustName(custName);
                   System.out.println("Please enter number of covers:");
                   System.out.flush();
                   covers = Integer.parseInt(stdin.readLine());
                   oneReservation.setCovers(covers);
                   System.out.println("Please enter date:");
                   System.out.flush();
                   date = Integer.parseInt(stdin.readLine());
                   oneReservation.setDate(date);
                   System.out.println("Please enter start time:");
                   System.out.flush();
                   startTime = Integer.parseInt(stdin.readLine());
                   oneReservation.setstartTime(startTime);
                   System.out.println("Please enter end time:");
                   System.out.flush();
                   endTime = Integer.parseInt(stdin.readLine());
                   oneReservation.setendTime(endTime);
                   System.out.println("Please enter comments, if any:");
                   System.out.flush();
                   comments = stdin.readLine();
                   oneReservation.setComments(comments);
                   count++;
              while (count<1);
              class Reservation
              private Reservation oneReservation;
              private String custName, comments;
              private int covers, startTime, endTime, date;
              //Default constructor
              public Reservation()
              public Reservation(String custName, int covers, int date, int startTime, int endTime, String comments)
                   this.custName=custName;
                   this.covers=covers;
                   this.date=date;
                   this.startTime=startTime;
                   this.endTime=endTime;
                   this.comments=comments;
              //Setter methods
              public void setcustName(String custName)
                   this.custName=custName;
              public void setCovers(int covers)
                   this.covers=covers;;
              public void setDate(int date)
                   this.date=date;
              public void setstartTime(int startTime)
                   this.startTime=startTime;
              public void setendTime(int endTime)
                   this.endTime=endTime;
              public void setComments(String comments)
                   this.comments=comments;
              //Getter methods
              public String getcustName()
                   return custName;
              public int getCovers()
                   return covers;
              public int getDate()
                   return date;
              public int getstartTime()
                   return startTime;
              public int getendTime()
                   return endTime;
              public String getComments()
                   return comments;
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    class searchBooking
         static BufferedReader stdin = new BufferedReader (new InputStreamReader(System.in));
         public static void main (String[]args)throws IOException
              int choice, date, startTime;
              String custName;
                   //Search Menu
                   System.out.println("Search By: ");
                   System.out.println("1. Date");
                   System.out.println("2. Name of Customer");
                   System.out.println("3. Date & Name of Customer");
                   System.out.println("4. Date & Start time of reservation");
                   System.out.println("5. Date, Name of customer & Start time of reservation");
                   System.out.println("Please make a selection: ");          
                   //User keys in choice
                   System.out.flush();
                   choice = Integer.parseInt(stdin.readLine());
                   if (choice==1)
                        System.out.println("Please key in Date (DDMMYY):");
                        System.out.flush();
                        date = Integer.parseInt(stdin.readLine());
                   else if (choice==2)
                             System.out.println("Please key in Name of Customer:");
                             System.out.flush();
                             custName = stdin.readLine();
                   else if (choice==3)
                             System.out.println("Please key in Date (DDMMYY):");
                             System.out.flush();
                             date = Integer.parseInt(stdin.readLine());
                             System.out.println("Please key in Name of Customer:");
                             System.out.flush();
                             custName = stdin.readLine();
                   else if (choice==4)
                             System.out.println("Please key in Date (DDMMYY):");
                             System.out.flush();
                             date = Integer.parseInt(stdin.readLine());
                             System.out.println("Please key in Start time:");
                             System.out.flush();
                             startTime = Integer.parseInt(stdin.readLine());
                   else if (choice==5)
                             System.out.println("Please key in Date (DDMMYY):");
                             System.out.flush();
                             date = Integer.parseInt(stdin.readLine());
                             System.out.println("Please key in Name of Customer:");
                             System.out.flush();
                             custName = stdin.readLine();
                             System.out.println("Please key in Start time:");
                             System.out.flush();
                             startTime = Integer.parseInt(stdin.readLine());
                        }

    Please stop calling your questions urgent. Everybody's question is urgent to them. Nobody's are urgent to the people who are going to answer them. Calling your questions urgent suggests that you think they are more important than others' (They're not.) and will only serve to irritate those who would help you. It won't get your questions answered any sooner.

  • What's the easiest way to move app data and data structures to a server?

    Hi guys,
    I've been developing my app locally with Apex 4.2 and Oracle 11g XE on Windows 7. It's getting close to the time to move the app to an Oracle Apex server. I imagine Export/Import is the way to move the app. But what about the app tables and data (those tables/data like "customer" and "account" created specifically for the app)? I've been using a data modeling tool, so I can run a DDL script to create the data structures on the server. What is the easiest way to move the app data to the server? Is there a way to move both structures and data in one process?
    Thanks,
    Kim

    There's probably another way to get here, but, in SQL Developer, on the tree navigation, expand the objects down to your table, right click, then click EXPORT.. there you will see all the options. This is a tedious process and it sucks IMO, but yeah, it works. It sucks mostly because 1) it's one table at a time, 2) if your data model is robust and has constraints, and sequences and triggers, then you'll have to disable them all for the insert, and hope that you can re-enable constraints, etc without a glitch (good luck, unless you have only a handful of tables)
    I prefer using the oracle command line EXP to export an entire schema, then on the target server I use IMP to import the schema. That way, it's near exact. This makes life messy if you develop more than one application in a single schema, and I've felt that pain -- however -- it's a whole lot easier to drop tables and other objects than it is to create them! (thus, even if the process of EXP/IMP moved more than you wanted to "move".. just blow away what you don't want on the target after the fact..)
    You could use oracle's datapump method too.
    Alternatively, what can be done, IF you have access to both servers from your SQL developer instance (or if you can tnsping them both already from the command line, you can use SQL*PLUS), is run a script that will identify your apex apps' objects (usually by prefix to object names, like EBA_PROJ_%, etc) and do all the manual work for you. I've created a script that does exactly this so that I can move data from dev to prod servers over a dblink. It's tricky because of the order that must be executed to disable constraints and then re-enable them, and of course, trickier if you don't consistently prefix ALL of your "application objects"... (tables, views, triggers, sequences, functions, procs, indexes, etc)

  • Dictionary Structure---data structure of Dictionary

    I don't know how to build a data structure of dictionary...
    somebody said it's was built by binary tree. but I don't know how.
    somebody helps me????
    thanks for reading my topic

    A dictionary is not a tree/hierarchical structure. Have you ever opened a dictionary before?! If you have even once, you already know the structure of a dictionary. Now, how to build that in Java is another question. I'm not sure of your requirements but you could start by creating an object that represents a single term, it's definitions, usages, etc. Then you can create a List of them (and sort them alphabetically). Simple enough?

  • SAP ABAP Proxy - recursive data structure problem

    Hi,
    For our customer we try to bind SAP with GW Calendar using GW Web Services. We tried to generate ABAP Proxy from groupwise.wsdl file but there are problems: GW uses recursive data structures what ABAP Proxy can not use. Is there any simple solution to this problem?
    Best regards
    Pawel

    At least I don't have a clue about ABAP Proxy.
    You are pretty much on your own unless someone
    else has tried it.
    Preston
    >>> On Tuesday, August 03, 2010 at 8:26 AM,
    pawelnowicki<[email protected]> wrote:
    > Hi,
    >
    > For our customer we try to bind SAP with GW Calendar using GW Web
    > Services. We tried to generate ABAP Proxy from groupwise.wsdl file but
    > there are problems: GW uses recursive data structures what ABAP Proxy
    > can not use. Is there any simple solution to this problem?
    >
    > Best regards
    > Pawel

  • ABAP proxy class - data structure

    I generated a ABAP Proxy Class and the data structure I want to use is put automatically under item structure which has 0...unbounded type.
    1. How can I get rid of this item structure as it will create another unnecessary level for my mapping
    2. If my source structure has only 3 level, and the target structure has more than 3 (including item), how to map it?
    e.g.
    Source structure: Level 1(occurrence 1) > Level 2(1)> Level 3(0..1)
    Target structure: Level 1(1])--> Level 2(0...1) --> item (0..unbounded) ---> Level 4(0..1)
    I need to map level 3 from my source to level 4 in target, but it didn't seem to work.
    Thanks.

    --->1. How can I get rid of this item structure as it will create another unnecessary level for my mapping
    You can delete the proxy at Application Server.....make necessary changes at XI Message Interface and again generate the proxy...
    -->Source structure: Level 1(occurrence 1) > Level 2(1)> Level 3(0..1)
    Target structure: Level 1(1])--> Level 2(0...1) --> item (0..unbounded) ---> Level 4(0..1)
    For this you need to make use of context change features of XI Mapping.
    Regards,

  • PCI_MSI_CAPABILITY data structure is not defined in the wdm.h?

    Hi
    I cannot find the PCI_MSI_CAPABILITY data structure definition in the wdm.h. I found
    PCI_CAPABILITIES_HEADER and
    PCI_PM_CAPABILITY
    etc. are defined in the wdm.h. Am I missing something?
    Thank you,
    Tiger

    The webpage
    https://msdn.microsoft.com/en-us/library/windows/hardware/ff537579%28v=vs.85%29.aspx?f=255&MSPPError=-2147217396 states it is not included.  Why do you think you need to reference it?
    Don Burn Windows Driver Consulting Website: http://www.windrvr.com

  • How will be outbound data structure in sap xi after executing the stored pr

    Hi All
    can any one please tell me how will be outbound data structure in sap xi after executing the stored procedure by sender JDBC adapter?
    Thanks in advance
    regards
    Rams

    Hi..
    My stored procedure is select and it will be OUTBOUND in PI.
    is it will be same as following
    <resultset>
    <row>
    <column-name1>column-value</ column-name1>
    <column-name2></column-name2>
    <column-name3>column-value</ column-name3>
    <column-name4></column-name4>
    </row>
    <row>
    <column-name1>column-value</ column-name1>
    <column-name2></ column-name2>
    </row>
    </resultset>
    Regards
    Rams
    Edited by: Rameshkumar Varanganti on Oct 15, 2008 12:04 PM

  • A robust data structure for a histogram of events over time.

    Hi all,
    I've been thinking for days, I am still lost on how to solve the following:
    Let's say that we have a list of events that occur over a certain time interval. Each event has a start timestamp and a finish timestamp. The events occur in no particular order and expand over various length of periods.
    With respect to the histogram, we mark the time period of an event occuring with a frequency of one. When events overlap over a time period, we mark that time period with a frequency of number of overlapping events.
    Given this raw data, we want to examine the data's frequency under various granularity such as second, minutes, hours, days, weeks, month and years for 60sec, 60min, 24 hrs, 7 days, 4 wks ranges respectively.
    So, what is a good intermediate (general) data structure that can store all the various events such that they can be easily transformed to a specific granularity. My goal here is to read the event list once, and histogram of events data with respect to various timeframe can be calculated.
    Does a timeline like data strucuture that stores aggregated frequencies solve my problem? With a fine granuarlity say seconds and a long time range, the data structure will bound to overflow.
    Any help is greatly appreciated!
    Brian

    You data structure would be in terms of Classes. You need a class Anevent{
    String eventName;
    Date startTime;
    Date endTime;
    In your main program have a Vector object which can store unllimited objects and which would not overflow. Store your event objects as they happen in this Vector. After your time interval of events is completed and you have the final Vector, you can pass the vector object to various methods in your main program to be simple or pass it to specailized graph drawing classes you would have to write to draw various Histograms.

  • How to create a reference instead of complex type for record in data structure type in BizTalk schema

    I have created the record in BizTalk schema but i will give to Data Structure Type as reference For eg:Recordname(Reference) instead of recordname(Complex type).
    Please help me to sort this out
     

    You can add a reference to an existing type only.
    You might have to manually edit the XSD file to change the xmlns definiton to xmlns:tns and then after you define the record type, set the datatype to tns:.... this creates the XML Type. Then later in your schema you can instantiate this with a name and
    then set type to tns:.... (Reference)
    Regards.

Maybe you are looking for

  • Date Validation on Datagrid

    Hey guys i have a web app that basically has a browse button  and when the user clicks  the button it prompts for a file on their PC and loads a datagrid. My question is rather simple. How do I validate the date columns on datagrid so that it should

  • Updater for Intel Mac

    Hi all, There are new updaters available, QT 7.1.6 and security update http://www.apple.com/support/downloads/securityupdate2007004v11universal.html http://www.apple.com/support/downloads/quicktime716formac.html http://www.apple.com/support/downloads

  • AE CS5 will not open

    I have just installed CS5 Premium Production on a MacPro(1,1) 2.66mhz Quad, 9gb RAM, OS 10.6.3.  All of the applications except AE open and work.  Only AE bounces once in the dock and that is it.  I have installed Rosetta as an option from the OS 10.

  • CS5 Classroom in a Book

    Do you folks recommend getting this book as a reference? I've got the rudiments, with all of your help, and will probably not go through it lesson by lesson. As a reference, is it useful, given we have this forum and the Help sites? Do you find that

  • Continuous Service Date on Service Dates Tab

    Hi there , Can some one help me to get the "Continous Service Date" from Service Dates Tab on the Person Screen using a DATA BASE ITEM. we are using this in our payroll calculation but unable to find the correct DB to get this value. Thanks, Raghu.