Simple queries

Hiya,
I'm trying to run simple queries against my database but get the following error:
Error: Could not fetch DOM element for doc id: 9167 in ..../mydbxml/lib/i386-linux-thread-multi/Sleepycat/DbXml.pm, line 497
Some simplified test code that still generates this error is:
     $fullQuery = 'collection("/local/scratch/ar283/anthology/dbxml/anthology.dbxml")//PAPER';
     $txn = $mgr->createTransaction();
     $context = $mgr->createQueryContext();
     $expression = $mgr->prepare($txn,$fullQuery,$context);
     $results = $expression->execute($txn,$context);
     my $value;
     while( $results->next($value) ) {
print "$value\n";
print $results->size() . " objects returned for expression '" . "$fullQuery'\n" ;
     $txn->commit();
I've also tried variations on the $results = $expression->execute($txn,$context); line, e.g.
     $results = $mgr->query($txn,$fullQuery,$context);
     $results = $mgr->query($txn,$fullQuery);
But the same error occurs. Can anyone tell me what I'm doing wrong?
Thanks,
Anna

Have you removed
any environment files by accident.Not that I'm aware of. What files? From within my code? How could I be doing this?
Once you are in this situation, it won't be fixed
other than by recreating the data.Okay, thanks.
How far do you get in your application before you hit
this problem? I.e. what is
working?Well, in the case of the the test query at the end of the building script, I successfully create the database as follows:
my $theContainer = "/local/scratch/ar283/anthology/dbxml/anthology.dbxml";
my $env = new DbEnv(0);
$env->set_cachesize(0, 64 * 1024, 1);
$env->open("/local/scratch/ar283/anthology/dbxml/",
Db::DB_INIT_MPOOL|Db::DB_CREATE|Db::DB_INIT_LOCK|Db::DB_INIT_LOG|Db::DB_INIT_TXN|Db::DB_RECOVER, 0);
my $mgr = new XmlManager($env);
my $txn = $mgr->createTransaction();
if ($mgr->existsContainer($theContainer)){
$mgr->removeContainer($txn,$theContainer);
$txn->commit();
$txn = $mgr->createTransaction();
my $container = $mgr->openContainer($txn, $theContainer, Db::DB_CREATE);
$txn->commit();
Then I populate it in a for loop of transactions like:
my $xmlDoc = $mgr->createDocument();
$xmlDoc->setContent( $recordString );
$xmlDoc->setName( $theName );
$container->putDocument($txn, $xmlDoc);
$txn->commit();
That all seems to work fine. When I tag on the following code for querying the container, I get the error message:
$fullQuery = 'collection("/local/scratch/ar283/anthology/dbxml/anthology.dbxml")//PAPER';
$txn = $mgr->createTransaction();
$context = $mgr->createQueryContext();
$expression = $mgr->prepare($txn,$fullQuery,$context);
if ($debug){
     print "$0: full query is $fullQuery, expression is $expression\n";
$results = $mgr->query($txn,$fullQuery);
$txn->commit();
I get my debug output of "full query is collection("/local/scratch/ar283/anthology/dbxml
/anthology.dbxml")//PAPER, expression is XmlQueryExpression=ARRAY(0x87a7770)" then the next line causes the error.

Similar Messages

  • Simple Queries on Reports

    Hi, I have some simple queries on reports, which are following. Any help is appreciated!!
    Q1. Where does following information go ??
    a. The heading that I want to print on only the first page of the Report, like Report Name,
    Company name, should go to Margin or Header or Body ?
    b. Where does the page numbers go ?
    c. Field titles/labels, that should print on the beginning of every page of the report
    Q2. How can I restrict the printing of a cloumn to once, when a same value is repeating more than once.
    This is the first column of the report, so if it's value is same in the next rows, i need to print it only once.
    Should I use format trigger or that can be simply done my setting some properties ? If format trigger,
    then how ?
    Q3. Is it possible that I have two parameters in my Parameter Form and after accepting first parameter
    value I show the the list of second parameters on the basis of first value choosen by the user ?
    e.g. If somebody chooses dept# 10, and then second field displays all the employees of dept# 10.
    Q4. Im my report, I want always a last page that is dipalying data from altogether different query which is no
    way related to my other main query. This is query is pretty simple and I just want to display the
    data always on the last page of the report. How would I handle the layout of this page ??
    Thanks!!
    Shalu

    Hi Shalu,
    Q1-
    a)If you want to Print heading on the First Page ONLY then it sholud be
    in MAIN SECTION/BODY olny.(Fields Property- Print OBject On- Should be
    First Page).
    b)Bottom-Right of the MARGIN Section.
    c)if Report Heading is in Main Section then in the layouts Main Frame,
    inside that you can keep (Field titles/labels).
    Q2- Go to Report Wizard try these two Report Style(Group Above and Group Left)
    i think this will solve your Problem...
    Q3- I thinnnnnnnk this is not possible.
    Q4- You can use Trailer Section of the Report.
    Regards,
    Ravi

  • Simple queries in portal

    I am having some difficulty writing what I think should be simple queries for the portal. I cannot find any documentation about the tables and how they are linked. For instance, I need a report that gives information about items such as what folder they are in and what content area. I cannot determine how to link the content area into the query.

    Hi ,
    general structure to display an iview
    Role -> Work set-> Page-> Iview .
    If u want to create all these u have to go
    1) Content Administration .-> In the list u will End Conten provided by SAP -> Right Click on -> select create iview (or) page (or ) Work Set or (Role )
    selecte Create role -> U will seee ne page - > give name of role  , role id . -> click on next-> finish .
    u will get like
    u want edit the object right .
    just click on object editing .
    in right side u will get one  page there u will see Entry point should be enabled thats .
    ur role is created now .
    now u have to assign this role partcular user right
    go to User Administration ->give user name -> search -> u will get ur user name -> select it user -> then u will sear for availa ble role -> type ur role id there click on search - > ur role is displayed over there -> Click on ADD -> Save it .
    thats it ur role is added ur required user
    now logoff from poratl and relogin u will see ur role .in top level navigation .
    if u want measn i will send u with screen shot k .
    Regards ,
    venkat

  • Simple queries, but vital!!

    I have 3 very simple queries which I cannot figure out. I
    have little experience on Dreamweaver MX, and am in a position of
    having to edit and launch an existing site...
    1) I am trying to add a basic border around images and tables
    - just a thin black line to make them look neater - how do I go
    about this?
    2) to prepare for publishing, I assume I need to save all my
    pages and images in one place, and assign each link/image an
    address from my C-drive. At the moment everything is stored in, for
    instance, C:\Documents and Settings\Adam Lucas\My Documents\website
    updates\images etc - is this OK or do I need a more simple address
    to prepare for publishing?
    3) how do I go about publishing my website once all the
    finalised elements exist on my C-drive?
    any advice massively appreciated...

    > I have little experience on Dreamweaver MX,
    1. What is your knowledge level of HTML and CSS?
    2. Which Dreamweaver - is it 6.0 or 6.1? If it's the former,
    please update
    it to the latter from the updater on the Adobe site - there
    were quite a few
    quirks in the former that will make your life miserable.
    > 1) I am trying to add a basic border around images and
    tables - just a
    > thin
    > black line to make them look neater - how do I go about
    this?
    Andy has your answer - use CSS.
    > 2) to prepare for publishing, I assume I need to save
    all my pages and
    > images
    > in one place, and assign each link/image an address from
    my C-drive. At
    > the
    > moment everything is stored in, for instance,
    C:\Documents and
    > Settings\Adam
    > Lucas\My Documents\website updates\images etc - is this
    OK or do I need a
    > more
    > simple address to prepare for publishing?
    You *MUST* start with a local defined site. Do you have one?
    Have you read
    DW's F1 help about how to define a local and a remote site,
    and how to PUT
    and GET files?
    Murray --- ICQ 71997575
    Adobe Community Expert
    (If you *MUST* email me, don't LAUGH when you do so!)
    ==================
    http://www.dreamweavermx-templates.com
    - Template Triage!
    http://www.projectseven.com/go
    - DW FAQs, Tutorials & Resources
    http://www.dwfaq.com - DW FAQs,
    Tutorials & Resources
    http://www.macromedia.com/support/search/
    - Macromedia (MM) Technotes
    ==================
    "marcoppizio" <[email protected]> wrote in
    message
    news:[email protected]...
    >I have 3 very simple queries which I cannot figure out. I
    have little
    > experience on Dreamweaver MX, and am in a position of
    having to edit and
    > launch
    > an existing site...
    >
    > 1) I am trying to add a basic border around images and
    tables - just a
    > thin
    > black line to make them look neater - how do I go about
    this?
    >
    > 2) to prepare for publishing, I assume I need to save
    all my pages and
    > images
    > in one place, and assign each link/image an address from
    my C-drive. At
    > the
    > moment everything is stored in, for instance,
    C:\Documents and
    > Settings\Adam
    > Lucas\My Documents\website updates\images etc - is this
    OK or do I need a
    > more
    > simple address to prepare for publishing?
    >
    > 3) how do I go about publishing my website once all the
    finalised elements
    > exist on my C-drive?
    >
    > any advice massively appreciated...
    >

  • Extensive IO and CPU for simple queries

    Hi,
    I have a machine using oracle 9.2.0 and solaris 10.
    For every simple queries, it very big IO and a lot of CPU. This is only happend on one particular machine(We have same version db and soalris on other mahcine and it works fine).
    One example queris is when I use Enterprise Manager to get the "configuration" information of the instance,it use 50% IO. I get the trace file and tkprof as following:
    SELECT UNIQUE sp.name, sp.sid, DECODE(p.type, 1, 'Boolean', 2, 'String', 3,'Integer', 4, 'Filename', ' '), sp.value, p.issys_modifiable, p.description FROM v$spparameter sp, v$parameter p WHERE sp.name = p.name ORDER BY sp.name,sp.sid
    call count cpu elapsed disk query current rows
    Parse 4 0.02 0.01 0 0 0 0
    Execute 4 0.00 0.00 0 0 0 0
    Fetch 9 4.36 34.12 7980 0 0 783
    total 17 4.38 34.13 7980 0 0 783
    Misses in library cache during parse: 1
    Optimizer mode: ALL_ROWS
    Parsing user id: 5 (SYSTEM)
    Rows Row Source Operation
    261 SORT UNIQUE (cr=0 pr=0 pw=0 time=1214116 us)
    261 HASH JOIN (cr=0 pr=0 pw=0 time=1221296 us)
    361485 MERGE JOIN CARTESIAN (cr=0 pr=0 pw=0 time=370609 us)
    261 FIXED TABLE FULL X$KSPSPFILE (cr=0 pr=0 pw=0 time=19777 us)
    361485 BUFFER SORT (cr=0 pr=0 pw=0 time=6413 us)
    1385 FIXED TABLE FULL X$KSPPCV (cr=0 pr=0 pw=0 time=4180 us)
    1379 FIXED TABLE FULL X$KSPPI (cr=0 pr=0 pw=0 time=7001 us)
    It seems Oracle FTS the X$KSPPCV and X$KSPPI.
    Can anybody give me some suggestion to improve the performance?
    thanks.

    Is there a difference in the query plans on the two machines?
    Did you analyze the SYS and SYSTEM schemas on one system and not the other?
    Are there different initialization parameters on the two machines?
    What do you mean by "it use 50% IO"? I'm not sure what that means and I'm not sure how you're measuring that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Simple Queries in PLM

    Hello,
    I have few doubts simple can you make them clear .
    Is PLM configuration done in R/3 , spro or its done in different server and there is integration between those server and r/3?
    Where is configuaration of EH&S (environment health and safety ) and receipe Management done .
    Please say this bcos if these are done in R/3  i will be relieved as i can go thru the tables and do some R&D.
    Also say something simple  more about CAD and DMS and how the documents are stored and what configuration is done in basis side and functional and technical side. Also specify where is this configuration done i mean in r/3 or in other servers.
    Some of the TCodes if you specify will be of very much helpful .
    Please give answers for these with your own points so that I can also start going through some documentation which I can get from SDN later.

    Hi,
    I have tried to answer some of your questions.
    1. Is PLM configuration done in R/3 , spro or its done in different server and there is integration between those server and r/3?
    Mostly as Chris said PLM config is done in SPRO
    2. Where is configuaration of EH&S (environment health and safety ) and receipe Management done .
    In SPRO Logistics - General->Product Lifecycle Management (PLM)->Recipe Management and Logistics - General->Product Lifecycle Management (PLM)->Environment, Health and Safety
    3. Also say something simple more about CAD and DMS and how the documents are stored and what configuration is done in basis side and functional and technical side. Also specify where is this configuration done i mean in r/3 or in other servers.
    You can go through help.sap.com or wiki http://wiki.sdn.sap.com/wiki/display/PLM/SAPPLM7.0includingSAPEasyDocumentManagement7.0
    4. Some of the TCodes if you specify will be of very much helpful .
    Which tcodes are you looking for?

  • Simple queries on ALUI

    Hi All,
    I am newbie to AquaLogic Portal and have some queries.
    1-Do we need to worry about session synchronization of any means between the portal and tomcat?
    2-Can we validate that we do not receive any requests from users who are signed-off or session timed out on the portal server?
    Looking forward for your support.
    Thanks

    Hi Shalu,
    Q1-
    a)If you want to Print heading on the First Page ONLY then it sholud be
    in MAIN SECTION/BODY olny.(Fields Property- Print OBject On- Should be
    First Page).
    b)Bottom-Right of the MARGIN Section.
    c)if Report Heading is in Main Section then in the layouts Main Frame,
    inside that you can keep (Field titles/labels).
    Q2- Go to Report Wizard try these two Report Style(Group Above and Group Left)
    i think this will solve your Problem...
    Q3- I thinnnnnnnk this is not possible.
    Q4- You can use Trailer Section of the Report.
    Regards,
    Ravi

  • Help Required::::Simple queries from a beginner's prespective for certi

    Hi All,
    I was preparing for the HFM certi.
    While, going through the bootcamp exercises, i have come across some problems which I have formulated as questions.
    PFB my queries for HFM .
    · There is a property called “Consolidation Rules” in Appsettings—It has values “y” or “N” or “R”…What is the significance of each of these.We have a Sub-Consolidate subrountine in the rules file..if we choose “Y” in Consolidation Rules ,does it pick up the subroutine written by us in rules file or does HFM perform its own default Consolidation?
    · Meaning of ICP Entities Aggregation Weight.Its usage.
    · Node Security is an option at App Settings level—how is it used to define access at user/group level?
    · Concept of the “IsConsolidated” Property with Accounts
    · In Scenario, Property called “Zero View for Adj/Non Adj”
    Thanks,
    ColDFirE

    hi
    You are right about Consolidation rules.This option specifies whether consolidation rules are supported. Specify one of these values:
    Y to use the rules written in the Consolidate()routine in a user-defined rule.
    R to derive the proportional value in the Value dimension. Note that the proportional data is not stored.
    N to use the default consolidation and eliminations.
    IsConsolidated
    Determines if the current Account dimension member or a specified account member is a
    consolidated account. This function can be used in these types of rules:
    ● Calculation
    ● Translation
    ● Consolidation
    ● Allocation
    Ex:HS.Account.IsConsolidated("Account")
    ZeroViewForAdj
    Specifies how to interpret missing, adjusted data values for the period.This attribute is required. Specify YTD or Periodic.
    i guess admin guide can cover all your question
    read http://download.oracle.com/docs/cd/E12825_01/epm.111/hfm_admin.pdf
    regards
    alexander

  • High TimeDataRetrieval in SSRS, for simple queries

    Hi there
    We have a SharePoint 2013 instance running PowerView, that is hitting an SSAS Tabular Model instance in the back-end.
    We are seeing very slow response times in PowerView. I dug into things, and I noted that if I use a Report Data Source connection that impersonates the current user, it runs very slowly; whereas if I set in the Report Data Source in SharePoint that it should
    "Use the following credentials", and specify the SAME account I am logged in as the one that gives the slow results; it all works lightning fast.
    SSAS doesn't seem to be the issue (It has all data in an 'InMemory' SSAS Tabular Model implementation , and CPU and RAM usage are very low throughout; and in reviewing Profiler results, the query responses seems to be the same on both the SLOW and FAST runs)
    I checked Fiddler that pointed at the SSRS calls being slow - and on reviewing the SSRS logs, I see the following: (The same 3 queries were run for each; returning the same number of records in each case (roughly 51) - results here are in ms, and are for
    the "TimeDataRetrieval" value:
    QUERIES WITHOUT EMBEDDED CREDENTIALS:
    Query 1 - 3074
    Query 2 - 3085
    Query 3 - 84
    QUERIES WITH EMBEDDED CREDENTIALS:
    Query 1 - 76
    Query 2 - 61
    Query 3 - 9
    I also noted that if I run the FAST connection query, then closed the IE window, opened a new IE window, and then used the OTHER SLOW connection, then for the time-being it works fast for every request I make then - as if somewhere that connection is cached
    for this user?
    Any thoughts would be greatly appreciated.
    Thanks
    David

    Hi Jude_44,
    Thank you for your question.
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated.
    Thank you for your understanding and support.
    Thanks,
    Wendy Fu
    If you have any feedback on our support, please click
    here.
    Wendy Fu
    TechNet Community Support

  • Enormous response times to simple queries

    Dear Forum
    I am using TopLink via JPA in a Spring(V2) environment – using for tests the “org.springframework.orm.jpa.LocalEntityManagerFactoryBean”.
    There are 2700 Location objects stored in memory which have one Many-to-one relationship and a lot of one-to-many to not yet existing objects defined as lazy.
    I defined a query "findLocationByCode" (“SELECT DISTINCT l FROM Location l WHERE l.code = :code ") which takes normally about 30 ms!
    When I use that a creating a very complicated object with about 1000 dependent objects then that query may take up to 30 sec(!!!). Inspecting the database there is no relevant activity and it seems that the time is mainly wasted within TopLink (cache?).
    I use Oracle 10g 10.2.0 Enterprise edition, I have 2GB notebook ("Intel Pentium 1.7GHz) with Windows XP Professional, SP 2.
    TopLink: 2.0-b41-beta2 (03/30/2007)
    In the following I append a snapshot of the persistence.xml and the annotations of Location and LocClass classes.
    Y.s.
    Wolfgang
    TopLink settings in persistence.xml
    <property name="toplink.sessions-xml"
         value="META-INF/sessions.xml" />
    <property name="toplink.session-name" value="solver" />
    <property name="toplink.cache.type.default" value="HardWeak" />
    <property name="toplink.cache.size.default" value="15000"/>
    <property name="toplink.cache.type" value="NONE" />
    <property name="toplink.refresh" value="true"/>
    <property name="oracle.orm.throw.exceptions" value="false" />
    <property name="toplink.weaving" value="true" />
    <property name="toplink.jdbc.bind-parameters" value="true" />
    <property name="toplink.jdbc.native-sql" value="true" />
    <property name="toplink.jdbc.batch-writing"
         value="BUFFERED" />
    <property name="toplink.jdbc.cache-statements.size"
         value="100" />
    <property name="toplink.jdbc.read-connections.max"
         value="10" />
    <property name="toplink.jdbc.read-connections.min"
         value="2" />
    <property name="toplink.jdbc.read-connections.shared"
         value="true" />
    <property name="toplink.jdbc.write-connections.max"
         value="4" />
    <property name="toplink.jdbc.write-connections.min"
         value="4" />
    <property name="toplink.logging.level" value="SEVERE" />
    <property name="toplink.logging.timestamp" value="true" />
    <property name="toplink.logging.thread" value="true" />
    <property name="toplink.logging.session" value="true" />
    <property name="toplink.logging.exceptions" value="true" />
    <property name="toplink.logging.file"
    Location Class
    @Entity
    @Table(name = "LOCATION")
    @NamedQuery(
    name="findLocationByCode",
    query="SELECT DISTINCT l FROM Location l WHERE l.code = :code "
    public class Location implements ILocation {
         // persistent part
         // primary key field
         @Id
         @GeneratedValue
         @Column(name = "LOC_ID")
         transient
         private Long id;
         @Id
         @Column(name = "LOC_CODE", nullable = false)
         private String code;
         @Column(name = "LOC_NAME", nullable = true)
         private String name;
         @ManyToOne(cascade = CascadeType.ALL)
         @JoinColumn(name = "LOC_CLASS_ID")
         private LocClass locClass;
         @Column(name = "LOC_GPSX", nullable = true)
         private Float gpsX;
         @Column(name = "LOC_GPSY", nullable = true)
         private Float gpsY;
         @Column(name = "LOC_GRX", nullable = true)
         private Float grX;
         @Column(name = "LOC_GRY", nullable = true)
         private Float grY;
         @Column(name = "LOC_LOCAL_RADIO", nullable = true)
         private String localRadio;
         @OneToMany(mappedBy = "fromLoc", fetch = FetchType.LAZY, cascade = CascadeType.ALL)
         private Set<Connection> fromConnections = new HashSet();
         @OneToMany(mappedBy = "toLoc", fetch = FetchType.LAZY, cascade = CascadeType.ALL)
         private Set<Connection> toConnections = new HashSet();
         @OneToMany(mappedBy = "location", fetch = FetchType.LAZY, cascade = CascadeType.ALL)
         private Set<ViewLoc> viewLocs = new HashSet();
         @OneToMany(mappedBy = "location", fetch = FetchType.LAZY, cascade = CascadeType.ALL)
         private Set<Entry> entries = new HashSet();
    LocClass Class
    @Entity
    @Table(name = "LOCCLASS")
    @NamedQuery(
    name="findLocClassByCode",
    query="SELECT DISTINCT lc FROM LocClass lc WHERE lc.code = :code"
    public class LocClass implements ILocClass{
    //persistent part
    // primary key field
    @Id
    @GeneratedValue
    @Column(name = "LC_ID")
    transient     
    private Long id;
    @Id
    @Column(name = "LC_CODE", nullable=false)
    public String code;
    @Column(name = "LC_NAME", nullable=false)
    public String name;
    @Column(name = "LC_DIS_NAME", nullable=false)
    public Boolean displayLocName;
    @Column(name = "LC_D_LINE_MODE", nullable=false)
    public Integer displayLineMode;
    @Column(name = "LC_D_RUNTIME")
    public Boolean displayRuntime;
    @Column(name = "LC_SWITCH_POSS")
    public Boolean switchPossibility;

    Hello Wolfgang,
    You mention your notebook has 2gs but not how much memory or heap space the JVM is configured to use (or how much it is actually using). If you haven't tuned the JVM, this could be once cause of the slowdown.
    I do not see anything wrong with your entities - assuming the LOC_CLASS_ID field is a string type anyway. One thing to try is to mark the Many-to-one relationship as lazy, or bring it in using fetch join with the location query ie:
    SELECT DISTINCT l FROM Location l join fetch l.locClass WHERE l.code = :code
    Best Regards,
    Chris

  • Some simple queries on TimesTen

    Hi, I am completely new to TimesTen, I was just wondering how it can interact with standard Oracle Databases
    Q1.What is the connection method to TimesTen ? i.e. is it Sql*Net ?
    Q2. What is used to backup TimesTen databases ? Are they basically a backup of the transaction logs, therefore any file based backup method ? eg standard backup facility of the OS on which TimesTen is deployed ?
    Q3. I presume tools like Data Pump, export/import etc do not work with TimesTen ?
    Q4. Can TimesTen be used as a Data Source for Oracle BI Server ( i.e. OBIEE ) or alternatively to hold the BI Repository ?
    Q5. Are all the basic DML and DDL commands for Sql, available to be used with TimesTen ?
    thanks,
    Jim

    Hello Jim. With regards to your questions:
    Q1.What is the connection method to TimesTen ? i.e. is it Sql*Net ?
    A1: When TimesTen communicates with Oracle it uses regular SQL*Net mechanisms. To Oracle DB it appears as just another client.
    Q2. What is used to backup TimesTen databases ? Are they basically a backup of the transaction logs, therefore any file based backup method ? eg standard backup facility of the OS on which TimesTen is deployed ?
    A2: TimesTen provides it own backup and restore utilities (ttBackup/ttRestore) which allow you create online, transactionally consistent backups, both full and incremental. These tools are the only supported way to backup a TimesTen database. You should not use OS level file backup tools to backup an active database as the resulting 'backup' will not be consistent and will most likely not be usable.
    Q3. I presume tools like Data Pump, export/import etc do not work with TimesTen ?
    A3: Correct, however TimesTen does have its own set of tools that provide many capabilities.
    Q4. Can TimesTen be used as a Data Source for Oracle BI Server ( i.e. OBIEE ) or alternatively to hold the BI Repository ?
    A4: Yes, TimesTen is supported as a data source for OBIEE. In fact TimesTen is one of the key technologies within the OBIEE stack in the Exalytics BI engineered system. It is not currently supported to store the BI Repository in TimesTen.
    Q5. Are all the basic DML and DDL commands for Sql, available to be used with TimesTen ?
    A5: Yes, though there are differences in the syntax and features supported by TImesTen compared to Oracle DB. But TimesTen supports most of the usual SQL.
    You can find put a lot more detail about TimesTen by skimming through the presentations, whitepapers and documentation, available here:
    http://www.oracle.com/technetwork/database/database-technologies/timesten/overview/index.html
    Chris

  • Using simple queries

    This question was posted in response to the following article: http://help.adobe.com/en_US/ColdFusion/9.0/Developing/WSc3ff6d0ea77859461172e0811cbec0f375 -7fdb.html

    I think your example is wrong. Based on the example search results, should all the examples state "film", not "filming" (the last 3 show filming).

  • OPA simple queries

    a) I am very new to OPA.Have some questions.
    1)What is an interview?
    2)What is Web Determinations?
    Can any one provide me with the link to for documentation on OPA Basics.

    *1)* It is possible to use the screenflow functionality in OPA without having defined any interview screens. It means, of course, that you'd be limited to only putting Decision points in your screenflow, which would kinda defeat the purpose of designing a screen flow. You couldn't put any interview screens in your screenflow, since interview screens need to exist first before they can be added to a flow. So, while it's technically possible to use the screenflow functionality without having created interview screens, it would be highly unusual and not best practice.
    OPM Help section on Interviews and Flows: http://www.oracle.com/technology/products/applications/policy-automation/help/opm10_1/Content/Interviews%20and%20flows/Interviews_and_flows.htm
    OPM Help article on defining screenflows: http://www.oracle.com/technology/products/applications/policy-automation/help/opm10_1/Content/Interviews%20and%20flows/Define_interview_screenflow.htm
    *2)* Not really sure what you're asking here?
    You don't have to have a screenflow at all in a rule project. There are many other ways to impact how the interview flows, e.g. backward chaining in the engine, and also the Screen Order functionality. The majority of the rulebases I build do not use the screenflow functionality, but instead use the Screen Order functionality and the backward chaining in the engine.
    OPM Help article on Screen Order: http://www.oracle.com/technology/products/applications/policy-automation/help/opm10_1/Content/Interviews%20and%20flows/Define_interview_screen_order.htm
    *3)* If you have screens which may not be relevant (depending on what the user has previously answered earlier in the interview), then it would probably be easier to use the Screen Order functionality rather than the Screenflow functionality.
    With Screen Order, the screens will not automatically always be displayed. They will be displayed if they contain a question which is relevant to the goal you are evaluating. Screenflow can be designed to do this as well, but it's more effort, so why not just use Screen Order.
    If you have data you always want to collect and you just want to walk users through the same screens in the same sequence, then Screenflow could be useful. Having said this, you still don't necessarily need Screenflow for this as it can also be done through procedural rules.
    There are some notes about procedural rules in this OPM Help article: http://www.oracle.com/technology/products/applications/policy-automation/help/opm10_1/Content/Designing%20rule%20documents/Identify_what_rules_are_needed.htm
    Cheers,
    Jasmine

  • XMB2DB_XML queries with join

    Using XMB2DB_XML I have done some simple queries like select, insert, update in SAP XI 2.0.
    Now i want to do the queries involving joins...
    Any idea about the XML Document Format for the Mode XMB2DB_XML???
    Abhijeet
    [email protected]

    Hi,
    You will probably need to create a join on the database side and use the SELECT within XMB2DB_XML.
    XMB2DB_XML does not provide joins of tables.
    Regards,
    Bill

  • OBIEE generates complex queries

    Hi,
    Currently we are experiencing performance issues, i.e. some queries are running for a very long time. I examined these queries and it struck me that it could be caused by the complexity of the generated queries. For example, a query accessing only one facttable accessed that facttable 5 times (the queries are concatenated through the WITH statement) ! I wonder, why is this necessary? Business Objects XI does not generate such complex queries, BO XI would access that facttable only one time.
    Basically my question is: which settings affect the queries generated by OBIEE? Isn't it possible to generate different, simpler queries?
    TIA,
    EriK

    Thanks guys. Bad design could be a reason.. but we're using Oracle's own product called BI-APPS and I expect that to be perfect...:-)
    Here is an example of a query that is imho way too complex:
    WITH
    SAWITH0 AS (select sum(T258218.LINE_AMT * T258218.GLOBAL1_EXCHANGE_RATE) as c1,
    T551303.PROJECT_NUMBER as c2,
    T258218.PURCH_ORDER_NUM as c3
    from
    W_PROJECT_D T551303 /* Dim_W_PROJECT_D_Project */ ,
    W_PURCH_COST_F T258218 /* Fact_W_PURCH_COST_F */
    where ( T258218.DELETE_FLG = 'N' and T258218.PROJECT_WID = T551303.ROW_WID and T551303.PROJECT_NUMBER = '101491' )
    group by T258218.PURCH_ORDER_NUM, T551303.PROJECT_NUMBER),
    SAWITH1 AS (select distinct SAWITH0.c2 as c1,
    SAWITH0.c3 as c2,
    SAWITH0.c1 as c3,
    SAWITH0.c1 as c4,
    SAWITH0.c1 as c5
    from
    SAWITH0),
    SAWITH2 AS (select distinct SAWITH1.c1 as c1,
    SAWITH1.c2 as c2,
    SAWITH1.c3 as c3
    from
    SAWITH1),
    SAWITH3 AS (select sum(T258218.LINE_AMT * T258218.GLOBAL1_EXCHANGE_RATE) as c1,
    T551303.PROJECT_NUMBER as c2,
    T258218.PURCH_ORDER_NUM as c3
    from
    W_PROJECT_D T551303 /* Dim_W_PROJECT_D_Project */ ,
    W_PURCH_COST_F T258218 /* Fact_W_PURCH_COST_F */
    where ( T258218.DELETE_FLG = 'N' and T258218.PROJECT_WID = T551303.ROW_WID and T551303.PROJECT_NUMBER = '101491' )
    group by T258218.PURCH_ORDER_NUM, T551303.PROJECT_NUMBER),
    SAWITH4 AS (select distinct SAWITH3.c2 as c1,
    SAWITH3.c3 as c2,
    SAWITH3.c1 as c3,
    SAWITH3.c1 as c4,
    SAWITH3.c1 as c5
    from
    SAWITH3),
    SAWITH5 AS (select sum(SAWITH4.c4) as c4,
    SAWITH4.c1 as c5
    from
    SAWITH4
    group by SAWITH4.c1),
    SAWITH6 AS (select sum(T258218.LINE_AMT * T258218.GLOBAL1_EXCHANGE_RATE) as c1,
    T551303.PROJECT_NUMBER as c2,
    T258218.PURCH_ORDER_NUM as c3
    from
    W_PROJECT_D T551303 /* Dim_W_PROJECT_D_Project */ ,
    W_PURCH_COST_F T258218 /* Fact_W_PURCH_COST_F */
    where ( T258218.DELETE_FLG = 'N' and T258218.PROJECT_WID = T551303.ROW_WID and T551303.PROJECT_NUMBER = '101491' )
    group by T258218.PURCH_ORDER_NUM, T551303.PROJECT_NUMBER),
    SAWITH7 AS (select distinct SAWITH6.c2 as c1,
    SAWITH6.c3 as c2,
    SAWITH6.c1 as c3,
    SAWITH6.c1 as c4,
    SAWITH6.c1 as c5
    from
    SAWITH6),
    SAWITH8 AS (select sum(SAWITH7.c5) as c3
    from
    SAWITH7)
    select SAWITH2.c1 as c1,
    SAWITH2.c2 as c2,
    SAWITH2.c3 as c3,
    SAWITH5.c4 as c4,
    SAWITH8.c3 as c5
    from
    SAWITH2,
    SAWITH5,
    SAWITH8
    where ( nvl(SAWITH2.c1 , 'q') = nvl(SAWITH5.c5 , 'q') and nvl(SAWITH2.c1 , 'z') = nvl(SAWITH5.c5 , 'z') )
    Basically the query is:
    select sum(T258218.LINE_AMT * T258218.GLOBAL1_EXCHANGE_RATE) as c1,
    T551303.PROJECT_NUMBER as c2,
    T258218.PURCH_ORDER_NUM as c3
    from
    W_PROJECT_D T551303 /* Dim_W_PROJECT_D_Project */ ,
    W_PURCH_COST_F T258218 /* Fact_W_PURCH_COST_F */
    where ( T258218.DELETE_FLG = 'N' and T258218.PROJECT_WID = T551303.ROW_WID and T551303.PROJECT_NUMBER = '101491' )
    group by T258218.PURCH_ORDER_NUM, T551303.PROJECT_NUMBER
    but this query is repeated multiple times. It looks like a self join - but why is OBIEE generating such a complex query? And can it be affected?
    Edited by: user10984315 on 18-Mar-2011 08:29
    Edited by: user10984315 on 18-Mar-2011 08:30

Maybe you are looking for

  • Error creating webservice from BRF+ application

    Hi, I am creating a webservice for the BRF+ application that I created. I entered following parameters, Development Package:  Local Package:    Transport Request:  Create Function Group:    Function Group:  Function Module:  Prefix for Generated Obje

  • SL & partition resize - Why not one only?

    Hello at all. I need to enlarge partition of my Time Machine I have an external 1TB hd, based on GUID map scheme, split into 2 partition: 1st 600GB hfs+ journled (with my TM) - partion A 2nd 400GB ntfs (data) - partition B By disk utility, I tried to

  • External number range for Int Orders

    We want to define statistical internal orders employee-wise.  For each employee, we want to define statistical internal orders with the same employee number for monitoring the low-valued assets given to them.  My query is can we have external number

  • IronPort and Internet

    I have a remote office and setup site to site vpn to HQ using ASA for both location.  All data traffic and internet browser on the VPN link back to HQ. Everything is working fine and strange thing happened that there are 2 users can not connect to in

  • Warning#1#com.sap.sup.admin.scheduler.JmsScheduler#Plain

    hi In our SAP portal server's default traces and application logs we are constantly seeing warning messages as follows #1.#00144F54B441000A0000007800002B75000456B6617AB4F7#1221241226310#com.sap.sup.admin.scheduler.JmsScheduler##com.sap.sup.admin.sche