OBI EE running against XE

Can Oracle BI EE run against Oracle XE?
Thanks

Hi
You can check this blogs (I paste the linko to part 3, in the post You can find links to previous posts on the topic):
http://oraclebi.blogspot.com/2005/11/oracle-xe-database-and-discoverer-3.html
regards,Peter

Similar Messages

  • Imported mapping in template mapping module still runs against old location

    Hello,
    I imported the mapping in a new workspace (located in other server) using object names strategy, and I reconfigured the location source (it's a generic one) it uses.
    Looking at the mapping Configure menu / Table operators / Location I see the correct location I want to use.
    The template mapping module has the only data location Default Agent.
    I have synchronized several times with the inbound repository table information, and redeploy the mapping, but it still goes to the old location, that has been unregistered / deleted in the current repository.
    I don't know how to make the mapping runs against the location I want to. Please, help me
    Thank you

    Hi Alexander
    After copying a table to a different module, edit the module to set the data location, configure the module to define the location and synchronize the mapping to use this table.
    In OMB to synchronize inbound and outbound use OMBSYNCHRONIZE command
    Inbound:
    OMBSYNCHRONIZE TABLE '$OMB_CURRENT_PROJECT/SALES/TABY' TO MAPPING 'M_W_PARAMS' OPERATOR 'TABY' USE (RECONCILE_STRATEGY 'REPLACE', MATCHING_STRATEGY 'MATCH_BY_OBJECT_NAME')
    Outbound:
    OMBSYNCHRONIZE MAPPING 'M_W_PARAMS' OPERATOR 'TABX' TO TABLE '$OMB_CURRENT_PROJECT/SALES/TABX' USE (RECONCILE_STRATEGY 'REPLACE', MATCHING_STRATEGY 'MATCH_BY_OBJECT_NAME')
    Cheers
    David

  • Get est time for sql without running against the db ??? possible ?

    hello,
    is there anyways to get a query execution time without running that against the database ?? even an estimate would be good...i am on 10.2.0.3 ......i am it might be impossible for it...but anyways to get estimate besides setting autotrace only ?? as i said want nothing but just like a estimate time for how long will it take to finish the query...i know we can very long_ops view but as i said..without running against the DB ...is that possible ?? as the explain plan i get shows time ?? but is that how long it will take to finish up or ?/
    select * from table(dbms_xplan.display);
    | Id  | Operation            | Name | Rows  | Bytes | Cost (%CPU)| Time     |  
    |   0 | SELECT STATEMENT     |      |    56 |  3192 |     9   (0)| 00:00:01 |  
    |   1 |  MERGE JOIN CARTESIAN|      |    56 |  3192 |     9   (0)| 00:00:01 |  
    |   2 |   TABLE ACCESS FULL  | DEPT |     4 |    80 |     3   (0)| 00:00:01 |  
    |   3 |   BUFFER SORT        |      |    14 |   518 |     6   (0)| 00:00:01 |  
    |   4 |    TABLE ACCESS FULL | EMP  |    14 |   518 |     2   (0)| 00:00:01 |  
    -----------------------------------------------------------------------------  

    user11168115 wrote:
    hello,
    is there anyways to get a query execution time without running that against the database ?? <snip>
    How long is a string?
    How deep is a hole?
    How long does it take to read an unknown number of blocks containing an unknown number of rows from an unknown disk system with an unknown contention load, pulling it across an network of unknown bandwidth and unknown load ....
    And I'm not saying that if you somehow provide all this that the answer can be known. I'm saying the too much of the information tht would be needed to make such an estimate is unknowable. Much of it will be widely variable from one execution of the query to the next execution of the exact same query. Which leaves you with exactly what to base your estimate on?
    Without even seeing your query, I can tell you with absolute certainty that it will take between 1 nano-second and 100 years to execute.

  • To retrieve number of reports run against a database

    Hi .i'm trying to get details of reports that run against a target database...is there any way of getting this detail ? ..Appreciate the help

    If the target database is the only selection criteria you will probably get the best results using the database logging and tracing facilities as in BusinessOjects there isn't any pre-defined function or Activity detail to reports on this
    Caroline

  • Compiling against developer database, running against production database??

    Hello :-)
    In our company we have a developer- , a test- and a production database, like most of us, I think... :-)
    In former times it was usual to compile forms against the database, that they are running against. But nowadays this is because of new security rules very difficult. Developers have all rights on the developer database, a few on test and nearly none on the production database and even our DBA is (officially) not allowed to use/know the sys password.
    The developer and the production database are very similar (both 10g Enterprise Edition Release 10.2.0.3.0 - 64bit, Linux), but not totally equal. Of course there are differences in database objects, because it's the developer database, there could be differences in installed patches, and the production database consists of 4 Clusters, the developer database not.
    Is it enough to compile the forms (6i and also 10g) against the developer database and run the compiled forms against the production database or should we compile the forms, that go in production, strictly against the production database, which is a bit difficult because of our security rules???
    Regards,
    Udo

    Hello, Francois!
    Thanks for your answer, but that was not exactly what I meant...
    What you describe is, how it should be :-)
    We develop, deliver the forms and scripts and "someone" compiles them against the production database.
    But our problem is that it would be a lot of "paperwork"(?) each time(!) to get the authorization to get the required rights on the production database to compile all objects.
    We have no "production team" that has all these rights and are not allowed to do so...
    So I just want to have your experiences, if it is enough to compile against the developer database???
    I have made the experience that probably 98-99% works fine, but sometimes there are strange things happening (like not passing variables from one form to another, when a form is compiled against the developer database and the attached library for example is compiled against the production database)...
    Regards,
    Udo

  • Script to run against ALL AD users in a loop

    I am going to do a SharePoint upgrade this weekend from 2010 to 2013.
    I need this script to run against every Active Directory user automatically, not just one at a time. How do I get this get this script to do that? I figure I create a pipeline, I just don't know where.
    Here is the script:
    Param(
        [string]  $account = $(Read-Host -prompt
    "UserAccount")
    Add-PSSnapIn Microsoft.SharePoint.PowerShell
    foreach ($wa in get-SPWebApplication)
        Write-Host "$($wa.Name) | $($wa.UseClaimsAuthentication
        #http://technet.microsoft.com/en-us/library/gg251985.aspx
        $wa.UseClaimsAuthentication = $true
        $wa.Update()
        $account = (New-SPClaimsPrincipal -identity
    $account -identitytype 1).ToEncodedString()
        $zp = $wa.ZonePolicies("Default")
        $p = $zp.Add($account,"PSPolicy")
        $fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
        $p.PolicyRoleBindings.Add($fc)
        $wa.Update()
        $wa.MigrateUsers($true)
        $wa.ProvisionGlobally()
    Please help me! Thank you!

    Hi,
    Need to do something like this
    $Users=Get-ADUser -filter *
    foreach ($User in $Users) {
    YOUR SCRIPT
    -Identity $Users
    YOUR SCRIPT
    YOUR SCRIPT
    YOUR SCRIPT
    Seidl Michael | http://www.techguy.at |
    twitter.com/techguyat | facebook.com/techguyat

  • Multiple OWSM Gateway's running against 1 Policy Server

    Is it a supported configuration to have multiple OWSM Gateways running against one policy server?
    Thanks,
    Michael

    It won't trash the applications (see http://java.sun.com/j2se/1.4.2/compatibility.html to get 100% proof). If you install a new version, this will be used by default (if you just use "java"). Of course you can still use other JREs / SDKs when you explicitely use the version ("c:\Programme\j2sdk1.4.2_04\bin\java").

  • HTML DB running against SAP (on Oracle) - any experiences ? known clients ?

    Hi folks,
    does anybody have knowledge of HTML DB running against SAP/Oracle (as some sort of reporting frontend for e.g. controlling depts.) ?
    Any Experiences on that ?
    Any info wud be appreciated.
    brgds
    Bernhard

    Hi folks,
    does anybody have knowledge of HTML DB running against SAP/Oracle (as some sort of reporting frontend for e.g. controlling depts.) ?
    Any Experiences on that ?
    Any info wud be appreciated.
    brgds
    Bernhard

  • NoSuitableDriver exception running against a jtds driver.

    I developed my BC using SQL Flavour and now I try to run against SqlServer using Jtds Driver.
    I created my connection and tested it : it works fine.
    I create a configuration using that connection but when I try to run the tester against the DB I get a NoSuitableDriver error. (No suitable driver found for jdbc:jtds:sqlserver://faac/MSSQLSERVER;domain=faac;)
    Shoud register the driver somewhere in my classes ?
    What I Missed ?
    Tks
    Tullio

    I'm guessing that the embedded WLS server can't find the JDBC library you are using - you can try adding it to the lib directory of the embedded WLS
    Somewhere like: C:\Users\youruser\AppData\Roaming\JDeveloper\system11.1.1.5.37.60.13\DefaultDomain\lib
    or directly to the classpath in the startup script:
    C:\Users\sshmeltz\AppData\Roaming\JDeveloper\system11.1.1.5.37.60.13\DefaultDomain\bin
    or try and mark the library with the JDBC jar to be deployed by default in JDeveloper.

  • HFM rule won't run against ECA

    I have a very simple rule in HFM v11.1.2 that runs against ECA, but won't execute during normal consolidation, it will only execute by using a force calculate command.  Any suggestions to allow this to run during normal consolidation?  The data in the 1750_3P account is being populated by rules, not by a posted journal.
    Here is the rule:
    If pov_scenario = "ACT" and is_base = True and pov_value = "<Entity Curr Adjs>" Then
            HS.Clear "A#1750_3P"
    End If 'pov_scenario
    Thank you,

    Hi HTM_Ptc,
    What is is_base referring to ? This rule should be run during calculation. Or if it still unsuccessful, would you try using this :
    IF strYEar = " " AND strPeriod = " " AND strScenario = " " Then
    Select Case Hs.Value.Member
    Case "<Entity Curr Adjs>"
    ...[your clear statement]
    End Select
    End IF
    Thanks,
    Anna

  • Find the current sql which is running against database

    Hi,
    How to find out the current sql which is running against database?

    Hi,
    You can use V$SESSION_LONG_OPS and V$SQL to get the long running(6 seconds or above) queries. You need to join these two views to get the SQL text.
    V$SESSION_LONGOPS : Will give you the long running SQL ID.
    V$SQL : Will give you the text of query against the SQL ID.
    Refere the below link for more details:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/dynviews_2092.htm#REFRN30227
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14237/dynviews_2113.htm#REFRN30246
    Cheers,

  • Reports fail when run against a different data source

    Hello,
    We have a VB.NET 2008 WinForms application running on Microsoft .NET 3.5. We are using Crystal Reports 2008 runtime, service pack 3 -- using the CrystalDecisions.Windows.Forms.CrystalReportViewer in the app to view reports. In the GAC on all our client computers, we have versions 12.0.1100.0 and 12.0.2000.0 of CrystalDecisions.CrystalReports.Engine, CrystalDecisions.Shared, and CrystalDecisions.Windows.Forms.
    Please refer to another one of our posted forum issues, u201CCritical issue since upgrading from CR9 to CR2008u201D, as these issues seem to be related:
    Critical issue since upgrading from CR9 to CR2008
    We were concerned with report display slow down, and we seemed to have solved this by using the Oracle Server driver (instead of either Microsoft's or Oracle's OLEDB driver).  But now we must find a resolution to another piece of the puzzle, which is:  why does a report break if one data source is embedded in the .rpt file is different than the one you are trying to run the report against, in the .NET Viewer?
    Problem:
    If you have a production database name (e.g. "ProdDB") embedded in your .rpt file that you built your report from and try to run that report against a development database (e.g. "DevDB") (OR VICE VERSA -- it is the switch that is the important concept here), the report fails with a list of messages such as this:
        Failed to retrieve data from the database
        Details:  [Database vendor code: 6550 ]
    This only seems to happen if the source of the report data (i.e. the underlying query) is an Oracle stored procedure or a Crystal Reports SQL Command -- the reports run fine against all data sources if the source is a table or a view).  In trying different things to troubleshoot this, including adding a ReportDocument.VerifyDatabase() call after setting the connection information, the Crystal Reports viewer will spit out other nonsensical errers regarding being unable to find certain fields (e.g. "The field name is not known), or not able to find the table (even though the source data should be coming from an Oracle stored procedure, not a table).
    When the reports are run in the Crystal Reports Designer, they run fine no matter what database is being used; but the problem only happens while being run in the .NET viewer.  It's almost as if something internally isn't getting fully "set" to the new data source, or something -- we're really grasping at straws here.
    For the sake of completeness of information, here is how we're setting the connection information
            '-- Set database connection info for the main report
            For Each oConnectionInfo In oCrystalReport.DataSourceConnections
                oConnectionInfo.SetConnection(gsDBDataSource, "", gsDBUserID, gsDBPassword)
            Next oConnectionInfo
            '-- Set database connection info for each subreport
            For Each oSubreport In oCrystalReport.Subreports
                For Each oConnectionInfo In oSubreport.DataSourceConnections
                    oConnectionInfo.SetConnection(gsDBDataSource, "", gsDBUserID, gsDBPassword)
                Next oConnectionInfo
            Next oSubreport
    ... but in troubleshooting, we've even tried an "overkill" approach and added this code as well:
            '-- Set database connection info for each table in the main report
            For Each oTable In oCrystalReport.Database.Tables
                With oTable.LogOnInfo.ConnectionInfo
                    .ServerName = gsDBDataSource
                    .UserID = gsDBUserID
                    .Password = gsDBPassword
                    For Each oPair In .LogonProperties
                        If UCase(CStr(oPair.Name)) = "DATA SOURCE" Then
                            oPair.Value = gsDBDataSource
                            Exit For
                        End If
                    Next oPair
                End With
                oTable.ApplyLogOnInfo(oTable.LogOnInfo)
            Next oTable
            '-- Set database connection info for each table in each subreport
            For Each oSubreport In oCrystalReport.Subreports
                For Each oTable In oSubreport.Database.Tables
                    With oTable.LogOnInfo.ConnectionInfo
                        .ServerName = gsDBDataSource
                        .UserID = gsDBUserID
                        .Password = gsDBPassword
                        For Each oPair In .LogonProperties
                            If UCase(CStr(oPair.Name)) = "DATA SOURCE" Then
                                oPair.Value = gsDBDataSource
                                Exit For
                            End If
                        Next oPair
                    End With
                    oTable.ApplyLogOnInfo(oTable.LogOnInfo)
                Next oTable
            Next oSubreport
    ... alas, it makes no difference.  If we run the report against a database that is different than the one specified with "Set Datasource Location" in Crystal, it fails with nonsense errors 

    Thanks for the reply, Ludek.  We have made some breakthroughs, uncovered some Crystal bugs and workarounds, and we're probably 90% there I hope.
    For your first point, unfortunately the information on the Oracle 6550 error was generic, and not much help in our case.  And for your second point, the errors didn't have anything to do with subreports at that time -- the error would manifest itself even in a simple, one-level report.
    However, your third point (pointing us to KB 1553921) helped move us forward quite a bit more.  For the benefit of all, here is a link to that KB article:
    Link: [KB 1553921|http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes_boj/sdn_oss_boj_bi/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/scn_bosap/notes%7B6163636573733d36393736354636443646363436353344333933393338323636393736354637333631373036453646373436353733354636453735364436323635373233443330333033303331333533353333333933323331%7D.do]
    We downloaded the tool referenced there, and pointed it at a couple of our reports.  The bottom line is that the code it generated uses a completely new area of the Crystal Reports .NET API which we had not used before -- in the CrystalDecisions.ReportAppServer namespace.  Using code based on what that RasConnectionInfo tool generated, we were able gain greater visibility into some of the objects in the API and to uncover what I think qualifies as a genuine bug in Crystal Reports.
    The CrystalDecisions.ReportAppServer.DataDefModel.ISCRTable class exposes a property called QualifiedName, something that isn't exposed by the more commonly-used CrystalDecisions.CrystalReports.Engine.Table class.  When changing the data source with our old code referenced above (CrystalDecisions.Shared.ConnectionInfo.SetConnection), I saw that Crystal would actually change the Table.QualifiedName from something like "SCHEMAOWNER.PACKAGENAME.PROCNAME" to just "PROCNAME" (essentially stripping off the schema and package name).  Bad, Crystal...  VERY BAD!  IMHO, Crystal potentially deserves to be swatted on the a** with the proverbial rolled-up newspaper.
    I believe this explains why we were also able to generate errors indicating that field names or tables were not found -- because Crystal had gone and changed the QualifiedName to remove some key info identifying the database object!  So, knowing this and using the code generated by the RasConnectionInfo tool, we were able to work around this bug with code that worked for most of our reports ("most" is the key word here -- more on that in a bit).
    So, first of all, I'll post our new code.  Here is the main area where we loop through all of the tables in the report and subreports:
    '-- Replace each table in the main report with new connection info
    For Each oTable In oCrystalReport.ReportClientDocument.DatabaseController.Database.Tables
        oNewTable = oTable.Clone()
        oNewTable.ConnectionInfo = GetNewConnectionInfo(oTable)
        oCrystalReport.ReportClientDocument.DatabaseController.SetTableLocation(oTable, oNewTable)
    Next oTable
    '-- Replace each table in any subreports with new connection info
    For iLoop = 0 To oCrystalReport.Subreports.Count - 1
        sSubreportName = oCrystalReport.Subreports(iLoop).Name
        For Each oTable In oCrystalReport.ReportClientDocument.SubreportController.GetSubreportDatabase(sSubreportName).Tables
            oNewTable = oTable.Clone()
            oNewTable.ConnectionInfo = GetNewConnectionInfo(oTable)
            oCrystalReport.ReportClientDocument.SubreportController.SetTableLocation(sSubreportName, oTable, oNewTable)
        Next oTable
    Next iLoop
    '-- Call VerifyDatabase() to ensure that the tables update properly
    oCrystalReport.VerifyDatabase()
    (Thanks to Colin Stynes for his post in the following thread, which describes how to handle the subreports):
    Setting subreport connection info at runtime
    There seems to be a limitation on the number of characters in a post on this forum (before all formatting gets lost), so please see my next post for the rest....

  • Issue with brspace running against EDI40

    I have 6.40 SAP system and I am running BRTools version 6.40 with patch level 40.
    Database version is 9.2.0.6 and when I run brspace against EDI40 table to show table info I receive following output:
    ethp5000:ad4254 21> g11brspace -f dbshow -c tbinfo -o "sapr3" -t "edi40"
    BR1001I BRSPACE 6.40 (40)
    BR1002I Start of BRSPACE processing: sdvmyyjb.dbw 2007-06-18 16.50.19
    BR0280I BRSPACE time stamp: 2007-06-18 16.50.28
    BR1009I Name of database instance: G11
    BR1010I BRSPACE action ID: sdvmyyjb
    BR1011I BRSPACE function ID: dbw
    BR1012I BRSPACE function: dbshow
    BR1036I Class of information to be shown: tbinfo
    BR0280I BRSPACE time stamp: 2007-06-18 16.50.40
    BR0692I Display menu 270 # no input possible
    Information about table SAPR3.EDI40
    1 - Partitioned table (partitioned) ..... NO
    2 - Number of partitions (partitions) ...
    3 - Monitoring attribut (monitoring) .... NO
    4 - Parallel degree (degree) ............ 1
    5 - Number of indexes (indexes) ......... 1
    6 - Tablespace name (tablespace) ........ PSAPCLU2D
    7 - Last analyzed (analyzed) ............
    8 - Sample size (sample) ................
    9 - Number of rows (rows) ............... -1
    10 - Allocated space in KB (space) ....... -1
    11 - Used space in KB / % (used) ......... -1 / 0.00
    12 - Pure data in KB / % (data) .......... -1 / 0.00
    13 - Number of chained rows (chained) .... -1
    14 - Next extend size in KB (next) ....... 512000
    15 - Maximum number of extents (maxexts) . 900
    Standard keys: c - cont, b - back, s - stop, r - refr, h - help
    I assume it is related to the fact that in DB02 in detail analysis for table edi40 I
    see that there is no SAPDBA storage analysis performed:
    .  .         :  :00       SAPDBA Storage analysis                          
    Data from DBSTATTORA                                                                               
    No data from SAPDBA Storage analysis availible   
    How to perform SAPDBA storage analysis and is it something that is recommended to be performed on edi40 table (this is very big table 200GB and it has LONG ROW data type).

    Hi Andrija,
    Brspace output makes me think the table EDI40 does not have any statistics in the database. You can doublecheck by querying directly on sqlplus:
    %sqlplus "/ as sysdba"
    %SQL> select last_analyzed from dba_tables where table_name='EDI40';
    there should be a null result to that query.
    About statistics creation I do recommend you to read SAP notes 588668 and 122718.
    Good luck.
    Alfonso

  • MRP Run Against Sales order

    Can you run MRP against a sales order , if so which tcode can i use to accomplish this.
    Thank you

    Did you try MD50?.
    Reg
    Samson

  • WebI error when running against SAP BW

    Hi all,
    I run a WebI query against BW 7 (XI3.0) and it display the following errors:
    MDDataSetBW.GetCellData   No more storage space available for extending an internal table.  (WIS 10901)
    To me it seems like a database resources issue.  So how can we further troubleshoot this?
    Or is there any performance tuning can be made on BO side?
    Regards,
    Derek

    Hi Derek,
      Means if you include all the items into the Web Intelligence query then they will retrieve data and based on the error message it is pretty clear that you are asking for too much data and running most likely out of memory in the backend.
      Try to have less dimensions in the query panel and you will have no error, remember this is not an OLAP tool, the WEBI converts the query to MDX and obtain all the information in WEBI to create the microcube, if you have less dimensions the microcube is smaller and the data you need to retrieve from BW is smaller too.
    Best regards,

Maybe you are looking for