Ajax LOVs with a SQL source
Hi,
I have two pages that I'm using I'm using Ajax to populate cascading Lov's(town and suburb). One page is a create record page, and the other is an edit of the same record. I have 2 LOV's. On the 'create' page the ajax works perfectly and allows me to choose a suburb(LOV2) based on the town(LOV1) selected in LOV1, however in the edit page the suburb selected does not show in LOV2 even though I am passing it in the source(SQL query). I want LOV2 to show the suburb selected then if you click on the dropdown, show the other suburbs available under the town selected in LOV1.
Any ideas?Or is there a more elegant way to handle this?
Thanks
Hi
Here it is:
.00:
0.00: S H O W: application="173" page="10" workspace="" request="" session="6993176175380586"
0.01: Language derived from: FLOW_PRIMARY_LANGUAGE, current browser language: en-us
0.01: alter session set nls_language="AMERICAN"
0.01: alter session set nls_territory="AMERICA"
0.01: NLS: CSV charset=WE8MSWIN1252
0.01: ...NLS: Set Decimal separator="."
0.01: ...NLS: Set NLS Group separator=","
0.01: ...NLS: Set date format="DD-MON-RR"
0.01: ...Setting session time_zone to +00:00
0.01: Setting NLS_DATE_FORMAT to application date format: DD/MM/YYYY
0.01: ...NLS: Set date format="DD/MM/YYYY"
0.01: NLS: Language=en-us
0.01: Application 173, Authentication: CUSTOM2, Page Template: 55669127687495456
0.01: ...Determine if user "WEYKU6B3" workspace "31732801716199613" can develop application "173" in workspace "31732801716199613"
0.01: ...ok to reuse builder session for user:ANONYMOUS
0.01: ...Application session: 6993176175380586, user=ANONYMOUS
0.01: ...Determine if user "WEYKU6B3" workspace "31732801716199613" can develop application "173" in workspace "31732801716199613"
0.01: ...Check for session expiration:
0.01: Session: Fetch session header information
0.01: Saving g_arg_names=P10_E_ID and g_arg_values=314
0.02: Fetch session state from database
0.02: ...Session State: Save Item "P10_E_ID" newValue="314" "escape_on_input="N"
0.02: Saving g_arg_names=P10_PR_ID and g_arg_values=
0.02: ...Session State: Save Item "P10_PR_ID" newValue="" "escape_on_input="N"
0.02: ...Metadata: Fetch page attributes for application 173, page 10
0.02: Branch point: BEFORE_HEADER
0.02: Fetch application meta data
0.03: Setting NLS_DATE_FORMAT to application date format: DD/MM/YYYY
0.03: ...NLS: Set date format="DD/MM/YYYY"
0.03: Computation point: BEFORE_HEADER
0.03: Processing point: BEFORE_HEADER
0.03: ...Process "GET_NEXT_E_ID": PLSQL (BEFORE_HEADER) DECLARE v_e_count NUMBER; BEGIN select count(1) into v_e_count from ENTITY where E_ID > :P10_E_ID; IF v_e_count > 1 THEN SELECT MIN(E_ID) INTO :P10_NEXT_E_ID FROM ENTITY WHERE E_ID > :P10_E_ID; END IF; END;
0.03: ...Session State: Save Item "P10_NEXT_E_ID" newValue="334" "escape_on_input="N"
0.03: ...Process "GET_PREV_ID": PLSQL (BEFORE_HEADER) DECLARE v_e_count NUMBER; BEGIN select count(1) into v_e_count from ENTITY where E_ID < :P10_E_ID; IF v_e_count > 1 THEN SELECT MAX(E_ID) INTO :P10_PREV_E_ID FROM ENTITY WHERE E_ID < :P10_E_ID; END IF; END;
0.04: Show page template header
0.04: Computation point: AFTER_HEADER
0.04: Processing point: AFTER_HEADER
ANONYMOUSLogout
0.04: Computation point: BEFORE_BOX_BODY
0.04: Processing point: BEFORE_BOX_BODY
0.04: Region: Edit Buyer
Edit Buyer
Similar Messages
-
LOV with PL/SQL in APEX 4.0
I want to create a LOV in APEX 4.0 with the help of PL/SQL block,but when I am pasting my PL/SQL block code in the List Of Values area and try to run it, it is showing me error "Rendering Page ITEM F300_P1510_PROJECT_TASK raised the following error: ORA-20001: Query must begin with SELECT or WITH" where F300_P1510_PROJECT_TASK is my LOV name.
Please help me how to fix this.Hi,
The way to do this that comes to mind, and I know it works because I have done it, is rather long and uses Ajax
1. Create an On Demand Application Process that returns this as a string
<optgroup label="Swedish Cars">
<option value="volvo">Volvo</option>
<option value="saab">Saab</option>
</optgroup>
<optgroup label="German Cars">
<option value="mercedes">Mercedes</option>
<option value="audi">Audi</option>
</optgroup>2. Write a javascript function (HTML Header) that will
a. Make an Ajax call to the Application Process in step 1
b. replace the options in the select tag with the result of the ajax call
3. Call the js function in step 2 using onload from HTML Body Attribute.
The pure SQL PL/SQL way would be as follows:
a. Create a PIPELINED function that returns the display, return pairs of values
b. Use the above function in SELECT in the LOV
Maybe there is a simpler way, but it eludes me.
Regards,
Edited by: Prabodh on Aug 20, 2010 4:23 PM -
Hi everyone
I would like to have an LOV that returns the list based on a dynamic SQL statement based on page items.
Example:
:P1_ID_COLNAME := 'empno';
:P1_VALUE_COLNAME := 'name';
'SELECT ' || :P1_VALUE_COLNAME || ', ' || :P1_ID_COLNAME || ' FROM emp;'resulting in
SELECT name, empno FROM emp;My PL/SQL skills are fairly limited. However, my assumption would be to use EXECUTE IMMEDIATE within a function and returning a REF CURSOR containing the dynamic query.
Do you agree with that? Or have you even got other ideas?
Appreciate your thoughts.
Regards,
MichaelI've found the solution myself.
It's even easier than I thought:
APEX allows a LOV source to be a VARCHAR2 return value of a function:
LOV source
RETURN f_get_query(:P1_VALUE_COLNAME, :P1_ID_COLNAME);
function on DB
FUNCTION f_get_query(
in_value_colname VARCHAR2,
in_id_colname VARCHAR2
RETURN VARCHAR2
IS
BEGIN
RETURN 'SELECT ' || in_value_colname || ', ' || in_id_colname || ' FROM emp;';
END;Michael -
Interactive Report with PL/SQL Function Source
Is it possible to create interactive report with PL/SQL function source returing a query? If not, has anyone done any work to simulate the interactive reporting feature for a normal report using API?
I haven't tried that before but you could:
1. create a collection from your result set returned by a dynamic query,
2. create a view on that collection,
3. use the view in your interactive report.
The usability of this proposal depends from a question how "dynamic" your query is - does it always have the same number of columns or not.
Denes Kubicek
http://deneskubicek.blogspot.com/
http://www.opal-consulting.de/training
http://apex.oracle.com/pls/otn/f?p=31517:1
------------------------------------------------------------------- -
SQL Developer 3 EA2 - Problem with formating plsql source code
I'm using sql developer 3 ea2. I have played with source code formating.
I have 2 issues with formating in source code editor.
I have a plsql string like the following extending over several lines.
s VARCHAR2(2000) := 'SELECT
col1, col2, col3
FROM table
WHERE col4 = :var';The result after apply "format" to my sourcecode ist:
s VARCHAR2(2000) := 'SELECT
col1, col2, col3
FROM table
WHERE col4 = :var';The second is that sql developer ist camelizing the previous line if I type a whitespace in plsql sourcecode.
How to switch that off??
The last issue is realy annoying!
ChristianI am having exactly the same problem. Every time you use Format Ctrl/F7 it adds new line feeds. Code starts off as:
command := '
select account_status, default_tablespace, temporary_tablespace,
to_char(created,"YYYY-MON-DD HH24:MI:SS"), profile
from Dba_Users@@
where username=:1' ;
First Format Ctrl/F7 get an extra blank line:
command := '
select account_status, default_tablespace, temporary_tablespace,
to_char(created,"YYYY-MON-DD HH24:MI:SS"), profile
from Dba_Users@@
where username=:1' ;
Then second Format Ctrl/F7, get THREE extra lines!!! It goes exponential!!
command := '
select account_status, default_tablespace, temporary_tablespace,
to_char(created,"YYYY-MON-DD HH24:MI:SS"), profile
from Dba_Users@@
where username=:1' ;
So far I've only really encountered the problem with dynamic SQL, which ignores the extra line feeds.
i am pretty sure this is a long standing SqlDeveloper Format problem, going back to V2. -
I need to host a website with a SQL database - Azure pricing details are too confusing
Hello,
I need to give a potential client a hosting price for a somewhat simple web application they want me to build. I told them it shouldn't be a problem. After gathering the requirements, I figured I would use the following technology to build and host
it:
ASP.NET 4.5
MVC 5
1 SQL Database ~ 25GB with options to expand and also with a backup
SSL certificate needed
Hosting would be on Azure because I have some experience using Visual Studio 2012 and integrating the Visual Studio Online (TFS) source code and scrum web applications. I've never actually spun up a website with a SQL database using Azure before, but I
imagined it wasn't too difficult to find a general hosting plan to support the above requirements.
The use of the website will be very simple and limited to the basic CRUD operations. Will support forms authentication using the Identity 2.0 framework. The web applications main purpose is to fill out a form for new accounts, have a search page for
those accounts, a page to view a created account and add notes to it. So performance wise, it isn't asking for much. I just want it to be fast and secure.
So I start looking on the Azure's pricing landing page which is here: (can't put links in here, but search Azure pricing on Bing) and I see this Pricing Calculator, so I click it
First thing I notice is the Websites tab doesn't mention SQL Database - in fact the Data Management is a separate tab from Websites. And if I made my selections on the Websites tab, the estimated monthly price doesn't stay the same when I go to the Data
Management tab - so I get the illusion I have to have two separate purchases.
I'm not exactly sure if the Pay as You Go billing feature would be okay because it's just a bit scary to leave every monthly payment up to chance; somewhat. Would love to know if there is other payment options that I could see for what I described above.
I want to use Azure to host my asp.net website - it makes sense and the integration with Visual Studio is amazing. I love the publish feature for both MVC 5 Projects and SQL Database Projects.
Thanks in advance for the help!Hello jdevanderson,
I suggest that you start by looking at the pricing TIERS for the Azure website. This link will give you clarity on different Service TIERS that are availaible:
http://azure.microsoft.com/en-in/pricing/details/websites/
You can guage your requirement and choose the Service TIER accordingly.
And regarding the database, you are right about it. You will be charged seperately for the database. You can refer to this link that will give you clarity on SQL database pricing:
http://azure.microsoft.com/en-in/pricing/details/sql-database/
Refer to this link for more information on 'How pricing works':
http://azure.microsoft.com/en-in/pricing/
Use the full calculator to add your website and the database to get an estimated cost:
http://azure.microsoft.com/en-in/pricing/calculator/?scenario=full
Thanks,
Syed Irfan Hussain -
11i EBS XML Publisher Report with Multiple Data Source
I need to create XML Publisher report in 11i EBS pulling data from another 10.7 EBS Instance as well as 11i EBS in single report.
I am not allowed to create extract or use db links.
My problem is how to create Data Source Connection using Java Concurrent Program.
The approach I am trying is
1. create Java concurrent program to establish connection to 10.7 instance.
2. Will write the SQL queries in Data Tempalete with 2 Data Source 1 for 11i EBS and 2 for 10.7 EBS
3. Template will show the data from both query in 1 report..
Is there any other way to proceed using datasource API...
thanksoption1:
The query should be same @ detail level, only the template has to be different for summary and details.
@runtime, user can choose to see the detail/summary
Disadvantage, if the data is huge,
advantage , only one report.
option2:
create two separate reports summary and details
and create diff data and diff layout and keep it as different report
Advantage, query will perform based on the user run report, summary/detail, so that you can write efficient query.
Dis advantage , two reports query/template to be maintained. -
Building portlets with PL/SQl vs java
Hi
We are planing to use oracle 9iAS application server(Enterprise version)to build portal application.
Can someone suggest me which one should i use, building portlets with j2ee or building portlets with PL/SQL.
what are the advantages of web providers over database providers?which one is the best way of building portlets.Hello
I've been using Portal for years now, and I'm still developping in PL/SQL. It's very simple and quick to developp. I'm even not using any database providers, I'm just invoking my procedures via their URLs with some Ajax hidden components, and I could developp some screens like employees vacations managements, trombinoscope, portal statistics, etc ...
I learnt at Oracle how to developp some "true" portlets with DB providers but it's not usefull for me at this time as I don't need portlet customization etc.
BUT
if I had to developp a really big project with several developpers I would use DB providers.
And maybe I would use Java but it takes rather long time to be efficient with this language and it needs to be a realy big project for I start using this language. And as explained above Java offers more compatibility with 3rd party products.
And last but not least, one has to know what Oracle is more and more dealing with Java, the next 11g version that has just released is much more using Java than 10g does, and that's true for every Oracle products.
So it's just a matter of skill and time.
A. -
FDMEE BefImport script for SQL source
Hi Experts ,
I am new to FDMEE and trying to implment it for our process which should pull the data from one of the SQL view and load into Essbase. I am trying to use the Open Interface table and hence trying to builde BefImport event script with help of admin guide exmple. I know i am missing manythings in following scripts but i m not sure about it , can anyone please help and guide me to build a correct befImport Event script please !!
Below is the script:
# Sample to import data from a custom SQL source and upload into FDMEE
# open interface table. This script should be called in BefImport Event.
# This is alternate to the FDMEE integration import script.
import java.sql as sql
batchName = "Batch_" + str(fdmContext["LOC_ACT"])
insertStmt = """
INSERT INTO AIF_OPEN_INTERFACE (
BATCH_NAME
,COL01
,COL02
,COL03
,COL04
,COL05
,COL06
,COL07
,COL08
,AMOUNT
) VALUES (
sourceConn = sql.DriverManager.getConnection(sqlserver://hyperion-financial-dev/Custom_hfm_dev1, testuser, password);
# Limiting number of rows to 5 during the test runs.
selectStmt = select * from dbo.ACTUAL_ALL_USD where FYear = '2014' and period = '1'
stmt = sourceConn.prepareStatement(selectStmt)
stmtRS = stmt.executeQuery()
while(stmtRS.next()):
params = [batchName, stmtRS.getString("account"), stmtRS.getString("entity"), stmtRS.getString("Dim1"), stmtRS.getString("Dm2"), stmtRS.getString
("Dim3"), stmtRS.getString("Dim4"), stmtRS.getString("Dim5"), stmtRS.getString("Dim6"), stmtRS.getBigDecimal("Data") ]
fdmAPI.executeDML(insertStmt, params, False)
fdmAPI.commitTransaction()
stmtRS.close()
stmt.close()
sourceConn.close()
Thanks
VJhere is the log ::
5,489 INFO [AIF]: -------START IMPORT STEP-------
2015-04-16 11:37:35,630 DEBUG [AIF]: CommData.preImportData - START
2015-04-16 11:37:35,630 DEBUG [AIF]: CommData.getRuleInfo - START
2015-04-16 11:37:35,630 DEBUG [AIF]:
SELECT brl.RULE_ID, br.RULE_NAME, brl.PARTITIONKEY, brl.CATKEY, part.PARTVALGROUP, br.SOURCE_SYSTEM_ID, ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE = 'FILE' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE = 'EPM' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID, app.TARGET_APPLICATION_NAME, app.TARGET_APPLICATION_TYPE, app.DATA_LOAD_METHOD, brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1 WHEN 'PLAN2' THEN 2 WHEN 'PLAN3' THEN 3 WHEN 'PLAN4' THEN 4 WHEN 'PLAN5' THEN 5 WHEN 'PLAN6' THEN 6 ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG, br.PERIOD_MAPPING_TYPE, br.INCLUDE_ADJ_PERIODS_FLAG, br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE, br.BALANCE_SELECTION, br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE, br.BAL_SEG_VALUE_OPTION_CODE, brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,COALESCE(brl.CHECK_FLAG, 'N') CHECK_FLAG
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE = 'EPM' THEN 'NONE'
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE FROM AIF_BAL_RULE_DETAILS brd WHERE brd.RULE_ID = br.RULE_ID AND brd.DETAIL_TYPE = 'LEDGER') PS_LEDGER
,CASE lg.LEDGER_TEMPLATE WHEN 'COMMITMENT' THEN 'Y' ELSE 'N' END KK_FLAG
,p.LAST_UPDATED_BY, p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL, p.EPM_ORACLE_INSTANCE
,brl.JOURNAL_FLAG, br.MULTI_PERIOD_FILE_FLAG, br.IMPGROUPKEY, imp.IMPSOURCELEDGERID
,imp.IMPGROUPFILETYPE, imp.IMPTARGETSOURCESYSTEMID, imp.IMPSOURCECOAID, part.PARTTARGETAPPLICATIONID
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 309
2015-04-16 11:37:35,645 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5) COA_SEGMENT_NAME5
,(SELECT DISTINCT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 28
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 13
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
AND dim.TARGET_DIMENSION_CLASS_NAME <> 'ICPTRANS'
AND (adim.VALID_FOR_PLAN1 = 1 OR dim.TARGET_DIMENSION_CLASS_NAME = 'LOOKUP')
ORDER BY adim.BALANCE_COLUMN_NAME
2015-04-16 11:37:35,661 DEBUG [AIF]: {'APPLICATION_ID': 28L, 'IMPORT_FROM_SOURCE_FLAG': u'Y', 'PLAN_TYPE': u'PLAN1', 'RULE_NAME': u'HFMAct', 'ACTUAL_FLAG': None, 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'D:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 2L, 'BAL_SEG_VALUE_OPTION_CODE': None, 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'OTHERS', 'CHECK_FLAG': u'N', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'Corp', 'RECALCULATE_FLAG': u'Y', 'SOURCE_SYSTEM_ID': 16L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'IMPGROUPKEY': None, 'AMOUNT_TYPE': None, 'DATA_TABLE_NAME': 'TDATASEG', 'EXPORT_TO_TARGET_FLAG': u'N', 'JOURNAL_FLAG': None, 'SOURCE_APPLICATION_ID': None, 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'UD1', u'UD2', u'UD3', u'UD4', u'UD5', u'UD6'], 'FCI_FLAG': 'N', 'IMPSOURCECOAID': 0L, 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD6': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD6', 'DIMENSION_ID': 33L}, u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 30L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 26L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 23L}, u'UD5': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD5', 'DIMENSION_ID': 32L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 31L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 27L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Version', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 29L}}, 'TARGET_APPLICATION_TYPE': u'HPL', 'PARTITIONKEY': 13L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'NONE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': None, 'PLAN_NUMBER': 1L, 'MULTI_PERIOD_FILE_FLAG': None, 'PS_LEDGER': None, 'BALANCE_SELECTION': None, 'IMPGROUPFILETYPE': u'ODI', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 31L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'Y', 'BALANCE_METHOD_CODE': None, 'SIGNAGE_METHOD': u'ABSOLUTE', 'WEB_SERVICE_URL': u'http://STAAP1655D.R02.XLGS.LOCAL:6550/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI', 'PARTTARGETAPPLICATIONID': 28L, 'IMPTARGETSOURCESYSTEMID': 0L}
2015-04-16 11:37:35,661 DEBUG [AIF]: CommData.getRuleInfo - END
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.insertPeriods - START
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerListAndMap - START
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerSQL - START
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerSQL - END
2015-04-16 11:37:35,692 DEBUG [AIF]:
SELECT COALESCE(br.SOURCE_LEDGER_ID,0) SOURCE_LEDGER_ID
,NULL SOURCE_LEDGER_NAME
,NULL SOURCE_COA_ID
,br.CALENDAR_ID
,NULL SETID
,NULL PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
WHERE br.RULE_ID = 31
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerListAndMap - END
2015-04-16 11:37:35,692 DEBUG [AIF]: doAppPeriodMappingsExist - Corp: Y
2015-04-16 11:37:35,692 DEBUG [AIF]: Period mapping section: ERPI/EXPLICIT/BUDGET/APFY
2015-04-16 11:37:35,692 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,pp.YEARTARGET YEARTARGET
,pp.PERIODTARGET PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0')+' ('+pp.PERIODDESC+')' IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,0 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIODADAPTOR_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
AND pp.INTSYSTEMKEY = 'Corp'
INNER JOIN TPOVPERIODSOURCE ppsrc
ON ppsrc.PERIODKEY = pp.PERIODKEY
AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
AND ppsrc.SOURCE_SYSTEM_ID = 16
AND ppsrc.CALENDAR_ID IN ('Jan')
LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
ON prd.PERIOD_ID = ppsrc.PERIOD_ID
AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
WHERE brl.LOADID = 309
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2015-04-16 11:37:35,692 DEBUG [AIF]: periodSQL - periodParams: ['N', 'PROCESS_BAL_IMP', 'PROCESS_BAL_TRANS', 0L, u'Corp', u'EXPLICIT', 16L, u'Jan', 'N', 309]
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.insertPeriods - END
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.getPovList - START
2015-04-16 11:37:35,708 DEBUG [AIF]:
SELECT DISTINCT brl.PARTITIONKEY, part.PARTNAME, brl.CATKEY, cat.CATNAME, pprd.PERIODKEY
,COALESCE(pp.PERIODDESC, CONVERT(VARCHAR,pprd.PERIODKEY,120)) PERIODDESC
,brl.RULE_ID, br.RULE_NAME, CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIODADAPTOR pp
ON pp.PERIODKEY = pprd.PERIODKEY
AND pp.INTSYSTEMKEY = 'Corp'
LEFT OUTER JOIN TLOGPROCESS tlp
ON tlp.PARTITIONKEY = brl.PARTITIONKEY
AND tlp.CATKEY = brl.CATKEY
AND tlp.PERIODKEY = pprd.PERIODKEY
AND tlp.RULE_ID = brl.RULE_ID
WHERE brl.LOADID = 309
ORDER BY brl.PARTITIONKEY, brl.CATKEY, pprd.PERIODKEY, brl.RULE_ID
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.getPovList - END
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.insertImportProcessDetails - START
2015-04-16 11:37:35,708 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'TDATASEG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'admin' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT DISTINCT PROCESS_ID
,IMP_ENTITY_TYPE ENTITY_TYPE
,IMP_ENTITY_ID ENTITY_ID
,IMP_ENTITY_NAME ENTITY_NAME
,(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 309
) q
ORDER BY ENTITY_NAME_ORDER
2015-04-16 11:37:35,724 DEBUG [AIF]: CommData.insertImportProcessDetails - END
2015-04-16 11:37:35,724 DEBUG [AIF]: Comm.doScriptInit - START
2015-04-16 11:37:35,833 DEBUG [AIF]: fdmContext: {BATCHSCRIPTDIR=D:\Oracle\Middleware\user_projects\epmsystem1\FinancialDataQuality, INBOXDIR=D:\AppFDM\FDMEE\inbox, LOCNAME=HFM_ACT, SOURCENAME=HFMSQL, APPID=28, SOURCEID=16, APPROOTDIR=D:\AppFDM\FDMEE, IMPORTFORMAT=HFMACTUALSQL, SCRIPTSDIR=D:\AppFDM\FDMEE\data\scripts, EPMORACLEHOME=D:\Oracle\Middleware\EPMSystem11R1, TARGETAPPTYPE=HPL, RULEID=31, CATNAME=Actual, EPMORACLEINSTANCEHOME=D:\Oracle\Middleware\user_projects\epmsystem1, LOADID=309, PERIODNAME=Jan-15, IMPORTMODE=null, SOURCETYPE=OTHERS, PERIODKEY=2015-01-31, EXPORTFLAG=N, TARGETAPPDB=PLAN1, TARGETAPPNAME=Corp, LOCKEY=13, RULENAME=HFMAct, OUTBOXDIR=D:\AppFDM\FDMEE\outbox, MULTIPERIODLOAD=N, EXPORTMODE=null, CATKEY=2, USERNAME=admin, FILEDIR=null, IMPORTFLAG=Y, USERLOCALE=null}
2015-04-16 11:37:35,849 DEBUG [AIF]: The executeEventScript is set to: YES
2015-04-16 11:37:35,849 DEBUG [AIF]: The AppRootFolder is set to: D:\AppFDM\FDMEE
2015-04-16 11:37:35,849 DEBUG [AIF]: The JavaHome is set to: %EPM_ORACLE_HOME%/../jdk160_35
2015-04-16 11:37:35,849 DEBUG [AIF]: The OleDatabaseProvider is set to: SQLNCLI
2015-04-16 11:37:35,849 DEBUG [AIF]: Comm.doScriptInit - END
2015-04-16 11:37:35,849 DEBUG [AIF]: Comm.executeScript - START
2015-04-16 11:37:35,849 INFO [AIF]: Executing the following script: D:\AppFDM\FDMEE/data/scripts/event/BefImport.py
2015-04-16 11:37:35,849 ERROR [AIF]: The script has failed to execute:
2015-04-16 11:37:35,895 DEBUG [AIF]: Comm.finalizeProcess - START
2015-04-16 11:37:35,895 DEBUG [AIF]: CommData.updateRuleStatus - START
2015-04-16 11:37:35,895 DEBUG [AIF]:
UPDATE AIF_BALANCE_RULES
SET STATUS = CASE 'FAILED'
WHEN 'SUCCESS' THEN
CASE (
SELECT COUNT(*)
FROM AIF_PROCESS_DETAILS pd
WHERE pd.PROCESS_ID = 309
AND pd.STATUS IN ('FAILED','WARNING')
WHEN 0 THEN 'SUCCESS'
ELSE (
SELECT MIN(pd.STATUS)
FROM AIF_PROCESS_DETAILS pd
WHERE pd.PROCESS_ID = 309
AND pd.STATUS IN ('FAILED','WARNING')
END
ELSE 'FAILED'
END
WHERE RULE_ID = 31
2015-04-16 11:37:35,911 DEBUG [AIF]: CommData.updateRuleStatus - END
2015-04-16 11:37:35,911 FATAL [AIF]: Error in COMM Pre Import Data
2015-04-16 11:37:35,911 DEBUG [AIF]: Comm.updateProcess - START
2015-04-16 11:37:35,911 DEBUG [AIF]: Comm.updateProcess - END
2015-04-16 11:37:35,911 DEBUG [AIF]: The fdmAPI connection has been closed.
2015-04-16 11:37:35,911 INFO [AIF]: FDMEE Process End, Process ID: 309 -
Create a report with PL/SQL
Hi,
I have two pages: the first page contains two text fields and a submit button. In the first text field you can enter a name and in the second field you can enter a number. That means you can search a record by name or by number.
In the second page the report is generated depending on the used text field of the first page.
I tried to to define a region source code with PL/SQL for the report, but nothing appears on the report page although the record I was looking for exists in the database.
begin
if :ENTERNAME IS NOT NULL then
FOR item IN (select "TB_PERSON_INSTITUTION"."PI_ID" as "PI_ID",
"TB_PERSON_INSTITUTION"."PI_NAME" as "PI_NAME",
"TB_PERSON_INSTITUTION"."PI_VORNAME" as "PI_VORNAME",
from "TB_PERSON_INSTITUTION" "TB_PERSON_INSTITUTION"
where upper("TB_PERSON_INSTITUTION"."PI_NAME") like upper(:ENTERNAME||'%'))
loop
DBMS_OUTPUT.PUT_LINE('First name = ' || item.PI_NAME ||
', Last name = ' || item.PI_VORNAME);
end loop;
end if;
end;
Regards
MarkHi,
ok thanks. I tried to use the SQL-Report with type "SQL Query (PL/SQL function body returning SQL-Query)" and made a few changes in the SQL-Statement so that a second table is also included:
declare My_select varchar2(500);
begin
if :TEXTEINGABENAME IS NOT NULL then
My_select:='SELECT
"TB_ADRESSE"."A_PLZ" "A_PLZ",
"TB_ADRESSE"."A_ORT" "A_ORT",
"TB_ADRESSE"."A_ID" "A_ID",
"TB_PERSON_INSTITUTION"."PI_MITGLIEDSNUMMER" "PI_MITGLIEDSNUMMER",
"TB_PERSON_INSTITUTION"."PI_NAME" "PI_NAME",
"TB_PERSON_INSTITUTION"."PI_VORNAME" "PI_VORNAME",
"TB_PERSON_INSTITUTION"."PI_ERGAENZUNG" "PI_ERGAENZUNG",
"TB_PERSON_INSTITUTION"."PI_ERGAENZUNG1" "PI_ERGAENZUNG1",
"TB_PERSON_INSTITUTION"."PI_ID" "PI_ID"
FROM
"TB_ADRESSE" "TB_ADRESSE",
"TB_PERSON_INSTITUTION" "TB_PERSON_INSTITUTION"
WHERE "TB_PERSON_INSTITUTION"."PI_ID" = "TB_ADRESSE"."A_F_PERSON_INSTITUTION"
AND upper("TB_PERSON_INSTITUTION"."PI_NAME") like upper(:TEXTEINGABENAME||"%")';
else
if :TEXTMITGLIEDSNUMMER is not null then
My_select:='SELECT
"TB_ADRESSE"."A_PLZ" "A_PLZ",
"TB_ADRESSE"."A_ORT" "A_ORT",
"TB_ADRESSE"."A_ID" "A_ID",
"TB_PERSON_INSTITUTION"."PI_MITGLIEDSNUMMER" "PI_MITGLIEDSNUMMER",
"TB_PERSON_INSTITUTION"."PI_NAME" "PI_NAME",
"TB_PERSON_INSTITUTION"."PI_VORNAME" "PI_VORNAME",
"TB_PERSON_INSTITUTION"."PI_ERGAENZUNG" "PI_ERGAENZUNG",
"TB_PERSON_INSTITUTION"."PI_ERGAENZUNG1" "PI_ERGAENZUNG1",
"TB_PERSON_INSTITUTION"."PI_ID" "PI_ID"
FROM
"TB_ADRESSE" "TB_ADRESSE",
"TB_PERSON_INSTITUTION" "TB_PERSON_INSTITUTION"
WHERE "TB_PERSON_INSTITUTION"."PI_ID" = "TB_ADRESSE"."A_F_PERSON_INSTITUTION"
AND upper("TB_PERSON_INSTITUTION"."PI_MITGLIEDSNUMMER") like upper(:TEXTMITGLIEDSNUMMER||"%")';
end if;
end if;
return My_select;
end;
When I try to apply changes an error message occurs:
"Query cannot be parsed within the Builder. If you believe your query is syntactically correct, check the ''generic columns'' checkbox below the region source to proceed without parsing. The query can not be parsed, the cursor is not yet open or a function returning a SQL query returned without a value."
Regards,
Mark -
Hi.
I installed the full suite of tools as I am involved with a company intended to become a partner and reseller so we are trying to get everything working.
SAP Crystal Server 2013 SP1
BusinessObjects Business Intelligence platform
SAP Crystal Reports for Enterprise
SAP HANA
SAP Crystal Reports 2013 SP1
SAP BusinessObjects Explorer
SAP Crystal Dashboard Design 2013 SP1
Crystal Server 2013 SP1 Client Tools
SAP BusinessObjects Live Office 4.1 SP1
.NET SDK
I found out that BI was needed only after I installed all the others but I installed it without problem.
My issue is that the Information Design Tool (IDT) which creates my universes successfully, and even lets me open and see the dada no problem. Is using a 32bit (ODBC connected to SQL Server 2008) data source.
However, I am unable to load the .UNX in crystal reports (CR) 2011, so I used CR Enterprise (CRE) to connect to my universe. Now when I try to open the universe I start getting the following error:
[Microsoft][ODBC Driver Manager] The specified DSN contains an architecture mismatch between the Driver and Application
When I do searches online I get very generic information relating to setting up data sources. While I believe the problem is indeed with the data source, I don't believe it is setup wrong.
When I access the universe using the IDT (which uses the 32bit version of the data source to build its connection, it wont let me use a 64bit) I can load a table under the "result objects for query #1" press refresh, I get a list of data.
When I access the same universe using CRE (which "Seems" to use a 64bit data source, but I am not sure), and follow the same process as above. I get the above error.
If I cancel the process, when CRE lists the fields for the report I can later manually map these fields to an ODBC connection. But if I have to do this what is the point of using the universes at all as the end user has full access to all the various DB's available in the data source.
Thanks in advance for any help you can provide.On the server where Crystal Reports Server is installed, create a 64-bit ODBC connection with the same name as the 32-bit connection. CRS will then automatically use the 64-bit version of the connection.
-Dell -
Error while opening SQL source for a Data Load Rules File
Hi ,I have created Data Laod rules file.When I try to open a SQL source for this rules file (File->Open SQL) , I get an error saying "Your server does not have a SQL connection Option, Please check with your system administrator"Further I get a message "There are no data sources defined. PLease create one to continue.".I have created DSN on my Essbase server.What is the problem.What needs to done to open SQL files.Thanks.
I have Essbase 7.1 I guess for version 7.1 the SQL interface option is intalled with the Analytic server itself .Am I right?I have setup the DSN also.Please help to resolve this issue.Thanks .
-
Hello,
Queries
1. Is it possible to use Batch Reading in conjunction with Custom Stored Procs/ SQL?
2. Is it possible to map an attribute to a SQL expression (like in Hibernate we have formula columns mapped using the formula* property)?
Background
1. We use Toplink 11g (11.1.1.0.1) (not EclipseLink) in our application and are controlling mapping using XML files (not annotations).
2. We are migrating a legacy application with most of its data retreival logic present in stored procedures to Java.
3. I am effectively a newbie to Toplink.
Scenario
1. We have a deep class heirarchy with ClassA+ at the following having a one-to-many relation with ClassB+ and ClassB+ having a one-to-many relation with ClassC+ and so on and so forth.
2. For each of these classes the data retreival logic is present in stored procedures (coming from the legacy application) containing not so simple queries.
3. Also there are a quite a few attributes that actually represent computed values (computed and returned from the stored procedure). Also the logic for computing the values are not simple either.
4. So to make things easy we configured toplink to use the stored procedures to retreive data for objects of ClassA+, ClassB+ and ClassC+.
5. But since the class heirarchy was deep, we ended up firing too many stored procedure calls to the database.
6. We thought we could use the Batch Reading feature to help with this, but I have come across documentation that says that it wont work if you override toplink's queries with stored procedures.
7. I wrote some sample code to determine this and for the heirarchy shown above it uses the speicifed Custom procedure (I also tried replacing the stored procs with custom SQL, but the behavior is the same) for ClassA+ and ClassB+, but for ClassC+ and below it resorts to its own generated SQL.
8. This is a problem because the generated SQL contains the names of the computed columns which is not present in the underlying source tables.
Thanks
ArvindBatch reading is not supported with custom SQL or stored procedures.
Join fetching is though, so you may wish to investigate that (you need to ensure you return the correct data from the stored procedure).
James : http://www.eclipselink.org -
Issue with Pro Res sources when encoding in Media Encoder.
There seems to be a big issue with Pro Res sources in Media Encoder. I've noticed that when exporting using the 'software only' mode my graphics and titles look horrible, they are pixelated around the edges and the compression looks bad. This issue only happens when it's being made from a Pro Res source, if I make the exact same file Uncompressed this issue is resolved. If I use the 'Cuda' option (which I already know is the better option) this issue is resolved. The thing is, in a work environment not all of our systems are Cuda enabled and I would like to use Media Encoder as a exporting option overall. I love Media Encoder, it's fast and easy to use but this Pro Res issue is huge because the majority of the time we are working in Pro Res. I also did a test out of Avid Media Composer to Media Encoder, I sent a reference file referencing the Avid MXF material and the issue is gone, this seems to my knowledge to be a Pro Res only issue. The settings I am exporting to is 960 x 540 h.264 and also .mp4. This is coming from a 1080p source and yes I do have the 'maximum render quality' checked for best scaling. I understand that Software only vs Cuda and Open CL use different algorithms when scaling but this seems crazy to me that it would look this much worse.
Anyways if somebody can please look into this that would be great, this seems to be an issue where I can't continue at this moment to use Media Encoder. Making my source files Uncompressed every time to do an export is just not a real workflow I want to do.
On a side note I've also recently noticed on a clip that had text over a grey background that there are lines going all down the screen evenly when exporting to even a non scaled 1080p .mp4. Once again, this issue goes away with any format but Pro Res. The weird part is this happens even with Cuda enabled. This is why I am thinking Media Encoder is having some sort of issue with the Pro Res codec.
I am on a current new 27'' fully loaded iMac with the latest Adobe CC updates.
Has anyone else experienced this?
thank youNo, it is why advance users do not use the wizard.
Have a look at OpenOffice.org and its form creation. Once you add your fields in OpenOffice.org, just export to PDF and the form fields will be named just like you named in them in OpenOffice.org and the drop down values carry over. -
Hi,
I know that there are much threads for PL/SQL, but I nowhere found an explanation for my problem ... sorry
I have a LOV with years (2002, 2003, 2004, ...). The display value is a number, the return value is an varchar2. If nothing is selected 0 is returned.
Then I have a report that should be filtered depending on the selection if this LOV.
So my code look like this
DECLARe
str_express varchar2(2000);
BEGIN
IF P2_YEAR != '0' THEN
str_express := 'select ... from ... where year = :P2_YEAR';
ELSE
str_express := 'select ... from ... ';
END IF;
return str_express;
END;
But it never is filtered, I always get all rows. If I delete the IF ... THEN ... ELSE ... END IF; then everything is ok. What have I done wrong? I've also tried
F :P2_YEAR <> '0' ...',
F :P2_YEAR <> 0 ...',
F :P2_YEAR != 0 ...', but nothing works. If I have a look in session state the value of this fields is the selected one and it also works in the select statement without the IF.
Thanks for help.
chrissyI think the problem is that you are trying to compare a date value to a string in your PL/SQL, try something like this:
if to_char(:P_Year,'YYYY') !='2004'
Maybe you are looking for
-
Dear All, In one of the standard program the following code is used FETCH NEXT CURSOR c_handle-db_cursor INTO CORRESPONDING FIELDS OF l_s_ce0_enh. Here the problem is sometimes at this line program getting timed out and gives dump as time lim
-
I can't install any apps on my phone
I bought a new cellphone and I can't install any app on my iPhone 5s ! I go to abroad , I had installed viber on my iPhone 4 , 2years ago but now I couldn't install again it on my iPhone 5s Could you help me?
-
Upgrade Intel Core Duo iMac to Core 2 Duo?
I know that, in theory a Merom processor is compatible with a motherboard made for a Yonah CPU, which is what's in the original Intel iMacs. However, I know that most Wintel laptops require a BIOS flash in order to support the newer processor. My que
-
Incorrect output for Min,Max, & Avg loops
This code is mostly working to compute the average and to get the maximum but it is taking 12 inputs and not 10 as required. I'm a bit unsure how to implement the Min into this as well. I would appreciate any guidance that can be provided on this as
-
Satellite P750 - touch buttons have been activated for no apparent reason
Heya My Toshiba P750/02L001 seems to have this wierd issue where the touch buttons that activate the volume controls / green mode / wifi switch / etc... have been activated for no apparent reason My hands are no where near the proximity, and it only