Problerm with the data source
Hi Gurus,
I have written an extractor to pull data from R/3 to BI .
when iam trying to pull the data through the extractor its not pulling any records .
at the same time when iam running the same through the function module .its working fine .
what will be the problem ,some are saying its because of the performance issue .
buts my function module is picking nearly 94000 records with in 2 minuts .
please advice
Hi Vinay,
Could you paste your FM code here? One thing I can immediately figure out from your post is you have written the FM wrong ... because FM for extractors have to have a specific interface ... that is I_T_DATA (or similar) should be the table defined in the interface and that needs to be populated in order to Pass data from your generic extractor to BW.
Hope this helps.
Best regrads,
Kazmi
Similar Messages
-
Problem with the data source and web.xml
I have an issue where JSC is removing my resource reference:
<resource-ref>
<description>Creator generated DataSource Reference</description>
<res-ref-name>jdbc/localOracleDatabase</res-ref-name>
<res-type>javax.sql.DataSource</res-type>
<res-auth>Container</res-auth>
</resource-ref>
from the web.xml and sun-web.xml.
The application has been working great in the IDE for months then wham, no more data source definition. I try and add the reference manually and the IDE takes it out. I am "NOT" adding it to the .xml's in the build area. Why is JSC removing the data source entry?This continues to be a problem. The only way that I can get around the problem is to drag a table from the data source onto the design pallete and then the datasource is added back to the web.xml. I can run fine for 10 or 15 runs then the entry is once again removed from the web.xml.
Help please! -
Hi all.
I think that the problem I want to discuss is well-known, but still I got no answer whatever I tried ...
I installed the BIEE on Linux (32 bit, OEL 5 - to be more precise), the complete installation was not a big deal. After that I installed the Administration tool on my laptop and created the repository. So... my tnsnames.ora on the laptop looks like this:
TESTDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.5)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = testdb)
And the tnsnames.ora on server, in its turn, looks like this:
TESTDB =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = localhost.localdomain)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = testdb.localdomain)
The database worked normally and I created and transferred the repository to the server and started it up.
It started without any errors, but when I tried to fetch the data via the representation services I got the error:
Odbc driver returned an error (SQLExecDirectW).
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.
[nQSError: 16023] The ODBC function has returned an error. The database may not be available, or the network may be down. (HY000)
I discovered, that the ODBC on my laptop was named not correctly (it should have been identical to tnsnames entry) - so I corrected it, saved and replaced the repository on the server and restarted it... - and still got the same error.
Apparently, something is wrong with the data source. So let me put here some more information...
My user.sh looks like this:
ORACLE_HOME=/u01/app/ora/product/11.2.0/dbhome_1
export ORACLE_HOME
TNS_ADMIN=$ORACLE_HOME/network/admin
export TNS_ADMIN
PATH=$ORACLE_HOME/bin:/opt/bin:$PATH
export PATH
LD_LIBRARY_PATH=$ORACLE_HOME/lib:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH
and my odbc.ini looks like this:
[ODBC]
Trace=0
TraceFile=odbctrace.out
TraceDll=/u01/OracleBI/odbc/lib/odbctrac.so
InstallDir=/u01/OracleBI/odbc
UseCursorLib=0
IANAAppCodePage=4
[ODBC Data Sources]
AnalyticsWeb=Oracle BI Server
Cluster=Oracle BI Server
SSL_Sample=Oracle BI Server
TESTDB=Oracle BI Server
[TESTDB]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=SH
Catalog=
UID=
PWD=
Port=9703
[AnalyticsWeb]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=
Catalog=
UID=
PWD=
Port=9703
[Cluster]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=local
Repository=
FinalTimeOutForContactingCCS=60
InitialTimeOutForContactingPrimaryCCS=5
IsClusteredDSN=Yes
Catalog=SnowFlakeSales
UID=Administrator
PWD=
Port=9703
PrimaryCCS=
PrimaryCCSPort=9706
SecondaryCCS=
SecondaryCCSPort=9706
Regional=No
[SSL_Sample]
Driver=/u01/OracleBI/server/Bin/libnqsodbc.so
Description=Oracle BI Server
ServerMachine=localhost
Repository=
Catalog=SnowflakeSales
UID=
PWD=
Port=9703
SSL=Yes
SSLCertificateFile=/path/to/ssl/certificate.pem
SSLPrivateKeyFile=/path/to/ssl/privatekey.pem
SSLPassphraseFile=/path/to/ssl/passphrase.txt
SSLCipherList=
SSLVerifyPeer=No
SSLCACertificateDir=/path/to/ca/certificate/dir
SSLCACertificateFile=/path/to/ca/certificate/file.pem
SSLTrustedPeerDNs=
SSLCertVerificationDepth=9
Can anybody point a finger where the error line is? According to the documentation it should work fine.Maybe the driver name is wrong? What driver I need then?
Cause I can't find it.
I'm really sorry to bother, guys :) Let me know if you get some ideas about it (metalink didn't help).OK, several things wrong here. First the odbc.ini is not meant to be used for Oracle databases, that's not supported on Linux. On Linux you should OCI (Oracle native drivers) and nothing should be added on odbc.ini. Your user.sh seems to be pointing to your DB installation path. This is not correct. It should point to your Oracle client installation so you need to install the Oracle FULL client somewhere. Typically this is normally done with the same OS account as the one used for OBIEE whereas the DB normally runs with the oracle account. Once you got the client installed test it under the OBIEE account doing tnsping and sqlplus to your DB. Also the LD_LIBRARY_PATH should point to $ORACLE_HOME/lib32 not lib as the lib directory is the 64bits and OBIEE uses the 32bits libraries even in 64bits OSes. Finally change your RPD connection to use OCI. Make all those changes and you should be good.
-
Hi.
I installed the full suite of tools as I am involved with a company intended to become a partner and reseller so we are trying to get everything working.
SAP Crystal Server 2013 SP1
BusinessObjects Business Intelligence platform
SAP Crystal Reports for Enterprise
SAP HANA
SAP Crystal Reports 2013 SP1
SAP BusinessObjects Explorer
SAP Crystal Dashboard Design 2013 SP1
Crystal Server 2013 SP1 Client Tools
SAP BusinessObjects Live Office 4.1 SP1
.NET SDK
I found out that BI was needed only after I installed all the others but I installed it without problem.
My issue is that the Information Design Tool (IDT) which creates my universes successfully, and even lets me open and see the dada no problem. Is using a 32bit (ODBC connected to SQL Server 2008) data source.
However, I am unable to load the .UNX in crystal reports (CR) 2011, so I used CR Enterprise (CRE) to connect to my universe. Now when I try to open the universe I start getting the following error:
[Microsoft][ODBC Driver Manager] The specified DSN contains an architecture mismatch between the Driver and Application
When I do searches online I get very generic information relating to setting up data sources. While I believe the problem is indeed with the data source, I don't believe it is setup wrong.
When I access the universe using the IDT (which uses the 32bit version of the data source to build its connection, it wont let me use a 64bit) I can load a table under the "result objects for query #1" press refresh, I get a list of data.
When I access the same universe using CRE (which "Seems" to use a 64bit data source, but I am not sure), and follow the same process as above. I get the above error.
If I cancel the process, when CRE lists the fields for the report I can later manually map these fields to an ODBC connection. But if I have to do this what is the point of using the universes at all as the end user has full access to all the various DB's available in the data source.
Thanks in advance for any help you can provide.On the server where Crystal Reports Server is installed, create a 64-bit ODBC connection with the same name as the 32-bit connection. CRS will then automatically use the 64-bit version of the connection.
-Dell -
When I deploy the cube which is sitting on my PC (local) the following 4 errors come up:
Error 1 The datasource , 'AdventureWorksDW', contains an ImpersonationMode that that is not supported for processing operations. 0 0
Error 2 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW', Name of 'AdventureWorksDW'. 0 0
Error 3 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed. 0 0
Error 4 Errors in the OLAP storage engine: An error occurred while the 'Customer Alternate Key' attribute of the 'Customer' dimension from the 'Analysis Services Tutorial' database was being processed. 0 0Sorry hit the wrong button there. That is not entire solution and setting it to default would work when using a single box and not in a distributed application solution. If you are creating the analysis database manually or using the wizard then you can
set the impersonation to your heart content as long as the right permission has been set on the analysis server.
In my case I was using MS Project Server 2010 to create the database in the OLAP configuration section. The situation is that the underlying build script has been configured to use the default setting which is the SQL Service account and this account does
not have permission in Project Server I believe.
Changing the account to match the Project service account allowed for a successful build \ creation of the database. My verdict is that this is a bug in Project Server because it needs to include the option to choose impersonation when creating the Database
this way it will not use the default which led to my error in the first place. I do not think there is a one fix for all in relations to this problem it is an environment by environment issue and should be resolved as such. But the idea around fixing it is
if you are using the SQL Analysis server service account as the account creating the database and cubes then default or service account is fine. If you are using a custom account then set that custom account in the impersonation details after you have granted
it SQL analysis administrator role. You can remove that role after the DB is created and harden it by creating a role with administrative permissions.
Hope this helps. -
Web Analysis : populate the same table with multiple data sources
Hi folks,
I would like to know if it is possible to populate a table with multiple data sources.
For instance, I'd like to create a table with 3 columns : Entity, Customer and AvgCostPerCust.
Entity and Customer come from one Essbase, AvgCostPerCust comes from HFM.
The objective is to get a calculated member which is Customer * AvgCostPerCust.
Any ideas ?
Once again, thanks for your help.I would like to have the following output:
File 1 - Store 2 - Query A + Store 2 - Query B
File 2 - Store 4 - Query A + Store 4 - Query B
File 3 - Store 5 - Query A + Store 5 - Query B
the bursting level should be give at
File 1 - Store 2 - Query A + Store 2 - Query B
so the tag in the xml has to be split by common to these three rows.
since the data is coming from the diff query, and the data is not going to be under single tag.
you cannot burst it using concatenated data source.
But you can do it, using the datatemplate, and link the query and get the data for each file under a single query,
select distinct store_name from all-stores
select * from query1 where store name = :store_name === 1st query
select * from query2 where store name = :store_name === 2nd query
define the datastructure the way you wanted,
the xml will contain something like this
<stores>
<store> </store> - for store 2
<store> </store> - for store 3
<store> </store> - for store 4
<store> </store> - for store 5
<stores>
now you can burst it at store level. -
Unable to Connect to the Data Source with Report Builder when using a closed Excel file
I am using Report Builder 3.0 and to get started I am using an Excel spreadsheet as my data source. I set this up just fine. I can connect successfully if the file is open Excel. But if I close the file, I get the error "Unable to
Connect to the data source".
This is my connection string:
Dsn=Licenses;dbq=C:\USERS\AMEADE\DOCUMENTS\licenses.xlsx;defaultdir=C:\USERS\AMEADE\DOCUMENTS;driverid=790;fil=excel 8.0;maxbuffersize=2048;pagetimeout=5
Works fine if the file is open but stops working when the file is closed. Any idea what I can do?Hi Alice,
Based on my understanding, when you keep the Excel file open, you can connect to the data source correctly. But when you close the Excel file, the error “Unable to Connect to the data source” is thrown out.
In your scenario, I would like to know if only this Excel comes across this issue. Have you experienced the same issue when you use other Excel files as a data source? As we tested in our environment, we can connect to the data source whether the Excel file
is open. You can refer to this
article to create a data source again then check if you can connect to the data source when the Excel file is closed.
If issue persists, I would suggest you use the
Process Monitor to capture the processes during the connection to the data source. Then check the result of each process to find the exact reason.
If you have any question, please feel free to ask.
Best regards,
Qiuyun Yu
Qiuyun Yu
TechNet Community Support -
Unable to capture the Data Source into a Transport Request
Hi All,
We have a product hierarchy and we are using the data source :4R_PRODH_D_LGEN_HIER for the hierarchy.
Now we need to transport this structure to the quality environment but we were not able to capture the datasource:4R_PRODH_D_LGEN_HIER into a transport request.
When ever we activate the data source:4R_PRODH_D_LGEN_HIER it is asking for the Package and the Transport Request Number.If we give these details and save it, data source is not getting captured in the request, only the "bject Directory Entry" is getting captured.
Can someone please guide me on how to capture the datasource under "Data Sources in BW" in a transport request.
Regards,
Sachin Dehey.Hi Sachin,
Hierarachy datasource is not captured as Attributes and Text Datasource. So what ever you have done is correct.
What ever is captured in Object Directory Entry is correct. So go ahead with your transports, once transport is done check the Hierarchy Infopackage with Available OLTP hierarchies and load the data.
Most important thing first see that the all Master & Transactional Datasources are transported in R/3 Dev to QA to PRD
In BW, datasources are not transported, only their replica is transported.
Transportation of Datasource is done in R/3. Only their replica is transported in BW.
So wht ever you have done till now is correct. So go ahead.
While attaching Hierarchy Datasource it is captured only in "Object Directory Entry"
Regards,
Vishnu. -
Refreshing the Data Source View in Analysis Services
I have added columns to the SQL Database table that is used as a dimension in an Analysis Services Cube. The new columns will be used as additional Property Fields for the dimension. When I attempted to refresh the Data Source view so that the additional columns are present, I am given the following error:
System.Data
Property not accessible because 'Parent Columns and Child Columns don't have type-matching columns'
I have done nothing to the columns used for the parent of child and the error message provides nothing to gon on. Does anyone have any ideas on this?
GaryOlga,
Thanks for your response. I will try and answer your questions
1) I have not tried removing the columns yet. I will try that this afternoon but have limited hope. The two columns I added are simple text columns that will be used as attributes in the dimension. I have made no change to the parent or child columns.
2) The table I modified is the source table for a parent-child dimension.
3) The reference to the "check list" does not take me to any kind of check list.
4) The parent-child dimensions I am trying to modify have been in use for months and the parent and child columns do have the dame data types.
5) I have also check the data types between the dimension table and the fact table. they use the same data types (small int).
6) I have not made a collection for the parent key, it is a single column. The remainder of your last paragraph is not clear to me. Can you give me an example.
I am fairly inexperienced with Analysis Services, please talk slow and use small words :-)
Thanks again for your help!
Gary -
Unable to get the data in RSA 3 for the data source 2LIS_03_BX.
Hi Xperts,
we are loading the data sources(2LIS_03_BX) by using MCNB transaction but not getting the data in it.
can any one please tell me how to load the data into it with step by step process.
regards,
satishHi,
To see the data in RSA3 you need to do the setup of data for the particular data source through SBIW transaction.
The Rsa3 transaction displays the data present in the setup tables of the particular data source. -
Fields missing while creating the data source
Hi All
I am using a third party system to pull data in to BI...Actually there was one table where from the other end they had added 2 fields recently...when i checked in development i was able to create the data source for the table along with the new fields..but when i check it in production those newly added 2 fields are missing .. though i am able to see the old fields in the data source the newly added fields are missing..so is there any problem with the refreshment of the table or should we reastart the connection?
Can any one please let me know a solution for this...
Regards
ShilpaHI,
Are you having connectivity like this
BI Dev
BI QA ---> All Three connected to same 3 rd party system
BI Prod
If yes, If it is R3 replicate datasource will help.
Incase of third party system you need to regenerate the datasource.
if it is DB Connect try to go to the source system and connect to database table from BI side and at the top you will have
generate datasource. use that option you will be able to solve.
Else
BI Dev - 3rd party dev
BI prod - 3rd party prod
Try to add 2 new fields in the 3rd party system and generate the datasource as told above.
Thanks
Arun
Edited by: Arunkumar Ramasamy on Sep 14, 2009 9:12 AM -
We have the SharePoint Server 2010 with SP1 environment on which the custom SP2010 designer pages were working as expected before the
August 13, 2013 CU has installed. But, getting the below exception while trying to add the new item after the CU has installed.
Error while executing web part: System.NullReferenceException: Object reference not set to an instance of an object. at Microsoft.SharePoint.WebControls.SPDataSourceView.ExecuteInsert(IDictionary values) at
System.Web.UI.DataSourceView.Insert(IDictionary values, DataSourceViewOperationCallback callback) 3b64c3a0-48f3-4d4a-af54-d0a2fc4553cc
06/19/2014 16:49:37.65 w3wp.exe (0x1240) 0x1300 SharePoint Foundation
Runtime tkau Unexpected Microsoft.SharePoint.WebPartPages.DataFormWebPartException: The data source control
failed to execute the insert command. 3b64c3a0-48f3-4d4a-af54-d0a2fc4553cc at Microsoft.SharePoint.WebPartPages.DataFormWebPart.InsertCallback(Int32 affectedRecords, Exception ex) at System.Web.UI.DataSourceView.Insert(IDictionary
values, DataSourceViewOperationCallback callback) at Microsoft.SharePoint.WebPartPages.DataFormWebPart.FlatCommit() at Microsoft.SharePoint.WebPartPages.DataFormWebPart.HandleOnSave(Object sender, EventArgs e)
at Microsoft.SharePoint.WebPartPages.DataFormWebPart.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.ProcessRequestMain(Boolean
inclu... 3b64c3a0-48f3-4d4a-af54-d0a2fc4553cc
06/19/2014 16:49:37.65* w3wp.exe (0x1240) 0x1300 SharePoint Foundation
Runtime tkau Unexpected ...deStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) 3b64c3a0-48f3-4d4a-af54-d0a2fc4553cc
I have tried changing the "DataSourceMode" as below, now the insert command is working, but update command is not working.
<SharePoint:SPDataSource runat="server" DataSourceMode="ListItem" />
Also, the lookup dropdown fields are displaying the value as "<a href="Daughterhttp://cpsp10/sites/Employees/_layouts/listform.aspx?PageType=4&ListId={8F62F444-FB6A-4F03-9522-C4696B45DCD1}&ID=10&RootFolder=*">Daughter</a>"
instead of only "Daughter".
Please provide the solution to get rid of this issue.
Thanks
RamasubbuTry below:
http://social.technet.microsoft.com/Forums/en-US/ae910269-3a0c-4506-844b-e8bc89d95b71/data-source-control-failed-to-execute-the-insert-command
http://blog.jussipalo.com/2012/01/sharepoint-2010-data-source-control.html
While there can be many causes for this generic error message, in my case the first parameter or ddwrt:DataBind function inside the SharePoint:FormFields element was
'i' and I was working with an Edit Form. Changing it to
'u' as it was with every other FormField fixed the issue.
<SharePoint:FormField runat="server" id="ff1{$Pos}" ControlMode="Edit" FieldName="Esittaja" __designer:bind="{ddwrt:DataBind('u',concat('ff1',$Pos),'Value','ValueChanged','ID',ddwrt:EscapeDelims(string(@ID)),'@Esittaja')}"
/>
Explanation:
DataBind operation type parameters (the first parameter) are listed below:
'i' stands for INSERT,
'u' stands for UPDATE,
'd' stands for DELETE.
http://webcache.googleusercontent.com/search?q=cache:d9HHY4I7omgJ:thearkfloats.blogspot.com/2014/03/sharepoint-2010-data-source-control.html+&cd=4&hl=en&ct=clnk&gl=in
If this helped you resolve your issue, please mark it Answered -
Unable to connect with new Data Source
Hi everybody,
After many developpment, I finally get my class to do what we need with crystal and webi report using Rebean and RAS SDK's.
But, today we found a special case that I was thinking to solve following examples but the solutions given by the example do not work.
Here is the case :
We have Crystal report in BO using DSN file localted on the C:\ disk of the server. So, the Information for a such report founded in the IConnectionInfo object is :
ConnectionInfo Kind = CRQE
QE_DatabaseType = >ODBC (RDO)< (java.lang.String)
SSO Enabled = >false< (java.lang.Boolean)
Database DLL = >crdb_odbc.dll< (java.lang.String)
QE_DatabaseName = >< (java.lang.String)
QE_LogonProperties = >{FILEDSN=C:\path to DSN, UseDSNProperties=false, Trusted_Connection=false}< (com.crystaldecisions.sdk.occa.report.lib.PropertyBag)
QE_SQLDB = >true< (java.lang.Boolean)
QE_ServerDescription = >C:\path to DSN< (java.lang.String)
Today, we had to import old crystal report in BO and for that report, the Data Source gives a DSN file on another disk R:\...
So for the report where I found in a IConnectionInfo object a DSN file beginning by R:\, I have to change it by a new one pointing to the right DSN file on the C:\ disk. (The DSN file on the C:\ work perfectely because we can use it with other report).
I follow this example : http://www.sdn.sap.com/irj/boc/index?rid=/library/uuid/e068eba5-1f57-2c10-10a0-91e111c98bc7
And so I get the following code where I put all the parameter need for the correct DSN file on the C:\ :
ConnectionInfos ci = (ConnectionInfos)dbcontroller.getConnectionInfos(null).clone();
IConnectionInfo ici = ci.getConnectionInfo(0);
// Create a new connection and start setting new properties
IConnectionInfo newConnectionInfo1 = new ConnectionInfo();
newConnectionInfo1.setKind(ConnectionInfoKind.CRQE);
newConnectionInfo1.setUserName("momtest");
newConnectionInfo1.setPassword("testmom");
PropertyBag newpb = new PropertyBag();
newpb.put(PropertyBagHelper.CONNINFO_CRQE_DATABASETYPE,"ODBC (RDO)");
newpb.put(PropertyBagHelper.CONNINFO_SSO_ENABLED,false);
newpb.put(PropertyBagHelper.CONNINFO_DATABASE_DLL,"crdb_odbc.dll");
newpb.put(PropertyBagHelper.CONNINFO_CRQE_DATABASENAME,"");
newpb.put(PropertyBagHelper.CONNINFO_CRQE_SQLDB,true);
newpb.put(PropertyBagHelper.CONNINFO_CRQE_SERVERDESCRIPTION,"path to DSN");
PropertyBag LogonProperties = new PropertyBag();
LogonProperties.put("FILEDSN","path to DSN");
LogonProperties.put("UseDSNProperties", false);
LogonProperties.put("Trusted_Connection","false");
newpb.put(PropertyBagHelper.CONNINFO_CRQE_LOGONPROPERTIES,LogonProperties);
newConnectionInfo1.setAttributes(newpb);
try {
dbcontroller.replaceConnection(ici, newConnectionInfo1, null, DBOptions._useDefault);
catch (ReportSDKServerException error)
System.out.println(error.getServerError());
System.out.println("--->" + dbcontroller.getServerNames());
System.out.println("Refreshing document...");
clientDoc.refreshReportDocument();
System.out.println("Document refreshed");
ByteArrayInputStream byteIS = (ByteArrayInputStream) clientDoc.getPrintOutputController().export(ReportExportFormat.PDF);
When we arrived at line "ByteArrayInputStream byteIS = (ByteArrayInputStream) clientDoc.getPrintOutputController().export(ReportExportFormat.PDF);" we get a message that the logon failed.
Someone have an idea?
Edited by: jerome.vervier on Dec 29, 2011 4:23 PMThanks Adam
It ensure me that I get no usual error in my code and it confims me what I think.
Thanks for help ! -
11i EBS XML Publisher Report with Multiple Data Source
I need to create XML Publisher report in 11i EBS pulling data from another 10.7 EBS Instance as well as 11i EBS in single report.
I am not allowed to create extract or use db links.
My problem is how to create Data Source Connection using Java Concurrent Program.
The approach I am trying is
1. create Java concurrent program to establish connection to 10.7 instance.
2. Will write the SQL queries in Data Tempalete with 2 Data Source 1 for 11i EBS and 2 for 10.7 EBS
3. Template will show the data from both query in 1 report..
Is there any other way to proceed using datasource API...
thanksoption1:
The query should be same @ detail level, only the template has to be different for summary and details.
@runtime, user can choose to see the detail/summary
Disadvantage, if the data is huge,
advantage , only one report.
option2:
create two separate reports summary and details
and create diff data and diff layout and keep it as different report
Advantage, query will perform based on the user run report, summary/detail, so that you can write efficient query.
Dis advantage , two reports query/template to be maintained. -
Greetings,
I have an InfoPath 2013 form that uses an external data connection. The data connection became corrupt (somehow, no one knows who changed what). A user went in to InfoPath designer and created a new data connection and changed all references
to use that new data connection. Now the form cannot be published at all with the error
"The data source "GetUserProfileByName" referenced in the form template is not valid or cannot be found"
The new data source they created is GetUserProfileByName2 and changed all references.
Of course, since it is broke, they asked me to see if I can find the issue. I went through the form looking to see if they missed any references to the old data connection and can't find anything.
Where is InfoPath storing the old data connection information and where can I remote it? I looked in the manifest and don't see it there either.
Any thoughts?
Thank you!Hi Bob,
There are many XML schema files for the data connection, I recommend to check if the references to the XML files have been changed to the new schema files in the manifest file.
For example, if we create a data connection called GetUserProfileByName, then there will be one XML file and three XML schema files for the data connection.
Please make sure that all the references are updated in the manifest file.
Thanks,
Victoria
Forum Support
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
[email protected]
Victoria Xia
TechNet Community Support
Maybe you are looking for
-
ORA-29279 553 error Can someone explain me what you mean by the following error: ORA-20001: ORA-29279: SMTP permanent error: 553 5.1.3 multiple addresses not allowed: I am passing multiple valid email addresses. I am able to send email to these addr
-
Itunes store does not show music
Why I couldn't find "Music" appears on the bar of iTunes store?
-
HT5639 how do you uninstall boot camp partion after a failed install of windows 7
I cannot remove a boot camp partition after a failed attempt to install windows 7 64 bit
-
Since upgrading to the l;atest firefox am unable to access facebook. they say they do not support your latest plugin. WHEN are you going to fix so I don't have to use IE8?
-
Just ran into Scott's recently blog entry about Apex on Sourceforge! http://spendolini.blogspot.com/2006/04/apex-open-source-applications.html http://sourceforge.net/projects/oracle-apex/ http://apex.oracle.com/pls/otn/f?p=45326:107 Looks like this i