Data source for fixed assets
Hello All,
I would like to know what is the new version of data source for fixed assets. I believe it is 0FI_AA_20. But, this data sources has less number of fields compared to the old version data sources viz. 0FI_AA_11.
Which data source is presently being used widely. Is IT 0FI_AA_20 or is it 0FI_AA_11. If so, for what reason?
In case you are using old version data source viz. 0FI_AA_11, how did you justify to the client that 0FI_AA_11 is the appropriate data source.
Regards.
Shanker Ramabhotla
Hi Lokesh,
Thanks for your reply.
We have already implemented the notes in our development vut it did not help. It looks like we have a problem with the extractor checker too. I ran the extractor checker today but the total record count is exceeding the records in there.
If anyone has faced a similar situation with this particular data source, then please provide me valuable inputs.
Any help would be greatly appreciated.
Thanks,
Vinay
Similar Messages
-
Year end closing for fixed asset
Hi,
Currently we run the process AJAB to close the year for fixed assets at the end of the year and can close two years back. So in September 07 I would have closed 2006, then run AJRW to open 2008.
So it is my understanding that we can only have two years open and we have in the past had the current year and one year back. Do you know why we do this, why dont we have the current year and one year forward?
There has been a request to close 2007 for fixed assets, if I do this, will this cause any problems?
With regards,
SreeIf u close the 2007 there is no problem, useally everbudy maintain one previous year and one current year, basic reason is if the any capitilization is to be made with back date u can able to do. If that is not there in your case u can close but if you open the future year after that you can not reset/reopen the 2007.
If you open 2009 also there is no use, because with future date no budy do the capitalisation.
advantage of opening of 2009 only one advantage is u can able to see the depreciation simulation will be taken period wise.
disadvantage of opening of 2009 you can't open 2007 if you post any entries with back date.
assign points if its helpful. -
Calculate Depreciation Forecast for Fixed asset
Dear all expert,
I'm new to ASO, would like to know if there's a way to do depreciation forcast for fixed asset in Essbase.
1. Database type is ASO
2. Assuming user will prepare forecast caplitialization of fixed asset then upload to Essbase
3. Attribute dimension "Useful life" has been defined and attached to specific "Accounts" dimension for calcluation
4. Need to calculate the forecast and spread of depreciation for future months
E.g. Input 32000 USD for a fixed asset cost with 32 months useful life
-> should calculate 1000 each month for the coming 32 months
Can any expert suggest how to do this in ASO? meaning only can do calculation in member formula itself, not calculation script.
Thanks for your help in advance,
BillyI haven't used the feature, but the reference to 'local script_file' in the MaxL railroad diagram for the 'execute calculation' command suggests the location is relative to the client, if the semantics are consistent with the 'export data' command (the options for report scripts there are 'server' or 'local', although according to the railroad diagram there is no non-local alternative in this case!).
There is a worked example in a Rittman Mead blog post about this that might help: http://www.rittmanmead.com/2010/04/oracle-epm-11-1-2-essbase-calculations-on-aso-cubes-part-2-maxl-scripts-and-eas/
The DBAG also says that the script files should have suffix .csc, while the Tech Ref example has suffix .txt - helpful!! If they were actually meant to sit in the server's db directory, I would be surprised to find the .csc suffix used, since that is also the suffix for aggregation scripts. Although me being surprised doesn't mean much. -
Data source for this data connection isn't registered for Power BI
Hi, I am getting this error when I set Schedule Data Refresh to refresh data from db. How do I register my connection to the data source? Is this a fix on SQL server or Power BI?
FAILURE INFORMATION
Failure
Correlation ID: c5132b7a-3c54-4f12-a048-3ddcf0b95f26
Data connection Status
SqlServer twcperfsql1 OfficePerf OnPremise error: Sorry, the data source for this data connection isn't registered for Power BI. Tell your Power BI admin the admin needs to register this connection to this on-premises data source.
Thanks for your help.I'm running into the same issue, I did configure the Gateway and then the data source in Power Bi.
When I go to the Power BI Admin Center and click on "Test Connection" it seems to be working. But when I refresh the data from Power BI it doesn't work. Now if I edit the spreadsheet in excel and refresh the data in PowerPivot then it works. So
not sure why the solution doesn't refresh from the automated service in Power BI without this useless error.
Thanks
Fabian
In case it helps this is the actual error:
Failure Correlation ID: ecc73a16-7264-45b2-9baf-e3448f007211
Power Query - dbo_TableAOnPremise error: Sorry, the data source for this data connection isn't registered for Power BI. Ask your Power BI admin to register the data source in the Power BI admin center.
Also further information I have found so far is that the Gateway is receiving the requests for the refresh and it shows the message below, so the data source is found and communication seems to be happening from server to client and back, not sure why server
doesn't seem to like the data the Gateway is sending back to server.
The feed 'dbo_TableA' was successfully accessed at 7/15/2014 4:23:26 PM.
The request took 00:00:00.0000105 seconds.
Activity ID: e8464e5d-3f0a-49c2-b10b-450fec3e5940 -
Hi All,
I am uploading data for fixed assets using LSMW with bapi -
business object - BUS1022
Method - FIXEDASSET_CREATEINCLVALUES
Message Type - FIXEDASSET_CREATEINCLVALUES02
According to project settings group asset and group asset subnumber are mandatory fields for depreciation area 15. When i am running the lsmw i am getting the error
"DEPRECIATIONAREAS-GRP_ASSET ("Group asset") is a required entry field in depreciation area 15. The field was not supplied with data when the BAPI was called."
Please help me how to solve the problem...
I am copying the lsmw for which group asset and group asset sub number were not included and i am adding those two fields in my lsmw and i am not able to upload the data because of the error. Previously these two fields were not mandatory and LSMW was working fine.
Thanks,
GovindaHi Govinda,
Please try with the following code , if it is useful then please allocate points to me
*& Report ZASSET_CREATE
REPORT zasset_create.
DATA:
input parameters to the bapi
input_key LIKE bapi1022_key,
gen_data LIKE bapi1022_feglg001,
gen_datax LIKE bapi1022_feglg001x,
posting_info LIKE bapi1022_feglg002,
posting_infox LIKE bapi1022_feglg002x,
posting_data LIKE bapifapo_gen_info,
acquis_data LIKE bapifapo_acq,
time_dep_data LIKE bapi1022_feglg003,
time_dep_datax LIKE bapi1022_feglg003x,
real_estate LIKE bapi1022_feglg007,
real_estatex LIKE bapi1022_feglg007x,
allocations LIKE bapi1022_feglg004,
allocationsx LIKE bapi1022_feglg004x,
extensionin TYPE TABLE OF bapiparex,
origindata LIKE bapi1022_feglg009,
origindatax LIKE bapi1022_feglg009x,
it_depreciationareas TYPE TABLE OF bapi1022_dep_areas,
wa_depreciationareas TYPE bapi1022_dep_areas,
occurs 0 with header line,
it_depreciationareasx TYPE TABLE OF bapi1022_dep_areasx,
wa_depreciationareasx TYPE bapi1022_dep_areasx,
occurs 0 with header line,
it_investment_support TYPE TABLE OF bapi1022_inv_support,
occurs 0 with header line,
output parameters from the bapi
asset_number LIKE bapi1022_1-assetmaino,
sub_number LIKE bapi1022_1-assetsubno,
out_return LIKE bapiret2,
out_return2 LIKE bapiret2.
input_key-companycode = 'CAT1'.
gen_data-assetclass = '00003000'.
gen_datax-assetclass = 'X'.
gen_data-descript = 'Testing BAPI'.
gen_datax-descript = 'X'.
gen_data-serial_no = '1111'.
gen_datax-serial_no = 'X'.
gen_data-base_uom = 'KG'.
gen_datax-base_uom = 'X'.
*Append the Depreciation keY VALUE EQUAL TO LINA to the internal table.
wa_depreciationareas-area = '01'.
wa_depreciationareas-descript = 'Book deprec.'.
wa_depreciationareas-dep_key = 'LINA'.
wa_depreciationareas-ulife_yrs = '2'.
wa_depreciationareas-ulife_prds = '3'.
wa_depreciationareas-exp_ulife_yrs = '5'.
wa_depreciationareas-exp_ulife_prds = '3'.
*wa_depreciationareas-EXP_ULIFE_SDEP_YRS
*wa_depreciationareas-EXP_ULIFE_SDEP_PRDS
*wa_depreciationareas-ORIG_ULIFE_YRS
*wa_depreciationareas-ORIG_ULIFE_PRDS
wa_depreciationareas-change_yr = '3'.
*wa_depreciationareas-dep_units = '3'.
*wa_depreciationareas-odep_start_date = '01012005'.
*wa_depreciationareas-sdep_start_date = '01012005'.
*wa_depreciationareas-INTEREST_START_DATE
*wa_depreciationareas-READINESS
*wa_depreciationareas-INDEX
wa_depreciationareas-AGE_INDEX
wa_depreciationareas-var_dep_portion = '200'.
wa_depreciationareas-scrapvalue = '20'.
*wa_depreciationareas-currency = 'USD'.
*wa_depreciationareas-currency_iso = 'USD'.
*wa_depreciationareas-NEG_VALUES
*wa_depreciationareas-GRP_ASSET
*wa_depreciationareas-GRP_ASSET_SUBNO
*wa_depreciationareas-ACQ_YR
*wa_depreciationareas-ACQ_PRD
*wa_depreciationareas-SCRAPVALUE_PRCTG
APPEND wa_depreciationareas TO it_depreciationareas.
CLEAR: wa_depreciationareas.
wa_depreciationareasx-area = '01'.
*wa_depreciationareasX-DESCRIPT = 'X'.
*wa_depreciationareasX-DEACTIVATE = 'X'.
wa_depreciationareasx-dep_key = 'X'.
wa_depreciationareasx-ulife_yrs = 'X'.
wa_depreciationareasx-ulife_prds = 'X'.
*wa_depreciationareasX-EXP_ULIFE_YRS = 'X'.
*wa_depreciationareasX-EXP_ULIFE_PRDS = 'X'.
*wa_depreciationareasX-EXP_ULIFE_SDEP_YRS = 'X'.
*wa_depreciationareasX-EXP_ULIFE_SDEP_PRDS = 'X'.
*wa_depreciationareasX-ORIG_ULIFE_YRS = 'X'.
*wa_depreciationareasX-ORIG_ULIFE_PRDS = 'X'.
wa_depreciationareasx-change_yr = 'X'.
*wa_depreciationareasx-dep_units = 'X'.
wa_depreciationareasx-odep_start_date = 'X'.
wa_depreciationareasx-sdep_start_date = 'X'.
*wa_depreciationareasX-INTEREST_START_DATE = 'X'.
*wa_depreciationareasX-READINESS = 'X'.
*wa_depreciationareasX-INDEX = 'X'.
*wa_depreciationareasX-AGE_INDEX = 'X'.
wa_depreciationareasx-var_dep_portion = 'X'.
wa_depreciationareasx-scrapvalue = 'X'.
*wa_depreciationareasx-currency = 'X'.
*wa_depreciationareasx-currency_iso = 'X'.
*wa_depreciationareasX-NEG_VALUES = 'X'.
*wa_depreciationareasX-GRP_ASSET = 'X'.
*wa_depreciationareasX-GRP_ASSET_SUBNO = 'X'.
*wa_depreciationareasX-ACQ_YR = 'X'.
*wa_depreciationareasX-ACQ_PRD = 'X'.
*wa_depreciationareasX-SCRAPVALUE_PRCTG = 'X'.
APPEND wa_depreciationareasx TO it_depreciationareasx.
CLEAR wa_depreciationareasx.
*Append the Depreciation keY VALUE EQUAL TO LINb to the internal table.
break mukherar.
CALL FUNCTION 'BAPI_FIXEDASSET_CREATE1'
EXPORTING
key = input_key
REFERENCE =
CREATESUBNUMBER =
POSTCAP =
CREATEGROUPASSET =
TESTRUN =
generaldata = gen_data
generaldatax = gen_datax
INVENTORY =
INVENTORYX =
POSTINGINFORMATION =
POSTINGINFORMATIONX =
TIMEDEPENDENTDATA =
TIMEDEPENDENTDATAX =
ALLOCATIONS =
ALLOCATIONSX =
ORIGIN =
ORIGINX =
INVESTACCTASSIGNMNT =
INVESTACCTASSIGNMNTX =
NETWORTHVALUATION =
NETWORTHVALUATIONX =
REALESTATE =
REALESTATEX =
INSURANCE =
INSURANCEX =
LEASING =
LEASINGX =
IMPORTING
COMPANYCODE = company_code
asset = asset_number
subnumber = sub_number
ASSETCREATED =
return = out_return
TABLES
depreciationareas = it_depreciationareas
depreciationareasx = it_depreciationareasx.
INVESTMENT_SUPPORT =
EXTENSIONIN =
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
WAIT =
IMPORTING
RETURN =
WRITE: / 'Asset Number:',asset_number. -
Hello,
I need to create three reports for the Fixed Assets module.
Can I user XL Reporter or what should I use?
Thank you,
IrinaHi Irina,
I've never seen documentation on the table structure for Fixed Assets so you might have to do your own detective work to establish the tables you need. The tables will all have the same prefix. You should see the title of each table listed under Tools--User Tables which will give you an idea of what type of data is stored in each table. After that, it's going to be a case of using System Information, Enterprise Manager and SQL Profiler, where appropriate, to understand how the data is stored.
Kind Regards,
Owen -
ODS as data source for the next ODS - can't see the name in the DS list
Hello, experts!
Need your help. I'm trying to use Staging ODS for the transaction data to populate data into next level of ODS - Standard ODS. The problem is that Staging ODS does not appear in the Data Source list for the Standard ODS when I Assign data sources(and would not let me paste the Staging ODS data source name either). I did Replicate data source for the Staging ODS and Generate Export data sources but it did not fix the problem.
Any suggestions?
Thank you.
Alina.OK, you're talking about the selection screen "Available datasources" showing only 50 entries right?
In my system the screen can show more than 900... must be one of your user settings in the gui...
just under the tab "restrictions" there is a line with an arrow pointing down; click this line to expand the restrictions...
In the DataSource field enter 8* or 8<ODS_TECH_NAME> in order to see your DSource.
Otherwise in RSA13 in your Dsources overview, right click it and "assign infosource"; it will also make it...
hope this helps...
Olivier.
Edited by: Olivier Cora on Jul 30, 2008 6:31 PM -
Unable to generate "Export data source" for Master data infoobject
Hi Gurus,
I'm in BW 3.5
I need to generate "Export data source" for a master data info object.
I followed below steps:
I ticked the checkbox "Characterist. is export data source" in the master data Info object maintenance screen.
In the Infosource area, for the desired Infoobject --> Right click --> Generate export datasource --> a dialog box displays saying "The generation of DataSource was successful"
Problem:
Even after refreshing, etc.. I couldn't find the generated datasource 8**** for the above master data. I also tried to replicate the datasource from the myself source system, but couldn't find it even there. I don't know where the generated export data source is hiding ???
Could anyone please help ????
Thanks,
VenkatHi Venkat,
I guess that after creating an export datasource for infoobject this DS should appear in the RSA1-Source systems - MYSELF - datasource overview (with name starting with 8). Since it's a datamart, it should appear also as system generated infosource. And in this case you would be able to assign in URs of data target the appropriate infosource (make sure that in Infoproviders tab of RSA1 you have inserted your infoobject as a data target).
However, there are some bugs related to export data source. CHeck if you see you export data source in RSA6.
If you don't see it - consider applying OSS Note #816892 - "30BSP26: Export DS for master data is not generated".
Note #876845 - "30BSP29: InfoObject change: Export DataSource not adjusted" is also may be useful.
-Vikram -
Standard BI Data Source for FMGLFLEXT
Hello Friends,
The standard data source for the Financials New Legder is 0FI_GL_10 General Ledger: Leading Ledger Balances, which fetches the data from the FAGLFLEXT Table using the Extract Structure FAGL_EXTSTRUCT_LEAD. However there has been a migration of the FAGL table to FMGL table. And now the FAGLFLEXT is no longer populated with any FI postings data. And the new table which is getting the update is the FMGLFLEXT Table. GL Migration from FAGL to FMGL is a standard procedure. However I am not finding any Standard Data Source to fetch the data from the FMGLFLEXT Table.
Can anybody let me know if there is any standard datasource for this or should we create a custom data source to fetch the data from FMGLFLEXT using the FMGL_EXTSTRUCT_LEAD Extract Structure.
Appreciate help in this regard.
VBHello,
The requirement is resolved by the following:
If you want to use your own totals table (in this scenario FMGLFLEXT) as your basis, you first have to create the corresponding extraction structure using transaction FAGLBW01. In this way, all fields of the totals table are transferred into the extraction structure. Further In the OLTP system, call up transaction FAGLBW03 and Configure the DataSource and save your settings.
http://help.sap.com/saphelp_nw70/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
Regards,
VB -
Standard Data Source for FMGLFLEXT Table
Hello Friends,
The standard data source for the Financials New Legder is 0FI_GL_10 General Ledger: Leading Ledger Balances, which fetches the data from the FAGLFLEXT Table using the Extract Structure FAGL_EXTSTRUCT_LEAD. However there has been a migration of the FAGL table to FMGL table. And now the FAGLFLEXT is no longer populated with any FI postings data. And the new table which is getting the update is the FMGLFLEXT Table. GL Migration from FAGL to FMGL is a standard procedure. However I am not finding any Standard Data Source to fetch the data from the FMGLFLEXT Table.
Can anybody let me know if there is any standard datasource for this or should we create a custom data source to fetch the data from FMGLFLEXT using the FMGL_EXTSTRUCT_LEAD Extract Structure.
Appreciate help in this regard.
VBHello Sasi and Kirun,
Thanks for your reply,
The requirement is resolved by the following:
If you want to use your own totals table (in this scenario FMGLFLEXT) as your basis, you first have to create the corresponding extraction structure using transaction FAGLBW01. In this way, all fields of the totals table are transferred into the extraction structure. Further In the OLTP system, call up transaction FAGLBW03 and Configure the DataSource and save your settings.
http://help.sap.com/saphelp_nw70/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
Regards,
VB -
Can you use SQL as a data source for a project in the same way you can in Excel?
Excel allows you to create a data source that executes a SQL stored procedure, display that data as a table in a spreadsheet and have that data automatically refresh each time you open the spreadsheet. Is it possible to do the same thing in MS Project, displaying
the data from the stored procedure as a series of tasks?
Here's what I'm trying to do - I have a stored procedure that pulls task data meeting a specific criteria from all projects in Project Server. We're currently displaying this data as an Excel report. However, the data includes start dates and durations so
it would be nice to be able to display it as a Gantt Chart. I've played around with creating a Gantt chart in Excel and have been able to do a very basic one, but it doesn’t quite fit our needs.No, You can not use sql as a data source for a project.
You have 3 options to achieve it:
1. You can create a Sharepoint list with desired column ,fill desired data in that list then you can create a MS project from Sharepoint List.
2. You can create a SSRS report in which you can display grantt chart Joe has given you that link.
3. You can write a macro in MPP which will take data from your excel. In excel you will fetch data from your stored procedure. write a schedule which will run every day to update your data or
create an excel report in which will update automatically and write macro in mpp which will fetch the data then publish it so that it would be available to team members.
kirtesh -
Hi,
I am getting below error in SharePoint PPS 2013 Dashboard Designer. While create the Analysis Service by using "PROVIDER="MSOLAP";DATA SOURCE="http://testpivot2013:9090/Source%20Documents/TestSSource.xlsx"
"An error occurred connecting to this data source. Please check the data source for any unsaved changes and click on Test Data Source button to confirm connection to the data source. "
I have checked all the Sites and done all the steps also. But still getting the error.Its frustrating like anything. Everything is configured correctly but still getting this error.
Thanks in advance.
Poomani SankaranHi Poomani,
Thanks for posting your issue,
you must have to Install SQL Server 2012 ADOMD.Net on your machine and find the browse the below mentioned URL to create SharePoint Dashboard with Analysis service step by step.
http://www.c-sharpcorner.com/UploadFile/a9d961/create-an-analysis-service-data-source-connection-using-shar/
I hope this is helpful to you, mark it as Helpful.
If this works, Please mark it as Answered.
Regards,
Dharmendra Singh (MCPD-EA | MCTS)
Blog : http://sharepoint-community.net/profile/DharmendraSingh -
Trying to change the data source for a Crystal Report.
<p>The method below represents my best attempt to programatically change the data source of a Crystal Report. The goal is to have a routine that will update the data source for reports after they have been distributed to production servers. So far I have not been successful in saving the report back to the CMS. No exceptions are thrown, but when I view the Database Configuration of the report in the CMC nothing has changed.
</p>
<p>
Am I missing a step, or is there another way to accomplish this?
</p>
<p>
Thank you.
</p>
<hr />
<pre>
private void test(String reportName)
throws SDKException, ReportSDKException, java.io.IOException
IInfoObjects newInfoObjects;
IInfoObject reportObj;
ReportClientDocument clientDoc = new ReportClientDocument();
DatabaseController dc;
PropertyBag pBag;
PropertyBag logonProps;
ConnectionInfo newConInfo;
ConnectionInfo oldConInfo;
ConnectionInfos conInfos;
int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB; //0;
Fields connFields = null;
String queryStr = "Select * From CI_INFOOBJECTS " +
"Where SI_NAME='wfr.rpt' AND SI_KIND='CrystalReport' AND SI_INSTANCE=0";
newInfoObjects = getCms().executeQuery(queryStr);
if(newInfoObjects.size() > 0)
reportObj = (IInfoObject)newInfoObjects.get(0);
try
clientDoc = getCms().getReportAppFactory().openDocument(
reportObj
, OpenReportOptions._refreshRepositoryObjects
, java.util.Locale.US);
dc = clientDoc.getDatabaseController();
conInfos = dc.getConnectionInfos(null);
for(int i = 0; i < conInfos.size(); ++i)
oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(i);
newConInfo = (ConnectionInfo)oldConInfo.clone(true);
pBag = newConInfo.getAttributes();
pBag.putStringValue("QE_ServerDescription", "alio");
logonProps = new PropertyBag();
logonProps.putStringValue("Trusted_Connection", "false");
logonProps.putStringValue("Server", "alio");
pBag.put("QE_LogonProperties", logonProps);
newConInfo.setUserName("admin");
newConInfo.setPassword("password");
dc.replaceConnection(
oldConInfo
, newConInfo
, connFields
, connOptions);
catch(ReportSDKServerException Ex)
String msg = "A server error occured while processing the " + reportObj.getKind()
+ " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
Utility.errorOut(msg, Ex);
catch(Exception Ex)
String msg = "An error occured while processing the " + reportObj.getKind()
+ " object, " + reportObj.getTitle() + " (" + reportObj.getCUID() + "), from the CMS.";
Utility.errorOut(msg, Ex);
finally
clientDoc.save();
getCms().commitToInfoStore(newInfoObjects);
clientDoc.close();
</pre>
Edited by: Mark Young on Sep 10, 2009 2:16 PM<style type="text/css">
/<![CDATA[/
body
font-size: 1.125em;
font-family: helvetica,arial,"sans-serif";
.code{font-family: "courier new",courier,mono,monospace}
.bi{font-style: italic; font-weight: bold;}
/]]>/
</style>
<p>Justin,</p>
<p>
Thank you for the reply. Time constraints have not allowed me to post back to this tread
till now. I will try your suggestion. My assumption is that <i>Save the report back to the
info store</i> refers to <span class="code">IInfoStore.commit(IInfoObjects)</span>.
</p>
<p>
I'm afraid that I do not understand why I don't want to change the report client document,
or why <i>successfully exporting the report with the new login/password</i> is not what I
want to do. Any explanation on that statement would be appreciated.
</p>
<p>
I did find a way to accomplish my goal. It involved adding the SSOKEY property to the
logon property bag. Below you'll see my revised code which modifies the report logon and
server. I have no idea what
this does, and SAP support has not been able to tell me why it works. However, what I
discovered is that if I changed the report option, <b>Database Configuration -> When
viewing report:</b>, in the CMS to <span class="bi">Use same database logon as when report
is run</span> from <span class="bi">Prompt the user for database logon</span>, then the
SSOKEY property had been added to the logon property bag having an empty string as its
value. This allowed me to successfullyupdate and save the modified logon back to the CMS.
</p>
<p>
So I took a chance and added code to always add the SSOKEY property with an empty string
as its value, and I could then successfully modify and save the report's logon info
and server. Again, I don't know what this means, but it has worked so far. If anyone has
some insight or comments, either are welcome. Thank you in advance.
</p>
<br />
<hr />
<pre>
private void changeDataSourceOfAWFCrystalReports()
throws SDKException, ReportSDKException, java.io.IOException
IInfoObjects newInfoObjects = null;
IInfoObject reportObj = null;
IReport curReport = null;
ReportClientDocument clientDoc = new ReportClientDocument();
DatabaseController dbController;
PropertyBag pBag;
PropertyBag logonProps;
ConnectionInfo newConInfo;
ConnectionInfo oldConInfo;
ConnectionInfos conInfos;
int connOptions = DBOptions._ignoreCurrentTableQualifiers + DBOptions._doNotVerifyDB;
Fields connFields = null;
String outputStr;
int numOfReports;
int numOfQueryPages;
double progressIncrementPerPage = 30;
int progressIncrementPerReport = 0;
// Path query to reports is in a .properties file.
String queryStr = getAppSettingsFile().getWscAwfCrystalReportPathQuery();
try
// Executes IInfoStore.getPageingQuery() and generates a list of queries.
getCms().setPathQueryQueries(queryStr, 100);
numOfQueryPages = 0;
// Gets a List<String> of the IPageResult returned from IInfoStore.getPageingQuery().
if(getCms().getPathQueryQueries() != null)
numOfQueryPages = getCms().getPathQueryQueries().size();
if(numOfQueryPages > 0)
// Use 30% of progress bar for the following loop.
progressIncrementPerPage = Math.floor(30.0/(double)numOfQueryPages);
for(int queryPageIndex = 0; queryPageIndex < numOfQueryPages; ++queryPageIndex)
// Gets the IInfoObjects returned from the current page query
newInfoObjects = getCms().getPathQueryResultSetPage(queryPageIndex);
numOfReports = newInfoObjects.size();
if(newInfoObjects != null && numOfReports > 0)
progressIncrementPerReport =
Math.round((float)Math.floor(progressIncrementPerPage/(double)numOfReports));
for(int reportIndex = 0; reportIndex < numOfReports; ++reportIndex)
reportObj = (IInfoObject)newInfoObjects.get(reportIndex);
curReport = (IReport)reportObj;
clientDoc = getCms().getReportAppFactory().openDocument(
reportObj
, OpenReportOptions._refreshRepositoryObjects
, java.util.Locale.US);
dbController = clientDoc.getDatabaseController();
conInfos = dbController.getConnectionInfos(null);
for(int conInfosIndex = 0; conInfosIndex < conInfos.size(); ++conInfosIndex)
oldConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
newConInfo = (ConnectionInfo)oldConInfo.clone(true);
pBag = newConInfo.getAttributes();
pBag.putStringValue(
"QE_ServerDescription"
,getConfigFile().getDBDataSourceConnections());
logonProps = new PropertyBag();
logonProps.putStringValue("Trusted_Connection", "false");
<b>logonProps.putStringValue("SSOKEY", "");</b>
logonProps.putStringValue(
"Server"
,getConfigFile().getDBDataSourceConnections());
pBag.put("QE_LogonProperties", logonProps);
newConInfo.setUserName(getConfigFile().getUNVConnectionUserName());
newConInfo.setPassword(getConfigFile().getUNVConnectionPasswordDecrypted());
dbController.replaceConnection(
oldConInfo
, newConInfo
, connFields
, connOptions);
newConInfo = (ConnectionInfo)conInfos.getConnectionInfo(conInfosIndex);
} // end for on conInfosIndex
clientDoc.save();
} // end for on reportIndex
} // end if on newInfoObjects
} // end for on queryPageIndex
} // end try
catch(ReportSDKServerException Ex)
// handle...
catch(Exception Ex)
// handle...
finally
getCms().commitToInfoStore(newInfoObjects);
if(clientDoc != null)
clientDoc.close();
</pre> -
Error while creating Data Source for master data attributes
Hi BI Experts,
Well its been some time for me that I have been part of Extraction in BI.I primarily handled reporting in my last assignments.
I was trying extraction with flat files in SAP BI 7(new to sap bi 7 but very much familiar with BW3.5) but failed in the activity during master data attributes and text upload in infoobject (say IOSP_Mat).
Here is the procedure that I did after creation of characteristic IOSP_Mat.I created a source system for flat file followed by data source for Master data attributes, i selected all the parameters correctly.i.e. csv file format, data seperator as ,
and other settings, now when i am trying to look at the proposed data in the next tab using Load example data.its not showing the desired result.The columns that I have maintained in Flat File is MAT_NUMBER and MAT_NAME (with say 100 data in the file)
same is the result when I am trying to load the text data too columns maintained are
(LANGUAGE MAT_NUMBER Short Description)(same 100 data).
now i used to rsa1old transaction to upload the file using 3.5 version.i created info source for master data/text/hierarchies for IOSP_Mat
now when trying to upload it using info package for master and text data,I observe its(the data) not maintained in the characteristic IOSP_Mat.
When I monitored ,I figured the data has not been even uploaded to the PSA level.
Can you BI experts tell me the answer for this.
Thanks,
Srijithapologies to all of you for late response,
was busy with some other activities.
I don't remember the exact message,but I remember it was not loaded to even the PSA level.I will try it again and post the exact message.
Thanks again for your quick response.
Once again sorry to all of you for my late response
Thanks,
Sri -
OIM 9.1.0.2 - Weblogic JDBC Multi Data Sources for Oracle RAC
Does OIM OIM 9.1.0.2 BP07 support Weblogic JDBC Multi Data Sources (Services>JDBC>Multi Data Sources) for Oracle RAC instead of inserting the "Oracle RAC JDBC URL" on JDBC Data Sources for xlDS and xlXADS (Services>JDBC>Data Sources> xlDS|xlXADS > Connection Poll> URL) ?
If yes, is there are any other modifications that need to be made on OIM, or just changing the data sources?Yes, it's supported. You install against one instance directly of the Rac Server. Then you update the config.xml file and the jdbc resource in your weblogic server with the full rac address. It is documented for installation against RAC. http://docs.oracle.com/cd/E14049_01/doc.9101/e14047/database.htm#insertedID2
-Kevin
Maybe you are looking for
-
I live in Italy but am Australian and opened my itunes account with my Italian credit card and address, but that does not let me access itunes in any English country which is what I had wanted. Apple tell me I need to open separate accounts, complete
-
Hi all, I try to make a pop up window in Indesign using the folio features. The pop up window must contain a scrollable text field and a image at the bottom (also in the scrollable field). What I've already tried: Transforming the pop up window into
-
When I change anything in Develope module LR5 will not allow me to open a .NEF file in PS6
I am using LR5, ACR 8.2 Beta and PS6 on a MAC. If I open a .NEF file in Develop Module and make any changes I cannot get LR to allow me to choose 'Edit in PS6". The option lights up but nothing happens when I click on it. If I Choose a .NEF file tha
-
T60p, T60 wireless card shuts down by itself
Hello, Any help will be greatly appreciated. I have a T60p and T60 that have the exact same issue. When connected wireless after about 20 - 30 minutes the wireless card shuts off by itself. I loaded the updated drivers. I am not using Access Connecti
-
iTunes on my iPad has switched to Spanish somehow - any ideas how I reset it to English?