Calculated members in Data Manager when run package
Hi All
Do you know why calculated members in C_ACCT dim (which has child and are calculated by hierarchy) are not available in data manager when I want to run package. I want copy this calculated members to another account by using script logic and run it by data manager.
Below is my script
*XDIM_MEMBERSET C_ACCT = IC_BSAS05
*WHEN ENTITY
*IS 1013
*REC(FACTOR=2,C_ACCT="ICX_BSCS02")
*ENDWHEN
*COMMIT
IC_BSAS05 is caclulated by hierarchy.
It works for non-calculated, please tell me if it is possible to run script also for calculated members.
Regards!
Hi Justyna,
When you say that IC_BSAS05 is "calculated by hierarchy" do you mean that IC_BSAS05 is hierarchy node?
If this is true then try replacing the first line of your script with
*XDIM_MEMBERSET C_ACCT = BAS(IC_BSAS05)
Regards,
Gersh
Similar Messages
-
Error in process chain when running package (data manager)
Hello experts,
I try to run the following how to : "move date between applications using Data Manager ..."
Yon can find the pdf here : http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/b0480970-894f-2d10-f9a5-d4b1160be203
I'm looking after someone who used it successfully, on my side after implementing all steps and running the package I get the message "IMMEDIATE RUN: Request to run the package on the server was successful The package is now running"
When I look the log for the process chain, overall status of the process chain is yellow, and steps "modify dynamically" and "clear BPC Tables" are in error.
I can't get any usefull error messages.
If someone has an idea to have details about those errors, it will be appreciated.
Points will be awarded, thanks in advance.
Guillaume P.Hello Experts,
I am also facing the same issue with PC failing at Modify_Dynamically step....
This process type is working fine when used in standard delivered process chain to load transaction data.
I tried using this variant in my cutsom chain, but it fails.
Please suggest.....
Thx....D -
Receiving error in BPC Data Manager while running Export package.
We are in a multi server environment and receiving below error message while running Export package.
Have anyone seen below error message in BPC 5.1
An error occurred while executing a package.
Package Error Events:
ErrorCode = -1073668060
Source = Dump Data
SubComponent=
Description = The task "Dump Data" cannot run on this edition of Integration Services. It requires a higher level edition.
IDOfInterfaceWithError= {8BDFE889-E9D8-4D23-9739-DA807BCDC2AC}
Thanks
SridharBelow are our BPC versions:
BPC on Server Manager: 5.0.486
Data Manager from eData: 5.0.484
BPC from eTool: 5.0.486
Below is the total error message as per your suggestion to run Export package. (even with service account which we used to install the software we are getting same error message.)
TOTAL STEPS 2
1. Dump Data: Failed in 0 sec.
[Selection]
FILE=\ApShell_SK\FINANCE\DataManager\DataFiles\SKTEST.TXT
TRANSFORMATION=\ApShell_SK\FINANCE\DataManager\TransformationFiles\System Files\Export.xls
MEASURENAME=PERIODIC
(Member Selection)
Category: ACTUAL
Time: 2006.JAN
Entity:
Account:
DataSrc:
IntCo:
RptCurrency:
[Messages]
An error occurred while executing a package.
Package Error Events:
ErrorCode = -1073668060
Source = Dump Data
SubComponent=
Description = The task "Dump Data" cannot run on this edition of Integration Services. It requires a higher level edition.
IDOfInterfaceWithError= {8BDFE889-E9D8-4D23-9739-DA807BCDC2AC} -
Data Manager not running scripts
Dear Experts,
I've followed one of the "How to pass parameters to BPC scripts" guide and created a Process Chain to run scripts. The whole thing didn't go as smooth as expected and I'm not sure which part went wrong.
As I'm having trouble validating the complex version of codes. I wrote something very simple instead like
*XDIM_MEMBERSET AcctDetail=Non_AcctDetailInput
*WHEN ACCOUNTPLAN
*IS "SSVV060"
*REC(EXPRESSION=%VALUE%,P_DataSrc="INPUT")
*ENDWHEN
*COMMIT
I ran the package, the package should trigger the process chain. but it never showed itself in the package status like it's never ran once at all.
I'm posting my script problem on another thread.
anyone has an idea? Thanks a million...
, Jim HsuPlease check your task profile....
Make sure you have all task profile related to data manager....do let us know if youstill face this problem -
Parent members not being updated when running AGG
I have a scenario where eventhough the bottom level members change the parent members are not being updated when we run an agg
For example if the bottom level member was 150 and we run the aggregate the numbers are fine. If the bottom level member was changed to #Missing then the parent members are not updated when we run the agg
IN the script below we use CALC DIM(COstCenters,SpaceTYpe)
My concern is the setting SET FRMLBOTTOMUP ON i use, because it looks like it's skipping that block. HOw can work around this issue without affecting the performance too much. For example using SET CREATEONMISSINGBLK ON might help, but will have a performance impact
Here are the details of the script
SET MSG SUMMARY;
SET FRMLBOTTOMUP ON;
SET CALCPARALLEL 4;
FIX("Budget","Version1", "FY2011", "RSF","No_Period","M3","M6","M9","M12",@IDESCENDANTS("$1"),@LEVMBRS
(SpaceType,0),@LEVMBRS(CostCenters,0))
CALC DIM (ManagedBU,AllocatedBU);
ENDFIX
FIX("Budget", "Version1", "FY2011", "RSF","No_Period","M3","M6","M9","M12",AllocatedBU,ManagedBU,
@LEVMBRS(SpaceType,0),@LEVMBRS(CostCenters,0))
"$1";
ENDFIX
/* Aggregating numbers for the report so it could be viewed n a top cost center level */
FIX("Budget", "Version1", "FY2011", "RSF","No_Period","M3","M6","M9","M12",@IDESCENDANTS(AllocatedBU),@iDESCENDANTS(ManagedBU),@IDESCENDANTS("$1")
CALC DIM(COstCenters,SpaceTYpe);
ENDFIX
/* Aggregating numbers for the report so it could be viewed in a top regional level*/
FIX("Budget", "Version1", "FY2011", "RSF","No_Period","M3","M6","M9","M12",@IDESCENDANTS(ManagedBU),@IDESCENDANTS(AllocatedBU),@IDESCENDANTS(SpaceType),CostCenters)
@IANCESTORS("$1");
ENDFIXyes Aggmissg will work with from bottom up. As for performance, believe it or not, having Aggmssg on is quicker that off. That is because when off, it has to look at the children blocks to determine if there are data values present that need to overwrite the parent value where hen it is on, it does not look it just does it.
-
Freezing when "Running Package Scripts"
Hello,
Recently, I've tried running two different installations. One was a Microsoft Office update and the other was installing some design software. Both had the same problem - the Installer freezes once it gets to the "Running package scripts" part of the installation, with the install time remaining sticking at "Less than a minute." When this first happened with the Microsoft Office, I ended the program. Then when I was shutting down, it wouldn't let me shut down the computer until it ended, so I let it sit for a few hours and it finally worked. Is there any way to fix this issue?Hi there,
This can occasionally occur if your disk permission are incorrect. I would recommend taking a look at the article below and repairing the disk permission on your drive.
About Disk Utility's Repair Disk Permissions feature
http://support.apple.com/kb/ht1452
Hope that help,
Griff W. -
How data flow when SSIS packages are run on a different server than the DB server
The scenario is that i have a dedicated SQL SErver 2014 SSIS machine that executes the packages.
The database server is a separate machine with SQL Server 2008R2.
1) Running SSIS packages that transfer data within SQL Server 2008R2 (same machine)
2) Running SSIS packages that transfer data between 2 separate SQL Server servers.
How the data flow in these two cases and what resource is being used where? (cpu,disk,ram,network)
EliasWhen you have a dedicated SSIS server, all data read flows to that server, is processed using the resources of that ETL server and then sent back over the network to the destination server.
It doesn't matter if source and destination are the same server. If you use a data flow, all data flows over the network twice.
The only exception is when you don't use a data flow, but only SQL statements. In that case, data flows only between source and destination.
MCSE SQL Server 2012 - Please mark posts as answered where appropriate. -
Error when running package to load data from InfoProvider
I encounter the following error when using the standard package to load data from a InfoProvider to BPC. I tried loading the data from both DSO and InfoCube, but I get the same error.
The log message where the error occurs is as follows:
Task name CLEAR CUBE DATA:
Could not perform write (INHERITED ERROR)
The "convert" step of the process chain completed successfully. This error is happening when the "clear cube according data" step of the process chain gets executed.
Any ideas on why this is happening?
Thanks.Hi Laeral:
You should be able to just open one VISA session then write and read both commands then close the session at the end. This sounds to me like an error that comes from trying to open two VISA sessions to the instrument at the same time. I have attached some very basic LabVIEW VISA code that writes and reads two different commands. Hopefully it will get you started.
Regards,
Emilie S.
Applications Engineer
National Instruments
Attachments:
Basic VISA Example Two Commands.vi 39 KB -
Hello,
When I move the package from development to production package fails due to excel connection manager.
Error message---
Message
Executed as user: Answers\Administrator. ...lt: 0x80040154 Description: "Class not registered". End Error Error: 2014-06-27 05:10:26.50 Code: 0xC0202009 Source: CPRO_Prod_to_XLS_Worksheet_Basic_EX
Connection manager "Excel Connection Manager" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040154. An OLE DB record is available. Source: "Microsoft OLE
DB Service Components" Hresult: 0x80040154 Description: "Class not registered". End Error Error: 2014-06-27 05:10:26.50 Code: 0xC0202009 Source: CPRO_Prod_to_XLS_Worksheet_Basic_EX
Connection manager "Excel Connection Manager" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040154. An OLE DB record is available. Source: "Microsoft OLE
DB Service Components" Hresult: 0x80040154 Description: "Class not registered". End Error Error: 2014-06-27 05:10:26.50 Code: 0xC0202009 Source... The package execution
fa... The step failed.
Any help would be appreciated.Hi Vinay,
As Visakh mentioned, the issue usually occurs because the package runs in 64-bit runtime, however, the Excel driver is 32-bit. According to your another thread, it seems that you use SSIS 2005, right? In SSIS 2005, there is no “use 32 bit runtime” option
for the package executions of a SSIS package job step. In this condition, you have two choices:
Install the 64-bit Microsoft ACE OLE DB 12.0 Provider on the production server. Here is the download link (AccessDatabaseEngine_x64.exe):
http://www.microsoft.com/en-in/download/details.aspx?id=13255
Create a CmdExec type job step instead of SSIS Package job step, so that we can call 32-bit DTExec.exe to execute the package within command line. For more information, please see:
http://support.microsoft.com/kb/934653/en-us
Regards,
Mike Yin
TechNet Community Support -
No data loaded when running publish/subscribe sample on Tomcat
I have Tomcat 6.0 running as a .jsp handler with Apache 2.2 using the mod_jk connector. I've configured the samples/ context to be handled via Tomcat. Everything seems to be working fine -- I can access the files and launch .jsp files via my web browser.
But when I try to fire up the DataPush example, I get no results from BlazeDS. Here's the procedure I am using:
1. Start the feed by running http://localhost:8080/samples/testdrive-datapush/startfeed.jsp
2. Run the Flex side: http://localhost:8080/samples/testdrive-datapush/index.html.
3. Press the "Subscribe to 'feed' destination."
Using Firebug within Firefox, I can tell that it's attempting to access a streaming channel. There are two requests:
1) http://localhost:8080/samples/messagebroker/streamingamf
2) http://localhost:8080/samples/messagebroker/streamingamf?command=open&version=1
Request #1 comes back with a plain-text "Loading..." response.
Request #2 returns an error message: The request sent by the client was syntactically incorrect ().
Incase it matters, I have been playing around with the config files in previous tests. For this test, I deleted the samples.war, and samples/ directory from Tomcat. Then I copied the original samples.war back in and restarted Tomcat to redeploy. I would think that this erases all of my configuration settings -- but it does not seem to be doing that (The Data Push/Pull does NOT default to streaming from the Turnkey package, yet when I hit the service again, it always tries to perform streaming first -- as I had told it to do in a previous configuration).
So I have two questions:
1. Why doesn't the example work using streaming?
2. Why aren't my configuration files being reloaded when I restart Tomcat?
TIA
JonathonHi,Alex
Thanks for your advies, But it do not works.
with a Apache Fronting Tomcat, the following config is work:
channel 1 : Hi. Here are answers to your questions.
>
>>>1. Why doesn't the example work using streaming?
>Some proxies buffer the response that is sent back to the client. This causes streaming connections to not work. This is a known issue with the mod_jk connector. Take a look at the following bug for more info.
>http://bugs.adobe.com/jira/browse/BLZ-84
>
>>>2. Why aren't my configuration files being reloaded when I restart Tomcat?
>Did you recompile the sample application after changing the configuration? The configuration information is used by both the server and the client and is compiled into the client. You will need to recompile the client application using the new configuration files if you want your changes to be picked up. Note that is is possible to load the configuration information into the client application at runtime but that's not how the sample applications were built. If you are interested in loading the configuration information at runtime here is a blog posting that gives an example.
>http://coenraets.org/blog/2009/03/externalizing-service-configuration-using-blazeds-and-lc ds/
>Hope that helps.
>-Alex
> -
How can I get size of fetched data size when run a SQL ?
Hi,
When I run a SQL how can I get fetched data size?
regards,We can get it using some calculations:
1. SQL%ROWCOUNT attribute will fetch you number of rows returned in a SQL statement.
Problem-->Now you need to get the size of 1 row.
Solution-->
Step 1.We can get size of table from dba_segment data dictionary.
Select block_size from dba_segemnts where segment name like '%YOURTABLE%';
Step2 . Get count of rows in your table
Select count(*) from YOUR_TABLE
Step 3.
Get size of 1 row--> Divide result in Step1 by result in Step 2.
Required result--> multiple result in step3 with SQL%ROWCOUNT..
I hope this is what u want. -
Issue with date formula when running a report on the 1st day of the month
We have a formula that compares last month data against last year last month, the report runs on the 1st of every month although the report errors when it tries to run in January being the first month of the new year it tries to look for a month begining with Zero instead of looking back at decemeber which would be 12.
Does anyone any solutions for this issue?
Thanks,
Chris
Example of some formula's we have tried but none seem to work.
if {@MonthName} = "December" then year({CDCCHD.CDOPDT}) in year(currentdate)-1 to year(currentdate)-2 else year({CDCCHD.CDOPDT}) in year(currentdate)
OR
(if {@Last Full Month Name}= "December" then year({CDCCHD.CDOPDT})= year(currentdate)-2 or
{CDCCHD.CDOPDT} in lastyearytd else {CDCCHD.CDOPDT} in lastyearytd or
{CDCCHD.CDOPDT} in yeartodate)
OR
({CDCCHD.CDOPDT} in dateserial(year(currentdate)-1,month(currentdate),01)-1 to dateserial(year(currentdate)-1,month(currentdate)-1,01) or
{CDCCHD.CDOPDT} in lastfullmonth)
OR
if {@monthname} = "December" then {CDCCHD.CDOPDT} in dateadd("m", -28, (currentdate)) to (currentdate)-14 else {CDCCHD.CDOPDT} in dateadd("m", -14, (currentdate)) to (currentdate)Chris,
Give these formulas a try...
Beg of Last Month
DateAdd("m", DateDiff("m",#1/1/1900#, CurrentDate) - 1, #1/1/1900#)
End of Last Month
DateAdd("m", DateDiff("m",#1/1/1900#, CurrentDate), #1/1/1900#)
Beg December of Last Year
IF DatePart("m",CurrentDate) = 1
THEN DateSerial(DatePart("yyyy",CurrentDate) -2, 12,1)
ELSE DateSerial(DatePart("yyyy",CurrentDate) -1, 12,1)
End December of Last Year
IF DatePart("m",CurrentDate) = 1
THEN DateSerial(DatePart("yyyy",CurrentDate) -1, 1,1)
ELSE DateSerial(DatePart("yyyy",CurrentDate), 1,1)
Note that the "End" formulas produce a value that shows midnight of the following day that you would normally expect to see... For example, if you are expecting to see 12/31/2009, it will show 1-1-2010.
The reason is that the formulas come out to be midnight of the day shown... So a range of 12/1/2009 - 1/1/2010 will include 12/31/2009's data but none of 1/1/2010... (12/1/2009 - 12/31/2009 would actually cut off 12/31/2009).
If for some reason your data is stored w/o time values then an adjustment would need to be mande.
HTH,
Jason -
NO DATA error when running init load with 2LIS_11_VAHDR
hi experts,
I try to load initial data with 2LIS_11_VAHDR
the steps I did:
1. 2LIS_11_VAHDR is activated in R/3 Log.Cockpit LBWE
2. 2LIS_11_VAHDR is replacated in BW
3. Setup tables are filled for SD Sales Orders in R/3
4. testing 2LIS_11_VAHDR with RSA7 in R/3 delivers 981 records
when I start the info package with "Initialize delta process with Data Transfer" the error occurs "0 Records - NO DATA"
Any Idea what I'm doing wrong?
Best Regards
nevenHi,
Go to bd87 transaction and see any yellow requests..If so select it and process manually.
or
In the monitor Environment->Transactional RFC -> in the source system.
Then give user id and pwd for source system and try to execute...see if any LUWs are pending and process them. -
Date format when running in Mexico vs US
When using RS 2005. I have a report that I developed in the US and runs on a server in the US. I can create a subscription with no problems.
When our region in Mexico trys to create a subscription it gives an error on the date selections because it want to do dd/mm/yyyy instead of mm/dd/yyyy.
Is there a way around this problem.The solution will require custom code since SSRS does not support localization. CodeProject has a couple of solutions:
http://www.codeproject.com/Articles/294636/Localizing-SQL-Server-Reporting-Services-Reports
http://www.codeproject.com/Articles/33355/Localization-of-SSRS-Reports
"You will find a fortune, though it will not be the one you seek." -
Blind Seer, O Brother Where Art Thou
Please Mark posts as answers or helpful so that others may find the fortune they seek. -
Weird data obtained when running Task: AD Group Lookup Recon
Hi,
Im running the scheduled task named: AD Group Lookup Recon
It works. and populates the lookup named Lookup.ADReconciliation.GroupLookup
but when lookin in the design console, the Code Key and the Decode values have weird data ie:
code key: 2~CN=TelnetClients,CN=Users,DC=adtest,DC=com
Decode: ADITResource~CN=TelnetClients,CN=Users,DC=adtest,DC=com
in the code key there is an extra *2~*
in the Decode is an extra ADITResource~
I may think that it is some kind of coding for connector commands used in provision tasks, when I'm trying to provision an OIM user to Active Directory (in the Organization Lookup field) i get this data
this is just one line:
Value: 2~CN={6AC1786C-016F-11D2-945F-00C04fB984F9},CN=Policies,CN=System,DC=adtest,DC=com
Description: ADITResource~CN={6AC1786C-016F-11D2-945F-00C04fB984F9},CN=Policies,CN=System,DC=adtest,DC=com
Any Ideas?
Thank You.yes you are right, code key and decode key is because of the coding in the connector to distinguish lookup values coming from multiple IT resources.
If you want to get rid of this [IT Resource~] you will have to modify the connector.
One more thing looks like the base dn you have specified for lookup reconciliation is DC=adtest,DC=com with generic filter thats why you are getting entries like 2~CN={6AC1786C-016F-11D2-945F-00C04fB984F9},CN=Policies,CN=System,DC=adtest,DC=com which may not be a group you want
Hope this helps,
Sagar
Maybe you are looking for
-
Problem With PSE-2 and New 1.5 TB Drive
Hi all. First of all, I've been using my registered PSE-2 for several years with no problem. AMD Athalon 7750 dual core at 2.7 Ghz, 2.0 G Memory. WD 1.5 TB primary and WD 500 GB secondary drive.Win XP SP3. I recently had a drive fail and installed a
-
URGENT Vodafone "vodacom" blackberry service stopped Working in CDMA phones in WHOLE South africa
customers using CDMA devices like storm 9530,tour 9630 ....can not use blackberry service due to vodafone "vodacom" netWork issues !!!!!! it looks like they have removed CDMA device identifiers for their BB service books, so any time a CDMA device a
-
Hi Experts, i well know about OBIEE 10g but i am new to 11g. can any one tellme wat is the exact use of KPI's and Score cards in answers?/ as we can compare measures or anything using chart views etc... then wat is the need of KPI's and Score cards??
-
Hi, My iphone 5 loses the 3G connections everytime i go into tunnel and also to holland park and it does not come back untill i restart the phone? i dont know why is it happening?
-
How to make software component available inside IB (Design)
I have created a product/component version in SLD. But I cannot see it in IB. What did I miss ?