Transformation File - Usage of Len function in Mapping-
Hi,
I am trying to load the data from flat file to bpc 7.
I have maintained a Transformation file, under the mapping options iam using the following condition
Mapping:
Account = if (LEN(Col(7)) = 4 then Col(7) +str(.) +*Col(5) ; *Col(7) = *Str ( ) then *Col(5) + *Str(.) + *Col(17); *Col(5) + *Col(4))
Iam trying to check whether column 7 has four characters in the first condition, but it does not work and action 1 is never reached. Iam not sure whether length function can be used in bpc and also There are no errors during validation, but when i run this package transformation for the Account Field does not happen.
It gives the message "Argument Length must be greater or equal to zero". I have tried many ways but the same error message is displayed.
Is the above mapping correct and is there any other way to check for the length of a field and make comparison ?
Any quick help is highly appreciated.
Advances thanks !!!!!!
Edited by: N. Thanveer Ahmed on Aug 22, 2009 7:02 PM
Edited by: N. Thanveer Ahmed on Aug 23, 2009 9:45 PM
removed
Message was edited by:
Marc Kelderman
Similar Messages
-
Script Logic VS Data Transformation File
Hi all,
I'm new to SAP BPC. I have knowledge of SAP BW.
I can see conversion file, which we are referring in data transformation file. which we can use for mapping and conversion of external data into internal data.
How data transformation file different form script logic? Are we going to refer script logis in Data transformation file for each required dimension?
Can any of you give me clarity on how to place script logic and data transformation file in BPC data management flow.
I will really applicate all your help!!!
Thanks
Ben.Nilanjan,
I have a another quick question...
suppose my bpc application has 5 dimensions. Out of the 5 dimensions, 4 dimensions data i'm getting directly from SAP BW. assume 1 dim, i need to extract by doing look up at different table which also reside in BW.
how to populate data for DIM 5.
I got your point that data transformation file purely for field mapping. suppose DIM5 if i want to populate from script logic, wht do i need map in Transformation file. I hope you got my point.
My question if how to populate a DIM in BPC using lookup approach.
Thanks,
Ben. -
Mapping in Transformation file for loading infoprovider
Mapping in transformation file for load from infoprovider:
The requirement is : if Account of BW starts with 70XXXXXXX then use char1 if Account of BW starts with 12XXXXXXX then use char2 in BPC dimension 2.
So, in the transformation file for a load from an infoprovider we want for a dimension to use the data from a certain BW characteristic based on the characteristic Account.
For example if the account start with 70 then use for a certain bpc-dimension u201Cdetailu201D the characteristic of 0COUNRTY should be used, if account start with 2 the char X should be used etc..
Following in the transformation works but the issue is that we have to specify all the accounts individually (+100 accounts in the statement which is not feasible):
BPC_detail = *IF (BWACCOUNT = str(70000010) then 0COUNTRY;str(NO_DETAIL))
Where BPC_detail is the dimension in BPC and BWACCOUNT is the characteristic in BW.
Following statement does not work: there is also no documentation available how to do this:
BPC_detail = *IF (BWACCOUNT(1:2) = str(70) then 0COUNTRY;str(NO_DETAIL))
Is there a solution/statement that fulfills this requirement for the load of an infoprovider?
( so similar to what you can do with the load of a flat file like for example: Entity=IF(col(1,1:1)=U then SEntity;*col(1,1:1)=Z then *col(1,3:6); *STR(ERR)) )
RgdsHi,
Install process chain /CPMB/LOAD_INFOPROV_UI from BI Content as follows:
1.Enter Tcode RSA1
2. In the left navigation bar, click 'BI content'
3. Select process chain and double click "Select Objects".
4. Select the process chain /CPMB/LOAD_INFOPROV_UI.
5. Click 'Transfer Selections' button.
6. On the right pane, install objects from BI Content.
7. Enter Tcode SE38.
8. Input program name ujs_activate_content and click to run.
9. Only select option 'Update DM Default Instructions'.
10. Execute program.
Hope it helps..
Regards,
Raju -
Conditional mapping in the transformation file
Hi
Can anyone tell me if it is possible to perform more complex mapping when loading transactional data? I have 2 requirements:
1. When the Profit Centre is invalid, set it to a default value of "9999". Where an invalid Profit Centre is a Profit Centre that does not exist as a valid dimension member.
2. When the Profit Centre is null, then set it to a default value based on the Entity value.
For example:
When the Profit Centre is null and the Entity = 100, set the profit centre to 1000.
When the Profit Centre is null and the Entity = 200, set the profit centre to 2038. etc
Ideally I would like to define the mapping rules in either transformation or conversion files.
Regards, SharonWhat I can think right now is
Case 1:
If you're with BPC 7.0 NW, there is no option to accept not existing member and store it with a default value(Correct me if I'm wrong).
But you can enable VALIDATERECORDS=YES in the transformation file to output the rejected records with wrong dimension member into a file. Since we need all the rejected records and keep processing the current import, set MAXREJECTCOUNT=-1.
Next do additional task in your import package to rename all the rejected Profit Center with the default value (9999) and do re-import again.
Limitation: this will work only if you want to do fix on 1 dimension in this case the Profit Center. If you need to do fix on multiple dimension, this would not work because the rejected records could be caused by any dimension. So you're lucky in your case.
If you're with BPC 7.5 NW, use DM BADI to do what you want.
Case 2:
You can use IF in your mapping section, for example:
ProfitCenter=*IF(ProfitCenter = *str(),*IF(Entity=100,*str(1000),...),ProfitCenter)
Halomoan -
Mapping navigation attributes on transformation files
All,
I have one navigational attribute I would like to map to a dimension in my transformation file. I've tried:
INTCO=0COMP_CODE__0COMPANY
and
INTCO=0COMPANY
Either ways I get an error message saying it is expecting an * after the equal sign. How can I map / use navigational attributes here?
Thanks
PauloHi Tony,
*SELECTION is one of the parameters allowed in the *OPTIONS section of a transformation file. You can see the syntax in the transformation documentation at [http://help.sap.com/saphelp_bpc75_nw/helpdata/en/5d/9a3fba600e4de29e2d165644d67bd1/frameset.htm]. It is only allowed in transformation files used to load from InfoProviders, not from flat-files.
Ethan -
How to load several transformation files with a single action
Hi everybody,
We are loading data from BI cube into BPC cube. We are working on SAP BPC 7.0 version and we have designed several transformation files in order to load each key figure we need.
Now, we want to load all the transformation files executing only one action. Which one is the best way to do it?
We thought that it would be possible to build a single process chain, where we would call the target cube and all the transformation files. In this way, the administrator only has to execute once a package that would execute the process chain. We don't want the administrator to execute several times a package looking for the different transformation files.
How can we do it? Is there any example or document related to it?
Any idea out there?
Kind regards
Albert MasHI SCOTT,
I AM FACING A PROBLEM WHEN I RUN 2 ROUNDS IN ONE TRANSFORMATION FILE...
I need to distribute a source field in to BPC through making 2 conversion files... following is the data
Transformation file
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
CONVERTAMOUNTWDIM=ZOUTPUT
*MAPPING
CATEGORY=*NEWCOL(ACT)
PAO=0COSTCENTER
TIME=0FISCYEAR
ZOUTPUT=0FUNDS_CTR
SIGNEDDATA=0DEB_CRE_LC
*CONVERSION
PAO=PAO_CONVER.XLS
ZOUTPUT=ZOUTPUT_CONVER.xls
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
CONVERTAMOUNTWDIM=ZOUTPUT
*MAPPING
CATEGORY=*NEWCOL(ACT)
PAO=0COSTCENTER
TIME=0FISCYEAR
ZOUTPUT=0FUNDS_CTR
SIGNEDDATA=0DEB_CRE_LC
*CONVERSION
PAO=PAO_CONVER.XLS
ZOUTPUT=AMOUNT_CONVER.XLS
Conversion file 1 (PAO=PAO_CONVER.XLS)
EXTERNAL
INTERNAL
ID0001
F08001
ID0002
F08001
ID0003
F08001
DG0001
F08001
DG0002
F08001
Conversion file 2 (ZOUTPUT=ZOUTPUT_CONVER.xls)
ID0001
FX01
VALUE*1
ID0002
FX01
VALUE*1
ID0003
FX01
VALUE*.40
DG0001
FX02
VALUE*1
DG0002
FX02
VALUE*1
Conversion file 3 (ZOUTPUT=AMOUNT_CONVER.XLS)
EXTERNAL
INTERNAL
FORMULA
ID0003
FX02
VALUE*.60
I am getting the following error
[Start validating transformation file]
Validating transformation file format
Start validation transformation 1/2
Validating options...
Validation of options was successful.
Validating mappings...
Validation of mappings was successful.
Validating conversions...
Validation of the conversion was successful
Start validation transformation 2/2
Validating options...
Validation of options was successful.
Validating mappings...
Validation of mappings was successful.
Validating conversions...
Validation of the conversion was successful
Creating the transformation xml file. Please wait...
Transformation xml file has been saved successfully.
Begin validate transformation file with data file...
[Start test transformation file]
Validate has successfully completed
ValidateRecords = YES
Reject count: 0
Record count: 6
Skip count: 0
Accept count: 6
0COSTCENTER is not a valid command or column 0COSTCENTER does not exist in source
Validation with data file failed -
How to execute several ROUNDS in a single transformation file?
Hi everybody,
I've put several transformation files into one transformation file but I have the following problems:
- when I look at the log after validating the transformation file, it only shows values for the last combination of OPTIONS / MAPPING / CONVERSION
- when I execute the package for loading dat into the cube, it ONLY loads data for the last combination of OPTIONS / MAPPING / CONVERSION that appears in the transformation file
The number of 'Submit record count' is always the same as the number of accepted records of the last round. And if I create a view from the target cube, it only has data for the last combination of OPTIONS / MAPPING / CONVERSION.
What is happening? Do I must change any parameter?
It's critical for the project as we are loading a lot of key figures and we must simplify the administration of the load process
Thanks a lot in advance for your support,
Albert MasHi,
Please try (for Windows):
Double click printer icon on desktop,
Select Scan a Document or Photo,
Put the first page on the glass (face down),
Check options (size, dpi ...), and select Scan document to file,
Click Scan - machine will scan the first page
Remove the first page on the glass, put the second page,
Click + (plus sign) It sits on the left hand side of a red x
Machine will scan the second page, put 3rd page on the glass and click + again ..... to the end then click Save
Click Done after Save
Regards.
BH
**Click the KUDOS thumb up on the left to say 'Thanks'**
Make it easier for other people to find solutions by marking a Reply 'Accept as Solution' if it solves your problem. -
Hello All,
Below is the server configuration,
OS: Windows 2008 R2 Enterprise 64 Bit
Version: 6.1.7601 Service Pack 1 Build 7601
CPU: 4 (@ 2.93 GHz, 1 core)
Memory: 12 GB
Page file: 12 GB
1. The actual utilization, be it a 15 minute sample, hourly, weekly etc, the utilization of real memory has never crossed 20% and the page file usage is at 0.1%. For some reason, the Pages/Sec>Limit% counter reports 100% continuously regardless of the
sampling intervals. Upon further observation, the Page Reads/Sec value is somewhere between 150~450 and Page Input/Sec is somewhere between 800~8000. Does this indicate a performance bottleneck? (I've in the interim asked the Users, App. Owners to see if they
notice any performance degradation and awaiting response). If this indicates a performance issue, please could someone help list down how to track this down further to which process/memory mapped file is causing it? and what I should go about performing to
fix this problem please?
p.s., initially the Security logs were full on this server and since page file is tied to Application, Security and System logs, this was freed up to see if this is causing the high page reads but this doesn't.
2. If the above does not necessarily indicate a performance problem, please can someone reference few KB articles that confirms this? Also, in this case, will there be any adverse effects if attempting to fine tune a server which is already running fine?
assuming App. Owners confirm there isn't any performance degradation.
Thanks in advance.Hi,
Based on the description, we can try to download Server Performance Advisor (SPA) to help further analyze the performance of the server. SPA can generate comprehensive diagnostic reports and charts and provides recommendations to help you quickly analyze
issues and develop corrective actions.
Regarding this tool, the following articles can be referred to for more information.
Microsoft Server Performance Advisor
https://msdn.microsoft.com/en-us/library/windows/hardware/dn481522.aspx
Server Performance Advisor (SPA) 3.0
http://blogs.technet.com/b/windowsserver/archive/2013/03/11/server-performance-advisor-spa-3-0.aspx
Best regards,
Frank Shen
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected] -
Creation of Transformation File
Dear Friends,
How to map the Dimension with Flat file in transformation file, Can any one please guide me.
I have pull the Company Code Master data from BW and upload the flat file in BPC Application Server, now want to create a transformation file to update the same in BPC.
I have created the Dimension called Entity in that, my ID is Company and IntCo(Property) is Intercompany reference, Both Dimensions doesnt have the property Company code.
But in the flat file have a Company information.
Kindly guide me how to map the flat file fields with Dimension Entity.
Thanks in advance.
Regards, MD.Hi,
The transformation file should contain the information about which property should be mapped with which source column of the flat file. For example, if the master data is in the column heading COMPANY, then you should have:
ID = COMPANY
Please refer to the transformation example in the below link from help.sap:
http://help.sap.com/saphelp_bpc75_nw/helpdata/en/e0/532af2d0804218a59157136bb63a98/content.htm
Hope this helps. -
Problem Validating & Processing Transformation file in NW 7.0 version
I am trying to Validate & Process Transformation File against my Data file I am getting the error message that I have at the below. When I validate the Conversion files, I see the creation of corresponding .CDM files. I even deleted the .CDM files and recreated the files.
So my question is why it is giving "Sheet does not exist (CONVERSION)" warning for which it is rejecting the records inside the Data File?
DATA FILE:
C_Category,Time,R_ACCT,R_Entity,InputCurrency,Amount
ACTUAL,2007.DEC,AVG,GLOBAL,USD,1
ACTUAL,2007.DEC,END,GLOBAL,USD,1
ACTUAL,2007.DEC,AVG,GLOBAL,JPY,110
ACTUAL,2007.DEC,END,GLOBAL,JPY,110
ACTUAL,2007.DEC,HIST,GLOBAL,JPY,110
ACTUAL,2007.DEC,HIST,GLOBAL,USD,1
ACTUAL,2008.MAR,AVG,GLOBAL,USD,1
ACTUAL,2008.MAR,END,GLOBAL,USD,1
ACTUAL,2008.MAR,AVG,GLOBAL,JPY,107.5
ACTUAL,2008.MAR,END,GLOBAL,JPY,105
ACTUAL,2008.MAR,HIST,GLOBAL,JPY,110
ACTUAL,2008.MAR,HIST,GLOBAL,USD,1
TRANSFORMATION FILE:
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
SPECIFICMAPPING=YES
*MAPPING
C_Category=*col(1)
Time=*col(2)
R_ACCT =*col(3)
R_Entity=*col(4)
InputCurrency=*col(5)
Amount=*col(6)
*CONVERSION
C_Category=[COMPANY]C_Category.xls!CONVERSION
Time=[COMPANY]Time.xls!CONVERSION
R_ACCT=[COMPANY]R_ACCT.xls!CONVERSION
R_Entity=[COMPANY]R_Entity.xls!CONVERSION
InputCurrency=[COMPANY]InputCurrency.xls!CONVERSION
ERROR
[Start validating transformation file]
Validating transformation file format
Validating optionsu2026
Validation on options was successful
Validating mappingsu2026
Validation on mappings was successful
Validating conversionsu2026
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Validation on conversions was successful
Creating the transformation xml file; wait a moment
Transformation xml file saved successfully
Connecting to server...
Begin validate transformation file with data fileu2026
[Start test transformation file]
Validate has successfully completed
ValidateRecords = YES
Task name CONVERT:
No 1 Round:
Record count: 12
Accept count: 0
Reject count: 12
Skip count: 0
Error: All records are rejected*CONVERSION
C_Category=C_Category.xls
Time=Time.xls
R_ACCT=R_ACCT.xls
R_Entity=R_Entity.xls
InputCurrency=InputCurrency.xls
On validating with the above format, I still getting the same error
[Start validating transformation file]
Validating transformation file format
Validating optionsu2026
Validation on options was successful
Validating mappingsu2026
Validation on mappings was successful
Validating conversionsu2026
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Sheet does not exist (CONVERSION)
Validation on conversions was successful
Creating the transformation xml file; wait a moment
Transformation xml file saved successfully
Connecting to server...
Begin validate transformation file with data fileu2026
[Start test transformation file]
Validate has successfully completed
ValidateRecords = YES
Task name CONVERT:
No 1 Round:
Record count: 12
Accept count: 0
Reject count: 12
Skip count: 0
Error: All records are rejected -
Issue with the /CPMB/EXPORT_TD_TO_FILE transformation file.
Good evening to all,
We have to export data from a cube to a text file. To do this we have create in the BPC (NW 7.5 SP 09) a package with the process chain /CPMB/EXPORT_TD_TO_FILE but when we execute the package, we get an error. We think that the error is generated by the transformation file. In the chainu2019s log we can see that the process didnu2019t finish successful, the Appl Source and the clear BPC tables variants are in red and we get this error u201CJob or process BPCAPPS waiting for eventu201D.
Can anybody help us?
our transformation file is like this:
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
*MAPPING
DIMENSION_1=/CPMB/NAMEDIMENSION_1
DIMENSION_2=/CPMB/NAMEDIMENSION_2
DIMENSION_3=/CPMB/NAMEDIMENSION_3
DIMENSION_4=/CPMB/NAMEDIMENSION_4
DIMENSION_5=/CPMB/NAMEDIMENSION_5
Amount=/CPMB/SDATA
*CONVERSION
Is this the proper use of the TF in this case?
The complete error log:
/CPMB/MODIFY completado en 0 segundos
/CPMB/APPL_TD_SOURCE completado en 2 segundos
/CPMB/CLEAR completado en 0 segundos
MEASURES=PERIODIC
TRANSFORMATION= DATAMANAGERTRANSFORMATIONFILESTF_EXTRACCION_PLANO.xls
FILE= DATAMANAGERDATAFILESprueba1.txt
ADDITIONINFO= Sí
(Selección de miembros)
CONCEPTO_FI:
FUENTE:
GRUPOCAME:
MONEDA:
SOCIEDAD:
TIEMPO: 2011.ENE
VERSION:
Nom.tarea TRANSACTION DATA SOURCE
Error de sentencia MDX: Valor CONTCORMOL /CPMB/T5DDIW
Aplicación: FINANZAS status de paquete: ERROR
Thanks in advance,
Best regardsHi. Thanks for the reply.
We still working on the issue. For a test we run the package without TF but we get the same error. We can find out the variant BPC Appl Source(APPL_TD_SOURCE) is the variant that generates the issue.
When we ran the package the package didn't finish and it get the status get the status active but when we ran the package in an other application the package ends with an error.
The variant log error was:
RSPROCESS 4ENFI7BF1OI7M7E5GKHXIT5IL is unknown
Here is the log in english.
/CPMB/MODIFY complete in 0 seconds
/CPMB/APPL_TD_SOURCE complete en 5 seconds
/CPMB/CLEAR complete en 0 seconds
[Selection]
MEASURES=PERIODIC
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\TF_EXTRACCION_PLANO2.xls
FILE= DATAMANAGER\DATAFILES\EXAMPLES\prueba_ETD.txt
ADDITIONINFO= Yes
(Members selection)
CONCEPTO_FI:
SOURCE:
GRUPOCAME:
CURRENCY:
SOCIETY:
TIME: 2011.JAN,2011.FEB,2011.MAR,2011.APR,2011.MAY,2011.JUN,2011.JUL,2011.AGO,2011.SEP,2011.OCT,2011.NOV,2011.DEC
VERSION:
[Messages]
Task name TRANSACTION DATA SOURCE
Sentence error MDX: Value CONTCORMOL /CPMB/T5DDIW
Application: FINANZAS Package status: ERROR
Thanks for your help, best regards. -
File Adapter's Write Functionality Error
I am using BPEL PM10.1.3.1
and getting problem for the OrderBooking tutorial File Adapter’s Write Functionality
BUILD FAILED
D:\JDeveloper\jdev\mywork\OrderBookworkspace\POAcknowledge\build.xml:79: A problem occured while connecting to server "localhost" using port "9700": bpel_POAcknowledge_1.0.jar failed to deploy. Exception message is: ORABPEL-05215
Error while loading process.
The process domain encountered the following errors while loading the process "POAcknowledge" (revision "1.0"): BPEL validation failed.
BPEL source validation failed, the errors are:
[Error ORABPEL-10902]: compilation failed
[Description]: in "bpel.xml", XML parsing failed because "undefined part element.
In WSDL at "file:/D:/product/10.1.3.1/OraBPEL_1/bpel/domains/default/tmp/.bpel_POAcknowledge_1.0_9cb86657d04e99ccb06091467c82f390.tmp/FileWrite.wsdl", message part element "{http://www.thiscompany.com/ns/sales}POAcknowledge" is not defined in any of the schemas.
Please make sure the spelling of the element QName is correct and the WSDL import is complete.
[Potential fix]: n/a.
If you have installed a patch to the server, please check that the bpelcClasspath domain property includes the patch classes.
at com.collaxa.cube.engine.deployment.CubeProcessHolder.bind(CubeProcessHolder.java:285)
at com.collaxa.cube.engine.deployment.DeploymentManager.deployProcess(DeploymentManager.java:804)
at com.collaxa.cube.engine.deployment.DeploymentManager.deploySuitcase(DeploymentManager.java:670)
at com.collaxa.cube.ejb.impl.BPELDomainManagerBean.deploySuitcase(BPELDomainManagerBean.java:445)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at com.evermind.server.ejb.interceptor.joinpoint.EJBJoinPointImpl.invoke(EJBJoinPointImpl.java:35)
at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:52)
at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
at com.evermind.server.ejb.interceptor.system.JAASInterceptor$1.run(JAASInterceptor.java:31)
at com.evermind.server.ThreadState.runAs(ThreadState.java:620)
at com.evermind.server.ejb.interceptor.system.JAASInterceptor.invoke(JAASInterceptor.java:34)
at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
at com.evermind.server.ejb.interceptor.system.TxRequiredInterceptor.invoke(TxRequiredInterceptor.java:50)
at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:52)
at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:119)
at com.evermind.server.ejb.InvocationContextPool.invoke(InvocationContextPool.java:55)
at com.evermind.server.ejb.StatelessSessionEJBObject.OC4J_invokeMethod(StatelessSessionEJBObject.java:87)
at DomainManagerBean_RemoteProxy_4bin6i8.deploySuitcase(Unknown Source)
at com.oracle.bpel.client.BPELDomainHandle.deploySuitcase(BPELDomainHandle.java:317)
at com.oracle.bpel.client.BPELDomainHandle.deployProcess(BPELDomainHandle.java:339)
at deployHttpClientProcess.jspService(_deployHttpClientProcess.java:376)
at com.orionserver.http.OrionHttpJspPage.service(OrionHttpJspPage.java:59)
at oracle.jsp.runtimev2.JspPageTable.service(JspPageTable.java:453)
at oracle.jsp.runtimev2.JspServlet.internalService(JspServlet.java:591)
at oracle.jsp.runtimev2.JspServlet.service(JspServlet.java:515)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
at com.evermind.server.http.ResourceFilterChain.doFilter(ResourceFilterChain.java:64)
at oracle.security.jazn.oc4j.JAZNFilter$1.run(JAZNFilter.java:396)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAsPrivileged(Subject.java:517)
at oracle.security.jazn.oc4j.JAZNFilter.doFilter(JAZNFilter.java:410)
at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:621)
at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:368)
at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:866)
at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:448)
at com.evermind.server.http.HttpRequestHandler.serveOneRequest(HttpRequestHandler.java:216)
at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:117)
at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:110)
at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)
at oracle.oc4j.network.ServerSocketAcceptHandler.procClientSocket(ServerSocketAcceptHandler.java:239)
at oracle.oc4j.network.ServerSocketAcceptHandler.access$700(ServerSocketAcceptHandler.java:34)
at oracle.oc4j.network.ServerSocketAcceptHandler$AcceptHandlerHorse.run(ServerSocketAcceptHandler.java:880)
at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303)
at java.lang.Thread.run(Thread.java:595)
I am not able to trace the error....
Please suggest me where I am making mistake or anything else required i.e.Patche etc. to overcome this issue.
AvinashHey I found solution to this problem after R&D on JDeveloper BPEL file structure
We just have to
Copy OrderBookingPO.xsd and POAcknowledge.xsd from the Oracle_
Home\integration\orabpel\samples\tutorials\127.OrderBookingTu
torial\PracticeFiles directory to the Oracle_
Home\integration\jdev\jdev\mywork\OrderBookworkspace\POAcknow
ledge\bpel directory.
not in
Oracle_
Home\integration\jdev\jdev\mywork\OrderBookworkspace\POAcknow
ledge directory as mensitioned in Order Booking Tutorial in section Adding Transformation Logic just ignore this instruction and follow up as I have asked to you in above paragraph. Every thing will be ok...
thanx
Avinash Kumar Pandey -
How to use the transformation file
Hello there!
This message is to ask a question that maybe it could be familiar to you. I want to load data using a .txt file and a transformation file using the import package. Suppose that the structure of my txt file is:
Entity,Category,Time,Account1,Account2,Account3
Ent1,Actual,2010.JAN,289.23,32.43,123.34
Ent1,Actual,2010.JAN,289.23,32.43,123.34
Ent1,Actual,2010.JAN,289.23,32.43,123.34
How could I load this values using a transformation? I know that there exists an option named *MVAL(), but if I use the following syntax in mapping section it doesnu2019t work:
Account=*MVAL(4:6)
I read something about accountval, does some boddy knows how to use it?
Thanks in advance
regardsHi,
If you are using MVAL statement, then you need to maintain a conversion file also. Its compulsory. Please make sure that you have maintained a conversion file.
If you are on 7.5 MS, then the statement will work. Please refer to the MVAL statement in the mapping section of the below link
http://help.sap.com/saphelp_bpc75/helpdata/en/a2/e722bc58404335ada8592cdc8feaca/content.htm
However, in 7.0 MS, the MVAL works only if the multiple key figures are corresponding to the time dimension. In your case, they correspond to the account dimension. You need to create separate transformation file for each of the key figures and then upload the flat file separately using each of the transformation files.
Hope this helps. -
Hi,
It seems that the usage of BODS Functions like NVL and DECODE are leading to the partial PUSH-DOWN to the database.
Is there any way to achive to the FULL PUSH DOWN (INSERT INTO DBTABLENAME)?
Regards,
SudhakarHello
The optimiser determines if the whole dataflow can be pushed-down. NVL and DECODE can be pushed-down to most databases, so its something to do with how you've built the dataflow.
Things that stop full push-down
1 - custom functions
2 - complex statements
3 - datatype conversions
3 - complex dataflows with many transforms
4 - some specific tranforms (DQ, pivot,etc.)
With a bit of trial and error, you should be able to figure out which is stopping yours.
Michael -
How can i fix a member for a dimension in the transformation file?
Hello everybody,
anyone knows how can I fix a dimension member using the *mapping section of the transformation file in a data upload, I´m trying to fix the member for the Category dimension, I´ve tried an instruction like:
*MAPPING
*CATEGORY=ACTUAL
but it doesnt work, any idea?
thanks!The easiest way is to use the folloowing in the Transformation file:
Category = *newcol(Actual)
Hope this jelps.
Maybe you are looking for
-
How to load external text file into a Form?
Hi, I made a menu with 1-5 in a Form and 5 external text files. I want the user who is able to view the text content of the text file when he chooses one of them from the menu. And the text file screen has a "Back" command button to let him go back.
-
Hi there, The following happens to often and has just happend (again) which has stimulated me to post. Working with a Nikon D300, open the raw files in ACR and apply some adjustments. Click Open Image and the file opens in PS (CS3) Apply some further
-
Here's my code.. String Varble = rs0.getString("adddate").substring(0,10); out.println("<td>" + stmt + "</td>"); I've also tried this (which is the same thing just more code) Now it DOES work if I make tinydate text. String tinydate = rs0.getString("
-
FLA taking forever to save (and edit)
I've got a reasonably uncomplicated FLA file, a couple of megs, no audio, all vector art. It's a quiz, so there's some coding but nothing that would be considered intimidating by an intermediate Flash programmer. The only thing in any way remarkable
-
How to get Adobe Premiere Elements 11 to download (replacing a destroyed computer)
First, I am an old man who is not that computer literate....Now, I purchased in December of last year Adobe Premiere Elements 11 from Best Buy. I reg. it with Adobe. I have all the numbers associated with the programs. My problem, my computer was