Custom code for Target Source Reconciliation from a flat file
Hi Experts,
I need help in writing a custom code for Target Source Reconciliation from a flat file to OIM. The flat file will contain account details for different application instances. I am working on 11gr2.
Thanks,
Subin
All right, all right, not so quickly.
I am at the stage of trying to put one dimension
array. But I stuck in one place, this is the program:
import java.io.*;
public class FromFile {
public static void main(String[] args) throws IOException {
File inputFile = new File("mac.txt");
FileReader in = new FileReader(inputFile);
int c;
for(int i = 0; i < 10; i++) {
c = in.read();
System.out.println(c);
and I try to read: 1 2 3 4 from text file
This is the result so far...
49
32
50
32
51
32
52
-1
-1
-1
well,
I think I know what's wrong. I must change ASCII numbers into
ints. But I dont' know how to do it. Some nice book, or
tutorial on streams would come in handy. Could you correct
it?.
Similar Messages
-
Custom code for Flat file reconciliation on LDAP
Hello,
I have to write a custom code for flat file reconciliation on LDAP as the GTC connector wasn't working entirely.
Could someone help me out with this.. How do i do this ??
Thanksflat file reconciliation on LDAPWhat do you mean by Flat File on LDAP ?
If you want to create Flat File connector then search google for reading a flat file using Java.
Define RO Fields and do mapping in Process Defintion. You can use Xellerate User RO for Trusted Recon.
Make a map of CSV that and Recon Field
Call the Reconciliation API -
I'm getting an INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY error when running a mapping that attempts to Upsert data from a Flat File to SFDC Account and Contacts. I also tried to split The error message for Account: WRITER_2_*_1> WRT_8164 [2015-07-26 08:49:57.707] Error loading into target [Account] : Error received from salesforce.com. Fields []. Status code [INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY]. Message [Please enter a value for Country].WRITER_2_*_1> CMN_1053 [2015-07-26 08:49:57.707] : Rowdata: ( RowType=1(update) Src Rowid=1 Targ Rowid=1 Upsert is based on the External ID: Account_External_ID__c The error message for Contact: WRITER_1_*_1> WRT_8164 [2015-07-26 08:49:55.305] Error loading into target [Contact] : Error received from salesforce.com. Fields []. Status code [INSUFFICIENT_ACCESS_ON_CROSS_REFERENCE_ENTITY]. Message [Please enter a value for Country].WRITER_1_*_1> CMN_1053 [2015-07-26 08:49:55.305] : Rowdata: ( RowType=0(insert) Src Rowid=1 Targ Rowid=1 Upsert is based on the External ID: Contact_External_ID__c Any ideas on how to proceed?
Hello,Can you please help me understand that limitations of the free data loader? In this link - http://www.informaticacloud.com/editions-integration.html# - I see the below features listed.No-code, wizard driven cloud integrationMulti-tenant SaaS solutionDatabase and file ConnectivityFlexible schedulingBulk API support (for Salesforce.com)Unlimited rows/day24 jobs/day1 Secure AgentLimited to 1 userCommunity supportCloud Data MaskingQuestions:When I view licenses in my free data loader, under Feature Licences, it shows the License type for Salesforce Connectivity/Bulk API as “Trial”. Can’t I create a scheduled Data Synch task to upsert records in Salesforce using Bulk API mode?Is the email notification option (for success, warning and failure of data synch task) available on the free version (and not as a trial)?I understand there is a limit of 24 jobs/day. But is there a limit on the number of scheduled data synch tasks that can be created?Data Masking is listed as a feature above for the free edition. However, when I view the licenses in my free data loader, Data Masking is shown as “Trial”. Can you please clarify this?Is there a limit on the number of Connections that can be created?ThanksSanjay
-
Error while creating GTC for trusted source reconciliation in OIM11g
Hi,
I got an exception while trying to create GTC for Trusted source Reconciliation in OIM11g
Class/Method: CreateGenConnectorAction/imageScreen encounter some problems: Provider Exception[[
java.lang.Exception: Provider Exception
at com.thortech.xl.webclient.actions.CreateConnectorAction.getGenericAdapter(CreateConnectorAction.java:2265)
at com.thortech.xl.webclient.actions.CreateConnectorAction.imageScreen(CreateConnectorAction.java:1196)
at com.thortech.xl.webclient.actions.CreateConnectorAction.goNext(CreateConnectorAction.java:521)
at sun.reflect.GeneratedMethodAccessor4673.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:600)
at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:269)
at com.thortech.xl.webclient.actions.tcLookupDispatchAction.execute(tcLookupDispatchAction.java:133)
at com.thortech.xl.webclient.actions.tcActionBase.execute(tcActionBase.java:894)
at com.thortech.xl.webclient.actions.tcAction.execute(tcAction.java:213)
at com.thortech.xl.webclient.actions.CreateConnectorAction.execute(CreateConnectorAction.java:135)
at org.apache.struts.chain.commands.servlet.ExecuteAction.execute(ExecuteAction.java:58)
at org.apache.struts.chain.commands.AbstractExecuteAction.execute(AbstractExecuteAction.java:67)
at org.apache.struts.chain.commands.ActionCommandBase.execute(ActionCommandBase.java:51)
at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191)
at org.apache.commons.chain.generic.LookupCommand.execute(LookupCommand.java:305)
at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:191)
at org.apache.struts.chain.ComposableRequestProcessor.process(ComposableRequestProcessor.java:283)
at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1913)
at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:462)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at com.thortech.xl.webclient.security.XSSFilter.doFilter(XSSFilter.java:103)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at com.thortech.xl.webclient.security.CSRFFilter.doFilter(CSRFFilter.java:61)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.iam.platform.auth.web.PwdMgmtNavigationFilter.doFilter(PwdMgmtNavigationFilter.java:115)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.iam.platform.auth.web.OIMAuthContextFilter.doFilter(OIMAuthContextFilter.java:100)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:330)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.doIt(WebAppServletContext.java:3684)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3650)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2268)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2174)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1446)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:600)
at com.thortech.xl.gc.util.ProviderFacade.getProvider(ProviderFacade.java:344)
at com.thortech.xl.webclient.actions.CreateConnectorAction.getGenericAdapter(CreateConnectorAction.java:2201)
... 47 more
Caused by: java.lang.NullPointerException
at com.thortech.util.logging.Logger.isDebugEnabled(Logger.java:599)
at com.thortech.xl.gc.impl.recon.SharedDriveReconTransportProvider.initialize(SharedDriveReconTransportProvider.java:106)
... 53 more
Thanks & Regards,
PrasadMost likely you are hitting below bug
Bug 14271576 - OIM BETA : CONNECTOR LOGS ARE NOT GETTING UPDATED IN 11G R2 [preferrred fix ...]
or
Bug 13605443 - NULL POINTER EXCEPTIONS IN OIM SERVER DURING RECONCILIATION USING GTC CONNECTOR
Thanks Deepak -
Code for reading particular fields from the file placed in application
hi,
code for reading particular fields from the file placed in application server in to the internal table.Hi,
Use the GUI_UPLOAD FM to upload the File into ur Internal Table.
DATA : FILE_TABLE TYPE FILE_TABLE OCCURS 0,
fwa TYPE FILE_TABLE,
FILENAME TYPE STRING,
RC TYPE I.
CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_OPEN_DIALOG
EXPORTING
WINDOW_TITLE = 'Open File'
DEFAULT_EXTENSION =
DEFAULT_FILENAME =
FILE_FILTER =
INITIAL_DIRECTORY =
MULTISELECTION =
WITH_ENCODING =
CHANGING
FILE_TABLE = FILE_TABLE
RC = RC
USER_ACTION =
FILE_ENCODING =
EXCEPTIONS
FILE_OPEN_DIALOG_FAILED = 1
CNTL_ERROR = 2
ERROR_NO_GUI = 3
NOT_SUPPORTED_BY_GUI = 4
others = 5
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
READ TABLE FILE_TABLE INDEX 1 into fwa.
FILENAME = fwa-FILENAME.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = filename
FILETYPE = 'DAT'
IMPORTING
FILELENGTH =
TABLES
data_tab = itab
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
OTHERS = 6 .
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
Regards,
Balakumar.G
Reward Points if helpful. -
Hi,
How/where can I define the custom code for scrambling?
The SAP Scrambling guide does not give any details where I can define these, it only gives code examples
I did find transaction codes CNVMBTTWB and CNV_MBT_RULES but these are not working (kicking me out the transaction as soon as I try to create something). And these are TDMS 3.0 related.
Anyone an idea how to do this? Is there a guide for TDMS 4 somewhere? Didn't found it so far.
Kind regards,
JohnnyHi all,
I still got issues with the custom code for scrambling.
Has anyone a good document about this and please do not refer to the SAP ones since for me they are not helpful in this case.
I have created the custom program but when I test it via the scrambling rule, I receive a 'NULL' error (which is strange)
Also, if I execute my scrambling flow, it gets stuck in phase 'Preparations for data scrambling' with error FORM xxxxx does not exist.
The form was created in the Control system.
What am I missing here?
Cheers,
Johnny -
How to MODIFY A CUSTOM TABLE FROM A FLAT FILE
Dear Friends,
I have a requirement where i have to upload data from excel file to my custom table so i have used a FM
'TEXT_CONVERT_XLS_TO_SAP' and i have collected data into a internal table , till here i am able to get data correctly , now i hae to upload this data into a custom table .
the flat file is having 6 fields and the custom table is having
8 fields , for uploading the data into this custom table from the internal table where i have collected above iam getting problem . Actually iam using a modify statement to update the custom table .
the flat file which i have collected into the internal table is as below :
IDNo. Name Date Location Designation Dept
101 Raja 4/12/2007 Delhi Manager HR
102 James 4/12/2007 Delhi Clerk HR
Custom table is having the below fields
IDNO. Name Date Location Designation Dept Manager
101 Raja
now when i run the program iam getting the problem while usign the modify statment is the ID no which is already having
a record as IDno = 101 and manger = Raja.......with the other fields name , date, location,designation and dept as blank.
if i want to fill this fields from my flat file the modify statment
just filling all the fields for the ID no = 101 and manager field which already having Raja as being overwritten by space .....
becasue this field is not being there in the flat file.
the code iam using as follows.
The flat file is having with the below structure
TYPES: BEGIN OF t_emp_data,
IDNO(11) TYPE c,
Name(13) TYPE c,
Date(20) TYPE c,
Location (40) TYPE c,
Designation(40) TYPE c,
Dept(40) TYPE c,
end of t_emp_data.
The Custom Table(ZEMP_DATA) is having with the below structure
TYPES: BEGIN OF t_emp_data_table,
IDNO(11) TYPE c,
Name(13) TYPE c,
Date(20) TYPE c,
Location (40) TYPE c,
Designation(40) TYPE c,
Dept(40) TYPE c,
Manager(20) type c, -- this is the extra field in table
end of t_emp_data_table.
data :
it_empdata TYPE STANDARD TABLE OF t_emp_data,
it_empdata_tmp TYPE STANDARD TABLE OF t_empdata_tmp,
wa_empdata_tmp type t_empdata_tmp,
wa_empdata type t_emp_data.
loop at it_empdata into wa_empdata.
move-corresponding wa_empdata to wa_empdata_tmp.
modify ZEMP_DATA from wa_empdata_tmp .
endloop.
could any one please let me know what i have to do inorder to not get the manager field data not being overwritten with the modify statment , for the IDNo. 101 . I want the data which is already ( manager = Raja) shouldnt not be get overwritten with Space.
please help me in this regard
Regards
Madhuri.Hi,
use a slect statement before
"move-corresponding wa_empdata to wa_empdata_tmp."
select manager
from ztable
into wa_empdata_tmp-manager
where id = 100.
regards,
lavanya -
Cube like aggregation behavior from a flat file data source
Is the following workflow possible in Xcelsius and if so how would your organize you logic to be able to acheive this? Any examples would greatly be appreciated
My data source is XML, therefore a flat file.
The flat file contains multiple dimensions for which the customer wants to aggregate by based on the selection and also include and "ALL" option for every dimension/selector.
Essentially, the file is structured like this:
ie: Dim1 Dim 2 Dim3 Dim 4 Dim5 Metric1 Metric2
a c e j l 5 5
a c e j m 3 2
a c e j n 4 8
a c e k o 3 4
a c e k n 8 2
...etc
We need drop down selectors for Dim1-Dim4 with the "ALL" option. I would have my tables and charts etc use the Dim5, Metric1 and Metric2.
I've used the "Lookup" function to create 1 level of aggregation in a flat file structure with 1 dimension, such as the "Lookup" would call all the Dim1entries and insert into a group of referenced cells and then i would use the "Sum" function from Excel.
But is this possible with multiple dimensions and include the "ALL" option. If so, how? The workflow defined is ideal for a CUBE architecture but for many different political reasons we need to be able to address this with Xcelsius being sourced from a flat file ie XML. Any insight would be greatly appreciated..hi there.
if you backend is SAP BW connected via OLAP Universe you could use L0 Dimensions for all aggregation in WebI Layer. (Check the Blog's of Ingo Hilgefort for details)
Best Regards
Ulrich
http://www.xcelsius-insight.com -
What is SHA1 code for Windows 8.1 Enterprise Evaluation ISO files
Since I downloaded the files from TechNet, no sha1 code information is given in my download webpage, could someone kindly provide the SHA1 code for Windows 8.1 Enterprise Evaluation ISO files named
9600.16384.WINBLUE_RTM.130821-1623_X64FRE_ENTERPRISE_EVAL_EN-US-IRM_CENA_X64FREE_EN-US_DV5.ISO
(my sha1 code is 73321fa912305e5a16096ef62380a91ee1f112da)
and
9600.16384.WINBLUE_RTM.130821-1623_X64FRE_ENTERPRISE_EVAL_ZH-CN-IRM_CENA_X64FREE_ZH-CN_DV5.ISO
(my sha1 code is 2fc5246dd9d02d185e92283324d9b81822827f19)Hi,
This version is Windows 8.1 Enterprise Evaluation, I suggest you check sha1 in TechNet subscription download center.
Welcome to Subscriber Downloads
https://technet.microsoft.com/en-US/subscriptions/securedownloads/hh442904#searchTerm=Windows%208.1%20Enterprise&ProductFamilyId=0&Languages=en&PageSize=100&PageIndex=0&FileId=0
Regards,
Yolanda
We
are trying to better understand customer views on social support experience, so your participation in this
interview project would be greatly appreciated if you have time.
Thanks for helping make community forums a great place. -
BAPI_PO_CREATE1 not able to create PO's for multiple rows from the flat fil
Hi
i am uploading PO's from a flat file into SAP using the BAPI_PO_CREATE1. Everything works fine if the flat file hast only one record.
if the flat file has more than one record then while loading the second record the BAPI returns a error message. I am calling the BAPI in a loop.
The strange thing is that if i load the second record individually the program is able to create the PO. So only when i have multiple records in the flat file i am unable to load the PO into SAP. I debugged and checked all the internal tables passed to the BAPI. All seems to have the data correctly but still the BAPI fails.
any idea where i am going wrong?
the code looks something like this.
LOOP AT HEADER_ITAB.
PERFORM FILL_HEADER_RECORDS.
LOOP AT ITEM_ITAB WHERE EBELN eq HEADER_ITAB-EBELN.
PERFORM FILL_ITEM_RECORDS.
ENDLOOP.
PERFORM CERATE_PO_VIA_BAPI.
ENDLOOP.What is the error message. Are you trying something like this:
LOOP AT T_DATA1.
AT NEW LIFNR.
READ TABLE T_DATA1 INDEX SY-TABIX.
PERFORM INIT_TABLES.
PERFORM FILL_DATA.
--Call the BAPI to create PO
PERFORM CREATE_PO.
ENDAT.
ENDLOOP.
FORM CREATE_PO .
CALL FUNCTION 'BAPI_PO_CREATE1'
EXPORTING
POHEADER = POHEADER
POHEADERX = POHEADERX
POADDRVENDOR =
TESTRUN =
MEMORY_UNCOMPLETE =
MEMORY_COMPLETE =
POEXPIMPHEADER =
POEXPIMPHEADERX =
VERSIONS =
NO_MESSAGING =
NO_MESSAGE_REQ =
NO_AUTHORITY =
NO_PRICE_FROM_PO =
IMPORTING
EXPPURCHASEORDER = EXPPURCHASEORDER
EXPHEADER = EXPHEADER
EXPPOEXPIMPHEADER = EXPPOEXPIMPHEADER
TABLES
RETURN = RETURN
POITEM = POITEM
POITEMX = POITEMX
POADDRDELIVERY =
POSCHEDULE = POSCHEDULE
POSCHEDULEX = POSCHEDULEX
POACCOUNT = POACCOUNT
POACCOUNTPROFITSEGMENT =
POACCOUNTX = POACCOUNTX
POCONDHEADER =
POCONDHEADERX =
POCOND = POCOND
POCONDX = POCONDX
POLIMITS =
POCONTRACTLIMITS =
POSERVICES =
POSRVACCESSVALUES =
POSERVICESTEXT =
EXTENSIONIN =
EXTENSIONOUT =
POEXPIMPITEM =
POEXPIMPITEMX =
POTEXTHEADER =
POTEXTITEM = POTEXTITEM
ALLVERSIONS =
POPARTNER = POPARTNER
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
WAIT = 'X'
IMPORTING
RETURN = RETURN1.
DATA: L_NAME TYPE LFA1-NAME1.
CLEAR L_NAME.
SELECT SINGLE NAME1
FROM LFA1
INTO L_NAME
WHERE LIFNR = POHEADER-VENDOR.
LOOP AT RETURN.
WRITE : / RETURN-TYPE,
RETURN-ID,
RETURN-MESSAGE.
WRITE : '--> For vendor:',
POHEADER-VENDOR,
L_NAME.
ENDLOOP.
ENDFORM. " CREATE_PO -
Creating reconciliation events from a flat file--a design question
Hello,
I am currently evaluating an existing OIM implementation to rebuild it using OIM 11g and have a question regarding the ideal method to create reconciliation events from a flat file.
The current implementation is using a web service call to process a flat file and creates the reconciliation events. This runs every hour.
Although this looks cool but I thought there was no need to go to the extent.
If OIM cannot consume the flat file directly, meaning if it needs some data massage, I can always load the data from the flat file into an external table, write a pl/sql procedure to transform the data and put it into a temporary global table and create reconciliation events like that.
What would be the ideal method to load data from a flat file into OIM?
THanks
KhanhIf it's a flat file, then have you looked at GTC option? And why any staging in between? OIM can read flat files just fine either through GTC or write up your own recon code.
-Bikash -
Looking for a program to delete CVCs from a flat file, infocube, ztable.
I am looking for a program that will delete CVCs from a flat file, infocube, or ztable. Based on the research I have done, I would imagine the core of such a program would be the use of function module /sapapo/ts_plob_delete.
If anyone has such a program and would be willing to share it that would be great.
ShaneHi
Yes you can use this program, but I think the program /sapapo/ts_plob_delete will only delete the data from Master planning Object structure( MPOS). First you have to deactivate the planning are first than use this program which will delete everything and again activate the planning area.
If you want to delete the data flat files or infocube, you require to delete the data directly from the infocubes and add this job in the process chain to delete everything.
I hope this information help you.
Thanks
Amol -
Step by Step details on how to load data from a flat file
hi can anyone explain how to load data from a flat file. Pls giv me step by step details. thnx
hi sonam.
it is very easy to load data from flat file. whn compared with other extrations methods...
here r the step to load transation data from a flat file.......
step:1 create a Flat File.
step:2 log on to sap bw (t.code : rsa1 or rsa13).
and observe the flat file source system icon. i.e pc icon.
step:3 create required info objects.
3.1: create infoarea
(infoObjects Under Modeling > infoObjects (root node)-> context menu -
> create infoarea).
3.2: create char /keyfig infoObject Catalog.(select infoArea ---.context menu --->create infoObject catalog).
3.3: create char.. infoObj and keyFig infoObjects accourding to ur requirement and activate them.
step:4 create infoSource for transaction data and create transfer structure and maintain communication structure...
4.1: first create a application component.(select InfoSources Under modeling-->infosources<root node>>context menu-->create applic...component)
4.2: create infoSource for transation data(select appl..comp--.context menu-->create infosource)
>select O flexible update and give info source name..
>continue..
4.4: *IMp* ASSIGN DATASOURCE..
(EXPAND APPLIC ..COMP..>EXPAND YOUR INFOSOURCE>CONTEXT MENU>ASSIGN DATASOURCE.)
>* DATASOURCE *
>O SOURCE SYSTEM: <BROWSE AND CHOOSE YOUR FLAT FILE SOURCE SYSTEM>.(EX:PC ICON).
>CONTINUE.
4.5: DEFINE DATASOURCE/TRANSFER STRUCTURE FOR IN FOSOURCE..
> SELECT TRANSFER STRUCTURE TAB.
>FILL THE INFOOBJECT FILLED WITH THE NECESSARY INFOOBJ IN THE ORDER OR SEQUENCE OF FLAT FILE STRUCTURE.
4.6: ASSIGN TRANSFER RULES.
---> NOW SELECT TRANSFER RULES TAB. AND SELECT PROPOSE TRANSFER RULES SPINDLE LIKE ICON.
(IF DATA TARGET IS ODS -
INCLUDE 0RECORDMODE IN COMMUNICATION STRUCTURE.)
--->ACTIVATE...
STEP:5 CREATE DATATARGET.(INFOCUBE/ODS OBJECT).
5.1: CREATE INFO CUBE.
-->SELECT YOUR INFOAREA>CONTEXT MENU>CREATE INFOCUBE.
5.2: CREATE INFOCUBE STRUCTURE.
---> FILL THE STRUCTURE PANE WILL REQUIRE INFOOBJECTS...(SELECT INFOSOURCE ICON>FIND UR INFOSOURCE >DOUBLE CLICK > SELECT "YES" FOR INFOOBJECT ASSIGNMENT ).
>MAINTAIN ATLEAST ON TIME CHAR.......(EX; 0CALDAY).
5.3:DEFINE AND ASSIGN DIMENSIONS FOR YOUR CHARACTERISTICS..
>ACTIVATE..
STEP:6 CREATE UPDATE RULES FOR INFOCUDE USING INFOSOURCE .
>SELECT UR INFOCUBE >CONTEXT MENU> CREATE UPDATE RULES.
> DATASOURCE
> O INFOSOURCE : _________(U R INFOSOURCE). AND PRESS ENTER KEY.......
>ACTIVATE.....UR UPDATE RULES....
>>>>SEE THE DATA FLOW <<<<<<<<----
STEP:7 SCHEDULE / LOAD DATA..
7.1 CREATE INFOPACKAGE.
--->SELECT INFOSOURCE UNDER MODELING> EXPAND UR APPLIC.. COMP..> EXPAND UR INFOSOURCE..> SELECT UR DATASOURCE ASSIGN MENT ICON....>CONTEXT MENU> CREAE INFOPACKAGE..
>GIVE INFOPACKAGE DISCREPTION............_________
>SELECT YOUR DATA SOURCE.-------> AND PRESS CONTINUE .....
>SELECT EXTERNAL DATA TAB...
> SELECT *CLIENT WORKSTATION oR APPLI SERVER >GIVE FILE NAME > FILE TYPE> DATA SAPARATER>
>SELECT PROCESSING TAB
> PSA AND THEN INTO DATATARGETS....
>DATATARGET TAB.
>O SELECT DATA TARGETS
[ ] UPDATE DATATARGET CHECK BOX.....
--->UPDATE TAB.
O FULL UPDATE...
>SCHEDULE TAB..
>SELECT O START DATA LOAD IMMEDIATELY...
AND SELECT "START" BUTTON........
>>>>>>>>>>
STEP:8 MONITOR DATA
> CHECK DATA IN PSA
CHECK DATA IN DATA TARGETS.....
>>>>>>>>>>> <<<<<<<<<----
I HOPE THIS LL HELP YOU..... -
What is the best way to load and convert data from a flat file?
Hi,
I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
What is the best and easiest way to archive this?
Thanks,
Carsten.Hi,
thanks for your answers so far!
I gave them a thought and came up with two different alternatives:
Alternative 1
I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
The columns of the staging table have the target format (date, number).
The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
Alternative 2
The columns of the staging table are all of type varchar2 regardless of the target format.
I define data rules for all columns that require a later conversion.
I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
The rows that cannot be loaded go automatically into the error table.
When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
Further, I would prefer using expressions in the mapping for converting the data.
What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
As far as I know I need the data quality option for using data rules, is that true?
Is there another alternative without any of these drawbacks?
Otherwise I think I will go for alternative 1.
Thanks,
Carsten. -
Multiple idocs from single flat file
Hi All
I want to send data from a flat file to SAP(file to idoc)
My flat file structure is
id,name,number,city
2,R1,234,SD
2,R2,457,MD
3,R4,789,HG
3,R6,235.HG
The Field 'id' will change..after every change in 'id' ,seperate idoc should be created.
I have checked the following thread.
Re: Content conversion for seperate idoc
In the above thread ,it is asked to map v.no with remove context and use SPLIT BY VALUE on value change then do the mapping accordingly ,you can create 3 idocs for the same.
I'm confused about how to do these mappings.
Please explain the mapping in detail.
Please help
Regards
Reemaif your source data type is like
MT_Source
Record 0-unbounded
id ----1
name -----1
number -----1
city -------1
then in the sender file communication channel you have to specify file content conversion parameters as
Parameter name parametervalue
Document Name MT_Source
Recordset Structure Record,*
choose + to add more parameters
Name Value
Record.fieldSeparator ,
Record.fieldNames id,name,number,city
Record.endSeparator 'nl'
then do the maping
as
id ---->removeContext---->SplitByValue(Value change)---->Target Idoc
map according to your requirement for other fields
Maybe you are looking for
-
Header.php exists but does not show up on every page.
Hi. I am starting from scratch with this. part1. My objective is to create a website that takes advantage of DreamweaverCC's fluid-grid-layout so that my web Site auto-flows on smartphones, tablets and desktops/laptops. file > new > fluid-grid-layou
-
How can I create a private app that does not appear in a search on the app store?
I have an app which I only want certain people to see but no matter what I do it continues to appear when people search for it in the app store. Is there a setting I can use so that it doesn't appear in any search?
-
I upgraded to 10.6.8 and my imagecapture did not upgrade. Why? Do I have to buy a new verson? Where is the new version located? Thanks. I cannot access the imagecapture from my builtin camera now. I have a MacBook-5 years old. Thanks.
-
hi I have a 867mhz powerbook g4, the 12inch model, and am looking to buy an external monitor as I need more room for the work I'm now doing. I was wondering if anyone knows whether I can connect one of the new 20" cinema displays to the powerbook. I'
-
Database dependences with collaboration or infra ?
hello, this is the situation We have the database 10g on a linux server (#1), on another server (#2) we have: -a collaboration suite installation working fine since 2005 -Oracle Infraestructure -Oracle IAS Now we bought a third server(#3), and instal