Flat file requirement

I have received this Flat file which contains data for new products which currently do not have SAP material # assigned to them.
The format is :
Material#       qty
D1          10
D2          20
D3          30
Now when later , one of this new product gets created in SAP , the user wants to provide me with a new file which contains only the changed records
Material #     Qty
1234          10
How can I achieve this in BW?
Thanks

To clarify:
You want to load a flat file into BW for records that may not have an SAP material # assigned.  You want to use a placeholder value to be stored within material # (0MATERIAL) such as 'D1'.  Later, when the product is created within SAP, you want to load a new file, which will contain only the changed records.  Is this correct?
Using your example below:  The initial file contains a record with material D1and a quantity of 10.
The new file contains SAP material #'1234' and a quantity of 10.
Can I assume this record can be interpreted as material 'D1' is now '1234'?
If so, please clarify.
Otherwise, based on the infomation supplied, I do not think this is a feasible approach.  How can you link the original D1 record to the new '1234' record?  (Surely not by the quantity key figure...what happens if multiple records have 10 as a quantity?)
Ideally, you need to develop a data model where you store the values for "new" products (with no SAP material no.) and "existing" products (with SAP material) into separate infoobjects.  Please keep in mind, if you create a report using 0MATERIAL, records for which 0MATERIAL is blank (i.e. "new" products) will be excluded from the report.
Please do not consider the above recommendation as a finished solution.  I offer it merely as a launchpad for further design possibilities for this scenario.  I'm sure there are other questions which need to be addressed which may influence the direction.
Hope this helps.

Similar Messages

  • Flat files required for Universal Adapters HR Payroll

    Hi Folks
    we are implementing new HR Analytics using Universal adapters HR Payroll subject area.
    where can i get the Flat files which are designed and delivered for Universal Adapters which contains the Field names etc.
    i searched in web and found no clue of it and neither on Metalink.
    please help
    thank you
    kumr

    Data lineage for 795 , Search for doc id 735950 :
    OR
    https://metalink3.oracle.com/od/faces/secure/km/DocumentDisplay.jspx?id=735950.1&h=Y.
    796 :
    Oracle® Business Intelligence Applications ETL Data Lineage Guide Release 7.9.6 (Doc ID 829385.1) is now avalaible on Metalink3.
    Below is the URL,
    https://metalink3.oracle.com/od/faces/secure/km/DocumentDisplay.jspx?id=829385.1
    rm

  • Is a database table required for temporary interfaces with flat file data set source ?

    Folks,  this is the situation I have in ODI 11.1.1.7
    I have a temporary interface (yellow), called MJ_TEMP_INT,  that pulls data from TWO data sets in the source into a temporary target (TEMP_TARG). The catch is one data set pulls from a from a table whereas the other data set pulls from a flat file.  A union is done on the data sets.
    I then create another interface, called MJ_INT, that uses the MJ_TEMP_INT as a source and the target is a real db. table called "REAL_TARGET"
    Two questions:
    When I execute my second interface  (MJ_INT), I get a message "ORA-00942: table or view does not exist" because it is looking for a real db table TEMP_TARG. Why must I have one ? because I am pulling from a flat file ?
    On my second interface (MJ_INT) when I look at the property sheet of my source interface MJ_TEMP_INT (yellow), the checkbox next to "Use temporary interface as Derived table" is DISABLED.  Why ? Is is also because my temporary interface is pulling from a flat file ?
    I have attached a file that shows a screen shot of my ODI studio.
    By the way,  IF my temporary interface source has only one data set pulling from a db. table into a temporary target table, say called MJ_TEMP2_TARG,  and then when I use this temporary interface as a source to another other real db. target table (REAL2_TARGET),  THEN, every thing works.  ODI does not require me to have a real db. table MJ_TEMP2_TARG and the checkbox for "Use temporary interface as Derived table" is NOT DISABLED and my REAL2_TARGET table gets populated.
    Thank you in advance.
    M. Jamal.

    Thanks SH. I thought so. 
    Though I understand the reason to materialize the file in a staging area, but that almost defeats the purpose of having a temporary interface in this case if we have to save the data in a permanent db. table first.  I assume the db. table sticks around and is not automatically dropped once the interface executing ends.  If the db. table sticks around then I also must truncate it first before executing the temporary interface each time. Right ?

  • Flat File to IDOC Mapping requirement to generate Multiple Segments

    Hi Experts,
    I got a requirement were i have 2 records in a file and i need to generate 2 IDOCs  with  multiple segments in it.
    FILE :
    10/01/2010     101  KRNA     ic_quantity          30-0257     3526     1     1     ea     110000     10
    10/01/2010     101     KRNA     ic_quantity          90-0005     3526     1     2     ea     110000     10
    Idoc should generate 2 IDOCs with multiple segments as shown below
    I have imported the IDOC and changed the occurrence to " unbounded "
    The Basic  IDOC Type :  WMMBID02
    I need to generate Multiple segments of  E1MBXY1
    i.e..,  First IDOC should contain two  E1MBXY1 segments
             Second IDOC should contain Four  E1MBXY1 segments
    IDOC1 :   WMMBID02
    Segment :   E1MBXY1( 2 segments)                                                       
    10/01/2010     101     KRNA     ic_quantity          30-0257     3526     1     1     ea     110000     10
    10/01/2010     101     KRNA     ic_quantity          90-0005     3526     1     2     ea     110000     10
    IDOC2 : WMMBID02
    Segment :  E1MBXY1 ( 4 segments)                                                  
    10/01/2010     101     KRNA     ic_quantity          30-0257     3526     1     1     ea     110000     10
    10/01/2010     101     KRNA     ic_quantity          30-0257     3521     1     1     ea     110000     10
    10/01/2010     101     KRNA     ic_quantity          90-0005     3526     1     2     ea     110000     10
    10/01/2010     101     KRNA      ic_quantity          90-0005     3521     1     2     ea     110000     10
    Can anyone suggest me how to generate IDOCs with multiple segments
    what are multiple ways of generating it
    Whether it can be achieved using Multi-mapping or I need go for UDF
    If any one has done has done this type of requirement ,please share the points.
    Thanks
    Sai

    Basically you need to generate idoc per record in the flat file. During fcc conversion you convert flat file to xml structure at the sender side. In the mapping use xml file structure source and idoc as receiver structure. You just export idoc and update the idoc segment 1 to unbounded.  Please follow the michael blog for file to idoc multimapping without bpm. Yes without bpm it is possible.
    see this link... This will answer your requirement.
    https://wiki.sdn.sap.com/wiki/display/XI/File%20to%20Multiple%20IDOC%20Splitting%20without%20BPM
    >Whether it can be achieved using Multi-mapping or I need go for UDF
    you dont need udf for this.

  • STEPS REQUIRED TO EXTRACT DATA FROM FLAT FILE TO BI 7.0

    HI ALL,
      I NEED THE STEPS TO EXTRACTED THE DATA FROM A FLAT FILE TO BI 7.0
    PLEASE PROVIDE ME THE STEPS RATHER THAN PROVIDEING ME THE COMPLETE LINK OF SAP HELP.
    I NEED THE STEPS ONLY.
    THANKS,
    GAUTAM

    BW 7.0
    Uploading of master data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1.     Creation of Info Objects
    •     In left panel select info object
    •     Create info area
    •     Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    •     Create new characteristics and key figures under respective catalogs according to the project requirement
    •     Create required info objects and Activate.
    2.     Creation of Data Source
    •     In the left panel select data sources
    •     Create application component(AC)
    •     Right click  AC and create datasource
    •     Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    •     In general tab give short, medium, and long description.
    •     In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    •     In proposal tab load example data and verify it.
    •     In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    •     Activate data source and read preview data under preview tab.
    •     Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3.     Creation of data targets
    •     In left panel select info provider
    •     Select created info area and right click to select Insert Characteristics as info provider
    •     Select required info object ( Ex : Employee ID)
    •     Under that info object select attributes
    •     Right click on attributes and select create transformation.
    •     In source of transformation , select object type( data  source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    •     Activate created transformation
    •     Create Data transfer process (DTP) by right clicking the master data attributes
    •     In extraction tab specify extraction mode ( full)
    •     In update tab specify error handling ( request green)
    •     Activate DTP and in execute tab click execute button to load data in data targets.
    4.     Monitor
       Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
    BW 7.0
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    5.     Creation of Info Objects
    •     In left panel select info object
    •     Create info area
    •     Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    •     Create new characteristics and key figures under respective catalogs according to the project requirement
    •     Create required info objects and Activate.
    6.     Creation of Data Source
    •     In the left panel select data sources
    •     Create application component(AC)
    •     Right click  AC and create datasource
    •     Specify data source name, source system, and data type ( Transaction data )
    •     In general tab give short, medium, and long description.
    •     In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    •     In proposal tab load example data and verify it.
    •     In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    •     Activate data source and read preview data under preview tab.
    •     Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    7.     Creation of data targets
    •     In left panel select info provider
    •     Select created info area and right click to create ODS( Data store object ) or Cube.
    •     Specify name fro the ODS or cube and click create
    •     From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    •     Click Activate.
    •     Right click on ODS or Cube and select create transformation.
    •     In source of transformation , select object type( data  source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    •     Activate created transformation
    •     Create Data transfer process (DTP) by right clicking the master data attributes
    •     In extraction tab specify extraction mode ( full)
    •     In update tab specify error handling ( request green)
    •     Activate DTP and in execute tab click execute button to load data in data targets.
    8.     Monitor
       Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
    cheers
    sunil

  • I have one requirement .which is JMS XML file should convert to Flat file

    HI Gurus,
    My Scenario is
    sender SAP ---> receiver  MF( Mainframe) ..
    I have one requirement i will get IDOC from SAP sender pass throw PI to Mainframe...  which is recevier's JMS XML file should convert to Flat file...
    plz guide me any related suggestion and related links .... how to achieve..
    Thanks in advance..

    >
    > My Scenario is
    > sender SAP ---> receiver  MF( Mainframe) ..
    >
    > I have one requirement i will get IDOC from SAP sender pass throw PI to Mainframe...  which is recevier's JMS XML file should convert to Flat file...
    >
    Where is MQ coming in to picture here, You receiving data from SAP and sending Main Frame system,as per your post.
    so we can SAP->PI-->MAINFRAMES.most of the times we sent data to main frame system in the form of test files,so you can use receiver adapter file and use file content conversion.
    thats it.
    Regards,
    Raj

  • How to extract required data from a column to a flat file

    my ssis package is working OK. However, I want to refine one of the column extraction.
    when data is extracted to the flat file, I just want to the initials, firstname, lastname e.g.
    FZ = Ben Smith, Add1, add1, etc
    the only bit that i want is  Ben Smith
    how can i state in the package to just give me the name and exclude the rest
    sukai

    Add a derived column task to extract Name part alone and give expression as below
    LEFT([ColumnName],FINDSTRING([ColumnName],",",1)-1)
    If before SSIS 2012 use SUBSTRING
    SUBSTRING([ColumnName],1,FINDSTRING([ColumnName],",",1)-1)
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Help required in OIM-OID LDap Synch and GTC flat file connector

    Hi Experts,
    I am using OIM 11.1.1.5 with OID LDap Synch enabled. I have OIM protected with OAM 11.1.1.5.0 and almost all normal things are working.
    Once I am doing TRUSTED FLAT FILE GTC recon to OIM, the users are getting created in OIM without any password and due to that my users are not getting created in OID(Ldap Synch is enabled);
    The following exception is getting thrown:
    <Nov 13, 2011 9:48:21 AM CET> <Warning> <XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT> <BEA-000000> <FILE SUCCESSFULLY ARCHIVED : /home/oracle/OAM_ProtoTyping/TestCSV/Scheduled.csv>
    <Nov 13, 2011 9:48:21 AM CET> <Warning> <oracle.iam.callbacks.common> <IAM-2030146> <[CALLBACKMSG] Are applicable policies present for this async eventhandler ? : false>
    <Nov 13, 2011 9:48:22 AM CET> <Error> <oracle.iam.ldapsync.impl.eventhandlers.user> <IAM-3010021> <An error occurred while creating the user in LDAP.
    oracle.iam.platform.entitymgr.MissingRequiredAttributeException: [usr_password]
    at oracle.iam.platform.entitymgr.impl.EntityManagerImpl.checkRequired(EntityManagerImpl.java:1450)
    at oracle.iam.platform.entitymgr.impl.EntityManagerImpl.createEntity(EntityManagerImpl.java:263)
    at oracle.iam.ldapsync.impl.eventhandlers.user.UserCreateLDAPPostProcessHandler.createUser(UserCreateLDAPPostProcessHandler.java:261)
    at oracle.iam.ldapsync.impl.eventhandlers.user.UserCreateLDAPHandler.execute(UserCreateLDAPHandler.java:123)
    at oracle.iam.platform.kernel.impl.OrchProcessData.runPostProcessEvents(OrchProcessData.java:1166)
    at oracle.iam.platform.kernel.impl.OrchProcessData.runEvents(OrchProcessData.java:710)
    at oracle.iam.platform.kernel.impl.OrchProcessData.executeEvents(OrchProcessData.java:227)
    at oracle.iam.platform.kernel.impl.OrchestrationEngineImpl.resumeProcess(OrchestrationEngineImpl.java:675)
    at oracle.iam.platform.kernel.impl.OrchestrationEngineImpl.resumeProcess(OrchestrationEngineImpl.java:705)
    at oracle.iam.platform.kernel.impl.OrhestrationAsyncTask.execute(OrhestrationAsyncTask.java:108)
    at oracle.iam.platform.async.impl.TaskExecutor.executeUnmanagedTask(TaskExecutor.java:100)
    at oracle.iam.platform.async.impl.TaskExecutor.execute(TaskExecutor.java:70)
    at oracle.iam.platform.async.messaging.MessageReceiver.onMessage(MessageReceiver.java:68)
    at sun.reflect.GeneratedMethodAccessor1821.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at com.bea.core.repackaged.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:310)
    at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
    at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
    at com.bea.core.repackaged.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:89)
    at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
    at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
    at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
    at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
    at com.bea.core.repackaged.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
    at $Proxy335.onMessage(Unknown Source)
    at weblogic.ejb.container.internal.MDListener.execute(MDListener.java:574)
    at weblogic.ejb.container.internal.MDListener.transactionalOnMessage(MDListener.java:477)
    at weblogic.ejb.container.internal.MDListener.onMessage(MDListener.java:380)
    at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:4659)
    at weblogic.jms.client.JMSSession.execute(JMSSession.java:4345)
    at weblogic.jms.client.JMSSession.executeMessage(JMSSession.java:3822)
    at weblogic.jms.client.JMSSession.access$000(JMSSession.java:115)
    at weblogic.jms.client.JMSSession$UseForRunnable.run(JMSSession.java:5170)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTuningWorkManagerImpl.java:528)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    >
    Has any body faced similar kind of issue.
    I tried to use post process event handler on create but while updating password its saying the user state is not in synch with OID.
    So I am unable to use post process event handlers as well.
    Regards,
    J

    Thanks Sunny,
    But the post process event handler with reset/update password is not working on CREATE;
    the following error message is being thrown:
    oracle.iam.platform.kernel.EventFailedException: Password reset failed because user JSMITH151 is not synchronized to the LDAP directory.
    at oracle.iam.ldapsync.impl.eventhandlers.user.util.LDAPUserHandlerUtil.resetPassword(LDAPUserHandlerUtil.java:203)
    at oracle.iam.ldapsync.impl.eventhandlers.user.UserResetPasswordLDAPHandler.execute(UserResetPasswordLDAPHandler.java:167)
    at oracle.iam.platform.kernel.impl.OrchProcessData.runPreProcessEvents(OrchProcessData.java:898)
    at oracle.iam.platform.kernel.impl.OrchProcessData.runEvents(OrchProcessData.java:634)
    at oracle.iam.platform.kernel.impl.OrchProcessData.executeEvents(OrchProcessData.java:227)
    at oracle.iam.platform.kernel.impl.OrchestrationEngineImpl.resumeProcess(OrchestrationEngineImpl.java:665)
    In 11.1.1.3 OIM, I found the password was available for mapping in GTC connector, but in OIM 11.1.1.5, oracle has removed the password mapping attribute.
    Can you please suggest?
    I checked with Oracle Support, They are saying in OIM 11.1.1.5 they have introduced a new post process event handler which should generate the password on every trusted reconcilication event.
    But in my environment its not behaving like that.
    Regards,
    J

  • Help Required regding: Validation on Data Loading from Flat File

    Hi Experts,
    I need u r help in the following issue.
    I need to validated the transactional data loading to the GL Cube from Flat file,
    1) The transactional data to the Cube to be loaded <b>only if master data</b> record exists for the <b>“0GL_ACCOUNT”</b> info object.
    2) If the master data record does not exits then the record need to be skipped from the loading and after the loading  the system should throw a message saying that these many records have been skipped (if there are any skipped records.).
    I would really appriciate u r help and suggestions on solving this issue.
    Regds
    Hari

    Hi, write a <b>start routine</b> in transfer rules like this.
      DATA: l_s_datapak_line type TRANSFER_STRUCTURE,
            l_s_errorlog TYPE rssm_s_errorlog_int,
            <b>l_s_glaccount type /BI0/PGLACCOUNT</b>,
            new_datapak type tab_transtru.
           refresh new_datapak.
           loop at datapak into l_s_datapak_line.
           select single * from /BI0/PGLACCOUNT into l_s_glaccount
             where CHRT_ACCTS eq l_s_datapak_line-<b>field name in transfer structure/datsource for CHRT_ACCTS</b>
    and GL_ACCOUNT eq l_s_datapak_line-<b>field name in transfer structure/datsource for GL_ACCOUNT</b>
    and OBJVERS eq 'A'.
           if sy-subrc eq 0.
             append l_s_datapak_line to new_datapak.
           endif.
           endloop.
           datapak = new_datapak.
           if datapak[] is initial.
    abort <> 0 means skip whole data package !!!
             ABORT = 4.
           else.
             ABORT = 0.
           endif.
    i have already some modifications but U can slightly change it to suit your need.
    regards
    Emil

  • Require script to load fixed  length flat file

    Hi
    Can anyone provide me a script to load a flat file into database ? A sample script would be ok
    thanks in advance
    Siva

    You can use SQL*Loader to do this, or external tables in 9i or higher. These methods are well-documented in the Utilities Guide.
    Cheers, APC

  • Parallel loading of flat files into BPC 7.0 NW - Help required

    Hi All
    Can I load multiple flat files in parallel into BPC using IMPORT process chain.
    If yes , How ?
    Regards
    AK

    Please import SAP Note [1507226|https://service.sap.com/sap/support/notes/1507226] titled "FileService should support running DM package simultaneously" which is contained in BPC 7.0 NW SP10 and BPC 7.5 NW SP05.
    Best regards,
    [Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
    SAP Labs, LLC
    BusinessObjects Division
    Americas Applications Regional Implementation Group (RIG)

  • Positional flat file schemas for input and output files to be generated with the required usecases

    Hello all,
    I need one help regarding the positional flat file schema which contains multiple headers, body and trailers.
    I have attached the sample input file below. This is a batched input and we have to generate the output which I have given below:
    We are unable to flat file schema which replicates the below input file. Please suggest better approach to achieve this.
    Sample Input FIle:
    010320140211ABC XYZ XYZ ABCD201402110 FSPAFSPA005560080441.02000006557.FSPA.IN AB-CD ABCD/AB-CD BETALKORT
    1020140210AN194107123459.FSPA.IN
    [email protected]
    1020140210AN196202123453.FSPA.IN
    [email protected]
    1020140210AN198103123435.FSPA.IN
    [email protected]
    1020140210AN195907123496.FSPA.IN
    [email protected]
    1020140210AN195903123437.FSPA.IN
    [email protected]
    1020140210AN193909123434.FSPA.IN
    [email protected]
    1020140210AN195607123413.FSPA.IN
    [email protected]
    1020140210AN199205123408.FSPA.IN
    [email protected]
    1020140210AN196206123499.FSPA.IN
    [email protected]
    1020140210AN196709123410.FSPA.IN
    [email protected]
    1020140210AN194708123455.FSPA.IN
    [email protected]
    1020140210AN195710123443.FSPA.IN
    [email protected]
    1020140210AN198311123436.FSPA.IN
    [email protected]
    1020140210AN196712123471.FSPA.IN
    [email protected]
    1020140210AV197005123403.FSPA.IN
    98000000000000014000000000000001000000000000015
    010320140211ABC XYZ XYZ PEDB201402111 FSPAICA 005560080441.02000006557.FSPA.IN AB-CDABCD/ABCDBETALKORT
    1020140210AN195111123491.ICA.IN
    [email protected]
    1020140210AV195309123434.ICA.IN
    98000000000000001000000000000001000000000000002
    Output FIle:
    1020140210AN195607123413.FSPA.IN
    [email protected]
    1020140210AN199205123408.FSPA.IN
    [email protected]
    1020140210AN196206123499.FSPA.IN
    [email protected]
    1020140210AN196709123410.FSPA.IN
    [email protected]
    1020140210AN194708123455.FSPA.IN
    [email protected]
    1020140210AN195710123443.FSPA.IN
    [email protected]
    1020140210AN198311123436.FSPA.IN
    [email protected]
    1020140210AN196712123471.FSPA.IN
    [email protected]
    1020140210AN195111123491.ICA.IN
    110019EPS [email protected]
    1020140210AV197005123403.FSPA.IN
    1020140210AV195309123434.ICA.IN
    98000000000000001000000000000001000000000000002
    99000000000000001
    Note: Header is a single line till BETALKORT and also there is a space before email id. That is not getting pasted properly in the files.
    Thanks and Regards,
    Veena Handadi

    Hi all,
    Header is missed from the output file for the above post:
    Please find the output file:
    010320140211ABC XYZ XYZ ABCD201402110 FSPAFSPA005560080441.02000006557.FSPA.IN AB-CD ABCD/AB-CD BETALKORT
    1020140210AN195607123413.FSPA.IN
    [email protected]
    1020140210AN199205123408.FSPA.IN
    [email protected]
    1020140210AN196206123499.FSPA.IN
    [email protected]
    1020140210AN196709123410.FSPA.IN
    [email protected]
    1020140210AN194708123455.FSPA.IN
    [email protected]
    1020140210AN195710123443.FSPA.IN
    [email protected]
    1020140210AN198311123436.FSPA.IN
    [email protected]
    1020140210AN196712123471.FSPA.IN
    [email protected]
    1020140210AN195111123491.ICA.IN
    110019EPS [email protected]
    1020140210AV197005123403.FSPA.IN
    1020140210AV195309123434.ICA.IN
    98000000000000001000000000000001000000000000002
    99000000000000001

  • Flat file Hirarchies

    Hai..
    i have created a Flat file Hierarchy in BI...Now i want create a Bex query on this ..i would like to know how to load transactional data into this Hierarchy.so, that i can view the data  in my query.....kindly revert
    Mahi...

    hierarchy- we can't load the transaction data. It consists of of Master data .
    Can you be more clear on your requirement? Load transaction data in the DSo/IC where this hierarchy info object is used.
    and while creating a query you can make use of this hierarchy in your report.
    Regards
    KP
    Edited by: prashanthk on Dec 1, 2010 12:26 PM

  • Transfer structure sequence for Flat File

    I am wondering is it realy important to maintain particular sequence in DS/TR str. for InfoObjects which are set up for getting constant value in TR rules. I am having trouble loading flat file. TR str. has some compounded characteristics like 0Fisvar for 0Fiscper and 0Co_area for 0profit_ctr both getting fix value in transfer rules. how to maintain flat file and TR str. sequence?

    Hi Vishan,
    This is what I used to do.
    If there is a compounded InfoObject or Key Figure that requires a unit (even though in your case, the compounded InfoCObject & the unita are always the same), You "need" add them in TS.
    After adding them, this is what I will do:
    Move them around to match the Flat File structure.
    Now, the fields that are always constants & not coming from Flat Files, move them to the end.
    They will be ignored. Your TR will populate them.
    If you get an error now, that means your Flat File is not formatted correctly, apart from that, you are fine.
    Ram Chamarthy
    Message was edited by: Ram Chamarthy

  • One Time Flat File load in a Cube that extracts from ECC

    Hi,
    I have a cube that is already extracting data from ECC on a daily basis. I am asked to load some historical data into the cube from their legacy system. My question is, how do i prepare that file from the legacy system to match the fields in the Cube? What are the technicalities that i need to look at?
    For Example:
    0MATERIAL in the Cube:  Material Number in the Flat File
    ZAMOUNT in the Cube: Cost of Product in the Flat File
    How do I arrange to match these?
    Thanks

    Hi
    Create DS and enter the required objects
    Preview the Data
    Create Transformations (Map DS fields to Targets Objects - direct assignemnt)
    Create Infopack and extract data upto PSA
    Create DTP and run - it will extract data from PSA to Target
    No of Heade rows to be ignored : 1
    File type : CSV
    Data seperator : ;
    Escape Sign: "
    Thanks,

Maybe you are looking for

  • Acrobat CS2 安装,缺少typewriter和adobe PDF打印设置

    各位, 最近,将计算机操作系统由XP改为WIN 7,重新安装Acrobat CS2时,丢失了"工具"里的"Typewriter"和打印转换时的"Adobe PDF"功能.请指点解决办法,非常感谢. Hi There, Recently, my computer has beed updated to Win 7 from Windows XP. When I installed Acrobat CS2, two things were not showing: 1) "typewriter" i

  • Batch command to zip all files datewise

    Hi, Below are the trace files present on Solaris server. -rw-r----- 1 oracle dba 1307 Mar 9 18:15 esapdb_j000_4563.trc -rw-r----- 1 oracle dba 1307 Mar 9 18:45 esapdb_j000_7486.trc -rw-r----- 1 oracle dba 1309 Mar 9 19:15 esapdb_j000_10627.trc -rw-r-

  • Surcharge on vat

    Dear experts, actually i am from fi. now facing a problem related to SD. so some confusions are there.. my clients requirement is as follows.. "We do not require any modification in our normal sale but we do require VAT @ 5 % plus 10 % surcharge on s

  • Quick query

    hello there. just a quick query on SQL. I have 4 simple tables in a spatial database each with one geometry column called geom. each table has a different number of tuples in it, i.e. city_table has 195 point geometries county_table has 3230 polygon/

  • System Boot problems / Power supply? Antec Neopower.

    I've had this system for around 2 weeks now and as I've posted in other threads I have a occasional problem getting past the windows splash screen, if I don't get in the first time if I reset I usually boot in no problem. It's a very odd problem that