File data transformation

Hello,
Looking for some advise...
We have text file as a source and one of the column "Period" has data in Q1FY10 format. And in the ODI interface this text source file column needs to be parsed to map two columns in the target (Time & Year)
Ex: Q1 -> Time
FY10 -> FY10
How do we parse/transform the source column "period"? to map to two different columns in target?
If the source was a relational table, we could have used substr(column period,1,2) -> Time & substr(column period,3) -> Year.
Thanks in advance for your help.

user4585856 wrote:
The target is Essbase cube (non relational). So, can i still do that?Yes you can, add the functions in the target and execute them in your staging area.
Cheers
John
http://john-goodwin.blogspot.com/

Similar Messages

  • Script Logic VS Data Transformation File

    Hi all,
       I'm new to SAP BPC. I have knowledge of SAP BW.
       I can see conversion file, which we are referring in data transformation file. which we can use for mapping and conversion of external data into internal data.
      How data transformation file different form script logic? Are we going to refer script logis in Data transformation file for each required dimension?
      Can any of you give me clarity on how to place script logic and data transformation file in BPC data management flow.
      I will really applicate all your help!!!
    Thanks
    Ben.

    Nilanjan,
       I have a another quick question...
      suppose my bpc application has 5 dimensions. Out of the 5 dimensions, 4 dimensions data i'm getting directly from SAP BW. assume 1 dim, i need to extract by doing look up at different table which also reside in BW.
       how to populate data for DIM 5.
       I got your point that data transformation file purely for field mapping. suppose DIM5 if i want to populate from script logic, wht do i need map in Transformation file. I hope you got my point.
       My question if how to populate a DIM in BPC using lookup approach.
    Thanks,
    Ben.

  • Data Transformation File

    Hi,
       Can any of you explain me the purpose of Data Transformation File ???
       How to use Script logic based on BADI in SAP BPC? Do we need to call in Default.lgf ???
       If i want to apply some custom logic while populating data using BADI, do i need to call that BADI in Data Tranformation file?
    Thanks,
    Ben.

    Hi Ben,
    There is no possibility of calling userdefined scripts or BADI from transformation files.The main use of transformation file is to map data from source to destination.
    If you require data load based on some filter criteria there are certain functionalities available for transformation file creation which u can find in the below link.
    http://help.sap.com/saphelp_bpc70sp02/helpdata/en/66/ac5f7e0e174c848b0ecffe5a1d7730/frameset.htm
    Hope this helps,
    Regards,
    G.Vijaya Kumar

  • Parameters in Transformations file "data export"

    Hi,
    i'm trying to export to a txt file data from SAP BPC to a txt file, using the standard package (Export Transactional Data).
    I have several issues. Firstly, it seems like the dataexport generates a lot of lines containing all the "PARENT" calculations. So my file is perpaps only 30 lines on basemember level, but the final flatfile is perhaps 3.000 lines due to all the calculated lines (on parents level).
    Is there any way to "disable" all these parent calculated lines in the export ?
    Also i've tried finding some examples on transformation files for dataexport, but can not find any.
    Does anyone know which parameters I can use in the *option section of the transformation file ?
    Thank you,
    Joergen Dalby

    If you can access the other machine like a share folder then provide the same path in physical schema like \\my_other_pc_on_shared\new_folder
    If you cannot access like that then create one agent on that machine which can access the path and execute your project using the agent.
    Thanks
    Bhabani
    http://bhabaniranjan.com/

  • BPM was Picking two files but it was not processing 2nd file data in target

    Hi all,
    I have designed scenario for picking two files and i want to merge this files into single structure and send it target.when i am try to testing my bpm was picking two files but it is processing only file it went target system.i have steructe for source like
    source1     target
    root        header
    a           item 0 to unbounded
    b            a
    c            b
                 c
    source2     item1 duplicated structure
    a            a
    b            b
    c            c
    i am getting two source files with the same structure but i have to map it one target structure and i want to map both source fileds to one target structre under item node so i was duplicated item node and mapped a b c fields from both(2 source structures).while testing BPM is picking two file bu in the target i am getting only first file data under item node,whatever i mapped with duplicate structure with second file i am not getting into target file(target is standard proxy structure so cant' create new one item node in that occurance is 0 to unbounded thats y i duplicated node while testing interface mapping its working fine).can anyone suggest solution for this problem.
    Thanks,
    Seshagiri.

    Hi,
    In BPM follow the below steps and hopefully your problem will be solved.
    1.     Configure one sender CC to get the files from your application server, once the files hit the BPM now you need to
                         configure your BPM accordingly.
    2.     Use the Receiver step in BPM to receive the messages. Use the correlation step to get the correct files.
    3.     Use container and append the files, so both the files will be append, this means the message content will
                         have 1header, body, footer, 2header, body, footer.
    4.     If you want to split the message to different receiver then use the fork step else leave it.
    5.     Use the transformation step, the transformation step will call the message mapping. Provide the correct message
                         mapping.
    6.     Now use the sender step to send the message to the receiver-target system. In this step use the split value for each
                          concept if you want to generate two different files.
    7.     Configure 1 receiver CC to generate both the files.
    Hope this helps.
    Cheers,
    Jay

  • Data transformation tool in wli8.1

    hi,
    We are trying to explore the data transformation tool of the wil8.1.
    We have cobol copy books as input and we need to end up populating oracle tables.
    These are the steps we are following:
    1. Use format builder to change cobol copy books into .mfl files
    2. Import them into the existing schema.
    3.. Create a schema(.xsd) which would take the output which then could be transferd
    into the java variables while creating the .jcx file.
    3. Create a .dtf file that would take in the .mfl as the input and the output
    would be the .xsd file created in step 3.
    4. Create a .jpd file that would be initiated with the client request and would
    use the .dtf as an input in the control send node and will return the .xsd file
    creted in the step3.
    The xsd file we have created is:
    <?xml version="1.0"?>
    <xs:schema
         xmlns:xs="http://www.w3.org/2001/XMLSchema"
         xmlns:tns="http://www.bea.com/TransformationWeb/part1xform.xsd"
         targetNamespace="http://www.bea.com/TransformationWeb/part1xform.xsd">
    <xs:element name="part1xform"
    type="tns:part1xform"/>
    <xs:complexType name="part1xform">
    <xs:sequence>
    <xs:element name="part1xform-rct_nbr" type="xs:integer"/>
    <xs:element name="part1xform-fac_nbr" type="xs:integer"/>
    <xs:element name="part1xform-rct_suff" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
         </xs:schema>
    We have not as yet gone to the stage where we can create a .jcx file using the
    output of the first stage. The error we get while trying to run the .jpd file
    in the last step is:
    An unexpected exception occurred while attempting to locate the run-time information
    for this Web Service. Error: java.lang.NoClassDefFoundError:com/bea/transformationWeb/part1Xform/Part1XformDocument
    Any help you coulde provide in this would be really appriciated.
    However, your suggestions on some other method to be followed for data tranformation
    from cobol copybooks to oracle tables are also welcome.
    Divya Ravishankar
    Advance Computer Services Ltd.
    Millennium Business Park, Mhape,
    Navi Mumbai-400703. Phone: 27782805/6/7.

    Can you please attach that cobol copy book so that i can test it?
    "divya" <[email protected]> wrote:
    >
    hi,
    We are trying to explore the data transformation tool of the wil8.1.
    We have cobol copy books as input and we need to end up populating oracle
    tables.
    These are the steps we are following:
    1. Use format builder to change cobol copy books into .mfl files
    2. Import them into the existing schema.
    3.. Create a schema(.xsd) which would take the output which then could
    be transferd
    into the java variables while creating the .jcx file.
    3. Create a .dtf file that would take in the .mfl as the input and the
    output
    would be the .xsd file created in step 3.
    4. Create a .jpd file that would be initiated with the client request
    and would
    use the .dtf as an input in the control send node and will return the
    .xsd file
    creted in the step3.
    The xsd file we have created is:
    <?xml version="1.0"?>
    <xs:schema
         xmlns:xs="http://www.w3.org/2001/XMLSchema"
         xmlns:tns="http://www.bea.com/TransformationWeb/part1xform.xsd"
         targetNamespace="http://www.bea.com/TransformationWeb/part1xform.xsd">
    <xs:element name="part1xform"
    type="tns:part1xform"/>
    <xs:complexType name="part1xform">
    <xs:sequence>
    <xs:element name="part1xform-rct_nbr" type="xs:integer"/>
    <xs:element name="part1xform-fac_nbr" type="xs:integer"/>
    <xs:element name="part1xform-rct_suff" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
         </xs:schema>
    We have not as yet gone to the stage where we can create a .jcx file
    using the
    output of the first stage. The error we get while trying to run the .jpd
    file
    in the last step is:
    An unexpected exception occurred while attempting to locate the run-time
    information
    for this Web Service. Error: java.lang.NoClassDefFoundError:com/bea/transformationWeb/part1Xform/Part1XformDocument
    Any help you coulde provide in this would be really appriciated.
    However, your suggestions on some other method to be followed for data
    tranformation
    from cobol copybooks to oracle tables are also welcome.
    Divya Ravishankar
    Advance Computer Services Ltd.
    Millennium Business Park, Mhape,
    Navi Mumbai-400703. Phone: 27782805/6/7.

  • Oracle Database Inserts Via Microsoft Data Transformation Services (DTS)

    This question involves a SQL Server database and an Oracle database. The databases reside on different servers. One of our developers periodically uses Microsoft DTS (Data Transformation Services) to read data from a SQL Server database and insert it into an Oracle database. Normally the job runs once a day and reportedly inserts about 20,000 rows. The job usually runs fine. About a month ago execution of the daily job was suspended. Two days ago the developer ran a job to select and insert nine days of information. He estimated that 80,000 rows would be inserted. The job cancelled after twenty-three minutes when it filled up the 512 MB UNDO tablespace. (FYI, we use automatic UNDO management.) At the point of failure the number of active sessions spiked sharply in the Oracle database because of system I/O waits (log file parallel write, db file parallel write, and control file parallel write). The number of active sessions also spiked sharply in three other Oracle databases whose files reside on the same array of disk drives. Most of those sessions were waiting on commits (log file sync). The spikes lasted for one minute or less. Grid Control’s performance monitor shows that sqlservr.exe is the module being executed when the UNDO tablespace fills up. We ran the job a second time and closely monitored it, watching the amount of UNDO space grow until it used all 512 MB available. The symptoms described above for the first cancellation were repeated in the second cancellation.
    We reran the job by processing a single day’s worth of information and that ran fine. Then we ran it for two days of information, then for six days of information. Everything ran fine. During those tests no more than 70 MB of space of UNDO were used.
    Our developer reported that last week he ran the job for nine days of information, the same amount as the job that cancelled twice today. He estimates that it ran for about 80 minutes and went to a normal end-of-job.
    Can anyone here offer an explanation of why we seem to be getting these varied demands for space in the UNDO tablespace? Do you know if Microsoft DTS issues a commit after each insert or only a single commit at the end-of-job?
    Thank you,
    Bill

    Hi Arthur,
    Yes both instances are same.
    Microsoft SQL Server 2008 R2 (SP2) - 10.50.4263.0 (X64)   Aug 23 2012 15:56:56   Copyright (c) Microsoft Corporation  Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor)
    I have run the Main package using the using SQL Agent, main package calls the child packages.
    The error message shown on SQL agent job is:
    R6025  - pure virtual function call.  The return value was unknown.  The process exit code was 255.
    or sometimes 
    The step did not generate any output.  The return value was unknown.  The process exit code was -532459699.
    in the even log it says:
    Error Level:
    Event ID 1000
    Faulting application name: DTExec.exe, version: 2009.100.4263.0, time stamp: 0x5036ba73
    Faulting module name: DTSPipeline.dll, version: 2009.100.4263.0, time stamp: 0x5036ba53
    Exception code: 0x40000015
    Fault offset: 0x00000000000a33c5
    Faulting process id: 0x98c
    Faulting application start time: 0x01cf64ba9b72b27c
    Faulting application path: C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe
    Faulting module path: C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTSPipeline.dll
    Report Id: e8eb9b4f-d0ad-11e3-babd-005056997b14
    Information Level:
    Windows error reporting  Event ID 1001
    Fault bucket , type 0
    Event Name: APPCRASH
    Response: Not available
    Cab Id: 0
    Problem signature:
    P1: DTExec.exe
    P2: 2009.100.4263.0
    P3: 5036ba73
    P4: DTSPipeline.dll
    P5: 2009.100.4263.0
    P6: 5036ba53
    P7: 40000015
    P8: 00000000000a33c5
    P9:
    P10:
    Attached files:
    These files may be available here:
    C:\ProgramData\Microsoft\Windows\WER\ReportQueue\AppCrash_DTExec.exe_ccc7a4e176faafbea69955957371ea96e175b_44c41e3e
    Analysis symbol:
    Rechecking for solution: 0
    Report Id: e8eb9b4f-d0ad-11e3-babd-005056997b14
    Report Status: 4

  • Upload falt file data

    Hi ,
    I am loading falt file data through data manager .
    where can i select  master data or transcatioanal data option .
    while run package i am getting fallowing eroor :  Conversion result file is empty; either the source data file was empty or all records were rejected during conversion.
    Please let me know your inputs
    PSR

    Hi,
    To load transaction data - Run "Data Management>Import" Package. Before this make sureto upload the file to the server using "eData>Data Upload". You also need to create a transformation file to define the mappings between the data in flat file and the dimensions in the application.
    To load master data - Run "System Administrative>ImportMasterData" package.
    Hope this helps.
    Regards,
    Shoba

  • Data Transformation Services Execution Utility stopped working and was closed

    Hi,
    I had the SSIS packages (ETL job) working fine for a long time and then we moved the server/machine and ETL job keeps failing.
    The ETL job run number of packages. The ETL job fails for the first run then I set the retires and it works in 2nd or 3rd try on the ETL job but it fails daily for the first run where DTexec.exe crash (log pasted below)
    Server: Windows 2008 R2
    SQL Server:  SQL Server 2008 R2
    RAM 64 GB
    CPU cores 8
    we had the similar specs on previous machine and ETL job worked fine. Only difference is we have 2 SQL server instances now on same machine.
    I am suspecting it's an issue with ETL packages  but why it was working fine before for a long time and after machine move it keeps failing.
    Any thoughts?
    Version=1
    EventType=APPCRASH
    EventTime=130432662277734721
    ReportType=2
    Consent=1
    ReportIdentifier=f2ff88b6-cfc3-11e3-babd-005056997b14
    IntegratorReportIdentifier=f2ff88b5-cfc3-11e3-babd-005056997b14
    Response.type=4
    Sig[0].Name=Application Name
    Sig[0].Value=DTExec.exe
    Sig[1].Name=Application Version
    Sig[1].Value=2009.100.4263.0
    Sig[2].Name=Application Timestamp
    Sig[2].Value=5036ba73
    Sig[3].Name=Fault Module Name
    Sig[3].Value=DTSPipeline.dll
    Sig[4].Name=Fault Module Version
    Sig[4].Value=2009.100.4263.0
    Sig[5].Name=Fault Module Timestamp
    Sig[5].Value=5036ba53
    Sig[6].Name=Exception Code
    Sig[6].Value=40000015
    Sig[7].Name=Exception Offset
    Sig[7].Value=00000000000a33c5
    DynamicSig[1].Name=OS Version
    DynamicSig[1].Value=6.1.7601.2.1.0.274.10
    DynamicSig[2].Name=Locale ID
    DynamicSig[2].Value=3081
    DynamicSig[22].Name=Additional Information 1
    DynamicSig[22].Value=7c29
    DynamicSig[23].Name=Additional Information 2
    DynamicSig[23].Value=7c290b53eb7940378e43b699d1de1f07
    DynamicSig[24].Name=Additional Information 3
    DynamicSig[24].Value=199e
    DynamicSig[25].Name=Additional Information 4
    DynamicSig[25].Value=199e6470d3145d6303fbe30033cf7038
    UI[2]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe
    UI[5]=Check online for a solution (recommended)
    UI[6]=Check for a solution later (recommended)
    UI[7]=Close
    UI[8]=Data Transformation Services Execution Utility stopped working and was closed
    UI[9]=A problem caused the application to stop working correctly. Windows will notify you if a solution is available.
    UI[10]=&Close
    LoadedModule[0]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe
    LoadedModule[1]=C:\Windows\SYSTEM32\ntdll.dll
    LoadedModule[2]=C:\Windows\system32\kernel32.dll
    LoadedModule[3]=C:\Windows\system32\KERNELBASE.dll
    LoadedModule[4]=C:\Windows\system32\ADVAPI32.dll
    LoadedModule[5]=C:\Windows\system32\msvcrt.dll
    LoadedModule[6]=C:\Windows\SYSTEM32\sechost.dll
    LoadedModule[7]=C:\Windows\system32\RPCRT4.dll
    LoadedModule[8]=C:\Windows\WinSxS\amd64_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.6195_none_88e41e092fab0294\MSVCR80.dll
    LoadedModule[9]=C:\Windows\system32\USER32.dll
    LoadedModule[10]=C:\Windows\system32\GDI32.dll
    LoadedModule[11]=C:\Windows\system32\LPK.dll
    LoadedModule[12]=C:\Windows\system32\USP10.dll
    LoadedModule[13]=C:\Windows\system32\OLEAUT32.dll
    LoadedModule[14]=C:\Windows\system32\ole32.dll
    LoadedModule[15]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTEParse.dll
    LoadedModule[16]=C:\Windows\WinSxS\amd64_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.6195_none_88e41e092fab0294\MSVCP80.dll
    LoadedModule[17]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTEPkg.dll
    LoadedModule[18]=C:\Windows\WinSxS\amd64_microsoft.vc80.atl_1fc8b3b9a1e18e3b_8.0.50727.6195_none_8a1dd9552ed7f8d8\ATL80.DLL
    LoadedModule[19]=C:\Windows\system32\PSAPI.DLL
    LoadedModule[20]=C:\Windows\system32\VERSION.dll
    LoadedModule[21]=C:\Windows\system32\IMM32.DLL
    LoadedModule[22]=C:\Windows\system32\MSCTF.dll
    LoadedModule[23]=C:\Windows\system32\CRYPTBASE.dll
    LoadedModule[24]=C:\Windows\system32\CLBCatQ.DLL
    LoadedModule[25]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTS.dll
    LoadedModule[26]=C:\Windows\system32\SHLWAPI.dll
    LoadedModule[27]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\dtsmsg100.dll
    LoadedModule[28]=C:\Windows\system32\CRYPT32.dll
    LoadedModule[29]=C:\Windows\system32\MSASN1.dll
    LoadedModule[30]=C:\Windows\system32\SXS.DLL
    LoadedModule[31]=C:\Windows\System32\msxml6.dll
    LoadedModule[32]=C:\Windows\system32\CRYPTSP.dll
    LoadedModule[33]=C:\Windows\system32\rsaenh.dll
    LoadedModule[34]=C:\Windows\system32\RpcRtRemote.dll
    LoadedModule[35]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DtsConn.dll
    LoadedModule[36]=C:\Windows\system32\SHELL32.dll
    LoadedModule[37]=C:\Windows\system32\mscoree.dll
    LoadedModule[38]=C:\Windows\Microsoft.NET\Framework64\v4.0.30319\mscoreei.dll
    LoadedModule[39]=C:\Windows\Microsoft.NET\Framework64\v2.0.50727\mscorwks.dll
    LoadedModule[40]=C:\Windows\system32\profapi.dll
    LoadedModule[41]=C:\Windows\assembly\NativeImages_v2.0.50727_64\mscorlib\88744044294787b99dd4a8704ab75a79\mscorlib.ni.dll
    LoadedModule[42]=C:\Windows\assembly\NativeImages_v2.0.50727_64\System\af0a0b96a02f9925eb84392ee65a5cfa\System.ni.dll
    LoadedModule[43]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\40937910f60adedfb214d89b989f929d\Microsoft.SqlServer.ManagedDTS.ni.dll
    LoadedModule[44]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\f82fe091649105fb4de1195cb9da7438\Microsoft.SqlServer.DTSRuntimeWrap.ni.dll
    LoadedModule[45]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\77e0f983055ea695867ca69afaecda79\Microsoft.SqlServer.SQLTask.ni.dll
    LoadedModule[46]=C:\Windows\assembly\NativeImages_v2.0.50727_64\System.Xml\3975acf49313ceea1280da91f0383480\System.Xml.ni.dll
    LoadedModule[47]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\80c6801f36d7c3e891323f6b53dccbcc\Microsoft.SqlServer.Msxml6_interop.ni.dll
    LoadedModule[48]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\ExecPackageTask.dll
    LoadedModule[49]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\Resources\1033\dtsmsg100.rll
    LoadedModule[50]=C:\Windows\system32\urlmon.dll
    LoadedModule[51]=C:\Windows\system32\api-ms-win-downlevel-ole32-l1-1-0.dll
    LoadedModule[52]=C:\Windows\system32\api-ms-win-downlevel-shlwapi-l1-1-0.dll
    LoadedModule[53]=C:\Windows\system32\api-ms-win-downlevel-advapi32-l1-1-0.dll
    LoadedModule[54]=C:\Windows\system32\api-ms-win-downlevel-user32-l1-1-0.dll
    LoadedModule[55]=C:\Windows\system32\api-ms-win-downlevel-version-l1-1-0.dll
    LoadedModule[56]=C:\Windows\system32\api-ms-win-downlevel-normaliz-l1-1-0.dll
    LoadedModule[57]=C:\Windows\system32\normaliz.DLL
    LoadedModule[58]=C:\Windows\system32\iertutil.dll
    LoadedModule[59]=C:\Windows\system32\WININET.dll
    LoadedModule[60]=C:\Windows\system32\api-ms-win-downlevel-shlwapi-l2-1-0.dll
    LoadedModule[61]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTSComExprEval.dll
    LoadedModule[62]=C:\Program Files\Microsoft SQL Server\100\Shared\instapi10.dll
    LoadedModule[63]=C:\Program Files\Microsoft SQL Server\100\Shared\SQLBoot.dll
    LoadedModule[64]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\c9743c9d2ac43664c43d9a0b3fbb5548\Microsoft.SqlServer.DtsMsg.ni.dll
    LoadedModule[65]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\5ef353725aeb5af35fb49c3f300c38b3\Microsoft.SqlServer.SQLTaskConnectionsWrap.ni.dll
    LoadedModule[66]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\52f53ccfcc9570f1fd13f63d0a640e81\Microsoft.SqlServer.BatchParser.ni.dll
    LoadedModule[67]=C:\Windows\assembly\GAC_64\Microsoft.SqlServer.BatchParser\10.0.0.0__89845dcd8080cc91\Microsoft.SqlServer.BatchParser.dll
    LoadedModule[68]=C:\Windows\WinSxS\amd64_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.6195_none_88e41e092fab0294\msvcm80.dll
    LoadedModule[69]=C:\Program Files\Common Files\System\Ole DB\oledb32.dll
    LoadedModule[70]=C:\Windows\system32\MSDART.DLL
    LoadedModule[71]=C:\Windows\system32\bcrypt.dll
    LoadedModule[72]=C:\Windows\WinSxS\amd64_microsoft.windows.common-controls_6595b64144ccf1df_6.0.7601.17514_none_fa396087175ac9ac\Comctl32.dll
    LoadedModule[73]=C:\Program Files\Common Files\System\Ole DB\OLEDB32R.DLL
    LoadedModule[74]=C:\Windows\system32\comsvcs.dll
    LoadedModule[75]=C:\Windows\system32\bcryptprimitives.dll
    LoadedModule[76]=C:\Windows\system32\sqlncli10.dll
    LoadedModule[77]=C:\Windows\WinSxS\amd64_microsoft.windows.common-controls_6595b64144ccf1df_5.82.7601.18201_none_a4d3b9377117c3df\COMCTL32.dll
    LoadedModule[78]=C:\Windows\system32\COMDLG32.dll
    LoadedModule[79]=C:\Windows\system32\NETAPI32.dll
    LoadedModule[80]=C:\Windows\system32\netutils.dll
    LoadedModule[81]=C:\Windows\system32\srvcli.dll
    LoadedModule[82]=C:\Windows\system32\wkscli.dll
    LoadedModule[83]=C:\Windows\system32\WS2_32.dll
    LoadedModule[84]=C:\Windows\system32\NSI.dll
    LoadedModule[85]=C:\Windows\system32\1033\SQLNCLIR10.RLL
    LoadedModule[86]=C:\Windows\system32\secur32.dll
    LoadedModule[87]=C:\Windows\system32\SSPICLI.DLL
    LoadedModule[88]=C:\Windows\system32\credssp.dll
    LoadedModule[89]=C:\Windows\system32\kerberos.DLL
    LoadedModule[90]=C:\Windows\system32\cryptdll.dll
    LoadedModule[91]=C:\Windows\system32\msv1_0.DLL
    LoadedModule[92]=C:\Windows\system32\ntdsapi.dll
    LoadedModule[93]=C:\Windows\system32\LOGONCLI.DLL
    LoadedModule[94]=C:\Windows\system32\security.dll
    LoadedModule[95]=C:\Windows\system32\schannel.DLL
    LoadedModule[96]=C:\Program Files\Microsoft SQL Server\90\Shared\instapi.dll
    LoadedModule[97]=C:\Windows\system32\mswsock.dll
    LoadedModule[98]=C:\Windows\System32\wshtcpip.dll
    LoadedModule[99]=C:\Windows\System32\wship6.dll
    LoadedModule[100]=C:\Windows\system32\DNSAPI.dll
    LoadedModule[101]=C:\Windows\system32\IPHLPAPI.DLL
    LoadedModule[102]=C:\Windows\system32\WINNSI.DLL
    LoadedModule[103]=C:\Windows\system32\rasadhlp.dll
    LoadedModule[104]=C:\Windows\System32\fwpuclnt.dll
    LoadedModule[105]=C:\Windows\system32\ncrypt.dll
    LoadedModule[106]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\SQLTaskConnections.dll
    LoadedModule[107]=C:\Windows\system32\ODBC32.dll
    LoadedModule[108]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\BatchParser.dll
    LoadedModule[109]=C:\Windows\system32\odbcint.dll
    LoadedModule[110]=C:\Windows\Microsoft.NET\Framework64\v2.0.50727\mscorjit.dll
    LoadedModule[111]=C:\Windows\assembly\NativeImages_v2.0.50727_64\System.Data\5e957216f11830cbc49b4b30314e0e10\System.Data.ni.dll
    LoadedModule[112]=C:\Windows\assembly\GAC_64\System.Data\2.0.0.0__b77a5c561934e089\System.Data.dll
    LoadedModule[113]=C:\Windows\assembly\NativeImages_v2.0.50727_64\System.AddIn\b3b4d44d80055c9e96909f153ff92fbc\System.AddIn.ni.dll
    LoadedModule[114]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\5388ebc8e600ad2542d8afbef9559944\Microsoft.SqlServer.ScriptTask.ni.dll
    LoadedModule[115]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\492ba64d2cb55aa910114c85c2327821\Microsoft.SqlServer.VSTAScriptingLib.ni.dll
    LoadedModule[116]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.VisualStu#\f3ab3bf00723d757396110ac3dc57a6d\Microsoft.VisualStudio.Tools.Applications.DesignTime.v9.0.ni.dll
    LoadedModule[117]=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTSPipeline.dll
    LoadedModule[118]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\06aaaec444077ab8248b28b0bdf2c3b2\Microsoft.SqlServer.PipelineXML.ni.dll
    LoadedModule[119]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\3dce1dfa4ea9d165471de3b83b82893c\Microsoft.SqlServer.SqlTDiagM.ni.dll
    LoadedModule[120]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\4f58550979ad960c616b557266fbc5d0\Microsoft.SqlServer.Diagnostics.STrace.ni.dll
    LoadedModule[121]=C:\Windows\assembly\NativeImages_v2.0.50727_64\System.Configuration\fed86e49fe95761085bf287f901f5b53\System.Configuration.ni.dll
    LoadedModule[122]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\6f12b249b9bd278ddc1a5ea48fb1221e\Microsoft.SqlServer.DTSPipelineWrap.ni.dll
    LoadedModule[123]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\OleDbSrc.dll
    LoadedModule[124]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\OleDbDest.dll
    LoadedModule[125]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxDerived.dll
    LoadedModule[126]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxRowCount.dll
    LoadedModule[127]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\CommandDest.dll
    LoadedModule[128]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxSplit.dll
    LoadedModule[129]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxMergeJoin.dll
    LoadedModule[130]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxSort.dll
    LoadedModule[131]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxDataConvert.dll
    LoadedModule[132]=C:\Windows\assembly\NativeImages_v2.0.50727_64\Microsoft.SqlServer#\78f11d2ba751edf529c28ca8b012d697\Microsoft.SqlServer.PipelineHost.ni.dll
    LoadedModule[133]=C:\Windows\assembly\GAC_MSIL\KimballMethodSCD100\1.0.0.0__8b0551303405e96c\KimballMethodSCD100.dll
    LoadedModule[134]=C:\Windows\assembly\GAC_MSIL\SSISComponentUtilities\1.0.0.0__8b0551303405e96c\SSISComponentUtilities.dll
    LoadedModule[135]=C:\Windows\assembly\NativeImages_v2.0.50727_64\System.Drawing\868d117286ad259249f31d3fe813d39a\System.Drawing.ni.dll
    LoadedModule[136]=C:\Windows\assembly\NativeImages_v2.0.50727_64\CustomMarshalers\98e9b163a01ce659f1bb3d7ee15be7bf\CustomMarshalers.ni.dll
    LoadedModule[137]=C:\Windows\assembly\GAC_64\CustomMarshalers\2.0.0.0__b03f5f7f11d50a3a\CustomMarshalers.dll
    LoadedModule[138]=C:\Windows\system32\apphelp.dll
    LoadedModule[139]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxLookup.dll
    LoadedModule[140]=C:\Program Files\Microsoft SQL Server\100\DTS\PipelineComponents\TxUnionAll.dll
    LoadedModule[141]=C:\Windows\assembly\NativeImages_v2.0.50727_64\EnvDTE\4a689bff8b507e736eea8e5b2b21d42a\EnvDTE.ni.dll
    FriendlyEventName=Stopped working
    ConsentKey=APPCRASH
    AppName=Data Transformation Services Execution Utility
    AppPath=C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe

    Hi Arthur,
    Yes both instances are same.
    Microsoft SQL Server 2008 R2 (SP2) - 10.50.4263.0 (X64)   Aug 23 2012 15:56:56   Copyright (c) Microsoft Corporation  Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor)
    I have run the Main package using the using SQL Agent, main package calls the child packages.
    The error message shown on SQL agent job is:
    R6025  - pure virtual function call.  The return value was unknown.  The process exit code was 255.
    or sometimes 
    The step did not generate any output.  The return value was unknown.  The process exit code was -532459699.
    in the even log it says:
    Error Level:
    Event ID 1000
    Faulting application name: DTExec.exe, version: 2009.100.4263.0, time stamp: 0x5036ba73
    Faulting module name: DTSPipeline.dll, version: 2009.100.4263.0, time stamp: 0x5036ba53
    Exception code: 0x40000015
    Fault offset: 0x00000000000a33c5
    Faulting process id: 0x98c
    Faulting application start time: 0x01cf64ba9b72b27c
    Faulting application path: C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe
    Faulting module path: C:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTSPipeline.dll
    Report Id: e8eb9b4f-d0ad-11e3-babd-005056997b14
    Information Level:
    Windows error reporting  Event ID 1001
    Fault bucket , type 0
    Event Name: APPCRASH
    Response: Not available
    Cab Id: 0
    Problem signature:
    P1: DTExec.exe
    P2: 2009.100.4263.0
    P3: 5036ba73
    P4: DTSPipeline.dll
    P5: 2009.100.4263.0
    P6: 5036ba53
    P7: 40000015
    P8: 00000000000a33c5
    P9:
    P10:
    Attached files:
    These files may be available here:
    C:\ProgramData\Microsoft\Windows\WER\ReportQueue\AppCrash_DTExec.exe_ccc7a4e176faafbea69955957371ea96e175b_44c41e3e
    Analysis symbol:
    Rechecking for solution: 0
    Report Id: e8eb9b4f-d0ad-11e3-babd-005056997b14
    Report Status: 4

  • XML file data load to NW2004s-BI

    HI,
    I am trying to load the XML file data to NW2004s-BI .
    I have created the file Data source, infosoucrce and data store object. I have maintained the transformation rule for data source and info source also for infosource and data store object.
    But I am not able to create the XML DataSource (BW DataSource with SOAP Connection)
    Could any one please help me
    Thanks in advance.
    Amit

    Amit 
    Welcome to SDN.
    I don't know whether you checked this or not
    Select your Infosource>right click change>goto Extras-->and select Create BW Data source with SOAP Connection. It will crete Datasource.
    Hope this helps
    Thanks
    Sat
    PS: Don't forget to assign points if the answer is useful. This is the way of telling thanks in SDN

  • Data transformation Service

    I need to add data to a oracle data mart from a variety of databases or files created from these databases. Microsoft has the DTS product that aids in performing data mappings and scripting data transformation from the source to the destination database. Is there an Oracle equivalent of this product. Loader does not measure up to what is needed.

    On wikipedia, I see that's a SQL Server component, so I don't think it's possible to use it in any other program.
    But you can always tweak columns through normal queries, and even create a new table based on that, like
    CREATE TABLE Table_New AS
    SELECT First_Name || Last_Name FROM Table_OldHope that helps,
    K.

  • Does Acrobat Pro read the content in pdf file and transforms it?

    Does Acrobat Pro read the content in pdf file and transforms it to xls file without the need for much changes or manual work?

    Acrobat X (Standard and Pro) will save tabular data to XLS or XLSX format, provided it can recognize the table as being a table. If the PDF has missing or incorrect structure tags, Acrobat will try to guess the table layout by the position of text and lines on the page - this works well for basic formatting but if the table has complex styling, spanned cells etc. it can lead to problems.
    Acrobat X will even attempt to export a table within a scanned document, by applying OCR during the export stage - though again this relies on the table being visually identified.
    See http://www.adobe.com/products/acrobatpro/pdf-to-word-excel-converter.html and this article on how to extract one table from a larger document.

  • File to file no transformation no mapping just picking and dropping.

    Hi Experts ,
    I have a scenario whre I have to pick up a .txt file  from Cisco directory and send it via XI using a New 3rd party SFTP adapter I have installed in XI to >>a vendor as a.txt  file so XI will need to send the same file without transformation basically.
    So for IR objects I dont have data Type and messge types or mapping to be defined neither a Interface mapping just picking up a file in XI and dropping it on a server using 3rd party SFTp Adapter .
    Please guide on the basic steps i need to craete in IR and ID.

    Hi Sunita,
    You have to follow the steps described in William's Blog:
    "How to send any data (even binary) through XI, without using the Integration Repository"
    /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository
    regards,
    Juan.

  • List View: How to force update of *actual* file dates when sort by date?

    List View: How do I force and update of actual file dates when sort by date?
    When I go in, I often see the sort order and dates from 12-15 hours ago!
    not good

    Hi, did you ever get that Windows® Sharing thing worked out?
    On this problem, If it's just that you need the Finder to wake up to the fact that it needs to update the window give a try with Refresh Finder - 1.3...
    http://www.versiontracker.com/dyn/moreinfo/macosx/33066

  • What are the different ways to upload file data to SAP? Please help

    Hi Experts,
       I have to transfer huge file data (few lakhs records) to SAP business transaction. What are the different ways to do the same? I have heard of BDC and LSMW. But are there any more options?
    Which option is best suited for huge file data?
    Is LSMW an old technology and SAP will not support it any more?
    Kindly answer my queries at the earliest.
    I will be greatful to you if you can help me.
    Thanks
    Gopal

    for uplodig data to non sap we have 2 methodes
    i) if u know bapi u will use lasm
    2) bdc
    but u mentioned so many records isthere
    best thing is u will uplode all record sto al11 using XI interface
    then u have to write bdc / lsmw  program
    beter to go for lsmw before that u will find bapi
    if u will unable to find bapi
    u have to create bapi and use it in lasmw
    ofter that u have schedule the lsmw program as a bockground
    then u have to create a job for it
    and release from sm 37
    then u have to moniter through bd87
    if u want to go through i will help u.
    if it is usefull to u pls give points
    Saimedha

Maybe you are looking for