Loading issue in Target file

Hi All,
My source is oracle10g and my target is flat file.
When i try to execute the interface i am getting the below error in Execution tab:
com.sunopsis.sql.SnpsMissingParametersException: Missing parameter
I have selected IKM Sql to Sql Append.
Regard
Rama

hi can help me i stuck in ODI connection to planning

Similar Messages

  • Issue with Target File structure

    Hi Experts ,
                              Mine is a proxy to file scenerio. please suggest me on the below req. My target file structure should be as follows.
    HDR:PARTNER,CHDAT,NEW_TEL,OLD_TEL,PREF_CONTACT_NO,TEL,EXXX_ID
    1029382,28.02.2011,0782191829,049829329,Y,3,0
    1029382,28.02.2011,0783484311,077383738,N,2,0
    1029382,28.02.2011,01972383934,0113938393,N,,0
    1039385,28.02.2011,0782133829,079829310,Y,3,0
    10245748,28.02.2011,N,,,,DAVID.WHITAKERattheYAHOO.CO.UK
    112928393,28.02.2011,01183393843,01123839388,N,,Tom.HanksattherateSKYdotCOM
    FTR:UPDATE_VENDOR_CONTACTS_SRM_TO_EIS,01.03.2011 02.18.29,6
    This structure has 3 parts , header , body and trailer.
    I am planning to create a target  structure . Is it possible to get the field names of the structure in it
    Header
         HeaderInfo :  HDR:PARTNER,CHDAT,NEW_TEL,OLD_TEL,PREF_CONTACT_NO,TEL,EMAIL_ID
    Body
    1029382,28.02.2011,0782191829,049829329,Y,3,0
    1029382,28.02.2011,0783484311,077383738,N,2,0
    1029382,28.02.2011,01972383934,0113938393,N,,0    
    Footer
         Footer Info :FTR:UPDATE_VENDOR_CONTACTS_SRM_TO_EIS,01.03.2011 02.18.29,6
    I will take care of the comma and the single line for header/footer and multiline for body in content conversion ,
    In the footer I need date and time stamp along with total number of rows for the body.
    How do I get the same  .
    Please suggest .
    Arnab.

    Hi,
    Yes you can get the timestamp and count of no of records in the footer.
    Please define your target structure like below
    MT_
    Records
      Header ( o to 1)
        Headerdata
      Items(0 to unbounded)
        field1
        fielld2
    Footer
      Footerdata
    And in the mapping get the current dat using data function.
    And then pass the Item node to the function count
    Concat both of them along with the constants you need and assign to footer data..
    Thanks
    Suma

  • XSLT Issue : Without the target file mapping is happening

    Hi,
    In one of my composite, which is a provider abcs transforms source to target, where in the transformation file, source and target are mentioned as usual, both are pointing to MDS locations.
    Source file is physically present in the MDS, but the target file is not present but still the transformation happening.
    The flow is like below.
    ProviderABCS -> Adapter service which will write the file.
    Some time the file is created with data and some time only the file is created without data.
    My question is how come there is no exception caught ? If i keep the target schema or wsdl file in MDS will this issue get resolved ?
    Thanks,
    Vijay

    When i try to open the xslt file with designer view its complaining that the target file is missing, but when i deploy, iam able to deploy and run with some instances.
    Still investigating.. wll update you.. soon. In the mean time if any body have any other idea.. plz update the thread.
    Thanks,
    Vijay

  • Urgent: Flat file load issue

    Hi Guru's,
    Doing loading in data target ODS via flat file, problem is that flat files shows date field values correct of 8 characters but when we do preview it shows 7 characters and loading is not going through.
    Anyone knows where the problem is why in the preview screen it shows 7 characters of date when flat file has 8 characters.
    Thanks
    MK

    Hi Bhanu,
    How do i check if conversion is specified or not and another thing is it is not just one field we have 6 date fields and all of them are showing 7 characters in PSA after loading where as flat file have 8 characters in all of the 6 date fields.
    In PSA I checked the error message it shows 2 error messages:
    First Error message:
    The value '1# ' from field /BIC/ZACTDAYS is not convertible into the    
    DDIC data type QUAN of the InfoObject in data record 7 . The field      
    content could not be transferred into the communication structure       
    format.                                                                               
    em Response                                                                               
    The data to be loaded has a data error or field /BIC/ZACTDAYS in the    
    transfer structure is not mapped to a suitable InfoObject.                                                                               
    The conversion of the transfer structure to the communication structure 
    was terminated. The processing of data records with errors was continued
    in accordance with the settings in error handling for the InfoPackage   
    (tab page: Update Parameters).                                                                               
    Check the data conformity of your data source in the source system.                                                                               
    On the 'Assignment IOBJ - Field' tab page in the transfer rules, check    
    the InfoObject-field assignment for transfer-structure field              
    /BIC/ZACTDAYS .                                                                               
    If the data was temporarily saved in the PSA, you can also check the      
    received data for consistency in the PSA.                                                                               
    If the data is available and is correct in the source system and in the   
    PSA, you can activate debugging in the transfer rules by using update     
    simulation on the Detail   tab page in the monitor. This enables you to   
    perform an error analysis for the transfer rules. You need experience of  
    ABAP to do this.                                                                               
    2nd Error Message:
    Diagnosis                                                                               
    The transfer program attempted to execute an invalid arithmetic          
        operation.                                                                               
    System Response                                                                               
    Processing was terminated, and indicated as incorrect. You can carry out 
        the update again any time you choose.                                                                               
    Procedure                                                                               
    1.  Check that the data is accurate.                                                                               
    -   In particular, check that the data types, and lengths of the     
                transferred data, agree with the definitions of the InfoSource,  
                and that fixed values and routines defined in the transfer rules                                                                               
    -   The error may have arisen because you didn't specify the      
             existing headers.                                                                               
    2.  If necessary, correct the data error and start processing again.  
    Thanks
    MK

  • Issue with flat file load

    Hello,
    I am trying to load data via flat file. we have entries in a field which starts with 0s are skipped when we load in CSV format.
    like i have an entry  00657, is being loaded as 657 .
    How can i prevent this?
    Thanks.
    Praveen

    Hi Praveen,
    It is of lenghth 5. but when i upload CSV file it is not considering 00 infront. But i want those. I am loading hierarchy from flatfile.
    Suggest me, if we can also load from text ?.
    Thanks.
    Praveen

  • Csv file loading issue

    I am trying to load data from csv file into oracle table.
    The interface executes successfully but the problem is-
    there are 501 rows in the original csv file.
    When i load it in the file model it shows only 260 rows.
    What is the problem? Why all the rows are not loaded?

    Just forget about the interface.
    I am creating a new datastore of file type.
    In the resource name, i m giving my files path.
    when i reverse engineer it and check for the data, it shows only 260.
    But there are 501 records in my csv file.

  • Error while loading table from flat file (.csv)

    I have a flat file which i am loading into a Target Table in Oracle Warehouse Builder. It uses SQL Loader Internally to load the data from flat file, I am facing an issue. Please find the following error ( This is an extract from the error log generated)
    SQL*Loader-500: Unable to open file (D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv)
    SQL*Loader-552: insufficient privilege to open file
    SQL*Loader-509: System error: The data is invalid.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    I believe that this is related to SQL * Loader error.
    ACtually the flat file resides in my system ( D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv). I am connecting to a oracle server.
    Please suggest
    Is it required that i need to place the flat file in Oracle Server System ??
    Regards,
    Ashoka BL

    Hi
    I am getting an error as well which is similar to that described above except that I get
    SQL*Loader-500: Unable to open file (/u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv)
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: The system cannot find the file specified.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    The difference is that Ashoka was getting
    SQL*Loader-552: insufficient privilege to open file
    and I get
    SQL*Loader-553: file not found
    The initial thought is that the file does not exist in the directory specified or I have spelt the filename incorrectly it but this has been checked and double checked. The unix directory also has permission to read and write.
    Also in the error message is
    Control File: C:\u21\oracle\owb_staging\WHITEST\source_depot\INV_LOAD_LABEL_INVENTORY.ctl
    Character Set WE8MSWIN1252 specified for all input.
    Data File: /u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv
    Bad File: C:\u21\oracle\owb_staging\WHITEST\source_depot\Durham_Inventory_Labels.bad
    As can be seen from the above it seems to be trying to create the ctl and bad file on my c drive instead of on the server in the same directory as the .csv file. The location is registered to the server directory /u21/oracle/owb_staging/WHITEST/source_depot
    I am at a lost as this works fine in development and I have just promoted all the development work to a systest environment using OMBPlus.
    The directory structure in development is the same as systest except that the data file is /u21/oracle/owb_staging/WHITED/source_depot/Durham_Inventory_Labels.csv and everything works fine - .ctl and .bad created in the same directory and the data sucessfully loads into a oracle table.
    Have I missed a setting in OWB during the promotion to systest or is there something wrong in the way the repository in the systest database is setup?
    The systest and development databases are on the same box.
    Any help would be much appreciated
    Thanks
    Edwin

  • SPM Master Data Load Issue

    We are on SPM 3.0 SP3 (BW on HANA) We get the following error:
    "Data target EXPORT data cluster too large for application buff () inactive; processing stopped. Activate data target"
    when we load a master data file which has more than 4000 rows. Some of our files have 300000 rows and splitting the files into smaller chunks cannot be an option.  We have tried the option provided by a SAP note to adjust the parameters for the infopackage. We have also tried tweaking the packet size in the RSXAADMIN table.
    Any help in resolving this issue is appreciated.
    Thanks,
    Mani

    Hi Mani,
    Select the infopackage and reduce "Maximum size of a data packet in rows" to 1000 by default its value should be 20000 reduce it to 1000.
    If still don't work reduce it further like 800 or less.
    Let me know if you need further help.
    Assign point if helped.
    Regards,
    Aditya Panwar

  • SAP Mobile Sales 2.0 delta load issue for Sales Orders

    Hello,
    we have used Mobile Sales 2.0 with a Windows app for a while now. Our current issue is that sales reps won't see any historical sales order data on their devices.
    Background
    Due customer requirements, we need to make small changes to customer master data attributes and reload all customers from ERP to CRM. Then we ran delta loads (MAS_PARTNER followed by all other objects) to DOE, in which virtually all 5000+ customer accounts were compared. The delta load ran for about 3 days (some performance bottleneck we haven't located yet).
    During the delta load, data on devices was inconsistent. Accounts were missing and all transaction data disappeared. After the delta loads, all accounts and contacts are OK, save for a few. Data from activities (appointments, tasks) have reappeared, as they should. Only sales orders won't reappear. The sales orders exist in the backend and belong to active accounts and sales reps.
    Settings and troubleshooting so far
    We don't have any limitations for sales orders in CRM Sales Mobile configuration.
    We've run delta loads for all objects in transaction SDOE_LOAD.
    MAS_CUSTOMIZATION etc seem fine.
    We've re-run initial load for sales orders from CRM.
    In the test system, we've even reinitialized the whole CDS database on DOE and on the devices, then re-ran the loads.
    Checked steps suggested in discussion
    SAP CRM 2.0 initial load issue
    Historical sales orders (those created before the master data reload) exist in the backend, but don't show up on the device.
    If I change one of those historical sales orders in the backend, it gets sent to the device.
    If I create a new sales order in the backend or on the device, it is saved and replicated just fine.
    To sum it up, it seems DOE is unable to identify the sales orders relevant for replication.

    First Doubt i got clarify by my self as we can go with Unwired Runtime option .
    But i still have doubt in :
    2. How can i Modifying the Main Menu for iOS.
    i am able to customize the same for windows using files SybaseCRM.Configuration.xml file.
    Same how can i do for iphone/ipad.

  • Data load issue with export data source - BW 3.5

    Hi,
    We are facing issues in loading data with the help of export data source.
    We have created export data source of 0PCA_C01 cube. With the help of this export datasource,  we are loading data to other custom cube. Scenario is working fine in development server.
    But when we transported objects to quality server data is not getting loaded to custom target cube.
    It is extracting zero records.  All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs.  Every object is active but data is not getting extracted.
    RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development.   We are in BW 3.5 with patch level 19.
    Please guide us to resolve the issue.
    Thanks,
    Aditya

    Hi
    Make sure that you have relevant Role & Authorization at Quality/PRS.
    You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
    Hope it helps and clear

  • [Load data from excel file [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There may be error messa

    Error
    [Load data from excel file [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009.  There
    may be error message
    I am using BIDS Microsoft Visual Studio 2008 and running the package to load the data from excel .
    My machine has 32 bit excel hence have set property to RUN64BITRUNTIME AS FALSE.
    But the error still occurs .
    I checked on Google and  many have used Delay validation property at Data flow task level to true but even using it at both excel connection manager and DFT level it doesnt work
    Mudassar

    Thats my connection string
    Provider=Microsoft.ACE.OLEDB.12.0;Data Source=E:\SrcData\Feeds\Utilization.xlsx;Extended Properties="Excel 12.0;HDR=NO";
    Excel 2010 installed and its 32 bit edition
    Are you referring to install this component -AccessDatabaseEngine_x64.exe?
    http://www.microsoft.com/en-us/download/details.aspx?id=13255
    Mudassar
    You can try an OLEDB provider in that case
    see
    http://dataintegrity.wordpress.com/2009/10/16/xlsx/
    you might need to download and install ms access redistributable
    http://www.microsoft.com/en-in/download/details.aspx?id=13255
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Issue in moving files from data fodler to processed folder in background

    Hi All,
    I am facing one issue in moving files from data fodler to processed folder in case of background execution.
    When i am executing the file in the foreground, i can move the file from Data folder to processed folder. I am using SXPG_COMMAND_EXECUTE FM to move the file from data folder to processed folder.  I can see the file in processed folder once the program is executed.
    But in case of executing the same program in background, it is giving me the error "Failed to move the file to processed folder" in the spool of SM37 and i can see the file still laying in data folder.
    I tried to check other programs which acesses the same folder as the above program, whether they are able to move. They are able to move the file to processed fodler successfully both in foreground and background mode.
    Please help me in resolving this issue.
    Thanks,
    Deepa

    Hi Sanu,
                    Please use teh following code to move the file from source to target folder.
    This is a code showing how to create and use COPY command of UNIX in ABAP
    PARAMETERS:
    Input file path
    p_input TYPE localfile,
    Processed file path
    p_proc TYPE localfile.
    Declare the Types to file data
    TYPES: BEGIN OF L_X_OUTPUT,
    sys(200), " Please note, there are asterisk before and after sys (i.e.sys)
    END OF L_X_OUTPUT.
    * Internal table to store file data
    DATA l_i_output TYPE STANDARD TABLE OF l_x_output WITH HEADER LINE.
    * Variable for the UNIX command
    DATA: l_v_unix_comm(255) TYPE c.
    Copy command of UNIX
    CONCATENATE 'mv' p_input p_proc
    INTO l_v_unix_comm SEPARATED BY space.
    For example the Copy command is stored as below
    cp u2018/data/interfaces/input/input_fileu2019 u2018/data/interfaces/processed/processed_fileu2019
    Examples of UNIX Command *u2022 mv filename1 filename2 --- moves a file (i.e. gives it a different name, or moves it into a *different directory (see below) *u2022 cp filename1 filename2 --- copies a file
    Execute the UNIX Copy command.
    This command will copy the file from input file path to the processed file path
    CALL 'SYSTEM' ID 'COMMAND' FIELD l_v_unix_comm
    ID 'TAB'
    FIELD l_i_output-sys.
    IF sy-subrc eq 0.
    write: 'File is copied successfully using UNIX command in ABAP'.
    ENDIF.

  • Initial load of small target tables via flashback query?

    A simple question.
    Scenario: I’m currently building a near real time warehouse, streaming some basic facts and dimension tables between two databases. I’m considering building a simple package to "reset" or reinitialize the dimensions an all-round fix for variety of problem scenarios (since they are really small, like 15 000 rows each). The first time I loaded the target tables I utilized data pump with good success, however since streams transforms data on the way a complete reload is somewhat more complex.
    Considered solution: Ill just write a nice flashback query via db-link fetching data from a specific (recent) SCN and then I reinitialize the table on that SCN in streams...
    Is this a good idea? Or is there something obvious like a green and yellow elephant in the gift shop that I overlooked? Why I’m at all worried is because in the manuals this solution is not mention among the supported ways to do the initial load of a target table and I’m thinking there is a reason for this?

    I have a series of streams with some tranformations feeding rather small dimensional tables, I want to make this solution easy to manage even when operations encounter difficult replication issues, so Im developing a PL/SQL package that will:
    1) Stop all streams
    2) Clear all errors
    3) Truncate target tables
    4) Reload them including transformation (using a SELECT AS OF "ANY RECENT SCN" from target to source over dblink)
    5) Using this random recent SCN I will re-instantiate the tables
    6) Start all streams
    As you see datapump even if it works is rather difficult to utilize when you tranform data from A to B, using AS OF I not only get a constant snapshot from the source, I also get the exact SCN for it.
    What do you think? Can I safely use SELECT AS OF SCN instead of datapump with SCN and still get a consisten sollution?
    For the bigger FACT tables im thinking about using the same SELECT AS OF SCN but there with particular recent paritions as targets only and thus not having to reload the whole table.
    Anyways this package would ensure operations that they can recover from any kind of disaster or incomplete recovery on both source and target databases, and just re-instantiate the warehouse within minutes.

  • How to create variable record length target file in SAP BODS

    Hi All
    I have a requirement to create target file which will have various record layout; meaning different record length (similar to cobol file format), but this is for the target. Please let me know what is the best practice and how to solution this requirment.
    Thanks
    Ash

    Hi Shiva,
    Thanks for your feedback. My issue is that I have 10 different detail records (each record type is fixed length).
    For each customer account, I have to write to file the header record, the detail records in the exact order, then continue with next account and so on and then write the trailer record. I have given sample layout below. Highlighted text is the record identifier in this exmaple while the underlineds are account numbers. Fields are fixed length right padded with space or 0.
    220700000000SA00    Wednesday     2014-12-12  ASA00034 334 000   ---> (this is header)
    220700000010SA10 AAb   00000+000000+ Akab xxxx bb   0000000000943 3433 --> (detail rec)
    220700000010SA14 AAA  00034354 DDD 000000000+    --> (detail rec)
    220700000010SA15 888e a88 00000000+            --> (detail rec)
    . . . . . remaining detail records
    220700000012SA10 AAb   00000+000000+ Akab xxxx bb   0000000000943 3433 --> (detail rec)
    220700000012SA14 AAA  00034354 DDD 000000000+   --> (detail rec)
    220700000012SA15 888e a88 00000000+           -->  (detail rec)
    . . . . . remaining detail records
    220700000000SA99    Wednesday     2014-12-12  d334 000   --> (trailer is header)

  • Override One Property in a .targets file

    Hello,
    I have a question about MSBuild (asked on
    Stack Overflow) that seems to have stumped the community. I was wondering if someone here might be able to give some advice. Thank you!
    The question is as follows:
    Background: I have several solutions with roughly 300 C++ projects across them, most of them shared across solutions. We are using Visual Studio 2013 and have a build script that compiles
    all of the projects in the correct order, ensuring dependencies are resolved ahead of time. Our development/engineering team builds all of the code through the build script and then attempts to test/debug using Visual Studio 2013.
    Issue: The "build then debug" process results in Visual Studio telling us that the Projects are out of date. This stems from the ProjectEvaluationFingerprint property (in Line 39 Microsoft.CppBuild.targets)
    including a $(SolutionDir) in
    the output file. The recommended
    fix from Microsoft suggests removing the $(SolutionDir) from
    the file. As our developers tends to transition back and forth between projects, I do not want to manually change this .targets file on every developer's machine (and remember to change it back when they leave the project). I would like to override the property
    in the .vcxproj by using a .targets file explicitly for this.
    The property in Microsoft.CppBuild.targets looks like:
    <!-- Global up-to-date check support -->
    <PropertyGroup>
    <ProjectEvaluationFingerprint>$(Configuration)|$(Platform)|$(SolutionDir)|$(ProjectEvaluationFingerprint)</ProjectEvaluationFingerprint>
    </PropertyGroup>
    Generally, I have been following
    Microsoft's How to: Use the Same Target in Multiple Project Files. I have created a .targets file (test.targets) that contains the following code (note the TEST text was to test evaluation of the property in both the build script and building the project
    in Visual Studio):
    <Project ToolsVersion="12.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <PropertyGroup>
    <ProjectEvaluationFingerprint>$(Configuration)|$(Platform)|TEST|$(ProjectEvaluationFingerprint)</ProjectEvaluationFingerprint>
    </PropertyGroup>
    </Project>
    I then import it using the following line in the .vcxproj
    <Import Project="..\..\Config\VSPropertySheets\test.targets" />
    The project.lastbuildstate file now reads:
    #TargetFrameworkVersion=v4.0:PlatformToolSet=v120_xp:EnableManagedIncrementalBuild=false:VCToolArchitecture=Native32Bit
    Debug|Win32|D:\views\devbranch\Products\SLN\|Debug|Win32|TEST|
    It is appending the new ProjectEvaluationFingerprint to the existing one, so it is not
    overriding (I can understand this to a degree, but I'm no MSBuild expert).
    Question: How can I override this one property using a .targets file? Do I need to use a replaceregexp
    task or do I have an easier option?

    Hello K.Kennedy,
    The replaceregexp task is not the MSBuild task, I'm afraid you cannot use it in your MSbuild project.
    I know you may have use the solution listed on MSDN, but I'm not so sure of your steps, could you please make sure you've done everything mentioned on the following blog:
    http://blogs.msdn.com/b/jomo_fisher/archive/2004/09/16/230631.aspx
    Here the keypoint seems to be this "MSBuild chooses the last target seen", which may helpful on your issue.
    Best regards,
    Barry
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

Maybe you are looking for

  • How to implement secure Licencing for Java Apps?

    Hi I'm already thinking some months about this topic. I serached the web, some books and magazines, i asked quite a lot of people - but in the end, there seems to be no really satisfying answer. My main question is: what can i do to protect the softw

  • Diferent format in 2 lines in Cross tab

    Hi experts, I have a cross tab with several lines. All the key figures are %. But when i insert another line (right button), it have the same format. I dont want the same format, i want a only number format now. Any idea?? thanks a lot !!

  • Handling exceptions for barcode:

    Hi,    How can I handle the exceptions for a barcode process? suppose I have a tiff document with 4 pages and 4 barcodes, and if one of them was failed in decoding, my whole process is running into exception. I want to handle that failed barcode alon

  • Why does my application Installer quit unexpectedly?

    So I am trying to install iLife 09 and when I open the installer window, everything is fine. I click install and about five seconds into the install it says how it is installing "Running Quicktime 7 installer script" and then it quits. What is the pr

  • Error message after updating to iTunes 9

    I updated my iTunes to the latest version- iTunes 9. After installation, I get message: **yBrowser.exe: Entry point not found**. The procedure entry point sqlite3preparev2 could not be located in the dynamic link library sqlite3.dll. I re-started my