Is necessay run entire flat file

in bdc flat file having 10000 records but in between means afere some records updated(5000) system crash.is necessary to run entire flat file or not.
can anybody explain in detail in both methods.

Hi,
you can do this in 2 ways,
1) delete first 5000 records from your flat file(this doesn't require any code changes)
or
2)after gui_upload u will loop internal table to place your code for mapping logic... at that time loop it from index 5000 which will solve your purpose...,
but preferably use (1) rather then (2) because it doesn't require any code changes..,
<b>reward me points if useful and close thread if your problem is solved.,</b>
cheers,
Harish

Similar Messages

  • Java API for running entire ".sql" files on a remote DB ( mySQL or Oracle)?

    Hi,
    Would anyone happen to know if there's a java API for executing entire ".sql" files (containing several different SQL commands), on a remote database server ?
    It's enough if the API works with MySQL and/or Oracle.
    Just to demonstrate what i'm looking for:
    Suppose you've created sql file "c:/test.sql" with several script lines:
    -- test.sql:
    insert into TABLE1 values(3,3);
    insert into TABLE1 values(5,5);
    create table TABLE2 (name VARCHER) ENGINE innoDB; -- MYSQL specific
    Then the java API should look something like:
    // Dummy java code:
    String driver="com.mysql.jdbc.Driver";
    String url= "jdbc:mysql://localhost:3306/myDb";
    SomeAPI.executeScriptFile( "c:/test.sql", driver, url);
    Thanks.

    No such a API, but it's easy to parse all sqls in a file, then run those command:
    For instance:
    import java.sql.*;
    import java.util.Properties;
    /* A demo show how to load some sql statements. */
    public class testSQL {
    private final static Object[] getSQLStatements(java.util.Vector v) {
    Object[] statements = new Object[v.size()];
    Object temp;
    for (int i = 0; i < v.size(); i++) {
    temp = v.elementAt(i);
    if (temp instanceof java.util.Vector)
    statements[i] = getSQLStatements( (java.util.Vector) temp);
    else
    statements[i] = temp;
    return statements;
    public final static Object[] getSQLStatements(String sqlFile) throws java.
    io.IOException {
    java.util.Vector v = new java.util.Vector(1000);
    try {
    java.io.BufferedReader br = new java.io.BufferedReader(new java.io.
    FileReader(sqlFile));
    java.util.Vector batchs = new java.util.Vector(10);
    String temp;
    while ( (temp = br.readLine()) != null) {
    temp = temp.trim();
    if (temp.length() == 0)
    continue;
    switch (temp.charAt(0)) {
    case '*':
    case '"':
    case '\'':
    // System.out.println(temp);
    break; //Ignore any line which begin with the above character
    case '#': //Used to begin a new sql statement
    if (batchs.size() > 0) {
    v.addElement(getSQLStatements(batchs));
    batchs.removeAllElements();
    break;
    case 'S':
    case 's':
    case '?':
    if (batchs.size() > 0) {
    v.addElement(getSQLStatements(batchs));
    batchs.removeAllElements();
    v.addElement(temp);
    break;
    case '!': //Use it to get a large number of simple update statements
    if (batchs.size() > 0) {
    v.addElement(getSQLStatements(batchs));
    batchs.removeAllElements();
    String part1 = temp.substring(1);
    String part2 = br.readLine();
    for (int i = -2890; i < 1388; i += 39)
    batchs.addElement(part1 + i + part2);
    for (int i = 1890; i < 2388; i += 53) {
    batchs.addElement(part1 + i + part2);
    batchs.addElement(part1 + i + part2);
    for (int i = 4320; i > 4268; i--) {
    batchs.addElement(part1 + i + part2);
    batchs.addElement(part1 + i + part2);
    for (int i = 9389; i > 7388; i -= 83)
    batchs.addElement(part1 + i + part2);
    v.addElement(getSQLStatements(batchs));
    batchs.removeAllElements();
    break;
    default:
    batchs.addElement(temp);
    break;
    if (batchs.size() > 0) {
    v.addElement(getSQLStatements(batchs));
    batchs.removeAllElements();
    br.close();
    br = null;
    catch (java.io.FileNotFoundException fnfe) {
    v.addElement(sqlFile); //sqlFile is a sql command, not a file Name
    Object[] statements = new Object[v.size()];
    for (int i = 0; i < v.size(); i++)
    statements[i] = v.elementAt(i);
    return statements;
    public static void main(String argv[]) {
    try {
    String url;
    Object[] statements;
    switch (argv.length) {
    case 0: //Use it for the simplest test
    case 1:
    url = "jdbc:dbf:/.";
    if (argv.length == 0) {
    statements = new String[1];
    statements[0] = "select * from test";
    else
    statements = argv;
    break;
    case 2:
    url = argv[0];
    statements = getSQLStatements(argv[1]);
    break;
    default:
    throw new Exception(
    "Syntax Error: java testSQL url sqlfile");
    Class.forName("com.hxtt.sql.dbf.DBFDriver").newInstance();
    //Please see Connecting to the Database section of Chapter 2. Installation in Development Document
    Properties properties = new Properties();
    Connection con = DriverManager.getConnection(url, properties);
    Statement stmt = con.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,
    ResultSet.CONCUR_READ_ONLY);
    //Statement stmt = con.createStatement(ResultSet.TYPE_SCROLL_SENSITIVE,ResultSet.CONCUR_UPDATABLE);
    // stmt.setMaxRows(0);
    stmt.setFetchSize(10);
    final boolean serializeFlag = false;//A test switch to serialize/deserialize the resultSet
    ResultSet rs;
    for (int i = 0; i < statements.length; i++) {
    if (statements[i] instanceof java.lang.String) {
    String temp = (java.lang.String) statements;
    switch (temp.charAt(0)) {
    case 'S':
    case 's':
    case '?':
    System.out.println(temp);
    rs = stmt.executeQuery(temp);
    if (serializeFlag) {
    // serialize the resultSet
    try {
    java.io.FileOutputStream fileOutputStream = new
    java.io.FileOutputStream("testrs.tmp");
    java.io.ObjectOutputStream
    objectOutputStream = new java.io.
    ObjectOutputStream(fileOutputStream);
    objectOutputStream.writeObject(rs);
    objectOutputStream.flush();
    objectOutputStream.close();
    fileOutputStream.close();
    catch (Exception e) {
    System.out.println(e);
    e.printStackTrace();
    System.exit(1);
    rs.close(); //Let the CONCUR_UPDATABLE resultSet release its open files at once.
    rs = null;
    // deserialize the resultSet
    try {
    java.io.FileInputStream fileInputStream = new
    java.io.FileInputStream("testrs.tmp");
    java.io.ObjectInputStream objectInputStream = new
    java.io.ObjectInputStream(
    fileInputStream);
    rs = (ResultSet) objectInputStream.
    readObject();
    objectInputStream.close();
    fileInputStream.close();
    catch (Exception e) {
    System.out.println(e);
    e.printStackTrace();
    System.exit(1);
    ResultSetMetaData resultSetMetaData = rs.
    getMetaData();
    int iNumCols = resultSetMetaData.getColumnCount();
    for (int j = 1; j <= iNumCols; j++) {
    // System.out.println(resultSetMetaData.getColumnName(j));
    /* System.out.println(resultSetMetaData.getColumnType(j));
    System.out.println(resultSetMetaData.getColumnDisplaySize(j));
    System.out.println(resultSetMetaData.getPrecision(j));
    System.out.println(resultSetMetaData.getScale(j));
    System.out.println(resultSetMetaData.
    getColumnLabel(j)
    + " " +
    resultSetMetaData.getColumnTypeName(j));
    Object colval;
    rs.beforeFirst();
    long ncount = 0;
    while (rs.next()) {
    // System.out.print(rs.rowDeleted()+" ");
    ncount++;
    for (int j = 1; j <= iNumCols; j++) {
    colval = rs.getObject(j);
    System.out.print(colval + " ");
    System.out.println();
    rs.close(); //Let the resultSet release its open tables at once.
    rs = null;
    System.out.println(
    "The total row number of resultset: " + ncount);
    System.out.println();
    break;
    default:
    int updateCount = stmt.executeUpdate(temp);
    System.out.println(temp + " : " + updateCount);
    System.out.println();
    else if (statements[i] instanceof java.lang.Object[]) {
    int[] updateCounts;
    Object[] temp = (java.lang.Object[]) statements[i];
    try {
    for (int j = 0; j < temp.length; j++){
    System.out.println( temp[j]);
    stmt.addBatch( (java.lang.String) temp[j]);
    updateCounts = stmt.executeBatch();
    for (int j = 0; j < temp.length; j++)
    System.out.println((j+1)+":"+temp[j]);
    for (int j = 0; j < updateCounts.length; j++)
    System.out.println((j+1)+":" +updateCounts[j]);
    catch (java.sql.BatchUpdateException e) {
    updateCounts = e.getUpdateCounts();
    for (int j = 0; j < updateCounts.length; j++)
    System.out.println((j+1)+":"+updateCounts[j]);
    java.sql.SQLException sqle = e;
    do {
    System.out.println(sqle.getMessage());
    System.out.println("Error Code:" +
    sqle.getErrorCode());
    System.out.println("SQL State:" + sqle.getSQLState());
    sqle.printStackTrace();
    while ( (sqle = sqle.getNextException()) != null);
    catch (java.sql.SQLException sqle) {
    do {
    System.out.println(sqle.getMessage());
    System.out.println("Error Code:" +
    sqle.getErrorCode());
    System.out.println("SQL State:" + sqle.getSQLState());
    sqle.printStackTrace();
    while ( (sqle = sqle.getNextException()) != null);
    stmt.clearBatch();
    System.out.println();
    stmt.close();
    con.close();
    catch (SQLException sqle) {
    do {
    System.out.println(sqle.getMessage());
    System.out.println("Error Code:" + sqle.getErrorCode());
    System.out.println("SQL State:" + sqle.getSQLState());
    sqle.printStackTrace();
    while ( (sqle = sqle.getNextException()) != null);
    catch (Exception e) {
    System.out.println(e.getMessage());
    e.printStackTrace();

  • How to upload  schedule line from flat files to sap file

    dear all,
    i want to upload the schedule lines from flat files to sap schedulle lines
    but the flat files have 15 schedule lines and the data is as per date
    so how to upload that and the fields available in flat files are more than the sap screen
    we are having more than 6 items
    and 15scedule lines its abt 90data to be upload
    for one customer in every 15 day
    so how to do this
    is there any direct use in functional side
    with out the help of any abap
    but my user will do it
    so he need a permanent solution
    with regards
    subrat

    Hi Subrat ,
    u can upload the data either ( Master /Transaction) data with the help of lsmw. for that all u need to do is go through the lsmw and do it. in that u can go Batch input recording/ BAPI/ IDOC any of that. here i am sending the LSMW Notes go through it and do the work.
    once u create the LSMW project then u can ask the data from user or u can explain the user about the program and can run the flat file to upload the data.
    if u require LSMW material Just send me blank mail from u. my mail id is [email protected]
    Reward if Helpful.
    Regards,
    Praveen Kumar.D

  • Running reports on Flat file schemas when OBIEE server is on Unix

    Hello
    we would like to know how Unix OBIEE work with flat file. Currently, in our environment our Development and Production OBIEE are running on Unix AIX machine while our local OBIEE environment is on Windows.
    As our devolepment and configuration goes, we have encounter situations where the flat file schema that were working well on Windows OBIEE server will not run on Unix. The flat files are located on a windows file directory and could be successfully imported into OBIEE throu Admin Tool.
    However, when updating rowcounts or running reports on answers in Unix OBIEE it would return errors:
    State: HY000. Code: 472983136. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 16023] The ODBC function has returned an error. The database may not be available, or the network may be down. (HY000)
    The same activity is working fine on our local machine when OBIEE is on windows..
    We would like to get some help on how to work with flat files when OBIEE is runing on Unix..

    is this the same question as [this one|http://forums.oracle.com/forums/message.jspa?messageID=4018049#4018049] ?

  • Segment_Unknown error encountered while running flat file recon

    When we tried to run 'SAP HRMS User Recon' schedule task by using a flat file generated from SAP HR system, we are facing an error 'com.sap.conn.jco.AbapException: (126) SEGMENT_UNKNOWN: SEGMENT_UNKNOWN'.
    The complete stack trace is as below :
    [2013-08-21T05:34:16.480+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] oracle.iam.connectors.sap.common.parser.HRMDAParser : getSchema() : SEGMENT_UNKNOWN
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================[[
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ================= Start Stack Trace =======================
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] oracle.iam.connectors.sap.common.parser.HRMDAParser : getSchema()
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] SEGMENT_UNKNOWN
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] Description : SEGMENT_UNKNOWN
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] com.sap.conn.jco.AbapException: (126) SEGMENT_UNKNOWN: SEGMENT_UNKNOWN Message 257 of class EA type E, Par[1]: Z1P9200, Par[2]: 731[[
    at com.sap.conn.jco.rt.MiddlewareJavaRfc$JavaRfcClient.execute(MiddlewareJavaRfc.java:1807)
    at com.sap.conn.jco.rt.ClientConnection.execute(ClientConnection.java:1120)
    at com.sap.conn.jco.rt.ClientConnection.execute(ClientConnection.java:953)
    at com.sap.conn.jco.rt.RfcDestination.execute(RfcDestination.java:1191)
    at com.sap.conn.jco.rt.RfcDestination.execute(RfcDestination.java:1162)
    at com.sap.conn.jco.rt.AbapFunction.execute(AbapFunction.java:302)
    at oracle.iam.connectors.sap.common.parser.HRMDAParser.getSchema(Unknown Source)
    at oracle.iam.connectors.sap.hrms.tasks.SAPHRMSUserRecon.execute(Unknown Source)
    at com.thortech.xl.scheduler.tasks.SchedulerBaseTask.execute(SchedulerBaseTask.java:384)
    at oracle.iam.scheduler.vo.TaskSupport.executeJob(TaskSupport.java:145)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at oracle.iam.scheduler.impl.quartz.QuartzJob.execute(QuartzJob.java:196)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:529)
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ================= End Stack Trace =======================
    [2013-08-21T05:34:16.488+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================
    [2013-08-21T05:34:16.488+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] oracle.iam.connectors.sap.hrms.tasks.SAPHRMSUserRecon : execute() :
    [2013-08-21T05:34:16.488+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================[[

    When we tried to run 'SAP HRMS User Recon' schedule task by using a flat file generated from SAP HR system, we are facing an error 'com.sap.conn.jco.AbapException: (126) SEGMENT_UNKNOWN: SEGMENT_UNKNOWN'.
    The complete stack trace is as below :
    [2013-08-21T05:34:16.480+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] oracle.iam.connectors.sap.common.parser.HRMDAParser : getSchema() : SEGMENT_UNKNOWN
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================[[
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ================= Start Stack Trace =======================
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] oracle.iam.connectors.sap.common.parser.HRMDAParser : getSchema()
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] SEGMENT_UNKNOWN
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] Description : SEGMENT_UNKNOWN
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] com.sap.conn.jco.AbapException: (126) SEGMENT_UNKNOWN: SEGMENT_UNKNOWN Message 257 of class EA type E, Par[1]: Z1P9200, Par[2]: 731[[
    at com.sap.conn.jco.rt.MiddlewareJavaRfc$JavaRfcClient.execute(MiddlewareJavaRfc.java:1807)
    at com.sap.conn.jco.rt.ClientConnection.execute(ClientConnection.java:1120)
    at com.sap.conn.jco.rt.ClientConnection.execute(ClientConnection.java:953)
    at com.sap.conn.jco.rt.RfcDestination.execute(RfcDestination.java:1191)
    at com.sap.conn.jco.rt.RfcDestination.execute(RfcDestination.java:1162)
    at com.sap.conn.jco.rt.AbapFunction.execute(AbapFunction.java:302)
    at oracle.iam.connectors.sap.common.parser.HRMDAParser.getSchema(Unknown Source)
    at oracle.iam.connectors.sap.hrms.tasks.SAPHRMSUserRecon.execute(Unknown Source)
    at com.thortech.xl.scheduler.tasks.SchedulerBaseTask.execute(SchedulerBaseTask.java:384)
    at oracle.iam.scheduler.vo.TaskSupport.executeJob(TaskSupport.java:145)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at oracle.iam.scheduler.impl.quartz.QuartzJob.execute(QuartzJob.java:196)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:529)
    [2013-08-21T05:34:16.485+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ================= End Stack Trace =======================
    [2013-08-21T05:34:16.488+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================
    [2013-08-21T05:34:16.488+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] oracle.iam.connectors.sap.hrms.tasks.SAPHRMSUserRecon : execute() :
    [2013-08-21T05:34:16.488+02:00] [oim] [ERROR] [] [OIMCP.SAPH] [tid: OIMQuartzScheduler_Worker-5] [userId: oiminternal] [ecid: 0000K0GBkw2EWN05zzP5iW1Hvuam000002,0] [APP: oim#11.1.1.3.0] ====================================================[[

  • 'File is not open' error when running DTP for flat file upload in BI

    Dear Sir,
    We are trying to upload master data from flat file for which we have
    created respective DTPs. When we are running the DTPs we are getting an
    error saying 'The file '&&filepath&&' is not open'. We also checked
    that the file is not open, but still we are getting this error. Could
    anyone please suggest what we have to do to solve this problem.
    Many Thanks
    Narendra

    Hi,
    I am currently facing the same error "Error in Source system".  "The file 'path' is not open".
    Could you please let me know how you resolved this issue.
    many thanks.
    Regards,
    Madhusudhan

  • N1 Segment in ACH CTX Flat File Payment Run

    Hi All,
    I'm trying to create a flat file for ACH CTX using PMW  from payment run
    I copied the Standard FM's and modified them as per our needs
    My only question is it don't see some segment like N1 Segment
    I believe  it contains Payor Company, Payee Company and Payee Address -if i'm missing that means i have to add them or activate them or the info is missing in the transactional data(Since there is no info in transaction data its not displaying in the flat file and also is N1 Segment Mandotory segment or not )
    Thanks in advance

    Venki,
    Are you using DMEE ?
    K.Kiran.

  • Does anyone know how to convert a Filemaker flat file to some other program that runs on Lion?

    I have a flat file of 200 recipes in Filemaker. It is searchable and prints the recipes on 4x6 cards.
    How can I get the same funtionality without getting an expensive new version of FM that runs on Lion?

    In order to password protect files or folders, you need to create an encrypted disk image. Then what ever is copied to the disk image will be protected.
    The directions to do this are here
    http://support.apple.com/kb/ht1578
    Allan

  • Run bdc program when flat file comes

    Hi friends
       I want to schedule my bdc program such that it automatically runs as soon as the flat file comes into application server.
    Please help me.
    Cheers
    Vamshi

    HI,
    Follow this procedure, it might help you.
    In the program, use following logic.
    IF SY-BATCH = 'X'. * This is for checking
                                   whether   program is set for back ground
                                    processing.
    open dataset .....
    if sy-subrc =0.
    process data.
    else.
    continue.
    endif.
    if you write this logic, as the background job will be scheduled for repetition at a certain interval thenautomatically the flat file will be checked and when it encounters, SY-SUBRC =0 ,the process starts , else no process will be  done.
    Hope this solves the problem.
    Reward points if helpful.
    Thanks and Regards.
    Edited by: Ammavajjala Narayana on May 29, 2008 11:36 AM

  • How to remove the date extensions from a filename in SSIS Flat File Connection Manager dynamically at run time

    Hello,
    I have to load data from a csv file to SQL Database. The file is placed into a directory by another program but the file name being same, has different extensions based on time of the day that it is placed in the directory. Since I know the file name
    ahead of time, so, I want to strip off the date/time extension from the file name so that I can load the file using Flat File Connection Manager. I am trying to use 'variable' and 'expression editor' so that I can specify the file name dynamically. But I
    don't know how to write it correctly. I have a variable 'FileLocation' that holds the folder location where the file will be placed. The file for example:  MyFileName201410231230  (MyFileName always the same, but the date/time will be different)
    Thanks,
    jkrish

    I don't want to use ForEach Loop because the files are placed by a FTP process 3 times a day at a specific time, for ex. at 10 AM, 12 PM and 3 PM. These times are pretty much fixed. The file name is same but the extension will have this day time stamp. I
    have to load each file only once for a particular reason, meaning I don't want to load the ones I already loaded. I am planning on setting up the SSIS process to load at 10:05, 12:05 and 3:05 daily. The files will be piling up in the folder. As it comes,
    I load them. At some point in time, I can remove the old ones so that they won't take up space in the server.  In fact, I don't have to keep the old ones at all since they are saved in a different folder anyways. I can ask the FTP process to
    remove the previous one when the new one arrives. So, at any point in time, there will be one file, but that file will have different extensions every time.
    I am thinking of removing the extensions before I load every time. If the file name is 'MyFileNamexxxxxxx.csv', then I want to change it to 'MyFileName.csv' and load it.
    Thanks,
    jkrish
    You WILL need to use it eventually because you need to iterate over each file.
    Renaming is unnecessary as one way or another you will need to put a processed file away.
    And having the file with the original extension intact will also help you troubleshoot.
    Arthur
    MyBlog
    Twitter

  • Problem while uploading data from flat file

    hi friends,
    suppose if there are 100 records in flat file , if 20 records uploaded with out any problem and if error occurs while uploading remaing data . is necessary to upload entire data again or else only remaining data should be uploaded?
    i had used call transaction for purchase order application?
    please give me reply  soon its urgent
    thanks & regards
    priya

    Hi Hari,
    you have to upload the remaining data.
    as u have used CT method, do 1 thing trap the error data in runtime & prepare another flat file. next time correct the data in new flat file. & run the DBC program again with this new flat file.
    Reward if useful
    Regards
    ANUPAM

  • Unable to process flat-files without delimiters in ODI 11.1.1.7

    Hi,
    We have a requirement to process a flat file using ODI 11.1.1.7 (installed on Windows and mounted on Oracle 11g Database).  The flat-file is a normal ASCII file coming from mainframes machine and has no delimiters (column as well as rows).  The first 48 characters is ROW 1, from character 49 to 96 is ROW 2 and so on.  I am unable to create a DataStore for this file in ODI.  If I reverse engineer using the file RKM, it is creating a datastore with over 2000 columns which is not the case (actual file layout given below).  The absence of a row delimiter is making ODI think that the entire file has only one single row. 
    I am also unable to create a DataStore manually. If I am not providing any value for the "row delimiter" it is throwing me an error.  However, I am able to process this file using SQL*Loader (there is a FIX option available where we can specify the length of a row). 
    Is there a way to do it from within ODI?  I tried to modify a KM to create a control file for SQL*Loader and execute it but the control file is not getting generated as expected.
    File Layout:
    Column 1 - Warrant Number (Position 1 - 9) - Number
    Column 2 - Type (Position 10-10) - Number
    Column 3 - Warrant Amount (Position 11-18) - Packed Decimal Signed
    Column 4 - Issue Fund (Position 19-22) - String
    Column 5 - Issue Sub Fund (Position 23-24) - String
    Column 6 - Filler (Position 25-48) - String
    Thanks
    Srivatsan P

    Hi LuizFilipe,
    I tried also with your method but i am getting below error when i am trying to view data.
    See com.borland.dx.dataset.DataSetException error code:  BASE+62
    com.borland.dx.dataset.DataSetException: Execution of query failed.
    Chained exception:
    java.sql.SQLException: ODI-40439: Could not read heading rows from file
      at com.sunopsis.jdbc.driver.file.FileResultSet.<init>(FileResultSet.java:164)
      at com.sunopsis.jdbc.driver.file.impl.commands.CommandSelect.execute(CommandSelect.java:57)
      at com.sunopsis.jdbc.driver.file.CommandExecutor.executeCommand(CommandExecutor.java:33)
      at com.sunopsis.jdbc.driver.file.FilePreparedStatement.executeQuery(FilePreparedStatement.java:135)
      at com.borland.dx.sql.dataset.o.f(Unknown Source)
      at com.borland.dx.sql.dataset.QueryProvider.e(Unknown Source)
      at com.borland.dx.sql.dataset.JdbcProvider.provideData(Unknown Source)
      at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
      at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
      at com.sunopsis.graphical.frame.edit.AbstractEditFrameGridBorland.initialize(AbstractEditFrameGridBorland.java:628)
      at com.sunopsis.graphical.frame.edit.AbstractEditFrameGridBorland.<init>(AbstractEditFrameGridBorland.java:869)
      at com.sunopsis.graphical.frame.edit.EditFrameTableData.<init>(EditFrameTableData.java:50)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
      at oracle.odi.ui.editor.AbstractOdiEditor$1.run(AbstractOdiEditor.java:176)
      at oracle.ide.dialogs.ProgressBar.run(ProgressBar.java:656)
      at java.lang.Thread.run(Thread.java:662)
    Well i created file datastore with selected below things
    File format : fixed
    Heading(number of lines) : 0
    field separator : null
    record separator : by default --> MS-DOS
    text delimiter: (blank)
    decimal separator : (blank)
    then when i did reverse engineering, able to see the row of data and then i separate the data with c1,c2,c3,c4
    please let me know did i miss something.
    Thanks
    Himanshu

  • Loading multiple flat file at a time.

    hi experts,
    I am having 15 flat files with same data structure.so how do i load all the fileswith out creating 15 info packges .
    ( say all the files are on the desktop.)
    I had seen the option that ABAP code an be written in info package in extraction tab.
    Can any one share me the abap code
    Regards
    Laxman.

    You can write dynamic code using ABAP if your number of flat files are not fixed...but if you have to 15 flat files always just creating multiple infopackages.
    For code, you have to store file names/name pattern in some table and reading it at run time. You can use function modules starting with BAPI_IPAK* to create/change/start template IP like BAPI_IPAK_START. You can set IP parameters at run time using FM RSBATCH_MAINTAIN_PAR_SETTINGS.
    Kamaljeet

  • Can we load data in variable screen from flat file?

    Hello all,
    One of my user once asked me she needs to run a report for some employees, i said you can just type that in employee varaible input box.
    She said I need to run the report and I need it for 120 employees, she has the employee numbers in excel sheet.
    What is the best way of doing this if any ?
    Thanks

    This can be done from a flat file, which has list of all values.
    If you are using Bex 3.5, then we have an icon in variable screen bottom right to upload values, where we can specify the path of the file.
    Naveen.a

  • HUGE amount of data in flat file every day to external system

    Hello,
    i have to develop several programs to export all tables data in a flat file for external system ( EG. WEB).
    I have some worries like if is possible by SAP export all KNA1 data that contains a lot of data in a flat file using the extraction:
    SELECT * FROM KNA1 ITO TABLE TB_KNA1.
    I need some advices about these kind of huge extractions.
    I also have to extract from some tables, only the data changes, new record, and deleted records; to do this I thought of developing a program that every day extract all data from MARA and save the extraction in a custom table like MARA; the next day when the programs runs compare all data of the new extraction with the old extraction in the ZMARA table to understand the datachanges, new records or deleted record.. It's a righ approach? Can have problems with performance? Do you now other methods?
    Thanks a lot!
    Bye

    you should not have a problem with this simple approach, transferring each row to the output file rather than reading all data into an internal table first:
    open dataset <file> ...
    select * from kna1 into wa_kna1
      transfer wa_kna1 to <file>
    endselect
    close dataset <file>
    Thomas

Maybe you are looking for

  • BAPI_GOODSMVT_CREATE error messages,  when I use Movement Type '541'

    Hello, I am using "BAPI_GOODSMVT_CREATE".It was working fine when i was creating a GR against Purchase Order. Now i am trying to create GR against subcontracting purchase order but it is giving Error "No goods receipt possible for purchase order" The

  • Calendar sync takes a long time

    Hi I have just updated to Ios6 and cant sync my calendar on my MacBook with my Ipad - it takes a LOOOOOONNNNGGGGG Time adn falls over - any clues?

  • 4s keeps restarting and turning off

    I am now on my 2nd replacement 4s and I keep getting the same problems. My phone keeps either resetting or turning itself off without being touched. I'm told It could be a software issue but how do I find out what's causing it?Im getting fed up of re

  • Stop printing purchase order changes

    We have a requirement that when the purchase order has been deleted that the purchase is not printed again as a chnage document or printed at all. I have removed the config in Fields Relevant to Printouts of Changes but even though the output fails i

  • Adding music to slide show

    Is it possible to add PART of a song to a slide show? I've made an iTunes playlist with 2 songs in it, but I would like the slide show music to have 2 1/2 songs. iMac   Mac OS X (10.4.6)