Problem loading data from the PSA to the InfoCube

Hello experts.
I'm having a problem loading data from the PSA to the InfoCube.
I'm using a DTP for this process but is happening the following error:
"Diagnosis
      An error occurred while executing the transformation rule:
      The exact error message is:
      Overflow converting from''
      The error was triggered at the point in the Following Program:
      GP4KMDU7EAUOSBIZVE233WNLPIG 718
  System Response
      Processing the record date has Been terminated.
Procedure
      The Following is additional information included in the higher-level
     node of the monitor:
     Transformation ID
     Data record number of the source record
     Number and the name of the rule Which produced the error
Procedure for System Administration
Have already created new DTP's deactivate and reactivate the InfoCube, the transformation, but solves nothing.
Does anyone have any idea what to do?
Thank you.

HI,
Is it a flat file load or loading frm any data source?
try to execute the program GP4KMDU7EAUOSBIZVE233WNLPIG 718 in Se38 and check if its active and no syntax errors are there.
Check the mapping of the fileds in transformations weather
some data fileds are mapped to decimal or char 32 filed is mapped to Raw 16
or calweek, calmonth mapped to calday etc.
Check in St22 if there any short dumps ..
Regards
KP

Similar Messages

  • Problem accessing data from Pointbase embedded in the application server

    When i try to access any information from a table in the pointbase database thru my jsp file it gives the error :"Invalid table name 'Table-name' specified at position 'position'.
    I am using the sun application server installed on the solaris machine and i created new table in the sample database PBPublic schema (a default schema that comes with the embedded pointbase database) and when i try to get data from the table i created i get the specified error.The connection string i use is :
    String l_driver = "com.pointbase.jdbc.jdbcUniversalDriver";
              Class.forName(l_driver).newInstance();
              // The URL for the sample PointBase database          
              String l_URL = "jdbc:pointbase:embedded:sample";
              // Database UserID          
              String l_UID = "pbpublic";
              // Database Password          
              String l_PWD = "pbpublic";
              // Establish connection with the database and return a Connection object          
              con = DriverManager.getConnection(l_URL, l_UID, l_PWD);
    it connects me to the database but does not fetch results on the query i execute.
    can anyone suggest what am i suppose to do for resolving this error.

    I've never used an embedded db. My connection settings are:
    <jdbc-connection-pool connection-validation-method="auto-commit" datasource-classname="com.pointbase.jdbc.jdbcDataSource" fail-all-connections="false" idle-timeout-in-seconds="300" is-connection-validation-required="false" is-isolation-level-guaranteed="true" max-pool-size="32" max-wait-time-in-millis="60000" name="jdbc-pointbase-pool" pool-resize-quantity="2" steady-pool-size="8">
    <property name="DatabaseName" value="jdbc:pointbase:server://localhost:9092/sample"/>
    <property name="Password" value="pbpublic"/>
    <property name="User" value="pbpublic"/>
    </jdbc-connection-pool>
    -- markus.

  • PROBLEM LOADING DATA FROM A TEXT FILE.

    Hi,
    Im having a problem in loading my csv file to the database. Im using Oracle Database 10g for Linux. Im in p. 228 in the book. This is my csv file look.
    db_name     db_version     host_id
    db10     9.2.0.7     1
    db11     10.2.0.1     1
    db12     10.2.0.1     1
    db13     9.2.0.7     1
    db14     10.2.0.1     1
    db15     9.2.0.7     1
    I loaded this data to an existing table called DATABASES loaded from tab delimited. FILE CHARACTER SET is UNICODE UTF-8. Then I browsed the name of the csv file to be uploaded. It looked like this.
    File Name      F23757437/db2.csv     Reupload File      
    Separator      
    Optionally Enclosed By      
         First row contains column names.
    File Character Set      
    I CLICKED NEXT, THIS IS WHAT IT LOOKED LIKE.
    Schema:      HANDSONXE06
    Table Name:      DATABASES
    Define Column Mapping      
    Column Names     %
    Format     
    Upload     yes
    Row 1     "db10" "9.2.0.7" 1
    Row 2     "db11" "10.2.0.1" 1
    Row 3     "db12" "10.2.0.1" 1
    Row 4     "db13" "9.2.0.7" 1
    Row 5     "db14" "10.2.0.1" 1
    Row 6     "db15" "9.2.0.7" 1
    I CLICKED LOAD AND THIS WAS THE RESULT.
    * There are NOT NULL columns in HANDSONXE06.DATABASES. Select to upload the data without an error.
    Schema
    Down
    Table Name
    Down
    File Details
    Down
    Column Mapping
    Load Data      
    Schema:      HANDSONXE06
    Table Name:      DATABASES
    Define Column Mapping      
    Column Names     COLUMN_NAMES
    Format     FORMAT
    Upload     UPLOAD
    Row 1     "db10" "9.2.0.7" 1
    Row 2     "db11" "10.2.0.1" 1
    Row 3     "db12" "10.2.0.1" 1
    Row 4     "db13" "9.2.0.7" 1
    Row 5     "db14" "10.2.0.1" 1
    Row 6     "db15" "9.2.0.7" 1
    I WAS REALLY WONDERING WHAT WAS REALLY WRONG. AN ERROR MESSAGE SAID, THERE ARE NOT NULL COLUMNS IN THE HANDSONXE06.DATABASES. I DIDN'T KNOW HOW TO FIX IT. WHAT DO I NEED TO CHANGE TO LOAD THE DATA WITHOUT AN ERROR? IT REALLY CONFUSED ME A LOT AND HOW COME I HAVE AN ERROR? PLEASE HELP ME. I NEED AND ANSWER TO MY PROBLEM PLEASE. I CANNOT GO FORWARD BECAUSE OF THIS.
    THANKS,
    JOCELYN

    I'm not certain of the utility you are using to load the data, however, I completed the following test using SQL Loader to insert the data into my table. Your process should work similar if the trigger and sequence are created for the table you are loading.
    SQL> create table load_tbl
      2  (db_id number(3) not null,
      3   db_name varchar2(100) not null,
      4   db_version varchar2(25),
      5   host_id number(3) not null)
      6  /
    Table created.
    SQL> desc load_tbl
    Name                                      Null?    Type
    DB_ID                                     NOT NULL NUMBER(3)
    DB_NAME                                   NOT NULL VARCHAR2(100)
    DB_VERSION                                         VARCHAR2(25)
    HOST_ID                                   NOT NULL NUMBER(3)
    SQL> create sequence db_id_seq;
    Sequence created.
    SQL> create or replace trigger db_id_trig
      2  before insert on load_tbl
      3  for each row
      4  when (new.db_id is null)
      5  begin
      6    select db_id_seq.nextval into :new.db_id from dual;
      7  end;
      8  /
    Trigger created.
    The contents of the data file, control file and log file are below for the load into load_tbl.
    C:\>sqlldr userid=username/password@db control=db_id_load.ctl log=db_id_load.log
    SQL*Loader: Release 9.2.0.6.0 - Production on Thu Jan 18 17:21:47 2007
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 6
    C:\>
    SQL> select * from load_tbl
      2  /
         DB_ID DB_NAME              DB_VERSION                   HOST_ID
             1 db10                 9.2.0.7                            1
             2 db11                 10.2.0.1                           1
             3 db12                 10.2.0.1                           1
             4 db13                 9.2.0.7                            1
             5 db14                 10.2.0.1                           1
             6 db15                 9.2.0.7                            1
    6 rows selected.
    SQL>
    Data File"db10" "9.2.0.7" 1
    "db11" "10.2.0.1" 1
    "db12" "10.2.0.1" 1
    "db13" "9.2.0.7" 1
    "db14" "10.2.0.1" 1
    "db15" "9.2.0.7" 1
    Control FileLOAD DATA
    INFILE "C:\db_id_load.dat"
    APPEND INTO TABLE load_tbl
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (db_name CHAR,
    db_version CHAR,
    host_id "TO_NUMBER(:host_id,'99999999999')"
    Log FileSQL*Loader: Release 9.2.0.6.0 - Production on Thu Jan 18 17:21:47 2007
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Control File:   db_id_load.ctl
    Data File:      C:\db_id_load.dat
      Bad File:     db_id_load.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table LOAD_TBL, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    DB_NAME                             FIRST     *  WHT O(") CHARACTER           
    DB_VERSION                           NEXT     *  WHT O(") CHARACTER           
    HOST_ID                              NEXT     *  WHT O(") CHARACTER           
        SQL string for column : "TO_NUMBER(:host_id,'99999999999')"
    Table LOAD_TBL:
      6 Rows successfully loaded.
      0 Rows not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                  49536 bytes(64 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             6
    Total logical records rejected:         0
    Total logical records discarded:        0
    Run began on Thu Jan 18 17:21:47 2007
    Run ended on Thu Jan 18 17:21:47 2007
    Elapsed time was:     00:00:00.39
    CPU time was:         00:00:00.13

  • Problem loading data from jena

    Hi, two issues when loading data into Oracle from a jena model:
    1. The incremental and batch load both works well except when we add a triple with a literal types as double:
    triple = new Triple(dirNode.asNode(), Node.createURI("http://www.w3.org/2003/01/geo/wgs84_pos#long"), Node.createLiteral(geopos.getLongitude().toString(), null,      (RDFDatatype) XSDDatatype.XSDdouble));
    graph.add(triple);
    We get the error:
    GRAVE: Could not add triple
    java.sql.BatchUpdateException: ORA-55303: Fallo en el constructor SDO_RDF_TRIPLE_S: Simple case: SQLERRM=ORA-55328: fallo al intentar insertar el valor literal "-5.9278863"^^<http://www.w3.org/2001/XMLSchema#double>
    ORA-06512: en "MDSYS.MD", línea 1723
    ORA-06512: en "MDSYS.MDERR", línea 17
    ORA-06512: en "MDSYS.SDO_RDF_TRIPLE_S", línea 211
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1335)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3449)
         at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3530)
         at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:1062)
    2. The bulk load simply does not work:
    ((OracleBulkUpdateHandler) graph.getBulkUpdateHandler()).addInBulk(GraphUtil.findAll(model.getGraph()), "sem_ts");
    We get:
    01-oct-2009 13:11:39 oracle.spatial.rdf.client.jena.SimpleLog warn
    ADVERTENCIA: addInBulk: [92 ] sqle
    java.sql.SQLException: ORA-44004: nombre de SQL cualificado no válido
    ORA-06512: en "SYS.DBMS_ASSERT", línea 188
    ORA-06512: en "MDSYS.SDO_RDF", línea 242
    ORA-06512: en "MDSYS.RDF_APIS", línea 693
    ORA-06512: en línea 1
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:439)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:395)
         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:802)
    In both case our conexion is something like:
    public static String conexion = "jdbc:oracle:thin:user/pass@ourserver:1521:ourdb";
    Any idea? Thanks

    Hi Wu, we have included your code in a java test ang got the same problem.
    Our installation is Oracle Database 11.2.0.1.0. Then we added the 'Semantic patch' 11_2_sem and followed the instructions to create a tablespace and the RDF/SEM network. Finally we create a model as [explained here|http://download.oracle.com/docs/cd/E11882_01/appdev.112/e11828/sdo_rdf_concepts.htm#CHDEDFFA] .
    Some text in the exception is spanish; basically it seems to say 'fails inserting the literal value'... The rest of data in the app has been correctly inserted.
    This is the java test:
    public class PruebaOracleTest extends TestCase {
         String jdbcUrl = "jdbc:oracle:thin:user/pass@server:1521:bd";
         public void testInsertData() throws Exception {
              Oracle oracle = new Oracle(jdbcUrl, null, null);
              GraphOracleSem graph = new GraphOracleSem(oracle, "ARTICLES");
              ModelOracleSem model = new ModelOracleSem(graph);
              Model inMemoryJenaModel = ModelFactory.createDefaultModel();
              long lStartTime = System.currentTimeMillis();
              System.out.println("testCustomerMisc: start");
              Triple t = new Triple(Node.createURI("http://sub"), Node
                        .createURI("http://www.w3.org/2003/01/geo/wgs84_pos#long"),
                        Node.createLiteral("-5.9278863", null,
                                  (RDFDatatype) XSDDatatype.XSDdouble));
              graph.add(t);
              graph.flushAdd();
              String queryString = "SELECT * " + "WHERE { "
                        + " ?subject ?predicate ?object . " + "} ";
              Query query = QueryFactory.create(queryString);
              QueryExecution qexec = QueryExecutionFactory.create(query, model);
              ResultSet results;
                   results = qexec.execSelect();
                   ResultSetFormatter.out(System.out, results, query);
         public void testListTriples() throws Exception {
              Oracle oracle = new Oracle(jdbcUrl, null, null);
              GraphOracleSem graph = new GraphOracleSem(oracle, "ARTICLES");
              int cont = 0;
              ExtendedIterator it = graph.find(Triple.ANY);
              while (it.hasNext() && cont<100) {
                   Triple t = (Triple) it.next();
                   System.out.println(t.toString());
                   cont++;
              graph.close();
              oracle.dispose();
         public void testCleanModel() throws Exception {
              Oracle oracle = new Oracle(jdbcUrl, null, null);
              GraphOracleSem graph = new GraphOracleSem(oracle, "ARTICLES");
              ModelOracleSem model = new ModelOracleSem(graph);
              model.removeAll();
              graph.close();
              oracle.dispose();
    And this the exception we get:
    java.sql.SQLException: ORA-55303: Fallo en el constructor SDO_RDF_TRIPLE_S: Simple case: SQLERRM=ORA-55328: fallo al intentar insertar el valor literal "-5.9278863"^^<http://www.w3.org/2001/XMLSchema#double>
    ORA-06512: en "MDSYS.MD", línea 1723
    ORA-06512: en "MDSYS.MDERR", línea 17
    ORA-06512: en "MDSYS.SDO_RDF_TRIPLE_S", línea 211
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:439)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:395)
         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:802)
         at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:436)
         at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:186)
         at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:521)
         at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:205)
         at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1008)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1307)
         at oracle.jdbc.driver.OraclePreparedStatement.sendBatch(OraclePreparedStatement.java:3753)
         at oracle.jdbc.driver.OraclePreparedStatementWrapper.sendBatch(OraclePreparedStatementWrapper.java:1140)
         at oracle.spatial.rdf.client.jena.GraphOracleSem.flushAdd(GraphOracleSem.java:1219)
         at org.fundacionctic.ogd.data.support.PruebaOracleTest.testInsertData(PruebaOracleTest.java:42)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at junit.framework.TestCase.runTest(TestCase.java:154)
         at junit.framework.TestCase.runBare(TestCase.java:127)
         at junit.framework.TestResult$1.protect(TestResult.java:106)
         at junit.framework.TestResult.runProtected(TestResult.java:124)
         at junit.framework.TestResult.run(TestResult.java:109)
         at junit.framework.TestCase.run(TestCase.java:118)
         at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
         at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
         at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)

  • Loading Data from an Infoset to an Infocube

    I have been searching on this forum but still confuse because of the different answers I have found,
    I want to know if it is posible to create a DTP and transformations from an Infoset to an Infocube, I know the Infoset itself does not
    contain any data, it is just a join between differents infoproviders, but I just want to know if is posible.
    Edited by: Wasipati on Jan 25, 2011 9:54 PM

    Hi,
    Yes, an Infoset can be a used as the source of a transformation.
    Source can be :- Infoset, an InfoObject, a DataStore, an InfoCube, a DataSource and an Infosource.
    Target can be :- an InfoObject, a DataStore, an InfoCube , an Open Hub and an infosource.
    Since infoset does not store any data physically, it will fetch the data from the underlying info providers(like DSO) through Joins.
    Hope it helps!!!
    Regards
    Lekshmi

  • Loading Data from R/3 to 3.5 AND 7.0

    Hi,
    We have a BW 3.5 that has been extracting Sales data from R/3 for the past few years. We now need to load the same data into a new BI 7.0. I saw that we can load data from one R/3 system to multiple BW systems through note 775568. But the question is, how will we INIT for a new BI system when the data extraction has already been happening for a long time. The other option would be move historical data from 3.5 TO 7.0 But is there any way for loading data from R/3 into the new BI.?
    Thanks in advance.
    Cheers
    AN

    Hi,
    It is possible to have multiple BW/BI systems for one R3/ECC systems. In order for you to load data into both 3.5 and 7.0 systems, you need to difine BI 7.0 logical system in R3/ECC just like your BW 3.5 logical system in R3/ECC. In R3/ECC, the outbound queue (SMQ1) differentiates the deltas based on the logical system name that you have defined (3.5 and 7.0) in R3/ECC system. So, in RSA7 you will see same delta queue name twice but differentiated by logical system name. So you will not have problems loading deltas. Hope this will shed some light.
    Regards,
    Rao.

  • Error while loading data from application server

    Hi all,
    Am facing a problem while loading data from application server.
    The error i get is ....
    *" The argument ' Rental/Lease ' cannot be interpreted as a number while assigning character to application structure*.
    'Rental/Lease' is a value for a character infoobject length 30. I checked for the sequence of fields in data source and the sequnce of values am receiving in application server and the sequence match.
    when i copy these values into a CSV onto a desktop and load,load is successful.
    Please let me know your views.
    Thanks&Regards,
    Praveen

    It looks like the system is trying to convert Rental/Lease to a number format.   Is the info object type CHAR or NUMC or ???  I would look there.
    Also, make  sure / is in RSKC.
    Brian

  • Loading data to Headcount and Personnel Actions infocube (0PA_C01) from r/3

    Hi Gurus,
    I'm loading data from R/3 into my infocube 0PA_C01. I'm loading the data from 2 infosources 0HR_PA_0 (DS is 0HR_PA_0) and 0HR_PA_1. The infosource 0HR_PA_1 has inturn 3 datssources 0EMPLOYEE_ATTR, 0PERSON_ATTR and 0HR_PA_1. So on the whole I have 4 data sources and so 4 infopackages.
    Now the problem is that if I load an employee from r/3, the data is getting loaded into the cube in 4 lines, one per each info package. But I want to see all the employee details in one line. Any ideas on how to load the data into one line ??
    Regards,
    Ramesh

    why are u loading master data in the cube.
    Load the master data seperately and what ever attributes of master data you want make that as master data attributes when you load the transaction data using 0HR_PA_0 and 0HR_PA_1.
    Also, there could be 4 records in the master data that the reson you are getting 4 records depeding about the time dependent infoobject.
    You can check that in 0Employee and 0Person what all attributes are time dependented according to that you are getting records.
    As becuase of Begda and Endda the records changes in R/3 so even you will have that set of record in BW when you fetch that data .
    Hope this help.

  • Loading Data From File Through EM on Mac OS X

    Has anyone been able to load data from a file through the Oracle Enterprise Manager on Mac OS X?
    Specifically, I'm trying to follow the example in the 2-day DBA manual (Chp. 08 "Managing Schema Objects" --> "Loadng Data Into Tables").
    When I enter the appropriate user name and password entries into the fields under the "Host Credentials" section, and I click the "Next" button, I run into a "Validation Error." The rest of the error message reads:
    "Examine and corrent the following errors, then retry the operation: Error - RemoteOperationException: ERROR: command process interrupted before exit"
    Has anyone else encountered the same error? If so, would you please share the solution? I'm also curious to know how to diagnose the problem (i.e., which logs can I reference to trace what's happening in the background?).
    Thank you.

    I used to get an error message stating that I entered the wrong password whether or not I actually entered the password correctly. Then, I modified the NetInfo entries for the local user account that I was using; I changed the "authentication_authority" property to ";basic;" and changed the "passwd" property a traditional unix hash password (using the 'openssl passwd <password>' command). All changes were done using the NetInfo Manager.
    After I made the change, I got a little further, but only as far as the problem that I described at the start of this message thread.
    Likewise, I hope someone from Oracle can actually provide some insight into this.

  • Problem loading data with DTP

    Hi everyone,
    I trying to load data from a DSO to a INFOCUBE, the problem is as follow. Normally in a DTP the system process packages in the request. In this case the DTP don't proccess any package but finish with status green.
    When I manage the target, I see that the request finish in red.
    The symptom is that the DTP don't process any request. I don´t  know why.
    Status: The DSO have active data, and the infocube is empty.
    I don't know what happen, but I've not carry the data from the DSO to the target.
    I hope that you help me,
    Regards.
    Jose

    Answering your questions:
    Are you seeing the request red in Manage and Green in Monitor Details ?
    Yes in manage I see the requests in red, but in the monitor details all is in green, although there are not requests processed in the monitor details..
    Are the requests green and active in the source DSO ?
    Ok, this DSO is direct input, an it's filled by a APD process. So the DSO don't have active request. This DSO Just have the data in table of active data. I think that it doesn't matter because I'm trying to do the same thing between two infocubes and doesn't work too.
    I'm looking that the problem y general for all warehouse.
    Additional, this DTP works correct the last week.
    This week begins to fail.
    Could be a System problem.
    What can I check??
    Also check if this is an auth issue - SU53 ..
    I check SU53 but all it's ok.
    In Monitor Header - Selections - You should see the requests which have been loaded from source.
    I don't see anything in this field.
    Thanks
    Jose

  • Error while loading data from DSO to Infocube

    Hi all,
    I'm loading data from flat file to the cube via a DSO. I get an error in the DTP.
    The data loads to the DSO, i'm able to see the contents in the active data table.
    Transformation from DSO to Cube is activated without any errors. Its a one to one mapping. No routines used.
    When i execute the DTP ( i use full load option as it is a one time load) i get the following error in DTP: All of which are Red.
    Process Request
    Data Package 1: Error during processing
    Extract from DSO  XXXXX
    Processing Terminated
    The long text of the error gives the msg:  Data package 1: Processed with errors    RSBK257
    Thanks

    Hi,
    You can check the below forums.
    DTP with error RSBK257
    - Jaimin

  • Load data from file and send to background

    Hello, is it possible to load data from a file on the presentation server and then create a batch input to call a transaction with those values but on background NOT using data set?
    Thanks in advance.

    U can surely design the process to be executed in various ways..one of them is as below ::
    Upload ABAP.
    1. It reads the file from the presentation server and exports it into INDX database with a specific filename.
    2. Updates a ztable with user/upload date and time/filename/status=U.
    Batch ABAP.
    1. It reads the ztable for all the files with status=U.
    2. Using the filename IMPORTS the data from INDX database.
    3. Creates the batch input to call transaction.
    4. Sets the status as process along with process date and time.
    With the above you can have multiple uploading data and batch job picking up each file and doing the needful.
    Regards
    Anurag

  • Sql loader loading data from legacy to Oracle

    Hi
    we have a requirement that we need to import data from legacy to oracle AR.We know that we need to put the data file in BIN folder,but I want to know about the data file source if we place in windows folder instead of BIN directory will SQL loader picks the file and loads the data.
    Thanks
    Y

    Yes,
    Refer this
    http://www.oracle.com/technology/products/database/utilities/htdocs/sql_loader_overview.html
    * Load data across a network. This means that a SQL*Loader client can be run on a different system from the one that is running the SQL*Loader server.
    * Load data from multiple datafiles during the same load session
    * Load data into multiple tables during the same load session
    * Specify the character set of the data
    * Selectively load data
    * Load data from disk, tape, or named pipe
    * Specify the character set of the data
    * Generate sophisticated error reports, which greatly aid troubleshooting
    * Load arbitrarily complex object-relational data
    * Use either conventional or direct path loading.
    -Arun

  • SAP Add On: How to load data from database to a matrix

    I am making a payroll application add-on for SAP Business One. I have made a form using screen painter and wanted to know how one goes about loading data from a database into the matrix columns.
    My matrix has 6 columns and have called the items;
    public void DeclareColumsInMonthlyMatrix()
                SAPbouiCOM.Item oItem = null;
                // Adding the Monthlymatrix Elements
                oItem = _form.Items.Item("matMonthly");
                oMatrix = oItem.Specific;
                oColumns = oMatrix.Columns;
                oColumn = oColumns.Item("mPayYear");
                oColumn = oColumns.Item("mMonth");
                oColumn = oColumns.Item("mStartDate");
                oColumn = oColumns.Item("mEndDate");
                oColumn = oColumns.Item("mPayStatus");
                oColumn = oColumns.Item("mTaxMethod");
    I have retrieved the appropriate data from the database using LinQ to SQL and I have:
    // Populate the Monthly Period Data Grid View           
        var monthlyPeriods = Program.Kernel.Get<IMonthlyPeriodService>().GetAllMonthlyPeriods();
        monthlyPeriods = monthlyPeriods.OrderBy(x => Enum.Parse(typeof(MonthsOfAYear), x.Code, true));
    The corresponding field names in the database for the 6 columns are:
    U_Payroll_Year,
        U_Month,
        U_Starting_date,
        U_Ending_date,
        U_Pay_Process_Status,
        U_Tax_Method
    I was previously using C# .Net win forms and was using a datagrid and bindingsource which was easy by using the code
    // Populate the Monthly Period Data Grid View           
        var monthlyPeriods = Program.Kernel.Get<IMonthlyPeriodService>().GetAllMonthlyPeriods();
        monthlyPeriods = monthlyPeriods.OrderBy(x => Enum.Parse(typeof(MonthsOfAYear), x.Code, true));
        monthlyPeriodBindingSource.DataSource = monthlyPeriods.ToList();
    How do I achieve the same in SAP? How do I get the returned results from monthlyperiods to map over the appropriate columns in my matrix?

    Hi Nor,
    you could build a function which is able to generate a list of koordinates from your geometry.
    The file generated will be a character-separated list.
    This list will be generated by using a simple select statement like this:
    <font color="FFFF00">
    select obj_id, mysdo_koo2list(geometry) from my_geotable where ... ;
    </font>
    the function mysdo_koo2list(..) have to be built by you first.
    <em>create function mysdo_koo2list ( gc sdo_geometry) return varchar2 as
    line varchar2(4000);
    n number;
    ordinate number;
    begin
    line:= ''; n := 0;
    for ordinate in gc.sdo_ordinates.FIRST .. geom.sdo_ordinates.LAST
    loop
    line := line||to_char(geom.sdo_ordinates(ordinate), '9999999D999');
    if ( mod(n,2) = 1 ) then
    line := line||chr(10);
    else
    line := line||',';
    end if;
    n := n +1 ;
    -- exit when n >330;
    end loop;
    return (line);
    end;
    </em>

  • Automatically trigger the event to load data from Planning cube to Standard Cube

    Hello,
    We have a below set up in our system..
    1. A Planning BEx query using which user makes certain entries and writes back data to the planning cube.
    2. An actual reporting cube which gets data from the planning cube above.
    Now, what we want to do is to automate the data load from Planning cube to Reporting cube.
    This involves 2 things..
    1. Change the setting " Change real time load behaviour " of the planing cube to Planning.
    2. Trigger the DTP which loads data from Planning cube to reporting cube.
    We want to automate the above two steps...
    I have tried few things to achieve the same..
    1. Created an event in SM64,
    2. In the Planning cube "Manage" Screen,  clicked on "Subsequent Processing" and provided the event details. ( Not sure if its the correct place to provide the event detail ).
    3. Wrote a ABAP program which changes the setting of the planning cube (  " Change real time load behaviour " to Loading )
    4. Created a process chain, where we have used the event as a start variant, used the ABAP program as a next step, DTP run as the last step..
    This i hoped, would trigger the event as soon as a new request comes and sits in planning cube, which in turn would trigger the process chain which would load the data from Planning cube to reporting cube..
    This is not working. I dont think the event is triggering, and even if does i am not sure whether it will start the process chain automatically. Any ideas please ?

    hi,
    try to do the transformation directly in the input cube by using CR of type exit, more details :
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/43/1c3d0f31b70701e10000000a422035/content.htm
    hope it helps.

Maybe you are looking for

  • Itunes not recognising gen 5 ipod touch on windows 8

    Connecting my new gen 5 ipod to windows 8, recognised by laptop but itunes does not show any devices connected,This is first time. any suggestions.

  • Get first and last day given month name with combobox

    hi guys ; I loaded to month name in combobox and I want to get first and last day by the name of month from selected Combobox So if I select to february than results get 01.02.2015 and 28.02.2015 if select March than 01.03.2015 and 31.03.2015 Thanks

  • SQLite version - AIR 1.0 vs 1.5 vs 2.0

    I've come to understand that AIR uses its own built-in version of SQLite but it's been hard to grasp what version is being used in each AIR version? Has there been any upgrades to the SQLite version from 1.0 to 1.5 to 2.0beta? I did notice that Neste

  • Urgent: how to add dynamic page created by portal to your as a portlet in page

    i want to add a component i made by portal like dynamic page as a portlet into page in my application. please the answer is so important thanks

  • Some .mov files not being loaded

    Hi, I've got 4 movies in iPhoto, all with the .mov file format, and only one of them will load into iMovie. I've seen a lot of similar topics, but none mentioning movs that won't load. The only pattern I can find is that the 3 that won't load have a