Problem on loading DAT file when using 3G network modem

Hello,
I'm having some strange problem when I'm trying to load my game on the part where DAT file with Map Object is read, using 3G Network Modem. This issue started when I migrated my applet from one host to another. Everything loads well, until the user authentication. On user authenticatiom I'm recieving map name which I should load and show the user on it. I'm recieving the message where map name is mentioned, but after, when the process of loading map begins(when I need to read DAT file which is only 15KB) application blocks(browser and JVM also block).
I thought it could be because of slow network communication, but it's not the reason, since I have tested(using Bandwidth limiter software) with 5 KBs/Second and it loads well and application is running well.
Any clue why this can happen? If code is needed I'll post some pieces related to the process of map creation.
URL or application: [ http://mimosa.dei.uc.pt/serhiy/demo/hoonline.html|http://mimosa.dei.uc.pt/serhiy/demo/hoonline.html]
Accounts: test01/test01 ... test0n/test0n ... test05/test05 (n is number from 1 to 5)
Thanks in advance!

You have to upload it with FileReference.upload() to a PHP
(or other server-side) script which saves it to a folder on the
server. When the DataEvent.UPLOAD_COMPLETE_DATA event has been
dispatched you can then use the FileReference.name to load from the
file on the server just like any other image.

Similar Messages

  • Problem specifying SQL Loader Log file destination using EM

    Good evening,
    I am following the example given in the 2 Day DBA document chapter 8 section 16.
    In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
    The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
    Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
    Thank you for your help,
    John.
    DDL (generated using SQL developer, you may want to change the space allocated to be less)
    CREATE TABLE "NICK"."PURCHASE_ORDERS"
        "PO_NUMBER"      NUMBER NOT NULL ENABLE,
        "PO_DESCRIPTION" VARCHAR2(200 BYTE),
        "PO_DATE" DATE NOT NULL ENABLE,
        "PO_VENDOR" NUMBER NOT NULL ENABLE,
        "PO_DATE_RECEIVED" DATE,
        PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
        INITIAL 67108864
      TABLESPACE "USERS" ;
    Load.dat file contents
    1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
    2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
    3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
    Steps I am carrying out in EM
    log in, select data movement -> Load Data from User Files
    Automatically generate control file
    (enter host credentials that work on your machine)
    continue
    Step 1 of 7 ->
      Data file is located on your browser machine
      "Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
       click next
    step 2 of 7 ->
      Table Name
      nick.purchase_orders
      click next
    step 3 of 7 ->
      click next
    step 4 of 7 ->
      click next
    step 5 of 7 ->
      Generate log file where logging information is to be stored
      Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
      Validation Error
      Examine and correct the following errors, then retry the operation:
      LogFile - The directory does not exist.

    Hi John,
    But, i did'nt found any error when i am going the same what you did.
    My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
    1.I created one table in scott schema :
    SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
      2  (
      3      "PO_NUMBER"      NUMBER NOT NULL ENABLE,
      4      "PO_DESCRIPTION" VARCHAR2(200 BYTE),
      5      "PO_DATE" DATE NOT NULL ENABLE,
      6      "PO_VENDOR" NUMBER NOT NULL ENABLE,
      7      "PO_DATE_RECEIVED" DATE,
      8      PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      9  )
    10  TABLESPACE "USERS";
    Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
    Here i total 3 text boxes :
    1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
    2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
    3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
    Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
    Step 3 of 7 I just clicked Next
    Step 4 of 7 I just clicked Next
    Step 5 of 7 I just clicked Next
    Step 6 of 7 I just clicked Next
    Step 7 of 7 Here it is Control File Contents:
    LOAD DATA
    APPEND
    INTO TABLE scott.PURCHASE_ORDERS
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    PO_NUMBER INTEGER EXTERNAL,
    PO_DESCRIPTION CHAR,
    PO_DATE DATE,
    PO_VENDOR INTEGER EXTERNAL,
    PO_DATE_RECEIVED DATE
    And i just clicked on submit job.
    Now i got all 3 rows in purchase_orders :
    SCOTT@orcl> select count(*) from purchase_orders;
      COUNT(*)
             3So, there is no bug, it worked and please retry if you get any error/issue.
    HTH
    Girish Sharma

  • Weird problem with loading data from an XML using a for loop

    Hi,
    I have a strange problem. I have encountered this thing many a times but still don't know the proper workaround for it.
    I am trying to load swf file, a video file or an image. They can be present on a local system or on a remote server also. All the entries corresponding to the files to be loaded is made in an XML file. I traverse through the nodes of the XML using a for loop. On the complete event of loader info, example:.
    loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onComplete);
    I fill a container with the loaded data.
    My problem is when I am using for loop it doesn't works properly but if i use a statement like this:
    someFunc()
         if(i<arr.length())
         ... do something...
         loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onComplete);
    private function onComplete(e:Event):void
    ... do something...
    i++;
    All files are loaded properly.
    I think this can be because the for loop processes pretty fast but the content takes time to load, which ultimately leads to some wierd results.
    Please let me know how can this thing be done correctly by using a for loop also.

    You don't want to use a for loop to load several items.  The way you almost appear to have it is the proper approach... load a file and use the completion of its loading to trigger loading the next file.

  • Problem in Loading data for clob column using sql ldr

    Hi,
    I am having problem in loading data for tables having clob column.
    Could anyone help me in correcting the below script for ctrl file inorder to load the data which is in mentioned format.
    Any help really appreciated.
    Table Script
    Create table samp
    no number,
    col1 clob,
    col2 clob
    Ctrl File
    options (skip =1)
    load data
    infile 'c:\1.csv'
    Replace into table samp
    fields terminated by ","
    trailing nullcols
    no,
    col1 Char(100000000) ,
    col2 Char(100000000) enclosed by '"' and '"'
    Data File(1.csv)
    1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
    2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"
    Error Encountered
    ORA-01461: can bind a LONG value only for insert into a LONG column
    Table sampThanks in advance

    I can't reproduce it on my 10.2.0.4.0. CTL file:
    load data
    INFILE *
    Replace into table samp
    fields terminated by ","
    trailing nullcols
    no,
    col1 Char(100000000) ,
    col2 Char(100000000) enclosed by '"' and '"'
    BEGINDATA
    1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
    2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"Loading:
    SQL> Create table samp
      2  (
      3  no number,
      4  col1 clob,
      5  col2 clob
      6  );
    Table created.
    SQL> host sqlldr scott/tiger control=c:\temp\samp.ctl log=c:\temp\samp.log
    SQL> select * from samp
      2  /
            NO
    COL1
    COL2
             1
    asdf
    assasadsdsdsd"sfasdfadf"sdsdsa,ssfsf
             2
    sfjass
    dksadk,kd,ss"dfdfjkdjfdk"sasfjaslaljs
            NO
    COL1
    COL2
    SQL> SY.

  • Problem with loading data to Essbase

    Hi All,
    I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
    But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?

    Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
    [2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
    [2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
    [2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
    [2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
    [2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
    [2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
    [2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory])

  • Is there any way to force the applet to load the file without using cache?

    Hi,
    I have the applet that renders some data from a file specified as the parameter. The problem is that the user can do something, that changes the input file and reloads the page, but the applet renders old data (from browser cache most probably)
    Is there any way to force the applet to load the file without using cache?
    Regards,
    Zdenek

    The initial view (IV) settings within a PDF file are static tags - they can't be made to dynamically-adapt based on the window dimensions,it's the renderer (Acrobat, Reader, or whatever else is opening the file) that decides if and how it will follow the IV requested by the file header.
    It would be possible to use a Page Open action on the first page of the file, which does some nasty math with the various doc.*WindowRect objects to work out how much "wasted" space there is, and then set the doc.layout and doc.zoomType properties - but page actions are a different concept to IV as the zoom will reset itself every time that page is viewed. Users don't like their application apparently fiddling with the zoom level without being told to!

  • Is there a capability to save/export the time capsule settings file when using the iphone/ipad airport utility. the "file" button does not exist on the latest airport utility app.

    is there a capability to save/export the new airport 2TB time capsule settings file when using the iphone/ipad airport utility. set-up wasn't a problem but the "file" button does not exist on the latest airport utility app v6.3 to save the configuration file.

    the "file" button does not exist on the latest airport utility app v6.3 to save the configuration file.
    Sounds like you are a bit confused with version numbers.
    Latest AirPort Utility version for the iPhone / iPad is 1.3.3.  There is no option or capability to export/import settings on the iOS version(s) of AirPort Utility.....although you could take a series of screen shots and save them for future reference.
    AirPort Utility 6.3.x is found on a Mac.....not on iPhone / iPad. Export and Import options are found under the File menu in 6.3.x.

  • Error while loading Data into Essbase using ODI

    Hi,
    I am very new to ODI. I have installed ODI and working on Demo environment only. I havn't done any configuration. I am using Essbase Technology which is coming by default.
    I have created one sample outline in Essbase and a text file to load data into essbase using ODI.
    Following my text file.
    Time     Market     Product     Scenario     Measures     Data
    Jan     USA     Pepsi     Actual     Sales     222
    I am getting the error. I have checked in Operator. It is giving at step 6 i.e. Integration SampleLoad data into essbase.
    Here is the description.
    from com.hyperion.odi.common import ODIConstants
    from com.hyperion.odi.connection import HypAppConnectionFactory
    from java.lang import Class
    from java.lang import Boolean
    from java.sql import *
    from java.util import HashMap
    # Get the select statement on the staging area:
    sql= """select C3_C1 ""Time"",C5_C2 ""Market"",C2_C3 ""product"",C6_C4 ""Scenario"",C1_C5 ""Measures"",C4_C6 ""Data"" from "C$_0Demo_Demo_genData" where      (1=1) """
    srcCx = odiRef.getJDBCConnection("SRC")
    stmt = srcCx.createStatement()
    srcFetchSize=30
    stmt.setFetchSize(srcFetchSize)
    rs = stmt.executeQuery(sql)
    #load the data
    stats = pWriter.loadData(rs)
    #close the database result set, connection
    rs.close()
    stmt.close()
    Please help me to proceed further...

    Hi John,
    Here is the error message in execution tab....
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 20, in ?
    java.sql.SQLException: Unexpected token: TIME in statement [select   C3_C1    ""Time]
         at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
         at org.hsqldb.jdbc.jdbcStatement.fetchResult(jdbcStatement.java:1598)
         at org.hsqldb.jdbc.jdbcStatement.executeQuery(jdbcStatement.java:194)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
         at java.lang.reflect.Method.invoke(Unknown Source)
         at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
         at org.python.core.PyMethod.__call__(PyMethod.java)
         at org.python.core.PyObject.__call__(PyObject.java)
         at org.python.core.PyInstance.invoke(PyInstance.java)
         at org.python.pycode._pyx4.f$0(<string>:20)
         at org.python.pycode._pyx4.call_function(<string>)
         at org.python.core.PyTableCode.call(PyTableCode.java)
         at org.python.core.PyCode.call(PyCode.java)
         at org.python.core.Py.runCode(Py.java)
         at org.python.core.Py.exec(Py.java)
         at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    java.sql.SQLException: java.sql.SQLException: Unexpected token: TIME in statement [select   C3_C1    ""Time]
         at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
         at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

  • Can we set the dynamic data source when using getReportParameters() ?

    Hello!
    I have a report where one of its parameters refers to a list of values (LOVs). This list of values is an SQL Query type. When the data source used in the report is defined in the BI Publisher server, I'm able to get the report parameters using the getReportParameters() function in my application. However, if the data source is not defined the function throws an exception, which is understandable.
    I decided to dynamically set the data source so that even if the data source used by the report is not defined in the BI Publisher server, it still will be able to get the LOVs for the parameter. I tried setting the JDBCDataSource of the dynamicDataSource for the ReportRequest object that I passed to the getReportParameters() function. Please see the sample code below:
    reportRequest.dynamicDataSource = new BIP10.BIPDataSource();
    reportRequest.dynamicDataSource.JDBCDataSource = new BIP10.JDBCDataSource();
    setReportDataSource(reportRequest.dynamicDataSource.JDBCDataSource, connectstr, jdbc, dc); //function to set the values for JDBCDataSource object
    reportParams = webrs.getReportParameters(reportRequest, uid, pwd); //call the getReportParameters
    I was expecting this to work as this is what I did to dynamically set the data source before calling the runReport function. So, my question is -- can we set the dynamic data source when using getReportParameters() ? I tried this both in versions 10g and 11g. It does not seem to work on both versions.
    Regards,
    Stephanie

    report_id column of apex_application_page_ir_rpt can help us uniquely identify each saved report.
    We can assign this report_id value to a page item and this page item can be put in the Report ID Item text box of the Advanced section of the Report Attributes page of the IR.
    This should load the saved report identified by report_id and you can get rid of javascript
    Regards,
    Vishal
    http://obiee-oracledb.blogspot.com
    http://www.packtpub.com/oracle-apex-4-2-reporting/book
    Kindly mark the reply as helpful/correct if it solves your problem

  • Create sql loader data file dynamically

    Hi,
    I want a sample program/approach which is used to create a sql loader data file.
    The program will read table name as i/p and will use
    select stmt will column list derived from user_tab_columns from data dictionary
    assuming multiple clob columns in the column list.
    Thanks
    Manoj

    I 'm writing clob and other columns to a sql loader dat file.
    Below sample code for writing clob column is giving file write error.
    How can I write multiple clobs to dat file so that control file will handle it correctly
    offset NUMBER := 1;
    chunk VARCHAR2(32000);
    chunk_size NUMBER := 32000;
    WHILE( offset < dbms_lob.getlength(l_rec_type.narrative) )
    LOOP
    chunk := dbms_lob.substr(l_rec_type.narrative, chunk_size, offset );
    utl_file.put( l_file_handle, chunk );
         utl_file.fflush(l_file_handle);
    offset := offset + chunk_size;
    END LOOP;
         utl_file.new_line(l_file_handle);

  • Adobe Premiere CC 2014.2: losing rendered files when using warp stabilizer

    Hi,
    I am constantly losing rendered files when using the warp stabilizer. So far I have tried about every hint I could find on the web such as cleaning the cache, rebuilding the rendered files, creating additional sequences etc etc.
    Honestly I am getting tired of using a product that isnt cheap in the first place to rent and where a bug like this apparently persists over several product versions without being fully fixed (I have had this problem throughout 2014 but according to forum postings others seem to have problems with much earlier versions as well).
    I would be really grateful if somebofy has any suggestion how this can be addressed.
    I am also happy to help testing fixes - if there are any fixes available.
    Thanks a lot and Happy New Year!
    Martin

    Hi Catherine,
    Welcome to the Adobe forums.
    Please try the steps mentioned below and check if it works for you.
    1. Launch Premiere Pro and create a Project, go to File menu>Project Settings>Renderer and change the Renderer to Software only mode, delete previews if you get a prompt and then try to import the clip.
    2. If step 1 fails or the Renderer is already on Software only mode, go to Start Menu and search for Device Manager, go into Display Adapters and Right click on the Graphics card to select Update driver software option, on the next screen choose "Browse my computer for driver software", then choose "Let me Pick from a list..." option and from the list select "Standard VGA Graphics adapters. You might need to change the screen resolution of your screen and once done restart the machine again.
    Launch Premiere Pro and import the clip to check.
    Regards,
    Vinay

  • Problem while loading data from ODS to infoobject

    Hi guys,
    I am facing problem while loading data from <b>ODS to infoobject</b>.
    If I load data via PSA it works fine but
    if I load data without PSA its giving error as Duplicate records.
    Do u have any idea why it is so.
    Thanks in advance
    savio

    Hi,
    when you load the data via the PSA, what did you select? Serial or Paralel?
    If you select serial most likely you don't have duplicates within the same datapackage and your load can go through.
    Loading directly in the IObj will happen thefore if you have the same key in two different packages, the "duplicate records" will be raised; you can perhaps flag your IPack with the option "ignore duplicate records" as suggested...
    hope this helps...
    Olivier.

  • How to totally turn off data traffic when using ma...

    Nokia N8-00:
    I only use the integrated gps and have turned off the assisted gps, net based gps and Wi-Fi/Network but when accessing maps it still generates data traffic.
    Normally this is´nt a problem but when traveling abroad data traffic costs a small fortune.
    In my old N95 when I downloaded offline maps and turned everything but integrated gps off there was no data traffic at all but even after the latest maps updates (i.e. yesterday) it appears that the maps still needs downloading when accessed...
    Any suggestions how to completely avoid data traffic when using maps?

    Set all destinations to always ask and select no whenever the phone gives you a request, set Maps to offline, and set location to just use intergrated GPS. By the way, just been to Portugal using Vodafone and data charges only about £5 for week, and they've just improved allowances, may be worth checking your service providers web site and see if it's necessary to bother about data roaming, they all tend to follow each other, and Data roaming charges in europe seem to be coming down !
    If I have helped at all, a click on the White Star is always appreciated :
    you can also help others by marking 'accept as solution' 

  • How to load .DAT file

    Hi,
    I want to load .DAT file from Al11 to one of my ODS.
    Can anyone tel me how to load (directly to ODS from Al11), what all things i need to use in infopackage.
    Thanks.

    Hi,
    Loading .DAT file from AL11 is nothing but loading from Application server, You need to keep the follwing settings while loading the file:
    In the Infopackage under extraction TAB:
    Adapter  :Load Text Type file from Application Server.
    Header rows to be ignored : Blank (Nill)
    data format :Seperated with Seperator (CSV)
    Data Seprator : ,
    Escape Sign  : "
    Give the path of the application server where you are keeping the file
    Thanks
    Mayank

  • Why select data from Function directly (SE37) got data , but when use funct

    why select data from Function directly (SE37) got data , but when use function in program donot found data
    i use function
    CS_BOM_EXPL_MAT_V2
    when i run function directly at SE37 .
    i found data.
    but when i use same function in program .
    system not found.
    please see my attachment.
    help me please.
    [http://www.quickfilepost.com/download.do?get=c974356a498b3a4d369aa0c50622e50b]
    http://www.quickfilepost.com/download.do?get=c974356a498b3a4d369aa0c50622e50b

    I know why U get empty data.
    In Program U should follow the rules:
    U'd better data a variant typed the function parameters's type.
    for example:
    in your program the parameter: stlal  = '1'
    U'd better like follow:
    Data l_stlal type STKO-STLAL
    l_stlal = '01'.  *Attention: '1' <> '01'.
    stlal  = l_stlal.
    in this way U may have less mistake
    and the parameter MTNRV
    U should use material convernt : "CONVERSION_EXIT_MATN1_INPUT" to change material into internal types before U put into function's parameter.
    why se37 has no problem? because In se37, the data you filled was be processed before use

Maybe you are looking for

  • Pension contribution for US Tax payers

    How Salary dedcution and salary reduction can be configured in PY-US

  • Performance issue in Report.

    Hello DBAs, We have a report which usually takes around 2 hours to complete in our instances "A" and "B". We created an index (CREATE INDEX xxexh_order_lines_n2 ON oe_order_lines_all (TRUNC(last_update_date)); ) in instance A and the report now takes

  • SelectManyCheckbox event handling

    Hi, I am new to this ADF technologies. In my jsp i have a selectmanycheckbox component and have a event handler method in backingbean, when i am clicking on the checkbox the control is going to event handler method in backing bean, its fine and worki

  • My recent project it is not opening

    while opening the dialogue box this project saved in a newer version of Adobe Premiere Pro and cannot be opened in this version

  • My Mackbook shut down automatically .

    Interval Since Last Panic Report:  1121592 sec Panics Since Last Report:          13 Anonymous UUID:                    85C784AD-469A-575E-7A05-4E491E899110 Tue Nov 13 18:17:18 2012 panic(cpu 0 caller 0xffffff800c4b7bd5): Kernel trap at 0xffffff7f8d1