More than 512 rows - xcelsius cannot export a PDF

I have a table of 2000 rows that excelsius needs to work on.  the combobox component needs to select data related to a vendor and put it in another sheet for display. my xcelsius is not connected to sap hence filtering of data is done through combobox.
I go into preferences >> excel and then change rows to 2500 but after that xcelsius wont generate pdf file. says error generating file that's it. it works only when i reduce the rows range for combobox to 500.
any ideas?

Post Author: amr_foci
CA Forum: .NET
i think excel will not be able to handle all that records , you can devide your report into some tabs based on a specific condition for every tab, so when you export it to excel you will get many tabs as was exist in the the Business Objects report
good luck

Similar Messages

  • Cannot create graph with more than 4000 rows or colume

    need some help.
    im trying to plot a graph using the report generation toolkit and i got this error and it say it cannot contain more than 4000 rows or columns
    is it that there is a limit? if there is, how can i increase?
    im trying to plot a graph using my data from excel which has over 10000 rows of data

    I use this VI to read excel and to write a graph report.
    in my excel, which i have attached, there's all data with about 11000 row.
    i want to use it to plot a graph and i kinda stuck at here where there is this limit of 4000 row? and i can't seem to get the header i have link in the block diagram to the graph either.
    Attachments:
    excel.xls ‏837 KB
    Read_excel_G_word_report.vi ‏34 KB

  • Export a BO  report to excel having more than 65500 rows

    Post Author: noble
    CA Forum: .NET
    Please let me know what are the different options available to export BO report having more than 65500 rows to excel 2003.

    Post Author: amr_foci
    CA Forum: .NET
    i think excel will not be able to handle all that records , you can devide your report into some tabs based on a specific condition for every tab, so when you export it to excel you will get many tabs as was exist in the the Business Objects report
    good luck

  • ADF view, export excel does not export more than 65536 rows

    as this is limitation of ms excel 2000, but not sure whether it is still limitation in using ADF view export excel feature.
    in our table , there are more than 80k rows, when we click on export excel, it only get's 65536 rows
    remaining rows are not exported.
    please let us know if this is still open issue or fixed, also let us know which version if it is fixed.

    Hi,
    Yo have to use custom library Ex: Apachi POI
    See
    Re: ADF table data export to Excel

  • Issue on Dashboard 4.1 Export as .SWF file while retrieving more than 5000 rows.

    Hi Experts,
        We  have a BO Dashboard (version 4.1) that displays the treasury information on cash flows across various countries and regions.
         The dashboard is built on Universe and  has large number of components (like filter, aggregation, glow region) with some third party Centigon add on for Google Maps. - GMaps Plugin for SAP BusinessObjects Dashboards provides an interactive map design solution for enterprise dashboards components).
         When trying to add more source content to the existing dashboard (and exceeding more than 5000 rows ) and trying to export the dashboard as .SWF file , we are getting a run-time error as below.
         But when we minimize the number of rows manually in data mapping (<=4500 rows) we are able to export the .SWF file.
          Seriously couldn't arrive at any conclusion or it is a tool limitation.
         Please help us in this issue.
    Regards,
    Prasana Ravichandran

    Justin
    I partially agree with you, When I say condition I was talking about LOAD when clause in ACCESS PARAMETERS not where condition of select clause
    When you use load condition with a byte range is checked before parsing the row , So to skip a row you can use something like this (seee example below)
    CREATE TABLE ext_employees
    (employee_id NUMBER(4),
    first_name VARCHAR2(20),
    last_name VARCHAR2(25),
    job_id VARCHAR2(10),
    manager_id NUMBER(4),
    salary NUMBER(8,2),
    commission_pct NUMBER(2,2),
    department_id NUMBER(4),
    email VARCHAR2(25)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ext_dir1
    ACCESS PARAMETERS
    records delimited by newline
    LOAD WHEN ( 1:5) != '*****'
    badfile ext_bad_dir:'empxt%a_%p.bad'
    logfile ext_log_dir:'empxt%a_%p.log'
    fields terminated by ','
    missing field values are null
    ( employee_id, first_name, last_name, job_id, manager_id,
    salary, commission_pct, department_id, email
    LOCATION ('empext.dat')
    REJECT LIMIT UNLIMITED
    This may solve his problem
    Santhosh

  • Export more than  70000 rows to excel in OBIEE 10G

    Hi We are using
    Office 2003 and need to export more than 70000 rows from OBIEE to excel.Is there any way?

    Hi,
    Refer the below link.
    Re: Downloading / Exporting more than 65k records in Excel
    Thanks,
    Satya
    Edited by: Satya Ranki Reddy on Apr 18, 2013 2:41 PM

  • Export from Discoverer to Excel more than 65K rows

    Hi
    I need to export data from discoverer to excel, in some cases it is more than 65K rows, does any body know if the file can be split in multiple files o worksheets in Excel?
    Any help will be appreciate
    Thanks
    Vladimir

    Russ
    The short answer is I didn't know, as in any instance where my end users have wanted to extract large amounts of data they have done it from one sheet, I expected that Discoverer would just create the extra sheets for both but I wanted to know for sure, so I did a test.
    I created a Disco workbook with two worksheets in it, the first one I called Monthly and it had 119,409 rows in it, the second I called Weekly and it had 299,775.
    The first time I did the export I did it by hitting the Excel icon on the toolbar, it therefore only extracted the first worksheet (Monthly), when I opened the ensuing Excel file it had two worksheets in it, one called Monthly, the other called Monthly_1 and as expected, between the two sheets the had exactly the same number of fields as in the original disco sheet.
    I then redid the extract going through the menu and chossing extract entire workbook, Discoverer then re-extracted the sheets. In the Excel file this time I had the same two monthly sheets as before and this time I also had weekly sheets, five of them. So I ended up with an Excel file with seven sheets containing all of the original data.
    Cheers
    Melanie

  • HOW TO IMPORT DATA MORE THAN ONCE FROM THE SAME EXPORT DUMP FILE?

    before asking my question i'd like to mention that i'm a french spoke...
    so my english is a little bit bad. sorry for that.
    my problem is : IMPORT
    how to import data a SECOND TIME from an export dump file within oracle?
    My Export dump file was made successfully (Full Export) and then i
    tried to import datas for the first time.
    I got this following message in my logfile: I ADDED SOME COMMENTS
    Warning: the objects were exported by L1, not by you
    . importing SYSTEM's objects into SYSTEM
    REM ************** CREATING TABLESPACES *****
    REM *********************************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "USER_DATA" DATAFILE 'E:\ORANT\DATABASE\USR1ORCL.ORA' SI"
    "ZE 3145728 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAX"
    "EXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLESPACE "ROLLBACK_DATA" DATAFILE 'E:\ORANT\DATABASE\RBS1ORCL.ORA"
    "' SIZE 10485760 DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS "
    "1 MAXEXTENTS 121 PCTINCREASE 50) ONLINE PERMANENT"
    etc........
    IMP-00017: following statement failed with ORACLE error 1119:
    "CREATE TABLESPACE "L1" DATAFILE 'E:\ORADATA\L1.DBF' SIZE 1048576000 "
    "DEFAULT STORAGE (INITIAL 10240 NEXT 10240 MINEXTENTS 1 MAXEXTENTS 121 PCTIN"
    "CREASE 50) ONLINE PERMANENT"
    IMP-00003: ORACLE error 1119 encountered
    ORA-01119: error in creating database file 'E:\ORADATA\L1.DBF'
    ORA-09200: sfccf: error creating file
    OSD-04002: unable to open file
    O/S-Error: (OS 3) The system cannot find the path specified
    --->etc..........
    the drive E: with the folder E:\ORADATA didn't exist, but after
    all that i created it.
    see below, before my IMPORT statement
    REM ********************* CREATING USER *********
    REM ********************************************
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "L1" IDENTIFIED BY VALUES 'A6E0DAA6865E7627' DEFAULT TABLESPACE"
    " "L1" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'L1' does not exist
    IMP-00017: following statement failed with ORACLE error 959:
    "CREATE USER "MLCO" IDENTIFIED BY VALUES '56AC6447B7D50467' DEFAULT TABLESPA"
    "CE "MLCO" TEMPORARY TABLESPACE "TEMPORARY_DATA""
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'MLCO' does not exist
    ETC.......
    REM ********************* GRANTING ROLES ***********
    REM ************************************************
    IMP-00017: following statement failed with ORACLE error 1917:
    "GRANT ALTER ANY TABLE to "L1" "
    IMP-00003: ORACLE error 1917 encountered
    ORA-01917: user or role 'L1' does not exist
    ETC.........
    IMP-00017: following statement failed with ORACLE error 1918:
    "ALTER USER "L1" DEFAULT ROLE ALL"
    IMP-00003: ORACLE error 1918 encountered
    ORA-01918: user 'L1' does not exist
    -- that is normal, since the creation of the
    tablespace failed !!
    REM******************************
    IMP-00015: following statement failed because the object already exists:
    "CREATE ROLLBACK SEGMENT RB_TEMP STORAGE (INITIAL 10240 NEXT 10240 MINEXTENT"
    "S 2 MAXEXTENTS 121) TABLESPACE "SYSTEM""
    IMP-00015: following statement failed because
    . importing SCOTT's objects into SCOTT
    IMP-00015: following statement failed because the object already exists:
    "CREATE SEQUENCE "EVT_PROFILE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999"
    "999999 INCREMENT BY 1 START WITH 21 CACHE 20 NOORDER NOCYCLE"
    ETC............
    importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00003: ORACLE error 1435 encountered
    ORA-01435: user does not exist
    REM *************** IMPORTING TABLES *******************
    REM ****************************************************
    . importing SYSTEM's objects into SYSTEM
    . . importing table "AN1999_BDAT" 243 rows imported
    . . importing table "BOPD" 112 rows imported
    . . importing table "BOINFO_AP" 49
    ETC................
    . . importing table "BO_WHF" 2 rows imported
    IMP-00015: following statement failed because the object already exists:
    "CREATE TABLE "DEF$_CALL" ("DEFERRED_TRAN_DB" VARCHAR2(128),
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ROLES" FOR "SYS"."DBA_ROLES""
    IMP-00015: following statement failed because the object already exists:
    "CREATE SYNONYM "DBA_ERRORS" FOR "SYS"."DBA_ERRORS""
    IMP-00008: unrecognized statement in the export file:
    . importing L1's objects into L1
    IMP-00017: following statement failed with ORACLE error 1435:
    "ALTER SCHEMA = "L1""
    IMP-00008: unrecognized statement in the export file:
    J
    Import terminated successfully with warnings.
    -------------------------------------b]
    So after analysing this log file, i created
    the appropriate drives and folders... as the
    import statement doesn't see them.
    E:\ORADATA G:\ORDATA etc...
    And i started to [b]IMPORT ONE MORE TIME. with:
    $ IMP73 sys/pssw Full=Y FILE=c:\temp\FOLD_1\data_1.dmp BUFFER=64000
    COMMIT=Y INDEXFILE=c:\temp\FOLD_1\BOO_idx.sql
    LOG=c:\temp\FOLD_1\BOO_log.LOG DESTROY=Y IGNORE=Y;
    after that i could not see the users nor the
    tables created.
    and the following message appeared in the log file:
    Warning: the objects were exported by L1, not by you
    . . skipping table "AN1999_BDAT"
    . . skipping table "ANPK"
    . . skipping table "BOAP"
    . . skipping table "BOO_D"
    ETC.....skipping all the tables
    . . skipping table "THIN_PER0"
    . . skipping table "UPDATE_TEMP"
    Import terminated successfully without warnings.
    and only 2 new tablespaces (originally 3) were
    created without any data in ( i check that in
    the Oracle Storage manager : the tablespaces exit
    with 0.002 used space; originally 60 M for each !!)
    so,
    How to import data (with full import option) succefully
    MORE THAN ONE TIME from an exported dump file ?
    Even if we have to overwrite tablespaces , tables and users.
    thank you very much

    The Member Feedback forum is for suggestions and feedback for OTN Developer Services. This forum is not monitored by Oracle support or product teams and so Oracle product and technology related questions will not be answered. We recommend that you post this thread to the appropriate Database forum.
    The main URL is:
    http://forums.oracle.com/forums/index.jsp?cat=18

  • Can't insert more than 8000 rows to excel via JOB

    Hello, 
    I have SSIS package where I have two tasks.
    File System task copies file (with header names) to destination folder.
    Data flow task export data from table to copied Excel file. Select are simple "select * from table" without any filters.
    There are about 100 000 rows in the table.
    When I run this task through Visual studio everything works fine. Data to Excel file is exported.
    When I run this package in JOB (Job Activity Monitor) the file is created but 0 rows is exported.
    One interesting thing, that when I use select "select top 7000 * from table" 7000 rows is inserted but when I try to export more than 8000 rows then 0 rows is exported. The JOB generates 0 errors. In fact he tries to export. Because all the time
    Excel's file "Date modified" is changing.
    It seems like data is exported but the final "commit" isn't done. Anybody knows where could be a problem?

    Out of curiosity I have had a go at reading 70K lines of data from a table in a sql server and exporting them to an excel 2007 file:
    SSIS 2012 project deployment
    2 project connections: sql database connection for source
    and an Excel 2007 connection for destination:
    Provider=Microsoft.ACE.OLEDB.12.0;Data Source=F:\Data\Outbox\ExcelOut.xlsx;Extended Properties="Excel 12.0 XML;HDR=YES";
    I have an empty excel 2007 file as template with header rows in the first sheet.
    I test the project in VS 2012, ran without problems. 
    I set DelayValidation to true on package level, data flow level.
    In data flow, excel distination, ValidataExternal metadata to False. 
    I deploy the project with the package to an Integration Services catalog on a SQL 2012 server (not my development machine)
    I copy an empty excel 2007 file (with the headers in the first row) in the correct file location.
    I execute the package from SSMS in the Integration services catalogs, with 32 bit checked.
    The package runs without errors. 
    I can open the excel file and see 70K records.
    Last thought: I have no NULL values in my data. Do you have null values after the first 7000 records? Or is one of the excel columns expecting numerical data where it suddenly finds alfanumeric after 7000 rows?
    Another suggestion: if you run the package with SSMS from the catalog, you can set the verbose reporting mode on, does that tell you anything more?
    Jan D'Hondt - SQL server BI development

  • How can I display more than 15 rows in a stacked 3D horizontal flash chart

    It seems I cannot display more than 15 rows at a time in a stacked 3D horizontal flash chart.
    Any ideas?
    Thanks, Pierre

    The default history length of a chart is 1024. To make it longer is a right click option.

  • [VIA] Via K7-266 Pro2 - System crashes if use more than 512 MB of memory

    Cannot use more than 512 MB of memory or system crashes. After running without a problem for over 2 years with 1 GB of memory installed, system started intermittent crashing, generally after 5 to 30 minutes of use.  I worked with the system manufacturer's tech support and they determined that it must be a motherboard problem.  I am hoping to find another fix rather than replacing motherboard.  System details:
    MB: Via K7-266 Pro2 (MS 6380)
    OS: MS Windows XP Update 1
    Bios: V3.6
    I've used Norton Systemworks and Norton Anti-Virus to keep system clean, and SpyBot and Adaware to clean adware
    During troubleshooting, we found that it will run with 512 MB of memory without problems, but not with the 1 GB I had installed.  My memory was 2 x 256 MB installed with the machine and 512 MB I purchased from Crucial and installed.  Computer ran over 2 years with all memory installed without a problem.  After the problem started and during troubleshooting, we found that any combination of 512 MB worked fine (either the 2 x 256 MB or the one 500MB chip), apparently in any arrangement of slots.  But as soon as I went to higher memory, I was subject intermittent crashes.
    Could not ID the source of the crash.  It sometimes happened when I was working with a single program open (various programs from IE to Excel to playing at GameSpy), had multiple programs open in background, or even when computer was sitting idle with only the startup programs active.
    During troubleshooting we also did a complete re-install of Windows XP, updated the bios to V3.6, and checked bios settings, but the problems persists. 
    What else could be the problem?

    I am beginning to suspect a PSU problem.  I neglected to mention in my first post that when the system crashes, it does not freeze but has intermittent reboots.
    I ran the memtest86+ test first on the two old sticks of memory then on just the one new 512 MB stick; both passed the test.  I also used the Prime95 stress test on both sets, and both ran without incident.
    I used PC Alert to report CPU and case temperatures during the stress test.  The results maxed at 54 and 41 degrees respectively during the first stress test and 46 amd 39 degrees during the second test (I think part of the difference was due to a lower ambient, room temperature during the second test.)  The voltages reported during both tests were similar and were: Vcore1: 1.71v;  Vcore2: 1.26v; +5v: 4.76v; 3.3v: 3.31v; +12v: 11.60v; and -12v: 12.60v.  I am not sure if that information is meaningful, PCAlert showed all measurements as within the required range.
    I periodically have cleaned the dust out of my computer, so I don't think that is the problem.
    I used 2 power calculators with major differences in the reported status of the system. The test from pcpowercooling reported minimum and maximum requirements for my system (with the full 1 GB of memory installed) of 300w and 510w.  The alternate I found at jscustompcs (from a recent article about power supplies at Extreme Tech) reported a minimuum of 256w (no maximum is calculated with their tool.  The two tools had significant differences for memory, CPU, and PCI cards.  Off particular relevance to my problem was the difference in memory requirements; Pcpowercooling calculated memory requirements at 8w per 128MGB while jcustompcs used 12.5 per stick, resulting in different totals of 64w vs. 21w.  Which is right?
    One item that makes me think it is not the PSU, however, is that the problem would occur with the 1 GB of memory while the system was idle, thus I was not using any of the hardware devices.  However, I never experienced the problem while using a device (e.g. DVD, CDRW, or tape drive) - either with 500MB or 1GB of memory.
    My power unit is a Power Man #FSP 300-60BTV with 300w output, including +3.3v: 28amps; +5v: 30.0 amps; +12v: 15 amps; +5vsb: 20amps; -5v: 0.3amps; -12v: 0.8amps.  It also notes max output of +3.3v and +5v of 200amps.
    My system also includes a Leadtek nVidia Geforce 3 Ti400 graphics card, DVD, CDRW, HP 14 GB tape drive, 1 case fan, and 4 PCI cards (modem, network, USB2.0, and Audigy 1).  The USB card is connected only to a bus in my Dell monitor and I have not used other USB devices.
    Any thoughts or suggestions?  Do I need to replace the PSU?

  • Hacking application id equals() to allow more than one row per "primary key"

    I have a read only entity whose primary key is not the real primary key
    on the underlying table. The result is that I get more than one row per
    "primary key". This is what I want. However KODO will not allow me do
    it, because I am storing the collection in a HashSet() and the equals()
    method on the application id object ensures that this set contains no
    duplicates (as defined by the application id). At least thats how it
    seems to behave.
    So, I have hacked the equals() method to do this:
    public boolean equals (Object ob)
    if (this == ob)
    return true;
    // Doing this because we expect more than one row from
    // REF_CODES for the same domain/shortCode combination.
    // This is ok to do (I guess?) as long as we are only
    // doing selects using this class.
    return false;
    Will this hack have any side effects? Is there another option? Like
    using a list collection? Is so, which collections are supported?
    Thanks,
    Mike.

    The "primary key" I am using it already a compound key of two columns.
    The real key on the underlying table is a three column key. But the
    problem is that the table represents two different application level
    entities. I could deal with it when I was hand-writing SQL (I could do
    a distinct for example) but now I am relying on foreign key
    relationships since I moved to JDO. Not sure what to do. Best solution
    is to rework the table, but there is a lot of legacy code (that other
    teams use and maintain) relying on this table. I guess I'll use the
    weekend for inspiration ;-)
    Steve Kim wrote:
    This sounds like a dangerous operation. Even if this works now, I
    cannot promise future compatibility... and in fact may result in bad
    data (for example in caching both at the PM and PMF level) Is there no
    other field that you can reference as part of the primary key? Primary
    Keys can be multi columned (e.g. last_name, soc_sec_number)
    Mike Hogan wrote:
    I have a read only entity whose primary key is not the real primary
    key on the underlying table. The result is that I get more than one
    row per "primary key". This is what I want. However KODO will not
    allow me do it, because I am storing the collection in a HashSet() and
    the equals() method on the application id object ensures that this set
    contains no duplicates (as defined by the application id). At least
    thats how it seems to behave.
    So, I have hacked the equals() method to do this:
    public boolean equals (Object ob)
    if (this == ob)
    return true;
    // Doing this because we expect more than one row from
    // REF_CODES for the same domain/shortCode combination.
    // This is ok to do (I guess?) as long as we are only
    // doing selects using this class.
    return false;
    Will this hack have any side effects? Is there another option? Like
    using a list collection? Is so, which collections are supported?
    Thanks,
    Mike.

  • Copy and Paste Large (more than 999 rows) Spreadsheet

    I have a large spreadsheet that has a number of hidden columns.  It is about 1,135 rows long.  I have no trouble copying and pasting the spreadsheet into Pages (both are IWork '09) as long as I only try to copy and past 999 or fewer rows.  But when I try to copy and paste more than 999 rows, I only get the cell contents pasted into Pages.  The cell borders and formatting all get removed.  I have tried different groupings of 999 and always get the same results whether I am trying it will range 1-999 or 100-1099.  999 cells can be copied and pasted, but 1,000 or more get stripped of any formatting during the paste.  Any suggestions?  I've tried rebooting, running disk utility and renaming the Numbers document.  I've tried copying and pasting the contents of the numbers document into a new numbers document, but get the same results.  I've exported the Numbes document as an Excel document and reopened it using Numbers but still get the same results.  Anybody got any suggestions?  I'm running OS X 10.6.8 and everything is current.

    Hello
    To be complete on the subject, I tested the number of columns allowed by Pages in a table.
    The document format, portrait or landscape, doesn't matter. Pages accept only 65 columns.
    This limit may have a simple explanation :
    in Binary format, 64 is represented by 1000000
    and of course as we start from 0, 7 digits allow to define 65 values.
    For the 999 rows limit I have no idea. The binary representation of 998 is : 1111100110
    I would be surprised  that Apple use the decimal system to count the rows and if it does, the logical limit wasn't 999 but 1000 which is the count of integers from 0 thru 999.
    At this time the larger table which I use regularly is a Numbers one with 5296 rows in which I store the references to Apple Technotes. It receive about 20 new rows every week.
    I splitted the original 6942 rows one in two tables : the one with 'living' technotes and a 1646 rows one storing the now obsolete technotes.
    15 days ago, it was an AW6 database but I decided that it was time to to switch it in today's application.
    Yvan KOENIG (VALLAURIS, France)  07 /11 /2011 21:38:53
    iMac 21”5, i7, 2.8 GHz, 4 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.2
    My iDisk is : <http://public.me.com/koenigyvan>
    Please : Search for questions similar to your own before submitting them to the community

  • Not uploading more than 150 rows from Excel file to Internal table.

    Hi All,
    We have a Z program to upload initial stock from excel file to SAP using BAPI. The problem is we have defined row to '65536'. But it is not uploading more than 150 rows at a time. The piece of code is given below.
    DATA : it_excel LIKE alsmex_tabline OCCURS 0 WITH HEADER LINE.
    DATA: xcel TYPE TABLE OF alsmex_tabline WITH HEADER LINE.
    DATA : gd_scol   TYPE i VALUE '1',
           gd_srow   TYPE i VALUE '3',
           gd_ecol   TYPE i VALUE '256',
           gd_erow   TYPE i VALUE '65536'.
    PERFORM upload_excel_file TABLES   gt_out
                                  USING   p_file
                                          gd_scol
                                          gd_srow
                                          gd_ecol
                                          gd_erow.
    FORM upload_excel_file  TABLES   gt_out
                                       "Insert correct name for <...>
                            USING    p_p_file
                                     p_gd_scol
                                     p_gd_srow
                                     p_gd_ecol
                                     p_gd_erow.
      DATA : lt_intern TYPE  kcde_cells OCCURS 0 WITH HEADER LINE.
      DATA : ld_index TYPE i.
      FIELD-SYMBOLS: <fs> TYPE ANY.
      CALL FUNCTION 'KCD_EXCEL_OLE_TO_INT_CONVERT'
        EXPORTING
          filename    = p_p_file
          i_begin_col = p_gd_scol
          i_begin_row = p_gd_srow
          i_end_col   = p_gd_ecol
          i_end_row   = p_gd_erow
        TABLES
          intern      = lt_intern[].
      IF sy-subrc <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      IF lt_intern[] IS INITIAL.
        FORMAT COLOR COL_BACKGROUND INTENSIFIED.
        WRITE:/ 'No Data Uploaded'.
        EXIT.
      ELSE.
        SORT lt_intern BY row col.
        LOOP AT lt_intern.
          MOVE lt_intern-col TO ld_index.
          ASSIGN COMPONENT ld_index OF STRUCTURE gt_out TO <fs>.
          MOVE lt_intern-value TO <fs>.
          AT END OF row.
            APPEND gt_out.
            CLEAR gt_out.
          ENDAT.
        ENDLOOP.
      ENDIF.
    ENDFORM.           
    Plz tell me, what is the problem here. I want to upload all the rows at a time.
    Thanks,
    With regards,
    Rosaline.

    Hi Rob, I confirm, you're nitpicking! :-D For information, I've noted a few SAP comments about releasing and released objects, for people who like reading: [SAP Library - Documenting and Releasing a Function Module|http://help.sap.com/saphelp_nw70/helpdata/en/d1/801f50454211d189710000e8322d00/frameset.htm], [Note 109533 - Use of SAP function modules|https://service.sap.com/sap/support/notes/109533], [Note 415983 - Modification/customer developments of SAP function modules|https://service.sap.com/sap/support/notes/415983]; Example of an unreleased BAPI -> [Note 577453 - Using BAPI BAPI_DELIVERYPROCESSING_EXEC|https://service.sap.com/sap/support/notes/577453]
    Sandra

  • ORA-01427: single-row subquery returns more than one row

    I have a problem something like this:
    table A
    id col1 after_update
    1 ? 11
    2 ? 22
    3 ? 33
    3 ? 44
    3 ? 55
    4 ? 66
    5 ? 88
    5 ? ? /* won't be updated */
    6 ? 99
    7 ? null /* updated to null */
    9 ? 20
    table B
    id col2
    1 11
    2 22
    3 33
    3 44
    3 55
    4 66
    4 77
    5 88
    6 99
    8 10
    9 20
    When I try the query:
    update table A
    set col1 = (select distinct col2 from table B) it gives me the error ORA-01427: single-row subquery returns more than one row
    When I try the query:
    update (select x.col1 x_col1,y.col2 y_col2
    from (select id,col1,row_number() over (partition by id order by null /*col1*/) rn
    from table_a
    ) x,
    (select id,col2,row_number() over (partition by id order by col2) rn
    from table_b
    ) y
    where x.id = y.id
    and x.rn = y.rn
    set x_col1 = y_col2
    it gives me an error saying ora-01779 :cannot modify a column which maps to a non key- preserved table. It shows the same even when I add the key to the tables.
    My Database version is Oracle Database 11g Enterprise Edition Release 11.2.0.1.0
    Kindly help.

    Another way, not waterproof too relay on rowid, but in your case it could work.
    You have to verify it by yourself
    update test_a aa
    set col1 =
    (select b.col1 from
    (select
    id, col1
    ,row_number() over
      (partition by id order by col1) rn
    from table_b) b,
    ( select
      rowid, id, col1
    ,row_number() over (partition by id order by col1) sort_order
    from test_a
    ) a
    where
    a.id=b.id
    and
    sort_order = rn
    and
    aa.rowid=a.rowid
    )Edited by: chris227 on 03.04.2013 06:42
    Table name corrected

Maybe you are looking for

  • Reading an PDF Attachment  From Sender Mail Adapter

    Hi All,                i am able to get a mail  from my Mail Server, But my aim is need to Read an PDF attachment from the mail, in SXMB Moni i am getting the Payload with attachment, I have a created a module to convert that PDF to XML(Module is wor

  • MENU PROBLEM WITH 6110 NAVIGATOR

    HI I HAVE A NOKIA NAVIGATOR 6110 CAN ANY ONE HELP ME WITH THESE PROBLEMS PROBLEM 1> WHEN I PRESS THE MENU BUTTON IT GOES TO A SINGLE FOLDER ON THE SCREEN THE NAME OF THE FOLDER IS &qtn_apps_... AND MOST OF MY OTHER FOLDERS ARE IN HERE. PROBLEM 2> WHE

  • Adobe Acrobat 6.0 Professional en español

    Hola soy nuevo en esto y no controlo mucho el inglés, alguien sabe si existe alguna actualización o parche para tener Adobe Acrobat 6.0 Professional en español. GRACIAS

  • Need help in OID user Trusted recon

    Hi all, I am using oim9.1.0.1,oid 9.0.4.7. When i run the OID user trusted recon it is bringing users based on pagesize.The problem is if i set the page size as 100 then it brought 98 users where as i have 30000 users in my OID.When i set the page si

  • Confirmation about status of a file right after closing the file

    I would like to know when to open a file such screen shot of file on Desktop and then close it. The file is not activated which means I can not edit the name or delete the file immediately. Please make me the status of closing the file The file is af