Regarding the downloading amount of data through export to PDF in WAD

hi in our WAD we are using export to PDF option to download the data. since we have more than 300 pages in our report the datas are not getting downloaded. if we restrict to some state(i.e., say some 30 pages ) then it is working fine.
is there  any setting to overcome this problem.
i need to download all the datas no matter how many pages it is. 
do anyone have ideas regarding this

Hi Dolly,
Hi
Export to PDF is assigned to some Web Item like Analysis or Chart
Go to the Analysis properties and there you have to align the number of rows and columns per page
Hope this helps u..
Best Regards,
VVenkat..

Similar Messages

  • Why does a standalone program created in Labview 8.5 try connecting to the internet when the program only reads data through the serial port? Firewalls object to progams that contact the internet without permission.

    why does a standalone program created in Labview 8.5 try connecting to the internet when the program only reads data through the serial port? Firewalls object to progams that contact the internet without permission.
    The created program is not performing a command I have written when it tries to connect to the internet, it must be Labview that is doing it. How do I stop this from happening? 
    Any help would be very appreciated.

    It looks that way..
    "When LabVIEW starts it contacts the service
    locator to removes all services for itself. This request is triggering
    the firewall.This is done in case there were services that were not
    unregistered the last time LabVIEW executed- for example from VIs that
    didn't clean up after themselves"
    This is not yet fixed in LV2009.
    Message Edited by Ray.R on 11-04-2009 12:25 PM

  • Photoshop can not print the maximum amount of data that can be spooled to a PostScript printer is 2G

    Hi all,
    This is the first time I've worked with the .psb (large format) size and I have gone to print a PDF of the file and I get the message:
    "photoshop can not print this document because the document is too large. The maximum amount of data that can been spooled to a PostScript printer is 2GB".
    The file itself is a 2700x1570mm 300dpi flatten .psb image that is 500mb in size.
    Not sure how I can get around this, where abouts do I see the image size information that is being spooled?
    Any help would be great

    There's no easy way to see the size of the image that's being spooled, but 2G limit is uncompressed, 4 bytes per pixel (either RGBX where X is a padding byte, or CMYK, depending on the image / printer), times about 1.6 because the image data is sent in ASCII85 format rather than binary.
    With your image at over 100 inches at 300 dpi, you're also perilously close to the 32k limit on the pixels in any one dimension (31,900 by my calculations) that PostScript can handle.
    The limits are 32767 pixels in either dimension, and 2GB of data total. The data that's sent at 4 bytes per pixel, and converted to ASCII85 so that it'll make it across any connection. ASCII85 gives about a 1.6 expansion factor, if I remember correctly.
    Do you really need a 10 foot image printed at 300 dpi? If not, changing down to 200 dpi will probably let this image print.

  • Issue in export to PDF in  WAD

    HI gurus
    I have an issue in taking export to PDF in WAD reporting
    In my WAD ,I have the below in order
    1)company logo,
    2)An info field ,showing fiscal period and date
    3)report designer,to fetch data
    4)An infofield to show the user id
    the WAD out put is coming properly
    But When i take an PDF output the I get three pages
    1) and 2) are in first page , 3) is in second page  4) is in third page which is a single line
    I want everyting in one page .please help me how to achive this
    Thanks
    Regads
    Pradeep

    Hi Pradeep,
    I dont think this can be done.  There are various issues with exporting web items to PDF, one of them being formating.  We had a similar issues and opened an OSS with SAP and they stated this was how it was designed and SAP do not provide a better way of formatting PDFs.
    Thanks,
    Nick.

  • How-to list the contents of a Data Pump Export file?

    How can I list the contents of a 10gR2 Data Pump Export file? I'm looking at the Syntax Diagram for Data Pump Import and can't see a list-only option.
    Regards,
    Al Malin

    use the parameter SQLFILE in the impdp which writes all the sql ddl's to the specified file.
    http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm
    SQLFILE
    Default: none
    Purpose
    Specifies a file into which all of the SQL DDL that Import would have executed, based on other parameters, is written.
    Syntax and Description
    SQLFILE=[directory_object:]file_name
    The file_name specifies where the import job will write the DDL that would be executed during the job. The SQL is not actually executed, and the target system remains unchanged. The file is written to the directory object specified in the DIRECTORY parameter, unless another directory_object is explicitly specified here. Any existing file that has a name matching the one specified with this parameter is overwritten.
    Note that passwords are not included in the SQL file. For example, if a CONNECT statement is part of the DDL that was executed, it will be replaced by a comment with only the schema name shown. In the following example, the dashes indicate that a comment follows, and the hr schema name is shown, but not the password.
    -- CONNECT hr
    Therefore, before you can execute the SQL file, you must edit it by removing the dashes indicating a comment and adding the password for the hr schema (in this case, the password is also hr), as follows:
    CONNECT hr/hr
    For Streams and other Oracle database options, anonymous PL/SQL blocks may appear within the SQLFILE output. They should not be executed directly.
    Example
    The following is an example of using the SQLFILE parameter. You can create the expfull.dmp dump file used in this example by running the example provided for the Export FULL parameter. See FULL.
    impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmpSQLFILE=dpump_dir2:expfull.sql
    A SQL file named expfull.sql is written to dpump_dir2.
    Message was edited by:
    Ranga
    Message was edited by:
    Ranga

  • What is the MAX amount of DATA for JDBC?

    Hello,
    Did anybody had any expericence with how jdbc works with large amount of data SAY
    > 1/2 GIG or more?
    Any help apprecieated
    Thanks.
    Paul

    Always depends on the driver implementation. Each
    driver/database combination deals with massiveamounts
    of data in a single transaction differently. Canyou
    be more specific? In general Most Drivers of
    commercial grade quality and/or shipped with the
    top
    5
    RDBMS have pretty solid implementations limitedsoley
    by network bandwidth to carry this info. A good
    developer however will not rely on the driver to
    manage this much info and will develop a scheme
    to
    minimize the impact of such large data transferson
    an
    unsuspecting database or client.
    LThis is what I thought. I just need moreinformation
    if for example I have a JAVA process which feedsdata
    from RDMBS and the amount of data the processreceives
    SAY 2GIG My guess this all get in "ResultSet"object
    and thus all feeds in VM( memory) so I have to
    designing my process SUCH that it takes only asmall
    sample of data and iterate this until I get all the
    information from my query. Ex.
    select * from test where data between 1 and 2000000;I
    have to brake this down by "DAYS"
    so my code will be something like
    for (int i=0 ;i<2000000 ;i++) {
    String select=null;
    select="select * from test where data="+i+;
    do( something else)
    Am I close far? Is there another way that may beJDBC
    driver handless this behind scenes?
    Well personally I think you should be structuring your
    SQL better than that. Making 2000000 calls over a
    network in a for loop is suicidal..and lets not forget
    how something like that will block your application.
    First you can do something like "select * from test
    where data <= 2000000" in one shot. Now you get the
    whole set and you can scroll through and do your
    operation ( of course this is slow too, but its much
    better than running the query 2000000 times ).
    the other thing you have to consider is : is it
    necessary to really block during an operation like
    this? Can it be done in another thread? a daemon
    process even? Some service which can be polled for
    results when needed?
    This is a tricky example and I'd be interested to know
    more about what you are trying to do.
    When you were talking about large data I was under the
    impression of something like a blob, clob, or any sort
    of single query with 2gb of data in the result.
    Can you break up your query? Doing things in batch in
    an alternative thread can yield to other processes
    perhaps?
    Loh, and also: Can the Database itself take care of this business? Stored Procedures and Internal Functions are all built to handle massive amounts of data with minimal impact. Can you leverage the technology within the Database to accomplish your goal? Sometimes a well written procedure can save you tons of work in your business logic tier.
    The right tools for the right job..
    L

  • Can we download aggregate's data through open hub?

    Hi,
    Can we download Cube Aggregate data to file or infospoke? If yes then send me detailed steps?
    Ritika

    Hi Ritika,
    Thier is no staright forward, way of extracting data from Aggrgates table, as the OHD can be build on following Object types: Datasource, infosource, DSO, Infoobject & infocube.
    So you can it a try by creating a generic datasource on the desired aggregates table, and use it in a Open hub destination.
    Hope this helps,
    Regards,
    Umesh

  • What is associated with the value of the HR2000+'s Spectral Data through the VISA driver

    Hey all,
    I've got the HR2000+ES here and I've successfully got it talking with labview and I can't change integration times and pull spectrum data.  I'm aware that I will have to take into account the CCD's lack of efficiency regarding the higher wavelengths and lower wavelengths, however I don't believe I'm at that step yet.  In my dark spectra, I receive roughly values around 8700 across all pixels.  I haven't a clue on what exactly that is corresponding to.  Does this correlate to a saturation value of each pixel? I know I'm getting close to having this operational, but it seems this one obstacle stands in my way.
    Thank you for any help or ideas anyone can offer,
    -John
    Solved!
    Go to Solution.

    Hi John,
    Thanks for your post. It sounds like you're acquiring your data from your instrument through VISA correctly. I'm a little bit unsure of where exactly you need assistance. Is your question regarding getting your data into LabVIEW, or is it about interpreting the data that you do receive? The manufacturer of your device would be able to help explain what the values you are receiving correspond to in terms of the pixels (saturation value or otherwise). However, if you're wondering how to manipulate your data in LabVIEW or need help with specifications for VISA, etc then that's something that we can certainly help you out with from our side!
    Best,
    Courtney L.
    Applications Engineer
    National Instruments

  • Need help regarding the maximum limit for data type

    Hi,
    this is Sravan. for my application am inserting the bulk data in XML format into a column of Database table.
    the sample inserted XML data will be in format... like
    '<ACC count = "10">
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    <Acc ac_no = "1111111111" cn_nr = "US" eflag = "Y" />
    </ACC>'
    this data in XML can have more than 1000 accounts.
    now, i need to take a Parameter value from XML node and write into a file. for this i have written a procedure by looping the Nodes from XML data and building Dynamic Query like..
    if nvl(v_int, 0) > 0 then
    v_sql := '';
         for v_count in 1..v_int
         loop
         if v_sql is null then
              v_sql := 'select extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@ac_no'')||extractvalue(empdetails, ''/ACC/Acc' || v_count
    ||'/@cn_nr'')||extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@eflag'') string from sample1';
         elsif v_sql is not null then
                   v_sql := v_sql || ' union all select extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@ac_no'')||extractvalue(empdetails, ''/ACC/Acc' || v_count
    ||'/@cn_nr'')||extractvalue(empdetails, ''/ACC/Acc' || v_count ||'/@eflag'') string from sample1';
              end if;
         end loop;
    end if;
    i will get this variable "v_int" from <ACC count = "10"> ACC count Attribute, here 10 is the value for v_int variable but in real time the variable value can be more than 1000.
    at the time of Building Dynamic Query I make using the Variable v_SQL which is of Data Type "Long",
    this variable stores dynamic query which is build at the time of executing procedure but this v_sql Variable is throughing Exception that ....
    "numeric or value error Character String Buffer is Too Small"... but Long Data type will store upto 2GB..
    this variable cant able to hold much data which is dynamically build.
    will you please help me to resolve this issue Or will u suggest me another method to pick the data from XML column.
    thanks,
    sravan.

    user11176969 wrote:
    i changed the code, now its working fine.
    direct assigning the dynamic query to a Clob variable raised error.
    for dynamic query building i used another variable and assigned this variable value to actual query variable.Nice!

  • Every time i try to download a pdf program an apple script comes up and the download does not go through what is the problem and why can i not download?

    I try downloading games and things but every time i do apple script comes up with an encrypted file. It has done this sense i bought this lap top. Please help me.

    Okay, can you reply letting me know if the computer was purchased used?, this will help as it would identify the possibility that the prior owner did indeed create some sort of block.
    Next, what you are speaking about blocks in regards to downloading 'PDF software'...is it PDF 'software' or PDF files that are blocked?, it appears that maybe it's the PDF files which are blocked at which case it indeed my be programmed into the system.
    Please answer these and we can further assist you in reaching an answer. As for this list you have provided, I would need more than simple bullets....if you can provide the location of these 'bullets' and file extension...that would get the ball rolling.
    Thanks

  • Trying to download the first 4.5GB CS6 Design Web Premium package several times, but the download cuts out midway through...

    I'm using Windows 8.1 (fully updated) and Chrome.
    This issue is really eating my bandwidth by having to delete and download the file again. The final file has different sizes in the end, from approximately 0.5GB to 2.0GB.

    I've moved to a different house/network and plugged in through wired Ethernet.
    I tried downloaded two different download managers and also tried each using Internet Explorer and Firefox as well... I'm currently using the computer while it's downloading so it's not going to sleep. I've also turned off my firewalls and anti-virus and it's "completed" the download four additional times, all of which stopped at exactly 480,704 KB.
    I'm out of ideas... Are there any Adobe download clients that will download each program within the CS6 package and install individually?

  • Regarding the Status of Loaded Data in Report.

    Hi All,
    My client wants to view the status of the data that is being represented in the Reports. For example we are having data till today in R/3 but we have only loaded data till yesterday then he wants it to be represented in the report. Like “Data as on May 8th “. Is there any script that can be written to get this information dynamically in the report header???

    hi Yj,
    in web you can use web item 'text element' and choose element type 'general text' and element id ROLLUPTIME.
    <object>
        <param name="OWNER" value="SAP_BW">
        <param name="CMD" value="GET_ITEM">
        <param name="NAME" value="TextElements_1">
        <param name="ITEM_CLASS" value="CL_RSR_WWW_ITEM_TEXT_ELEMENTS">
        <param name="DATA_PROVIDER" value="Data_Provider">
        <param name="GENERATE_CAPTION" value="">
        <param name="GENERATE_LINKS" value="">
        <param name="SHOW_FILTERS" value="">
        <param name="SHOW_VARIABLES" value="">
        <param name="ELEMENT_TYPE_1" value="COMMON">
        <param name="ELEMENT_NAME_1" value="ROLLUPTIME">
        <param name="ONLY_VALUES" value="X">
        ITEM:            TextElements_1
    </object>
    in bex analyzer, you can display the date with menu business explorer -> display text elements -> general, you will see some info, 'author'... 'last refreshed'.
    hope this helps.

  • Any way to control the order in which data is exported?

    I found it surprisingly easy to set up a form to submit its contents via a formmail script -- perhaps too easy...
    Using Acrobat 9 (and the demo version of Acrobat X), I've modified a form that students fill out so that when they hit the submit button it pushes the form contents to a script which relays it to an e-mail address specified on the form -- no problems there.
    But the order in which those fields are listed in the e-mail bears no relation to the order the fields appear in the form -- or alphabetical order of the field names or data values or the tab order. It's the same order every time. And as far as I can tell the mailing script isn't doing anything to change the order of the data.
    This is mostly an issue of making the e-mail friendly for the supervisor who has to look at the e-mails as they come in. (I've done a number of searches on Google and here in the forums before posting and haven't found any references to similar problems, so maybe I do have a funky script.)
    I'm wondering if a) there's something native to the form that could change the order of the data and b) there's some straightforward way of controlling that. (I imagine I could write some JavaScript to control the output I had to, but it seems like a needless step.)
    Thanks for any insight anyone can provide.

    function(){return A.apply(null,[this].concat($A(arguments)))}
    A solution would be to modify the server script to output in the order you want, as opposed to just looping over the collection of fields present in the incoming data as formmail type scripts typically do.
    That is what I wound up doing -- making a copy of the mailing script and reformatting the message body string to contain just the data responses my colleage wanted in the order he wanted. It beats the old method of requiring users to fill out the PDF, save the document and then e-mail in the PDF as an attachment.

  • Regarding the Time dependent master data error

    Hi Masters,
            I've loaded the time dependent master data from flat file only two fields. When I checked that corresponding characteristics in maintain master data, that contains by default from date 01.01.2007and to date 31.12.9999 this date also. Could you please help me to rectify this error.
    Thanks in advance
    Raja.S

    Hi Antonino La Vela
          I have 2 Project Manger and in different duration for different project.
    Following datas are my data in Excel sheet.
    PM Name                    To date             From Date           Costcenter
    Ragunath                    01.09.2007        01.06.2006           Project name1
    Ramana mani              01.02.2008         02.09.2007          Project name2
    while loading above data, I'm getting following data in maintain master data
    PM name                   To Date             From Date           Costcenter
                                     31.12.9999         01.01.1000
    Ragunath                   31.05.2007         01.01.1000
    Ragunath                   01.09.2007         01.06.2007            Project Name1
    Ragunath                   31.12.9999         02.09.2007 
    Ramana mani             01.09.2007          01.01.1000
    Raman mani               01.02.2008          02.09.2007           Project Name2
    Raman mani               31.12.9999           02.02.2008   
    Could you please help me, how this unnecessary datas are loaded by default?
    Thanks in Advance
    Raja.S

  • How to execute the download rdl and then reder it to pdf

    Dear all;
    i am trying to render an ssrs report from mscrm 2013 online version what i can do i can get download the report from the server through SDK of mscrm but i want to execute and rendered that rdl file in pdf format 
    any suggestion

    Hi anwar,
    According to your description, you want to render the report which exists in CRM to a PDF file.
    In your scenario, you should publish the report for external use, then in Visual Studio, make a web service reference to the SQL Server Reporting Services web service and use a snippet of code to render the file and save the output to a file. For more information,
    please refer to this article:
    Programmatically rendering a SQL Server Reporting Services (SSRS) report from Microsoft CRM 4.0 and capturing output in a file.
    Reference:
    MS CRM 2011: General approaches to generation of reports
    If you have any question, please feel free to ask.
    Best regards,
    Qiuyun Yu
    Qiuyun Yu
    TechNet Community Support

Maybe you are looking for