Export cannot handle large numbers

Exporting to TXT or CSV fails on when columns contain large numbers. I presume it uses 32 bit signed integers. The result is blanks for all fields where the contents are too large. I have tested on both win 2000 and unix with the same results.
Eddie

Similar problem occurs with the visualization, as mentioned in:
acctno is null when displaying data
Can to be that the problem with the EXPORT has the same cause of the problem with the visualization.

Similar Messages

  • BIP cannot handle large CLOB data.

    We have OBIEE 1.3.4.1 on Redhat, database is Oracle 11.2.0.2 on the same box.
    In Publisher, I build a report with table with a CLOB type.
    BIP can handle of the clob data over 4000 limit of varchar2, but failed at one of the rows that with clob size over 25000 character. The error message isThe report cannot be rendered because of an error, please contact the administratorIs this really beyond the limit of Publisher, or there is a place I can configure BIP to handle it.
    Thanks

    If I do not use any template and just show the XML in BIP View, I get a more meaningful error A semi colon character was expected. Error processing resource 'http://cchdb.thinkstream.com:9704/xmlpserver/servlet/xdo'. ...
    10/31/2006  LA-DOC P&P W ORLEANS, LA      PROB: POSS OF COCAINE BEGIN 10/30/2006 END 10/30
    ----------------------^This appear to due to the '&'. How can make BIP treat '&' as an ordinary character?

  • HT3350 what can you do when you get this warning? "Scouting Book  Collection Index" could not be handled because Numbers cannot open files in the "Numbers Document" format.

    “Scouting Book  Collection Index” could not be handled because Numbers cannot open files in the “Numbers Document” format.

    This file was created in 2002 in MSword but this had been forgotten.
    When I went to open it.... (MSword) was not a program option on my laptop. For some reason, I assumed that either Pages or Numbers wouild open the file. How the file was created was forgotten and had no file Info, but I new that it was a table of valuable information.
    The warning threw me off, because it led me to believe that it was a Numbers file. which Numbers would not open. A paradox.
    I took the file to a laptop in OSXpanther with MSoffice and it opened in MSword. Problem Solved.
    Thank you for your help.

  • Version-0.apversion could not be handled because Numbers cannot open files of this type

    Hi - I created a file in Numbers a couple of years ago and have been able to update it monthly since its creation.
    This morning I have tried to open the file and am getting the "Version-).apversion" could not be handled because Numbers cannot open files of this type.
    Any ideas for a fix?

    Hi Amelie,
    There are three major versions of Numbers:
    Numbers '08
    Number '09
    Numbers 3
    It is possible that an automatic update by App Store has installed the latest version of Numbers (Numbers 3) on your machine.
    Numbers 3 can not open a Numbers '08 document.
    Which version of Numbers is giving that error message? (In Numbers, Menu > Numbers > About Numbers.)
    Look in your Applications folder for sub-folders such as iWork '08 and iWork '09.
    Please call back with questions.
    Regards,
    Ian.

  • Handling tables with large numbers of fields

    Hi
    What is the best practice to deal with tables having large numbers of fields? Ideally, I would like to create folders under a Presentation Table and group fields into folders (and leave fields that may be needed rarely in a folder named 'Other Information').
    Is there a way to do this in Oracle BI? Any alternatives?
    Thanks

    Answering my own question:
    http://oraclebizint.wordpress.com/2008/01/31/oracle-bi-ee-10133-nesting-folders-in-presentation-layer-and-answers/
    This is definitely a working solution (creating multiple tables and entering '->' in their description in order for them to act as subfolders). Definitely not intuitive and extremely ugly, especially since reordering tables and columns isn't possible (or is it? in another non-obvious way? )
    Anyway it seems we have to live with this.

  • How do I convert an exported excel document to numbers? I am able to get the data into numbers but I can't read it.

    I am trying to get an excel document I exported from Wufoo into numbers on my mac. It works but I cannot read the data.

    I got it to work!! Just had to select Commas (.csv) instead of excel for the export. What happened before was that I chose .xls to export the data and when I tried to open the file in Numbers, it showed an error.
    Import Warning - This is a tab delimited document, not a valid Excel document. The data might look different.
    the data was all symbols etc..
    Thanks for replying to my question!

  • Business Partner records with large numbers of addresses -- Move-in issue

    Friends,
    Our recent CCS implementation (ECC6.0ehp3 & CRM2007) included the creation of some Business Partner records with large numbers of addresses.  Most of these are associated with housing authorities, large developers and large apartment complex owners.  Some of the Business Partners have over 1000 address records and one particular BP has over 6000 addresses that were migrated from our Legacy System.  We are experiencing very long run times to try to execute move in's and move out's due to the system reading the volume of addresses attached to the Business Partner.  In many cases, the system simply times out before it can execute the transaction.  SAP's suggestion is that we run a BAPI to cleanse the addresses and also to implement a BADI to prevent the creation of excess addresses. 
    Two questions surrounding the implementation of this code.  Will the BAPI to cleanse the addresses, wipe out all address records except for the standard address?  That presents an issue to ensure that the standard address on the BP record is the correct address that we will have identified as the proper mailing address.  Second question is around the BADI to prevent the creation of excess addresses.  It looks like this BADI is going to prevent the move in address from updating the standard address on the BP record which in the vast majority of cases is exactly what we would want. 
    Does anyone have any experience with this situation of excess BP addresses and how did you handle the manipulation and cleansing of the data and how do you maintain it going forward?
    Our solution is ECC6.0Ehp3 with CRM2007...latest patch level
    Specifically, SAP suggested we apply/review these notes:
    Note 1249787 - Performance problem during move-in with huge addresses
    **applied this ....did not help
    Note 861528 - Performance in move-in for partner w/ large no of addresses
    **older ISU4.7 note
    Directly from our SAP message:
    use the function module
    BAPI_BUPA_ADDRESS_REMOVE or run BAPI_ISUPARTNER_CHANGE to delete
    unnecessary business partner addresses.
    Use BAdI ISU_MOVEIN_CUSTOMIZE to avoid the creation of unnecessary
    business partner addresses (cf. note 706686) in the future for that
    business partner.
    Note 706686 - Move-in: Avoid unnecessary business partner addresses
    Does anyone have any suggestions and have you used above notes/FMs to resolve something like this?
    Thanks,
    Nick

    Nick:
    One thing to understand is that the badi and bapi are just the tools or mechanisms that will enable you to fix this situation.  You or your development team will need to define the rules under which these tools are used.  Lets take them one at a time.
    BAPI - the bapi for business partner address maintenance.  It would seem that you need to create a program which first read the partners and the addresses assigned to them and then compares these addresses to each other to find duplicate addresses.  These duplicates then can be removed provided they are not used elsewhere in the system (i.e. contract account).
    BADI - the badi for business partner address maintenance.  Here you would need to identify the particular scenarios where addresses should not be copied.  I would expect that most move-ins would meet the criteria of adding the address and changing the standard address.  But for some, i.e. landlords or housing complexes, you might not add an address because it already exists for the business partner, and you might not change the standard address because those accounts do not fall under that scenario.  This will take some thinking and design to ensure that the address add/change functions are executed under the right circumstances.
    regards,
    bill.

  • Very Large Numbers Question

    I am a student with a question about how Java handles very large numbers. Regarding this from our teacher: "...the program produces values that
    are larger than Java can represent and the obvious way to test their size does not
    work. That means that a test that uses >= rather than < won?t work properly, and you
    will have to devise something else..." I am wondering about the semantics of that statement.
    Does Java "know" the number in order to use it in other types of mathematical expressions, or does Java "see" the value only as gibberish?
    I am waiting on a response from the teacher on whether we are allowed to use BigInteger and the like, BTW. As the given program stands, double is used. Thanks for any help understanding this issue!

    You're gonna love this one...
    package forums;
    class IntegerOverflowTesterator
      public static void main(String[] args) {
        int i = Integer.MAX_VALUE -1;
        while (i>0) {
          System.out.println("DEBUG: i="+i);
          i++;
    }You also need to handle the negative case... and that get's nasty real fast... A positive plus/times a positive may overflow, but so might a negative plus a negative.
    This is decent summary of the underlying problem http://mindprod.com/jgloss/gotchas.html#OVERFLOW.
    The POSIX specification also worth reading regarding floating point arithmetic standards... Start here http://en.wikipedia.org/wiki/POSIX I guess... and I suppose the JLS might be worth a look to http://java.sun.com/docs/books/jls/second_edition/html/typesValues.doc.html

  • Best practices for speeding up Mail with large numbers of mail?

    I have over 100,000 mails going back about 7 years in multiple accounts in dozens of folders using up nearly 3GB of disk space.
    Things are starting to drag - particularly when it comes to opening folders.
    I suspect the main problem is having large numbers of mails in those folders that are the slowest - like maybe a few thousand at a time or more.
    What are some best practices for dealing with very large amounts of mails?
    Are smart mailboxes faster to deal with? I would think they would be slower because the original emails would tend to not get filed as often, leading to even larger mailboxes. And the search time takes a lot, doesn't it?
    Are there utilities for auto-filing messages in large mailboxes to, say, divide them up by month to make the mailboxes smaller? Would that speed things up?
    Or what about moving older messages out of mail to a database where they are still searchable but not weighing down on Mail itself?
    Suggestions are welcome!
    Thanks!
    doug

    Smart mailboxes obviously cannot be any faster than real mailboxes, and storing large amounts of mail in a single mailbox is calling for trouble. Rather than organizing mail in mailboxes by month, however, what I like to do is organize it by year, with subfolders by topic for each year. You may also want to take a look at the following article:
    http://www.hawkwings.net/2006/08/21/can-mailapp-cope-with-heavy-loads/
    That said, it could be that you need to re-create the index, which you can do as follows:
    1. Quit Mail if it’s running.
    2. In the Finder, go to ~/Library/Mail/. Make a backup copy of this folder, just in case something goes wrong, e.g. by dragging it to the Desktop while holding the Option (Alt) key down. This is where all your mail is stored.
    3. Locate Envelope Index and move it to the Trash. If you see an Envelope Index-journal file there, delete it as well.
    4. Move any “IMAP-”, “Mac-”, or “Exchange-” account folders to the Trash. Note that you can do this with IMAP-type accounts because they store mail on the server and Mail can easily re-create them. DON’T trash any “POP-” account folders, as that would cause all mail stored there to be lost.
    5. Open Mail. It will tell you that your mail needs to be “imported”. Click Continue and Mail will proceed to re-create Envelope Index -- Mail says it’s “importing”, but it just re-creates the index if the mailboxes are already in Mail 2.x format.
    6. As a side effect of having removed the IMAP account folders, those accounts may be in an “offline” state now. Do Mailbox > Go Online to bring them back online.
    Note: For those not familiarized with the ~/ notation, it refers to the user’s home folder, i.e. ~/Library is the Library folder within the user’s home folder.

  • Why am I losing quality when exporting to PDF from Numbers?

    Hi guys, so I need to export a fairly large table I have to PDF so that I can get it into the report. It looks beautiful in Numbers, and then when I go to print, size it the way I want it, save as PDF, I get a crappy looking table....Lines have just randomly disappeared or faded. What is going on and how do I fix this?! It looks great in Numbers, I don't understand why it should look any different when I print to PDF.... Here is a screen shot inside Numbers, notice the lines look perfect....then a screen shot after exporting in Preview, notice how some of the lines are totally gone. You'll see the rows for "depth" have a dotted line to dived them....in preview the dotted line just disappears in one spot, but not in another. It's totally random and I've no idea how to fix it....

    Hi randiceleste,
    How are you applying that dotted Cell Border?
    Using 1 point dotted Border below a row in Numbers:
    Then Menu > File > Print... > Print... > PDF > Save as PDF
    What point size are you applying?
    Regards,
    Ian.

  • FCPX + Lion + Cannot handle AAC or Apple Lossless audio in H.264! (Clicks and Pops)

    This is a problem I've been having since I upgraded to Lion, and I've posted this response in other forums, but wanted to start a fresh one with a video I created of my findings. In a nutshell: FCPX in Lion cannot handle AAC audio in H.264 files since it will create audible clicks and pops when it subsequently renders the footage. This ***** because the iPhone 4 for example records as H.264 and AAC for the audio. Below are my findings:
    So I think I figured it out! (I'm pretty sure FCPX + Lion has issues with compressed audio formats, ESPECIALLY AAC and Apple Lossless!) There's a video of my findings below:
    1. I tried doing exactly the same thing I was trying to do on my machine on a co-worker's Snow Leopard MBP, and it worked out fine (it was just importing an iPhone 4 .mov and exporting it in FCPX). The output file was pretty much the same as what came in.
    2. I found out that only movies from my iPhone 4 were coming out all crazy in the audio, yet my Canon T2i files were fine. Both are H.264, yet the Canon T2i records audio as Linear PCM, while the iPhone 4 records audio as AAC.
    3. So I got an idea to use screen flow to record a youtube video and record the computer's audio. Then I decided to export with different formats, both as HD NTSC standards and as web standards with H.264.
    4. I found out that both NTSC and H.264 standards were perfectly fine, SO LONG AS THE AUDIO WAS LINEAR PCM (or uncompressed!).
    5. I then exported the same clip (both times as H.264) and ONLY changed how the audio was rendered (either Linear PCM/Uncompressed or AAC) and VOILA, I got click and pop artifacts ONLY in the AAC version. The ones that were output as uncompressed audio were totally fine!
    CONCLUSION:
    Final Cut Pro X on OS X Lion has issues with compressed audio, MAINLY AAC and Apple Lossless! Anything that is Linear PCM/Uncompressed should be fine! For example, MPEG 4 AAC Enhanced Low Delay at 320K came out 95% OK, one or two clicks.
    So if any of us are working with material where the audio came to us already as AAC (like an iPhone 4), then we have to rip the audio out somehow first (like through VLC for example) and import it separately as an uncompressed file.
    Here's a youtube video showing exactly what I'm talking about:
    http://www.youtube.com/watch?v=FDw4btShH0s
    Anybody else have any thoughts on this before Apple comes out with a fix?

    I was having the same problem and spent several hours on the phone with apple working with one of their 3rd level support guys. Their is deffinatley a issue with Lion and FCPX. I was told  they would escalate to a FCPX engineer and they would get back to me in two days. In the mean time I went to my local apple store and was able to recreate the issue on every machine I tried. Apple did call back and confirm that there is an issue with the way FCPX and Lion handle the import and export of AAC files. They said there would hopefully be fixed in a future update.
    The best work around I found is to take your AAC file and convert it to a WAV. I used compressor to do this and it seems to have solved the problem.

  • Error adding large numbers

    I am adding large numbers and getting the wrong result. there seems to be some rounding taking place in the sum but i am adding integers. I am using DASYLab 9.02, data is summed in the arithmetic module, example problem 331153408-31570 = 331121838 but the output is 331121824. I tried making the variable where the inputs are stored 20 digit with 10 decimals but that did not help and i also tried dividing first by 1000 and 10000 only to get different answers. is there a setting that needs to be configured differently?

    Hi Tom, Thanks for the reply. I am reading a hex value in from a serial port. the number is large and when i format it as hex on one chan it is off by a small amount. there is some rounding in the LSD. i then take another reading later and calculate the delta. since i dont have the right values to begin with my difference calculation is wrong. when i read as bytes through 8 channels, i can see the ascii for each digit and that they are correctly displayed. using a formula module i can convert from ascii to decimal so that i get the decimal equivalent of the hex character then in the next formula i do the math to find the value of each hex digit in place it holds. then using a sum arithmetic module i get the final value of the large number coming in. it is correct all the way upto the aritmetic sum. i tried cutting the large hex number into two parts and then adding up the weighted parts and still have the wrong ans in the display module. i also tried dividing the halves by 1000 prior to adding them so that i was working with smaller numbers in the summation but that didnt help.
    so i did the math directly in the extended portion of the variables. the numbers add up properly there but when i try to bring the correct sum back into the work sheet to display it, it is wrong again. it seems that a value around 04000000 hex is the limit. below that i get the right value displayed that was calculated in the variable field, above it there is some degree of variation. I can set the limit of cycles to a value below where the addition becomes problematic or i can export the hex to a spreadsheet, do the math there and then bring it back in but i will still have the same issue displaying the answer.
    the limitation doesnt seem to be in DASYLab in general but in the Read, Formula, Constant Generator modules that read the variable back into the worksheet. it is displayed properly in the contents window

  • Cannot handle the request because a modal dialog or alert is active in indesign CS6, C#

    Hello,
    I'm having bunch of indesign CS6 files. I have written piece of code in C# which opens indesign files & exports those files to xml & pdf. If I run the exe, it works properly. But when I schedule it via task scheduler (runs at point of time without
    human intervention), it throws an error "Cannot handle the request because a modal dialog or alert is active" which results in failure of my process. This error keeps me annoying. So how to over come this error?
    Any help would be grateful. Thanks in advance.

    Hello,
    Thank you for your post.
    But your issue is out of support range of VS General Question forum which mainly discusses the usage issue of Visual Studio IDE such as
    WPF & SL designer, Visual Studio Guidance Automation Toolkit, Developer Documentation and Help System
    and Visual Studio Editor.
    Because your issue is involved to InDesign CS6 which is third-party, we don’t support it, so I will move it to Off-topic forum.
    Thanks,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Aperture Unable to handle large scans

    I am feeling quite frustrated by Apple on this program, Aperture. I have many large scans of my negatives, between 230 megs to 500 megs. When I bought Aperture I was assured by the Apple representative that I would be able to work with these images without a problem. It turns out that Aperture is unable to open any of them, and is unable to let me open these files with an external editor such as Adobe Photoshop.
    Has anyone else had this problem? Does anyone have any advice on how to solve this? Thanks.
    G5 imac   Mac OS X (10.4.3)  

    Given that it's a CMYK scan I think that's your whole problem. Aperture really cannot take that well.
    The way to tell Aperture not to make a copy on external edit is a little complex right now, but here goes...
    Go into the Aperture library with Finder - right click on the library itself, and select "Show package contents".
    You'll see your projects in here as files. Do the same thing (show package contents) to the project your file is in.
    Now you'll see some files and some import directories. Find the import directory that has a directory with the name of the master file and go in there, find a file ending with the extension ".apfile" that should start with the name of your master.
    Double click on that, and "Property List Editor" opens. Open the "Root" triangle and then look for an option called "isExternallyEditable". Click on the "No" and you can change it to "Yes". Save and exit.
    You are not quite done yet though! Now, re-open Aperture and have it rebuild the library - hold down "Alt-Cmd" when you click on Aperture in the dock and say "yes I want to rebuild my library". It will rebuild the whole thing.
    Now if you go to your image and say "Open in external editor" it will not create a new version. However any use of the adjustments in Aperture itself (like color or cropping or rotation) WILL create a new copied version, where they would not have before. That is because Aperture refuses to permanently store Aperture changes in any kind of master file so it has to make a new version to hold them in order to keep your master editable.
    If you want an easier approach consider this - if you want to Photoshop an Aperture TIFF version without Aperture making a new copy, you can also go into the library as described above, and simply open the stored master file directly in Photoshop, saving changes when you are done. This gives you the best of both worlds as you can adjust color or do crops and rotation in Aperture where you get that for "free" (no copies taking up space), while you can do any substantial clone or other more complex work in Photoshop to spruce up the master, as it were. You may have to restart Aperture to see the change take effect once you have saved over the old master, but then all versions you have created based on that master instantly hold Photoshop alterations. If you find yourself doing this often you can create a shortcut for that import group or other folders in a project either in the finder sidebar or the desktop. Or, you can also create a smart folder that holds all TIFF's in a project to work on directly.
    I have to import 16-bit TIFF's into Aperture today because my RAW format is not supported, and I sometimes take this route if an image needs clone work the Spot/Patch tool cannot handle (or is too cumbersome to perform as you can actually do very complex clone work in Aperture if you are patient and careful).

  • Airport Extreme with Ubee D3.0 - cannot download large files.

    Hello all Macoids!
    Here is my issue...
    I am on Charter Cable Network using Charter's provided modem Ubee D3.0 It is connected to an AirPort Extreme "main base" vie ethernet cable. That AirPort Extreme that I call "main base" distributes the WiFi throughout the apartment to other AirPort devices. And it all works.
    However...
    I cannot download large files on Wi-Fi using iMac or even while connected to the AirPort Extreme main base via ethernet cable on my MacBook. What is interesting that inside of iTunes I can download large files with TV shows episodes just fine, but the update to an iPhone or an iPad will stop after 10 or 100 megs with error message "Unknown Error 9006" occurred. Interestingly enough, if I will drag the Download window in iTunes around while it is downloading 1.4 gig iOS update file for iPhone 5 - it will complete fine. So it looks like Airport main base (or whichever Mac is used) looses the connection to a download server or Ubee D3.0 modem, unless I constantly "renew" it by dragging the Download window around.
    That is quite annoying to do for 15-20 minutes… Same happens on MacBook Pro that is actually connected to an AirPort Extreme main base via ethernet cable...
    Macbook running OS 10.9
    iMac running OS 10.8.5
    Same thing.
    I would've call Charter but knowing well they have usually no idea what is up I thought to ask here if anyone has the same problem….?
    Any smart suggestions not based on experience?
    Anything appreciated very much!

    I ended up reformatting with HFS and the problem was solved, sort of.  The AEBS can now handle large files.  But that allowed me to expose yet another serious firmware bug (version 7.6.1), namely that if you use "account" style fileshares, the fileshares are not reliably accessible anymore, and frequent rebooting of the AEBS is needed to bring them back.  A quick test for whether this has happened is just attempt at ten-minute intervals to create a file on a read-write share.  You'll find it can work for up to a few hours, but at some point it will fail.  Makes the AEBS essentially unusable for account-based fileshares.  I
    With firmware 7.5, I'd noticed a variation of this problem, which was that editing the fileshare permissions on the AEBS resulted sometimes in a corruption of the fileshare rights.  When this happened, you needed to reinitialize the AEBS completely.  So I hoped that in 7.6 they fixed the probems.  They fixed that one but added the new one.
    For now, the workaround seems to be using a device-based password for the fileshares, and forgetting about account-based shares.  The huge problem presented by this approach is that all users have full access, so I await Apple's next attempt at stable firmware with great anticipation.  If only they had a beta test program, other than their users, we would not be in this near-constant state of working around serious bugs.

Maybe you are looking for