Excel Import - No good for large-ish datasets?

Hi,
I'm trying to import some Excel data into an Oracle table...
The data is very simple - varchar2(9), varchar2(9), varchar2(11).
There are 40k rows in the spreadsheet with three columns corresponding to the above.
When I try to import from the spreadsheet (after mapping etc.), SD just seems to go to sleep. There's no indication that anything's happening and I can't cancel the import. In fact, I can't do anything - I have to use Program Manager to kill SD.
When I import a tiny amount of rows from a subset of the same spreadsheet (say 10), the import works fine.
I've tried to import 10k rows but SD goes to sleep again.
Has anybody managed to import anything other than piddling little datasets?
If so, what's the secret?
Thanking you...
(SD 1.1.2.25.79 - Oracle 10.2.0.1 - WinXP - Excel 2000)

I have experienced this same issue. I have 150k records (obviously needed to breakup the file into 3 chunks to allow for the 65,000 record limit in excel) that I needed to load into a table. SQL Developer seemed to hang after I tried loading the first chunk (about 55,000 records), and then it returned from the "busy" state without giving an error message or a confirmation message.
I then checked my table to see if any rows were loaded, but there were 0 records in it. The import had failed. I tried it again with a small sample file (about 100 records) and that worked fine. It only failed with the 55,000 record file. This was very frustrating.
Luckily we were able to use SAS (another DB tool besides Oracle) to import the file.

Similar Messages

  • How can you tell photo quality is good for large photo books

    I want to make a large photo book (13 X 10) and I have a good quality camera, but how do I tell the photos are good enough to blow up to that size.

    I want to make a large photo book (13 X 10) and I have a good quality camera, but how do I tell the photos are good enough to blow up to that size
    With a bit of arithmetic. Check the pixel size of your photos.
    To print a good quality book you will need at least 200 pixels per inch to get the necessary dpi in the printout.
    So your photos should have a pixel size of  at least 13 x 200 by 10 x 200 = 2600 x 2000 pixels.
    Check the Info panel for the pixel sizes of your selected photos.

  • Excel Throws an error for large data dumps

    Hi,
    I have a report in which displays 10,00,000 records , for that i created an agent and run the report the data is loaded into excel when we open the excel it is throwing an error
    I so mimimum i need to display 5,00,000 records
    Thanks,

    Hi,
    you can filter out some condition on report wise as well as dashboard wise using a prompts.
    Example:
    If u have Period table in the report filter out for year=2012 only
    or
    If u have prompt in dashboard filter out.
    in agent also we have the conditions rule while downloading the report.
    Check this makshu.blogspot.com\increase-row-limits-in-table-properties.html
    Regards
    VG

  • Excel shows an error for large data dumps

    Hi,
    I have a report in which displays 10,00,000 records , for that i created an agent and run the report the data is loaded into excel when we open the excel it is throwing an error
    I so mimimum i need to display 5,00,000 records
    Error is A DDE error has occured, and a description of the error cannot display becuase it is too long. If the file name or path is long, try renaming the file or copying it into a diffrent folder
    Thanks,

    Hi,
    you can filter out some condition on report wise as well as dashboard wise using a prompts.
    Example:
    If u have Period table in the report filter out for year=2012 only
    or
    If u have prompt in dashboard filter out.
    in agent also we have the conditions rule while downloading the report.
    Check this makshu.blogspot.com\increase-row-limits-in-table-properties.html
    Regards
    VG

  • Combining Text in Automator for Excel import.

    I'm attempting to create an automator workflow that will string some text together so that it can be imported into an Excel worksheet. Here's my workflow so far.
    • New Text File
    Creates file temp.txt
    • Run AppleScript
    Opens file temp.txt
    • Get Specified Text (specified text below)
    Read Today
    Read Tomorrow
    Read This Week
    Read This Month
    • Filter Paragraphs (needed so the text is not treated as one big line of text and is displayed in the next step correctly)
    Return Paragraphs that are not empty
    • Choose from List (all 4 items displayed correctly, all selected by default)
    Select one item
    • Set Contents of TextEdit Document (temp.txt) (append)
    • Get Specified Text (specified text below (a semi-colon))
    • Set Contents of TextEdit Document (temp.txt) (append)
    • Get Specified Text (specified text below)
    keywords go here
    • Set Contents of TextEdit Document (temp.txt) (append)
    • Get Contents of TextEdit Document (results below)
    Read Today
    keywords go here"
    (Once I get this part working, I will add a URL onto it also).
    So, all my content has arrived here, but on three different lines. But because it is on three different lines, it does not import into Excel correctly. I can't concatenate this using the usual AppleScript method because I would have to declare these items as variables. I've tried various ideas, but I cannot get the three above to append to one line, like this…
    Read Today;keywords go here
    without manual intervention, which sort of defeats the reason for trying to automate this.
    I hope I've explained this well enough that it makes sense.
    Any help or ideas are appreciated.

    Well, it is good to know that it can be done in Applescript, though I have not found a way to accomplish this. All examples I have seen (Googling and AppleScript 1-2-3 book) use set statements using static strings, and then concatenating those statements together. Since my values are dynamic, those examples don't help.
    What I have to work with are these lines of text in a new document.
    Read Today
    keywords go here
    The text will vary from run to run.
    What I need to end with is… (for use in Excel, import text files action)
    Read Today;keywords go here (line1 & line2 & line3)
    So, I don't know how to proceed from here, unless I can do something like set var1 to line1 of temp.txt, set var2 to line2 of temp.txt, etc.

  • Excel Importer for Visual Studio LightSwitch 2013

    Has anyone managed to get the 'Excel Importer for Visual Studio LightSwitch' extension working with Visual Studio 2013?

    Hi Kevin,
    I assume you mean this extension here -
    http://code.msdn.microsoft.com/silverlight/Excel-Importer-for-Visual-61dd4a90
    From what I can tell that does not yet support VS 2013.  The source code is of course available, so you could attempt to fix it up yourself and tell us the result.
    Thanks.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • AED not calculated properly for imported capital Goods

    Dear All,
                 I am facing one error while capturing Excise Invoice for Imported capital goods .While performing this AED set off reflecting is 100% which is supposed to be 50%.For raw material its calculation is correct .Any solutions.                                                                               
    Thanks.

    Hi
    check following link
    [http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/808c60ca-013b-2c10-34a2-94d1eb442e6f?QuickLink=index&overridelayout=true]
    Regards
    kailas ugale

  • ADC Caluculating 100% for Import Capital Good in j1iex part2

    Dear Gurus
    Hello everyone
    Now am facing a problem at the posting of Excise Invoice of Import Capital good where BED,ECESS,SCESS are picking 50% and the AED is picking 100% of the value in Part II entries.
    but this is correct as per Standard SAP note 930756,my client requirement is to pick only 50% for AED also at the time of part II entires
    Please help me

    HI,
    Its not possible in Std.SAP since it is legally allowed take 100% credit on ADC on capital goods in the current year itself.
    Also it will reflect in J2I8 report or post to ONHOLD account  since the code corrections in the SAP note - 930756 is already implemented in your system.
    Thanks & Regards,

  • How to import matching RAWs for a large library?

    I've been using iPhoto for many years to manage JPEGs while just keeping RAWs on the side on a large disk. Since Aperture allows me to reference those RAWs without storing them locally, I'd like to do so. The only problem is, it doesn't seem to be possible to import matching RAWs for more than one project at a time. Since I have 50+ projects (maybe 100+ – I haven't counted) and enough RAWs that it takes Aperture a few minutes to load them all and find the matching ones for each project, it's really impractical to do this one project at a time. Surely there's a better way.

    You may have some library corruption but that's hard to prove.
    I would also consider hardware issues including the router and hard drives.
    Where is your library content stored?  Internal drive (Mac?PC?) External drive? Network storage?
    200GB isn't huge - I don't think it's the size that matters here.
    AC

  • MP3:  How large can import file be for 4 GB Nano?

    I am having trouble importing one particular MP3 file, 74,000 kb, into my my 4GB, which is only filled to 1.3 GB. Is the file too big? It's an audio book.

    The thread title "How large can import file be for 4 GB Nano?" is something of a give-away, no?

  • Can anyone answer this question? What is used to format a string so that it falls into a certain Row/Column in excel? see text for indepth question.

    I'm attaching a file that may help. Just remember this is my FIRST attempt at using NI/FeildPoint so things that would be obvious to the normal users more than likely would slide right by me. So I'll try and explain what the text contains as to make my problem more clear. I'm using a FP-1000, FP-AI-100, and a FP-TB-10. i'm creating a application where I'm using the 8 channels from the FP-AI-100 to monitor a control system ,that we manufacture. The control system is in an environmental chamber. I use the FP-TB-10 to monitor the temperature while in the chamber. The control system is ran for 4 hours under varing enviromental conditions while I monitor vital system voltages with the FP-AI-100. Now the application that i'm building starts with the obvious FP CREATE.VI and so forth I have no problem communicating with the FP's, but the information that I'm monitioring needs to be placed into a file with Headers describing the information below it. Now I use a "WHILE LOOP.VI" to monitor the FP-AI-100 & FP-TB-10 for the 4 hour period. but before that I create and open a file where I use a "CONCATENATE STRINGS.VI" to enter the headers I need. I do this by CONSTANTS, CONTROLS, TABs, AND CARRIAGE RETURN being entered into the "CONCATENATE STRINGS.VI" in the order I want them to appear. I use ROW 1 to enter (OPERATOR:with a CONSTANT) an a (CONTROL so the operator may enter his name and be recorded into the file) using a TAB to seperate Columns and a CARRIAGE RETURNto drop down to ROW 2 for the next headers. I use header names and TABs to seperate the columns. This part works great. I end up with ROW 1 having the operator information then ROW 2 having the headers for all the channels I'm using to monitor the control system with in their own column. The trouble happens when I write to this file during the "WHILE LOOP". The information recorded for DATE:, TIME:, and Channel 0 of the FP-AI-100 end up right under their corresponding header on ROW 3, but the rest of the data ends up on ROW 4 in column 3 and goes down many rows with some of the data not being stored at all. It varies on how many rows are used starting at ROW 4 but hey always stay in column 3. I use the "FP READ.VI" outputing to a "ARRAY TO SPREADSHEET STRING.VI" outputing to a "CONCATENATE STRINGS.VI" for each channel within the "WHILE LOOP" then into the "WRITE.VI". I use the "CONCATENATE STRINGS.VI" to place my data OR atleast I thought you could do that some how I'm not quit doing something write either there is a sampling/timing issue or writing to file issue where things are being confused in the "WHILE LOOP" if someone know a better route to perform what I'm tring here I would be interested. Its not like I'm not giving it the old colledge try here but without proper training and the vagness of the manuals it's difficult to understand what every connection actually does.I'm tring though for everyone who happens to fill pity for me
    Thanks for your help
    John Morris
    Glendinning Marine Product, Inc.
    Attachments:
    ALLchannels ‏273 KB

    > I appreciate your effort to help me, but there is still a slight cloud
    > in front of my eyes here.(I must mention that I'm using Labview 5.1
    Oh, to bad. I'm using LV 6,02 and tried to save the application as LV5.1,
    but it didn't work.
    > Lets start with the easy one which is the SECOND main thing you
    > wanted to mention. In my application I used individual Create Tag.vi's
    > "so your saying that I can use just one Create Tag.vi and one Read.vi
    > and what ever I use to display the values will automatically to show
    > the individual channels" in other words if I use a "Indicator(DBL)"
    > comming out of the Read.vi what ever I use in the "PANEL" layout will
    > expand to show all 8 channel if I was using a FP-AI-100? Cool...
    Allmost right. The 8 channels come out as 1D-array, ch0...ch7.
    > #1--How do I change the delimiter (TAB) to a delimiter (comma) in a
    > Array to Spreadsheet String.vi?
    > ...cut
    I don't use this Spreadsheet VI, because files coming out there allways
    start this time consuming EXCEL wizzard. Because of this, I programmed my
    own CSV-conversion vi. Maybe you could just use a common texteditor, like
    wordpad and put the csv-examples from my last reply down to a text file
    and rename it to *.csv. Texteditors terminate lines with "\r\n" by
    default, so this is a very quick way for testing.
    > FP_Analog_Logging example to record all eight channels it uses a Array
    I didn't find the vi you metione above, but there is another good one:
    Look at examples\FieldPoint\DataLogging\FP Logger.vi.
    In a little case structure, at "false", there is a function called "Format
    into string".
    Pop up on the format string and adjust >>'\' Codes Display<<.
    Then you change the format string into: %.;%-f%s
    A Tab string is connected to the lower input of this function. Replace
    this one with a comma.
    That should do.
    >
    > #2--If using just one Create Tag.vi and one Read.vi and I have the
    > Item Name listed as ALL I take it that the information comming out of
    > the Read.vi is data for each channel in a String format starting with
    > channel 0 and ending with channel 7 for the FP-AI-100.
    Yes, just as explained above. If you put a indicator at the output to
    display the values, you can expand the display to show all 8 channels, but
    you can't name induvidual cells. In a array, all cells have identical
    named labels. If you want to name the individually, you have to attach the
    array to cluster function and put the indicator after this one. Enable
    labe display and name the output values individually.
    > #3--Now I use the following to "Format String" in the Array to
    > Spreadsheet string.vi (%.4f)but I don't see anyway of changing the
    > delimiter from (TAB) to (COMA)
    Explained above.
    > #4 you stated text strings need a "as prefix and sufix" and each
    > string is seperated by a coma, a period is used as decimal number
    > separator and lines have to be terminated by my question is
    > WHERE IS THIS ACCOMPLISHED? WHAT VI OR WHAT CONNECTOR PIN?
    Well, modifying strings are done with string functions.
    Above, as I explained how the sample FP Logger.vi can be modified to do
    the job for numbers. The pre- and suffixing with ", you only need for
    strings, which EXCEL should interpret a string and as nothing else as a
    string, even if there's a number inside. You usually need this only for
    headers. So its easyest, if you just write your string into a string
    constant or control and concatenate it to the previous csv-file contents.
    Oh, I just see by looking at the above mentioned example... Inside the
    case structure, but in the "true" case, there are error messages
    concatenated to the logfile. At this point it is importent, to use " for
    integrating the message into the CSV file, because a error message usually
    looks like this: >> ERROR 2345 in vi yxz <<
    Here you have text and number strings mixed in one line. EXCEL does not
    know, if it should interpret the number inside the error line as separate
    number, separating the line into three colums i.e. string before number,
    number and string after number. So tell EXCEL by putting a " before and
    after the line each.
    For this, expand the Format To String function by one input, move down all
    connections, to be able to insert a string constant with a " to the first
    argument connector. Change the contents of the string constant at the
    bottom argument from tab to ", (quotation mark AND comma)
    > #5 You gave me an example of what a 3 column header could look like:
    > "col0","col1","col2"\r\n Now is this something that you enter
    > somewhere cause I know that \r is carriage return and \n is newline so
    > I take it that the above is entered somewhere maybe in the Write.vi to
    > the connector called Header (F)? See this is what confuses me because
    > NI manuals have no examples of certain connectors types being used or
    > any reference as to how they manipluate data with there varing type
    > inputs. Or maybe I'm just missing them.
    The example I mentioned above help here to.
    Look at the Write File function. This function receves data from a
    function named concatenate strings. Expand this function to have one more
    spare input at the bottom. Create a string constant. Switch the constant
    display to "\" mode. Enter "col0","col1","col2"\r\n into the constant.
    Connect the constant to the spare input.
    Doing this, every dataline in the csv file is followed by
    "col0","col1","col2"\r\n .
    I wish you a nice weekend,
    Rainer Ehrt

  • UCMDB Excel import - update/delete relationship

    Hi, is any solution to update or delete a relationship between the object in Excel import? I created one (excel import), but there are wrong way in the relationship, so I want to change/update or delete all wrong, and create the good . Anyone have any ideas?

    Hi,
    Do you mind telling us which Excel version are you using? Which file are you using as data source? Excel, access or other? I tested in my environment, but I can't reproduce your issue.
    Would you like to share us a sample data source through OneDrive if possible?
    On the other hand, I recommend we follow this link to re-create the connection to test:
    http://exceluser.com/formulas/msquery-excel-relational-data.htm
    Please Note: Since the web site is not hosted by Microsoft, the link may change without notice. Microsoft does not guarantee the accuracy of this information.
    Regards,
    George Zhao
    TechNet Community Support
    It's recommended to download and install
    Configuration Analyzer Tool (OffCAT), which is developed by Microsoft Support teams. Once the tool is installed, you can run it at any time to scan for hundreds of known issues in Office
    programs.

  • Maximum import file size for stills?

    I am getting an error importing stills that are large, about 6000 pixels wide. I wanted to zoom into the image in my 1920x1200 video... Couldn't find info on max image size... anyone know?
    thanks...

    Studio3D,
    I do a fair amount of work with stills in Premiere. I always plan the size necessary for the pans and re-size to exactly what is required, based on the Project's image dimensions. As I do not do any "random pans and zooms," (each is choreographed tightly) I do all of the re-sizing work in PS, before Import. However, with your hi-rez Project, you might well be pushing the limiting size in Premiere. You may have to step back, just a little, and tighten your pans to accommodate these limits.
    Good luck,
    Hunt
    PS I like the re-sizing algorithms in PS better than any I have worked with in PrP. Have not experimented with CS4, so I might change my tune, should they have changed their algorithms.

  • Keeping number of files in check is good for your Mac

    I just did what I thought was impossible - reduced the space on my hard drive from 2.6GB to 11GB! The Mac is flying again, hard drive is hardly audible, I am not missing anything useful, all the projects with roughs from stage 1 through 40 have been consolidated into a few examples, copies and back-ups of copies for "just in case" during a project have gone, and gems of ideas have been organised properly into present and future projects.
    I may sound like a totally disorganised worker, but not really, I can quickly find what I need, didnt really come across anything I had no recollection of, but I just fell into the trap of "whatever the size of your bin the rubbish will accumulate accordingly".
    So now I have 28GB of pure hard drive data to clone to a wiped external drive and I am looking forward to being able to back up incrementally with quality files instead of dross for that rainy day.
    It makes me wonder how much useless data other people are struggling long with, and is the upcoming trend for unlimited storage with Google and Amazon S3 services really needed?

    lfdc:
    Good for you. Getting rid of un-needed material is an excellent way to free up disk capacity. And backing up to an external FW HDD or archiving on CD or DVD, conserves more space. The graphic interface permits quite sophisticated filing systems so that even if one has a lot of data is can be organized in a way so it can be easily accessed. For example I have on major folder called DATA, in which I have nested folders, one of which e.g. is Financial. In this I have other Folders such as Taxes, Investments etc. The investment folder can is further broken into the Brokerage firms, and each of those in individual accounts, and each account with Bought and Sold for transaction statements. In this way, even with large masses of data it is easily accessible if you know exactly where it should be. Of course, every now and then something gets saved in the wrong place and that breaks the monotony trying to find it.
    Incidentally, in your first sentence you said,"reduced the space on my hard drive from 2.6GB to 11GB!" I think you meant it the other way around (I hope).
    Good luck.
    cornelius

  • Problems editing large-ish avi

    I have an avi file that is 43.2mb. It plays in windows media player absolutely fine. However when it is imported into premier it doesn't play back hardly atall. The first frame is 'frozen' and the sound jumps about. I have tried exporting the un-edited file as an avi to test if it is just the playback, and the exported file plays exactly the same in windows media player as it does in Premier (ie, broken sound, frozen frame). I have come to the conclusion that the problem lies in premier rather than the file...seeing as the original files plays fine in wmp...but I dont want to jump to conclusions.
    Any ideas?
    I have also tried cutting a 7 second segment out of the original avi in premier and playing that/exporting it at as an avi only to have the same problems. I have also tried exporting it as a 4mb flv file only to have the same playback.
    Thanks
    Anna

    Large-ish avi:
    A duplication house was kind enough to convert a digi-Beta tape to file format for us. Two problems though, they used Wee-Willy-Winkie's super freeware capture program and provided us with two 180GB AVI's compressed with the FRWU D arim Vision Forward codec.
    Thanks to GSpot, I was able to determine the codec and then find it available for download. Unfortunately, the codec wouldn't install on our 64 bit Vista system. I was able to get it to work on our 32 bit system and was finally able to see video on the timeline.
    This all happened because the dup house didn't follow my instructions which cost me about 4 extra hours of troubleshooting and workarounds.
    Premiere CS4 opened the 180gb file without issue. If we weren't in a time crunch, I would make them do it over.
    Basically my file is about 4200 times larger than your large-ish file.

Maybe you are looking for

  • My MacBook is very slow and constantly has the spinning beach ball of death!!!

    In the last couple if weeks my MacBook has suddenly slowed down a LOT. Sometimes it takes ages for it to boot up when I turn it on and at other times not as long. Then once I log in, as soon as I click on something, the spinning beach ball appears an

  • Enough - Get specif

    I don't understand this mob mentality crap that's arisen in the last couple days, apparently everyone's acting like there are no drivers for Vista. Well surprise: Filename: [size="2" color="#3300ff">SBXF_PCDRV_LB_2_5_0006.exe This download is a drive

  • Nokia C3-01 won't start up after sw update

    Hi, I have just updated my firmwire for C3-01. After that the phone switched off automatically, but when I tried to restart my phone. The screen only displyaed whie background and turned off, and displayed white bacground again. Dose anyone know how

  • Is this how dual boot works in Lenovo laptops?

    I have a Z580 and I've succesfully installed Ubuntu 12.04 LTS alongside Windows 8. Earlier I lost my paritions during Ubuntu install and so I had to do a clean install of Windows using the MSDN version and by using the product key retrieved using RWE

  • How do I restore deleted passwords? URGENT

    I accidentally clicked "Remove all" in the passwords box. Can I get them all back? I kinda need them.