Xcelsius messes up links when importing Excel source document

I am creating a dashboard using an Excel doc as my master source.  Because the dashboard data is all over my network, I have a master excel doc that has cell values that are links to cells in three other excel spreadsheets on my network.  When I import the spreadsheet into Xcelsius, it says the links are wrong.  I click the "Change Source" button and notice the links point to my hard drive (c:\agency\agency production\2008 production.xls) instead of the where the link should be (o:\agency\agency production\2008 production.xls).  The master excel spreadsheet has all the links correctly going to our network drives.
I've tried recreating the master spreadsheet
breaking the links and recreating
Using FQDN names for the links instead of drive letters
I always get the same error.
Any help is appreciated.

Hi Chris,
I'm actually trying to get one master excel spreadsheet to link to several smaller source files. I've created all the references to the E:\ drive on our server, but when I import the model in Xcelsius the charts are all blank. This leads me to believe that Xcelsius can't work with linked excel spreadsheets. Were you ever able to get this to work?
Thanks!
Tiffany

Similar Messages

  • Java corba ID CS5 7.0 Problem with table format import excel 2007 document

    hello,
    I want to import an excel 2007 document with a table into a textframe. This is the code to set import preferences:
         ExcelImportPreference pref = aplicacion.getExcelImportPreferences();
         pref.setTableFormatting(kTableFormattingOptionsExcelFormattedTable.value);
    but when I insert the file into de texframe and export in a pdf file i lose the table format, always apply this format: kTableFormattingOptionsExcelUnformattedTabbedText.value.
    This problem only happens with excel 2007. Excel 2003 files the import is ok.

    Apart from iTunes (which I rarely update since we never use it) they both had a Java update, which I'm installing now but I can't imagine that's connected (plus they both didn't have it). Both Macs are 10.6.8.
    I also repaired the permissions and cleaned caches on Friday. Today I created a new user/profile and tried the same thing on that. No luck.

  • Component to URL link a word/excel/pdf document inside Page

    Hello Experts,
    I would really appreciate if somebody can answer if there are any standard component or functionality available to call a network URL link to a word/excel/pdf document inside a ABOUT tab page for the definition and screenshot of How-to and functionality of the dashboard.
    I am trying to avoid creating a custom component to use SAP UI5 HTML control to initialize a OLE object call to a given URL.
    Thanks
    Arun

    Well, Word and Excel documents are binary, so you will need a Binary LOB (BLOB).
    I have recently been storing/retrieving BLOBs through ASP/ADO . . .
    BLOBs are limited to 4GB, but ADO has a limit of 2GB. Hopefully, that will not be a problem.
    If you are using ODBC, then be aware that you need to enable LOBs by setting LOB=T in the connection string. If you are using ODBC with stored procedures, then there is a bug that causes the LOBs to be truncated to 32K, which is fixed in 9.2.0.6.5. A colleague tells me that the Microsoft ODBC Driver for Oracle works with LONGs but not with LOBs.
    If you are using OleDB with stored procedures, then you'll need to set the "SPPrmsLOB" property to TRUE.
    When uploading BLOBs through ADO, you need to set the parameter size to 1 higher than the actual size, otherwise you get an error.
    When retrieving the BLOBs through stored procedures as OUT parameters, you need to set the parameter size to larger than the BLOB, otherwise it gets truncated. I do not like having to specify 2GB for a file that might only be 32K, so I prefer to return a record set via a REF CURSOR.
    Tak

  • Formats not retained when importing into 'other' documents

    When I import a word document or pdf document or any document that I want to store in one of the categories e.g. research or other, the document's formatting and layout is completely lost.
    Why is this?
    Are there any fixes for this?
    If not, is it on the 'road map' and when?

    Took suggestion and tried pasting.
    No, doesn't retain all formatting. Paragraphs and bullets are OK, but Tables and Indenting and Font size and type are lost. I expect any shading and anything else like that wouldn't come across either. A virtual plain text translation is what you end up with.
    Makes it difficult to insert scanned or pdf docs that can be part of your research for example.
    OK...let's know when formatting and layout do get on roadmap.
    Thanks Sunny

  • Xcelsius Causes core dump when importing a workbook

    When I try to import a certain workbook a message comes up saying that the server is busy and switch to the application. Switching to the application does nothing and the only way to get out of this is to kill Xcelsius which then causes a blue screen and core dump. Does anyone know how to address this problem?

    Hi there,
    I have posted a suggestion via this link [Server Busy issue with Xcelsius|Re: Server Busy; or you can refer to my following steps:
    1. Open the Windows Task Manager.
    2. Go to tab Applications and look for task Microsoft Excel - Compatibility Check and double click on it. This action will bring you to the small windows of Excel which alerts you about the compatibility check of the Excel file.
    3. Click button Continue to process the import.
    4. Go back to Xcelsius and click Retry button to import the Excel sheet.
    Well, the issue is solved.
    Hopefully this tips will be useful for you.
    Cheers,
    Danny Pham

  • Losing data when importing excel into an array

    Hi,
    I have been writing some code that imports data from an excel .xlsx worksheet file containing 890 columns and 150 rows.
    When I set the VI to import 702 columns everything works fine.  When I try to import 703 columns I get no data in my array.  The VI continues to run without any errors just gives me no results as there is no data in the initial array.
    Can anyone tell me if there is a setting somewhere Im missing or explain why this is happening as I have read that there should be no limit to the amount of columns you have as long as you have less than 2^32 elements which I am well below.  I am using a Labview 2009 and a PC with windows7 64bit and 4G RAM so there should be no problem with PC resources.
    Thanks,
    Ray.

    Hi Ray,
    702 is the limit when you use just 2 chars for the column name: "A" to "ZZ" allows for 702 columns.
    Probably a subVI somewhere deep in your Excel loading routine is limited for that naming scheme (or is only aware of  older Excel versions)...
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • Disappearing links when importing snippet...

    OK, so I have two snippets, both contain a Rectangle with an Image child.
    In one of them, that is all I have.
    In the other, the Rectangle is part of a group with another item.
    After importing the snippet, I relink to fix the image paths.
    On desktop, this works perfectly for both.
    For IDS server;
    1. The single, ungrouped, snippet gives me programmatic access to the link but then tells me the link doesn't exist after it has been relinked, so attempting to do link.update(); throws an error
    2. The grouped snippet works perfectly as per desktop
    So...
    1. Does anybody have any workarounds?
    2. Where can I report this bug?
    G

    This issue is fixed in 8.0.2 update.
    If anyone else encounters it, please update your server.
    G

  • The Source Monitor isn't showing the video, and crashes when importing to Source Monitor or Timeline

    The Source Monitor has no Black Box (which is within that video displays); and when I try to load a file into the Source Monitor, I get this, after the screen whites out:
    (This also happens when trying to load something directly into the timeline.
    Problem signature:  
              Problem Event Name: APPCRASH  
              Application Name: Adobe Premiere Pro.exe  
              Application Version: 5.5.0.0  
              Application Timestamp: 4d8a89e8  
              Fault Module Name: Display.dll  
              Fault Module Version: 5.5.0.0  
              Fault Module Timestamp: 4d8a796f  
              Exception Code: c0000005  
              Exception Offset: 0000000000039e5c  
              OS Version: 6.1.7601.2.1.0.256.48  
              Locale ID: 1033  
              Additional Information 1: 393d  
              Additional Information 2: 393d5f134e9c66c475640b56bd3bc171  
              Additional Information 3: eb55  
              Additional Information 4: eb55406990add3f93298f7ceb7e12c9a

    You need to set the source patching:

  • Audio Tracks are not linked when imported

    I know I recorded in stereo so I don't understand why the audio pair I imported are not linked(2 mono tracks) in the timeline. Anyone know how to fix this?
    Adam

    There is no panning in stereo pairs. That's the way it works. You can't have separately panned tracks and still have them be stereo pairs. FCP works the same in this regard. If you captured in FCP as a stereo pair, the panning would be as you see it in FCE.

  • Form Wizard, Excel Sourced Documents, Dozens of Extra Fields

    I have a series of PDF documents which I generate on a weekly basis from MS Excel reports.
    It looks something like this:
    Description
    Jan Sales
    Feb Sales
    Mar Sales
    Apr Sales
    May Sales
    Order 1
    Order 2
    Order 3
    Order 4
    Vendor 1
    Widget 1
    2
    5
    5
    4
    Widget 2
    4
    7
    2
    8
    7
    Vendor 2
    Widget A
    2
    9
    3
    8
    2
    When I run this through Form Wizard, I get a form field in every blank cell.  I'd really like form fields limited to the Order 1 column.  Is there anything I can do to trick the Form Wizard to limit its attention to this area?  Perhaps by modifying the underlying Excel form before printing to Acrobat?
    Edit:
    Maybe I should mention that it's the fact that I have 200 pages/week that motivates me to look for a reduction in the Form Wizard initial field generation.  It takes me a little over 2 minutes a page to delete out the 100+ extra fields per page.  Eliminating the better part of a day's repetitive labor is kind of a big deal.
    Message was edited by: jwpfw

    Here are some ideas you can try. You can delete fields with JavaScript, so you could set up a script that loops through the fields and deletes them based on some criteria, such as the position on the page or the field name that Acrobat automatically generates. You can set up the script so it's activated from a custom toolbar button or menu item, or as part of a batch sequence (custom action). If you'd like specific help with the code, it would help if you could post a sample document that has the fields added.
    An alternative that you can use if the layout of the document will remain unchanged is to add the fields using a script, perhaps taking advantage of tempates. So instead of running the Form Wizard to add the fields, you create a script to add the fields exactly where you want and with the field names and any other properties you want.

  • Numbers error on import Excel (.xls) document

    Simple formulas appear to fail on import from Excel simply because the number of rows in the spreadsheet is truncated by content.
    For example: =SUM(E7:E65536) will fail in many cases because there aren't 65536 rows in the table. Shortening this maxiumum fixes the error, but is time consuming because the formulas are only visible by temporary warning notices and must be retyped. I wish these formulas could be kept as errors rather than being moved into warnings.
    I've read the documentation but can't see how to bundle the built-in functions to look for all the rows without specifying a specific range. Does anyone know how to set a wildcard maximum to table rows or columns instead of a number like 65536?

    Numbers isn't designed with the 16th century, monolithic ledger (or infinite grid of cells) in mind. Related sets of data are meant to be grouped in individual tables, with only the number of rows necessary to contain your data (blank cells can work, but should be accounted for in more complicated formulas using the "if" suffixed functions). With formulas acting on that data (with the exception of simple summary functions like you describe) contained in separate tables.
    In this case, your formula would be:
    =SUM(Table 1::E), if the summation formula is in a different table
    or
    =SUM(E), if it is in the same table - like in a footer row. No accounting for row count necessary.
    Where "Table 1" is the name of the table that contains the data ("Table #" being the default) and "E" the name of the column (unless you add a header row then it is the name of the column). This way rows can be added or deleted at will without disrupting any of your formulas and will always include all of the cells in any number of rows in that column. Any numbers in the header and footer rows won't be included in the summation.
    Because Numbers and Excel have fundamentally different views of how the data in spreadsheets should be arranged, straight imports from Excel or attempting to force Numbers to work exactly like Excel won't always make sense.
    Ends up sounding more complicated than it is.

  • Manually changing a document version number when importing an existing document

    Hello.
    Like a lot of people, we are starting to make use of Sharepoint Online as part of an Office 365 subscription.
    One of the things we want to do is move a bunch of version controlled documents from their current location on a traditional network file share into a Sharepoint Document library, and then use approval and version history to control and track changes.
    At the moment these are stored with the version number as part of the file name, so we have something like "Company car policy - version 6.docx". When there's a new version, the old one is moved to an archive folder and a new copy is saved as "Version
    7.docx" which is outdated and far from ideal.
    I've created my library, tested that approvals work, and have found instructions on how to use a workflow to notify people that changes have been made. That's all fine, but if I upload an existing document into the library it will start the version number
    at 0.1 or 1.0 depending on whether I add it as a draft or final.
    What I need to be able to do is to manually change the version number to match our existing system without having someone sit there checking a document in and out. Some of them have version numbers in the 20s so this would be a huge waste of time.
    I can't believe I'm the first person to have this problem - can anyone help me please?

    You are not the only one, that is for sure!
    The only way (that I'm aware of and is easy to accomplish) to achieve this is the following:
    * Upload your history v1, v2, v3, v4, v5, v6. And then it is really in sync with your current / old version numbers.
    * just upload the new / latest file and check out/check in the file x times so the version number will increase every check in.
    I know that this sounds weird/funny time consuming. Without 3th party tooling I don't see any other solution.
    With 3th party tooling then you can perform those actions (basicaly the same actions) more quick and easy, however you need to get yourself comfortable which such tooling. One of the tools (no I'm not a sales agent or whatsover) that I know can perform metadata
    changes is ShareGate. However in PowerShell you can perform also a lot of magic and without ad doubt there are a lot of migrating tools (AvePoint, Tzunami, etc) that can also "mimick" those manual steps.
    My best suggestion is: 
    * If you need to keep the old versions upload them all one by one (if it is not too time consuming)
    Otherwise
    * create a archive for old versions and start in SharePoint with version 1 which is exactly the same as the latest non-SharePoint version and store that archive in a separate document library/folder/site. (only keep old versions if there is a reall need
    /business case or legal case to do that.)
    Kind regards,
    Michiel hamers
    If you think this answer is good: please choose mark as answer.
    Michiel Hamers www.SharePointman.nl Don't hesitate to contact me for a SharePoint/O365 question.

  • Import Excel 2007 into SQL Server 2005

    Hi
    When importing Excel 2007 data into sql via Import wizard:
    choose "Microsoft Office 12.0 Access Database Engine OLE DB
    Provider" as the data source in SQL Server Import and Export Wizard, then click Properties, switch to the All tab, input your excel file path to the Data source field and input "Excel 12.0" to the Extended Properties field and then click OK to follow the wizard.
    I get the following error:
    Error 0xc0202009: Source An OLE DB error has occurred. Error code: 0x80004005
    Error 0xc02020e8: Source Opening a rowset for failed
    Exception from HRRESULT: 0xc02020e8 (Microsoft.sqlserver.dtspipelinewrap)
    I have recently upgraded to Vista and did not have this problem in XP, does anybody know how to fix this, as I really need to get the data into SQL server 2005 (without using SSIS)
    Thanks!

    Hi Jin
    This is not a problem in SSIS, this is a problem occuring when I try to import data via the Import/Export wizard in Management studio.
    This is the exact error I get:
    Error 0xc0202009: Source 
    - <table_name>[1]: An OLE DB error has occurred. 
    Error code:
    0x80004005
    Error 0xc02020e8: Source 
    - <table_name>[1]: Opening a rowset for “<table_name” failed.
    Check that the object exists in the database.
    Additional information:
    Exception from HRRESULT: 0xc02020e8 (Microsoft.sqlserver.dtspipelinewrap)
    Many thanks!!

  • Robohelp 11 not Linking or Importing .docx or docm

    I just got RoboHelp 11 installed. I specifically upgraded to allow me to link my many Microsoft Word 2007 and Microsoft Word 2010 since RoboHelp 8 will not allow you to .docx files.  When I try to link there Word docs, it only shows me options for .doc and .rtf files (as shown below). I have the Microsoft Office 2010 suite installed and usable on my computer.  As the tutorials state:
    You can create a RoboHelp project by importing a Word document. To import Word files, you must have Microsoft
    Word installed on your computer.
    Note: DOCX and DOCM formats are not supported by versions earlier than Microsoft Word 2007. See Microsoft Word
    I was able to create a new project using a .docx file, but even that project once created shows this same window without the .docx and .docm options when linking or importing new Word Documents.
    Is there a setting or an update that is needed to allow me to select .docx file types for Linking and Importing?

    When you go to open a document in an existing project as in your first image your dialog is only showing DOC and RTF whereas in the second image all the required options are showing. So the question has to be what is different and one possibility is that in the first you are browsing to a network and in the second you are browsing locally.
    Try copying the document you were trying to link to / import to your local drive and then try to link. Are the options then different? If yes, I would suggest a chat with your network guys.
    Let us know how that goes.
    See www.grainge.org for RoboHelp and Authoring tips
    @petergrainge

  • Importing excel files - problem with single quote

    When importing excel files using 1.5, I can't get data with single quotes (') imported.
    When I run the insert statement given in SQLPlus I get "ORA-01756: quoted string not properly terminated", which is different than the error that SQL Developer gives me (see below).
    Also, I have a numeric value shown without a thousands comma-separator in the XLS file that I'm trying to load into a varchar2 field. But, the insert statements have added a thousands comma-separator which I don't want.
    REM Error starting at line 1 in command:
    REM INSERT INTO table (ID, NAME, CODE)
    REM VALUES (2427407, 'Ed-u-care Children's Center', '73,000');
    REM Error at Command Line:2 Column:37
    REM Error report:
    REM SQL Error: ORA-00917: missing comma
    REM 00917. 00000 - "missing comma"
    REM *Cause:   
    REM *Action:
    One last thing, TOAD gives a way to automap columns chosen from XLS to the columns in the database. It sure would be nice to have this functionality in SQL Developer.
    Thanks,
    Steve

    Did you consider both to be bugs (i.e., single quote issue and thousands comma separator issue)?
    Thanks

Maybe you are looking for

  • Release Strategy in Procurement?

    Hi, When I create a PR, there is a release procedure based on the release code, release group, plant and the price. If a release is 'IN PROGRESS' status or 'RELEASE REFUSED' status, i would like to know which user was the last person to have released

  • I need to change my apple ID on downloads

    I purchased a second hand MacBook Pro about 2 years ago. Its been awesome. I have never had to update any of the apps, until now after I updated to OX Mavericks (big mistake!) I now have to update Iphoto but it still has the OLD owners apple id in th

  • View event does not trigger custom method

    This is something that should be straightforward and easy...... but it is not working as expected! I have a main window with a tab bar and two additional views. One view has text boxes and a method to change a label when an event on the text box is t

  • Mega 180 , Mp3, Tv.

    Sorry if this is stated in the manual or anywhere. Did not seem to find it. 1. I want to ask whether I am allowed to use the remote to navigate/play..etc, my mp3 songs stored in my harddisk. 2. Can I still use the 'TV controls' button on the remote t

  • Down payment cancellation leads to blank accounting document

    Friends, my issue is such.  A cancelled down payment request (S1) billing document is generated without error.  If you look at the corresponding accounting document it is totally blank - no lines.  The customer account still shows the original postin