MaxL for Import of Data & Dimension info using an ODBC source (Oracl based)

See below for syntax issues with using the SQL Interface with MaxL to load first a dimension, then data. (Any one have experience with getting the exact syntax for this?) ThanksMAXL> import database ESSBASE_APPNAME.ESSBASE_DBNAME dimensions connect as ORACLE_USER identified by ODBC_DEFINITION using rules_file LdJob; 6 - (1) Syntax error near end of statement. 50 - MaxL compilation failed.MAXL> import database ESSBASE_APPNAME.ESSBASE_DBNAME data connect as ORACLE_USER identified by ODBC_DEFINITION using rules_file LdTurn; 6 - (1) Syntax error near end of statement. 50 - MaxL compilation failed.

I think i got my error. I have to say 'import database dimensions from data_file........' instead of 'import database data from data_file............'. I dont know how to delete this post so i am not deleting it.

Similar Messages

  • [svn:fx-trunk] 11585: Initial check-in of CoreTech' s pixel bender kernel files for the pixel bender filters we use to mimic AIM blendModes based on Vera 's ok.

    Revision: 11585
    Author:   [email protected]
    Date:     2009-11-09 13:39:53 -0800 (Mon, 09 Nov 2009)
    Log Message:
    Initial check-in of CoreTech's pixel bender kernel files for the pixel bender filters we use to mimic AIM blendModes based on Vera's ok. Added the standard Flex copyright headers to what Core Tech passed along.
    QE notes: None
    Doc notes: None
    Bugs: None
    Reviewer: Me
    Tests run: Checkintests
    Is noteworthy for integration: Yes, perhaps, to downstream teams that want to rebuild the pixel bender filters
    Added Paths:
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Colo r.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Colo rBurn.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Colo rDodge.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Excl usion.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Hue. pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Lumi nosity.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Lumi nosityMaskFilter.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Satu ration.pbk
        flex/sdk/trunk/frameworks/projects/spark/src/spark/primitives/supportClasses/shaders/Soft Light.pbk

  • Roland MC500 into logic - setting tempo for imported midi data

    My mate has an Roland MC500, where he's programmed multi-tracked midi info for songs etc, and we've sussed how to import the channels into separate regions in logic.
    We've set the tempo as per the tempo on the MC500, but after importing the track, after a certain length of time the tempo goes out of sync. All the midi data has come across, and when assigning soft-synths etc it plays fine, but just moves out of tempo.
    Is there a way that, when you have imported midi data in this way, you can ask logic to confirm/set the tempo?
    I've had a read thru the manuals etc, but may have missed where it describes that bit?
    Thanks for any advice.
    andy

    yes...
    you need to set the MC500 to send the MMC over its MIDI outs... than in Logic you must set th sync "Blue button"
    the picture shows MTC but if you scroll down the menu you will find MMC settings... that must be turned ON on Logic settings Synchronization.
    Logic must be setted as Slave... because Roland device is the slower one!
    G

  • Excel integration for importing of data

    Hi all,
    I'm currently looking at a way of testing the funcionality of forms by importing a range of input data.
    The data is being collated in an Excel spreadsheet.  Does anyone know a way of integrating forms with the spreadsheet to populate the specified fields?
    Thanks,
    Rob.

    Thanks for this.
    I was more looking for using Excel as source to populate the fields within the form, rather than the other way round.
    Any ideas?
    Thanks,
    Rob.

  • How to exclude the selction in BW When importing Transaction data from Info provider

    Hi ,
      I  am trying to Import data from BW infoprovider to BPC cube through Data manager . It works ok if i select Year 2010-2014,value type.
    But i have G/L account i want to import from 100000 to 100129 exclude 100130 again include 100131 to 999999 .How do i achive this .
    I selected the accounts  100000 to 100130 in one line and 100131 to 999999 in different line it did not work .
    when i select this it simply errors out with 0 records selected .
    Best regards,
    Sreedhar .

    Thank you that is the issue . I used the same formula for INT_ORDER i am getting
    EXTERNAL INTERNAL FORMULA
    00000?????? ??????
    000001003951 is invalid or is a calculated member
    if i use
    EXTERNAL INTERNAL FORMULA
    * js:parseInt(%external%)
    I am getting
    1 ,NO_COST_CENTER,DS_CPX_ACTUAL,NO_EX_DEPT,NO_LOCATION,NO_PLAN_ID,NO_REQUEST_ID,USD,ACTUAL_TEST,ACT,160020,NaN,2012.DEC,1.00
    Line1 :Dimension: INT_ORDER member: NaN is invalid or is a calculated member

  • The connection string for coded UI Data driven test using excel data source is not working

    Hello,
    I am using the visual studio 2012 coded UI test, i added the following connection strings to connect to an excel data source:
    [DataSource("System.Data.Odbc", "Dsn=Excel Files;Driver={Microsoft Excel Driver (*.xls)};dbq=C:\\Users\\shaza.said.ITWORX\\Desktop\\Automation\\On-track Automation Framework\\On-track_Automation\\Automation data file.xls;defaultdir=.;driverid=790;maxbuffersize=2048;pagetimeout=5;readonly=true",
    "Sheet1$", DataAccessMethod.Sequential), TestMethod]
    [DataSource("System.Data.Odbc", "Dsn=Excel Files;dbq=|DataDirectory|\\Automation data file.xls;defaultdir=C:\\Users\\shaza.said.ITWORX\\Desktop\\Automation\\On-track Automation Framework\\On-track_Automation\\Automation data file.xls;driverid=1046;maxbuffersize=2048;pagetimeout=5",
    "Sheet1$", DataAccessMethod.Sequential), TestMethod]
    But i get the following error:
    "The unit test adapter failed to connect to the data source or to read the data. For more information on troubleshooting this error, see "Troubleshooting Data-Driven Unit Tests" (http://go.microsoft.com/fwlink/?LinkId=62412) in the MSDN Library.
    Error details: ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified"
    Thanks,
    Shaza

    Thanks for Adrian's help.
    Hi shaza,
    From the error message, I suggest you can refer the Adrian's suggestion to check the date source connection string correctly.
    In addition, you can refer the following about how to Create a Data-Driven Coded UI Test to check your issue:
    http://msdn.microsoft.com/en-us/library/ee624082.aspx
    Or you can also try to use a Configuration File to Define a Data Source for coded UI test.
    For example:
    <?xml
    version="1.0"
    encoding="utf-8"
    ?>
    <configuration>
    <configSections>
    <section
    name="microsoft.visualstudio.testtools"
    type="Microsoft.VisualStudio.TestTools.UnitTesting.TestConfigurationSection,
    Microsoft.VisualStudio.QualityTools.UnitTestFramework, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>
    </configSections>
    <connectionStrings>
    <add
    name="ExcelConn"
    connectionString="Dsn=Excel Files;dbq=E:\Unit Test\AddClass\AddUnitTest\add.xlsx;defaultdir=.;
    driverid=790; maxbuffersize=2048; pagetimeout=60;"
    providerName="System.Data.Odbc"/>
    <add
    name="ExcelConn1"
    connectionString="Dsn=Excel Files;dbq=E:\Unit Test\AddClass\AddUnitTest\sum.xlsx;defaultdir=.;
    driverid=790;maxbuffersize=2048;pagetimeout=60"
    providerName="System.Data.Odbc"/>
    </connectionStrings>
    <microsoft.visualstudio.testtools>
    <dataSources>
    <add
    name="ExcelDS_Addition"
    connectionString="ExcelConn"
    dataTableName="Addition$"
    dataAccessMethod="Sequential"/>
    <add
    name="ExcelDS_Multiply"
    connectionString="ExcelConn1"
    dataTableName="Multiply$"
    dataAccessMethod="Sequential"/>
    </dataSources>
    </microsoft.visualstudio.testtools>
    </configuration>
    For more information, please see:https://msdn.microsoft.com/en-us/library/ms243192.aspx
    Best Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Technique for importing DV directly into iMoviwithout using import function

    For those who don't want to spend extra time importing DV files, I offer the following technique which I deduced, tested and find very useful.
    I recently developed a very simple technique for getting DV clips into an iMovie06 project without importing them directly. First, I transform the clip, whatever the underlying codec, into DV and name the DV files in the order I want them to appear. (Use of free MPEG Streamclip is recommended)
    Then I create an iMovie06 project with the correct aspect ratio (4x3 or 16x9). Then I save and close the project in the same folder as the DV files.
    Then in Finder I right-click the iMovie06 project and use "show package contents" to reveal the contents of the project.
    I double click the Media folder and then select and drag all of the clips into the Media folder.
    Then I double click on the iMovie project. When it opens, it will announce that there are items in the trash. Those items are the clips which were just added to the folder prior to opening the project.
    Open the trash and systematically drag each clip into the clip menu spots in the sequence in which you wish to view them. Save the project and close the empty trash.
    The project is ready for further editing.
    This saves a great deal of time and seems to improve the video quality by avoiding a lengthy, second import process.
    Cheers.

    Yes, I have been using this with any longer import:
    http://www.sjoki.uta.fi/~shmhav/iMovieHD_6_bugs.html#quick_DVimport
    Just remember to take care that the .dv clip you are dropping in the Media folder is REALLY a plain DV stream .dv clip and that it is of the same video standard (PAL/NTSC) as the iMovie project.
    Some gotchas: iMovie HD 6 accepts even DV-encoded .mov files or wrong video standard .dv clips straight to its Media folder, but adding titles to such clips produces colorful artifacts to the rendered clip. Also notice that iMovie 4's Full Quality DV export preset exports a file with the .dv suffix but it is, in fact, a DV-encoded .mov if the source iMovie 4 project has chapters in it (this is not an issue with iMovie HD's Full Quality DV export anymore).

  • Importing form data per XML - Using the form interface?

    Hi!
    I'm developing interactive forms by adobe and I want to import my form data per XML file. The xml file and the email are created by my pdf document.
    Now this xml file has to be parsed by a report or something like this.
    Is it possible to use the form interface, I implemented, or do I have to parse it manually?
    I think it has to work with this interface because it makes the pdf and it "knows" what to import/export.
    Am I wrong with my suspicions or can you help me with this problem?
    Thanks & greets,
    Philip Gillißen

    I'm pretty impressed of Adobe and their stuff (I do NOT refer to the community), how helpful they are.
    Perhaps I didn't understand  the meaning on the offical customer support website, stating:
    The best way to contact us...  
    Ask our experts 
    Our community and staff are at your service 24/7
    Even worst, their is a similar question from 25/09/2013 (http://forums.adobe.com/message/5711946#5711946) and no feedback form Adobe stuff at all.
    Great service guys!
    Thanks a lot!

  • Corrupt data error when using Windows backup on Oracle

    Our SAP servers include a SDLT internal tape drive that we use for doing a complete system backup.  When using Windows Backup we get the following message in the backup log:
    WARNING: Portions of "\oracle\T00\sapdata1\protd_2\PROTD.DATA2" cannot be read.  The backed up data is corrupt or incomplete.
    This file will not restore correctly.
    This is occuring on a couple of different systems but the wierd thing is when this happens in occurs on one Oracle drive in one system and the other Oracle drive on another system i.e. the G: drive on our TST sysytem and the E: drive on another.  The E: and G: drives contain the main Oracle datafiles.
    Has anybody ever encounter something like this and what can I do about it?
    Thanks;
    Gale S.

    Especially no backup tool would know about the
    fact
    that a Oracle block had been changed after it was
    copied.
    Now I have to contracdict
    There are backup tools (e. g. OpenFile Manager from
    Legato/EMC), that track the filesystem changes and
    makes sure, the backups are consistent in sense of
    filesystem blocks. We´re backing up some ORA
    databases (non-SAP, 7.3.4 and 8.1.7) for ages now -
    and I never saw this kind of corruptions.
    Well, these Backup-Tools do perform copies I/O-Consistent. That's different than DB-Block-consistent in the first place. Anyhow due to the fact that the DBMS do have a syncronizing between I/Os and writing out the blocks this leads both to consistent blocks.
    I just wonder, how the database would deal with such
    "inconsistent" files. Given the fact, you start a
    backup at filesystem block 0, the database is being
    backed up and a transaction is committed and
    something is written to block 100, 200 and 500. The
    backup is on block 300. If you now restore the
    database, it has the already commited block 500 in
    the file but 100 and 200 are not yet written in the
    database file. The redo will then find an already
    commited transaction on 500. What magic will tell the
    engine, that this is due to an online database
    backup? I mean, if you set the tablespace to backup,
    the engine "knows" the state - but if you just backup
    the file, the engine is unaware of that... I´m just
    curious how this is handled...
    Well the trick here is: we tell the database before we plan to copy the data. When a ONLINE backup is done, the tablespaces are set into BACKUP mode. This changes two things for us:
    1. The datafile headers of these tablespaces are not updated anymore although dirty blocks are still written out to these files. Since the control files that keep the SCN (system change number) are updated nevertheless, the database will know at recovery time that these files had been in online backup mode and do need recovery to become consistent again.
    2. The UNDO-information is not only written out to the UNDO/ROLLBACK-Tablespace but additionally to the REDOLOGS. So, for the time where the tablespaces are in BACKUP mode, we have all information needed to make a block consistent again - regardless if a transaction had been commited or rolled back - in the redolog. So a full recovery of all changes is possible with
    that.
    That's basically the "magic" behind this.
    If the copy of the datafile is done without this - well, then there's no magician in the world that would been able to get it consistent again. Online Backups with Oracle are only possible with BACKUP mode or RMAN (ok, and possible some 3rd party hacks...).
    KR Lars

  • Better to use Aperture or FCPX for import?

    I am starting from scratch with FCPX. I have a MBA with a portable thunderbolt drive, and right now this is my primary. I have an older 2008 iMac, and this is my workstation I use for everything when I am at home. I'm on the road more than at home and do most of my FCPX work on the road.
    I use Aperture (and iPhoto) primarily for media management. Now that it imports movies from my camcorder and DSLR, it's really the primary app we use. A huge side benefit is the disk savings offered over importing everything in iMovie.
    Are there any drawbacks to just using Aperture for importing? I can still use these videos, and convert them to prores, etc. with FCPX as needed for video projects, correct?
    85% of the iMovie (or now FCPX) stuff is just to share it online, send to family, etc. But the 15% of the projects are important ones, which is why I'm investing in FCPX. I don't always know when I import the footage if it is going to be included in an important project or not. Housekeeping-wise, it seems like the best way is to use Aperture and avoid the large files until I need them in FCPX.
    Seems obvious, but I know little so far about FCPX.
    Thanks for any insight,
    Jack

    I come from the video end and I feel that it's best to import video into a video application and stills into a stills application. If you import the video into Apeerture you still have to get it into FCP. You then drag the video into FCP to edit it. You're doubling up the file, because FCP will want it in the event folder not the Aperture library. When you drag a still into FCP it will remain linked to the Aperture library. Currently FCP imports the JPEG thumbnail from Aperture not the RAW file BTW. If you want the RAW image you have to export it from Aperture and bring it into FCP.

  • Using the Import Test Data Wizard

    Using Oracle HTMLDB 1.6.0.00.87
    Whenever we try to use the import test data wizard, even using a simple text file such as: "Forname","Surname"
    Joe,Bloggs
    the file does not import correctly. We are trying to import to a new table and uploading a txt file with the above content. What we get in the set table properties is something like: Column Names : rom_wwv_flow_file_objects
    Data Type:VARCHAR2
    Format:
    Column Length: 30
    Upload: Yes
    Row 1: Where n
    Any ideas? We tried the same test at the UK Oracle user group conference with success. Is there a set up problem on our server?
    Cheers
    Ty

    Problem solved.
    Ensure the correct character is set when importing.

  • Import excel data from OWB10.2

    I wanted to import excel data into OWB 10gr2. So i have followed the steps in the attached link. But i cannot perform the operations from step 6 in OWB 10rR2.
    In OWB10.2 how could i create a new database link. Inorder to create a new module below ODBC i have to give the location. I cannot create the location.
    Step 6: Create an ODBC Source Module and a Database Link
    Use the following steps to create an ODBC source module and database link:
    1. From the Warehouse Builder console, create an ODBC source module. On the
    navigation tree, ODBC modules are listed under the Others node of the Databases
    node.
    2. On the Connection Information page, click New Database Link to create a new
    database link that reads data using the data source created. Figure 3–3 shows the
    entries used on the New Database Link dialog.
    Note: Ensure that the initialization parameter GLOBAL_NAMES is set
    to FALSE in the database's initialization parameter file. FALSE is the
    default setting for this parameter.
    Step 6: Create an ODBC Source Module and a Database Link
    3-4 Oracle Warehouse Builder Case Book
    Figure 3–3 New Database Link Dialog
    Notice that the Oracle Service Name field uses the Oracle system identifier
    specified for the agent.
    3. Ensure that the Use for Heterogeneous Services option is selected.
    Because you are not accessing an Oracle database, you can enter any value for
    username and password.
    4. Create and test this database link. Close the New Database Link dialog.
    5. Leave the Schema name <unspecified>. Click the Change Schema button and
    select <unspecified>. The Connection Information page now looks as shown in
    Figure 3–4.
    Step 8: Create a Mapping to Load Data Into the Target Table
    How Do I Load Data Stored in a Microsoft Excel File? 3-5
    Figure 3–4 Connection Information Page
    6. Create a new deployment location for the module or specify an existing location.
    http://www.oracle.com/technology/products/warehouse/pdf/Cases/case3.pdf
    But i cannot create in OWB10.2.
    Can any body suggest me the process.

    Gillies,
    I am not able to create a new location for ODBC in connection explorer.
    Which option should i consider. Should i consider database link or host:port:service or sql*net connection. suppose If i take database link option what is the description i have to enter in database link box. Should i create a database link manually from sql*plus.

  • Import some data from one oracle to another...

    Hi Guys,
    we have 2 oracle database servers on 2 different machines with schema.
    I want to import some data from a specific table from one oracle into another oracle.
    what is the way to do that.
    Please help in details
    Imran Baig

    Hi,
    Thanks for the reply.
    Tables are already created in both of the oralce databases only the data varies. I just have to import only few records from one oracle to another with the same user name and table already existing.
    I have tried using database link. I can view records from the other oracle database but as soon as i write an insert command oracle gets held. I cant do anything.
    Is there any other way?
    Imran

  • About the exporting and importing xml data file in the pdf form

    Hi all,
    I need help for importing xml data file in the pdf form. When I write some thing in the text field with fill color and typeface (font) and exported xml data using email button. When I imported that xml file in the same pdf file that is used for exporting the xml data then all the data are shown in the control but not color and font typeface. Anyone has suggestion for solving this problem. Please help me for solving this problem.
    Thank you
    Saroj Neupane

    Can you post a sample form and data to [email protected] and I will have a look.

  • Import meta data wlst command from Java

    Hi All,
    Can any one please provide me a java code for importing meta data file using WLST commands.

    HI Joe,
    As a professional news and sports photographer I cannot use Aperture for this problem alone. I live and die by my captions and if a photo isn't captioned, or incorrectly captioned, it might as well not exist at all. Fixing the IPTC import problem is No 1 priority for me.
    As it is at the moment only half of the caption (description) comes in and some fields are missing entirely. This is only with RAW files, JPEG files import perfectly.
    Whilst we wait for this issue to be resolved (and PLEASE can it be soon!) is there anyway of working with the IPTC from the JPEG?
    If I shoot RAW and JPEG together can Aperture show both the RAW and the JPEG side by side so that I can lift from JPEG and stamp the RAW with the IPTC data? Even better, can something be written into Aperture where I can select all the RAW & JPEG images and just select a command saying 'copy IPTC data from RAW to JPEG'? This would be a very helpful temporary workaround.
    Finally, and I've put this in the feedback, can we have a much bigger window for IPTC data entry? Personally I would like a pop-up window similar to PhotoMechanic, and I'd like to be able to save my caption so that the next time I import images I can just selected the saved caption instead of going through all the fields setting it up again.
    For Aperture to be used as a DAM for my huge library I need the IPTC to be fully featured.
    Power Mac G5 Duel 2Ghz, 30" Cinema Display Mac OS X (10.4.7) MacBook
    Power Mac G5 Duel 2Ghz, 30" Cinema Display   Mac OS X (10.4.7)   MacBook

Maybe you are looking for