Load several XSD files with ODI

would need to load several XSD files (which means I would need to create for each XSD file a new DataServer and as output a DataStore) has any- one already implemented this for a ct? as the amount of XSD might be up to ~200 I need to automate this load.
thnx, regards
mio.

would need to load several XSD files (which means I would need to create for each XSD file a new DataServer and as output a DataStore) has any- one already implemented this for a ct? as the amount of XSD might be up to ~200 I need to automate this load.
thnx, regards
mio.

Similar Messages

  • How to load several transformation files with a single action

    Hi everybody,
    We are loading data from BI cube into BPC cube. We are working on SAP BPC 7.0 version and we have designed several transformation files in order to load each key figure we need.
    Now, we want to load all the transformation files executing only one action. Which one is the best way to do it?
    We thought that it would be possible to build a single process chain, where we would call the target cube and all the transformation files. In this way, the administrator only has to execute once a package that would execute the process chain. We don't want the administrator to execute several times a package looking for the different transformation files.
    How can we do it? Is there any example or document related to it?
    Any idea out there?
    Kind regards
    Albert Mas

    HI SCOTT,
    I AM FACING A PROBLEM WHEN I RUN 2 ROUNDS IN ONE TRANSFORMATION FILE...
    I need to distribute a source field in to BPC through making 2 conversion files... following is the data
    Transformation file
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    CONVERTAMOUNTWDIM=ZOUTPUT
    *MAPPING
    CATEGORY=*NEWCOL(ACT)
    PAO=0COSTCENTER
    TIME=0FISCYEAR
    ZOUTPUT=0FUNDS_CTR
    SIGNEDDATA=0DEB_CRE_LC
    *CONVERSION
    PAO=PAO_CONVER.XLS
    ZOUTPUT=ZOUTPUT_CONVER.xls
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = ,
    AMOUNTDECIMALPOINT = .
    SKIP = 0
    SKIPIF =
    VALIDATERECORDS=YES
    CREDITPOSITIVE=YES
    MAXREJECTCOUNT=
    ROUNDAMOUNT=
    CONVERTAMOUNTWDIM=ZOUTPUT
    *MAPPING
    CATEGORY=*NEWCOL(ACT)
    PAO=0COSTCENTER
    TIME=0FISCYEAR
    ZOUTPUT=0FUNDS_CTR
    SIGNEDDATA=0DEB_CRE_LC
    *CONVERSION
    PAO=PAO_CONVER.XLS
    ZOUTPUT=AMOUNT_CONVER.XLS
    Conversion file 1 (PAO=PAO_CONVER.XLS)
    EXTERNAL
    INTERNAL
    ID0001
    F08001
    ID0002
    F08001
    ID0003
    F08001
    DG0001
    F08001
    DG0002
    F08001
    Conversion file 2 (ZOUTPUT=ZOUTPUT_CONVER.xls)
    ID0001
    FX01
    VALUE*1
    ID0002
    FX01
    VALUE*1
    ID0003
    FX01
    VALUE*.40
    DG0001
    FX02
    VALUE*1
    DG0002
    FX02
    VALUE*1
    Conversion file 3 (ZOUTPUT=AMOUNT_CONVER.XLS)
    EXTERNAL
    INTERNAL
    FORMULA
    ID0003
    FX02
    VALUE*.60
    I am getting the following error
    [Start validating transformation file]
    Validating transformation file format
    Start validation transformation 1/2
    Validating options...
    Validation of options was successful.
    Validating mappings...
    Validation of mappings was successful.
    Validating conversions...
    Validation of the conversion was successful
    Start validation transformation 2/2
    Validating options...
    Validation of options was successful.
    Validating mappings...
    Validation of mappings was successful.
    Validating conversions...
    Validation of the conversion was successful
    Creating the transformation xml file. Please wait...
    Transformation xml file has been saved successfully.
    Begin validate transformation file with data file...
    [Start test transformation file]
    Validate has successfully completed
    ValidateRecords = YES
    Reject count: 0
    Record count: 6
    Skip count: 0
    Accept count: 6
    0COSTCENTER is not a valid command or column 0COSTCENTER does not exist in source
    Validation with data file failed

  • How to load a flat file with utf8 format in odi as source file?

    Hi All,
    Anybody knows how we can load a flat file with utf8 format in odi as source file.Please everybody knows to guide me.
    Regards,
    Sahar

    Could you explain which problem are you facing?
    Francesco

  • Loading several flat files

    I want to load several flat files each week with the same rules file. The files are for instance week12.txt, week13.txt etc. The files are all in the same directory, and need to be loaded each time. Is it possible to use a wildcard with ESSCMD, or MaxL to create a simple loadscript? Or does anybody know another way to load several files?

    If you are using a Win OS, you could use a VBS to write a script. Call the VBS in a DOS batch file, then execute the newly created script. Note, In this example, I didn't fully build the IMPORT command. Additionally, I would add some form of error handling and logging.Const sDataPath = "c:\Files"Dim fso, oFolder, oFiles, oFile, oScriptSet fso = CreateObject("Scripting.FileSystemObject")Set oFolder = fso.GetFolder(sDataPath)Set oFiles = oFolder.FilesSet oScript = fso.CreateTextFile("c:\script.scr" ,2, False)oScript.WriteLine "LOGIN " & CHR(34) & "ASPEN" & CHR(34) & " " & CHR(34) & "ADMIN" & CHR(34) & " " & CHR(34) & "PASSWORD" & CHR(34) & ";"oScript.WriteLine "SELECT " & CHR(34) & "SAMPLE" & CHR(34) & " " & CHR(34) & "SAMPLE" & CHR(34) & ";"For Each oFile In oFiles oScript.WriteLine " IMPORT 3 " & CHR(34) & oFile.Name & CHR(34) & ";"NextoScript.WriteLine "LOGOUT;"oScript.WriteLine "EXIT;"oScript.CloseSet oFolder = NothingSet oFiles = NothingSet oScript = NothingSet fso = Nothing

  • How to load unicode data files with fixed records lengths?

    Hi!
    To load unicode data files with fixed records lengths (in terms of charachters and not of bytes!) using SQL*Loader manually, I found two ways:
    Alternative 1: one record per row
    SQL*Loader control file example (without POSITION, since POSITION always refers to bytes!)<br>
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode.dat
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001111112234444
    01NormalDExZWEI
    02ÄÜÖßêÊûÛxöööö
    03ÄÜÖßêÊûÛxöööö
    04üüüüüüÖÄxµôÔµ Alternative2: variable length records
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode_var.dat "VAR 4"
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001501NormalDExZWEI002702ÄÜÖßêÊûÛxöööö002604üuüüüüÖÄxµôÔµ Problems
    Implementing these two alternatives in OWB, I encounter the following problems:
    * How to specify LENGTH SEMANTICS CHAR?
    * How to suppress the POSITION definition?
    * How to define a flat file with variable length and how to specify the number of bytes containing the length definition?
    Or is there another way that can be implemented using OWB?
    Any help is appreciated!
    Thanks,
    Carsten.

    Hi Carsten
    If you need to support the LENGTH SEMANTICS CHAR clause in an external table then one option is to use the unbound external table and capture the access parameters manually. To create an unbound external table you can skip the selection of a base file in the external table wizard. Then when the external table is edited you will get an Access Parameters tab where you can define the parameters. In 11gR2 the File to Oracle external table can also add this clause via an option.
    Cheers
    David

  • How can we load a flat file with very, very long lines into a table?

    Hello:
    We have to load a flat file with OWB. The problem is that each of the lines in the file might be up to 30.000 characters long (up to 1.000 units of information in each line, 30 characters long each)
    Of course, our mapping should insert these units of information as independent rows in a table (1.000 rows, in our example).
    We do not know how to go about it. We usually load flat files using table functions, but we am not sure that they will be able to cope with these huge lines. And how should we pivot those lines? Will the Pivot operator do the trick? Or maybe we should pivot those lines outside the database before loading them?
    We are a bit lost. Any suggestion would be appreciated.
    Regards
    Edited by: [email protected] on Oct 29, 2008 8:43 AM
    Edited by: [email protected] on Oct 29, 2008 8:44 AM

    Yes, well, we could define a 1.000 column external table, and then map those 1.000 columns to the Pivot operator… perhaps it would work. But we have been investigating a little bit, and we think that we have found a better solution: there is a unix utility called “fold”. This utility can split our 30.000 character lines in 1.000 lines, 30 characters long each: just what we needed. Then we can load the resulting file using an external table.
    We think this is a much better solution that handling 1.000 columns in the external table and in the Pivot operator.
    Thanks for your help.
    Regards
    Edited by: [email protected] on Oct 29, 2008 10:35 AM

  • How to load a flat file with lot of records

    Hi,
    I am trying to load a flat file with hundreds of records into an apps table. when i create the process and deploy it onto the console it asks for an input in an html form. why does it ask for an input when i have specified the input file directory in my process? is there any way around tis where in it just reads all the records from the flat file directly??is custom queues anyway related to what I am about to do?any documents on this process will be greatly appreciated.If any one can help me on this it will be great. thank you guys....

    After deploying it, do you see if it is active and the status is on from the BPEL console BPEL Process tab? It should not come up to ask for input unless you are clicking it from the Dashboard tab. Do not click it from the Dashboard. Instead you should put some files into the input driectory. Wait few seconds you should see the instances of the BPEL process is created and start to process the files asynchrously.

  • Ability to process several raw files with the same content but with different exposure into the single picture

    Can you add to the Lightroom an ability to process several raw files with the same content but with different exposure into the single picture?
    Base raw files can be given with exposure bracketing during shooting, for example.
    The goal - to get maximum details in darks and lights (if we use the "ligths recovery" or "fill lights" we lose the quality because raw file just have no all required information).
    The similar (but not the same, only the idea) thing - is High Dynamic Range Photography in Adobe Photoshop
    Thank you

    The plugin LR/Enfuse does this already. And of course Photomatix have a plugin available for Lightroom. This essentially amounts to pixel editing, which is beyond the range of Lightroom's metadata editing.

  • Combine several XML files with same structure

    Hello,
    I have several XML files with the same structure and I want to combine them and create a new XML file to be bale to compare that information easily. It does not look very difficult but as I am very new in this I am not bale to get it
    The structure of my actual files would be something simlar to:
    Root->...-> Name->Address, Telephone
    And what I would like to have si something like
    Root->.... ->Address-> Name 1,Name 2....
    Root -> ...->Telephone-> Name 1, Name 2....
    Does anyone know how to do this.
    Thanks

    You could write a XSL transformation file that does this and transform your input file via
         * Transform XML file with a style sheet.
         * <p><b>Example:</b><p>
    <table align="center" bgcolor="#E0E0E0" border=1 cellpadding="10" cellspacing="0"><tr><td><pre style="margin-top:0; margin-bottom:0">
    XMLTransformer t = new XMLTransformer();
    FileOutputStream fos = new FileOutputStream("C:/Project/result.html");
    String xmlFile = "C:/Project/source.xml";
    String styleSheet = "C:/Project/stylesheet.xsl";
    t.transform(xmlFile, styleSheet, fos);
    </pre></td></tr></table>
         * @param xmlfile The XML file to transform.
         * @param style Stylesheet to use for transformation.
         * @param outputStream OutputStream to write the transformed result to.
        public void transform(String xmlfile, String style, OutputStream outputStream) {
            DocumentBuilderFactory factory =  DocumentBuilderFactory.newInstance();
            //factory.setNamespaceAware(true);
            //factory.setValidating(true);
                //todo: use inputstreams instead of file names
                try {
                    File stylesheet = new File(style);
                    File datafile   = new File(xmlfile);
                    DocumentBuilder builder = factory.newDocumentBuilder();
                    document = builder.parse(datafile);
                    // Use a Transformer for output
                TransformerFactory tFactory = TransformerFactory.newInstance();
                StreamSource stylesource = new StreamSource(stylesheet);
                Transformer transformer = tFactory.newTransformer(stylesource);
                DOMSource source = new DOMSource(document);
                StreamResult sr = new StreamResult(outputStream);
                transformer.transform(source, sr);
            } catch (TransformerConfigurationException tce) {
               // Error generated by the parser
               System.out.println ("\n**XMLTransformerr Factory error");
               System.out.println("   " + tce.getMessage() );
               // Use the contained exception, if any
               Throwable x = tce;
               if (tce.getException() != null) {
                   x = tce.getException();
               x.printStackTrace();
            } catch (TransformerException te) {
               // Error generated by the parser
               System.out.println ("\n** Transformation error");
               System.out.println("   " + te.getMessage() );
               // Use the contained exception, if any
               Throwable x = te;
               if (te.getException() != null) {
                   x = te.getException();
               x.printStackTrace();
             } catch (SAXException sxe) {
               // Error generated by this application
               // (or a parser-initialization error)
               Exception  x = sxe;
               if (sxe.getException() != null) {
                   x = sxe.getException();
               x.printStackTrace();
            } catch (ParserConfigurationException pce) {
                // Parser with specified options can't be built
                pce.printStackTrace();
            } catch (IOException ioe) {
               // I/O error
               ioe.printStackTrace();
        }//transform()

  • Load one text file with 12 periods' data into 12 different periods at once?

    Hi guys,
    In one swoop, can we load one .txt file with 12 periods of data into 12 different periods?
    The scenario:
    Budget data is required for monthly comparative reporting with actuals, so we have 12 periods in our Budget version.
    From a non-SAP system we get one .txt file containing 12 periods worth of budget data,
    - it is in the correct format (and we don't want to create risk by opening and editing it) so it is loaded into each period and the other 11 periods of irrelevant data are obviously ignored.
    Some extra tasks (such as validation, cashflow calculations) are then performed per period.
    Because this has to be repeated 12 times, this can be a time consuming process
    We now have multiperiod monitor functionality (from EHP2) and I see how it works for the automatic tasks (very well).
    I'm aware that the guide says that manual tasks will be ignored during the automatic run, this is true.
    However, if I remember correctly EC-CS used to allow it with upoad files, so i was expecting it in BCS 6.02
    - is there anyway to load a file containing 12 periods worth of data into 12 individual periods in all at once?
    (NB we still have an improvement to the previous situation, the user can scroll between periods more quickly and load the file 12 times, then go back to the start and run all auto tasks at once)..
    One thought was to use a file server location with a hardcoded filename but his would require work beyond my expertise.
    All suggestions welcome

    Hi
    I wil suggest mapping the path based on a common location is the best way and then map the same logically in the Upload method
    If you are using Citrix then it becomes all the more easy to have a commmon location
    Rgds
    Dheeraj

  • Loading XML(XSD) file into database by validating the format in file

    Hi,
    I have a file whose format is predefined as listed below
    <?xml version="1.0" encoding="utf-8" ?>
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">
         <xs:element name="VVV">
              <xs:complexType>
                   <xs:sequence>
                        <xs:element name="VVVHeader">
                        <xs:complexType>
                        <xs:sequence>
                        <xs:element name="UniqueIdentifier" type="xs:SEQ0001"/>
                        <xs:element name="VVVFileType" type="xs:123"/>
                        <xs:element name="FileCreatedDatetime" type="xs:23-MAY-2006"/>
                        <xs:element name="ScanDatetime" type="xs:23-MAY-2006"/>
                        <xs:element name="BatchID" type="xs:456"/>
                        <xs:element name="BatchSequence" type="xs:1001"/>
                        <xs:element name="DataEntryDatetime" type="xs:23-MAY-2006"/>
                        </xs:sequence>
                        </xs:complexType>
                        </xs:element>
                        <xs:element name="VVVDetail">
                        <xs:complexType>
                        <xs:sequence>
                        <xs:element name="RRRLocalOffice" type="xs:Mumbai"/>
                        <xs:element name="Duplicate" type="xs:No"/>
                        <xs:element name="ReRegMarker" type="xs:boolean"/>
                        <xs:element name="RegistrationMark" type="xs:Vishnu"/>
                        <xs:element name="RegMarkCheckDigit" type="xs:12S"/>
                        <xs:element name="TaxClassCode" type="xs:67"/>
                        <xs:element name="ExternalMakeCode" type="xs:Maruti"/>
                        <xs:element name="ExternalModelCode" type="xs:800LX"/>
                        <xs:element name="RRRMakeCode" type="xs:Mar"/>
                        <xs:element name="RRRModelCode" type="xs:800LV"/>
                        <xs:element name="RRRVehicleBodyCode" type="xs:98"/>
                        <xs:element name="RRRColourCode" type="xs:red"/>
                        <xs:element name="RegistrationDate" type="xs:23-MAY-2006"/>
                        <xs:element name="ChassisNumber" type="xs:num123"/>
                        <xs:element name="HC" type="xs:1.22"/>
                        <xs:element name="UnWeight" type="xs:90"/>
                        <xs:element name="NoSeats" type="xs:four"/>
                        <xs:element name="NOx" type="xs:2.22"/>
                        <xs:element name="RevenueWeight" type="xs:34"/>
                        <xs:element name="CO2" type="xs:55"/>
                        <xs:element name="Particulates" type="xs:long"/>
                        <xs:element name="CO" type="xs:3.33"/>
                        <xs:element name="HCNOx" type="xs:4.44"/>
                        <xs:element name="TrailerWeight" type="xs:66"/>
                        <xs:element name="StationaryLevel" type="xs:77"/>
                        <xs:element name="EngineSpeed" type="xs:88"/>
                        <xs:element name="DriveBynature" type="xs:99"/>
                        <xs:element name="SMMTFleetCode" type="xs:yel"/>
                        <xs:element name="Purchasercode" type="xs:4004"/>
                        <xs:element name="IndustryofUse" type="xs:office"/>
                        <xs:element name="OriginalDealerCode" type="xs:tr56"/>
                        <xs:element name="SellingDealerCode" type="xs:se23"/>
                        <xs:element name="bill110" type="xs:srira"/>
                        <xs:element name="bill111" type="xs:mula"/>
                        <xs:element name="SalesType" type="xs:krish"/>
                        </xs:sequence>
                        </xs:complexType>
                        </xs:element>
                   </xs:sequence>
              </xs:complexType>
         </xs:element>
    </xs:schema>
    I would like to update this into a a table by validating the format of the XSD file.
    I will be having several similar files.
    Steps I will be doing on this are
    1 > Pick up file from a directory one at a time
    2 > Validate the format of XSD file
    3 > If valid load into a table
    How can I achieve above steps using PL/SQL
    Pls mail me ur suggestions on very urgent basis
    Thanks in advance
    Vishnu

    Really this should be in the XML forum, but anyway....
    Just a small question... It looks as though you are using an XML schema (XSD) to store your table information along with it's data, am I correct? Seems very bizarre thing to do.
    If you are using Oracle 10g you can drop your file into the Oracle WebDAV area (XDB) and from there you can access the XML as an XMLTYPE using something like...
    SELECT rv.res.extract('/Resource/Contents/*')
    FROM   resource_view rv
    WHERE  lower(rv.any_path) = lower(lc_filename)Once you have it in the XMLTYPE variable from that query, and because your schema doesn't really define the datatypes etc. it's gonna be up to you to parse the XML using something like the DBMS_XMLDOM package and process that into create table statements or something.
    Certainly looks like you've got yourself a nice task to do there. Glad I'm not doing it.
    ;)

  • How to create java classes when multiple xsd files with same root element

    Hi,
    I got below error
    12/08/09 16:26:38 BST: [ERROR] Error while parsing schema(s).Location []. 'resultClass' is already defined
    12/08/09 16:26:38 BST: [ERROR] Error while parsing schema(s).Location []. (related to above error) the first definition appears here
    12/08/09 16:26:38 BST: Build errors for viafrance; org.apache.maven.lifecycle.LifecycleExecutionException: Internal error in the plugin manager executing goal 'org.jvnet.jaxb2.maven2:maven-jaxb2-plugin:0.7.1:generate': Mojo execution failed.
    I tried genarate java classes from multiple xsd files, but getting above error, here in .xsd file i have the <xe: element="resultClass"> in all .xsd files.
    So I removed all .xsd files accept one, now genarated java classes correctly. but i want to genarte the java classes with diffrent names with out changing .xsd
    Can you please tell me any one how to resolve this one......
    regards
    prasad.nadendla

    Gregory:
    If you want to upload several Java classes in one script the solution is .sql file, for example:
    set define ?
    create or replace and compile java source named "my.Sleep" as
    package my;
    import java.lang.Thread;
    public class Sleep {
    public static void main(String []args) throws java.lang.InterruptedException {
    if (args != null && args.length>0) {
    int s = Integer.parseInt(args[0]);
    Thread.sleep(s*1000);
    } else
    Thread.sleep(1000);
    create or replace and compile java source named "my.App" as
    package my;
    public class App {
    public static void main(String []args) throws java.lang.InterruptedException {
    System.out.println(args[0]);
    exit
    Then the .sql file can be parsed using the SQLPlus, JDeveloper or SQLDeveloper tools.
    HTH, Marcelo.

  • Using a container to load several swf files and play them

    I need some help. I want to use several swf files and have them be called upon in a container file and play them in sequence. It's a presentation that needs to play thru but still have the ability to stop, click on items, open a popup and then continue on in the presentation. I am building all the individual "chapters" and their "sub-chapters" as swf files, with the hope that I can load them in order. I am relatively new to AS3. Help?

    If you will be loading swf files into a container then you will use the Loader class to accomplish that, so give that a looking over in the help documents and see what you can do.  If you have a problem getting it to work, post your code and describe what you have done.

  • Extracting data from Essbase & loading into flat file through ODI

    Hi,
    I want to extract data from Essbase and load it into a flat file through ODI(for extraction from essbase I'm using a report script) and I’m using these KM’s:- LKM Hyperion Essbase data to SQL,IKM SQL to FILE Append & for reversing I’m using RKM Hyperion Essbase.All the mappings have been done and the interface has been made. But when I’m executing the interface it is throwing the error below:-
    ODI-1217: Session ESS_FILEI (114001) fails with return code 7000.
    ODI-1226: Step ESS_FILEI fails after 1 attempt(s).
    ODI-1240: Flow ESS_FILEI fails while performing a Loading operation. This flow loads target table ESS_FILE.
    ODI-1228: Task SrcSet0 (Loading) fails on the target FILE connection FILE_PS_ODI.
    Caused By: java.sql.SQLException: ODI-40417: An IOException was caught while creating the file saying The system cannot find the path specified
    at com.sunopsis.jdbc.driver.file.impl.commands.CommandCreateTable.execute(CommandCreateTable.java:62)
    at com.sunopsis.jdbc.driver.file.CommandExecutor.executeCommand(CommandExecutor.java:33)
    at com.sunopsis.jdbc.driver.file.FilePreparedStatement.execute(FilePreparedStatement.java:178)
    at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:619)
    Please let me know what I'm missing and how I can resolve this error.
    Thanks

    It seems that you are trying to use the file as your staging areas. Hyperion LKM extracts essbase data into a DB staging area which can then be used by your file IKM to load it into file.
    You need to use a RDBMS for your staging area.

  • Picking up Random files with ODI

    I have a directory into which an application dumps random files, The file might be randomly named but they have the same format. I need to be able to read the file and check if its the same schema, if so then load it into a common table...
    How do i go about doing this?
    I have created a file schema and manually reverse engineered one file in teh directory and had it as a part fo the interface to load it into my target
    My problem is however when we have might have different file names for the same file, eg:
    FlatFile.dat
    FlatFile 10-12-08.dat
    FlatFile 11-12-08.dat
    How do i pick up all the newly generated files automatically as a part of the ETL process and load?

    That is not a problem too... hehheehhe
    Put a ODI variable at your resource name BUT, in this case you will need to "know" the name to set the variable.
    I usually do that like:
    1) Create a procedure
    2) at first step buid an OS command that generate a file with the name of all files that you need, at windows will be something like:
    cmd.exe /c dir c:/temp/sample_dir /b /a:-d > c:/temp/sample_dir/file_list.txt
    Now you have a fixed named file that contains all file name
    4) Using the Logical Schema to this directory, create a datastore (model table, I will call it, file_of_files! hehehe) what points to the "file of files" with just one column (col_file_name as example)
    5) create a new step at the procedure where, at source tab, File Technology and the correct Logical Schema write:
    select col_file_name from file_of_file
    6) At target tab, using Sunopsis API technology call a package scenario.
    To create this package do:
    6.a) Drag and drop the variable you're using at resource name and set it as "Declare"
    6.b) Drag and drop the interface that use the dynamic file as source.
    6.c) generate the scenario
    the command at target will be something like:
    OdiStartScen "-SCEN_NAME=abc" "-SCEN_VERSION=001" "-LOG_LEVEL=5" "-ODI_VARIABLE_NAME=#col_file_name"
    In this way you can have your source dynamically!!!
    Does it make sense to you? (and help??)

Maybe you are looking for

  • Problem in hosting using tomcat 4.1.12

    earlier i was using tomcat 3.2.1 for hosting virtually. recently i upgraded to tomcat 4.1.12 and on hosting a site I constantly get this error when i try to open the site from the browser. What can be the solution to my problem. HTTP Status 500 - Int

  • Images and Effects not printing right in ID CS2

    I'll try to describe my problem to the best of my ability. I am using CS2 and In Design for some brochures. Basically I created some background art in Illustrator (it's just a wave yellow in color) and on the rest of the page is text and some images.

  • Search Results web part - Custom Query using "Value with a parameter from URL" inconsistent

    I have encountered what I think may be a bug, but I am hoping that there is something that I am missing. Within my search site, I have created a new search results page where I want to customize the "Search Results" web part query.  I can add in any

  • Can i resize my project in fcpe?

    it seems my sequence settings didn't match my source footage. i have a squeezed picture with black on each side. shouldn't i be able to click on the clip in the sequence and go to modify>scale to sequence? i don't have that command though. is it caus

  • Re: Batch Number Rabge

    Dear All, I am facing an issue. I am using internal number range for batches for process order. The number range nomenclature has need designed as per clients requests. Now my issue is that if I release the order the system creates a batch for the or