Issue while using SUNOPSIS MEMORY ENGINE (High Priority)

Hi Gurus,
While using SUNOPSIS MEMORY ENGINE to generate a .csv file using the database table as a source it is throwing an error in the operator like.
ODI-1228: Task SrcSet0 (Loading) fails on the target SUNOPSIS ENGINE connection SUNOPSIS MEMORY ENGINE.
Caused By: java.sql.SQLException: unknown token
(LKM used : LKM Sql to Sql.
IKM used : IKM Sql to File Append.)
can you please help me regarding this ASAP as it has became the show stopper for me to proceed further.
Any Help will be greatly Appreciable.
Many Thanks,
Pavan
Edited by: Pavan. on Jul 11, 2012 10:22 AM

Hi All,
The Issue got resolved successfully.
The solution is
we need to change the E$_,I$_,J$_,...... to E_,I_,J_,.... ((i.e; removing the '$' Symbol)) in the PHYSICAL SCHEMA of SUNOPSIS MEMORY ENGINE as per the information given below.
When running interfaces and using a XML or Complex File schema as the staging area, the "Unknown Token" error appears. This error is caused by the updated HSQL version (2.0). This new version of HSQL requires that table names containing a dollar sign ($) are surrounded by quotes. Temporary tables (Loading, Integration, and so forth) that are created by the Knowledge Modules do not meet this requirement on Complex Files and HSQL technologies.
As a workaround, edit the Physical Schema definitions to remove the dollar sign ($) from all the Work Tables Prefixes. Existing scenarios must be regenerated with these new settings.
It worked fine for me.
Thanks ,
Pavan Kumar

Similar Messages

  • Sunopsis Memory Engine Issue

    Hi,
    I recently started getting an the following error whilst executing an interface which uses the Sunopsi Memory Engine as the staging area -
    ODI-1228: Task SrcSet0 (Loading) fails on the target SUNOPSIS_ENGINE connection SUNOPSIS_MEMORY_ENGINE.
    Caused By: java.sql.SQLException: statement is not in batch mode
    The interface had previously been executing fine for a number of days. After some debugging I narrowed it down to the fact that when the source file was loaded to the staging area in the interface, if it had more than 210 records it would generate this error. I'm struggling to work out why this should be the case as I'm sure this isn't a restriction I have ever encountered before. Any ideas would be appreciated!!

    No, nothing strange in that row. I can remove any record in the file or add any record at row 211 and the result is aways the same. Threre are only 2 fields on each line so its a pretty basic recordset

  • Sunopsis memory engine

    Hi all,
    Ive got a doubt regarding the use of sunopsis memory engine as staging area. All the examples and blogs explain stuffs with sunopsis. Can it survive any amount of data load.. or for bulk dataload we have to go for some other means. Can we create a customized staging area like a dedicated schema or db ?
    thanks

    Sutirtha Roy wrote:
    Hi ,
    Sunopsis Memory engine should only be used when you do not have any relation schema to work with as staging area.
    If your data volumn increase , you are boud to get performance issue for sunopsis memory engine.
    Sunopsis memory enging is mainly provided for giving small demo purpose .
    Thanks,
    SutirthaI have to disagree, I don't believe it should only be used if there is not a relational schema there are occasions when it is very useful
    There are many occasions for instance loading to planning,essbase from a flat file where there is no need to load into relational staging area and using the memory can outperform it and works perfectly well.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Mapping issues while using MapwithDefault Node function for Idoc

    Hi Experts,
    We are facing issues while trying to generate nodes at reciver side even though it does not exist at sender side.
    we are using the matmas05 and we want the nodes E1marcm and e1mardm to generated at target structure.
    the structure is like this
    Matmas05
    idoc
    E1maram
    segment
          E1marcm
            segment
              msgfn
              .....other fields
              E1mardm (can be many segments)
                segment 
                  lgort
                  ..other fields
              E1mpgdm
                 segment
                 ...e1mpgdm fields
    the mapping has been done like:
    e1marcm -mapwithdefault-e1marcm
    constant -segment
    werks-mapwithdefault-werks
    other fields(one to one mapping)
    e1mardm-mapwith default-e1mardm
    lgort-mapwithdefault(value 1000)-lgort
    other fields -one to one mapping
    now the problem we are facing is when e1mardm is not existing in source structure the values from the other node of e1mardm(which exists) are getting overwritten into it.
    we want only the lgort to be 1000 and the segment should populate at target side.
    like
    e1mardm
    segment
    lgort value 1000
    Could you please assist in solving this issue and give your valuable suggestions as how could we handle it.
    Thanks and regards,
    jyoti

    U will achieve this using UDF function. Map the below Like.
    e1mardm-mapwith default(Value Constant)-e1mardm
                                                            lgort (A) ---
    e1mardm-mapwith default(Value Exit)u2014(EqualsS) (B)- (UDF1)- lgort
                                Constant(Value Exit)---   
                                              Constant(Value 1000)(C)---   
    Other one to one mapping fields also you should use one more UDF.
                        Other one to one fields (A) ---
    e1mardm-mapwith default(Value Exit)u2014(EqualsS)    (B) - (UDF2)- Target
                                 Constant(Value Exit)---   
    UDF1:
    Argument A,B,C and select queue.
       //write your code here
    int j = 0;
    for(int i=0;i<a.length;i++)
    if (b[j].equals("true"))
    result.addValue(c[0]);
    i--;
    result.addValue(ResultList.CC);
    else
    result.addValue(a<i>);
    j++;
    UDF2:
       Argument A,B and select queue.
        //write your code here
    int j = 0;
    for(int i=0;i<a.length;i++)
    if (b[j].equals("true"))
    i--;
    result.addValue(ResultList.CC);
    else
    result.addValue(a<i>);
    j++;

  • Is there any issue in using ABAP Memory ID to exchange between the programs

    Hi All,
    Do we expect any issues if we use ABAP Memory ID's to exchange the data between different programs?
    I was told by my colleagues that, we can expect some unforeseen issues if we use ABAP Memory ID's. These issues could be because of refresh of Memory ID's in the Standard program.
    Is that true? Need experts opinion on this question.
    Thanks a lot in advance.
    Regards,
    RSS

    I can think of such case only if you pick memory id of some standard name. Anyhow I can't imagine this happens w/o running any standard report on you machine from your custom report. ABAP memory is user dependant so you have your own roll area wherein all run programs can communicate. If you don't run any standard report by means of SUBMIT, you don't have to worry about this aspect either.
    Futhermore if you run separate GUI session, or sinmply use /o in same session, you open new external session which gets its own new ABAP memory. So you don't affect your previous one at all.
    If you want to be extremely careful, use memory id of some custom, original name i.e. I always use such naming convention NAME_OF_PROGRAM_XXXX where XXX denotes its usage i.e. XXX = 'EMPLOYEES'. If I also don't use SUBMIT I am 100% sure no other program touches/flushes this memory.
    Don't believe your collegues and use ABAP memory whenever needed, but always consider context of program and where it lies in the memory. If they persist, please send them here to discuss this matter giving some good reason why they discourage you to do.
    BTW: This could be an issue with SAP Memory, but with ABAP no chance.
    Regards
    Marcin

  • Short Dump TSV_TNEW_PAGE_ALLOC_FAILED while using shared memory objects

    Hi Gurus,
    We are using shared memory objects to stor some data which we will be reading later. I have implemented the interfce IF_SHM_BUILD_INSTANCE in root class and using its method BUILD for automatic area structuring.
    Today our developments moved from dev system to quality system, and while writing the data into the shared memory using the methods ATTACH_FOR_WRITE and DETACH_COMMIT in one report. We started getting the run time error TSV_TNEW_PAGE_ALLOC_FAILED.This is raised when the method DETACH_COMMIT is called to commit the changes in the shared memory.
    Everyhting works fine before DETACH_COMMIT. I know that it is happening since the program ran out of extended memory, but I am not sure why it is happening at DETACH_COMMIT call. If excessive memory is being used in the program, this run time error should have been raised while calling the ATTACH_FOR_WRITE method or while filling the root class attributes. I am not sure why it is happening at DETACH_COMMIT method.
    Many Thanks in advance.
    Thanks,
    Raveesh

    Hi raveesh,
    as Naimesh suggested: Probably system parameter for shared memory area is too small. Compare the system parameters in devel and QA, check what other shared memory areas are used.
    Regarding your question, why it does not fail at ATTACH_FOR_WRITE but then on DETACH_COMMIT:
    Probably ATTACH_FOR_WRITE will set an exclusive write lock on the shared memory data, then write to some kind of 'rollback' memory and DETACH_COMMIT will really put the data into shared memory area and release the lock. The 'rollback' memory is in the LUW's work memory which is much bigger as the usual shared memory size.
    This is my assumption - don't know who can verify or reject it.
    Regards,
    Clemens

  • Issue while using SUBPARTITION clause in the MERGE statement in PLSQL Code

    Hello All,
    I am using the below code to update specific sub-partition data using oracle merge statements.
    I am getting the sub-partition name and passing this as a string to the sub-partition clause.
    The Merge statement is failing stating that the specified sub-partition does not exist. But the sub-partition do exists for the table.
    We are using Oracle 11gr2 database.
    Below is the code which I am using to populate the data.
    declare
    ln_min_batchkey PLS_INTEGER;
    ln_max_batchkey PLS_INTEGER;
    lv_partition_name VARCHAR2 (32767);
    lv_subpartition_name VARCHAR2 (32767);
    begin
    FOR m1 IN ( SELECT (year_val + 1) AS year_val, year_val AS orig_year_val
    FROM ( SELECT DISTINCT
    TO_CHAR (batch_create_dt, 'YYYY') year_val
    FROM stores_comm_mob_sub_temp
    ORDER BY 1)
    ORDER BY year_val)
    LOOP
    lv_partition_name :=
    scmsa_handset_mobility_data_build.fn_get_partition_name (
    p_table_name => 'STORES_COMM_MOB_SUB_INFO',
    p_search_string => m1.year_val);
    FOR m2
    IN (SELECT DISTINCT
    'M' || TO_CHAR (batch_create_dt, 'MM') AS month_val
    FROM stores_comm_mob_sub_temp
    WHERE TO_CHAR (batch_create_dt, 'YYYY') = m1.orig_year_val)
    LOOP
    lv_subpartition_name :=
    scmsa_handset_mobility_data_build.fn_get_subpartition_name (
    p_table_name => 'STORES_COMM_MOB_SUB_INFO',
    p_partition_name => lv_partition_name,
    p_search_string => m2.month_val);
                        DBMS_OUTPUT.PUT_LINE('The lv_subpartition_name => '||lv_subpartition_name||' and lv_partition_name=> '||lv_partition_name);
    IF lv_subpartition_name IS NULL
    THEN
                             DBMS_OUTPUT.PUT_LINE('INSIDE IF => '||m2.month_val);
    INSERT INTO STORES_COMM_MOB_SUB_INFO T1 (
    t1.ntlogin,
    t1.first_name,
    t1.last_name,
    t1.job_title,
    t1.store_id,
    t1.batch_create_dt)
    SELECT t2.ntlogin,
    t2.first_name,
    t2.last_name,
    t2.job_title,
    t2.store_id,
    t2.batch_create_dt
    FROM stores_comm_mob_sub_temp t2
    WHERE TO_CHAR (batch_create_dt, 'YYYY') = m1.orig_year_val
    AND 'M' || TO_CHAR (batch_create_dt, 'MM') =
    m2.month_val;
    ELSIF lv_subpartition_name IS NOT NULL
    THEN
                        DBMS_OUTPUT.PUT_LINE('INSIDE ELSIF => '||m2.month_val);
    MERGE INTO (SELECT *
    FROM stores_comm_mob_sub_info
    SUBPARTITION (lv_subpartition_name)) T1 --> Issue Here
    USING (SELECT *
    FROM stores_comm_mob_sub_temp
    WHERE TO_CHAR (batch_create_dt, 'YYYY') =
    m1.orig_year_val
    AND 'M' || TO_CHAR (batch_create_dt, 'MM') =
    m2.month_val) T2
    ON (T1.store_id = T2.store_id
    AND T1.ntlogin = T2.ntlogin)
    WHEN MATCHED
    THEN
    UPDATE SET
    t1.postpaid_totalqty =
    (NVL (t1.postpaid_totalqty, 0)
    + NVL (t2.postpaid_totalqty, 0)),
    t1.sales_transaction_dt =
    GREATEST (
    NVL (t1.sales_transaction_dt,
    t2.sales_transaction_dt),
    NVL (t2.sales_transaction_dt,
    t1.sales_transaction_dt)),
    t1.batch_create_dt =
    GREATEST (
    NVL (t1.batch_create_dt, t2.batch_create_dt),
    NVL (t2.batch_create_dt, t1.batch_create_dt))
    WHEN NOT MATCHED
    THEN
    INSERT (t1.ntlogin,
    t1.first_name,
    t1.last_name,
    t1.job_title,
    t1.store_id,
    t1.batch_create_dt)
    VALUES (t2.ntlogin,
    t2.first_name,
    t2.last_name,
    t2.job_title,
    t2.store_id,
    t2.batch_create_dt);
    END IF;
    END LOOP;
    END LOOP;
    COMMIT;
    end;
    Much appreciate your inputs here.
    Thanks,
    MK.
    (SORRY TO POST THE SAME QUESTION TWICE).
    Edited by: Maddy on May 23, 2013 10:20 PM

    Duplicate question

  • Authorization issue while using business content objects

    Hi all,
    I am getting an authorization error while loading from DSO to Cube (for standard business content objects only). Even i am not able access the data in Bex reports from Standard cubes or DSO.
    The user is assigned to SAP_ALL .
    But there is no such issue while accessing user defined DSO or cubes.
    any solutions will be helpful.
    Regards,
    Varma

    Hi,
    Have you seen the error thru the SU53 t-code ? There you should see what is the missing authorization with the user.
    Regards, Federico

  • Issue to use all memory installed on server

    currently running crystal report server 2008 (12.1.0.882) Crystal 32bit on powerfull  Dell server R 905 with 64 Gig memory..
    There are currently 15 instances of Crystal running on this server, with each being allocated a maximum of 2GB.  The aim of the design is for the memory to be chunk up into 2GB each and allocation to the 15 instances of Crystal (2GB x 15), therefore we donu2019t want the instances to exceed the 2GB limit imposed.
    but all 15 instances use maximum 6 Gig memory on server even we have 64 Gig memory available. using page file and cpu goes to %100 and very slow performance.
    using /pae switch in boot.ini and server shows all 64 Gig memory and can be use all these memories by a test application
    thanks for sharing any information to resolve this issue.
    Nick

    Hi Nick,
    I Believe you are using Crystal Reports Server Embedded and working with Lawson to resolve this issue which is why I moved this to the Java Development forum.
    To get more details are you running 15 instances of Crystal RAS according to the CCM?
    RAS is not pae aware and although you can use the boot.ini switch to increase this we highly suggest you don't. I also don't believe we will limit if you are trying to limit it to 2 gig.
    Can you check the number of CPU's enabled in License Manager according to the key code?
    Also, how many reports are you running at one time?
    How many users are logged in?
    Are the reports all on demand reports or are they being exported to some format and e-mailed out etc.?
    If you working with Lawson possibly what we can do is set up a conference call?
    Thank you
    Don
    PS - also note Crystal Reports Server is a standalone version of the full Enterprise produce but it is limited to one server and one PC. It is not the same as Crystal Reports Server Embedded which is standalone Crystal Report RAS and used for application development only. There is no SAP interface to it.
    Edited by: Don Williams on Oct 24, 2009 9:53 PM

  • UIImagePickerController Custom Overlay Black Screen issue while using Video trimming controller

    Hello Everyone ,
    I have problem while opening camera in recording mode using UIImagePicker Controller with custom overlay and it should open first time properly .But when i push to next view controller and return back again to open camera at that time camera displays black screen for few time.
    The Logic in the inside view controller is for VideoTrimming . For video trimming i have used "SAVideoRangeSlider" controll. SAVideoRange Silder perform task to get all thumbnail images from video which i have assign . It uses AVAsset  image generator class provided by apple to get image frames from videos and it runs in background using its handler method.
    Handler Method ::
    [self.imageGenerator generateCGImagesAsynchronouslyForTimes:times
                                                  completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime,
                                                                      AVAssetImageGeneratorResult result, NSError *error) {
    I have nilling all the objects related to image generator class when i came back to Camera screen. still issues of black screen occur while opening camera.
    This issue occur in ios7 but not in ios6.
    Can anyone have any idea about this ...Please relply me ASAP.

    Hello Everyone ,
    I have problem while opening camera in recording mode using UIImagePicker Controller with custom overlay and it should open first time properly .But when i push to next view controller and return back again to open camera at that time camera displays black screen for few time.
    The Logic in the inside view controller is for VideoTrimming . For video trimming i have used "SAVideoRangeSlider" controll. SAVideoRange Silder perform task to get all thumbnail images from video which i have assign . It uses AVAsset  image generator class provided by apple to get image frames from videos and it runs in background using its handler method.
    Handler Method ::
    [self.imageGenerator generateCGImagesAsynchronouslyForTimes:times
                                                  completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime,
                                                                      AVAssetImageGeneratorResult result, NSError *error) {
    I have nilling all the objects related to image generator class when i came back to Camera screen. still issues of black screen occur while opening camera.
    This issue occur in ios7 but not in ios6.
    Can anyone have any idea about this ...Please relply me ASAP.

  • Network speed issues while using osx

    this may seem a little bit strange, and i can't make any sense out of it but if anyone can figure out what my problem might be it would be greatly appreciated.
    i have an older duel os (win xp, leopard) mbp and have had a lot of problems with hardware/software (osx) in the past, and have ended up, to my dismay, almost exclusively using windows.
    anyway, just yesterday i decided it was time to give osx another chance. so besides having it hanging constantly while doing simple tasks, i have noticed my network speed is drastically reduced.
    i ran a test on speedtest.net and ended up with 500kps average download speed, and 70kbps upload speed, with the closest server in my town at 800 ping.
    i straight away restarded into windows and ran the test again, and achieved 13500kbps download, and 600kpbs upload, with a ping of 66.
    NOTHING changed between the time of restarting. The router was left as it was, no other computers are switched on, and nothing is downloading.
    In the past when using osx i have had frequent connection dropping issues, but as i run wireless i always just attributed it to my router. Although since using windows exclusively this has also ceased.
    I have looked through airport settings and cannot find anything that looks like it may be the problem, although i'm not as experienced at getting into the back-end settings on osx.
    the part that confuses me the most is the ping. latency generally increases with distance, not so much speed. i dont understand how the same server situated nearby can give 66 ms in windows and 800 in osx.
    if anyone has any ideas, i thank you in advance.
    p.s. i have osx 1.5.4 and was updating to 1.5.5 but the slow download speeds caused me to create this post. it does mention an increase in network performance in update notes for 1.5.5, if that will solve this than i apologize, although i doubt this drastic speed difference could have existed for so long if everything is set up correctly.

    You are certainly not alone. I have the same problem. Download speed seems to be limited to 500KB/s (any program, any service) and 2.18MB/s using Windows. I tried two different routers with the same results. Even downloading the same files using rapidshare led to this. Very strange.
    This has nothing to do with the DNS server. My connection speed is pretty fast. With and without the recommended OpenDNS server settings. I just re-installed Leopard and updated to 10.5.6. Nothing changed.
    Seems like something is throttling the max. dl speed. Odd!

  • Cisco Nexus 3K Layer 3 Connectivity Issue while using Optical SFP

    Dear All,
    Am facing L3 reachability issue between N3k switched, even in same subnet. Also checked that VLAN is allowed under trunk port.
    I can able to see the switch details as CDP neighbour.
    We are using SVI, and found all the SVI and Interface protocol status is up/up. So to test I use a host to directly connect N3k with Optical SFP in access port, found failure on reachability, but while replacing with SFP ethernet module instead of SFP optical module reachability is okay.
    Please help me to resolve this issue.
    Thanks,
    Kannan,

    Hello Amit,
    Pls find the following details..
    We use SFP-10G-LR Modules on both end, we also replaced and checked with SFP-10G-SR modules as well..
    Software
      BIOS:      version 1.9.0
      loader:    version N/A
      kickstart: version 6.0(2)A1(1b)
      system:    version 6.0(2)A1(1b)
      Power Sequencer Firmware:
                 Module 1: version v3.1
      BIOS compile time:       10/13/2012
      kickstart image file is: bootflash:///n3500-uk9-kickstart.6.0.2.A1.1b.bin
      kickstart compile time:  9/5/2013 14:00:00 [09/05/2013 22:37:16]
      system image file is:    bootflash:///n3500-uk9.6.0.2.A1.1b.bin
      system compile time:     9/5/2013 14:00:00 [09/06/2013 02:25:01]
    Hardware
      cisco Nexus 3548 Chassis ("48x10GE Supervisor")
    Thanks for the reply,and sry for my delayed response..

  • Transformation issue while using IF condition.

    hi everyone,
    i am using bpel transformation based on the condition of field using IF condition.
    A       B          C          D
    10     20          30          40
    20     20          30          50
    30     30          20          60
    40     40          20          70
    now i need to apply IF condition in transformation on the filed B. if B = 20 transfer the data to one table or B NE 20 to other table
    now my issue is, it is able to differentiate the rows, but the data in the field in second table remains same 20.
    as i hope its storing the data in buffer location and writing it. after mapping with the fields in source.
    can anyone suggest, how to get the actual value after differentiating the records.

    Declare an internal table with two fields lar and another field for seconds.
    IF v_lar01 = 'MSTD' and v_vge01 = 'MIN'.
    it_time-lar         = lar1
    it_time-seconds = t_plpo-vgw01 * 60  " minutes to seconds
    else.
    it_time-lar         = lar1
    it_time-seconds = t_plpo-vgw01 " second
    endif.
    append it_time.
    IF v_lar02 = 'MSTD' and v_vge02= 'MIN'.
    it_time-lar         = lar2
    it_time-seconds = t_plpo-vgw02 * 60  " minutes to seconds
    else.
    it_time-lar         = lar2
    it_time-seconds = t_plpo-vgw02 " seconds
    endif.
    append it_time.
    and so on...at the end sum it_time-seconds and do the calculate to get hrs n minutes from seconds
    Mathews

  • Fonts issue while using FM 7.1

    Hi,
    I am using the XML file as a source and port the content to FM 7.1. I have to port 8 languages like Arabic, Russian, Korean, and so on. I am facing the following issues with few of the languages:
    Korean - The Korean fonts (Gulim, GulimChe, Dotum, and DotumChe(True type)) are available in the FrameMaker font list, but when I copy andpaste the content in FrameMaker and apply the fonts, the content is not thesame, there are “?” in between.
    Russian - The Russian font (Arial) is available in theFrameMaker font list, but when I copy and paste the content in FrameMaker, thecontent shows “?”
    zh-hans - The Chinese Simplified font (simsun) is available in theFrameMaker font list, but when I copy and paste the content in FrameMaker, thecontent shows “?”
    zh-hant - The Chinese Traditional font (mingliu) is available in theFrameMaker font list, but when I copy and paste the content in FrameMaker, thecontent shows “?”
    Can anyone help me to resolve this issue or if there are any websites/fonts available to download?
    Thanks,
    Keerthi

    Keerthi,
    As Michael Kazlow mentioned using FrameMaker 7.1 is one stumbling block. Using FrameMaker 8 or later will remove the problems with Russian and Eastern European languages, provided you are using fonts that support all necessary characters. Arial as FrameMaker 7 sees it does not contain Cyrillic glyphs, add Russian keyboard settings to your system and during this process a virtual font "Arial CYR" will be made available.
    With Arabic there is no real option with special tricks. FrameMaker does not and never did support right-to-left languages. All »solutions« I have heard of worked with tricks like mirrored fonts and could not take advantage of the built-in features of Windows and therefor need a proprietary editor to prepare the input.
    Regarding all non-Western languages, copy & paste requires that your system is set to the appropriate system locale.
    I am wondering why you talk about XML and also Copy & Paste. Are you going to handcraft the documents and the input is by chance in an XML document or are you planning to design a structured application? With the latter, please move the discussion to the Structured forum.
    - Michael

  • Issue while using CountNode()

    Hi All,
    I am stuck at this issue:-
    I am using a file adapter to read a file which has some repeated tags and I want to count the number of repeated nodes.I came across this function and used in my code. But instead of given the total count of the nodes( in my case it should return 4) but it is given 0 all the time. I have tried some changes in the function but it didn't help.
    Here is the Schema file I am referring :-
    <?xml version="1.0" encoding="windows-1252" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
                xmlns="http://www.example.org"
                targetNamespace="http://www.example.org"
                elementFormDefault="qualified">
      <xsd:element name="City">
        <xsd:complexType>
          <xsd:sequence>
            <xsd:element name="CityList" maxOccurs="unbounded">
              <xsd:complexType>
                <xsd:sequence>
                  <xsd:element name="LongName" type="xsd:string"/>
                  <xsd:element name="ShortName" type="xsd:string"/>
                  <xsd:element name="Language" type="xsd:string"/>
                  <xsd:element name="Capital" type="xsd:string"/>
                </xsd:sequence>
              </xsd:complexType>
            </xsd:element>
          </xsd:sequence>
        </xsd:complexType>
      </xsd:element>
    </xsd:schema>
    Here is my .bpel file for that in which I have used the countnode():
    <partnerLinks>
        <partnerLink name="read_Dvm_value" partnerLinkType="ns1:Read_plt"
                     myRole="Read_role"/>
      </partnerLinks>
    <variables>
        <!-- Reference to the message passed as input during initiation -->
        <variable name="receiveInput_Read_InputVariable" messageType="ns1:Read_msg"/>
        <variable name="countNodes" type="xsd:int"/>
      </variables>
      <sequence name="main">
        <!-- Receive input from requestor. (Note: This maps to operation defined in BPELProcess1.wsdl) -->
        <receive name="receiveInput" variable="receiveInput_Read_InputVariable" createInstance="yes"
                 partnerLink="read_Dvm_value" portType="ns1:Read_ptt"
                 operation="Read"/>
        <assign name="Assign1">
          <copy>
            <from expression="ora:countNodes('receiveInput_Read_InputVariable','body','/ns2:City/ns2:CityList')"/>
            <to variable="countNodes"/>
          </copy>
        </assign>
      </sequence>
    </process>
    Please suggest.
    Thanks,
    Mohit

    Thanks Bob. I changed the code accordingly:
    New Code:
    rem Local object &oWorkApp1, &oWorkSheet1, &oWorkBook, &oRange, &oCells;
    Local string &ampnewstring;
    &oWorkApp1 = CreateObject("COM", "Excel.Application");
    MessageBox(0, "", 0, 0, "&oWorkApp1:" | &oWorkApp1);
    &oWorkApp1.DisplayAlerts = False;
    ObjectSetProperty(&oWorkApp1, "Visible", False);
    &oWorkBook1 = ObjectGetProperty(&oWorkApp1, "Workbooks");
    rem &oWorkBook1.Open(&sDataFile);
    &oWorkBook1 = &oWorkApp1.Workbooks.Add();
    &rs_Awards = CreateRowset(Record.MSU_TEST_JOB);
    &rs_Awards.Fill("order by EMPLID");
    &oWorkSheet1 = &oWorkApp1.Worksheets("Sheet1");
    &oWorkSheet1.Name = "Karthik";
    For &ie = 1 To &rs_Awards.activerowcount
    &oWorkSheet1.Cells(&ie, 1).Value = &rs_Awards.getrow(&ie).MSU_TEST_JOB.EMPLID.Value;
    End-For;
    &oWorkSheet1 = &oWorkApp1.Worksheets("Sheet2");
    &oWorkSheet1.Name = "test";
    For &ie = 1 To &rs_Awards.activerowcount
    &oWorkSheet1.Cells(&ie, 1).Value = &rs_Awards.getrow(&ie).MSU_TEST_JOB.EMPLID.Value;
    End-For;
    &oWorkApp1.ActiveWorkbook.Saveas("\\nacrtcdell160\cust\temp\Directory.xls");
    ObjectSetProperty(&oWorkApp1, "Visible", True);
    &oWorkApp1.ActiveWorkbook.Close();
    &oWorkApp1.DisplayAlerts = true;
    &oWorkApp1.Quit();
    And also we have the write permission on the path to write a file. Still should i need to change anything on this?

Maybe you are looking for

  • Saving records to a flat file

    Dearest experts, I do need your kindest assistance. I have over 15 fieldnames from 11 tables. Some of these fieldnames are mandatory fieldnames and others are optional. The users are instructed to enter 'NA' to any to any fieldname that doesn't have

  • VALUATION NOT APPEARING IN RESULT RECORDING

    Hello.... I have created an inspection plan where i have assigned all the characteristic to be inspected with quantative data. Upon goods receipt inspection lot is generated, while doing result recording when i enter the result for eg. if the charact

  • Restoring 2nd Gen Ipod Nano?

    hey guys, iv just had a real big problem with my ipod nano (2nd gen)... well basically i was wanting to sync some more music from my itunes onto it and it just kept disconnecting itself for some stupid reason, and after several times it said somethin

  • Error handling HTTP 400 in WCF-WebHttp adapter BizTalk 2013

    Hi! Does anyone knows how to catch HTTP errors when using WCF-WebHttp REST adapter in BizTalk 2013? I've looked into the article BizTalk Server: REST Services Error Handling, but that is for BizTalk 2010 using the WCF-Custom adapter. What I want is t

  • SCD type 2 in OBIEE

    Hi, I'm relatively new to OBIEE and trying to implement slowly changing dimension type 2.-- i.e. to look up the correct record in the customer dimension (A_FICC_ACCOUNT) based on the transaction date in the fact table (A_FICC_PROFITABILITY). The cust