SMP 1.0 issue/question with SMIL, XML, M3U

Tryng to dynamically create a SMIL, XML, or M3U file to be played in SMP 1.0.  The page is generated from PHP script and appears to be properly formatted.  The .src parameter is properly set when the file is a static SMIL or M3U file, but as soon as we point the .src parameter to the dynamically generated PHP code, the SMP gives me the "we are unable to connect to the content you requested error".
Please note:  The HTTP content types are identical between the static file and the dynamically generated file.
Any help you can offer would be appreciated.

hi jurkiri,
My buttons does not have the immediate attribute set to true.
my button is just like that :
<af:commandButton text="Visualiser le rapport"
                              id="commandButton3">
           <af:fileDownloadActionListener method="#{backing_statistiques.rapportChangementPlafond}"
                                                       filename="rapport.pdf"
                                                       contentType="application/pdf"/>
           </af:commandButton>
If I do not use the filedownloadActionListener and use the similar method directly as a button action, the parameters work but the button remains down.

Similar Messages

  • Several issues questions with 865PE neo2 FIS2R

    ok first off thanks to the people here, the sticky's are great and very informative.
    although i do find the info outdated in that they all refer to old bios's. other than that this is a fantastic forum
    ok so on with the questions.
    im using a Silicon Image IDE U133 RAID controller (PCI CARD), , the ide hard drive(s) do NOT have to be in raid its just like a Promise card but its made by someone else.
    ANYTIME i have ANY type of hard drive hooked up to that, the motherboard takes nearly 5 minutes to boot to the active hard drive. even if i select boot menu and then the RAID hard drives the SATA drives i mean.
    my active hard drives are 2 Maxtor 160gig SATA drives in RAID0 mode.
    if i have NO hard drives hooked to the controller card the system boots directly to the active boot hard drive. without dealy.
    IF i have an ide hard drive set to boot off the SIL card, it takes just as long. which seems like ages.
    anyone know of issues with this mobo and certain IDE hard drive controllers. (PCI)
    i too am having blank sreen issues with my 9700 pro video card.  but mine is nearly 2 out of 5 times i boot i have no video, and many times i am forced to reach around the pc case and unplug the power cord for a few minutes, in order to get the system to boot WITH video.
    another issue that i have and this is the WORST.
    nearly 90% of the time, the motherboard will NOT i repeat will NOT boot from a floppy or cdrom, all i get is the message on screen, if i am using a bootable floppy, like "Starting windows 98" or "Starting Windows ME"  these are DOS boot disks made with windows XP.
    i then get only a blinking curser. and nothing else.
    OR
    i get a blinking cursor and then i must wait 5-10 minutes before anything else happens. and even then usually the cdrom or floppy wont complete the booting process.
    this happens regardless of if i have anything hooked to that IDE controller card.
    i am now using the LATEST bios offered for this mobo and nothing has changed in fact ,,well according to the readme i was expecting to see some changes within the bios but i seee none. maybe i read the read me wrong. (say that 5 times fast)
    i can NOT run the ram performance mode in anything other than slow. which is making me very angry.
    i have infineon ram not sure what model but i will report back when i find out.
    but i do recall sandra reporting it as supporting 2.5cas and other setting that appear to be FAST or TURBO and yet i cant use those in the bios otherwise my system reboots in windows or i get memory speed errors during post.
    overall im VERY pleased with this mobo but am discouraged at the way its treating me.
    i have a PentIV 3.0ghz HT which is a NON production model, it is a sample model with the multiplyer unlocked.  i can go with a lower multiplyer but NOT a higher one, so i assume its unlocked but the CPU itself is limited to 15x multiplyer.
    i have tried tons of configurations and can run very stable at 3.45ghz but i MUST make the dram 333 mhz  not 400. then raise the FSB to around 225 or 230.  but i still get serious power offs when doing any heavy work on the pc.
    i have CPU voltages at 1.575 yet  sandra reports it as 1.55
    DRAM voltage at 2.7  ( i am using DDR400)
    and AGP at 1.7
    with this CPU what is a GOOD voltage to use.
    is it ok to go higher than 2.7 with DDR400 ?
    and AGP  everything i have read said that 1.8v will fry the 9700 pro.  is this true ?
    sorry im asking alot of questions but if you can help me with any of them i would REALLY appreciate it, and i would pass along any info i learn.!

    Well as Far as your PCI IDE Controller Questions are concerned, I would not know as I have never Used one , And with the Available Promise controller, you should Try using this instead......As Far as your Memory Voltage, You can Raise the Voltage above 2.7 Volts, But unless you are Really Overclocking the PBS or FSB as some call it, then 2.7 Volts is the "Norm" For this Board, But if you do NOT have High Performance RAM you should check with the Manufacturer, before you Go any Higher than 2.7 V, as far as the AGP Voltage is concerned, 1.6 Volts is the MAX that you should Have it set to with any of the new NVIDIA-FX or ATI-PRO, or XT Series Cards, Because as you know, the Cards that are 8X AGP Speed only Require .8Volts to begin with, and the ABP Voltage is Further Reduced by the Cards onboard Electronics, And if you have one of the High Performance Cards, well as you know, these get 90% of their Operating Power from the 4 Pin Molex PSU Connector. Any higher AGP Voltage then 1.6 Volts will only increase the Temps. of Both the Motherboards Chipset, and the VPU itself.................Sean REILLY875

  • Issues - OC4J with Adobe XML/PDF Access API (XPAAJ)

    Hi,
    I am facing some issues with running XPAAJ with OC4J. I narrowed down the issue to the compactibility issues between the XML implementation which comes with oc4j and the ones used by adobe. In particularly the problem is caused by the xmlparserv2.jar which comes with the jdev.
    Please let me know if you have seen this issue.
    Thanks
    MS

    Never mind. I found the problem. The code is fine. The
    problem was that I was passing simple XML as form data but that
    only works for forms created in LiveDesigner. I had to create XFDF
    string to pass in as data. That worked but now getting another
    error that I need to debug in the xfdf.

  • RAM issues/questions with Bridge CS4

    Hi,
    I've become a little perplexed with Bridge CS4.  I constantly get an "out of memory" error with bridge when working with many large files (21 megapixels) through adobe camera raw, despite having a bullet proof system.  First, some system specs:
    Vista 64-bit
    Intel i7 3.2GHz processor
    12 Gb of 1600Mhz RAM
    Dedicated separate scratch disc
    This computer just freakin rips!
    So photoshop (the 64 bit version) just runs like a breeze, despite vista being its stupid self and taking up about 1.6 gigs of ram just for the OS.  However, I am mainly wondering that if Bridge isn't a 64 bit program (it is installed in the x86 program files. Only photoshop is installed in the 64-bit program files) then maybe it can't utilize the extra ram over 4 gigs?  If so, then it is severely crippled and limited to the 4 gig ceiling shared with the RAM thirsty OS.
    Does anyone actually know and have any answers?  I know the "out of memory" error has long been discussed but never seems to come to some solid conclusions.  And really, there is no reason why bridge should be out of memory on a 12 gig system with 9 of those just sitting free when it say its supposedly "out of memory".
    Any ideas?
    Thanks

    If I remember correctly the window's paging file is set to a discrete size, believe recomended 1.5 x ram size.  If Bridge is running in 32 bit mode guess you only need 6 gig paging file, but would more do anything?  How large is it, is it on a seperate partition?  Also consider this:
    If a paging file resides on a partition that contains other data, it  may experience fragmentation as it expands to satisfy the extra virtual memory that is required. An unfragmented paging file leads to faster virtual memory  access and to a greater chance of a dump-file capture that is free of significant errors.
    Is fragmentation a consideration?

  • Recording issues/questions with GB 8 and M-Audio Fast Track Pro

    Hello. I'm sure this topic has been covered and I think I must be missing something painfully obvious here. Okay. I want to record vocals onto a GB track using M-Audio's Fast Track Pro. I want to be able to hear myself through headphones while I'm recording vocals.
    1. I've installed the drivers.
    2. The M-Audio Icon appears in the System Preferences folder (other).
    3. When I go to GB, I go the GB Preferences and select Fast Track Pro as the Input.
    <here's where I'm lost>
    4. I plug an XLR microphone into the front XLR connection and I have to turn gain all the way up on the Fast Track Pro to even get a signal. When I do get a signal, it only registers on one channel.
    *I do understand that a microphone is a mono instrument, does GB automatically create stereo channels on a new track? If so, do I need to make it mono? and how do I do that?
    5. How do I route the audio to my headphones (plugged into the MAC) to monitor my recordings?
    I guess I'm confused with the system preferences vs. the preferences found in GB.
    Thank you for any assistance with this!

    When I do get a signal, it only registers on one channel.
    http://www.bulletsandbones.com/GB/GBFAQ.html#leftspeakeronly
    How do I route the audio to my headphones (plugged into the MAC) to monitor my recordings?
    Choose Built In for output in GB's prefs.
    I guess I'm confused with the system preferences vs. the preferences found in GB.
    leave the system prefs alone, just change GB's.

  • Re: Issues/Questions with BAPI_COPAACTUALS_POSTCOSTDATA

    I cannot get vendor from sourcelist to appear in my COPA document (ke23n). I'm using WWVND. Any thoughts?

    I got the profit center to work. And it is possible to load w/o customer number KNDNR.
    My issue is when there IS a valid customer number. I get an error, if I change just the KNDNR  to a non-existent customer, it works. Here is my table entry:
    60     000003     KNDNR     0000021014
    By the way, when I use BAPI_COPAACTUALS_POSTCOSTDATA in se37 single test, I don't get an error.

  • Issue with inbound xml..

    Hi All,
    We have an issue with inbound XML :
    XML structure is as follows:
    <?xml version="1.0" encoding="UTF-8" ?>
    <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
        <SOAP-ENV:Header>
          <Q-ENV:Header>
                  </Q-ENV:Header>
        </SOAP-ENV:Header>
        <SOAP-ENV:Body>
          <Q-ENV:Body>
            <Q-ENV:Content-Type>text/xml</Q-ENV:Content-Type>
            <Q-ENV:Message-Type>xCBL</Q-ENV:Message-Type>
            <Q-ENV:Encoding>UTF-8</Q-ENV:Encoding>
            <Q-ENV:Message-Body>
              <?xml version="1.0"?>
    <?soxtype urn:x-commerceone:document:com:commerceone:XCBL30:XCBL30.sox$1.0?>
    <OrderResponse>
    The issue is see at the <Q-ENV:Message-Body>  we are receiving  <?soxtype urn:x-commerceone:document:com:commerceone:XCBL30:XCBL30.sox$1.0?> for OrderResponse header ..it is neither validating as valid xml or unable to read the items after that namespace with graphical mapping or xslt ..if anybody have any idea, thanks

    Hello,
    The issue is see at the <Q-ENV:Message-Body> we are receiving <?soxtype urn:x-commerceone:document:com:commerceone:XCBL30:XCBL30.sox$1.0?> for OrderResponse header
    You can use java mapping for your requirement. The key is to convert the inputStream into String and then use a find/replace that value and then write to outputStream afterwards.
    Here is a sample code using PI 7.1 API:
    https://wiki.sdn.sap.com/wiki/display/XI/SampleJAVAMappingcodeusingPI7.1+API
    Hope this helps,
    Mark

  • I have a question about using multiple ipads in our school.  Each of our teachers have a iPad and AppleTV in their classroom.  The issue is, with our classrooms so close in proximity to one another, is there a way to pair teacher

    I have a question about using multiple ipads in our school.  Each of our teachers have a iPad and AppleTV in their classroom.  The issue is, with our classrooms so close in proximity to one another, is there a way to pair teacher #1 iPad to its AppleTV without effecting/projecting onto the adjacent teachers #2 classroom AppleTV?

    Not as such.
    Give the AppleTV units unique names and also enable Airplay password in settings with unique passwords for each teacher.
    AC

  • Strange issue creating a BCP xml format file with double dagger '‡' (alt + 0135) as column terminator with bcp.exe

    Hi,
    I'm having issues generating a BCP XML format file using a fairly unusual column terminator, a double dagger symbol
    ‡ (alt + 0135) which I need to support.
    I'm experiencing this problem with bcp.exe for SQL2008 R2 and SQL2012.
    If I run the following command line:
    bcp MyDB.TMP.Test_Extract format nul -c -x -f "C:\BCP\format_file_dagger_test.xml" -T -S localhost\SQL2012 -t‡
    I end up with a XML format file like so:
    <?xml version="1.0"?>
    <BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
     <RECORD>
      <FIELD ID="1" xsi:type="CharTerm" TERMINATOR="ç" MAX_LENGTH="255" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
      <FIELD ID="2" xsi:type="CharTerm" TERMINATOR="ç" MAX_LENGTH="50" 
    .. and so on.
    You will notice that the TERMINATOR="ç" (Minuscule c-cedilla) is output instead of TERMINATOR="‡". The
    ç character, is strangely enough is alt + 135 (and not alt + 0135) so this might more than a coincidence! I know you can specify the codepage but this switch applies the data being imported or extracted and not for the format file
    itself (I tried it anyway). 
    In order to use the XML file to bulk import I manually did a text substitution of 'ç' character for '‡' and then BCP imports '‡' data fine. 
    This character swap doesn't occur if I generate a non XML format file (the '‡' character is output in the format file correctly) however, this file produces other import errors, which I don't encounter if I use a standard delimiter like a comma. So I have stuck
    with the working XML format file which I prefer.
    Does anyone know why this is happening? I'm planning to automate the generation the of the XML format file and would like to avoid the additional step of text substitution if possible.
    Thank you.

    Hi Ham09,
    According to your description , we do a test and find that the character of the terminator is changed due to the code page of Operation System. When you choose the different time zone in Data and Time bar, and do the same bcp test, you will find it will
    export the different TERMINATOR in your XML format file. For example, you can import the character "ç" (alt + 135) in (UTC-12:00)International Date Line West time zone and (UTC+09:00)Osaka, Sapporo, Tokyo time zone, and check if the terminators are different.
    By default, the field terminator is the tab character (represented as \t). To represent a paragraph mark, use \r\n.
    For more information, there is detail about code page(Windows), you can review the following article.
    http://msdn.microsoft.com/en-us/library/windows/desktop/dd317752(v=vs.85).aspx
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • Issue in Creation of XML file from ABAP data

    Hi,
    I need to create a XML file, but am not facing some issues in creation of XML file, the in the required format.
    The required format is
    -<Header1 1st field= u201CValueu201D 2nd field= u201CValueu201D>
       - <Header2 1st field= u201CValueu201D 2nd field= u201CValueu201Du2026u2026. Upto 10 fields>
              <Header3 1st field= u201CValueu201D 2nd field= u201CValueu201Du2026u2026. Upto 6 fields/>
              <Header4  1st field= u201CValueu201D 2nd field= u201CValueu201Du2026u2026. Upto 4 fields/.>
               <Header5 1st field= u201CValueu201D 2nd field= u201CValueu201Du2026u2026. Upto 6 fields/>
          </Header2>
       </Header1>
    Iu2019m using the call transformation to convert ABAP data to XML file.
    So please anybody can help how to define XML structure in transaction XSLT_TOOL.
    And one more thing, here I need to put the condition to display the Header 3, Header 4, Header 5 values. If there is no record for a particular line item in header 3, 4 & 5, I donu2019t want to display full line items; this is only for Header 3, 4 & 5.
    Please help me in this to get it resolved.

    Hello,
    you can use CALL TRANSFORMATION id, which will create a exact "print" of the ABAP data into the XML.
    If you need to change the structure of XML, you can alter your ABAP structure to match the requirements.
    Of course you can create your own XSLT but that is not that easy to describe and nobody will do that for you around here. If you would like to start with XSLT, you´d better start the search.
    Regards Otto

  • Will the new 10.6.8 update fix SATA3 issues related with 2011 MacBook Pros?

    Will the new 10.6.8 update fix SATA3 issues related with 2011 MacBook Pros?

    It is against TOU to speculate on these message board.  Suggest you post your question on the Mac Rumors site.
    As I already stated, there is no 10.6.8 update.  If there was, it would be listed in Software Update.

  • Problem with the XML in Word 2007 (Word Template)

    Hi Experts,
    i am new on CRM 2007 and i have a problem with the XML Structure of the Word Template.
    First i built a Web Service Design Tool. Then i saw on the Testpage, that it works.
    So i started the Document Templates and created a new Template. Object Type was BUS2000126 - CRM Business Activity. Web Service was my created and tested Web Service Tool.
    As i opened the Word 2007 with the XML-Structure, i recognized, that there was something wrong.
    The Responce on my Testpage from the Web Service Tool had the following structure:
    response (test.types.p1.CrmostZlaWord5ReadResponse)
       Output (test.types.p1.CrmostZla010RoszlaWord5001)
            ZlaWord5 (test.types.p1.CrmostZla010Rosbtorder)
                Administrativeheaderoforder (test.types.p1.CrmostZla010Rosbtorderhea001)
                     Partiesinvolvedofheader (test.types.p1.CrmostZla010Rosbtheaderpa001)
                         Allpartiesinvolved (test.types.p1.CrmostZla010Rosbtpartnera002[]) Displaying 3 elements of 3
                              element1 (test.types.p1.CrmostZla010Rosbtpartnera002)
                                   Btpartneraddress (test.types.p1.CrmostZla010Rosbtpartnera001)
    My Problem is now, that the XML-Structure got not that point "element1".
    Instead of "element1" there is the point "item" in my XML-Structure in Word 2007.
    I guess that is the Problem why i am not getting the fields of the Btpartneraddress filled in my Word.
    Can anyone help me? Or put me in the right direction that i can change the XML?
    Thanks for your help
    André

    Hi andré, I guess the issue is coming from the fact that you selected "AllPartiesInvolved" and that may contain any numbers of entries. So when you test your webservice, you put a key and then get a result for that key, and in that case you might get "element1" until "element3" for example if there was 3 partners involved in you activity.
    But, when you design your template, you don't have a key at that moment, so in the Web Service structure, you have "items" which stands for all the possible entries you might retrieve at runtime. I guess you could use an index in your template to specify which item you need, but this is quite hasardeous, so i would be you, I would not design my web service to use "AllPartiesInvolved" but rather a specific Partner type like contact person for instance.
    Regards,
    Xavier

  • Strange Behavior with SQL/XML

    Our University have had for quite some time now a rather difficult situation with a very significant course handbook web site that creates HTML based on database XSL transformation of XML content generated with SQL/XML. The database is 10.2.0.2.0. The HTML is passed via a distributed database link to an Oracle Portal dynamic portlet for display. Some of the derived XML originates from XML Schema registered instance documents and some from relational storage.
    Let me explain. Occasionally after a period of operating (could be one week, sometimes even less) we will experience a problem whereby our derived XML content (just prior to PL/SQL database XSL transformation) will develop a parsing problem. Once this happens to one XSL transform of a course then subsequently it happens for every course viewed/transformed there-after.
    Recently I experienced the problem and then added the XMLNS argument to the extract function. That seemed to fix things and I thought I had it licked.
    Today we started to get the same problem again. The error reported was:
    ORA-31011: XML parsing failed
    ORA-19202: Error occurred in XML processing
    LPX-00242: invalid use of ampersand ('&') character (use &amp;)
    Error at line 1
    Now this might have been reasonable for one course. But every course reported the same problem there-after without fail. I managed to fix the issue but in a way that really doesn't seem to make sense. What I did was to alter the SQL/XML statement by adding in an extra XMLEMENT and then to recompile the function that returns an XMLTYPE based on the execution of the SQL statement. After compiling the function everything returned to normal.
    This doesn't seem to make sense.
    I am pretty desperate for help on this. I really don't know how to progress the solution. I don't get much joy from Oracle Support as our application covers too many high tech areas. This is really shaking confidence in our organization's use of Oracle technology.

    Hi Mark.
    We've isolated the problem a bit more now and there is a support SR (6089662.994). At this stage we are unable to reproduce - I am testing out an export though. The problem only occurs when a certain condition has been reached. Once that happens then extraneous characters are produced in the XMLTYPE feed returned by a function. We are not at all sure about how to make this condition occur. This doesn't just happen with one SQL/XML cursor - I've seen it happen with 3 different cursors. All are located in the same function that returns an XMLTYPE.
    We are convinced the problem lies with one or more SQL/XML cursors located in the same function. Once one of these cursors malfunctions it will produce XML that is not well formed OR on occasion has content omitted but is well formed (there often seem to be extraneous characters). When the XML content is not well formed then the parsing stage (dbms_xmlparser.parseclob(myparser, xml_clob)) of XSL transformation crashes. This error then propagates to the dynamic portlet in Oracle Portal thus removing the content on the page. This happens for every execution of the cursor while the condition manifests, the only difference between calls being bind variables.
    The interesting thing is that the error if left to itself will eventually stop. The cursor seems to right itself eventually. On the last occurrence, the problem commenced at 4.30am and went for 3.5 hrs and then seemed to right itself.

  • Temp tables with a XML target

    Hi guys,
    I posted a question last week about sorting an element while generating an XML file but didn't get any answers maybe my question wasn't clear. Here is another problem with the XML technology...which begins to upset me because I don't have any idea to solve it.
    To sum up, my XML schema named QST is stored in the memory engine and not in an external base. I have a package with a set of interfaces one for each XML table and a final CreateXMLFile treatment.
    Most of the time, my package runs well but sometimes I get this error in one interface:
    Step 1:
    +     Query+
    +     drop table QST.C$_0QUESTION+
    +     Warning message:+
    +     0 : null : java.sql.SQLException: Table not found: QST_C$_0QUESTION in statement [drop table QST_C$_0QUESTION]+
    Step 2:
    +     Query:+
    +     create table QST.C$_0QUESTION ( .... )+
    +     Error message:+
    +     0 : null : java.sql.SQLException: The table C$_0QUESTION already exists in the schema QST+
    This problem can occur at the creation of any working table C$, I$, E$ or SNP_CHECK_TAB.
    So it seems it is linked with the in build memory but what gets me mad is that I can't get the pattern to reproduce the error.
    Any idea would be greatly appreciated and I really hope this post will be more "inspiring" than my first one :D
    Thanks,
    Thierry

    Hi Rathish,
    thanks a lot for the time you spend to help me.
    As you say, the table is coming with an underscore in the error message... it's strange but I think it's just a small bug when ODI generates the error message. I say so because the package runs with success most of the time and the XML I get is well built. But maybe not, so to answer your questions.
    The KM I use is the original IKM SQL Incremental Update as it is indicated in the ODI user guide.
    So there shouldn't be any problem
    Best regards,
    Thierry

Maybe you are looking for

  • Function module throwing error from work area. Cannot find the problem...

    REPORT  ZPSMARTFORM1. tables: zptable1. types: begin of ty_zptable1,       f1 type zf1,       f2 type zf2,       f3 type zf3, end of ty_zptable1. data: itab type table of ty_zptable1 with header line. data: wa type ty_zptable1. select f1 f2 f3 from z

  • How Can I Setup Multiple External Editors In iPhoto'09? (8.12)

    My daughter has cortical visual impairment and we are trying to modify photos so that she can get more out of them.  Just about everything we need to do can be done in one program like photoshop but it can be tedious and if you know anything about ca

  • G5 Power Mac - os x

    My son-in-law bought a Apple G5 and it has bootleg software. I would like to buy 10.5 with iLife, iWork, etc.  I found a sealed one on ebay.  Do you know of any problems with installing this software?

  • Concatenating two columns in obiee 11g

    Hi, I am getting an syntax error when i concatenate two columns from the same table and i think syntax is correct. below is the column formula: (cast("Dim Stock Market"."Days High" as varchar))||'/'||(cast("Dim Stock Market"."Days Low" as varchar)) e

  • Log in Items not 'Sticking'.

    Howdy folks. Yes, I have taken the plunge from the PC world into the cool new world of Apple Macbook Pro Lion. I am also loving my iPhone 4s, iPad 2, Apple TV, etc. However, As I have been getting familiar with the OS X Lion, I have run into a bug wh