IDOC order of columns not same as the data source

Hi,
I am from non SAP background. We are using Informatica to pull data from SAP ECC using the Business Content for Integration by pulling data from IDOCS. Here is my problem:
1) We identified a particular data source (0FI_GL_4) for full mode data pull using Informatica. However, during extraction we found that the order of ports (columns) in the datasource and that generated in the IDOC are not the same. As a result, the loads are failing due to data conversion or mismatch errors.
Question is,how do we ensure that the order of columns in the IDOC generated is the same as that in 0FI_GL_4?
Thanks,
R.

Hi,
Please find the below link may useful.
http://wiki.ittoolbox.com/index.php/Re-Connect_R/3_and_BW
Reg,
Venkat

Similar Messages

  • Enter a valid basic type - BW IDoc type RSSEND is not same as the SS IDOC

    Hi,
    I am connecting the ERP system to BW system, and am getting error:
    CHECK(context menu source system):
    @5C@     BW unknown in source system     RSAR     8     
    @5C@     The BW IDoc type RSSEND is not the same as the source system IDoc type     RSAR     371     
    RESTORE(context menu source system):
    Information: Enter a valid basic type
    Information help:
    Enter a valid basic type
        Message no. E0434
    Diagnosis
        An entry with the key /BD2CLNT100/LS//RSSEND//// is to be inserted in
        table
        EDP13.
        The value 'RSBEND' is not valid for field IDOCTYP.
    Procedure
        Check whether basic type RSBEND exists.
    help required

    Hi,
    Please find the below link may useful.
    http://wiki.ittoolbox.com/index.php/Re-Connect_R/3_and_BW
    Reg,
    Venkat

  • Custom sort pivot table columns with Essbase as the data source

    Is it possible to sort columns in a pivot table according to an arbitrary value that I define when the data is coming from Essbase?
    For example, say I have a dimension called Soda, with values Coke, Diet Coke, Dr. Pepper and Diet Dr. Pepper. I create a report with a sales measure with the measure labels on the rows and the Soda dimension on the column. By default the columns will be sorted alphabetically:
    Coke Diet Coke Diet Dr. Pepper Dr. Pepper
    Sales 1M .5M .75M 1.25M
    I want to create a report that looks like this:
    Coke Diet Coke Diet Dr. Pepper Dr. Pepper
    Sales
    I think I could do this if the source was relational just by creating bins or creating a custom column with a case statement that assigns each Soda an arbitrary value and then sort on this value. Everything I've tried with Essbase as the source, though, results in:
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 42043] An external aggregate is found in an outer query block. (HY000)
    Any ideas?

    Hi,
    1. You can try solve the 'An external aggregate is found in an outer query block' by changing aggregation rule for your measure both in physical and business layer.
    By default it's set to Aggr_External - change it to Sum
    In physical : Column properties->Aggregation rule
    In business model : Column properties->Aggregation tab -> Default aggregation rule.
    This may change the result - after changing check whether you still get correct values.
    2. Also, in case the desired order is the same as the order of members in the Essbase cube, and you want to leave Aggr_External, you can create a calculated column that will help you with the sort.
    See http://oraclebizint.wordpress.com/2008/04/28/oracle-bi-ee-101332-handling-sort-order-in-hyperion-essbase-931-evaluate-and-mdx/
    Hope this helps,
    Alex

  • Same layout differerent Data source

    Hello,
    Here is the setup :
    I have a couple of reports that have data being fetched from 2 different sources : database and XML files.
    1.The Layouts for both of them are exactly same,just the Data sources are different.
    2. I have created a Top level frame each which enclose all data fields for respective data source.
    3.Based on a parameter being passed,I decide which Top level frame to hide/display,using format trigger.
    4.The top level frame enclosing XML source fields is located 1 inch below ( physically, not in terms of layers ) Top level frame enclosing Data source fields
    Now the problem :
    When the Top level frame bound to Database Data source is hidden, the top level frame enclosing XML source fields should move up, automatically. This is not happening.
    I tried to anchor top level XML frame to top level Data source frame and 'Keep With Anchoring Object' property = yes of the Top Level XML frame.. It did not help either.
    Can anybody provide any isights ?
    Thanks.

    An additional detail :
    Main section of my report layout has 2 vertical panels.
    My Top level frame fetching data from database is on first panel and Top level frame feching data from XML data source is on the second panel.
    If I try to keep both frames on the same panel, then the second frame extends out of the panel boundry, and is tough to edit then.

  • Why do columns not appear in the same order as the fields on my imported form?

    Hello All
    I am an Acrobat X user and make numerous forms through Acrobat.  However, when I import the file into FormsCentral, the order of the fields in the table do not correspond to the order in which they appear on my form.  The fields on my form are in a calendar format so they appear from left-to-right.  I'm wondering if when I do the import, it is read by FormsCentral from top-to-bottom instead or is it random.  It's a real inconvenience and severely inhibits my workflow when I have to rearrange the columns to work with the data.  Any suggestions or insight would be appreciated.
    Thanks
    Daryl

    Files have no order. File management programs like the finder or windows explorer will display files according to your selected display sort. Typically alpha by file name, date created, size, date modified. So to view them in a selected order you need to sort on an attribute that accomplishes that. The easiest usually is to export photos using the sequential naming option then displaying in alpha order
    LN

  • I'd copied some pictures from a PC to a folder in Finder in my new iMac. The 'Date Created' in iMac is not the same as the 'Date Created' in the PC. Why? Is there a way to fix that?

    I'd copied some pictures from a PC to a folder in Finder in my new iMac. The 'Date Created' in iMac is not the same as the 'Date Created' in the PC. Why? Is there a way to fix that?

    View Menu -> Sort Photos is a good way to start.
    Some organising possibilities in iPhoto:
    I use Events simply as big buckets of Photos: Spring 08, July - Nov 06 are typical Events in my Library. I use keywords and Smart Albums extensively. I title the pics broadly.
    I keyword on a
    Who
    What
    Where basis (The When is in the photos's Exif metadata). I also rate the pics on a 1 - 5 star basis.
    Using this system I can find pretty much find any pic in my 50k library in a couple of seconds.
    So, for example, I have a batch of pics titled 'Seattle 08' and a  typical keywording might include: John, Anne, Landscape, mountain, trees, snow. With a rating included it's so very easy to find the best pics we took at Mount Rainier.
    File -> New Smart Album
    set it to 'All"
    title contains Seattle
    keyword is mountain
    keyword is snow
    rating is 5 stars
    Or, want a chronological album of John from birth to today?
    New Smart Album
    Keyword is John
    Set the View options to Sort By Date Ascending
    Want only the best pics?
    add Rating is greater than 4 stars
    The best thing about this system is that it's dynamic. If I add 50 more pics of John  to the Library tomorrow, as I keyword and rate them they are added to the Smart Album.
    In the end, organisation is about finding the pics. The point is to make locating that pic or batch of pics findable fast. This system works for me.

  • Not getting all the data

    Hi,
    I have the below query, but the output data for the Unavailable column is not showing all the data with a condition_id that's not null.
    Does anyone have any ideas that I might need to try?
    SELECT I.SKU_ID,
    I.DESCRIPTION,
    TO_CHAR(I.EXPIRY_DSTAMP, 'DD-Mon-YYYY') expiry_dt,
    SUM(CASE WHEN I.CONDITION_ID IS NOT NULL THEN QTY_ON_HAND ELSE 0 END) AS Unavailable,
    SUM(CASE WHEN I.CONDITION_ID IS NULL THEN QTY_ON_HAND ELSE 0 END) AS Available,
    P_ILV.SUM_QTY_DUE + (NVL(P_ILV.SUM_TL,0) * NVL(S.USER_DEF_NUM_3,0)) AS Sum_Qty_Due,
    ROUND(ITXN_ILV.AVG_QTY, 2) AS Avg_Qty,
    ROUND(SUM(CASE WHEN I.CONDITION_ID IS NULL THEN QTY_ON_HAND ELSE 0 END)/CASE WHEN ITXN_ILV.AVG_QTY = 0 THEN 1 ELSE ITXN_ILV.AVG_QTY END,2) AS Days_Worth_Stock
    FROM INVENTORY I
    JOIN SKU S
    ON (I.SKU_ID = S.SKU_ID AND I.DESCRIPTION = S.DESCRIPTION)
    JOIN (SELECT P.SKU_ID, SUM (CASE WHEN P.TRACKING_LEVEL NOT LIKE 'C%' THEN P.QTY_DUE ELSE 0 END) AS SUM_QTY_DUE,
      SUM(CASE WHEN P.TRACKING_LEVEL LIKE 'C%' THEN P.QTY_DUE ELSE 0 END) SUM_TL
    FROM PRE_ADVICE_LINE P
    WHERE P.QTY_RECEIVED IS NULL
    GROUP BY P.SKU_ID) P_ILV
    ON (S.SKU_ID = P_ILV.SKU_ID)
    JOIN (SELECT ITXN.SKU_ID, SUM(ITXN.UPDATE_QTY)/(CEIL(TO_DATE($P{To_Date},'DD-Mon-YYYY') - TO_DATE($P{From_Date},'DD-Mon-YYYY')) + 1) AVG_QTY
    FROM INVENTORY_TRANSACTION ITXN
    WHERE ITXN.CODE = 'Shipment'
    AND ITXN.DSTAMP BETWEEN TRUNC(TO_DATE($P{From_Date},'DD-Mon-YYYY')) AND TRUNC(TO_DATE($P{To_Date},'DD-Mon-YYYY')+ 1) - 1/86400
    GROUP BY ITXN.SKU_ID) ITXN_ILV
    ON (S.SKU_ID = ITXN_ILV.SKU_ID)
    GROUP BY TO_CHAR(I.EXPIRY_DSTAMP, 'DD-Mon-YYYY'), I.SKU_ID, I.DESCRIPTION, P_ILV.SUM_QTY_DUE, NVL(P_ILV.SUM_TL,0), NVL(S.USER_DEF_NUM_3,0), ITXN_ILV.AVG_QTY, 2
    ORDER BY I.SKU_IDThanks, Sam.
    Edited by: Sam Mardell on 08-May-2009 06:25

    OK Sam, one thing I would question in the JOIN between INVENTORY and SKU is the JOIN on the DESCRIPTION column - I would reckon that SKU_ID should be enough (and it's not good design to have the DESCRIPTION in more than one place). I think that that could be causing this issue. I've also included the zeroes for NULLs in this.
    Try:
    SELECT I.SKU_ID,
         I.DESCRIPTION,
         TO_CHAR(I.EXPIRY_DSTAMP, 'DD-Mon-YYYY') expiry_dt,
         SUM(CASE WHEN TRIM(I.CONDITION_ID) IS NOT NULL THEN QTY_ON_HAND ELSE 0 END) AS Unavailable,
         SUM(CASE WHEN TRIM(I.CONDITION_ID) IS NULL THEN QTY_ON_HAND ELSE 0 END) AS Available,
         NVL(P_ILV.SUM_QTY_DUE,0) + (NVL(P_ILV.SUM_TL,0) * NVL(S.USER_DEF_NUM_3,0)) AS Sum_Qty_Due,
         NVL(ROUND(ITXN_ILV.AVG_QTY, 2),0) AS Avg_Qty,
         ROUND(NVL(SUM(CASE WHEN I.CONDITION_ID IS NULL THEN QTY_ON_HAND ELSE 0 END),0)/CASE WHEN  TXN_ILV.AVG_QTY = 0 THEN 1 ELSE ITXN_ILV.AVG_QTY END,2) AS Days_Worth_Stock
      FROM INVENTORY I
      JOIN SKU S
       ON (I.SKU_ID = S.SKU_ID)
      LEFT JOIN (SELECT P.SKU_ID, SUM (CASE WHEN P.TRACKING_LEVEL NOT LIKE 'C%' THEN P.QTY_DUE ELSE 0 END) AS SUM_QTY_DUE,
          SUM(CASE WHEN P.TRACKING_LEVEL LIKE 'C%' THEN P.QTY_DUE ELSE 0 END) SUM_TL
        FROM PRE_ADVICE_LINE P
       WHERE P.QTY_RECEIVED IS NULL
       GROUP BY P.SKU_ID) P_ILV
       ON (S.SKU_ID = P_ILV.SKU_ID)
    LEFT JOIN (SELECT ITXN.SKU_ID, SUM(ITXN.UPDATE_QTY)/(CEIL(TO_DATE($P{To_Date},'DD-Mon-YYYY') -  TO_DATE($P{From_Date},'DD-Mon-YYYY')) + 1) AVG_QTY
          FROM INVENTORY_TRANSACTION ITXN
       WHERE ITXN.CODE = 'Shipment'
          AND ITXN.DSTAMP BETWEEN TRUNC(TO_DATE($P{From_Date},'DD-Mon-YYYY')) AND TRUNC(TO_DATE($P{To_Date},'DD-Mon-YYYY')+ 1) - 1/86400
         GROUP BY ITXN.SKU_ID) ITXN_ILV
       ON (S.SKU_ID = ITXN_ILV.SKU_ID)
    GROUP BY TO_CHAR(I.EXPIRY_DSTAMP, 'DD-Mon-YYYY'), I.SKU_ID, I.DESCRIPTION, NVL(P_ILV.SUM_QTY_DUE,0), NVL(P_ILV.SUM_TL,0), NVL(S.USER_DEF_NUM_3,0), NVL(ROUND(ITXN_ILV.AVG_QTY, 2),0)
    ORDER BY I.SKU_ID

  • Newly added field in the data Source not getting populated

    Hello All,
    We have added few fields in the Data Source. The Data Source is based on the InfoSet. We have included the field in the Infoset and have updated the code to fetch the value for the newly added fields.
    When we perform the test extraction for this Data Source in RSA3, the newly added fields are not getting populdated with the value. In the system generated query's selection list, the newly added fields are not selected.
    Please let mw know how to get the newly added field selected in the system generated query of the infoset.
    Regards,
    -Purnima

    Hi,
    As you said you have added the filed in Info Set. Have you included the same in data source? try if not.
    If you are trying to create a query in source system to check the data. I guess in R/3 (ECC) then you have to include the new field in slection critera (ther is an option available on top menu).
    I would suggest before creating any query go to RSO2 select the data source. Display the field structure and check if the filed is there or not. You maysee Infoset level data directly by data display from top menu. Try that optionas well to check.
    I hope it will help.
    THanks,
    S

  • BI publisher report is not showing all the data

    Hi All,
    I have created a report using BI Publisher in R12. The report is not showing all the records.
    I have checked the result XML it is also not having all the data. My query returns 846 rows but my report only has 662 rows.
    what might be the issue.please give me some idea to resolve this issue.
    Thanks in advance.
    Regards,
    P.Kalidoss

    Hi Arun,
    In the following code: public SelectItem[] getAllPrinters() {
    if (allPrinters == null) {           // allPrinters is not defined. what type of object it is
    PrintService[] printers = PrintServiceLookup.lookupPrintServices(null, null);
    allPrinters = new SelectItem[printers.length];
    for (int i = 0; i < printers.length; i++) {
    SelectItem printer =
    new SelectItem(printers.getName(), printers[i].getName());
    allPrinters[i] = printer;
    return allPrinters;;;
    Variable allPrinters is not defined. what type of object it is?
    And also the same variable is referenced here <af:selectOneChoice label="Available Printers" partialTriggers="cb1"
    value="#{pageFlowScope.applicationPrinterBean.selectedPrinter}"
    id="soc1"
    autoSubmit="true">
    <f:selectItems value="#{pageFlowScope.applicationPrinterBean.allPrinters}" id="si1"/>
    </af:selectOneChoice>.
    Thanks.

  • JDBC sender channel running but not picking up the data from sp

    Hi,
    One of the jdbc sender channels in production is running at its schedule time but it is not picking up the data from the sql side, we have checked with the sp side and they are saying that sp is running fine. No changes have been done in its configuration. Last message coming in RWB  is Retry interval started but that is of 1 day and its already been 3 days. I tried by starting and stopping the channel but of no use. The channel was re activated but that also didn't help.
    Please help, what can be the reason for the same.
    thanks.

    Hi,
    The JDBCadapter ( The respective channel) is definitely locked in PI . Ideally for each polling interval a lock is being created and once the processing is over , the lock should be released/deleted automatically to allow further polling interval. If the lock is not released by the system automatically,further polling will not happen as expected. ( This may affect all sender JDBC adapters as well. I would recommend to do a check in all sender JDBC communication channels)
    You can see/delete the locks in Visual admin.
    Go to Server>Services->Locking adapter and click refresh
    The entries for JDBC adapter ( with name $XIDBAD.JDBC2XI) should be deleted by selecting those particilar entries and click delete selected locks.
    If you have more than one node, then same should be done in all server nodes.
    The temprory solution would be creating/copy the existing channel in ID with same properties and assign it into particular sender agreement.
    But, the lock may be created again which potentailly stops all your database interfaces. Hence i would suggest to use Disconnect From Database After processing of Each messages in Advanced tab in the sender JDBC adapter.
    Hope this solves your issue.
    PS: The same bahaviour would expected for all file adapter as well

  • Peristent Store does not show the Data Source in Wenlogic 12.1.3

    Hi All,
    I have installed WebLogic 12.1.3 on my windows machine using Java 8 and am trying to configure the Data Source and Persistent Store (of JDBC type).
    I'm able to create a Generic Data Source of type -  *Oracle's Driver (Thin) for Instance connections;Versions:Any. The test connection also was successful.
    When i proceed with the creation of a Persistent Store - JDBC type, the Data Source created is not getting populated in the drop down across Data Source label.
    Any suggestions as to why this is happening?? or What needs to be done by me to fix this issue?
    I'm unable to proceed further with my configuration due to this.

    Hi,
    In order to use the JDBC persistence store you have to use a non XA JDBC driver for your Data Source.
    http://docs.oracle.com/cd/E23943_01/web.1111/e13701/store.htm#CNFGD221
    Point-13). When configuring a connection pool to use with WebLogic JMS JDBC Store, use non-XA database drivers.
    http://middlewaremagic.com/weblogic/?p=586
    Hope it helps

  • Primarily using the ipad air as an art portfolio but photos i upload in order get rearranged. other than changing the date they were shot, is there any solution to this? can they be reorganized in iphoto on the ipad?

    primarily using the ipad air as an art portfolio but photos i upload in order get rearranged. other than changing the date they were shot, is there any solution to this? can they be reorganized in iphoto on the ipad?

    You are welcome.
    Sorting and organizing photos on IOS devices is one of the things that really need to be improved. It looks like the developers simply did not expect the users to want to keep large photo libraries on an IOS device, where the ordering would matter. I'd send feedback to the developer team, so that they see, what the users want and need:
    You can use this form:  Apple - iPhoto - Feedback

  • HP 34401A-When I run the Read Meas.vi, it has error if I do not turn off the data storage

    When I run the Read Meas.vi, it has error if I do not turn off the data storage. The same thing happen when I run the App. Example. vi. Anyone knows how to solve this problem? Thanks alot.
    KL

    LoganS wrote:
    Hi KL,
    The Read Meas.vi is one of the subvi's in the App. Example.vi, so this problem is most likely the same problem in each case. From the help for Read Meas.vi:
    Data Storage instructs the device to store the data to be sent to either the internal or external buffer. If TRUE (default), the VI stores data in the on mode. If FALSE, the VI does not retain data in the off mode. Use the off mode only with the average min/max operation when you do not need to retrieve data. You cannot configure the meter for external buffering in the off mode.
    So the question is, are you trying to retrieve any data? If so, then as indicated in the above paragraph, you cannot retrieve data and have the data storage turned off. Good luck!
    Logan S.
    Yes, I need to retrieve data. So due to that problem, I cannot really get any data. I am not sure is that problem is due to the USB GPIB or not. But once I click on the click arrow, everying goes fine from initialise, measurment... once it comes to Read measurment, it has error (say somthing like: VISA Wait on Event for RQS.vi->HP34401A Read Meas.vi->Untitled 1) I have no idea like what it shows up.
    KL

  • Not able pass the data from component to other component.

    Hello All
    I am not able pass the data from component to other component.
    I have done like this.
    1 Main Component (Parent component ) having below  two child components.Embeded as used components.
    2)     Search Component  and Details Component
    3)     In the Search Component having buttons,  Say : Button u201CXu201D on click of button I am navigating to Details component view through FPM.
    4)     When I am clicking above button u201CXu201D raising the event to call the parent   business logic method, there I am getting  Structure with values and binded this structure to the node and Mapped this node to the Details component  interface node. FYI : I kept the debugging point Structure is having data , I had set static attributes table to node instance.
    5)     In the Details component node data is not coming mean empty.
    Thanks in Advance.
    Br-
    CW
    Edited by: CarlinWilliams on Jul 4, 2011 9:21 AM

    Hi,
    When you use input Ext. check that the parent component should not be used as used component in child component.
    Only in the parent component the child components should be used as used components and the usage has to be created for the
    Child Components and the binding of the Node should be done from comp. controller of parent component to child node
    by which you will be able to see double arrow against the node.This should work
    Thanks,
    Shailaja Ainala.

  • JDODataStoreException: The instance null does not exist in the data store

    I'm unable to figure out how this exception occurs.
    I have a class IDCounter which has a number of fields such as
    'm_Name' (String)
    'm_AccountName' (String)
    'm_UserName' (String)
    'm_Description' (String)
    'm_CreationDate' (Date)
    'm_LastModifiedDate' (Date)
    'm_DeletedDate' (Date)
    'm_Count' (long)
    The filter I'm using is "m_AccountName == \"test\" && m_UserName
    ==\"test\" && m_DeletedDate == null"
    The generated SQL statement is "SELECT t0.M_IDX, t0.JDOCLASSX,
    t0.JDOLOCKX, t0.M_ACCOUNTNAMEX, t0.M_CREATIONDATEX, t0.M_DELETEDDATEX,
    t0.M_DESCRIPTIONX, t0.M_LASTMODIFIEDDATEX, t0.M_NAMEX, t0.M_USERNAMEX,
    t0.M_COUNTX FROM ABSTRACTENTITYX t0 WHERE ((t0.M_DELETEDDATEX IS NULL) AND
    t0.JDOCLASSX = 'com.ewarna.pdm.entities.IDCounter')"
    Exception Trace:
    javax.jdo.JDODataStoreException: The instance null does not exist in the
    data store.
         at
    com.solarmetric.kodo.impl.jdbc.runtime.LazyResultList.instantiateRow(LazyResultList.java:165)
         at
    com.solarmetric.kodo.impl.jdbc.runtime.LazyResultList.get(LazyResultList.java:96)
         at java.util.AbstractList$Itr.next(AbstractList.java:416)
         at
    com.solarmetric.kodo.runtime.ResultListIterator.next(ResultListIterator.java:49)
         at
    com.solarmetric.kodo.impl.jdbc.runtime.ResultListFactory.createResultList(ResultListFactory.java:85)
         at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCStoreManager.java:646)
         at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java:150)
         at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:580)
         at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:428)
         at
    com.solarmetric.kodo.query.QueryImpl$SynchronizedQuery.execute(QueryImpl.java:1331)
         at
    com.ewarna.pdm.sessions.persistence.BasicQuery.getByAdvancedFormula(BasicQuery.java:78)
         at
    com.ewarna.pdm.sessions.persistence.BasicQuery.getByFormula(BasicQuery.java:119)
         at
    com.ewarna.pdm.sessions.persistence.BasicQuery.getByFormula(BasicQuery.java:95)
         at
    com.ewarna.pdm.sessions.persistence.BasicQuery.getAll(BasicQuery.java:131)
         at
    com.ewarna.pdm.sessions.persistence.GenericEntityManager$7.execute(GenericEntityManager.java:305)
         at
    com.ewarna.pdm.sessions.persistence.GenericEntityManager.execute(GenericEntityManager.java:251)
         ... 18 more

    Youcan no longer display a workbook. You receive an error message when opening: <Internal error>: 1201 document storage
    Cause and prerequisites
    In very rare cases, when you store a workbook, you might not be able to open it again.
    Solution
    Function module BDS_PHIOS_GET_RIGHT has to be changed so that the last available version of the Workbooks can be displayed.

Maybe you are looking for

  • My Apple TV is reading " There was a problem connecting to the network" error - 3906

    I have reset all network components and the Apple TV.  I first set up Apple TV and it worked beautifully.  The next time I turned it on it gave me the error message.  I have tried reconnecting to the netwrk multiple times.  I have my laptop and my ip

  • HVDVD TS foler vs VIDEO TS folder

    I'm trying to create an SD .img disc image (all the assets are SD and all settings I can find are set for SD), but DVDSP keeps creating a HVDVD TS folder. What needs to be changed to create a standard DVD? Thanks.

  • IDOC-XI-IDOC:CRM-PI-R/3(Both Idoc are same nameProblem)

    Hi all I have a scenario in which CRM triggers and IDOC via transaction WE19 which goes to PI and then to R/3. My problem is that i have to map IDOCs from CRM and R/3  in XI  and both have the same name 1. In CRM ZDOC.ZDOC01 2. In R/3 ZDOC.ZDOC01 IF

  • Problem using IntelliJ GUI

    I used IntelliJ GUI desiger to draw a panel to be used in an applet. It worked the first time applet is inited. Then I refresh IE, then I had the following exception:"IntelliJ java.lang.IllegalArgumentException: layoutState cannot be null" I kept ref

  • ITU.R BS.1770-3 Loudness not implemented in Adobe Audition CC 2014 Amplitude Statistics

    The amplitude statistics feature in Adobe Audition CC 2014 is a great tool to quickly analyze the long-term loudness; however, it is currently using an old standard (BS.1770-2) that has been replaced by the new BS.1770-3 for about a year or so now.