Issue with 2LIS_02_SCL Data source

Hi Everybody,
I am facing the below 2 issues with 2lis_02_scl data source,
1) This is fetching only the records  ETENR (Delivery Schedule Line Counter) value with ' 1 ', It is ignoring others ex:2,3 and 4. Hence Data is not reconciling with ECC system.
2) The standard field GLMNG is not getting any data, Data was existed in table(EKET) level. So i have written the code and data is coming now. But the problem is, This is not considering the ROCANCEL indicator it seems. All the other key figures values are coming in with Negative sign When ROCANCEL Value is ' X ' or ' R ', But this field is getting all the positive values irrespective of ROCANCEL indicator. Hence showing the incorrect values compared to the ECC.
Can anybody help me on this,
Regards,
Gopinath

Hi Gopinath:
   Have you already applied any SAP Note to solve this problem?
Please check if the SAP Note below is applicable to your system.
668177 - "LIS BW: wrong quantity for documents with invoice plan"
Regards,
Francisco Milán.

Similar Messages

  • Performance issue with Oracle data source

    Hi all,
    I've a rather strange problem that I'm stuck on need some assistance on.
    I have a rules file which drags data in via an SQL data source thats an Oracle server. If I cut/paste the 3 sections of "select" "from" and "where" into SQL-Developer and run the query, it takes less than 1 second to complete. When I run the "load data" with this rule file or even use the "Retrieve" with the rules file edit, it takes up to an hour to complete/retrieve the data.
    The table in question being used has millions of rows and I'm using one of the indexed fields to retrieve the data. It's as if the Essbase/Rule file is ognoring the index, or I have a config issue with the ODBC settings on the server that is causing the problem.
    ODBC.INI file entry for the Oracle server as follows (changed any sensitive info to xxx or 999).
    [XXX]
    Driver=/opt/data01/hyperion/common/ODBC-64/Merant/5.2/lib/ARora22.so
    Description=DataDirect 5.2 Oracle Wire Protocol
    AlternateServers=
    ApplicationUsingThreads=1
    ArraySize=60000
    CachedCursorLimit=32
    CachedDescLimit=0
    CatalogIncludesSynonyms=1
    CatalogOptions=0
    ConnectionRetryCount=0
    ConnectionRetryDelay=3
    DefaultLongDataBuffLen=1024
    DescribeAtPrepare=0
    EnableDescribeParam=0
    EnableNcharSupport=0
    EnableScrollableCursors=1
    EnableStaticCursorsForLongData=0
    EnableTimestampWithTimeZone=0
    HostName=999.999.999.999
    LoadBalancing=0
    LocalTimeZoneOffset=
    LockTimeOut=-1
    LogonID=xxx
    Password=xxx
    PortNumber=1521
    ProcedureRetResults=0
    ReportCodePageConversionErrors=0
    ServiceType=0
    ServiceName=xxx
    SID=
    TimeEscapeMapping=0
    UseCurrentSchema=1
    Can anyone please advise on this lack of performance.
    Thanks in advance
    Bagpuss

    One other thing that I've seen is that if your Oracle data source and Essbase server are in different geographic locations, you can get some delay when it retrieves data over the WAN. I guess there is some handshaking going on when passing the data from Oracle to Essbase (either by record or groups of records) that is slowed WAY down over the WAN.
    Our solution to this was remove teh query out of the load rule, run it via SQL+ on a command line at the geographic location where the Oracle database is, then ftp the resulting file to where the Essbase server is.
    With upwards of 6 million records being retrieved, it took around 4 hours in the load rule, but running the query via command line took 10 minutes, then the ftp took less than 5.

  • Data load issue with export data source - BW 3.5

    Hi,
    We are facing issues in loading data with the help of export data source.
    We have created export data source of 0PCA_C01 cube. With the help of this export datasource,  we are loading data to other custom cube. Scenario is working fine in development server.
    But when we transported objects to quality server data is not getting loaded to custom target cube.
    It is extracting zero records.  All transports are ok and we have generated export datasource in quality before transports .Also regenerated export datasource after transport and activated infosource, update rule via RS* programs.  Every object is active but data is not getting extracted.
    RSA3 for 80PCA_C01 datasource isn't extracting any record in Quality. Records getting extracted in development.   We are in BW 3.5 with patch level 19.
    Please guide us to resolve the issue.
    Thanks,
    Aditya

    Hi
    Make sure that you have relevant Role & Authorization at Quality/PRS.
    You have to Transport the Source Cube first and then Create a Generate Export Data Source in QAS. Then, replicate data sources for BW QAS Soruce System. Make sure this replicated Data Source in QAS. Only then can transport new update rules for second cube.
    Hope it helps and clear

  • 10g Reports issue with XML Data Source

    Hi,
    Has anybody ever encountered an issue with Oracle 10g report using an XML as the data source? What happens is, some of the values in the XML are printed to the wrong column.
    One of the elements in our XML file is a complex type with 10 elements under it. The first 5 are picked up properly, but the last 6 are not. Elements #6 to #9 has a minimum occurence of 0. What happens is when element #6 is present, but #7 is, the value for element #7 is passed on to element #6.
    The XSD and XSL files are both valid since the reports were working when we were still using 9i. There is no hidden logic in the report which might cause this issue to come up, i.e., the report just picks up the values from the XML and prints it to the appropriate columns.
    Any help will be greatly appreciated.

    XSD used
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified">
            <!-- trade instructions detail & trailer -->
            <xs:element name="TradeDetail">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="TradeType"/>
                                    <xs:element ref="TradeID"/>
                                    <xs:element ref="TradeDate"/>
                                    <xs:element ref="FundID"/>
                                    <xs:element ref="FundName"/>
                                    <xs:element ref="DollarValue" minOccurs="0"/>
                                    <xs:element ref="UnitValue" minOccurs="0"/>
                                    <xs:element ref="PercentageValue" minOccurs="0"/>
                                    <xs:element ref="OriginalTradeID" minOccurs="0"/>
                                    <xs:element ref="CancellationFlag"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <xs:element name="Instruction">
                    <xs:complexType>
                            <xs:sequence minOccurs="0">
                                    <xs:element ref="TradeDetail" maxOccurs="unbounded"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- overall trade instruction message -->
            <xs:element name="InterchangeHeader">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="Instruction"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- definition of simple elements -->
            <xs:element name="FundID" type="xs:string"/>
            <xs:element name="TradeType" type="xs:string"/>
            <xs:element name="TradeID" type="xs:string"/>
            <xs:element name="TradeDate" type="xs:string"/>
            <xs:element name="FundName" type="xs:string"/>
            <xs:element name="DollarValue" type="xs:decimal"/>
            <xs:element name="UnitValue" type="xs:decimal"/>
            <xs:element name="PercentageValue" type="xs:decimal"/>
            <xs:element name="OriginalTradeID" type="xs:string"/>
            <xs:element name="CancellationFlag" type="xs:string"/>
    </xs:schema>
    XML used
    <?xml version = '1.0' encoding = 'UTF-8'?>
    <InterchangeHeader xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="TradeInstruction.xsd">
       <Instruction>
          <TradeDetail>
             <TradeType>Purchase</TradeType>
             <TradeID>M000038290</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>ABN Fund</FundName>
             <DollarValue>2111.53</DollarValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>Redemption</TradeType>
             <TradeID>M000038292</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>AMRO Equity Fund</FundName>
             <UnitValue>104881.270200</UnitValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>ISPurchase</TradeType>
             <TradeID>M000038312</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>MLC0011AU</FundID>
             <FundName>Cash Fund</FundName>
             <OriginalTradeID>M000038311</OriginalTradeID>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
       </Instruction>
    </InterchangeHeader>
    XSLT used
    <?xml version="1.0" encoding="UTF-8"?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
            <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
            <xsl:template match="/">
                    <InterchangeHeader>
                            <xsl:for-each select="InterchangeHeader/Instruction/TradeDetail">
                            <xsl:sort select="FundName"/>
                            <xsl:sort select="TradeDate"/>
                                    <TradeDetail>
                                            <TradeType><xsl:value-of select="TradeType"/></TradeType>
                                            <TradeID><xsl:value-of select="TradeID"/></TradeID>
                                            <TradeDate><xsl:value-of select="TradeDate"/></TradeDate>
                                            <FundID><xsl:value-of select="FundID"/></FundID>
                                            <FundName><xsl:value-of select="FundName"/></FundName>
                                            <DollarValue><xsl:value-of select="DollarValue"/></DollarValue>
                                            <UnitValue><xsl:value-of select="UnitValue"/></UnitValue>
                                            <PercentageValue><xsl:value-of select="PercentageValue"/></PercentageValue>
                                            <OriginalTradeID><xsl:value-of select="OriginalTradeID"/></OriginalTradeID>
                                            <CancellationFlag><xsl:value-of select="CancellationFlag"/></CancellationFlag>
                                    </TradeDetail>
                            </xsl:for-each>
                    </InterchangeHeader>
            </xsl:template>
    </xsl:stylesheet>

  • Issue with Treasury Data Source - 0CFM_DELTA_POSITIONS

    Hi all,
    I'm trying to use data sources 0CFM_INIT_POSITIONS & 0CFM_DELTA_POSITIONS to load treasury data.
    Through INIT data source the data has been initialized and loaded the history data to BW from R/3. I'm able to see the entry in time stamp table for the INIT data source. Also the initialization has been done for DELTA_POSITIONS. But I couldn't see any entry in time stamp tabe, whereas queue has been created in RSA7.
    After that we made few transactions in R/3 and trying to take those data through DELTA_POSITIONS. I checked in RSA3, system fetches more than 5000 records (which was not made after initialization) and beyond that it gives dump saying termination point has been set in the program.
    Can some one please explain me the logic on which these data source works? Any other setting needs to be done?
    With regards,
    Muthu.

    Hi Pooja,
    1) It might be Database table space issue in the backround, Please checkit out with BASIS
    Regards,
    Marasa.

  • Master data load issue with flexible data source

    Hi All,
    1CL_OLIS001have this data source for loading configuration data. the changed records are not updating into this DS regularly.
    I have delta load on this every day. It loads correct data for one month and suddently it will start loading 0 records every day with out failing. If I reset the delta, it will work fine for month or two and then will corrupt again.
    Is any one have this kind of issue before Please suggest to resolve this issue permanently?
    Thanks and Regards,
    Pooja

    Hi Pooja,
    1) It might be Database table space issue in the backround, Please checkit out with BASIS
    Regards,
    Marasa.

  • Issue with 0SRM_TD_IV data source delta

    Hi experts,
    I have an issue with the invoices numbers not updated correctly when one of invoice item is deleted in SRM.
    For example:
    In SRM:
    Invoice number: 50001168
    Invoice item : 1, quantity: 30 - deleted ( deletion indicator - x)
    Invoice item : 2, quantity: 4
    Invoice item : 3, quantity: 7
    In standard extractor (0SRM_TD_IV) RSA3 : exactly same as SRM data.
    In standard DSO (0SRIV_D3) : only invoice number and invoice item one is present, which suppose to be deleted in BI.
    Invoice number: 50001168
    Invoice item : 1, quantity: 30
    When I did the full repair for this invoice number from SRM. I got all the 3 invoice items. It did not overwrite invoice item 1, which is deleted in SRM. the way it works in SRM is, it didn't delete the invoice item but flaged with the deletion indicator x.
    Know every thing look good up to extractor RSA3. what should I do know to fix this issue? I see that we don't have tranformation rules in BI 7.0 insted we are using trnasfer rules and update rule. Please let me know.
    Thanks,
    Chandra

    Hi Chandu,
    The way I understand your issue is that,
    when you do a full upload, you get the information correctly in RSA3 and DSO.
    when you run a delta, you are not getting all the line items.
    1) Please check whether there are any relevant SAP notes for this extractor to fix this delta issue.
    2) Please check whether there is any code in the update rule or transfer rule that deletes the missing line item
    Thanks,
    Krishnan

  • Crystal Reports XI R2 SP 6 - Issue with setting Data Source Location

    Hello,
      After some initial difficulty instally CR XI R2 with SP 6 on a windows XP machine (see thread Error on installation of CR XI R2 SP6), the Crystal Reports environment seemed fine, however, when I tried resetting the datasource location on an ODBC datasource between a development and production server, I get the message 'Some Tables could not be replaced as no match was found....'   The tables in the two databases are identical so that isn't the issue.
    Here is some additional information
    The issue seems related only to DSN's that point to a progress database.  I am able to reset datasource locations for DSN's that use the SQL server driver and also for those that use the XML CR ODBC XML Driver 5.0.  I am not able to reset the datasource locations on the DSN's that use a Progress Open Edge 10.1 DB  driver.
    I can create a new report using the DSN for the Progress Driver and add tables, but the table names are coming up as an alias - i.e. if I add a table called PM_Plant, the table added to the report is PM_Plant1.
    I also found I can go into existing reports, rename the tables in the database expert to be an alias (appending 1 to the end of the table name), then I am able to repoint them using the datasource location screen. 
    So it looks like there is a potential work around to the situation, but I didn't run across any information that we should need to do that. 
    Any recommendations how to fix the issue?  
    Thanks,

    Hi Don,
    The reports were created with CR XI R1 on my PC  initially and the progress drivers have not changed since.   The reports were deployed to a server and I pulled them back to my PC to test out any changes after the CR XI R2 SP 6 upgrade.  (so there is really only one machine involved, the one that had the upgrade). 
    I did look at the settings for verifying and tried playing around with those and also with verifying the database but that didn't make any difference.
    I wasn't quite sure which registry keys to look at or what the values should be so wasn't able to pursue that option. 
    All the tables in the progress database use underscores as part of the table name (i.e. PM_Plant, PM_Company), do you think the upgrade to SP6 means that the underscore is now a reserved character and that is why the tables are getting aliased?  If so, do you know how to change the alias settings or the list of characters that are reserved?
    Just an FYI, I had to downgrade back to CR XI R1 at this point to get work done.   If time allows, I'll retry the upgrade in about 5-7 weeks.  I discussed the issue with a system admininstrator and we willl try removing the progress drivers and DSN's prior to trying an upgrade again to see if that makes a difference.  I'll also make sure to keep track of all Report Options and Options and also registry keys to see what changes.
    Thanks

  • 64bit CR-Enterprise wont talk with 32bit data source, 32bit IDT wont use 64bit connection for universe. How do I connect them.

    Hi.
    I installed the full suite of tools as I am involved with a company intended to become a partner and reseller so we are trying to get everything working.
    SAP Crystal Server 2013 SP1
    BusinessObjects Business Intelligence platform
    SAP Crystal Reports for Enterprise
    SAP HANA
    SAP Crystal Reports 2013 SP1
    SAP BusinessObjects Explorer
    SAP Crystal Dashboard Design 2013 SP1
    Crystal Server 2013 SP1 Client Tools
    SAP BusinessObjects Live Office 4.1 SP1
    .NET SDK
    I found out that BI was needed only after I installed all the others but I installed it without problem.
    My issue is that the Information Design Tool (IDT) which creates my universes successfully, and even lets me open and see the dada no problem. Is using a 32bit (ODBC connected to SQL Server 2008) data source.
    However, I am unable to load the .UNX in crystal reports (CR) 2011, so I used CR Enterprise (CRE) to connect to my universe. Now when I try to open the universe I start getting the following error:
    [Microsoft][ODBC Driver Manager] The specified DSN contains an architecture mismatch between the Driver and Application
    When I do searches online I get very generic information relating to setting up data sources. While I believe the problem is indeed with the data source, I don't believe it is setup wrong.
    When I access the universe using the IDT (which uses the 32bit version of the data source to build its connection, it wont let me use a 64bit) I can load a table under the "result objects for query #1" press refresh, I get a list of data.
    When I access the same universe using CRE (which "Seems" to use a 64bit data source, but I am not sure), and follow the same process as above. I get the above error.
    If I cancel the process, when CRE lists the fields for the report I can later manually map these fields to an ODBC connection. But if I have to do this what is the point of using the universes at all as the end user has full access to all the various DB's available in the data source.
    Thanks in advance for any help you can provide.

    On the server where Crystal Reports Server is installed, create a 64-bit ODBC connection with the same name as the 32-bit connection.  CRS will then automatically use the 64-bit version of the connection.
    -Dell

  • I am having a issue with getting data useage alerts for my iphone 4s

    I am having a issue with getting data useage alerts for my iphone 4s from AT&T.  I do not download anything huge at all.
    I looked into it and figured out that the phone dials out nightly at 12:29am every night.   I went into my settings and went to general..about..diagnostics and useage..then diagnostics and useage data to see this.  I then clicked don't send...but I am still getting useage alerts.  Can anyone help me please...
    Thanks

    Honestly, from reading the thread linked, they all come off as a bunch of whiney people that cannot be bothered to help themselves.
    Little to nothing in that thread indicates an issue beyond inept consumers.  Yes, I read several pages on the incessant gripes.  Very few made any actual attempts to troubleshoot issues before whining about the "Apple issue" and those that did actual troubleshooting got their issues resolved.
    So no, Apple has nothing to fix beyond a few specific devices that are experiencing hardware issues.
    If you have actually put forth effort and done the basic troubleshooting, take the device to Apple for evaluation and possible replacement.  Whining will get nothing accomplished.

  • 11i EBS XML Publisher Report with Multiple Data Source

    I need to create XML Publisher report in 11i EBS pulling data from another 10.7 EBS Instance as well as 11i EBS in single report.
    I am not allowed to create extract or use db links.
    My problem is how to create Data Source Connection using Java Concurrent Program.
    The approach I am trying is
    1. create Java concurrent program to establish connection to 10.7 instance.
    2. Will write the SQL queries in Data Tempalete with 2 Data Source 1 for 11i EBS and 2 for 10.7 EBS
    3. Template will show the data from both query in 1 report..
    Is there any other way to proceed using datasource API...
    thanks

    option1:
    The query should be same @ detail level, only the template has to be different for summary and details.
    @runtime, user can choose to see the detail/summary
    Disadvantage, if the data is huge,
    advantage , only one report.
    option2:
    create two separate reports summary and details
    and create diff data and diff layout and keep it as different report
    Advantage, query will perform based on the user run report, summary/detail, so that you can write efficient query.
    Dis advantage , two reports query/template to be maintained.

  • Issue with Pro Res sources when encoding in Media Encoder.

    There seems to be a big issue with Pro Res sources in Media Encoder. I've noticed that when exporting using the 'software only' mode my graphics and titles look horrible, they are pixelated around the edges and the compression looks bad. This issue only happens when it's being made from a Pro Res source, if I make the exact same file Uncompressed this issue is resolved. If I use the 'Cuda' option (which I already know is the better option) this issue is resolved. The thing is, in a work environment not all of our systems are Cuda enabled and I would like to use Media Encoder as a exporting option overall. I love Media Encoder, it's fast and easy to use but this Pro Res issue is huge because the majority of the time we are working in Pro Res. I also did a test out of Avid Media Composer to Media Encoder, I sent a reference file referencing the Avid MXF material and the issue is gone, this seems to my knowledge to be a Pro Res only issue. The settings I am exporting to is 960 x 540 h.264 and also .mp4. This is coming from a 1080p source and yes I do have the 'maximum render quality' checked for best scaling. I understand that Software only vs Cuda and Open CL use different algorithms when scaling but this seems crazy to me that it would look  this much worse.
    Anyways if somebody can please look into this that would be great, this seems to be an issue where I can't continue at this moment to use Media Encoder. Making my source files Uncompressed every time to do an export is just not a real workflow I want to do.
    On a side note I've also recently noticed on a clip that had text over a grey background that there are lines going all down the screen evenly when exporting to even a non scaled 1080p .mp4. Once again, this issue goes away with any format but Pro Res. The weird part is this happens even with Cuda enabled. This is why I am thinking Media Encoder is having some sort of issue with the Pro Res codec.
    I am on a current new 27'' fully loaded iMac with the latest Adobe CC updates.
    Has anyone else experienced this?
    thank you

    No, it is why advance users do not use the wizard.
    Have a look at OpenOffice.org and its form creation. Once you add your fields in OpenOffice.org, just export to PDF and the form fields will be named just like you named in them in OpenOffice.org and the drop down values carry over.

  • CrystalReports error in XI 3.1 with Oracle Data Source

    Having trouble running Crystal Reports with Oracle data source off XI 3.1 SP3. Reports hang through CMC or InfoView. Have no trouble on identical environment (presumably) with the same reports. Basically this happened after migration.
    Report can be run on designer on the server box (suggesting the connection is good) however once uploaded to CMS, it hangs with error. This is not about an specific report only reports with Oracle Data source (Oracle client already installed and connection established)
    Any comments appreciated.

    Thank you for your reply.
    Seems like a re-boot resolved the problem.
    Edited by: Amir Eskandari on Feb 10, 2011 3:57 PM

  • XmlDataProvider .... is gone completely in my Xaml file. Why? How many different ways to deal with xml data source through WPF

    I followed a procedure described in a book.
    1. insert "Inventory.xml" file to a project "WpfXmlDataBinding" .
    2. add the XML data source through the data panel of "blend for 2013", named it "InventoryXmlDataStore" and store it in the current document.
    3. dragged and droppped the nodes from the Data panel onto the artboard.
    Then I checked my Xaml file against the one provided by the book
    Xaml file by the book:
    <Window.Resources>
    <!-- This part is missing in my xaml file --><XmlDataProvider x:Key="InventoryDataSource"
    Source="\Inventory.xml"
    d:IsDataSource="True"/>
    <!-- This part is missing in my xaml file -->
    <DataTemplate x:Key="ProductTemplate">
    <StackPanel>
    <TextBlock Text="{Binding XPath=@ProductID}"/>
    <TextBlock Text="{Binding XPath=Cost}"/>
    <TextBlock Text="{Binding XPath=Description}"/>
    <CheckBox IsChecked="{Binding XPath=HotItem}"/>
    <TextBlock Text="{Binding XPath=Name}"/>
    </StackPanel>
    </DataTemplate>
    </Window.Resources>
    <Grid>
    <ListBox HorizontalAlignment="Left"
    ItemTemplate="{DynamicResource ProductTemplate}"
    ItemsSource="{Binding XPath=/Inventory/Product}"
    Margin="89,65,0,77" Width="200"/>
    </Grid>
    my Xaml file:
    <Window x:Class="WpfXmlDataBinding.MainWindow"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    Title="MainWindow" Height="922" Width="874">
    <Window.Resources>
    <DataTemplate x:Key="ProductTemplate">
    <StackPanel>
    <TextBlock Text="{Binding XPath=@ProductID}"/>
    <TextBlock Text="{Binding XPath=Cost}"/>
    <TextBlock Text="{Binding XPath=Description}"/>
    <CheckBox IsChecked="{Binding XPath=HotItem}"/>
    <TextBlock Text="{Binding XPath=Name}"/>
    </StackPanel>
    </DataTemplate>
    </Window.Resources>
    <Grid DataContext="{Binding Source={StaticResource InventoryXmlDataStore}}">
    <ListBox HorizontalAlignment="Left" Height="370"
    ItemTemplate="{DynamicResource ProductTemplate}"
    ItemsSource="{Binding XPath=/Inventory/Product}"
    Margin="65,55,0,0" VerticalAlignment="Top" Width="270"/>     
        </Grid>
    </Window>
    All looks quite the same except the <XmlDataProvider ....> part under <Window.Resources>, which is gone completely in my Xaml file.
    1, Why?
    2, How many different ways to deal with xml data source through WPF?
    Thanks, guys.
    (ps My "WpfXmlDataBinding" runs without problem through.)

    Never do yourself down Richard.
    Leave that to other people.
    It's quite common for smart developers to think they're not as good as they are.
    I coach a fair bit and it's a surprisingly common feeling.
    And to repeat.
    Never use anything ends .. provider.  They're for trivial demo apps.  Transform xml into objects and use them.  Write it back as xml.  Preferably, use a database.
    You want to read a little mvvm theory first.
    http://en.wikipedia.org/wiki/Model_View_ViewModel
    Whatever you do, don't read Josh Smiths explanation.  I used to recommend it but it confuses the heck out newbies. Leave that until later.
    Laurent Bugnion did a great presentation at mix10.  Unfortunately that doesn't seem to be working on the MS site, but I have a copy.  Download and watch:
    http://1drv.ms/1IYxl3z
    I'm writing an article at the moment which is aimed at beginners.
    http://social.technet.microsoft.com/wiki/contents/articles/30564.wpf-uneventful-mvvm.aspx
    The sample is just a collection of techniques really.
    I have a sample which involves no real data but is intended to illustrate some aspects of how viewmodels "do stuff" and how you use datatemplates to generate UI.
    I can't remember if I recommended it previously to you:
    https://gallery.technet.microsoft.com/WPF-Dialler-simulator-d782db17
    And I have working samples which are aimed at illustrating line of business architecture.  This is an incomplete step by step series but I  think more than enough to chew on once you've done the previous stuff.
    http://social.technet.microsoft.com/wiki/contents/articles/28209.wpf-entity-framework-mvvm-walk-through-1.aspx
    The write up for step2 is work in progress.
    https://gallery.technet.microsoft.com/WPF-Entity-Framework-MVVM-78cdc204
    Hope that helps.
    Recent Technet articles: Property List Editing;
    Dynamic XAML

  • Has anyone had issues with Administration\Data Import/Export\Data Import???

    Has anyone had issues with Administration\Data Import/Export\Data Import???
    I have a client who has recently upgraded from V2007 to V8.81. They were succesfuly  using this standard function to import supplier prices to their master price list, but now it has failed?
    I have looked at the file they are importing and it appears to be fine.
    On closer inspection, it did contain approx 46,000 entries, so I took the first 1,000 and created a test file, which imported fine.
    The only issue I found was Speed, with the test file of 1,000 records taking about 30 Mins to import. This appeared to get slower and slower the further through the file it got!
    Based on this, I have estimated that the whole file would takle about 13 hours to import. The client say that when they used to run it on version 2007 it was far quicker?
    In practice, it does appear to run, but the speed is the issue. Having said this, I set the whole file to run last night (over night)and this morning it had appeared to hang after about 2,307 rows, with nothing else being updated.
    Has anyone any ideas or is aware of performance issues like this?
    Thanks,
    Ian

    Always an option, but would you give your clients access to this tool?
    Not sure really.
    I have uploaded a copy of their database onto my test system and run the same routine. Its equally as SLOW
    I can't gauage if its an issue with 8.81 that 2007 didn't have, as I only have the client's word on it, however I have no reason to disbelieve them.
    Kind regards,
    Ian

Maybe you are looking for