Adding BPAddresses after main BP data import

Hi,
I imported the main BP data, now I'm adding the BPAddresses, so, the Record key is the "linking" aspect between the 2 spreadsheets. At the moment I left the LineNum field blank, but as I import using the DTW, it gives this error for the very first Record key:
This entry already exsists in the following table (ODBC -2035) Application-defined or object-defined error
Is this error caused because it is seeing the customer code/entry as duplicate / should the LineNum be set to 1?
The first import did not bring any address info in
Thanks

Thanks for the help, but I'm still getting this error.
For the very first record key for the CardCode (AAO001-S):
AAO001-S
This entry already exsists in the following table (ODBC -2035) Application-defined or object-defined error
The initial BP data import had no address information at all.
In the DTW I select
Business object = oBusinessPartners
Chooses data files
I link the CSV for BusinessPartners as the source file  and the CSV for BPAddresses
The BP addresses format in the spreadsheet:
RecordKey  = 1
LineNum      = 0
AddressType   =    bo_BillTo
AddressName =  Postal address     
RecordKey  = 1
LineNum      = 1
AddressType   =    bo_ShipTo
AddressName =  Physical address     
I'm selecting import new objects
Any ideas as I'm unsure what the problem can be.

Similar Messages

  • All of a sudden (video) files added to my iTunes library after a certain date aren't playable on other devices.

    hello!
    All of a sudden I have a weird issue with playing video-files added to my iTunes library after a specific date.
    This is the situation so far:
    I encode the video for iTunes with the same settings for the last 3 years. I know these vids play on all my devices. AppleTV & iPad. All of a sudden videos added to my library after the 8th of september aren't playable on my AppleTV and iPad. I can play them directly in iTunes and by accessing them directly via the finder and quicktime.
    The following errors occur:
    When I pull the video from iTunes on my AppleTV.. the video simply doesn't play.
    When I pull the video from iTunes on my iPad I get a message that the file can't be found.
    When I push the video from iTunes to the AppleTV using airplay I get a message that there has occured an error while connecting to the AirPlay-device "Apple TV". The necesary file could not be found. In addition when the AppleTV is switched off it doesn't turn on doing this, suggesting that there is something missing on my mac itself.
    All the above errors aren't popping up when I choose a video that has been put in my library before the 9th of september. Then everything works as expected.
    This is what I tried:
    Turning off and on home sharing on all devices including the mac.
    Logging in wiht a different apple ID though using the same ID as before.
    I created a new iTunes library and populated with a video that was added after the 8th of september and one that was added before the 9th of september. Surprisingly the older one played and not the newer one.. while they both used the same encoding settings. The new library was locally on my mac as my original library is on a NAS.
    I encode my videos with Quicktime 7 Pro using the following settings:
    H.264 (current frame rate, key-frames:automatic, re-order frames, automatic bitsize)
    Medium Quality
    Multi-pass
    current resolution (which is always 1280 X 720)
    AAC sound
    44.100 kHz
    128kbps
    I'm using the Mountain lion 10.8.2 and iTunes 10.7 Mountain lion got updated a few days after 19th of september and iTunes i think as well. Both versions were released after this issue seemed to pop up.
    I also have two video-files encoded on the same day with the same settings that dragged into the iTunes library on the 9th of september. One works.. the other doesn't.
    So I wonder if there is somebody out there that can tell me what might have happened here and how I could fix this issue.
    Thank you in advance.

    Hi Seraphiel07.
    I have a very similar issue. Hardware involved:
    Apple MacMini (Mid 2011) - OSX 10.8.2. iTunes 10.7. Home Sharing enabled.
    Apple TV (3rd Generation). Home Sharing (access) enabled.
    Apple MacBook Pro (15" Mid-2009). OSX 10.8.2. iTunes 10.7. Home Sharing enabled.
    Apple iPad 2 (Wifi only). iOS 6
    Apple iPhone 4S. iOS 6
    All connected through:
    Apple Time Capsule 8.0211n (4th Gen)
    I have an iTunes library on the MacBook Pro, shared over "Home Sharing" and another (duplicate content, different library name) iTunes library on my MacMini, which is also accessed through the Apple TV.
    All software is current (as of Sep 26th).
    All movies are iTunes purchases. 80 in total.
    All files are purchased (and home shared) under one account.
    My last two iTunes purchases both HD movies on 09/17/2012 and 09/05/2012 refuse to appear on any device accessing either of the two Libraries over Home Sharing. It should be noted that every other movie (the other 78) all show up and play correctly. I therefore know it is not an issue with connectivity, otherwise I would see zero movies available.
    I have:
    1. Removed all content from each Library and re-added
    2. Upgraded the OS on the MacMini 10.8.2
    3. Tested viewing the two missing movies on other Apple devices and no luck.
    4. Turn home sharing on and off within iTunes and also on each device.
    It would appear to me that the issue is with the files themselves, rather than the devices as different devices and different versions have been tested.
    All comments welcome.

  • Reset sequence after data import

    Hi all,
    I've got a problem where we import data into a table with an auto-incremented field for a primary key. After the data import, I can drop the sequence, check the max(primary key), and re-create it to start at the max value + 1. My problem is that I need to do this for a large number of tables and don't know how to write a script to do it. In the Create Sequence command, whenever I try to use a variable in the "START WITH" clause, I get an error.
    Thanks in advance,
    Raymond

    Spool sequence creation scipt result in a file and then run it.
    Or use dynamic sql.
    Or You can "drive" sequence forward issuing select myseq.nextval from dual; apropriate times. If You need "drive" sequence backwards alter it with increment -1, issue select myseq.nextval from dual; apropriate times and alter it with previous increment.

  • User Data Import

    I add about 800 users to a portal group via a User Data Import.  All of the users were added to the group which had a role assigned to it.  But, when the users logged in they had no roles assigned to their account.  Though everything worked fine when we removed the user and re-added them back to the group.  I do not want to do this for the rest of the 800 users as this would take us days.  Anyone know how we can rectify this issue?

    Have you tried a portal restart ?
    It is not unlikely that the problem is in UME caching (which will at least be reloaded after a restart, but there are probably also other ways)
    Dagfinn

  • Data import from EBS failed via FDMEE in fdm . Getting error message as "Error connecting to AIF URL.

    FDM Data import from EBS failed via FDMEE after roll back the 11.1.2.3.500 patch . Getting below error message in ERPI Adapter log.
    *** clsGetFinData.fExecuteDataRule @ 2/18/2015 5:36:17 AM ***
    PeriodKey = 5/31/2013 12:00:00 AM
    PriorPeriodKey = 4/30/2013 12:00:00 AM
    Rule Name = 6001
    Execution Mode = FULLREFRESH
    System.Runtime.InteropServices.COMException (0x80040209): Error connecting to AIF URL.
    at Oracle.Erpi.ErpiFdmCommon.ExecuteRule(String userName, String ssoToken, String ruleName, String executionMode, String priorPeriodKey, String periodKey, String& loadId)
    at fdmERPIfinE1.clsGetFinData.fExecuteDataRule(String strERPIUserID, String strDataRuleName, String strExecutionMode, String strPeriodKey, String strPriorPeriodKey)
    Any help Please?
    Thanks

    Hi
    Getting this error in ErpiIntergrator0.log . ODI session ID were not generated in ODI / FDMEE. If I import from FDMEE its importing data from EBS.
    <[ServletContext@809342788[app:AIF module:aif path:/aif spec-version:2.5 version:11.1.2.0]] Servlet failed with Exception
    java.lang.RuntimeException
    at com.hyperion.aif.servlet.FDMRuleServlet.doPost(FDMRuleServlet.java:76)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
    at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:324)
    at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
    at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
    at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
    at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:163)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)

  • Error Message while adding Item in Item Master Data- [Microsoft][SQL Server Native Client 10.0][SQL Server]Conversion failed when converting the nvarchar value 's008 01' to data type int. (CINF)

    Dear Experts
    I am getting the following error message while adding item in Item Master data. I have modified the following SBO_SP_transactionNotification in SQL server after that could not able to add the item
    ALTER proc [dbo].[SBO_SP_TransactionNotification]
    @object_type nvarchar(20),                      -- SBO Object Type
    @transaction_type nchar(1),               -- [A]dd, [U]pdate, [D]elete, [C]ancel, C[L]ose
    @num_of_cols_in_key int,
    @list_of_key_cols_tab_del nvarchar(255),
    @list_of_cols_val_tab_del nvarchar(255)
    AS
    begin
    -- Return values
    declare @error  int                       -- Result (0 for no error)
    declare @error_message nvarchar (200)           -- Error string to be displayed
    select @error = 0
    select @error_message = N'Ok'
    --    IF @OBJECT_TYPE = '59' AND (@TRANSACTION_TYPE = 'A' or @TRANSACTION_TYPE = 'U')
      BEGIN
       IF EXISTS(
        SELECT T0.Price FROM IGN1 T0
        where  IsNull(T0.Price, '0') = '0' and T0.DocEntry = @list_of_cols_val_tab_del)
       BEGIN
        SELECT @ERROR=1,@ERROR_MESSAGE='Please insert the price !'
      END
    end
    -- Select the return values
    select @error, @error_message
    end

    Hi Rathna,
    Just put the SP like this, without the -- before the IF. A -- marks the line as a command therefore you need to uncomment and it will work.
    IF @OBJECT_TYPE = '59' AND (@TRANSACTION_TYPE = 'A' or @TRANSACTION_TYPE = 'U')
      BEGIN
       IF EXISTS(
        SELECT T0.Price FROM IGN1 T0
        where  IsNull(T0.Price, '0') = '0' and T0.DocEntry = @list_of_cols_val_tab_del)
       BEGIN
        SELECT @ERROR=1,@ERROR_MESSAGE='Please insert the price !'
      END
    end
    Hope it helps

  • Data Import defaulting Null date to system date

    I am trying to import Campaigns using Data Import, and when End Date is NULL it defaults to system date. Is there any way to avoid this default? Even when I don't "map" the End Date, it still defaults it.
    Thanks!

    This file will always populate itself what you have to try to do is establish what is a good date to put in this section. What i have done is created a System default to be today() + 14 this will mean that the end date will be roughly 14 days after it was created. This is then up to the person running the campaign to change if required.

  • Linking to a different document after a certain date

    I hope there is a way to do this.
    I have a PDF that I have added to a web page. It is a registration form to be filled out and sent in. The prices on the form reflects both early bird (until April 29) and regular prices.
    I have created a second registration form (not yet loaded) which only reflects the regular prices for use after the early bird date has passed. After the early bird time has lapsed, I was going to load the new PDF and change the link so that the second registration form is linked.
    I don't want to have them both available at the same time - less chance of someone coming back later and saying I didn't see the form with the early bird pricing.
    Is there a way I could set them both up on my site so that the first form is linked to until the end of April 29, and on April 30 the link will automatically bring people to the second form?
    THANKS
    deb

    The effort to do this by creating a dynamic page is just not worth it.
    Anyone saying it's too much effort is either lazy or they simply don't know how to build dynamic pages in a simple manner. Furthermore, it's not just about you and your issue; it's about anyone that maybe looking for a solution to the question you have asked: how to display a link based on if a date has passed or not. Not an alternative suggestion, a solution.
    It's very easy to create a dynamic link based off date. Of course bregent mentioned that it's not difficult, but there wasn't any mention of how easy it was (nor examples for that matter). Just lots of talk about whether it was worth it or not and an alternate, cumbersome suggestion. Bummer. Here's an example of the dynamic method with included explanation. You can decide whether it's worth the effort or not. I estimate it's less than 5 mins. of work, even for a beginner. Plus you might learn a thing or two about dynamic scripting in the process, which is exciting.
    Change the filename of register.html to register.php then use the following code to display link depending on date.
    <?php if (date("Y-m-d") < "2011-03-15") { ?>
    today is before March 15, 2011 put link to early bird pdf here.
    <?php } else { ?>
    today is on or after March 15, 2011 put link to regular rate pdf here.
    <?php } ?>
    You can use 301 redirect with .htaccess on apache server to redirect register.html to register.php if there's many links already established for register.html
    You'd need two different pdf files to accomplish your task: ealry_bird.pdf to use before your set date and regular_rate.pdf to use for after your set date.
    Then setup a function to remove the early_bird.pdf file from the server after a certain date. It's as easy as defining the variable and applying the function to the area that is congruent with the argument presented. In this case the argument is whether the date has passed or not. If it has then define the variable as the filename then apply the function that deletes the variable (filename) from the server as shown in the example below:
    <?php if (date("Y-m-d") < "2011-03-15") { ?>
    today is before March 15, 2011 <a href="/images/early_bird.pdf">link to early bird pdf here</a>.
    <?php } else {
    // declare variable for the early bird pdf file
    $pdffile = "/images/early_bird.pdf";
    // remove the early bird pdf file from the server
    // by using unlink function on the set variable
    unlink($pdffile); ?>
    today is on or after March 15, 2011 <a href="/images/regular_rate.pdf">link to regular rate pdf here</a>.
    <?php } ?>

  • Creating record on second main table during import

    Hello all,
    I am importing data to a main table (materials), and I have a second main table linked to the materials main table to store supporting data.  Assuming I have a new record being imported that contains an entity that doesn't exist in that second main table, is it possible to create a record inside the second main table?  This functionality exists for lookup tables, if the lookup record doesn't exist you can configure the map to create the record in the lookup table.  Can the same thing be accomplished with multiple main tables?  I'm having trouble with this because I can't get any field aside from the primary key on the second main table to show up in the destination fields in the import manager.

    Hi,
    As you said, Assuming I have a new record being imported that contains an entity that doesn't exist in that second main table, is it possible to create a record inside the second main table?
    This scenario is quite possible.
    I have a Work Around and it should work according to me...
    See, In this case you have to create two maps, one for Main table (Primary) Import and another for Second Main table(Secondary).
    Before Importing Main table(Import) this file should be Imported to Second Main table import(Secondary) by putting that file to ready inbound port of Second Main table.
    So, Records entities that do not exist in secondary main table will get created and for existing records will get updated.
    Now when the same Source file import for Main table, that record entity would be already there in Secondary main table and as such you would not face any issue while importing through main table.
    Kindly let me know if you face any issue.
    Thanks and Regards,
    Mandeep Saini

  • Main Table data load u2013 UNSPSC fields is not loading

    I am new to SAP MDM
    I have the main table data that includes UNSPSC field. UNSPSC (hierarchy) table is already loaded.
    It works fine when I use import manager with field mapping and value mapping. (UNSPSC field value mapping is done).
    When I use the import server using the same map to load the main table data with UNSPSC field (in this case the UNSPSC field value is different but UNSPSC lookup table has that value) , UNSPSC field is not loading but all other fields are loaded including images and PDF's with new values
    If I go to the import manager and do the value mapping again for the UNSPSC field with the new value then save the map and use the import server to load the data then it is loading correctly.
    My question when we use the import server, main table data with UNSPSC codes value will be different  each time and it doesnu2019t make sense to go to the import manager and do the value mapping and saving the import map  before loading the data again.
    What I am missing here?.  Anyone can help me?

    Could anyone clarify this?
    Issue: UNSPSC field value mapping automatically by using the import server while loading the Main table.
    This issue was resolved yesterday and still works fine with the remote system MDC UNSPSC.
    Is there anyn settings in the ' Set MDIS Unmapped value handling'? (Right click on the field Product hierarchy  field at the destination side). By default it is setting to 'Add' for both the working remote system as well as the non working remote system
    SAP MDM 5.5 SP6 and I am using the standard Product Master repository
    I tried this in a different remote system MDC R/3 & ERP and it worked some time and didnu2019t work later. If it is working then during the UNSPSC code field mapping,  it automatically maps the values also.
    The destination side the main table Products and the destination side [Remote key] field is displayed.
    Source file, I have only 3 fields and they are Product No, Product Name and UNSPSC Category and UNSPSC Category is mapped to the destination field Product Hierarchy field(lookup hierarchy)
    Do I have to map any field  or clone any field and map to the [Remote Key Field]  in the destination side? If yes, what field I have to clone and map it to the Remote Key filed? Is there any other settings necessary. I am not using any matching with this field or any other field.
    Steve.
    Edited by: SteveLat on Oct 8, 2009 11:57 PM
    Edited by: SteveLat on Oct 9, 2009 12:03 AM
    Edited by: SteveLat on Oct 9, 2009 12:47 AM

  • Data Import from Excel - Users Logged on

    Hello all,
    I am new the forums and am learning SAP from a user and administrator perspective so please excuse if some of my questions seem obvious.  I appreciate everyone's help very help
    I have the task of setting up two new company profiles in addition to our current one.
    I am at the stage of importing Item Master and Business Partner data.
    1) What is the preferred method?  DTW vs. Import from Excel?  Is one more suitable then the other for this purpose?
    2) The help file for importing via excel states "If SAP Business One is running on a network, you have ensured that other users will be logged off the system during the import."  Could someone please explain:
    - What are the risks of running DTW or imports while users are logged in?
    - If users are logged into the live version of our company but I am testing DTW/excel imports from a TEST copy of that company are there still risks or is this okay?
    - If users are logged into one database but I am performing data imports on a different company database (for example I would be the only one logged into the companies currently being set up - is this okay even though users are logged into SAP server but accessing a different company?)
    Thanks very much!
    Richard

    Hi Kiwi,
    Referring to the Rules of enagement you have to post one thread per question. Please adhere to the rules from your next post.
    DTW is the very much Preferred method you have simulation to see if you record are find to be imported before importing to the DB, using DTW Import from Excel will only help importing  BP and Item master data. If you have to make any updates then it will be a task again. In DTW you can save the template and keep adding data and changes can be updated from the template and re-imported
    Please refer to the link below to "Differences between the Import From Excel and Data Transfer Workbench templates" the screen shots for DTW are from version before 2007A SP01 PL09.
    [http://wiki.sdn.sap.com/wiki/display/B1/DifferencesbetweentheImportFromExcelandDataTransferWorkbenchtemplatesandwhathappenswhentheyare+interchanged.]
    1. When users are logged on It can cause some inconsistency, ensure no user is logged on the live DB before performing any major DTW import
    2. It will be ok if you are Test importing data to test DB on the same SQL server.
    Regards,
    Rakesh N

  • Data import/update on Custom Objects

    Hi,
    We are using the Custom Object1 for capturing site data within an opportunity. Since the custom object does not have capability to check for duplicates, the users have now entered data into this object which have lot of duplicates and also the data quality and integrity is lost. I am trying to see if there is an option to export this data and reimport them back after cleansing the data.
    I then realised while importing custom objects, the only available option is to use external id.All the site that have been entered by the users do not have any external unique id. Also there is no option to do a mass delete records within the custom object1.
    I understand that the only option to cleanse and reimport them back into ondemand is using web services. I want to use web services as last option.
    Is there any other option to reimport them back into ondemand using the import utility after cleansing the data.
    I would like to know what is the best practice while using custom objects. Is it advisable to populate a default value in the external unique id for custom objects while creating new records. If i had populated some value in the external unique ids while creating those records, i would have had the option to update the existing records. Now i don't even that option.
    I am looking for some suggestions for this issue.
    Thanks
    Swami

    Bobb,
    I exported the data and mapped the row id to the external unique id. Like i said before, the external id is blank in crm ondemand when it is created. We did not have any default value specified for the external unique id.
    When i tried to import with overwrite option, it does not find a match.
    I get the following error message as i expected.
    Row Id: AEMA-EYGFE     No matching record has been found. The import process will ignore this record.
    Row Id: AEMA-F8CPC     No matching record has been found. The import process will ignore this record.
    Row Id: AEMA-12CLIA     No matching record has been found. The import process will ignore this record.
    Unless i do a web service update of External Unique ID in crm ondemand, the import option will not work.
    Thanks
    Swami

  • Has anyone had issues with Administration\Data Import/Export\Data Import???

    Has anyone had issues with Administration\Data Import/Export\Data Import???
    I have a client who has recently upgraded from V2007 to V8.81. They were succesfuly  using this standard function to import supplier prices to their master price list, but now it has failed?
    I have looked at the file they are importing and it appears to be fine.
    On closer inspection, it did contain approx 46,000 entries, so I took the first 1,000 and created a test file, which imported fine.
    The only issue I found was Speed, with the test file of 1,000 records taking about 30 Mins to import. This appeared to get slower and slower the further through the file it got!
    Based on this, I have estimated that the whole file would takle about 13 hours to import. The client say that when they used to run it on version 2007 it was far quicker?
    In practice, it does appear to run, but the speed is the issue. Having said this, I set the whole file to run last night (over night)and this morning it had appeared to hang after about 2,307 rows, with nothing else being updated.
    Has anyone any ideas or is aware of performance issues like this?
    Thanks,
    Ian

    Always an option, but would you give your clients access to this tool?
    Not sure really.
    I have uploaded a copy of their database onto my test system and run the same routine. Its equally as SLOW
    I can't gauage if its an issue with 8.81 that 2007 didn't have, as I only have the client's word on it, however I have no reason to disbelieve them.
    Kind regards,
    Ian

  • How can I get a PHP include to update automatically after a set date?

    Hi guys,
    I maintain a number of website for sports athletes and on their websites I have an include for their next event - just a small PHP file with when and where the next event is.
    After each event I have to manually update the include to the clients' next event which is a bit of a pain and quite time-consuming.
    Is there any way I can set up the include script to automatically update with a different file to include after a set date?
    Or, even better, can I set up the include itself to display the updated content after a specific date?
    This is what the include contains...
    <img class="event" src="/images/eventPicture.png" />
    <p class="nextEvent">Location<br />
    Series<br />
    1-3 January 2013</p>
    Thank you very much and I hope to hear from you!
    SM

    Its easier than setting up a database...I've been dabbling. You will still need to use server-side language like php.
    First establish the date by adding the get date function to the page:
    <?php
    $date = date('d-F-Y');
    ?>
    Then add the below bit of php in the position where you want the 'include' to appear. (so if on the 19th of April you want event_3.php to appear:
    <?php
    if ($date == "19-April-2013") { ?>
    <?php include ('event_3.php')?>
    <?php } ?>
    This isnt economical for many event dates but if you have a few it will do as a solution.
    It's not great because you would need to add as below, changing the include on the date you want it to show which will replace the previous event.
    <?php
    if ($date == "19-April-2013") { ?>
    <?php include ('event_3.php')?>
    <?php } ?>
    <?php
    if ($date == "30-May-2013") { ?>
    <?php include ('event_4.php')?>
    <?php } ?>
    <?php
    if ($date == "4-June-2013") { ?>
    <?php include ('event_5.php')?>
    <?php } ?>

  • How to Zero Debt After Account Closure Date in MDX Calculated Measure

    Thanks in advance to anyone who can help me with the below problem.
    I’m building a Finance cube, it holds all customers we have billed and any cash received against their debt. I have a few calculated measures that work out Total Debt, Total Cash. From these I have a calculated measure called Total Debt Outstanding which
    is (Total Debt - Total Cash)
    My problem, when an account is closed the debt is not always zero. We don’t always get all the cash to cover the debt so just write off the outstanding debt. I have a requirement to zero out the debt after closure date has passed. 
    If I look at the calculated measure [Total Debt Outstanding] today for all accounts it will include debt figures from closed accounts, I need the calculated measure
    [Total Debt Outstanding] to ignore this and not to inflate my current debt figure.
    Example
    Account 12345, Debt £3142 added on 15/03/2013, Cash £1,687
    received on 12/12/2013, Total Debt Outstanding £1107. The Account closed on 15/12/2013 with a balance of £1107, after the closure date I need the total Debtor outstanding figure to be £0 if you browse the cube with my date dimension. 
    My date dimension Hierarchies is Year-Quarter-Month-Day
    I have tried using the Scope function to Zero the Total Debt Outstanding figure after closure date but can’t get it to work. 
    My scope calculation is below:
    Scope([Measures].[Total Debt Outstanding]);
    This = IIF([Date].[Year-Quarter-Month-Day].Currentmember > [ClaimData].[Closed].[Closed].Currentmember,0,[Measures].[Total Debt Outstanding]);
    End Scope;
    My Account Opening and Closing dates are in my ClaimData fact table, with Opening Date linked to the date dimension. I can’t link Closing date as we don’t always have a closing date until the account closes, so SSAS errors.
    Many Thanks

    Hi,
    Try compare key value of members. For example,
    [Date].[Year-Quarter-Month-Day].Currentmember.Properties("key") > [ClaimData].[Closed].[Closed].Currentmember.Properties.("key")
    And think about UnknownMember in Date dimension. In this case you can link Closing date
    Best regards,
    Aleksandr Severin

Maybe you are looking for