Core data: Versioned model migration with change in existing attribute data types.

Hi ,
I want to upgrade my ios app with app store version with enhancement of new functionality related to core data features in the app.
-In new upgarde version, I want to change data types of attribute which is already present in core data model of existing app store version.
e.g.In Version 1.0 ,Attribute "date" datatype is "NSDate", New change required to change  datatype "NSDate" to "NSString" with new version model.
I followed Lightwaight migration, but as per documentation , Lightwaigh migration not support for change data type of any existing attribute/entity of core data model.
Please suggest optimized solution for migration of database along with change in data type of exsiting attribute of the entity in core data.
If required in further info ,please ask.
Thanks in advances.
Regards,
Laxmikant

More Info: The two entries are actually pointing to the same object. If I save the context and restart, I only have one entry (I can also see this by looking at the XML store).
It seems that the NSTableView is getting messed up somehow after the FetchRequest. Still stumped.

Similar Messages

  • IXOS 9.6.1 migration with change in DBMS(Oracle - MS SQL)

    Hi All
    We have requirement to migrate IXOS 9.6.1 archive server(DBMS:Oracle) from one server to another with change in DBMS to sql server
    Current : OpenText Archive server 9.6.1 with Oracle 
    Target : OpenText Archive server 9.6.1 with MS SQL
    Can anyone please help me out with migration of OpenText archive server.
    Thanks and Regards,
    Chaitanya

    I turned on the jsp debug init-param for the system for more information about the error. I get the following:
    Request URI:/interop/jsp/css/CSSTemplate.jsp
    Exception:
    OracleJSP:oracle.jsp.parse.JspParseException: /jsp/css/provisionSummary.jsp: Line # 119, <bean:define id="appId" name="appName" property="key"/>
    Error: Tag attempted to define a bean which already exists: appId
    When I attempt to provision the user I created for administrative purposes, Ialso see the following:
    --from the file SharedServices_Metadata.log I see the error:
    +2009-09-18 15:55:32,399 \[AJPRequestHandler-HTTPThreadGroup-6\] ERROR com.hyperion.eie.common.cms.CMSFacade - org.apache.slide.structure.ObjectNotFoundException: No object found at /users/admin+
    --from SharedServices_Admin.log
    +2009-09-17 14:49:20,583 \[Thread-13\] ERROR com.hyperion.cas.server.CASAppRegistrationHandler.loadApplicationsFromCMS(CASAppRegistrationHandler.java:430) - AuthorizationException occured when loading applications : Refer log in debug mode for details+
    How does one set these logs into debug mode ?

  • Opportunity Closing Date - Upon Status Change, Retain Existing Closing Date

    Hello,
    When an Opportunity is set to Won or Cancelled, the system automatically updates the Closing Date to today's date (i.e., the date in which the Opportunity was changed to Won or Cancelled).
    Does anyone know how to override this functionality?  The requirement is to retain the existing Closing Date, and to NOT automatically set the Closing Date to today's date.
    I cannot seem to find anything in config to override this functionality.  I'm guessing I'll need to implement a BADI - just not sure which one...
    Thanks,
    Matt

    Prasenjit,
    Thanks for the reply, I more than appreciate it!  Per your recommendation, after some debgging, I was able to locate the event :-).  I have awarded you points!
    In case anyone comes across this issue, here's how I resolved it:
    In T-Code CRMV_EVENT, for Transaction Category BUS2000111 (i.e., Opty), Object Name STATUS, Event AFTER_CHANGE, and Attribute I1005, I specified my custom Function Module ZCRM_OPPORT_H_EXPECTEND_CMPD, which is simply a copy of the out-of-box Function Module CRM_OPPORT_H_EXPECTEND_CMPD_EC.
    In my custom FM, I simply commented out the code that defaults the EXPECT_END date to today's date.
    In addition, I had to make the following entry in table CRMC_FUNC_ASSIGN:
    FUNC_NAM = ZCRM_OPPORT_H_EXPECTEND_CMPD
    FUNCTION = CRM_OPPORT_H
    An entry is required in table CRMC_FUNC_ASSIGN in order for t-code CRMV_EVENT to recognize the custom Function Module.
    Thanks!

  • Change/Delete Existing Pricing Condition Type

    Hello
    I have a manu Pricing Condition Type and a rate(obtained from custom source) was inserted pro-grammatically into this line item whenever the new line item was added into the basket using CRM_ISA_BASKET_ITEMS badi.
    When the user changes the Qty for the same line item, I want to delete the existing condition rate(manual) and find the new current rate from the custom source and add that to this Line Item.
    Question: How to change/delete existing Condition Rate ?
    Cheers
    RJ
    Reference: E-Commerce Implementation

    Hello,
    You can use java condition value user exit for this.
    Thanks and Regards
    shanto aloor

  • Directory Migration with changing schema

    Hi,
    We are planning a directory migration as part of an implementation of Sun Identity Manager.
    The directory migration is from one set of servers to another and comprises an upgrade from Directory 5.2 to 6.3, and a schema change.
    We'd like to have a rollback plan that involves copying changes from the new directory to our legacy servers. A delay is acceptable.
    We're doing two things to our schema which increases the complexity:
    - Users are being segregated, e.g.:
    OLD: uid=123456,ou=People,ou=UK,dc=root
    NEW: uid=123456,ou=Internal,ou=Users,dc=root OR uid=567890,ou=External,ou=Users,dc=root
    - Replacing an ou hierarchy containing Groups (groupofuniquenames) to nsRoleDN attributes on users.
    We'd like to avoid writing some kind of custom script to retrieve changes, modify them and insert them into the old directory.
    Directory Proxy doesn't seem to be the right tool for this job.
    Could anyone suggest an alternative?

    So I have done this migration before - The directory information is not so bad to get migrated in fact the Novell IDM could do that piece for you.
    Migrating the files, and security over will be an exercise in icacls
    Logon scripts/Group policy preferences will need to be done to convert the existing drive mapping scripts over, Look at DFS for this so you only have to migrate once.
    If everything is planned out it will keep the migration smooth, but truly understanding the role of Novell IDM in the environment, analyzing the shared files and data (Good time to cleanup, hint hint), validating the groups and security are set up correctly
    will aid immensely.
    I would also do in batches based on shared data and not try to move everyone at one big move, that way as departments move you can get the kinks out of the process.
    From a workstation that has the novell client you can script the data copy with robocopy over to the new home
    once the data is copied over, robocopy can keep the data sync'd until users are migrated over.
    Novell has the ability of exporting directory structures with permissions on them, scripting icacls will get the permissions repopulated on the MS side. Groups and users of course would need to be there already.
    Thanks,
    Brad Held
    Windorks.wordpress.com
    View
    Brad Held's profile

  • Exporting historical data to text file with MAX misses columns of data?

    I am using Labview 7.1 with DSC module 7.1 and wants to export data to either excel format or .txt.
    I have tried this with the historical data export in MAX, and also programmatically with the "write traces to spreadsheet file.vi" available in the DSC module. All my tags in my tag engine file (*.scf) are defined to log data and events. Both the update tag engine deadband and the update database are set to 0 %.
    My exported excel or text file seems reasonalbe except that some columns of data are missing. I dont understand
    why data from these tags are not in the exported files since they have the same setup in the (.scf) as other tags which are okay exported?
    All defined tags can be seen using the NI hypertrend or MAX. Also the ones that are not correctly exported to file.
    Appreciate comments on this.
    Best regards,
    Ingvald Bardsen

    I am using LV and DSC 7.1 with PCI-6251 and max 4.2. In fact, just one column of values does not make sense. In attachment, follows the excel exported file. The last of column called ...V-002 that is the problem. I put probes for checking values but them shows correct. When the file is exported that to show values wrong.
    The problem of missing values in column i solved putting 0% in field of deadband.
    thank you for your help
    Attachments:
    qui, 2 de ago de 2007 - 132736.xls ‏21 KB

  • Data access in reports after changing Member Access profile

    Hi All
    I made changes in the member access profile of a user (while current system was available for User Planning).
    After making and applying these changes in Access profile, the Current view in the report accessible to user got updated.
    But the problem was in reporting, where the updates didnot happen.
    Please suggest for necessary steps so that user get updated report as per change in Member access profile.
    Thanks in advance.
    Regards
    Abhishek

    Hi Lokesh
    Thanks for the reply.
    1. Report is based on CV
    2. With another ID assigned to same member access profile, the report is showing complete data.
    I mean with X user id 100 data sets are showing while with Y user id only 95 data sets. Where both X and Y are having same Member Access Profiles.
    Regards
    Abhishek

  • Purchase Ord delivery date changes SO schedule line date?

    Hi,
    I am purchasing a drop ship item as per sales order. The account assignment in the purchase order is the Sales order. 
    Why in some cases the schedule line date in the sales order changes when I change the delivery date in the purchase order? whereas in other cases it does not.
    Please help me out with this question.
    Sincerely.
    Puja

    Puja,
    The SO schedule line date should not change when you change the purchase order delivery date.
    However, if you change the PO del date, AND a user performs re-ATP against the sales order, OR if you are running backorder processing (rescheduling) of sales docs, then the SD item/schedule date may change to match the changed availability situation caused by the change of the delivery date of the PO.
    Look in the sales order in VA03, then Environment>changes.  Look to see if anyone or anything has 'touched' the SO item that is home to the schedule lines that were affected.
    Regards,
    DB49

  • Pass i_table with changing

    Hello all,
    I m trying to pass an internal table with CHANGING
    Code
    FORM X
    it_zfichpai type table of zfichpai,
    wa_zfichpai like line of it_zfichpai.
    perform create_md5key changing wa_zfichpai-MD5KEY
                                it_zfichpai.
    ENDFORM.
    FORM create_MD5key changing hkey type zfichpai-md5key
                                -> p_zfichpai like zfichpai <-(dont work).
    the "like" doesn't work,neither "like table of", "type", "type table of"
    A way to do it is to make a global declaration of it_zfichpai and use "like" statement in my form declaration but I want my it_zfichpai declaration stay local.
    Need help

    define a table type like.
    types: begin of ty_ltap,
                 lgnum like ltap-lgnum,
                 tanum like ltap-tanum,
                 tapos like ltap-tapos,
              end of ty_ltap,
              tt_ltap type table of ty_ltap.
    data: i_ltap type tt_ltap.
    You can use the table type in the definition of the subroutine :
    FORM subroutine changing pi_ltap type tt_ltap.
    ENDFORM.
    And call it by :
    perform subroutine changing i_ltap.
    regards,
    Hans

  • Flat file data load error: Exception with type CX_SY_CONVERSION_NO_NUM

    Hi Guys,
    I am trying to create a datasource from a flat file (excel data in csv format) and when I create infopackage on the datasource and trying to load, I am getting the error,
    Error 'An exception with the type CX_SY_CONVERSION_NO_NUM' at conversion exit RSDS_CONVERT_NUMBER (field FAB record 480, value 8)
    Can somebody help me with this.
    I am working in BI 7. In the excel sheet I have all cell with general format.
    I tried using PSA Typed and not Typed. Didnt work
    Thanks in advance

    I selected Data Format as "Separated with Separated (for Example CSV),
    Data Separator ,
    Escape Sign ; (this i didnt understand where it comes in csv file) also I left the  Hex check boxes unchecked.
    PSA not Typed is also unchecked.
    Am I missing something? Data types in excel spreadsheet (csv file)?
    Thank you

  • Change labels Additional Customer Data

    Hi Expert Forum
    In the transaccion XD02 -> in the option "Additional Customer Data (Central)" I want to change the label "Attribute 1" for a new value for example "Customer Group"
    How can I do this?
      Program name     SAPLV02Z      
      Screen number    0100          
      Program name     SAPLV02Z      
      Status           A001          
      Transparent tabl KNA1          
      Field name       KATR1         
      Data element     KATR1         
      DE supplement    0             
      Screen field     KNA1-KATR1

    Goto CMOD->Text enhancements->Keywords->Change->Enter Data element and change the text.
    Thanks,
    SKJ

  • Only PSA with the 'Update subsequently in data targets'- BI 7.0

    Hello friends,
    We are currently on BI7.0 and my concern is with the InfoPackage Processing of 'Only PSA' with the 'Update subsequently in data targets'checked. When I load the data it loads into the PSA and I can see it in the monitor and in the Manage of the DSO it shows me Transferred '328' and Added Record '0'/. How do I manually update it from the PSA into the data target from this point?
    When I select  'PSA and then into Data Targets (package by package)' in the DSO manage screen it shows me Transferred 328 and Added Record '123'. So certainly that means that the data is not being subsequently loaded from the PSA when 'Only PSA' with the 'Update subsequently in data targets'checked option is used. Please tell me how to update data from PSA into Data Target when 'Only PSA' with the 'Update subsequently in data targets'checked is selected.
    Many thanks.

    Hi,
    In BW 3.x data flow, 'Only PSA' with the 'Update subsequently in data targets' option in Info package will load up to PSA only in case of the Process chains.
    After this info package we must add the additional step "Update from PSA to Target" in order to load to the target.
    Hope this will fix ur issue.

  • Latest PowerQuery issues with data load to data models built with older version + issue when query is changed

    We have a tool built in excel + Powerquery version 2.18.3874.242 - 32 Bit (No PowerPivot) using data load to data model (not to workbook). There are data filters linked to excel cells, inserted in OData query before data is pulled.
    The Excel tool uses organisational credentials to authenticate.
    System config: Win 8.1, Office 2013 (32 bit)
    The tool runs for all users as long as they do not upgrade to PowerQuery_2.20.3945.242 (32-bit).
    Once upgraded users can no longer get the data to load to the Model. Data still loads to the Workbook but the model breaks down. Resetting load to data model erases all measures.
    Here are the exact errors users get:
    1. [DataSource.Error] Cannot parse OData response result. Error: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    2. The Data Model table could not be refreshed: There isn't enough memory to complete this action. Try using less data or closing other applications. To increase memory available, consider ......

    Hi Nitin,
    Is this still an issue? If so, can you kindly provide the details that Hadeel has asked for?
    Regards,
    Michael Amadi
    Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to vote it as helpful :)
    Website: http://www.nimblelearn.com, Twitter:
    @nimblelearn

  • What version of Data Modeler comes with SQL Developer EA 3.1?

    Hi:
    I've looked but can't find what version of Data Modeler is shipped with SQL Developer 3.1 EA1. Is it Data Modeler EA 3.1 or the production Data Modeler 3.0 version?
    Thanks,
    Doc

    Hi:
    I ran an import and here's what the log showed...so it looks like SQL Developer EA 3.1 uses the production version of SQL Data Modeler...3.0.
    Oracle SQL Developer Data Modeler 3.0.0.665.2
    Oracle SQL Developer Data Modeler Import Log
    Date and Time: 2011-11-14 14:19:42 PST
    Design Name:
    RDBMS: Oracle Database 10g

  • Data Center Design: Nexus 7K with VDC-core/VDC-agg model

    Dear all,
    I'm doing with a collapsed VDC-core/VDC-agg model on the same chassis with 2  Redundant Cisco Nexus 7010 and a pair of Cisco 6509 used as a Service  Chassis without VSS. Each VDC Core have redundant link to 2 PE based on  Cisco 7606.
    After reading many design document of Cisco, I'm asking  what is the need of a Core Layer in a Data Center especially if it is  small or medium size with only 1 aggregation layer and dedicated for a Virtualized Multi-Tenanted environement? What is driving to have a core layer?
    Thanx

    If your data center is small enough to not require a core, then its fine to run with a collapsed core (distribution + core as the same device).  For a redundant design you need to uplink all your distribution switches to each of your cores.  If you have no cores, then you need full mess at your distribution layer (for full redundancy).
    Lets say you have only 4 distribution pairs...so 8 switches  For full redundancy each one needs uplink to each other.  This means you need 28 total ports used to connect all the switches together (n(n-1)/2).  Thats also assuming 1 link to each device.  However if you had redundant cores, the number of links used for uplinks reduces to 21 total links (this includes links between each distribution switch in a site, and link between the two cores).  So here you see your only saving 7 links.  Here your not gaining much by adding a core.
    However if you have 12 distribution pairs...so 24 switches.  Full redundancy means you have 276 links dedicated for this.  If you add a core, this drops to 61 links.  Here you see the payoff.

Maybe you are looking for

  • How to delete the transparent frames that Pluraleyes adds to replaced audio clips in an event ?

    Hi there, I have a workflow issue working with Pluraleyes 3 and FCPX. MY CONTEXT So I imported all my clips into FCPX (about 300 from camera 1, about 300 from camera 2 and separate audio which ran almost with only very few interruptions on set). I sy

  • Attribute name images in table headers

    Hi all, it is quite easy to use attribute images for the text values in Publisher. But how can I use the attribute image for the attribute name in Publisher? I would like to have one column with an image in the header and text values to be shown for

  • Query is doing full table scan

    Hi All, The below query is doing full table scan. So many threads from application trigger this query and doing full table scan. Can you please tell me how to improve the performance of this query? Env is 11.2.0.3 RAC (4 node). Unique index on VZ_ID,

  • .class expected error

    I am consistently recieving an error message that says ".class expected" on the "Easter date = new Easter (int year);" line every time i try to compile my tester class. Can anyone help me figure out the problem? Here is my tester/driver class: * Test

  • Top row of keyboard wrong after goto snow leopard

    Just updated from tiger to snow leopard, now the top row keys (using 2 different keyboards that work ok on other macs correctly) do not work correctly: Speaker up vol key: starts dashboard Speaker down vol key: all windows hide Speaker on/off: shows