ETL design pattern with Data Integrator

Hi all
I have searched a lot on the SAP website and also on the Google to find a reliable document about ETL design pattern which applies to BO Data Integrator, but it was not successful. Most of the links I found was for  SSIS and Informatica,....
I would be grateful if you could guide me to a good link about ETL design pattern.
Thanks.

I would start with these two:
https://wiki.sdn.sap.com:443/wiki/display/BOBJ/ETLProjectGuidelines
https://wiki.sdn.sap.com:443/wiki/pages/viewpage.action?pageId=49414406

Similar Messages

  • Producer/Consumer Design Pattern with Classes

    I'm starting a new project which involves acquiring data from various pieces of equipment using a GPIB port.  I thought this would be a good time to start using Classes.  I created a GPIB class which contains member data of:  Address, Open State, Error; with member vis such as Set Address, Get Address, Open, Close...general actions that all GPIB devices need to do.  I then created a child class for a specific instrument (Agilent N1912 Power Meter for this example) which inherits from the GPIB class but also adds member data such as Channel A power and Channel B power and the associated Member Functions to obtain the data from the hardware.  This went fine and I created a Test vi for verfication utilizing a typical Event Structure architecture. 
    However, in other applications (without classes) I  typically use the Producer/Consumer Design Pattern with Event Structure so that the main loop is not delayed by any hardware interaction.  My queue data is a cluster of an "action" enum and a variant to pass data.  Is it OK to use this pattern with classes?  I created a vi and it works fine and attached is a png (of 1 case) of it.
    Are there any problems doing it this way?
    Jason

    JTerosky wrote:
    I'm starting a new project which involves acquiring data from various pieces of equipment using a GPIB port.  I thought this would be a good time to start using Classes.  I created a GPIB class which contains member data of:  Address, Open State, Error; with member vis such as Set Address, Get Address, Open, Close...general actions that all GPIB devices need to do.  I then created a child class for a specific instrument (Agilent N1912 Power Meter for this example) which inherits from the GPIB class but also adds member data such as Channel A power and Channel B power and the associated Member Functions to obtain the data from the hardware.  This went fine and I created a Test vi for verfication utilizing a typical Event Structure architecture. 
    However, in other applications (without classes) I  typically use the Producer/Consumer Design Pattern with Event Structure so that the main loop is not delayed by any hardware interaction.  My queue data is a cluster of an "action" enum and a variant to pass data.  Is it OK to use this pattern with classes?  I created a vi and it works fine and attached is a png (of 1 case) of it.
    Are there any problems doing it this way?
    Including the error cluster as part of the  private data is something I have never seen done and ... well I'll have to think about that one.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Choice of design pattern for data acquisition system

    Hello all
    I have a trouble about selecting the suitable design pattern / architecture for a data acquisition system. 
    Here is the details of the desired system:
    There is data acquisition hardware and I need to use it by observing parameters on User interface. 
    the data acquisiton period, channel list to scan should be chosen on User interface. Besides, there are many user interface interactions. e.g. if user selects a channel to add scanlist, then I need to enable and make visible some other parts on user interface. 
    When user completes the channel selection, then he will press the button to start data acquisition. Then I also need to show the scanned values on a graph in real time and log them in txt file.
    I know that I cannot use producer consumer pattern here. because the data acquisition loop should wait for parameters to scan channels. and it works in a given period by user. so the user interface loop performs higher rate then consumer loop (data acquisition loop). it means queue will be bigger bigger. if I use notifier it will loss some data come from user interface. 
    is there any idea about that ? is there any suitable design pattern for this case ? 
    Thanks in advance
    best regards 
    Veli BAYAR
    Embedded Systems Software and Hardware Engineer 
    "You live in a graphical world. Why not program in one?"
    Solved!
    Go to Solution.

    johnsold wrote:
    Veli,
    I recommend the Producer/Consumer model with some modifications.
    You might need three loops.  I cannot tell for sure from your brief description.
    The User Interface loop responds to the user inputs for configuration and start/stop of acquisition.  The parameters and commands are passed to the Data Acquisition loop via a queue. In this loop is a state machine which has Idle, Configuration, Acquisition, and Shutdown states (and perhaps others). The data is sent to the Processing loop via a different queue. The Processing loop performs any data processing, displays the data to the user, and saves it to file. A notifier can be used to send the Stop or shutdown command from the User Interface loop to the other loops.  If the amount of processing is minimal and the file write times are not too long, the Processing loop functions might be able to occur in the Timeout case of the UI loop Event structure.  This simplifies things somewhat but is not as flexible when changes need to be made.
    I am not sure that a Design Pattern for this exact setup exists but it is basically a combination of the Producer/Consumer (Events) and Producer/Consumer (Data) Design Patterns.
    Lynn
    Check out this thread: http://forums.ni.com/t5/LabVIEW/Multiple-poll-case-structures-to-event-help/td-p/2551309
    There are discussions there about a 3-loop architecture that may help you.
    Jeff
    Jeffrey Zola

  • How to load an XML schema with Data Integrator ?

    Post Author: Kris Van Belle
    CA Forum: Data Integration
    Is someone having experience with loading data from a regular table into an XML schema ?
    What are exactly the steps to undertake ? The DI user manual does not provide that much information...

    Post Author: bhofmans
    CA Forum: Data Integration
    Hi Kris,
    In the Designer.pdf there is a chapter called 'nested data' with more information, but you can also check this website with some detailed instructions and examples on how to build a nested relational data model (NRDM).
    http://www.consulting-accounting.com/time/servlet/ShowPage?COMPANYID=43&ELEMENTID=161
    (Thanks to Werner Daehn for putting this together).

  • Design patterns  for data access

    . What design patterns can be used for data access to ensure that only a minimum amount of data is maintained in session

    . What design patterns can be used for data access to
    ensure that only a minimum amount of data is
    maintained in sessionThat's not a design pattern, that's business logic.
    Your application has to determine what "minimum" means.
    Martin Fowler has lots of patterns for data access in his "Patterns of Enterprise Application Architecture". Check them out.

  • SAP Standard Extractors with Data Integrator

    Does the SAP ERP or APO Standard Extractors can be used and managed by Data Integrator? e.g. 2LIS_12_VCITM, etc.

    Hi,
    you might want to put the question into the EIM area. This forum is for the Integration Kit for SAP product
    Ingo

  • Problem with data integration when using KCLJ

    Hello,
    For a project, I had to integrate a new field using transaction KCLJ.
    For this I extented the DDIC structure of the sender structure, and after that, I updated the corresponding transfer rules.
    When I execute transaction KCLJ I have no error, and table BUT000 is updated with the data of the flat file.
    The problem is that erase also 6 BUT000's fields; they're not in the sender structure and so, have no transfer rules.
    Could you help me ?

    Hi
    Please read this.
    External Data Transfer
    These activities are not relevant if you use a CRM/EBP system.
    In the following activities you make definitions for transfer of business partner data or business partner relationship data from an external system to a SAP System.
    Data transfer takes place in several stages:
    1. Relevant data is read from the external system and placed in a sequential file by the data selection program. The data structure of the file is defined in the sender structure.
    This procedure takes place outside of the SAP environment and is not supported by SAP programs. For this reason, data changes can be made at this point by the data selection program.
    2. The sequential file is stored on an application server or a presentation server.
    3. The SAP transfer program reads data from the file and places this in the sender structure. This does not change the data. This step is carried out internally by the system and does not affect the user.
    4. Following transfer rules that have to be defined, the transfer program takes the data from the sender structure and places it in the receiver structure. During this step you can change or convert data.
    The receiver structure is firmly defined in the SAP system. Assignment of the sender structure to the transfer program, and of the transfer program to the receiver structure is made using a defined transfer category.
    5. The data records in the receiver structure are processed one after the other and, if they do not contain any errors, they are saved in the database.
    Before you transfer external data for the first time, make the following determinations:
    The structure of the data in the external system may not match the structure expected by the SAP system. You may have to supplement data.
    There are two ways in which you can adapt the structure:
    You make the required conversions and enhancements within the data selection program prior to beginning the transfer to the SAP system. This will be the most practical solution in most cases since you have the most freedom at this point.
    You do the conversion using a specially developed transfer program and transfer rules.
    You then define the fields of the sender structure. The system offers you the option of automatically generating a sender structure that is compatible with the receiver structure.
    You define transfer rules to create rules according to which the fields of the sender structure are linked with those of the receiver structure.
    You now carry out the transfer.
    SAP Enhancements for External Data Transfer
    The following SAP enhancements are offered in the following areas of External Data Transfer:
    Four Customer Exits exist for the data transfer or for the conversion from IDOC segments. The Exits are contained in the enhancement KKCD0001. As soon as the Customer Exits are activated, they are carried out for all sender structures or segments. The first two Customer Exits require minimal coding once they are activated. The sender structure concept is used when loading data into the SAP-System. The concept Segment is used in the context of the distribution of the SAP-System. It is a matter of a record of data to be transferred or converted. It is recommendable to code a CASE -instruction within the Customer Exit, where (differentiated according to sender structure (REPID) or segment) various coding is accessed. In the parameter REPID, the name of the segment for the conversion from IDOC segments. The parameter GRPID is not filled out with the conversion from IDOC segments. You should have a WHEN OTHERS branch within the CASE instruction, in which the 'SENDER_SET' is allocated to the 'SENDER_SET_NEW' or the 'RECEIVER_SET' to the 'RECEIVER_SET_NEW'. Utherwise the return code will have its initial value. You can view a possible solution in Code sample.
    The first Customer Exit is accessed before the summarizing or conversion. It is called up as follows:
    CALL CUSTOMER-FUNCTION '001'      EXPORTING            GRPID          = GRPID       "Origin            REPID          = REPID       "Sender program           SENDER_SET     = SENDER_SET  "Sender record      IMPORTING           SENDER_SET_NEW = SENDER_SET  "modified sender record            SUBRC          = SUBRC.      "Returncode
    If the variable 'SUBRC' is initial, the modified record is edited further or else passed over. The import parameter 'SENDER_SET_NEW ' must be filled out in the Customer Exit, as only this field and not the field 'SENDER_SET is further edited. This also especially means that you must allocate the import parameter 'SENDER_SET_NEW' the value of 'SENDER_SET' for records, for which no special handling will be carried out otherwise.
    The second Customer Exit is accessed after the summarization and before the update:
    CALL CUSTOMER-FUNCTION '002'   EXPORTING     REPID            = REPID           "Senderprogramm     GRPID            = GRPID           "Herkunft     RECEIVER_SET     = RECEIVER_SET    "verdichteter Satz   IMPORTING     RECEIVER_SET_NEW = RECEIVER_SET    "modifizierter verdichteter Satz     SUBRC            = SUBRC.          "Returncode
    The modified record is only updated if the variable 'SUBRC'
    is initial.
    The import parameter 'RECEIVER_SET_NEW' must be filled out in the Customer Exit, since only this field and not the field 'RECEIVER_SET _NEW' is updated.
    The third Customer Exit is used for replacing variables. It is called up when you load the transfer rules.
      CALL CUSTOMER-FUNCTION '003'     EXPORTING       REPID = REPID       GRPID = GRPID       VARIA = VARIA       RFELD = RFELD       VARTP = VARTP     CHANGING       KEYID = KEYID     EXCEPTIONS       VARIABLE_ERROR = 1.
    The parameters REPID and GRPID are supplied with the sender structure and the origin. The variable name is in the field VARIA. The name of the receiver field is in the parameterRFELD. Field VARTP contains the variable type. Valid types are fixed values of the domain KCD_VARTYP. You transfer the variable values in the parameter KEYID. If an error occurs you use the exception VARIABLE_ERROR.
    the fourth Customer Exit is required in EC-EIS only. It is called up after the summarization and before the determination of key figures. It is a necessary enhancement to the second Customer Exit. This is because changes to the keys are considered before the database is checked to see if records exist for the keys.
    The function is called up as follows:
    CALL CUSTOMER-FUNCTION '004' CHANGING    RECEIVER_SET = R    SUBRC = UE_SUBRC.
    The parameter RECEIVER_SET contains the receiver record to be changed. The parameter RECEIVER_SET is a changing parameter. No changes must be made to the function module if it is not used.
    The User-Exits can be found in the Module pool 'SAPFKCIM'. If you want to use the Customer Exits, you can create a project and activate the Customer Exits with the transaction 'CMOD'. The enhancement which you must use with it is KKCD0001.
    Note that when programming customer exits, that these will also run if corrected data records are imported into the datapool within the context of post processing for both test and real runs.
    I will provide some pointers soon. Give me some time.
    Hope this will help.
    Please reward suitable points.
    Regards
    - Atul

  • Factory Design Pattern with Java generics

    I was wondering if it was possible to implement the factory pattern using a generic like sintax with java5. Something like:
    IFactory factory = new ConcreteFactory();
    Car c=factory.CreateObject<Car>();
    I saw this article the other day http://weblogs.asp.net/pgielens/archive/2004/07/01/171183.aspx
    done in C# for the framework 2.0 and tryed to re-implement it with java5 however I was less then fortunate, can someone do a functional conversion?

    I had to change the signature a bit but this is the best I came with:
    (I don't like I have to write Type.class but if someone has a better idea please share, I deliberatly used classes are return types but you can easily program to the interfaces as well)
    use:
    ConcreteFactory cf=new ConcreteFactory();
    Car c=cf.Create(Car.class);
    public interface Vehicle {
    String getName();
    public class Plane implements Vehicle {
    String name="Mig 29";
    public String getName() {
    return name;
    public class Car implements Vehicle {
    String name="Volvo";
    public String getName() {
    return name;
    public interface IFactory{
    <T> T Create(Class<T> type);
    public class ConcreteFactory implements IFactory {
    public <T extends Vehicle> T Create(Class<T> type) {
    try {
    return type.newInstance();
    } catch (InstantiationException e) {
    return null;
    } catch (IllegalAccessException e) {
    return null;
    }

  • Localization design patterns with TopLink

    Hello,
    We are creating web app with JSF and TopLink.
    Each table in database may have several fields that can be localized.
    All localization resources are stored in a table that has such columns: table_name, column_name, language, localized_text.
    How would you suggesting implement localization using TopLink?

    Sorry, I missed one column.
    Columns in localization table: table_name, column_name, row_id, language, localized_text.

  • Performance degradation in Data Integrator Designer with Multi-user mode

    Post Author: ccastrillon
    CA Forum: Data Integration
    We are developing an information system based on a DataMart populated with data using ETL processes built with Data Integrator. We are three people developing and when we work at the same time we begin to have problem of performance with Designer. Designer seems freeze sometimes and development process becomes painful.
    Where is the problem? accessing repository? Is any known bug?
    Job Server? but it happens even when we don't launch any job. Only building ETL processes manipulating objects (Dataflows, workflows,...etc).
    We would appreciate  any help. Thanks in advance.
    Carlos Castrilló

    Post Author: bhofmans
    CA Forum: Data Integration
    What do you mean with 'working at the same time' ? You need 3 different repositories if you want to work with 3 developers, so there should be no impact at all when working simultaniously...
    -Ben.

  • Using Data Integrator Migration Mechanisms and Migration Tools

    Data Integrator provides two mechanisms for migrating jobs from development to test to production :<br /><ul><li>Export/import <ul><li>Export/import is the basic mechanism for migrating Data Integrator applications between phases. First, you <em>export</em> jobs from the local repository to either a file or a database, then you can import them into another local repository. </li></ul></li><li>Multi-user development <ul><li>Instead of exporting and importing applications, multi-user development provides a more secure <em>check-in</em>, <em>check-out</em>, and <em>get</em> mechanism, using a <em>central repository</em> to store the master copies of your application elements. </li></ul></li></ul><p>Regardless of which migration mechanism you choose, Business Objects recommends you prepare for migration using one or more tools that best fit your development environment :</p><ul><li>Naming conventions <ul><li>Just as Business Objects recommends you standardize Data Integrator object prefixes, suffixes, and path name identifiers to simplify your projects internally, we also recommend the use of naming conventions externally for migration purposes. </li></ul></li><li>Datastore and system profiles <ul><li>With multiple profiles, instead of a separate datastore (and datastore configuration) for each database instance, you can associate multiple <em>datastore profiles</em> with a single datastore connection. </li></ul></li></ul><p>Each mechanism and tool is recommended for specific types of projects. For more detail, please see the <em>Data Integrator Advanced Development and Migration Guide</em> provided with Data Integrator 6.5.</p>

    Hej Fahad!
    Thanx for your valueable information but i have not any background of data migration and etc. I am mobile communication developer. So this is my first assignment between two companies. first company will give me the oracle data maybe in the shape of dump files and i will make my virtual server and there i will test that the data is valid or not and etc. If the data is valid then i will transfer it to target platform.
    So now i have no access to the source system from where data is coming. Can you please explain me little bit more about invalid objects and still export/import works same.
    Waiting for reply
    Regards
    Hani

  • Design pattern / data loading solution

    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...
    thanks for any help!
    psi have also asked this question here [http://stackoverflow.com/questions/4329087/design-pattern-data-loading-solution]

    rackham wrote:
    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...That depends on the specifics.
    The fact that it seems like processes are similar doesn't mean that they are in fact the same. My code editor and Word both seem to be basically the same but I am rather sure that generalizing between the two would be a big mistake.
    And I speak from experience (parsing customer data and attempting to generalize the process.)
    The problem with attempting to generalize is if you generalize functionality that is not in fact the same. And then you end up with conditional logic all over the place to deal with differences dependent on the users. Rather than saving time that actually costs time because the code becomes more fragile.
    Doesn't mean it isn't possible but just rather that you should insure that it is in fact common behavior before implementing anything.

  • Data Lista Handler Design Pattern and paging queries

    I'm using Toplink in an MVC (Regular Beans, JSP and Struts) application. I'm not using ADF. At least not until i make up my mind on the features-portability tradeoff.
    Anyway, my application will have to use queries to show the data to the user. This can easily be accomplished by putting the search parameters on my jsp and using ReadAllQuery to compose my query.
    My problem is that the result can return a large amount of records and show all this records at once on the jsp might take a time the user won't be willing to wait. That's when paging the results will come handy.
    In another application of mine i use paging manipulating the rownum of the query, but in that case, i had control of the whole sql statement. As this is the first time i'm working with Toplink, i'm feeling that can't be done.
    There's the Data List Handler design pattern too, but that one also manipulates the sql statement.
    I'm wondering now whether paging can be accomplished using Toplink..? Will i have to use ADF...? This kind of stuff.
    Thanks a lot to all.
    - Eduardo

    Eduardo,
    What you need to do is use Cursors in TopLink.Scrollable cursors enable you to scroll through a result set from the database without reading the whole result set in a single database read. The ScrollableCursor class implements the Java ListIterator interface to allow for direct and relative access within the stream. Scrollable cursors also enable you to scroll forward and backward through the stream.
    Here is an example:
    ReadAllQuery query = new ReadAllQuery();
    query.setReferenceClass(Employee.class);
    query.useScrollableCursor();
    ScrollableCursor cursor = (ScrollableCursor) session.executeQuery(query);
    while (cursor.hasNext()) {
    System.out.println(cursor.next().toString());
    cursor.close();
    I hope this answers your question.
    Deepak

  • Real Advantages of Data Integrator integrated with SAP

    Hi
    Can anybody please tell me what is the real advantages of Data Integrator reading from SAP R3 / ECC and performing ETL to SAP BW 7.
    I do not know enough on the intergration between the two systems. I have a perspective client looking at the solotion but needs some motivation.
    Thanks

    I would definitely not go down that route. There is good R/3 to BW integration, without BODI in the middle. In this scenario I would see BODI as overhead only, which will slow your development down and will show your day to day load processes down. I can't see any benefit at all.
    If you have complicated transformations, use ABAP in your transformations. That is much more powerful than scripting on the BODI side.
    If you do use BODI you will find it a struggle first to extract data from R/3 and then to load data into BW. You probably end up creating a lot of extra objects on both sides just to make BODI work. BODI will certainly not make your load processes better performing and from a maintenance point of view I don't see any benefits at all.
    Maybe in the future when BODI has a better integration with both R/3 and BW it might become useful, but at this point in time I fail to see any benefits.
    I would only use BODI if I wanted to integrate data from various platforms, not for a single (SAP) platform.
    I would advise against a scenario using BODI for loading R/3 to BW.
    I hope this helps...
    Jan.

  • Data integrator designer login issue

    Post Author: jeffrey
    CA Forum: Data Integration
    Why causes this data integrator desisgner login problem ? "LOGON EXCEPTION BODI-1112170: Cannot connect to repository. Please correct and retry."
    What is the possible solution for this error.
    Thank you.

    Post Author: bhofmans
    CA Forum: Data Integration
    There seems to be something wrong with the connection to your repository.
    To test the connectivity you can use the Repository Manager (installed on the same machine as your DI Designer), fill in the repository connection parameters and click 'Get Version'. This will check the connectivity and show you the version of your repository.
    Possible causes for this error :
    Database middleware not installed on the machine (which database type are you connecting too) ?
    Database middleware not configured to connect to your repo (e.g.  for Oracle the instance must be defined in tnsnames.ora)

Maybe you are looking for

  • Another IOS6 blunder or a Virus?

    I am having a weird experience since yesterday, i have to double click on icon on ipad to open it (even to slide to open i have to double click on the slide button on the screen), it looks like i am using a keyboard to operate ipad. is it a virus or

  • SAP xi3.0 install on redhat 9.0 ...

    Hi I want to install the sap xi3.0 on redhat9.0 server. already download following s/w.. 51030362.zip ( Oracle 9.2 dvd) 51030721.zip (Exports dvd) 51030723_4.zip     (xi3.0) 51030723_5.zip (Xi3.0 connecter pack kit) 51030724_11.zip (j2ee) 51030763_8.

  • Aperture library can't link the RAW files.

    I am pretty much pulling my hair out on this. I setup Aperture so all files are collected and held into the APlibrary files. My Aperture library also resigned on an external drive. One day I realized that I needed to reformat the drive so I copied th

  • Read Statement antwork

    Hi All, I am having one record in it_vbrk table and in it_vbrp table i am having three records with same vbeln with three items. Loop at it_vbrk into wa_vbrk. Move: XXXX to XXXX Read table it_vbrp into wa_vbrp with key vbeln = wa_vbrk-vbeln. if sy-su

  • Updates or install of applications from disk won't install.

    Hello. Tried running a software update on my recently already archived and installed Macbook. Neither the Safari or iTunes would install. Tried installing older version of iTunes (deleted iTunes 8 due to problems) and that wont work. When I did a sof