.csv flat file loading data issue

Environment: SQL Server 2008 R2
Purpose: Load source data which are stored in a csv file to the destination table
Problem: After setting up flat file transformation, including the columns in the output alias, see the following picture:
,because SSIS set datatype to 50 char while the length is 65. I also set up response string filed to 100 in the destination table, An error message showed as the following picture
What I could do differently to overcome this issue. please advise

There are a number of them but not associated with that field
[SSIS.Pipeline] Warning: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console.
1. [Flat File Source [511]] Error: Data conversion failed. The data conversion for column "Version_cd" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
2. [Flat File Source [511]] Error: The "output column "Version_cd" (604)" failed because truncation occurred, and the truncation row disposition on "output column "Version_cd" (604)" specifies failure on truncation.
A truncation error occurred on the specified object of the specified component.
3. [Flat File Source [511]] Error: An error occurred while processing file "C:\Users\data\ETL files\dummydata.csv" on data row 2.
4. [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Flat File Source" (511) returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput().
The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
5. Task Data Flow Task failed

Similar Messages

  • Flat File loading Initialize with out Data transfer is disabled in BI 7.0

    Hi experts,
              When loading through flat file in BI 7.0 for Info Package Level Initialization Delta Process with data Transfer is coming by default,but when i want to select Initialization Delta Process without Data transfer is disabled. (in the creation of Data Source (flat file) in the Extraction Tab Delta Process is changed to FIL1 Delta Data (Delta Images).
    please provide me Solution.
    regards
    Subba reddy.

    Hi Shubha,
    For flat file load please go throught he following link:
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/03450525ee517be10000000a1553f6/frameset.htm
    This will help.
    Regards,
    Mahesh

  • Urgent: Flat file load issue

    Hi Guru's,
    Doing loading in data target ODS via flat file, problem is that flat files shows date field values correct of 8 characters but when we do preview it shows 7 characters and loading is not going through.
    Anyone knows where the problem is why in the preview screen it shows 7 characters of date when flat file has 8 characters.
    Thanks
    MK

    Hi Bhanu,
    How do i check if conversion is specified or not and another thing is it is not just one field we have 6 date fields and all of them are showing 7 characters in PSA after loading where as flat file have 8 characters in all of the 6 date fields.
    In PSA I checked the error message it shows 2 error messages:
    First Error message:
    The value '1# ' from field /BIC/ZACTDAYS is not convertible into the    
    DDIC data type QUAN of the InfoObject in data record 7 . The field      
    content could not be transferred into the communication structure       
    format.                                                                               
    em Response                                                                               
    The data to be loaded has a data error or field /BIC/ZACTDAYS in the    
    transfer structure is not mapped to a suitable InfoObject.                                                                               
    The conversion of the transfer structure to the communication structure 
    was terminated. The processing of data records with errors was continued
    in accordance with the settings in error handling for the InfoPackage   
    (tab page: Update Parameters).                                                                               
    Check the data conformity of your data source in the source system.                                                                               
    On the 'Assignment IOBJ - Field' tab page in the transfer rules, check    
    the InfoObject-field assignment for transfer-structure field              
    /BIC/ZACTDAYS .                                                                               
    If the data was temporarily saved in the PSA, you can also check the      
    received data for consistency in the PSA.                                                                               
    If the data is available and is correct in the source system and in the   
    PSA, you can activate debugging in the transfer rules by using update     
    simulation on the Detail   tab page in the monitor. This enables you to   
    perform an error analysis for the transfer rules. You need experience of  
    ABAP to do this.                                                                               
    2nd Error Message:
    Diagnosis                                                                               
    The transfer program attempted to execute an invalid arithmetic          
        operation.                                                                               
    System Response                                                                               
    Processing was terminated, and indicated as incorrect. You can carry out 
        the update again any time you choose.                                                                               
    Procedure                                                                               
    1.  Check that the data is accurate.                                                                               
    -   In particular, check that the data types, and lengths of the     
                transferred data, agree with the definitions of the InfoSource,  
                and that fixed values and routines defined in the transfer rules                                                                               
    -   The error may have arisen because you didn't specify the      
             existing headers.                                                                               
    2.  If necessary, correct the data error and start processing again.  
    Thanks
    MK

  • Transformation not generating-Flat file loading

    Hello guys, I hope you can help me with this little confusion I have on BI7 Flat file loading.
    I got a File (CSV) on my workstation. I am trying to load Master Data. Here is the example of my file and issues:
    Lets say, I have CSV file named "CarModel.CSV" on my PC.
    This excel file has 10 Records and No atributes for this Field.
    So the records should show like this and it is showing correctly in PSA level, DS level.
    A
    B
    C
    D
    E
    F
    My goal is to load Flat file data to InfoObject (inserted in Infoprovider)
    I created Source system, DS, all thats stuffs.
    I am now on Display DS screen under Proposal Tab. I am putting 10 Records to show and hitting Load example data....it works fine by showing all 10 records. However in the bottom part of this screen, what should show as a Field ?In my case , First it was showing the First Record ("A")...i didnt think it was correct but i prceeded anyways. Transformation could not be generated.
    I tried by deleting "1" from the field "No. of header rows to be ignored" and its the same result ..No transformations.. I mean i know its very simple to load this data but i am not sure what i am doing wrong..My question is:
    1) What should show under Field/propasal tab in my case?am i supposed to create this Field?
    2) Should the system to propse the field which is from my flat file header ?in my case i dont have any header..should i have include header in my csv file like "car model"? i am confused please give me some info thanks
    dk

    Hi,
    In filed tab, u have to enter ur infoobject names in an order...and press enter it'll automatically give the description and its other factors...
    i guess u shuld have some header in the sense eg: customer,cust ID like this..this only u have to enter as fields...in proposal tab...try that
    rgds,

  • What are the settings for datasource and infopackage for flat file loading

    hI
    Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
    pls let me know
    regards
    kumar

    Loading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
    Loading of master data in BI 7.0:
    For Uploading of master data in BI 7.0
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.

  • Flat-File Loading problem

    Hi Friends,
    I am struggling with flat-file loading problem. I am trying to load a .csv file into data target. I took all pre-cautions while loading data. I look into preview and simulate the data. Everything is ok, but when i schedule the data, i found 0 records in the monitor. The following is the STATUS message of the above problem:
       No data available
    Diagnosis
    The data request was a full update.
    In this case, the corresponding table in the source system does not
    contain any data.
    System response
    Info IDoc received with status 8.
    Procedure
    Check the data basis in the source system.
             Can anybody help me what is the problem and procdure to resolve it?
    Regards,
    Mahesh

    Hi Eugene,
    Thanks for the quick reply. The following screen-shot tells you the messages of detail tab;
    OVER ALL STATUS MISSING WITH MESSAGES OR WARNINGS
    REQUEST: MISSING MESSAGES
    EXTRACTION
    EVERYTHING IS OK
    DATA REQUEST RECEIVED
    NO DATA AVAILABLE DATA ELECTION ENDED.
    PROCESSING
    NO DATA
               The above message was shown in details tab. Pls guide me to locate the problem.
    Regards,
    Mahesh

  • Decimal point separator in flat file load

    Hi all
    In BI 7.0 I'm stuck trying to load a csv flat file via DTP  (without extracting from PSA). The file I have to load has an euro amount field in this format:
    143565,56
    but the load fails. The only way to load it's to open the csv with notepad and to "find and replace" the commas with dots before starting to load, in order to have:
    143565.56
    I tried to modify the su01 settings for decimals separator, but without success, and I cant find any setting in the DTP. There is a way to load amounts whit the comma as decimal separator, avoiding to "find & replace" the file?
    Thank you in advance
    Francesco

    Sorry but your hints dont work for me.
    I checked in the Datasource Extraction tab and I specified the comma separator for decimal in User Select Entry, but without success (I tried both extracting from PSA and not).
    I tried the su01 user setting for decimal separator, but it was already right (with comma), I think in the su01 you can set only visualization of amounts, for example in the Bex.
    Before the upgrade at 7.0, the right decimal separator was the comma and we have a lot of excel macros that create csv with that decimal separator, so it's becoming a serious problem...
    Just a question: where can I check, if exists a transaction code, the BW system setting for decimal and thousand separators (not the user's ones)?
    Thank you all
    Francesco

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • Flat File - Delete data

    Hi Expert,
    I have flat file as data source. I want to know how to handle deletes with flat files.
    For examaple,
    If this is my flat file data
    Company...Costcenter....Stock
    A......B......10
    M.....Z.......20
    K......W......25
    Company and Costcenter are keys.
    If the above three records are loaded into an ODS last month and this month if I realize that A..B...10 record is wrong and is not needed, how do I delete this from the ODS? Do I need to send some indicator in the flat file for BW to understand that this is a delete entry?
    Another case is if A..B record changes valye from 10 to 35, then I can send the same file or the same record in a diferent file again and it will over write the data, which is ok.
    Can exprets plzz explain these data load streategies?
    Jason

    Hello ,
      For Case1) Do the reverse posting of the record which u wish to delete from the ODS. I Mean (A..B...-10)
    Case2) If you are loading data only till ODS. Then in your ODS u have active data table and a Change log table.
             so this new record A..B..35 will overwrite the existing rec -> A..B..10 to A..B..35 and if you are doing further update to Cube then in change log it will first create a reverse image of the exinging record ie A..B..-10 and the writes the new record A..B..35
    so --> A..B..10
             A..B..-10
             A..B..35  so the remaining will be A..B..35
                   Provided you have to maintain the Delta method in you flatfile
    -- EnjoySAP:-)
    **Award Credits if you find the solution, Have a great day

  • Best practise around handling time dependency for flat file loads

    Hi folks,
    This is a fairly common situation - handling time dependency for flat file loads. Please can anyone share their experience around handling this. One common approach is to handle the time validity changes within the flat file where it is easily changeable by the user but then again is prone to input errors by the user. Another would be to handle this via a DSO. Possibly, also have this data entered directly in BI using IP planning layouts. There is a IP planning function that allows for loading flat file data but then again, it only works without the time dependency factor.
    It would be great to hear thoughts or if anyone can point to a best practise document for such a scenario.
    Thanks.

    Bump!

  • Selection in IP for flat file Load

    Hi Experts ,
    I Want to know can I have selections available for an IP which is created for Flat file as data source.
    your early response is highly appreciated.
    Regards
    Patil

    Hi,
       Yes You can. But at the data source level you have to tick the selection fields, so that at info pack level those fields will be available for selections.
    Regards
    Sankar

  • How to convert Flat file(.txt) data to an Idoc format(ORDERS05)

    Hi,
    How to convert Flat file(.txt) data to an Idoc format(ORDERS05). If any FM does the same work please let me know.
    thanks in advance,
    Chand
    Moderator message : Duplicate post locked. Read forum rules before posting.
    Edited by: Vinod Kumar on Jul 26, 2011 11:11 AM

    Hi,
            For more information, please check this link.
    http://sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/46759682-0401-0010-1791-bd1972bc0b8a
    Have a look at the FM IDOC_XML_FROM_FILE. May be it helps...
    Regards

  • Unit Code, Commercial Code - Flat File loading vs BPS Loading

    Hi SDN Community,
    Recently, we have moved our flat file loading to be performed by BPS interfaces.
    During flat file loads, the Commercial code of all units of measures, are verified.
    eg.DAY
    But when loaded by BPS, the UNIT code is verified.
    eg. TAG.
    In the display of the BPS upload, it displays DAY which is the commercial code.
    The only thing, is the customer is forced to use TAG in the BPS upload files.
    They wish to create another record in the transaction
    CUNI to be
    Unit code = DAY
    Commercial code = DAY.
    However, i have found that we cannot allocate the same commercial code to another unit code.
    Is this a design constraint, or a process error that i am doing.
    Thank you.
    Simon

    04.01.2010 - 02:22:53 CET - Reply by SAP     
    Dear customer,
    As i do not fully understand how this works, but the base table T006A
    has 2 entries, 1 for english, 1 for German, should it not be that the
    English EN entry field be working rather that the German DE, hence DAY
    should be used? Can you please confirm my understanding on this?
    >>> If you check my attachment "SE16.xls", you can see that it's for
    language >>English<<, and the internal format is TAG while the external
    format is DAY.
    - Are there any plans to modify the BPS functionality in newer SAP BW
    versions to allow the customer to indicate the same UOM as per flat
    file laods? Or is an upgraded Support Pack that allows this?
    - If not, would it be possible to make any customer enhancements to
    allow this to take place depending on customer requirements.
    >>> There is no plan at this moment to change this BPS functionality
    in newer SAP BW versions or support packages.
    We really recommend you to use internal format TAG in the upload file
    in BPS which I think should be acceptable and feasible to you. All
    other ways trying to use external format is risky and we cannot assure
    you that it will work well as it's not SAP standard function. (I think
    it's not worth the risk as the standard function which requires
    internal format should not be too unacceptable)
    Thanks for your understanding.
    I also don't think my BC-SRV-ASF-UOM colleague would be able to help
    you a lot regarding this.
    Best Regards,
    Patricia Yang
    Support Consultant - Netweaver BW
    Global Support Center China
    SAP Active Global Support

  • What are the advantages of idoc compare to flat file. how data is secure

    what are the advantages of idoc compare to flat file. how data is secure in idocs compare to flat file

    Hi Ramana,
    In simple words, Main advantage with idoc over flat file is security....
    I will explain you some scenario here U got a flat file with all the data...Now u r having the flat file if you want u can modify the data in it, or somehow any one can modify the data in it  if they were able to access this file. That means u maintained the file in the presentation server
    One level of higher security to the above level is maintaining the flat file in application server, at point also even though lot of people r not having the access to that file, super user who is  having  the access may modify the data or delete the data from it rite....
    so in both of those levels u don't have 100% security...
    So there come to the picture of idocs, Idocs simply data carriers, those r generated by a program but not manually...data will be divided into number of segments based upon ur program. So manually its not so easy to modify the data in these idocs. If any changes to be made in the data then u have to modify the data in the application and then u have to update the idoc or you have to generate the new idoc with that corresponding data. so in this case not even super user can manipulate the data directly in the idoc....
    I think u got my point what I mean to say.....
    If you find it useful mark the points
    ~~Guduri

  • Strange behavior with csv flat file integration

    Hi,
    i've a real strange behavior with csv flat file integration.
    I've defined the file inbound with | as separator
    Here is the sender payload result, extra columns appears !
    <row>
      <VAL1>D</VAL1>
      <VAL2 />
      <VAL3>01</VAL3>
      <VAL4 />
      <VAL5 />
      <VAL6>003160000</VAL6>
      <VAL7 />
      <col>003160001</col>
      <col />
      <col>2</col>
      <col />
      <col>91200604212</col>
      <col />
      <col>VIRTUAL DJ HOME EDITION</col>
      <col />
      <col>2</col>
      <col />
      <col>2</col>
      <col />
      <col />
      <col>2</col>
      <col />
      </row>
    When i changed the separator to ; in my file and in the file inbound it gives the following.
    Everything is ok.
    <row>
      <VAL1>D</VAL1>
      <VAL2>01</VAL2>
      <VAL3 />
      <VAL4>003160000</VAL4>
      <VAL5>003160001</VAL5>
      <VAL6>7</VAL6>
      <VAL7>91200604212</VAL7>
      <col>VIRTUAL DJ HOME EDITION</col>
      <col>10</col>
      <col>10</col>
      <col />
      <col>10</col>
      </row>
    Why using a different separator would change the payload ?
    It will be a good thing for me if I could use the pipe separator.
    Thanks

    Can you give the steps you're using to get this error? I'm not seeing any problem here.
    1) Create a new audio project (44.1 KHz).
    2) Arm the record head in track 1 by pressing the "R" button in that track. Verify audio levels.
    3) Record by pressing the record button in the audio controls.
    4) Stop recording by pressing the stop button.
    5) Play back the resulting audio. Sounds fine.
    6) Select File > Export.
    7) Change the file format to WAVE file. Change the bit depth to 16 bit, change the sample rate to 44.1 KHz.
    8) Type the name SAVETEST.wav and click on the Export button.
    9) Play back the resulting WAV file in QuickTime Player, Apple Loops Utility, etc. and it seems just fine.
    What am I missing that you're doing?

Maybe you are looking for

  • How to push a view to another view to a specific State in Mobile App?

    I want to open a second view that contains some states from the first view. Example: myFirstView  with a spark list component: So far I got this code for the view of the list to push to the other views: <fx:Script>         <![CDATA[             impor

  • How to return back my number

    I want to return my numbers

  • Server Newbie. TM help

    I had a WD ext HDD fire wired to my imac & backed TM directly to there. I have just set up Mac mini server & thought I would be able to plug in the ext HDD to the mac mini & then wirelessly back up my Imac to the same backup folder created when ti wa

  • Materialized view Vs Streams - can you recommend one?

    We have many materialized views (for change data capture) based on rowid as the source doesn't have primary keys. The source DB will be upgraded to 10/11g (from 9). Because the data would be exported and imported back, the rowids will be changed at t

  • IPad 2 discharging when plugged (10W)?

    Since using my brand new iPad 2 over the last two weeks or so I have noticed on more than one occasion that using it while plugged into that 10W adapter (220V) will actually result in the battery percent going down. For example today while using it (