Loading from source using DBLink

Hello all,
I am trying to load from a table which I can connect using only a DBLink. I have a DB link created to a remote schema.
OWB : 10gR2; Oracle DB (local and target) : 10g R2; Oracle DB (remote - Source):9i R2.
I created a source with the following steps:
- I created a normal location (host:port:service) to connect to my local DB where I have the DB link to connect to remote DB.
- I created a source with a new location (dblink connection) using the above location and selecting the DB link.
- The above step created a location just for the data, for the metadata, owb forced me to create another location- Hence created another source using the same location created in 1 and selecting the DBlink.
- Using the above, I was able to connect to the source and import the required Views and MVs.
Next, I created mapping using this source. When I deploy the mapping, I get an error, "table or view does not exist". The script is not using the location detail at all.
I did try registering the location manually (though they will be deployed while deploying the mapping), still it did not help.
Any ideal please?
Thanks
Maruthi

I tried creating all over again.
The location which is created using the Database link as the location type is not listed in the 'Metadata location' tab of the source module editor.
Alternatively I also created the source module and when the create module wizard prompts me for the location, I selected the location that was created with the DBlink as the location type and finally when click Finish, I get a message 'API2421: A metadata location is not defined for the module. Import cannot continue without a metadata location. To continue with import, select ok and a metadata location will be created'.
When I proceed here, a new location is created and I am prompted for the connection details. I created this also as DBlink type and then the import takes place.
I go back and open the editor for the source module, I still don't see this location which was created by OWB.
Is this some kind of bug or am I missing something.
Please respond as early as possible.
Thanks a lot in advance!
Maruthi

Similar Messages

  • Doesn't load from source system to PSA and DSO

    Hi Gurus,
    We have launched a Delta load from Source System (POSDM) to DSO. In the source system there are 5000000 rows but none was loaded into DSO or PSA. The delta load was ok, but any data was loaded.
    If the delta  load was lunch again, would be loaded those 5000000 rows. Or those 5000000 rows wouldn't be able to load again.
    Any idea abut this issue? Any feedback will be really appreciated.
    Thanks in advance.

    Hi David,
    Are you sure this 5 million records are delta records and should be pulled into BW as delta??
    did you count the number of records in the underlying tables??
    which data source you are using??
    Delta loads are suppose to bring the new records or changed records since the last data load and not all the records.
    Since the request is green as you said and still it shows 0 records then it is possible that nothing has changed since last delta.try to see the details of the data load... if it failed...may be thats the reason you are not able to see any records.
    If you schedule the delta again it will bring the records changed or created after last delta.
    If the delta was unsuccessfull then turn the QM status and overall status of request to red manually in the monitor of the request and delete it from PSA and all the targets in which its loaded and schedule the delta again and it should bring the delta.
    Thanks
    Ajeet
    Thanks
    Ajeet

  • Data Load from Source to Destination

    Hello Forum Members,
    I have to databases on different solaris servers.
    I have to load data from Source Table[Server 1] to Target Current Table[truncate existing data and load from Source Table] [Server 2]and on every Wednesday from Target Current Table into Target History Table[Data always appended][Server 2].
    Any advice how to solve the issue is highly appreciated.
    Thanks and Regards,
    Suresh

    Unless there is a history table on the remote server, the history table would not appear to be an appropriate candidate for a materialized view.
    You could, in theory, create the live table as a materialized view rather than doing a TRUNCATE (or DELETE) and INSERT INTO SELECT in your stored procedure. Since it sounds like you need to have some coordination between the refreshing of the materialized view and the population of the history table, it's not obvious that creating the live table as a materialized view buys you anything.
    Justin

  • Data loading from source system takes long time.

    Hi,
         I am loading data from R/3 to BW. I am getting following message in the monitor.
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    in the source system
    Is there anything wrong with partner profile maintanance in the source system.
    Cheers
    Senthil

    Hi,
    I will suggest you to check a few places where you can see the status
    1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
    Also see if there is any 'sysfail' for any datapacket in SM37.
    2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
    3) RSMO see what is available in details tab. It may be in update rules.
    4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
    5) SM58 and BD87 for pending tRFCs and IDOCS.
    Once you identify you can rectify the error.
    If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
    If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
    If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
    SM21 - System log can also be helpful.
    Thanks,
    JituK

  • Selectively deleting load from ODS using Process Chain

    Hi All,
    I have a requirement in which I have to delete the last load of the current month if any and leave the load of the last month untouched which has to be done using process chain.
    I will try to elaborate the condition required like say we have a data for June in the ODS already (date 10) and when the load for July will come it will not touch the June data but on 2 July the load of first July will be deleted without touching the load from June 10. (The PC will run for the first 10 days of the month).
    How can this be implemented, There seems to be some FM for this but are there any other method by which the same can be implemented without much use of ABAP.
    Any relevant suggestion would be awarded points
    Regards,
    Samved

    Hi Bhanu,
    Thanks for the reply.
    I tried to generate the program but that can be used for selective deletion from the ODS i think soo.
    But my requirement is basically to delete the complete request from the ODS and not any selective deletion and at the same time I cannot use the complete deletion option as that will delete the request from the last month also.
    Are there any optioin at the level of Info package which can be used for these type of deletion.
    Regards,
    Samved

  • Load from Source-Destination=?Table structure not the same.

    Hi
    I have to copy data from source table to destination table. The structure of the 2 are not the same. Number of records in destination table should be half number of records in source. The reason being in the source a column named (e.g.) c_type = 'Up' - one row or 'Down' - in another row. The same is reprsented in the destination as 1 row since the number of columns is more. Example up_name, down_name, up_dep, down_dep.
    How can I insert into the destination depending on the c_type column in the source?
    Example:
    I want to insert into destination.up_name where c_type = 'Up' and destination.down_name where c_type = 'Down'...
    and so onHow can I write my sql query such that I have to write one insert statement and put the right data in the right column?

    Mass25 wrote:
    Number of records in destination table should be half number of records in source. The reason being in the source a >column named (e.g.) c_type = 'Up' - one row or 'Down' - in another row. The same is reprsented in the destination >as 1 row since the number of columns is more. Example up_name, down_name, up_dep, down_dep. Hope this is what you are looking for:
    SQL> WITH SOURCE AS
      2       (SELECT 1 id_col, 'UP' c_type, 'up_name_1' name_col,
      3               'up_dept_1' dept_name
      4          FROM DUAL
      5        UNION ALL
      6        SELECT 1 id_col, 'DOWN' c_type, 'down_name_1' name_col,
      7               'down_dept_1' dept_name
      8          FROM DUAL
      9        UNION ALL
    10        SELECT 2 id_col, 'UP' c_type, 'up_name_2' name_col,
    11               'up_dept_2' dept_name
    12          FROM DUAL
    13        UNION ALL
    14        SELECT 2 id_col, 'DOWN' c_type, 'down_name_2' name_col,
    15               'down_dept_2' dept_name
    16          FROM DUAL
    17        UNION ALL
    18        SELECT 3 id_col, 'UP' c_type, 'up_name_3' name_col,
    19               'up_dept_3' dept_name
    20          FROM DUAL
    21        UNION ALL
    22        SELECT 3 id_col, 'DOWN' c_type, 'down_name_3' name_col,
    23               'down_dept_3' dept_name
    24          FROM DUAL)
    25  SELECT * FROM SOURCE
    26  /
        ID_COL C_TY NAME_COL    DEPT_NAME
             1 UP   up_name_1   up_dept_1
             1 DOWN down_name_1 down_dept_1
             2 UP   up_name_2   up_dept_2
             2 DOWN down_name_2 down_dept_2
             3 UP   up_name_3   up_dept_3
             3 DOWN down_name_3 down_dept_3
    6 rows selected.
    SQL> WITH SOURCE AS
      2       (SELECT 1 id_col, 'UP' c_type, 'up_name_1' name_col,
      3               'up_dept_1' dept_name
      4          FROM DUAL
      5        UNION ALL
      6        SELECT 1 id_col, 'DOWN' c_type, 'down_name_1' name_col,
      7               'down_dept_1' dept_name
      8          FROM DUAL
      9        UNION ALL
    10        SELECT 2 id_col, 'UP' c_type, 'up_name_2' name_col,
    11               'up_dept_2' dept_name
    12          FROM DUAL
    13        UNION ALL
    14        SELECT 2 id_col, 'DOWN' c_type, 'down_name_2' name_col,
    15               'down_dept_2' dept_name
    16          FROM DUAL
    17        UNION ALL
    18        SELECT 3 id_col, 'UP' c_type, 'up_name_3' name_col,
    19               'up_dept_3' dept_name
    20          FROM DUAL
    21        UNION ALL
    22        SELECT 3 id_col, 'DOWN' c_type, 'down_name_3' name_col,
    23               'down_dept_3' dept_name
    24          FROM DUAL)
    25  SELECT s1.id_col, s1.name_col up_name, s1.dept_name up_dept,
    26         s2.name_col down_name, s2.dept_name down_dept
    27    FROM SOURCE s1 JOIN SOURCE s2
    28         ON (s1.id_col = s2.id_col AND s1.c_type = 'UP' AND s2.c_type = 'DOWN
    29  /
        ID_COL UP_NAME     UP_DEPT     DOWN_NAME   DOWN_DEPT
             1 up_name_1   up_dept_1   down_name_1 down_dept_1
             2 up_name_2   up_dept_2   down_name_2 down_dept_2
             3 up_name_3   up_dept_3   down_name_3 down_dept_3
    3 rows selected.
    SQL>Source had 6 records and it was self joined to give 3 records which can be populated in Destination Table.
    Always post some sample data with desired result. That will get you quick responses.
    Regards,
    Jo
    Edited: Added Quote Tags

  • Issue with load from livecache

    Dear all
    In our APO system we have encountered a strange issue with data loaded from livecache into the infocube.
    Problem is in the infocube, where there are two requests with a forecast version data with the same amount of data.
    When I look in my infocube in APO I see e.g. the two following request IDs for a given product:
    REQU_4CZ4SQ5PSM9V5PHJCHTIS24GK is the most recent load from Livecache (replacing any previous livecache loads).
    APO_R43U3KZLM0V3WVK112YEK3KKLE cannot be located. It is not available on the infocube manage section
    Ultimately this results in double data for this given product.
    Do any of you have an idea as to what this APO_* request could be triggered from?
    This is the input from the BW-team:
    This indicates that an APO program has modified the infocube data without updating the request tab of the manage infocube. But it has updated the request id of the data which is transformed. With subsequent loads into the infocube, these records are split based on different request idu2019s, and loaded collectively into BI.
    I hope you can help me out.
    Best regards,
    Anders Thinggaard

    Hi Visu
    It seems I made a typing error in my description of the problem. It is of course the DESTINATION combination, not the SOURCE combination, that is written into the infocube when realigning on the infocube.
    This, of course, creates the request ID starting with APO. Besides that I get the request starting with REQU when I load from livecache using an infopackage. So far so good.
    What confuses me though is that when I load back into livecache (load my planning area) is seems to pick up the correct amount of data (not taking the APO* request into account) whereas my load to my external BW out of my APO infocube seems to pick up both the REQU* and the APO* requests, resulting in dublicate data.
    Have you had this challenge as well?
    My first idea is to make the BW team make some ABAP coding leaving out any request ID starting with APO*. However it seems to me that this is a stardard functionality of APO, and I'd like to get to the bottom of this...
    Best regards,
    Anders
    P.S. As I understand, the copy logic only dertermines whether data from source combination is added to or overwriting the data for the destination combination.

  • Error message from source system

    Hi All,
    We have an infocube and ods which is being loaded from another ods. This is included in a process chain.
    In the process chain the Infopackage execution step has failed. When I right click and go for display messages, I found the following message in Messages tab:
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the
    And in Indiv Messages tab,
    Delta upload is only possible after successful initialization.
    Errors in source system.
    I tried to update the data targets from the initial ODS manualy but in vain.
    Any Help is awarded with points.
    Thanks
    AP

    Hi,
      Goto Info pack -> on menu bar select scheduler -> select init options for source system -> one request with green status -> select it and  click on delete symbol , it will delete the init. you can cross verify this in the base ODS where you dint find data mart status(after refresh). now  select init delta process under that select init without data transfer and schedule the load it will update one record and init flag to your source ODS.
    Note: if the last request is not updated from soruce ods to targets for that also you will find data mart status for this select the particular data mart -> click on delete symbol it will delete for the particular request (if it is updated to target it will not allow you to delete) then do delta load it will update the request which is not loaded from source ods to next targets.
    Regards
    Sankar

  • Compiling Packages from Source

    I am having some issues using the "make" command when compiling any kind of source code. When doing so, I get the sort of error that one would would expect when trying to build without headers. However, this is not the case. I do have the kernel headers installed (matching the kernel version, of course) and still I am unable to build. Interestingly enough, I am able to make packages from the AUR from source using packer, so I assume that I am able to make packages. Or am I painfully deluded in my understanding?

    TheHebes wrote:@tavianator- I have looked, the RTL8188CE driver is not present is NOT in the AUR. (packer -Ss r 8188 comes up with nothing). Furthermore, @graysky, I do have have kernel26-headers installed. Both the kernel and headers are version 2.6.39 3-1. For the most part, by the looks of it, there is no packagebuild available, so managing this through pacman is not looking like an option. From what I gather, "make" is the only way this is going to compile.
    The driver you want is indeed in the AUR.  Look at the description:
    http://aur.archlinux.org/packages.php?ID=46797
    The driver name and version number exactly match the errors from make you posted above:
    TheHebes wrote:make[1]: *** No rule to make target `*/rtl8192ce_linux_2.6.0006.0321.2011/HAL/rtl8192'.  Stop.
    Last edited by mrstegeman (2011-08-10 11:36:02)

  • Data load from R3 is slow and gives rfc connection error

    Hi all, is there any debug capabilties available on BI 7.0 when it comes to debug data load from R3 using rfc connection ( aleremote and bwiremote users )
    any ideas where can I look for solution.
    thanks.

    Hi,
    Check the connection between the R/3 and BW.
    or It may be the network problem contact system admin or basis people.
    thanks,
    dru

  • Error -90904 / Load from ASCII

    Hi,
    I have a LabVIEW VI containing the "Load from ASCII" module from SignalExpress. Running the VI results in the error message "Error -90904. "Load from ASCII" using exactly the same settings under SignalExpress works fine. What is going wrong here?
    Bjoern

    Bjoern,
    I'm sorry, this was my mistake!
    But, there is a download vor LabVIEW SignalExpress, as follows:
    http://digital.ni.com/softlib.nsf/webcategories/86256C220055D6BC862570A8002C71A0?opendocument&node=1...
    BR, Christian
    Message Edited by Christian_M on 03-08-2007 03:57 AM

  • Can we use 0INFOPROV as a selection in Load from Data Stream

    Hi,
    We have implemented BW-SEM BPS and BCS (SEM-BW - 602 and BI 7 ) in our company.
    We have two BPS cubes for Cost Center and Revenue Planning and we have Actuals Data staging cube, we use 0SEM_BCS_10 to load actuals.
    We created a MultiProvider on BPS cubes and Staging cube as a Source Data Basis for BCS.
    Issue:
    When loading plan data or Actuals data into BCS (0BCS_C11) cube using Load from Data Stream method, we have performance issue, We automated load process in a Process Chain. Some times it take about 20 hrs for only Plan data load for 3 group currencies and then elimination tasks.
    What I noticed is, (for example/) when loading Plan data, system is also reading Actuals Cube which is not required, there is no selection available in Mapping or selection tab where I can restrict data load from a particular cube.
    I tried to add 0INFOPROV into databasis but then it doen't show up as selection option in the data collection tasks.
    Is there a way where I can restrict data load into BCS using this load option and able to restrict on cube I will be reading data from ?
    I know that there is a filter Badi available, but not sure how it works.
    Thanks !!
    Naveen Rao Kattela

    Thanks Eugene,
    We do have other characteristics like Value Type (10 = Actual and 20 = Plan) and Version (100 = USD Actual and 200 = USD Plan), but when I am loading data into BCS using Load from Data Stream method, the request goes to all the underlying cubes, which in my case are Planning cubes and Actual Cube, but I don't want request to goto Actual Cube when I am running only Plan load. I think its causing some performance issue.
    For this reason I am thinking if I can use 0INFOPROV as we use in Bex queries to filter the InfoProvider so that the data load performance will improve. 
    I was able to to bring in 0INFOPROV into DataBasis by adding 0INFOPROV  in the characteristics folder used by the Data Basis.
    I am able to see this InfoObject Data Stream Fileds tab. I checked marked it to use it in the selection and regenerated the databasis.
    I was expecting that now this field would be available for selection in data collection method, but its not.
    So If its confirmed that there is no way we can use 0INFOPROV as a selection then I would suggest my client for a redesign of the DataBasis it self.
    Thanks,
    Naveen Rao Kattela

  • Error is data loading from 3rd party source system with DBCONNECT

    Hi,
    We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
    After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
    The connection is working OK and we can see the tables in the source system. But we cannot load the data.
    The error in the monitor is as follows:
    'Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.'
    But, unfortunately, the error message has no further information.
    If we look at the job log in sm37, the job finished with the following log -                                                                               
    27.10.2009 12:14:19 Job started                                                                                00           516          S 
    27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA)                    00           550          S 
    27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG                                              RSM1          797          S 
    27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container                                     OL           356          S 
    27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G     RSM1          796          S 
    27.10.2009 12:14:24 Job finished                                                                                00           517          S 
    In a BW 3.10 system, there is no  message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
    Thanks in advance,
    Rajib

    There will be three things to get the errors like this
    1.RFC CONNECTION FAILED
    2.CHECK THE SOURCE SYSTEM
    3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
    4.CHECK I DOC PROCESSING
    5.FINALLY MEMORY ISSUES.
    6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
    7.LAST IS MEMORY ISSUE.
    and also Check the RFC connection in SM59 If  it is ok then
    check the SAP note : 692195 for authorization
    Santosh

  • Error in loading data into PSA from source system.

    Hi Experts !!!
    Good morning .
    I am trying to load data from SRM source system into BI system.When I execute infopackage,the data is not loaded into PSA and status of the process immediately becomes yellow.After sometime it gets timed out and converts to RED.When I checked the error documentation I found that the iDOCS response was not received from the source system.Detailed error is as below.
    System Response
    There are idocs in the source system ALE outbox that did not arrive in the ALE inbox of BI.
    Further Analysis
    Check the TRFC log
    you can access this log using the wizard OR menu path "Environment > Transact. RFC> In the source system
    However,I was not able to navigate to this path.I think if I can navigate to this path,I can manually push IDOCS from source system to BI system.
    Regards,
    Mandar.

    Hi,
    Check the DataSource in RSA3, if it is working fine and able to see the data in RSA3, there is no problem in DS level, then check the Mappings and any routines in BW for that DS, if this is also fine then check the below options.
    See Dumps in ST22, SM21 also.
    Check RFC Connection between ECC and BW systems, i.e. RSA1-->Source System->Right Click on Source system and Check.
    You must have the following profiles to BWREMOTE or ALEREMOTE users.So add it. Bcoz either of these two users will use in background to get extract the data from ECC, so add these profiels in BW.
    S_BI-WHM_RFC, S_BI-WHM_SPC, S_BI-WX_RFC
    And also check the following things.
    1.Connections from BW to ECC and ECC to BW in SM59
    2.Check Port,Partner Profiles,and Message Types in WE20 in ECC & BW.
    3.Check Dumps in ST22, and SM21.
    4.If Idocs are stuck i.e see the OLTP Idoc numbers in RSMO Screen in (BW) detials tab see in bottom, you can see OLTP Idoc number and take the Idoc numbers and then goto to ECC see the status in WE05 or WE02, if error then check the log else goto to BD87 in ECC and give the Idoc numbers and execute manually and see in RSMO and refresh.
    5.Check the LUWs struck in SM58,User Name = * (star) and run it and see Strucked LUWs and select our LUW and execute manually and see in RSMO in BW.
    See in SDN
    Re: Loading error in the production  system
    IDOC getting struck at TRFC (SM58)
    1.Reorg of table (ARFCSSTATE )
    2.Incraesing the resources in the system (no of processes and memory ) , as this issue happens owing to this.
    Source system tries to send TRFC to target and if there are no WP's available it will come to Transaction Recorded state, and form here it will not try to send this TRFC.So you have to execute this manually.
    Also we can increase the timeout parameter so that it can try few more times to send before actually it comes to recorded state.
    Regards,
    Suman

  • How to load from multiple sources to multiple destination dynamically

    I have more than hundred tables  in a different server (lets say on a sql server 1). I have to perform load,by getting change records copy of those hundred tables with the some condition in to sql server 2 destination. I know how to perform data
    flow task in SSIS to extract data from a source and load it in a destination. With more than hundred tables, I would need to create more than hundred data flow tasks which is very time consuming. So I have heard about from source to destination dynamically
    by looping through and creating variables. Now, how do I do this? Remeber, those hundred tables do having similar column names in both soruce and destination server. How can I perform this initial load faster without using multiple data flow task in SSIS.
    Please, help! Thank you!

    Ok , if the schema of both the source and destination is same , you can just perform an import and export tool and then it can be saved as an SSIS package. If this needs to be separately developed in an SSIS package, the logic can be :
    1 - Get the list of tables using sysobjects or metadata table where all the table list is maintained.
    2 - Loop through those tables using for each loop.
    3 - Inside the for each loop , use a script to access the source connection ( OLEDB source table name) and the destination connection (OLEDB destination table name).
    4 - Assign the source table connectoin dynamically to the variable fecthed from for each loop above.
    5 - You can also use expressions for the OLEDB source table name and OLEDB destination name inside the data flow task.
    6 - At the end the data flow task which is explained above will be used.
    Script task and the data flow task will be used inside the for each loop , above which all the objects list will be fetched using execute sql task using a SELECT and assigned to Object variable.
    Simplest will be Import and Export.
    Happy to help! Thanks. Regards and good Wishes, Deepak. http://deepaksqlmsbusinessintelligence.blogspot.com/

Maybe you are looking for

  • Display in a date picker (calendar)

    Hello, I have a date picker when i press a button, to choose a date from calendar. But some months are displayed with the year too, like November 2009, June 2009, and in this image i uploaded, in case of September only the month is displayed.. i don'

  • Aerial in/out connections get very hot.

    As above, the in out TV aerial connectiobs on the youview box get very hot and the picture breaks up. Anyone else got this problem?

  • Mail App return receipt

    Can you set up the mail app in Panther to give you a return receipt for outgoing mail?

  • Question sur Camera Raw

    Je n'arrive plus à ouvrir les fichiers ARW dans PE  "impossible d'ouvrir....car ce type de fichier est incorrect". J'ai téléchargé la version 6.5 de Camera Raw sur le site d'Adobe mais il semble que pour mon Sony Alpha 57 c'est la version 6.7 qu'il f

  • Totally frustrated by the following stacktrace..

    ####<17-Jul-2006 15:16:45 o'clock BST> <Error> <com> <LDNCTIWA095922> <cgServer> <ExecuteThread: '7' for queue: 'weblogic.kernel.Default'> <hf52065> <> <000000> <bea.wlw.runtime.core.dispatcher.DispClass - maximum depth reached> ####<17-Jul-2006 15:1