OSB. Atomic level. Namespace transformation and endpoint approachability .

Need your opinion...
1. Should atomic proxy service convert physics level services namespaces into their(atomic) namespaces?
My latest practice shows that it depends from situation. By default - should, but on complex services it looks bad.
For example, IBM CM has big and complex structure of WSDL and its structure depends on current internal entities - it very time consuming to follow changes.
2. Can atomic and domain OSB services have an endpoint that is reachable for an external application?
The book "Definitive Guide to SOA OSB" says that they can n

1. I suppose that physical services not alway have "name spaces" at all, therefore my opinion on the first question "It depends and up to you".
2. I think the main thing about enterprise service level is the SLA, that correspond to truly reuse services through enterprise. From this point of view the domain and atomic services actually doesn't have corresponding SLA that can bring you to problems in the real share reuse cross enterprise.
PS. Take a look at Oracle "SOA Governance: Framework and Best Prectices" http://www.oracle.com/technologies/soa/docs/oracle-soa-governance-best-practices.pdf

Similar Messages

  • Need help on OSB problem , need to transform and route

    Hi,
    I am new to OSB and i have a problem where i need to route service call to 7 different services based on one of the value of parameter of input request. After routing i need to transform the input parameter into stirct datatype output and of different structure. Need suggestion how to do it. I have read some tutorial about proxy services. i have WSDL of both input and output.
    Regards

    You can call 7 different services. AlSB supports one route node, but provides you with the facility of Service Callout.
    You can try Service Callout. To use Service Callout select Add An Action >> Communication >> Service callout.
    You can also try Routing Options if you don't have any Proxy Service or Business Service to call.
    Hope this helps!

  • OSB message level authentication fault and predicate

    Hi,
    I have successfully configured MLS on my proxy service and it works fine.
    However, the fault thrown back by OSB does not give you much when authorization fails.
    Is it possible to get the predicate that has not been fulfilled? It would be more useful to the consumer of the service.
    Also, is it possible to at least trace/log the predicate on authorization failure?
    The logs contain the operation that failed but not the predicate
    \[OSB Security:386004\] Message-level access control denied access to proxy service Main/Proxy Service/LocalFundPS, operation findAllSchemes, subject: XXX
    Thanks
    Arnaud
    <faultcode>soapenv:Server</faultcode>
    <faultstring>
    BEA-386102: Message-level authorization denied
    </faultstring>
    <detail>
    <con:fault xmlns:con="http://www.bea.com/wli/sb/context">
    <con:errorCode>BEA-386102</con:errorCode>
    <con:reason>Message-level authorization denied</con:reason>
    <con:location>
    <con:path>request-pipeline</con:path>
    </con:location>
    </con:fault>
    </detail>
    </soapenv:Fault>

    I have authentication providers configured as below (in the same order)
    Custom authenticatio provider - REQUIRED
    DefaultAutentication Provider - OPTIONAL
    I did configured message level authentication in proxy service for custom username/password token.
    Craeted a new user in weblogic console.
    Invoked proxy service with newly created username/password.
    I expect the user should not be authenticated since authetication with custom authentication provider will fail, which is required.
    I'm getting user authenticated and business service invoked and got a valid response.
    If i open another browser window and try to login with the weblogic admin password, it does not let me in because the user is not authenticated by my custom provider.
    Hope this makes scenario clear.

  • Xquery transformation and namespaces

    I'm transforming one XML document based on schema A to another XML document based on schema B. Thransformation works ok, except for the problem with namespaces. My result document looks like this:
    <ns0:root xmlns:ns0="namespaceA">
         <ns0:request>
              <ns1:Element xmlns:ns1="namespaceB">value1</ns1:Element>
              <ns1:Attribute xmlns:ns1="namespaceB">value2</ns1:Attribute>
              <ns1:Attribute xmlns:ns1="namespaceB">value3</ns1:Attribute>
              <ns1:Attribute xmlns:ns1="namespaceB">value4</ns1:Attribute>
         </ns0:request>
    </ns0:root>
    The problem is that namespace attribute is repeated in each element. I want to declare namespaceB in root and than use only prefixes. So document should loook like this:
    <ns0:root xmlns:ns0="namespaceA" xmlns:ns1="namespaceB">
         <ns0:request>
              <ns1:Element>value1</ns1:Element>
              <ns1:Attribute>value2</ns1:Attribute>
              <ns1:Attribute>value3</ns1:Attribute>
              <ns1:Attribute>value4</ns1:Attribute>
         </ns0:request>
    </ns0:root>
    Is it possible to achieve that, and how?
    Thanks for help.

    Helo!
    I saw yesterday this soluction:
    It's best not to think of this as "removing the namespaces" but rather as
    "changing the element names". The name of an element consists of a namespace
    URI and a local name. So the name of the Email element, for example, is
    ("urn:mpeg:mpeg7:schema:2001", "email"). In your desired result, you want an
    Email element without the namespace, that is, you want the name of the
    element in the result to be ("", "email"). So your query has to rename every
    element, or to put it another way, it has to create an element whose name is
    different from the original.
    The way to do this is to write a recursive function something like this:
    declare function f:strip-namespace($e as element()) as element() {
    element {QName((), local-name($e)} {
    for $child in $e/(@*,*)
    return
    if ($child instance of element())
    then f:strip-namespace($child)
    else $child
    AUTHOR: Michael Kay
    see at:
    http://www.x-query.com/pipermail/talk/2006-January/001062.html

  • SQL DM - Data Transformation and Data Movement option ?

    SQL DM - Data Transformation and Data Movement option ?
    I am using SQL DM 3.0.0.665. I need your thoughts on following.
    We find that Erwin introduced Data Transformation and Data Movement functionality to support ETL need. We were able to generate ETL spec using this feature.
    Does SQL DM have any plan to introduce such features?
    How do we use the current SQL DM to build ETL spec ?
    Thanks in helping us out.

    Hello,
    I am currently experimenting with SQL Data Modeler to produce high level solution designs and ETL specifications.
    Have not completed what I am doing but so far have come up with the following:
    Current assumption I am working on:
    All objects specified within the SQL Data Modeler will export to the Reporting Schema tables set up in an Oracle database. Once the data is within these tables then it will be a simple task to develop a SQL report to extract the data in any format required.
    1) There is nothing in the physical (Relational) Model section that supports this
    - though I have yet to fully use the Dimensional Modelling functionality which may have the mapping functionality required to specify an ETL
    2) We need diagrams of the processes as well as the ETL mapping
    - Process modelling is available in the Logical
    - Reverse Engineer all Physical objects to become Logical object i.e. one Table to one Entity
    - For each Entity set up an Information Structure
    (Currently this can only be done in a convoluted method via creating a diagram, creating a Flow and editing the Flow then drilling down)
    MESSAGE to SQL Data Modeler Support: Can things be set up so that Information Structures can be set up directly from the Browser, current method is a bit nonsensical
    - You are now set up to use the Logical Process Modeling functionality to capture the ETL requirements
    - Advise that you reference the training to understand what primitive, composite and transformation processes objects are
    - Also, take the time to understand what an external agent object is
    - Will assume you know what a Data Store is
    Here is the standard I am heading towards that seems feasible, will need to run a proof of concept within the larger team to ensure it works though:
    - A Logical is kept that is a one for one with the Physical
    (The only reason for this is that there is no process modeling functionality for the Physical objects
    MESSAGE to SQL Data Modeler Support: Can you duplicate the Process Modeling for the Logical to be available for the Physical objects too, would be a great help to specify ETL jobs.
    - An External Agent is used to represent an external source e.g. Billing application
    - A primitive process is used to represent the high Level design
    - A composite process is used to specify processes which can be further broken down to ETL jobs
    - A transformation process is used to represent an ETL job
    Within a Transformation process you can specify the mapping from multiple sources to a target table
    There are some negatives to this approach:
    - You lose the physical schemas the tables are part of, though a naming convention will get round this
    - You need to maintain a logical that is one for one with the physical, this is not a big overhead
    However, as I have stated in my message to the SQL Data Modeler support team, would all be resolved if the Process Modeling functionality were also made available within the Physical objects environment.
    Please note that we have not as yet adopted the above approach and are still assessing is SQL Data Modeler will meet this requirement to our satisfaction. The critical bit will be if the data exports to the Reporting Schema, if it does then we have plenty of SQL resource that can produce the reports required procided the data can be captured.
    Hope that all helps.
    Also, hope I have not missed the point of your email.
    Kind regards,
    Yusef

  • TRANSFORMATION AND DTP

    Hi all,
    I have two questions:
    1. If I have a 3x datasource..can i still use Transformations and DTP's with it in BI 7.0?
    2. I have a cube and know the datasources. Could some one give me the modelling steps that come next... i am using BI 7.0 and want to use Transformations and DTP.
    Thanks

    1. If I have a 3x datasource..can i still use Transformations and DTP's with it in BI 7.0?
    Yes - you can.
    2. I have a cube and know the datasources. Could some one give me the modelling steps that come next... i am using BI 7.0 and want to use Transformations and DTP.
    data loading in BI 7.0 from fla file extraction.
    fisrt create one Cube or DSO with the same structure which you have in flatfile..
    and activate it..
    ->now comes to Datasource tab> create one Datasource here you need to select type of data for example.. select Transactional data --> and menntion your flatfile name in extraction tab- and file type and eneter your info object names in FIELDS tab --> and load preview data Activate it..
    now select your datasource and create info package and schedule it.. now your data will loded in to PSA level...
    > and now comes to info provider select your cube.. and right clcik it.. and create transformations.,. and activate it..
    > and create DTP -- Activate it.. and Execute it..
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> (DTP)-->Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    Data Transfer Process (DTP) is now used to load data using the dataflow created by the Transformation. Here's how the DTP data load works:
    1) Load InfoPackage
    2) Data gets loaded into PSA (hence why PSA only is selected)
    3) DTP gets "executed"
    4) Data gets loaded from PSA into the data target once the DTP has executed
    Hope it helps..

  • How to Activate Transformations and DTP in PRODUCTION

    Hi Experts,
    I Have done some modifications in datasource in R/3 when i treid to run the Infopacakge in BI, I see the Datasource is inactivate.
    So I have activated the DS through the program in se38 rsds_datasource_activate_all, after that i see the transformations and DTP is inactivate.
    In 3.5 we have program to activate the transfer rules and updates though rsds_transtru_activate_all  i dont have idea for update rules program to activate.
    *Is there any program to activate the transformations and DTP in Production.Rather than re-transport entire thing.*
    Regards.

    There is no activation program for Transformations. In order to do that in production, you would have to have the client opened for modification and manually activate the Transformation(s) that are in a deactivated state. This isn't advisable however. The best approach to activate deactivated Transformations is to capture on a transport in the development environment and move that transport throughout the landscape to production.
    The SAP delivered ABAP program RSDKDTPREPAIR is available for activating DTPs. There is no selection screen for this program, however, and therefore it would try to activate all DTPs. This program should only be executed in background.
    Here are the other activation programs in case you need them or are interested (I realize that you mentioned or more already):
    RSDG_CUBE_ACTIVATE (InfoCube)
    RSDG_IOBJ_ACTIVATE (InfoObject)
    RSDG_MPRO_ACTIVATE (MultiProvider)
    RSDG_ODSO_ACTIVATE (DSO)
    RSQ_ISET_MASS_OPERATIONS (InfoSet - Requires client to be opened)
    RSDS_DATASOURCE_ACTIVATE_ALL (DataSource)
    SAP delivered program RSDS_TRANSTU_ACTIVATE_ALL is for activating Transfer Structures, which aren't relevant for 7.x DataSources.

  • How to configure OSB Tracing level from Customization File?

    Hi,
    I want to configure the OSB Tracing level from the customization file rather than from the console.
    I check the oracle documentation for OSB tracing and could not find anything related to OSB tracing property that we could configure from OSB workshop.
    Is it possible to do so ?
    Any pointers will be highly helpful.
    Thanks,
    Anugoonj
    Edited by: Anugoonj Ranjan on Jan 20, 2011 1:27 PM

    Customization file is for replacing environment values that differ between domains and NOT for changing Operational Settings. You may use "Configuration" section of OSB Dashboard to effectively change Operations Settings for resources instaed of doing it one-by-one on console -
    http://download.oracle.com/docs/cd/E14571_01/doc.1111/e15867/configuration.htm#CACGEFAE
    Regards,
    Anuj
    Edited by: Anuj Dwivedi on Jan 20, 2011 6:10 PM

  • Reporting level Currency translation and group level Currency translation

    Hi All,
    Could any body explain me reporting level Currency translation and group level Currency translation.Please expalin me step by step scenario.
    Thanks in advance.
    Setty.

    Hi Jian,
    In my recent implementation completed, business only had one currency to deal with i.e. USD. Data was coming from ECC and we loaded all the data in LC instead of USD using transformation *NEWCOL(LC). Next, you can maintain a rate of 1 in the rate model and run the currency conversion.
    This will generate the same data values against USD. So, your statement " if we load LC and than convert it into USD, the data vulu will de double." is incorrect. In the system, you will have same set of values against LC as well as USD.
    I suggest that you configure Currency Conversion for future reqmts if any.
    Regarding BCF, balances from previous year (Balance Sheet Accounts) will need to be carried forward as opening balances to next year else your BS won't give true picture.
    Regards,
    Ashish

  • (CS5.1) Free Transform and Warp tool causing pixelation and quality drop.

    When I free transform and warp a layer to lie flat, sometimes I'll get this harsh pixellation at one end of the layer. It doesn't appear while I'm doing the warp, just suddenly coming up when I hit enter to apply the transformation.
    So for example, Ill be transforming this grid, which is originally about 9x this big.
    So when I''m running the transformation/warp, it looks like this
    Not great, but at least most of the lines are still intact on the left and back (the right side is basically fine looking)
    But once I hit enter, I get this
    And now like, half the lines I did have are gone, and what's left is super pixelated.
    Is there an interpolation setting I could be using? Is it something unavoidable from how distorted I make the layer?

    It is a general AA problem with mapping texture on surfaces in acute angles. In 3D programs this is solved by a technique called mipmaping, The 3D renderer calculates the degree of distortions and applies blurring filter where it is needed based on the depth of the 3D space to solve this problem. Unfortunately Photoshop is not a 3D renderer and can't eliminate this. You can try manually to reduce it yourself by applying the same transformation on several differently blurred copies of the layer and then using masks made with manual brushing over the problem areas to reduce the artifact.
    Here's an example image of a 3D program solving the problem.
    and a quick attempt I made to illustrate the problem using your grid after making it more dense with several multiplied layers. the bottom part is the grid masked with different degree of blurring at the problem areas. As you can see this is not an easy task and the problem may be considered as impossible to be fixed nicely. On top of that  as you can see this approach may require constructing grid patterns on layers so that you can blur the eventual horizontal or vertical directions separately.
    Click the image to see it larger without the additional distortion of forum scaling

  • Move up one level in path and make new folder

    How do I get applescript to make a new folder one level up from myFolder i.e.
    set myFolder to ¬
    (choose folder with prompt "Where are the image files for which you want to change exif data ?") as reference
    make new folder at myFolder with properties {name:"Gallery JPEGs"}
    makes a new folder inside myFolder, but how do I tell it to move up one level from myFolder and make a new folder?
    Thanks.
    Pedro

    You can use statements such as the following ones:
    *tell application "Finder" to make new folder at container of myFolder with properties {name:"Gallery JPEGs"}*
    or
    *tell application "Finder" to make new folder at container of container of myFolder with properties {name:"Gallery JPEGs"}*
    and so on…
    And if you want the folder “myFolder” to be contained by the folder “Gallery JPEGs”, use the following statements:
    *tell application "Finder"*
    *make new folder at container of myFolder with properties {name:"Gallery JPEGs"}*
    *move myFolder to the result*
    *end tell*
    Message was edited by: Pierre L.

  • Problem accessing /config_general/null/Default.action   Reason:There is no Action mapped for namespace/ config_general and action name default

    in use:
    vRO 5.1
    eclipse 3.7.2
    vRo plug-sdk 5.1
    steps:
    1.create a plug-in project from samples(choose solar system)
    2.find the dar package and upload it by vRo configuration
    3.vRo configuration said upload successfully,but the solar system configuration is not properly configued..
    problem:
    Problem accessing /config_general/null/Default.action   Reason:There is no Action mapped for namespace/ config_general and action name default
    How to solve it??
    Thanks so much!!

    There was problem from crm side...its working now..

  • Using transforms and filters without device drivers

    Hello,
    I came across NIMS as a possible solution for some transforms and filtering, possibly even generating test signal data, for a seismic application. I'm in the process of evaluating NIMS for best possible fit for what we need/want to accomplish.
    Basically, we've got some seismic data, and we want to process that data through a series of transforms and filters to denoise and pick the data for seismic analysis. No sense reinventing the wheel if we can adopt and then adapt a third-party library like NIMS into our app.
    We do not necessarily need any device drivers, although I noticed installing NIMS requires them. Hopefully we can opt in or out depending on what's actually required. Can someone help clarify the nature of the driver dependency?
    Anyhow, like I said I am evaluating it for best possible fit in our application, but in the meantime if someone can shed some light on the above concerns, questions, etc, would begreat.
    Thank you...
    Best regards.
    Solved!
    Go to Solution.

    Glad to hear it!
    -Mike
    Applications Engineer
    National Instuments

  • SPM Transformation and Routine are missing in Delivered

    Hi All,
            During the transport(From Devlopment to Quality) we received an error message as 'Transformation and Routine are not available in A Version'. Hence the transport has got failed.
    I tried searching this Object(DSO - 0ASA_DS00) in RSOR - Business content, but all these objects are missing in the Delivered content itself. The Object and System details are as follows,
    Object : DSO - 0ASA_DS00
                 TRFN - The TRFN 0E2R8KR45M0H68POUF6JWA5J7TNA4DS9 is not available in Delivered content
                 TRFN - The TRFN 0MM70WOQSHKX3SSQEDTOL7UAK90H15G8 is available in Delivered content, but the corresponding Routine is not available in delivered content
                 TRFN - The TRFN 0E2R8KR45M0H68POUF6JWA5J7TNA4DS9 is not available in Delivered content
    SAP_ABA          701          0008     SAPKA70108
    SAP_BASIS                          701          0008     SAPKB70108
    PI_BASIS                           701          0008     SAPK-70108INPIBASIS
    ST-PI                                  2008_1_700                                                0003     SAPKITLRD3
    SAP_BW          701          0008     SAPKW70108
    ANAXSA          210          0005     SAPK-21005INANAXSA
    BI_CONT          704          0009     SAPK-70409INBICONT
    OPMFND          210          0005     SAPK-21005INOPMFND
    ST-A/PI          01M_BCO700     0001     SAPKITAB7F
    When I tried collecting object (DSO - 0ASA_DS00) with 'Only necessary object', I am not facing this error, but with 'In DataFlow before & After', we are facing this issue.
    We also found 3 objects (TRFN's & Routines) are missing for PO & Invoice Process chains in the delivered content itself.
    We need to transport this object at the earliest, can anyone please guide me, how to proceed on this?
    Thanks..
    Regards
    Santhosh Kumar N

    Hi Rajesh,
                Thanks for the guidance. I have checked the TRFN's & Routines in the Business content, but they are missing in the Delivered Version itself.
    When we tried transporting the Spend Process chains for PO & Invoices, we are facing this issue.
    Please let me know, if we need exclude this missing objects and start transporting Or Is there any other options to get these missing objects in?
    Please find below the missing object details,
    1. The TRFN 0E2R8KR45M0H68POUF6JWA5J7TNA4DS9 is not available in Delivered content
    2. The TRFN 0MM70WOQSHKX3SSQEDTOL7UAK90H15G8 is available in Delivered content, but the corresponding Routine is not available in delivered content.
    3. The TRFN 0E2R8KR45M0H68POUF6JWA5J7TNA4DS9 is not available in Delivered content.
    4. The Object '8CXV3JV4FQR295IVBCHN69QBU' of type 'Routine' is not available in Delivered content.
    Thanks
    Regards
    Santhosh Kumar N

  • Errors While Transporting Transformations and DTPs

    Hi Experts,
    Iu2019m trying to transport transformations and DTPs from DEV to QA and am getting the following error messages.  Does anyone know whatu2019s happening and how I can fix this?
    Thanks,
    Janice
    The following are excepts from the log of the transport organizer with further SAP supplied information on the error messages.
    Transformations:
    Start of the after-import method RS_TRFN_AFTER_IMPORT for object type(s) TRFN (Activation Mode) 
    Activation of Objects with Type Transformation                                                  
    Checking Objects with Type Transformation                                                       
    Checking Transformation 06C3WE26JQY0VSZPNMOZFKD0W416PRU4                                        
    No rule exists                                                                               
    Target RSDS 0ACCOUNT_TEXT QA1CLNT400 is not allowed                                             
    Target RSDS 0ACCOUNT_TEXT QA1CLNT400 is not allowed
    Message no. RSTRAN802
    No rule exists
    Message no. RSTRAN514
    DTPs:
    Start of the after-import method RS_DTPA_AFTER_IMPORT for object type(s) DTPA (Activation Mode)
    Conversion of T version from DTP DTP_4FG5GXT9OLNN3YM3QFMW9V3W5 to M version...                 
    Conversion of T version from DTP DTP_4FK894515QD0O9UM4JSJPIB91 to M version...                 
    Activation of Objects with Type Data Transfer Process                                          
    Saving Objects with Type Data Transfer Process                                                 
    Saving Data Transfer Process DTP_4FKN7KAKCOLQ3ROIB7U7DV13A                                     
    Transformation 06C3WE26JQY0VSZPNMOZFKD0W416PRU4 is inactive (cannot be executed)               
    Error saving Data Transfer Process DTP_4FKN7KAKCOLQ3ROIB7U7DV13A                               
    Transformation 06C3WE26JQY0VSZPNMOZFKD0W416PRU4 is inactive (cannot be executed)
    Message no. RSTRAN715
    Diagnosis
    The transformation 06C3WE26JQY0VSZPNMOZFKD0W416PRU4 is inactive (not executable).
    This can be caused, for example, by a change made to the source object or target object of the transformation.
    The transformation connects the source  to the target .
    Procedure
    The transformation must be reactivated.
    Error saving Data Transfer Process DTP_4FKN7KAKCOLQ3ROIB7U7DV13A
    Message no. RSO843

    Thanks so much everyone for your suggestions.  Iu2019ve tried them all and hereu2019s what I now have:
    Anil u2013 Active datasources were included in the original transport.  I tried retransporting and got the same error messages.
    Voodi u2013 0ACCOUNT_TEXT was resident in the Q system as a 3x datasource.  I replicated it as a 70 datasource in D and this is what was transported to Q.  It is a 7.0 datasource in Q.  I got the same error messages when I tried transporting just the transformations.
    Jayaram - 0ACCOUNT is active in the Q system and is currently being used in transformations other than the one Iu2019m trying to import.
    Godhuli u2013 I not only re-replicated the datasource in D, I reactivated the transformations and DTPs and resaved the InfoPackages.  I tried retransporting with a new transport request and it failed with the same error messages.
    Is there anything else anyone can think of?
    Thanks,
    Janice

Maybe you are looking for

  • HP 4540s will not activate windows 8 from HP recovery dvds, please help

    Hi, I'm trying to migrate my HP from the harddrive to a new ssd I bought. As far as I know windows 8 is supposed to activate based on the code burned into the machines bios, so I bought some hp recovery dvds online as HP recovery manager no longer se

  • Search Help for Custom field in Sourcing Cockpit

    Hi SRM Experts, I added custom field "rush order" in the Structures as per requirement. I added code in MODIFY_SCREEN function module. Search help is working for "rush order" in Process Purchase Orders (to search PO) and Check Status (Searching Shopp

  • XML Forms Service warnings

    Hi Experts, We have developed some xml templates by using xml form builder. At the time of these templates desing we were on E.P 7.0 SP 10. Recently we have upgraded to SP 13. These xml templates worked fine earlier and working fine now also. But whe

  • Itunes "invalid certificate" error message

    when launching itunes I get an error message that a certain url has an invalid certificate.  I've been to this website many times previously without a problem.  I have subscribed to podcasts from it.  it has a short alternative url too: cac.org    Bu

  • ITunes locking up on my 3GS but not my wifes...

    Out of the blue my iPhone is locking up iTunes at home AND at work. If I disconnect the phone, iTunes becomes responsive again. Strange thing is my wifes iPhone 3GS doesn't do this?! Same computer, same plugin. I reset the phone (not the erase data e