Required, SD DATA FLOW  with MM Integration.

Dear Experts,
I have used the search functionality.. but got links..
Please Explain in your words, 1.SD Data Flow.
and also please mention its Integration with MM.
Points Assured.
Thanks

Link Between SAP SD, MM & FI
The link between SD and MM :-
1. When you create sales order in SD, all the details of the items are copied from Material master of MM.
2. MRP and availibility check related data is also taken from MM although you control this data in SD also.
3. While you create inbound/outbound delivery with reference to a sales order,the shipping point determination takes place with the help of the loading group, plant data, shipping conditions etc. This also refers to Material Master.
4. The material which you are entering in a sales order must be extended to the sales area of your sales order/customer otherwise you cannot transact with this material.
There are many such links between SD and MM.
Now the link between SD and FI :-
1. Whenever you create a delivery with reference to a sales order, goods movement takes place in the bacgground. eg. In case of standard sales order, you create an outbound goods delivery to the customer.
Here movement 601 takes place. This movement is configured in MM. Also, this movement hits some G/L account in FI. Every such movement of good s hits some G/L account.
2. The accounts posting in FI is done with reference to the billing documents (invoice, debit note, credit note etc) created in SD. Thus this is a link between SD and FI
3. Tax determination: In case of a tax determination also, there is a direct link between SD and MM
SD Integration points with other modules
SD module is highly integrated with the other modules in SAP.
Sales Order –
Integration Points                      Module
•Availability Check             -       MM
•Credit Check                   -       FI
•Costing                        -       CO/ MM
•Tax Determination              -       FI
•Transfer of Requirements       -       PP/ MM
Delivery & Goods Issue –
Integration Points                      Module
•Availability Check             -       MM
•Credit Check                   -       FI
•Reduces stock                  -       MM
•Reduces Inventory $            -       FI/ CO
•Requirement Eliminated         -       PP/ MM
Billing -
Integration Points                      Module
•Debit A/R                      -       FI/ CO
•Credit Revenue                 -       FI/ CO
•Updates G/ L                   -       FI/ CO
  (Tax, discounts, surcharges, etc.)
•Milestone Billing              -       PS
Return Delivery & Credit Memo -
Integration Points                      Module
•Increases Inventory            -       MM
•Updates G/ L                   -       FI
•Credit Memo                    -       FI
•Adjustment to A/R              -       FI
•Reduces Revenue                -       FI
SD Transaction Code Flow: 
Inquiry / Document type IN
Tcode for creation VA11,VA12,VA13. tables VBAK,VBAP
Quotation / QT
Tcode for creation VA21,VA22,VA23. tables VBAK,VBAP
Sales Order OR
Tcode for creation VA01,VA02,VA03. tables VBAK,VBAP
Delivery LF
Tcode for creation VL01,VL02,VL03. tables LIKP,LIPS
Billing F2
Tcode for creation VF01,VF02,VF03. tables VBRK,VBRP
Regards,
Rajesh Banka
Reward suitable points.

Similar Messages

  • Purchasing flow with stock - Integration to FM

    Dear experts,
    my customer would like to have the FM budget check only when the warehouse orders from the supplier (on the warehouse FC)
    For this purpose I implemented a fix FC assignment, whenever the "Inventory Flag" is avtive. For specific movement types, I implemented an assignment of statistical commitment item.
    Now I face the following problem:
    When posting the Purchase Order, everything works fine (I can see in the trace of FMDERIVE, that the inventory flag is active and therefore, the FC will be filled)
    When posting a goods receipt (TA MIGO) on the same Purchase Order, the Inventory flag is no more active as I see in trace. The system throws error message FC is not filled.
    Do you know this issue and what to do?
    Thanks a lot for any hint,
    Thomas

    Hi Thomas,
    I have some information that may help you with your issue:
    - Please check if you have defined in FMDERIVE a derivation rule to clear commitment item or funds center field. You should adjust FMDERIVE in order to avoid that FMDERIVE overwrites the inherited account assignment if you wish to maintain the account assignment across the whole MM document chain.
    - You need to set the field status of the field FLG_INVNTRY_PSTNG in purchase order to 'Display' in order to ensure the use of the automatically derived account assignments.
    - On the other hand, you are posting a Goods receipt (MIGO, probably movement type 101) where the related business transaction is RMWE. If this is your case it is correct that the Inventory posting flag is not relevant in this case as it is not an inventory posting.
    If you want to get business transaction RMWA, you need to post Goods issues and use a different movement type, i.e. 201K
    I believe that you are using scenario 1. Note 700485 says:
    a) for scenario 1:
        A statistical commitment item must be derived for all goods issue postings concerning the warehouse (process RMWA in field
        TRANSAC). Since the relevant account assignments are already entered with the purchase order for the FM, SAP recommends to change these to mandatory fields by means of the field status.
    Process RMWA is Goods Movement, done by transaction MB11.
    Process RMWE is Goods receipt for purch. order
    Please check your trace log and see which process appears there, I believe that trace shows process RMWE and flag
    FLG_INVNTRY_PSTNG not marked when you are in MIGO.
    May you so kind to clarify RMWE from MM point of view? Is it correct to create RMWE, movement 101 for stock?
    It seems to be that the standard program is not considering your example with RMWE as a stock movement, it is expecting a RMWA process to be able to have the flag FLG_INVNTRY_PSTNG marked.
    Please refer carefully to the Warehouse concepts documentation. In order to bypass this error, you should work under one of the Warehouse scenarios supported by SAP and adjust your customizing accordingly.
    In the link below the warehouse concepts are described with the related business transactions: (purchase requisition, purchase order, and goods issue).
    http://help.sap.com/erp2005_ehp_04/helpdata/en/f0/ca3f50260211d28a430000e829fbbd/frameset.htm
    Plase check the complete link, specially:
    - the example of customizing settings for Scenario 1 available there (graphic);
    - a posting Example Warehouse Concept 1;
    - Posting Example for Consumable Materials and,
    - Overview of Posting Ex. Consumable Materials/Warehouse Concepts
    I believe that this will help you with your issue.
    Best Regards,
    Vanessa.

  • Does the Palm Centro require a data plan?

    Does the Palm Centro require a data plan? I'm wanting to switch back to my Centro and I just want to make sure, before I purchase a new one, that I will not be required to pay the $29.99 a month like I do now with my Pixi. Can anyone give me a for sure, 100% positive answer on whether or not this is a discontinued phone and does not require the data plan with activation? Thank you so much!

    That's exactly what I thought, but I kept finding in small subjects people saying it might require it. I thought, surely, they wouldn't. That just wouldn't be fair. Thank you so much for your answer. I really appreciate it.

  • Error on Data Flow Task MSSQL 2012 Clustered "Description: The version of Lookup is not compatible with this version of the DataFlow. "

    We have an SSIS package that runs on clustered MSSQL 2012 Enterprise Nodes that is failing.  We use a job to executer the package.
    Environmental information:
    Product - Microsoft SQL Server Enterprise: Core-based Licensing (64-bit)
    Operating System - Microsoft Windows NT 6.1 (7601)
    Patform - NT x64
    Version - MSSQL Version 11.0.3349.0
    Package is set to 32 -bit.  All permissions verified.  Runs in lower environments, same MSSQL version.  All environments are clustered.  In the failing environment, all nodes are at the same service pack.  I have not verified if all
    nodes in the failing environment have SSIS installed.  Data access is installed.  We have other simpler packages that run in this environment, just not this one.  Time to ask the community for help!
    Error:
    Source: Data Flow Task - Data Flow Task (SSIS.Pipeline)     Description: The version of Lookup is not compatible with this version of the DataFlow.  End Error  Error:  Code: 0xC0048020    
    Description: Component "Conditional Split, clsid {7F88F654-4E20-4D14-84F4-AF9C925D3087}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.  End Error 
    Description: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Conditional Split;Microsoft Corporation; Microsoft SQL Server; (C) Microsoft Corporation; All Rights
    Reserved; http://www.microsoft.com/sql/support;0".  End Error 
    (Left out shop specific information.  This is the first error in the errors returns by the job history for this package. )
    Thanks in advance.

    Hi DeveloperMax,
    According to your description, the error occurs when you execute the package with Agent job on clustered MSSQL 2012 Enterprise Nodes.
    As per my understanding, I think this issue can be caused by you use SQL Server Agent to schedule a SQL Server Integration Services package in a 64-bit environment. And the SSIS package is referencing some 32-Bit DLL or 32-Bit drivers which are available
    only in 32-bit versions, so the job failed.
    To fix this issue, we should use the 32-bit version of the DTExec.exe utility to schedule the 64-bit SQL Server Agent to run a package. To run a package in 32-bit mode from a 64-bit version of SQL Server Agent, we can go to the Job Step dialog box, then
    select “32 bit runtime” in the Advanced tab.
    Besides, we should make sure that SQL Server Integration Services is installed on the failing environment.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Problem with context mapping and data flow in a FPM application

    Hi All,
    I am trying to develop an ESS application using FPM. For the same, the requirement is to see the history of an employee in the second view.
    The first view has got just the overview information and the second one has got the detail. So, the records or the fields are the same on both the views.
    As per the FPM guidelines, the Model is residing in the Fc component and the respective Vc components are using the model data accordingly.
    I am executing the model in the Fc component calling the executable method in the interfaceController of the first view and then trying to display the output data of the BAPI in the first view which provides the overview information.This is working fine.
    But when i am trying to map the same output node to the Table UI for the second view, the record size is coming zero and thus no information is available.
    For the above issue, I am again executing the RFC in the InterfaceController of the second view to populate the records, which is incorrect as it is already executed and the data is available for the first view.
    I request you to let me know the correct approach to Context mapping and data flow when using FPM-roadmap. Is their any standard method or approach available to deal with such requirements? Please let me know.
    Thanks in advance.
    Regards
    DK

    Hi Idhaya,
    I model node is available in Fc and the Fc interface controller is being used in the first Vc and the second Vc.
    So the idea is, as the executable method is generated in the Fc, so i have created a custom method to call the executable method in Fc, where the input parameter is getting passed and this custom method is finally getting called is the first Vc.
    So , now my first Vc is ready to call the custom method in Fc and execute the RFC. Once the RFC is executed, the nodes in the Fc should get populated which is the ideal case.
    And as the Fc is used as a component in the second Vc, the same node is available to the UI elements.
    But, when I check the record size for the output node, it is always zero, for the second Vc.
    Regards
    DK

  • Is there a mobile phone that can sync with ical, but does not require a data plan?

    Is there a mobile phone that can sync with ical, but does not require a data plan?

    Thanks for the reply.  I already have this info, but have spent several hours trying to cross reference with the product descriptions and the different carriers websites.  I was hoping there was an easier way.
    John Maisey wrote:
    Hi,
    You cakn sync many phones using iSync. See here: http://support.apple.com/kb/ht2824
    Best wishes
    John M

  • SSIS Data Flow task using SharePoint List Adapter Setting SiteUrl won't work with an expression

    Hi,
    I'm trying to populate the SiteUrl from a variable that has been set based on a query to a SQL table that has a URL field.  Here are the steps I've taken and the result.
    Created a table with a url in it to reference a SharePoint Task List.
    Created a Execute SQL Task to grab the url putting the result set in a variable called SharePointUrl
    Created a For Each container and within the collection I use the SharePointUrl as the ADO object source variable and select rows in the first table.
    Still in the For Each container within the Variable mappings I have another Package Variable called PassSiteUrl2 and I set that to Index 0 or the value of the result set.
    I created a script task to then display the PassSiteUrl2 variable and it works great I see my url
    This is where it starts to suck eggs!!!!
    I insert a Data Flow Task into my foreach loop.
    I Insert a SharePoint List Adapter into my Data Flow
    Within my SharePoint List Adapter I set my list to be "Tasks", My list view to be "All Tasks" and then I set the url to be another SharePoint site that has a task list just so there is some default value to start with.
    Now within my Data Flow I create an expression and set the [SharePoint List Source].[SiteUrl] equal to my variable @[User::PassSiteUrl2].
    I save everything and run my SSIS package and it overlays the default [SharePoint List Source].[SiteUrl] with blanks in the SharePoint List Adapter then throws the error that its missing a url
    So here is my question.  Why if my package variable displays fine in my Control Flow is it now not seen or seen as blanks in the Data Flow Expression.  Anyone have any ideas???
    Thanks
    Donald R. Landry

    Thanks Arthur,
    The scope of the variable is at a package level and when I check to see if it can be moved Package level is the highest level.  The evaluateasexpression property is set to True.  Any other ideas?
    I also tried to do the following.  Take the variable that has the URL in it and just assign it to the description of the data flow task to see if it would show up there (the idea being the value of my @[User::PassSiteUrl] should just show in the
    description field when the package is run. That also shows up blank. 
    So i'm thinking its my expression.  All I do in the expression is set [SharePoint List Source].[SiteUrl] equal to @[User::PassSiteUrl] by dragging and dropping the variable into the expression box.  Maybe the expression should be something
    else or is their a way to say  @[User::PassSiteUrl] = Dts.Variables("User::PassSiteUrl2").Value.ToString() 
    In my script task I use Dts.Variables("User::PassSiteUrl2").Value.ToString() to display
    the value in the message box and that works fine.
    Donald R. Landry

  • R11.0.3: An Invoice with future dated apyment data flow

    제품 : FIN_AP
    작성날짜 : 2003-11-18
    R11.0.3: An Invoice with future dated apyment data flow
    ===============================================
    PURPOSE
    Future Dated Payment Term을 사용하는 Invoice의 Data Flow에 대해 살펴보도록 한다.
    Explanation
    accounting rule = accrual
    .allowed future dated payment
    1. Create a standard invoice (invoice amount: 2100, item amount: 2100, payment term : future dated payment)
    And transfer it to GL.
    * ap_invoice_distributions_all
    accounting_date = 2003-11-11
    accrual_posted_flag = Y
    posted_flag = Y
    * ap_trial_balance
    accounting_date = 2003-11-11
    distribution_amount = 2100
    * 해당 data gl_interface table로 넘어감.
    2. Posting these transferred journals in gl module
    * 해당 data가 gl_interface table에서 삭제 된 후
    * gl_je_lines
    accounted_dr = 2100 charge
    accounted_cr = 2100 liability
    3. Create a payment against this invoice with mature date = 12-nov-2003 and then transfer it to gl
    * ap_invoice_payments_all
    accounting_date = 2003-11-11
    accrual_posted_flag = 'N'
    posted_flag = 'N'
    future_pay_posted_flag = 'Y'
    * ap_payment_distributions_all
    line_type_lookup_code = 'CASH'
    * ap_trial_balance
    accounting_date = 2003-11-11
    payment_amount = 2100
    * ap_checks_all
    status_lookup_code = 'NEGOTIABLE'
    4. Posting these payment journals in gl module
    * 해당 data가 gl_interface table에서 삭제 된 후
    * gl_je_lines
    accounted__cr : 2100 future pay
    accounted_dr : 2100 liability
    5. Clear this payment and then transfer to gl
    Future dated payment 이고 Cash Management 를 사용하면 Cash Management 에서
    Bank Reconcilation을 해주시면 됩니다.
    Cash Management 를 사용하지 않으실 때는
    "automatic clearing for future dated payment" 를 수행하여서 clear 를 해주어야 합니다.
    이번 test case에서는 cash managemnt를 사용하지 않고 automatic clearing for
    future dated payment 를 수행하여 clear 해주었습니다.
    * ap_checks_all
    status_lookup_code = 'CLEARED BUT UNACCOUNTED'
    Cash Management를 사용하지 않기 때문에 status가 위와 같이 나타납니다.
    Cash Management를 사용하면 option에 따라 status가 cleared나 reconciled로 변경 됩니다.
    6. Transfer these cleared payment to gl and posting in gl
    * 이 때에는 ap_trial_balance table에는 아무런 영향도 주지 않습니다.
    * ap_invoice_payments_all
    accrual_posted_flag = 'Y'
    posted_flag = 'Y'
    * gl_je_lines
    accounted_dr= 2100 future pay
    accounted_cr = 2100 cash
    Example
    Reference Documents
    N/A

    Rohini,
      Thanks for your answer; I was out for few days and hence the delay.
      What I am looking for is the complete usage model of the following ODS.
    1> 0BBP_SC     we have figured it out
    2> 0BBP_SCA    we have figured it out
    3> 0BBP_DOC    we have figured it out
    4> 0bbp_inv    we need help
         Looks like this can be populated from R3 using the following
    Data Source    Info Source                    ODS Object
    2LIS_02_ITM    2LIS_02_ITM                 0BBP_INV
    2LIS_02_HDR    2LIS_02_HDR                0BBP_INV
    5> 0BBP_DS     we need help
    6> 0bbp_po     we need help
        The same combination is also used to populate the 0BBP_PO as well. We suspect that 0BBP_DS will then be populated from both 0BBP_INV and 0BBP_PO. At least that what we though after activating the BC and reading the sap help on BC for SRM.
      We are doing a proof of concept and want to know all the ODS with data. We want to build an end to end (Shopping cart, approval, Doc flow, PO data with value, Invoice data with Value) and struggling to understand the model surrounding this three ODS ( 0bbp_inv, 0bbp_ds, 0bbp_po). Any help in this area will be appreciated.
    Thanks
    Arun

  • Are unique GUIDs required in the data flow task in SSIS?

    Hi,
    I have previously used SQL server 2008R2 to develop packages.  I was told not to copy and paste packages, or create templates with tasks on the data flow because this leads to multiple packages with the same GUID, the consequence of this might be calamitous
    problems at runtime when the engine can't distinguish between tasks in different packages.
    My process was to create templates with Control flow tasks only, manually create the dataflow tasks and then use bids helper from codeplex to regenerate GUIDs (this regenerated the GUIDs at the control flow level only).
    I am now using SQL 2012. I have looked and seen that if I copy and paste a package the GUIDs at the data flow level are also copied.  So my questions are:
    1) Does having the same ID in different dataflows cause problems as I was led to believe?
    2) If it does has this been addressed in some way in SSIS 2012? and what is the best way of working around this given I have many similar packages to develop?
    Thanks,
    Dan

    SSIS doesn't really lend itself well to creating templates.
    SQL 2012 is much improved when it comes to copy and pasting tasks, dataflows and transformations between package designs, other than the usual issues with missing connection managers.

  • I want Data flow Co with SD and MM

    My dear Friends,
    I want Data flow Co with SD and MM with examples please..
    Ashok kumar
    Email: [email protected]

    My dear Friends,
    I want Data flow Co with SD and MM with examples please..
    Ashok kumar
    Email: [email protected]

  • Data Flow task with error redirection hangs

    I am migrating an SSIS package from 2005 to 2012.  I have a package that, among other things, contains a data flow task with redirects.  The source is a flat file pipe delimited that we receive from an outside source.  The file contains a
    bunch of bad data including empty lines.  I redirect the bad rows so I can provide an audit back to the list provider.  The file has about 300 k rows.  Since I have completed the migration wizard, the data flow task stalls at 72,173 rows.  I
    can change the number of rows that get loaded by changing the DefaultBufferMaxRows and DefaultBufferSize values but I can't get it anywhere near 300K.  I decided to try rebuilding the data flow task from scratch and found that if I set it to ignore all
    errors, the entire file will load but when I add the redirect it hangs and does not give me any errors.
    I am currently running it in debug mode from Visual Studio.  I have not tried running it from the SS agent yet.
    Any help would be greatly appreciated.  I would like to keep the error redirects if at all possible for audit reasons.
    Thanks in Advance.
    Alan

    The error says it stopped on row 45200 and that the column AgentIdentifier returned status value 4 "Text was truncated..."  This is one of the errors that I have to trap for.  The field preceding AgentIdentifier is a remarks field that typically
    contains embedded pipe characters that throw off the rest of the row.  There are some other errors that I typically find in the data file but that one is the most frequent and is why I have to redirect so I can report back to the client what rows they
    need to fix.
    Thanks for the suggestion.

  • Foreach Loop Container with a Data Flow Task looking for file from Connection Manager

    So I have a Data Flow Task within a Foreach Loop Container. The Foreach Loop Container has a Variable Mapping of User:FileName to pass to the Data Flow Task.
    The Data Flow Task has a Flat File Source since we're looking to process .csv Files. And the Flat File Source has a Flat File Connection Manager where I specified the File name when I created it. I thought you needed to do this even though it won't really
    use it since it should be getting its File name from the Foreach Loop Container. But when attempting to execute, it is blowing up because it seems to be looking for my test file name that I indicated in the Flat File Connection Manager rather than the file
    it should be trying to process in User:FileName from the Foreach Loop Container.
    What am I doing wrong here??? I thought you needed to indicate a File name within the Flat File Connection Manager even though it really won't be using it.
    Thanks for your review...I hope I've been clear...and am hopeful for a reply.
    PSULionRP

    The Flat File Connection manager's Connection String needs to be set to reference the variable used in the ForEach Loop:
    Arthur My Blog

  • Help Required Regarding - SAP Job names using R3 data flows

    We are calling a set of SAP Jobs using R3 data flows in data services. When ever a job fails we first kill the active SAP jobs by logging into SAP and then restarting the Jobs.
    There are about 100 odd SAP jobs that we call using these Data services Jobs so we wanted to kill the jobs using a reusable code on the SAP side by passing the Job name just before every R3 flows just incase its still in active status.
    So wanted to know if there are any short cuts to retrive the set of associated SAP job names because it will be a tedious process to hardcode the SAP job names and pass them as parameters for all the 100 + SAP job names in the custom defined resuable code.
    Any help or advice on this please !!

    The program is not meeting the expectations
    and the problem is due to reflection.Do we know this for certain?
    ... my application gets the class name, field name
    etc. from an XML file so i don't know their method names
    beforehand .
    Now since every class instance corresponds to a row
    in the database and i have to call get and set
    methods of each class instance so the performance
    keeps on degrading as the number of columns and rows increase .
    Can somebody suggest some improvement regarding this
    and regarding creating multiple instances of the same object Class.forName() will be using a hash already, so there is probably not much room for improvement.
    Class.newInstance() probably does not take significantly more processing than a simple "new Fubar();".
    Umpteen reflective method invokations (one per column) for each row/instance - Are you saying these are the problem?
    You can test this easy enough.
    If you comment out the reflective method invocations and leave the rest of your code untouched,
    does your application processing speed up significantly?

  • Oracle to SAP BI with BCS u2013 Best Data flow Design.

    Hi,
      We are a SAP implementation team. We are @ Blue Print stage. My client is a RETAIL Business Gaint. Client has 50 % of Transaction data and Master data in Oracle data base. Now we are moving to BI 7.0 and also has plans to use SAP-BCS.
      We would like to map all the existing Oracle tables to BI. Provide any clue regarding the best Data flow (From Oracle 10G to BI 7.0).
    Your quick and valuable suggestion/ links are highly appreciated.
    Warm Regards,
    Bab

    Hi Ashok,
    You have mentioned that you have a Oracle 10g  system as a data inflow which perfectly sets the platform to extract the data from the Oracle  system to SAP BW system.
    This Can be done through the DB connect ,through where you could select the necessary tables and form them as a datasource in BI system,further creating the usual BI objects on top of the DS.
    Once the Data is in BI ,we could pull the BI cube or form a replica of the cube in BCS application format to use it in the BCS environment.
    Hope this helps,
    Regards,
    Rajesh.

  • Need help ASAP with Data Flow Task Flat File Connection

    Hey there,
    I have a Data Flow Task within a ForEach loop container.  The source of the flow is ADO.NET connection and the destination is a Flat File Connection.  I loop through a collection of strings in the ForEach loop.  Based on the string content,
    I write some data to the same destination file in each iteration overwriting the previous version. 
    I am running into following Errors:
    [Flat File Destination [38]] Warning: The process cannot access the file because it is being used by another process.
    [Flat File Destination [38]] Error: Cannot open the datafile "Example.csv".
    [SSIS.Pipeline] Error: Flat File Destination failed the pre-execute phase and returned error code 0xC020200E.
    I know what's happening but I don't know how to fix it.  The first time through the ForEach loop, the destination file is updated.  The second time is when this error pops up.  I think it's because the first iteration is not closing the destination
    file. How do I force a close of the file within Data Flow task or through a subsequent Script Task.
    This works within a SQL 2008 package on one server but not within SQL 2012 package on a different server.
    Any help is greatly appreciated.
    Thanks! 

    Thanks for the response Narsimha.  What do you mean by FELC? 
    First time poster - what is the best way to show the package here?

Maybe you are looking for

  • I'm unsuccesfully trying to update Creative Cloud (Photoshop CC)

    Dear You I have received a notification, that the is an essential update that I ned to execute. When I click the Update button in the Creative Cloud Window, the downliad start, but reaching 62% progress, I'm told that the update was unsuccesfull and

  • ITunes 7 and playing videos

    So, I recently downloaded iTunes 7 and never really used the video features because I had an older 3rd Gen iPod. I got an iPod video for Christmas and have been trying to get the video feature working on iTunes. I had some videos in my library, and t

  • LCM - copy (promote) jobs from DEV to TEST

    Hi all! I am searching for a way to copy the LCM jobs (to promote objects from DEV to TEST) from the DEV environment to the TEST environment. Is there a way to do so? Or do I need to recreate the jobs on the TEST new, when I want to transfer the same

  • Firefox continually opens tabs about latest updated add-ons

    Every time I open/start Firefox, in addition to my previously opened tabs (if any), it opens (or has started opening) an additional... 4-7 tabs... https://www.mozilla.org/en-US/firefox/33.1/whatsnew/?oldversion=31.0 http://update.downloadhelper.net/i

  • Bluetooth support option is greyed out

    Hi, While i use Desktop Manahger options under Options > Connection Settings > Bluetooth Settings are greyed out. My Desktop has bluetooth and works fine with other phones. Thanks,