Production related data extraction

Hi all,
I am new to PP. I have a requirement to get -minimum production lot size, production throughput for products at a location,unit production cost,initial set up time,BOM input/output items etc. I was trying SQVI transaction, but don't know which all the tables suitable for the query.
It will be great to have some inputs.
Thanks
Mano

Hi,
Please refer the following tables,
AUFK - Production order headers
AFKO - Production order header
AFPO - Production order detail
AFRU - Production order confirmation
RESB - Reservation/dependent requirements
S022 - Order Operation Data for Work Center
JEST - Object status
CRHD  - Work center header
STPO- BOM item details
STKO - BOM header
PLKO - Routing header details
PLPO - Routing operation details
For lot size check in table MARC
Regards,
Sankaran

Similar Messages

  • Production Related Data

    Hi Gurus,
    I have an ALV where i am displaying a few columns, the columns being displayed are Invoice #, Customer #, Sales Order #, Material #, Batch #, Serial #, Material Document # and Plant. As per my functional knowledge Most of the fields are related to Sales Data.
    The problem is i am supposed to display the Manufacturing plant for a particular Batch. Currently i am picking plant from MCHA/B/* tables which gives me the Distribution Plant.
    I was unable to find a link between the Sales related Data and Production related Data so as to link the key fields and fetch Manufacturing Plant.
    Any sort of input/suggestion would be highly appreciated.
    Regards,
    Neo.

    hi Neo ,
    if u are maintaining PLANT Profile then u can find out.check view  V_TWRF2 for plant category.
    Regards
    Prabhu

  • How to do the CRM Product Master Data extraction?

    Hello guys,
    I have to upload the CRM Product Master Data from a CRM 6.0 system to a BI 7.0 system.
    The only information that I have is the infocube in the BI for the extraction is 0CRM_PROD.
    I've checked that the CRM Product table COMM_PRODUCT and found that entries in the table and the infocube don't correspond to each other exactly. I also cannot find approprite DataSource in BI or CRM system for the data transfer.
    Who can telll me anything about this kind of master data extraction to BI?
    Thanks in advance.
    Best Regards

    Hi,
    u tried using the generic extraction using rso2.create one datasource and give the table name comm_prod.
    bye.

  • Rsa3 Data extraction problem

    Hi All,
    We have a data source called “0BBP_TD_CONTR_2”. It collects data from SRM related to contracts. We have activated it by using rsa5 transaction and extract data from SRM to BW. It works fine.
    However, when we tried to extract data again we came across with an error on BW side pointing out the source system. When we control the SRM side:
    On RSA6: Data source is active
    On RSA3: When we extract data it gives an error: “No corresponding product found”.
    When we double click this error: it says: “No corresponding product found Message no. COM_PRODUCT004”
    So we can not see the result of RSA3.
    Is there anyone who came across with such an error?
    Thanks in advance
    #Bill J.

    Hi Bill,
    Please take a look at the following SAP Note:
    I think it will also apply to SRM system.
    The error message "No corresponding products found" occurs if the product GUID from the R/3 System is not the product GUID under which the product has been created in the CRM/EBP System.
    This may have the following reasons:
    You create the product data of a CRM/EBP System by copying the product data of a different CRM/EBP System.
          If this different CRM/EBP System is linked to a different R/3 Back-end System, the products can no longer be identified during another download.The reason for this is that the different R/3 Back-end System has assigned other product GUIDs than the original R/3 Back-end System.
    In the R/3 Back-end System, material data - including the contents of table NBDSMATG16 - is overwritten with a copy from a different R/3 Back-end System.If material data from the first R/3 Back-end System has already been transferred to a CRM/EBP System, the corresponding products can no longer be identified in the CRM System during a new download attempt.
    You linked the CRM/EBP System to a different R/3 Back-end System.This leads to the above problem if the same logical system is assigned to the new R/3 Back-end System as to the original R/3 Back-end System.
    You load materials from two R/3 Back-end Systems to which the same logical system is assigned respectively into a CRM System.
    Solution
    The logical system of an R/3 Back-end System must not be changed if a download from this R/3 Back-end System has occured.
    The creation of product data in a CRM/EBP System should not occur by means of a client copy. When you set up a new CRM/EBP System, the products should be created by means of a download from the connected R/3 Back-end System.
          If it is required to make a client copy, for example in the cause of an upgrade test, the R/3 Back-end System must be copied to the new CRM system landscape as well.In the new CRM system landscape, you must not rename any logical systems.However, all RFC destinations have to be redefined in such a way that there is no longer a connection to the original system landscape.
          If this restriction is too strict, you must not change the logical system directly in table T000, but by means of Transaction BDLS which must run three times in this process:
              o in the R/3 Backend:    LOGSYS OLTP old     -> LOGSYS OLTP new
              o in the CRM/EBP System: LOGSYS CRM/EBP old  -> LOGSYS CRM/EBP new
              o in the CRM/EBP System: LOGSYS OLTP old     -> LOGSYS OLTP new
    In a production CRM/EBP System, the data should always be loaded from the corresponding production R/3 Back-end System.That is, the product master data should not be transferred from a different CRM/EBP System by means of a copy.
    Entries of table NBDSMATG16 of an R/3 Back-end System must not be changed by a copy from a different system.That is, entries in this table must never be changed.
    You must not replace the R/3 Back-end System of a CRM/EBP System with a different R/3 Back-end System just like that.When you replace the R/3 Back-end System, a corresponding new CRM/EBP System should be set up as well.
          If the setup of a new CRM/EBP System is not possible in this case, you have to copy the contents of table NBDSMATG16 from the original R/3 Back-end System into the new R/3 Back-end System.You can only do this in turn if no product data exchange with the new R/3 Back-end System has occurred, that is, if table NBDSMATG16 does not contain any entries yet.
          Then you have to assign the same logical system that was assigned to the old R/3 Back-end System to the new R/3 Back-end System.In addition, there must not be a connection or (Customizing) setting that points from the CRM/EBP System to the old R/3 Back-end System or vice versa.
          Recommendation:You should delete the definitions of the RFC destination (Transaction SM59).
    If the problem described above occurs because you do not follow the above recommendations, a data inconsistency exists which has to be analyzed by SAP.From the product master point of view, you can solve this problem by deleting the affected products and creating them again by means of a download.
    However, this procedure is problematic or cannot be carried out in production systems if references to the affected products are already made in other objects (for example, in orders). In this case, these references would refer to product GUIDs that no longer exist in the system.
    In such cases, individual solutions (which are quite extensive) must be found.
    If several R/3 Back-end Systems are connected (currently, this is only supported in EBP Systems), different logical systems must be assigned to them
    let me knopw if you have any issues or questions
    Regards
    Satish Arra

  • Is it possible to integrate relational data with OLAP cubes?

    I have a web application that accesses cubes created from AWM via the OLAP API. I need to integrate a column from a relational table in the front application and display the column along side cube data.
    Is there any way to achieve the functionality from the OLAP API?

    Can you explain how the relational data source relates to the OLAP data, is it a master-detail relationship? If this is the case then you could consider the following:
    1) Depending on how you are displaying the OLAP data. If you are using a non-BI Beans presentation bean then if the keys are consistent across both data sources it should be possible to create two separate queries and glue them together using the common keys within your data source module.
    2) Alternatively, you create a custom text measure within AWM and then use OLAP DML to extract the detail data and load it into a multi-line text variable that could be retrieved via OLAPI. This might not work if there is a large number of rows within the text variable to retrieve as formatting the results within your application might get complicated. The OLAP DML Help contains a lot of excellent examples that will help you create a program that uses SQL commands to load data.
    Hope this helps
    Keith Laker
    Oracle EMEA Consulting
    BI Blog: http://oraclebi.blogspot.com/
    DM Blog: http://oracledmt.blogspot.com/
    BI on Oracle: http://www.oracle.com/bi/
    BI on OTN: http://www.oracle.com/technology/products/bi/
    BI Samples: http://www.oracle.com/technology/products/bi/samples/

  • ECC Data extraction

    Hello All,
    I have a situation in my department where I need to extract the ECC data to third party database (MS SQL). Initially we were thinking of using SAP BW extractors for extraction through SAP PI to push the data out but this doesn't seem to be feasible. Is this a fair assessment of the situation?
    What other methods we can use for data extraction? We are evaluating options for ABAP programs, SAP Queries and BAPIs. Is there any other way we can extract this data set. The data set is required for sd,fi-ar/ap,mm,pp.
    We donot need data fro real time time basis, but on the same lines of a data warehouse extraction like SAP BW.
    Appreciate your help.
    Thanks

    Hi
    I recently figured out what to do and here is what we do :
    Not many would like to allow you to extract data directly from SAP R/3. But we have been successful in extracting data for our in-house implementation of BI at ORG level. You need to understand which tables of R/3 would give you what information.
    If that information could be gathered. Then extracting data on BODS is very simple. Though it is a bit tricky and it is not direct. But a we do it as a process of different stages.
    We use BODS 3.1
    Step 1 :
    You need to create 2 connections on SAP R/3 Datastore :
    1) conn 1 :
    Use the option generate and execute - ABAP Execution option
    data transfer mtd : direct download
    working directory on SAP Sever : here give local path on which BODS server is running e.g. ; d:\bods
    local directory : : d:\bods
    Generated ABAP directory : d:\bods.
    execute in bg : no
    Create data flow :
    SAPR/3 >>> (sap r/3 table>>>query transformation>>data transport (dat file))
    ||__query transformation >> target database (we have chosen oracle) . You could create a datastore for sql server.
    And Import that table table under this connection.
    Even for SAP R/3. Import that table immediately after you create the SAP R/3 datastore.
    We have used the option of replace file in data transport option in dat file
    So that each time you run the job the file gets replaced.
    Step 2 :
    Create a job for this work flow
    Run the job.
    It will create a dat file and a ABAP File on the specified D: Bods   folder.
    Step 3 :
    Send the ABAP File : to sap bw / sap team - requesting them to create this ABAP program. and make it available on your SAP R/3 System.
    (We found this is easy to do for anyone, all that you need is to copy paste and activiate and do the transport to the sap r/3 system and does not require any time to do any ABAP programming - as such)
    Note : For the data store creations of SAP R/3 : THE user name being used. Need to have access to all the required tables of SAP R/3. Otherwise data access will not happen. You can chek the data flow at each stage and it will say, not authorised. If you dont have access.
    Step 4 :
    Now create another SAP R/3 Connection conn2 ;
    This connection will use the option : execute preloaded (abab execution option)
    execute in bg : no
    data transfer method : shared directory
    working directory on sap server :  d:\bi_share
    (Create a shared folder on sap r/3 server : :\BI_SHARE (the user on data store should have complete access of read and write on this shared folder)
    application path to share directory :
    sapservername
    bi_share
    Step  5 :
    Now if you have the ABAP Prog already available on your R/3 system.
    Re-run the job finally using the new data store for execute - preloaded.
    which is pointing to the shared directory of sap server itself.
    And this job can be scheduled on production BODS server finally.
    ======================================================
    Wherever required, import the tables under the related data store.
    Although it might sound a lengthy process. Actually, it hardly takes a very little time, compared to the time involved in trying to ask someone on ABAP to code / or take help of BW or whatever.
    We have done things on DEV BODS tested the same.
    then we have moved everything into production.
    And till date : it has been successful for us.
    We have taken help of ABAP programers only when we explicity required some customised logical programming. Which was not available directly on SAP R/3 system. 
    Otherwise if we generally pull all the fields once we take any table from SAP R/3 so that i could be used any time later. Even if not required today.
    And SO with a little effort, even a person who is new to SAP R/3  or BW or BODS.... like me, is able to manage. So, it is not difficult i suppose.
    Good Luck.
    We have used oracle for reporting database. And sql server should also work the same way.
    We use sql server only for meta data information.
    like BO Repository or BODS repository
    Cheers
    Indu
    Edited by: Indumathy Narayanan on Jul 27, 2011 8:23 AM
    Edited by: Indumathy Narayanan on Jul 27, 2011 8:24 AM
    Edited by: Indumathy Narayanan on Jul 27, 2011 8:27 AM

  • BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?

    Hi.
    Can anyone advise as to what is the difference  in using the data extraction flow for extracting Data from SAP R/3 ?
    1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
    This ABAP flow generates a ABAP program and a dat file.
    We can also upload this program and run jobs as execute preloaded option on datastore.
    This works fine.
    2) We also can pull the SAP R/3 table directly.
    DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
    THIS ALSO Works fine. And we are able to see the data directly into oracle.
    Which can also be scheduled on a job.
    BUT am unable to understand the purpose of using the different types of data extraction flows.
    When to use which type of flow for data extraction.
    Advantage / disadvantage - over the 2 data flows.
    What we are not understanding is that :
    if we can directly pull data from R/3 table directly thro a query transformation into the target table,
    why use the Flow of creating a R/3 data flow,
    and then do a query transformation again
    and then populate the target database?
    There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand.  Can anyone advise please.
    Many thanks
    indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PM

    Hi Jeff.
    Greetings. And many thanks for your response.
    Generally we pull the entire SAP R/3 table thro query transformation into oracle.
    For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
    so as to be able to use the option of Execute preloaded - and run the jobs.
    Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
    we do not do anything at the SAP R/3 level
    I was doing this trial and error testing on our Worflows for our new requirement
    WF 1 : which has some 15 R/3 TABLES.
    For each table we have created a separate Dataflow.
    And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
    by-passing the ABAP flow.
    And still the entire work flow and data extraction happens ok.
    In fact i tried creating a new sample data flow and tested.
    Using direct download and - and also execute preloaded.
    I did not see any major difference in time taken for data extraction;
    Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
    Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors.  Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
    I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
    And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow.  Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP. 
    I dont know whether i could do this direct pulling of data - for the new job which i have created,
    which has 2 workflows with 15 Dataflows in each worflow.
    And and push this job into production.
    And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement.  And this technical understanding is not clear to us as regards the difference between the 2 flows.  And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction. 
    As advised I shall check the schedules for a week, and then we shall move it probably into production.
    Thanks again.
    Kind Regards
    Indu
    Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM

  • How to retrive relational data from an XMLType column in Oracle 10g R2

    Hi
    I want how to retrive the data which is in XML document in an XMLColumn in a Table(or an XMLTable which has the XML Document). This XML Document has to be Queried with XQuery as a Relational data(not an XML Document).
    If any body has some ideas please share it across ASAP.
    please share an example for this because i am new to this XQuery.
    Thanks in Expectation,
    Selva.

    Got it working now. I used the 'extract' function in my select statement, but had to add the .getStringValue() fuction. The extract function, just by itself, returns an XMLDocument type. The call for the column in the SQL statement looked like this.
    extract(XML_CONTENT, '/ROOTOBJECT').getStringVal() xml_content
    Thanks so much for your help. Problem solved!

  • R/3 SP Import:Open data Extraction requests Error

    We are getting the below error when Basis doing the upgrade support packeage in R/3 source system.
    Open Data Extraction Requests
    The system has found a number of open data extraction requests. These
    should be processed before starting the object import process, as
    changes to DDIC structures could prevent data extraction requests from
    being read after the import, thus causing them to terminate. You can
    find more information about this problem in SAP Note 328181.
    Call the Customizing Cockpit data extraction transaction and process all
    open extraction requests.
    Intially we have cleared the entries in LBWQ,RSA7 and SM13.But after clearing these entries also we are getting the same above error.
    For support package upgrade in R/3 do we need to delete the SETUP table also in production environment?.
    Is there any other way around without deleting the setup table to upgrade support package in R/3 system?
    Please help us with your inputs urgently..
    Thanks
    Message was edited by:
            Ganesh S

    Thanks Siggi for the suggestion.
    We have already cleared the v3 updates by running RMBWV* job manually.After clearing all the entries in LBWQ,RSA7 and SM13 still we have got the same above error.
    When we imported In R/3 development system after deleting the setup table we didn't get the above error.But we are concerned about deleting the setup table in production system
    though already deltas are running fine.
    Is there any other way around? or deleting the setup table is the only option?
    Please help...

  • Invalid data status error during the data extraction

    Hi,
    while extracting capacity data from the SNP Capacity view to BW. i get the "invalid data status error" and the data extraction fails.
    when debugged the bad requests of the ODS object, i found that for a certain product(which has both positive and negative input and out qtys) co-product manufacturing orders were created. but this product was not marked as the co-product and functionally its fine.
    how can i rectify the data extraction problem..can you advice.
    Thanks,
    Dhanush

    Sir,
    In my company for some production order status some are having "errors in cost calculation" ie "cser" .how to deal these kind of errors.

  • How to  fetch the relational  data from the xml file registered in xdb

    Hi,
    I have to register the xml file into the  xdb repository and i have to fetch the data of the xml file as relational structure  through the select statement .
    i used the below query to register the xml file in xdb.
    DECLARE
    v_return BOOLEAN;
    BEGIN
    v_return := DBMS_XDB.CREATERESOURCE(
    abspath => '/public/demo/xml/db_objects.xml',
    data => BFILENAME('XML_DIR', 'db_objects.xml')
    COMMIT;
    END;
    Now i have to fetch the values in the xml file as relational data .
    whether it is possible ?
    can any one help me.
    Regards,
    suresh.

    When you transform your XMLdata to a xmltype you can do something like this for example:
    select
    extractvalue(value(p),'/XMLRecord/Session_Id') session_id,
    extractvalue(value(p),'/XMLRecord/StatementId') StatementId,
    extractvalue(value(p),'/XMLRecord/EntryId') EntryId
    from
    table(xmlsequence(extract(xmltype('
    <XMLdemo>
    <FormatModifiers><FormatModifier>UTFEncoding</FormatModifier></FormatModifiers>
    <XMLRecord>
    <Session_Id>117715</Session_Id>
    <StatementId>6</StatementId>
    <EntryId>1</EntryId>
    </XMLRecord>
    </XMLdemo>
    '),'/XMLdemo/*'))) p
    where extractvalue(value(p),'/XMLRecord/Session_Id') is not null;
    For this sample I've put a readable XML in plain text and convert it to xmltype so you can run it on your own database.

  • Performance Tunning- data extraction from FMGLFLEXA table

    Hi,
      Need to fetch data from FMGLFLEXA table based on below condtion.
           WHERE rfund IN s_rfund
               AND rgrant_nbr IN s_rgnbr
               AND rbusa IN s_rbusa
               AND budat LE v_hbudat.
    Please tell me how can i optimize this extaraction b'coz in production system there are lacks of records.
    Regards,
    Shweta.

    create a index on these fields due to which data extraction from table will be fast.

  • Field symbols and data extract

    hi frenz,
    Can anybody please give me details about <b><i>field symbols</i></b> and <b><i>data extracts</i></b> what is the purpose of  using them and regarding the performance issues for the two.....
    if some one explains with an example its going to be fine......
    Please don tell me to refer the 980 or help document.....
    Please help me out

    Field symbols offer functionlity similar to pointers in other programming languages. It does not occupy memory for the data object, it points to the object that has been assigned to the field symbol. You use the assign command to relate to the data object. After the assign operation you can work with field symbol in the same way as with the object itself.
    You define the field symbol as: field-symbols <fs>.
    field-symbol <fs>.
    data field value 'X'.
    asssign field to <fs>.
    The field symbol <fs> now inherits all the properties from the assigned field, and you can work with the field symbol in the same way as with the field itself.

  • SPM data extraction question: invoice data

    The documentation on data extraction (Master Data for Spend Performance Management) specifies that Invoice transactions are extracted from the table BSEG (General Ledger) . On the project I'm currently working the SAP ERP team is quite worried to run queries on BSEG as it is massive.
    However extract files are called BSIK and BSAK of; which seems to suggest that the invoices are in reality extracted from those accounts payable tables.
    Can someone clarify the tables really used, and if it's the BSIK/BSAK tables what fields are mapped?

    Hi Jan,
    Few additional mitigation thoughts which may help on the way as same concerns came up during our project .
    1) Sandbox Stress testing
    If available u2013 take advantage of an ECC Sandbox environment for extractor prototyping and performance impact analysis. BSEG can be huge (contains all financial movements), so e.g. BI folks typically do not fancy a re-init load for reasons outlined above. Ask basis to copy a full year of all relevant transactional data (normally FI & PO data) onto the sandbox and then run the SPM extractors for a full year extraction to get an idea about extraction system impact.
    Even though system sizing and parameters may differ compared to your P-box you still should get a reasonable idea and direction about system impact.
    2) In a second step you may then consider to break down the data extraction (Init/Full mode for your project) into 12 monthly extracts for the full year (this gives you 12 files from which you Init your SPM system with) with significant less system impact  and more control (e.g. can be scheduled over night).
    3) Business Scenario
    You may consider to use the Vendor related movements in BSAK/BSIK instead the massive BSEG cluster table as starting tables (and fetch/lookup BSEG details only on a need base) for the extraction process (Index advantages were outlined above already).
    Considering this we managed to extract Invoice data with reasonable source system impact.
    Rgrds,
    Markus

  • Relative date as selection option

    Hi Experts,
    could any of You pls help me to define a relative date as selection options?
    I have a program about production order selection with a start date as selection option (AKFO-GSTRP). I would like to run this program daily as a background job with dynamic selection: ex. always select orders start date: -5 days form today up till +5 days from now.
    So,I want to define a selection options, which will give possibilit to user define only the number of days, and based on this and sy-datum can calculate the actual value for the selection.  But in so_xxx definition up till now I only referred to a real value, and I  can not do it this time with FOR string. ( My selection field has data element CO_DAT_REL).
    Could any of You pls help me - as being beginner at programming-, how to define this relative date selection option?
    Thanks in advance

    Some solutions (if I understood)
    - define a variant for the report, use this variant in the transaction definition (define this variant as a [system variant|http://help.sap.com/saphelp_nw70/helpdata/en/c0/980389e58611d194cc00a0c94260a5/frameset.htm] (start with CUS& or SAP&) so it will be transported automatically, in this variant define the range of date as a selection variant of type D: Dynamic date calculation and select one of the available variable by F4
    - define a parameter field of type integer, and build yourself the range by subtracting or adding this field to current date, you can define the result range as a select-options, just protect it in the PBO (AT SELECTION-SCREEN OUTPUT with a classical LOOP AT SCREEN/MODIFY SCREEN/ENDLOOP)
    Regards,
    Raymond

Maybe you are looking for

  • Downloading BLOB from a table.

    Hello, First I want to say thatnks for such a great tool like HTMLDB. Question is: How do I download BLOB from a table. What I am trying to do is. I am uploading files to a default HTMLDB location (wwv_flow_file_objects$) and moving the records to di

  • A740 WiFi performanc​e

    I received an A740 last week and I'm very disappointed with the network / WiFi performance. So far the speed has shown no more than 29 Mbps but averages 14 Mbps - compare that to my Dell laptop which is constantly around the 78 Mbps mark, and the Mic

  • I have an older mac and it wants me to update to itunes 10 but my mac is too old and I cant figure out how to disable the update

    I have an older mac and it wants me to update to itunes 10 but my mac is too old and I cant figure out how to disable the update so I can use the store.

  • Display Driver crashes in Windows 8.1 on late 2013 macbook

    I have been able to install Windows 8.1 on my new 15 inch MacBook with Mavericks and Nvidia GPU. I installed windows along with the bootcamp drivers. The issue I am having now is that the machine complains that "Display driver stopped responding and

  • Random black screen after log in screen

    I have a week old top spec rMBP, I migrated my settings from my older MBP using migration assistant without any issues, my problem is random which makes it difficult to describe, when I reach the log in screen and insert my password click enter, some